ON-DEMAND WEBINAR | 27:13
Launchpad Live Community Call, March 26
Get the most out of Launchpad and learn directly from the team.
This session covers: 1) using AI for sample data during preview, 2) GenAI Connect rule for attachment analysis, and 3) building Insights within Launchpad Studio
Host: Tim Miranda (Fellow, Solutions Architect, Launchpad Provider Success)
Guest speakers: Chloe Loh (Product Manager, Case Management); Adam Torrey (Sr Product Manager, User Experience); Dileep Thotakura (Sr Product Manager, Low Code Studios)
What if your apps could build themselves better previews, understand files instantly, and visualize data in real time?
That’s exactly what we’re unlocking in this Launchpad Live Community Call.
Watch this on-demand webinar and learn about:
✨ AI-generated sample data for faster, more realistic previews
✨ GenAI Connect rules for intelligent attachment analysis
✨ Smarter file handling with configurable LLMs
✨ Live Insights for real-time data visualization in Studio
If you're building with Launchpad, this is how you level up: faster workflows, smarter automation, and more powerful apps.
1 "Tim Miranda" (553248768)
00:00:00.814 --> 00:00:22.789
Yeah folks. All right. Well, welcome to our launchpad live community call for March. I'm Tim Miranda. I'm provider's success manager here at Pega with Launchpad. Purpose of this call is to introduce some stuff that's new and exciting in launchpad, and some different ways.
2 "Tim Miranda" (553248768)
00:00:22.789 --> 00:00:55.500
Of doing things. So we'll go through three topics today, a new ability to use AI to generate sample data when you're previewing your app, and a new capability with our Gen AI connects rule. And then finally we'll talk about building insights within launchpad Studio and the cool preview capabilities that we have. Anytime during this, if you want to submit questions, we can get to those at the end. And then we'll wrap it up. We'll talk about some upcoming events and go from there.
3 "Tim Miranda" (553248768)
00:00:55.500 --> 00:01:16.610
Alright, so like I said, I'm Tim Miranda. We have three speakers today from our product management organization. We have Chloe from case management, Adam from our user experience and insights, and Delete from our low code studios, and they're gonna be walking us through several features and launchpad. This.
4 "Tim Miranda" (553248768)
00:01:16.610 --> 00:01:36.660
But so this live call is, mostly for existing providers, so people that are using launchpad, building their applications, and want to get deeper into it. But of course, if you're new to Launchpad or looking at it, please stick around. We'll be showing off different aspects of Launchpad that you'll probably be interested in.
5 "Tim Miranda" (553248768)
00:01:36.660 --> 00:01:58.010
Alright cool. So we're gonna start, I'm gonna introduce Chloe and talk about this new feature using AI to generate sample data. Chloe welcome. I know that in my role I get to demo launchpad to folks pretty often, and I love this idea that.
6 "Tim Miranda" (553248768)
00:01:58.010 --> 00:02:09.975
The platform now has like a one click way to quickly fill in the form using the EI, AI so I can show off my app to my stakeholders. Can you tell me a little bit about this new feature.
7 "Chloe Loh" (3401936896)
00:02:09.975 --> 00:02:29.210
Well, a good way to think about this feature is that it's really a developer productivity accelerator for prevailing cases. So when developers are currently building an application in Launchpad now, a big part of that workflow is constantly previewing the case. So just to check whether the forms behave correctly, whether validations work.
8 "Chloe Loh" (3401936896)
00:02:29.210 --> 00:03:00.180
And whether the overall flow makes sense. But the friction comes from the time that every time they preview, they have to manually fill in a lot of fields, things such as like names, like emails, dates, and this gets repetitive very fast, as you can imagine. So when we observe that a typical enterprise they have maybe ten to 20 fields in the particle assignment, the developers can easily spend a minute just filling in the data. And when you multiply that across many iterations, it really slows down the development.
9 "Chloe Loh" (3401936896)
00:03:00.180 --> 00:03:17.763
So what this feature really does is very simple on a surface, it introduces a few forms with AI buttons at the bottom of every assignment during preview. So when the developer clicks on it, the system automatically populates all the editable editable fields with realistic sample data.
10 "Tim Miranda" (553248768)
00:03:17.763 --> 00:03:37.580
Cool. Why don't you show us, show us this working, I like to see more about this. Okay cool. So creating a case here, oh I see this new button, the fill form with AI. Yeah, that's great. Like you said, as I'm iteratively developing the app, I don't want to try it out.
11 "Tim Miranda" (553248768)
00:03:37.580 --> 00:03:58.460
Out, I don't have to keep filling in the same forms or manually create some sample data to choose from and try to prefill that. So let's let's see this. So, so show me, so how does it know like what type of data to fill into each field? Obviously it's not random, it seems to be contextual. How does it know?
12 "Tim Miranda" (553248768)
00:03:58.460 --> 00:04:00.310
Or what's needed for each field.
13 "Chloe Loh" (3401936896)
00:04:00.310 --> 00:04:20.180
So what's cool about this feature, as you mentioned just now is that the AI is not generating values in isolation, is using both the field level information as well as case context to guide what it produce. So on the field level, the system has signals such as what's the field type, so whether it's a date or email or a phone number or currency. And on the field.
14 "Chloe Loh" (3401936896)
00:04:20.180 --> 00:04:44.270
Field name it has the field name itself, which gives it semantic meaning. And on top of that, it also has access to the broader form context such as what is the case type is and what is the case description and even the stage of the workflow. So this is how AI understands the overall scenario that the developer is working in. So e.g., imagine that you are working in a pet K care case types and you.
15 "Chloe Loh" (3401936896)
00:04:44.270 --> 00:05:15.649
You are trying to schedule a vat visit for your cats. So if there's a field such as the appointment dates, the AI will generate a specific return that is for date time. And if there is a few like description of issue, the AI will take into consideration the pet care context and generate something more meaningful like some, like the cat has not been eating well for the past week instead of just generic place for the text. So the goal here is to generate plausible.
16 "Chloe Loh" (3401936896)
00:05:15.649 --> 00:05:21.980
Use of all the business data that will help the developer to move to the form very quickly.
17 "Tim Miranda" (553248768)
00:05:21.980 --> 00:05:31.995
And then if the user, if I'm previewing, if I'm demoing this or trying it out, if I've typed in any information, does it get overwritten when I click that button or what happens with that?
18 "Chloe Loh" (3401936896)
00:05:31.995 --> 00:05:55.079
No, it does not get overwritten. This is something that we are very intentional about. So whenever data is applied, feels that the developer has already manually filled in, is not overwritten by AI, so the AI only fills in the gaps, it helps to complete the form by respects whatever is ready on the form itself. And at the same time.
19 "Chloe Loh" (3401936896)
00:05:55.079 --> 00:06:15.079
The developer can choose to overwrite some of the values that is written by AI very quickly, say if they want to test edge cases or whatnot. Outside of that, we also make sure that this feature is very safe by making sure that is strictly an authoring time feature, is only available to provide the developers when they're previewing.
20 "Chloe Loh" (3401936896)
00:06:15.079 --> 00:06:32.538
The application, so it does not appear at all in runtime and subscriber users will never see it. Developers also have control over whether they want to enable this capability to begin with, so they can choose to enable or to disable this particular feature in app settings.
21 "Tim Miranda" (553248768)
00:06:32.538 --> 00:06:51.649
That's great. Cool. So, yeah, that's awesome. I have a great productivity booth for developers here and I like that it's really kind of targeting the developers that are iterating through their app. We don't have to worry about this affecting the actual end users of the subscribers, kind of nice.
22 "Tim Miranda" (553248768)
00:06:51.649 --> 00:07:09.599
Isolated feature. Yeah, I love it. This is this is great. Appreciate it Chloe. And yeah, I think I think I covered it. I'm actually been using this already today I'm working on, so very exciting. Well, thank you. I appreciate your time on this one Chloe. Thank you.
23 "Chloe Loh" (3401936896)
00:07:09.599 --> 00:07:11.631
Thank you. All right.
24 "Tim Miranda" (553248768)
00:07:11.631 --> 00:07:36.335
Cool, so next one we're gonna, I'm gonna bring Delete in here from low code, and let's talk about the Gen AI Connect rule. So we've introduced the Gen AI Connect rule in the past, great way to have LLM calls and, you know, summary analysis directly built into your workflow. Delete tell me about this new capability with the Gen AI Connect rule and working with attachments.
25 "Dileep Thotakura" (3628892160)
00:07:36.335 --> 00:07:55.609
I'm pretty excited to talk about this new feature that is called as attachment analysis, and I'm sure a lot of provided developers have adopted this genAI connect which is a seamless way to integrate AI into your applications and enhance them. Now, this particular capability is even more tremendous because developers can now.
26 "Dileep Thotakura" (3628892160)
00:07:55.609 --> 00:08:15.609
Configure the tool to analyze various attachment types. The files can be images, documents, Word, PDFs etc. And it purely depends on the models. But again, the benefit is that your end users, whenever they upload attachment receipts or any kind of documents, you can open up the.
27 "Dileep Thotakura" (3628892160)
00:08:15.609 --> 00:08:34.090
Opportunity for your workflows to summarize or extract the data automatically or process the data in a more meaningful way because you're now having the power of large language models sitting inside your workflows and it tremendously improves your end user productivity as well as your application experience.
28 "Tim Miranda" (553248768)
00:08:34.090 --> 00:08:53.451
Yeah, cool. I mean this is one of the great parts about Launchpad here is that attachments and files are not treated separately, they're part of your business data, right? So they're fields in your model and they can be used here as well as a bunch of other places just like all the data you collect from the user. So show me the, show me the how this is working here. Yeah.
29 "Dileep Thotakura" (3628892160)
00:08:53.451 --> 00:09:12.739
So let's imagine we have u plus expense management in which we have a manual way of entering all the fields, so which is what is a troublesome thing, but now I'm automating it with AI, which means whenever my end users upload images of expenses, I wanted to extract them automatically. So I created and I connect rule and I'm proceeding.
30 "Dileep Thotakura" (3628892160)
00:09:12.739 --> 00:09:32.739
Feeling to configure. So by just enabling the checkbox that says enable this checkbox to do the attachment analysis, you get this capability and then you select your field in which the attachment content is present. So it is that simple. Now, in this case I'm also configuring the output structure in which the LLM has to respond. This is even more powerful and.
31 "Dileep Thotakura" (3628892160)
00:09:32.739 --> 00:09:52.739
The new capability that we have launched in where you are indicating the large language model that when you extract the data, you have to fill the data into these fields automatically. And you're NO longer bound to your primary in this case as an expense submission, you can actually choose a case or data object as well, which you are framing.
32 "Dileep Thotakura" (3628892160)
00:09:52.739 --> 00:10:07.735
So this adds even more power and it lifts the need for you to map the fields in a JSON transform and we are doing it automatically for you. So in this case, I, yeah, I'm I'm also configuring the automation to consume the output then.
33 "Tim Miranda" (553248768)
00:10:07.735 --> 00:10:34.469
Right. Yeah, NO, that's great cause I mean it's so easy to call the LLM using the connect rule from a step in my workflow, and you know with the structured output, it's using the power of the LLM to say, look, this is the kind of structure I expect back, I've got data types, I've got embedded data, whatever I've defined in my my data model, in my application, and it's able to just directly map those back, and so it's sort of out.
34 "Tim Miranda" (553248768)
00:10:34.469 --> 00:10:54.469
One step, be able to call the LLM, get the response, get that back in my business data model and go from there. Very cool. So now we're seeing, so for the end user here, you know, you mentioned like when they're uploading a file here, it's gonna get analyzed by by the LLM that there's a set of supported file types.
35 "Tim Miranda" (553248768)
00:10:54.469 --> 00:11:01.454
Can you talk a little bit more about that? Like what choices the users have? Does it depend on the LLM, et cetera?
36 "Dileep Thotakura" (3628892160)
00:11:01.454 --> 00:11:21.169
Sure team. So we have a lot of models that are already supported in launchpad. When you choose agents or you can choose the model providers et cetera. And the capability to analyze files is purely dependent on the models. And in my experience, all the latest and greatest models and especially the Pega default fast and Pega default smart ones.
37 "Dileep Thotakura" (3628892160)
00:11:21.169 --> 00:11:41.169
Are capable of analyzing various file extensions that includes images, documents and PDFs. So that's that's something that I really like about the power of Launchpad is we are abstracting the complexity for you and you can choose any model of your choice that is provided and you can get the functionality pretty sure.
38 "Dileep Thotakura" (3628892160)
00:11:41.169 --> 00:12:04.178
Now, the, my experience against this like Gemini models are even more aggressive and they add a ton of accuracy and reliable results, so you can all always try those, but feel free, your fast and smart are also great models, so for your use case, if it is about analysis or extraction, it really depends on your use case.
39 "Tim Miranda" (553248768)
00:12:04.178 --> 00:12:20.029
Yeah, that's great. This is, this is a great add on for the Gen AI Connect because, you know, traditionally in other types of platforms and stuff, I might need to set up some sort of like, you know, file parsing or extraction or or use an external server.
40 "Tim Miranda" (553248768)
00:12:20.029 --> 00:12:40.029
Service or something. Here since since the connect, like you said, abstracts out a lot of the complexity of even using an LLM, I can just drop this in to my workflow, you know, have an automation, call it, get the response back, and be able to do real file analysis, whether you like you said, it's extracting identifying information.
41 "Tim Miranda" (553248768)
00:12:40.029 --> 00:13:18.873
And, you know, fields that are needed or doing summarization or, you know, finding things in documents and figuring out the next steps in my business logic. This looks like that this is really easy. I've actually started using this in some of my provider applications that I'm helping with. It just works. So that's cool. Now, some of the exist like, so I am using some of my existing apps, do I have to change anything here like if I've already got Gen AI connect, you know, built into my workflows, do I need to do anything with those existing Gen AI Connect calls or they work as is?
42 "Dileep Thotakura" (3628892160)
00:13:18.873 --> 00:13:45.334
Yeah, the the new functionality that we brought does not affect your existing connectors, they'll work the way they are. But like I said, this new functionality of extracting automatically and mapping to the response structure that you expect is even a better way so you can always upgrade or for your new enhancements in your workflows, you can always use this feature. Whether it is for attachment analysis or basic connects, it is supported in both of them. Awesome.
43 "Tim Miranda" (553248768)
00:13:45.334 --> 00:13:56.220
Yeah, NO, this is great. Looking forward to what else will be coming in the future with a GenAI Connect rule here. Anything else you want to add for this one?
44 "Dileep Thotakura" (3628892160)
00:13:56.220 --> 00:14:13.050
Nope, I would love for all providers, provide developers to use this and brag about this feature because this is pretty easy and our functionality is always easy to configure and easy to for your end users to experience for greater applications. Thanks Awesome.
45 "Tim Miranda" (553248768)
00:14:13.050 --> 00:14:33.339
Thanks Delete. All right, cool. So moving on to our 3rd topic, I'm gonna bring Adam into the conversation here. You know, we've had insights with Launchpad to be able to build visualizations, whether they're tables, graphs, charts, things like that. But we want to walk through how easy it is to do it these days.
46 "Tim Miranda" (553248768)
00:14:33.339 --> 00:14:45.435
Is with the preview data being used to show, the information as you're configuring your insight. So Adam, show me tell me a little bit more about this feature.
47 "Adam Torrey" (3380011008)
00:14:45.435 --> 00:15:03.119
Yeah, NO, thanks Tim. This this one's really exciting for me, to be able to access that preview data when you are or doing things within your application, whether it's data exploration or building content like insights and landing pages, to, to be able to see that data. Number one, it's just more visually appealing, right? It's more interactive.
48 "Adam Torrey" (3380011008)
00:15:03.119 --> 00:15:23.119
And for those of us that are, you know, consider ourselves data enthusiasts, it's really great to see the impact of your actions as it's happening real time. So, so things such as adding groups to your, your table or your charts changing the sort or the filter or or other information, it really comes to life in front of you.
49 "Adam Torrey" (3380011008)
00:15:23.119 --> 00:15:47.414
And it really gives you a little bit of clarity around what you're actually looking at. For instance, if I'm on a dashboard, I might have a bunch of different insights and I'd like to better understand the content of each one to be able to arrange it in the most meaningful way. And it really just gives you that, that overall ease of use and and showing, you know, unlocking more feature sets and things that you're doing during your authoring experience.
50 "Tim Miranda" (553248768)
00:15:47.414 --> 00:15:58.403
Cool. Yeah, so why don't we see what this looks like? Cause as a developer like where will I see this preview data? Where can I take advantage of this when I'm configuring all these different things related to insights?
51 "Adam Torrey" (3380011008)
00:15:58.403 --> 00:16:14.169
Sure yes, and so you'll see this in in several different places. So, so here you'll see it right as you're, you start exploring data. So this is you how you access the data in your application. You want to show that in a table, you wanna, you know, create a chart or a KPI, and then
52 "Adam Torrey" (3380011008)
00:16:14.169 --> 00:16:34.169
Ultimately saving that as an insight that can be used in other places in your application as well. So this is where that, that comes to life and as I mentioned, being able to change grouping and sorting and see things happen in real time. And also when you are adding that to a landing page and, and, and authoring that there, whether that's just a landing page with a list of data, if that's.
53 "Adam Torrey" (3380011008)
00:16:34.169 --> 00:16:54.169
The dashboard that you see here with a bunch of different representations of your data. It allows you to basically add those, arrange those and and make that really come to life. As well as when you're creating views in your application. So you're, you're able to show that data, it makes things easier for you to understand as you're building it what that ultimate.
54 "Adam Torrey" (3380011008)
00:16:54.169 --> 00:16:59.679
Ultimate experience for your your subscriber and your user base is going to be.
55 "Tim Miranda" (553248768)
00:16:59.679 --> 00:17:18.499
So this is using the same data that I have in preview. I don't have to move any data around, right? So as I'm demonstrating the app to stakeholders or trying out things or verifying a new feature as I'm building the app, I'm building up cases and data records and things in my my preview.
56 "Tim Miranda" (553248768)
00:17:18.499 --> 00:17:46.069
Session and that stuff's just gonna show up here. Correct. Yes. Okay. Awesome. Yeah. So this is great because that's that same data like that I want my stakeholders to kind of see like look this is my my vision for the dashboard, is this meeting what you're looking for? And so as a developer, being able to kind of see that myself before I even like make that available for other people to see or or launch preview and try out other parts of the.
57 "Tim Miranda" (553248768)
00:17:46.069 --> 00:18:17.212
I love that I can see that as I'm configuring the insight, and I can see that as I'm configuring the views that I'm gonna be dropping that insight into whether they're dashboards, you know, list views in other places, this this really kind of pulls it all together here. What else? So as I'm configuring these visualizations, like I can see here like my pie charts filled in, my bar charts there, where else can I kind of see the preview data like as I'm dealing with the insights?
58 "Adam Torrey" (3380011008)
00:18:17.212 --> 00:18:45.189
Sure sure yeah so i think you know number one as you mentioned, preview saving time on on having to preview to see this stuff, but there's there's also other features that really come to life with this capability. Top of mind for me is, is drill down within a chart and and that's something where I'm able to see multiple levels of data and, and really understand that my data structures better. Similarly when you are drilling down, within your charts, there's that underlying view of records.
59 "Adam Torrey" (3380011008)
00:18:45.189 --> 00:19:05.189
So when you get to the bottom, right? I wanna I wanna identify a subset of data and I want to go and I wanna see those records. That lowest level can actually be configured as well. So this will also helps that, you know, helps you tailor it to your use case, as well as you can see the promoted filters on the top there, whether it's an inside or a page, and that allows more for that ad hoc style.
60 "Adam Torrey" (3380011008)
00:19:05.189 --> 00:19:10.139
Level of filtering and querying to see your data come to life as you need to see it.
61 "Tim Miranda" (553248768)
00:19:10.139 --> 00:19:25.319
Okay cool. Yeah, this is great. This this really makes it fast to be able to configure the visualizations I need and yeah, I already have providers starting, starting to use this. They, they love that they can see kind of like.
62 "Tim Miranda" (553248768)
00:19:25.319 --> 00:19:45.319
Exactly what they're gonna get when they're showing off the app here in preview, and they can control this too. It's not generated data or made up data. It's, it's what they have in preview, so as they load in like you know real examples, real data records that they need, or what they're gonna show to say.
63 "Tim Miranda" (553248768)
00:19:45.319 --> 00:19:59.519
Stakeholders like they'll be seeing this here in the authoring environment as well, so cool. Well, this is neat, appreciate it Adam. I hope people get to go explore with us right after this call and, and take a look.
64 "Adam Torrey" (3380011008)
00:19:59.519 --> 00:20:01.294
Great, thanks Tim. All right.
65 "Tim Miranda" (553248768)
00:20:01.294 --> 00:20:18.919
Cool. So, alright, so yeah, we've been collecting some questions here so what we'll do now is we'll run through some of them. If we don't get to your question, we can also follow up offline, and we'll point you to some more resources as well so you can learn.
66 "Tim Miranda" (553248768)
00:20:18.919 --> 00:20:24.900
More about these topics. Katie, do we have any initial questions to start with?
67 "Katie Cibulka" (1110992128)
00:20:24.900 --> 00:20:32.736
Yeah, of course. Let's start with Chloe. Can we choose the LLM model and its temperature for the data generation?
68 "Chloe Loh" (3401936896)
00:20:32.736 --> 00:20:53.879
Yeah, so by default, the feature comes with a pre-configured model as well as temperatures, so the developers can actually just click a button and quickly fill in the sample data right away without having to think about tuning anything. But under the hood, this is actually powered by a Gen AI Connect rule which talked about so much just now as designed to be overridable.
69 "Chloe Loh" (3401936896)
00:20:53.879 --> 00:21:10.936
So if a developer does want to have more control, e.g., if they want to tweak the temperature, they want to change the prompt, they want to switch to a different model, what they can do is to just to override that particular rule and fill in the gaps themselves.
70 "Katie Cibulka" (1110992128)
00:21:10.936 --> 00:21:30.769
Great, thank you. Let's see, question. Can GenAI connect rule and you already answered this but let's have the have it the whole group here. Can the excuse me I can't talk. Can the generate connect rule parse receipt item lines to fill out.
71 "Katie Cibulka" (1110992128)
00:21:30.769 --> 00:21:34.039
An embedded list of line items in the case.
72 "Dileep Thotakura" (3628892160)
00:21:34.039 --> 00:21:53.329
Yeah, sure. I have answered this but again I'll explained Yes, the connect rule can pass the list of line items provided it's the you in your output configuration, if you have chosen the line items as a data object or case type object basically and select the cardinality as multiple records. Again.
73 "Dileep Thotakura" (3628892160)
00:21:53.329 --> 00:22:09.756
Then, in that level, we support the scalar fields, but if line attempts has another level of complex object that is not supported and that's an enhancement we have in the roadmap to be making. Okay, thank you.
74 "Katie Cibulka" (1110992128)
00:22:09.756 --> 00:22:17.374
One more for you, delete. Can you support multiple attachments at once?
75 "Dileep Thotakura" (3628892160)
00:22:17.374 --> 00:22:31.217
No, today we only support single attachment to be sent in the single LLM call, but that is also an enhancement in the roadmap and we will be soon be able to complete that. Great.
76 "Katie Cibulka" (1110992128)
00:22:31.217 --> 00:22:37.057
So Chloe, another one for you are all field types supported for this feature.
77 "Chloe Loh" (3401936896)
00:22:37.057 --> 00:22:52.859
And certain features really optimized around the most common and particle field types that developers typically need help with during preview, so it helps to scale a fields such as text, emails, numbers, dates, and URLs and whatnot.
78 "Chloe Loh" (3401936896)
00:22:52.859 --> 00:23:16.100
And there are some few types that are deliberately not supported, so this include things that requires information or system relationships such as user references, our rule references, as well as things such as attachments which takes a long time for the AI to generate and will cause a huge latency in the core itself. Gotcha.
79 "Katie Cibulka" (1110992128)
00:23:16.100 --> 00:23:27.578
I'm looking through some of these, sorry. How accurate or realistic is the data that gets generated? Can developers actually rely on it for testing?
80 "Chloe Loh" (3401936896)
00:23:27.578 --> 00:23:44.219
I would say that the data is realistic enough for our day to day testing, especially since if you just want to move to a form quickly and to be able to validate the form the flow itself. This is because the AI is using things such as the field types, the few names, as well as the overall case context.
81 "Chloe Loh" (3401936896)
00:23:44.219 --> 00:24:04.219
However, I won't position it as something that actually replace all testing but more as a fast starting point. So e.g., if you need to have data that fuse that is targeted at very specific edge cases or business critical values, then you might want to adjust them manually to make sure that they past validations and whatnot. So the way.
82 "Chloe Loh" (3401936896)
00:24:04.219 --> 00:24:14.712
I'll describe it is really is that is a great 1st pass that saves you a lot of time, but you are still in finding control of the data.
83 "Katie Cibulka" (1110992128)
00:24:14.712 --> 00:24:23.454
Great. Delete. Are there any best practices to leverage while using this feature?
84 "Dileep Thotakura" (3628892160)
00:24:23.454 --> 00:24:42.439
Sure. So like I said, the model capabilities are also documented. We have also made extensive documentation about the attachment analysis and the best practices out there, but to me personally, the usage of the model depending on your use case is really important.
85 "Dileep Thotakura" (3628892160)
00:24:42.439 --> 00:25:02.439
When you use the output configuration, you can actually leverage the power of automatic mapping and you don't have to write specific prompts again in your connection to ask the LLM to extract. You just have to define your output structure very well using data modeling capabilities of launchpads.
86 "Dileep Thotakura" (3628892160)
00:25:02.439 --> 00:25:22.439
So that we take the heavy lifting of indicating to the LLM that, hey, this is the user, this is what the user is expecting, and it has to map those fields appropriately well and you can actually use that easily into your workflows. Again, one important thing is that whenever the model is unable to map those into those fields.
87 "Dileep Thotakura" (3628892160)
00:25:22.439 --> 00:25:34.458
We do not break any of your behavior, we just will fill in null or empty values and you can always debug it from the tracer that you have and proceed with your use cases.
88 "Katie Cibulka" (1110992128)
00:25:34.458 --> 00:25:40.897
Okay, great. Well, I think that's all the questions that we have for right now.
89 "Tim Miranda" (553248768)
00:25:40.897 --> 00:26:08.789
Okay. Cool. Thanks, everybody. Thanks Katie for walking through those. Alright, so I think we'll wrap it up here. So just to briefly mention again, coming up in May may 9 May, we have our next launchpad live community call. Again, we'll be talking about all sorts of different topics, new features and features that we want folks to really check out. Of course, follow us on LinkedIn.
90 "Tim Miranda" (553248768)
00:26:08.789 --> 00:26:28.789
To find out about more events as they're happening, and if you go to launchpad.io in the resources, you can get to a bunch of really useful stuff, the release notes, which will have, you know, every week, every couple of weeks, really detailed information about new and extended features. We have our help documentation in there as well. I just want to give a shout out to.
91 "Tim Miranda" (553248768)
00:26:28.789 --> 00:26:48.789
The AI assistant that while you're developing in studio, the little purple icon in the upper right, use that. That thing will now source from our documentation, you can chat with it, ask questions, but it also sources from our webinars, from our use cases, so everything that you can find are launched by that I/O.
92 "Tim Miranda" (553248768)
00:26:48.789 --> 00:27:11.209
You can actually have an interactive chat now, with the assistant and actually figure out the best way to do something and and figure out where to go instead of funding through documentation. So please take a look at those, go to launchpad.io and and check out the upcoming events as well. Otherwise, thanks everybody. These were three features that I use every day in my launchpad development, so I hope people get the.
93 "Tim Miranda" (553248768)
00:27:11.209 --> 00:27:23.197
Check them out as soon as they can, and thanks everybody for your participation today, and I think that's it for, for this launchpad live. Thank you everybody. Thank.
94 "Chloe Loh" (3401936896)
00:27:23.197 --> 00:27:27.360
Thank you.