Experience with OpenAI Assistants / Threads

Hi everyone,

after using the native Adalo AskChatGPT Action (thanks to Adalo for that, it was such a good way to start with GPT API) I figured out to make my own custom action to adress OpenAI and this works just fine.

What I struggle with: I want to use OpenAIs Assistants, which (afaik) leads me to not using /completions but /threads. So if I get this right, it’s just a different form of using the API?

And I am really not a developer, adressing the /completions API was kind of easy, but /threads seems to be a whole other thing…

Long story short: does anyone have experience with adressing OpenAI Assistants via API and can tell me/us about their experience, dos and don’ts, etc.?

BR
Mat

PS: For those interested: I will use several different custom actions (each GPT-API) in my app, and all of them need a different (AI) skillset as they are set up for different tasks. Also later on I want the AIs that I have configured to learn from what they have done so far (like recursive learning), and if I adress a new Chat/AI everytime that seems to be a stretch.
PPS: As I said: I’m neither an experienced developer, nor an expert in AI/OpenAI and/or API topics, so if you have anything to teach me: Please do so, I’m all ears :smiley:

Hello, Matt!

I’m struggling with a similar problem, so I was wondering if you ever figured out how to use threads? I’m already pretty skilled with /completions, but /threads is really getting to me, especially when it comes to creating a history.

Best Regards, Dore

Hi Dore,

Yes, I found a solution: I use a make.com scenario.

The example you see here gets Info from my app, runs a search-query on SerpApi, aggregates the information, runs the info to my assistant and then saves the answer from the assistant directly to the adalo database.

Hope this helps,
BR
Martin