Do we think responses will be able to become streamed? This really shouldn’t be a make or break for my app but as users are constantly chatting with the AI. Waiting 10-20 seconds can seem unprofessional in a sense. Anyone else feeling the same?
1 Like
Yes, waiting 10-20 seconds is a lot. I think Adalo doesn’t have a solution for it.
Maybe we can use a third-party service and stream it on Adalo? Not sure if this works.
I personally tried to stream the response but the test returned empty
This lag is at the API level - you can see lots of discussions online about this. Not sure there is anything Adalo can do.
1 Like
I think I can be able to make this work with a component that I’m about to release. I’ll let you guys know when it’s ready.
2 Likes
Yes, James, that will be a great idea. Please, keep us updated. Thanks!
1 Like
Hello, in the ChatGPT component in the marketplace, we’re getting the response fast and in less than 10 seconds.
Thank you!
1 Like
That’s awesome to hear @Ali-Bazzi!
Thanks for the support
1 Like