AskGPT query is super slow

Using the AskGPT function I try to query a bedtime story and then output this to the next screen. After firing the action user has to wait approx 1 minute to see if anything came back or it just crashed. 1/3 of times it returns nothing. I tried to make prompts supershort, it shanged a littlebit but still 40-50sek waiting time. Anybody else same problem?

Have you tried using the chatGPT component in the marketplace?

Hm, donā€™t think so. Iā€™m looking at this marketplace ā€œChatGPT responseā€ app youtube intro (ChatGPT Response Component - Adalo Marketplace - YouTube) and looking at the 2 weeks old comment below:


Q: Hi, can u connect it to your Adalo db and make it answer with the data you have there?
A: To the extent of my knowledge I donā€™t think itā€™s possible. It may be if someone has found a workaround or something like that. Let me know if you have any more questions.


So, maybe the response delay is because I use Adalo DB? Hereā€™s my flow:

  1. GPT prompt is populated from 10 different variables from user collection, saved into the user field named ā€œpromptā€
  2. AskGPT is firing this prompt and GPTresponse is also saved into user field, named ā€œGPTrespondedā€
  3. When ā€œGPTrespondedā€ field is updated, user is sent to next screen and the ā€œGPTrespondedā€ is showed

If I use Xano or Airtable or if I connect with GPT on the database level, would it make any difference in speed and less failing?

Hmmā€¦I havenā€™t tried that yet.

Maybe @Ali-Bazzi can answer? I know heā€™s used the chatGPT response component and loved it :slight_smile:

There can be many reasons why itā€™s so slow. But chat component inside Adalo will be slow in any case. But maybe u can play around with it? Donā€™t make list in ā€˜askingā€™ and ā€˜asweringā€™ screen. Just inputs. Yes, it will be one answer which user would be able to see. But you can record conversation and make ā€˜Historyā€™ or smth like that button, which would allow to see all conversation. Donā€™t know if itā€™s good for you, but I tried that OpenAi API and without databases answer I am waiting about 10 sec. Itā€™s not bad time.

Hello @James_App_Maker and @Chhhh ! The ChatGPT component is very successful, and it can be used to get data from the database and then answer according to those data. And i havenā€™t faced any issues with it.

Thank you!

1 Like