Performance issue - Limit on number of rows

Hi @mdelvita, @Torrsy, @James_App_Maker,

I guess there are 3 questions mixed in one topic.

First is reg. Sheety integration. Out of curiosity, I’ve tested this with Google Sheet of 10K records (autogenerated), 5 rows (ID, Name, Surname, City, Phone). I’ve successfully integrated this with Sheety and was able to add it as an external collection. See the sceenshots how I display the list with filter by Name:


Sheety has very restrictive limits on the request quantity on free tier, so I didn’t test it extensively. But integration is possible, so may be if you could share more info about the problem you face, someone can give an advice.

As for the 2nd question, reg. CSV import to Adalo DB - unfortunately, this is a known limitation. Usually it is not recommended to upload more than 3K records at one time. I think that this may be related to API timeouts; for the same collection as above, when I uploaded it to Adalo, it uploaded about 7,5K records until it stopped. So, general advice here is to break one CSV into several, for example, 1K records each, and try to upload them one-by-one.

And the for general questions reg. data handling: Adalo can handle large collections (the maximum one I’ve seen is 100K records), but working with such collections require some planning.
Based on my experience, I’d avoid displaying a collection of 10K records as an unfiltered list, and always try to “break down” the whole data set into a smaller parts. For example, for a product list mentioned by @Torrsy it could be a good idea to have it always filtered by category.
Also, for built-in collections there is a possibility to enable the setting “load and display only visible items”, this may help in showing large lists.

By the way, Adalo has introduced DB Indexing for larger apps, so this also might improve the performance.

Best,
Victor

3 Likes