We’re about a month in from switching to dStore and must say we couldn’t be happier. The filtering speed on large datasets is seriously impressive and coupled with the onDemandGrid it’s a fantastic solution.
We are providing a reporting application that if the user is silly enough, can pull down many, many records. We’re sending pages of data (about 1k records) from the server (node.js) and it is all running fine up to between 150k and 200k records. This is an extreme use case and something we’re probably going to block in the logic, but if a user decides they want to pull down a year’s transaction records it could be in these sorts of numbers. Just a question, are there any known or specific count limits in dStore we should be aware of? We’ve turned off tracking and force dGrid updates periodically as the data loads (like every 15k records or at the end of the batch) and it works really well considering the size of the dataset.
It’s an extreme case but appreciate any thoughts so we don’t chase our tails.