Hello,
I’ve created a model that works by emulating the real world through importing data in real time. It works perfectly at this point.
The thing is, I need to create some statistics and tables about it. I created it using Statistics Collectors & Calculated Tables. When I executed the model for a few hours, it didn’t run at a 1 to 1 ratio with the real world, and that created fake data.. I thought that this would be a problem of Calculated Tables (they executes de SQL for all the data every time), and I just recreated that part of the model by using only tables and code, and I saved the data as Global Tables.
Now, when I run the model, it crashes after a few hours. If I open the Windows Task Manage, I see that it takes 1 Mb more of RAM every second.
Is there a method that doesn’t save that data in RAM?
What do you recommend me?
I read something about bundles, or is it better to export the data in CSV every few hours and then delete the global tables (I would prefer not to do that)
Thank you!!