question

Durga Ghosal avatar image
0 Likes"
Durga Ghosal asked Matt Long answered

Array vs Bundle vs Table

I am trying understand which one has maximum impact on speed of model run, managing the data in Array vs Bundle vs Table.

FlexSim 17.1.6
model speed
5 |100000

Up to 12 attachments (including images) can be used with a maximum of 23.8 MiB each and 47.7 MiB total.

Matt Long avatar image
0 Likes"
Matt Long answered

With so many rows of data I would either use a table or a bundle. As stated by Jordan, a bundle will be the most memory efficient, but you can only store numbers and strings in a bundle and each column can either be all numbers or all strings.

Using a Global Table you can set the data to be a node table (the default) or check the box to have it use a bundle. There's really no difference in the interface between the two. The code for accessing values in the table/bundle will also remain the same so it's not hard to switch back and forth between the two methods and test your speed as Steven says.

5 |100000

Up to 12 attachments (including images) can be used with a maximum of 23.8 MiB each and 47.7 MiB total.

Jordan Johnson avatar image
1 Like"
Jordan Johnson answered Emily Hardy converted comment to answer

This very much depends on what you want to do. All three are just about equally fast for read speeds. Bundles are the most memory efficient, storing very little data besides the actual values. Bundles can also be indexed for quick lookup. Tree tables and arrays are more memory intensive, but they can store a much wider range of values, including nodes, or even other tree tables or arrays. Usually, I would prefer the tree table over the array, because the interface for working with tree tables makes it much easier to see what's going on.

· 2
5 |100000

Up to 12 attachments (including images) can be used with a maximum of 23.8 MiB each and 47.7 MiB total.

Durga Ghosal avatar image Durga Ghosal commented ·

I am working on a model which might have more than 40,000 rows of data, which method would be the best method. I would have to read the data and also maintain log of the data.

0 Likes 0 ·
Steven Hamoen avatar image Steven Hamoen Durga Ghosal commented ·

@Durga Ghosal Considering all the variables involved (how do you write, read from it, using queries or not, using loops, having index hashes etc.) the best way is too simply create a large table with data ( copy from Excel) and do some tests to check what takes the most time considering the things you want to do with it.

0 Likes 0 ·

Write an Answer

Hint: Notify or tag a user in this post by typing @username.

Up to 12 attachments (including images) can be used with a maximum of 23.8 MiB each and 47.7 MiB total.