Idea

Cameron Pluim avatar image
2 Likes"
Cameron Pluim suggested Cameron Pluim commented

Be able to dump queries to a Table in memory

I have a situation where I query a global table to get a subset of the original values (Subset A). Then, depending on the value of the first entry on that query, I need to query Subset A in various ways creating Subset B.

In order to accomplish this currently I need to dump Subset A into a "dummy" node so the values are stored somewhere, and then I am able to query Subset A normally. It would be nice if I could do something like the following:

Table myGlobalTable = reftable("GlobalTable1");

//Option 1
Table SubsetA = query("SELECT * FROM $1 WHERE Column1 > 2",myGlobalTable);

//Option 2
Table SubsetA = nullvar;
query("SELECT * FROM $1 WHERE Column1 > 2",myGlobalTable);
dumpquery(SubsetA);

Table SubsetB = nullvar
if(SubsetA[1][2] <= 10) {
	SubsetB = query("SELECT * FROM $1 WHERE Column3 = 2", SubsetA);
} else {
	SubsetB = query("SELECT * FROM $1 WHERE Column3 = 1", SubsetA);
}

This would make it so I don't need to create a "dummy" node that I will be dumping data into, but I would be able to create and store multiple queries within the same code block.

querytables
5 |100000

Up to 12 attachments (including images) can be used with a maximum of 23.8 MiB each and 47.7 MiB total.

1 Comment

Matt Long avatar image
2 Likes"
Matt Long commented

In 2017 Update 1 you can use the Table.query command to do exactly this, create a table in memory.

· 7
5 |100000

Up to 12 attachments (including images) can be used with a maximum of 23.8 MiB each and 47.7 MiB total.

Cameron Pluim avatar image Cameron Pluim commented ·

That's fantastic!

0 Likes 0 ·
Cameron Pluim avatar image Cameron Pluim commented ·

@Matt Long, I just updated to 17.1.2, and I tried using a table in memory, and it doesn't seem to work with column references.

In the attached model, there are 2 script windows. In the first I am directly querying the tables in memory, and it throws errors saying:

Reference to Col 2 could not be resolved to a valid column. Parsing halted.

In the second script window I'm cloning the tables in memory to dummy nodes, and I'm able to query from the node correctly.

tableinmemory.fsm

0 Likes 0 ·
tableinmemory.fsm (12.9 KiB)
Matt Long avatar image Matt Long Cameron Pluim commented ·
0 Likes 0 ·
anthony.johnson avatar image anthony.johnson ♦♦ Cameron Pluim commented ·

Unfortunately Table.query() was not designed for this specific situation. Instead, the new Table.query() features were meant to allow multiple query results to be active simultaneously, but these were not yet able to be subsequently passed into Table.query() again. See the documentation for p1-p9 of Table.query():

You will usually ... pass a reference to a table node here, defining the table to query ... If this parameter defines a table, it must be a tree-bound table or a direct reference to a node. In other words, it cannot be the result Table of another call to Table.query(), unless you clone that result to a tree-bound table with Table.cloneTo()

Although this might not pertain to your case, you can nest queries in 17.1. This would solve your test case, but not necessarily the original question case.

Table result = Table.query("SELECT * FROM (SELECT * FROM (SELECT * FROM $1 WHERE [Col 3] = 1) WHERE [Col 2] = 1) WHERE [Col 1] = 1", Table("GlobalTable1"));

I do believe you could solve your original question somehow, though, with a nested query

0 Likes 0 ·
Cameron Pluim avatar image Cameron Pluim anthony.johnson ♦♦ commented ·

Thanks @anthony.johnson. I didn't see that documentation, but it does spell it out pretty clearly. Should I write this up as an idea to be implemented in the future, or are there already plans to be able to do this in the future?

Also, the nested queries look really useful, but I think I'll probably end up using clone.To to a dummy node more often than nested queries from a readability stand point (depending on the table size).

I had actually set up the second test case to see what would be faster, a single query with multiple WHEREs, or multiple queries each with a single WHERE but on tables of decreasing size. But since the multiple queries would need to be copied to their own dummy nodes it made single query significantly faster.

1 Like 1 ·
Show more comments

Your Opinion Counts

Share your great idea, or help out by voting for other people's ideas.

Related Ideas