Reputation: 49
What I have: the hostname/port number of an always-running [q] session that exposes several KDB tables via our internal web. I can easily run [q] commands against it in a browser (or even, through the use of [hopen], via a local [q] session invoked on the command line).
What I need: a [q] script, or the knowledge of how to write one, that will automatically connect to the web-facing database, and copy over all of its tables into the localhost [q] session's working memory (without knowing all the table names in advance).
Concerns include:
The tables are huge. I'm prepared to wait on my machine if need be, but I do need this to work eventually.
While I can get a legible list of all the server's table-names, I can never get it in a useful format (ideally it'd be a List, rather than the Symbol that the hopen-ed [tables] command always gives me). Also, I'm told that it may be possible to accomplish the transfers without ever explicitly querying the table names, though I can't imagine how; bonus points if you manage that.
Upvotes: 0
Views: 179
Reputation: 3229
Yo can implement smoething like this :
.data.oc:1000;
/connect to the session using hopen
h:hopen `::1234;
/get the table names
tabs:h"tables[]";
/create local tables with the same names
{ .[x;();:;()] } each tabs;
/for each table name
{[tab]
/get the table count
c:h({count value x};tab);
oc:.data.oc;
/cut the table count to some optimal value, say 10,000 (0-99999; 10000-19999).
idxl:$[c>oc; [ l: c div oc; ( (0;oc-1)+/:oc*til l),enlist (l*oc;c-1) ] ; enlist (0; c-1)];
/now iterate over the list and use them as indexes to query the table.
{[t;idx] t upsert h ({[t;y] ?[t; enlist (within;`i;y);0b;()] } ; t;idx ) }[tab] each idxl;
}each tabs
Upvotes: 1