Reputation: 2000
I was wondering if I could make a sophisticated search application out of the error logs that are generated. Like for example -
xquery version "1.0-ml";
let $xyz := xdmp:http-get("http://xyz:8001/get-error-log.xqy?filename=ErrorLog_1.txt",
<options xmlns="xdmp:http">
<authentication method="digest">
<username>xyz</username>
<password>xyz@123</password>
</authentication>
</options>) return $xyz
The query console crashed but if I write a script and run it? What could be the best way to do this without involving other application or language ?
Upvotes: 0
Views: 53
Reputation: 3732
See: Similar Question
In V9 there is a REST API exposing an optimized search across all log files in the cluster, with XML or JSON structured output available. This is not the same as an Indexed query such as for documents in the database, but is highly performant for even large logfiles (GB+) as long as you restrict the result set to a reasonable value.
Upvotes: 0
Reputation: 7770
Ouch.. streaming large log files seems like a not-so-goo-idea.
Have you looked at Ops Director - maybe it does what you need..
Otherwise, in Node.js or some other language of your choice, I would parse the log files and submit them as structured content. Either as part of the log rotation or by having something that monitors the file streams and continues to pump content to the database for each new line of each file in question. For this, you could probably create a module in your favourite syslog monitoring solution.
Upvotes: 2