Reputation: 513
I have an application that logs events to a table in MySQL, each entry (row) has a timestamp, status and a few other arbitrary bits of metadata.
In a very simplified form the log table might look something like log_id | datetime | result_code | message
I guess that asking Zabbix to go away and query the database directly for stats is probably a little ambitious but I'm wondering how else it could be architected.
I could have a separate process querying the table and writing out a log file but that feels a bit clunky. I could have a script running from zabbix_agent.conf but I'm not sure how to convert that data in to a metric for Zabbix to interpret.
Upvotes: 0
Views: 6370
Reputation: 404
For part1 that is mentioned in the answer of asaveljevs, I would look at a MySQL Trigger to help here. So the trigger can push a script, that pushes the info to Zabbix.
Upvotes: 0
Reputation: 4143
the answer by asaveljevs is really great and very detailed. it will also offer the zabbix log monitoring view. if that is not needed, a more simple approach might be to grab all the entries from the database (probably by storing log_id and only grabbing newer entries only), and then sending them to zabbix using zabbix_sender (see https://www.zabbix.com/documentation/2.2/manpages/zabbix_sender )
Upvotes: 1
Reputation: 2250
The way I understand this question is to be able to monitor a database with log records in approximately the same way as Zabbix' built-in log monitoring.
If so, this question is in two parts: (1) how to poll the database for new records only and (2) how to send data to Zabbix in a manner that it can understand. I shall leave (1) to you, but will instead propose a way to deal with (2).
We can model the solution to (2) after the way Zabbix agent deals with Windows event log. If we ask Zabbix agent on Windows to monitor, say, "eventlog[Application]", we will notice that it sends a JSON like the following to Zabbix server:
{
"data": [
{
"clock": 1398753145,
"ns": 928525552,
"eventid": 9003,
"host": "Windows 2008",
"key": "eventlog[Application]",
"lastlogsize": 51,
"severity": 1,
"source": "Desktop Window Manager",
"timestamp": 1375273705,
"value": "The Desktop Window Manager was unable to ..."
}
],
"request": "agent data"
}
Now, we can send data that we want using the same protocol. For instance, based on your simplified form of the log table, we can put "log_id" into "lastlogsize", "datetime" into "timestamp", "result_code" into "eventid", and "message" into "value". Then, we can send this data to Zabbix server using a convenience misc/debug/sender.pl script available in Zabbix source code:
$ cat mysql.json
{
"data": [
{
"clock": 1398753145,
"ns": 928525552,
"eventid": 12345,
"host": "MySQL Server",
"key": "eventlog[mysql.log]",
"lastlogsize": 1,
"severity": 1,
"source": "My Application",
"timestamp": 1375273705,
"value": "My Application was unable to ..."
}
],
"request": "agent data"
}
$ ./sender.pl -h 127.0.0.1 -p 10051 -i mysql.json
ZBXD^{
"response":"success",
"info":"processed: 1; failed: 0; total: 1; seconds spent: 0.000130"}
There are two caveats, though. One is that in order to enjoy Windows event log facilities in Zabbix frontend (like showing Windows event log columns in "Monitoring" -> "Latest data") a key should start with "eventlog[". Second is that "lastlogsize" should increase with each record that you send, but if you are using "log_id" for "lastlogsize" that would come naturally.
Apart from that, you should now be able to enjoy Windows event log specific trigger functions like logeventid(), logseverity(), and logsource() and macros {ITEM.LOG.EVENTID}, {ITEM.LOG.SEVERITY}, and {ITEM.LOG.SOURCE}.
Example configuration of an item might be as follows:
Upvotes: 3