Kiazim Khutaba
Kiazim Khutaba

Reputation: 323

How to Oracle ON LOGON trigger impact on performance?

I've created trigger for logon event as showed below:

CREATE OR REPLACE TRIGGER "log_users_session"
AFTER LOGON ON DATABASE
WHEN USER = 'SomeUser'
BEGIN
   INSERT INTO "users_logon_log" ("username","date") VALUES ("Some user",sysdate)
END;

It's big report database. I want to know, is this really slow down database perfomarnce, or has side effects?

My Oracle version 19c.

Upvotes: 0

Views: 4374

Answers (3)

pifor
pifor

Reputation: 7882

If you only need to record connexions to database, I would simply use database audit features: https://docs.oracle.com/en/database/oracle/oracle-database/19/dbseg/introduction-to-auditing.html#GUID-F901756D-F747-489C-ACDE-9DBFDD388D3E

Upvotes: 1

Littlefoot
Littlefoot

Reputation: 142743

A few objections, if I may.

Get rid of double quotes when working with Oracle, i.e. no "log_users_session" but log_users_session. In Oracle, everything is (by default) stored into data dictionary as uppercase, but you can reference it any way you want. With double quotes, you MUST reference it using exactly that letter case along with double quotes, always.

That affects column name: "date". When you remove double quotes, you'd get date and that's an invalid name as date is reserved for Oracle datatype; so, use log_date or something like that.

As of your question: you decided to log only SomeUser so - if that user doesn't establish zillion connections, I wouldn't expect significant impact. Though, if it is a big reporting database and all users (read: people) use the same credentials while connecting/establishing a new session, then maybe. On the other hand, what's the purpose of such a setup? You'd get a large number of connections for the same user for the whole time you monitor that.

Basically, it just depends on what you do and how. It wouldn't cost much if you try it and see how it behaves. If it affects performance, don't use it any longer.

Upvotes: 2

sara0506
sara0506

Reputation: 21

There will be a performance hit ranging from very minimal to significantly high based on the concurrent connections. The hit is linearly proportional to the no. of concurrent connections i.e. More connections mean big hit, less connections mean less hit. It would be ideal to make a decision depending upon the avg no. of users who connects to the system at a given time. I had implemented this for a 300GB database with ~200 connections, it didn't have much impact.

Also, users_logon_log table should be taken into account for regular maintenance/clean up from growing too large and occupy significant disk space.

Upvotes: 1

Related Questions