Reputation: 3709
We have an application with huge amount of data, approx 100 tables and most of them have ~8-10 million rows, suddenly we are facing performance issues and it was discovered that CPU usage is too high on Oracle server.
Since requests to the oracle server are coming from different applications is there a way to find out in Oracle server, which queries takes longer time or consumes lot of CPU ?
Will appreciate responses or any pointers to find this out.
Upvotes: 2
Views: 8028
Reputation: 36872
select elapsed_time/1000000 seconds, gv$sql.*
from gv$sql
order by elapsed_time desc;
It does not provide as much information as tools such as AWR, and queries will age out periodically. But it's very quick and easy to run and it requires less privileges.
Upvotes: 1
Reputation: 231721
Since you are licensed to use the AWR and your DBA knows how to generate an AWR report, ask your DBA to generate an AWR report for the period of time that CPU usage was too high. On the AWR report, there will be a number of different sections one of which is the SQL statements ordered by CPU usage. That will show you which SQL statements used the most CPU over the time period in question.
Upvotes: 3
Reputation: 1761
Sounds like you need to generate some SQL trace files and analyse the results using tkprof.
There's a good Ask Tom question here that should show you how to go about this: http://asktom.oracle.com/pls/apex/f?p=100:11:0::::P11_QUESTION_ID:969160000346108326
Upvotes: 1