Arshya
Arshya

Reputation: 690

Database Analysis

I am supposed to analyze an old existing database with 200+ tables and 100+ stored procedures and some views.

I am supposed to find out the tables that are not being used currently. Visit all relationships, report the ones that are obsolete, find out the ones missing, columns not being used etc

Today I spent 5 hrs and could only get 7-8 tables done and I am confused about how to create a report, should I use Visio or excel or something else??

Please advise!

Thanks, Arshya

Upvotes: 1

Views: 120

Answers (2)

Mark Davidson
Mark Davidson

Reputation: 414

This is definitely a task you can't complete manually any time sooner and you'll need to use additional tools.

Here are several additional things you might want to do and recommended tools:

  • Check which database objects are not referenced by any other objects and skip them in the initial analysis. You'll probably find a lot of these. Tool - ApexSQL Clean
  • Install some auditing tools and run it for several days to see which tables/stored procedures are being used the most. Tool - ApexSQL Audit
  • Check your applications and see which objects are being used there. If there are some stored procedures or tables that are not being referenced by any application code you can probably mark them as good for deleting. If your code is .NET only then you can do this wiht ApexSQL Clean but I'd recommend the good old grep functions.
  • Once you're done with the analysis you might want to consider re-factoring some tables or other objects to suit the new requirements. Tool - ApexSQL Refactor

Generally, you can't go wrong with tools from Red-Gate or Idera but I do prefer ApexSQL a bit more

Upvotes: 1

ryan1234
ryan1234

Reputation: 7275

If you are using SQL Server, I would recommend Red Gate tools. http://www.red-gate.com/

I believe everything from SQL Server to Oracle to MySQL have tools to monitor transactions and who is connected to the database and what queries they are running. I would start there.

You can also run queries to see if any fields have only NULLS, hinting that they are never used.

To find out which columns are being used is more difficult. If you have the source code for the application(s) using the database, I would go there and try to write a program to parse all the database calls and generate a report. This can be very hard though.

Another idea (even though it isn't the best) is to make a copy of the database and switch your systems to use the restored copy on development machines. Then prove your hunches about unused columns/tables by modifying the copy of the database and then logging the issues. Did you delete a table and the application crashed? Then it's important! (I realize this is a very painful thing to try and do because you can't test everything.)

Unfortunately with database schemas you end up exposing an interface that most likely many applications rely on. The cat is out of the bag so to speak. I think after doing some research it would be acceptable to come back and say that it's best to leave things as is.

Upvotes: 1

Related Questions