Reputation: 1
I have a database in which there are around 600 tables being used in SIT. As we are moving to UAT, I need to compare the DDL's of the tables (column names, datatypes, size) in SIT that are different to that of in UAT. I am mulling over the following approaches. (I do not have PERM space) I prefer to do this in database rather than exporting the data and comparing through unix.
FIRST APPROACH:
create volatile table SIT (s_tablename varchar(30), s_ddl varchar(30000)) on commit preserve rows;
create volatile table UAT (u_tablename varchar(30), u_ddl varchar(30000)) on commit preserve rows;
DATABASE SIT;
How can I generate the DDL of the above with a select statement to insert into the above values?
1 SELECT 'SHOW TABLE ' || TRIM(TABLENAME) from dbc.tables where databasename='SIT' and tablekind='T' order by tablename;
If I can have the resultant of the SHOW TABLE i.e., DDL along with table name, I could run a direct insert select into the SIT table (something like below)
insert into SIT select trim(tablename <DDL_OF_TABLE> from dbc.tables where databasename='SIT' and tablekind='T'
and into UAT values as appropriately so that I could LEFT JOIN them as below to find out the differences.
select sit.s_tablename, uat.u_tablename from sit left join uat on sit.s_tablename = uat.u_tablename and sit.s_ddl=uat.u_ddl where uat.u_tablename is NULL;
SECOND APPROACH:
select all the column information from dbc.columns as below from both sit and uat and compare them in excel sheet (SIT, UAT are in 2 different servers).
SEL TABLENAME, COLUMNNAME, COLUMNFORMAT, COLUMNTYYPE, COLUMNLENGTH, NULLABLE, DECIMALTOTALDIGITS, DECIMALFRACTIONALDIGITS FROM DBC.COLUMNS WHERE DATABASENAME='SIT'ORDER BY TABLENAME, COLUMNNAME
2a)--
As the resultant is huge (187000 rows -- for around 600 tables), the excel sheet is taking very long to compare and give the resultant. I could break down the above query into chunks and compare it. However, I am trying to achieve this in one part.
2b)--
One more option is to import the above select resultant into the volatile tables through sql assistant (I do not have access to run utilities) and do a minus operation as below.
sel * from SIT minus sel * from UAT;
However, I feel that it is a time consuming process of loading around 187000 rows into 2 volatile tables through SQL assistant import. Also, I should overcome the null handling using zeroifnull or some other function during the above select over dbc.columns to avoid any further issues if so.
Can you please elucidate on the pros and cons of both the approaches or any more feasible solution?
Upvotes: 0
Views: 3818
Reputation: 7786
Teradata Studio 15.00
Teradata's eventual replacement for the incumbent SQL Assistant now offers a Compare Object Wizard to compare objects between the same or different Teradata databases. It places the object DDL side by side in the Compare Editor and highlights the differences. This can be downloaded freely from Teradata's Developer Exchange. (Registration required - no charge)
Third Party Schema Compare Tools
There are vendors that offer IDE tools with the ability to perform schema comparisons across database environments. Some offer free evaluation periods for their tools while others require you to license the product up front. AquaFold and AtanaSuite are two options that come to mind.
BTEQ Export of SHOW TABLE Output
You can script the SHOW TABLE Output to be exported to a flat file for SIT and UAT. With the flat files you can then use a tool like Notepad++ (open source Windows text editor) that offer the ability to perform a compare of two files and highlight the differences. This might be a better option than the first two which may not be as friendly when it comes to processing a high volume of objects that need to be compared.
Another option if your savvy with UNIX scripting (sed, awk, regular expressions) you can probably script something to perform the comparison for you as well. Your mileage may vary with this option depending on how thorough you are with your scripting.
Implement Better Version Control Processes
Something that I believe many shops can do a better job handling especially as the complexity of their enterprise data environment increases. The challenge is that many of the tools software engineers have come to rely on are not as friendly when it comes to dealing with DDL changes in a database. You will find that there are products out there for some of the better known environments such as SQL Server and Oracle. Fewer for Teradata. Tools aside this is probably more of a process issue as it relates to release management in many shops.
Hope this helps.
Upvotes: 0