vic
vic

Reputation: 311

Alter the column type over several tables

In a PostgreSQL db I'm working on, half of the tables have one particular column, always named the same, that is of type varchar(5). The size became a bit too restricting and I want to change it to varchar(10).

The number of tables in my particular case is actually very manageable to do it by hand. But I was wondering how one could script this with a query for larger dbs. It generally should be possible in just a few steps.

  1. Identify all the tables in the schema, then (?) filter by condition if column present.
  2. Create ALTER TABLE statements for each table found

I have some idea about how to write a query that identifies all tables in the schema. But I wouldn't know how to filter them. And if I didn't filter them, I assume the generated alter table statements would break.

Would be great if someone could share their knowledge on this.

Upvotes: 0

Views: 1697

Answers (1)

vic
vic

Reputation: 311

Thanks to Abelisto for providing some guidance. Eventually, this is how I did it.

First, I created a query that in turn creates the ALTER TABLE statements. MyDB and MyColumn need to reflect actual values.

SELECT 
   'ALTER TABLE '||columns.table_name||' ALTER COLUMN '||MyColumn||' TYPE varchar(20);'
FROM 
   information_schema.columns
WHERE 
  columns.table_catalog = 'MyDB' AND 
  columns.table_schema = 'public' AND 
  columns.column_name = 'MyColumn';

Then it was just a matter of executing the output as a new query. All done.

Upvotes: 2

Related Questions