seddy
seddy

Reputation: 801

Is it possible to use a long id in Rails applications and persist it through to the test database?

The setup

We have some tables which have very high id values, and as such they are bigints in production, which was achieved by running migrations changing the id columns including limit: 8. This methodology is outlined here: https://stackoverflow.com/a/5870148/2240218

Those migrations don't modify db/schema.rb, so when we run rake db:test:prepare, the test database is created with normal 4-byte integer columns which have a maximum of 2.1 billion (for what it's worth, we are using Postgres).


A note about our ids

For legacy reasons they are tied to being foreign keys from a third party system. We would ideally be using the id column as an internal surrogate primary key and the third party key would be a separate column entirely (which would remove this whole problem), but the overhead in this change is beyond what I'm trying to get to at the moment.


The problem

I'm trying to put some integration tests in place with real-world data, and some of these have an id larger than 2.1billion. We will have some calls into these external systems when running the tests (which we'll ultimately stub using VCR) so they need to be correct. However, when I try and use this data it blows up because the value is too large for the column in the test database.

So my question is: is there any non-massively-hacky way to ensure these id columns are bigints in the test database after running db:test:prepare?

Upvotes: 1

Views: 66

Answers (1)

Philip Hallstrom
Philip Hallstrom

Reputation: 19899

Change the schema format from :ruby to :sql so that your schema dump is pure SQL. This should keep those large integers intact (as well as any stored procs, etc you might have).

In config/application.rb:

config.active_record.schema_format = :sql

http://guides.rubyonrails.org/active_record_migrations.html#types-of-schema-dumps

Upvotes: 1

Related Questions