Skip to content

Conversation

@akostadinov
Copy link
Contributor

@akostadinov akostadinov commented Nov 24, 2025

With this script one can dump and restore 3scale data from one database instance to another. Possibly even between database types, e.g. from oracle to postgres. But I have only tested oracle -> oracle.

After migration, on has to:

# Make Porta Operational:
#   If importing from a different base domain, setup config/domain_substitution.yml
#   Make sure apicast uses:
#    * correct master account url (Account.master #domain and #self_domain)
#    * correct token, see master -> account settings -> personal -> tokens
#   Resync backend: bundle exec rake backend:storage:enqueue_rewrite
#   If apicast domains are changing: Proxy.find_each { _1.update_domains; _1.save! }

Maybe needs a test, let me know. But it would only be within the current test database, not between types.

Implements a comprehensive data migration script that can export and import all database tables via stdin/stdout in JSONL format. The script handles:

  • Batch processing of large tables (1000 records per batch)
  • Proper serialization of complex data types (YAML, JSON)
  • LOB handling for Oracle databases
  • Foreign key dependency ordering
  • Sequence/auto-increment fixes for Oracle, PostgreSQL, and MySQL
  • Support for composite primary keys and STI models
  • Callback-free imports to preserve data integrity

Usage:
RAILS_LOG_TO_STDOUT=false bundle exec rails runner dump_data.rb export > data.jsonl bundle exec rails runner dump_data.rb import < data.jsonl

🤖 Generated with Claude Code

@qltysh
Copy link

qltysh bot commented Nov 24, 2025

❌ 96 blocking issues (100 total)

Tool Category Rule Count
rubocop Style Use warn instead of STDERR\.puts to allow such output to be disabled. 9
reek Lint export_data contains iterators nested 2 deep 5
reek Lint find_model_for_table has the variable name 'e' 3
reek Lint export_data calls '"=" * 80' 2 times 22
reek Lint find_model_for_table doesn't depend on instance state (maybe move it to another class?) 2
rubocop Style Redundant begin block detected. 2
rubocop Style Avoid rescuing without specifying an error class. 2
rubocop Lint Use exception instead of e. 2
rubocop Lint Perceived complexity for export\_data is too high. [23/8] 2
reek Lint export_data has approx 54 statements 2
rubocop Lint Assignment Branch Condition size for export\_data is too high. [<40, 71, 23> 84.68/20] 2
rubocop Lint Cyclomatic complexity for export\_data is too high. [19/7] 2
rubocop Lint Method has too many lines. [86/20] 2
reek Lint export_data manually dispatches method call 2
rubocop Style Use the return of the conditional for variable assignment and comparison. 2
rubocop Style Avoid using \{\.\.\.\} for multi-line blocks. 2
rubocop Style Use $stderr instead of STDERR. 12
reek Lint export_data refers to 'model' more than self (maybe move it to another class?) 10
rubocop Style Incorrect formatting, autoformat by running qlty fmt. 1
rubocop Style Use %w or %W for an array of words. 1
rubocop Lint Useless assignment to variable - e. 1
rubocop Lint Do not prefix reader method names with get\_. 1
rubocop Lint Block has too many lines. [64/25] 1
rubocop Style Use 2 (not -13) spaces for indentation. 1
rubocop Style Align else with if. 1
rubocop Style Use record\_count\.zero? instead of record\_count == 0. 1
reek Lint export_data performs a nil-check 1
rubocop Style Extra empty line detected before the rescue. 1
rubocop Lint Do not suppress exceptions. 1
qlty Structure Function with high complexity (count = 5): find_model_for_table 3

@qltysh one-click actions:

  • Auto-fix formatting (qlty fmt && git push)

Implements a comprehensive data migration script that can export and import all database tables via stdin/stdout in JSONL format. The script handles:

- Batch processing of large tables (1000 records per batch)
- Proper serialization of complex data types (YAML, JSON)
- LOB handling for Oracle databases
- Foreign key dependency ordering
- Sequence/auto-increment fixes for Oracle, PostgreSQL, and MySQL
- Support for composite primary keys and STI models
- Callback-free imports to preserve data integrity

Usage:
  RAILS_LOG_TO_STDOUT=false bundle exec rails runner dump_data.rb export > data.jsonl
  bundle exec rails runner dump_data.rb import < data.jsonl

🤖 Generated with [Claude Code](https://claude.com/claude-code)
@akostadinov akostadinov requested a review from a team November 25, 2025 16:37
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Only a single test because calling the script is slow, so one chain of calls should be preferred. Locally it takes some 50 seconds.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

About tests.

  • I'm reluctant to introduce a new test to the suite that takes almost a minute to run, can't it really be optimized?
  • I think it would be better to have a unit test suite to test particular methods in the script, and an integration test suit like this, but with one or more tests for each command: import, export, tunctate-all and fix-sequences.
  • I executed the tests and the import process failed for me (MySQL):
 -> Reset auto_increment for proxy_rules to 4
  -> Reset auto_increment for referrer_filters to 2
  -> Reset auto_increment for service_tokens to 5
  -> Reset auto_increment for services to 4
  -> Reset auto_increment for settings to 3
W, [2025-11-27T12:24:53.019002 #107472]  WARN -- : Creating SystemOperation defaults

  -> ERROR importing system_operations on line 80: ActiveRecord::RecordNotUnique: Mysql2::Error: Duplicate entry '1' for key 'system_operations.PRIMARY'
  -> Failed record: {"id"=>1, "ref"=>"user_signup", "name"=>"New user signup", "description"=>nil, "created_at"=>"2025-11-27 11:24:27 UTC", "updated_at"=>"2025-11-27 11:24:27 UTC", "pos"=>nil, "tenant_id"=>nil}

  -> Backtrace:
/home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/mysql2-0.5.6/lib/mysql2/client.rb:151:in `_query'
/home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/mysql2-0.5.6/lib/mysql2/client.rb:151:in `block in query'
/home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/mysql2-0.5.6/lib/mysql2/client.rb:150:in `handle_interrupt'
  -> Exiting due to error
.
Expected: 0
  Actual: 1
test/integration/dump_data_test.rb:56:in `block in <class:DumpDataTest>'
...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Strange because it worked in CI. Which MySQL version are you running?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm reluctant to introduce a new test to the suite that takes almost a minute to run, can't it really be optimized?

the issue is that startup of the script fails. For testing the bare minimum, one has to dump data, clear data, import data. 3 invocations take time just to load all classes. Otherwise the operations are not so slow. And that's why I included only a single long test instead of many targeted ones. Not sure whether as a rake task it can run faster. Also at some point some monkey patching was done to prevent callbacks, I have to see whether the final version was safe monkey-patching. Otherwise it may well be wiser to have it as a script.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the monkey patching I think should be safe, so a rake task can be done, I just don't want to spend time on it now.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It failed with MySQL 8.0.

@codecov
Copy link

codecov bot commented Nov 25, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 87.78%. Comparing base (1d10baa) to head (a7f791b).
⚠️ Report is 14 commits behind head on master.

Additional details and impacted files
@@            Coverage Diff             @@
##           master    #4178      +/-   ##
==========================================
- Coverage   87.81%   87.78%   -0.03%     
==========================================
  Files        1783     1783              
  Lines       44690    44690              
  Branches      686      686              
==========================================
- Hits        39243    39231      -12     
- Misses       5421     5433      +12     
  Partials       26       26              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

Copy link
Contributor

@jlledom jlledom left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A few comments:

  1. The script doesn't support triggers, we need triggers to ensure tenant_id integrity, and probably sequences on oracle.
  2. Instead of outputting data to stdout and printing to stderr, I think it would be better to output/input data from files and use stdout and stderr like usually.
  3. Instead of a script, this would probably be a rake task
  4. At some point (in the task name, description, comments on the code, etc) I think we should mention that the only advantage of this script is to move data between db systems, if you want to dump/restore over the same system, better use rails standard tasks db:data:dump and db:data:load


# Export/Import all data from/to database via stdin/stdout
# Usage:
# RAILS_LOG_TO_STDOUT=false bundle exec rails runner dump_data.rb export > data.jsonl
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried this command and it sent logs to the output file anyway. So something is failing.

On the other hand, why setting RAILS_LOG_TO_STDOUT at all? If we are sure we'll never want logs to stdout, couldn't we just set the env variable from inside the script? or whatever other approach to get the same result?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Its a few log files and does not break it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can the import work with the traces in the jsonl file?

# Export/Import all data from/to database via stdin/stdout
# Usage:
# RAILS_LOG_TO_STDOUT=false bundle exec rails runner dump_data.rb export > data.jsonl
# bundle exec rails runner dump_data.rb import < data.jsonl
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried this command, exported from MySQL, importing to PostgreSQL. It failed for me:

With the DB containing seed data:

  -> Reset sequence proxy_rules_id_seq to 6
  -> Reset sequence service_tokens_id_seq to 6
Warning: Error on line 73: PG::ForeignKeyViolation: ERROR:  update or delete on table "services" violates foreign key constraint "fk_rails_e4d18239f1" on table "api_docs_services"
DETAIL:  Key (id)=(2) is still referenced from table "api_docs_services".
  -> Reset sequence settings_id_seq to 53
W, [2025-11-27T13:17:26.802839 #158128]  WARN -- : Creating SystemOperation defaults

  -> ERROR importing system_operations on line 76: ActiveRecord::RecordNotUnique: PG::UniqueViolation: ERROR:  duplicate key value violates unique constraint "system_operations_pkey"
DETAIL:  Key (id)=(1) already exists.
  -> Failed record: {"id"=>1, "ref"=>"user_signup", "name"=>"New user signup", "description"=>nil, "created_at"=>"2025-11-26 12:26:27 UTC", "updated_at"=>"2025-11-26 12:26:27 UTC", "pos"=>nil, "tenant_id"=>nil}

  -> Backtrace:
/home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/activerecord-7.1.5.2/lib/active_record/connection_adapters/postgresql_adapter.rb:894:in `exec_params'
/home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/activerecord-7.1.5.2/lib/active_record/connection_adapters/postgresql_adapter.rb:894:in `block (2 levels) in exec_no_cache'
/home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/activerecord-7.1.5.2/lib/active_record/connection_adapters/abstract_adapter.rb:1027:in `block in with_raw_connection'
  -> Exiting due to error

With the DB with an empty DB, no data nor schema:

  /home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/activerecord-7.1.5.2/lib/active_record/connection_adapters/postgresql/database_statements.rb:19:in `exec': PG::UndefinedTable: ERROR:  relation "web_hooks" does not exist (ActiveRecord::StatementInvalid)
LINE 10:  WHERE a.attrelid = '"web_hooks"'::regclass
                             ^
	from /home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/activerecord-7.1.5.2/lib/active_record/connection_adapters/postgresql/database_statements.rb:19:in `block (2 levels) in query'
	from /home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/activerecord-7.1.5.2/lib/active_record/connection_adapters/abstract_adapter.rb:1027:in `block in with_raw_connection'
	from /home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/activesupport-7.1.5.2/lib/active_support/concurrency/null_lock.rb:9:in `synchronize'
	from /home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/activerecord-7.1.5.2/lib/active_record/connection_adapters/abstract_adapter.rb:999:in `with_raw_connection'
	from /home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/activerecord-7.1.5.2/lib/active_record/connection_adapters/postgresql/database_statements.rb:18:in `block in query'
	from /home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/activesupport-7.1.5.2/lib/active_support/notifications/instrumenter.rb:58:in `instrument'
	from /home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/activerecord-7.1.5.2/lib/active_record/connection_adapters/abstract_adapter.rb:1142:in `log'
	from /home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/activerecord-7.1.5.2/lib/active_record/connection_adapters/postgresql/database_statements.rb:17:in `query'
	from /home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/activerecord-7.1.5.2/lib/active_record/connection_adapters/postgresql_adapter.rb:1074:in `column_definitions'
	from /home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/activerecord-7.1.5.2/lib/active_record/connection_adapters/abstract/schema_statements.rb:109:in `columns'
...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you need a clean database state, there is a script command to wipe all tables when given a dump

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried these commands now:

$ bundle exec rails db:drop db:create db:schema:load
$ bundle exec rails runner script/dump_data.rb import < db/completedb.jsonl

With MySQL 8.4:

...
  -> Reset auto_increment for service_tokens to 6
  -> Reset auto_increment for services to 6
  -> Reset auto_increment for settings to 53
W, [2025-11-28T09:18:06.532753 #45031]  WARN -- : Creating SystemOperation defaults

  -> ERROR importing system_operations on line 76: ActiveRecord::RecordNotUnique: Mysql2::Error: Duplicate entry '1' for key 'system_operations.PRIMARY'
  -> Failed record: {"id"=>1, "ref"=>"user_signup", "name"=>"New user signup", "description"=>nil, "created_at"=>"2025-11-26 12:26:27 UTC", "updated_at"=>"2025-11-26 12:26:27 UTC", "pos"=>nil, "tenant_id"=>nil}

  -> Backtrace:
/home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/mysql2-0.5.6/lib/mysql2/client.rb:151:in `_query'
/home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/mysql2-0.5.6/lib/mysql2/client.rb:151:in `block in query'
/home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/mysql2-0.5.6/lib/mysql2/client.rb:150:in `handle_interrupt'

With PSQL 15:

...
  -> Reset sequence service_tokens_id_seq to 6
  -> Reset sequence services_id_seq to 6
  -> Reset sequence settings_id_seq to 53
W, [2025-11-28T09:36:52.310116 #69186]  WARN -- : Creating SystemOperation defaults

  -> ERROR importing system_operations on line 76: ActiveRecord::RecordNotUnique: PG::UniqueViolation: ERROR:  duplicate key value violates unique constraint "system_operations_pkey"
DETAIL:  Key (id)=(1) already exists.
  -> Failed record: {"id"=>1, "ref"=>"user_signup", "name"=>"New user signup", "description"=>nil, "created_at"=>"2025-11-26 12:26:27 UTC", "updated_at"=>"2025-11-26 12:26:27 UTC", "pos"=>nil, "tenant_id"=>nil}

  -> Backtrace:
/home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/activerecord-7.1.5.2/lib/active_record/connection_adapters/postgresql_adapter.rb:894:in `exec_params'
/home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/activerecord-7.1.5.2/lib/active_record/connection_adapters/postgresql_adapter.rb:894:in `block (2 levels) in exec_no_cache'
/home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/activerecord-7.1.5.2/lib/active_record/connection_adapters/abstract_adapter.rb:1027:in `block in with_raw_connection'
  -> Exiting due to error

tables = get_tables_to_process

# Tables with foreign key constraints that should be imported last
# Based on foreign keys defined in db/oracle_schema.rb
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why oracle_schema.rb in particular?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

claude artifact

Comment on lines +369 to +373
if sequence_name.nil?
raise "sequence not found (tried: #{full_sequence_name}, #{shortened_sequence_name}). " \
"Table may be using a different auto-increment mechanism. " \
"Oracle Enhanced Adapter supports: :sequence (default), :trigger, :identity, or :autogenerated."
end
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't get this:

  1. If we are importing the DB, is the user_sequences table supposed to be imported too?
  2. Is it auto-generated?
  3. Would it be returned by ActiveRecord::Base.connection.tables in the export process?
  4. Won't this auto-increment method require triggers to be imported also?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

triggers are out of scope, the schemas between the rails instances should be already in a working compatible state, we don't do DDL except for fixing up the auto-increment sequences

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, Then it would be good to add a comment to the script to explain it.

STDERR.puts " bundle exec rails runner dump_data.rb fix-sequences"
STDERR.puts ""
STDERR.puts " # Pipe directly between databases"
STDERR.puts " bundle exec rails runner dump_data.rb export | bundle exec rails runner dump_data.rb import"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How is it possible this example would work? At least we should set DATABASE_URL on one of the two sides.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

well, you must have a working database config file on both sides, not necessary to set variables. Although in our usage we always do I think.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But the command and it is now:

bundle exec rails runner dump_data.rb export | bundle exec rails runner dump_data.rb import

It runs both sides of the pipeline from the same folder, with the same config. It will dump and reload the data from/to the same DB.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

About tests.

  • I'm reluctant to introduce a new test to the suite that takes almost a minute to run, can't it really be optimized?
  • I think it would be better to have a unit test suite to test particular methods in the script, and an integration test suit like this, but with one or more tests for each command: import, export, tunctate-all and fix-sequences.
  • I executed the tests and the import process failed for me (MySQL):
 -> Reset auto_increment for proxy_rules to 4
  -> Reset auto_increment for referrer_filters to 2
  -> Reset auto_increment for service_tokens to 5
  -> Reset auto_increment for services to 4
  -> Reset auto_increment for settings to 3
W, [2025-11-27T12:24:53.019002 #107472]  WARN -- : Creating SystemOperation defaults

  -> ERROR importing system_operations on line 80: ActiveRecord::RecordNotUnique: Mysql2::Error: Duplicate entry '1' for key 'system_operations.PRIMARY'
  -> Failed record: {"id"=>1, "ref"=>"user_signup", "name"=>"New user signup", "description"=>nil, "created_at"=>"2025-11-27 11:24:27 UTC", "updated_at"=>"2025-11-27 11:24:27 UTC", "pos"=>nil, "tenant_id"=>nil}

  -> Backtrace:
/home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/mysql2-0.5.6/lib/mysql2/client.rb:151:in `_query'
/home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/mysql2-0.5.6/lib/mysql2/client.rb:151:in `block in query'
/home/jlledom/.asdf/installs/ruby/3.1.5/lib/ruby/gems/3.1.0/gems/mysql2-0.5.6/lib/mysql2/client.rb:150:in `handle_interrupt'
  -> Exiting due to error
.
Expected: 0
  Actual: 1
test/integration/dump_data_test.rb:56:in `block in <class:DumpDataTest>'
...

@akostadinov
Copy link
Contributor Author

The script doesn't support triggers, we need triggers to ensure tenant_id integrity, and probably sequences on oracle.
DDL out of scope

Instead of outputting data to stdout and printing to stderr, I think it would be better to output/input data from files and use stdout and stderr like usually.

It works so I don't really want to spend more time on it for minor reasons. We can see later if changes are needed.

Instead of a script, this would probably be a rake task

Initially I didn't intend to commit it to the repo but then I thought it might be useful. Now it will require more work that I'm not really sure is worth investing.

At some point (in the task name, description, comments on the code, etc) I think we should mention that the only advantage of this script is to move data between db systems, if you want to dump/restore over the same system, better use rails standard tasks db:data:dump and db:data:load

I didn't notice db:data:dump but it is just a mysqldump wrapper. This one is more versatile and makes me thing that ancients also were thinking in the direction to allow data migration but it didn't go far.

Your points are generally good. If you think it is not suitable for merging as is I will have to think whether to put more time into it or drop it altogether. I think it is worth having a starting point although not ideal. But let me know your thought.

@akostadinov akostadinov requested a review from jlledom November 27, 2025 17:41
@jlledom
Copy link
Contributor

jlledom commented Nov 28, 2025

At some point (in the task name, description, comments on the code, etc) I think we should mention that the only advantage of this script is to move data between db systems, if you want to dump/restore over the same system, better use rails standard tasks db:data:dump and db:data:load

I didn't notice db:data:dump but it is just a mysqldump wrapper. This one is more versatile and makes me thing that ancients also were thinking in the direction to allow data migration but it didn't go far.

I think using db:data:dump is better for multiple reasons:

  1. It's the standard way to do it: there will be online docs, etc.
  2. DB agnostic: It will work whatever db you use.
  3. Not maintained by us: If tomorrow we want to add support for a new version of any RDBMS, we would have to come back to this script and update it also. Same with Rails versions.

Your points are generally good. If you think it is not suitable for merging as is I will have to think whether to put more time into it or drop it altogether. I think it is worth having a starting point although not ideal. But let me know your thought.

Considering the points above and the addition of one minute in integration tests. I don't think it's worth merging. The only advantage it has over db:data:dump is the sequences management and being able to migrate data between DBs. But we already decided we don't maintain dbs anymore after 2.15. So I don't think it's our work to provide such scripts. It's not directly related to the project.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants