Populating Dockerized PostgreSQL with dumped data from host

I needed to populate a local SQLite database to do some development testing for a client. Her web app has a lot of moving parts and left something to be desired in terms of testing. Most of these steps were adapted from this wonderful post.

Long story short: I needed to recreate an issue discovered in production, but didn’t have any data with which to write tests or even recreate the issue manually. The whole thing is a huge pain in the butt, so I dumped the production database from Heroku and attempted a brute-force reconstruction of the issue. The Heroku instructions produced a file called latest.dump.

Create/run the container

I like using Docker for this kind of stuff. I don’t even run any databases on my host machine anymore.

From the Docker PostgreSQL documentation…

docker run --name some-postgres -e POSTGRES_PASSWORD=secret -p 5432:5432 -d postgres

Restore to local database

This is where the dumped data gets written to the Dockerized PostgreSQL database.

pg_restore --verbose --clean --no-acl --no-owner -h localhost -U postgres --create --dbname=postgres latest.dump


There are a couple of ways to find the database name. Either log in as follows, or go to the Heroku database dashboard. The password was set to secret when the container was created. It’s good idea to login anyway, if only to make sure the database is actually there.

psql -h localhost -p 5432 -U postgres --password

Dump the local database

I didn’t want to mess with the previous developer’s setup, so I took the wussy way out and used the existing SQLite configuration to run the app locally. This requires creating an SQLite DB with the PostgreSQL data dump. This command dumps that data:

pg_dump -h localhost -p 5432 -U postgres --password --data-only --inserts YOUR_DB_NAME > dump.sql

Create SQLite database

Again, these steps were adapted from this post.

Modify dump.sql

My favourite editor is vim. I used it to modify dump.sql.

Remove all the SET statements at the top of the file.

It’ll look something like this:

SET statement_timeout = 0;
SET lock_timeout = 0;
SET client_encoding = 'UTF8';
SET standard_conforming_strings = on;
SET check_function_bodies = false;
SET client_min_messages = warning;
SET row_security = off;
SET search_path = public, pg_catalog;

Get rid of it all.

Remove all the setval queries

The post that I’m plagiarizing says these autoincrement IDs. SQLite doesn’t need this. Find and remove everything that looks like this (i.e., everything with a call to setval):

SELECT pg_catalog.setval('friendly_id_slugs_id_seq', 413, true);

Replace true with 't' and false with 'f'

Anything that looks like this:

INSERT INTO article_categories VALUES (4, 9, 13, true, '2011-11-22 06:29:07.966875', '2011-11-22 06:29:07.966875');
INSERT INTO article_categories VALUES (26, 14, NULL, false, '2011-12-07 09:09:52.794238', '2011-12-07 09:09:52.794238');

Needs to look like this:

INSERT INTO article_categories VALUES (4, 9, 13, 't', '2011-11-22 06:29:07.966875', '2011-11-22 06:29:07.966875');
INSERT INTO article_categories VALUES (26, 14, NULL, 'f', '2011-12-07 09:09:52.794238', '2011-12-07 09:09:52.794238');

Wrap it all in a single transaction

If you don’t do this, the import will take forever. At the top of the file put


and at the very bottom of the file put


Create the SQLite database

Finally, after all that hard work:

sqlite3 development.sqlite < dump.sql

The app was already set up to work with the development.sqlite database. There were a couple of error messages when creating it, but they didn’t seem to impact the app. Everything worked fine, and I found a nifty new way to use Docker.