Explain Codes LogoExplain Codes Logo

How can I import a JSON file into PostgreSQL?

sql
json
postgresql
data-manipulation
Alex KataevbyAlex Kataev·Nov 9, 2024
TLDR

To swiftly import a JSON file into PostgreSQL, apply the COPY command paired with the jsonb data type. In the scenario where you have a table with a jsonb column, use this snippet:

-- "Make a home, don't just expect a free couch!" CREATE TABLE IF NOT EXISTS json_data_table ( data jsonb ); -- "Your JSON data needs a crib, this is how to get one!" COPY json_data_table(data) FROM '/path/to/jsonfile.json';

Change json_data_table to match your table name and /path/to/jsonfile.json to the location of your JSON file. This will load your JSON file content as jsonb objects into PostgreSQL.

Taking it a notch higher :arrow_up:

To be a master at importing JSON files to postgreSQL, I got some extra tricks up my sleeve for you:

Struggling with nested JSON objects?

Create a staging table to unpack the nested JSON goodies:

-- "Nested JSONs are like those Russian dolls, let's open one up!" CREATE TABLE staging_table ( id serial PRIMARY KEY, raw_data jsonb ); -- "Moving raw data, like moving houses...hard but fun" COPY staging_table(raw_data) FROM '/path/to/jsonfile.json'; -- "Remember you are a data ninja, slicing and dicing data" INSERT INTO target_table (column1, column2) SELECT raw_data->>'key1', raw_data->'nested'->>'key2' FROM staging_table;

Easier engagement with JSON data

You can use json_populate_recordset to make JSON data look like good old relational data:

-- "Creating a table is like choosing the right-sized Tupperware for your data" CREATE TABLE json_records AS SELECT * FROM json_populate_recordset(null::your_table_type, (SELECT raw_data FROM staging_table));

Here's how to upsert data

In case your JSON contains data for existing records, slap in the ON CONFLICT clause to handle it all:

-- "Did someone say UPSERT? You're covered mate!" INSERT INTO target_table (id, data) SELECT * FROM json_populate_recordset(null::target_table, '{"id": 1, "data": "new"}') ON CONFLICT (id) DO UPDATE SET data = EXCLUDED.data;

Taking your game to the SQL stratosphere

Elevate your JSON import game with a common table expression (CTE):

-- "With great CTE, comes great convenience!" WITH preprocessed_json AS ( SELECT id, data FROM json_populate_recordset(null::target_table, (SELECT raw_data FROM staging_table)) ) INSERT INTO target_table (id, data) SELECT id, data FROM preprocessed_json ON CONFLICT (id) DO NOTHING;

Boost your query speed with JSONB

The jsonb data type stores data in a decomposed binary format. It's like having a power engine for your queries!

-- "JSONB is basically JSON on steroids" ALTER TABLE target_table ALTER COLUMN data TYPE jsonb USING data::jsonb;

Advanced tricks and tips

Harnessing meta-commands

Who's meta? Well, with psql, PostgreSQL's command-line interface, you can use \set and \copy to load files and execute SQL scripts:

-- "Meta is your new friend" \set content `cat /path/to/jsonfile.json` INSERT INTO json_data_table (data) VALUES (:'content');

Dealing with big files

When your JSON file is so large it breaks the scale, the csv trick will save the day:

-- "How to feed a big JSON file to PostgreSQL? Just use CSV!" \copy large_json_data_table (data) FROM '/path/to/large_jsonfile.json' (FORMAT csv);

Turbocharge your JSON manipulation

jq and spyql can turn JSON data into SQL-compatible insert statements, faster than a twitchy rabbit:

-- "Making JSON and SQL play well together" cat /path/to/jsonfile.json | jq '...' | spyql '...' | psql -c "COPY (FIELD LIST) FROM STDIN";