Explain Codes LogoExplain Codes Logo

How do I copy data from one table to another in postgres using copy command

sql
data-transfer
copy-command
postgres
Anton ShumikhinbyAnton Shumikhin·Sep 13, 2024
TLDR

To efficiently transfer data from one table to another in PostgreSQL, use the \COPY meta-command:

  1. Export source table data as a CSV file:
    \COPY source_table TO 'file.csv' CSV HEADER; -- Here we go, CSV Magic!
  2. Import the CSV file into the destination table:
    \COPY target_table FROM 'file.csv' CSV HEADER; -- Let's feed it to our hungry table.

This seamlessly preserves the CSV format and headers. Moreover, \COPY can be executed from client interfaces, such as psql, without requiring server file access permissions.

Comprehensive Data Copy Guide

There's no one-size-fits-all solution when it comes to data transfer between tables. You might need different commands for unique scenarios. Let's examine some common requirements:

Cloning a table (structure + data)

CREATE TABLE new_table AS SELECT * FROM existing_table; -- "Clone Wars"

The statement above creates new_table, duplicating all data and structure from existing_table.

Copying specific columns

INSERT INTO new_table(colA, colB) SELECT col1, col2 FROM existing_table; -- "Pick and choose!"

In this case, you're selecting specific columns (col1, col2) from existing_table and inserting them into new_table (colA, colB). Make sure your columns align!

Structure first, data next

CREATE TABLE new_table (LIKE existing_table INCLUDING ALL); -- "Dress first, then fill pockets" INSERT INTO new_table SELECT * FROM existing_table; -- "Stuffing pockets..."

new_table is first created with the structure of existing_table, and then data is inserted from existing_table into new_table.

Copy only some data (filtered)

INSERT INTO new_table SELECT * FROM existing_table WHERE condition; -- "Selective hearing!"

This command will copy only the rows that meet the filter condition from existing_table to new_table.

Remove duplicates during copy

INSERT INTO new_table SELECT DISTINCT ON (unique_column) * FROM existing_table; -- "We don't like clones!"

Distinct values only, based on unique_column, from existing_table will be copied to the new_table.

Transaction safety net

BEGIN; -- "Safety first!" CREATE TABLE new_table (LIKE existing_table INCLUDING ALL); INSERT INTO new_table SELECT * FROM existing_table; -- Verify data before committing SELECT COUNT(*) FROM new_table; -- "Counting sheep..." COMMIT; -- or ROLLBACK in case of issues, you can't unscramble an egg!

Every operation is wrapped in a transaction for peace of mind. Either all the operations are successful, or none are—just like committing a crime, except legal.

Backup table before shenanigans

pg_dump -t existing_table -f table_backup.sql database_name -- "Safety net, here we come!" psql database_name -f table_backup.sql -- "Time to relax, right?"

Backup the table specifically before performing any risky operations, because sometimes prevention is better than cure.

Advanced Data Copy Methods

The \COPY command serves well for file-based data transport. Yet, PostgreSQL has more techniques for fine-grained control.

Data copy with conditional filters

SELECT * INTO new_table FROM existing_table WHERE condition; -- "Not everything that shines is gold!"

Copy complex datasets

CREATE TABLE new_table AS SELECT a.*, b.specific_column FROM existing_table a JOIN another_table b ON a.match_column = b.match_column; -- "Making friends (aka joins)!"

Visualization

Visualize the data transfer from one table to another in PostgreSQL as a well-organized transfer of goods:

[Source Table] 🛤️ ===(🚂 COPY Command)🔄💾===> [Target Table]

Akin to moving cargo, the COPY command shuttles the valuable data from one stop to the next with efficiency.

COPY (SELECT * FROM origin_table) TO '/path/to/file.csv' WITH CSV; -- 🚂 Loading the data cargo COPY destination_table FROM '/path/to/file.csv' CSV; -- 🔄 Unloading at the intended station

Safeguard and Optimize your Data Transfer

Pre-import assessment

Before importing the data, consider if the destination table:

  • Has indexes that require pausing.
  • Contains triggers that need disabling.
  • Needs any extra constraints.

Post-import verification

After successfully importing the data:

  • Verify the data quality and counts.
  • If indexing was paused, rebuild it.
  • Enable the previously disabled triggers.