Export specific rows from a PostgreSQL table as INSERT SQL script
The quickest way to export specific row(s) from a PostgreSQL table to INSERT
script is using pg_dump
:
Mind-map: Replace YourDBName
with your database name, YourUserName
with your user name, your_table_name
with your table name from where you want to extract specific rows, "id = 123"
is replaced with the specific condition you wish to apply. The inserts.sql
will contain tailored INSERT
commands.
Quick pointers to Creating Views
When you have to export the same specific rows repeatedly, a CREATE VIEW
command might be your savior. Views can act as a lens focusing on your desired data:
Post creation of the view, you can pump-in pg_dump
, but this time on MyView
:
The game-changer here is, there's no need to specify the --where
clause every time you export. Sweet, isn't it?
One-time tickets and Temporary Tables
If your requirement is a one-off export or requires running a few operations before exporting, consider creating a temporary table:
Then, proceed with a pg_dump
on TempTable
. But remember, this table is for one-time use, and will vanish after your session.
Who said "Now you see me, now you don't"!
Add Grep to the Mix
You may want to generate the general INSERT
script first, but then filter to specific rows using grep
, something like:
This would help you gain more control over the exported data. The more the control, the merrier the experience!
Defaulting to the COPY command
If you are more into using COPY
command, then export the table contents into an intermediate CSV file, and convert that CSV into INSERT
statements:
Where there's will, there's a way, and for the conversion of the csv file into INSERT
statements, you could use scripts or small applications that handle this specific task.
For the rebellious hearts without superuser privileges
When you don't have superuser rights on the server, do not fret. \copy command
in psql
can be your guardian angel, exporting data from the server to your local system:
Remember, the output file created this way has to be used just like any other CSV export to create INSERT
statements.
A friendly reminder for compatibility check and conflict resolution
On exporting data, it’s crucial to recognize compatibility with the target table’s structure and data types. For instance, when using COPY
to export rows to CSV and then producing INSERT
commands, you need to align with data types and resolve all juicy conflicts (like escaping quotes, taking care of NULL
values, etc.).
Battle of the Titans: Command line vs. GUI in Exports
Based on your comfort and complexity of operations, choosing between command line and GUI for exports makes a huge difference. The command line, with commands such as pg_dump
, grep
, and massive scope for scripting, is usually the go-to for regular & automated exports. GUI tools provide a simpler visual approach, but may not be as powerful as the command line. So, choose your weapon wisely!
In the end, all's well with Backup:
I cannot emphasize enough on this, but having regular backups ensure data safety. Regular backups can also come in handy while creating your INSERT
script if ever required. Making sure your backup game is strong simplifies recovery and brings a peace of mind, and who doesn't like that?
Was this article helpful?