Move SQL data from one table to another
Here's the quick answer to do a clean and neat transfer:
Adjust new_table
, old_table
, column1
, column2
, and condition
as per your needs. This way, Data integrity is maintained by confirming only specific rows are transferred.
When INSERT followed by DELETE isn't your cup of tea
Sometimes, you may not be able to DELETE rows due to various reasons like maintaining historical data. In these scenarios, mark the transferred rows instead of deleting them:
Here is_transferred
works as a flag which indicates that a row has been transferred.
One-stone-two-birds approach in SQL Server
If you're using SQL Server, you can move rows in merely one statement with the OUTPUT clause, saving you an extra line and disk I/O:
This command magically deletes from old_table
and "abracadabra" they appear in new_table
.
Keeping primary keys intact
Conducting IDENTITY_INSERT prior to transferring data can preserve the primary key:
Just remember to turn IDENTITY_INSERT
ON before, and OFF after the operation.
Adapting to different SQL dialects
Different SQL engines (like MySQL, SQL Server, PostgreSQL) come with their distinct syntax, and what's fun without some variety, right?
SQL Server
MySQL
PostgreSQL
Be race-condition proof
Race conditions might pop out of nowhere when there are transfers happening concurrently. Use serializable transactions or row locks to ensure data consistency:
Using UPDLOCK
and HOLDLOCK
would prevent other transactions from meddling with the specific data until the present transaction is committed.
Encountering common issues
Some issues you might face are:
- Data conflicts: Having primary keys or unique indexes producing violations when transferring data.
- Extended transactions: Lengthy transactions can hamper performance. Batching operations into digestible amounts might help here.
- Foreign key constraints: If the destination tables have dependencies, it's better to disable constraints before the operation, or make sure the dependent data has been transferred.
Being SQL-smart
Here are some pro tips:
- Always test your queries on a non-production environment before running them on production.
- Perform a backup before running commands that can affect many rows. You wouldn't want an "Oh Snap!" moment, would you?
- Monitor performance implications; moving a lot of data could lock up tables and affect availability, something your boss wouldn't be too excited about.
Was this article helpful?