Explain Codes LogoExplain Codes Logo

How do I convert a database row into a struct

sql
database-mapping
struct-conversion
sql-null-values
Nikita BarsukovbyNikita Barsukov·Oct 15, 2024
TLDR

To migrate a row from your database and transform it into a mapped Go struct, your solution is to utilize a database access library that binds query results automatically. You simply create a struct with fields that align with your table columns, compile a query, and allow the library's binding function to populate your struct for you. Here's a short and punchy example using Go's widely-used sqlx library:

// Here we define our struct to act as a big, jumbled mirror reflecting our database table columns type User struct { ID int `db:"id"` Name string `db:"name"` // Boss level: creating your own field names // Add fields ad infinitum (try not to add fields ad nauseam) } // Assuming our db is an *sqlx.DB instance, and our user is a freshly summoned User struct err := db.Get(&user, "SELECT id, name FROM users WHERE id = ?", userID) // Handle those errors like they're hot potatoes

The db:"field_name" tags act as our go-to field mappers, leading struct fields and database columns right to their perfect match. So, when you deploy db.Get, our struct gets a nice, big scoop of row data from the specific user denoted by userID.

Let's roll up our sleeves: Expanding on row to struct conversion

Playing tag with struct fields

To get our libraries (like sqlx) to play nice with our code, we need to display our struct field tags at all times:

type Product struct { SKU string `db:"sku"` // It's like giving directions to a stranger Name string `db:"name"` // Left at the red building, right at the big tree... Price float64 `db:"price"` // Just follow the tags InStock bool `db:"in_stock"` // And voila! You've reached your destination // Keep on driving... }

In sqlx, these tags behave like middlemen, clarifying column-to-field relationships for the library, resulting in a much smoother mapping journey.

Handling sneaky null values

Database rows sometimes can't avoid a bout of NULL-itis. Yet, Go does a clean sweep and doesn't allow nulls for basic types. To tackle these, turn to sql.NullString, sql.NullInt64, sql.NullBool (and so on):

type NullableProfile struct { Bio sql.NullString `db:"bio"` Age sql.NullInt64 `db:"age"` // Keep adding fields like there's no tomorrow }

They raise the Valid flag in their methods so we can verify field content before we even think of using their values.

Absolute power of named parameters

With sqlx on your team, you'll find yourself with the ability to harness named parameters. This will without a doubt elevate your code readability and maintain a sweet spot of logical consistency:

query := `SELECT * FROM users WHERE id = :id` // Notice the ":" before "id" namedParams := map[string]interface{}{"id": userID} err := db.NamedQuery(&user, query, namedParams) // Errors are part and parcel. Ready your error handling techniques

Going to the market: Multiple row fetching

When you're tasked with bringing home multiple rows, you'll find yourself in a loop. Alongside the Select method, a slice of structs is sure to lend a helping hand:

var users []User err := db.Select(&users, "SELECT * FROM users") // Handle error like a champ

Bid adieu to manual scanning

If we're stuck with the database/sql package purely, we might have to bow to manual scanning. To avoid this chore, consider switching to friendlier libraries like scany:

import "github.com/georgysavva/scany/sqlscan" var user User err := sqlscan.Get(context.Background(), db, &user, "SELECT * FROM users WHERE id = ?", userID) // Got the error-handling reflexes activated yet?

Database compatibility: Your new best friend

When mapping rows to structs, have your Go-to compatibility guidebook at the ready. The standard Scan method says yea to most types, but custom database types may require slightly more sophisticated scanning strategies.