1

Let's say I have:

type User struct {
    ID int64 `json:"id"
    Posts []Post `json:"posts"
}

type Post struct {
    ID int64 `json:"id"
    Text string `json:"text"
}

The SQL query:

WITH temp AS (SELECT u.id AS user_id, p.id AS post_id, p.text AS post_text FROM users u JOIN posts p ON u.id=p.user_id)

SELECT user_id, ARRAY_AGG(ARRAY[post_id::text, post_text]) 
FROM temp
GROUP BY user_id
)

What I want is to scan rows from the query above into a slice of User objects:

 import (
    "context"
    "fmt"
    "github.com/jackc/pgx/v4/pgxpool"
    "github.com/lib/pq"
 )

 var out []User

    rows, _ := client.Query(context.Background(), query) // No error handling for brevity

    for rows.Next() {

        var u User

        if err := rows.Scan(&u.ID, pq.Array(&u.Posts)); err != nil {
            return
        }

        out = append(out, u)

    }

Pretty much expectedly, the code above fails with:

pq: cannot convert ARRAY[4][2] to StringArray

This makes sense, but is there a way to read the SQL output into my slice of users?

1 Answer 1

4

Scanning of multi-dimensional arrays of arbitrary types, like structs, is not supported by lib/pq. If you want to scan such an array you'll have to parse and decode it yourself in a custom sql.Scanner implementation.

For example:

type PostList []Post

func (ls *PostList) Scan(src any) error {
    var data []byte
    switch v := src.(type) {
    case string:
        data = []byte(v)
    case []byte:
        data = v
    }

    // The data var holds the multi-dimensional array value,
    // something like: {{"1","foo"}, {"2","bar"}, ...}
    // The above example is easy to parse but too simplistic,
    // the array is likely to be more complex and therefore
    // harder to parse, but not at all impossible if that's
    // what you want.

    return nil
}

If you want to learn more about the PostgreSQL array representation syntax, see:


An approach that does not require you to implement a parser for PostgreSQL arrays would be to build and pass JSON objects, instead of PostgreSQL arrays, to array_agg. The result of that would be a one-dimensional array with jsonb as the element type.

SELECT user_id, array_agg(jsonb_build_object('id', post_id, 'text', post_text)) 
FROM temp
GROUP BY user_id

Then the implementation of the custom sql.Scanner just needs to delegate to lib/pq.GenericArray and another, element-specific sql.Scanner, would delegate to encoding/json.

type PostList []Post

func (ls *PostList) Scan(src any) error {
    return pq.GenericArray{ls}.Scan(src)
}

func (p *Post) Scan(src any) error {
    var data []byte
    switch v := src.(type) {
    case string:
        data = []byte(v)
    case []byte:
        data = v
    }
    return json.Unmarshal(data, p)
}

type User struct {
    ID    int64    `json:"id"`
    Posts PostList `json:"posts"`
}
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.