0

I'm trying to insert rows into a database that has three columns: publication, publication_year, and authors. The field authors is of type JSON []. Currently, I have the following code:

query = "INSERT INTO " + store_table_name + "( " + ",".join(tuple(key_types.keys())) + ") VALUES %s"
try:
    # connect to the database
    conn, cur = pg_connect(DB_OSP, db_host, db_user, db_password)
    conn.set_isolation_level(psycopg2.extensions.ISOLATION_LEVEL_AUTOCOMMIT)

    logger.info('Adding %s rows to %s' % (len(data), store_table_name))

    execute_values(cur, query, data)

    cur.close()
    conn.commit()
# exception handling to follow

Data is a list of tuples where each entry is (publication, publication_year, authors). So for example, an entry could look like:

('Sample Book', 2019, ['{"Author1_fname": "Billy", "Author1_lname": "Bob"}', '{"Author2_fname": "King", "Author2_lname": "Kong"}'])

Initially, the code doesn't work as there is a type mismatch (it tries to insert authors as a text array instead of a JSON array). I tried changing the array to string type which didn't work either. I'm not quite sure how to explicitly add the ":::json[]" cast using execute_values. Does anyone have experience doing this?

1
  • Are you sure you want an array of json for authors? What you have is not a 'json array', it's an array of json objects, which is rarely useful. Commented Apr 10, 2020 at 4:27

1 Answer 1

1

Try using Json from psycopg2.extras by formatting your data as:

(
  "Sample Book",
  2019, 
  Json([{"Author1_fname": "Billy", "Author1_lname": "Bob"}, {"Author2_fname": "King", "Author2_lname": "Kong"}])
)
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.