2

I am using a custom callable to pandas.to_sql(). The below snippet is from pandas documentation for using it

import csv
from io import StringIO

def psql_insert_copy(table, conn, keys, data_iter):
    """
    Execute SQL statement inserting data

    Parameters
    ----------
    table : pandas.io.sql.SQLTable
    conn : sqlalchemy.engine.Engine or sqlalchemy.engine.Connection
    keys : list of str
        Column names
    data_iter : Iterable that iterates the values to be inserted
    """
    # gets a DBAPI connection that can provide a cursor
    dbapi_conn = conn.connection
    with dbapi_conn.cursor() as cur:
        s_buf = StringIO()
        writer = csv.writer(s_buf)
        writer.writerows(data_iter)
        s_buf.seek(0)

        columns = ', '.join('"{}"'.format(k) for k in keys)
        if table.schema:
            table_name = '{}.{}'.format(table.schema, table.name)
        else:
            table_name = table.name

        sql = 'COPY {} ({}) FROM STDIN WITH CSV'.format(
            table_name, columns)
        cur.copy_expert(sql=sql, file=s_buf)

but while using this copy functionality, I am getting the error

psycopg2.errors.InvalidTextRepresentation: invalid input syntax for integer: "3.0"

This is not a problem with input as this table schemas and values where working initially, when I have used to_sql() function without using the custom callable 'psql_insert_copy()'. I am using sqlalchemy engine for getting the connection cursor

5
  • The error tells you the problem: select '3.0'::integer; ERROR: invalid input syntax for type integer: "3.0". You need select '3'::integer; 3 You can't insert a float into an integer field. Commented Mar 1, 2021 at 16:00
  • Yes, that makes sense but I am using the same schema and same values which worked when I didnt use the Postgre Copy and which worked! That is the part confusing me Commented Mar 2, 2021 at 2:32
  • COPY is just taking what it is given on STDIN. It is the process that produces s_buff that is producing '3.0'. Either data_iter or csv.writer. Capture the intermediate output to see where. Commented Mar 2, 2021 at 15:36
  • I think when we use INSERT if there are some implicit type mismatch with the schema and the inserted rows, it will take care of it, whereas when we use COPY such checks are not given. Commented Mar 3, 2021 at 3:59
  • Then your choices are 1) Clean the data to match the types in the table 2) Change the types in the table to match the data 3) Use @SargisKazaryan answer and create a load table that has all varchar fields and clean up there before moving to final table. Commented Mar 3, 2021 at 15:48

1 Answer 1

0

I would recommend using string fields in the table for such actions, or writing the entire (sql) script manually, indicating the types of table fields

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.