11

I have a table in postgresql named mytable and I need to print the contents of this table from a python application to stdout.

I'm currently doing the following:

        conn = psycopg2.connect("dbname=postgres user=postgres password=psswd")
        cur = conn.cursor() 
        cur.copy_to(sys.stdout,'mytable',sep = '\t')

However, I get some "\N" when its printed in between some columns. I believe the reason why this happens is because somewhere during the print process, the line exceeds and goes to the next line in the psql terminal and so these \N s show up.

Output:

E0307   1       M       400     Ethan   UTDallas        12.98580404     \N      50.79403657     1
E0307   1       M       400     Lucas   Baylor  15.18511175     \N      56.87285183     3
E0307   1       M       400     Jackson Baylor  13.64228411     \N      56.87285183     3
E0307   1       M       400     Jacob   Baylor  13.19878974     \N      56.87285183     3
E0307   1       M       400     Samuel  Baylor  14.84666623     \N      56.87285183     3

My question is the following:

  1. How do I get rid of these \N in output? Is there an alternative way of printing a table? I'm trying to avoid ways in which I have to execute an entire "SELECT * FROM my_table" query. Something that just uses the name of the table to be printed.

  2. Also, how do I get the table headers while printing out? I tried the following:

    cur.execute("COPY mytable TO STDOUT with csv header")

I get this error message:

ProgrammingError: can't execute COPY TO: use the copy_to() method instead

Also, I'm not sure if this is the best way. But something I tried to do :)

0

3 Answers 3

7

don't have a postgress table handy to test this but does this work for you?

import psycopg2 as pg
import pandas as pd
import pandas.io.sql as psql

connection = pg.connect("dbname=postgres user=postgres password=psswd")
#my_table   = pd.read_sql_table('table_name', connection)
my_table    = pd.read_sql('select * from my-table-name', connection)
another_attempt= psql.read_sql("SELECT * FROM my-table-name", connection)

print(my_table)

# OR
print(another_attempt)
Sign up to request clarification or add additional context in comments.

3 Comments

I get this error: Traceback (most recent call last): File "<ipython-input-39-731d90653fa9>", line 1, in <module> my_table = pd.read_sql_table('team_totals', connection) File "C:\Users\peshalnayak\Anaconda3\lib\site-packages\pandas\io\sql.py", line 351, in read_sql_table raise NotImplementedError("read_sql_table only supported for " NotImplementedError: read_sql_table only supported for SQLAlchemy connectable.
ok I'll update my answer to use the read_sql function which takes a query
ok, hoping either the my_table or another_attempt lines work. one uses standard pd_read_sql, the second uses pd.io.sql which I found suggested specifically for connecting to postgres. curious which works, or if both do.
3

That \N is the default textual representation of a null value. It can be changed with the null parameter of copy_to

To have the headers in the output use copy_expert

copy = "copy mytable to stdout with csv header delimiter '\t' null 'NULL'"
cursor.copy_expert(copy, sys.stdout)

Comments

1

As stated earlier by Neto: cur.copy_expert("sql statement", sys.stdout) will work. To use copy_to you need to pass null param.

Try this if you choose the copy_to method (set null value - see docs). Print column names first.

header = [i[0] for i in cur.description
print header
cur.copy_to(sys.stdout, 'table', sep='\t', null='\N')

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.