0

I'm been trying to export all the tables in my database into a single csv file.

I've tried

import MySQLdb as dbapi
import sys
import csv
import time


dbname      = 'site-local'
user        = 'root'
host        = '127.0.0.1'
password    = ''
date        = time.strftime("%d-%m-%Y")
file_name   = date+'-portal'

query='SELECT * FROM site-local;' //<---- I'm stuck here

db=dbapi.connect(host=host,user=user,passwd=password)
cur=db.cursor()
cur.execute(query)
result=cur.fetchall()

c = csv.writer(open(file_name+'.csv','wb'))
c.writerow(result)

I'm a little stuck now, I hope someone can sheds some light base on what I have.

1
  • I'm no MySQL expert, but I'd say there is no single statement to get every record from every table in the database. RDMSs will return results with all the same columns, so what you're doing doesn't really fit. What I would do is query the metadata table and get a list of all tables, then query each table and append that to your file. Commented Jun 16, 2016 at 2:21

2 Answers 2

2

Using python isn't the way to go about this. The simplest solution is to use SELECT INTO OUTFILE. That can dump even very large tables into CSV format speedily and without the need to bother with CSV writers in your python code.

From your other questions, I understand that the reason for your dumping is to re-import into postgresql. If that was not so, you could have simply used mysqldump command to dump out the entire database at once.

If you want to dump each table in CSV format, it does call for a bit of code. Create a python loop to iterate through all the tables and then execute a SELECT INTO query on each of those tables.

Sign up to request clarification or add additional context in comments.

Comments

1

Consider iteratively exporting the SHOW CREATE TABLE (txt files) and SELECT * FROM (csv files) output from all database tables. From your related earlier questions, since you need to migrate databases, you can then run the create table statements (adjusting the MySQL for Postgre syntax such as the ENGINE=InnoDB lines) and then import the data via csv using PostgreSQL's COPY command. Below csv files include table column headers not included in fetchall().

db = dbapi.connect(host=host,user=user,passwd=password)
cur = db.cursor()

# RETRIEVE TABLES
cur.execute("SHOW TABLES")
tables = []
for row in cur.fetchall():
    tables.append(row[0])

for t in tables:    
    # CREATE TABLES STATEMENTS
    cur.execute("SHOW CREATE TABLE `{}`".format(t))
    temptxt = '{}_table.txt'.format(t)

    with open(temptxt, 'w', newline='') as txtfile:
        txtfile.write(cur.fetchone()[1])                   # ONE RECORD FETCH
    txtfile.close()

    # SELECT STATEMENTS
    cur.execute("SELECT * FROM `{}`".format(t))
    tempcsv = '{}_data.csv'.format(t)

    with open(tempcsv, 'w', newline='') as csvfile:
        writer = csv.writer(csvfile)
        writer.writerow([i[0] for i in cur.description])   # COLUMN HEADERS
        for row in cur.fetchall():        
            writer.writerow(row)
    csvfile.close()

cur.close()
db.close()

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.