0

I have a database that contains multiple tables, and I am trying to import each table as a pandas dataframe. I can do this for a single table as follows:

import pandas as pd
import pandas.io.sql as psql
import pypyodbc

conn = pypyodbc.connect("DRIVER={SQL Server};\
                        SERVER=serveraddress;\
                        UID=uid;\
                        PWD=pwd;\
                        DATABASE=db")

df1 = psql.read_frame('SELECT * FROM dbo.table1', conn)

The number of tables in the database will change, and at any time I would like to be able to import each table into its own dataframe. How can I get all of these tables into pandas?

1 Answer 1

4

Depending on your SQL server, you can inspect the tables in a database.

For example:

tables_df = pd.read_sql('SELECT table_name FROM database_name', conn)

Now your table names are accessible as a pandas data frame, you just need to parse it out:

table_name_list = tables_df.table_name

select_template = 'SELECT * FROM {table_name}'
frames_dict = {}
for tname in table_name_list:
    query = select_template.format(table_name = tname)
    frames_dict[tname] = pd.read_sql(query, conn)

Your dictionary frames_dict contains all the dataframes with the table_name as the key

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.