Is there any way to get pandas to read a table with array-typed columns directly into native structures? By default, a int[] column ends up as an object column containing python list of python ints. There are ways to convert this into a column of Series, or better, a column with a multi-index, but this are very slow (~10 seconds) for 500M rows. Would be much faster if the data was originally loaded into a dataframe. I don't what to unroll the array in sql because I have very many array columns.
url = "postgresql://u:p@host:5432/dname"
engine = sqlalchemy.create_engine(url)
df = pd.read_sql_query("select 1.0 as a, 2.2 as b, array[1,2,3] as c;", engine)
print df
print type(df.loc[0,'c']) # list
print type(df.loc[0,'c'][0]) # int