I am writing a pandas Dataframe to a redshift database using pandas.to_sql() as so:
from sqlalchemy import create_engine
import pandas as pd
conn = create_engine('postgresql://user:[email protected]:5439/db')
dataframe.to_sql('table_name', conn, index=False, if_exists='replace', schema='schema_name')
I this dataframe I have timestamp column which looks like this 2020-03-02, but when I write it to db it gets stored as a varchar and I cant use this column to plot graphs which require timestamp format.
Is there a way to define the column type when writing data to a db with pandas.to_sql()? I do not want to create a view or another table out of the one I am writing because then I would need to drop the current table as the other one would be related to the original one and I would get an error cant drop table because other objects depend on it.
sqlalchemy.typesand specify a schema dictionary asdtypeto thepd.to_sqlfunction, check the accepted answer in this link - pandas to_sql all columns as nvarchar Check here for supported sqlalchemy types.