1

I am using read_sql_table() to fetch a data from sql to python where the data after import looks like:

column1       column2          column3
1.0            868.0            76225.0
0.0            2767.0           2763.0

When I read this table into a dataframe it's getting converted to float. Since I need those columns as integer, I'm using:

df['column2']=df['column2'].fillna(0).astype('int'). (Using fillna(0) as there are Nan values)

But I also want to convert all the zeroes (due to fillna(0)) back to NaN.

If I try df['column2'].replace(0, np.nan, inplace=True), this not only converts the zeroes to Nan, but also integer back to float.

Any help on how to convert float to integer using read_sql_table, without changing Nan to 0.

Thanks !!

2
  • by default NaN is a float , hence it converts the dtype to float , you might want to consider a Nullable Integer datatype as a dtype as integer with NaN present Commented Apr 19, 2020 at 15:00
  • I get TypeError: cannot safely cast non-equivalent object to int64 when I try df['column2'].fillna(0).astype('Int64') @anky Commented Apr 19, 2020 at 15:07

1 Answer 1

1

From pandas version 0.24.0, an Int64 datatype is made available which can be used to store NaNs in an integer array.

So, you can use df['column2'] = df['column2'].astype('Int64')

This will convert all the float values to int while keeping the NaNs intact.


References:

Official pandas documentation

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.