Mark Andrews
Mark Andrews

Reputation: 21

How to force dtype to integer in dataframe

I am working with a dataframe that has a column in which all values are None. By default the dtype for this column is object. I need it to be an integer type, likely int64. Why is irrelevant; that's what I need.

I've tried the following:

df['field'] = df['field'].astype(int)

This does not work. The error is:

ValueError: Cannot convert NA to integer

insert_data['data_set_key'] = pd.to_numeric(insert_data['data_set_key'], errors='coerce')

This converts the dtype to float64 which isn't at all what I am lookign for.

ADDENDUM:

The dataframe is being used by SQL Alchemy to populate a table in an Oracle database. That table has an integer field that is not populated at this time. The field in the dataframe is loaded with None which then has a dtype of object. When this data is loaded that seems to be interpreted as a CLOB type and Oracle pukes all over itself. If I change None to 1 it works fine. But that's not an option.

Upvotes: 1

Views: 643

Answers (1)

Mark Andrews
Mark Andrews

Reputation: 21

One thing I had not tried, since it just shouldn't work, actually is the answer. Substituting np.NaN for None results in the table being populated with NULL in the desired field.

Upvotes: 1

Related Questions