Souvik Ray
Souvik Ray

Reputation: 3038

Selecting column values of a dataframe which is in a range and put it in appropriate columns of another dataframe in pandas

I have a csv file which is something like below

date,mean,min,max,std
2018-03-15,3.9999999999999964,inf,0.0,100.0
2018-03-16,0.46403712296984756,90.0,0.0,inf
2018-03-17,2.32452732452731,,0.0,143.2191767899579
2018-03-18,2.8571428571428523,inf,0.0,100.0
2018-03-20,0.6928406466512793,100.0,0.0,inf
2018-03-22,2.8675703858185635,,0.0,119.05383697172658

I want to select those column values which is > 20 and < 500 that is (20 to 500) and put those values along with date in another column of a dataframe.The other dataframe looks something like this

Date        percentage_change  location
2018-02-14  23.44              BOM

So I want to get the date, value from the csv and add it into the new dataframe at appropriate columns.Something like

Date        percentage_change   location
2018-02-14  23.44               BOM
2018-03-15  100.0               NaN
2018-03-16  90.0                NaN
2018-03-17  143.2191767899579   NaN
....        ....                ....

Now I am aware of functions like df.max(axis=1) and df.min(axis=1) which gives you the min and max but not sure for finding values based on a range.So how can this be achieved?

Upvotes: 0

Views: 38

Answers (1)

jpp
jpp

Reputation: 164793

Given dataframes df1 and df2, you can achieve this via aligning column names, cleaning numeric data, and then using pd.DataFrame.append.

df_app = df1.loc[:, ['date', 'mean', 'min', 'std']]\
            .rename(columns={'date': 'Date'})\
            .replace(np.inf, 0)\
            .fillna(0)

print(df_app)

df_app['percentage_change'] = np.maximum(df_app['min'], df_app['std'])

print(df_app)
df_app = df_app[df_app['percentage_change'].between(20, 500)]

res = df2.append(df_app.loc[:, ['Date', 'percentage_change']])

print(res)

#          Date location  percentage_change
# 0  2018-02-14      BOM          23.440000
# 0  2018-03-15      NaN         100.000000
# 1  2018-03-16      NaN          90.000000
# 2  2018-03-17      NaN         143.219177
# 3  2018-03-18      NaN         100.000000
# 4  2018-03-20      NaN         100.000000
# 5  2018-03-22      NaN         119.053837

Upvotes: 1

Related Questions