Reputation: 447
I have some employee data that shows the list of dates for which they have requested leave
emp_id,emp_name,from_date,to_date
101,kevin,2018-12-01,2018-12-05
104,scott,2018-12-02,2018-12-02
I am trying to have the above format converted such that each date in the above sample is shown as a seperate row as shown below:
emp_id,emp_name,date
101,kevin,2018-12-01
101,kevin,2018-12-02
101,kevin,2018-12-03
101,kevin,2018-12-04
101,kevin,2018-12-05
104,scott,2018-12-02
Could anyone advice how could I have this done in pandas. Thanks.
Upvotes: 1
Views: 47
Reputation: 527
You can iterate over each row
df_dates = pd.concat([pd.DataFrame({'Date': pd.date_range(row.from_date, row.to_date, freq='D'),
'Emp_id': row.emp_id,
'Emp_Name': row.emp_name}, columns=['Date', 'Emp_id', 'Emp_Name'])
for i, row in df.iterrows()], ignore_index=True)
print(df_dates)
Upvotes: 0
Reputation: 863166
Solution if emp_id
values are unique - reshape by melt
and resample
with ffill
:
df1 = (df.melt(['emp_id','emp_name'], value_name='date')
.set_index('date')
.drop('variable', axis=1)
.groupby(['emp_id', 'emp_name'])
.resample('d')[[]]
.ffill()
.reset_index()
)
print (df1)
emp_id emp_name date
0 101 kevin 2018-12-01
1 101 kevin 2018-12-02
2 101 kevin 2018-12-03
3 101 kevin 2018-12-04
4 101 kevin 2018-12-05
5 104 scott 2018-12-02
Another solutions - more general, only necessary default RangeIndex
:
#default RangeIndex
#df = df.reset_index(drop=True)
df1 = (df.reset_index()
.melt(['emp_id','emp_name','index'], value_name='date')
.set_index('date')
.drop('variable', axis=1)
.groupby(['index'])
.resample('d')[['emp_id','emp_name']]
.ffill()
.reset_index(level=0, drop=True)
.reset_index()
)
Or use concat
by Series
s created by date_range
with itertuples
and then join
:
df1 = (pd.concat([pd.Series(r.Index,
pd.date_range(r.from_date,r.to_date))
for r in df.itertuples()])
.reset_index())
df1.columns = ['date','idx']
df1 = df1.set_index('idx').join(df[['emp_id','emp_name']]).reset_index(drop=True)
print (df1)
date emp_id emp_name
0 2018-12-01 101 kevin
1 2018-12-02 101 kevin
2 2018-12-03 101 kevin
3 2018-12-04 101 kevin
4 2018-12-05 101 kevin
5 2018-12-02 104 scott
Upvotes: 1