HH____HH
HH____HH

Reputation: 75

Group column by level and other column by other level pandas

I have Weather data sampled Hourly and it contains [Temp, Humidity, Speed]

Timestamp      Humidity Temp    Speed
01/01/2019 00:00    57  23  2.222222222
01/01/2019 01:00    56  23  1.944444444
01/01/2019 02:00    55  23  1.944444444
01/01/2019 03:00    54  22  1.944444444
01/01/2019 04:00    55  22  1.944444444
01/01/2019 05:00    56  22  1.666666667
01/01/2019 06:00    57  22  1.666666667
01/01/2019 07:00    57  22  1.666666667
01/01/2019 08:00    57  23  1.944444444
01/01/2019 09:00    57  23  1.944444444
01/01/2019 10:00    55  23  2.222222222

I want to create a dataframe that contains the following :

Timestamp (daily). Timestamp (hourly). Temp (Hourly), Humidity (Hourly), Humidity (Hourly), Temp (Daily), Humidity (Daily), Humidity (Daily)

What is the fastest way of doing this [using one group by for example]?

Below my attempt

        weather_data = weather_data.groupby(pd.Grouper(key='Timestamp', freq='1d'),pd.Grouper(key='Timestamp', freq='1d'))\
            .agg(Temp_Daily = ('Temp','mean'),
                 Humidity_Daily=('Humidity', 'mean'),
                 Speed_Daily=('Speed', 'mean'),
                 Temp = ('Temp', lambda x:x),
                 Humidity=('Humidity', lambda x: x),
                 Speed=('Speed', lambda x: x)
                 ).reset_index()

Upvotes: 1

Views: 33

Answers (1)

Alexander
Alexander

Reputation: 109626

If I understand correctly your intended output, you can concatenate the daily averages to the original dataframe (after changing column names).

# Sample data.
df = pd.DataFrame({
    'Timestamp': pd.date_range('2019-01-01 00:00', '2019-01-01 10:00', freq='H'),
    'Humidity': [57, 56, 55, 54, 55, 56, 57, 57, 57, 57, 55],
    'Temp': [23, 23, 23, 22, 22, 22, 22, 22, 23, 23, 23],
    'Speed': [2.222222222, 1.944444444, 1.944444444, 1.944444444, 1.944444444, 1.666666667, 1.666666667, 1.666666667, 1.944444444, 1.944444444, 2.222222222]
})


# Solution.
df_daily = (
    df
    .groupby(df['Timestamp'].dt.date)[['Humidity', 'Temp', 'Speed']]
    .transform('mean')
    .add_suffix(' (daily)')
    .set_index(df['Timestamp'])
)
result = pd.concat([df.set_index('Timestamp').add_suffix(' (Hourly)'), df_daily], axis=1)

>>> result.shape
(11, 6)

>>> result
                     Humidity (Hourly)  Temp (Hourly)  Speed (Hourly)  \
Timestamp                                                               
2019-01-01 00:00:00                 57             23        2.222222   
2019-01-01 01:00:00                 56             23        1.944444   
2019-01-01 02:00:00                 55             23        1.944444   
2019-01-01 03:00:00                 54             22        1.944444   
2019-01-01 04:00:00                 55             22        1.944444   
2019-01-01 05:00:00                 56             22        1.666667   
2019-01-01 06:00:00                 57             22        1.666667   
2019-01-01 07:00:00                 57             22        1.666667   
2019-01-01 08:00:00                 57             23        1.944444   
2019-01-01 09:00:00                 57             23        1.944444   
2019-01-01 10:00:00                 55             23        2.222222   

                     Humidity (daily)  Temp (daily)  Speed (daily)  
Timestamp                                                           
2019-01-01 00:00:00                56     22.545455       1.919192  
2019-01-01 01:00:00                56     22.545455       1.919192  
2019-01-01 02:00:00                56     22.545455       1.919192  
2019-01-01 03:00:00                56     22.545455       1.919192  
2019-01-01 04:00:00                56     22.545455       1.919192  
2019-01-01 05:00:00                56     22.545455       1.919192  
2019-01-01 06:00:00                56     22.545455       1.919192  
2019-01-01 07:00:00                56     22.545455       1.919192  
2019-01-01 08:00:00                56     22.545455       1.919192  
2019-01-01 09:00:00                56     22.545455       1.919192  
2019-01-01 10:00:00                56     22.545455       1.919192  

Upvotes: 1

Related Questions