Reputation: 1765
I'm trying to plot stacked bar chart from a dataframe for several hours. I'm sorry if this is a bare question, but I just can't make it work, I need help.
My dataframe looks like this:
_id date news_source
0 2715eeada6726024df20e6938ef09f64 2019-12-23 airport-suppliers.com
1 d068a3d0b24d2a348ff8c8a856aba86c 2019-12-23 airport-suppliers.com
17 552d7bb9f7d3fd689dd308dc7650baac 2019-12-23 airport-suppliers.com
20 82be33a041204fd008ba5093607310f6 2019-12-23 airport-suppliers.com
21 4044907f5b6d5610ec59a03c75e0554c 2019-12-23 airportsinternational.keypublishing.com
22 db4e1e4d1246abc3304e5d77688424dc 2019-12-23 airportsinternational.keypublishing.com
23 b7f57b63218190d249d19624bbdcb520 2019-12-23 internationalairportreview.com
27 84d5377bd8755a685100e408140c4ab1 2019-12-23 internationalairportreview.com
28 8289a1c1b3fa3f618c332d61023eae00 2019-12-16 passengerterminaltoday.com
29 f4f020f09ee5f95499a26c43cfd82d2d 2019-12-16 airportsinternational.keypublishing.com
.. ... ... ...
59 a18388a1c77889bdbe6aaa9238a8d21a 2019-12-16 airport-suppliers.com
62 5cd894a9fa587ab4267adfd23f01e1c4 2019-12-16 airportsinternational.keypublishing.com
66 bb7d05d61f999b1f0b317d21c6c23c0c 2019-12-16 airportsinternational.keypublishing.com
70 f49b9ce330198aec666cb90275d293b2 2019-12-16 internationalairportreview.com
71 af893db09fad9335413ce5c325ced712 2019-12-16 passengerterminaltoday.com
72 e21dc60cfda457b03a6dba6ab44aa3b1 2019-12-16 passengerterminaltoday.com
81 963760af4b4653d175902f4d6285ff0a 2019-12-16 passengerterminaltoday.com
82 778b572be28fd25f394cfa41bbc5aa4a 2019-12-16 airport-suppliers.com
The final plot I want to show is like this, but instead of strategies there will be weekly dates, news_source
instead of Products, and counts is the same.
What I tried is groupby by date
and news_source
, then counting them. Then the rest of my work just got messed up and in the end I couldn't make it to get in a format like the example in this. Also, the amount of unique news_source, date may change over time, so I'm avoiding hardcoding things as much as I can.
The grouping:
groups = df.groupby(['date', 'news_source'])["_id"].count()
If you need them as dictionary:
counts = defaultdict(dict)
for index, count in zip(groups.index, groups):
try:
counts[index[0]][index[1]] += count
except KeyError:
counts[index[0]][index[1]] = count
Output is:
{'2019-12-16': {'airport-suppliers.com': 9,
'airportsinternational.keypublishing.com': 12,
'internationalairportreview.com': 19,
'passengerterminaltoday.com': 21},
'2019-12-23': {'airport-suppliers.com': 21,
'airportsinternational.keypublishing.com': 2,
'internationalairportreview.com': 5}}
If you know how to do it properly, any help will be appreciated, thanks.
Here is the code to generate minimal reproducible example:
import pandas as pd
dates = ['2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-23', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16', '2019-12-16']
sources = ['airport-suppliers.com', 'airport-suppliers.com', 'airport-suppliers.com', 'airport-suppliers.com', 'airport-suppliers.com', 'airport-suppliers.com', 'airport-suppliers.com', 'airport-suppliers.com', 'airport-suppliers.com', 'airport-suppliers.com', 'airport-suppliers.com', 'airport-suppliers.com', 'airport-suppliers.com', 'airport-suppliers.com', 'airport-suppliers.com', 'airport-suppliers.com', 'airport-suppliers.com', 'airport-suppliers.com', 'airport-suppliers.com', 'airport-suppliers.com', 'airport-suppliers.com', 'airportsinternational.keypublishing.com', 'airportsinternational.keypublishing.com', 'internationalairportreview.com', 'internationalairportreview.com', 'internationalairportreview.com', 'internationalairportreview.com', 'internationalairportreview.com', 'passengerterminaltoday.com', 'airportsinternational.keypublishing.com', 'airportsinternational.keypublishing.com', 'airportsinternational.keypublishing.com', 'airportsinternational.keypublishing.com', 'airportsinternational.keypublishing.com', 'airportsinternational.keypublishing.com', 'airportsinternational.keypublishing.com', 'internationalairportreview.com', 'internationalairportreview.com', 'internationalairportreview.com', 'airport-suppliers.com', 'passengerterminaltoday.com', 'internationalairportreview.com', 'internationalairportreview.com', 'internationalairportreview.com', 'internationalairportreview.com', 'passengerterminaltoday.com', 'passengerterminaltoday.com', 'internationalairportreview.com', 'internationalairportreview.com', 'internationalairportreview.com', 'airport-suppliers.com', 'passengerterminaltoday.com', 'airport-suppliers.com', 'airport-suppliers.com', 'passengerterminaltoday.com', 'passengerterminaltoday.com', 'passengerterminaltoday.com', 'passengerterminaltoday.com', 'passengerterminaltoday.com', 'airport-suppliers.com', 'airport-suppliers.com', 'airport-suppliers.com', 'airportsinternational.keypublishing.com', 'airportsinternational.keypublishing.com', 'airportsinternational.keypublishing.com', 'airportsinternational.keypublishing.com', 'airportsinternational.keypublishing.com', 'internationalairportreview.com', 'internationalairportreview.com', 'internationalairportreview.com', 'internationalairportreview.com', 'passengerterminaltoday.com', 'passengerterminaltoday.com', 'passengerterminaltoday.com', 'passengerterminaltoday.com', 'passengerterminaltoday.com', 'passengerterminaltoday.com', 'passengerterminaltoday.com', 'passengerterminaltoday.com', 'passengerterminaltoday.com', 'passengerterminaltoday.com', 'passengerterminaltoday.com', 'airport-suppliers.com', 'airport-suppliers.com', 'internationalairportreview.com', 'internationalairportreview.com', 'internationalairportreview.com', 'internationalairportreview.com', 'internationalairportreview.com']
df = pd.DataFrame({"date": dates, "news_source": sources})
Upvotes: 0
Views: 41
Reputation: 2348
How about this? I added counts for your data:
df1 = df.groupby(['date', 'news_source']).size().reset_index().rename(columns={0:'count'})
Then, I used pd.crosstab, set the following index, columns, and values parameters. Then include an aggfunc, which is sum() in this case.
pd.crosstab(index=df1['date'], columns=df1['news_source'], values=df1['count'], aggfunc=sum).plot.bar(stacked=True)
Result:
Upvotes: 1