Reputation: 79
I have made the API parsing the GITHUB contribution data of each account and arranging by month, week or day and decorating with JSON.
responding to just one request takes approximately 2 second. (1800ms)
Link to my GitHub repository.
contributions.py
in repository is the python code that does the above things.
THE POINT OF QUESTION : What makes my API slow?
just too many data to parse (about 365)? the way api make the JSON String?
Thank you for answering and helping me in advance.
Upvotes: 0
Views: 89
Reputation: 445
"Why is my code slow?" is a really hard question to answer. There's basically an unlimited number of possible reasons that could be. I may not be able to answer the question, but I can provide some suggestions to hopefully help you answer it for yourself.
There are dozens of questions to ask... What kind of hardware are you using? What kind of network/internet connection do you have? Is it just slow on the first request, or all requests? Is it just slow on the call to one type of request (daily, weekly, monthly) or all? etc. etc.
You are indicating overall request times being ~1800ms, but as you pointed out, there are a lot of things happening during the processing of that request. In my experience, often times one of the best ways to find out is to add some timing code to narrow down the scope of the slowness.
For example, one quick and dirty way to do this is to use the python time module. I quickly added some code to the weekly contributions method:
import time
# [...]
@app.route("/contributions/weekly/<uname>")
def contributionsWeekly(uname):
before = time.time()
rects = getContributionsElement(uname)
after = time.time()
timeToGetContribs = after - before
# [...]
print(' timeToGetContribs: ' + str(timeToGetContribs))
print('timeToIterateRects: ' + str(timeToIterateRects))
print(' timeToBuildJson: ' + str(timeToBuildJson))
Running this code locally produced the following results:
timeToGetContribs: 0.8678717613220215
timeToIterateRects: 0.011543750762939453
timeToBuildJson: 1.5020370483398438e-05
(Note the e-05
on the end of the last time... very tiny amount of time).
From these results, we know that the time to get the contributions is taking the majority of the full request. Now we can drill down into that method to try to further isolate the most time consuming part. The next set of results shows:
timeToOpenUrl: 0.5734567642211914
timeToInstantiateSoup: 0.3690469264984131
timeToFindRects: 0.0023255348205566406
From this it appears that the majority of the time is spent actually opening the URL and retrieving the HTML (meaning that network latency, internet connection speed, GitHub server response time, etc are the likely suspects). The next heaviest is the time it actually takes to instantiate the BeautifulSoup parser.
Take all of these concrete numbers with a grain of salt. These are on my hardware (12 year old PC) and my local internet connection. On your system, the numbers will likely vary, and may even be significantly different. The point is, the best way to track down slowness is to go through some basic troubleshooting steps to identify where the slowness is occurring. After you've identified the problem area(s), you can likely search for more specific answers, or ask more targeted questions.
Upvotes: 1