Reputation: 3
I Have a problem. I'm working with a 400mb postgres database. I have to perform searches with a lot of different filters. It takes me around a minute to load the page.
example from views.py, the task is to search all the possible combinations of word's letters. Like cat > act > atc etc. :
def neighb():
count = 0
words = []
mylist = re.findall(r'\w', word)
combinations = (list(itertools.permutations(mylist)))
result = ''
for comb in combinations:
for letters in comb:
result += letters
if data.filter(form_v=result).exists():
count += 1
words.append(result)
result = ''
return count, words
So, is there some way to make it faster?
Upvotes: 0
Views: 68
Reputation: 15370
There few things you are not doing optimally.
First: don't join strings like this
for letters in comb:
result += letters
do this
result = ''.join(comb)
Second: you should always try to do as less db queries as possible. In your case you will do it for every combination. Why not just get filter by all combinations and then get all words that actually in db. This way you will do only one db query.
def neighb(data, word):
mylist = re.findall(r'\w', word)
combinations = [''.join(c) for c in itertools.permutations(mylist)]
words = list(data.filter(form_v__in=combinations).values_list('form_v', flat=True))
return len(words), words
Upvotes: 1