erogol
erogol

Reputation: 13614

how to code vector to matrix hamming distance in Python?

I like to develop a query system that finds the most similar items to given one based on a binary signature extracted from the data. I probe for the most efficient way since I have runtime constraints. I tried to use scipy distance but it was too slow. Do you know any other useful library or trick to make it in a faster manner.

For being and example scenario,

I have a query vector with binary values with length 68, and I have a dataset with a matrix size 3000Kx68. I like to find the most similar item in this matrix to given query by using Hamming distance.

thanks for any comment

Upvotes: 2

Views: 2724

Answers (3)

Jan Kuiken
Jan Kuiken

Reputation: 2010

Nice problem, I liked the answers of Alex and Piotr. My first naive attempt resulted also in a solution time around 800ms (on my system). My second attempt, using numpy's (un)packbits, resulted in a 4x speed increase.

import numpy as np

LENGTH = 68
K = 1024
DATASIZE = 3000 * K
DATA = np.random.randint(0, 2, (DATASIZE, LENGTH)).astype(np.bool)

def RandomVect():
    return np.random.randint(0, 2, (LENGTH)).astype(np.bool)

def HammingDist(vec1, vec2):
    return np.sum(np.logical_xor(vec1, vec2))

def SmallestHamming(vec):
    XorData = np.logical_xor(DATA, vec[np.newaxis, :])
    Lengths = np.sum(XorData, axis=1)
    return DATA[np.argmin(Lengths)]  # returns first smallest

def main():
    v1 = RandomVect()
    v2 = SmallestHamming(v1)
    print(HammingDist(v1, v2))

# oke, lets try make it faster... (using numpy.(un)packbits)

DATA2 = np.packbits(DATA, axis=1)
NBYTES = DATA2.shape[-1]

BYTE2ONES = np.zeros((256), dtype=np.uint8)
for i in range(0,256):
    BYTE2ONES[i] = np.sum(np.unpackbits(np.uint8(i)))

def RandomVect2():
    return np.packbits(RandomVect())

def HammingDist2(vec1, vec2):
    v1 = np.unpackbits(vec1)
    v2 = np.unpackbits(vec2)
    return np.sum(np.logical_xor(v1, v2))

def SmallestHamming2(vec):
    XorData = DATA2 ^ vec[np.newaxis, :]
    Lengths = np.sum(BYTE2ONES[XorData], axis=1)
    return DATA2[np.argmin(Lengths)]  # returns first smallest

def main2():
    v1 = RandomVect2()
    v2 = SmallestHamming2(v1)
    print(HammingDist2(v1, v2))

Upvotes: 1

Alex Monras
Alex Monras

Reputation: 195

I would be surprised if there was a significantly faster way than this: Put your data into a pandas DataFrame (M), each vector by columns, and your target vector into a pandas Series (x),

import numpy as np
import pandas as pd

rows = 68
columns=3000 
M = pd.DataFrame(np.random.rand(rows,columns)>0.5)
x = pd.Series(np.random.rand(rows)>0.5)

then do the following

%timeit M.apply(lambda y: x==y).astype(int).sum().idxmax()

1 loop, best of 3: 746 ms per loop

Edit: Actually, I am surprised this is a much faster way

%timeit M.eq(x, axis=0).astype(int).sum().idxmax()

100 loops, best of 3: 2.68 ms per loop

Upvotes: 0

Piotr Migdal
Piotr Migdal

Reputation: 12792

Use cdist from SciPy:

from scipy.spatial.distance import cdist
Y = cdist(XA, XB, 'hamming')

Computes the normalized Hamming distance, or the proportion of those vector elements between two n-vectors u and v which disagree. To save memory, the matrix X can be of type boolean

Reference: http://docs.scipy.org/doc/scipy/reference/generated/scipy.spatial.distance.cdist.html

Upvotes: 0

Related Questions