Arindam Bose
Arindam Bose

Reputation: 197

MemorError while calculating silhouette_score

I am running a KMeans clustering algorithm on a matrix with shape (190868,35). I am running the following code for the same:

for n_clusters in range(3,10):
kmeans = KMeans(init='k-means++',n_clusters=n_clusters,n_init=30)
kmeans.fit(matrix)
clusters = kmeans.predict(matrix)
silhouette_avg=silhouette_score(matrix,clusters)
print("For n_clusters =",n_clusters,"The avg silhouette_score is :",silhouette_avg)

and I am having the following error

Traceback (most recent call last):

  File "<ipython-input-6-be918e90030a>", line 5, in <module>
    silhouette_avg=silhouette_score(matrix,clusters)

  File "C:\Users\arindam\Anaconda3\lib\site-packages\sklearn\metrics\cluster\unsupervised.py", line 101, in silhouette_score
    return np.mean(silhouette_samples(X, labels, metric=metric, **kwds))

  File "C:\Users\arindam\Anaconda3\lib\site-packages\sklearn\metrics\cluster\unsupervised.py", line 169, in silhouette_samples
    distances = pairwise_distances(X, metric=metric, **kwds)

  File "C:\Users\arindam\Anaconda3\lib\site-packages\sklearn\metrics\pairwise.py", line 1247, in pairwise_distances
    return _parallel_pairwise(X, Y, func, n_jobs, **kwds)

  File "C:\Users\arindam\Anaconda3\lib\site-packages\sklearn\metrics\pairwise.py", line 1090, in _parallel_pairwise
    return func(X, Y, **kwds)

  File "C:\Users\arindam\Anaconda3\lib\site-packages\sklearn\metrics\pairwise.py", line 246, in euclidean_distances
    distances = safe_sparse_dot(X, Y.T, dense_output=True)

  File "C:\Users\arindam\Anaconda3\lib\site-packages\sklearn\utils\extmath.py", line 140, in safe_sparse_dot
    return np.dot(a, b)

MemoryError

If anyone knows any solution to this please suggest. I have tried specifying sample_size = 70000, the code runs and consumes all the memory and the system freezes. I am having a Lenovo Thinkpad with 16GB RAM and a i7 processor.

Upvotes: 1

Views: 1005

Answers (1)

bbrady
bbrady

Reputation: 36

MemoryError means the memory is not enough to allocate the numpy array while executing silhouette_score. So the solution is to use less memory or increase the memory space :

Solution 1. Allocate less memory space by setting sample_size to silhouette_score

reference : https://stackoverflow.com/a/16425008/1229868

How to find the maximum suitable sample_size ?

def eval_silhouette_score(matrix, clusters, sample_size):
    try:
        silhouette_avg = metrics.silhouette_score(matrix, clusters, sample_size = sample_size)
        return silhouette_avg
    except MemoryError:
        return None

div_factor = 1.
silhouette_avg = None
while silhouette_avg == None:
    sample_size = int(len(clusters) / div_factor)
    silhouette_avg = eval_silhouette_score(matrix, clusters, sample_size)
    div_factor += 1.

Solution 2. Install more physical memories :)

Upvotes: 1

Related Questions