Khalid
Khalid

Reputation: 661

Python_OSError: [Errno 28] No space left on device

I have the following error while exporting pandas dataframe into csv file. I have enough space on my hard disk.

OSError: [Errno 28] No space left on device

What can be the reason for this? Many thanks in advance.

Upvotes: 29

Views: 87716

Answers (7)

DialFrost
DialFrost

Reputation: 1770

The ENOSPC ("No space left on device") error will be triggered in any situation in which the data or the metadata associated with an I/O operation can't be written down anywhere because of lack of space. This doesn't always mean disk space – it could mean physical disk space, logical space (e.g. maximum file length), space in a certain data structure or address space. For example you can get it if there isn't space in the directory table (vfat) or there aren't any inodes left. It roughly means “I can't find where to write this down"

Source

Python causing: IOError: [Errno 28] No space left on device: '../results/32766.html' on disk with lots of space

Also

It turns out the best solution for me here was to just reformat the drive. Once reformatted all these problems were solved.

Possible solutions
  • Reformat your drive (OP of other question has not update their answer)
  • Delete the \tmp directory with rm -r /tmp/

Upvotes: 1

Rohan Anand
Rohan Anand

Reputation: 3

I’ve encountered this issue myself and, although I haven’t found a permanent solution yet, a temporary workaround is to use Google Colab. Google Colab provides a cloud-based environment with more disk space, which can help avoid the "No space left on device" error you're facing. You can easily upload your pandas DataFrame to Google Drive and export it to a CSV file from there.

Upvotes: 0

Morye
Morye

Reputation: 51

I was using pipenv to install package.
I didn't have enough space in /tmp folder in linux, so solution was to temporary set another folder in TMPDIR env variable.

export TMPDIR=/path/to/directory/with/lot/of/space

More info about TMPDIR: docs.python

Upvotes: 4

Cesar Galindo
Cesar Galindo

Reputation: 1

try truncating the .log files, /var/log

examples

sudo truncate -s 0 /var/log/celery/*log
sudo truncate -s 0 /var/www/html/app/stderr.log

Upvotes: 0

Bastien Ho
Bastien Ho

Reputation: 829

Late answer, but maybe usefull.

I encountered a similar issue :

OSError: [Errno 28] No space left on device:

But, I have enough space:

$ df -h
Sys. de fichiers          Taille Utilisé Dispo Uti% Monté sur
/dev/dm-0                   5,6G    3,2G  2,1G  62% /
udev                         10M       0   10M   0% /dev
tmpfs                       4,8G    481M  4,3G  10% /run
tmpfs                        12G    8,0K   12G   1% /dev/shm
tmpfs                       5,0M       0  5,0M   0% /run/lock
tmpfs                        12G       0   12G   0% /sys/fs/cgroup
tmp                         945M     23M  858M   3% /tmp
var                         5,6G    3,3G  2,0G  64% /var
data                         20G     11G  8,6G  56% /data

In fact, it's not a weight problem, it is an amount problem.

The evidence:

$ df -hi
Sys. de fichiers          Inœuds IUtil. ILibre IUti% Monté sur
/dev/dm-0                   367K   307K    61K   84% /
udev                        3,0M    365   3,0M    1% /dev
tmpfs                       3,0M    597   3,0M    1% /run
tmpfs                       3,0M      2   3,0M    1% /dev/shm
tmpfs                       3,0M      9   3,0M    1% /run/lock
tmpfs                       3,0M     13   3,0M    1% /sys/fs/cgroup
tmp                          61K   5,1K    56K    9% /tmp
var                         367K    95K   273K   26% /var
data                        1,3M   1,3M      0  100% /data

There is no more place for inodes (files).

And this is clearly due to the session management:

$ du --max-depth=1 --inodes -h /data/my-python-program/data/
1,2M    /data/base-eelv/data/sessions
178 /data/base-eelv/data/medias

The quick fix is to empty the session directory.

The clean fix is to auto-cleanup, but at this time, I have'nt search how to do this.

Upvotes: 2

Dev Arora
Dev Arora

Reputation: 157

Assuming you actually do have enough space on disk i.e. nothing funny is going on disk-space-wise (temp files deleted, OS correctly reporting amount of free space)...

This error while exporting with pandas specifically could be triggered because the dataframe in question is too large for the write operation to process it properly. You can try writing the dataframe to a CSV in batches instead. Manually this looks something like:

    INTERVAL = 50
    num_rows = len(df)
    final_index = num_rows - 1
    indices = list(range(0, final_index, INTERVAL))
    indices.append(final_index)
    

    for i in range(len(indices) - 1):
        start = indices[i]
        end = indices[i+1]        
        table_segment = df.iloc[start:end+1] 
        if i == 0:
            table_segment.to_csv("./test2.csv")
        else:
            table_segment.to_csv("./test2.csv", mode = 'a', header = False)

EDIT: Alternatively the (new?) chunksize argument in to_csv for pandas dataframes does the same thing:

df.to_csv(path_or_buf="./test3.csv", chunksize=INTERVAL)

Lastly, there is a good answer explaining this error generally here: Python causing: IOError: [Errno 28] No space left on device: '../results/32766.html' on disk with lots of space

Upvotes: 1

Kartik Thakral
Kartik Thakral

Reputation: 1

I was working with docker containers and increasing its swap space bypassed the error for me.

Upvotes: 0

Related Questions