Reputation: 345
I have a file that is being written to in a background Python 3.5 process. The file was opened in this process (which is basically a running Python script) but I forgot to include a file close statement to flush out the buffer occasionally in that script. The script is infinite unless it is ended manually and I (now) know that killing the python process will cause all the data in the buffer to be lost. Is there anyway to recover the data to be written in this already running process?
Upvotes: 4
Views: 6723
Reputation: 2153
1) Get the PID of your python process
pgrep python
2) List file descriptors
ls -l /proc/{PID}/fd
3) Open gdb
$ gdb
(gdb) attach {PID}
(gdb) call fflush({FILE_DESCRIPTOR})
(gdb) detach
4) Check your file
Upvotes: 2
Reputation: 9427
first do
f.flush()
, and then doos.fsync(f.fileno())
, to ensure that all internal buffers associated with f are written to disk.
Here is an example,
import os
with open("filename", "w") as f:
while True: # infinite program
f.write("the text")
f.flush()
os.fsync(f.fileno())
Upvotes: 5