Joinwei Zhang
Joinwei Zhang

Reputation: 21

how to avoid oom killer for coding with python

I wrote a crawler coding with python, but it often leads to oom killer, oom killer makes linux freeze and I can not connect to os via ssh. I also write a script to protect memory, if memory
usage exceeds 80% [memusage = (MemTotal - MeMFree - Buffers - Cached)/MeMTotal], restart crawler. but it seems not work. so my question is that how to avoid oom killer, and even if oom killer happens, is there something method to avoid freeze whole opertion system?

when oom-killer happens, screen prints information below, who can explain to me these information?

ve:okB present:16256kB Pages_scanned:173280531 all_unreclaimable? yes
[9132.468227] lowmem_reserve[]: 0  829  829  829
[9132.468403]Normal free:3628kB  min:3648kB  low:4568kB  high:5472kB  active:614280kB  inactive:22508kB  present:849376kB  pages_scanned:1415839762 all_unreclaimable? yes
[9132.468713] lowmem_reserve[]: 0  0  0  0
[9132.468883]DMA: 0.4kB  1.8kB  1*16kB  1,32kB  0*64kB  0*128kB  1*256kB  0*512kB  1*1024kB  1*2048kB  0*4096kB = 3384kB
[9132.469286]Normal:7.4kB 5*8kB 0*16kB 1*32kB 1*64kB 1*128kB 1*256kB 0*512kB 1*1024kB 1*2048kB 0*4096kB = 3620kB
[9132.469674]Swap cache:add 0, delete 0, find 0/0, race 0+0
[9132.469825]Frees swap = 0kB
[9132.469905]Total swap = 0kB
[9132.469986]Free swap:        0kB
[9132.472289]218112 pages of RAM
[9132.472386]0 pages of HIGHMEM
[9132.472469]44668 reserved pages
[9132.472553]5732 pages shared
[9132.472634]0 pages swap cached
[9132.472760]0 pages dirty
[9132.472837]0 pages writeback
[9132.472919]874 pages mapped
[9132.472999]3343 pages slab
[9132.473082]392 pages pagetables

Upvotes: 0

Views: 3623

Answers (2)

max333
max333

Reputation: 97

As i understand you have no swap partition or disable it, try add swap. With no swap, it has no way to evict dirty pages from memory and system can freeze.

Upvotes: 3

MattH
MattH

Reputation: 38245

The OOM killer is one of the strangest parts of unix to me, never seems to go after the process with all the memory.

The solution is to not have your python process consume too much memory. You can temporarily alleviate the problem by installing more memory.

However, the long-term solution is to write your crawler so that it doesn't eat all your memory.

Without seeing your code, we can only guess at where all the memory is going.

Upvotes: 2

Related Questions