allstrives
allstrives

Reputation: 634

Setting up Hadoop using Cloudera Manager on corporate cloud

I've a corporate cloud env that allows me to create 4 VMs with pretty high config (like 16 Gig RAM and 130 Gig drive space). I'm trying to create Hadoop cluster and ran into issues. OS RHEL 6.4 Of the 130 gig, Here is the df

Filesystem           1K-blocks      Used Available Use% Mounted on
/dev/mapper/rootvg-root
                       5160576   1454056   3444376  30% /
tmpfs                  8167172         0   8167172   0% /dev/shm
/dev/sda1               516040     40876    448952   9% /boot
/dev/mapper/local1vg-vol_01
                     123789284    683164 116817944   1% /vol_01

I think , as I start installing parcels, the "/" and "/boot" fills up and the cloudera manager gets stuck.

One of the solution I tried was, to move /tmp to under /vol_01 and make /tmp a link to /vol_01/tmp, it did make situation slightly better...but then /var was taking up lot of space. I can't move /var to under /vol_01 as it is used by the OS(I guess...it fails to move some of the folders from /var). Now, these mount points are predefined....as part of the image. Any solution, that I can ask Cloudera Manager to install differently or modify my VM in a specific way ...?

I'm aware of solutions like stopping all services before I move /var to /vol_01/var so that all files are copied

I will go in the above order to find best solution and post the result.

Upvotes: 3

Views: 245

Answers (1)

Rico
Rico

Reputation: 61571

You might have to go into single user mode to move the /var directory . Doable, but a lot of daemons for example use /var/run

Upvotes: 1

Related Questions