user1730083
user1730083

Reputation: 31

Shell script to move files into a hadoop cluster

This may have been answered somewhere but I haven't found it yet.

I have a simple shell script that I'd like to use to move log files into my Hadoop cluster. The script will be called by Logrotate on a daily basis.

It fails with the following error: "/user/qradar: cannot open `/user/qradar' (No such file or directory)".

#!/bin/bash

#use today's date and time
day=$(date +%Y-%m-%d)

#change to log directory
cd /var/log/qradar

#move and add time date to file name
mv qradar.log qradar$day.log

#load file into variable
#copy file from local to hdfs cluster

if [ -f qradar$day.log ]

then
    file=qradar$day.log
    hadoop dfs -put /var/log/qradar/&file   /user/qradar

else
    echo "failed to rename and move the file into the cluster" >> /var/log/messages

fi

The directory /user/qradar does exist and can be listed with the Hadoop file commands. I can also manually move the file into the correct directory using the Hadoop file commands. Can I move files into the cluster in this manner? Is there a better way?

Any thoughts and comments are welcome. Thanks

Upvotes: 3

Views: 11482

Answers (1)

Chris White
Chris White

Reputation: 30089

Is the &file a typo on in hadoop dfs -put line?

If not then this is likely your problem, you're running the command hadoop dfs -put /var/log/qradar/ in the background (the ampersand runs the command in the background), then the command file /user/qradar, which the shell is looking for on the local path.

My guess is you meant for the following (dollar rather than ampersand):

hadoop dfs -put /var/log/qradar/$file /user/qradar

Upvotes: 4

Related Questions