Reputation: 1
I want to copy updated files from local file system to Hadoop every hour as i want to put in cron. Is there any hadoop command that i can use to copy the updated files from local to Hadoop?
Upvotes: 0
Views: 489
Reputation: 4818
Something like below code? In folder with your files
files=$(find . -type f -mmin -60)
for f in $files
do
hadoop fs -cp $f /hadoopdest
done
Maybe another loop for created files:
files_c=$( find . -type f -cmin -60)
Upvotes: 0
Reputation: 343
You can use various data ingestion tools like Flume,Nifi etc.
Please let me know if you need any help on these tools.
Upvotes: 0