M. Palsich
M. Palsich

Reputation: 1818

Find Replace using Values in another File

I have a directory of files, myFiles/, and a text file values.txt in which one column is a set of values to find, and the second column is the corresponding replace value.

The goal is to replace all instances of find values (first column of values.txt) with the corresponding replace values (second column of values.txt) in all of the files located in myFiles/.

For example...

values.txt:

Hello Goodbye

Happy Sad

Running the command would replace all instances of "Hello" with "Goodbye" in every file in myFiles/, as well as replace every instance of "Happy" with "Sad" in every file in myFiles/.

I've taken as many attempts at using awk/sed and so on as I can think logical, but have failed to produce a command that performs the action desired.

Any guidance is appreciated. Thank you!

Upvotes: 1

Views: 2038

Answers (3)

James K. Lowden
James K. Lowden

Reputation: 7837

You can do what you want with awk.

#! /usr/bin/awk -f

# snarf in first file, values.txt
FNR == NR {
    subs[$1] = $2
    next
}

# apply replacements to subsequent files
{
    for( old in subs ) {
        while( index(old, $0) ) {
            start = index(old, $0)
            len = length(old)
            $0 = substr($0, start, len) subs[old] substr($0, start + len)
        }
    }
    print
}

When you invoke it, put values.txt as the first file to be processed.

Upvotes: 1

ppp
ppp

Reputation: 1005

  1. Read each line from values.txt
  2. Split that line in 2 words
  3. Use sed for each line to replace 1st word with 2st word in all files in myFiles/ directory

Note: I've used bash parameter expansion to split the line (${line% *} etc) , assuming values.txt is space separated 2 columnar file. If it's not the case, you may use awk or cut to split the line.

while read -r line;do
    sed -i "s/${line#* }/${line% *}/g" myFiles/*  # '-i' edits files in place and 'g' replaces all occurrences of patterns 
done < values.txt

Upvotes: 1

Kevin J. Rice
Kevin J. Rice

Reputation: 3373

Option One:

  1. create a python script
  2. with open('filename', 'r') as infile, etc., read in the values.txt file into a python dict with 'from' as key, and 'to' as value. close the infile.
  3. use shutil to read in directory wanted, iterate over files, for each, do popen 'sed 's/from/to/g'" or read in each file interating over all the lines, each line you find/replace.

Option Two:

  1. bash script
  2. read in a from/to pair
  3. invoke

    perl -p -i -e 's/from/to/g' dirname/*.txt

done

second is probably easier to write but less exception handling. It's called 'Perl PIE' and it's a relatively famous hack for doing find/replace in lots of files at once.

Upvotes: 0

Related Questions