DanH
DanH

Reputation: 5818

Unable to grep file in Ubuntu, matching lines are blank?

I have a CSV file which is parsed by a bash script in order to find the position of a given ID within the file, as a means of tracking progress via JMeter.

The specific line I'm having trouble with consists of this code:

egrep -n -C 2 '49156$'

Previously I've run the script with no problem, however I think something funny has happened to the character encoding or the end line characters. As now when I run this line I get no return.

If I change the regex to check for an extra character, such as '49156.$' then I get the following in OSX and Ubuntu:

OSX:

1307-node/49150
1308-node/49153
1309:node/49156
1310-node/49159
1311-node/49162

Ubuntu:

1307-node/49150
1308-node/49153

1310-node/49159
1311-node/49162

I need the script to run on an Ubuntu server, so I really need to get it functioning on that, however as I said, the script has run fine with previous CSVs, I'm really not sure how this most recent CSV has become altered.

If I look at the file in VIM using :set list I see the following:

node/49153$
node/49156$
node/49159$
node/49162$
node/49165$

I thought to check for numeric block instead of the end line character, using the pattern: '49156[^0-9]' however this still produces the same result:

1307-node/49150
1308-node/49153

1310-node/49159
1311-node/49162

Other than the above, I'm not really sure what to test for next. Thanks for any advice :)

Upvotes: 0

Views: 139

Answers (1)

ruakh
ruakh

Reputation: 183371

It sounds like your file might be using CRLF (carriage-return + line-feed) for line endings, instead of just LF (line-feed). (This might happen if, for example, your file was ever edited on a Windows machine.) So, I'd suggest running dos2unix on it, to remove any stray carriage-returns, and see if that fixes it.

Upvotes: 1

Related Questions