NewUsr_stat
NewUsr_stat

Reputation: 2587

read.table parses a text file in an unexpected way

I'm attempting to read a big matrix using R (64 bit). The matrix has 14.000 columns and 900 rows. The problem is that at a certain point of the matrix, R splits a row in a new line. The split occurs more than one time. I suppose the problem is in the R memory even if I'm using a 64 bit. Can anyone help me? Thanks in advance.

Just 1 row example: (from shell Unix)

SELT 0.00134 TGH 0.776554 P53 0.23436 MYC 0.2351 BRCA 0.7654.... # (the line has 9.573 columns)

after R read.delim() or read.table() (just reading the file):

SELT 0.00134 TGH 0.776554 P53 0.23436   
MYC 0.2351 BRCA 0.7654....

MYC 0.2351 BRCA 0.7654 are wrongly on a new line.

Eleonora

Upvotes: 1

Views: 192

Answers (2)

Janez
Janez

Reputation: 36

I had the same problem. My solution is to put a head in .txt file with the names: "V1", "V2"..."Vmax_number_of_columns". With this it worked.

Upvotes: 2

NewUsr_stat
NewUsr_stat

Reputation: 2587

we verified that the output of the parallel analysis we performed was corrupted, so this is the reason why R "failed" to read correctly the file. anyway thanks a lot for your help!

Upvotes: 1

Related Questions