fednem
fednem

Reputation: 127

write.csv strange encoding in R

I am encountering a very strange problem that I am not able to resolve by myself.

Suddenly, write.csv is encoding csv file in a way that make it impossible to read it in libre office. The command has always worked until today. Now, if I try to use write.csv (or its more general equivalent write.table) and then i try to open the file with libre office, all I get is a bunch of symbol and asian character. I don't really understand what's happening here, it seems that the default encoding of write.csv has changed by itself. The only different thing that I done today was reading some text file that were encoded using the program eprime, and so I had to use the following command in order to read the file

A=read.delim("Pre_NewTask_Run1.txt", fileEncoding="UCS-2LE")

Is it possible that this has changed the default encoding of write.csv ? And if this is the case, how can I change back ?

Thanks in advance for any help

Upvotes: 1

Views: 1769

Answers (1)

Konrad
Konrad

Reputation: 18585

It may be difficult to provide you with a precise answer without sample data or reproducible code being made available. Having said that, as an initial attempt you can attempt to force export of your data with use of specific encoding for example, the code:

con<-file('filename',encoding="utf8")
write.csv(...,file=con,...)

would enable you to use the utf-8 encoding. You may also run the l10n_info() command that would provide you with information on the local encoding that you currently have:

> l10n_info()
$MBCS
[1] FALSE

$`UTF-8`
[1] FALSE

$`Latin-1`
[1] TRUE

$codepage
[1] 1252

Upvotes: 2

Related Questions