Reputation: 13
Seems like a simple problem, but even after searching forum and web I could not find an answer.
When I run my program in netbeans all the special characters like ä, ö, ü are showing correctly. But when I run "jar" file of the same project (I did clean and rebuild) some strange characters as @A &$ and so on are appearing instead of correct character.
Any help would be appreciated.
//edited 22. 08. 2012 00:46
I thought the solution would be easier so I didn't post any code or details. Ok then:
//input file is in UTF-8
try {
BufferedReader in = new BufferedReader(new FileReader("fin.dir"));
String line;
while ((line = in.readLine()) != null) {
processLine(line, 0);
}
in.close();
} catch (FileNotFoundException ex) {
System.out.println(ex.getMessage());
} catch (IOException ex) {
System.out.println(ex.getMessage());
}
I am displaying characters in this way:
JOptionPane.showMessageDialog(rootPane, "Correct!\n\n"
+ testingFin.getWord(), "Congrats", 1);`
Upvotes: 1
Views: 2203
Reputation: 86774
From the description of FileReader
:
Convenience class for reading character files. The constructors of this class assume that the default character encoding and the default byte-buffer size are appropriate. To specify these values yourself, construct an InputStreamReader on a FileInputStream.
If you're on Windows, the default encoding is ISO-8859-1, so as Jon commented, the encoding problem is occurring on input. Try this:
in = new BufferedReader(
new InputStreamReader(new FileInputStream("fin.dir"),"UTF-8"));
Upvotes: 1
Reputation: 6738
Add your netbeans setting under YOURNETBEANS/etc/netbeans.conf likes this;
-J-Dfile.encoding=UTF-8
Upvotes: 0