Reputation: 63
I've taken up a recent interest in making/emulating text-based rpgs from the 80s, like Rogue and its derivatives, which feature graphics make out of extended ASCII characters. So when it comes to creating and printing graphics to the console for these games, I figure that I should do the following: 1) Design the levels and whatnot in a text editor like Notepad; 2) Save those files as Unicode-encoded txt files, since they contain extended ASCII; 3) Have my game program read the graphics from these files and print them, verbatim, to the console. This seems like a fine plan to me, except there is one problem.
For the life of me, I can't get the program to output the extended ASCII characters properly. What generally happens is that the program will seem to read each single char from the file as a pair of ASCII chars. For example, the char '☺' would be output as "&;", or something like that.
In C++ and/or C#, how can I properly read extended ASCII chars from Unicode-encoded txt files, line by line, into a program and output those lines to the console window?
(I mean, I suppose I could make a translator function that takes the corrupted char-pair, like "&;" and converts it back to the single ASCII char, like '☺', by way of a big ol' if-then statement or some cleverly-deduced mathematical formula, but I am not only quite lazy, I would also very much be interested in knowing how C++/C# handle file I/O with non-ANSI-encoded txt files, if they indeed have such mechanisms implemented!)
Upvotes: 0
Views: 1224
Reputation: 8946
I'm not into C# very much, so here is sample for C++
#include <stdio.h>
int main()
{
FILE * pFile;
wchar_t mystring [100];
pFile = fopen ("myfile.txt" , "r");
if (pFile != NULL)
{
if ( fgetws (mystring , 100 , pFile) != NULL )
fputws ( mystring, stdout );
fclose (pFile);
}
return 0;
}
I suggest use C
style IO operations, not C++
because it gives better performance (not an issue in your case but you should do that). So you need open file using fopen and read it using fgetws, also after file resources used, don't forget to close the file.
Also printing to console must be different (I mean you should tell that you use wide characters), so use fputws.
Also additional suggest, you should use binary reading format (use "b"
instead of "r"
in fopen
) to read data, that should give slightly better performance, but you may will need to implement some data parsing.
Also if you are looking C++ type solution, you could use streams, but you need specify that you are working with wide characters, so instead of std::ifstream
std::wifstream
, instead of std::cout
use std::wcout
and etc. using w
prefix.
Upvotes: 0
Reputation: 9782
Since you control both sides (writing a text file and reading it back) things are very easy:
.net uses UTF-8 encoding by default. If you use a StreamWriter() to write a file, you may use a StreamReader() to read the file back and all characters will survive the round-trip unaltered.
Now the trick for you: If you want to manipulate such a file with en external editor make sure the editor is able to read/write UTF-8 encoding. Use notepad++, it will do.
Upvotes: 1