Reputation:
I'm writing my first application in Node.js. I am trying to read some data from a file where the data is stored in the JSON format.
I get this error:
SyntaxError: Unexpected token in JSON at position 0
at Object.parse (native)
Here is this part of the code:
//read saved addresses of all users from a JSON file
fs.readFile('addresses.json', function (err, data) {
if (data) {
console.log("Read JSON file: " + data);
storage = JSON.parse(data);
Here is the console.log
output (and I checked the .json file itself, it's the same):
Read JSON file: {
"addresses": []
}
That seems to me like a correct JSON. Why does JSON.parse()
fail then?
Upvotes: 12
Views: 78104
Reputation: 689
const fs = require('fs');
const myConsole = new console.Console(fs.createWriteStream('./output.json'));
myConsole.log(object);
This will create an output file with all the output which can been triggered through console.log(object)
.
This is the easiest way to convert the console.log()
output into a file.`
Upvotes: 0
Reputation: 1
As mentioned by TamusJRoyce I ended up using the util.TextDecoder class to come up with a robust way to read both UTF-8 (without BOM) and UTF-8 (with BOM). Here's the snippit, assuming that file input.json is UTF-8 (with or without BOM) and contains valid JSON.
const fs = require('fs');
const util = require('util');
const rawdata = fs.readFileSync('input.json');
const textDecoder = new util.TextDecoder('utf-8');
const stringData = textDecoder.decode(rawdata);
const objects = JSON.parse(stringData);
Upvotes: 0
Reputation: 6872
JSON.parse()
does not allow trailing commas. So, you need to get rid of it:
JSON.parse(JSON.stringify(data));
You can find more about it here.
Upvotes: 10
Reputation: 168
It might be the BOM[1].
I have done a test by saving a file with content {"name":"test"}
with UTF-8 + BOM, and it generated the same error.
> JSON.parse(fs.readFileSync("a.json"))
SyntaxError: Unexpected token in JSON at position 0
And based on a suggestion here [2], you can replace it or drop it before you call JSON.parse()
.
For example:
var storage = {};
fs.readFile('a.json', 'utf8', function (err, data) {
if (data) {
console.log("Read JSON file: " + data);
console.log(typeof(data))
storage = JSON.parse(data.trim());
}
});
or
var storage = {};
fs.readFile('a.json', function (err, data) {
if (data) {
console.log("Read JSON file: " + data);
console.log(typeof(data))
storage = JSON.parse(data.toString().trim());
}
})
You can also remove the first 3 bytes (for UTF-8) by using Buffer.slice()
.
Upvotes: 5
Reputation: 834
To further explain @Luillyfe's answer:
Ah-ha! fs.readFileSync("data.json")
returns a Javascript object!
Edit: Below is incorrect...But summarizes what one might think at first!
I had through that was a string. So if the file was saved as UTF-8/ascii, it would probably not have an issue? The javascript object returned from readFileSync would convert to a string JSON.parse can parse? No need to call JSON.stringify?
I am using powershell to save the file. Which seems to save the file as UTF-16 (too busy right now to check). So I get "SyntaxError: Unexpected token � in JSON at position 0."
However, JSON.stringify(fs.readFileSync("data.json"))
correctly parses that returned file object into a string JSON can parse.
Clue for me is my json file contents looking like the below (after logging it to the console):
�{ " R o o m I D _ L o o k u p " : [
{
" I D " : 1 0 ,
" L o c a t i o n " : " f r o n t " ,
" H o u s e " : " f r o n t r o o m "
}
}
That doesn't seem like something a file would load into a string...
Incorrect being (this does not crash...but instead converts the json file to jibberish!):
const jsonFileContents = JSON.parse(JSON.stringify(fs.readFileSync("data.json")));
I can't seem to find this anywhere. But makes sense!
Edit: Um... That object is just a buffer. Apologies for the above!
Solution:
const fs = require("fs");
function GetFileEncodingHeader(filePath) {
const readStream = fs.openSync(filePath, 'r');
const bufferSize = 2;
const buffer = new Buffer(bufferSize);
let readBytes = 0;
if (readBytes = fs.readSync(readStream, buffer, 0, bufferSize, 0)) {
return buffer.slice(0, readBytes).toString("hex");
}
return "";
}
function ReadFileSyncUtf8(filePath) {
const fileEncoding = GetFileEncodingHeader(filePath);
let content = null;
if (fileEncoding === "fffe" || fileEncoding === "utf16le") {
content = fs.readFileSync(filePath, "ucs2"); // utf-16 Little Endian
} else if (fileEncoding === "feff" || fileEncoding === "utf16be") {
content = fs.readFileSync(filePath, "uts2").swap16(); // utf-16 Big Endian
} else {
content = fs.readFileSync(filePath, "utf8");
}
// trim removes the header...but there may be a better way!
return content.toString("utf8").trimStart();
}
function GetJson(filePath) {
const jsonContents = ReadFileSyncUtf8(filePath);
console.log(GetFileEncodingHeader(filePath));
return JSON.parse(jsonContents);
}
Usage:
GetJson("data.json");
Note: I don't currently have a need for this to be async yet. Add another answer if you can make this async!
Upvotes: 2
Reputation: 19347
try it like this
fs.readFile('addresses.json','utf-8', function (err, data) {
if (data) {
console.log("Read JSON file: " + data);
storage = JSON.parse(data);
its because of the BOM that needs an encoding to be set before reading the file. its been issued in nodejs respository in github
https://github.com/nodejs/node-v0.x-archive/issues/186
Upvotes: 4
Reputation: 2294
You have a strange char at the beginning of the file.
data.charCodeAt(0) === 65279
I would recommend:
fs.readFile('addresses.json', function (err, data) {
if (data) {
console.log("Read JSON file: " + data);
data = data.trim();
//or data = JSON.parse(JSON.stringify(data.trim()));
storage = JSON.parse(data);
}});
Upvotes: 29