Reputation: 6504
Problem :
I have a very complex json which have several integer values which are beyond 8 Bytes (c_time, p_time) and some which are smaller(id from the below example)
Example structure:
{
"c_time": 18446744062065078000,
"p_time": 18446744062065078000,
"id" : 122,
"name" : "example",
... : ...
simple json structure
}
When i want to insert this json into Mongo, iam facing the following error.
Exception MongoDB can only handle up to 8-byte ints
What I Tried:
To rectify this i have 2 options, one way is to parse through the json and remove all such large ints using del c_time
. But this obviously has problem which is iam losing valuable info.
Secondly i want to convert these large ints to stings.I parse through the json and tried converting them to strings, but due to complex structure and depth of json its difficult to convert all of them to string.
Is there any simple and effective way to convert all the long ints inside the given json into stings without major penalty of system.
Upvotes: 3
Views: 1446
Reputation: 1420
I've solved this issue by doing a conversion of all int/float fields to string on inserting, and on reading, doing conversion again, to get all back. This is the code in Python:
def serialize(data):
if isinstance(data, types.GeneratorType):
data = list(data)
return convert_numeric_to_str(data)
def deserialize(data):
return convert_str_to_numeric(data)
and
def convert_numeric_to_str(d):
cur_type = type(d)
if cur_type == dict:
for key, value in d.items():
d[key] = convert_numeric_to_str(value)
elif cur_type == list:
for i, el in enumerate(d):
d[i] = convert_numeric_to_str(el)
else:
if cur_type in [int, float]:
d = str(d)
return d
def convert_str_to_numeric(d):
cur_type = type(d)
if cur_type == dict:
for key, value in d.items():
d[key] = convert_str_to_numeric(value)
elif cur_type == list:
for i, el in enumerate(d):
d[i] = convert_str_to_numeric(el)
else:
if cur_type == str:
try:
d = int(d)
except ValueError:
try:
d = float(d)
except ValueError:
pass
return d
Upvotes: 3