Reputation: 185
With spring data mongo, i need to update document in mongo.
My entity is define like this :
@Document(collection = "Orders")
public class Order{
@Id
private Long id;
private String clientContainerReference;
private String status;
private Bigdecimal amount;
private BigDecimal remainingQuantity;
...
}
At first, this document is inserted in mongo with a remainingQuantity of 100. Next, the order is updated with a null remainingQuantity. After the update (upsert), the remainingQuantity is always set to 100.
This is due to the class :
org.springframework.data.mongodb.core.convert.MappingMongoConverter
in the method writeInternal a null check is done on every property of the document. If the property is null, it is exclued from the generated DBObject.
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty prop) {
if (prop.equals(idProperty)) {
return;
}
Object propertyObj = wrapper.getProperty(prop);
if (null != propertyObj) {
if (!conversions.isSimpleType(propertyObj.getClass())) {
writePropertyInternal(propertyObj, dbo, prop);
} else {
writeSimpleInternal(propertyObj, dbo, prop);
}
}
}
});
I can understand that it is more efficient because the generate DBObject is smaller and the update request is more digest for mongo. But how can i update real null values ?
More specific, in my case, all fields of all document could be null. so i doesn't want to write custom converter and map one by one each java field to DBObject Field.
For more my usecase, i have created a "NullAwareMappingMongoConverter" that override MappingMongoConverter to let the converter write the value if it is a null value.
entity.doWithProperties(new PropertyHandler<MongoPersistentProperty>() {
public void doWithPersistentProperty(MongoPersistentProperty prop) {
if (prop.equals(idProperty)) {
return;
}
Object propertyObj = wrapper.getProperty(prop);
if (null != propertyObj) {
if (!conversions.isSimpleType(propertyObj.getClass())) {
writePropertyInternal(propertyObj, dbo, prop);
} else {
writeSimpleInternal(propertyObj, dbo, prop);
}
}
else{
writeSimpleInternal(propertyObj, dbo, prop);
}
}
});
It's an very uggly solution, because, the MappingMongoConverter For spring data mongo has a package visibility. Does spring provide a way to tell : don't ignore null value on this properties with annotation or something else ?
Thank you
here is the code used to update the entity
public T setNotificationDateAndSave(T entity) {
Assert.notNull(entity, "Entity must not be null!");
BasicDBObject dbObject = new BasicDBObject();
mongoTemplate.getConverter().write(entity, dbObject);
DateTime expirationDate = getDeprecatedStatus().contains(getStatus(entity)) ?
new DateTime().plusSeconds(EXPIRE_AFTER_SECONDS) : null;
dbObject.put(EXPIRATION_DATE_COLUMN, mongoTemplate.getConverter()
.convertToMongoType(expirationDate, ClassTypeInformation.from(DateTime.class)));
NotificationDateUpdate update = new NotificationDateUpdate(dbObject, NOTIFICATION_DATE_COLUMN);
Query q = Query.query(Criteria.where("_id").is(entity.getId()).andOperator(Criteria.where(REGISTER_DATE_COLUMN).lte(entity.getRegisterDate())));
try{
mongoTemplate.upsert(q, update, parameterClass);
}catch (DuplicateKeyException e) {
logger.info(format("could not save notification : a more recent notification exists for collection %s and entity id : %s", parameterClass.getName(), entity.getId()));
}
return entity;
}
Upvotes: 5
Views: 4296
Reputation: 1307
Same problem, but I didn't want to mess with the converter, it felt too complicated for the need. I simply automated the unsetting of all null fields for my object.
Slightly adapted to your code, it would look like:
DBObject dbObject = new BasicDBObject();
mongoTemplate.getConverter().write(entity, dbObject);
Update update = Update.fromDBObject(dbObject);
for (Field f : entity.class.getDeclaredFields()) {
try {
f.setAccessible(true);
if (f.get(entity) == null)
update.unset(f.getName());
} catch (IllegalAccessException e) {
e.printStackTrace();
}
}
ops.upsert(query,update);
Upvotes: 2