Reputation: 1696
We have a large Spring Boot app using a Mongo database, and we are migrating our ORM from Morphia to Spring-data-mongo. In Morphia, they have an annotation @Reference(ignoreMissing = true)
which we use for our DBRefs. When we use the @Reference
annotation on a List<Entity>
, Morphia will skip over any null refs (ie refs it cannot load), which helps avoid nulls in our list items.
Sample of our existing model class:
public class Book {
@Reference(ignoreMissing = true)
private List<Author> authors;
}
We are trying to reproduce the same behavior with Spring-data-mongo's @DBRef
annotation, but by default Spring is always including the null dbrefs in the List<Entity>
class members that we have.
To resolve this, as a start, I tried overriding the DbRefResolver.bulkFetch
method Spring uses for resolving DBRefs:
public class CustomDbRefResolver extends DefaultDbRefResolver {
public CustomDbRefResolver(MongoDatabaseFactory mongoDbFactory) {
super(mongoDbFactory);
}
@Override
@NotNull
public List<Document> bulkFetch(@NotNull List<DBRef> refs) {
var docs = super.bulkFetch(refs);
return docs.stream().filter(Objects::nonNull).collect(toList());
}
}
But looking deep into Spring's implementation, they don't use this method at all if the list of DBRefs only has 1 element in it:
List<Document> referencedRawDocuments = dbrefs.size() == 1
? Collections.singletonList(readRef(dbrefs.iterator().next()))
: bulkReadRefs(dbrefs);
So now I'm stuck on how to make Spring ignore the null DBRefs for any-size list like how Morphia does it?
(I checked the javadocs and there doesn't seem to be an ignore property like Morphia has)
Upvotes: -1
Views: 74
Reputation: 1696
Ok, since Spring doesn't call bulkFetch
for Collections of size 1, I ditched the CustomDbRefResolver
and overrode the higher-level MappingMongoConverter
instead. The DBRefResolver
's methods get invoked during the execution of MappingMongoConverter.readCollectionOrArray
which can be overridden:
import com.mongodb.DBRef; //make sure to use the right import!
public class CustomMappingMongoConverter extends MappingMongoConverter {
public CustomMappingMongoConverter(DbRefResolver dbRefResolver, MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext) {
super(dbRefResolver, mappingContext);
}
@Override
protected Object readCollectionOrArray(ConversionContext context, Collection<?> source, TypeInformation<?> targetType) {
var readResult = super.readCollectionOrArray(context, source, targetType);
if (source.isEmpty()
|| !(source.stream().findAny().orElse(null) instanceof DBRef)
|| !(readResult instanceof List)
|| ((List<?>) readResult).isEmpty()
) {
return readResult;
}
var list = ((List<?>) readResult);
return list.stream().filter(Objects::nonNull).collect(toList());
}
}
In order to use this converter with spring-data-mongodb, I had to override this part of my MongoConfiguration:
@Configuration
public class MongoConfiguration extends AbstractMongoClientConfiguration {
@Override
@Bean
@NotNull
public MappingMongoConverter mappingMongoConverter(MongoDatabaseFactory databaseFactory,
MongoCustomConversions customConversions,
MongoMappingContext mappingContext) {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(databaseFactory);
MappingMongoConverter converter = new CustomMappingMongoConverter(dbRefResolver, mappingContext);
converter.setCustomConversions(customConversions);
converter.setCodecRegistryProvider(databaseFactory);
return converter;
}
}
Upvotes: 0