Reputation: 1091
The issue
Awhile back I started using MongoDB and Spring Data. I'd left most of the default functionality in place, and so all of my documents were stored in MongoDB with a _class
field pointing to the entity's fully-qualified class name.
Right away that didn't "smell" right to me, but I left it alone. Until recently, when I refactored a bunch of code, and suddenly none of my documents could be read back from MongoDB and converted into their (refactored/renamed) Java entities. I quickly realized that it was because there was now a fully-qualified-classname mismatch. I also quickly realized that--given that I might refactor again sometime in the future--if I didn't want all of my data to become unusable I'd need to figure something else out.
What I've tried
So that's what I'm doing, but I've hit a wall. I think that I need to do the following:
@TypeAlias("ta")
where "ta" is a unique, stable string.TypeInformationMapper
for Spring Data to use when converting my documents back into their Java entities; it needs to know, for example, that a type-alias of "widget.foo" refers to com.myapp.document.FooWidget
.I determined that I should use a TypeInformationMapper
of type org.springframework.data.convert.MappingContextTypeInformationMapper
. Supposedly a MappingContextTypeInformationMapper will scan my entities/documents to find @TypeAlias'ed documents and store an alias->to->class mapping. But I can't pass that to my MappingMongoConverter; I have to pass a subtype of MongoTypeMapper. So I am configuring a DefaultMongoTypeMapper
, and passing a List of one MappingContextTypeInformationMapper
as its "mappers" constructor arg.
Code
Here's the relevant part of my spring XML config:
<bean id="mongoTypeMapper" class="org.springframework.data.mongodb.core.convert.DefaultMongoTypeMapper">
<constructor-arg name="typeKey" value="_class"></constructor-arg>
<constructor-arg name="mappers">
<list>
<ref bean="mappingContextTypeMapper" />
</list>
</constructor-arg>
</bean>
<bean id="mappingContextTypeMapper" class="org.springframework.data.convert.MappingContextTypeInformationMapper">
<constructor-arg ref="mappingContext" />
</bean>
<bean id="mappingMongoConverter"
class="org.springframework.data.mongodb.core.convert.MappingMongoConverter">
<constructor-arg ref="mongoDbFactory" />
<constructor-arg ref="mappingContext" />
<property name="mapKeyDotReplacement" value="__dot__" />
<property name="typeMapper" ref="mongoTypeMapper"/>
</bean>
<bean id="mongoTemplate" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg ref="mongoDbFactory" />
<constructor-arg ref="mappingMongoConverter" />
</bean>
Here's a sample entity/document:
@Document(collection="widget")
@TypeAlias("widget.foo")
public class FooWidget extends Widget {
// ...
}
One important note is that any such "Widget" entity is stored as a nested document in Mongo. So in reality you won't really find a populated "Widget" collection in my MongoDB instance. Instead, a higher-level "Page" class can contain multiple "widgets" like so:
@Document(collection="page")
@TypeAlias("page")
public class Page extends BaseDocument {
// ...
private List<Widget> widgets = new ArrayList<Widget>();
}
The error I'm stuck on
What happens is that I can save a Page along with a number of nested Widgets in Mongo. But when I try to read said Page back out, I get something like the following:
org.springframework.beans.BeanInstantiationException: Could not instantiate bean class [com.myapp.document.Widget]: Is it an abstract class?
I can indeed see pages in Mongo containing "_class" : "page"
, with nested widgets also containing "_class" : "widget.foo"
It just appears like the mapping is not being applied in the reverse.
Is there anything I might be missing?
Upvotes: 10
Views: 16723
Reputation: 47
If you use spring boot with auto-configuration, declaring the following bean can help:
@Bean
MongoManagedTypes mongoManagedTypes(ApplicationContext applicationContext)
throws ClassNotFoundException {
return MongoManagedTypes.fromIterable(
new EntityScanner(applicationContext).scan(Persistent.class));
}
Instead of scanning for Document
s it will scan for both Document
and TypeAlias
, since both of these annotations are Persistent
Upvotes: 2
Reputation: 41
If you extend AbstractMongoConfiguration, you can override method getMappingBasePackage to specify the base package for your documents.
@Configuration
class RepositoryConfig extends AbstractMongoConfiguration {
@Override
protected String getMappingBasePackage() {
return "com.example";
}
}
Update: In spring-data-mongodb 2+ you should use:
@Configuration
class RepositoryConfig extends AbstractMongoConfiguration {
@Override
protected Collection<String> getMappingBasePackages(){
return Arrays.asList("com.example");
}
}
because getMappingBasePackage()
is no deprecated and won't work.
Upvotes: 4
Reputation: 101
In the default setting, the MappingMongoConverter
creates a DefaultMongoTypeMapper
which in turn creates a MappingContextTypeInformationMapper
.
That last class is the one responsible for maintaining the typeMap
cache between TypeInformation
and aliases.
That cache is populated in two places:
mappingContext.getPersistentEntities()
So if you want to make sure the alias is recognized in any context, you need to make sure that all your aliased entities are part of mappingContext.getPersistentEntities()
.
How you do that depends on your configuration. For instance:
AbstractMongoConfiguration
, you can overwrite its getMappingBasePackage()
to return the name of a package containing all of your entities. @EntityScan
to declare which packages to scan for entitiesmongoMappingContext.setInitialEntitySet()
One side note, for an entity to be discovered by a scan, it has to be annotated with either @Document
or @Persitent
.
More informations can be found in the spring-data-commons Developer Guide
Upvotes: 9
Reputation: 212
Andreas Svensson is right, this can be done much simpler than described by Dave Taubler.
I posted a slightly more elaborate answer than Andreas' (including sample code) in this post. Excerpt:
So all you need to do is to declare an "unused" Repository-Interface for your sub-classes, just like you proposed as "unsafe" in your OP:
public interface NodeRepository extends MongoRepository<Node, String> {
// all of your repo methods go here
Node findById(String id);
Node findFirst100ByNodeType(String nodeType);
... etc.
}
public interface LeafType1Repository extends MongoRepository<LeafType1, String> {
// leave empty
}
public interface LeafType2Repository extends MongoRepository<LeafType2, String> {
// leave empty
}
Upvotes: -1
Reputation: 11
Today I ran into the exact same issue. After more research I found out that my subclass was missing a repository. It appears that Spring Data is using the repositories to determine which concrete subclass to create and when it is missing, it falls back to the superclass which in this case is abstract.
So please try to add a FooWidgetRepository and map it to FooWidget with correct ID type. It might work in your case as well.
Upvotes: 1
Reputation: 1091
I spent a bunch of time with my debugger and the Spring Data source code, and I learned that Spring Data isn't as good as it probably should be with polymorphism as it should be, especially given the schema-less nature of NoSQL solutions like MongoDB. But ultimately what I did was to write my own type mapper, and that wasn't too tough.
The main problem was that, when reading in my Page document, the default mappers used by Spring Data would see a collection called widgets, then consult the Page class to determine that widgets pointed to a List, then consult the Widget class to look for @TypeAlias information. What I needed instead was a mapper that scanned my persistent entities up front and stored an alias-to-class mapping for later use. That's what my custom type mapper does.
I wrote a blog post discussing the details.
Upvotes: 6