Reputation: 51
I use the sesame http repository and because of the fact I have a big schema a repository supporting inference is too slow (especially when adding triples). As a result, I use a simple memory repository(set it at the workbench) and I configure it at run-time to support inference with the following lines in the page i want it.
ForwardChainingRDFSInferencerConfig inferMemStoreConfig = new ForwardChainingRDFSInferencerConfig(new MemoryStoreConfig(true));
SailRepositoryConfig repositoryTypeSpec = new SailRepositoryConfig(inferMemStoreConfig);
RepositoryConfig repConfig = new RepositoryConfig(repositoryID, repositoryTypeSpec);
RemoteRepositoryManager manager = new RemoteRepositoryManager(sesameServer);
manager.initialize();
Repository myRepository = manager.getRepository(repositoryID);
manager.addRepositoryConfig(repConfig);
So in the page where I add triples, how can I disable it?
This is what I have tried:
MemoryStoreConfig memStoreConfig = new MemoryStoreConfig(true);
SailRepositoryConfig repositoryTypeSpec = new SailRepositoryConfig(memStoreConfig);
RepositoryConfig repConfig = new RepositoryConfig(repositoryID, repositoryTypeSpec);
RemoteRepositoryManager manager = new RemoteRepositoryManager(sesameServer);
manager.initialize();
Repository myRepository = manager.getRepository(repositoryID);
manager.addRepositoryConfig(repConfig);
myRepository.initialize();
Any help? A better approach maybe?
Upvotes: 2
Views: 396
Reputation: 22052
You can not change a default Sesame repository's inferencing strategy at runtime like this. Once you have created the repository using a particular configuration, that configuration is fixed. The same store can not be configured to be both inferencing and non-inferencing.
And even if you could change it, it wouldn't help you. I'm not sure what exactly you're trying to achieve but adding data to a store with inference is slower because it, well, has to do inferencing. Disabling inferencing during load but enabling it during query is pointless as all the inferencing work is done during loading, so in this scenario nothing will be inferred.
You have several choices: one option is to work with a completely non-inferencing repository, and just do smarter querying to get what you need - most RDFS inheritance inferencing can be replaced by using queries.
For example, to get all subclasses of a class A
:
SELECT ?x
WHERE { ?x rdfs:subClassOf+ ex:A }
All (inherited) instances of A
:
SELECT ?i
WHERE { ?i a [ rdfs:subClassOf* ex:A ] }
And so on and so forth.
Another option is to look into one of the Sesame third party backends, such as OWLIM, which has far more sophisticated inferencing support and better performance.
Upvotes: 1