TWdorado
TWdorado

Reputation: 117

OSGi and Sesame with OWLIM

Is there a simple way to get Sesame and OWLIM work in an OSGi-environment? Or is there another fast way to inference OWL-Data and store them? I tried Jena with build-in reasoner, pellet etc., but it's so slow. Then I tried Sesame with OWLIM and it was incredibly fast, but i can't get it to work with OSGi. Is there everyone who solved this problem?

Upvotes: 3

Views: 307

Answers (3)

christian.vogel
christian.vogel

Reputation: 2137

One of my colleagues is currently working with Sesame, OWLIM and OSGi. It seems for OWLIM you have to add additional VM arguments. You can read something in the news section of the OWLIM site, look for OSGi and SwiftOWLIM as well in this PDF. There is also a very interesting project which seems to be the right choice amdatu-semanticweb. Unfortunately, there is no direct documentation, but the projects of amdatu are a good choice if you want ready OSGi components. Have a look and I hope it helps you.

But I would not recommend the usage of the VM arguments since they seem to work with absolute paths which is not flexibel in my opinion.

Upvotes: 2

Jeen Broekstra
Jeen Broekstra

Reputation: 22042

I'm no OSGi expert, but Sesame, at least, is available as an OSGi bundle. While I don't think OWLIM is currently available as such, I know there has been some demand for this, so it might pay off to ask the OWLIM developers directly (via their support mailinglist).

Upvotes: 0

Michael
Michael

Reputation: 4886

OWLIM is fast with respect to something like Pellet because it materializes inferences, that is, at load time, it computes all inferences and puts them into the database. So when you run a query, you're just querying over the data, there is no extra work for reasoning done when querying.

This eager materialization of inferences is very appropriate in situations where your data does not change frequently, however, in use cases when that is not the case, the overhead of maintaining the materialized inferences can be unacceptable.

The flip side of the coin is for systems like Pellet, or other databases which use backward chaining-style approaches to reasoning is that loads and data changes are not affected by inference, but the work of reasoning is done at query time, which can slow down queries. So you'd need to think about how you plan on using inference to know what approach will suit your needs, neither one is the "right" approach.

I will say that Pellet, the first time you query it, will usually have to do all the hard work of reasoning...classification, realization, which can make the first query very slow, but once that information is computed, if there are no changes to the data, later queries can be quite performant.

Disclaimer, I don't know how OWLIM works, that is just an educated guess from what I know about it. Also, I realize this does not answer your question -- I don't know of a way to use OSGI with OWLIM or other databases -- but thought your comments merited some clarification, which was too long to fit into a comment =)

Upvotes: 0

Related Questions