Reputation: 41
I am running a simple SPARQL query which aims at counting instances of data and object properties across entities of a given class, i.e. how many organizations have been provided with a name, an address, and so on. There is only one type of entity currently loaded in the triple store so no need to filter on ?
s below:
select ?attribName
from <http://example.com/graphs/orgs>
where
{
?s ?attribName ?attribValue .
}
group by
?attribName
This query times out. I can alternatively select distinct to obtain 5 or 6, but putting a higher limit will also time out after 30 seconds. There are a a few million entities.
I would like to know how to modify the timeout for SPARQL queries (not optimize the query). I've tried various timeouts in the admin console, but none seems to have an impact. This happens both when running the query from the query console or through the rest API, but nothing we change seems to have an impact on the timeout we observe.
Any idea what the right way to achieve this is? Thanks.
Upvotes: 1
Views: 174
Reputation: 41
Solved. It turns out I had not set the right timeout value in the admin console. I went to the admin console, selected the app servers in the left pane. I then changed the "request timeout" field of the different app servers to a higher value. It initially did not work, but when the environment was redeployed on the next day (i.e. restart of Marklogic), the timeout became effective, and the issue was solved.
Upvotes: 0
Reputation: 3138
I don't think there is any time out for 30 seconds. You might be facing some another issue. Check this thread https://developer.marklogic.com/pipermail/general/2014-July/015483.html and you can also try something like:
import module namespace sem = "http://marklogic.com/semantics" at "/MarkLogic/semantics.xqy";
sem:sparql('select ?o
where
{
?s ?p ?o .
}
group by ?o')
which can help you to increase the MarkLogic query timeout to minimum 10 minute.
Upvotes: 1