Reputation: 367
I am working on a project that requires the validation of many XML files against their XSD, the trouble I am having is that many of the XSD files depend on others XSDs, making the usual validation kind of troublesome, is there an elegant way to resolve this issue?
I would prefer if possible to work with those files in memory, the files are not in a concise directory structure that conforms with their importation paths.
Just to note I am working with the Java language.
Upvotes: 0
Views: 644
Reputation: 12817
Assuming here that you work with JAXP, so that you can setSchema()
on either SAXParserFactory
or `DocumentBuilderFactory.
One solution I was part of, was to read all XSD sources into an aggregated Schema
object using SchemaFactory.newSchema(Source[] schemas)
. This aggregated Schema was then able to validate any XML document that referenced any "top" schema; all imported
schemas had to be part of the aggregated schema. As I remember it, it was necessary to order the Source array by dependency, so that if Schema A imported Schema B, Schema B had to occur befor Schema A in the array.
Also, as I recall, <include>
didn't work very well with this mechanism.
Another solution would be to set an LSResourceResolver
on the ShemaFactory. You would have to implement your own LSResourceresolver
that serves byte- or character streams based on the input to the resolver. I haven't personally used or researched this solution.
The first solution has of course the benefit that schema parsing and processing can be done once and reused for all validations that follows; something that will probably be difficult to achieve with the second option.
Another thing to keep in mind (depending on your context): It is a good design choice to control the whole "resolving" process (i.e. control how the parsers get access to external resources), from a performance as well as a security perspective.
Upvotes: 1