Reputation: 11
When working on our robotics project, we want to use a mix of local SDF files that we wrote ourselves, and ones from the pydrake / manipulation packages. All the examples of yaml files we found in the manipulation course were either of the form:
- add model:
...
file: file://absolute/path/to/file.sdf
...
or of the following form (the Rubick's cube example even writes the SDF file to the manipulation package library to use this).
- add model:
...
file: package://package_name/path/to/file.sdf
...
Because we are developing with our own SDF files, the former seems like a better option, but then we end up with yaml files that look like the one below which would be hard to maintain when merging versions in git.
- add model:
...
file: file://absolute/path/to/project/on/my/computer/filename.sdf
...
Our current hack is to add a "gen_scenario_yaml.py" file that generates the yaml with the relevant absolute base paths whenever it is imported. Another alternative could be to write to the manipulation/pydrake package directories (though this would not be picked up by git). Is there a more elegant solution?
We tried setting paths of the form file://./relative/path/to/file.sdf
, but this throws an error.
We also have a hacky solution of generating our yaml files on the fly, which works, but isn't elegant.
Upvotes: 0
Views: 163
Reputation: 1
Yes! You can do so by creating a local package for the files. First, create a package.xml
file in the same folder as all your SDF files to load them in. The package.xml
file should look something like this...
<?xml version="1.0"?>
<package>
<name>PACKAGE_NAME_HERE</name>
</package>
And then you should be able to reference any SDF from that folder into your .yaml file like so...
file: package://PACKAGE_NAME_HERE/SDF_NAME_HERE.sdf
Upvotes: 0