Reputation: 333
An issue arises with contact modeling in robotics simulation package drake.
I try to position-control an iiwa manipulator to affect a body, attached to a screw joint.
What I expect is the nut progressing downwards. What I see is the end-effector sliding around the nut, unable to induce a nuts rotation along the bolt. note: This is a continued investigation into this question.
The simplifying experiment is to use ExternallyAppliedSpatialForce
, instead of the manipulator.
I apply torque around the bolts's Z axis in the centre of bolts axis. The amount of torque is 1 Nm. And, this produces the desired effect:
Then, I stop artificial force application, and try manipulation again, combined with the contact visualization by ContactResultsToMeshcat
:
I also print the contact force magnitudes. Forces are in the range of 50-200 N. But the nut does not move.
My question is, where to take the investigation next, when the 1 Nm torque is able to turn a screw joint, while the 200N force cannot?
So far, after watching 10 repeats of the visualization, I notice that only one finger of end-effector is in contact at any given moment. So, I think, I must prepare less sloppy trajectories for the fingers of end effector [and for the whole arm, for that matter].
Upvotes: 2
Views: 153
Reputation: 333
The problem most likely was in having actuation port of bolt-and-nut model not connected to any input port and hence not simulated properly (more details here).
After fix the simulation looks that way:
Upvotes: 1
Reputation: 122
It could be the inverse dynamics controller used in ManipulationStation
, which is different from (and softer than) iiwa’s real controller, that’s making the robot’s joints too soft and unable to turn the nut.
Specifically, the inverse dynamics controller scales the joint stiffness matrix by the mass matrix of the robot, whereas iiwa's real controller does not do that.
Upvotes: 2