Reputation: 33
I am using a Jetson Xavier NX
to capture frames from a USB camera using OpenCV.
Then I implement a KCF
tracker on the captured frames to track a certain object.
In parallel I am sending the tracking data to an ESP32 capturing IMU data from an MPU6050.
I can see a lot of inconsistency between when the tracker recorded movement to when the sensor recorded movement. sometimes it is 70 millisecond difference, and sometimes it even gets up to 150 millisecond difference. that much difference is really messing up my plans, even that 70 millisecond difference seems too much considering i am running on 60fps camera, that amounts to almost 5 frames delay.
I checked the network delay with a scope, it is about 3 millisecond consistent using UART communication.
Here's how I am grabbing frames:
video.open(
"v4l2src device=/dev/video0 queue-size=1 ! video/x-raw,width=640,height=480 ! nvvidconv flip-method=2 ! video/x-raw(memory:NVMM),
format = I420 ! nvvidconv ! video/x-raw,format=BGRx,width=(int)640, height=(int)480, framerate=(fraction)60/1!
videoconvert !video/x-raw,format=BGR ! appsink drop=True sync=False"
);
I use cv::VideoCapture::read() to capture frames.
the IMU sensor is working at 200hz
.
Can anyone explain both the delay in measurements, meaning between the sensor and the tracker, and the difference between measured delay, ranging from 70 to 150 milliseconds?
Upvotes: 0
Views: 215