Reputation: 51
I'm trying to implement video stabilization using OpenCV videostab module. I need to do it in stream, so I'm trying to get motion between two frames. After learning documentation, I decide to do it this way:
estimator = new cv::videostab::MotionEstimatorRansacL2(cv::videostab::MM_TRANSLATION);
keypointEstimator = new cv::videostab::KeypointBasedMotionEstimator(estimator);
bool res;
auto motion = keypointEstimator->estimate(this->firstFrame, thisFrame, &res);
std::vector<float> matrix(motion.data, motion.data + (motion.rows*motion.cols));
Where firstFrame
and thisFrame
are fully initialized frames. The problem is, that estimate
method always return the matrix like that:
In this matrix only last value(matrix[8]
) is changing from frame to frame. Am I correctly use videostab objects and how can I apply this matrix on frame to get result?
Upvotes: 3
Views: 1197
Reputation: 289
I am new to OpenCV but here is how I have solved this issue. The problem lies in the line:
std::vector<float> matrix(motion.data, motion.data + (motion.rows*motion.cols));
For me, the motion
matrix is of type 64-bit double
(check yours from here) and copying it into std::vector<float> matrix
of type 32-bit float
messes-up the values.
To solve this issue, try replacing above line with:
std::vector<float> matrix;
for (auto row = 0; row < motion.rows; row++) {
for (auto col = 0; col < motion.cols; col++) {
matrix.push_back(motion.at<float>(row, col));
}
}
I have tested it with running the estimator
on duplicate set of points and it gives expected results with most entries close to 0.0
and matrix[0], matrix[4] and matrix[8]
being 1.0
(using author's code with this setting was giving the same erroneous values as author's picture displays).
Upvotes: 0