Feignedjesse2
Feignedjesse2

Reputation: 107

Robot odometry in labview

I am currently working on a (school-)project involving a robot having to navigate a corn field.

We need to make the complete software in NI Labview.

Because of the tasks the robot has to be able to perform the robot has to know it's position.

As sensors we have a 6-DOF IMU, some unrealiable wheel encoders and a 2D laser scanner (SICK TIM351).

Until now I am unable to figure out any algorithms or tutorials, and thus really stuck on this problem.

I am wondering if anyone ever attempted in making SLAM work in labview, and if so are there any examples or explanations to do this?

Or is there perhaps a toolkit for LabVIEW that contains this function/algorithm?

Kind regards, Jesse Bax 3rd year mechatronic student

Upvotes: 1

Views: 1101

Answers (2)

Kake_Fisk
Kake_Fisk

Reputation: 1165

As Slavo mentioned, there's the LabVIEW Robotics module that contains algorithms like A* for pathfinding. But there's not very much there that can help you solve the SLAM problem, that I am aware of. The SLAM problem consist of the following parts: Landmark extraction, data association, state estimation and updating of state.

For landmark extraction, you have to pick one or multiple features that you want the robot to recognize. This can for example be a corner or a line(wall in 3D). You can for example use clustering, split and merge or the RANSAC algorithm. I believe your laser scanner extract and store the points in a list sorted by angle, this makes the Split and Merge algorithm very feasible. Although RANSAC is the most accurate of them, but also has a higher complexity. I recommend starting with some optimal data points for testing the line extraction. You can for example put your laser scanner in a small room with straight walls and perform one scan and save it to an array or a file. Make sure the contour is a bit more complex than just four walls. And remove noise either before or after measurement.

I haven't read up on good methods for data association, but you could for example just consider a landmark new if it is a certain distance away from any existing landmarks or update an old landmark if not.

State estimation and updating of state can be achieved with the complementary filter or the Extended Kalman Filter (EKF). EKF is the de facto for nonlinear state estimation [1] and tend to work very well in practice. The theory behind EKF is quite though, but it should be a tad easier to implement. I would recommend using the MathScript module if you are going to program EKF. The point of these two filters are to estimate the position of the robot from the wheel encoders and landmarks extracted from the laser scanner.

As the SLAM problem is a big task, I would recommend program it in multiple smaller SubVI's. So that you can properly test your parts without too much added complexity.

There's also a lot of good papers on SLAM.

Upvotes: 1

Slavo
Slavo

Reputation: 492

LabVIEW provides LabVIEW Robotics module. There are also plenty of templates for robotics module. Firstly you can check the Starter Kit 2.0 template Which will provide you simple working self driving robot project. You can base on such template and develop your own application from working model, not from scratch.

Upvotes: 1

Related Questions