Simple robot control and navigation based on aruco markers — part 2

Part 1 is here

Now, when I have robot working and under my control lets make it to drive around by it self. I want to use aruco markers like this one:

The tools for finding and tracking these markers already exists in OpenCV(Open source computer vision) library which I plan to use.

Python is the language of my choice as it is very popular right now and I have to finally check it out. So please keep in mind that I haven’t got any experience with Python before and my code probably doesn’t look very good.

To see where the markers are I needed a camera and it must be wireless as I don’t want to new wires on my ceiling. But the IP camera that I have doesn’t give very good picture. So I thought I can use my Android phone as a webcam. After trying few apps from Google Play Store I settled on this one : 
IP Webcam. The app is great it has everything I wanted. I started to play with Python and OpenCV and in a short while I was able to take a video from my phone that was send by IP Webcam app and display it on my laptop screen and find aruco markers! It worked well enough. The only problem I had was latency. Capturing the image on the phone, compressing it, sending through WiFi network and receiving, decompressing, processing on PC takes time. It wasn’t bad and this solution was still usable but I think that I found better one.

Why not do all image processing on the phone? New phones ( event older one ) are perfectly capable to process video and find aruco markers. When image processing is done on the phone, just the position of a marker has to be send over network. And the ArucoAndroidServer was born.


The app can be downloaded from HERE

The app finds the aruco markers on the video from camera. And each time you send ‘g’, it responds with aruco marker position in JSON format.

  "aruco": [
      "ID": 2,
      "center": {
        "x": 266,
        "y": 259
      "heading": -3.1150837133093243,
      "markerCorners": [
          "x": 129,
          "y": 125
          "x": 396,
          "y": 129
          "x": 402,
          "y": 381
          "x": 137,
          "y": 401
      "size": 267

Now we just process data and make finally the robot to navigate.

One thought on “Simple robot control and navigation based on aruco markers — part 2”

  1. Please I how can I implement the Arucoandroidserver actually I have no clue what of if I use Astar algorithm for the path planning, how can I localized the robot

Leave a Reply

Your email address will not be published. Required fields are marked *