My Personal Robotic Companion

SLAM and autonomous navigation with ROS + kinect + arduino + android

https://github.com/sungjik/my_personal_robotic_companion

  20150926_21523820150926_215308 20150926_21521320150926_215254 20150926_215135

patrick_the_robot_flow_chart

  1. The Hardware
    – two geared DC motors with integrated encoders (RB30-MM7W-D W/EC 26P 12V)
    {gear ratio: 1/120, rated speed: 49.5 rpm, torque: 4kgf.cm, encoder resolution: 26}
    – Xbox kinect 360
    – Macbook Air 11 inch running Lubuntu 14.04 and ROS indigo
    – Galaxy S6 edge w/ Android
    – Arduino Mega 2560
    – Adafruit Motor Shield v2
    – 6800mAh 3S LiPo battery (and balance charger)
    – DC to DC step up and step down converter (used for connecting the kinect to the battery)
    – LiPo alarm (to prevent completely discharging the battery)
    – Wood planks, metal connectors, nuts, bolts, wires, cables, etc.
  2. Arduino(motor_controller.ino)
    The main loop in motor_controller.ino converts encoder tick counts into the current rpm of both motors and controls the speed of the motors using PID control. The rosserial library conveniently allows the program to communicate with the rest of the ros nodes in the pc via usb. The program subscribes to the cmd_vel topic which sets the desired speed of the motors, and publishes the actual speeds on the rpm topic. I used two interrupt pins in the mega, one for each encoder. The PID control and rpm publishing are all done in the same loop cycle, at a desired rate of 10Hz (every 100 ms). I modifed the Adafruit motorshield v2 library such that setSpeed for DC motors used the full 4096 PWM resolution, i.e. in Adafruit_MotorShield.cpp
    void Adafruit_DCMotor::setSpeed(uint16_t speed) {
    MC->setPWM(PWMpin, speed);
    }

    The uint8_t parameter in the corresponding declaration in Adafruit_MotorShield.h also needs to be modified accordingly. The adafruit motor shield v2 has its own on-board pwm chip with 12 bit resolution, but for some strange reason the dc motor library only used 8 bits.
    References :
    – http://forum.arduino.cc/index.php?topic=8652.0
    – http://ctms.engin.umich.edu/CTMS/index.php?example=Introduction§ion=ControlPID
    – http://playground.arduino.cc/Main/RotaryEncoders
  3. base_controller.cpp
    The base_controller subscribes to the rpm topic from the arduino and converts it into x velocity, theta velocity, xy position, and yaw. It also subscribes to the gyro topic from the android phone and combines the readings with the rpm information to produce a combined yaw. The base_controller then publishes this information in the form of odometry and tf messages.
    Subscribed Topics : rpm(geometry_msgs::Vector3Stamped), gyro(geometry_msgs::Vector3)
    Published Topics : odom(nav_msgs:Odometry)
    Parameters:
    – publish_tf : whether to publish tf or not. set to false if you want to use robot_pose_ekf
    – publish_rate : rate in hz for publishing odom and tf messages
    – linear_scale_positive : amount to scale translational velocity in the positive x direction
    – linear_scale_negative : amount to scale translational velocity in the negative x direction
    – angular_scale_positive : amount to scale rotational velocity counterclockwise
    – angular_scale_negative : amount to scale rotational velocity clockwise
    – alpha : value between 0 and 1, how much to weigh wheel encoder yaw relative to gyro yaw
    References :
    – https://github.com/Arkapravo/turtlebot
     http://wiki.ros.org/turtlebot
    – http://wiki.ros.org/ros_arduino_bridge
  4. Android(gyro.py)
    I found it necessary to add an IMU to the robot to improve odometry, since odometry from wheel encoders was prone to error from voltage changes, wheel diameter and track width miscalculations, etc. I used an app called HyperImu on the android app market, which publishes gyroscope, accelerometer, magnetometer, and other useful sensor readings onto a UDP stream. I connected my android phone to the PC via usb and set up usb tethering on the android to get an IP address for the UDP stream. Getting the odometry to work with reasonable accuracy was one of the toughest parts of the project. There always existed a discrepancy between what the software thought the robot was doing and what the robot was doing in real life. I at first tried fusing magnetometer, accelerometer, and gyroscope data using a complementary filter and a kalman filter, but in the end I found that a simple weighted average of the gyroscope data and wheel encoder data worked the best. To test whether odometry was reasonably accurate or not, I followed the instructions in http://wiki.ros.org/navigation/Tutorials/Navigation%20Tuning%20Guide.
  5. Kinect
    I followed the instructions in http://wiki.ros.org/kinect/Tutorials/Adding%20a%20Kinect%20to%20an%20iRobot%20Create to connect my 11.1v lipo battery to the kinect. I used an automatic step up/down DC converter instead of a voltage regulator to ensure a steady 12v to the kinect. I then installed the freenect library :
    sudo apt-get install ros-indigo-freenect-stack
    and used the depthimage_to_laserscan ros package to publish a fake laser scan derived from the kinect’s rgbd camera and ir sensors.
    References :
    – https://github.com/turtlebot/turtlebot/blob/indigo/turtlebot_bringup/launch/3dsensor.launch
  6. teleop_twist_keyboard
    The teleop_twist_keyboard ros package takes in keyboard input and publishes cmd_vel messages.
  7. gmapping
    sudo apt-get install ros-indigo-slam-gmapping ros-indigo-gmapping
    The ros gmapping package uses Simultaneous Localization and Mapping(SLAM) to produce a 2D map from laser scan data. I played around with the parameters linearUpdate, angularUpdate, and particles to get a reasonably accurate map. I pretty much followed the directions in http://www.hessmer.org/blog/2011/04/10/2d-slam-with-ros-and-kinect/ and referenced the parameter settings in the turtlebot stack(http://wiki.ros.org/turtlebot).
  8. navigation
    sudo apt-get install ros-indigo-navigation
    I used the amcl, base_local_planner, and costmap_2d packages from the ros navigation stack. I pretty much followed the instructions in http://wiki.ros.org/navigation/Tutorials. One thing I had to do differently was adding map topics for the global and local costmap in rviz instead of a a grid topic. I also modified and used https://code.google.com/p/drh-robotics-ros/source/browse/trunk/ros/ardros/nodes/velocityLogger.py to calculate the acceleration limits of the robot. After hours of tuning amcl, base_local_planner, and costamp_2d parameters, I finally got everything to work.

55 thoughts on “My Personal Robotic Companion

  1. Hi,

    Very nice project. I am interested in building and learning ROS by actually making a robot like this. Is it possible for you to share the robot build step by step and make your code open source so that people like me can benefit from your work. I am a newbie into ROS and robotics. I have a turtlebot that I have used to learn basics of ROS and programming, but I would like to learn more by building my own machine. Can you tell me how much does it cost to build the robot (excluding the kinect and the laptop ). If I use the same hardware as you do, will I be able to make something like this? How much time is it going to cost me? I would really appreciate if you can answer my questions and help me with this project.

    Thanks
    Alex

    1. Hi Alex,

      All of my code is in a public github repo for all to see. Just click on the link at the very top of the blog post. The entire project took me about 3 months working mostly on the weekends.
      1. The hardware excluding the kinect, laptop, and android phone cost about 250 dollars (you can spend much less if you get weaker motors and battery pack). Just be sure to get dc motors with integrated encoders that have decent torque of at least 3kgf.cm. The robotic platform itself is very easy and quite fun to build. You simply need to attach two motors to wheels, attach the motors to a flat frame, connect the motors to a motor driver, connect the motor driver and encoders to an arduino and battery, and finally connect the arduino to a laptop. Building the hardware can be done over a weekend or two.
      2. Naturally, the next step that should come to your mind is writing an arduino sketch that can calculate and control the speed of the dc motors. The most common approach is to use PID control, which you can learn by googling something like “arduino PID control.” You should first try to learn some of the theory behind PID control, then you can take a look at the references in the above post and my code in github to see how I implemented it.
      3. Now the next step would be to interface the arduino sketch with the ROS library. There is a nice package called rosserial that lets you do exactly this, and you can read up on the documentation by googling rosserial. This is also why you might want to get an arduino mega instead of an uno. The rosserial library lets you subscribe to and publish ros topics of bigger sizes with a mega than with an uno. You can test whether or not your robot is working with ROS properly by installing and running the teleop_twist_keyboard package to drive around your robot. At this point, you have created a basic remote-controlled car.
      4. Next, you would want to connect the kinect to the laptop and interface it with ROS. You can simply follow the steps I took in the above post.
      5. Finally, try to follow hessmer’s tutorial on running gmapping and navigation. His tutorial is outdated so you need to compare his launch files with the turtlebot launch files which are up to date.
      6. At this point, everything should be working but rather miserably. You then enter an infinite loop of fine tuning your robot odometry and the parameters in amcl, gmapping, and navigation. If you have any questions along the way, feel free to ask them here.

      Best of luck,
      Sung

  2. Cha Sungjik cordial greeting, my name is Jesus Sandoval David Ortiz, country Colombia. Norte de Santander state. City of Cucuta. I am a student of Mechatronics Engineering at the University of Pamplona. I have observed the project published to your blog, it really is an excellent project, very complete my sincere congratulations.
    I am currently developing a robot with similar characteristics and components, with the aim of making autonomous navigation and SLAM, design and construction of the robot is ported on my blog. The part and components used are detailed. (https://davidsandovalort.wordpress.com/)
    I am in the learning process of ROS for programming the robot. I have been quite difficult. I want to ask a favor if I could specify how to execute the commands within the package is in Github.
    I really appreciate if you can answer the question and help me with my project.
    Apologize for my writing but I have a high command of inlges language.

    Thank you

    Sincerely:
    Jesus Sandoval David Ortiz

    1. Hello Jesus,

      If you want to run my code on your robot, you will need to make some modifications.
      1. First clone my github repo using the command: $ git clone https://github.com/sungjik/my_personal_robotic_companion.git
      2. Install ros packages for navigation, freenect, and gmapping.
      3. If you aren’t using Adafruit’s motor shield v2 as your dc motor driver, you will need to make the appropriate changes to the motor_controller.ino file.
      For example, adafruit’s motor shield library sets the motor speed and direction with the setSpeed and run methods, which you will need to fix to work with the motor driver you are using.
      4. Change the wheel diameter, track width, etc. in robot_specs.h
      5. edit in robot_config.launch the static_transform_publisher args.
      You will need to change the static tf args such as -0.17 to fit your robot specifications. You can follow the directions in http://www.hessmer.org/blog/2012/02/11/ardros-transform-between-base_link-and-the-kinect-sensor/
      6. If you aren’t using an android phone as an imu like I did, comment out the imu_node lines in robot_config.launch
      7. If you haven’t created a urdf model for your robot, comment out the urdf include lines in driver.launch, slam.launch, move_base.launch, and laser_scan.launch
      8. Compile everything and source your packages as described here : http://wiki.ros.org/ROS/Tutorials/InstallingandConfiguringROSEnvironment
      9. Try driving around the robot : $ roslaunch my_personal_robotic_companion driver.launch
      10. For trying the laser scanner, SLAM gmapping, and navigation, roslaunch laser_scan.launch, slam.launch, and move_base.launch respectively.
      11. You will need to play around with the parameter settings in the launch files, such as the linear and angular scale constants in robot_config.launch, the params in amcl_diff.launch, the yaml files, etc. All the information you need are in the ros navigation tutorials and the gmapping wiki page to get everything properly set up and working.
      Honestly, I think you will find it easier to start from scratch and use my code as a reference instead of trying to port it onto your robot.

      Best of luck,
      Sung

  3. Hello Sung,

    I’m a student of University of Technical Education in Vietnam. I’m doing my final project and I’ve found your amazing project. Then I decided to follow your tutorial. I did setup the Ubuntu on my laptop but I don’t know what to do next. Can you help me out? And now I predetermine to buy the Hardware, but when I saw your demo picture and the section 1.Hardware on the top of the blog post, I feel a little confuse about “Adafruit Motor Shield v2” and “DC to DC step up and step down converter” because I don’t know how to buy these things in my country. Can I replace it by the other one? And could you give me some more picture about conecting the hardware!? I’m a beginner in ROS and robotic so my skill is very bad!
    I will be happy if I get your help! My English is not good so I really sorry if you feel difficult to understand my Reply.

    Thank you,
    Giang

    1. Hi Giang, sorry for the late reply. You can use any DC motor shield, just make sure you can use it with an arduino. You will need the DC to DC step up converter if you plan to connect your kinect to an 11.4v lipo battery pack like I did. The converter simply changes the voltage from 11.4v to the 12v that the kinect requires. The hardware setup is pretty straightforward, you just need to connect the motors and wheel encoders to the motor shield and connect the motor shield to the arduino. You can google each step, e.g. connecting wheel encoders to a motor shield and arduino, and find tons of tutorials. I think the diagram at the top explains the overall big picture. For more detailed steps, please take a look at my replies to the comments above.

      1. Hi Sung!
        Thank you for your reply! I’m really happy right now! During the period of waiting, I have read all about your replies to the comments on your blog, include the comment in “about me” page. And all of it helped me so much. I did most of them! At this time! I have been driving around my robot by using teleop_twist_keyboard. It’s really fun, my robot hit so many times into the obstacles. I think this problem is about my PID. Can you give some advices!?
        And when my robot slam, the map can’t be built. Its axis just spin around while the robot doesn’t move. Then I trying drive my robot around but it seems that things did not look any better…. And the result is a mess. My partner and me have laugh out loud. We don’t know what to do to solve this problem!
        Can you help me out!? 🙂
        Anyway, thank you for everything! I think that God has led me to you! You know, I have stuck in everything in the beginning! But now…… thank God! 🙂

        Once again, Thank you! God bless you!
        Giang

      2. hi Sung!

        The link above is the video about my problem!
        I would really appreciate if you can help me with this!

        Thank you,

        Giang

  4. And can you give me the paper or something about the theory, the algorithm….. Because I just did build the robot without clearly understanding whats make it work! And that’s not good at all,
    Thank you,
    Giang.

    1. Hi Giang, sorry for the late reply. I hope you solved your problem in the youtube video, but just in case you haven’t figured it out yet, I think it is probably because of your odometry. Try to look at your odometry tree and see if all the nodes are connected, I forgot how but if you browse through the ROS tutorials there should be a tutorial on how to view odometry trees. Also make sure all your axes are pointing where they should be pointing, i.e. robot orientation (left, right, up, down) matches your rviz representation. As for the theory behind SLAM, I would recommend “SLAM for dummies” by MIT (you can google it and you’ll find a pdf).

  5. Hi sung, I have a problem can you help me , when I typed a command $ roslaunch my_personal_robotic_companion driver.launch, the output is: Unable to sync with device; possible link problem or link software version mismatch such as hydro rosserial_python with groovy Arduino
    what should I do?, I have tried ros lib example(blink) it worked well, I tried to replace rosserial in this package with rosserial indigo, still the same error

  6. Thank you for the amazing post! I am building this for my senior design project for Electrical engineering. I was wondering if I wanted to use a Raspberry pi instead of a laptop on the robot. Is there any major changes I would have to make on the ROS side. I was planning on using a pc to remote desktop connect into the Pi.
    Thanks

    1. That was actually something I considered for quite a while! I also had a remote desktop client on my macbook air so that I could see the rviz visualizations from my larger macbook pro. I’m not sure if you will be able to run all the ROS nodes and rviz on a raspberry pi, though, and that was why I decided to have a macbook air on the robot instead of a raspberry pi. You could try to cut down all the non-vital ROS nodes and optimize your code to make everything run on raspberry pi. It could just work with the new raspberry pi 3. Personally, I think gmapping and navigation would work but be quite slow on a Pi, and rviz would be nearly impossible to run, so no cool visualizations.

  7. Hey brother I cloned your files using the github clone command. Whenever I attempt to launch the laser_scan.launch file i get the following error. I tried to follow the tutorials on sourcing files but it seems like you didnt set up a catkin ws so im a little lost.

    raise ResourceNotFound(name, ros_paths=self._ros_paths)
    ResourceNotFound: my_personal_robotic_companion
    ROS path [0]=/opt/ros/indigo/share/ros
    ROS path [1]=/opt/ros/indigo/share
    ROS path [2]=/opt/ros/indigo/stacks
    kuljot@kuljot-Lenovo-IdeaPad-U530-Touch:~/my_personal_robotic_companion/my_personal_robotic_companion/launch$

    any thoughts on what might fix this issue?

    Thanks

      1. Hey guys have you modified the adafruit motor controller code for VNH5019MotorDriver? I have the same as you, but I didn’t know much about coding. Could you share the motor_controller.ino for pololu VNH5019motordriver please?

  8. Hey Broo Sungjik about gyro.py he give me this error
    Traceback (most recent call last):
    File “gyro.py”, line 107, in
    imu_publisher(sock)
    File “gyro.py”, line 37, in imu_publisher
    sock.bind((host,port))
    File “/usr/lib/python2.7/socket.py”, line 224, in meth
    return getattr(self._sock,name)(*args)
    socket.error: [Errno 99] Cannot assign requested address

  9. Hello ,
    I’m very interested in your slam robot , and I ‘m trying to make slam robot that refering to your robot.
    currently , I’m coding base_controller.cpp .

    I have something to ask you .

    (1)from 142 to 173 lines in the base_controller.cpp
    What is meaning of these covariance ?
    How do you calculate these values?
    The element of the covariance is 35, what does their meaning?
    Where Is the covariance used?

    1. I have the same doubt. Now I’m modifying the base_controller.cpp but idk what that means. Does it matter modifying those values or I shouldn’t touch that?
      Thanks
      Gerson

  10. very nice job
    do you have some sample code to connect with robot_localisation or robot_pose_ekf ?
    in you code you start to write the covariance.
    currently i am working with ros_arduino_bridge
    thank you very much for the tutorial.

  11. Sung, this is not only a great bot, but a terrific tutorial as well. I’m going to follow your steps here with one exception, I’m going to try and use the Intel RealSense camera. If that doesn’t work, I’ve got a Kinect. But my question is around the laptop. Outside of the USB3 issue (for the RealSense), could I use a Pi3b for ROS? Would it have the processing power to handle things? (I’ve got a 2015 MBP I can use if need be, just trying to see what I can get away with.) I should state I’ve never worked with ROS, so this will be my first foray.
    Thanks!

  12. Hello Sung
    I must say this is an excellent project. I’m doing a similar one but with a Create 2 base. The create is notorious for its unreliable encoders so the odometry info you can extract is really bad. I came across your blog while researching how to integrate a gyro into my project. Its a pretty good idea to use your phone’s gyro for this purpose, so thanks for the tip and instructions. I would definitely try it and see if it is reliable enough before buying a separate gyro. Correct me if I’m wrong, but the linear scale and alpha parameters are what you used to fine-tune/calibrate your combined odometry right? Did you follow any specific method to find them? Also, have you consider using robot_pose_ekf?

  13. Hello
    I have a problem ;(
    “Unable to sync with device; possible link problem or link software version mismatch such as hydro rosserial_python with groovy Arduino”
    I checked on Lubuntu 14.04 witch ros indigo and ubuntu 14.04 and ubuntu 15.04 witch ros jade
    Help me please

  14. Dear Panda Burkhardt, All the files are intact. You should just replace “find patrick_the_robot” with “my_personal_robotic_companion”wherever present in your launch files.
    And make sure the following lines in your robot_config.launch are suited to your robot design

    node pkg=”tf” type=”static_transform_publisher” name=”base_to_kinect_broadcaster” args=”-0.17 0.04 0.1975 0 0 0 \/base_link \/camera_link 100″ />

  15. Ashish Menon! My question is that when the bot is palced in the room for the first time, then mapping the room will be done autonomously or by keyboard commands?

    1. It is done by keyboard commands through teleop_twist_keyboard node. Do you know how to do something similar but autonomously?

  16. Hey Sung,

    thank you so much for the great work. I just wonder, why you did not use a MPU-6050 (Gyro+Acc) or MPU-9150 (Gyro+Acc+Compass) directly with/on the Arduino to publish Odomety data instead of the Android mobile phone?

    Thank you again
    Armin

  17. Is robot_specs.h library in base_controller.cpp also from arduino motor controller library robot_specs.h
    My question is about

    In base_controller.cpp
    #include

    In motor controller.ino
    #include “robot_specs.h”

    Is this come from the same library

    Thank you for your tutorial

  18. Thanks, Laurie! I’m glad to know it didn’t stop working for everyone. I lost a ton of feeds in my Google reader so I am pretty bummed about that as well. I am trying to remember all the blogs that I loved so I can subscribe to them another way.

  19. Hello Sung,
    How important is it to have a URDF file that describes the robot accurately? Did you make a URDF file for your robot ? If not, how is your robot aware of its own shape while navigating around obstacles?
    Thanks!

  20. I have a problem with Hyperimu. While running, it crashes waiting for device … and received incomplete udp packet. Help me.

  21. Udp is already fine. There was a problem with odometry. my engines have a gear ratio of 29: 1 and in the project 1: 120, which means that about 11 turns of the whole robot is one revolution in rviz.

  22. No habia visitado tu blog por un tiempo, porque me pareció que era aburrido, pero los últimos posts son de buena calidad, así que supongo que voy a añadirte a mi lista de sitios web cotidiana. Te lo mereces amigo. 🙂

    Saludos

  23. Hey i am also working on ros i just want to run this algorithim can you tell how can i run this algorithim in my pc step by step

Leave a comment