Now that I have put my robot together, what's next?
The only documentation I could find on Internet about this specific robot is the Artificial Human Companions blog. The author Simon Birrel bought the robot in early 2016 and has written a couple of good articles to share what he learnt.
Lack of documentation was not a huge problem, as almost all tools and programs for TurtleBot2 can also be applied to the Deep Learning Robot.
So, by referring to Simon's posts and other TurtleBot2 resources, I have experimented with the following:
- Learn about ROS
- Use VMware Fusion to run Ubuntu on my MacBook Pro
- Get IP address of robot from computer
- Check the battery level of the robot
- Miscellaneous network setup
- Use SLAM (Simultaneous Localization and Mapping) program to create an indoor map of my home
I will comment on each of these activities in the sections below.
Learn about ROS
The Deep Learning Robot, like TurtleBot2, runs on Robot Operating System (ROS).
Despite the name, ROS is actually not an operating system but more like a set of programs and framework that run on a "real" operating system like Ubuntu. ROS provides abstraction over low-level device control so that developers can focus on programming the high-level behaviour. It provides a standard mechanism for robotics software to be written and to communicate with one another.
There are many resources out there about ROS. I would recommend one that I found particularly helpful: "Programming for Robotics - ROS" provided by ETH Zurich, a robotics course with lectures and exercises freely available online.
This is not an online course in the traditional sense. There are no forum or multiple choice quizzes. There are simply 4 video lectures and exercises but they explained the material very well. The best thing is that they have provided an Ubuntu image file ready to be used in VMware Fusion, so I could play around with ROS commands immediately, without any installation or problems fixing dependencies.
I also found the official ROS wiki introduction and basic concepts very helpful. Give it a go here.
Use VMware Fusion to run Ubuntu on my MacBook Pro
I need to have a local environment that runs ROS, so that I can communicate with the robot and develop ROS programs easily. My options are:
- Install ROS on my MacBook Pro - People say this will be difficult because ROS does not officially support MacOS.
- Load Ubuntu on MBP, then install ROS on Ubuntu - This is a possible setup, and will perform better than the next option, however leaving the MacOS environment (or work on dual-boot all the time) is probably not suitable for everyone. Not me anyway.
- Load Ubuntu on a local virtual machine, then install ROS on Ubuntu - Everyone says this is the best for beginners. So I went with it.
The main options of virtual environment management software on MacOS are VMware Fusion and VirtualBox. I have used both before and VMware Fusion gave me far less problems and ran smoothly, so I went with that, and it worked very well as expected. It is a paid product but worth the money.
Get IP address of robot from computer
I have to know the robot IP address before I can ssh to it. How can I get its IP without connecting the monitor, keyboard and mouse? I could use
nmap to show the IP addresses of all the devices connected to my network.
sudo nmap -sn 192.168.1.1/24
-sn means no "port scan" so that the command takes much shorter time to run.
Check the battery level of the robot
It is not very nice to have the robot suddenly shutdown due to running out of battery. To check the battery level,
# Run on robot if not already running. roslaunch turtlebot_bringup minimal.launch # Run on local to check topic /mobile_base/sensors/core rostopic echo /mobile_base/sensors/core
A lot of messages will be displayed. Use Ctrl-C to exit
rostopic. Check the value of the “battery” attribute and it should be about 160 when battery is full.
To get battery level readings in percentage, I would have to write programs for it. These two TurtleBot tutorials talk about how to do that:
Miscellaneous network setup
I followed what Simon did in his blog post that would make my ROS development life much easier:
- Set a hostname for my Ubuntu workstation and the robot.
- Set up robot to be connected by hostname and appear on Finder, by installing
avahi, an Open Source Bonjour implementation, that allows hosts to be connected by hostname instead of IP.
netatalkan Open Source Apple's Filing Protocol (AFP) implementation, that allows the robot to appear on my MacBook’s Finder as a network drive.
- Set up environment variables that allow ROS to work well across my computer and the robot
- See a live view from the robot's camera
- Visualise the depth image from the 3D Camera
Above: 3D depth image of me sitting in front of the robot.
I also updated the original password of the robot and the workstation.
Use SLAM (Simultaneous Localization and Mapping) program to create an indoor map of my home
Following another post from Simon, I was able to run a program that makes the robot create a map while navigating around my home. I could then click somewhere on the map, and the robot will go there.
I now have an understanding of how ROS works and able to run different ROS nodes to do something basic, but I still have not written a single line of code yet! That will be my next challenge.