Using Sensors via ROS (Tue Sep 17, lecture 6) previous next
How do we use sensor data to actually do something useful?

Homework due for today

Legend: : Participation (pass/fail) | : PDF | : Team | : Attachment

  1. Read: Read Chapter 7 of PRR and remember to refer to 7: Wander Bot as you try to understand and test out the code. Believe me you won’t really understand the chapter until you get the code to run! Consider the following warmup questions:
    1. As we are talking about sensors, thonl about this: In the Red Light! Green Light! program what plays the role of a sensor?
    2. This is tricky: the formula on page 104: bearing = msg.angle_min + i * msg.angle_max / len(msg.ranges) is incorrect (i.e. it’s an error in the book!) What is the correct formula, or what is the error?
    3. In Wander-bot, where is the code that assures that the robot doesn’t run into a wall when it is wandering? What line(s).
    4. Please write one or two things that are still confusing to you; if it’s all clear, then please write one or two major takeaways. Deliverable: Your responses to the above questions, in brief, in a pdf with your name and homework number at the top.
  2. Lidar: I want you to use your intuition and your knowledge of geometry. Try to understand the data that comes from the LIDAR. You can think of them as “rays” that go in all directions. The data you get back is an array of distances to the first obstacle. Using words, diagrams and/or pseudo code, try and answer:
    • Given a single LIDAR sensor how would you detect that you are about the crash into a wall?
    • How would you compute the angle at which you are approaching the wall? Deliverable: Submit your response as a pdf. Points off for a superficial answer.
  3. Read: Wall Following Algorithms Paper. Write up what your major takeaways are from this paper. Deliverable: 1 page max pdf writeup


  • Lets talk over your answers to the homework questions
  • Requested workshops: #1 is running your code on an actual robot
  • How to deliver this?
  • New OH times
  • Discussion of the pace of the course so far. Fast? Slow?
  • Grading is happening
  • Show projects area being developed

Review - how it works in ROS

  • All your code (almost) in ROS is a node
  • All information (almost) is transferred in the form of messages
  • Sensors
    • Are electronic devices
    • Wired to the built in computer
    • In the case of the TB3, wired to the Arduino
    • Code on the Arduino read the i/o and participate as a node in ROS
    • Publishing information
  • ROS info required
    • What is the “topic” that the device publishes to
    • How frequently does it publish (e.g. 10 times per second, once every second)
    • And what is the message type
  • Acting on the sensor data
    • A node subscribes to that topic
    • Writes a “handler” which is invoked each time the sensor node publishes
  • Simulation and IRL are different
    • Sensor’s specifications are different (subtly)
    • Ranges and what happens when out of range
    • Degree of “randomness”
    • Exact dimensions of robots, obstacles, etc.
NOTE! All the book instructions make reference to "turtlebot_xxx". We will be using "turtlebot3_xxx"


  • We will modify the instructions from the book so follow along here!
  • Code will be part of prrexamples package which we have been using: prrexamples
  • Setup the simulation packages for Turtlebot3 by following 11. Simulation. In particular make sure you add the turtlebot3_simulations package, as follows:
$ cd ~/catkin_ws/src/
$ git clone
$ cd ~/catkin_ws && catkin_make

Red Light/Green Light

  • Pretty trivial, toe in the water
    • Drive forward for a bit
    • Stop.
    • Repeat
  • Look at
  • Run code:
$ roslaunch turtlebot3_gazebo turtlebot3_world.launch
$ ./

Reading sensor data

  • TB3 has a lidar
  • Lidar returns and array of 360 doubles (and some other stuff)
  • 0 degrees is straight ahead
  • Each element has distance to nearest obstacle (in M)
  • Compute bearing (in degrees) of element i as follows:
bearing = ((msg.angle_max - msg.angle_min) / (len (msg.ranges) - 1) * i + msg.angle_min
  • To compute the distance to the obstacle immediately ahead (0 degrees)
    • msg.ranges[0]
  • To compute the nearest obstacle:
    • nearest_obstacle = min(msg.ranges)
  • But of course those won’t be very useful (why?)

  • Experiment by running program and understanding it
  • Experiment by seeing how the scan data changes based on where the simulated robot is
$ roslaunch turtlebot3_gazebo turtlebot3_world.launch
$ rostopic echo scan

Wander Bot

  • Program will perform a pretty stupid trick
  • It will drive in a straight line for some seconds
  • Then it will spin in place for some more seconds
  • If it is too near an obstacle then it will spin in place
  • It turns out this is a pretty terrible algorithm
  • It gets stuck all the time
  • Take a look at the annotated code: Wander Bot Code

    • returns a time structure with
    • fields secs and nscecs
    • Comparison operator is defined
    • rospy.Duration(x) is an interval, in seconds
  • Now run the program
$ roslaunch turtlebot3_gazebo turtlebot3_stage_3.launch
$ chmod +x
$ ./


  • Logic in a .py program can estabish behaviors
  • All .py programs become nodes (almost)
  • Nodes can publish and subscribe, and do both at the same Time
  • Need to know the exact names of the topics
  • Simulator is used for an idealized context for testing

Example - Simple “Roomba” pattern

Roomba is iRobot’s robotic vacuum cleaner. It seems to just drive forward until it gets to a wall and then turn some amount and then continue. We want to think through a simplistic version of this pattern.

  • Loop forever
    • While there a wall ahead
      • Rotate in place
    • Move forward until there’s a wall ahead
Discussion about this approach
  • What questions need to be answered?
  • Wil it work?
Lets look at some code that uses sensors
  • prrexamples/
  • prrexamples/

Next Class