TT101: Basics of Mobile Robotics
Introductory mobile robots course for absolute beginners
Basics of mobile robotics
In the TT101:Basics of mobile robots, we discuss the fundamentals of mobile robots including sensors, perception and algorithms for navigation. This lecture series is aimed at providing structured information to help absolute beginners get started with mobile robotics.
TT101.1: Course Introduction
As the 1st lecture of the TT101: Basics of mobile robotics series, in this video, we go over the scope of this lecture series and who can benefit from the course content. This course primarily focuses on mobile robotics and covers the fundamentals except those required for designing and fabricating robots and robot components. All lectures are on-demand videos curated chronologically under a playlist.
TT101.2: Components of mobile robots
In lecture 2 of TT101: Basics of mobile robotics, we discuss various components of mobile robots. This lecture has a high level overview of the types of sensors, types of intelligence (for planning and decision making), types of actuators and joints and primary control paradigms.
In lecture 3 of TT101: Basics of mobile robotics, we cover preliminaries and notations. These will serve as the building blocks for the upcoming lectures. Here we discuss:
- State spaces (Configuration spaces, Degrees of Freedom, Work space, Task Space)
- Frames of References
- Transformations (Translation, Rotation, Affine transform, Homogeneous transform, Chained rotations, undoing transformations)
- Various ways to represent rotations (Euler anglers, roll-pitch-yaw, quaternions, angle-axis representation)
- Kinematic constraints (Holonomic, non-holonomic, redundant)
TT101.4: Sensors for mobile robots
In Lecture 4 of the TT101: Basics of mobile robotics series we look into sensors for mobile robots. We discuss the need for sensors for mobile robots, various types of exteroceptive (cameras, lidars, sonar) and interoceptive (odometers, imu, gyroscope, accelerometers, magnetometer) sensors including a high level overview of their working principles. We also looked into challenges that appear when deploying such sensors for mobile robots such as noise, sensor uncertainty, rogue sensors.
TT101.5: From Sensing to Perception
In Lecture 5 of the TT101: Basics of mobile robotics series we look into perception for mobile robots. Sensors report the raw stimuli as is which then gets passed through a perceptual pipeline so that us or the robot can make sense of the data or in other words, associate context to the data.
In this lecture, we look into 2 of the most well studied perceptual pipelines: audio and visual perceptual pipelines and also look into how perception can be deceptive. Perception is just an interpretation of the sensed data and as such is rarely accurate. On top of this, it can lead to illusions and aliasing which in turn might confuse the robot.
We also look into synasthesia, a rare neurological disorder which might potentially beneficial qualities for a mobile robot operating in unknown environments if one could artificially induce such a condition.
TT101.6: From Perception to State Estimation
In Lecture 6 of the TT101: Basics of mobile robotics series we look into state estimation for mobile robots.
Sensors report the raw stimuli as is which then gets passed through a perceptual pipeline so that us or the robot can make sense of the data or in other words, associate context to the data. Thereafter, the robot needs to evaluate its state and the instantaneous state of the world in which it is operating in order to be able to take suitable actions.
This lecture introduces mapping, active vs passive localization (global and local localization) and filtering which is often misinterpreted as throwing away data as opposed to fusing the data from various sensors for state estimation and reducing the uncertainty in state estimate.
TT101.7: Towards recursive state estimation
In Lecture 7 of the TT101: Basics of mobile robotics series we look into recursive state estimation for mobile robots. As the robot moves around in its environment and gathers new sensor observations, it needs to constantly update its state estimate. This is known as Recursive state estimation where the state often refers to the robot’s pose and the map of the environment.
In real world, the robots often get deployed in unknown environments, meaning they have to build the map and localize themselves with respect to the map being generated. This is known as Simultaneous Localization and Mapping or SLAM for short which is the focus of this lecture.
In this lecture we look into various SLAM approaches such as Filter-based approaches, Graph-based approaches, keyframe-based and bio-inspired approaches. Some examples of such approaches included in this lecture are: –
- Kalman Filter
- Extended Kalman Filter (EKF)
- Unscented Kalman Filter (UKF)
- Particle Filter (PF)
- Rao Blackwellized Particle Filter (RBPF)
- ORB SLAM
- Google Cartographer SLAM
TT101.8: Introduction to Robot Operating System (ROS)
In Lecture 8 of the TT101: Basics of mobile robotics series we look into ROS: the robot operating system. This is a guest lecture presented by Sakshay Mahna from @RoboticswithSakshay, an Undergraduate Computer Science student who has a passion for robotics and Artificial Intelligence (AI).
Robot Operating System or ROS for short, is an interface between the hardware and software component of the robot.
Given the modular and opensource nature of this framework, one can easily reuse parts of the publicly available ROS packages to build out robotic applications saving some of the programming workload. This lecture also includes a worked example with some DIY exercises to give the learners a chance to get their hands dirty with a simulated Turtlebot 3 robot in simulation controlled via ROS.