We’re excited to be present two new papers at ICRA 2021 in Xi’an, China! Our first paper describes a learning-based technique to adjust camera gain and exposure parameters to maximize the number of inlier feature matches for visual motion estimation under challenging lighting conditions. We demonstrate successful navigation through road tunnels – a situation where competing algorithms, and built-in auto-exposure and automatic gain adjustment, fail. Grab the preprint here: https://arxiv.org/abs/2102.04341; this work will appear in RA-L and at ICRA.
Our second paper presents a continuous-time approach for extrinsic calibration of cameras and 3D millimetre-wave radar sensors. This research, carried out in partnership with the LAMOR group at the University of Zagreb, enables accurate in-field calibration without the need for radar retroreflectors or specialized visual targets. Grab the preprint here: https://arxiv.org/abs/2103.07505.
Please join us for the presentation sessions at ICRA to learn more!
Congratulations to our alum Valentin Peretroukhin, who received the 2020 Gordon N. Patterson Award for the top PhD dissertation at UTIAS! His thesis, “Learned Improvements to the Visual Egomotion Pipeline,” is available on the laboratory Publications page. Great work Valentin!
Congratulations to Ph.D. student Olivier Lamarre and postdoc Ahmad Bilal Asghar on winning 1st prize at the IROS 2020 Workshop on Planetary Exploration Robots! The associated short abstract describes considerations related to traversability uncertainty for long-duration rover navigation planning on remote planetary surfaces. Great work! Thanks to Moog for sponsoring the award!
Starting January 1, 2021, Prof. Kelly will serve as an Associated Editor for the IEEE Robotics and Automation Society’s Robotics & Automation Magazine (RAM). The magazine has over 12,000 readers and is consistently listed by Thomson’s Journal Citation Reports (JCR) as one of the most highly ranked publication in both the Robotics (#7) and Automation & Control (#15) categories, with an impact factor of 4.250 in 2018 (5-years IF:4.816). The magazine publishes four issues per year: March, June, September and December.
New RA-L and IROS 2020 work with the RLAI Lab at the University of Alberta on learning robust latent dynamics from images only. We demonstrate how latent dynamics can be made more robust for real-world robotic applications by composing a Kalman filter with a learned, heteroskedastic uncertainty model. Watch the YouTube video to find out more!
Have you always wondered whether your extrinsic sensor transformation globally minimizes its calibration cost function? With our latest work on certifiable monocular hand-eye calibration, wonder no more! Check out the IEEE MFI 2020 paper – we prove that trajectories satisfying observability requirements lead to convex relaxations that are inherently stable to measurement error. The open source implementation of our method is fast, requires no calibration targets, and works for a wide variety of sensors, including monocular cameras!
Congratulations to lab members Valentin Peretroukhin and Matthew Giamou and to our collaborators, David M. Rosen, W. Nicholas Green, and Nicholas Roy at MIT for winning this year’s RSS Best Student Paper Award! Full details and code for the paper, “A Smooth Representation of SO(3) for Deep Rotation Learning with Uncertainty,” are available here. Great work all!
Just a couple of days until the RSS 2020 Workshop on Power-On-and-Go Robots: ‘Out of the Box’ Systems for Real-World Applications! We are extremely excited about the event, which will be streamed live via Zoom. Full details are available at https://www.power-on-and-go.net/.
The workshop will bring together researchers from diverse backgrounds to address topics related to power-on- and-go robots: robotic systems that are able to successfully deal with new situations fluidly and to adapt immediately to new environments or to changes in their own operating parameters. We have a fantastic lineup of speakers and panelists, including Hadas Kress-Gazit (Cornell), Stefan Leutenegger (Imperial College), Nathan Michael (CMU), Arne Sieverling (Realtime Robotics), Luca Carlone (MIT), Ali Agha (JPL), Dorsa Sadigh (Stanford), and Gaurav Sukhatme (USC)!
As a follow-on to the workshop, our Call for Papers for a special issue of the journal Autonomous Robots is out now as well, with more details and deadlines available here. We hope you will be able to join us for an insightful virtual event!
We’re excited to have three lab papers that will be presented at this year’s (virtual) ICRA 2020 conference! Highlights and video links are below.
Check out our new extension to DPC-Net (from ICRA 2018): we show that DPC networks can be trained in a fully self-supervised manner, which improves accuracy and allows for retraining online in new environments!
Got features? Our recent RA-L and ICRA 2020 work demonstrates how to learn maximally-matchable image mappings to dramatically reduce the data needed for experience-based navigation.
Check out our work on a QCQP approach to inverse kinematics for redundant manipulators. We show that this difficult, nonconvex problem often admits a provably tight convex relaxation that can be efficiently solved! Coming soon to MoveIt!