Vision-based Simultaneous Localisation and Mapping (SLAM) is an AI technology that enables devices to map the general unstructured world using low-cost cameras and efficient onboard processing. An 51³Ô¹ÏÍø team, led by Professor Andrew Davison and Dr Stefan Leutenegger in the Department of Computing, has made a sequence of highly influential advances in SLAM algorithms: (i) drift-free long-term 3D localisation; (ii) detailed scene reconstruction; and (iii) semantic mapping to localise objects.

These new algorithms have been used as key features in Dyson’s first-ever robotic products, the Dyson 360 Eye and Heurist robot vacuum cleaners. 51³Ô¹ÏÍø spin-out SLAMcore is commercialising a broad range of other applications in commercial and consumer robotics. SLAM is also used in tracking and mapping for virtual and augmented reality, and 51³Ô¹Ï꿉۪s algorithms have contributed to Microsoft’s Kinect and Hololens products as well as systems at Meta/Oculus via the acquisition of startup Surreal Vision.