Algorithm and Augmented / Virtual Reality
Simultaneous Localization and Mapping (SLAM) is the computational problem of constructing or updating a map of an unknown environment while simultaneously keeping track of an agent’s location within it. SLAM algorithms are used in navigation, robotic mapping and odometry for virtual reality or augmented reality. SLAM algorithms are tailored to the available resources, hence not aimed at perfection, but at operational compliance.
In Augmented Reality, the computer uses sensors and algorithms to determine the position and orientation of a camera. AR technology then renders the 3D graphics as they would appear from the viewpoint of the camera, superimposing the computer-generated images over a user’s view of the real world.
In Virtual Reality, the computer uses similar sensors and math. However, rather than locating a real camera within a physical environment, the position of the user’s eyes is located within the simulated environment. If the user’s head turns, the graphics react accordingly. Rather than compositing virtual objects and a real scene, VR technology creates a convincing, interactive world for the user. Is AR/VR a metamorphosis for a UX designer?
Facebook announced the open sourcing of ‘Detectron’, a software system that implements state-of-the-art object detection algorithms. It is the company’s platform for computer vision object detection algorithm based on a deep learning framework. The algorithms examine video input and can make guesses about what discrete objects comprise the scene.