AR - Augmented Reality

Keynote reel:

un po’ di storia:




Come funziona

camera, gyroscope, accelerometer, motion


multiplatform mobile

Device tracking

track the device’s position and orientation in physical space.

Plane detection

detect horizontal and vertical surfaces

Point clouds, also known as feature points.

Anchor: an arbitrary position and orientation that the device tracks.

Light estimation

estimates for average color temperature and brightness in physical space.

Environment probe

a means for generating a cube map to represent a particular area of the physical environment.

Face tracking

detect and track human faces.

2D image tracking

detect and track 2D images.

3D object tracking

detect 3D objects.


generate triangle meshes that correspond to the physical space Lidar Scene reconstruction

Body tracking

2D and 3D representations of humans recognized in physical space

Collaborative participants

track the position and orientation of other devices in a shared AR experience

Human segmentation

determines a stencil texture and depth map of humans detected in the camera image


queries physical surroundings for detected planes and feature points

Pass-through video

optimized rendering of mobile camera image onto touch screen as the background for AR content

Session management

manipulation of the platform-level configuration automatically when AR Features are enable or disabled


allows for occlusion of virtual content by detected environmental depth (environment occlusion) or by detected human depth (human occlusion


AR Foundation


Progetto test

art XR lecture: