Conceptual Work: Eye-Tracking influencing the AR User Experience
All the conceptual videos are made in Adobe After Effects
AR Prototyping Work
Object recognition and adding a virtual object to it (Prototyped using Unity and ARKit2, runs on iPhone 6s+)
Augmenting 3d assets on top of real world (Prototyped using Unity and ARKit2, runs on iPhone 6s+)
Realtime point cloud generation using 6d.ai, also interesting to notice is the rain getting occluded by the TV (Prototyped using XCode as a platform to generate particle rain and 6d.ai API for the realtime SLAM)
Modelled array of cubes in Blender, later imported to Scenekit (XCode). Prototype runs on iPhone6s+
Eye direction detection using ARKit in XCode (Used open source code to estimate the gaze direction using eye direction)
Prototyped the idea of virtual desktops in AR. Here I've placed 3 different intractable web views in front of me. Implemented in Scenekit/ ARKit using XCode.
Face Tracking with ARKit, done on Unity
Realtime point cloud generation using 6d.ai

You may also like

Back to Top