ETH Zurich :
Computer Science :
Pervasive Computing :
Distributed Systems :
Education :
Student Projects :
Abstract
Facial Augmented Reality (L)Status: Abgeschlossen
Overview
Facial augmented reality applications like MSQRD or FaceSwap are quickly gaining popularity among mobile users. In these applications, a smartphone or a tablet acts like a 'digital mirror' and augments the user’s face with apparel, or overlays the user’s face with an animated character mask. Existing approaches target non-photorealistic entertainment applications and do not aim for photorealistic augmentation. In particular, the lack of ambient lighting, shadows, and specularities are apparent in existing facial AR applications. One of the reasons for the lack of photorealism is that it would require very precise registration and real-time estimation of physical quantities such as face material properties and environment lighting. These are, however, not straightforward on resource-constrained mobile devices.
The goal of this project is to investigate how we can estimate environmental lighting conditions on unmodified mobile devices. This is a very difficult problem in general, but in facial AR applications we can exploit additional clues from the face (symmetries, expected head shape, expected face colors, eye ball reflections, etc.) that all can make the problem easier.
Student/Bearbeitet von: Andreas Hess Contact/Ansprechpartner: Gábor Sörös
|