Banuba

Banuba is a computer vision startup developing augmented reality and 3D facial animation software for mobile devices. The company focuses on face-augmenting reality producing its Face AR software development kit (SDK) to enable AR experiences for the front-facing camera. The company is headquartered in Hong Kong with its R&D center in Minsk, Belarus, offices in Krakow, Poland and Limassol, Cyprus.
By November 2019, Banuba has 32 granted and published patents and 9 filed patents in the fields of computer vision, face detection, face tracking, image processing, avatar generation and design.
History
Banuba was founded in May 2016 as an AI laboratory developing computer vision and augmented reality technologies for mobile devices. In Feb 2017 Banuba secured $5 mln investments from Viktor Prokopenya's VP Capital and The Gutseriev Family's Larnabel Ventures. This investment was spent on the development of its technology portfolio and internal product development.
In March 13, 2017, President of Belarus Alexander Lukashenko visited Banuba to discuss with the technological entrepreneurs and investors Viktor Prokopenya and Mikhail Gutseriyev the idea of creating an IT Development Council as part of the Decree on Development of Digital Economy.
In Nov 2018, Banuba received a further $7 million in funding from the same investment units and announced the launch of its new software development kit for third-party companies and brands wanting to use 3D Face AR in their apps.
In July 2019, Banuba was listed among the 10 most valuable startups in Europe specializing in face recognition and tracking according to Silicon Canals.
Technologies
Face tracking detects and tracks the presence of a human face in a digital video frame in real-time. The technology builds a 3D model of the face by recognizing 37 face characteristics represented as morphs for the default face mesh skipping the point of landmark detection.
Face segmentation labels facial regions including the nose, mouth, eyes and hair to allow for facial modifications. These can range from changing the size, shape and color of a face in filters, beautification applications and virtual makeovers.
Face beautification transforms the visual appearance of the user to adjust facial symmetry, smooth skin, correct tone, recolor hair, whiten eyes and teeth, morph face or apply virtual makeup.
Face filtering uses face tracking and 3D rendering graphical technologies such as 3D modelling, animation, physically based rendering (PBR), image-based lighting, animation billboards, facial morphing and other to enable AR experiences for the front-facing camera including facial animation and 3D mask application.
Background subtraction technology uses a convolutional neural network to identify and tag either a human subject or the background of an image to allow for the background subtraction or real-time animation in both full body and selfie mode.
Avatar generation estimates the shape of the face, facial parameters such as eyes, face contour, lips, nose and eyebrows to draw up a 3D model automatically and match it with face mesh to make avatars copy user facial expressions in real-time.
Age, ethnicity estimation system is composed of ageing and/or ethnicity feature extraction and feature classification by the convolutional neural networks.
Gender detection is enabled with a neural network, trained on the annotated dataset to recognize the human gender as male or female.
Emotion recognition estimates facial expression in real-time using morphable face models in conjunction with neural networks to detect happiness, anger, joy, sadness and surprise.
Action Units system is based on the extended morphable three-dimensional face model that includes extended set of morphs and anthropometry coefficients to track facial expressions of the user and animate 3D avatars and emojis.
Heart-rate tracking analyses the fine patterns of the facial areas and their color variations within time in order to detect pulse frequency.
Eye tracking technology combines machine learning with camera projection and statistical algorithms. The ‘random forest’ machine learning strategy, which implements regression combinations in order to achieve better accuracy when predicting the eye-gaze vector and pupil geometry. Camera projection transforms the predicted quantities into a geometrical format (points, arrows) on a mobile device screen.
Hand segmentation and hand tracking algorithms track palm and finger points of the hand skeleton model allowing to estimate hand position and attitude in the real-time.
Product
Banuba’s Face AR SDK enables face filtering, virtual makeup, beautification, avatar generation, virtual try-on and other facial augmented reality features in web and mobile apps. The company provides SDK licencing for brands and uses its Face AR SDK to build mobile consumer apps.
Face AR performance is achieved with precise face detection, face tracking and tailoring the technologies to different lighting conditions, skin colors and low-power devices. The software features advanced 3D rendering technology that allows users to create realistic AR effects in terms of color, texture, shape, and behavior.
Application
Banuba technologies find its application in retail and e-commerce, enabling virtual try-on solutions for makeup, glasses, accessories and headwear. Mobile developers integrate Face AR features in social media apps, dating platforms, video streaming solutions, video editors mobile games and AR advertising campaigns.
Yandex utilized Face AR SDK to enable AR video editor in a fashion app Sloy. An Icelandic startup Teatimes Games develops AR-powered mobile social games with a video chat built with Banuba’s Face AR technology.
Proprietary Mobile Applications
The company uses its Face AR technology to develop proprietary mobile applications with face filters, AR effects and face beautification among which are Banuba Face Filters, My Banuba: Family Face Filters, EasySnap selfie editor, INNER, FunCam Football 2018, VideoMoji.
 
< Prev   Next >