Behavioral Geovector

The Behavioral Vector Geolocation (Behavioral Geovector) is a process which involves defining the position of a person and capturing the direction of their eye movement. By knowing the position of a user and his eye direction and subsequently mappingthose coordinates, it is easy to determine all the objects that were looked at from all view points.
Definition
Geolocation is the identification of the real-world geographic location of an Internet-connected computer, mobile device, website visitor or other.
Vector, a Euclidean vector in mathematics is represented by a line segment with a definite direction connecting an initial point A with a terminal point B. It is fully described by both magnitude and direction.
Vector Geolocation introduces another concept, the behavior. Behavior refers to the actions or reactions of an organism in relation to the environment.
History
The company Geovector began conducting research into augmented reality in the early 1990s. The basic concept around which most of Geovector’s R&D has concentrated, is knowing the position and orientation of the device. This allows the application to provide digital information associated with a place in the real world. Position and direction data can be used to create a virtual vector which intersects with objects indexed in databases by their latitude and longitude coordinates.
In 2006, Geovector & Mapion delivered the World's first Pointing Based Search for mobile phones that can measure the position (geolocation) and the direction (vector) of movement of the user. As early as 2001 in France, a reflection on the geovector concept was carried out by Antoine POLATOUCHE, one of the creators of , and Yves APELOIG, this resulted in the general concept of Behavioral Vector Geolocation or Behavioral Geovector.
An experiment was conducted in 2004 to assess if the concept was working effectively. A prototype device was built in collaboration with Honeywell in France in 2004. This head-mounted prototype carried out location and head elevation measurements subsequently allowing interactivity with human behavior, was tested in 2005. Another experiment was carried out with 6 operational devices which were installed within an interactive audio guide application, in a real environment in the Château de Lourmarin :fr:Château de Lourmarin in France on 7th of July 2006.
Today, the main Information and Communication Technology companies are now focusing their R&D on geovector market. In the near future, the process will be able to determine what you look at and will automatically search the web for the corresponding information. The big question is: Who will build and maintain this huge database and all associated services?
The Behavioral Geovector concept
The behavioral geovector allows virtual interaction with the environment without manipulating any electronic device, hands free, using a user interface whose parameters are:
* The subject location (geolocation)
* The sight direction (vector)
* Measurable physical reactions of her/his behavior wherever the user’s dynamic position (movement) is.
Recorded and collected data treatment allows setting up the user’s behavioral modeling and reproducing a complete virtual world in order to use it in real time or after his ‘behavioralised’ journey.
The application field of such a concept covers interactive mobile systems as well as hands free control systems (e.g. for disabled people). This concept was presented for the first time in France in 2001 and gave birth to a behavioral interface that can:
* determine the user’s position: Geolocation
* determine the user’s sight direction in 3D space in order to find out what is looked at: Vector
* determine automatically and continuously head movement in space according to a specific algorithm. Knowing that human behavior can be mainly measured by head movements, the system can develop a usable and significant user’s behavior model: Behavioral
* continuously and automatically record data in order to accurately reconstruct the user’s journey as he lived it: what he saw, his preferences…
Patents related to this concept were registered in 2003 and 2005.
Principles
Behavioral Geovector can establish a coherent and stable link between the real world and its virtual reconstruction, a meta-universe Multiverse, where the person is the center according to his/her sight direction, orientation vector and behavior. It represents multiple variable geometry.
Behavioral Geovector allows controlling devices according to an established protocol which makes this system a very powerful aid for disabled people. By connecting the device to an embedded or remote database, the system will search, build and/or diffuse multimedia information taking the form of visual reconstitution. This reconstitution is extremely pertinent as it is accurately sequenced according to the interest of the user based on what he saw during her/his visit. This is the visual behavior.
By connecting the device to another electronic device, it is possible to produce a hands free action defined by the behavior: control a wheel chair, answer a call on a Smartphone…
Unlike other behavioral technologies e.g. wearable computers, Behavioral Geovector doesn’t gauge the brain activity. It doesn’t require additional equipment, nor emit any radio frequency waves. It is therefore non invasive and completely harmless.
The concept and the device have been recognized by the European Commission for ICT inclusion and by CEDRIC (IT research and study center) of CNAM (National Center for Arts and Crafts, Paris), which signed an agreement with the inventor of the concept to carry out advanced experiments.
Percipio
The first fully operational consumer product, Percipio, measures and analyzes the visual behavior in order to build up a behavioral model. It is produced by Eshkar , a company created to develop various applications. The device is assembled by its industrial partner Pallard Industrie. A specific design made of two wings in order to provide an audio system without physical contact makes it highly hygienic.
The system can deliver sequenced multimedia contents according to what the user looks at, indoor or outdoor and carries out a reconstitution of the user’s journey throughout a museum according to his/her interests and what he/she visited or looked at. This reconstitution can be enhanced according to the user’s interest and preferences. It is a hands free Intelligent Tour Assistant for able or disabled persons.

The device has been tested in a live environment from July 2006 and May 2008, in Lourmarin castle in France :fr:Chateau de Lourmarin. This was to validate the reliability of the device's audio guide application. During this experiment, visitors could use the device in the same way they would ordinarily use a standard audio guide. It has also been tested live in the Musée de l'Arles et de la Provence antiques and in the Palace of Versailles.
The device does not impair the user in any way. He/she can interact normally with his/her environment (e.g. communicate with others). It can be used by the hearing or visually impaired or other handicapped persons. It can be connected to a PDA or a Smartphone for Internet connectivity and data storage.

Function
It is an interactive mobile information system which operates according to the user’s mobility without any manipulation. It is hands free, can work indoor as well as outdoor without any additional equipment. It is the first system of its sort that can be used by anybody, including impaired people.
The system will track each and every head movement of the user to generate a meta-universe from the real environment. It will rebuild the user’s journey wherever he/she goes and looks. It also acts as an audio guide in a more intelligent way, according to what the visitor looks at, e.g. it can command: “Stop here, look on your right”…
The description of museum masterpieces or a monument is loaded on the device. The visitor wears PERCIPIO and starts his/her visit. According to where he/she goes and what he/she looks at, the device will guide him/her and at all time delivering specific comments through the speakers. When the visitor stops in front of a masterpiece for more than few seconds, the system will recognize it and will start commentary. As soon as he walks away, the system will instantly and automatically know it and will stop commenting about the masterpiece. The system is sequencing and delivering information according to the user's real time interest during the visit.
The technology
The technology consists of the combination of a Magnetometer, Accelerometer, GPS, Infrared, Compass, Piezoelectric speakers, and a Patented algorithm
Applications
- Culture and tourism: visit a site or a museum with all the necessary related information according to human’s head movement (what you see is what you hear) e.g. pilgrimage guides.
- Professional exhibitions
- Shopping mall customer’s behavioral measurement: in exchange for shopping vouchers, customers can wear the device while shopping. It records the customer’s behavior and what he looked at.
- Military
- Pursuit games, treasure hunt, urban games
- Multimedia and entertainment: a new device for any video game console able to analyze the player’s behavior and allowing avatar-type interaction
- Lifestyle and healthcare
- Medical applications: mobility behavior analysis and wounds recovery
- Sports: can measure how long it takes an injured athlete to fully recover
- Elderly or handicapped persons: can guide a visually impaired person throughout his/her journey
- Education
 
< Prev   Next >