Affective gaming

Affective gaming is where the player's current emotional state is used to manipulate game-play. Affective gaming can be instrumented through biofeedback, it is although important to note that simply using for instance the flexing of a muscle as a way of controller the character, is merely biofeedback gaming. In order to make a game affective, one have to utilize the bio-metric data in order to try to understand and interpret the players emotional states, feelings and reactions.
Biofeedback
Seeing that your heart-rate is high in a HUD while playing a game, and then realizing you need to take a deep breath and calm down, is an example of biofeedback.
Biofeedback <nowiki/>mechanisms for games can be divided up into two sub-groups, indirect biofeedback consists of physiological processes like heart-rate, pulse, respiratory rates and body temperature. Direct biofeedback consists of things we are directly in control of, like the flexing of a muscle. In order to utilize biofeedback mechanisms for affective games, indirect is where most of the unconscious processes lie, which are the most optimal to analyze in order to understand the users emotional states and reactions. Now there are exceptions, for instance one can manipulate the respiratory rates by being aware of it, and muscles can also be contract involuntary.
In order to use biofeedback, some sensors are needed. New gaming equipment like the Mionix Naos QG allows us to get heart rate and galvanic skin response without attaching several different sensors.
Existing work
Allanson, Dix and Gilleade (2005)
Allonson, Dix and Gilleade laid the foundation and first definitions of affective gaming.
They used a biosensor to capture the ECG HRV (Electrocardiogram Heart rate Variability) and GSR (Galvanic Skin Response) of a player in real-time. Although those two measurements doesn't give a complete bio-metric analysis, it does provide insight into the player's reaction to specific game events.
The aspects they changed within the game, based on the bio-metric data combined with <u>predefined base levels</u>.
* Based on heart-rate
** The characters speed
** Screen-shake
** Shader effects
*** Red filter (excited/stressed)
*** White and black filter (calm)
* Based on skin response
** The volume of the environment
** The gravitiy and density of the environment
** Semi-transparent environment
* Based on both
** Weapon damage
Kalyn, Lough, Mandry and Nacke (2011)
Kalyn, Lough, Mandryk and Nacke utilized both indirect and direct physiological control / bio-feed in order to enhance game interaction.
To get all the inputs they attached various sensors to their body and used that as a means to control the game (video). The aspects they used biofeedback to change within the game:
* Enemy Target Size
** Respiratory rates (C1)
** Skin conductance (C2)
* Length of the flame (Flamethrower)
** Skin conductance (C1)
** Respiratory rates (C2)
* Speed and jump height
** Heart-beat (C1)
** Muscle contraction (C2)
* Weather condition and boss speed (final boss)
** Body temperature (C1)
** Heart-beat (C2)
* Medusa's gaze (Eye tracker, froze enemies and moving platforms)
** Gaze (C1)
** Gaze (C2)
C1 are the first conditions, and C2 the second.
They found that direct physiological functions can be used as a means of replace traditional controller input, but it should reflect an action in the real world. Indirect physiological control are appropriately used indirectly, to change the environment and similar aspects.
Torres (2013)
Torres made his Masters thesis on "Development of Biofeedback Mechanisms in a Procedural Environment Using Biometric Sensors", building on the papers of the authors of the previous papers mentioned in this section. Where he looked at how to utilize the biofeedback, utilizing both indirect and direct biofeedback appropriately.<ref name=":1" />
The E² is a biofeedback game development standard, framework that was created, and named The Emotion Engine (E²). Not to be confused with the Sony CPU: Emotion Engine.
E² was applied in a survival-horror game VANISH.
E² - The Emotion Engine
The emotion engine uses different types of indirect biofeedback, to change the game:
Visible Indirect Biofeedback (V-IBF) - Adjusting aspects of the game which are noticeable by the player.
Non-Visible Indirect Biofeedback (NV-IBF) - Adjusting aspects of the game which are not noticeable by the player.
Emotional Regulation Biofeedback (ERB) - Studies the player through the game, and uses the information to dynamically trigger/adjust specific game mechanics
Potential applications
Biofeedback can be utilized for competitive games, e-sports and casting of e-sports. With biofeedback the e-sports players can see when they need to calm down, or when another team member needs help as they might be "stressing out". It can also help the e-sports players and teams to analyze their performance, and it provides deeper analysis opportunities for casters.
Many games today changes aspects of the story based on choices and decisions you make during the game, for instance Dark Souls III has four different endings.
One could incorporate the biofeedback mechanisms in order to tailor the game also based on the players emotional states and reactions to events within the game. In combination with VR this could create immersive and even more personalized game-play than previously.
The technology can also contribute to various aspects within e-health systems like Qualcomm's , and contribute to big data<nowiki/>projects.
Problems
For biofeedback devices like the Mionix Naos QG, it could be problematic for game developers to utilize the sensors if there were a sudden boom in devices that incorporated different kinds of sensors and methods for reading the data. If there were 50 different devices, with different sensors and different data, it would be difficult for the game developers to incorporate 50 different standards.
 
< Prev   Next >