Eco-innovation diffusion

Eco-innovation diffusion is a research subfield of innovation diffusion that seeks to explain how, why, and at what rate new "ecological" ideas and technology spread. Eco-innovations refer to a wide range of innovations such as efficient energy use, renewable energy technologies, pollution prevention schemes, waste management equipment, green technology and sustainable agriculture. Although the eco-innovation literature typically focuses on policy, regulations, technology, market and firm specific factors rather than diffusion, understanding of eco-innovation diffusion has recently gained importance.
What drives eco-innovation?
Macro (or company) level
There is no consensus as to what factors drive eco-innovation at the company level. Studies leaning towards the field of innovation indicate that demand factors play an important role for the generation of eco-innovations. On the other hand, insights from the management literature on corporate social responsibility (CSR) suggest that the societal pressure and demand for environmentally friendly products and processes may affect the decision of the firm to undertake eco-innovations, but have little or no impact upon the level of investments in eco-innovations. This may suggest that firms initiate eco-innovations in order to satisfy the minimum customer and societal requirements, but increased investments in eco-innovations are stimulated by other factors such as cost savings, firms' organizational capabilities and stricter regulations. Stringency of environmental regulations affects eco-innovation as firms respond to stricter environmental regulations with higher levels of eco-innovations, but only the least and most innovative firms are driven by regulatory requirements. The (NIMBY) concept is often used to describe what at first seems to be a confusing intention-behavior gap between high levels of public support for eco-innovations and frequent non-engagement or even local hostility towards specific projects, like onshore wind farms. As Gyamfi et al. and Byrka et al. argue, social psychology and economic behavior models should be used to overcome these challenges.
A well explored factor that fosters diffusion of eco-innovations is conformity to others. Innovations naturally involve uncertainty triggered by insufficient knowledge. In such situations of uncertainty, opinions and behaviors of friends and neighbors often serve people as a guideline for their own behaviors. In empirical research peer effects were found to increase adoption of solar panels or reduce residential energy usage at households. electric vehicles and sustainable transport or residential photovoltaic systems.
A less studied factor is the cost related to adoption that may effectively hinder this process. If external barriers are modeled in the literature at all, they usually refer to price regulations and economic burden set on customers. Two main classes of models are considered in the literature: aggregate innovation diffusion models and agent-based models.
It were the seminal works of Fourt and Woodlock, Rogers and Bass in the 1960s that have triggered studies on innovation diffusion. Particularly the Bass model, which is defined by a simple differential equation that characterizes the diffusion as a contagious process initiated by mass media and propelled by word-of-mouth communication, has seen countless applications. In the context of eco-innovations, variants of the Bass model have been used among other to study the diffusion of wind power technology, green electricity tariffs, stationary fuel cells, photovoltaic-system support schemes and consumer demand for smart metering tariffs.
What makes a good eco-innovation diffusion (agent-based) model?
It is not easy to define what makes a good model for the diffusion of eco-innovations. Simplicity and usefulness are the key factors, however, the challenge resides in simplifying the often complex theories of social science and the complex reality into simple sets of rules. Byrka et al. The idea of binary states is natural from the social point of view, since dichotomous response format with 1 for "yes", "true" or "agree" and 0 for "no", "false" or "disagree" as response options is one of the most common in social experiments. Agents with more than two states or continuous opinions have been also considered in the literature, but Byrka et al. However, conformity is not the only type of social response and the other three main types (independence, anti-conformity and congruence) should be also taken into account. Recently these types of social response have been formalized within the class of "spinson models", i.e. models of interacting dichotomous agents.
Difficulty of engagement
Difficulty is considered a key predictor of behavior in psychology and is included in the most recognized models of behavior change, such as the Theory of Planned Behavior. Typically, behavioral difficulty is considered to be subjective and person-dependent (relying on people's perceptions). However, a more objective, person-independent difficulty seems more relevant for environmental or energy policy, because people often fail to recognize how challenging the barriers that they have to overcome really are and their perceptions may depend on mood or current circumstances. proposed a model of behavior change with person-independent difficulty and called it the Campbell Paradigm. Their model treats the likelihood of individual behavior as a function of a person's attitude and of the difficulty of engaging in this behavior. The more demanding these barriers are, the more favorable attitude towards a general goal, such as environment protection, a person needs to have to overcome them. The relation between difficulty of behaviors, attitude and behaviors can be computed using a one-parameter logistic Rasch model and the proportion of persons that engage in a given behavior. Since it can be easily measured through market surveys, the difficulty of engagement provides a much needed practical link between the model and the real world.
The size of the influence group
The optimal group size for discussions, collaboration, etc., has been the issue of interest for decades. In the management context, Hackman and Vidmar concluded from a cross-sectional study that the optimal team size was 4.6 members, while Wheelan found that groups containing 3 to 6 members were significantly more productive and more developmentally advanced than larger groups. From the point of view of social influence, the size of the group cannot be too small (it has to be of a sufficient size to invoke the social pressure) nor too large (people tend to discuss in small groups); the "optimal" group size has been found to vary between 3 and 5, depending on the experiment. For these reasons, instead of using threshold models, so-called q-voter models in which each individual interacts with a set of q neighbors instead of all of its neighbors (in a social network context: all nodes with a link to the target node) may be used.
Unanimity or majority
Conformity can be modeled in different ways, but the most common is to assume that a predefined ratio <math>r\in (0,1]</math> of adopted neighbors is needed to influence an agent.
* In the person approach there is a fraction <math>p</math> of individuals that are permanently immune and a fraction <math>(1-p)</math> of agents who are susceptible to the social influence. The distribution of independence consists of two peaks - one of height <math>p</math> at <math>p 1 </math> and the second of height <math>(1-p)</math> at <math>p 0 </math>, and can be treated as an approximation of a bimodal distribution.
* In the situation approach all agents have the same level of independence <math>p</math>, i.e. every individual behaves independently with probability <math>p</math> and conforms to group pressure with probability <math>(1-p)</math>. The distribution of independence consists of only one peak of height 1 at <math>p = p </math> and can be treated as an approximation of a uni-modal bell-shaped curve.
Such two extreme scenarios introduced on the agent level can yield very different results on the aggregate or system level. As there is no single, optimal structure, different specifications have been considered in the literature:
* Complete graphs, in which every agent is connected with every other agent. Although this is unrealistic for large social networks, it is suitable for describing small groups (or cliques) within larger structures. This topology allows for analytical treatment.
* Regular networks, like the chessboard, which are commonly used for spatial modeling.
* Random graphs, in which any two agents are connected with some probability, independently of the other links. This topology also allows for analytical treatment, but is unrealistic as it does not admit high clustering. Yet, it often serves as a benchmark for comparisons with other network structures.<ref name=":4" />
* Small-world networks, which is a much more realistic topology that possesses two wanted features - short path length and high clustering. Communities exhibit the small-world topology with strong clustering.
* Scale-free or Barabasi-Albert networks, which are characterized by a power-law tail in the degree distribution. Structures like the Internet are regarded as scale-free networks.
* Real-world networks, like "friends lists" from Facebook or "circles" from .
 
< Prev   Next >