Another feature that is relevant is the sensitivity of complex adaptive systems to the environment. For example, Langton’s (1992) artificial life simulations showed that dramatically different system-level behaviors can emerge from small perturbations of environmental conditions. For instance, small changes led to three unique states which might be called “highly ordered”, “edge of chaos”, and “highly chaotic” (Langton, 1992). While highly ordered and highly chaotic system states resulted in failure, systems that evolved to the “edge of chaos” state produced the type of complex and adaptive responses that allowed life to thrive. In fact, the differences between these states were quite salient, which implied that these systems appear to have an internal threshold function that produces abrupt ‘tipping point’ transitions between adaptive and maladaptive states.
Synthesis and Recapitulation
In each literature, we found evidence for an inverted U-shaped relationship between the amount of structure and performance. We were particularly struck by the commonality across diverse structures in the organizational studies, network sociology, strategy, and complexity theory streams of research. Whether the structures are roles, linkages, rules, or schemata, a common logic of adaptation appears to underlie all of these observations – i.e., that a moderate amount of structure leads to higher performance (Kauffman, 1993; Gell-Mann, 1994).
The logic that underlies these observations leads to two hypotheses that form the theoretical core of the emerging theory that we seek to explore and extend. The first hypothesis links the amount of structure with high performance. Consistent with research in organization studies, network sociology, strategy, and complexity theory, we argue that over-structured systems constrain behavior by impeding improvisational response to dynamic environments (Weick, 1976; Reynolds, 1987; Langton, 1992; Kauffman, 1993), whereas systems that are under-structured lack the coherence to efficiently respond to changes in these environments (Brown and Eisenhardt, 1998; Weick, 1998). This points to the existence of an optimal level of organizational structure – i.e., a region of moderate structural complexity giving rise to the highest performance.
H1: Organizational performance has an inverted U-shaped relationship with the amount of structure.
The second proposition links environmental dynamism to the optimal level of the amount of structure, and builds upon the inverted U-shaped relationship of performance with structure (H1). Consistent with extant research (Haveman, 1992; Pisano, 1994; Eisenhardt and Tabrizi, 1995), we argue that as environmental dynamism increases, the need to quickly and flexibly respond to changing opportunities becomes more critical than efficiency for achieving effective performance. Therefore, simpler structures become more useful because they are applicable to a broad array of opportunities (Brown and Eisenhardt, 1998; Rowley et al., 2000). Simpler structures enable improvised actions that are likely to be better suited to the specific demands for capturing any particular opportunity (Miner, et al, 2001; Rindova and Kotha, 2001). Conversely, as environmental dynamism decreases, more structure becomes more effective (Miller and Shamsie, 1996). In these settings, managers can establish complicated and dense structures that are customized to the environment because change takes place so infrequently and often incrementally (Tushman and Anderson, 1986; Miller and Shamsie, 1996; Siggelkow, 2001). Thus, while structure is less effective in more dynamic environments where the flexibility to adjust to new opportunities is key, it is more effective in less dynamic markets where efficiency is critical and possible.
H2: As environmental dynamism increases the optimal amount of structure decreases.
While these hypotheses capture the core theoretical relationships regarding the tension between too much and too little structure that appear in the extant literature, they leave open key theoretical issues. For example, it is unclear whether the inverted U-shaped relationship is symmetric. That is, it may be advantageous to err on the side of too much or too little structure. It is also unclear whether there is a wide range of optimal structures that suggests the optimal structure is easy to find and manage, or whether there is a very narrow range of optimal structures that suggests a managerially challenging edge of chaos. It is also unclear how, if at all, various attributes of market dynamism (e.g., velocity, complexity, ambiguity, unpredictability) might influence the tension between too much and too little structure. Finally, the role of mistakes has not been explored. Thus, our research has two objectives: confirm the internal validity of the above hypotheses using the precision of simulation, and more significant, conduct virtual experiments to probe the open theoretical issues, thereby elaborating and extending the theory.
METHODS
We conduct this research using simulation methods. Specifically, we model the environment as a flow of heterogeneous opportunities, and the organization as a collection of rules for executing those opportunities. We chose simulation because it is a particularly effective method for research such as ours where the basic outline of the theory is understood, but its underlying theoretical logic is limited (Davis et al., 2006). That is, simulation “facilitates the identification of structures common to different narratives” (Rudolph and Repenning, 2002: 4). Given its computational precision, simulation is useful for internal validation of underlying theoretical logic as well as elaboration and extension of theory through experimentation (March, 1991; Zott, 2003). The result is often a more complete theory with greater clarity in its assumptions, theoretical relationships, constructs and underlying logic. As Sastry (1997: 237) notes, simulation helps “examine the completeness, consistency, and parsimony of the causal explanation laid out in an existing theoretical model.”
Simulation is also a particularly useful method when the focal phenomenon is non-linear (Carroll and Burton, 2000; Davis et al, 2006). While inductive case and statistical methods may indicate the presence of non-linearities, they are less precise than simulation in the calibration of these non-linearities, particularly complex ones such as tipping points, cusps, and skews. Fundamental tensions (such as in our study) often exhibit these non-linear effects (Rudolph and Repenning, 2002). Simulation is also a particularly useful method for research such as ours in which empirical data are challenging to obtain (Davis, et al, 2006). For example, simulation enables us to study mistakes that are often difficult to measure (e.g., people are reticent to acknowledge them), and to unpack the distinct effects of environmental dimensions that may be difficult to disentangle in actual environments. Finally, simulation is particularly useful for understanding longitudinal and process phenomena such as ours because simulation can be used to track these processes over longer time periods than is realistically possible with empirical data (Sastry, 1997; Zott, 2003).
While several simulation approaches (e.g., systems dynamics, genetic algorithms) are available, we use stochastic process modeling. This approach enables researchers to custom-design the simulation because it makes no particular assumptions about the research question, and is not constrained by an explicit problem structure (e.g., cellular automata grid). Rather, it allows the researcher to piece together processes that closely mirror the focal theoretical logic, bring in multiple sources of stochasticity (e.g., arrival rates of opportunities), and characterize them with a variety of stochastic distributions (e.g. Poisson, gamma, normal) (Davis et al., 2006).
Stochastic modeling is a particularly effective choice for our research because the problem structure does not fit well with any structured approach. Stochastic modeling enables us to accurately represent the phenomenon rather than force-fitting it into an ill-suited structured approach. Further, since the rough outline of our base phenomenon is established in the empirical literature, we enhance the likelihood of a realistic model by using stochastic modeling (Burton and Obel, 1995; Carroll and Harrison, 1998; Burton, 2003), and thus mitigate a key criticism of simulation. Finally, stochastic modeling is also a particularly effective choice for our research because it enables the inclusion of multiple sources of stochasticity that offer greater latitude to experiment to develop new theoretical insights than more structured approaches (Davis et al., 2006). For example, we are able to include several sources of stochasticity that are theoretically important in this research (e.g., improvisation of behaviors, flow of opportunities) and to experiment with theoretically relevant environmental attributes (e.g., velocity, ambiguity).
Modeling the Environment and Organization
There are two major components of our simulation model: environment and organization. We model the environment as a flow of heterogeneous opportunities. This is consistent with the conceptualization of market dynamism in the entrepreneurship (Shane, 2000) and Austrian economics (Hayek, 1945; Kirzner, 1997) literatures where such environments are focal. We model the organization as a collection of rules for executing these opportunities. This conceptualization is consistent with the organization theory and strategy literatures which indicate that rules are an important type of structure in dynamic environments (Brown and Eisenhardt, 1997), and can be used to capture key market opportunities (Rindova and Kotha, 2001; Bingham and Eisenhardt, 2005). While we could have operationalized structure in several ways (e.g., nodes, linkages, roles), we chose rules because of their importance in dynamic markets (Burgelman, 1994) and their relevance in organization theory (Cyert and March, 1963) and strategy (Nelson and Winter, 1982; Teece et al., 1997), our two primary areas of interest.
A brief outline of the model follows. The organization has a set of rules to capture opportunities in its surrounding environment. In each time period, the organization takes some rule-based actions to capture a given opportunity (e.g., selection of opportunities, execution of opportunities). But typically the organization does not have rules to cover all aspects of an opportunity, and so it also takes some flexibly improvised actions. Thus, the firm takes both rule-based and improvised actions to capture opportunities. When these actions match the opportunity, the opportunity is captured and firm performance increases by the value of opportunity. A key point is that the firm’s actions (both rule-based and improvised) require managerial attention which is bounded (March and Simon, 1958) and further, since improvised action involves real-time sensemaking (Weick, 1993) and thoughtful convergence of design and action (Miner et al., 2001), improvised action requires more attention than rule-based action. Therefore, the organization has a limited number of actions that it can take in any one time period. When the firm’s attention budget is used up, the firm cannot take actions until its attention is replenished in the next time period.
As in all research, we make several assumptions. First, in order to focus on the effects of the amount of structure on performance, we assume that all rules are appropriate for at least some opportunities. In addition, since recent empirical research indicates that simple heuristics such as ours are learned quickly and stabilize rapidly (Bingham and Eisenhardt, 2005), we assume that the rules have already been learned, and that adaptation to new environments occurs through improvised action. Finally, we assume that the effects of competitors are realized through the flow of opportunities, an assumption that mirrors the Austrian economics argument that market dynamism is endogeneously created by the actions of competitors (Kirzner, 1997). Research on competitive action suggests that this is likely to be a valid assumption in dynamic environments (D'Aveni, 1994; Roberts, 1999; Hill and Rothaermel, 2003). Nonetheless, while reasonable, these assumptions could be explored in future research.
Environment: Market Dynamism
While some research conceptualizes the environment in terms of a single environmental attribute such as “velocity” (Eisenhardt, 1989) or “complexity” (Gavetti, Levinthal, and Rivkin, 2005), we model the environment as a heterogeneous flow of opportunities flow characterized by four dimensions – velocity, complexity, ambiguity, and unpredictability. We chose these dimensions because they are the attributes that research has shown to be relevant in dynamic markets (D'Aveni, 1994; Grant, 1996; Eisenhardt and Tabrizi, 1995; Rindova and Kotha, 2001).
We define velocity as the pace of opportunity flow into a given environment. The velocity of opportunity flow is a key dimension of market dynamism because it influences the nature of major organizational activities like strategic decision making (e.g., Hickson, et al., 1986; Eisenhardt, 1989) and product innovation (Eisenhardt and Tabrizi, 1995). The Internet bubble is an example of an environment with a high velocity of opportunities. We define complexity as the degree to which environmental opportunities have many features that must be successfully executed. Factors like institutional norms (DiMaggio and Powell, 1983), geographic or material-resource constraints (Dill, 1958), and technical challenges (Tyre and Orlikowski, 1993) increase the complexity of opportunities because they increase the number of opportunity features that must be correctly executed. The implication of higher complexity is that opportunities become more difficult to capture. Biotechnology is an example of a high complexity environment because many facets of an opportunity must be correct to achieve success (Hill and Rothermael, 2003). Ambiguity is the degree to which the key features of opportunities are difficult to interpret. Ambiguity is an important aspect of the environment because ambiguous environments are equivocal, and so are challenging to perceive (March and Olsen, 1976; Hickson, et al., 1986). Nascent markets such as nanotechnology are environments with high ambiguity (Santos and Eisenhardt, 2006). Unpredictability is the degree to which past opportunities are dissimilar from present ones and so are unforeseeable. In predictable environments, leaders can use anticipated patterns to capture opportunities (Dess and Beard, 1984). In contrast, leaders have few or no patterns to exploit in unpredictable environments (Baum and Wally, 1999). Growth markets, for example, are often unpredictable (Eisenhardt and Schoonhoven, 1990).
Operationalization of Market Dynamism
Velocity is operationalized as the rate of opportunity flow into the environment. Specifically, a Poisson distribution is used to model a stochastic flow of opportunities into the environment with a velocity lambda, λ. A Poisson distribution, p(k), describes the probability of k opportunities arriving in t timesteps and is determined by the single rate parameter λ:
p(k) = (λt)e-λt / k! (1)
Poisson is a well-known probability distribution that is used to model arrival flows (Cinlar, 1975; Glynn and Whitt, 1992). It is particularly attractive here and in many simulations because it makes few underlying assumptions about the timing of opportunities (Law and Kelton, 1991). Although lambda can range from 0 to infinity, we fix an upper bound because of the inherent limits of bounded rationality and resource-constraints that limit the number of opportunities that can be introduced into the environment (Shane, 2000).
Complexity is operationalized as the number of features of an opportunity that must be correctly executed to capture the opportunity. Similar to computational complexity (i.e., number of states needed to complete a computation) (Simon, 1962; Sipser, 1997), greater complexity makes an opportunity more difficult to capture because the organization must get many features right to achieve success. Specifically, an integer indicating the number of opportunity features that must be correct in order to capture a given opportunity is used to operationalize complexity. Since each opportunity has 10 features, complexity ranges from 0 to 10.
Ambiguity is operationalized as the difference between the actual features of an opportunity and its features as perceived by the firm. The actual features of opportunities are modeled as a 10-element bit string (i.e., vector) of 1s and 0s – e.g., 0100100110. The perceived features of the same opportunity are also a 10-element bit string of 1s and 0s, but differ from the actual features by those features for which perception does not match reality – e.g., 0110100110. Thus, environmental ambiguity is operationalized by the proportion of perceived features that differ from actual features. For example, ambiguity = 0.1 could produce the example (above) bit strings of opportunity features since they differ by only 1 element of 10. This is a particularly useful way to model environmental ambiguity because it makes precise the relationship between the perceived and actual features of opportunities. Ambiguity ranges from 0 to 1.
Unpredictability is operationalized by the extent to which the features of opportunities in the present are dissimilar to the features of opportunities in the past. We use an entropy measure of unpredictability (Cover and Thomas, 1991), and vary entropy by varying the probability, p, of a given feature being a ‘1’. Formally, the entropy, H, of an opportunity is simply the negative sum over all possibilities (in this case only 2 possibilities: either ‘1’ or ‘0’) of p*log(p):
H = - ∑ p * log2(p) (2)
For example, when the probability of a feature being ‘1’ is 0.5, or p(1)=0.5 – and, by implication, the probability of being a ‘0’ is 0.5, or p(0)=0.5 – the entropy is high at H=1 and unpredictability is thus high as well (Cover and Thomas, 1991). In contrast, when there is a higher probability of elements equal to ‘1’ than ‘0,’ the opportunities have a lower entropy or H<1 and thus, less unpredictability. The implication of less unpredictability is that leaders are more able to use rules to exploit underlying patterns in the opportunity flow because these patterns are more likely to exist. For example, a stream of opportunities with a high probability of 6 recurring 1s (1111110101... 1111110010.... 1111110110) can be matched by rules with that same pattern. . Thus, we operationalize unpredictability by giving the rules the same proportionality of ‘1’s as we give the opportunities. Our operationalization of unpredictability is especially useful because it captures the insight that organizations can learn underlying patterns of opportunities and incorporate them into their heuristics (Cyert and March, 1963; Bingham and Eisenhardt, 2005). In particular, less unpredictability is associated with more pattern and more unpredictability is associated with less pattern. Unpredictability ranges from 0 to 1. See the Technical Appendix for more detail.
In addition, each opportunity has an associated perceived payoff value that executives believe a priori will be gained if the opportunity is correctly captured, an actual payoff value that the firm actually receives if the opportunity is correctly captured, and a window of opportunity during which an opportunity can be captured. There is no a priori knowledge of the exact length of the window of opportunity or the actual magnitude of the payoff, consistent with real environments (Kirzner, 1997; Shane, 2000). The operationalization of each dimension, the opportunity payoffs, and windows of opportunity are described further in the Technical Appendix.
In summary, we model the environment as a flow of heterogeneous opportunities. Four dimensions (i.e., velocity, complexity, ambiguity, and unpredictability) are defined and operationalized as parameters that describe this flow. These dimensions take specific values in each simulation run, but can be varied across runs. Opportunities flow into the environment at a velocity λ. More or fewer features of a given opportunity must be correctly acted upon depending on complexity. The degree to which the actual features of opportunities differ from the perceived features seen by firms is specified by ambiguity. Unpredictability is the degree to which features of opportunities are dissimilar to one another over time.
Organization: Rules as Structure
We model the organization as a set of rules for capturing opportunities. Following Eisenhardt and Sull (2001), we use five types of rules: boundary, priority, how-to, timing, and exit rules. These types have also emerged in empirical research (Bingham and Eisenhardt, 2005), and characterize the types of rules that appear in the literature on dynamic environments (Burgelman, 1994; 1996; Gersick, 1994; Brown and Eisenhardt, 1997; Rindova and Kotha, 2001; Galunic and Eisenhardt, 2001; Miner et al, 2001). Each rule relates to particular actions to be taken with respect to capturing an opportunity (Table 1). Collectively, these rules partially determine which opportunities are chosen (boundary rules), in what order (priority rules), how to execute them (how-to rules), how many opportunities to address at a time (timing rules), and when to stop (exit rules) exploiting an opportunity, and so form the framework of rules (i.e., improvisational referents) within which flexible action occurs (Weick, 1998).
Operationalization of Five Rule Types
Boundary rules determine which opportunities leaders will attempt to capture. Boundary rules are critical because they define the scope of market opportunities within which firms operate (Santos and Eisenhardt, 2005). For example, pharmaceutical companies often use rules to decide which drug development opportunities to consider. These rules might specify the size of the market, therapeutic area, and other factors in deciding whether to pursue an opportunity. These rules can often be framed as if/then statements (March and Simon, 1958). For example, boundary rules might be “If the drug development opportunity is 1) within cardiology 2) has at least a $100 million/year projected market and 3) for which at least one senior scientist has related experience, then consider the opportunity.” In addition, this same pharmaceutical firm might have other boundary rules related to other therapeutic areas, market sizes, or factors such as geography and scientific difficulty. Thus, boundary rules classify opportunities as either outside or inside the firm’s purview.
Boundary rules are operationalized as a 10-element bit string of 1s, 0s, and ?s – e.g., 01?0??011?. To determine if an opportunity is within the firm’s boundaries, each boundary rule element with a 1 or 0 is compared to the corresponding perceived feature of the opportunity. If a perceived feature of the opportunity has the 1 or 0 in the corresponding place, then this element ‘matches’. All elements in a boundary rule with a 1 or 0 need to match for the rule to classify the opportunity as inside the pool of opportunities to pursue further. That is, 1s and 0s represent conditions that must match corresponding perceived features of the opportunity. ? elements are not checked for a match, and so boundary rules with more ?’s are less structured.
Priority rules rank opportunities that have successfully passed through the boundary rules. An example of a priority rule is Intel’s rule for allocating manufacturing capacity to alternative semiconductor products (i.e., opportunities) according to their corresponding profit margins (Burgelman, 1996: 205). Applying this rule resulted in a ranking of semiconductor product manufacturing opportunities, and so prioritized the products that were made. Consistent with Burgelman (1996), priority rules are operationalized by the rankings of the opportunities according to their perceived payoff values.
How-to rules specify the actions for executing the opportunities. For example, Galunic and Eisenhardt (2001) find a how-to rule for executing “patching” opportunities in a high-performing, multi-business corporation that stated, ”Always assign new product-market charters to divisions that 1) have relevant product-market experience and 2) are currently assigned to charters with shrinking market size or fading profit margins.” How-to rules can relate to a variety of opportunities that are embedded in activities such as manufacturing, sales, internationalization and acquisitions.
How-to rules are operationalized with a 10-element bit string of 1s, 0s, and ?s (e.g., 0?1?10???0). For each 1 and 0, the firm applies its how-to rules to an opportunity by taking the corresponding rule-based actions. For each “?”, the firm randomly improvises a 1 or 0 (e.g., 0111100110), and then compare this set of 10 actions to the opportunity’s features. If the number of actions (both rule-based and improvised) that match the actual features of the opportunity equals or exceeds the value of the environmental complexity of the opportunity, then the opportunity has been captured and the firm gains the actual payoff value of that opportunity for the time period. For example, if the environmental complexity = 6 and the actions above – 0111100110 – are compared to the opportunity 0110101010, then the opportunity is successfully captured because 7 of the actions were correctly taken. This is an effective operationalization of how-to rules (i.e., heuristics that specify actions for executing opportunities) because it captures the idea that while certain actions are specified by the rules, others are left open to real-time improvisation (Brown and Eisenhardt, 1997; Miner, et al., 2001).
Timing rules specify the maximum number of opportunities that a firm can attend to at any given time. Timing rules have been found in several studies of organizational processes in dynamic environments. For example, Brown and Eisenhardt (1997) find that managers use timing rules to pace the execution of multiple product development opportunities according to a temporal rhythm (e.g., one new product every 9 months). Similarly, Bingham and Eisenhardt (2005) note that entrepreneurs use timing rules to specify the number of new country entry opportunities to exploit simultaneously. Gersick (1994) also finds that entrepreneurs use timing rules to delimit the opportunities that are simultaneously addressed. Timing rules are operationalized as the number of opportunities that can be examined at any given time.
Finally, exit rules indicate when to stop the execution of an opportunity. Exiting fading opportunities in a timely fashion is critical for firms because it frees up resources for capturing new opportunities (Burgelman, 1994). In this simulation, exit rules are operationalized by comparing the remaining actual payoff for each currently captured opportunity that is about to reach the end of its window of opportunity with the perceived payoff available for the highest priority opportunity that could be addressed, and then choosing the opportunity with the highest perceived payoff. This operationalization of exit rules captures the idea that firms stop exploiting opportunities when they are about to end and when attractive new opportunities are available.
Operationalization of the Amount of Structure and Performance
Amount of Structure. We operationalize the amount of structure in two ways in order to ensure more robust results. The first operationalization is simply the number of total rules of all types that the firm uses. This operationalization is consistent with theoretical notions of structure such as Simon’s (1962) in which the amount of structure is associated with the number of components. In the context of our research, this suggests that an increase of the number of rules increases structure through the number of specified actions, some of which are simultaneous. The second operationalization focuses on the amount of structure in the how-to rules because of their importance in capturing opportunities. For example, recent empirical research by Bingham and Eisenhardt (2005) on the emergence of rules for capturing opportunities indicates that the number of how-to rules exhibits the greatest variance across firms, and is the type of rule that is most closely associated with capturing opportunities and creating high performance. Specifically, the second operationalization of the amount of structure, number of rules, is the total number of rule-based actions specified in the how-to rule, (ranging from 0 to 10) – i.e., the number of how-to rule elements that have either a 1 or a 0 value. For example, the number of rules in the how-to rule element bit string 01?0??011? is 6. For ease of exposition, we term a small to moderate number of rules (i.e. 3-5) “simple rules”.
Performance. Performance is operationalized as the sum of all actual payoffs to every opportunity captured, across all time periods. This operationalization is particularly appropriate for our research because it is consistent with the empirical research on dynamic markets that indicates that performance is the result of a series of advantages and their related payoffs (D'Aveni, 1994; Rindova and Kotha, 2001; Roberts, 1999). Specifically, each opportunity has an associated actual payoff that is drawn from a normal distribution, and performance is the sum of these actual payoffs of captured opportunities, across all time periods. See the Technical Appendix for details.
Simulating the Model
We implemented this model in Matlab and Java software. The flow of the computer program is outlined below (see also Table 1) while key parameter and distribution details are in the Technical Appendix. Initially, the organization’s structure (i.e., its rules) and environment (i.e., the velocity, complexity, ambiguity, and unpredictability of the flow of opportunities) are randomly set using draws from probability distributions (Law and Kelton, 1991). In each timestep, opportunities flow into the environment at a velocity lambda. The organization takes rule-based actions in attempts to capture some of these opportunities (10-element bit string of 0’s and 1’s). But it typically does not have rules to cover all facets of an opportunity, and so it takes some flexibly improvised actions. When these actions (both rule-based and improvised) match the opportunity (number of element matches is the same or greater than the environmental complexity), the opportunity is captured and firm performance increases by the actual payoff value of the opportunity. A priori the length of the window of opportunity and the size of the payoff are perceived by the organization, but (as in real firms) their actual values may not correspond to these perceptions. Consistent with cognitive limits (March and Simon, 1958; Occasio, 1997), the organization has a fixed attention budget for each timestep that is decreased when rules are compared to opportunities (e.g., boundary rules are compared to the opportunity) and when rule-based and/or improvised actions are taken. Since improvised action involves real-time sensemaking (Weick, 1993) and thoughtful convergence of design and action (Miner et al., 2001), improvised action requires more attention than rule-based action. If the organization uses all of its attention in a given timestep, it must wait until its stock of attention is reset in the next timestep. At the end of t=200 timesteps, the simulation ends and performance is computed. We chose this number of timesteps because it is large enough to allow sufficient opportunities to flow into the environment such that any initialization effects on the findings are mitigated (Law and Kelton, 1991).
Monte Carlo Simulation Experiments
Given our stochastic modeling approach, we use standard Monte Carlo simulation techniques in order to make more accurate inferences about the simulation results. In the Monte Carlo approach, an experiment is a simulation with fixed parameter settings that is run multiple times (Law and Kelton, 1991). These results are then averaged, and their confidence intervals are calculated (Kalos and Whitlock, 1986). Thus, for any given experiment, the result is the mean performance (and related confidence interval) over multiple simulation runs, and is more reflective of the underlying processes under investigation than those produced by a single simulation run.
Each experiment consists of 30-50 simulation runs (each with 200 time steps as noted above). We ran experiments for a wide range of values of each environmental dimension (e.g. velocity). We averaged the results and calculated confidence intervals in terms of error bars for the performance results over 30-50 simulation runs at each value. Given space limitations, we report only representative ‘low’ and ‘high’ values for each environmental dimension. Specifically, we plot the relationship between the amount of structure and performance for these representative values of the environmental dimensions. Confidence intervals in the form of error bars (i.e., the square root of the variance over the number of trials) are included to enable more accurate statistical interpretation of the results, as is standard in Monte Carlo experiments (Kalos and Whitlock, 1986).
RESULTS
Amount of Structure and Performance
The first set of results examines the relationship between the amount of structure and performance. H1 proposes that performance has an inverted U-shaped relationship with the amount of structure. Figure 1 plots the results of performance versus the number of total rules, the first operationalization of the amount of structure . Consistent with H1, results show that organizations with few or many rules perform worse than those with a moderate number of total rules. Intriguingly, these results are asymmetric. The left side of the inverted U-shaped curve is steeper than the right side – i.e., too little structure produces a catastrophic drop in performance, while too much structure produces a more gradual decline.
Figure 2 plots the results for the second operationalization of the amount of structure, number of rules (measured in terms of how-to rules), and performance. Again, consistent with H1, there is an inverted U-shaped relationship such that moderate structure generates the highest performance. Again, there is an asymmetry such it is more catastrophic to have too little structure than to have too much (i.e., the left side of the inverted U-shaped curve is steeper than the right) and so replicates the perils of too little structure. We return to this finding later in this section.
Unpacking Market Dynamism
H2 proposes that an environmental contingency affects the trade-off between flexibility and efficiency – i.e. the optimal amount of structure decreases with increasing market dynamism. The next results examine H2 as well as H1 in the context of high and low values for each dimension of market dynamism (i.e., velocity, complexity, ambiguity, and unpredictability) while holding the other three constant at moderate values.
Environmental Velocity. Figure 3 depicts the effect of increasing environmental velocity (i.e. rate of opportunity flow) on performance by superimposing the resulting curves of two representative values. That is, we plot the results that correspond to ‘low’ and ‘high’ values of velocity (λ=0.6 and 1.4) in order to examine the effects of velocity on H1 and H2. The results support H1 in both environments – i.e., both curves have an inverted U-shape.
In contrast, the results do not support H2. That is, within the precision of the simulation, the optimal amount of structure is the same for both high and low velocity environments. Further, while the optimal amount of structure is the same in the two velocity conditions, their performance is unexpectedly not. That is, for a given amount of structure, firms in high velocity environments have higher performance than those in low velocity ones. Overall, this suggests that the large number of opportunities that occurs in high velocity environments yields better performance for all levels of structure, ceteris paribus.
Environmental Complexity. Figure 4 depicts the effect of increasing environmental complexity (i.e., difficulty of capturing opportunities in the environment) on performance by superimposing the results of representative ‘low’ and ‘high’ values of complexity (4 and 8). H1 is supported in both high and low complexity environments. H2 is again not supported. That is, within the precision of the simulation, the optimal amount of structure is the same for both high and low environmental complexity. Yet surprisingly, performance at the optimal structure differs between the two environments. That is, firms perform better in low complexity environments at all levels of structure, a finding that is opposite to that for velocity. We return to these findings in the discussion.
Environmental Ambiguity. Figure 5 shows the effect of increasing environmental ambiguity (i.e. difficulty in correctly perceiving opportunities) on performance by superimposing the results of the two representative cases that correspond to ‘low’ and ‘high’ values of ambiguity (0.0 and 0.2). H1 is again supported in both environments – i.e., the curves have an inverted U-shape.
H2 is again not supported. That is, the optimal amount of structure is the same in both low and high ambiguity environments. Yet unexpectedly, both the range of optimal structures and the peak performance at the optimal structure differ in low v. high ambiguity environments. When ambiguity is high, there is a wide range of optimal structures and lower level of peak performance. This suggests an environment in which it is relatively easy for managers to find an optimal structure, but they will not achieve particularly high performance. This is consistent with a chance-dominated environment where skill offers little advantage. In contrast, when ambiguity is low, there is a narrow range of optimal structures and higher level of peak performance. This suggests an environment in which it is more difficult for managers to find an optimal structure, but they will achieve particularly high performance when they do. To the extent that more skilled executives can manage the trade-off between efficiency and flexibility (and so locate the optimal structure), this is consistent with a skill-dominated environment. We return to these findings in the discussion.
Environmental Unpredictability. Figure 6 reveals the effects of environmental unpredictability (i.e. the dissimilarity between past and current opportunities) on performance by superimposing the results of two representative cases that correspond to ‘low’ and ‘high’ unpredictability (0.5 and 0.8). Again, the inverted U-relationship (H1) is supported in both environments. But unlike the previous environmental dimensions (i.e., velocity, complexity and ambiguity), H2 is supported. Thus, unpredictability is the key environmental dimension that ties the optimal amount of structure to market dynamism (H2).
There are also unexpected findings related to the range of optimal structures. Specifically, when environments have low unpredictability, there is an inverted-U “plateau” relationship between structure and performance. This suggests a forgiving environment in which there is a wide range of optimal structures with roughly the same performance. In contrast, when environments have high unpredictability, there is an inverted V-shaped “peak” relationship between structure and performance. This suggests a punishing environment in which it is challenging to find the optimal amount of structure, hard to maintain the optimal structure even in the face of small perturbations of structure and environment, and very low performing when the optimal structure is not achieved. In other words, there is a strong tension between too much and too little structure. More strikingly, the findings are consistent with an edge of chaos for which only a thin range of structures leads to superior performance. Thus, in contrast to forgiving low unpredictability environments, high unpredictability environments are punishing with sharp tipping points and catastrophic failures for both too much and too little structure.
Mistakes. In order to more fully understand the theoretical logic underlying H1 and H2, we examined the frequency and size of mistakes that were committed as the organization attempts to capture opportunities. We define a mistake as an application of a rule to an opportunity that does not result in successful execution. We first plot the number of mistakes committed at each number of rules (Figure 7a). The results indicate that mistakes increase dramatically as structure decreases. We then compute the frequency distribution of the size of mistakes at each number of rules and for several values of unpredictability (Figure 7b). There are 9 values (rows) of structure: low=1, high=9, and 3 values (columns) of unpredictability: high, low (the same values used above) and very low. Mistake size is the number of rule-based actions that is incorrect for a given opportunity
These results provide several additional insights. First, they clarify the intuition for the asymmetry of the inverted U-shaped relationship between structure and performance (H1). Specifically, in predictable environments, increasing structure has several effects: it reduces the mean of number of mistakes, lowers the total number of mistakes, and eliminates large mistakes. That is, opportunities are very efficiently executed with few and small errors. Thus, while high structure limits flexibility and so the number of types of opportunities that can be addressed, high structure does enable the efficient execution of opportunities that fit the rules. In other words, while fewer acceptable opportunities decrease performance, fewer mistakes increase it. These effects are partially counter-acting, leading to a gradual decline in performance as the amount of structure increases.
In contrast, in unpredictable environments, increasing structure does not lower the mean of number of mistakes, the total number of mistakes, and the number of large mistakes. Rather, the mistake distribution is similar across different amounts of structure in unpredictable environments. Thus, unpredictable environments are unforgiving for both too much and too little structure as indicated by in the inverted V-shaped “peak” relationship. In addition, at this edge of chaos that exists in highly unpredictable markets, executives are likely to make many mistakes, have a widely varying scale of mistakes, and commit some large mistakes.
DISCUSSION
We began by noting that a fundamental trade-off between flexibility and efficiency appears to drive a common finding – an inverted U-shaped relationship between the amount of structure and performance. Less structure keeps the organization more open to respond opportunistically to fresh, unexpected opportunities. More structure keeps the organization tightly focused on the efficient capture of anticipated opportunities. Yet despite the ubiquity of this tension in several literatures, its core theoretical logic is relatively unexamined. Using simulation, we explore the trade-off between flexibility and efficiency by examining the tension between too much and too little structure, the environmental contingencies that influence this tension, and its related effects on performance.
Principal Results
We have several principal results. First, we find an inverted U-shaped relationship between the amount of structure and performance. That is, we confirm the tension between too much and too little structure. More significant, we add the insight that this relationship is unexpectedly asymmetric. That is, while too little structure creates a precipitous drop in performance, too much structure creates only a gradual decline. Therefore, it is more risky to err on the side of too little structure. The underlying intuition is that, when structure is low, organizations have difficulty gaining traction and end up in incoherent improvisation. In contrast, when structure is high, organizations pursue fewer opportunities. But the performance decline is gradual because these remaining opportunities are reliably and efficiently executed.
Second, we find that unpredictability is the key aspect of the environment that influences the trade-off between flexibility and efficiency. While previous research suggests that the optimal amount of structure should decrease with increasing market dynamism (Lawrence and Lorsch, 1967; Pisano, 1994; Eisenhardt and Tabrizi, 1995), we extend this work in several ways. First, we identify unpredictability as the particular aspect of market dynamism that drives this relationship. In contrast, other frequently mentioned features of environmental dynamism such as ambiguity (March and Olsen, 1976; Rindova and Kotha, 2001), velocity (Eisenhardt, 1989), and complexity (Gavetti et al., 2005) do not appreciably influence the optimal amount of structure. Specifically, as environmental unpredictability increases, the optimal amount of structure decreases because less structure enables more flexible response to increasingly unforeseen opportunities.
We also add an unexpected insight by noting that the range of optimal structures decreases with increasing unpredictability. That is, in very predictable environments, the range of optimal structures is an inverted U-shaped “plateau” with a wide range of equally high-performing structures. This suggests robustness of action such that many structures are roughly equivalent. Therefore, executives can easily achieve and maintain an optimal structure. Further, they can select structures that favor particular features without sacrificing much performance. For example, executives with particular interest in minimizing errors can design high-reliability organizations that have extensive structure (Perrow, 1984; Weick and Roberts, 1993) with little or no performance penalty. In contrast, in very unpredictable environments, the optimal amount of structure shrinks to an inverted V-shaped “peak”. Here only a narrow range of structures is high performing, and the drop-off of performance from peak performance is steep. Since even minor perturbations in structure or environment can have catastrophic consequences, these environments are highly “attention-demanding” and require managing the amount (not just the content) of structure. Overall, high unpredictability is an unforgiving environment where performance is precarious and mistakes are often fatal.
Third, we find that other dimensions of market dynamism (i.e., velocity, complexity, ambiguity) have unique effects. High velocity environments are particularly attractive. Since these environments are opportunity-rich, executives can be selective, and so choose only the highest payoff opportunities. In addition, this finding offers further insight into why rapid executive actions such as fast strategic decision making (Eisenhardt, 1989) and fast product innovation (Eisenhardt and Tabrizi, 1995) are especially effective in high velocity environments. In these opportunity-rich environments, there are many high-payoff opportunities. By acting quickly, executives can secure a larger number of these superior payoff opportunities, and so achieve very high performance. In contrast, by acting slowly, executives are likely to secure fewer opportunities and to exploit them for less time, leading to lower performance. The attractiveness of high velocity environments may also explain why the Internet era (characterized by a high velocity of opportunities) had a surprisingly low failure rate. Although many firms died, the death rate was unusually low when compared with the overall number of foundings (Goldfarb, Kirsch, and Miller, 2005). Overall, we find that high velocity environments are attractive for gaining high performance.
In contrast, complex environments are particularly unattractive. In highly complex environments, opportunities have many features that executives must execute correctly in order to capture them. Thus, these opportunities are challenging to capture, and performance is correspondingly low. This finding extends prior research by clarifying why firms in complex environments such as biotechnology (Owen-Smith and Powell, 2003) and “green” power (Sine, Haveman, and Tolbert, 2005) often perform poorly even when their executives have high domain expertise. In these technically and institutionally complex markets, executives must achieve success in so many areas (e.g., technical, safety, regulatory, manufacturing, investment, and complementary products) in order to capture an opportunity. Thus, they can address relatively few opportunities, and are likely to have a low probability of success when they do. Overall, we find that high complexity environments are unattractive for gaining high performance.
Our findings for environmental ambiguity are especially intriguing. When ambiguity is high, executives are unable to perceive opportunities accurately. Here, we find that a wide range of optimal structural alternatives exists, and that these alternatives have roughly equivalent, albeit low, performance. Thus, performance is not well-determined by the skill of finding an optimal amount of structure, and so offer no advantage for high skill executives (i.e., more likely to find the optimal amount of structure). In contrast, in low ambiguity environments, executives are able to perceive opportunities accurately. Here, we find that the range of optimal structures narrows, and so favors skilled executives (i.e., can more easily find the optimal point). Moreover, peak performance at the optimal structure is higher in low ambiguity environments than in high ambiguity ones because executives can be more accurate regarding the optimal amount of structure when they clearly perceive the environment. The key insights are that skilled executives (i.e., able to locate the optimal structure) should avoid high ambiguity environments where there is no advantage to their skill and performance is poor, and seek low ambiguity ones where their skills are advantageous and high performance is achievable. In contrast, unskilled executives (i.e., not good at finding optimal point) should do the reverse – i.e., rely on chance, and stay in more forgiving (albeit low-performing) high ambiguity environments.
The ambiguity finding also relates to research on high performance in nascent markets (where ambiguity is particularly high). This research indicates that successful entrepreneurial firms shape these new markets by tactics such as creating varied aspects of industry structure (Rao, 1994; Rindova and Fombrun, 2001; Santos and Eisenhardt, 2006). We extend this research by clarifying that the underlying strategic logic of these activities is the lowering of environmental ambiguity. This, in turn, changes the nature of competition from a game of luck to one of skill in which effective entrepreneurs (i.e., those that can find the optimal structure) are most successful. Thus, it is advantageous for skilled executives to structure ambiguous environments.
The ambiguity finding also relates to analogic reasoning (Hargadon, 2003). Analogic reasoning provides a cognitive map of a given situation that can accelerate understanding (Weick, 1993). Prior research emphasizes the value of analogies in complex environments because they jumpstart the capture of opportunities via specifying some correct actions (cf. Gavetti et al., 2005). We extend this work by indicating other environments where analogy is important and where it not. For example, unpredictable environments lack the stability that makes any particular analogy helpful, and are highly sensitive to the amount of structure. Here analogies are less effective, and may even damage performance if they add an inappropriate amount of structure. High velocity environments are opportunity-rich, and so analogies are also less helpful because the speed of opportunity capture (not reduction of complexity) is key. Finally, analogic reasoning is likely to be helpful in ambiguous environments, but only when executives vigorously act to structure the environment to match the analogy. Simply internal organization to fit an analogy per se is unlikely to be helpful because competitors are likely to have alternative analogies and so structure the environment according to their own analogies.
Finally, our results provide organizational and strategic texture for the concept of the edge of chaos (Kauffman, 1993; Carroll and Burton, 2000). Previous research offers only vague definitions and anecdotal evidence for the edge of chaos such as popular catch-phrases including “snooze, you lose” and “only the paranoid survive” (Brown and Eisenhardt, 1998). In contrast, we suggest several new insights on this intriguing construct. First, we identify when the edge of chaos, which has been defined as the phase transition between order and disorder (Kauffman, 1993), is likely to occur – i.e., in highly unpredictable environments. Here, the relationship between structure and performance is a narrow, inverted V-shaped relationship with abrupt tipping points on both sides of the optimal structure, consistent with an edge of chaos. Second, we characterize the distribution of mistakes at the edge of chaos – i.e., many errors of widely varying size such that firms are likely to experience both small oversights as well as some potentially debilitating miscalculations. We do not, however, find a power law distribution such that there are many more small mistakes. Rather, the distribution approximates normal with a non-trivial probability of large errors. Third, we provide insight into energy at the edge of chaos. Prior research indicates that the edge of chaos is a dissipative equilibrium, an unstable critical point which requires constant energy expenditure to maintain (Prigogine and Stengers, 1984). We extend this work to organizations and strategies by indicating that the energy required to remain at the optimal structure (i.e., edge of chaos) is focused on “effort-demanding” improvisation and mistake recovery processes that are essential to maintain the optimal structure. Overall, we contribute to complexity thinking by more sharply specifying the edge of chaos within organizational and strategic contexts – i.e., the edge of chaos occurs in highly unpredictable environments, is crucial to find, and requires significant effort to sustain as even small environmental or structural perturbations can tip organizations into catastrophic failures.
Towards a Pluralistic View of Strategies
Broadly, our work has implications for strategy. First we offer insight into the strategic logic of opportunity and the related view of strategy as simple rules (Eisenhardt and Martin, 2000; Eisenhardt and Bingham, 2006). According to this strategic logic, firms can achieve high performance in dynamic environments by using a few simple rules to guide the capture of fleeting opportunities (e.g., Gersick, 1994; Burgelman, 1996; Miner et al., 2001; Galunic and Eisenhardt, 2001; Rindova and Kotha, 2001). Eisenhardt and Bingham (2005) offer insight into their content. Our research further extends these findings by providing internal validation of the core theoretical logic. That is, we internally validate the theoretical logic that links these rules and their number to opportunity capture and performance through a combination of real-time action and rule following. Thus, like other simulations that provide internal validation of theory (e.g. Sastry, 1997 for punctuated equilibrium), our simulation similarly confirms the internal validity of the strategic logic of opportunity.
Second, we offer insights into the boundary conditions of several strategic logics. For instance, in a positioning logic, executives achieve superior performance by building tightly linked activity systems in valuable strategic positions such as low-cost or high differentiation (Porter, 1996; Rivkin, 2000). Our findings add to this view by clarifying that such “high-structure” strategies are especially effective in predictable markets. Further, our findings point to a deeper understanding of why tightly linked activity systems are high performing in such predictable environments – i.e. while fewer opportunities are available, highly structured strategies in the form of activity systems produce both few and small mistakes. Therefore, they efficiently capture a flow of similar opportunities. In addition, since there are many possible high-performing structures in predictable environments, our findings indicate that executives can achieve good performance with many strategies. Indeed, these numerous optimal strategic alternatives help to explain why multiple differentiated positions are often viable in predictable environments (Porter, 1996). Finally our findings add by indicating that, once achieved, competitive advantage gained through positioning is relatively robust to environmental and structural perturbations, making competitive advantage and performance sustainable. This suggests that positioning logic creates a stable equilibrium.
By contrast, in an opportunity logic, executives achieve superior performance by using a few simple rules or heuristics to flexibly capture fleeting opportunities (Eisenhardt and Bingham, 2006). Our findings add to this view by indicating that “low-structure” opportunity logic is effective in unpredictable environments, thereby sketching a boundary condition between positioning and opportunity strategic logics. Our findings further suggest a subtle insight regarding the precarious nature of competitive advantage. While prior research argues that dynamic environments enable a series of short-term advantages (D’Aveni, 1994), we note that competitive advantage in these environments is unstable and that its duration is unforeseeable (not necessarily short-term). Overall, this insight suggests that firms with a strategic logic of opportunity are threatened by internal collapse such that they can fail from within (by having too much or too little structure), not just from external competition. This threat of internal collapse offers an alternative explanation of intra-industry performance heterogeneity that differs from path dependent explanations (McGahan and Porter, 1997; Bowman and Helfat, 1997). That is, we highlight the role of the optimal amount of structure and the edge of chaos tipping points in creating heterogeneous firm performance within industries.
Toward Adaptive Organization
Broadly, our work also has implications for organization theory. At the heart of our research is the trade-off between flexibility and efficiency in dynamic environments. Less structure enables the flexible capture of serendipitous opportunities through the combination of limited structure and real-time action. But with too much improvisation, the organization runs the risk of incoherence and drift. More structure enables tight focus on the efficient capture of expected opportunities. But with too much structure, the organization runs the risk of stagnation and misalignment with the environment. The essence of flexibility is thus the messy capture of unexpected opportunities while the essence of efficiency is the smooth capture of the anticipated. We internally validate and extend this logic, and show how this trade-off is affected by types of environments (i.e., velocity, complexity, ambiguity, and unpredictability).
We conclude with insights into how the trade-off between flexibility and adaptation affects types of organizations. For young firms (i.e., de novo or venture firms) that typically have little structure, the challenge in any environment is the same. The imperative is to gain enough structure before failure ensues. Legitimation and competition, of course, matter to the performance of young firms. But, the key insight here is that sufficient and repeatable structure is also essential. Without structure, there is neither traction nor coherence. It is impossible to adapt when the organization is incoherent. Thus, the well-known liability of newness may mask a liability of too little structure.
In contrast, for established firms (i.e., incumbent or old firms) that often have significant structure in terms of roles, rules, and linkages across units, the challenge varies in different environments. If the environment is predictable, our findings indicate that it is advantageous to create extensive structure. The number and size of mistakes drop. Modest executive attention and effort are needed to manage the amount of structure. The organization can gain a stable equilibrium such that it is robust to structural and environmental changes. But as the environment becomes unpredictable or when executives diversify into new environments, our findings indicate two key challenges. One is to simplify the organization structure. The other is the much more subtle challenge of a dramatically different mindset. This mindset involves executives’ being discriminating in their managing the amount of structure (not just the content) at an “edge of chaos”. It also involves being tolerant of more and sometimes large mistakes that must be quickly fixed. It also involves expending more effort and paying more attention to managing the organization than in less unpredictable environments. Simply put, managing in unpredictable environments is different, harder, and more precarious than in predictable ones as organizations can gain only an unstable or dissipative equilibrium. Thus, the well-known liability of age may be as much a structural as an age phenomenon. Overall, the irony of adaptation is that, as it becomes more crucial to adapt to environmental change, it also becomes more challenging to do so.
CONCLUSION
Much organization theory emphasizes inertia, arguing that it is challenging and even foolhardy for organizations to attempt adaptation (Hannan and Freeman, 1984; Leonard-Barton, 1992; Tripsas and Gavetti, 2000). Other theory emphasizes that organizations should buffer against environmental change through devices such as interlocking boards (Pfeffer and Salancik, 1978), and legitimating practices (DiMaggio and Powell, 1983). Similarly, much strategy theory emphasizes the path dependent evolution towards stable configurations of activity systems and resources (Porter, 1996; Bowman and Helfat, 2001; Eisenhardt and Bingham, 2006).
Yet, there is also broadly scattered research that reveals the trade-off between flexibility and efficiency, and the related (and ubiquitous) tension between too much and too little structure in dynamic environments. Using simulation, we confirm the underlying theoretical logic of an inverted U-shaped relationship between structure and performance. More important, we extend these ideas to include a deeper understanding of the symmetry of that relationship, its environmental and organizational contingencies, and the ease with which high performance can be achieved.
We conclude with an optimistic view of adaptation that emphasizes how and when strategic and organizational adaptation occur. Adaptation depends upon balancing flexibility and efficiency by the improvisational combination of moderate structure and flexible action. More significant, we observe the importance of concepts from complexity science surrounding the organization of structure, phase transitions between order and disorder, and cascading failure from even small perturbations in the environment and organization.
REFERENCES Albert, Reika, Hawoong Jeong, and Albert-Laszlo Barabasi
2000 "Error and attack tolerance of complex networks." Nature, 406: 378-382.
Amabile, Teresa
1996 Creativity in context. Boulder, Colo.: Westview Press.
Anderson, Philip
1999 "Complexity Theory and Organization Science." Organization Science, 10: 216-232.
Arthur, W. Brian
1994 Increasing Returns and Path Dependency in the Economy. Ann Arbor, MI: U of Michigan Press.
Barnard, Chester I.
1938 The Functions of the Executive. Cambridge, MA: Harvard University Press.
Baum, J. A. C., and S Wally
1999 "Dynamics of dyadic competitive interaction." Strategic Management Journal, 24: 1107-1129.
Bigley, Gregory A. and Karlene H. Roberts
2001 "The Incident Command System: High-reliability Organizing for Complex and Volatile Task Environments." Academy of Management Journal, 44: 1281-1299.
Bingham, Christopher B., and Kathleen M. Eisenhardt
2005 "Unveiling the creation and content of strategic processes: How and what firms learn from heterogeneous experience." University of Maryland working paper.
Birkinshaw, Julian, and Neil Hood
1998 "Multinational subsidiary evolution: Capability and charter change in foreign-owned subsidiary companies." Academy of Management Review, 23: 773-795.
Bower, Joseph L., and Clark Gilbert
2005 "Pandesic: The Challenges of a New Business Venture." 1-19: Harvard Business School.
Bowman, E., and Constance E. Helfat
2001 "Does corporate strategy matter?" Strategic Management Journal, 22: 1-23.
Bradach, Jeffrey L.
1997 "Using the plural form in the management of restaurant chains." Administrative Science Quarterly, 42: 276-304.
Brown, Shona L., and Kathleen Eisenhardt
1998 Competing on the Edge - Strategy as Structured Chaos. Boston, MA: Harvard Business School Press.
Brown, Shona L., and Kathleen M. Eisenhardt
1997 "The Art of Continuous Change: Linking Complexity Theory and Time-paced Evolution in Relentlessly Shifting Organizations." Administrative Science Quarterly, 42: 1-34.
Burgelman, Robert A.
1994 "Fading Memories: A Process Theory of Strategic Business Exit in Dynamic Environments." Administrative Science Quarterly, 39: 24-56.
1996 "A Process Model of Strategic Business Exit: Implications for an Evolutionary Theory of Strategy." Strategic Management Journal, 17: 193-214.
Burns, Tom, and G. M. Stalker
1961 The management of innovation. London: Tavistock.
Burt, Ronald
1992 Structural Holes. Cambridge: Harvard University Press.
Burton, Richard M.
2003 "Building Computational Laboratories for Organization Science: Questions, Validity, and Docking." Computational & Mathematical Organzation Theory, 9: 91-108.
Burton, Richard M., and Borge Obel
1995 "The Validity of Computational Models in Organization Science: From Model Realism to Purpose of the Model." Computational and Mathematical Organization Theory, 1: 57-72.
Cameron, Kim, M Kim, and D. Whetten
1987 "Organizational effects of decline and turbulence." Administrative Science Quarterly, 32: 222-240.
Carroll, Glenn, and J. Richard Harrison
1998 "Organizational Demography and Culture: Insights from a Formal Model and Simulation." Administrative Science Quarterly, 43: 637-667.
Carroll, Tim N. and Richard M. Burton
Complexity: Searching for the Edge of Chaos." Computational & Mathematical Organization Theory, 6: 319-337.
Chi-Nein, Chung, Ishtiag Mahmood, and Will Mitchell
2005 "The Janus face of intra-firm ties: Group-wide and affiliate level innovation by multi-business firms in Taiwan." Working Paper.
Cinlar, E
1975 Introduction to Stochastic Processes. Englewood Cliffs, NJ: Prentice-Hall.
Cover, Thomas, and Joy Thomas
1991 Elements of Information Theory. New York: Wiley and Sons.
Cyert, R. M., and James G. March
1963 A Behavioral Theory of the Firm. Englewood Cliffs, NJ: Prentice-Hall.
D'Aveni, Richard A.
1994 Hypercompetition: Managing the Dynamics of Strategic Maneuvering. New York: The Free Press.
Davis, Jason, Christopher B. Bingham, and Kathleen Eisenhardt
2006 "Developing Theory Through Simulation Methods." Academy of Management Review, forthcoming.
Dess, Gregory, and D Beard
1984 "Dimensions of organizational task environments." Administrative Science Quarterly, 29.
Dill, William
1958 "Environment as an influence on managerial autonomy." Administrative Science Quarterly, 2.
DiMaggio, Paul J., and Walter W. Powell
1983 "The Iron Cage Revisited: Institutional Isomorphism and Collective Rationality in Organizational Fields." American Sociological Review, 48: 147-160.
Eisenhardt, Kathleen M.
1989 "Making Fast Strategic Decisions in High-Velocity Environments." Academy of Management Journal, 32: 543-576.
Eisenhardt, Kathleen M. and Mahesh Bhatia
2002 "Organizational Computation and Complexity." In Companion to Organizations, Joel Baum (Ed.), Blackwell Business, Oxford UK.
Eisenhardt, Kathleen M. and Christopher B. Bingham
2006 "Disentangling Resources from the Resource-based View: A Typology of Strategic Logics and Competitive Advantage." Managerial and Decision Economics, forthcoming.
Eisenhardt, Kathleen, and Jeffrey A. Martin
2000 "Dynamic capabilities: What are they?" Strategic Management Journal, 21: 1105-1121.
Eisenhardt, Kathleen M. and Claudia Bird Schoonhoven
1990 "Organizational Growth: Linking Founding Team, Strategy, Environment and Growth among U.S. Semiconductor Ventures, 1978-1988." Administrative Science Quarterly, 35: 504-529.
Eisenhardt, Kathleen, and Donald Sull
2001 "Strategy as Simple Rules." Harvard Business Review, Jan-Feb.
Eisenhardt, Kathleen, and Behnam Tabrizi
1995 "Accelerating Adaptive Processes: Product Innovation in the Global Computer Industry." Administrative Science Quarterly, 40: 84-110.
Feldman, M, S., and Brian T Pentland
2003 "Reconceptualizing Organizational Routines as a Source of Flexibility and Change." Administrative Science Quarterly, 48: 94-118.
Fligstein, Neil
2001 The Architecture of Markets: An Economic Sociology of Twenty-first Century Capitalist Societies. Princeton, NJ: Princeton University Press.
Galaskiewicz, Joseph
1985 "Interorganizational Relations." Annual Review of Sociology, 11: 281-304.
Galbraith, Jay
1974 "Organization Design: An Information Processing View." Interfaces, 4: 28-36.
Galunic, Charles, and Kathleen M. Eisenhardt
2001 "Architectural Innovation and Modular Corporate Forms." Academy of Management Journal, 44: 1229-1250.
Galunic, D. Charles, and Kathleen M. Eisenhardt
1996 "The evolution of intracorporate domains: Divisional charter losses in high-technology, multidivisional corporations." Organization Science, 7: 255-282.
Gavetti, G., Dan Levinthal, and Jan W. Rivkin
2005 "Strategy making in novel and complex worlds: the power of analogy." Strategic Management Journal, 26: 691-712.
Gell-Mann, Murray
1994 The Quark and the Jaguar: Adventures in the Simple and the Complex. New York: WH Freeman.
Gersick, Connie J. G.
1994 "Pacing Strategic Change: The Case of a New Venture." Academy of Management Journal, 37: 9-45.
Gilbert, Clark. "Unbundling the Structure of Inertia: Resource vs. Routine Rigidity." Academy of Management Journal 48, no. 5 (October 2005): 741-763.
Glynn, Peter, and Ward Whitt
1992 "The Asymptotic Efficiency of Simulation Estimators." Operations Research, 40: 505-520.
Goldfarb, B, David A. Kirsch, and D. Miller
2005 "Was There Too Little Entry During the Dot Com Era." Working Paper
Grant, Robert M.
1996 "Prospering in Dynamically-competitive Environments: Organizational Capability as Knowledge Integration." Organization Science, 7: 375-387.
Hannan, Michael T., and John Freeman
1984 "Structural Inertia and Organizational Change." American Sociological Review, 49: 149-164.
Hansen, Morten T.
1999 "The Search-Transfer Problem: The Role of Weak Ties in Sharing Knowledge Across Organizational Subunits." Administrative Science Quarterly, 44: 82-111.
Hargadon, Andrew
2003 How Breakthroughs happen: The Surprising Truth about how Companies Innovate. Cambridge, MA: Harvard Business School Press.
Hargadon, Andrew and Robert I. Sutton
1997 "Technology Brokering and Innovation in a Product Development." Administrative Science Quarterly, 42: 716-749.
Hatch, M. J.
1998 "Jazz as a metaphor for organizing in the 21st century." Organization Science, 9: 556-557.
Haveman, Heather A.
1992 "Between a Rock and a Hard Place: Organizational Change and Performance under Conditions of Fundamental environmental Transformation." Administrative Science Quarterly, 37: 48-75.
Hayek, F.A.
1945 "The use of knowledge in society." American Economic Review, 35: 519-530.
Henderson, Rebbeca M., and Kim B. Clark
1990 "Architectural Innovation: The Reconfiguration of Existing Product Technologies and the Failure of Established Firms." Administrative Science Quarterly, 35: 9-30.
Hickson, D, R Butler, D Cray, G Mallory, and D Wilson
1986 Top Decisions: Strategic decision making in organizations. San Francisco: Jossey-Bass.
Hill, Charles W. L., and F.T. Rothaermel
2003 "The performance of incumbent firms in the face of radical technological innovation." Academy of Management Review, 28: 257-274.
Holland, J.H.
1992 Adaptation in natural and artificial systems, 2nd ed. ed. Cambridge, MA: MIT Press.
Kalos, Malvin, and Paula Whitlock
1986 Monte Carlo Methods -- Basics, Volume. 1: Wiley-Interscience.
Karim, Samina and Will Mitchell
2000 "Path-dependent and Path-breaking Change: Reconfiguring Business Resources Following Acquisitions in the U.S. Medical Sector, 1978-1995." Strategic Management Journal, 21: 1061-1081.
Katila, Riitta and Gautum Ahuja
2002 "Something old, something new: A longitudinal study of search behavior and new product introduction",Academy of Management Journal, 45(6): 1183-1194.
Kauffman, Stuart
1989 "Adaptation on rugged fitness landscapes." In E. Stein (ed.), Lectures in the Science of Complexity. Reading, Mass.: Addison-Wesley.
1993 The Origins of Order. New York, NY: Oxford University Press.
Kirzner, I.
1997 "Entrepreneurial Discovery and the Competitive Market Process: An Austrian Approach." Journal of Economic Literature, XXXV: 60-85.
Krackhardt, David
1992 "The Strength of Strong Ties: The Importance of Philos in Organizations." In N. Nohria, and R. G. Eccles (eds.), Networks and Organizations: Structure, Form, and Action. Boston: Harvard Business School Press.
Langton, C
1992 "Life at the edge of chaos." C. Langton, et al., (eds.), Artificial Life II: Sante Fe Institute Studies in the Sciences of Complexity: 41-91: Addison-Wesely.
Law, Averill M., and David W. Kelton
1991 Simulation Modeling and Analysis, 2nd ed. New York, NY: McGraw-Hill.
Lawrence, Paul R., and Jay W. Lorsch
1967 Organization and Environment: Managing Differentiation and Integration. Boston: Harvard University.
Leonard-Barton, Dorothy
1992 "Core Capabilities and Core Rigidities: A Paradox in Managing New Product Development." Strategic Management Journal, 13: 111-125.
March, James G.
1991 "Exploration and Exploitation in Organizational Learning." Organization Science, 2: 71-87.
March, James G., and Johan P. Olsen
1976 Ambiguity and Choice in Organizations. Bergen: Universitetsforlaget.
March, James, and Herbert Simon
1958 Organizations. New York: Wiley.
Markides, Constantinos C., and P Geroski
2004 Fast Second: How Smart Companies Bypass Radical Innovation to Enter and Dominate New Markets: Jossey-Bass.
McGahan, Anita M., and Michael E. Porter
1997 "How Much Does Industry Matter, Really?" Strategic Management Journal, 18: 15-30.
Miller, Danny, and Peter H. Friesen
1980 "Momentum and Revolution in Organizational Adaptation." Academy of Management Journal, 23: 591-614.
Miller, Danny, and Jamal Shamsie
1996 "The Resource-Based View of the Firm in two Environments: The Hollywood Film Studios From 1936 to 1965." Academy of Management Journal, 39: 519-543.
Miner, A, P Bassoff, and C Moorman
2001 "Organizational Improvisation and Learning: A Field Study." Administrative Science Quarterly, 46: 304-337.
Mintzberg, Henry, and Alexandra McHugh
1985 "Strategy Formation in an Adhocracy." Administrative Science Quarterly, 30: 160-197.
Mintzberg, Henry and Waters
1982 "Tracking Strategy in an Entrepreneurial Firm." Academy of Management Journal, 25: 465-499.
Moorman, Christine, and Anne S. Miner
1998 "Organizational Improvisation and Organizational Memory." Academy of Management Review, 23: 698-723.
Mowery, David C.
1996 The International Computer Software Industry: A Comparative Study of Industry Evolution and Structure. Washington, DC: National Academy Press.
Nelson, Richard R., and Sidney G. Winter
1982 An Evolutionary Theory of Economic Change. Cambridge, Massachusetts: Belknap - Harvard University Press.
Ocasio, William
1997 "Towards an Attention-Based View of the Firm." Strategic Management Journal, 18: 187-206.
Okhuysen, Gerardo Andres, and Kathleen M. Eisenhardt
2002 "Integrating Knowledge in Groups: How Formal Interventions Enable Flexibility." Organization Science, 13: 370-386.
Orton, J. Douglas, and Karl E. Weick
1990 "Loosely Coupled Systems: A Reconceptualization." Academy of Management Review, 15: 203-223.
Owen-Smith, Jason, and W. W. Powell
2003 "Knowledge Networks as Channels and Conduits: The Effects of Spillovers in the Boston Biotechnology Community." Organization Science (forthcoming).
Perrow, Charles
1984 Normal Accidents: Living with High-Risk Technologies. New York: Basic Books.
Pfeffer, Jeffrey, and Gerald Salancik
1978 The External Control of Organizations: A Resource Dependence Perspective. New York: Harper & Row Publishers.
Pisano, Gary P.
1994 "Knowledge, Integration, and the Locus of Learning: An Empirical Analysis of Process Development." Strategic Management Journal, 15: 85-100.
Podolny, J.
1994 "Market Uncertainty and the Social Character of Economic Exchange." Administrative Science Quarterly, 39: 458-483.
Porter, Michael E.
1996 "What is strategy?" Harvard Business Review, 74: 61-78.
Powell, Walter W.
1990 "Neither Market or Hierarchy: Network Forms of Organization." In B. M. Staw, and L. L. Cummings (eds.), Research in Organizational Behavior: 295-336: JAI Press.
Prigogine, Ilya, and I Stengers
1984 Order Out of Chaos: Man's New Dialog With Nature. New York.
Rao, Hayagreeva
1994 "The Social Construction of Reputation: Certification Contests, Legitimation, and the Survival of Organizations in the American Automobile Industry: 1895-1912." Strategic Management Journal, 15: 29-44.
Reynolds, C. W.
1987 "Flocks, Herds, and Schools: A Distributed Behavioral Model, in Computer Graphics." SIGGRAPH '87: 25-34.
Rindova, Violina P. and Charles J. Fombrun
2001 "Entrepreneurial Action in the Creation of the Speciality Coffee Niche." In C. B. Schoonhoven
and E. Romanelli (Eds.), The Entrepreneurship Dynamic. Stanford CA: Stanford University Press.
Rindova, Violina, and Suresh Kotha
2001 "Continuous Morphing: Competing through Dynamic Capabilities, Form, and Function." Academy of Management Journal, 44: 1263-1280.
Rivkin, Jan, W.
2000 "Imitation of Complex Strategies." Management Science, 46: 824-844.
Rivkin, Jan W., and Nicolaj Siggelkow
2003 "Balancing Search and Stability: Interdependencies Among Elements of Organizational Design." Management Science, 49: 290-311.
Roberts, Peter W.
1999 "Product Innovation, Product-Market Competition and Persistent Profitability in the U.S. Pharmaceutical Industry." Strategic Management Journal, 20: 655-670.
Roberts, Peter W., and Kathleen Eisenhardt
2003 "Austrian insights on strategic organization: from market insights to implications for firms." Strategic Organization, 1: 345-352.
Rowley, T.J., D. Behrens, and David Krackhardt
2000 "Redundant governance structures: An analysis of structural and relational embeddedness in the steel and semiconductor industries." Strategic Management Journal, 21: 369-386.
Rudolph, Jenny, and Nelson Repenning
2002 "Disaster Dynamics: Understanding the Role of Quantity in Organizational Collapse." Administrative Science Quarterly, 47: 1-30.
Rumelt, Richard P.
1991 "How Much Does Industry Matter." Strategic Management Journal: 167-185.
Santos, Filipe M, and Kathleen Eisenhardt
2006 "Constructing Niches and Shaping Boundaries: Entrepreneurial Action in Nascent Fields." Working Paper INSEAD.
Santos, Filipe M., and Kathleen M. Eisenhardt
2005 "Organizational boundaries and theories of organization." Organization Science, 16: 491-508.
Sastry, M. Anjali
1997 "Problems and paradoxes in a model of punctuated organizational change." Administrative Science Quarterly, 42: 237-275.
Schilling, Melissa A., and H. Kevin Steensma
2001 "The Use of Modular Organizational Forms: An Industry-Level Analysis." Academy of Management Journal, 44: 1149-1168.
Schumpeter, Joseph A.
1934 The theory of economic development, 7 ed. Cambridge, Massachusetts: Harvard University Press.
Scott, W. Richard
2003 Organizations: Rational, Natural and Open Systems, 5th ed. Upper Saddle River, NJ: Prentice Hall.
Selznick, Philip
1957 Leadership and Administration. New York: Harper & Row.
Shane, Scott
2000 "Prior knowledge and the discovery of entrepreneurial opportunities." Organization Science, 11: 448-469.
Shapiro, C, and H Varian
1999 "The Art of Standards Wars." California Management Review, 41: 8-32.
Siggelkow, Nicolaj
2001 "Change in the Presence of Fit: the Rise, the Fall, and the Renascence of Liz Claiborne." Academy of Management Journal, 44: 838-857.
Simon, Herbert A.
1962 "The Architecture of Complexity." Proc. Amer. Philos. Soc., 106: 467-482.
1996 Sciences of the Artificial, 3rd ed. Cambridge: MIT Press.
Sine, Wesley D., Hitoshi Mitsuhashi, and David A. Kirsch
2005 "Revisiting Burns and Stalker: Formal structure and new venture performance in emerging economic sectors." Academy of Management Journal, forthcoming.
Sine, Wesley D., Heather Haveman, and Pamela S. Tolbert
2005 "Risky Business: Entrepreneurship in the New Independent Power Sector". Administrative Science Quarterly, 50: 200-233.
Sipser, Michael
1997 Introduction to the theory of computation. Boston: PWS Publishing Company.
Stark, David
1996 "Recombinant Property in East European Socialism." American Journal of Sociology, 101: 993-1027.
Stinchcombe, Arthur L.
1965 "Social Structure and Organizations." In J. G. March (ed.), Handbook of Organizations: 142-193. Chicago: Rand McNally & Company.
Strogatz, Steven
2001 Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry, and Engineering: Perseus Books Group.
Teece, David J., Gary Pisano, and Amy Shuen
1997 "Dynamic Capabilities and Strategic Management." Strategic Management Journal, 18: 509-533.
Tripsas, Mary
1997 "Unraveling the process of creative destruction: Complementary assets and incumbent survival in the typesetter industry." Strategic Management Journal, 18: 119-142.
Tripsas, Mary, and Giovanni Gavetti
2000 "Capabilities, Cognition and Inertia: Evidence from Digital Imaging." Strategic Management Journal, 21: 1147-1162.
Tushman, Michael L., and Philip Anderson
1986 "Technological Discontinuities and Organizational Environments." Administrative Science Quarterly, 31: 439-465.
Tushman, Michael, and Charles A. O Reilly, III
1996 "Ambidextrous organizations: Managing evolutionary and revolutionary change." California Management Review, 38: 8-30.
Tyre, M, and W Orlikowski
1993 "Exploiting opportunities for technological improvement." Sloan Management Review, 35: 13-26.
Uzzi, Brian
1997 "Social Structure and Competition in Interfirm Networks: The Paradox of Embeddedness." Administrative Science Quarterly, 42: 36-67.
Uzzi, Brian, and Jarrett Spiro
2005 "Collaboration and Creativity: The Small World Problem." American Journal of Sociology (Forthcoming).
Vermuelen, Freek and Harry Barkema
2002 "Pace, Rhythm, and Scope: Process Dependence in Building a Profitable Multinational Corporation." Strategic Management Journal, 23: 637-653.
Watts, Duncan, Peter Sheridan Dodds, and M.E.J. Newman
2002 "Identity and Search in Social Networks." Science, 296: 1302-1305.
Weick, Karl E.
1976 "Educational Organizations as loosely coupled systems." Administrative Science Quarterly, 21: 1-19.
1993 "The Collapse of Sensemaking in Organizations: The Mann Gulch Disaster." Administrative Science Quarterly, 38: 628-652.
1998 "Improvisation as a Mindset." Organization Science, 9: 543-555.
Weick, Karl E., and Karlene H. Roberts
1993 "Collective Minds in Organizations: Heedful Interrelating on Flight Decks." Administrative Science Quarterly, 38: 357-381.
Weisstein, Eric W.
2004 "Sample Raw Moment." From MathWorld--A Wolfram Web Resource http://mathworld.wolfram.com/SampleRawMoment.html.
Williams, Charles, and Will Mitchell
2004 "Focusing Firm Evolution: The Impact of Information Infrastructure on Market Entry by U.S. Telecommunications Companies, 1984–1998." Management Science, 50: 1561-1575.
Zott, Christoph
2003 "Dynamic Capabilities and the Emergence of Intra-industry Differential Firm Performance: Insights from a Simulation Study." Strategic Management Journal, 24: 97-125.
Table 1: Types of Rules for Opportunity Capture
Rule-Type Definition Operationalization If opportunity meets conditions, then take actions to execute.
Table 2: Organization and Environment: Definitions, Operationalization and Results
Range of
Values
Construct (Low / High) Definition Operationalization Key Results
TECHNICAL APPENDIX
Operationalization and Initialization of Opportunities
As described in the Methods section, each opportunity is composed of a 10-element perceived features vector (i.e., bit string), a 10-element actual features vector, an integer window of opportunity, a scalar actual payoff, and scalar perceived payoff. The feature vectors are produced by an algorithm that randomly assigns each element either a 1 or 0. The perceived features vector differs from the actual features vector by a proportion of elements as set by environmental ambiguity parameter. The exact elements that differ are randomly chosen. The window of opportunity for each opportunity is drawn from a normal distribution with mean, m=20 and variance, v=5 and then rounded to the nearest integer greater than 0. The actual payoff is drawn from a normal distribution with m=30 and v=5. The perceived payoff is produced by adding a (potentially negative) Gaussian white noise element – i.e., a value drawn from a normal distribution with m=0, and v=10 – to this actual payoff. We have conducted sensitivity analyses with several distributions and many different parameter settings for these operationalizations, and found no significant differences in our findings. Therefore, we chose representative distributions and parameter settings.
Operationalization and Initialization of Rules
We operationalized each type of rule (i.e., boundary, priority, how-to, timing, exit) such that the number of free parameters upon which the results depend is minimized. Most rules also have an amount of structure that can be varied to examine the relationship between structure and performance. The boundary and how-to rules consist of 10-element vector of rules, consisting of 0’s, 1’s and ?’s. A smaller number of ?’s increases the amount of structure. The priority rule is simply the ranking of the selected opportunities by their perceived payoff. The timing rules specify the number of opportunities that can be captured at one time. In addition, the timing of opportunities relies on two additional parameters not described in the Methods section. These parameters relate to latencies at beginning and ending of opportunity capture. As Sastry (1997) indicates, there is some delay in both beginning and ending actions in response to changes. In this research, birth latency is the amount of time required to start capturing an opportunity while death latency is the amount of time needed to stop executing an opportunity. We set both birth latency and death latency to 5 timesteps, a moderate value that was fixed across all experiments and that produced findings that were typical across a range of settings, and assumed that the organization is aware of the timing of these latencies. These latencies are used in both the timing and exit rule algorithms. The exit rules assume that the organization can perceive the ending of the window of opportunity. In particular, we use an additional parameter for the exit rules – i.e., the projected endtime which is defined as the percentage of the window of opportunity time period that must elapse before an exit is made. We set the projected endtime to 80% of the window of opportunity. Once an organization reaches the projected endtime for a given opportunity, the organization can continue to exploit this opportunity or, if a higher valued payoff opportunity is available, end the current opportunity and begin this higher-valued one. This enables firms to gain most of the payoff from each opportunity that has been captured without excessive churning (Sastry, 1997). We selected values for the fixed parameters (e.g., birth latency, death latency, projected endtime) after extensive sensitivity analyses to ensure that these values produced representative findings.
We initialized the rules in similar way to the opportunities. For example, the boundary and how-to rules are initialized as 10-element vectors, but with ? ‘s (elements that can be improvised) scattered throughout a string of 1s and 0s (rules to be followed). The exact placement of 1’s, 0’s, and ?’s is randomly assigned with the number of ?’s fixed. Thus, the amount of structure is operationalized by the number of ?’s. For example, if a boundary rule’s number of rules is set to 6, then 0?0?1?01?0 or any other permutation could result as long as four ?s were assigned. The how-to and boundary rules also have one additional feature. Specifically, unpredictability influences the degree to which there is a foreseeable pattern in the flow of opportunities. Thus, as unpredictability decreases, organizations can increasingly learn how to respond to these opportunities by recognizing this pattern, and this learning can be incorporated into their rules (Bingham and Eisenhardt, 2005). We assume that this learning by the focal firm has occurred, and so we initialize the organization’s set of rules to reflect the degree to which there is a foreseeable pattern of opportunities. For instance, when there is high environmental unpredictability with entropy H=1, we randomly assign an equal 50/50 split of 1s and 0s to the actual features vector of the opportunity as described in the Methods section for operationalizing environmental unpredictability. Correspondingly, the boundary and how-to rules are also initialized to have an approximately 50/50 split of 1s and 0s (although the total number of 0’s and 1’s depends upon the value of the amount of structure). When there is low environmental unpredictability (e.g., a 70/30 split of 1s and 0s in the actual features vector of opportunities), the rules will also have a 70/30 split of 1s and 0s that reflects the degree to which the organization has learned that the opportunities in the environment have this higher proportion of 1’s. Thus, we assume that organizations have learned patterns in the opportunity flow if they exist, have learned more when unpredictability is lower, and have adjusted their rules to reflect this learning prior to the simulation. The embedding of learning within rules is consistent with much prior research (Cyert and March, 1963; Burgelman, 1994; Bingham and Eisenhardt, 2005).
Improvisation and Attention
A key feature of our model is the improvisation of action. As outlined in the Methods section, the organization takes various actions as it attempts to capture opportunities. Some of these actions are rule-based and some are improvised. As the amount of structure is decreased, the amount of improvised action increases and so enables the organization to be flexible to unexpected opportunities. In contrast, as the amount of structure increases, the amount of rule-based action increases and so enables the organization to be efficient. More specifically, when a rule (e.g., 0?1?10???0) is applied to a given opportunity, the organization follows the rule for each element as specified by a ‘0’ or a ‘1’. Thus, it takes some rule-based actions. In addition, the organization also randomly improvises a ‘0’ or ‘1’ action for each '?' placeholder with a 50/50 likelihood of each outcome. Thus, it takes some improvised actions. Overall, this process produces a set of actions (e.g., 0111100110 where the 2nd, 4th, 7th, 8th, and 9th actions are improvised) that can be compared for match to a given opportunity (e.g., 0110101010). When enough of the actions match the features of an opportunity (e.g., 0110101010) as specified by the complexity of the opportunity, the opportunity is captured and the organization gains the opportunity’s actual payoff. When an insufficient number of actions matches the opportunity features, the opportunity remains in the environment. Depending upon the attention available (see below), the organization may continue to improvise in order to try to capture the opportunity. When an opportunity reaches the end of its window of opportunity, it leaves the environment.
Another key feature of the model is attention. As in actual organizations, we assume that the organization has a finite amount of attention. In particular, the organization has a fixed attention budget. In each timestep, the attention budget is decremented for each checking of rules with opportunities, each rule-based action and each improvised action. Consistent research on improvisation (Weick, 1993; Brown and Eisenhardt, 1997; Miner et. al. 2001) and the use of rules (Cyert and March, 1963), we assume that an improvised action takes more attention than simply checking or applying a rule because improvisation has enhanced demands for sense-making regarding the particular characteristics of the situation (Weick, 1993) and for blending design of actions with execution (Miner et al., 2001). Thus, consistent with empirical evidence, we set the attention required to check the match of a rule with an opportunity or take a rule-based action at 1 unit of attention while each improvised action at 10 units of attention. Our sensitivity analyses indicate the findings are robust to a broad range of variations in the amount of attention that an improvised action requires. We chose 10 as a representative value. In any given timestep, the attention budget is decremented until the attention budget is depleted or the timestep ends. Action stops if the attention budget is completely depleted. It is then replenished at the beginning of each new timestep. We set the attention budget to 2800 attention units. Sensitivity analyses that varied the value of this parameter showed that increasing this budget simply increases the number of opportunities that can be executed in a given timestep, but that these variations (above a minimal threshold) do not produce a qualitatively different set of findings. Therefore, we chose this representative value for our simulation runs. Finally, in any given timestep, rules are checked against opportunities for a match, and rule-based and improvised actions are taken as long as attention is still available. .
Performance and Error Constructs in Monte Carlo Experiments
As noted in the Methods section, we use standard Monte Carlo techniques (Law and Kelton, 1991). Specifically, each simulation run consists of 200 timesteps and each experiment consists of 30-50 simulation runs for each amount of structure, and each specific high or low value of the environmental dimensions. The results of these simulation experiments are graphed in a consistent fashion in Figures 1-6. That is, each point represents the results for one simulation experiment , including the mean performance computed across all 30-50 simulation runs (Y-axis) for a given amount of structure (X-axis). A curve is then interpolated between the mean performance values by connecting them points with a line.
In order to describe the inverted U-shaped curves in the Results section, we use three performance constructs – performance peak, optimal amount of structure, and performance peak robustness. The performance peak is the point of maximum performance that the organization can achieve. For example, the peak performance in Figure 1 is approximately 300. The optimal amount of structure is the value of structure (either number of total rules or number of rules) at which the performance peak (i.e., highest performance) occurs. Since the inverted U-shaped relationship between structure and performance can be more or less like a plateau, there may be a range of optimal structures. In this case, the exact optimal point is defined as the ‘center’ value of this set of high performing points. In other words, it is the ‘sample raw moments’ of performance on the x-axis, with the ‘sample raw moments’ as the performance weighted ‘center’ of a set of data (Weisstein, 2004). We chose this definition and related calculation because of its precision and because the calculations of the ‘sample raw moments’ correspond to the intuitive ‘center’ of the set of optimal values. For example, the exact optimal point on the x-axis of Figure 1 is approximately 12. Finally, performance peak robustness describes the width of the range of optimal structures. This robustness reflects the degree to which changes in the amount of structure are unlikely to affect performance. Performance peak robustness is calculated graphically by defining a performance cutoff line for ‘high’ performance and then calculating the distance between the two amounts of structure that intersect it. Large performance peak robustness denotes a wide range of optimal structures that can produce the performance peak. Small performance peak robustness denotes a narrow range of optimal structures and so is sensitive to even small changes in the amount of structure. For example, for Figure 1, a performance cutoff of 225 as the performance cutoff produces performance peak robustness = 5 (i.e., the distance between 20 and 25).
As in all stochastic processes and related phenomena (regardless of whether empirical or simulated), the results of Monte Carlo experiments will typically vary across simulation runs even when construct parameter values are fixed (Law and Kelton, 1991). Therefore, we compute not only the mean performance for a given experiment, but also its variability in terms of error variances. Specifically, we plot both a performance mean for each value of the amount of structure and an associated ‘error bar’ that indicates the variability of each result, a standard graphical method used in Monte Carlo outputs (Kalos and Whitlock, 1986). We compute the length of the error bar as the square root of the error variance of each experiment over the number of trials (i.e., simulation runs) of these experiments. These error bars provide an intuitive and visual display of the confidence intervals surrounding a result. As a rule of thumb, if the mean of one result is contained within the errorbars of another result, then the two are not significantly different.
Sensitivity Analyses
We performed extensive sensitivity analyses for all the key amount of structure-performance relationships described in the Results section. By sensitivity analysis, we mean a thorough exploration of the parameter space to discover if a given finding remains when primary and secondary construct values (i.e., parameters) are varied. That is, we not only varied the amount of structure operationalizations (i.e., number of rules, number of total rules) that were the primary focus of the research, but also secondary constructs in order to ensure the robustness of the results. We chose the specific values for presentation because they represent ‘extreme’ values of a parameter or the ‘midpoint’ values nestled in between already tested values, as appropriate. Thus, we explored the parameter space in a very fine-grained way. In particular, we paid special attention to exploring the full range of the environmental dimension values – i.e., velocity, complexity, ambiguity, and unpredictablility. Since velocity (λ) is unbounded in a Poisson distribution but actual organizations are both cognitively and resource bounded, we placed an upper-bound λ at the value for which the number of opportunities is an order of magnitude greater than the organization could capture in any timestep. We then thoroughly explored velocity at a variety of parameter values including 0, .4, .6, .8, 1.2, 1.4, 1.6, 1.8, 2.0, 2.2, 2.4, 2.6, 2.8, 3, 4, and 5. All results are consistent with Figure 3. We also explored complexity, which ranges from 0 to 1, with a variety of parameter values including 0, .2, .3, .4, .5, .6, .7, .8, .85, and .9. All results are consistent with those in Figure 4. We tested ambiguity, which ranges from 0 to 1, with a variety of parameter values including 0, .1, .2, .25, .3, .4, .6, .8, 1.0. All results are consistent with those in Figure 5. Unpredictability, which is operationalized with an entropy measure, ranges from 0 to1 in our tests. We tested the sensitivity of the unpredictability results with a variety of parameter values for the proportion of 1s including 0, .1, .2, .3, .4, .5, .6, .7, .8, .9, and 1.0. All results are consistent with those in Figure 6.
ENDNOTES
There are several interpretations of complexity theory and the related complexity sciences – e.g., formulations by Kauffman, Gell-Mann, Langton, and Holland. We follow the conception of complexity theory closest to Kauffman and Gell-Mann. Specifically, we note that the ‘complexity” in “complexity theory” refers to the outcomes of systems – namely, the outcome of moderately connected systems (i.e., complex adaptive systems) is complex such that resulting system behaviors are innovative, adaptive, varied, and surprising (Gell-Mann, 1994). This view most notably is different from Simon’s (1962) notion of the complexity of systems like organizations that refers to the degree to which they have many parts and are tightly interconnected.
We considered several approaches including NK, but none matched our research problem and objectives well. For example, NK modeling is an optimization approach that focuses on the time of the search process to find a peak, usually on a fixed landscape. In contrast, we were interested in a descriptive approach that would enable a rich examination of multiple dimensions of a dynamic environment (not a fixed landscape) and the long term performance effects of particular structures for a series of opportunities (not the search time).
We set the proportion of 1s and 0s in the 10-element rules to the same proportion as the environmental unpredictability. Thus, we assume that organizations have learned any previous patterns in the underlying flow of opportunities and have incorporated them into their rules. This is consistent with empirical research that firms learn and stabilize their rules quickly (Bingham and Eisenhardt, 2005).
We experimented with multiple values for the amount of attention required for improvised action relative to rule-based action. We found no qualitative differences in the findings, and so present the results for a representative value. Future research could explore this parameter further.
Additional results for other values of the environmental dimensions are available from the authors.
In Figures 1-6, each point represents the average over n=30 simulation runs (except in Figure 1 which uses n=50) with t=200 time steps each. We selected n=30 as the number of simulation runs because exploratory analyses revealed that values n>30 yielded insignificantly small incremental gains on reliability. We used n=50 for the first operationalization of the amount of structure because of the added precision that is useful for the larger range of structure values. These results are representative of the findings produced by other construct values during our extensive exploration of the parameter space. See the Technical Appendix for more details.
Given space limitations, the following analyses are conducted for the second operationalization of structure. We chose this operationalization because it captures the focal importance of how-to rules in empirical settings (Bingham and Eisenhardt, 2005) and because it enables easier-to-understand graphical representation. The results for both operationalizations are, however, qualtitatively the same.
Sensitivity analyses indicate that the results produced by these choices were representative of other values of velocity.
Sensitivity analyses indicate that the results produced by these choices were representative of other values of complexity.
Sensitivity analyses indicate that the results produced by these choices were representative of other values of ambiguity.
Sensitivity analyses indicate that the results produced by these choices were representative of other values of unpredictability.
We omit the value of 10 because of the appearance of some undefined endpoint values
.