In 2002, I published an article in the Business History Review, “Competition and Business Strategy in Historical Perspective,” that attempted to provide an overview of the evolution of practical—rather than academic—ideas about business strategy from the beginnings of the field through the second half of the 1990s. It concluded that, by then, (1) there had been a profusion of new ideas about strategy, which had probably started in the early to mid-1980s, and (2) many of the new ideas embodied some explicit dynamics, compared with the typically timeless or static approaches to strategy that had previously dominated the field. The article was, however, qualified in its predictions about the future; the first paragraph of the conclusion offered the following summary:
Dynamic thinking . . . has absorbed the bulk of academic strategists’ attention in the last fifteen-plus years. But when one looks at the practice of strategy in the late 1990s, this simple narrative is complicated by an apparent profusion of tools and ideas about strategy in particular and management in general.
In other words, I noted a trend toward more emphasis on dynamic than on static thinking about strategy through the 1990s but was uncertain—at a time when many sophisticated observers were alleging faddishness in the development of new ideas about strategy—about whether dynamics itself might simply be of passing interest.
This article updates the earlier one, based on what is getting close to twenty years of additional evidence about the evolution of business strategy. In line with the earlier article, it focuses on the development and diffusion of influential new ideas about strategy. This approach can now also count on support from updates in the interim, most notably Walter Kiechel's (2010) history of the intellectualization of strategy and the categorization by Martin Reeves and his colleagues at the Boston Consulting Group (BCG) of salient strategy frameworks up to 2013.
This article finds several indications of a drop-off in the rate of development of new ideas about strategy since the second half of the 1990s. But it also indicates a continued focus on dynamics, which seems to have intensified over the past ten years in particular. The (cumulated) stock of dynamic frameworks has, based on the BCG enumeration, more than doubled over the last two decades. Such updating increases both the need and the empirical basis for some generalizations about the types of dynamic frameworks that managers are likely to find helpful and those that they are not—and also provides additional perspective on some of the older, static frameworks described in the earlier article. The objective is to move from additions of new frameworks to a large cumulated stock toward a critical assessment of frameworks, new and old.
Indicators of Innovation Rates
While there are no ideal indicators of rates of innovation in strategy—let alone performance measures—there are multiple indications that the rate has dropped off sharply since the levels reached in the 1990s (which my original Business History Review article characterized as being high from a historical perspective). One telling indication relates to Richard Pascale's importance-weighted citation index of new ideas about business/management through the mid-1990s, which he generously let me use in my 2002 article.
When I contacted Pascale recently for an update, he explained that the most recent version he had ran only through 2000—see Figure 1, which incidentally does seem to show a flattening toward the end of the period—and that he had discontinued his tracking because there just weren't enough new ideas with impact to bother!
Figure 1. Ebbs, flows, and residual impact of business fads, 1950–2000. (Source: Richard Pascale, updated figure, “Ebbs, Flows, and Residual Impact of Business Fads, 1950–1995,” published in his book Managing on the Edge: How Successful Companies Use Conflict to Stay Ahead [New York, 1990], 18–20.)
Another indication—cited by Adrian Wooldridge in a column for the Economist in spring 2015, titled “The Twilight of the Gurus” and subtitled “The management-pundit industry is a shadow of its former self”—pertains to the relative stability at the very top of Thinkers50's biennial ranking of leading management thinkers; Clayton Christensen and W. Chan Kim and Renée Mauborgne topped the list in 2013, for ideas developed a decade-plus earlier (disruptive innovation and blue ocean strategy, respectively).
Wooldridge happened to be my dinner guest at the most recent Thinkers50 gala, in November 2015, and we both noted with interest that Michael Porter had squeezed ahead of Christensen and Kim-Mauborgne to the number one spot, which seemed an instantiation of his point.
Along similar lines, it is the period through the mid-1990s that is characterized as “strategy heydays” in A. T. Kearney's recent history of strategy.
And as far as what the strategic consulting firms themselves now focus on, my interviews at three of the very largest of these firms (in fall 2013) indicate more of an emphasis on “capabilities” and long-term engagement with clients than on coming up with the next big new idea to be sold widely—which seemed much more the mindset in the 1980s and the 1990s.
Up-to-date data backstopping such impressions can be drawn from the recent classification of salient (practical) strategy frameworks by Martin Reeves and his colleagues at BCG.
More than five years ago, BCG launched a major exercise aimed at learning from the history of the strategy field that involved, among other things, the compilation of a chronology of salient strategic frameworks. In order to generate its list, BCG reviewed the academic literature (including my 2002 Business History Review article, which was cited as a key source) as well as the publications of other major strategy consulting firms and conducted interviews with leading academics, CEOs and senior managers, and its own senior officers.
A chronological list of the eighty-one salient strategy frameworks identified by BCG is provided in the Appendix, and the rate of innovation over time (through 2013) is summarized in Figure 2. The number of additions to this list increased from ten in the 1960s and nine in the 1970s to eighteen in the 1980s and a peak of twenty-five in the 1990s before subsiding to thirteen in the 2000s and to four so far in the 2010s. So there does seem to have been a peak in the 1990s and a big drop-off since then.
Figure 2. Rate of innovation based on BCG list of eighty-one salient strategy frameworks. Note: Data shown represent new frameworks introduced during five-year periods. The first of the frameworks was from 1958 and the most recent from 2013. (Source: Figure created using data in Martin Reeves, Knut Haanæs, and Janmejaya Sinha, Your Strategy Needs a Strategy: How to Choose and Execute the Right Approach [Boston, 2015].)
I will discuss later in this article what I make of this recent drop-off in terms of its welfare implications. But before that, I want to update the second conclusion from my earlier article.
Dynamic versus Static Frameworks
The second conclusion of my 2002 Business History Review article was that ideas about strategy had increasingly come to involve some explicitly dynamic considerations: specifically, more or less explicit evocation of the passage of time and its effects. However, I was somewhat hesitant, back then, to rule out the possibility that dynamics itself might be a fad. The fifteen to twenty years of updating helps in this regard, indicating that despite the overall drop-off, there is a continued emphasis on new ideas that are dynamic rather than static. In other words, there are more reasons now than at the time of my original Business History Review article to think that dynamics is a sustained focus of strategic innovation rather than one of passing interest.
Once again, reaching such a conclusion requires assembling a handful of indicators that, while subject to different limitations, do tend to point in the same direction. Thus, the Wikibook Business Strategy, with its mostly chronological structure, deals with the 1990s under the rubric of “Strategic Change in the 1990s,” followed by a section on “Information- and Technology-Driven Strategy,” both of which would seem to be dynamic in their thrust.
And four of the five leading McKinsey ideas about strategy from the early 2000s, represented on the strategic thinking map in a McKinsey staff paper by John Stuckey (growth, portfolio of initiatives, corporate strategy that replaces [static] assessments of business attractiveness with [dynamic] ones of value creation potential, and creative destruction), evoke explicitly dynamic themes, as does, at least for me, the lone outside idea cited (capabilities-based strategy).
But again, the BCG schema provides perhaps the single best basis for concluding that the trend toward dynamic frameworks, evident by the second half of the 1990s, has persisted in the nearly twenty years since then. BCG has since published a book on the topic that, among other things, classified the frameworks into “classical” versus adaptation, vision, shaping, and renewal. The latter four would seem to have a dynamic emphasis of some sort, whereas most of the ones included in the classical bucket might seem not to. Assuming as much, the BCG chronology implies that over the 1990s and 2000s, new strategy frameworks have come to be distributed roughly equally across adaptation, other nonclassical frameworks, and classical ones. But closer scrutiny of BCG's definitions makes me cautious about equating “nonclassical” and dynamic. Thus, my own contribution to BCG's list of eighty-one—a focus on commitment (irreversibility) that was described most fully in the 1991 book bearing that title—is classified by BCG as “classical” despite bearing the subtitle “The Dynamic of Strategy.”
Accordingly, I persuaded BCG to reclassify their eighty-one salient strategy frameworks into static versus dynamic, an exercise for which I am enormously grateful. The criterion used, in BCG's own words, is “dynamic strategy frameworks . . . deal with the evolution of strategic positions over time and issues arising from this change over time, whereas static frameworks deal with strategic positions and issues in a given moment.”
This sounds reasonable, although I will revisit how this criterion was operationalized below.
Based on BCG's classification, Figure 3 charts dynamic versus static strategic frameworks over time. Four of the eighteen salient strategy frameworks from the 1980s were classified as dynamic, versus thirteen of the twenty-five from the 1990s and ten of the seventeen in the period from 2000 to 2013.
Interest in dynamics really started to surge in the mid-1990s and seems to have intensified in the last ten years in particular, with seven of the eight frameworks from the last ten years (since 2005) being classified as dynamic.
Figure 3. Cumulated number of strategy frameworks, static versus dynamic. Note: Based on BCG's list of eighty-one frameworks and BCG's classification of them as static versus dynamic. (Sources: The frameworks are from Martin Reeves, Knut Haanæs, and Janmejaya Sinha, Your Strategy Needs a Strategy: How to Choose and Execute the Right Approach [Boston, 2015]. Classification of static and dynamic subsequently provided by BCG to author.)
I should add that while the BCG classification is very useful—it is the group's list of frameworks, for one thing—my own sense is that it may be too restrictive, overall, in classifying frameworks as dynamic. My own classification suggests that eight of the eighteen frameworks from the 1980s were dynamic, as were eighteen of the twenty-five in the 1990s and twelve of the seventeen since 2000, helping swell the cumulative total to forty-nine out of eighty-one by my count (versus thirty-seven by BCG's). A third classification, by Bruno Cassiman, who coedited a special issue of Management Science on strategic dynamics with me in 2007, yielded an overall dynamic/static split very close to mine.
But the commonality across these three different looks at the list of eighty-one frameworks is that dynamic frameworks have dominated numerically since the 1990s, and this trend does not seem to be weakening. So there is evidence to suggest that the second conclusion in my 2002 article—more attention to dynamic frameworks—still holds.
What should one make of the drop-off overall and the shift toward more attention to dynamics? And what, if anything, should be done?
Wooldridge himself is concerned about current “lethargy” because, “considering the resources that are devoted to thinking about management, it is remarkable how much virgin territory remains.”
But others hold different opinions. Thus, a decade ago, Stuckey of McKinsey sensed a slowdown and sounded an appreciative rather than disappointed note on the last page of his review of the state of strategy:
Most future developments in strategic thinking may be incremental rather than paradigm-shifting. Indeed, many will simply be repackagings of basic ideas, emerging at a particular time because a particular strategic question is on the minds of many clients (as risk management is today) or because someone finds a powerful new way to communicate an existing strategy insight—as, indeed, we have ourselves done, to great effect.
In assessing whether current levels of development of new ideas about strategy are too low or about right, it is also worth considering the period in which observations of drop-offs are most often anchored: the (all-time) peak in the 1990s. Back then, something unsustainable was clearly in the air. Wooldridge wrote about it at the time, in a book on management gurus coauthored with John Micklethwait—memorably titled The Witch Doctors—that highlighted the possibility of too much churn in management ideas.
(One of the memorable slams of fads I took away from that book was BOHICA, shorthand for “Bend over, here it comes again.”) And others had also expressed similar opinions by the time I started to conduct research for my original Business History Review article. Thus, Nitin Nohria and James Berkeley, writing in 1994, warned that “the adoption of off-the-shelf ‘innovations’ [for managers] continues at a disturbing rate.”
Were there too many new ideas about strategy in the 1990s (too many BOHICA moments) or too few recently (too few eureka moments)—or both, or neither? It is hard to say, partly because the data suggest big temporal variations in the rate of development of new ideas/frameworks that seem to condition whether we worry about too many or too few ideas at a particular point in time but whose drivers we do not quite understand. Saying that market forces should lead to appropriate outcomes misses the point about market imperfections on both the supply side and the demand side of the market for ideas about strategy that I emphasized toward the end of my 2002 article in Business History Review. To summarize, on the supply side, small numbers competition raises issues about whether market mechanisms will deliver the right quantity and kind of innovation, and on the demand side, there are the additional complications created by the informational imperfections of markets for ideas, as opposed to more conventional products. While the focus of concern has shifted from too many new ideas about strategy to too few, these supply-side and demand-side issues continue to be reasons not to assume that market outcomes are necessarily optimal.
And market failures are not the only source of difficulty in assessing the performance markets for ideas. Here, I will briefly mention a few others that are also discussed at greater length in the concluding section of my 2002 article: (1) Markets for ideas are inherently more complex to analyze because ideas are slipperier, harder to distinguish from each other, and harder to measure than prototypical widgets—as well as more prone to information-related problems of various sorts; (2) Despite attracting lots of use, some ideas may turn out to be busts. Reengineering exploded from nowhere to $2.5 billion in consulting revenues by 1995, only to implode to $1 billion by 1998 (some of which seems due to “failures” and some to a shift in corporate emphasis in the United States from restructuring to growth). Wide usage is no guarantee of idea quality; (3) Quantity of usage can itself be difficult to measure. Thus, in terms of article counts, quality circles peaked in 1982 and then suffered an even bigger drop-off than reengineering. But they and total quality management (TQM) have had an enduring impact by being adopted and adhered to rather than abandoned; (4) Idea quality—which would seem to be as indispensable a part of any idea impact assessment as quantity of usage—is the really tricky measure, though, sometimes even after the fact. It is often hard to sort out idea quality from implementation quality in explaining performance; (5) Also, standard models of scientific progress (e.g., Thomas Kuhn's) imply that worthwhile big new ideas are likely to be very infrequent, and most progress is associated with many small ideas. So a focus on big new ideas could be very limiting even if those ideas happened to be (relatively) easy to measure because it misses out on many smaller improvements.
For all these reasons, I conclude that while there does seem to have been a significant drop-off since the second half of the 1990s in the rate of development of big new strategy ideas/frameworks, it is much harder to be definite about the welfare implications. And even if you believe that the drop-off is problematic, you may want to pause before encouraging a thousand flowers to bloom if you think that too much ideation went on in the 1990s.
In terms of what is to be done, I have another concern about simply assuming that we would be better off with a burst of new frameworks (although I would not rule out that possibility either). We already have eighty-one salient strategy frameworks, according to BCG. Is it likely that tossing a few more onto that heap will lead to significant progress? While I am unable to answer that question, I do think that whether we need new frameworks or not, we could benefit from some attempts to sort through the menagerie we already have. I will argue this point with applications: in the next section, by bringing some logical considerations to bear on the value of dynamic thinking and, more briefly and in the conclusion, by reconsidering classically static frameworks that preceded the burst of dynamic thinking.
By BCG's count, we have three dozen dynamic frameworks, and by my count, four dozen. My first instinct was to focus on the significant discrepancies between BCG's classification and mine to understand the reasons behind them. But the process of doing so shifted my interest away from more precisely defining “dynamic” toward making at least a partial attempt to sort through the large and still-growing stock of dynamic frameworks/ideas. Some can be very valuable to real managers, and some cannot. In what follows, I propose simple criteria for assessing the potential that “dynamic” frameworks actually offer for truly dynamic thinking: both thinking through time and thinking over time.
The two principal criteria come down to recognizing—but avoiding making extreme assumptions about—history dependence, on the one hand, and learning possibilities, on the other, as summarized in Figure 4. Extreme assumptions in either respect call the value of dynamic thinking into question in fundamental ways, which is perhaps why relatively few practical strategy frameworks achieve these extremes. But some do end up quite close to the extremes and, as such, can be treated as “ϵ-examples.”
Figure 4. The value of dynamic thinking. (Source: Author's depiction.)
Let us start with the arguments around the imperative of intermediate levels of irreversibility instead of either zero or complete irreversibility. Zero irreversibility would imply an ahistorical perspective, which, for an audience of business historians, might already be a fatal defect. But to connect better with the criterion of the value of dynamic thinking, note—as Kenneth Arrow did fifty years ago—that without irreversibility of any kind, choices could be reversed costlessly and there would be no need to look deep into the future.
In fact, a series of myopic decisions—a sequence of static optimization exercises—would be a perfectly adequate approach to strategy.
Do any salient strategic frameworks exist that assume zero irreversibility? It is useful to begin by noting that all of the static frameworks effectively shunt such issues aside by ignoring time's arrow and its manifestation in the form of irreversibility. And even among dynamic frameworks, it is possible to specify some that seem underbounded by history. Consider, for instance, the most recent dynamic framework on BCG's list, Rita McGrath's “Transient Advantage.”
Her vision of corporations as shapeshifters that are continuously morphing conjures up for me an image of frogs jumping from lily pad to lily pad without regard to initial position. History dependence seems absent, at least in the form of any specific product market commitments. This might be (approximately) acceptable for a start-up that is still just a spark in somebody's mind, but for most enterprises, it would seem that where they are coming from in product market terms should have some bearing on where they choose to go.
The other end of this continuum is occupied by extreme irreversibility, which leaves no room whatsoever for managerial agency. Take, for instance, the resource-based view of the firm, dating back to the 1990s, which BCG classifies as static. I actually included it as one of “four strands of academic inquiry that embodied new approaches to thinking about the time dimension” that warranted individual discussion in the “Competitive Dynamics and History” section of my 2002 article.
But I can see BCG's point in that “high church” versions of resource-based theory—which place a great deal of emphasis on socially complex resources beyond the ability of managers to manage systematically—imply that firms and those who manage them must focus on competing with the resources they are stuck with (i.e., in the operational short run), as in more clearly static frameworks. Or, in the words of a CEO who—unusually—has heard about the resource-based view, it is basically resigned. In specific regard to dynamic thinking, because there is nothing to be done, there is no point in taxing oneself by thinking through or over time.
Organizational or population ecology is another frame that stresses high irreversibility and that has more currency academically than practically. Thus, BCG included organization ecology in its candidate list of strategic frameworks but then dropped it; I remember similar treatment, twenty years ago, when I was involved in the McKinsey Strategy Theory Initiative, which classified population ecology (along with institutional theory and some of the other frameworks discussed below under the rubric of uncertainty) as “anti-strategy.” The practical problem with such ecological explanations, and the theories of imprinting that underlie them, is that they leave little or no room for managerial action, at least once an organization has been founded—nor, indeed, for strategy.
For practical as opposed to academic purposes, however, the siren song and free spirits associated with zero irreversibility seem to be more of a danger than does overemphasis of the dead hand of the past. For an illustration of the pull that this extreme can exert even on academically grounded strategy frameworks, consider dynamic capabilities, which, along with the resource-based view, represent the two broad perspectives on strategy that perhaps wield the most academic influence today. I was much more positive about dynamic capabilities than the resource-based view of the firm in my 2002 article, because unlike resources in the resource-based view, which were taken as given, capabilities were supposed to be built and reinforced over time—that is, offered more room for managerial agency. I hoped that this still-emerging literature would afford micro insights into which capabilities to focus on and how to invest in them (or not).
The literature on dynamic capabilities has made quite a mark, but academically, it has not lived up to my expectations—nor, more importantly, to those of some of the academics more closely involved with it and therefore in a better position to offer credible critiques.
Thus, around the time that my Business History Review article appeared, Sidney Winter—coauthor with Richard Nelson of the influential book An Evolutionary Theory of Economic Change (1982)—was already warning that “some of the mystery and confusion surrounding the concept of dynamic capabilities arises from linking the concept too tightly to notions of generalized effectiveness at dealing with change.”
And Gary Pisano, a coauthor of the original piece on dynamic capabilities, has recently expressed a similar view:
Research on dynamic capabilities should focus on the connection between capability creation/investment decisions and competitive outcomes. . . . Dynamic capabilities depart from other perspectives on strategy by making capability investments a key strategic choice (albeit a constrained choice) of the firm. This was one of the original purposes of the Teece and Pisano (1994) and Teece, Pisano, and Shuen (1997) papers. Unfortunately, since that time, research on dynamic capabilities has come to focus on generalized capabilities for adaptation to change. While such flexibility is potentially useful in many contexts, it is not necessarily a basis for strategy.
Pisano's quote also suggests a way forward, by exploiting connections with game theory—another one of the four areas scouted in the “Competitive Dynamics and History” section of my Business History Review article. I think it is also fair to say that game theory has not had the impact on strategic management that early adopters such as myself had hoped, with the most oft-cited problem probably being the sensitive-dependence of the conclusions from conventional noncooperative game-theoretic models on a host of parameters, including factors unobservable by third parties (e.g., information sets). But progress has been made: John Sutton and Adam Brandenburger and Harborne Stuart Jr., for instance, use unconventional (and very different) methods to deduce robust bounds on the flow of firms’ (operating) profits, given opportunity costs and buyers’ willingness to pay, and work backward in time to figure out how anticipations in this regard will affect prior investments in shifting cost or willingness-to-pay—almost exactly what Pisano describes above as one of the original purposes of the dynamic capabilities literature.
From the academic perspective, progress is being made at tying things together in a coherent way.
Assertions about the uncertainty implicit (or explicit) in ideas about strategy are layered onto the prior set of conditions regarding intermediate irreversibility because, although uncertainty and its resolution affect learning possibilities, learning about the future is itself unnecessary with zero irreversibility and impossible to act upon with complete irreversibility. But under conditions of intermediate irreversibility, the injection of uncertainty into the problem transforms decision making because, as Arrow observed, “When there is uncertainty, there is usually the possibility of reducing it by the acquisition of information.”
That is why there is a peak in the middle of Figure 4 while the edges asymptote toward zero.
The depiction in Figure 4 of extreme uncertainty entirely overpowering dynamic thinking is perhaps a touch overdone. Even if luck were all that mattered, it would still be useful to know that one could then account for (dynamic) phenomena such as regression toward the mean. What the figure is meant to emphasize is that the level of uncertainty has a nonlinear effect on the value of dynamic thinking. Whether this point is already obvious is less important than the fact that it is often ignored. As Hart Posen and Daniel Levinthal note, “Although in substance this work is nuanced in its treatment of strategic responses to environmental change, in practice it is often interpreted as suggesting a simple positive relationship between the extent of environmental change and the merits of organizational adaptation.”
And their simulations confirm nonlinear effects of environmental turbulence on the relative mix of exploration versus exploitation, with the former commanding comparatively more attention at intermediate levels of turbulence.
Many strategic frameworks, of course, fixate on the opposite end of this continuum: they are either static, which means they offer no basis for learning over time, or they ignore the possibility of failure. (Thus, Kim and Mauborgne's work on value innovation, to take just one example, suffers a bit in this regard despite the undeniable influence it has had; developing value innovations or blue oceans is the type of exercise particularly prone to high failure rates.)
And new arguments about how uncertainty may be expungeable from the scene have been made by invoking, among other things, big data, machine learning, and fantastic increases in processing power as means toward realizing the end articulated two hundred years ago by the French mathematician Pierre-Simon Laplace:
An intellect which at a certain moment would know all forces that set nature in motion, and all positions of all items of which nature is composed, if this intellect were also vast enough to submit these data to analysis, it would embrace in a single formula the movements of the greatest bodies of the universe and those of the tiniest atom: for such an intellect, nothing would be uncertain and the future just like the past would be present before its eyes.
It is worth noting that since Laplace's time, work in quantum physics, meteorology, nonlinear dynamics, and so on have actually made scientists much less confident about perfect predictability, unprecedented levels of data crunching notwithstanding.
That is the extreme labeled “certain” on the axis involving uncertainty in Figure 4. The other extreme, labeled “unknowable,” is also worth commenting on. This is the case in which uncertainty renews itself at a rate that is impossible to reduce usefully through learning—a point of view built into strategy frameworks that primarily invoke luck, the vagaries of evolutionary processes, or chaotic, nonlinear dynamics (three frameworks that the McKinsey Strategy Theory Initiative lumped into “anti-strategy” twenty years ago, for being too random, along with population ecology and institutional theory, for being too deterministic). Empirical research suggests that the effects of luck, the vagaries of evolution, and so on are not overpowering, on average.
That said, situations exist in which uncertainty is much less easily reducible than in others, and extremes in this regard are what are labeled as “unknowable” in Figure 4.
Similar characterizations of levels of uncertainty and their connections to strategy have previously been provided by others. Thus, Hugh Courtney, Jane Kirkland, and Patrick Viguerie distinguished among four levels of uncertainty, with their Level 1 (“A Clear-Enough Future”) coming closest to the “certain” end of the axis in Figure 4 and their Level 4 (“True Ambiguity”) coming closest to “unknowable.”
They also suggest that intermediate levels of uncertainty predominate in the real world, that Level 1 situations account for most of the rest, and that Level 4 situations are quite rare and tend to migrate toward one of the other levels over time—that is, they do not involve unremitting “unknowability” in the extreme sense that the end of the axis in Figure 4 is meant to imply. And Arnoud de Meyer has a four-level schema that culminates in chaos, which would also seem to come close to the extreme of unknowability.
Note that the latter is meant to be an extreme scenario that rules out learning over time. In the absence of learning, one might have to think through time in a deep way once, but not recurrently over time—whereas truly dynamic thinking, as defined above and reflected in Figure 4, embodies both.
Courtney, Kirkland, and Viguerie also assert that managers are too prone to leap to extreme assumptions about uncertainty—that it is either zero or overpowering—instead of taking the middle, which is usually more frequent, more seriously.
In parallel with the earlier point regarding extreme assumptions about irreversibility, this is another disability of dichotomization: it is important to (generally) steer a middle course by neither ignoring uncertainty nor treating it as overpowering and a sufficient justification for “gut-based” decision making. A way of tying uncertainty and irreversibility together that is often useful is the learn-to-burn rate: the rate at which useful information is being received from a particular course of action divided by the rate at which sunk or opportunity costs are being incurred in pursuing it.
Having emphasized the general importance and usefulness of learning opportunities, one could go further and try to specify how their structure ties in with “better” (i.e., dynamically more useful) strategy frameworks. It seems that in a number of dynamic strategy frameworks, there is only one big thing to be learned, and once that happens, the game is, in that sense at least, over. Think, for instance, of Christensen's theory of disruptive innovation, still highly influential despite the weaknesses in its competitive logic and empirical base that have recently been pointed out.
But work in cybernetics by Ross Ashby and others suggests that instead of simple rules, adaptive self-stabilization in an ever-changing environment requires a double-loop system rather than a single loop: a system in which the lower-level loop might be like a thermostat but also includes a second, higher-level loop that can reprogram it.
Organization theorists Chris Argyris and Donald Schön developed an analogous distinction between single-loop and double-loop organizational learning.
And Joan Ricart i Costa and I have reframed and analyzed such notions in terms of the tension between two types of efficiency (or actually, efficiency-oriented search processes): static efficiency, which involves continuous search for improvements within a fixed set of initial conditions, and dynamic efficiency, which involves reconsideration of initial conditions.
The key point here is the distinction between two ways of thinking dynamically about strategy: those that encourage reconsideration of the broader lay of the land versus those entailing relentless pursuit of one particular path through it. And inevitably, strategy frameworks that focus on one kind of dynamic—whether the experience curve in the old days or disruptive innovation today—are somewhat incomplete in this sense. At a minimum, alertness to other types of dynamics or, more broadly, ways of reframing the discussion remain essential.
It is worth adding that similar thoughts about strategy are suggested in Sir Lawrence Freedman's magisterial recent book on strategy in war, politics, and business. According to him,
A strategy could never really be considered a settled product, a fixed reference point for all decision-making, but rather a continuing activity, with important moments of decision. Such moments could not settle matters once and for all, but provided the basis for moving on until the next decision.
One of the reasons I like this quote very much is probably that it coincides with the thrust of my 1991 book, Commitment, which identified important moments of decisions in terms of their irreversibility and provided a framework for analyzing them.
Looking across both dimensions—irreversibility and uncertainty—suggests the utility of focusing on midrange theories, involving intermediate settings along both dimensions. Such midrange theories are of particular interest in many other areas of business economics and finance as well. Modern contract theory is useful only if contracts are neither complete nor completely ineffective. The theory of business competition applies to the (large) space between perfect competition and a monopolist unthreatened by entry. In the international domain, I have stressed that semiglobalization is both empirically realistic and logically necessary for international business to have distinctive content: with zero cross-border integration, countries would all be isolated and country-by-country analysis would suffice; with complete cross-border integration, the world would in effect be one large country. And closest to the domain scouted in this paper, Levinthal has argued for a focus on intermediate levels of rationality in the strategy field.
Such midrange theories and the nonlinearities implicit in them are worthy of attention even though they cannot be fitted into that favorite strategic communications device: a two-by-two matrix with a categorical imperative for each of its four cells. Note that Figure 4 resorted, in effect, to a three-by-three structure.
Finally, even if you remain unconvinced that intermediate levels—defined broadly—of irreversibility and uncertainty should be treated as markers of particular strategic interest in a dynamic context, the very plethora of strategic frameworks does suggest that something is to be said for sorting through them. In other words, there is a case for shifting at least some attention from the accumulation of frameworks to their assessment.
This update of my 2002 article in Business History Review has concluded that a drop-off does seem to have occurred in the development of new ideas about strategy since a peak in the mid- to late 1990s, but that an even earlier pattern—attention to explicitly dynamic strategy frameworks—has persisted despite the drop-off. And this article has also proposed two criteria—intermediate levels of irreversibility and of uncertainty—for sorting through our still rapidly growing stock of dynamic frameworks to see whether they actually require truly dynamic thinking. Quite a few putatively dynamic frameworks seem subject to some practical limitations in these respects.
It is worth adding that it is also possible to use screens to sort through our stock—even larger, according to BCG's classifications—of static frameworks. Reconsider, for instance, the two sets of ideas about strategy emphasized toward the end of the static portion of my 2002 article: Porter's five-forces framework for the structural analysis of industries and the insights developed by a range of authors regarding the analysis of relative cost and willingness-to-pay to determine competitive position.
Both of these sets of ideas continue to seem relevant decades later. In regard to the former, there are a number of indications that the recent surge in the average profitability of U.S. business is at least partly due to increases in domestic concentration. Thus, according to the Economist, the percentage of revenues accounted for by concentrated industries—those in which the four biggest firms together control more than one-third of the market—increased from 28 percent of the total in 1997 to 42 percent in 2012.
And analyses of costs and willingness-to-pay as fundamental primitives are emphasized not just in strategy textbooks but also in how-to books meant to help readers achieve both personal and professional objectives.
The longevity of both sets of ideas suggests additional kinds of screens that might be used to sort through ideas about strategy. Porter's five-forces framework, and particularly his emphasis on concentration and barriers to entry (which are related, of course), was based on hundreds of empirical studies in industrial organization economics through the mid-1970s. This sort of empirical base might be treated as another screen to be applied to strategy frameworks. And while many best-selling strategy books claim to be based on rigorous research, it is worth recalling Phil Rosenzweig's (2007) strictures about the delusion of rigorous research and particularly his explanations of why “some of the most successful business books of recent years, perched atop the bestseller lists for months on end, cloak themselves in the mantle of science, but have little more predictive power than a pair of coconut headsets on a tropical island.”
Unless you are an expert on selection bias, matching techniques, dealing with endogeneity and the like, it is better not to take an author's claims that a book is based on rigorous research at face value but instead to look for corroboration from independent researchers (i.e., a broad empirical base).
Analyses of costs and willingness-to-pay actually emerged, unlike Porter's five-forces framework, from business practice—particularly the synthesizing efforts of consulting firms. But, as noted above, we now have formal theoretical analyses, using very different methods, by Sutton and Brandenburger and Stuart, that deduce robust bounds on the flow of firms’ (operating) profits given opportunity costs and buyers’ willingness-to-pay. This kind of theoretical base should bolster one's confidence that these ideas, while static in one sense, are ideas for the long term.
To summarize, we should be able to sort through our large and still-growing stock of practical ideas about strategy by using logical, theoretical, and empirical screens. A framework with a solid grounding in at least one of these respects is likely to prove more reliable in terms of its instrumental utility than one that is deficient in all three.
And to recognize as much is to shift some attention away from chronologies of frameworks to historiography that attempts to assess them in some fashion.