Introduction
While most of the prior work in strategy-as-practice (SAP) research has been conceptual or qualitative in nature (e.g., Reference Kohtamäki, Whittington, Vaara and RabetinoKohtamäki et al. 2022), there is also potential in studying strategy practices quantitatively. There are a number of different benefits that can be gained in comparison to a solely qualitative research orientation. Qualitative methods have advanced the strategic management field with groundbreaking theoretical and empirical insights. Their importance in theory building is incontestable, as demonstrated by some of the highly influential qualitative articles from our field (e.g., Reference BarleyBarley 1986; Reference Brown and EisenhardtBrown and Eisenhardt 1997; Reference BurgelmanBurgelman 1983). Despite these advantages, however, the reliance on a single dominant research method can also be constraining. A broader range of methods may be useful in examining the macro-level patterns emerging from the micro-level data, for establishing boundary conditions or in showing that the qualitative insights also have broader generalizability (Reference Edmondson and McManusEdmondson and McManus 2007). Moreover, the innovative use of quantitative methods could also lead to the emergence of novel insights that might not be achievable with purely qualitative research designs (e.g., Reference Whittington, Yakis-Douglas and KwangwonWhittington, Yakis-Douglas and Kwangwon 2016). Finally, calls have also been made to examine the performance implications of specific practices across populations of organizations through systematic, large-scale, quantitative studies (Reference Bromiley and RauBromiley and Rau 2016).
SAP research has historically had a strong reliance on qualitative data and related research designs in order to go deeper in understanding the micro-level strategy practices that the dominating quantitative research methods could not capture. Because of this important mission, an epistemic culture has emerged around the study of strategy practices over time. The term ‘epistemic culture’ refers to how a research community generates knowledge. It is an implicit property, and can be inferred from the dominant research practices at work in a research stream (Reference Knorr CetinaKnorr Cetina 1999). The epistemic culture of SAP research has been strongly influenced by sociological practice theory, in which the use of qualitative research methods has been particularly prominent.
While the epistemic culture of a research stream plays a strong role in the choice of a research method, the maturity of the research field also drives decisions on the choice of appropriate research methods. SAP research has experienced enormous growth in the past decade (e.g., Reference Kohtamäki, Whittington, Vaara and RabetinoKohtamäki et al. 2022). It has arguably entered an increasingly consolidated and mature ‘harvesting’ phase (Reference Jarzabkowski, Seidl and BalogunJarzabkowski, Seidl and Balogun 2022) such that the concepts and relationships studied by SAP scholars have become increasingly refined and precise. With increasing maturity, research on strategy practices may continue to benefit from the insights emerging from in-depth qualitative research. For moving an increasingly mature field onwards, however, it may also benefit from the use of quantitative research designs (Reference Edmondson and McManusEdmondson and McManus 2007).
Accordingly, we call for a more prevalent use of quantitative research in the SAP domain. In this regard, SAP research will benefit from an increasing formalization and fine-tuning of the established concepts and the constructs used to measure them, and it will help establish enhanced comparability of the findings across multiple researchers and contexts. Specifically, we argue that mixed-method studies that combine qualitative and quantitative data would hold major potential in the field of SAP research (see Reference JickJick 1979 and Reference Netz, Svensson and BrundinNetz, Svensson and Brundin 2020, for example). On the one hand, adding quantitative data to qualitative studies may help in providing more and different types of evidence and in producing further generalizability for some of the more exploratory findings (e.g., Reference CreswellCreswell 2008). On the other hand, quantitative analysis can be used to identify patterns of behaviour that can then be zoomed in on with the help of qualitative analysis.
The chapter proceeds as follows. We start by briefly reviewing and discussing the use of quantitative research methods in closely related strategy research streams, such as upper echelons research, which focuses on top management teams; research on middle management; strategic decision-making; strategic consensus; and on strategic issues and initiatives. After synthesizing the lessons learned with the established quantitative methods, we also discuss a number of innovative quantitative research methods that could be used to further advance SAP research. These methods include computer-aided content analysis, topic modelling and machine learning (ML), network analysis, sequence analysis and event history analysis. We offer a brief introduction to each method and highlight possible avenues to study strategy practices. The discussion of the different quantitative research methods is by no means meant to be exhaustive. Instead, we focus on a set of quantitative research methods, both established and novel, that SAP researchers can potentially use to enrich their research designs.
Quantitative Work in Related Streams
Despite the scarcity of quantitative research on strategy practices, there is research in closely related research streams that one can build on. For example, behavioural strategy (e.g., Reference Bolinger, Josefy, Stevenson and HittBolinger et al. 2022; Reference Powell, Lovallo and FoxPowell, Lovallo and Fox 2011), micro-foundations of strategy (e.g., Reference Felin, Foss and PloyhartFelin, Foss and Ployhard 2015; Reference Foss and PedersenFoss and Pedersen 2016) and strategy process (e.g., Reference Burgelman, Floyd, Laamanen, Mantere, Vaara and WhittingtonBurgelman et al. 2018) researchers have benefited from the use of quantitative research methods to study related phenomena. For example, Reference Reitzig and SorensonReitzig and Sorenson (2013) analysed the failure of organizations to adopt an idea or innovation due to organizational behavioural biases by analysing data on innovation proposals inside a large, multinational consumer goods firm. Relatedly, Reference Tarakci, Ates, Floyd, Ahn and WooldridgeTarakci et al. (2018) examined data collected from 123 senior middle managers of a Fortune 500 firm to understand what drives middle managers to search for new strategic initiatives and champion them to top management and how they respond to different types of performance feedback.
From among the different phenomena, the prior research on top management teams, middle management, strategic decision-making, strategic consensus, and strategic issues and initiatives provide potentially useful examples on which to build one’s own research on strategy practices. As SAP research focuses on the practices, praxis and practitioners, these research streams are also interested in the role of the practitioners (e.g., top managements teams (TMTs), board members, and middle management but also frontline employees and external actors, such as consultants, advisers or other stakeholders), practices (e.g., certain strategy-making conventions and tools), and praxis (the concrete use of strategy-making tools and the concrete enactment of strategy-making conventions, e.g. during strategy meetings).
When making the move from a qualitative research design to a quantitative one, the central decisions relate to: (1) the choice of the object of analysis; (2) the development or use of established constructs that can be measured and replicated across the different objects of analysis; (3) the development of appropriate control variables to account for alternative theoretical explanations and to control for the contextual differences across the different objects of analysis; and (4) the logic of reasoning as to how the different objects of analysis could cause the outcome variable of interest. We discuss these choices briefly next in the context of the related research streams.
Top Management Teams
The common objects of analysis in the research on top management teams are the team members – such as CEO, CFO, COO or CSO (Reference Menz and ScheefMenz and Scheef 2014) – or the top management teams across firms. Since Reference Hambrick and MasonHambrick and Mason (1984) first brought up the idea that the demographic backgrounds of top management team members are likely to matter in terms of their behaviour, the quantitative research into TMTs has expanded rapidly. The ease of measurement has enabled researchers to develop different kinds of quantitative research settings, which have helped deepen the understanding of the effects of TMT member characteristics and heterogeneity on the top management team dynamics across firms. Over time, increasingly sophisticated constructs have emerged to go beyond the original demographic variables. On the one hand, research has gone deeper into measuring and examining the different characteristics of top management team members, like their personality characteristics (Reference Chatterjee and HambrickChatterjee and Hambrick 2007; Reference Chatterjee and Hambrick2011; Reference Hayward and HambrickHayward and Hambrick 1997) or cognitive characteristics (Reference Graf-Vlachy, Bundy and HambrickGraf-Vlachy, Bundy and Hambrick 2020). On the other hand, the emergence of a construct measuring managerial discretion has enabled researchers to explain why CEOs and TMTs do not always matter equally (Reference Carpenter and GoldenCarpenter and Golden 1997; Reference Finkelstein and HambrickFinkelstein and Hambrick 1990).
It has also been found that, for example, managerial discretion is highly context-dependent and could play out differently in different cultural contexts (e.g., Reference Crossland and HambrickCrossland and Hambrick 2011). Moreover, connecting different constructs to outcomes requires an understanding of the interdependencies and the processes with which the TMT interacts, which has led to research on TMT interdependencies and team processes (Reference Barrick, Bradley, Kristof-Brown and ColbertBarrick et al. 2007). The relationship of the research on top management teams and SAP is intriguing. When making sense of the relationship, Don Hambrick noted in his keynote speech at the European Academy of Management conference in June 2014 that he sees himself as an epidemiologist, aiming to understand the overall patterns in the data, and SAP research as work by microbiologists trying to figure out the specific biological mechanisms at play.
Middle Management
The common object of analysis in research on middle management is an individual middle management representative (see e.g., Reference Wooldridge and FloydWooldridge and Floyd 1990; Reference Wooldridge, Schmid and FloydWooldridge, Schmid and Floyd 2008). Similarly to the research on top management, middle management and its role represent a fruitful arena for quantitative analysis. An advantage of quantitative research on middle managers is the possibility to create a sufficiently large sample for quantitative analysis even from within one firm (see e.g., Reference Glaser, Stam and TakeuchiGlaser, Stam and Takeuchi 2016). Yet, we see this as a still underutilized research opportunity.
Even though there is already an extensive body of quantitative research on the role of middle management (for a review, see Reference Wooldridge, Schmid and FloydWooldridge, Schmid and Floyd 2008), there is significantly less work on it than, for example, on the role of top management teams. Recent research has started to investigate into the role of middle management characteristics for innovation outcomes (see Reference Heyden, Sidhu and VolberdaHeyden, Sidhu and Volberda 2015; Reference Schubert and TavassoliSchubert and Tavassoli 2020). Some of the most central constructs of this research stream relate to the roles of the middle managers in relation to strategy and the nature of involvement in strategy development (Reference Collier, Fishwick and FloydCollier, Fishwick and Floyd 2004; Reference Floyd and WooldridgeFloyd and Wooldridge 1992; Reference Wooldridge and FloydWooldridge and Floyd 1990). While there has also been work on the contextual determinants of the roles and perceptions of middle managers in different organizational contexts (Reference Currie and ProcterCurrie and Procter 2005; Reference Floyd and WooldridgeFloyd and Wooldridge 1992; Reference Thomas and AmbrosiniThomas and Ambrosini 2015), there is still a major further research opportunity on this front as well. Finally, the common logic of reasoning in the research on middle management tends to be that involvement in the strategy process tends to create the highest commitment and that middle management behaviours play a major role in influencing strategy implementation (e.g., Reference Guth and MacmillanGuth and Macmillan 1986; Reference MantereMantere 2008; Reference Ren and GuoRen and Guo 2011).
Strategic Decision-Making
The common object of analysis in strategic decision-making research is either an individual strategic decision or the process that led to the decision. The central constructs in this research stream relate to the formalization (Reference Papadakis, Lioukas and ChambersPapadakis, Lioukas and Chambers 1998), comprehensiveness (Reference 591Atuahene-Gima and LiAtuahene-Gima and Li 2004; Reference FredricksonFredrickson 1984; Reference Fredrickson and MitchellFredrickson and Mitchell 1984) and speed (Reference Baum and WallyBaum and Wally 2003; Reference EisenhardtEisenhardt 1990) of the process. The moderating influence of different environmental and organizational contexts – dynamic versus mature – on strategic decision processes and strategic decision-making performance has also been extensively studied, with somewhat mixed results (Reference Hough and WhiteHough and White 2003; Reference Papadakis, Lioukas and ChambersPapadakis, Lioukas and Chambers 1998). The common logic of reasoning in strategic decision-making research is that comprehensiveness and rationality contribute to decision-making performance, in general, and that the speed of decision-making contributes to strategic decision-making performance in fast-moving, turbulent environments. More recently, scholars into the micro-cognitive underpinnings of strategic decision-making (see Reference Alves, Vastola, Galina and ZolloAlves et al. 2021; Reference Laureiro-Martínez and BrusoniLaureiro-Martinez and Brusoni 2018; Reference Laureiro-Martínez, Brusoni, Canessa and ZolloLaureiro-Martínez et al. 2015; Reference Mitchell, Shepherd and SharfmanMitchell, Shepherd and Sharfman 2011). Together, these findings may be deepened further with the SAP lens.
Strategic Consensus
Research on strategic consensus can in some respects be seen as an integrative area that cuts across the above research streams. The common object of analysis in strategic consensus research is either a strategic decision around which the consensus is formed or a set of teams or organizations that either reach or do not reach consensus (Reference BourgeoisBourgeois 1980; Reference DessDess 1987; Reference Iaquinto and FredricksonIaquinto and Fredrickson 1997; Reference Kellermanns, Walter, Lechner and FloydKellermanns et al. 2005; Reference Tarakci, Ates, Porck, van Knippenberg, Groenen and de HaasTarakci et al. 2014). The most central constructs of the quantitative research on strategic consensus relate to the locus and content of consensus (e.g., Reference Kellermanns, Walter, Lechner and FloydKellermanns et al. 2005), and extensive research shows that the performance implications of consensus are context-, organization- and decision-dependent (Reference Henderson and MitchellHenderson and Mitchell 1997; Reference Homburg, Krohmer and WorkmanHomburg, Krohmer and Workman 1999; Reference Kellermanns, Walter, Lechner and FloydKellermanns et al. 2005; Reference Kellermanns, Walter, Floyd, Lechner and ShawKellermanns et al. 2011; Reference Walter, Kellermanns, Floyd, Veiga and MatherneWalter et al. 2013).
The common logic of reasoning in this research is that reaching consensus is positively related to team and firm performance and that the relationship is moderated by organizational and environmental context. A group of researchers from Rotterdam University put forward an innovative quantitative method for mapping organizational consensus in a multi-team setting (Reference Tarakci, Ates, Porck, van Knippenberg, Groenen and de HaasTarakci et al. 2014). Using a survey of different teams’ rankings of their own priorities regarding their firms’ strategy, the authors were able to use multidimensional scaling to map the different team-level cognitions in the organization and demonstrate how different teams’ priorities differ from each other. The method provides an interesting way to examine organizations simultaneously on three levels – organization, team and individual – and to develop an understanding of the effects of strategic interventions on the changes in the priorities and strategic consensus.
Strategic Issues and Initiatives
Finally, the common objects of analysis in research on strategic issues and initiatives tend to be the strategic issues that a firm is facing (Reference Chattopadhyay, Glick and HuberChattopadhyay, Glick and Huber 2001; Reference Dutton and AshfordDutton and Ashford 1993; Reference Dutton, Ashford, Wierba, Oneill and HayesDutton et al. 1997; Reference Laamanen, Maula, Kajanto and KunnasLaamanen et al. 2018; Reference Thomas and McDanielThomas and McDaniel 1990) or the strategic initiatives that it is launching and running (e.g., Reference Lechner and FloydLechner and Floyd 2012; Reference Lechner, Frankenberger and FloydLechner, Frankenberger and Floyd 2010). The common constructs in the strategic issue research relate to managers’ interpretations of strategic issues, that is, events that have the potential to affect the organizational objectives (Reference AnsoffAnsoff 1980). For example, strategic issue management researchers have found that managers’ perception of issues as either opportunities or threats (Reference Jackson and DuttonJackson and Dutton 1988) importantly shapes outcomes.
In contrast, the research on strategic initiatives tends to be more focused on the effects of the explorative versus exploitative nature of the initiatives (Reference Lechner and FloydLechner and Floyd 2007; Reference Lechner and Floyd2012), how they are coordinated on the corporate level (Reference Lechner and KreutzerLechner and Kreutzer 2010) or how they are positioned in the inter-group networks inside the firm (Reference Lechner, Frankenberger and FloydLechner, Frankenberger and Floyd 2010). Both research streams are interested in the contextual influences, either on strategic issue interpretation or strategic initiative performance. The common logic of reasoning is that the nature of the issue or initiative and how they are managed are related to the performance of the firm. Taking into account the large number of issues and initiatives that even individual large firms are running, there is a major further research opportunity in this research area to more deeply analyse the strategy practices associated with the management of strategic issues and initiatives.
As a whole, the five phenomenon-based research streams can provide a foundation on which one can build one’s own quantitative research on strategy practices. While the research on strategy practices adds the more micro-level sociological lens to the study of organizational phenomena, the five related research streams provide a wealth of well-established constructs, relationships and insights into the contextual moderators on which to build a quantitative research agenda.
Lessons Learned for Strategy Practice Research
A number of criteria for assessing the quality of quantitative research have emerged over time. SAP scholars can take these on as ‘best practices’ when examining focal processes and phenomena through a quantitative lens. These include: (1) using established constructs; (2) avoiding common method bias; (3) controlling for unobserved outliers; (4) examining reverse causality; and (5) avoiding endogeneity. Countering these potential biases when developing the empirical research design helps alleviate some of the most common challenges that typically emerge in the journal review processes. We discuss them briefly below.
Use of established constructs. As there is typically extensive, contextually rich data available on a firm’s strategy practices and the context surrounding them, it is easy to get distracted by the richness of the data. In order to be able to build on prior work and to provide a basis for future studies in similar and different contexts, it would be useful to be able to complement the richness of the explanation with a quantitative analysis using established pre-tested constructs. In case such constructs do not exist, the creation and testing of new constructs could in itself be an important contribution and develop the research area onwards.
Common method bias. When one carries out surveys, either within a single organization or across multiple organizations, there is a danger of the results being confounded by the common method bias. If, for example, one measures both the dependent and the independent construct based on the perception of the same respondent without triangulating external data, there is a danger that the characteristics of the respondent affect the results more than the actual constructs that one is interested in studying.
Unobserved outliers. As there are often multiple other events going on in parallel in an organization, it is important to be able to control for the other events occurring at the same time. For example, a firm could be engaging in particular types of strategy activities because of increasing competitive pressure from the market, but in reality it could be the market pressure affecting the dependent variable more than the strategy practices.
Reverse causality. When doing cross-sectional data collection, it is commonly impossible to show the direction of causality when two variables relate to each other. In order to be able to say something about the cause and the effect, it would be useful to have multiple waves of surveys at different points in time and then examine how a change in the independent variable related to the change in the dependent variable. Another alternative would be to mix the quantitative survey-based evidence with qualitative research in order to help interpret the relationship.
Endogeneity. Endogeneity is a common problem in most strategy research, because many organizational behaviours are path-dependent. An example of a possible endogeneity problem would be an analysis of a strategic issue management team’s capabilities in which one compared the capabilities of the strategic issue management team on the successful resolution of the strategic issue. One might be surprised to then find that the relationship is statistically significantly negative. This might not be because better-quality strategic issue management teams do not do good work, however. Rather, it could be that the difficulty of the strategic issue affects both the type of issue management team that is assigned to the task – a more capable team for the more difficult strategic issues – and the difficulty of reaching a successful outcome. There are a number of ways to alleviate endogeneity, ranging from two-stage regression analysis to the inclusion of further control variables. Often, however, the influence of endogeneity cannot be fully eliminated from the analyses.
Innovative Research Approaches for Quantitative Research on Strategy Practices
SAP research typically involves the mapping of individual- or group-level activities in the process of strategizing, potentially complemented through links to meso- or macro-level outcomes or events. Reflecting the need to do in-depth research, descriptive, process and case studies tend to dominate current research on strategy practices. They offer the richness of explanation necessary to uncover novel phenomena and to understand them holistically. Quantitative methods are useful in complementing the qualitative empirical approaches. They enable one to move from a focus on micro-level patterns and principles to developing causal theories, making predictions, adding contingencies and yielding external validity to the research (Reference KaplanKaplan 2007). In this section, we present several quantitative methodologies and their usefulness in: (1) quantifying qualitative data surrounding strategy practices; (2) formalizing the sequential nature and temporal aspects of strategizing processes; and (3) creating a formal link to meso- or macro-events, outcomes or antecedents. Several quantitative methods seem particularly promising in assisting the realization of these goals:
Computer-aided content analysis: to deductively quantify and track the evolution of qualitative data, such as speeches, texts, images and videos on strategy practices;
Topic modelling and machine learning (ML): to apply algorithm-supported induction and artificial intelligence in the analysis of text, images and organizational digital exhaust (e.g., calendar invites, intranet documents, meeting notes) to understand strategy practices;
Network analysis: to quantify the relational nature of strategy practice data, for example, patterns of interactions between practitioners through the study of communication networks, etc.;
Sequence analysis: to uncover and formalize the sequential patterns with which strategy practices unfold over time; and
Event history analysis: to model the durational aspects with which strategy practices unfold over time and the circumstances that increase/decrease the likelihood for this process.
These five methods are not substitutes for qualitative in-depth studies or for each other. They can be used in multiple complementary ways. Content or network analyses can serve to translate qualitative data to the quantitative research domain. ML and optimal matching algorithms are able to capture patterns and sequences, whereas event history analysis models the transition rate of moving from one state to another and helps explain how other variables affect this transition rate. The different methods can be combined with each other or with qualitative data sources in mixed-method research designs. Alternatively, they could be integrated into set-theoretic approaches to study organizational configurations and explore asymmetric causality (Reference FissFiss 2007; Reference RaginRagin 2014).
In the following sections, we briefly introduce this quantitative methodological toolkit, link it to common threads in SAP research, discuss how to use the methodologies and provide pointers for further literature and some of the available toolkits. We place the ‘application-focused’ part of our writing into separate boxes.
Computer-Aided Content Analysis
Computer-aided content analysis (CCA) has a long tradition and a broad field of application across the social sciences. CCA seeks to deductively quantify the content of textual data, as well as of non-textual data, including visual, auditory and motion data. While CCA is still widely applied, strategy research is also increasingly making use of topic modelling and ML techniques based on algorithm-based induction. In the following two sections, we review both approaches.
Text-Based Content Analysis
CCA seeks to quantify the content of textual data by looking for the prevalence of certain keywords. The underlying idea of standard computer-aided text analysis (CATA) is that the frequency of word use is a reflection of cognitive saliency (Reference HuffHuff 1990). Categories of language reflect how people perceive the world. As such, content analytical techniques are particularly worthwhile for capturing cognitive, cultural, communicative or discursive processes and phenomena (see mechanics in Box 32.1). For instance, several highly cited discourse-analytic studies rely on CATA to substantiate or complement their findings (e.g., Reference Fiss and HirschFiss and Hirsch 2005). Although CATA offers higher reliability than human coding with lower cost and greater speed, a review of content analysis in organizational studies finds that fewer than 25 per cent of the articles in major management journals in the past twenty-five years used CATA in their content analysis processes (Reference Duriau, Reger and PfarrerDuriau, Reger and Pfarrer 2007).
Text analysis typically serves one of two purposes: the categorization of texts or the scaling of texts according to some scale of interest – for example, the conservative–liberal spectrum. For instance, Figure 32.1 illustrates the relative distributions of words that may serve as indicators for the underlying conceptualizations of a technological change across companies A, B and C. This may enable researchers to compare differences in attention allocation patterns across companies.

Figure 32.1 Word counts as differing attention allocations, stratified by companies
Structure of analyses: For conducting analyses, algorithms typically require the body of texts and an analysis scheme that specifies the parameters of the study to be performed, for instance, keyword lists. Here one can either draw on predefined dictionaries or build one’s own dictionary of keywords that requires validation to ensure the internal validity or one may use a combination of both (see Reference Graf-Vlachy, Bundy and HambrickGraf-Vlachy, Bundy and Hambrick 2020). Frequently used examples of software for CATA are the following: LIWC, DICTION, Prosuite, Word Stat, QDA Miner and General Inquirer.
When working with CATA tools, there are two main choices: developing custom dictionaries, which is an iterative and time-consuming process, or making use of existing dictionaries (Reference NeuendorfNeuendorf 2002). The predefined dictionaries in the DICTION software program, for example, have been used to examine the language of charismatic leadership (Reference Bligh, Kohles and MeindlBligh, Kohles and Meindl 2004). Research has also drawn upon Martindale’s Regressive Imagery Dictionary to differentiate between image- versus concept-based words and their respective impacts on charismatic leadership (see Reference Emrich, Brower, Feldman and GarlandEmrich et al. 2001; Reference 596Seyranian and BlighSeyranian and Bligh 2008). Alternatively, the dictionaries in LIWC have been used to study social evaluations of the focal firm or CEO characteristics. To illustrate, it has been used to capture the (positive or negative) tenor of newspaper data in order to study, for example, an actor’s legitimacy, reputation or celebrity (see Reference Pfarrer, Pollock and RindovaPfarrer, Pollock and Rindova 2010). Furthermore, it has been used, for instance, to study the CEOs’ temporal focus (see Reference Nadkarni and ChenNadkarni and Chen 2014) or cognitive complexity (see Reference Graf-Vlachy, Bundy and HambrickGraf-Vlachy, Bundy and Hambrick 2020). As Reference Pfarrer, Pollock and RindovaPfarrer, Pollock and Rindova (2010: 1146) emphasize:
content analysis techniques can help bridge the gap between large-sample archival research, which may suffer from internal validity issues, and small sample research, which allows for the collection of primary data and in-depth analyses but may suffer from external validity problems. Analyzing the content of press releases, media coverage, or stakeholder blogs can enhance archival research (which has been criticized for failure to provide insight into cognitive processes), while maintaining the advantages of using large samples.
Although Reference Short, Broberg, Cogliser and BrighamShort et al. (2010) highlight that construct validity is still often a problematic issue in content analysis, they provide specific guidelines for ensuring construct validity in the application of CATA tools. Human coding procedures remain relevant when working with automatized content analysis tools, specifically, for the origination of content analysis schemes that eventually become CATA algorithms, for the measurement of highly latent constructs and for the ongoing validation of CATA measures. In this respect, scholars have used more qualitative analysis as an input to the more quantitative content analysis. For instance, scholars have used card-sorting techniques to uncover concept categories that then serve as input for the content analysis (Reference Nadkarni and BarrNadkarni and Barr 2008). Moreover, scholars have started to expand and complement the established dictionaries by creating own content analysis algorithms with features that are not covered by the traditional LIWC or DICTION algorithms (see Reference Graf-Vlachy, Bundy and HambrickGraf-Vlachy, Bundy and Hambrick 2020; Reference Nadkarni, Pan and ChenNadkarni, Pan and Chen 2019). For instance, a core focus has been on the elaboration of algorithms for the study of sentiments, that is, the attitude that one holds towards a particular entity. These so-called sentometrics studies focus on the computation of sentiment from any type of qualitative data (see Reference Algaba, Ardia, Bluteau, Borms and BoudtAlgaba et al. 2020). The related software is available in open-source (see, for example, https://sentometrics-research.com). Furthermore, to ensure internal and external validity, scholars have conducted separate studies to validate the archival measures (Reference Nadkarni and ChenNadkarni and Chen 2014). Others have combined multiple qualitative, semiotic and quantitative content analyses in the study of cultural registers and related keywords (Reference WeberWeber 2005).
Non-Text-Based Content Analysis: Visuals, Voice and Motion
Recent research has also started to expand the content analysis to non-text-based data sources, such as visual (see Reference Höllerer, Jancsary, Meyer and VettoriHöllerer et al. 2013; Reference Meyer, Höllerer, Jancsary and van LeeuwenMeyer et al. 2013), speech and motion data (see Reference Clarke, Cornelissen and HealeyClarke, Cornelissen and Healey 2019; Reference Soleymani, Garcia, Jou, Schuller, Chang and PanticSoleymani et al. 2017). For example, content analysis of images may differentiate between compositional and semiotic analysis. Others draw on multiple complementary data sources to improve and go beyond text-based analyses, for instance, of sentiments. Because sentiments can be uncovered from affective traces in facial and vocal expressions, the so-called multimodal sentiment analysis opens up new avenues for analysing facial and vocal expressions in addition to text-based expressions of sentiments (see Reference Algaba, Ardia, Bluteau, Borms and BoudtAlgaba et al. 2020; Reference Soleymani, Garcia, Jou, Schuller, Chang and PanticSoleymani et al. 2017). These contents and novel forms of computer-aided content analyses may also be of particular interest for studying strategizing phenomena from a SAP lens.
In SAP research, content analysis could be used in multiple ways.
Content analysis tools enable one to effectively and reliably analyse vast amounts of text, video and image data about strategizing activities, such as meetings, plans and presentations, emails, company websites, intranets, internal documents, etc. These kinds of data sources may help shed light onto the micro-level dynamics of strategy practices and connect them to strategy process phenomena. For example, content analysis may help uncover the nature, sequence patterns or effects of speech practices on strategic change implementation (see Reference Reuter and KrauspeReuter and Krauspe 2022; Reference 596Seyranian and BlighSeyranian and Bligh 2008) or on consensus-building within strategic decision-making teams. It may further help identify the nature, sequence patterns or effects of consulting practices on initiative success.
Further avenues for the application of CATA in SAP research constitute the study of media coverage – for example, text and image – and their relation to the strategic responses of the firm. Content analysis tools help uncover the distinct practices with which firms react to media tenor, discourse, etc.
Topic Modelling and Machine Learning (ML)
While CCA is largely deductive in nature, significant methodological advancements have been made in the area of ML to conduct algorithm-supported induction (Reference Choudhury, Allen and EndresChoudhury, Allen and Endres 2021; Reference Shrestha, He, Puranam and von KroghShrestha et al. 2021). Topic modelling is a form of ML that allows to unearth phenomenon-based constructs and grounded conceptual relationships in textual data, exploiting the increasing capability of ‘intelligent’ algorithms to ‘understand’ the meaning of text. Instead of looking for the prevalence of certain keywords, it constructs relational maps of language use. It uses statistical associations of words in a text to generate latent topics – clusters of co-occurring words in a text – without relying on predefined, explicit dictionaries or interpretative rules (e.g., Reference HaansHaans 2019; Reference 594Kaplan and VakiliKaplan and Vakili 2015). When contrasting the method with dictionary-based analysis, the generated topics here are not mutually exclusive. Individual words appear across topics with different probabilities and the topics themselves may overlap or cluster. Topic modelling bears potential to detect novelty and emergence, to develop inductive classification systems, to analyse frames and social movements, cultural dynamics, or online audiences and products (Reference Hannigan, Haans, Vakili, Tchalian, Glaser, Wang, Kaplan and JenningsHannigan et al. 2019). The topic modelling process begins with the preparation of sets of texts to be analysed, then requires analytical choices that determine how topics are identified within those texts, and finally consists of crafting the topics into constructs, causal links or mechanisms. It is thus not a mechanistic process, but requires careful interpretive decisions and theory work. Reference Hannigan, Haans, Vakili, Tchalian, Glaser, Wang, Kaplan and JenningsHannigan et al. (2019) provide detailed recommendations for the individual steps and choices involved.
ML techniques more broadly can be used to examine also other kinds of data generated by organizations that do not have to be in a textual or image form. This data, also called the ‘digital exhaust’, created by the organizational members includes, for example, data on employees’ calendar markings, documents posted in the intranet, discussion boards, statistics on the use of different software tools in the organization (e.g., the frequency of the usage of Zoom), meeting recordings and transcriptions, employees’ physical location data, offers sent to the customers and data generated by other customer interactions by hundreds or even thousands of employees every day. Already Microsoft’s 365 and Teams (just to name two commonly used software platforms) collect data on each individual across eight categories through eleven popular apps, including Outlook and Excel, recording every interaction and its content.
Tapping into this digital exhaust generated by organizational members represents a treasure trove to conduct different kinds of quantitative analyses of strategy practices if one is able to negotiate an access to the data generated by an organization. Calls have been made to better tap into this opportunity also by the organizations themselves to enhance their competitive advantage. A recent Forbes article (Reference EnglishEnglish 2021) noted that an effective use of the available data can be used: (1) to delayer the organization through enhanced coordination and monitoring; (2) to identify organization’s linchpins through communication flows and collaborations in the organization; (3) to identify internal imbalances and weak spots in the organization; (4) to enhance organization design and processes; and (5) to monitor strategy enactment and mood in the organization. While such data are typically highly company-specific, there are such vast amounts of data that identifying and understanding the emerging patterns requires the use of advanced ML algorithms and quantitative methods.
Several authors provided detailed instructions on how to technically implement and theorize with the help of different ML algorithms. Reference Shrestha, He, Puranam and von KroghShrestha et al. (2021) suggest the use of algorithm-supported induction in these kinds of situations with vast amounts of different types of data to both develop and test theory. They put forward a four-step process through which inductive theory development through ML can be achieved. These steps include: (1) splitting the sample into two parts (focal sample and holdout sample); (2) detection of robust and interpretable patterns; (3) theory formulation; and (4) theory testing with the holdout sample. Choudhury, Allen and Endres provide guidance for evaluating model performance, highlight human decisions in the process, and warn of common misinterpretation pitfalls (Reference Choudhury, Allen and EndresChoudhury, Allen and Endres 2021).
For SAP research, exciting new opportunities arise from the application of ML algorithms to different types of data:
First, this method allows SAP scholars to extend the samples upon which they typically build their studies, ‘expanding the spectrum of questions that can be rigorously addressed’ (Reference Vaara and FritschVaara and Fritsch 2021: 1176). In this regard, SAP scholars could, for example, more extensively analyse and quantify the different types and styles of communication by strategists. A recent study by Choudhury, Wang, Carlson and Khanna, for instance, combines video data with interview transcripts to discover CEO oral communication styles that incorporate both verbal and non-verbal aspects of communication, coding speech and facial expression as ‘excitable’, ‘stern’, ‘dramatic’, ‘rambling’ or ‘melancholic’ and linking it to distinct strategic behaviours (Reference Choudhury, Wang, Carlson and KhannaChoudhury et al. 2019).
Moreover, top modelling may be fruitfully combined with a number of other research methods that SAP scholars have frequently drawn from, including critical discourse analysis (Reference Aranda, Sele, Etchanchu, Guyt and VaaraAranda et al. 2021) or the so-called ‘Gioia method’ of grounded theory building (e.g., Reference Croidieu and KimCroidieu and Kim 2018). This would allow SAP scholars to address the calls to also use quantitative data in their inductive studies, to develop more mixed-methods studies, and to develop ‘an epistemically objective science of a domain that is ontologically subjective’ (e.g., Reference Vaara and FritschVaara and Fritsch 2021: 1176).
Overall, the methodological progress made in the ML algorithms and artificial intelligence research enables strategy practice researchers to more effectively use the extensive amount of data generated by organizations to enhance the understanding of how strategies are created and enacted and how the feedback loop from strategy creation and enactment can be enhanced further (see e.g., Reference Weiser, Jarzabkowski and LaamanenWeiser, Jarzabkowski and Laamanen 2020).
Network Analysis
Over the past thirty years or so, network analysis has become an ubiquitous methodology across the social sciences. As such, classes in network analysis are a mandatory component in most PhD programmes in sociology, and increasingly also in management and strategy (see mechanics in Box 32.2). Social network analysis has been extensively employed by strategy and management scholars to analyse social network structures and dynamics within and between organizations (e.g., Reference Jacobsen, Stea and SodaJacobsen, Stea and Soda 2022; Reference Rab, Alhajj and RokneRab 2018). In social network analysis, social structures, such as teams, organizations and industries, are conceptualized and analysed as networks. Hereby, a network is typically defined as a set of nodes that are connected by dyadic ties of the same type. Nodes typically represent individuals, groups or organizations. Ties, in turn, can represent flows of information, communication or resource exchanges between nodes as well as attitudes between dyads of nodes, such as friendship or trust.
There are several authoritative sources on how to design and conduct network analytic studies that SAP scholars can draw on when developing and implementing their research designs (e.g., Reference Borgatti, Everett and JohnsonBorgatti, Everett and Johnson 2013; Reference Wasserman and FaustWasserman and Faust 1994).
(1) Data collection: One of the biggest differences between network analysis and other widely applied quantitative analysis techniques is the nature of the data that needs to be collected. Instead of collecting information about individual cases (e.g., managers, groups, organizations), as would be required for more traditional methods of data analysis, network analysis requires the collection of information about the relationship between actors.
Important steps in the data collection procedure include the identification of the population of nodes that the focal study comprises as well as the determination of appropriate data sources. With regard to determining an appropriate population, setting the right boundary for the study is most important. Boundaries can, for instance, be determined by theoretical criteria or by attribute-based criteria, such as top management team members or strategic initiatives. Because of the possibility of collecting data in very detailed and fine-grained ways, fascinating network analytic studies have, for instance, been developed based on data collected about members of a single case organization (e.g., Reference KrackhardtKrackhardt 1990). For the latter, network analytic studies can draw on various data sources, including archival data, interviews, observations and surveys. In any case, network analysis often goes hand in hand with ethnographic immersion in a focal research context. Ethnography at the front end can, for instance, help in selecting the right research questions to investigate, while an ethnographic investigation at the end can help scholars interpret the results of the quantitative analysis.
(2) Data analysis: For the analysis of collected data, network analysis draws from two distinct areas of mathematics: matrix algebra serves to record and analyse relations between notes as variables. In turn, graph theory helps analyse the ties among nodes and to calculate such concepts as paths. Scholars have developed a distinctive set of softwares for the quantitative analysis of networks. Among these, UCINET and NetDraw are perhaps the most widely applied ones. UCINET relies on matrix algebra for calculating various network-related measures – for example, the ‘centrality’ of a node in a given network, or the ‘cohesion’ of a given network – and can also be used for testing hypotheses. In turn, NetDraw is a network visualization software. Relying on graph theory, NetDraw allows for the graphic representation of networks, including relations and attributes. In addition, there are several other software packages that fulfil more specialized roles in the analysis of networks. WordNet, for instance, is a software package that can be applied for various types of semantic network analysis (and thus, for the analysis of, for instance, discourses, narratives and cognitive maps). Moreover, SIENA is also of particular interest, as it allows modelling and analysing how networks evolve and change over time.
For SAP scholars, the use of network methods provides a range of advantages and opportunities:
For sociological practice theorists, network analysis has always been a legitimate tool. Bourdieu, in particular, repeatedly relied on network analytic methods in his surveys and for structuring and quantifying his qualitative field observations. It is thus unsurprising that some of Bourdieu’s most central theoretical concepts, such as social capital, social position and social field, can be well captured with network analytic methods. In addition, Giddens’s theory of structuration has been empirically explored by means of network analysis in very fruitful ways. For instance, Reference BarleyBarley’s (1986) landmark study of structuration processes in hospital organizations – arguably a role model for SAP scholarship (see Reference Johnson, Langley, Melin and WhittingtonJohnson et al. 2007) – receives much of its succinct empirical grounding from the dyadic network analytic methods it employs. Additionally, symbolic interactionism and network analysis can yield fascinating research insights – as shown, for instance, in a study of the micro-level communicative turns in meetings among middle and top managers (Reference GibsonGibson 2005).
Network analysis is not limited to the analysis of social networks. It is also increasingly applied for investigations of some of the cultural or cognitive structures (cf. Reference DiMaggio, Scott and CarringtonDiMaggio 2011; Reference Leifeld, Victor, Lubell and MontgomeryLeifeld 2017) – such as discourses, cultural content, narratives and mental models – that have been important research foci for SAP scholars. In this case, macro-level discourses, narratives and micro-level mental models have all been conceptualized and analysed as interrelated networks of topics or concepts. One can thus rely on semantic network analysis to investigate, for instance, the centrality of a focal topic in a given discourse, or the complexity of a given story or mental model (e.g., Reference Blaschke, Schoeneborn and SeidlBlaschke, Schoeneborn and Seidl 2012; Reference Liu, Friedman, Barry, Gelfand and ZhangLiu et al. 2012; Reference Martens, Jennings and JenningsMartens, Jennings and Jennings 2007; Reference Pachucki and BreigerPachucki and Breiger 2010). Analyses of cultural networks have, for instance, been conducted in recent research on institutional agency and institutional change (e.g., Reference van Wijk, Stam, Elfring, Zietsma and den Hondvan Wijk et al. 2013). Additionally, the analysis of so-called ‘narrative networks’ plays an increasingly prominent role in practice-based studies of organizational routines (e.g., Reference Pentland and FeldmanPentland and Feldman 2007; Reference Pentland, Kim, Pentland, Rerup, Seidl, Dittrich, D’Adderio and FeldmanPentland and Kim 2021).
To conclude, network analysis constitutes a fascinating avenue for future SAP research. It offers several important advantages that can help scholars push the boundaries of our understanding of communication, attention allocation and strategy emergence in organizations (see e.g., Reference Rhee and LeonardiRhee and Leonardi 2018), as well as of extant applications of social practice theories more generally.
Sequence Analysis
Sequence analysis involves ‘a family of methods that can be used to identify, describe, compare and visualize patterns in sequentially ordered data’ (Reference Mahringer, Pentland, Pentland, Rerup, Seidl, Dittrich, D’Adderio and FeldmanMahringer and Pentland 2021: 172). It enables studying how ‘things’ ‘emerge, develop, grow or terminate over time’ and across patterns (Reference Langley, Smallman, Tsoukas and Van de VenLangley et al. 2013). Specifically, sequence analysis can be helpful for addressing three types of questions: (1) whether some typical sequence exists; (2) why it exists; and (3) what the consequences are (Reference AbbottAbbott 1990: 375). Regarding the latter, and by formalizing sequence patterns, for example with optimal matching algorithms (see mechanics in Box 32.3), sequence analysis methods can be readily complemented with inferential statistics to link sequence patterns with antecedents and outcomes.
Optimal matching analyses study data describing sequences of states actions by calculating distance metrics between individual sequences. Typical applications of this capability are:
(1) The clustering of sequences into data-driven taxonomies.
(2) The comparison of sequences to theoretically or empirically derived reference sequences and typologies.
In both of these applications, the optimal matching algorithm uses dynamic programming techniques to determine the ‘cheapest’ (e.g., least number) of edits to transform one sequence into the other. As different types of edits may imply different ‘costs’/weights, the total cost of edits, which represents the resulting distance measure, may vary with scholarly choices.
While the calculation of the distance measure is taken care of by the statistical software package – to illustrate, the Stata community offers an ado-file for sequence analysis: SEQCOMP, the scholar needs to specify the coding logic of the sequences as well as which sequences should be compared to each other. For this, he/she needs to define the dimension of interest and its respective coding scheme. Further, the scholar needs to theoretically justify the reference object against which he/she seeks to compare individual observations. In the example illustrated by Figure 32.2, the stylized strategic planning process (T) and the rhythmic pattern (G) would need to originate from sound theoretical reasoning. Recommended tools for applying this method are Stata, DTA orTraMineR, a software package for R which offers different methods for whole sequence analysis (including optimal matching), pattern mining and narrative network analysis (Reference Mahringer, Pentland, Pentland, Rerup, Seidl, Dittrich, D’Adderio and FeldmanMahringer and Pentland 2021).

Figure 32.2 Optimal matching analysis for sequence analysis
Scholars have identified different categories of sequence analysis methods including whole sequence methods, pattern mining methods or so-called ‘narrative network’ methods (e.g., Reference Mahringer, Pentland, Pentland, Rerup, Seidl, Dittrich, D’Adderio and FeldmanMahringer and Pentland 2021; Reference Pentland, Kim, Pentland, Rerup, Seidl, Dittrich, D’Adderio and FeldmanPentland and Kim 2021). Of these, whole sequence methods have perhaps been most frequently employed in management research and in the social sciences more generally. As the name suggests, whole sequence methods focus on complete sequences of action as units of analysis. For analysing and conceptualizing whole action sequences, optimal matching analysis is the ‘canonical’ technique (Reference Abbott and ForrestAbbott and Forrest 1986; Reference Abbott and HrycakAbbott and Hrycak 1990). The optimal matching algorithm can be meaningfully applied to sequences with any finite set of different states. It is, however, also possible to code more than one dimension simultaneously (Reference Biemann and DattaBiemann and Datta 2014), meaning that each coded entity within a sequence can be categorized according to different characteristics. The optimal matching algorithm itself calculates distances among sequences by computing the number of insertions, deletions and substitutions that are needed to transform one sequence into the other. It is thus usually applied in combination with cluster analysis to inductively derive groups of resembling sequence patterns. Therefore, it is fundamentally an exploratory tool which helps to make sense of large amounts of process data.
With this general structure, the method has proven to be very versatile. Although initially developed in biology to identify patterns in DNA sequences, optimal matching soon migrated to sociology and was used to study topics as diverse as transition from school to work (Reference McVicar and Anyadike-DanesMcVicar and Anyadike-Danes 2002; Reference SchererScherer 2001), links followed in website visit (Reference Wang and ZaïaneWang and Zaiane 2002), time use (Reference WilsonWilson 2006), the interplay of housing, employment, marriage and fertility (Reference PollockPollock 2007), or lynching incidents in the American South (Reference StovelStovel 2001). Also in strategy and organization research, optimal matching is increasingly used (Reference Langley, Smallman, Tsoukas and Van de VenLangley et al. 2013). According examples include, for instance, analyses of information systems’ implementation processes (Reference Sabherwal and RobeySabherwal and Robey 1993), competitive dynamics (Reference FerrierFerrier 2001), pathways in the development of organizational networks (Reference Stark and VedresStark and Vedres 2006), product development processes (Reference SalvatoSalvato 2009), patterns in acquisition and alliance behaviour (Reference Shi and PrescottShi and Prescott 2011), sequences of job experiences (Reference LeungLeung 2014), and especially research on the dynamics of organizational routines, where sequence analysis is employed with increasing frequency (Reference Mahringer, Pentland, Pentland, Rerup, Seidl, Dittrich, D’Adderio and FeldmanMahringer and Pentland 2021).
For SAP researchers, sequence analysis in general and optimal matching in particular imply promising avenues for formalizing processual aspects and temporal dynamics in the study of strategy practices, praxis and practitioners (Reference WhittingtonWhittington 2006). Several areas of application come to our mind:
Strategizing unfolds in episodes, such as meetings, strategy events, investor presentations or awaydays (Reference Jarzabkowski, Balogun and SeidlJarzabkowski, Balogun and Seidl 2007). Furthermore, strategy practices tend to be highly ritualized (Reference Johnson, Prashantham, Floyd and BourqueJohnson et al. 2010). A sequence analysis enables identifying a range of prototypical patterns of dramaturgies and activities with which strategic episodes unfold. For example, detailed meeting observations can be coded according to the use of certain tools, ceremonial gestures or the prevalence of certain themes. On this basis, empirical taxonomies of the dynamics of strategic episodes could be developed, and it could be investigated whether certain dynamics may be more conducive to successful outcomes of strategic episodes than others.
Consensus among strategists positively shapes team and organizational outcomes (Reference Kellermanns, Walter, Lechner and FloydKellermanns et al. 2005). Optimal matching might be used to study how sequences of discursive processes, communicative interactions or behavioural dynamics and associated practices shape the emergence of consensus among strategy practitioners over time.
A core concern of the SAP lens has been to identify the roles of strategy practitioners and professionals – for example, consultants, strategy directors and the like – that are more than organizational roles (Reference WhittingtonWhittington 2007; Reference Whittington, Yakis-Douglas, Ahn and CailluetWhittington et al. 2017). Optimal matching techniques lend themselves particularly well to shedding light onto strategy practitioners’ career trajectories and developmental paths. In this regard, Reference Abbott and HrycakAbbott and Hrycak (1990), for example, studied the careers of German musicians in the eighteenth century, while Reference Blair-LoyBlair-Loy (1999) studied women’s careers in finance.
Reference SalvatoSalvato’s (2009) analysis of new product development processes at the Italian company Alessi is an excellent example for demonstrating the potential usefulness of optimal matching techniques in future SAP research. He compiled and analysed rich descriptive narratives that detail the sequences of events for each of the ninety new product innovation projects that he covered over a fifteen-year period. He applied optimal matching analysis and developed a taxonomy of new product development processes. The example of Salvato’s study also emphasizes the iterative nature of interpretation and algorithmic distance calculation which characterizes the optimal matching method. While it starts with an interpretative process to develop the initial categorization scheme, it also requires interpretation, going back to the raw data and/or conducting additional interviews in the end to transform the numerically derived temporal profiles into meaningful conceptualizations of developmental paths.
Event History Analysis
Many theories involve arguments on the effect of time on the occurrence of discrete events. Methodologically, this reflects the question of how the transition rate of moving from one state to another state is affected by other variables of interest. To investigate this question, event history methodology2 derives conditional likelihoods for moving from one state to another from properties of the underlying data, notably, from the time spans between individual transition events (typically referred to as ‘spell lengths’) and with all covariates considered (see mechanics in Box 32.4). Strategy scholars have made use of event history analyses, for instance, for studying the competitive behaviours among firms (e.g., Reference Asgari, Singh and MitchellAsgari, Singh and Mitchell 2017; Reference Yu and CannellaYu and Cannella 2007). In organization theory, event history analysis constitutes the standard methodology in ecological research (see Reference Tuma and HannanTuma and Hannan 1984). Organization scholars have also used this method to uncover, for example, the diffusion of practices within a given context (Reference GreveGreve 1996; Reference Greve1998) and the patterns of attention allocation to emerging industry trends (Reference Eggers and KaplanEggers and Kaplan 2009; Reference Maula, Keil and ZahraMaula, Keil and Zahra 2013). As event history models consider one transition at a time, the method is less suited for studying processes with recurring events. In these cases, each transition has to be modelled separately and the covariates need to be assigned appropriately each time. This factually renders the longitudinal character of the analysis cross-sectional, such that time dummy variables may be needed to restore the procedural character of longer event sequences. If the research focus lies on such full sequences of multiple transitions – for example, because one wants to investigate if the stages of the sequence fit together – optimal matching analysis is potentially the more appropriate method.
Event history analyses study the time to the transition between states. The method therefore considers the state of various covariates, and is apt to explain an event’s ‘history’, awarding it the name event history analysis. Using different types of clock variables as dummies, the method also allows the testing of dynamic theories on strategy practices. Figure 32.3 illustrates such a case. Using a firm’s strategy revision as a defining moment for the initiation of sequences of strategic practices, the event history method could provide formal evidence for managers’ increased inclinations to engage in strategic practices following strategy revisions.

Figure 32.3 Event history analysis for sequence analysis
The method requires material preparatory work in the form of data transformations. Specifically, it requires the scholar to generate for each firm or actor a series of ‘spells’ (durations in time units between the transition steps of interest) that stretches across the study’s observation period. After this step, the overall set of spells needs to be brought into a cross-sectional sample, for which time-varying covariates need to be assigned in a manner that reflects the covariates’ changing values over time. Recommended tools for applying this method are STATA, SPSS and SAS; the transformation of spells can also be done easily using a spreadsheet program.
For SAP researchers, these capabilities may imply promising avenues for uncovering durational aspects of strategizing practices, practitioners and praxis. Exemplary areas of application might be the following.
An important application of event history analysis in SAP research may, for instance, constitute future research on strategy implementation. In this regard, one could conceptualize and analyse strategy implementation in terms of the diffusion of novel strategy concepts and practices within the organization. Although this topic has been considered at the industry level, there has been less research so far considering the diffusion of practices, their antecedents and outcomes inside the organization. For example, consider an organization undergoing a large strategic change. In this regard, event history analysis could be used to model the time to adoption of novel concepts and practices among certain – for example, resistant to a greater or lesser extent – groups, departments or units.
Event history analysis can also be used to pave the way for new directions in research on strategic issues: a survival analysis may, for instance, uncover how the strategic agenda of organizations evolves and changes over time. As such, one can model the rate at which strategic priorities ‘die’ within the organization and leave the strategic agenda, as well as antecedents that enhance or mitigate this survival rate.
Strategy work goes with dominant discursive practices (Reference Mantere and VaaraMantere and Vaara 2008). Event history analyses may enable the modelling of discursive episodes and shifts in dominant discourses, as well as antecedents that enhance or mitigate the likelihood of such shifts.
Conclusion
We have argued in this chapter that, even though most of the prior work in SAP research has been conceptual or qualitative in nature, there would be major potential in researching strategy practices quantitatively as well. The related research on behavioural strategy (e.g., Reference GavettiGavetti 2012; Reference Powell, Lovallo and FoxPowell, Lovallo and Fox 2011) and the micro-foundations of strategy (Reference Eisenhardt, Furr and BinghamEisenhardt, Furr and Bingham 2010; Reference Felin and FossFelin and Foss 2009; Reference Felin, Foss, Heimeriks and MadsenFelin et al. 2012; Reference Foss and LindenbergFoss and Lindenberg 2013; Reference GreveGreve 2013) has made good progress in combining the conceptual work with both qualitative and quantitative methods.
There are a number of different benefits that could be gained in comparison to a solely qualitative research orientation. Expanding the use of quantitative methods in studying strategy practices would further enable the development and validation of constructs for the study of strategy practices and verification of the applicability of the results across different contexts, and, as a whole, lead to a more effective and integrative accumulation of the empirical evidence in relation to the broader research community it contributes to. Moreover, applying innovative quantitative methods could also lead to the emergence of novel insights that might not be achievable with purely qualitative research designs.
Our chapter offers two main contributions to SAP research. First, we outline several related strategic management research areas that co-align well with the SAP research agenda and that have used quantitative methods extensively to examine practitioners and practices. Over the years, these areas have accumulated a wealth of constructs and measures that SAP scholars can draw from and connect their research to in order to develop quantitative SAP research. This would not only enable a more straightforward accumulation and formalization of SAP research findings but also a better integration of these findings into the broader strategy research field.
Second, we have also introduced several quantitative research methods that could help SAP scholars go one step further in examining the antecedents, occurrence and implications of strategy practices. Our selection of these research methods is not meant to be exhaustive, but, rather, to introduce a set of methods that may be particularly insightful for SAP scholarship. Together, these methods help advance the SAP agenda by providing a toolkit for the holistic quantitative analysis of strategy practices. As such, this toolkit provides robust means for strategy practices’ content (with content analysis and ML), context (with network analysis), occurrence (with event history analysis) and sequences (with optimal matching analysis). Moreover, these methods can be combined in a number of complementary ways.
Overall, we hope that our chapter helps scholars in also pursuing quantitative SAP research. Developing a more balanced epistemic culture that encourages and accepts not only qualitative but also quantitative research holds a wealth of potential, and will greatly facilitate the further development and effectiveness of SAP research.


