Skip to main content Accessibility help


  • Access


      • Send article to Kindle

        To send this article to your Kindle, first ensure is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle.

        Note you can select to send to either the or variations. ‘’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

        Find out more about the Kindle Personal Document Service.

        The Power of Ranking: The Ease of Doing Business Indicator and Global Regulatory Behavior
        Available formats

        Send article to Dropbox

        To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Dropbox.

        The Power of Ranking: The Ease of Doing Business Indicator and Global Regulatory Behavior
        Available formats

        Send article to Google Drive

        To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Google Drive.

        The Power of Ranking: The Ease of Doing Business Indicator and Global Regulatory Behavior
        Available formats
Export citation


We argue that the World Bank has successfully marshaled the Ease of Doing Business (EDB) Index to amass considerable influence over business regulations worldwide. The Ease of Doing is a global performance indicator (GPI), and GPIs—especially those that rate and rank states against one another—are intended to package information to influence the views of an audience important to the target, such as foreign investors or voters, thus generating pressures that induce a change in the target's behavior. The World Bank has succeeded in shaping the global regulatory environment even though the bank has no explicit mandate over regulatory policy and despite questions about EDB accuracy and required policy tradeoffs. We show that the EDB has a dominating market share among business climate indicators. We then use media analyses and observational data to show that EDB has motivated state regulatory shifts. States respond to being publicly ranked and some restructure bureaucracies accordingly. Next we explore plausible influence channels for the EDB ranking and use an experiment involving US portfolio managers to build on existing economics research and examine whether the rankings influence investor sentiment within the experiment. Using a case study of India's multiyear interagency effort to rise in the EDB rankings, as well as its decision to create subnational EDB rankings, we bring the strands of the argument together by showing how politicians see the ranking as affecting domestic politics, altering investor sentiment, and engaging bureaucratic reputation. Overall, a wide variety of evidence converges to illustrate the pressures through which the World Bank has used state rankings to achieve its vision of regulatory reform.

The main advantage of showing a single rank: it is easily understood by politicians, journalists, and development experts and therefore created pressure to reform. As in sports, once you start keeping score everyone wants to win.

—World Bank Staff Report, 20051

Stripping the ordinal rankings and “reforming” the report's methodology would have the effect of completely destroying the report's credibility and usefulness as a policy tool.

—Steve Hanke, director of the CATO Institute's Troubled Currencies Project, in response to a Chinese-led effort to remove the rankings2

The world is increasingly governed not by force but by information that moves markets, affects reputations, and impinges on national security. Global performance indicators (GPIs), especially regimes that rate and rank states against one another, consciously package information to influence the priorities of states, the perceptions of publics, and the decisions of economic actors. As the introduction to this symposium suggests, GPIs constitute an increasingly important form of social pressure around the world. Their creators promote them to change the information environment of communities that are important to the target to change its behavior. All social pressure is exerted through information: sometimes relying on evidence and rational argument, often using emotive persuasion, and occasionally by making implicit or explicit demands for conformity. This information is intended to affect the views of an audience of importance to the target, anticipating that the target will care about and respond to those views.3

That is precisely what GPIs aim to do. Wielding comparative information using simple rankings is designed to alter shared information, affect third-party beliefs and opinions, and ultimately convince targets that their reputation or relative status is at stake, potentially with material and/or social consequences. Social pressure of this kind is evident in the area of business (de)regulation. Since the mid-2000s the World Bank has used rankings to influence the regulatory policies of countries worldwide. By creating the Doing Business Report and the Ease of Doing Business (EDB) Index, the bank has decisively shaped states’ regulatory behavior, especially in emerging markets and developing countries. Even though the EDB is formally a noncoercive reporting exercise and may not always accurately reflect appropriate regulation, its existence has influenced governments around the world to change their economic and regulatory policies to meet the bank's expectations. By benchmarking and especially by ranking, the bank intentionally exerts competitive social pressure on states to deregulate.4 If the World Bank simply wanted to exert traditional economic pressure, they have long had the tools at hand, and scarcely needed to construct and propagate such an elaborate way to change the broader informational environment. Instead, the bank has chosen to innovate by manipulating information that influences official reputations and states’ status.

We explore the bank's intentions in establishing its deregulatory ranking system. We build a prima facie case for the EDB's social influence by demonstrating its salience in the media and on the minds of high government officials. Plausible observational evidence demonstrates an average global correlation between publicizing the rankings, bureaucratic adaptations responding to the rankings, and an acceleration in actual policy reforms. A survey experiment and case study unpack causal mechanisms. By manipulating the information available to an elite panel of investors, we demonstrate that EDB rankings affect assessments of investment opportunities within a controlled experiment involving investors. A case study of India informed in part by leaked documents brings the strands of the argument together and provides evidence that politicians see the ranking as affecting domestic politics, altering investor sentiment, and engaging bureaucratic reputations. The case demonstrates holistically that altering information allows the World Bank to intensify its influence on states whose national politicians and bureaucrats believe their reputations and ability to attract business are at stake. Consequently, they strive to move up the rankings. Overall, a broad range of evidence, each source and method tailored to a specific step in the argument, shows that the bank has intentionally and successfully packaged information to maximize its influence on states to reform business regulations in emerging markets around the world.

Comparative Information and Social Pressure: A Theory of the Influence of the EDB

The World Bank's use of the EDB index is a prime example of the mechanisms discussed in the symposium introduction.5 For decades, the bank has used the traditional tools of loans and technical assistance to influence development strategies. For a number of reasons—including the possibility of growing skepticism about the legitimate role of international organizations in traditional areas of state sovereignty6—traditional tools of economic leverage were seen as undesirable and/or ineffective ways to encourage business deregulation. Instead, the bank intentionally chose a communication device that uses the views of other actors to encourage change. Rankings served that purpose. They simplify a complex regulatory reality, compare all states along a set of actionable indicators, and publicize the resulting rankings to media hungry for simple headlines. Investors looking for rules of thumb to guide their decisions pay attention. Constituents use them to pass simple judgments on policies and politicians. Knowing this, the bank sees an opening to use information through these audiences to achieve results. In other words, this is a bank-initiated application of social pressure. In anticipation or response, governments alter priorities, make bureaucratic changes, and intensify their engagement with the bank to improve their rankings. Figure 1 illustrates schematically how this process works.

Figure 1. Theory of the social influence of the Ease of Doing Business ranking

Constructing comparative judgments is crucial in this process.7 By engaging the right-hand segments of the loop in Figure 1, rankings reverberate and magnify whatever direct influence the World Bank may traditionally have had on states. Since the bank publishes overall and sub-index rankings, it could not be easier to sort states by their total number of reforms or a specific reform category.8 The format is important because broad social engagement is much less likely to be activated by raw data alone than by comparisons.9

This is precisely the mechanism that the symposium introduction theorizes. When the bank deploys “business climate” information in a simple comparative format such as the EDB Index, they effectively change the information environment for economic and political groups important to the target state. Not only does the bank staff and the ministry of development (for example) know their rating; they both know that investors know, citizens have gotten wind of the ranking, and other states have become aware as well. This is the essence of social pressure: it engages the reputations and status concerns of relevant bureaucrats and politicians, in some cases fueled by the national pride of domestic publics more generally.10 When King Abdullah of Saudi Arabia declared in 2006, “I want Saudi Arabia to be among the top ten countries in Doing Business in 2010. No Middle Eastern country should have a better investment climate by 2007,”11 he was displaying a status motivation that has no other metric than his kingdom's relative performance on the bank's narrowly defined but highly focal scale.

Social pressure is not a bilateral relationship between the World Bank and a state; our theory stresses that the World Bank alters the informational environment through the EDB ranking which in turn stimulates (often implicit) group pressures on states to reform. Were it not for the anticipated public response, the bank would not be able to exert social pressure of the kind described here (though any economic leverage it may have would remain intact). Governments are likely to care about the beliefs of two groups in particular: domestic constituents (voters, business groups)12 and international investors. For domestic businesses, the rankings uniquely reveal how much more heavy-handed their government is than its peers. World Bank rankings recalibrate expectations and legitimate demands for a reduction in red tape associated with conducting business. International investors may be influenced by a state's EDB rankings as well.13 Even more importantly, state regulators believe that the rankings influence private investment decisions and will try proactively to improve their rankings to attract investment.14 Market actors15 use the bank's rankings as a credible shorthand for a competently regulated economy. Perhaps for this reason, EDB rankings correlate with investment flows, consistent with a claim that good ratings attract business.16 Unfortunately, existing studies do not distinguish between the underlying “business environment” the EDB is meant to reflect and the signal sent by the ratings per se. Methods isolating the influence on investor opinions and beliefs of information packaged as ratings are essential to our argument.

But why should bureaucrats—some of whom may collect rents from existing inefficient red tape—care about such information? EDB rankings also reflect on the personal competence of an individual government minister or that of a department or bureaucracy.17 Some EDB subindicators are specific enough to implicate the professionalism of business regulators, encouraging policy reform before the next “grading period” to avoid opprobrium. Ongoing EDB monitoring and publicity prompt bureaucracies to develop institutionalized routines and capacities, especially in middle-tier emerging markets where incentives to develop a reputation for a business-friendly environment are strongest.18

Finally, governments can use the EDB rankings strategically to gain support for their policies. GPIs can help leaders overcome rent-seeking politicians or competition-fearing monopolies by empowering allies, shaming bureaucrats, mobilizing publics, and promising to attract investment.19 External validation (or criticism) from a credible institution can be part of a strategy to bolster a broad domestic coalition for reform.20 External pressure in the form of rankings is sometimes a politically useful tool to accomplish leaders’ objectives in the face of domestic resistance. This possibility is evident in India, where Prime Minister Modi has emulated the World Bank's tactics intranationally to intensify social pressure on Indian bureaucrats around the country to improve their performance.

EDB Background

Economic Theories and Bank Motives

Over the course of the 1990s, a remarkable development was afoot in one of the most important public investment bureaucracies in the world. The World Bank, whose legal mandate was to promote investment by guaranteeing loans and supplementing private finance, began to turn its attention to what it saw as one underlying reason for underinvestment in the first place: burdensome business regulations.21 In the spirit of the times, academic and bank researchers began to collect information that would speak to the empirical links between regulatory burdens, investment, and economic outcomes such as growth and development.22 They developed the concepts and methods underlying the indices on which the rankings were to eventually be based in a widely cited set of academic and policy papers that reflected the deregulatory and pro-investor approaches that were reaching their height at the time.23

The EDB index was “built on the premise that firms are more likely to flourish if they have to abide by fewer, cheaper, and simpler regulations.” It seeks to assess “the burden of regulation … as seen from the private firm's point of view,” not the net social benefits of regulation, and not net poverty reduction.24 A ranking that rewards reduced business costs was justified theoretically on the grounds that overregulation stifles business activity, stunting growth and development. In August 2002, the bank noted its assessments were meant to set standards and to be actionable: “The [EDB] database differs from existing cross-country reports … which … do not identify the nature of regulatory reforms required to improve the investment climate. Doing Business aims to provide a new set of objective, quantifiable measures of business regulations and their enforcement.”25

The decision to rank was a deliberate part of the strategy to affect policy. EDB's “lively communication style” was designed specifically to establish benchmarks and set states in competition with one another in support of the World Bank's development agenda.26 To promote its “flagship knowledge product,”27 the bank staff carry out a massive media campaign every year when they release the ratings. A separate indicator-based reform team works with countries to target policies effectively.

Market Share

The EDB product line has a robust online presence, including a Wikipedia page, and presence on Chartsbin, Facebook, LinkedIn, several Youtube videos, and Slideshare. Consequently the EDB Index enjoys tremendous “market share” among the growing list of GPIs that deal with national business environments. To illustrate, we selected seven of the EDB's closest cognate assessments, and searched a database of over fifty thousand online media sources (news organizations, blogs, and other media).28 The EDB brand dominates the market for easy-to-access comparative rankings of country performance, as Table 1 clearly shows. In fact, the EDB had more mentions in the media between 2010 and 2017 than the other seven cognate indicators combined. In 2017 the Doing Business website had nearly 5 million annual visitors, 166 times as many as in 2003 (Figure 2).

Table 1. Market share of the ease of doing business index

Notes: Showing the number and share of hits. Results generated from Harvard Berkman Center, “Media Cloud Database,” 2017.

Figure 2. Doing Business website visits, annually (2003–2016)


Despite its dominance, the EDB indicator inhabits a contested space and faces criticisms about its accuracy and validity. One critical study compared EDB's de jure measures of regulations with de facto measures from World Bank firm surveys and found significant differences between the two.29 Some firms in countries with low ranks in categories such as legal requirements for construction permits actually attained permits faster than countries with higher ranks, a pattern that also holds across many other EDB subcategories. The rankings based on formal laws were largely unrelated to actual business practice. The EDB Index has even been assailed for frequent changes in methodology that “had the appearance of being politically motivated.”30 Whether this is true or not, it illustrates disagreement over what the rankings capture.

The ranking criteria face some sharp ideological criticism for their deregulatory biases. Unions and the International Labor Organization (ILO) have criticized the EDB for neglecting the consequences of business deregulation for workers, and the bank eventually removed labor-related components from the index.31 The EDB has likewise been criticized on environmental grounds for downplaying the importance of environmental assessments in favor of a streamlined permits process that could increase the risk of natural disasters.32 Many have questioned whether restrictions on female participation in business should be included in the ranking. When the bank's data on “Women and the Law” were included in the rankings, states like Saudi Arabia tumbled downward. These examples suggest that states have reasons to wonder whether competing to ascend the rankings could create new problems or exacerbate existing ones. Unsurprisingly, competitors have developed and deployed alternative measures for states’ business environments (Table 1). The EDB faces competition from GPIs that prioritize low taxes and limited government (Heritage and Frasier), that include the informal sector (Global Entrepreneurship Monitor), and that include labor (Global Competitiveness Index).

Simplicity, Salience, and Competition: Prima Facie Evidence of the Theory in Practice

Despite questions about its singular deregulatory emphasis and validity, the EDB rankings have become quite salient. Within the first year of publicizing the rankings, leaders from many countries, including Algeria, Burkina Faso, Malawi, Mali, and São Tomé and Príncipe had reportedly requested not general regulatory advice, but specific advice on how to improve their standings. These requests provoked the first epigraph to this paper, a staff report marveling at the competitive state response, in 2005.33 The World Bank itself has succinctly summarized our theory of social pressure: decision makers view the EDB index as a system that compares performance, engages reputations, and incites competition. The bank explicitly and intentionally designed an assessment system calculated to draw attention to a few very simple criteria that are plausibly but not unequivocally associated with a “better” business environment. The index became focal in part because it was one of the first to successfully harness broader intellectual and ideological trends, to link development with a country's business-friendly environment, and thus to ride the crest of the deregulatory wave of the Washington Consensus touted by prominent economists. It has also been advocated by arguably the most central development institution in the world, leveraging the World Bank's credibility.

The EDB also benefits from its quantitative clarity. The ranking simply rewards any policy that reduces the time or the cost of doing business.34 The bank chose not to cloud this focal concept with alternative or countervailing values such as fair business, socially responsible business, labor protection, or environmental considerations.35 The bank further reinforces the EDB's legitimacy by referring to the rankings themselves as “data” on par with the rest of the World Development Indicators.36 As a result, the EDB has survived political pushback from powerful states such as China and Russia and has become focal enough to significantly influence the behavior of states.

Evidence of our theory in practice can be found in policymakers’ own words. Over the past decade, policymakers around the world have spoken and acted as though the EDB matters greatly. Countries openly publicize their plans to undertake reforms. Georgia—whom some have criticized for gaming the system—announced concerted efforts to rise from one-hundredth to the top twenty in two years.37 National officials in Yemen,38 Portugal,39 Mauritius,40 El Salvador,41 and India42 have also highlighted EDB as motivating reforms.

To test the general plausibility of this claim, we examined a near-comprehensive set of press statements and stories for 2016 in English from the Lexis Nexis database. While hundreds of stories mention the EDB Index, our specific interest was in the fifty-one English-language stories covering twenty-six countries that directly cite high-ranking government officials. Illustrating the seriousness with which countries take the EDB Index, 14 percent of the officials cited are heads of state, and another 47 percent are either ministers or deputy ministers, making up over 60 percent of the stories. The remaining stories quoted spokespeople for these offices.

These statements demonstrate the theorized channels of influence. When countries improve, officials highlight this accomplishment: 18 percent brag about progress on the index. Comparisons are rife: 14 percent of officials compare their countries to others. For example, the undersecretary to Cyprus's president, who heads the president's administrative reform unit, noted that Cyprus ranked twenty-fifth of twenty-eight EU states and that “our performance there is not good.” Fifteen percent of the stories mention specific bureaucracies tasked with improving the EDB score, potentially amplifying reputational concerns. Most of the stories identify specific policy measures taken and link them to the EDB Index. Indonesia's agrarian and spatial planning minister noted specifically that a “ministerial regulation was made to respond to a survey by the World Bank on the ease of doing business.”43

Many officials stressed the desire to improve their rankings. For half the countries, official statements—usually by a head of state—publicly commit to a specific target ranking. For example, Indonesian President Jokowi announced “a policy intended to improve Indonesia's position in the World Bank's Ease of Doing Business rankings from 109 to 40.” In Bangladesh, a high-level official noted that it was “the prime minister's demand to see Bangladesh among the countries with a double-digit position (10 to 99) in the ‘ease of doing business index.’ It's an aggressive target, but achievable.” In Kazakhstan, Erbolat Dossaev, minister of national economy, committed to reach the top thirty, “an objective set by the president of Kazakhstan, Nursultan Nazarbayev.” None less than President Vladimir Putin of Russia has gotten into the game. A story reports that “Russia's high positions in the Doing Business ranking were one of the objectives provided in the president's May decrees of 2012. Russia is to go up from the 120th position in 2011 to the fiftieth in 2015 and to the twentieth in 2018.”

This evidence shows that high-level government officials make explicit comparative judgments and set goals based on the EDB Index. Some also believe their efforts will be rewarded in a very tangible way—by attracting investment. Serbia's Prime Minister Aleksandar Vucic acknowledged this explicitly, stating that, Serbia wants to enter the top thirty countries on the World Bank's list. This is very important for the citizens of Serbia because the better positioned we are, the more we will be able to attract foreign and domestic investors.” There is ample prima facie evidence that the EDB Index has motivated a wide range of states, especially those with emerging markets, to make policy reforms that will be tallied by the EDB Index.

Observational Evidence: Bureaucratic Restructuring and Policy Reforms

The EDB system has a clear bureaucratic imprint in many states, and new dedicated structures help states ascend the rankings more efficiently. Evidence also suggests that the bank's strategy of public competition has paid reform dividends: states have responded by reducing costs and time associated with starting a business once the rankings were made public.

Bureaucratic Efforts

Since 2006 when the bank started tracking, countries have undertaken 3,057 sets of reforms related to the EDB.44 Many of these reforms appear to be concerted efforts to improve the rankings as countries initiate collaborations with bank staff in response to the rankings. For example, in 2006, Azerbaijan's president declared the country's ranking “unacceptable,” and sent a working group to consult with the bank to design reforms that moved Azerbaijan up in the rankings.45 In February 2008 the Albanian government asked the World Bank's Doing Business Reform Unit to review proposed legislation to protect investors and then, one month later, unanimously enacted it.46 Such consultations are frequent. Between November 2013 and October 2014 alone, the EDB team received over 160 queries from countries, which suggests that bureaucracies are now configured to respond to the bank's policy advice.47 More than fifty states have formed or designated “reform committees” that, according to the bank, “use the Doing Business indicators as one input to inform their programs for improving the business environment.”48

Table 2 lists the reform committees in place as of 2015. Although countries with reform committees resemble those without in terms of relevant factors such as GDP growth, World Bank loans, regime type, GDP per capita, or even initial EDB ranking,49 they differ with respect to their EDB performance over time. Between 2007 and 2014, they undertook many more reforms (a total of 2.7 reforms per year compared to 1.2 reforms for those without committees, a statistically significant difference) and whereas the countries without committees dropped ten spaces in the rankings during this period, the ones with committees rose by nine.50

Table 2. Countries with reform committees directly using the EDB data

Notes: No information is available for the precise date when committees were formed. These are all reform committees in existence as of 2015. Source: World Bank.

The rankings boost is not merely proportional to the number of reforms for countries with such committees; they systematically get a bigger “ranking bang” for their “reform buck.” We coded the annual number of reforms by each subindicator of the Doing Business Index,51 using a more fine-grained count than the bank's own data and coding for whether the reforms were positive or negative, based on bank descriptions.52 A new variable, total reforms accounts for both positive and negative reforms. This variable has a mean of 1.6 (and a standard deviation of 2.4), suggesting that on average countries undertook a net of 1.6 reforms a year. The range is from -6 to 17 (some reforms are negative). It turns out that per reform, the countries with designated committees moved up more in the rankings (about 1.03 places) than those without such committees (up about .55 places). In other words, countries with committees got nearly double the rankings’ reward for each reform effort. In Table 3, this total reforms variable is used to predict overall EDB ranking in the subsequent year, using a normal linear regression model and controlling for past ranking as well as year-fixed effects to account for any minor methodological changes over time. Models 1 and 2 demonstrate the separate effects of reform committee and total reforms, models 3 and 4 illustrate the total reforms for countries with and without reform committees as separate subgroups, and model 5 uses an interaction term between reform committee and total reforms to demonstrate that the relationship between total reforms and the EDB ranking differs by whether countries have reform committees. The analysis suggests that focused bureaucratic organization produces more strategic responses to the rankings, and not just to global market pressures unrelated to the EDB. As leaders’ own commentary in the media suggests, many states undertake specific reforms strategically to improve their rankings.53

Table 3. The efficiency of bureaucratic reforms for ascending the rankings

Notes: Dependent variable is EDB Ranking. The coefficients that improve rankings are negative as countries move toward being number 1.

Empirical Context: Publicity's Impact

The theory advanced in this symposium claims that ranking publicity per se should matter for reform. This is challenging to assess since the World Bank's monitoring, reporting, and public ranking have been introduced gradually. In the late 1990s several indices around competitiveness were emerging—the bank was not the first to capture the field. The idea for the EDB report arose with a paper by Djankov and colleagues on “The Regulation of Entry,” which has been cited over 3,000 times and was well known before it appeared in print in 2002.54 The paper ranked regulation on entry procedures derived from 1999 data on eighty-five countries representing a wide array of regimes types and other characteristics. In 2002, the World Bank issued the first data on its website (roughly covering 2001), thus commencing the formal period of monitoring and rating. The early data covered 110 countries, but selection into that sample is not significantly correlated with the outcome variable measuring the regulations.55 In 2004, a report covering 145 countries was issued for the first time, attracting more attention to the ratings and monitoring, but still without rankings. In 2005, the report included top-twenty and also worst-performer lists, essentially constituting a protoranking. By 2006 (covering 2005 data), a true ranking of all countries debuted. The ranking's introduction was by no means a clean break, which makes it difficult to detect a precise ranking-publicity effect.

Despite the gradual introduction of ratings and rankings, we hypothesize that the full publication of rankings in 2006 should be associated with greater efforts to reform and therefore greater reductions in the relevant measures after 2006. The dependent variables are four indicators that were first published for “Starting a Business,” the most often referenced component of the index. Larger numbers represent higher costs or longer waits, and so are considered worse from a business perspective. Data were recovered from “the Wayback Machine” Internet archive for years prior to publication. Table 4 displays the indicators and the years the data collection began.56

Table 4. Overview of de jure reform measures (dependent variables)

Notes: Source is EDB website. Years published covers data from the prior year.

Average values of these indicators show steady declines, meaning it is easier and cheaper to do business in these countries on average over time. Many more countries have been progressing each year than retrogressing. In 2002, for example, only 13 percent of the countries required fewer than six procedures to start a business. By 2014, half of the countries had come below six procedures. By 2014, in nearly a quarter of countries one could start a business in about a week, something that had been possible in less than 5 percent of countries in 2002. Figures 3 and 4 show the number of countries over time improving and backsliding on these measures, keeping in mind that countries face a floor effect which at some point makes further reductions difficult or impossible.57

Figure 3. Days to start a business

Figure 4. Procedures to start a business

To examine the association between ratings and the introduction of rankings we use a time series simple regression model that includes controls for the most salient economic indicators: polity (as a measure of regime type), gdp, population, gdp growth, loans from the world bank, as well as a lag of the outcome variable for each of the four sets of models associated with the four subindicators in Table 4. The economic and outcome variables are all logged.

The underlying hypothesis is whether 2006 represents a break point in a trend. This is a hard test because of the gradual introduction in the monitoring, rating, and ranking scheme and the expectation that policy reactions take some time. The key explanatory variable is ranked, which equals 1 for all countries in year 2006 and afterward. Two different specifications were run for each outcome. The first includes ranked, the control variables, the lagged outcome, and country fixed effects. Ranked is expected to be negative and significant in this model. The next model adds a year trend variable. A negative and significant coefficient on ranked would indicate greater improvements after the introduction of the rankings.

Table 5 displays the results. In models 6 to 9, which have only country fixed effects, ranked is associated with reductions in time, costs, and procedures associated with starting a business, indicating greater improvements after 2006 than before. That 2006 presents a clear break in a trend is evident for two of the four variables, procedures and cost in models 10 to 13, the set of second models that add a year trend. In all cases the coefficients are small, suggesting the effects are modest. It is important to interpret these findings in the context of the analysis's findings as a whole. Given the unfavorable conditions for observing any clear break arising from the gradual introduction of the monitoring and rating scheme prior to the ranking, these results, combined with the evidence that specific countries are highly motivated by the rankings, plausibly support the argument that publicizing the rankings has contributed to reforms, and that the efforts to improve have been more intense after the introduction of the rankings.

Table 5. The effect of ranking on reductions in time, cost, and procedures for starting a business

Notes: All reform variables and economic variables are logged. All explanatory variables except year are lagged one year.

EDB Channels of Influence: Altering Information for Investors

Correlations that link ranking publicity to bureaucratic and policy reforms are one thing; causation involving specific mechanisms is another. We have argued that the World Bank promulgates ranking information to pressure states to conform to its favored policies. We have shown that governments pay attention to these rankings, that they have altered their bureaucracies strategically to enhance their performance on the rankings, and that media coverage suggests that competitive signaling to domestic constituencies and investors is one important reason. But does the information contained in the ratings themselves plausibly change important groups’ perceptions enough to encourage reform?

Governments have told us—repeatedly and in public—that ascending the EDB rankings will improve their countries’ ability to attract business investment. It is therefore tempting simply to run a regression to see whether improvements in the ratings do attract more capital, but this would not help to understand the effect of rankings per se because it is nearly impossible with observational data to separate the ranking effects from the underlying qualities that rankings purport to measure. In fact, economists have run such tests, and have shown that the EDB rankings are, as expected, highly predictive of inward FDI when included in standard models of foreign investment flows. These studies conflate the ranking information with the underlying business environment and assume they are the same thing. Corcoran and Gillanders, for example, assert that the EDB rankings are “a very objective measure of regulation,”58 and their study cannot—and was not designed to—separate ranking pressure from underlying characteristics of the regulatory environment. The ease-of-starting-a-business component of the rankings has also been used to predict new business start-ups, which is offered as evidence of the positive effects of “good governance” but it could just as well entail a distinct ranking effect. The point is, we cannot distinguish these claims with such correlations.59 Critical legal research60 as well as statistical studies61 have warned against the methodological, substantive, and conceptual problems with relying on the EDB indicators for assessing the business environment. To accept any EDB-investment correlation on face value reinforces the common but potentially fallacious assumption that rankings are meaningful—an assumption that fuels their impact, but which is precisely the relationship scrutinized here.

A better way to explore the causal claim of the power of the ranking per se is with a survey experiment. The goal is to test whether the “false reductionism”62 of the EDB rankings affects how investors assess investment risks. No study to date—positive or critical—has shown that the rankings frame how investors think about risk. To do so we recruited 150 investment professionals and manipulated information about EDB rankings, controlling macroeconomic information for a hypothetical “emerging market economy” (based on India and using Indian macroeconomic information) and varied the EDB rankings as treatment.63 We hypothesize that even when controlling for important economic and political conditions, information about EDB rankings will influence the willingness to recommend an investment in the ranked country. By changing investors’ information set, we are testing whether the upper right loop of the argument in Figure 1 causally alters investors’ perceptions and therefore plausibly provokes the “government concern” in the lower part of the figure within the confines of the experiment.

A perfect approximation of a real-world information environment that informs investment assessment is experimentally unattainable. We do not purport to estimate the EDB's impact on investments in the real world, but rather, show it is plausible that EDB rankings—which may or may not reflect a meaningful reality—prime investment attitudes. In actuality, investors confront a more crowded information environment than that in the experiment. But it is also clear that investors depend heavily on a few crucial economic indicators as well as other heuristics when making decisions.64 We included critical macroeconomic information and alerted investors that the hypothetical country was an “emerging market economy,” capturing enough salient features of investor decision making sufficient to glean some insights into EDB's influence on investors’ assessments.

The panel of 150 investors was recruited by Qualtrics through a partnership with over twenty Golden Mean certified and actively managed online market research panel providers.65 Respondents were subjected to comprehension checks, asked to answer free response questions, highly compensated for their time, and directly recruited by Qualtrics—which verified their status as industry professionals. To be clear: we do not claim to have recruited a “globally representative sample of investors”—which we would not begin to know how to define. The sample of portfolio managers are upper-middle-class investment professionals living across the United States. All participants had over five years of experience in the investment industry. About half had over twenty years of experience. Roughly half held high-level positions at their investment firm, such as senior director, managing director, vice president, partner, principal, or president/CEO. Investor strategies varied, with nearly half identifying as value investors and others identifying as macro, stock, bond, long/short, and activist investors. The average respondent was fifty years old; the oldest was seventy-eight and the youngest was twenty-six. Roughly three-quarters of respondents were male.

Portfolio managers made up three-quarters of the respondents, while others worked in private equity, venture capital, bank lending, and other investment sectors. Portfolio managers are a hard test for EDB influence. Because they buy and sell securities of foreign firms that are already operating in difficult environments, they should be less sensitive to the EDB ranking than direct investors, for whom day-to-day business operations are a primary concern. Portfolio investment is of significant concern to emerging market states since its rapid outflow can precipitate currency and financial crises, which makes the experiment more relevant to emerging economies. Without claiming representativeness, this panel is one of the few in international relations research to recruit relevant professionals rather than draw from students or the general population.

To avoid self-selection bias, recruitment did not involve any discussion of the survey contents. Respondents received a nontrivial incentive for their participation from Qualtrics or its market research partners, and the response rate was 32 percent. The survey asked respondents to consider an investment in an unnamed emerging-market country. To ensure findings are not an artifact of hypothetical conditions, we used India's true (announced) EDB goals and macroeconomic information—the exact information with which investors and the broad public have been “treated” in reality. Respondents were assigned to one of three groups: a control group and two treatment groups. Those in the Control Group (no EDB information) were given four macroeconomic facts about an unnamed country which, unknown to them, was based on India: real GDP growth: 7 percent; inflation rate: 6 percent; unemployment rate: 10 percent; per capita income: $6,000.66 Those in Treatment Group 1 were given these same four macroeconomic facts but were also told that the unnamed country had an EDB rank of 30, which in fact is Prime Minister Modi's target rank for India. Those in Treatment Group 2 were given the same four macroeconomic facts but were told that the unnamed country had an EDB rank of 130, which is India's pre-reform rank. Thus, both the panel recruits and the information they were provided are highly realistic, imbuing the survey with as much external validity as is possible in the inherent confines of an experimental setting. To minimize any possible framing effects, all information was presented simultaneously in randomized order in a list so that the EDB indicator received no undue attention.

Respondents were asked, all things equal and based only on the information they were given, how likely they would be to recommend investment in the unnamed country. Answers were scored on a seven-point Likert scale with 7 serving as the highest likelihood of recommending investing and 1 the lowest. Higher scores and positive coefficients reflect an increase in likelihood of investment. In addition to ordinary least squares (OLS), three other tests were used: a boot-strapped T-test, a nonparametric Wilcoxon rank sum test, and OLS including a series of controls such as investment industry, investment strategy, title, experience, and the respondent's assumption of where the country was located.

The EDB ranking significantly affected investors’ expressed intent to recommend investment. Relative to respondents who were told the unnamed country had an EDB rank of 130 (Treatment 2), those told it had a rank of 30 (Treatment 1) said on average that they would be far likelier to recommend investment (by more than one full point on a seven-point scale; or roughly 19% more likely). This finding was significant across all four statistical tests at p < .01 (Table 6).

Table 6. Experimental results of ranking differences on investment likelihood

Notes: A positive coefficient entails a higher Likert score and a greater willingness to invest in the first group relative to the second. * p < .1; ** p < .05; *** p < .01.

1 Industry, strategy, title, experience, assumed region.

Relative to those told the unnamed country had an EDB Rank of 130, those in the control group said they would be much more likely to recommend investment (by .97 points on a seven-point scale, or roughly 18% more likely).67 This was significant at p < .01 across all four tests and suggests politicians may be right to fear that a poor EDB ranking could reduce investment. Within this experiment, a higher EDB rank induced greater investment enthusiasm than a lower rank, and a low rank significantly depressed willingness to recommend investment relative to no EDB ranking information at all.68

In a free-response section after the survey, many respondents who received the EDB rank of 30 noted its influence on their investment recommendation. One respondent wrote: “While real GDP growth is substantial, the high unemployment rate is of some concern … [and] already high inflation could get worse … Ease of doing business certainly helps however.” Another thought EDB helped mitigate uncertainty: “while there are risks … it is comparatively easy to do business.” One investor even noted that the country was a “great growth opportunity” because of its “low economic barriers” as indicated by the EDB ranking. Conversely, those who received the low ranking of 130 also suggested it guided their decisions negatively. One respondent argued, “While the GDP growth and income numbers suggest potential, the unemployment rate and poor ease of doing business rank indicate some structural issues with the country and its governance” (italics added). Finally, the survey also revealed that 40 percent of our sampled investors have consulted EDB rankings or reports, which makes even more plausible the idea that EDB could shape investor behavior.69

Finally, those who received an EDB rank of 30 were more likely on a seven-point Likert scale to believe the government was more competent, less corrupt, would attract competing investment, and would not discriminate between foreign and domestic investors compared to those who received a rank of 130. While these results do not quite achieve statistical significance, they tell a consistent story. They also suggest that the EDB may appeal to governments as an “easier” way to attract capital than a far-reaching anticorruption campaign or an expensive infrastructure program, as the case of India shows.

The conclusions drawn from this experiment are significant but limited. Emphatically, they do not prove that high EDB rankings actually increase investment. At least within the experiment, we see that the EDB ranking is an important piece of information that can frame how investors assess investment risk. The framing effect in this experiment illustrates precisely the hypothesized causal mechanism. One might be concerned that surveyed investors are just expressing their own pre-existing belief that rankings represent genuinely better business conditions. This may be so, but it is not inconsistent with the proposed mechanism that ranking information influences investor thinking. We are agnostic about what the EDB “really” captures (though we have cited several skeptical studies). Such a belief suggests merely that investors find the rankings useful, perhaps even credible. In the hands of the World Bank, such ratings induce pressure—based on officials’ beliefs about investors’ beliefs informed by the ranking—to adopt the reforms that will boost their state's rankings.70

Tracing Influence Channels in Modern India

Now that we have assembled evidence that the EDB has shaped state behavior and suggested that it can frame the attitudes of investors and other salient groups, our case study explores these mechanisms in the case of India through Indian Hindi and English-language media, primary sources, and reports on leaked documents related to India's EDB ambitions. It shows that the EDB indicator has created social pressure by encouraging political actors to believe their status is at stake and in competition with other countries and even other Indian states. That competition is heightened because of the perception that investment and political support are at stake. Indian officials seek to improve their EDB rankings because they believe it will win votes, secure investment, and improve official reputations—and they organize major interagency efforts to ascend the rankings. The case ties together much of the preceding evidence, demonstrating that changes in informational framing spark concerns about current and future rankings as well as incentivize state reform behavior.

India provides an important—though not an obvious—case to explore these channels. It is significant for its sheer size. If the EDB Index influences policy in the fifth largest economy in the world, the “average” effects described in the observational analysis are even more important. The research is based exclusively on independent (non-World Bank) evidence. It demonstrates that Narendra Modi's reformist government has made climbing the EDB ranks a central feature of his government's agenda. The effort has been mentioned in party platforms, is explicitly coordinated through interagency mechanisms, and is implemented in part through local governments by using subnational rankings to stimulate competition, embarrass opponents, and reward supporters. Together, the contextualized Indian evidence strongly suggests that Indian political figures and bureaucrats anticipate and react to the EDB pressures transmitted through political, investment, and bureaucratic channels. It also demonstrates strategic behavior on the part of Modi's government, with EDB-related reforms undertaken in large part for their value in lifting India's ranking.


Narendra Modi began to focus on the EDB Index late in his 2013 campaign for prime minister. Emphasizing the business-friendly roots of his political party, the Bhartiya Janata Party (BJP), Modi blamed India's poor rating on the ruling Congress Party and promised to improve the ranking. The BJP implicitly included EDB Index improvement in the 2014 party platform when it promised “making ‘doing business’ in India easy.”71

Not long after Modi assumed power with the largest parliamentary majority in decades (2014), he announced the “Make in India” program, a set of policies intended to attract investment and transform India into a manufacturing powerhouse. The EDB Index was central to this new campaign. It was linked to manufacturing and investment within the BJP policy platform, and in subsequent official policy. In fact, Modi first formally announced his EDB initiative in a major national speech launching the Make in India Campaign. The effort to improve India's EDB ranking is integral to the country's most visible domestic economic program and is a signature Modi initiative.

Modi has always been clear that his EDB-related reforms were not about improving microeconomic incentives but about signaling a welcoming investment climate through a higher EDB ranking. In his speech announcing his EDB effort, Modi declared that “industrialists don't come due to some fancy incentive scheme. One can say you will get this or that we will make this tax free or that tax free. Incentives don't work.”72 Instead, “the investor first wants the security of his investment. Growth and profit come later,” Modi argued. For that reason, India needed to send a signal to investors that “your money will not sink.” The EDB initiative was part of that signaling effort and Modi committed his entire team in government to improve India's ranking from 130 to 50, and then later to 30. While the reforms adopted may well have economic benefits that ordinarily could explain their adoption, they were undertaken for symbolic rather than economic value. The prime minister's words and behavior reveal a belief that rankings matter more than economic incentives—they improve India's reputation, and thereby attract investment. This viewpoint also appears in leaked documents related to India's EDB efforts. Two months after Modi was elected, Principal Secretary Nripendra Misra held a high-level meeting to discuss a “concrete strategy for moving up India's rank” on the EDB ranking, with one senior economist later summarizing the meeting in a leaked memo as “focused on the more immediate and quickly doable process improvements.”73 The fact that the EDB effort began by focusing on easier improvements is fully consistent with the argument that countries pursue those reforms that offer the biggest “bang for the buck.”

Coordinated Efforts to Improve India's EDB Ranking

Modi followed up his 2014 announcement of an EDB initiative with a wide-ranging interagency coordinated effort to improve the country's ranking. India's most powerful bureaucrat, the cabinet secretary, called high-profile meetings of senior officials to discuss how to improve India's ranking.74 These efforts were coordinated through the Department of Industrial and Policy Planning (DIPP), which was tasked with leading Modi's “Make in India” campaign and coordinating state-level reforms. Roughly a month after Modi announced the initiative, DIPP published a report with forty-six policy proposals across several government ministries hewing almost precisely to the bank's subindicators and intended to improve India's ranking. The Indian government has adopted many of these reforms, including reducing the number of days it takes to register a business from twenty-seven to one; simplifying application forms for industrial licenses; placing license applications online; exempting several business from licensing requirements; extending the validity of licenses; raising FDI caps in several industries; introducing a new regulatory reform law; simplifying import-export documentation; and abolishing the Soviet-style planning commission. At the subnational level, in December 2014, the DIPP sponsored a meeting of central and local governments where state leaders committed to a ninety-eight-point action plan to improve EDB at the local level.75 DIPP also created a list of 344 recommendations for state-level governments76 and organized meetings through which states were to share their best practices.77 Regardless of whether these reforms have economic benefits, DIPP generally discusses them as ways of improving India's ranking.

The Modi government—convinced of the very EDB influence channels we identify—even chose to reproduce the international EDB competition domestically among Indian states. In concert with the World Bank, the central government created its own state-level EDB indicator to score India's states on their compliance with the ninety-eight-point action plan and publicly praise or criticize them for their performance. In one report, seven Indian states led by the BJP made the top ten, suggesting either party-line cooperation or efforts to reward political allies through the ranking. These rankings were then used as framing devices in domestic politics, instruments to attract state investment, and as proxies for bureaucratic competence. For example, during a visit to BJP-governed Jharkhand, Modi praised its leaders for working hard to improve their EDB ranking.78 In advance of critical elections in Bihar that would determine the balance of power in India's upper house of parliament, Modi's finance minister attacked Nitish Kumar, the chief minister of Bihar, for his state's low EDB ranking in 2015: “Nitish says let us debate the development issue. What is there to debate? This debate is over. Gujarat [the state Modi previously managed] is number one and Bihar stands at twenty-one [on EDB]. The economy speaks through statistics and not through debate.”79

Since Modi took office, the World Bank has supported India's attempts to climb the EDB Index, and publicly praised the government for its ambition. The bank explicitly recognizes that its rankings shape Indian politics. It offered an explanation for why India's ranking ascended so little in Modi's first year that absolved him of responsibility; praised him regularly for his cooperation with the bank; and even sent the World Bank CEO to attend Modi's celebratory address on India's thirty-place climb and first-ever entry into the top one hundred ranks.80 Attesting to the importance of the rankings, Indian officials have also actively lobbied the World Bank's Doing Business team to improve their scores.81 As one senior government official involved in those meetings noted, “We listed a host of measures we have taken to cut red tape and improve business environment in the country. We are confident of seeing a substantial improvement in our ranking this year.”82

EDB Index Channels of Influence

The India case illustrates several of the influence channels described in the symposium introduction. First, it demonstrates domestic political channels. Indian politicians acted as if they thought the Indian public might be sensitive to the EDB Index's status implications. Indeed, when Modi was an opposition politician, his party used the country's low ranking to shame the incumbent government. Modi himself campaigned on the promise of making improvements. In office, he is making better rankings a priority.83 Despite criticism, he has doubled down on his commitment to improve India's ranking and has, if anything, scaled up his ambitions by setting a new goal to rank in the top thirty. Importantly, Modi made this commitment credible by promising to achieve a high target rank before the next election, allowing voters to punish him for failure. He has hitched his domestic political reputation to the rankings—not to a specific growth figure or a poverty-reduction goal. His own public commitments—and the bank's efforts to avoid embarrassing him—suggest the ranking competition is a significant driver of Indian policy.

Even as India's growth slowed to a three-year low ahead of the next national elections, Modi leaned on India's thirty-place climb in EDB rankings to demonstrate he had improved India's status and its economy.84 He gave a major address dedicated to India's climb in the rankings and trumpeted India's success relative to other countries: “This year, India's jump in ranking is the highest [of all countries]. India has been identified as one of the top reformers … [and] may become an example for many other nations.” To make EDB even more politically salient, Modi linked it to the “life of a common man” by recasting it as the “Ease of Living Life” indicator. He also leveraged the rating to shame his opposition:

Had these kind of reforms … been carried out during [the opposition's] tenure, then our ranking would have improved much earlier. And the credit for improvement in ranking would have gone to them … they did nothing and have had been raising questions about someone who has been doing something. It's just a coincidence that the World Bank started the process of releasing the ease of doing business ranking in 2004. It's an important year. And all of you know who was in the government [the Congress Party] since then till 2014.85

One of India's major newspapers, The Indian Express, wrote that India's thirty-rank increase “comes as a shot in the arm for the Narendra Modi government amid dissenting voices in certain quarters about implementation of the Goods and Services Tax (GST) as well as demonetization,” two policies that had called into question Modi's economic credentials.86 For that reason, Modi's chief political opponent, Rahul Gandhi, had made attacking Modi's EDB gains a part of his stump speech, stating that Modi's team “listens to outsiders” and should instead ask the Indian people “whether ease of doing business has improved for them … What is spoken abroad is truth for this government but what the poor say in India is farce.”87 These statements together demonstrate that India's leading politicians act as if EDB shapes domestic politics and recognize the high ranking as enhancing status for India.

Second, the case demonstrates that political figures in emerging markets act as if they believe the rankings will affect investment levels, which complements the experimental evidence. Modi repeatedly states the belief that EDB affects investment levels. A review of all Modi's foreign addresses establishes that he has broadcast his ambitions on the EDB on virtually every foreign trip for three straight years, including addresses before Davos and the G20 as well as to audiences in capital-rich countries like the United States, China, France, Germany, Japan, Saudi Arabia, South Korea, and the United Kingdom, among others.88 Modi even created a joint “Ease of Doing Business Group” with the United States during the first US-India Strategic and Commercial Dialogue—another signal to the global community that India is a secure and easy place to do business—and has repeatedly declared his belief that EDB efforts have helped attract record investment.89

Third, the case demonstrates the importance of bureaucratic channels. Modi embedded the EDB effort in the national bureaucracy, created interagency structures to improve the ranking, and tied its success or failure at the national level to specific officials. At the state level, he launched a subnational ranking mechanism and used state EDB rankings to praise reformers and shame laggards, triggering reputational mechanisms among local Indian politicians and bureaucrats. He has publicly acknowledged these mechanisms and declared that under his subnational ranking system states are “often competing with each other in implementing business reforms,” which will help the country's overall ranking.90

The channels discussed in the symposium introduction are on full display in India. Modi would have been a reformer regardless, but he latched onto the EDB Index as a tool because he believes that domestic political actors, foreign investment communities, and professional bureaucrats care about the index too. The index became influential in giving content to his reform ambitions. He even encouraged his subordinates to pursue EDB-tailored reforms. Thus, the EDB Report and rankings are clearly shaping the policy response in one of the world's largest and fastest growing economies.


GPI creators aim not only to call attention to their issue and set standards of appropriate behavior; most hope to change policy outputs and—ultimately—outcomes. By relying on multiple forms of data, we have presented considerable evidence that the World Bank's EDB Index motivates reforms, perhaps even above and beyond those one might expect from consulting with or borrowing from the World Bank alone. One interviewee in the investment consulting industry exclaimed unprompted that the EDB Index was one of the most effective things the World Bank had ever done.91

The news is good for those who support the contents of the EDB Index and want to use it as a model in other areas.92 For those who believe the EDB Index is flawed, its influence is cause for concern. Despite episodes of pressure to withdraw the rankings or alter the criteria by countries such as China or organizations such as the ILO, the bank has continued to rank states because it believes the index is indeed an effective tool.93 Both critics of the EDB Index and the bank's refusal to drop it assume EDB rankings have an effect—for good or for ill—on reform policy. Ours is the first study to systematically document the major influence channels that connect an annually published rank-ordered list of countries with powerful policy trends and consequential shifts in state behavior.

The most important message of this research is what it says about new ways to capture governance spaces and exert social pressure by using ingenious forms of communication. GPIs are communication strategies to draw attention to issues, and to define problems and offer solutions using extreme forms of simplification. As such, they are an international counterpart to “nudge” tactics much touted by behavioral economists and psychologists as ways to shift human behavior in desired ways.94 Actors who try to create competitive dynamics and other forms of social pressure through ranking systems know that they oversimplify reality, strip concepts of their context and history, and offer a false sense of precision and certainty.95 But the point of ranking systems is to change behavior, not to faithfully render reality. The ILO has understood this point very well and has been a strong proponent of keeping the labor flexibility measures out of the bank's overall EDB Index, while countries like Saudi Arabia have balked at the recent addition of gender components.

This deep dive into the World Bank indicator has important high-altitude implications for global information politics and governance. It reminds us that information is not neutral but is an important power resource. The World Bank has used the EDB Index to consolidate its authority to address not just development lending but business regulation as well. Arguably, the case of the World Bank's EBD Index suggests that the cumulative effect of widespread comparative quantification is to reinforce global power structures.96 That said, there is some evidence that alternative power centers—notably China—understand the game. China will soon launch a few new rankings of its own, and its Asian Infrastructure and Investment Bank may eventually be as much an opportunity to offer alternative scorecards for states as it will be a resource for finance.

This study helps explain the influence of rankings in international relations. Combined, the evidence goes beyond the standard scrutiny of the validity of the EDB data to show that rankings stimulate competitive dynamics with policy consequences. These findings invite examination of related questions. For example, is it wise to pursue complex policies of deregulation by deploying simple heuristics, such as ranking systems? Do states regularly game such systems to improve their scores rather than select the most appropriate policies?97 Who gains “authority” to rank, and why? Is it fair that a few actors worldwide can use first-mover advantage and other strategic positions to set standards over which states are then pressured to compete? How should the use of GPIs as tools of governance themselves be governed—purely by the market place of ideas? These and other questions need answers if we are to understand the full range of normative issues associated with the power of assessments in global governance.

1. Djankov et al. 2005. From 2001 to 2005 the bank did not rank. Data that would eventually form the basis of the rankings were first published in the fall of 2001 on the bank's website.

2. Steve H. Hanke, “Singapore Leads the Way in Doing Business,” CATO Institute, 19 September 2013, <>.

3. Nugent 2013. Unlike much of the socialization and social pressure literature in international relations (relating to human rights, for example), this definition does not take a position on whether social pressure is used for objectively good purposes. Our definition of social pressure can also be used as a synonym for peer pressure.

4. “Doing Business—About Us,” Doing Business 2019, World Bank <>.

5. Kelley and Simmons 2019.

6. Zürn 2018.

7. Sinclair 2008.

8. See “Doing Business—Reform Count,” retrieved from <>.

9. Hansen and Mühlen-Schulte 2012; Robson 1992.

10. Kelley 2017; Kelley and Simmons 2015.

11. World Bank Group 2008.

12. Dai 2007.

13. This claim is tested in the investor survey experiment we describe later.

14. Jayasuriya 2011. Media analysis speaks directly to this claim. Some scholars argue that states use the EDB rankings specifically as a form of “competitive signaling” to investors and other stakeholders. Appel and Orenstein 2018.

15. We cast this argument in terms of investors and markets, but for some countries, EDB indicators are more important to access nonmarket development aid: EDB subindicators are used in awarding Millennium Challenge Corporation (MCC) funding. See “Guide to the Indicators and the Selection Process, FY 2015,” Millenium Challenge Corporation, retrieved from <>. See also “Business Start-Up Indicator,” 2007, Millenium Challenge Corporation, retrieved from <>. The MCC entered into operation in 2004, before the bank started ranking.

16. Corcoran and Gillanders 2015; Klapper, Amit, and Guillén 2010.

17. Kelley 2017.

18. Evidence analyzed later on reform committees, bureaucratic statements to the press, and the robust interagency process and subnational competition underway in India's EDB reform effort all suggest that the perception of bureaucratic competency is often at stake.

19. DeMarzo 1992; Kahler 1994.

20. Kelley 2004.

21. The bank's legal mandate is discussed in “IBRD Articles of Agreement: Article I,” World Bank, 27 June 2012, <>.

22. For broader trends see “International Standard Cost Model Manual: Measuring and Reducing Administrative Burdens for Business,” 2004, OECD <>. For the EU, see Boheim 2006.

23. See the papers posted on the Doing Business website's methodology page at <>. See especially Djankov et al. 2002 which describes barriers to setting up businesses around the world and has been cited more than 3,000 times.

24. Independent Evaluation Group 2008.

25. “About Doing Business,” from the Wayback Machine, World Bank Group, 2002. <>.

26. Independent Evaluation Group 2008.

27. Ibid.

28. Accessed via the Berkman Center, Harvard University. See “Media Cloud,” Berkman Klein Center for Internet and Society, 2019 <>.

29. Hallward-Driemeier and Pritchett 2011.

30. Josh Zumbrun and Iain Talley, “World Bank Unfairly Influenced Its Own Competitiveness Rankings” The Wall Street Journal, 12 January 2018; “Paul Romer Quits After an Embarrassing Row,” The Economist, 25 January 2018, retrieved from <>.

31. See the critique of the International Confederation of Free Trade Unions (ICFTU) by Bakvis 2007.

32. See, for example, in the case of India, Manju Menon and Kanchi Kohli, “Is Ease of Doing Business Undermining Green Norms?” DNA India, 14 November 2017, retrieved from <>.

33. Djankov et al. 2005. From 2001 to 2005 the bank did not rank. Data that would eventually form the basis of the rankings were first published in the fall of 2001 on the bank's website.

34. For example, days to enforce a contract, and cost of contract enforcement as a share of the total claim. There are just a few exceptions, such as the “quality of judicial processes index” which is a subindicator under “enforcing contracts.” See “Doing Business–Enforcing Contracts,” World Bank 2019, retrieved from <>.

35. The bank maintains a database on labor protections, but does not rank states in this area and does not combine labor and business regulations for a composite score.

36. “Ease of Doing Business Index Databank,” World Bank, 2018, retrieved from <>.

37. Schueth 2011.

38. World Bank Group 2009.

39. World Bank Group 2008.

40. World Bank Group 2009.

41. World Bank Group 2007.

42. Discussed in detail later, relying only on non-bank sources.

43. A complete file of all the quotes and sources is available online in the appendix.

44. See World Bank, “Doing Business—Reform Count.” In reality, even more reforms occurred because if a country undertook multiple reforms within a given indicator in a given year, it is counted as just one reform.

45. World Bank Group 2008.

46. World Bank Group 2009, 55–56.

47. World Bank Group 2014.

48. The appendix lists the countries by region.

49. See Appendix Table A1.

50. See Table A3 in the appendix. Using normalized rankings instead results in a drop of 5 and increase of 10 instead, but the general picture is the same.

51. The indicators and methodology are explained at length online. See “Doing Business—Starting a Business Methodology” at <>.

52. This coding is discussed in the appendix.

53. While it is hard to know whether such reforms are more or less appropriate than those made without the bank's close guidance and without rankings in mind, an analogy to the phenomenon of teaching (and learning) to the test is potentially helpful. The literature is voluminous, especially in the wake of No Child Left Behind policies of the 2000s. See, for example, Jensen et al., n.d.; Menken 2006. Much of this literature suggests that overreliance on standardized tests shifts resources and is associated with more superficial learning. While we are agnostic about the quality of EDB-inspired reforms, we use this analogy to understand the motivation for making them in the first place. Although we have documented the controversy over the validity of the EDB criteria, assessment of reform quality is beyond our scope here.

54. “Doing Business–Starting a Business Methodology.” Djankov et al. 2002.

55. GPD per capita income, GPD growth, democracy, population size, and international or civil conflict rarely correlate significantly with reform and tend not to predict selection into the sample in 2001. Not even the total volume of loans to a country predicts either selection into the original group of rated states or improved business-reform measures. See Table A1 in the appendix.

56. To provide a comparable time series for research, the World Bank back calculates to adjust for changes in methodology, but these corrections have been made only since 2003 data (in the 2004 report). Therefore, if the data in 2001 and 2002 were then the biggest, methodology-induced drop will occur between 2002 and 2003, which is a year before rankings existed. This would bias the findings against our hypothesis, because it would make a preranking year appear to have large improvements.

57. It is not suitable to explore similar trends for the other two variables because they are based on GDP and therefore display minor absolute changes even when the country took no action, simply as a result of the change in GDP that inevitably occurs in any given year.

58. Corcoran and Gillanders 2015, 105.

59. Klapper, Amit, and Guillén 2010.

60. Michaels 2009.

61. Pinheiro-Alves and Zambujal-Oliveira 2012.

62. Michaels 2009, 794–95.

63. This experiment was preregistered with <>.

64. For an in-depth exploration, see Mosley 2000. Sometimes these heuristics are surprisingly unrelated to economic fundamentals. See, for example, Gray 2013.

65. It was pretested on “mTurk Masters.” Treatment effects were present for both experiments, but were stronger and more significant for investors, who are more familiar with investment decision making.

66. To check whether people were guessing that this was India, we asked participants to later identify which region they thought the country was in and no clear pattern emerged. This variable was also used as a control.

67. We also asked respondents what their preferred return would be for this investment. Most respondents complained that this specific question was too difficult to answer. Consequently, answers to these questions exhibited a wide dispersion and no significant differences among groups.

68. We included one question to ascertain whether the issuer of a hypothetical ranking might affect the willingness of participants to use that ranking. We asked respondents to imagine that four different organizations—the World Bank, the Economist magazine, the Heritage Foundation, and the Brookings Institution—all published a fictitious “Global Competitiveness Ranking” (GCR) and then asked which of these organization's GCR they would be most likely to use. We found that 44 percent chose the World Bank's GCR and 33 percent chose the Economist’s, with the remainder split by Brookings and Heritage.

69. We acknowledge that self-reporting bias may inflate this number.

70. In a separate experiment described in the appendix, we found similar ranking effects on the part of the Indian general public, illustrating the potential for a “domestic constituency effect” described in the symposium introduction. Members of the general public typically have almost no sense of how the business environment at home compares with that elsewhere. The EDB may be one of the few ways these groups can come to learn about whether or not it is reasonable to expect one's own government to do a lot better than it has done to date. In this separate experiment conducted with Indian citizens, information about India's EDB rank of 130 was held constant, but the rank of China—a status competitor for India—was manipulated. A high Chinese ranking was found to stimulate competitive expectations and increase the importance Indians attached to a high EDB ranking and the priority they placed on a better business climate. The experiment offers another plausible way the bank leverages its ability to apply social pressure to conform. See the appendix for a description of the experiment and findings.

71. “BJP Election Manifesto 2014: Ek Bharat Shreshthah Bharat,” from the Wayback Machine, BJP <>.

72. Modi 2014.

73. Akshay Deshmane, “How Modi and Jaitley Gamed the World Bank's Doing Business Rankings,” Huffington Post India, 20 November 2018, retrieved from <>.

74. “DIPP Suggests Steps to Improve Business Climate,” The Hindu Business Line, 22 October 2014, retrieved from <>.

75. Sai Nidhi, “Ease of Doing Business: Here's All You Need to Know About the Top Ten States,” Daily News and Analysis, 16 September 2015, retrieved from <>.

76. “Business Reform Action Plan 2016 for States/UTs,” Department of Industrial Policy and Promotion: Ministry of Commerce and Industry, Government of India, 2015, retrieved from <>.

77. Ruchika Chitravanshi, “States Share Best Ideas to Lift India's Global Ease-of-Doing Business Ranking,” The Economic Times, 19 October 2015, retrieved from <>.

78. Modi 2015b.

79. “Arun Jaitley Mocks Nitish Kumar on Development; Says Gujarat Number One, Bihar at Twenty-one,” The Economic Times 17 September 2015, retrieved from <>.

80. “World Bank ‘Ease of Doing Business’ Report Doesn't Factor in Modi Government's Reforms: BJP,” The Economic Times, 30 October 2014, retrieved from <>. These actions attest to the bank's willingness to not only engage in social pressure through rankings but also to engage in related strategies of back patting for favorite pupils as well.

81. “DIPP Urges World Bank to Upgrade India's Ease of Doing Business Ranking,” The Economic Times, 9 June 2015, retrieved from <>.

82. Ibid.

83. Modi 2015a.

84. Modi 2017.

85. Ibid.

86. “This Is What Helped India Go Up in World Bank Rankings in ‘Ease of Doing Business,’” The Indian Express 31 October 2017, retrieved from <>.

87. “Ease of Doing Business: Rahul Gandhi, Arun Jaitley Take Potshots at Each Other Over India's Ranking,” Hindustan Times, 1 November 2017, retrieved from <>.

88. For example, see Modi 2015c; “Full Text of Prime Minister Narendra Modi's Speech at the India-China Business Forum in Shanghai,” NDTV, 16 May 2015, retrieved from <>.

89. Modi 2017.

90. Ibid.

91. Anonymous interview, August 2014.

92. Independent Evaluation Group 2008.

93. In a 2013 formal review Indian officials expressed they were unhappy with India's rankings. The group discussed tensions over the rankings and once again recommended that they be removed. The bank ignored the recommendation. “Stand Up for ‘Doing Business,’” The Economist, 25 May 2013, retrieved from <>; Independent Evaluation Group 2008.

94. Thaler and Sunstein 2008.

95. Merry 2011.

96. Löwenheim 2008.

97. While countries often start with easier, more actionable, reforms, we explored gaming in several ways, but found no systematic evidence for it.

Supplementary Material

Supplementary material for this article is available at <>.


Appel, Hilary and Orenstein, Mitchell A.. 2018. From Triumph to Crisis: Neoliberal Economic Reform in Postcommunist Countries. Cambridge University Press.
Bakvis, Peter. 2007. How the World Bank and IMF Use the Doing Business Report to Promote Labour Market Deregulation in Developing Countries. PB 15-06-06. International Confederation of Free Trade Unions. Available at <>.
Boheim, Michael. 2006. Pilot Project on Administrative Burdens. European Commission.
Corcoran, Adrian, and Gillanders, Robert. 2015. Foreign Direct Investment and the Ease of Doing Business. Review of World Economics 151 (1):103–26.
Dai, Xinyuan. 2007. International Institutions and National Policy. Cambridge University Press.
DeMarzo, Peter. 1992. Coalitions, Leadership, and Social Norms: The Power of Suggestion in Games. Games and Economic Behavior 4 (1):72100.
Djankov, Simeon, La Porta, Rafael, Lopez-de-Silanes, Florencio, and Shleifer, Andrei. 2002. The Regulation of Entry. Quarterly Journal of Economics 4 (1):137.
Djankov, Simeon, Manraj, Darshini, McLiesh, Caralee, and Ramalho, Rita. 2005. Doing Business Indicators: Why Aggregate and How to Do It. Doing Business Project. World Bank Group. <>.
Doing Business 2015: Going Beyond Efficiency. 2014. The World Bank.
Gray, Julia. 2013. The Company States Keep. International Economic Organization and Sovereign Risk in Emerging Markets. Cambridge University Press.
Hallward-Driemeier, Mary, and Pritchett, Lant. 2011. How Business Is Done and the “Doing Business” Indicators: The Investment Climate When Firms Have Climate Control. World Bank Policy Research Working Paper Series, no. 5563. World Bank.
Hansen, Hans Krause, and Mühlen-Schulte, Arthur. 2012. The Power of Numbers in Global Governance. Journal of International Relations and Development 15 (4):455–65.
Independent Evaluation Group. 2008. An Independent Evaluation: Taking the Measure of the World Bank IFC Doing Business Indicators. World Bank. Available at <>.
Jayasuriya, Dinuk. 2011. Improvements in the World Bank's Ease of Doing Business Rankings: Do They Translate into Greater Foreign Direct Investment Inflows? World Bank Policy Research Working Paper no. 5787. World Bank.
Jensen, Jamie L., McDaniel, Mark A., Woodard, Steven M., and Kummer, Tyler A.. n.d. Teaching to the Test … or Testing to Teach: Exams Requiring Higher Order Thinking Skills Encourage Greater Conceptual Understanding. Educational Psychology Review 26 (2):307–29.
Kahler, Miles. 1994. External Influence, Conditionality, and the Politics of Adjustment. In Voting for Reform: Democracy, Political Liberalization, and Economic Adjustment, edited by Haggard, Stephan and Webb, Steven Benjamin, 89136. Oxford University Press.
Kelley, Judith G. 2004. Ethnic Politics in Europe: The Power of Norms and Incentives. Princeton University Press.
Kelley, Judith G. 2017. Scorecard Diplomacy: Grades States to Influence Their Reputation and Behavior. Cambridge University Press.
Kelley, Judith G., and Simmons, Beth A.. 2015. Politics by Number: Indicators as Social Pressure in International Relations. American Journal of Political Science 59 (1):1146–61.
Kelley, Judith G., and Simmons, Beth A.. 2019. The Power of Global Performance Indicators. International Organization 73 (3). <>.
Klapper, Leora, Amit, Raphael, and Guillén, Mauro F.. 2010. Entrepreneurship and Firm Formation Across Countries. In International Differences in Entrepreneurship, edited by Lerner, Joshua and Schoar, Antoinette, 129–58. University of Chicago Press.
Löwenheim, Oded. 2008. Examining the State: A Foucauldian Perspective on International “Governance Indicators.” Third World Quarterly 29 (2):255–74.
Menken, Kate. 2006. Teaching to the Test: How No Child Left Behind Impacts Language Policy, Curriculum, and Instruction for English Language Learners. Bilingual Research Journal 30 (2):521–46.
Merry, Sally Engle. 2011. Measuring the World: Indicators, Human Rights, and Global Governance: With CA Comment by John M. Conley. Current Anthropology 52 (3):8395.
Michaels, Ralf. 2009. Legal Origins Thesis, Doing Business Reports, and the Silence of Traditional Comparative Law. American Journal of Comparative Law 57 (4):765–95.
Modi, Narendra. 2014. Text of Prime Minister Shri Narendra Modi's Address at the Launch of “Make in India” Global Initiative. 25 September. <>.
Modi, Narendra. 2015a. Text of PM's Letter to the People on Economic Issues. Bharatiya Janata Party. <>.
Modi, Narendra. 2015b. Through Mudra Yojana We Want to Accelerate Development Process in India: PM at Inauguration of Mega Credit Camp in Jharkhand. Bharatiya Janata Party. Available at <>.
Modi, Narendra. 2015c. Text of PM's Statement at India-Republic of Korea CEOs Forum. Bharatiya Janata Party. 19 May. Available at <>.
Modi, Narendra. 2017. PM Modi Attends Programme on “Ease of Doing Business.” New Delhi, 4 November. <>.
Mosley, Layna. 2000. Room to Move: International Financial Markets and National Welfare States. International Organization 54 (4):737–73.
Nugent, Pam. 2013. Social Pressure. Psychology Dictionary. 13 April. Available at <>.
Pinheiro-Alves, Ricardo, and Zambujal-Oliveira, João. 2012. The Ease of Doing Business Index as a Tool for Investment Location Decisions. Economic Letters 117 (1):6670.
Robson, Keith. 1992. Accounting Numbers as “Inscription”: Action at a Distance and the Development of Accounting. Accounting, Organizations, and Society 17 (7):685708.
Schueth, Sam. 2011. Assembling International Competitiveness: The Republic of Georgia, USAID, and the Doing Business Project. Economic Geography 87 (1):5177.
Sinclair, Timothy J. 2008. The New Masters of Capital: American Bond Rating Agencies and the Politics of Creditworthiness. Cornell University Press.
Thaler, Richard H., and Sunstein, Cass R.. 2008. Nudge: Improving Decisions about Health, Wealth, and Happiness. Yale University Press.
World Bank Group. 2007. Celebrating Reform 2007: Doing Business Case Studies. World Bank.
World Bank Group. 2008. Celebrating Reform 2008: Doing Business Case Studies. World Bank.
World Bank Group. 2009. Celebrating Reform 2009: Doing Business Case Studies. World Bank.
Zürn, Michael. 2018. A Theory of Global Governance: Authority, Legitimacy, and Contestation. Oxford University Press.


We thank Andrew Heiss for outstanding assistance with the empirical analysis and graphics. We also thank all the participants at the May 2016 workshop on Assessment Power in World Politics at Harvard University; participants in colloquia at the Center for Advanced Study, Princeton; International Relations Department, London School of Economics; Legal Studies and Business Ethics Department of the Wharton School of Business; and the Department of Politics and IR, Oxford University. For detailed feedback, special thanks to Indermit Gill, Philip Keefer, Christopher Lucas, Eddy Malesky, Richard Messick, Rita Ramalho, Dani Rodrik, Sylvia Solf, Anton Strezhnev, and Jonas Tallberg.