I came to Wall Street in 1970 while still finishing my doctoral dissertation for Cambridge University because I knew what I did not want to do. During the summers of 1965 and 1966, on either side of my first year at Cambridge, I had served as an intern for the Senate Finance Committee as LBJ’s triumphant Great Society Administration began its catastrophic dissolution. I emerged from the experience permanently immunized against Potomac Fever.
In 1968 and 1969, I had worked nine-to-five in the Public Record Office in London’s Chancery Lane, conducting primary research on the economic policies of the Labour government of 1929–1931. I expected that I would return from Cambridge to pursue an academic career in economics in the United States, so the following Christmas vacation I interviewed my way from my alma mater, Princeton, by way of Yale, to Harvard and MIT just before the academic job market was submerged by the deluge of graduate students whose scholarly ambitions had been intensified by the Vietnam War. In those slack market conditions, there was an opportunity available in each school’s Economics department. The results of the interviews were uniformly positive, but each offer of employment came with a common curse that expressed itself in the suggestion that I might be more comfortable in a department of Politics or Government or History, rather than Economics.
The discipline of economics was then accelerating its transition to formal methods, mathematical models and quantitative techniques. Practitioners who did not deploy the toolkit, and topics that did not lend themselves to quantitative analysis and mathematical rigor, were being nudged to the sidelines. In 1994, Paul Krugman, meditating on the marginalization of the great development economist Albert Hirschman, recalled how maps of Africa evolved beginning in the fifteenth century, when distances and coastlines were inaccurate but the interior was rich in details, some real (the great city of Timbuktu), some imaginary (“men with mouths in their stomachs”):
Over time, the art of map-making and the quality of information used to make maps got steadily better. The coastline of Africa was first explored, then plotted with growing accuracy, and by the 18th century that coastline was shown in a manner essentially indistinguishable from that of modern maps. Cities and peoples along the coast were also shown with great fidelity.
On the other hand, the interior emptied out. The weird mystical creatures were gone, but so were the real cities and rivers. In a way, Europeans had become more ignorant about Africa than they had been before …
Between the 1940s and the 1970s something similar happened to economics. A rise in the standards of rigor and logic led to a much improved level of understanding of some things, but also led for a time to an unwillingness to confront those areas the new technical rigor could not yet reach. Areas of inquiry that had been filled in, however imperfectly, became blanks.Footnote 1
My research agenda, the intersection of politics and economics at times of extreme financial crisis, lay in one of those blank spaces. It and I were both found wanting, informed as we were by Cambridge economics.Footnote 2
I had been drawn to Cambridge in the first place by the magnetic power of Keynes (dead since 1946), whose legacy defined a distinctive approach to economic problems. In his preface to The General Theory of Employment, Interest and Money, Keynes wrote that “the composition of this book has been for the author a long struggle of escape”Footnote 3 from the “classical” paradigm of economics in which he had been educated. By the late 1960s, the classical paradigm had risen anew, now in formal mathematical garb, as neoclassical economics. But, committed to Keynes’s thinking under the tutelage of his leading student, Richard Kahn, I had come to read a deep philosophical message behind the Keynesian revolution in economic theory and policy, one that represented nothing less than an alternative statement of the purpose of economics. This statement turned on a radically different understanding of the nature of the world with which economists and their discipline engage.
I summarize now what I understood fifty years ago. Neoclassical economics concerns itself with analyzing how rational agents, endowed with relevant information, more or less efficiently allocate scarce resources. In this reading of the world, time is an ahistorical index of sequence that merely indicates the order in which events occur. Keynes’s economics, on the other hand, explores the decisions (and the aggregate effects of those decisions) made by people who know that they do not and cannot know enough about the future, but who will nonetheless suffer the consequences of whatever they decide. In Keynes’s reading of the world, time past is problematically comprehended history, and time future is a world of contingency and chance – and at the core of a capitalist economy are investment decisions that incorporate that uncertain future. As Keynes emphasized in The General Theory: “The outstanding fact is the extreme precariousness of the basis of knowledge on which our estimates of prospective yield have to be made.”Footnote 4
Beginning in the mid-1950s, the “war of the two Cambridges” animated the discipline. By the late 1960s, as Krugman retrospectively observed, the war was over, unequivocally won by MIT and Harvard. Even to a research student in old Cambridge, this was clear. My interpretation was that Paul Samuelson’s neoclassical synthesis had accommodated the Keynesian revolution by sleight of hand. Success in the pursuit of economic efficiency by rational agents presumes that all available resources are fully employed at all times, and Keynesian macroeconomic policy was invoked to ensure that such would be the case. The Keynesian revolution, far from entailing the reconstruction of the foundations of economics, served as a handy footnote.
The “Bastard Keynesians” of new Cambridge, as Keynes’s student Joan Robinson provocatively called them, had appropriated the mantle of Keynesianism while abandoning the ontological core of Keynes’s thinking. Some fifteen years after I left academia, Hy Minsky summarized his indictment of Samuelson’s achievement: “the neoclassical synthesis became the economics of capitalism without capitalists, capital assets and financial markets.”Footnote 5
I returned to Cambridge at the start of 1969 determined to complete my thesis and earn my doctorate, whatever its apparent irrelevance to mainstream economics. Beyond that, I only knew that I did not want to return to a dysfunctional Washington, and at the age of 26 I certainly did not want to keep going to school. And so, by unanticipated default, I entered the world that Keynes had described with such telling insight in the essential Chapter 12 of The General Theory: the world of the financial markets – that is, the world of Wall Street. I did not appreciate at the time that my four years at Cambridge had endowed me with advantages of great prospective value.
Most obviously, I had been mentally living in the world of 1929–1931, a period that had demonstrated the interdependence of the financial system and the market economy, as well as the occasional need each could have for state intervention at times of extreme stress. Beyond consideration of the content and context of macroeconomic policy, my study of that period also forced my attention to the microeconomics of bubbles and crashes. I later came to appreciate that the stock market boom that culminated in the Great Crash of 1929, and the Global Financial Crisis of 1931 that transformed a recession into the Great Depression, were previews for the movies we all lived through during the dotcom/telecom bubble of 1998–2000 and the Global Financial Crisis that began in 2007.
Thus I was already equipped with a peculiar set of framing concepts and historical metaphors when, in 1970, I stumbled into F. Eberstadt & Co., one of the numerous investment banking partnerships that peopled Wall Street in those days. Those concepts and metaphors have shaped my professional career for over forty years, proving extraordinarily relevant at critical moments. They have also motivated me to observe and engage with the evolving disciplines of economics and finance, even while standing apart from the academic mainstream for a generation and while deploring the intellectual and institutional chasm that opened up between economics and finance after Keynes’s death.
From Old Wall Street to New
In 1970, Wall Street was run by a generation who had grown up under the shadow of the Crash of 1929 and the Great Depression. In 1937, the New York Society of Security Analysts was founded; only three graduates of the Harvard Business School went to work on Wall Street; and Richard Whitney, recent past President of the New York Stock Exchange (NYSE), went to jail for stealing his clients’ money.Footnote 6 Thirty-three years later, the Generation of 1937 was in charge. They hardly noticed that 1970 was also the year when the National Association of Security Dealers agreed to create NASDAQ in order to automate the trading of stocks that could not qualify for the NYSE.
The structure of Wall Street in 1970 reflected three institutional facts. First, prior to that year, all member firms of the NYSE were required to be general partnerships, which entailed unlimited financial liability for their principals and limited access to external capital. Second, the NYSE maintained a schedule of fixed brokerage commissions that all member firms were required to charge their clients. Finally, the Glass–Steagall Act of 1933, separating the business of commercial banking from that of investment banking, was the law of the land. As one relevant metric of that institutional landscape, when in 1970 I chose not to pursue the chance to start in the bullpen at Morgan Stanley (it would have meant giving up my pursuit of a Cambridge doctorate when it was almost within grasp), the firm had some 250 employees and total capital of $7.5 million; the equivalent numbers forty-seven years later were 56,000 employees and $254 billion.Footnote 7
Still sheltered from competitive pressures, the commercial bankers of the day were a dull lot, generally confined to taking deposits and making loans. The first signs of their awakening to new opportunities could just be discerned in the unintended consequences of the Johnson Administration’s attempts to protect the dollar against the threats posed by the financial demands of the Vietnam War. As a direct response to US financial protectionism, the “Eurodollar” markets emerged in London. There, dollars that had accumulated offshore in consequence of growing deficits in the US balance of payments with the rest of the world could be freely borrowed and lent. In these markets, American commercial banks competed as underwriters of loans, unconstrained by domestic legislative and regulatory barriers.
The investment bankers of Wall Street served collectively as the agents for their clients – corporate, institutional and retail. They bought and sold securities and other financial assets, underwrote new issues of debt and equity, and advised on corporate strategy and merger and acquisition transactions. They were structured in a well-defined hierarchy. At the top of the heap were the white-shoe corporate advisory firms, led by the bulge-bracket lead managers of quality underwritings: Morgan Stanley; First Boston; Kuhn, Loeb; and Dillon, Read. These firms, in turn, enforced a strict ranking of status among their lesser counterparts that was published to the world in the order in which the firms appeared in the tombstone advertisements that accompanied every public offering of securities.
The retail-oriented firms, led by Merrill Lynch, distributed new securities and aggregated demand and supply for existing ones through more or less national networks of brokers; they were called wire houses because their branch offices were linked to the trading floor by telegraph (yes, still!) and telephone wires. The block trading firms – Goldman Sachs, Salomon Brothers, Bear Stearns – had the brains and the guts to put their own (still quite modest) capital at risk on behalf of their clients and themselves. They were below the salt, with the increasingly clear exception of Goldman, whose rise to respectability reflected more than thirty years of labor by Sidney Weinberg to overcome the taint left by its exploitation of its customers in the stock market bubble of the 1920s. And literally hundreds of niche partnerships thrived, subsidized by the NYSE commission schedule to compete for business by any means other than price.
As Chernow notes in The House of Morgan, the traditional “religious segregation” of Wall Street was “crumbling” but still visible.Footnote 8 By and large, the leading advisory firms in the Street remained WASP, with the exception of Kuhn, Loeb (whose past preeminence was plainly fading). The trading powerhouses were Jewish. Merrill, with its sales army known as “We, the People,” was Irish. It was barely possible to meet an Italian-American outside the mailroom in any of them. Professional women were virtually nonexistent in the established firms: Muriel (“Mickie”) Siebert was the first woman allowed to buy a seat on the NYSE, at the peak of the bull market in the late 1960s, and she had to start her own firm to have a place from which to trade.
The culture of Wall Street was a holdover from the days when the brokers were big and the clients were small. The canonical story went back to before the Great Crash, and every new entrant heard it. I was told it this way: in, say, 1928, a fellow from, say, Indianapolis came to New York with his wife and visited an old college pal in the latter’s skyscraper office downtown. His friend escorted them to a window and showed them the sights: “There’s Mr. Hutton’s yacht; there’s Mr. Dillon’s yacht; and there, there, is Corsair, Mr. Morgan’s yacht.” “Yes,” the visitor replied, “but where are the customers’ yachts?”Footnote 9
As late as 1967, The Money Game – an account of the great postwar bull market, which is in equal measure insightful and hilarious – depicted a culture that in its essence was recognizably continuous with that of generations past, despite such institutional intrusions as the Securities and Exchange Commission (SEC). In his focus on brokers doing well by servicing new institutional clients from a fee schedule that harked back to another era, author “Adam Smith” (George Goodman, a former Rhodes Scholar in disguise) identified the force that would obliterate old Wall Street.
After World War II, the architecture of finance in the United States was transformed by the rise of investing institutions. Institutional investors had existed since time immemorial: the trust departments of banks, the investing side of insurance companies, the investment trusts (closed-end funds) organized by brokers to aggregate their clients’ capital. What drove structural transformation was the postwar institutionalization of savings, first through the broad emergence of defined-benefit pension plans, encouraged by amendments to the tax code, for both private- and public-sector employees. In parallel, newly founded open-end mutual funds competed for retail investors who were gradually emerging from the shadow of the Crash and the Depression. This was the source of the growing weight of the block trading firms, who provided liquidity to customers who had to trade in scale.
The growth of the institutional equity market was the counterpart to the long bear market in bonds that set in after World War II. This was partly a function of the fact that, contrary to popular fears and the conventional wisdom of most economists, the end of the artificial demands of military mobilization did not return the world to the conditions of the Depression: rather, economic growth drove profits and real incomes into a golden age of broad-based prosperity.
Unlike the aftermath of previous great wars – the Napoleonic Wars, the Civil War and World War I – in the United States, as in the United Kingdom and across Europe, the inflation of war was not followed by postwar deflation. In the first instance, this was directly the result of far wiser techniques of public finance in both the United States and Britain, which had relied on comprehensive rationing and direct industrial controls to divert resources to the war effort while protecting the build-up of voluntary and forced savings from dissipation by unconstrained price increases.Footnote 10 A commitment to full employment and sheer growth in the scale of the public sector to serve commitments to both social and – with the onset of the Cold War – national security were accompanied by a persistent, gradual inflation that continued after an uncorrected spike during the Korean War.
In 1959, anticipating and accelerating the future, three young men – Bill Donaldson, Dan Lufkin and Dick Jenrette – had started a firm (DLJ) to implement a great idea. As the post-World War II bull market in common stocks increasingly drew investors out of the extended, anti-equity trauma of the Depression, they all had to pay the fixed NYSE commissions. Previously, competition for business had involved a number of noneconomic factors – old school ties and the “three Bs” (booze, babes and baseball tickets) prominent among them. DLJ competed for business by offering documented recommendations for stock purchases and (less commonly) sales on the basis of fundamental investment research; this was novel and needed. It was the first firm to define itself as a research brokerage, and it was followed by many others. In 1972, Institutional Investor magazine – its own existence emblematic of the new structure of finance – inaugurated an annual “All America” ranking of institutional research analysts.
Between autumn 1973 and spring 1975, old Wall Street entered an accelerating process of irreversible change. The first shock came with the oil embargo of September 1973. Inflation rates and interest rates soared to levels not previously experienced in peacetime. By the summer of 1974, Watergate had paralyzed Washington just when Wall Street both needed help and knew that it did. Nixon was in the bunker, the Watergate Committee was closing in, and the Dow Jones Industrial Index, which had tried and failed three times to hold steady above the iconic 1,000 level, was falling back toward 500.
The bear market that began in the fall of 1973 provided the context for the structural reforms that would transform Wall Street over the next generation. But the underlying cause of these reforms was the reversal of position between the Street’s agents and their customers. As brokers, Wall Street firms had become small relative to their institutional clients. And as investment bankers, they had become small relative to their premier corporate clients – AT&T, DuPont, GE, GM, IBM – which had become sufficiently substantial to access the capital markets on their own. In the commercial paper market, the great business corporations created a new capital market as they lent their excess cash to each other.
The transformational reform for the brokerage business took place on May 1, 1975: May Day. The NYSE suspended its fixed commission schedule, and member firms were free to negotiate with their customers. For most firms, negotiating with a pension fund was easy: just say “yes” to whatever rate the client proposed. And so brokerage commissions began their monotonic descent from more than 20 cents per share on an institutional-size, 10,000-share block toward zero.
Seven years later, the SEC confirmed the transformation of the world of corporate finance by promulgating Rule 415, allowing shelf registrations for qualified issuers, who could thus avoid the expense of underwritten offerings by putting registered securities “on the shelf” to be sold when demand presented itself at an acceptable price. Each policy initiative reduced the rewards available for Wall Street firms acting as agents and generated powerful incentives for them to reposition themselves as principals.
The narrative of Wall Street’s evolution since 1970 confirms that one abiding law of history is the law of unintended consequences.Footnote 11 Deregulation of the capital markets beginning in 1975 achieved its intended result. It brought vastly improved transactional efficiency, represented most visibly by a radical decline in brokerage commissions and an enormous increase in trading volumes. But it also radically reduced informational efficiency. When the subsidy from brokerage commissions disappeared, fundamental investment research evolved from being a public good openly offered by the brokers and dealers on the sell side to become a proprietary asset of the buy side, held by managers of financial assets: pension funds and mutual funds to hedge funds.
Moreover, the changes revolutionized the institutional structure of the financial markets. Institutional investors who demanded the best execution from their brokers at the lowest net price spawned a set of bigger, smarter, tougher counterparties who made unimaginably more money as principals than they ever could have as agents. In parallel, under pressure from leading theorists and practitioners, the regulators unleashed capital market competition from its post-Great Crash shackles. Liberated from unlimited liability and more or less insured against liquidation by deposit insurance or the lender-of-last-resort powers of the Federal Reserve, Wall Street’s banks enjoyed a position on the risk–reward spectrum never before experienced in the history of financial capitalism. Enabled by advanced computer technology and modern finance theory, they were free to construct an infinite web of derivative securities in which every player had the opportunity to become too systematic to fail lest the circle of issuers and purchasers be broken at any link.
Wall Street’s transformation expressed itself through the progressive securitization of one asset class after another, beginning with mortgages around 1980. Financial instruments that had been held on the books of the originating creditor became tradable securities, so the prices at which they traded became subject to the same dynamics of bubble and crash that characterize all markets in securities. As principals in the markets they had invented, the players in the new Wall Street rendered themselves utterly dependent on the presumption of liquidity in the markets in which they dealt. That is, they had to rely on the ability to transform any asset into cash at a predictable, historically consistent cost (funding liquidity) and on the continuity of trading in the markets where the assets they held were priced (market liquidity).
Here is a critical instance of the dependence of practice on theory. Theory asserted that the statistical attributes of the instruments the Wall Street firms bought and sold – their average return, their volatility, the correlation of return and of volatility among different securities, and especially their liquidity – could be relied on to be stable over time, and that cash would always be available on predictable terms. My practice as an apprentice venture capitalist would teach me that Cash and Control – assured access to sufficient cash in time of crisis to buy the time needed to understand the unanticipated, and sufficient control to use the time effectively – is the joint hedge against the inescapable uncertainties of economic and financial existence. The big banks and their regulators chose theory over practice as long as they could and were validated by Alan Greenspan’s Federal Reserve when market reality challenged them, as it did in the 1987 stock market crash; in the Asian Flu, the Russian default and the collapse of the hedge fund Long Term Capital Management in 1998; and in the bursting of the dotcom/telecom bubble in 2000.Footnote 12
This intellectual construct enabled the excesses of 1929 to be emulated, even exceeded, eighty years later. The great financial institutions acted as though sufficient cash would always be available whenever needed, and in 2008, for the first time in three generations, they brought the capitalist system to its knees when they discovered that only institutions of the state could deliver the cash they needed to survive.
The Transformation of F. Eberstadt
How Wall Street transformed itself from a private club of highly paid agents into an enormously more profitable band of dealers resonates with the narrative of how my colleagues at F. Eberstadt and I were forced to evolve from investment banking agents into venture capital principals. In each instance, competitive pressures forced innovation on those who had enjoyed the economic rents provided by membership in a closed cartel. One decisive difference, however, was that F. Eberstadt was not too big to fail. We knew that in time of need our survival would depend on access to the cash owned by our best clients. So we learned to do what we had to do in order to deserve access to that cash.
I lived through and, to some degree, led the transformation of the institutional research business during the ten years that followed 1975’s May Day. The Eberstadt firm possessed a remarkable endowment with which to face the new era. It had been founded in 1931, just as the Great Crash turned into the Great Depression, by a great financier, Ferdinand Eberstadt. In the early 1920s, Clarence Dillon, the original “Wolf of Wall Street,” had recruited Eberstadt to the investment banking firm of Dillon, Read. Eberstadt led Dillon, Read to play a major role in refinancing German industry after the losses of World War I and the hyperinflation of 1922. As the peak of the bull market approached toward the end of the decade, he suggested to Dillon that his partnership share might be increased to reflect more closely his contribution to the firm’s profits. According to Eberstadt, Dillon responded, “You’re not happy here, are you?”
So, in 1928, Eberstadt was freed to play a leading role in the drafting of the Young Plan (named for Owen Young, then chair of General Electric), a collaborative, quixotic effort to reduce to manageable scale the burden of reparations established at the Versailles Peace Conference in 1919. Eberstadt completed this pro bono assignment and returned to Wall Street at the worst imaginable time. In 1930, he put his considerable capital into one of the then major wire houses, Otis and Company, which closed its doors less than two years later with unlimited liability to its partners. Eberstadt literally walked across the Street and, with $15,000 and some used furniture proffered by friends such as Averell Harriman, started his own firm, determined that no one else would ever again have the opportunity to lose his money.
During World War II, Eberstadt returned to the public sector, first as head of the Army/Navy Munitions Board and then as Vice Chair of the War Production Board. At the latter he directed the implementation of the Controlled Materials Plan, a conceptually brilliant and operationally effective tool for directing the mobilization of American industry for total war by controlling the physical allocation of three critical inputs: steel, aluminum and copper.Footnote 13 After the war, the Eberstadt Report on America’s national security architecture led directly to the founding of the National Security Council and the passage of the National Security Act of 1947, which created the Department of Defense.
In my father’s chronicle of the industrial mobilization that led to Allied victory, The Struggle for Survival, Eberstadt emerges as the only heroic figure other than FDR himself. He was a
financier intimately acquainted with the workings of industry, a magnate frankly sympathetic with the claims and contributions of labor, a remarkably blunt and forceful character of scholarly attainments and penetrating intellect, an administrator able to master endless detail and yet to formulate comprehensive and workable over-all policy.Footnote 14
With regard to Eberstadt the man, I had the best of both worlds. I knew him from my boyhood, and he was a mentor to me until his death. He instilled in me the idea that Wall Street and Washington were and are ever locked into mutual interdependence. The libertarian bankers who despise the idea of government interference in any economic or financial activity are as suicidally unrealistic as those political entrepreneurs who do not appreciate that every public policy is inevitably subject to the direct or indirect test of the financial markets’ temperament.
It was Eberstadt who, in conjunction with my father, led me to focus my doctoral research on the formulation of economic policy in time of crisis. Eberstadt exemplified in one person the game played between the practitioners of financial capitalism and those who control the apparatus of the state. Through his firm, he operated effectively in the game played between the market economy and the sources of finance. On the other hand, I never actually worked for Eberstadt and so was never subject to his dictatorial rule.
When Eberstadt died, in 1969, at the age of 79, he was still Managing Partner of his firm. To the end, he held a morning meeting every day, where each partner reported what he had done in the previous twenty-four hours and how he proposed to pass the next twenty-four. This was not an environment in which a new generation of entrepreneurial leaders was likely to thrive; indeed, many of his partners left to pursue their own destinies over the years. Fortunately, shortly before his death, Eberstadt agreed to a recapitalization of the firm that created at least the opportunity for it to survive his own demise.
Because of his genius and despite his need for absolute control, Eberstadt left behind three franchises. The firm’s investment banking team sponsored emerging companies that he liked to call his “baby blue chips.” The team advised the companies on strategy, negotiated mergers and acquisitions on their behalf, and raised debt and equity capital for them in the financial markets. The second franchise, Chemical Fund, was a phenomenon. It was the first mutual fund to be started after the Crash of 1929 and the first to focus on the new science-based growth industries, starting with chemicals and moving on to pharmaceuticals, then electronics and computing. Not coincidentally, it was also the first mutual fund to reach $1 billion in assets under management. Through the mid-1970s, Chemical Fund’s management fees could be counted on to cover the firm’s basic operating expenses. Third, and least in prominence, was an institutional research brokerage business that had been spun off from Chemical Fund and had a similar focus.
Eberstadt also left behind a set of senior partners who proved unable to defend, let alone renew, the first two franchises. By the late 1970s, virtually all of the firm’s inherited investment banking clients had been poached. And Chemical Fund, which had built its outstanding investment record on long-term holdings of the great postwar engines of growth and innovation – DuPont and Pfizer, IBM and Xerox – followed these and the other “Nifty Fifty one-decision” stocks over the cliff and into the abyss of the 1973–1975 bear market.
That the firm had a future at all was the work of two men. Pike Sullivan and Ed Giles had joined with Eberstadt’s son-in-law in the early 1960s to launch the firm’s institutional research business. The son-in-law gave up waiting for the succession, but Sullivan and Giles stayed on as Eberstadt’s most junior partners in the part of the firm most removed from him. Sullivan built and ran the firm’s sales and trading activities. He possessed a remarkable, if inarticulate, instinct for stock selection. He executed a simple analytical construct by dividing the world and its contents into a two-by-two matrix: one dimension ranged from the “simple” to the “complicated” and the other ran from the “real” to the “remote.” The secret of management, whether of investments or of the firm, was to live to the maximum extent possible in the quadrant that was both simple and real, and to avoid all that was both complicated and remote.
Giles had joined Eberstadt as Chemical Fund’s chemical analyst and had recruited the research team for the institutional business. He combined deep knowledge of the dynamics of the first science-based industry with insatiable curiosity about the context in which its industrial participants operated and in which their securities were valued. In contrast to Sullivan, Giles would habitually lead an audience through a complex and nuanced discussion of the global chemical industry or some particular segment of it and punctuate each stage of the argument by saying: “Do you follow me? Well, it’s not that simple!” They made a formidable team.
As their seniors followed their failing franchises down and out, Sullivan and Giles inherited the opportunity to reinvent the firm around the research core. In 1979, they took the decisive step of selling the remains of Chemical Fund to Marsh & McLennan, then owner of the Putnam Group of mutual funds. Those partners who went with Chemical Fund received compensation for their Eberstadt interest entirely in Marsh & McLennan stock. Those who were invited and chose to take the risk of a restart received a portion in such liquid form and the balance in the stock of “new Eberstadt,” the first explicitly defined “research-based investment banking firm.”
The idea was simple. Since the institutional clients would no longer pay us enough in commissions for us to earn a worthwhile return on our investment in research, we had to generate other income streams. Three were available. All depended on repurposing the research team from generating commissions on the trading desk to generating fees from corporate clients: strategy advice, mergers and acquisitions, and corporate finance. The tight focus of the firm’s research on the “high IQ” industries meant that the small group of investment bankers, whose purpose was to leverage the knowledge and insights of the research analysts, had no choice but to go native into the chemical, healthcare and emerging IT sectors. At a time when the major Wall Street firms would not let analysts near their corporate clients for fear they might say something “smart” and undermine the relationship, our model turned on the analysts telling the bankers where to go and what to do.
Mastering the Pragmatics of Finance
During the half-dozen years of old Eberstadt’s unwinding, I had been engaged in absorbing the basics of the business as an apprentice in the investment banking department. And the firm had enough business in the early 1970s to provide a comprehensive education in corporate valuations, public offerings of debt and equities, and merger and acquisitions transactions, including hostile takeovers. The most fundamental lesson arose from the most mundane of work: the valuation of private companies, a reliable fee-generating practice.
I learned to pursue parallel but methodologically independent approaches. First, one would project forward estimates of future cash flows, discounting them back at a rate judged to reflect an appropriate level of idiosyncratic risk specific to the perceived stability of the business and its competitive position, as well as to market rates of interest. In the language of the new finance theory just being propagated, this defined the “fundamental,” as if only one such number could be generated and as if all interested parties would agree on it. In practice, alternative approaches were invoked. One would identify more or less comparable public companies, then introduce market metrics such as price/earnings and market/book value ratios, making appropriate adjustments to reflect the particulars of each company in question. Finally, one would estimate the likely net realization from a hypothetical sale of the business, having due regard for “what a willing buyer would pay a willing seller, neither under any compulsion to transact.”
The layers of judgment embedded in each of these methodologies for valuating companies were as evident then as they are now. In a financial universe transformed institutionally beyond imagining from that of the early 1970s, the same techniques remain central to the discipline, and they are just as dependent on judgment as ever, regardless of the reservoirs of data and massive computing power brought to bear.Footnote 15
Such valuations were and are typically used in court, or at least subject to legal review, when they are needed to help settle an estate or for tax purposes. I learned that the professional goal was to manage each of the processes so the resulting numbers would be within approximately 10 percent of each other: farther apart, and the disparity would threaten the edifice of legitimizing objectivity; too close, and the coincident accuracy would raise suspicions. Practice drove any belief in a single, verifiable fair or fundamental value out of my brain long before I seriously thought through the theoretical impossibilities of the Efficient Market Hypothesis and its assertion that market prices could be relied on to represent accurately that fair and fundamental value.Footnote 16
I was fortunate to learn so early in my career the value of viewing the “fundamental” – a central building block of modern finance theory and neoclassical economics alike – with suspicion. Application of such skepticism faced in two directions. With respect to financial assets, the anchor of a value around which prices are supposed to fluctuate is itself a problematic entity, subject to divergent opinions and estimates. The same stance applies to the calculations that rationalized investment in the physical assets of the so-called real economy – and more so to the extent that those assets embodied innovative technology.
Of course, individual cases are situated along a spectrum that runs from relative continuity and predictability to outright ignorance. At one extreme, in 1970, AT&T functioned as a legislated monopoly that controlled the pace at which essential services would be extended and new technology deployed. Because its revenue and cash flow grew monotonically, it could reliably forecast the return it would generate from any new investment. And its shareholders, informed by a stated and rigorously maintained dividend payout policy, could predict the return they would receive. At the other and more sporty end of the spectrum were, and are, the host of start-ups venturing into the economic and financial unknown and unknowable.
Years later at Warburg Pincus, I would instruct my team that they were allowed to run one instance of a financial model of a start-up to check for logical consistency, but if they insisted on running more instances in the hope of defining the prospective rate of return, we would not do the deal. The parameters of such a model were necessarily so soft that any net present value of expected future cash flows could readily be generated.
Understanding that the fundamental is an uncertain construct, even when applied to an established and ostensibly secure business, has strategic importance. At a systemic level, it forces recognition of the waste that must be generated by any process of economic development and growth through time. Schumpeter’s process of creative destruction can only proceed by trial and error. We see that which is created through the lens of survivors’ bias and ignore the “hopeful monsters” that economic evolution has spawned and left behind in metaphorical emulation of Darwin’s process of natural selection. No doubt every one of them was launched on the basis of an exercise in forecasting future revenues, costs and an expected value to be compared with a rough estimate of the cost of capital. As Schumpeter well knew, the wastage is the measure of the inescapable uncertainty that attends the practice of doing capitalism:
We need only visualize the situation of a man who would … consider the possibility of setting up a new plant for the production of cheap aeroplanes which would pay only if all people who drove motorcars could be induced to fly. The major elements in such an undertaking simply cannot be known … Neither error nor risk expresses adequately what we mean.Footnote 17
Inventing the “Post-venture Private Placement”
All of this seemed to be decoupled from the institutional research business through the mid-1970s. But, as impossible as it may be to conceive of today, it was possible then to live multiple professional lives in an investment firm like Eberstadt. So, while I was being paid as an apprentice, then journeyman investment banker, I continued to write and lecture on the increasingly fraught state of the domestic and global political economies. In particular, during the winter of 1973–1974, when Wall Street generally held Watergate to be a partisan political sideshow, I was speaking and writing on its economic consequences. My thesis was that Nixon’s loss of authority would cripple the government’s ability to meet the need for political underwriting of the financial and economic risks that the first oil shock was generating. The game between the state and the market economy of 1931, which I had explored in depth during my years at Cambridge, had returned, if this time only as a shadow and a warning. The persistent relevance of that warning reaches forward as well as backward. In 2008, it was precisely the authority of those in charge of states from Washington to Beijing by way of London and Berlin that put a floor under the Global Financial Crisis. In 2017, even as the prices of financial assets continue to reflect the extraordinarily low discount rates that are the result of extraordinarily unconventional central bank monetary policy, we may confidently expect financial consequences from the loss of credible leadership in Trump’s White House and Brexit’s Downing Street. And they will not be pretty.
Around 1975, my work in this domain attracted the attention of Ed Giles. He asked me to produce regular research reports on the political economy, to be published as part of the firm’s offerings to our institutional clients. In this back-door manner, I had the opportunity to align myself with the “smart guys” in the firm as the old investment banking franchise into which I had been hired was fading away. By the time we split up the firm in 1979, I was working increasingly closely with key members of the research team on specific opportunities to generate investment banking deals, and with Giles and Sullivan on developing the business model for the new firm.
The most economically significant new business that we created in new Eberstadt was what we called post-venture private placements, the sale of unregistered securities by emergent companies to institutional investors. This business proved to be as good as the market for initial public offerings (IPOs) was bad. Understanding this dynamic is crucial. The public equity markets exist to provide liquidity to investors who can correct an investment error by selling the shares back to the market. But liquidity in any market is fragile and vulnerable. It is subject to two different threats: one-sided market opinion and the existence of categories of securities that are deemed too risky for trading.
If market opinion is heavily one-sided and investors are united in the belief that a given share or the market as a whole can move only in one direction – the conditions that enable a bubble or a crash – then the premium or discount that an investor must pay or can receive will be large. Under extreme conditions of panic, as in the autumn of 2008, the discount may become infinite. As Keynes wrote in The General Theory: “Best of all if we should know the future. But, if not, then … it is important that opinions should differ.”Footnote 18 Precisely because no one can know the value of the fundamental for sure, markets offer liquidity as those with different opinions bid and offer prices that correspond to their differing views. So, at a fundamental level, uncertainty explains why financial markets exist in the real world.
As for the second threat, there have been repeated episodes when whole categories of securities – for example, debt securities of governments that have defaulted on their obligations – have been deemed too risky for trading. Most relevant to my own career and to the dynamics of the Innovation Economy, there are times when the common stock of new companies is judged unsuitable for introduction to the public market. Generally, when aversion to perceived risk is high and bear market conditions prevail, the IPO window closes. Such were market conditions after the oil shock of 1973 and through the remainder of the 1970s. During those years, the number of venture-backed companies that managed to go public was very small. A few names stand out: Cray Research (1976), Tandem Computer (1977) and Federal Express (1978).
At roughly the same time, we discovered at Eberstadt that we could mobilize large sums of capital – tens of millions of 1980-vintage dollars, worth slightly more than three times as much today – in order to fund the sort of emerging company that in normal times would have gone public. In 1980, when the first $100 million venture fund was just being raised (by my present firm, Warburg Pincus, as it happens) and when the typical IPO amounted to only $10 million in aggregate proceeds, this was real money. The basis of the business was the relationship of trust that Eberstadt had established with a highly diverse set of our best institutional clients: ranging from the State Farm Insurance Company, to private investment advisory firms around the United States, to various branches of the big Swiss banks and members of the private banking confraternity in Geneva, to the Scottish investment trusts. These relationships, in turn, reflected underlying economic self-interest: because these investors were so important to the overall revenues of our firm, they knew that we could not afford to exploit them.
The terms of these private placements reflected a balance of issues. On the one hand, by sponsoring a company, we were certifying its post-venture status as a revenue-generating business delivering (or soon to deliver) positive cash flow from operations. This was instantiated by the form of the equity securities our clients bought: straight common stock, underneath the convertible preferred shares typically purchased by venture capitalists. Of course, this subordination was powerfully attractive to the entrepreneurial founders and the venture capital backers of the issuers. On the other hand, the price our clients paid reflected the scarcity of capital and the lack of liquidity.
One of our early successes was a medical device company called IMED, a leader in computer-based pumps to control the intravenous infusion of fluids and drugs. When we sold shares to our clients, the company already had annual revenues of $35 million, and it was growing at some 30 percent per year with an operating profit margin of 20 percent. We valued the company at $50 million, perhaps half or less of the valuation freely traded shares of a comparable company would have received in the public market under less stressed conditions. Barely two years later, Warner–Lambert bought IMED for $465 million in cash.
An echo of the Eberstadt post-venture financing business could be heard in the pseudo-market that has emerged from 2010 around the most visible exemplars of the consumer-oriented internet that are still privately held. The purchase of shares without SEC registration by passive investors has a passing resemblance to our innovation of more than thirty years ago. The one clear link is the absence of an active IPO market. The primary difference is valuation: the buyers of these shares have been paying premium prices, as if they had the certainty of liquidity that only a deep trading market can provide. By deep contrast, our post-venture private placements were typically priced at a 40 per cent or even greater discount to the valuations of roughly comparable public companies.
As I write in 2017, the Unicorn Bubble appears to be fading, as a growing number of companies that have succeeded in going public trade below their previous private valuations. Underwritten by the historically unprecedented, sustained low levels of real interest rates that have been the consequence of excessive reliance on central bank ease through the painfully long recovery from the Global Financial Crisis and the Great Recession, the Unicorn Bubble is doubly vulnerable: at the micro-level, to the “marks to reality” generated by active trading markets for those companies that do go public; at the macro-level, from a return to traditional levels of interest rates and credit spreads as the major central banks normalize monetary policy. Even before such confrontation with reality, a recent detailed analysis of the individual funding rounds of 135 asserted Unicorns indicates that the valuations of all were exaggerated by not accounting for the specific favorable terms offered later-stage investors that detract from the value available to earlier rounds and to the underlying common shares held by founders and employees:
We value unicorns using financial terms from legal filings and find reported unicorn post-money valuations average 50% above fair value, with 15 being more than 100% above. Reported valuations assume all shares are as valuable as the most recently issued preferred shares. We calculate values for each share class, which yields lower valuations because most unicorns gave recent investors major protections such as … IPO return guarantees (14%), vetoes over down-IPOs (24%), or seniority to all other investors (32%). Common shares lack all such protections and are 58% overvalued. After adjusting for these valuation-inflating terms, almost one-half (65 out of 135) of unicorns lose their unicorn status.Footnote 19
But the collapse of the Unicorn Bubble poses no systemic risk, given that in aggregate the funds involved are relatively modest and unleveraged.
As always, the majority of hopeful monsters will fail while a few amazing winners will again demonstrate the power of productive bubbles to finance innovation at the frontier. As well, the entrepreneurs and directors of the 250 or so currently existing Unicorns will discover that there is a fundamental difference between meeting operating expenses with proceeds from the sale of goods and services versus from the issuance of securities to inevitably fickle investors. And, finally, a set of former fund managers at major institutions who drove the bubble will discover the lesson that I learned at first hand during my apprenticeship on old Wall Street. When liquidity is available, escape from error is available, even if a loss must be accepted. When liquidity is not available – whether by reason of contract or law or an adverse change in market conditions – the path to redemption is laborious work, at best, as Virgil said of the return from hell.
Early in my apprenticeship at Eberstadt, when I was splitting my time between serving concurrently as a political economist and a trainee investment banker, I discovered computers. That is to say, I discovered why computers are interesting. This came about as an unexpected result of the collapse of the post-World War II golden age. From the autumn of 1973, under the impact of the oil embargo and energy crisis triggered by the Yom Kippur War, both political and market processes broke down, nowhere more definitively than in the United States. Making sense of the new economic environment in which the financial markets were functioning was as challenging as it was necessary.
By the early 1970s, the macroeconomics of Samuelson’s neoclassical synthesis, universally and misleadingly termed Keynesian, had come to be intimately associated with large-scale econometric models. Otto Eckstein’s DRI (Data Resources, Inc.) Model, based on his research at Harvard, led the field, with competition in the commercial world from Michael Evans’s Chase Econometrics and Lawrence Klein’s Wharton Model. Every major central bank had its own version, as did the Treasury Department. Derived from the work that had won Jan Tinbergen his share of the first Nobel Prize in Economics, these models all deployed a statistical methodology intended to define consistent relationships between variables, using the correlations between historical time series to establish predictable patterns of systemic behavior.
From the beginning of the econometrics enterprise in the late 1930s, Keynes had raised objections to the whole procedure, even though he had championed the development of the national income statistics that populated the models.Footnote 1 Tinbergen himself emphasized the practical promise of econometrics:
The establishment of a system of equations compels us to state clearly hypotheses about every sphere of economic life and, in addition, to test them statistically. Once stated, the system enables us to distinguish sharply between all kinds of variation problems. And it yields clear-cut conclusions. Differences of opinion can, in principle, be localised, i.e., the elementary equation in which the difference occurs can be found. Deviations between theory and reality can be measured.Footnote 2
Keynes, in response, identified a number of technical issues that infest any attempt to derive causal relationships from statistical correlations. He then turned to the core structural issue: the instability of behavioral relationships through time. This is what undermines Tinbergen’s project at the most fundamental level and renders econometrics finally unable to serve Tinbergen’s second purpose: to test the validity of alternative economic theories. Fifty years later, Pesaran and Smith evaluated the prewar argument between Tinbergen and Keynes:
Given that there were not strong a priori reasons for believing economic relations to be stable over time, and the fact that estimated equations are prone to structural change, one is forced to agree with Keynes that at a logical level econometric inference, like other forms of inference, is unsupportable.Footnote 3
In practice, Keynes himself had emphasized that in the face of uncertainty
we have tacitly agreed, as a rule, to fall back on a convention. The essence of this convention … lies in assuming that the existing state of affairs will continue indefinitely, except in so far as we have specific reasons to expect a change.Footnote 4
As David Hume had asserted some 150 years before Keynes, reliance on the convention of continuity underlies the observable stability of behavioral relationships in “normal” times, even though the “precariousness” (Keynes’s term) of such foundations can also be observed in the drastic regime shifts represented by bubbles and crashes.Footnote 5 And so, as Pesaran and Smith conclude, “it does not follow that econometrics is useless.” Indeed, it was their “practical usefulness in decision-making and policy formation” that drove the proliferation of econometric models in the postwar decades.Footnote 6
In the winter of 1973–1974, however, Keynes’s attack on “the promise of structural stability” dominated.Footnote 7 For the first energy crisis drove all the key variables of the models – interest rates, inflation rates, unemployment rates and, with the contemporaneous collapse of the Bretton Woods international financial system, exchange rates – beyond the ranges that had been observed during the mere quarter-century in which national economic statistics had been systematically collected. The functional relationships that had been defined on the data from this period and that constituted the guts of the models were left floating in air, decoupled from empirical observation.
In a series of papers written for Eberstadt’s institutional clients, I defined this as “the database problem.” The econometric models represented a statistical economy whose behavior was supposed to evolve in close emulation of the underlying complex networks of agents and institutions, stocks and flows, goods and services, money and credit. But, beginning in 1973, the world economy was ejected from the models. We were living “outside the database.” Whether or not a given dependent variable would exhibit the same relationship to the supposedly relevant independent variables was an entirely arbitrary judgment once the latter moved to levels never before observed. So, quite apart from the econometric models’ standing in economic logic, as practical tools for prediction they had broken down as thoroughly as the political economy they were supposed to represent.
Agent-based Simulation Models
In 1975, my persistent search for alternative tools with which to evaluate global economic discontinuity led me to a warehouse off Kendall Square in East Cambridge, Massachusetts. There I found a band of academic refugees, led by a young scholar named Nathaniel Mass. They had been students of Jay Forrester, renowned first for his leadership of MIT’s pioneering “Whirlwind” computer project and then for his development of a methodology for representing the behavior of complex systems by capturing both the positive feedback effects that amplify initial movements and the negative feedback that dampens them.
At the end of the 1960s, in collaboration with Donella Meadows, Dennis Meadows and others, Forrester applied his system dynamics to economic systems, an effort that culminated notoriously in The Limits to Growth: A Report for the Club of Rome’s Project on the Predicament of Mankind.Footnote 8 Forrester’s design goal was to represent complex systems with parsimonious models that would reveal the systems’ modes of behavior, and he did so with the discipline of an engineer. But the system dynamics model deployed in The Limits to Growth was so parsimonious that it lacked a price mechanism. As a result, increasing demands on resources, driven by population growth and rising incomes, led monotonically to resource exhaustion, since rising consumption and stretched supply generated no price signals to ration demand and divert investment toward the development of alternatives.
Under assault from economists of all persuasions, the younger members of the team had learned the appropriate lessons. Now isolated from MIT’s engineering and economics departments, they set about constructing a national economic model piece by piece, from the bottom up, incorporating both price mechanisms and actions by financial institutions. Their goal was to simulate the behavior of a monetary production economy by tracking the collective behavior of agents that were realistically defined with respect to the data they could observe, the instruments they could control and the constraints to which they were subject. This was the opposite of the reductionism of neoclassical economics, and therefore all the more appealing. The work was an early exercise in what have come to be known generically as agent-based models: comprehensive, alternative approaches to representing how a market economy evolves through time.Footnote 9
When I learned what Mass and his colleagues were up to, a very large penny dropped in my mind. I realized that the transformational function of computers went far beyond their ability to perform arithmetic and statistical operations on ever larger quantities of data. Computers could serve as simulation engines, making it possible to address problems too complicated to solve analytically and enabling analysts to represent the behavior of systems too complex to model by hand. My immediate response was to engage actively with the System Dynamics National Modeling Project as its practitioners moved from specification of the production sector to construct the financial and government sectors of their model.
In the spring of 1977, I attempted to lay out for our clients how agent-based simulation models differed from econometric models driven by statistical correlations. Econometrics generated “prediction models” that were valued to the extent that they yielded accurate forecasts of economic and financial variables. The experience of the previous few years had shown how unreliable such tools could be in the face of radical discontinuities. The MIT agent-based simulation project, by contrast, was an “exploration model” whose explicit microstructure offered the prospect of being able to trace the nonlinear, disequilibrium consequences of the behavior of the participating agents. It is true that agent-based simulation had limited immediate utility as a prediction model, but the opportunity to follow the simultaneous evolution of individual behaviors and emergent systemic phenomena was novel and provocative. Writing more than thirty years later, Doyne Farmer and Duncan Foley expressed the promise I saw then:
To understand what such a model would be good for it is useful to make a comparison to climate models. We specifically compare to climate rather than weather because we think that it will be a long time before such models will be useful for short term forecasting (though this is not impossible). We think the main utility of such models will be to model the equivalent of the economic climate: For example, when the economy is at a given point in the business cycle, what central bank actions tend to be most effective?Footnote 10
My hope that agent-based simulations would develop into a full-blown methodological alternative was frustrated in the field of economics, even though it has thrived in such fields as epidemiology and climate studies. Indeed, all the natural sciences have adopted simulation techniques as they grapple with the dynamics of systems revealed by ever more fine-grained tools of analysis to be dauntingly complex. As discussed in the Coda to this second edition, their reappearance in economics largely reflects the shift toward more realistic engagement with the world, induced by the Global Financial Crisis and its economic consequences. Back in the 1970s, however, Mass and his team took their project out of MIT on a quixotic venture to apply their model as a tool for macroeconomic forecasting, with the predictable (and predicted) result: the assertion of a superior methodology was irrelevant to potential clients whose sole criterion of merit was the short-term accuracy of the model’s prediction of such variables as GDP growth rates and market interest rates.
The demise of the MIT Systems Dynamics National Model in no way inhibited my determination to learn more about the uses of computing. This, in turn, required learning as much as I could digest about the range of contributing disciplines, including semiconductor physics, digital logic and software engineering. It also involved constructing access to the commercial activities that were emerging not only within the behemoth of IBM but also on the Route 128 periphery north and west of Boston and, barely discernibly, in the potato fields south of Palo Alto.
Along the way, I discovered a history – still living then, now all but forgotten – that represented perhaps the most productive collaboration ever in the game between the American state (or, indeed, any other state) and the market economy. Understanding how the US government’s unprecedented investment in fundamental science and related technologies fostered the emergence of computers and all things digital is central to understanding, first, the emergence of a venture capital industry focused predominantly on information technology and, second, the creation of the new digital economy that venture capitalists and the financial markets have funded over the past generation.
Government Investment in Science and Technology
During World War II, the United States had followed the United Kingdom’s example in mobilizing science for war. In addition to funding the development and procurement of advanced technological products, from radar to the atomic bomb, the US government invested in the scientific sources of technological innovation. At war’s end, Vannevar Bush, who had served FDR as founder and Director of the Office of Scientific Research and Development (OSRD), delivered to President Truman a prospectus for continuing this investment of public funds. In Science, the Endless Frontier, Bush argued:
The Government should accept new responsibilities for promoting the flow of new scientific knowledge and the development of scientific talent in our youth. These responsibilities are the proper concern of the Government, for they vitally affect our health, our jobs and our national security. It is in keeping also with the basic United States policy that the Government should foster the opening of new frontiers and this is the modern way to do it. For many years the Government has wisely supported research in our agricultural colleges and the benefits have been great. The time has come when such support should be extended to other fields.Footnote 11
Bush explicitly advocated funding basic research, calling it the “scientific capital” that “creates the fund from which the practical application of knowledge must be drawn,”Footnote 12 and he argued for making the results of scientific research broadly available to industry and the public at large.
For five years, implementation of Bush’s vision for permanent programs of state investment in fundamental science was delayed by arguments over the extent and manner of political control. Into the institutional vacuum moved the more entrepreneurial agents in the public sector, which took control of elements of the OSRD’s domain: the newly created Atomic Energy Commission assumed responsibility for nuclear research; the National Institute of Health pluralized its name and took over OSRD’s programs of extramural grants for life sciences research; and the Office of Naval Research emerged as the vanguard of the newly formed Department of Defense, focusing on the range of sciences and technologies that supported the development of microelectronics and digital computing. In 1950, the outbreak of the Korean War finally induced the creation of the National Science Foundation (NSF), a relatively modest version of the all-encompassing National Research Foundation envisioned by Bush.Footnote 13
The NSF was endowed with a broad mandate across both the natural and the social sciences, but the Office of Naval Research’s initiative pointed the way. National funding of the basic research that enabled the IT revolution emerged largely from the Defense Department. The Soviet threat, crystallized in the years following 1945 and amplified by the Korean War in 1950 and the launch of Sputnik in 1957, was the context for the US military’s massive commitment to renewing its wartime role as the principal financier of scientific and technological research and the principal customer of the products generated therefrom.Footnote 14 Indeed, the fact that various arms of the Defense establishment had become sophisticated purchasers of advanced digital technology may have been more significant than the government’s direct funding of research, for it both enabled substantial investments in productive capacity and know-how by the industrial side of the military–industrial complex and encouraged the sharing of expertise by requiring second sources of supply and cross-licensing of patents.Footnote 15
Kira Fabrizio and David Mowery summarize the essential elements of federal policy:
The IT sector, which scarcely existed in 1945, was a key focus of federal R&D and defense-related procurement spending for much of the postwar period. Moreover, the structure of these federal R&D and procurement programs exerted a powerful influence on the pace of development of the underlying technologies and the structure of the industries that developed these technologies for defense and civilian applications.Footnote 16
And the scale was substantial: for twenty-five years through 1978, federal sources accounted for more than 50 percent of national R&D expenditures and exceeded the R&D expenditures of all other OECD governments combined.Footnote 17 As Henry Kressel, my partner and collaborator at Warburg Pincus, would write in retrospect, drawing on his own entry into the digital research enterprise at RCA’s Sarnoff Laboratory around 1960: “The real visionaries in the early days were to be found in U.S. defense organizations.”Footnote 18
The Computer Industry in the 1980s
By 1980, the world of computing had stabilized. IBM dominated commercial data processing across the corporate world. The “seven dwarves,” including the BUNCH companies (Burroughs, Univac, NCR, Control Data and Honeywell) plus the computer divisions of GE and RCA, all knew that IBM was more than a mere competitor. IBM defined and managed the environment in which they sought to survive. Digital Equipment Corporation (DEC) led the minicomputer industry. At the peak of that segment there were some 200 companies focused on automating manufacturing management and financial reporting for smaller companies and for divisions of large ones.
All the computer companies were vertically integrated: that is, the core processing engines were built according to proprietary designs that ran proprietary operating systems often bundled with their own application software and peripheral devices. The goal was to manage complexity for the customer – an important need given the novelty of the technology and the scarcity of trained personnel. The cost was lag- gardly innovation and customer lock-in. The resultant profit margins were too good to last.
The disruptive transformation of the vertical, centralized computer industry into a horizontally layered, distributed industry was the work of two decades. The developments unfolded on multiple fronts. One was in computer-aided design, manufacturing and engineering software. Here, the technical members of staff required the continuing, dedicated power of a machine that could run complex, data-intensive algorithms. The engineering workstation, networked to specialized servers, challenged and defeated the minicomputer: Sun Microsystems defeated DEC. It was especially significant that the software technologies deployed by the victors embodied open standards. Although the software was customizable by particular vendors, who were subject to competitive pressure to experiment and innovate, the interfaces were collectively agreed and were accessible to all. Thus, products from different vendors could be integrated into a working system.
The UNIX operating system was developed in the Bell Laboratories of AT&T and licensed to the world as a consequence of the 1956 consent decree with the US Department of Justice that precluded AT&T from competing in the nascent commercial computing markets. The Ethernet networking protocol was developed at Xerox’s PARC and successfully promoted as an open standard, in collaboration with DEC and Intel, in competition with IBM’s proprietary offering. The profit margins of the new players who used these open standards were distinctly lower than their entrenched competitors’, but their growth was positive and accelerating.
Client–server computing was proven in niche technical applications where the scale of the addressable market was measured by the number of “seats” occupied by distinct categories of engineers. In time, as the technology matured and as the performance of general-purpose microprocessors grew to match and then to exceed the custom hardware at the core of IBM’s mainframes, client–server systems would penetrate the enormously larger commercial markets, IBM’s domain. The manner in which I was educated to appreciate the economic and investment significance of this revolution came by way of an abstruse academic exercise.
Real Lessons from Artificial Intelligence
A persistent interest in innovative applications of computing had led me to the frustrating frontier of artificial intelligence (AI). Around 1980, Phil Meyer, a colleague of mine at Eberstadt who had followed the oil boom from the electronics industry to the sophisticated end of the oilfield equipment and supply industry, discovered that the chair of the oilfield services company Schlumberger, Jean Riboud, had recruited the entire team of AI researchers from the Stanford Research Institute. On the simple premise that if Schlumberger was interested in AI, then we should be as well, Phil set out to learn all he could, in the style of the unreconstructed, old-fashioned investment analyst that he was. Riboud’s immediate purpose was relatively easy to discern. The core of Schlumberger’s extraordinarily profitable business was the generation of data from proprietary instruments inserted into oil wells to enable analysts to estimate the likelihood of finding hydrocarbons and to assess the magnitude of any find. The data were being interpreted by human experts; if their work could be even partially automated, then Schlumberger’s productivity and profits would rise in tandem.
Phil correctly read Schlumberger’s project as indicative of a broader potential. And so he set out, with me in tow, to explore the broad world of academic and industrial AI research. MIT’s Artificial Intelligence Laboratory, established in 1970 and managed by Pat Winston, was the first stop. MIT was the logical place for us to start because of its early engagement with the field and because we had already built a relationship with its Industrial Liaison Program. But the range of relevant research establishments was broad, covering a multitude of academic and industrial labs and a growing number of start-ups staffed therefrom. Projects included primitive exercises in machine learning of the physical, and the definitely more problematic metaphysical, worlds that humans find it natural to navigate. The center of gravity of these research endeavors lay in “expert systems,” software programs populated with relevant rules for decision-making in specified domains, from medical diagnosis (University of Pittsburgh) to management of shipboard propulsion systems (Bolt, Beranek and Newman). That the researchers in the field had their own chosen language, LISP, and required specialized workstations optimized to run LISP added a certain esoteric allure and exemplified the cult nature of the enterprise.
Over the course of a year or more, we visited all the labs and met all the start-ups. We managed not to lose any of our own or our clients’ money in the process largely because along the way we had met a legendary figure on the West Coast who, we were persistently informed, was a radical critic of the AI research agenda and spoke from a position of authority within the world of computer science. At the suggestion of Howard Austen, a knowledgeable, credible and connected consultant whom Phil had retained, one evening after dinner I found myself knocking on the door of a small, nondescript building on a side street off Page Mill Road in Palo Alto. Eventually I was admitted and escorted to a conference room, where I was joined by a tall, bearded man. This was John Seely Brown, or JSB as he had already come to be known. JSB was still an independent scientist and not yet director of Xerox PARC, but he was already a figure with extraordinary reach across the entire space of information technology, from the physics that constituted its foundation to the epistemological issues that attended its applications. For the next two hours, he and I set out to discover what we were talking about – which meant discovering how to find out what we were talking about.
JSB and I bonded over the impossibility of deriving semantic information – “meaning,” that is – from the syntactical rules of language. The purportedly expert systems could replicate only the most simplistic of intelligent behaviors, those following well-defined rules. Genuine expertise, by contrast, is a function of, first, perceiving patterns that distinguish possible signals amid a world of noise and, next, bringing experience to bear in order to interpret them. Meaning, that is to say, is relative to context, and learning to read context adequately is a lifetime’s work.
The ability of computers to track the evolution through time of the elements of complex systems, like their ability to estimate correlations across ever larger sets of data, made them increasingly useful tools for extending the application of human intelligence. However, the expectation that they could be developed into autonomous substitutes for human intelligence was doomed then and remains problematic today, even as the second great wave of investment in AI, funding innovative methods of machine learning, gains cumulative strength.
Lessons from the collapse of AI’s first hype cycle in the 1980s remain relevant in assessing the prospects of the second cycle in which we are now embedded. This time, machine learning techniques are properly focused on pattern recognition in the oceans of data generated and captured in an increasingly digitalized world. We will consider the promises, threats and limits of contemporary AI in Chapter 12. Here it is appropriate to note that transforming data into information, and from that information extracting actionable meaning, remains context-dependent. Bootstrapping the understanding of context from the bottom-up examination of data remains a daunting challenge. That is why the increasingly rich spectrum of machine learning successes share a common, general character. Either there is an objectively definable pattern to identify (as in the case of image recognition) or there are defined rules exogenously established (as in the case of games, from Chess to Go to Poker). When the meaning of the pattern in question is subject to real-time negotiation – as was precisely the case in my first conversation with JSB – comprehending the context is necessarily a shared creative act for which humans remain better equipped than machines.Footnote 19
JSB offered me informal access to PARC, where I got to play with the Xerox Star, the first PC that could run a graphical user interface and be managed by a mouse, and where I had the opportunity to be a naive guinea pig for the assertedly intuitive directions for operating the holy grail: the digital copiers that would transform Xerox’s core business. At this frontier of digital innovation, a future of intelligent client computers distributed across networks and drawing on the power of dedicated servers could be lived in real time.
This was some years before Xerox finally learned how to earn a return on PARC’s extraordinary innovations in the architecture, technology and application of digital systems.Footnote 20 The profitability of Xerox’s patented position in the copier market meant that no start-up business could compete with the economics of the existing business, so the company was passively watching entrepreneurs depart to start new companies when headquarters refused to commit the funds required to turn invention into commercially significant innovation. This was a powerful lesson in one way that the innovator’s dilemma expresses itself in action, crippling the ability of a company with surplus resources to exploit commercially the innovations generated by research funded with those resources.Footnote 21 Warburg Pincus would benefit hugely from such corporate paralysis when it came time, more than a decade later, to challenge IBM’s core engine of monopoly profit. As for Xerox, it finally began earning a return on PARC’s innovations years later by taking minority stakes in spin-off ventures.
In March 1983, we at Eberstadt sponsored a colloquium on artificial intelligence at MIT.Footnote 22 One of my assignments was to induce JSB to participate in what was bound to be a festival of promises that he profoundly disbelieved could be kept. His lecture, on “the high road and the low road” of AI research, stays with me to this day. The high road was the project to give computers the ability to think like human beings. When that project had failed, as JSB correctly anticipated it would, we would be rewarded nonetheless for having, of necessity, followed the low road of incrementally improving how human beings and computers interact. Three years later, the Dreyfus brothers nailed the lid on the pretensions of the first generation of AI research with their definitive work, Mind over Machine.Footnote 23 With their behavior rigidly dictated by rules imposed from outside, the so-called expert systems were actually emulating human apprentices.
My engagement with Xerox PARC provided an education at the frontier of innovation in information technology. More than twenty years later, it would pay an exceptional dividend by validating our shared critique of the initial, fundamentally misguided approach to AI. As JSB had anticipated, the work “wasted” on AI would contribute to the integration of computers into the working and social lives of human beings for three decades and more.
Paul Ricci was a young member of the PARC staff when JSB introduced us in 1983. In September 2000, he left Xerox, where he had risen to the role of group Vice President of Marketing, to become CEO of one of Xerox’s family of sponsored spin-offs, Scansoft. Endowed with Xerox’s optical character recognition technology, which is used to scan paper documents into digital formats, Scansoft was struggling to reach profitable scale as an independent public company. Once Paul had acquired Scansoft’s principal competitor and established a sustainable – albeit slow-growing – base of cash flow from operations, he decided that the time was ripe to address automatic speech recognition, a domain of technological invention that seemed to be persistently a decade from commercial maturity.
Automatic speech recognition was one of those fields of AI research where the rules-based approach had failed to produce adequate results. Paul’s bet succeeded largely because research and development in speech recognition was turning toward the application of increasingly sophisticated statistical techniques to ever larger datasets using ever more powerful computers to identify ever more subtle correlations. In other words, researchers broke the code in speech recognition by using computers as computers, not as pathetically inadequate simulacra of the human mind. It should come as no surprise that much of the underlying science and technology was funded by the Defense Department and other arms of the government.
The first time Paul approached me at Warburg Pincus to ask me to consider backing his vision was in 2002. I listened with academic interest and minimally restrained skepticism. But within two years Paul had purchased relevant technology, and Scansoft had begun to demonstrate both step-function increases in the accuracy of speech recognition and meaningful revenue from a variety of applications of the technology. Moreover, Paul understood that the critical factor was not the raw accuracy of the recognition engine but, as always, customer satisfaction. Unlike the techno-geeks who had been driving the technology for decades, Paul recognized that turning automatic speech recognition from a laboratory curiosity and a science-fiction fantasy into a large-scale commercial solution required taking seriously the delicate process of engineering human beings into the system as back-up and for quality control.
Warburg Pincus acquired Xerox’s residual ownership of Scansoft in March 2004, when annual revenues were somewhat above $100 million. We subsequently funded several strategic acquisitions while the company was on its way to approximately $1.5 billion in revenues in 2011, prior to Warburg Pincus’s exit from its successful investment (and $2 billion in revenues today). One of those acquisitions was the Stanford Research Institute’s entry in the game, Nuance Communications, which carried a far more relevant name for the leader in speech recognition than Scansoft. As Nuance, the company established leadership positions in a range of major markets: voice control of mobile devices; automation of enterprise call centers; dictation, both general purpose under the Dragon brand and with specific applications, such as medical transcription. Extension of the statistical approach to automatic speech recognition led Nuance to the frontier of the current, second generation of AI research: computerized natural language understanding, applicable not only to digital transcripts but to all media of human communication.
The Return of the IPO Market
The passage from PARC in the early 1980s to Nuance in one professional and multiple technological generations illustrates the continuity available from human relationships through discontinuous shifts at the frontier of technology. Long before I reconnected with Paul, when I was still in the microworld of Eberstadt’s research-based investment banking practice, the first hint of a major shift in the capital market context came in the autumn of 1980, when the hugely successful offerings of Genentech and Apple signaled the end of the IPO drought in which we had thrived. The Volcker credit crunch that broke inflation with double-digit interest rates postponed the revival, but by the autumn of 1982 the writing was on the wall or, rather, the growl was in my ear.
The year before, we had financed Daisy Systems, a pioneer in computer-based electronic engineering software. Daisy’s lead venture capital investor was a remarkable individual named Fred Adler. With extraordinary analytical powers and intense purpose, Fred had worked his way from poor Jewish Brooklyn through Harvard Law School to a leading Irish Catholic law firm in New York. From that base, he had put his talents to work as a turnaround artist, taking operational control of troubled businesses and driving them to positive cash flow. His successes reached from Loehmann’s, a chain of discount women’s clothing stores based in New York, to a Silicon Valley semiconductor company backed by some of the Valley’s venture capital elite. Fred’s motto, displayed on needlework pillows in his office, was: “Corporate Happiness Is Positive Cash Flow.” Paying bills by selling products to customers for more than it cost to develop and deliver them endows a company with strategic freedom: independence from the volatile vagaries of the capital markets. This, is the always relevant lesson today’s Unicorns sooner or later will be compelled to learn, and in the vast majority of cases it is bound to be learned the hard way.
Fred had made his decisive step to become a venture capitalist in 1969 by mobilizing the capital to back a brilliant engineer, Ed de Castro, who had left DEC to start Data General, one of the top tier of minicomputer companies that emerged in the 1960s and 1970s. By 1980, Fred had built a substantial venture capital firm, based in New York, with a portfolio that extended from Israel to Silicon Valley. Daisy Systems, a leader in computer-aided engineering for the design of printed circuit boards, was one of his most promising investments when I proposed that we do a second private placement to fund its growth. Specifically, I proposed using roughly the same valuation metrics as the year before, despite hints that the IPO window might be finally opening. My argument was that returning to our institutional clients, long-term equity investors who already knew the company and owned the stock, was a safety play for which a substantial discount from a hypothetical future IPO was appropriate. I knew the game had changed when, on a cold evening in November, I stopped at a pay phone in Columbus Circle on the way home to catch up with Fred, who reported, “Sandy Robertson just told me he’ll do Daisy at Janeway-plus-10 percent!”
Fred and Daisy stayed with Eberstadt for this financing despite the offer of a higher valuation, but the game had indeed changed. As the window opened with breadth and depth and even some speculative excess, both the institutions and the bankers woke up. The latter were led by the “Four Horsemen” of the venture capital ecosystem: Alex. Brown, Hambrecht & Quist, Robertson Stephens (Sandy Robertson’s firm) and Rothschild, Unterberg & Towbin. Even the major firms, such as Morgan Stanley and Goldman Sachs, were drawn to the new business opportunity represented by financing venture-backed IT and biotech companies in the public equity market. And all recognized that pairing research analysts with investment bankers was the way to win the business and to market the stock.
What only some fifteen years before had been a marginal, hardly respectable activity – peddling shares in speculative, early-stage companies to risk-seeking retail investors – had become a worthy and substantial line of business. In what seemed like a heartbeat, our innovation of the late 1970s, research-based investment banking, had become business as usual, although in the generic model it was the bankers who told the analysts where to go and what to do.
By 1984, it had become clear that fundamental investment research as a product of the “sell side” of the market had two possible futures that were emphatically not mutually exclusive: commoditization and prostitution. As food for institutional investors, the path forward was toward commoditization. By the mid-1980s, commission rates on large institutional blocks of shares had fallen more than 50 percent from their former fixed levels, breaking through 10 cents per share (“a dime a dance”) with no bottom above zero in sight. In the absence of a cartel-based subsidy from brokerage transactions, sell-side research could not pay for itself: why would any institution pay for an investment idea if the vendor was simultaneously offering it to all others? In economists’ jargon, the output of sell-side research was a nonexcludable, nonrivalrous good that, once published, any number of competitors would consume and replicate simultaneously without the protection that copyright or patent law offered other forms of intellectual property.
No wonder, then, that talent started leaving the research departments of Wall Street brokers in order to monetize the value of their knowledge almost as soon as commission rates began to fall. By 1980, two of the top analysts of the computer industry had shown the way. Gideon Gartner had left Oppenheimer & Company to start his highly successful business, advising corporate clients on their IT purchasing decisions. And Ben Rosen had left Morgan Stanley to start his newsletter and conference business before he co-founded the most successful new venture capital firm of the 1980s.
The alternative path, the one toward prostitution, was already apparent fifteen years before the revelations that followed the dotcom/telecom bubble. A direct example of how economic incentives constantly threatened analytical objectivity came in our corporate advisory business. Ed Giles, in addition to serving as President and Research Director of Eberstadt, continued to function as the best-ever investment analyst of the chemical industry. In the mid-1970s, he had hired arguably the second-best analyst out of a provincial trust department. One of the chemical companies with whom they had built a close and rewarding relationship was Hercules. There came a day when number two came into Giles’s office with the news that the numbers did not add up to what Hercules was forecasting. So Giles called the Hercules CEO, whose response was: “Ed, ignore it. You’re a president and I’m a president. We have people who worry about the numbers.”Footnote 24
Research-driven investment banking existed in a sea of conflicting interests. When we first met with the management of an interesting company, we would begin by explaining that we had no idea whether we would end by proposing to sponsor the company to our institutional clients (and join them in buying the stock), or propose a merger or acquisition or invite them to put us on retainer to provide strategic advice. We did intend to demonstrate that we understood their business better than any other financial firm. I used to say, “Conflicts of interest exist; the difference between children and grown-ups is that the latter know how to manage them.” Perhaps it should not be surprising that, when the stakes increased exponentially during the Internet Bubble, so many senior bankers and analysts proved themselves to be children. But by then we had long since declared victory and sold our firm.
At Eberstadt, before fleeing from the dual dooms to which the research business was fated, we attempted to escape by moving upstream, incrementally shifting our role from that of investment banking agent to venture capital principal. In 1981, I had hired Jack Lasersohn into our investment banking group. Jack had been top of his class at Yale Law School and was an associate at Cravath, Swaine and Moore when he decided that he needed a more entrepreneurial career path than that available at one of the most prestigious corporate law firms in Wall Street.
Having been an undergraduate student of physics at Yale, Jack was fascinated with computers and computing and was determined to participate more directly in the industry. When we responded as a firm to persistent requests to establish a conventional limited partnership to serve our clients who wanted to be able to make a single decision in committing to our stream of private placements, Jack took the lead in managing Post-Venture Capital, LP, which was chartered to invest broadly in venture opportunities, not just in the deals we originated.
This shift in our center of gravity was not entirely voluntary. Not every one of the companies we backed as investment bankers performed like IMED and Daisy. Given our commitment to our institutional clients, when a company performed badly, we had no choice but to intervene. As I said regularly to our investors, “If we ever lose one of these companies, I will be in the emergency room with my thumb on the carotid artery, covered in blood.”
In such circumstances, our challenge was to work our way out of the role of hired gun in order to sit on the venture capitalists’ side of the table. This was not an easy task, especially given that we had sold, and our clients had bought, common stock that carried neither preferential rights nor board representation. But in critical circumstances our relationship with our institutional clients provided the necessary source of leverage. Before we came to establish our own modest venture capital funds and beyond any commitments our clients made to them, they had deep pockets, far deeper than those of any venture capitalist or all of them in combination. In practice, these funds were accessible only with our active support and on terms we recommended, which endowed us with the ability to act as principal by proxy.
So I learned the venture business by coming in the back door as a sort of cross between a police officer and a garbage collector. By far the most effective mentor I had in this career-changing transition was Fred Adler, that same lead investor in Daisy Systems. While he was operating under the guise of a venture capitalist, Fred’s excellence lay in his ability to take a business apart analytically and dissect the interaction of its functional operations and its financial cash flows. He was a notoriously difficult human being, treating CEOs as subordinates, and subordinates as trash. I used to tell him that the greatest compliment he ever paid me was that he never offered me a job. But it was through two collaborations with Fred that I learned the substantive consequences of taking responsibility as a financier for the economic life of an operating business.
Bethesda Research Laboratories
The first of these collaborations concerned Bethesda Research Laboratories (BRL), a pioneering producer of enzymes and other biological products needed by all who were active in the nascent field of molecular biology and the technologies of genetic engineering. Eberstadt’s involvement with BRL began as a legacy of the firm’s old investment banking franchise. One of BRL’s co-founders was married to an heir to one of Ferdinand Eberstadt’s baby blue chips, the medical device manufacturer Becton, Dickinson.
At new Eberstadt, we had become intrigued with biotechnology in the late 1970s. In 1977, Bob Swanson, the business co-founder of Genentech and a former sell-side investment analyst who knew Eberstadt from those days, had called me to introduce his start-up. After serious exploration of the emergent science of molecular genetics and its potential to deliver clinically effective, commercially significant therapeutic and diagnostic products, we decided not to participate as financiers. Despite the government’s growing support for research in the life sciences through the National Institutes of Health (NIH), the time line from laboratory to clinic was certain to be so long, and the rate of attrition from candidate molecules to FDA-approved drug was certain to be so high, that investment returns were bound to be hugely speculative. No biotech start-up could be expected to reach positive cash flow from operations during the lifetime of the venture funds that launched it. Investment success across the prospective new industry would be far more dependent on the varying state of the public equity markets, for both primary financing and ultimate liquidity, than on the scientific and operational success of the ventures.
Here was a signal example of the game played between financial capitalism and the market economy.Footnote 1 Years later, as I contemplated the persistent determination of venture capitalists to invest in biotechnology, and the persistent willingness of the IPO market to accept and fund ventures with no predictable date for generating revenues, let alone positive cash flow from operations, I came to appreciate a significant truth. In biotech, as in information technology, all ventures are launched into a fog of uncertainty. Both IT and biotech ventures face varying degrees of technology risk: can they actually produce what they are founded to make? As Pike Sullivan characteristically put it: “when you plug it in, does it light up?” In addition, IT start-ups face at least as much market risk: if it does light up, does anyone care? But here lies the fundamental difference: market risk for biotech is radically lower.
With well-defined target patient populations and third-party funding of demand – and conditioned “only” on successfully gaining FDA approval – the prospective revenues of a biotech start-up can actually be modeled at launch, unlike any venture in any other field. Nonetheless, the odds against are long, the number of serially successful biotech VCs dauntingly few and, in retrospect, I do not regret the decision.
We did decide at the time to remain engaged as students of the science and its long-term potential to influence and even transform the healthcare industry. We hired Scott King, a young Ph.D. biologist out of Harvard, who would be dedicated to the nascent domain, and we developed a collaboration with MIT’s Industrial Liaison Program that resulted in a symposium titled “Biotechnology: Status and Prospects.” The conference brought our investment clientele together with scientific leaders in the field. It took place on October 15, 1980 – by coincidence the day after Genentech’s IPO brought the genetics revolution to the investing public’s attention.
Beyond our work with MIT on the research side of our business, we were also attuned to the potential for a “Levi Strauss opportunity.” Rather than backing any of the host of start-ups panning for gold, we wanted to find a business that delivered what all of the prospectors needed to do their work, including those still ensconced in Big Pharma and in academia. This is what BRL did, offering a growing range of the molecular tools needed to conduct genetic engineering. With the NIH as its anchor client, BRL was growing fast and had already attracted a major venture capital investor. Given our demonstrable understanding of the company’s market and technology and a growing track record of success in bringing institutional equity to support the sort of company that BRL appeared to be, in 1981 Eberstadt was hired to execute a private placement that would carry the company through the estimated two years needed to reach the promised land of positive cash flow from operations. And this we did, selling some $20 million worth of common stock (more than $60 million in today’s money) to our best institutional clients.
In barely three months, we learned the truth of the adage “No business is so good that it cannot be destroyed by incompetent management.” The co-founder’s father-in-law, although not a member of the board of directors, had prudently mandated BRL’s choice of outside counsel, thereby maintaining oversight of his financial and familial investment. This was how, in January 1982, we discovered that the young entrepreneur and his scientific partner, despite the presence of that major venture firm on the board, had gone mad. The capital that was to fund BRL over the better part of the two years needed for the achievement of sustainable cash flow had disappeared in a spending spree on people and equipment and facilities unconstrained by any business discipline at all.
I recall hearing the news on a Friday. The initial shock expressed itself in preemptive regret for the loss of what had been a promising business: not BRL, but our own post-venture corporate finance business – and with it, of course, my own career as an entrepreneurial financier. Through the ensuing sleepless weekend, however, I worked my way through the pragmatic logic of the situation. BRL was indeed a promising business with more than $10 million of annual revenue, and it was growing rapidly in a rapidly growing market. In other words, it was worth saving. To save it, however, was going to take time and money: money to buy the time needed to cut costs and stabilize operations. Our clients had ample additional resources from which to fund the turnaround, but we could not ask them for more cash unless we could do so in partnership with new leadership whom we and they could trust to use their money effectively. But, of course, we were hired agents with no seat on the board, and our clients owned common stock with no defensive protections against just such circumstances.
The order of action resolved itself into a conceptually simple sequence of events, each of which had to occur so that BRL – and our business and my career – might be saved. First, we had to secure the commitment of an experienced, credible, operational war leader who would join forces with us. Then, in partnership with this ally, we had to secure effective control of the company, subject to raising the needed new capital. In turn, we would bring our new leader and an agreed turnaround plan to our investors as the joint prerequisites for the needed new investment. Subsequent to the required radical surgery, we would recruit long-term successor management. On Monday morning, with the unanimous support of my partners, I called Fred Adler.
Fred had substantial capital in his Venad fund, but I began by explaining that we had no need for his fund’s cash. In fact, it was critically important that we clear the way for our investors to be the sole funders of the turnaround operation in order to maximize their opportunity to recoup the loss of their original investment. Rather, I told Fred, we wanted to hire him to plan and execute the turnaround, and to this end I offered him 10 percent of BRL’s equity if, as and when we secured effective control and refinanced the business, a commitment that, of course, at the time of the offer, it was not yet legally or practically possible for us to deliver on. I subsequently learned that Fred’s acceptance of our proposal generated intense conflict with his junior partners, who understandably objected to the obvious conflict of interest with his own obligations to his firm and the fund he had raised. At the time, both issues proved to be blessedly irrelevant to his decision to join the project.
The next step was for Fred and me to invite the principals of the incumbent venture firm to meet us in New York for what proved to be a remarkably efficient confrontation with reality. The process was helped by the fact that the venture capitalists knew they did not command the resources required to save BRL. Their choice was clear: immediate and very public bankruptcy and loss of all of their investment, or surrender of their protections against the substantial dilution that our investors’ refinancing of BRL was bound to entail. They acquiesced completely.
The following step was more melodramatic. We had to secure complete agreement to our plan from the two founders of BRL, who still owned effective control of the company. My partner John Hogan and I arrived at the company’s building in Gaithersburg, Maryland, in the afternoon, knowing that if we did not get a signed agreement by that evening to the terms of an emergency bridge loan, which carried with it transfer of control, BRL would not meet its payroll on the following day. Fred was in New York, available to join us by phone at any time.
The founders’ incompetence as businesspeople was easily matched by their powers of denial and evasion. Fred’s extensive repertoire of threats and promises was not prevailing until, long after nightfall, a telephone message was delivered to the office where we were meeting. BRL’s products – restriction enzymes and nucleic acids and other molecular tools of biotechnology – physically existed inside inoculated eggs that were held in a special-purpose rented warehouse. The owner of the warehouse now advised that if he were not paid his overdue rent by the next morning he would literally pull the plug on BRL’s eggs, which meant pulling the plug on all its inventory of products for sale, which meant pulling the plug on the company itself. This, finally, was the catalyst for capitulation.
Within twenty-four hours, Fred had become chair of a newly created executive committee of the board. Within weeks, he directed a substantial restructuring of the business, while we brought in $5.5 million of new capital from our investors. My own transition from agent to principal was confirmed as I, too, joined BRL’s board. By June 1982, Jim Barrett had been recruited from SmithKline to lead BRL, and the company was back on track.
One year later, Ed Giles and I led a strategic process to merge BRL with the GIBCO life sciences division of the Dexter Corporation (which, not coincidentally, had been the first strategic advisory client of new Eberstadt’s research-based corporate business). This merger created Life Technologies, a strongly profitable business with $100 million in revenue – indeed the Levi Strauss of the biotechnology industry.Footnote 2 And Fred had succeeded in constructing an outstanding group of scientific advisors including, most notably, Richard Axel of Columbia, who had already pioneered recombinant DNA technology for application in mammalian cells and who would, some twenty years later, share the Nobel Prize in Physiology or Medicine for elucidation of the genetic basis of the sense of smell. Life Technologies completed an IPO in June 1986 on a basis that provided liquidity on attractively profitable terms to Eberstadt’s investors.
The BRL saga was an intense education in the financial economics of uncertainty at the micro-level. Despite our research into the emergent biotechnology industry and into BRL’s own operations, we had made investments while we were fundamentally ignorant of the competence and integrity of the company’s management. Unlike investors in a public company, when we began to learn what we had not known, our clients could not get out: the illiquidity discount was infinite.
Hedges against Uncertainty
Could we and our investors have hedged against our necessary ignorance? From a pre-2008 point of view – but definitely not from a post-2008 perspective – it is tempting to imagine a derivatives market in which we could have purchased that hedge. In neoclassical economic theory, the central notion of general equilibrium depends on the existence of just such a market. In that fantastical virtual space, rational agents protect themselves from the ontological uncertainties of life by trading Arrow–Debreu securities (named for Kenneth Arrow and Gerard Debreu, both winners of the Nobel Memorial Prize in Economics).Footnote 3 These securities are conceived to provide exactly the state-contingent insurance for which all of us yearn. Markets for goods and services and assets are made complete by the supposition that at any point in time one can buy and sell insurance over every possible future state of the world.
In the spirit of Arrow–Debreu, let us suppose that an active market in credit default swaps – insurance against the bankruptcy of a company that evolved in the first decade of the twenty-first century – had existed in 1981. Could we have bought protection so we would have been indifferent to the failure of a business on whose quality and prospects we had bet our reputation? Even at the peak of the credit bubble in the first half of 2007, there were only a limited number of corporations against whom it was possible to purchase single-name credit insurance. Any institution prepared to write such a contract on a company at BRL’s stage of development would either have had to charge a premium so huge as to make the hedge uneconomic on its face or itself have been so obviously incapable of evaluating and pricing risk as to be utterly unreliable as a counterparty.
In other words, a market mechanism for hedging the sort of ontological uncertainties that proliferate where entrepreneurial innovation meets emerging commercial opportunity has never existed, is unlikely ever to exist, and will not persist if someone is foolish enough to create it. Here is another aspect of the game between the market economy and financial capitalism: however stationary the processes of the market economy may appear to be, contracts that will guarantee the persistence of such stability through time will never be valid under all the limitless alternative states of the world that may obtain.
Does the specific instance of BRL’s rescue convey some more general lesson? It does. The conjunction of available surplus cash and our success with Fred in leveraging access to that cash to wrest effective control of the company from its founders constituted a retrospective hedge against the adverse consequences of having bet on incompetent managers and inattentive directors. But the succession of contingencies on which our improvised rescue mission depended was terrifyingly tenuous. How much more efficient (as well as less emotionally arduous!) it would have been to hold effective control in the first place so that, if needed, the surplus cash could have been deployed without the necessity of the face-off with the venture capitalists and the late-night cliff-hanger with the founders!
Ever since BRL, I have known that Cash and Control represent the sole conjoint hedge against the radical uncertainty that comes with the opportunity to seek outsize returns from making illiquid investments. This is a more complex proposition than venture capitalists’ clichéd Golden Rule – “Whoever has the gold makes the rules” – which addresses the straightforward, bilateral game between the venture capitalist and the entrepreneur. Cash and Control relates to the open-ended, multi-dimensional game we are doomed to play with the universe at large, addressing the infinite range of possible threats to continuity from outside the frontiers of the enterprise.
My experiences in discovering how to construct defenses against the vagaries of living in this uncertain economic world are far from unique. The most successful venture-backed companies typically hold cash reserves far in excess of what conventional economic theory can rationalize as efficient. To pick five stand-out leaders of the digital economy at nonrandom, as of mid-2017 Facebook held $36 billion in cash and short-term investments; Alphabet (parent of Google) held $86 billion; Microsoft held $133 billion; Apple held $77 billion, plus no less than an additional $185 billion in marketable long-term investments. And Amazon, despite Jeff Bezos’s twenty-year determination to re-invest in maximizing growth, nonetheless held $26 billion.Footnote 4 No doubt, a substantial portion of the financial reserves reflect the incentive to generate profits in low-tax regimes and hold cash receipts there, given the exemption from US taxation of profits held offshore nominally for reinvestment. But having accepted radical technological risk in the development of novel products and services, along with radical market risk to discover whether there are customers for their inventions, even the biggest winners in the Innovation Economy understandably choose to accept no financial risk whatsoever.
Although the context was different, the same strategy expressed itself in the fortress balance sheet that Jamie Dimon succeeded in building at J. P. Morgan Chase in anticipation of the Crisis of 2008. At the global level of the game, the turn toward aggressively mercantilist policies by the nation-states of East Asia, led by China, in direct response to the destruction wrought by the International Monetary Fund (IMF) in the late 1990s, has the same pragmatic motivation.Footnote 5 To avoid the threat that the IMF would again impose severe reductions in spending and increases in taxes, thereby accelerating and deepening the contraction of their economies into recession, these nations were determined to achieve the autonomy that Cash and Control ensures.
At the national level, there is a reason why policies aimed at accumulating cash and ensuring autonomy of action are termed protectionist, whether they are implemented by way of an undervalued currency or legislated tariffs and subsidies. Of course, political leaders in these instances are also serving the economic interests of those in the market economy who export and thrive on protectionist policies at the expense of the mass of consumers who suffer at the margin from the adverse shift in the terms of trade with the external world. And, of course, the intensely focused interest of the few whose wealth buys access to those in power always tends to trump the diffuse interests of the many.
The political economy of protection extends far beyond the narrow confines of the efficiency of markets. Only nations that are the most competitively productive and that hold substantial net balances of international assets can afford to implement the pieties of free trade without fear – think Great Britain in 1846 or the United States in 1945 and, as discussed at length in the Conclusion, China today. Friedrich List put it succinctly some 170 years ago: “Any power which by means of a protective policy has attained a position of manufacturing and commercial supremacy can (after she has attained it) revert with advantage to a policy of free trade.”Footnote 6 All other participants on all the fields on which the game is played are on notice to develop strategies of self-insurance.
MicroPro International
The second collaboration with Fred provided an education at an even more granular level. Through working with him to save MicroPro International, I learned how to exercise operational control during crisis – how, that is, to play the role of turnaround expert. The collaboration also represented Eberstadt’s most salient engagement with the PC revolution.
By the early 1980s, the killer applications for the PC had been discovered: tools for automating office work. The most visible was the electronic spreadsheet, and the first electronic spreadsheet to hit the market was VisiCalc, a product that rapidly became a brand. It was made by a company called Personal Software, which changed its name to VisiCorp, gained backing from venture capitalist Arthur Rock and from Venrock (the venture capital arm of the Rockefeller family) and hired a senior manager from Intel as CEO. VisiCorp appeared to be unstoppable. We at Eberstadt had built a relationship with Arthur Rock. According to legend, Rock had followed the New York Giants west to San Francisco and become a founder of Silicon Valley venture capital. He had orchestrated the start-up capital for Intel; had been an investor in Scientific Data Systems, the first computer company to be acquired for $1 billion (by Xerox in 1969); and, before VisiCorp, had joined with Venrock in funding Apple.
Second only to VisiCalc as an early winner was the leading word-processing software program, WordStar, spawned by MicroPro International, whose venture capital investor was none other than Fred Adler. My partner Jack Lasersohn and I set about creating the opportunity to finance these two leaders in this most dynamic market at a time when access to the public IPO market was still uncertain. VisiCorp was undoubtedly the class act, with its premier venture capital backing and professional leadership. Yet we chose to commit to MicroPro, pushed in good part by a valuation of VisiCorp that its board insisted should reflect the quality of its brand above and beyond its operating results. In addition, with BRL we had already experienced first-hand Fred’s distinctive ability to cross over from independent investor to operational leader. This provided a substantial degree of insurance for making an illiquid investment in the immature and volatile world of the PC. It was an insurance policy on which we would have to make a claim.
In early 1982, MicroPro was riding a rocket. Revenues, at $4.2 million in the fiscal year ended August 31, 1981, were on the way to $22.3 million for the new fiscal year. Seymour Rubenstein, MicroPro’s founder, had possessed the vision as early as 1977 to imagine a word-processing program that would run on any of the new wave of personal computers, at a time when the term “word processor” referred to a closed, dedicated and expensive machine from IBM, Xerox or Wang. Rubenstein had teamed up with a genius programmer named Rob Barnaby to produce the first version of WordStar in 1979. When it took off in 1981, he accepted an investment from Fred.
In June 1982, Eberstadt completed another in our succession of post-venture private placements, delivering on the order of $10 million (more than $30 million in 2017 dollars) in return for unregistered, illiquid shares of common stock. And then, with extraordinary speed, MicroPro proceeded to blow up. Revenue forecasts proved as ephemeral as the growth in expenses was inexorable. In the quarter ended November 30, 1982, the company managed to lose $1.5 million on only $6.3 million of revenue.
Fred became an effective ally of us and our abused clients – a circumstance that was made more likely by the fact that one of our lead investors, General Electric’s pension fund, was one of his lead limited partners. Fred’s intervention was needed not least because it turned out that Rubenstein, along with a number of his followers, was a devotee of Werner Erhard and his self-empowerment movement EST. As I came to learn, ESTies (or “EST-holes” as they were called by those who knew them well) believed that “we are each responsible for our own self.” This could all too readily be translated into the maxim: “If I screw you, it’s your own fault.”
As it turned out, Fred convinced Rubenstein that if the latter wished to avoid litigation, a price adjustment to our financing was in order. It was duly delivered by way of 250,000 additional shares issued to our investors at no additional cost. Along with negotiating this compensation, Fred took over operational control of the company, and I joined him as a sort of adjutant with Jack at my side. Fred called for a detailed structural layout of the company, with every employee tagged with her or his direct compensation and placed in the appropriate functional role and reporting relationship. We were assisted by Henry Montgomery, an experienced finance professional whom Fred had managed to convince Rubenstein to hire as CFO and who had no interest whatsoever in denying reality or in resisting the need for drastic action.
Being immediately on hand, Fred was able to reduce headcount by some 20 percent while maintaining operational continuity of the business and without needing any additional capital. More broadly, he showed me how to implement his dictum, “There is no such thing as a fixed cost; what matters is how much time and money it takes to turn what appears to be a fixed cost into a variable one.”
The deep lesson I learned from Fred in this case was to understand the internals of a business by following the cash. He liked to deploy a time-worn anecdote to explain how he learned the fundamental importance of cash flow. Every morning his father would count the currency and coins on his bureau before putting them into his pocket. Every evening, his father would return home from work, empty his pockets and again count the cash. If he had more than he had started with, it had been a good day.
Fred’s emphasis on the primacy of cash flow took on progressively greater significance in years to come. Generally accepted accounting principles (GAAP) seek to match costs with sales by accruing expenses and deferring recognition of revenue independent of the actual transfer of cash between buyers and sellers. The consequent disparities between cash flow and reported profits used to be relatively easy to track. But, by the turn of the millennium, accountants had fallen in love with the economics of efficient markets. They began to require that assets and liabilities on the balance sheet be “marked to market” at “fair value,” as if the latter bore a necessary and consistent relationship to the prices generated from time to time in the inevitably less than perfectly efficient markets of the real world.
This meant that ever more experience and expertise were required to reverse-engineer the GAAP financial statements in order to expose the actual underlying cash flows. Thus, as with the elimination of fixed brokerage commissions a quarter-century earlier, an initiative motivated by an explicit commitment to increasing economic efficiency had perverse consequences. In this case, too, the financial markets were rendered less informationally efficient, to the significant benefit of professional investors with the time, skill and motivation to undo the accountants’ work. As we shall see in Chapter 8, the process of pursuing transactional efficiency at the expense of informational efficiency has been carried a gigantic step further by the rise of index funds and exchange-traded funds (ETFs).
Fred fixed MicroPro. After that disastrous first quarter, the full fiscal year ending August 31, 1983 showed revenues doubling to $45 million and net after-tax profits in excess of 10 percent. The next problem was to convince Rubenstein that he had to turn over managerial control to a professional CEO and yield ownership control if he were to be able to realize a return on his entrepreneurial vision through an IPO. Rubenstein had accumulated sufficient legal baggage to make his appearance in a prospectus as CEO and controlling stockholder problematic. Fred and I recruited Glenn Haney from Sperry-Univac as CEO. And Kit Kaufman of the Heller Ehrman law firm came up with a creative solution to the control issue, known as Founder’s Common Stock. This had the peculiar attribute that, as long as it represented 10 percent or more of the total common shares, it could only be voted to elect one member of the board. It would convert to full voting common stock, share for share, at the time of a merger or sale of the company or when it was transferred to a party entirely independent of Seymour Rubenstein.
MicroPro went public in March 1984 at a valuation of $125 million, and Rubenstein got to sell some $8 million of stock at the offering. The stock held up for a year or so, allowing all of the investors to achieve liquidity before the entry of new competitors, first WordPerfect and then Microsoft Word, cut off WordStar’s growth. Its transient market leadership had been based on code that was necessarily written in a low-level software language to generate acceptable performance from the 8-bit microprocessors that were the engines of the first generation of PCs. The new competitors had been designed from scratch to run on the next generation of 16-bit microprocessors, whose greater processing performance and memory enabled them to support a graphical user interface and to deliver WYSIWYG (“What You See Is What You Get”) renderings of text on the screen. By the time MicroPro’s new management realized it had to make its own core product obsolete, it was too late. This was another lesson to be learned and retained.
MicroPro had performed well enough long enough to deliver liquidity to all of its stockholders, not just Rubenstein. The contrast with VisiCorp was stark. There, management and board alike had been blindsided when Mitch Kapoor, a top developer, left to start his own company and launched Lotus 1–2–3, a product that integrated charts and graphs with numerical spreadsheets to deliver killer competition-not-in-kind. VisiCorp never managed to go public, and its investors were entirely liquidated, not liquefied.
Only years later did I realize that in our idiosyncratic collaboration with Fred we were reinventing a wheel originally fashioned by J. P. Morgan himself. Naomi Lamoreaux and her co-authors summarize the process:
Morgan had worked out a technique for building investors’ confidence when he reorganized bankrupt railroads during the 1890s, putting his own people on the boards of directors to reassure stockholders that the business would be run in their interests. The railroads’ return to profitability enhanced his reputation, and Morgan used the same method to promote the securities of the giant consolidations he orchestrated at the turn of the century. Studies … suggest that stockholders responded by flocking to buy the securities of “Morganized” firms and also profited handsomely from their purchases.Footnote 7
Institutional Revolution
The context in which Eberstadt reinvented itself from agent to principal was one of industry-wide institutional revolution. The man who had offered me the chance to give up my doctoral ambitions and join the Morgan Stanley bullpen back in 1970 was Fred Whittemore. “Father Fred” was the longtime head of syndicate at the number one investment banking franchise in the Street, and thus the chief arbiter of its hierarchy of status. In 1979, Whittemore dismissed Dillon, Read and Kuhn, Loeb from the bulge bracket and replaced them with Merrill Lynch, Goldman Sachs and Salomon Brothers. National distribution and trading muscle trumped tradition. Also in 1979, Morgan Stanley itself had a painful moment when IBM required, not requested, that the firm share leadership of a $1 billion debt offering with Salomon. Morgan Stanley refused to surrender its traditional sole manager role, and the upstart Salomon got the business on its own.Footnote 8
As corporate clients learned to use their power, the old traditions of relationship banking faded away. Merger and acquisition advice had been a free service offered by bankers to long-term clients. Beginning in the mid-1970s, it rapidly became a transactional service, with every deal standing on its own and every firm charging what the traffic would bear on a deal-by-deal basis. Lewis Bernard of Morgan Stanley remarked in 1978, “Clients will do more for themselves. Our principal competition is our clients.”Footnote 9 In turn, the major firms, led by Goldman Sachs and Morgan Stanley, began to invest in the people and the computer systems necessary to compete effectively against their clients – institutional and corporate – from the trading desk. This reversal of position was the fundamental change that defined the business of the investment banks, both the independents and those captive inside the universal banks such as Citibank and J. P. Morgan, in the run-up to the Crisis of 2007–2009.
During the 1980s, two developments confirmed the irreversible transformation. To obtain the capital necessary to compete as principals, the investment banking firms had to go public. In 1970, the NYSE had relaxed its generations-old prohibition to make this possible, but the opportunity had only been taken by the major retail wire houses. They had gone public to fund their investments in their branch office networks and in the first generation of the computer systems forced on them by the paperwork crisis of the late 1960s, when stock trading choked on rising volume. Now the wholesale banks followed, leveraging advanced computer systems to trade against their clients with the first generation of mathematical models for pricing financial assets.
At the same time, the Federal Reserve and the SEC began to let the commercial banks creep back into the investment banking business. Glass–Steagall had been established fifty years before to protect the retail depositors of commercial banks against the volatility of the financial markets and against the greed of bank managements intent on exploiting that volatility. The contemporaneous creation of the Federal Deposit Insurance Corporation meant that Glass–Steagall also protected the taxpayers generally by insuring them as depositors. These actions condemned the commercial banks to the slow-or-no-growth business of lending to corporate customers that were not substantial enough to access the capital markets directly.
The most aggressive of the commercial banks had historically been the most conservative: J. P. Morgan, the commercial banking side of the House of Morgan, which Glass–Steagall had divided from Morgan Stanley. By 1987, with Glass–Steagall still nominally in force, J. P. Morgan’s fee revenues exceeded its income from the net interest spread on its lending business. The opportunity to play across all of the wholesale financial markets in London with the growth of the Eurodollar markets was a training experience not only for J. P. Morgan but for its commercial banking competitors as well. Appropriately, it was also in 1987 that Dennis Weatherstone, a working-class Brit without a university degree, who had grown up on the foreign exchange trading desk, became the bank’s President.Footnote 10
Wall Street was becoming open to new talent and, with more than a little lag, so was the City of London. There, hard on the heels of the Big Bang of 1986, which eliminated restrictive practices and longstanding guild-like monopolies, the “barrow boys” on the trading desks were generating more profit and taking home more money than the public school and Oxbridge-educated blue bloods of corporate finance and advisory services. As brokerage commissions declined in all markets, volume rose more than proportionately, and trading activity as a source of revenue and profit rose with volume. Moreover, the accelerating proliferation of computers, moving from routine back-office accounting functions toward the trading desk on the front line, created space for new players with new skills.
The combination of intellectual and temperamental qualities that make for successful trading – intense focus, infinite patience for haggling, a propensity for gambling – had always earned a return in the market, albeit a highly volatile one. Now those skills became ever more central to the economics of both banks and brokers. Further, the ability to analyze market data and to devise innovative trading strategies began to generate value. Those with such expertise, whether traders or trading strategists, tended not to be heirs of old Wall Street and the City of London. Relationships yielded to transactions as the source and measure of value, and the sociology of the financial markets was transformed. As more and more classes of financial assets were transformed into tradable securities – from residential and commercial mortgages, to corporate and credit card receivables, to student loans, and on and on – there were ever more transactions and ever more opportunities for the dealer banks to earn attractive spreads versus their less-informed clients.
Analytical skill, the mastery of quantitative techniques, and an all-consuming work ethic – these were required to populate the vast expansion of investment banking practices. First-class credentials that testified to such abilities now trumped family and old school ties. But something was lost as well. There was not much room for eccentricity in the new Wall Street. Perhaps that was merely an aesthetic loss. But formulaic finance and the computers that enabled it made it easy to substitute an algorithm for judgment. When I was learning how to value private companies in the early 1970s, the tools at hand were a Monroe electromechanical calculator and a book of logarithms: it took half a day to run a case, and the analyst thought long and hard about the assumptions employed. Barely ten years later, when I addressed the artificial intelligence colloquium at MIT on the valuation of ventures, the Hewlett-Packard digital calculator had already made it possible to generate innumerable cases at the push of a few buttons, so it was easy to construct whatever model was needed to rationalize the prospect of earning the required rate of return. “And then,” I said at the colloquium, “came VisiCalc.”
The lessons I learned from collaborating with Fred Adler to generate positive returns from start-up ventures that had seemed destined to go bankrupt cut against the grain of how modern finance theory instructs investors to manage risk – namely, by diversifying. For the venture capital investor, a fund portfolio typically consists of no more than twenty-five positions, usually in no more than two or three industrial sectors and often concentrated in only one; this is hardly an opportunity for substantial diversification. Moreover, each position is definitionally immature as a business and is subject to failure along any of several dimensions, including managerial competence, technological efficacy and market acceptance. In a venture capital portfolio, that is to say, idiosyncratic risk is both very great and quite homogeneous. And, as in the case of BRL, it cannot be hedged through any sort of transactions in markets that either do not or cannot exist. Thus, the counterpart of learning the game of venture capital in the trenches was learning that modern finance theory is largely irrelevant to its practice.Footnote 11
My collaboration with Fred not only defined a technique for addressing the chances and contingencies that face the venture capitalist. Fred’s mentoring also represented a case study in what Perry Mehrling calls “the money view,” which focuses on the continuously evolving present moment in which “cash flows emerging from past real investments meet cash commitments entered in anticipation of an imagined future.”Footnote 12 In fact, as I discuss at length in Chapter 8, what Fred taught me at the level of practice corresponds exactly with what Hy Minsky was teaching me concurrently at the level of theory. Minsky’s “survival constraint” binds at the point at which currently due obligations cannot be met from operating cash flow, from new security issuance or borrowings or from asset sales. So I had the opportunity twice over to absorb the core of Minsky’s extension of Keynes to encompass the “Financial Instability Hypothesis” more than twenty years before the world would have its Minsky moment in September 2008.Footnote 13
While I was making my professional transition from investment banker to venture capitalist, it was becoming clear that the Eberstadt firm as a whole could not succeed in similarly transforming itself. To try would have entailed dismissing more than half of our partners and employees. Not even Fred could have worked out how we could reduce costs to where they could be covered by the management fees from two modest post-venture funds plus investment banking and corporate advisory fees while we maintained our distinctive competitive position. The relationships that the brokerage business had created with our institutional clients were crucial to the research-based investment banking model even as the brokerage business generated ever less direct revenues. We were facing a contradiction in our operating environment that we lacked the power, not the understanding, to resolve.
So, in September 1985, we were fortunate to learn that Robert Fleming & Company, a global investment manager and bank and a major institutional client of the firm, was seeking an American link between its London base and the hugely successful Asian joint venture it had established with Jardine Matheson in Hong Kong. As we walked back from closing the sale of Eberstadt to Robert Fleming, Pike Sullivan remarked: “Well, we got rid of the black queen.”