In this autobiographical statement, the author conveys several lessons learned over the course of developmental research on social cognition, moral commitment, character, and purpose. The lessons include: (1) always check theoretical claims against real-world observations and intuitions; (2) always start a research program with deep attention to the field‘s past achievements, including those made decades ago; (3) employ available insights from humanities domains such as literature, philosophy, and theology; and (4) approach any new research topic with both small-scale idiographic methods and larger-scale nomothetic study methods. The chapter closes with an explanation of the author’s choice to focus on problem-centered research rather than general theory-building, with the hope that studying specific problems would inform theory-building, so that the research benefits would flow both ways, from theory to real-life problems and back again.
My intellectual interests in human development began in high school; and at that time my interests were literary in nature, as they had to be: in those days psychology wasn’t taught in high school. But I was fortunate to have chosen German as my foreign language. In German literature, there is a Bildungsroman tradition that produced compelling “coming of age” stories. I remember becoming fascinated with the accounts of challenges and accomplishments of youth development that I found in Goethe’s Wilhelm Meister and Thomas Mann’s Tonio Kröger, and after that avidly reading, in a similar light, writings in English such as Joyce’s Portrait of the Artist as a Young Man, Dickens’ Great Expectations, and all of J. D. Salinger. In these readings, I was taken by the common themes of learning by observing, growing through experimentation, youthful ambition, and the wonders of discovery. I loved the idea of personal progress as an objective of growing up.
While still in high school, I tried my hand at writing some fictional coming-of-age stories. It did not take me long to realize that I did not have the outside-the-box imagination necessary for a career in creative writing. But sometime in the summer after my high school graduation I stumbled upon a book of readings (History, Psychology, and Science: Selected Papers) assembled by E. G. Boring, one of US psychology’s founding figures. I remember nothing specific about any of these readings, but I do recall becoming transfixed by the possibility of studying in a systematic way the developmental themes I had encountered in the coming-of-age fiction that had intrigued me in high school. When I started college at Harvard, I signed up for an introductory psychology course right away.
Once again, I was fortunate. That year, Roger Brown was assigned to teach Harvard’s introductory course in social psychology, and it was a mesmerizing offering. Roger included all kinds of material – such as the work of naturalists like Tinbergen and developmentalists such as Piaget and Kohlberg – that would not have appeared in a more conventional treatment of social psychology. Roger’s course had the same quality as his famous, broadly conceived textbook of that name. It gave me an ideal introduction to the overall field. The following year I was brought deeper into developmental studies by Jerome Kagan’s galvanizing introductory course. Although later I was to have some disagreements with Kagan on research matters, he stoked my interest in developmental psychology with stunningly brilliant lectures in his basic course.
The other lasting discovery I made while in college was the appeal of qualitative research designs, in particular semi-structured interviews and the associated interpretive schemes. One of Roger Brown’s graduate students, Doug Carmichael, took enough of an interest in me to give me paperback editions of Piaget’s Language and Thought of the Child and The Child’s Conception of the World, where I discovered the interview approach (la méthode clinique) that was to become my lifelong method of choice for exploring human development.
This was a personal choice that followed directly from my original interest in stories from a literary tradition, stories that capture many aspects of the whole person, mind, and heart and all the nuances of everyday experience. Interviews can provide a window into virtually any part of the person’s experience. But my personal choice did not stand alone in the context of those times (the mid to late 1960s). American psychology was just emerging from the shackles of behaviorism, and the associated commitment to exclusively quantitative, “experimental” methods, and there was a glow of almost giddy exuberance as cognition, the mind, and all forms of human development became acceptable targets of exploration. At Harvard, the excitement was tangible. Jerome Bruner (who gleefully described behaviorism with the decidedly unscientific adjective “crass”) had established a Center for Cognitive Studies, Erik Erikson was lecturing in grand style, and Roger Brown was hosting advanced seminars that pushed the boundaries of language research into all aspects of mental experience. As a capstone to my own undergraduate education, I was granted the opportunity to have Roger as my senior thesis advisor on the unlikely topic of how children processed the emotions enacted by Agamemnon and Achilles in Homer’s Iliad.
My work on that senior thesis revealed to me the orientation that would characterize my particular approach to research in human development. The revelation came about as a result of a disagreement I had with Jerome Kagan, one of the members of the faculty committee set up to review my thesis. In the thesis, I described two differing modes of understanding that children expressed when speaking about the emotions (anger, jealousy, fear, and so on) that the characters displayed in the Iliad. One of the modes succeeded in capturing the emotions accurately, whereas the other was capricious and unreliable in trying to identify the right emotions. Kagan asked me to control for the number of words that the children in each camp used, under the assumption that having superior linguistic facilities (vocabulary, sentence construction, and so on) would explain any differences in identifying the emotions correctly. I resisted, because it seemed to me that any association between better answers and number of words could reflect the opposite causality: a superior understanding could lead to greater word use, rather than the other way around. But in retrospect, Kagan’s suggestion would have added to the information I had to work with, and I should have taken it, rather than stubbornly sticking to my own simplistic assumptions. Yet this experience revealed to me my personal aversion to the hypothesis-testing mentality that relies on an assortment of controls whose ultimate meaning is far from certain and it awakened my resolve not to explain psychological data with any single form of determinism, linguistic or otherwise.
Summing up where I stood as a neophyte developmental scientist before graduate school: I had determined that my interests lay in the development of the whole person; that my methods of investigation would draw from the interview and narrative traditions rather than experimental methods favored by behaviorism and its sequelae; and that, as a result of these choices, my own efforts would be dedicated more to open-ended discovery research than to hypothesis testing.
In my senior year of college, I applied for doctoral training at Harvard, Berkeley, and other places that were investing in human development, just then becoming a growth area. My statement of academic purpose, as I recall, proposed a plan to study “developmental sociology,” whatever that might be (and I had only a vague idea myself). I was hoping to combine insights from Jean Piaget, Erik Erikson, Erving Goffman, and Talcott Parsons to discover the ways that human progress could foster social harmony and personal fulfillment. It was mostly idealistic nonsense, and my great advisor Roger Brown, who read my application after I had submitted it to Harvard, called me on it. Unlike his usual considerate and gentle manner, he spoke with a tone of annoyance in his voice when he told me that the committee on graduate admissions at Harvard had turned down my application. He asked me (and I still remember his exact words), why I had “made such a mess of” my application. He graciously offered me the chance to submit a revised statement, but I felt too embarrassed (and/or peeved) to do that. Fortunately, Berkeley overlooked what I wrote on its application and accepted me, likely because Roger Brown had written me a complimentary letter of recommendation.
Developmental studies were a high priority at Berkeley when I arrived in 1970, as they were in many other psychology departments at the time. I did manage to study how social and individual dimensions of life experience come together (or “dynamically interact,” as we now say) to spur growth in mind, character, and other adaptive capacities. But given there is now so much known about the dynamic interplay of developmental systems, it is hard to recapture how out of the box this line of research seemed in the early 1970s. I remember the puzzlement I encountered while seeking a dissertation committee for a study of children’s social cognition. Some of the faculty I spoke with had not heard the phrase “social cognition” and wondered whether it was even a real thing. But my two closest advisors, Jonas Langer, and Paul Mussen, were open-minded (even curious) about this direction and gave me the support I needed to pursue it. Still, I felt I needed to justify my interests in this new area with a dedicated essay that became one of my early publications after graduate school (see Damon, Reference Damon1979).
The research I began while a graduate student at Berkeley was aimed at uncovering the previously uncharted depths of social cognition that young people use in their everyday lives. At that time, not only were many in the field unacquainted with the very notion of social cognition, the relatively few studies that had been done were ill-conceived studies of “person perception” (a phrase that stood for social cognition in those days). Those early studies concluded that a child’s understanding of persons moves from the “overt” (surface qualities such as physical looks) to the “covert” (“inner” traits such as thoughts and motives). I knew that this couldn’t be true because of my own young children’s frequent ruminations about the intentions of family members and peers. Moreover, I believed that, for reasons of social adaptation, the more important focus of social cognition is on the relations and transactions between persons, rather than on persons as discrete individuals.
Another well-known developmental perspective on social understanding at that time was Kohlberg’s moral reasoning stage system, which described the origins of moral cognition as a total respect for power and authority. In this view, children begin thinking about the social world as structured around the commands of authority figures such as parents and God, and they make judgments about right and wrong accordingly. But, again, I knew from my own children that fairness is a big deal even to preschoolers; and they do not rely on adult commands to pursue fairness. A child will share toys, and expect sharing from others, without being told to do so by a higher authority. This reveals that, at least in the area of distributive justice, even very young children have their own active views that are distinct from, and can even disagree with, those of powerful adults. As I prepared for my dissertation, I discovered old studies that confirmed this, conducted by the pioneering child psychologist Charlotte Bühler back in 1920s Vienna. This discovery made me aware of our field’s unfortunate tendency to forget (or ignore) its own past achievements, which is a systemic mistake that I have tried not to perpetuate.
I reported findings from my early studies in the standard developmental journals, but I found the format of book-length treatment to be more congenial to my narrative interests. Soon after graduate school, while in the first year of my assistant professorship at Clark, I wrote The Social World of the Child, covering my own early research on children’s concepts of fairness and the work of other psychologists who were writing about children’s concepts of friendship and authority. The book was well received, landing the lead review in Contemporary Psychology (the prominent journal founded by E. G. Boring), which gave me special satisfaction. As my career developed, I devoted far more of my writing efforts to books than journals, in part because discursive book writing comes more naturally to me and in part because books often have a broader public audience. Yet I have continued writing journal articles where appropriate, because journals that demand extensive peer review deservedly have greater credibility in the scientific community; and I know I have a responsibility to subject my work to the rigors of such review on a regular basis.
For the remainder of the 1980s, I conducted research that chartered the growth of social and self-concepts from childhood through adolescence. I wound down this phase of my research when it dawned on me that none of these studies had shed enough light on actual social behavior. Then, in 1988, an opportunity to examine behavior in real-life contexts was offered to Anne Colby (my spouse) and me by a Social Science Research Council committee (which included David Feldman and Howard Gardner) that gave us a small grant to study “moral giftedness.”
At the time the only paradigm that might be used to address this subject was Kohlberg’s stage system, which defined an elevated mode of reasoning (Stage 6) that was so rare that it was actually left out of the official Kohlberg stage-scoring manual. Anne and I agreed that this mode was too ethereal to provide a sufficient basis for anything as bighearted as moral giftedness. We knew we would need to examine character strengths such as courage and truthfulness. Kohlberg had intentionally dismissed these as haphazard components of an unanalyzable “bag of virtues.” So, although Anne and I each had found value in Kohlberg’s work, we disagreed with him about the importance of character strengths (or “virtues”), which we intuited would be more closely linked to behavior than stages of reasoning.
With a total of $12,000 in funding, half provided by SSRC and half by a California group (the Institute of Noetic Sciences) interested in “the altruistic spirit,” Anne and I launched the study of “moral exemplars” that was to result in our book Some Do Care: Contemporary Lives of Moral Commitment. To select the moral exemplars for our study, we spent a year on a nominating process that enlisted twenty prominent philosophers and theologians to help us develop selection criteria and identify living people who met those criteria. We achieved solid consensus on the criteria (despite the variety of ideologies represented by our nominators), and we arrived at a sample of twenty-three living nominees who agreed to be studied through our interview and case study methods. The book that reported our findings gained widespread attention, both for its findings and its methodology, and it is still in wide circulation today.
The findings of Some Do Care yielded several surprises. We had gone into the study expecting to hear these extraordinary individuals inform us of how they sustained their courage and controlled their fears when they felt called on to lay their lives on the line for civil rights, justice, world peace, and other moral causes. But all of our subjects denied needing courage for what they did. They felt they had no choice other than to do what they believed right. Whenever faced with daunting moral challenges, they mostly needed to remind themselves who they were. We took this finding as an indicator that it is moral identity, rather than moral reasoning, that is the key factor in moral behavior.
On the methodology front, the book introduced a new version of psychology’s venerable but out of fashion case study approach, a method we called “exemplar methodology.” The idea is that in-depth studies of extraordinary individuals can uncover normal developmental processes in a uniquely clear way. The book, and follow-up special sessions at SRCD, SRAFootnote 1, and elsewhere, spawned an upsurge of interest in exemplar methodology. Over the thirty-year timespan since the original publication of Some Do Care, exemplar methodology has become a crucial research tool used widely in psychological studies of capacities such as creativity and wisdom, in moral and character development research, and in studies of “Good Work” that Howard Gardner, Mihaly Csikszentmihalyi, and I conducted in the early years of the twenty-first century.
The Good Work Project, led by Howard Gardner and described elsewhere in this volume, explored how dedicated professionals manage to conduct work that is both excellent and ethical, even under conditions of pressure and negative incentives. The Good Work Project examined vocations such as education, journalism, science, and business. The finding from the Good Work research that most interested me was the lucidity and conviction with which dedicated workers committed themselves to their field’s public mission. This led me into the third phase of my research career, studies in the formation of purpose. I saw purpose to be the personal (and thus psychological) equivalent to a mission (a sociological notion) in a vocational field. Just as the Good Work Project aimed to understand how vocational fields pursue public missions, I wished to understand how individuals build and sustain their personal purposes.
My focus on purpose also drew on my longstanding interest in positive features of human development. As I noted, the type of fairness that my dissertation focused on was distributive justice – or what I called “positive justice” in my dissertation and subsequent Social World of the Child – to distinguish it from retributive or punitive justice. More recently, I had been a participant in two lively movements that emphasized the positive dimensions of psychological functioning: positive psychology, founded by Martin Seligman and Mihaly Csikszentmihalyi in 2000, and “PYD” (“Positive Youth Development”), for which Peter Benson, Richard Lerner, and I were the initial advocates in the late 1990s. Both movements urged greater attention to the psychological strengths and assets that all people display. The two movements shared the sense that previous psychological writings had distorted human nature by their overemphasis on problems, neuroses, conflicts, and what Peter Benson called their focus on “deficits.”
The importance of purpose for leading a fulfilling life had been recognized for centuries, in philosophy, religious teachings, and popular writings. I found the philosophical and spiritual writings on purpose to be rich and informative. Before settling into my own research approach, I digested the highlights of these writings into a short book of sayings and quotes, interspersed with essays I wrote summarizing what was then known about how purpose functions in human life and how it develops over the lifespan. The eminent author David Meyers did me the honor of writing an introduction to this little book, which I entitled Noble Purpose. Immediately after that, I launched my research program on how people at all ages find and pursue purpose, beginning in early adolescence and continuing through the entire lifespan.
Psychological writings about purpose predated my research. Austrian psychiatrist Victor Frankl, while imprisoned in a German concentration camp, conceived his psychological theory that identified purpose as a noble antidote to life’s destabilizing misfortunes and stresses. Frankl wrote that commitment to a purpose can provide resilience against psychological maladies such as anxiety, depression, and despair. In a positive sense, purpose can provide inspiration, energy, and contentment. Although his theory was published in English as Man’s Search for Meaning, Frankl in fact used the German word Zweck, connoting “purpose” rather than “meaning.” His book’s actual title before its English translation was Nevertheless Say Yes to Life.
Frankl founded a school of psychological counseling called “logo-therapy,” based on the idea that purpose should serve as a primary objective of a well-lived human life – not, as other schools of psychology (such as behaviorism and Freudianism) had claimed, last in line after the satisfaction of a host of biological and material desires. Frankl’s ideas about purpose were nonreligious. More recently, Pastor Rick Warren offered an influential examination of faith-based, Christian purposes in his Purpose-Driven Life (2002). My own focus has been on the entire range of purposes, from the mundane to the heroic, that provide direction, meaning, and fulfillment to people of all ages, abilities, and backgrounds. I have looked for purposes wherever I can find them, and I’ve tried to explain how they develop and how they affect those who pursue them.
Although purpose is associated with related concepts such as passion and meaning (and is often spoken of interchangeably with such concepts), it is distinct from all other concepts. This is important to understand, because science doesn’t need more than one name for the same concept, nor should it use the same term differently on separate occasions (and the same applies to rigorous practice and scholarship in general). After examining how purpose has been used in a variety of scholarly fields, the definition that my team and I came up with to capture its unique features is this: Purpose is an active commitment to accomplish an aim that is both meaningful to the self and of consequence to the world beyond the self.
On the everyday front, purpose can be found in a parent caring for a child. It can be found in workers doing their jobs with pride and responsibility. Citizens voting or campaigning display civic purpose. Neighbors organizing a block party display community purpose. Members of religious congregations pursue purposes of faith or spirituality. Musicians practicing scales, artists painting the colors of a sunset, and poets crafting poems are pursuing aesthetic purposes. Supervisors who train employees for vocational growth manifest mentoring purposes.
Purposes need not be heroic or extraordinary to provide their psychological and social benefits. But this is not to say that purposes are quickly or easily found. My teams’ research – in both our qualitative and quantitative studies – has shown that purposefulness is a late-developing capacity that grows gradually over the course of the lifespan for those who have had the opportunity to commit themselves to causes they believe in.
In addition to my research, I have dedicated part of my scholarly career to editing collections of other scholars’ writings. Two engagements were especially significant: in 1978, I founded the series New Directions for Child and Adolescent Development; and in 1998 I became editor-in-chief of The Handbook of Child Psychology (fifth edition, followed by a co-editorship, with Richard Lerner, of the sixth edition in 2006). Editing the New Directions series enabled me to learn about vast areas of the field beyond my own. Based on my work with that series, I was offered the editorship of The Handbook of Child Psychology, which for over sixty years had been the landmark organizer of research in our field. With an extremely talented group of co-editors (Richard Lerner, Robert Siegler, Deanna Kuhn, Nancy Eisenberg, Irving Sigel, and Anne Renninger), we produced a four-volume set that gave the handbook’s grand tradition a forward-looking treatment similar to the New Directions series.
When I was in graduate school, I had the impression that the pinnacle of achievement in a social science was to create a theoretical model that would leave a mark on the field. Grand theories were the order of the day, esteemed and hotly debated. In psychology, attention was riveted on distinctions between Freud, Skinner, Piaget, Werner, Vygotsky, Bronfenbrenner, and a host of successor theoretical-model-builders. To do basic research in service of creating, improving, or even rejecting one of the field’s grand theories, seemed like the highest possible intellectual calling.
In some ways, a concern with the “big picture” of theory had been a welcome change for American psychology, which was just emerging from a long bout of what had been derided as “dust-bowl empiricism.” The field of child development had moved past the lifeless agenda of simply cataloguing skills and behaviors of children at various phases of their growth trajectories, such as recording “a day in the life of the child” from dawn to dusk. The theoretical debates of the mid to late twentieth century brought excitement and drama to the field and added value to the research that scholars were choosing to do. When I edited The Handbook of Child Psychology in 1998, it was the first of its four volumes – the one dedicated to the major theoretical models of our times – that felt most crucial to me. But when I edited the Handbook again in 2006, I did not feel the same way. This time the volumes on research and practice felt more pressing.
Theoretical discourse has its limitations, especially when it becomes detached from the problems and data that the theories are intended to explain. It was hard to use insights generated by nuanced discussions, for example, contrasting Piaget with Vygotsky, to answer common questions such as: What kinds of schooling work best for all the various kinds of students today? How does TV (or computer or video games) affect learning during childhood and adolescence? What kinds of friendships do youngsters benefit most from? Why do many youngsters gravitate towards anti-social and destructive behavior? How can adolescents acquire goals and motives that will shape their life choices in positive directions? What family patterns influence the perspectives of the young? Have such influences changed over time? Do they vary across cultural context? These are the kinds of questions that people outside our field seek answers to.
It seemed to me, then, that a more problem-centered type of research was needed to address such questions. In fact, I was by no means the only scholar who made this observation. In 1975, my old mentor Roger Brown announced in a textbook that “the days of the grand theory are over.” Yet a return to sterile empiricism would lead nowhere. What seemed apparent to me was that the profound conceptual work that had gone into building and critiquing psychological theories could be mined for its potential for helping us to understand the common and fundamental problems of human development.
For my own work, I decided to pursue problems of interest that could be addressed by an approach informed by theoretical distinctions but not bound to any one theoretical or ideological system. I also hoped that grappling with such problems would inform theory-building, so that the benefits of such research can flow in two ways, from theory to real-life problems and back again. In this way, my research has followed the climate of our times; although it is also the case that I have constructed it by my own lights.
My words of advice for younger scholars follow directly from what I have found most rewarding in my own career: choosing to do what I believed in and what I found compatible with my particular interests and abilities. This often meant ignoring the most apparent benchmarks of achievement in my field, such as conventional study designs and high numbers of publications in prestigious journals. But I am certain that the value of my scholarly contributions has benefited from such choices. Moreover, my own career situation has worked out well over the years. I cannot say whether this has been due to reasons of luck, due to reasons related to the choices I made, or, most likely, due to a combination of the two. But during any ups and downs I experienced along the way, I always had the reassurance that, whatever happened, I was doing something I believed in. As my research on the correlates of purpose has shown me, this can be a reward in itself.Footnote 2