YPP Network Description

The MacArthur Research Network on Youth and Participatory Politics (YPP) formed out of recognition that youth are critical to the future of democracy and that the digital age is introducing technological changes that are impacting how youth develop into informed, engaged, and effective actors.

Howard Gardner
Subscribe to Howard Gardner feed Howard Gardner
Hobbs Professor of Cognition and Education / Harvard Graduate School of Education
Updated: 1 hour 3 min ago

Three Founders of The Good Work Project Named Among Most Influential Psychologists

April 19, 2018 - 9:34am

Howard Gardner, Mihalyi Csikszentmihalyi, and William Damon, the three founders of the Good Work Project, have been recognized among the fifty most influential living psychologists in an online ranking by TheBestSchools.org.

Howard Gardner comments below on this news.

Generally speaking, I pay little attention to rankings. To be sure, it’s nice to be ranked higher, rather than lower. But I am sufficiently familiar with the process to know that one can easily finagle the ratings; and that they are, at best, a very imperfect index of quality, however scrupulously they are assembled and displayed.

That said, I was pleased to see the list of “The 50 Most Influential Living Psychologists in the World” as determined by The Best Schools. On that list were my close colleagues Mihaly (Mike) Csikszentmihalyi and William (Bill) Damon, and me.

In 1994-1995, the three of us had the privilege of full-year fellowships at the Stanford Center for Advanced Study in the Behavioral Sciences. And it was there that we conceived of a ten year project—a study of the professions—which came to be called “The Good Work Project.” That project yielded ten books, scores of articles, and various tools for the workplace. Until this day, almost a quarter of a century later, we each continue our own work in the spirit of the original project—in my case, under the title “The Good Project.”

I am pleased to thank the Center for Advanced Study in the Behavioral Sciences for enabling this collaboration and to salute Mike and Bill, my close colleagues and friends—now recognized as  “influential psychologists.”

Howard Gardner

To learn more about this work, please visit TheGoodProject.org.

Categories: Blog

American Philosophical Society Publishes Jerome Bruner Memoir

April 16, 2018 - 12:32pm

Howard Gardner’s memoir of Jerome Bruner, the pioneering cognitive psychologist who passed away at the age of 100 in 2016, has been published by the American Philosophical Society.

In this reflection, Gardner provides an overview of Bruner’s life, work, and influence, including personal recollections. The essay appeared in the December 2017 of Proceedings of the American Philosophical Society.

Click here to access the piece in full.

Categories: Blog

A Requiem for “Soc Rel”: Here’s to Synthesizing Social Science

April 10, 2018 - 9:57am

As both an undergraduate at Harvard College in the early 1960s, and as a doctoral student at the Harvard Graduate School of Arts and Sciences in the late 1960s, I studied in a field called “Social Relations”—universally shortened to “Soc Rel” (and pronounced “Sock Rell”). Right after I received my degree in 1971, the field was terminated. Almost no one nowadays has even heard of Soc Rel, and accordingly, its demise has not lamented. Yet I believe it was an excellent example of interdisciplinary social science. We should seek to preserve the valuable lessons that it embodied.

First, a brief potted history. In the late 19th century and early 20th century, the social sciences were born. Following loosely on European examples, American scholars began to carry out empirical research in psychology (e.g. experiments); in sociology (e.g. surveys); and in anthropology (e.g. field work in remote cultures). (Comparable work was carried out in other social sciences, such as political science, economics, and linguistics, but that’s another story.) These fields of study often spawned departments in universities; and there were also collective enterprises across fields—as organized, for example, in the New York based Social Science Research Council (SSRC), founded in 1924. (SSRC supported my research in the early 1970s, and I subsequently served on its Board.)

The period before, during, and after the Second World War saw considerable interdisciplinary work in the social sciences (some, indeed, spurred by WWII). In my own field of developmental psychology, there were Bureaus of Child Welfare in several Midwestern universities, and Committees or Departments of Human Development at schools like Yale and the University of Chicago.

The establishment of institutes, committees, and departments is almost always a joint product of history, biography, and funding (chiefly private foundations, in those days). Disciplines develop alone, in tandem, or, more rarely, together, while scholars from these fields also carry out their work alone, in tandem, and, more rarely, together. At Harvard, during the 1940s, there was an unusual collection of distinguished scholars who came to know one another and to be invigorated by one another’s work. Specifically, the major figures were psychologists Gordon Allport and Henry A. Murray; anthropologists Clyde Kluckhohn and Cora DuBois; and sociologists Samuel Stouffer and Talcott Parsons. Among these scholars, Parsons was notably ambitious: he was intellectually ambitious, trying to tie the social sciences together through a conceptual framework (very influential in its time, now largely forgotten); and he was organizationally ambitious as well, thinking/hoping that the heretofore separate disciplines could, if integrated, prove to be far greater than the sum of their parts.

Hence in 1946 the Department of Social Relations was launched, as both an undergraduate major (or concentration) and as a doctoral degree department. (The name is truly awful!) For a while it thrived, because of the leading scholars involved; because of the interesting work that they carried out, sometimes jointly; and, it has to be stated, because Soc Rel was seen as being an easy major, one favored by many athletes.

And now an autobiographical turn. When I entered college in 1961, I had not heard of Soc Rel (probably very few high school students had). I assumed that I would be an history major and that I would go on to law school (adults had often told me that I would become a lawyer some day). But through a combination of circumstances, I became interested in this new field of study (new to me, still new to the academy), and when I was turned off by my sophomore tutorial in history, I decided to switch to Soc Rel—which turned out to be a fine home for my interests and my intellectual style.

When I look back on this experience and wonder why I was attracted to Soc Rel, I can identify two separate reasons. On the one hand, I liked very much several of the major professors—sociologists David Riesman, Daniel Bell, and Charles Tilly, personality psychologist Henry Murray, cognitive psychologist Jerome Bruner, psychologist of language Roger Brown, anthropologists David Maybury-Lewis and Lawrence Wylie, and several others (alas, no women scholars). (I also liked the work of social psychologist Stanley Milgram, though we clashed personally.) Above all, there was the eminent psychoanalyst Erik Erikson, and I was fortunate enough to be his tutee as both a junior and senior in college. Though Erikson often told his students “Don’t try to be me,” it’s clear that he was a role model for me throughout college, just as Roger Brown and Jerome Bruner became role models in graduate school.

The other reason had to do with the kind of work that was central to scholarship in this area—and the real reason for this blog!

While the scholars in this field usually had their own specific expertise—ranging from linguistics to psychoanalysis—they moved readily and comfortably across the social scientific terrain. Riesman and Erikson—the individuals who had the greatest influence on me in college and later in life—did not represent a discipline at all. Riesman was trained as a lawyer, not a social scientist; and Erikson had never gone to college! To try to put them into a disciplinary bin was hopeless. Instead, they carried out what I have called “synthesizing social science.”

In this work, they surveyed large bodies of knowledge, did considerable field work, and then put together powerful syntheses. Most famously David Riesman (and his colleagues Nathan Glazer and Reuel Denney) focused on the new social arrangements emerging in the United States. They contrasted the “tradition directed” perspective of the 18th century Americans with the “inner directed” perspective of the frontier-attracted 19th century and the new “other directed” perspective of suburban Americans (white middle class, we would now underscore) of their own period. For his part, Erikson observed widely across several distinctly different societies, probed the life cycle through psychoanalytic sessions with hundreds of patients, and put forth his theory of eight stages of the life cycle—beginning with the conflict between trust and mistrust of the infant; highlighting the crisis of identity versus role diffusion of adolescence and early adulthood; and culminating in the struggle between integrity and despair as one’s powers wane in old age.

It would take many pages to detail how these authorities went about their work and reached their conclusions—and this is a blog, not a door-stopping book. But it may help to point out that the scholarly efforts of these researchers and writers—and others in the “Soc Rel” tradition—fell between two examples. While their work often came up with “easy to summarize” conclusions, it was not “mere” journalism; the authorities spent years observing and reflecting and took their time in reaching and expressing their conclusions. (They also wrote well!) On the other hand, the work was not quantitative science. While there were certainly “data,” they were informed by informal observations rather than large surveys or carefully controlled experiments, complete with tests of statistical significance.

Put differently, they were more informed individual and societal portraits than traditional science: not putting forth claims that could be “tested” in the sense of Karl Popper, but rather sense-make syntheses that sought to capture the world in its complexities. Neither Riesman nor Erikson nor their colleagues would have dreamed of claiming that they had obtained “truths” in the manner of astronomer or a geneticist.

And there, perhaps, lies the major explanation for the decline and demise of Soc Rel. Within universities, individual departments, and especially their doctoral training programs, are powerful entities. With the passage of time (and the passing of the pioneers), up-and-rising scholars wanted to be known as developmental psychologists, or sociologists of religion, or physical anthropologists—and not as experts in “Soc Rel” or even as synthesizing or qualitative social scientists.

But there were also the factors of age and successions. As I was going to graduate school, the founders of Soc Rel were all retired or about to retire—and only rarely had they nurtured successors of equal scholarly eminence and organizational skills. With first rate scholars retreating to their disciplinary trenches and budding Soc Rel scholars (like me!) who were less eminent, the pull toward safe and secure traditional departments was powerful.

Indeed, the demise of Soc Rel in the early 1970s could be well analyzed in terms of its constituent disciplines. There was the psychology of ego on the part of ambitious faculty; the sociology of departmental power; and the ethnography of a particular set of characters who had shared a vision but had not built the infrastructure or recruited the next generation of leaders. RIP Soc Rel.

But not entirely. As I and my colleagues pass the age of the founders, some of us still carry the Soc Rel banner. Among my own classmates, Rick Shweder of the University of Chicago clearly spans the range of disciplines; and among colleagues at other schools, Mihaly Csikszentmihalyi, long at Chicago and now at Claremont Graduate School, moves easily among the social sciences and also writes in the synthesizing mode of Riesman, Erikson, and their associates—nearly all male, given the university environment of the period. And in recent memory, there were other scholars who clearly carried the Soc Rel banner—for example, sociologists Robert Bellah and Neil Smelser.

I am bold enough to assert that there will long be a need—and perhaps also a hunger—for the kind of synthesizing social science embodied by the leaders of Soc Rel. To be sure, without institutional support (from universities and philanthropists), it will be more difficult to pull off this approach. But I have sufficient optimism that young scholars with the “Soc Rel” gene will be able to learn from the powerful role models of an earlier generation and to continue to compose impressive works in that tradition. How else will we understand the times in which we live, and the people with whom we live?

Categories: Blog

Bill Drayton and Howard Gardner in Conversation

April 2, 2018 - 8:52am

In December 2017, Bill Drayton and Howard Gardner—friends since their days as classmates at Harvard College—had a wide-ranging conversation at the Harvard Club in New York City.

This lightly edited version of that conversation highlights Bill’s vision of the changes that are taking place at rapid pace around the world and how they can and should lead to a world of change-makers.

Bill Drayton, Harvard Class of 1965 Yearbook Photo

Howard Gardner, Harvard Class of 1965 Yearbook Photo

Gardner: So, I’m talking to my longtime friend Bill Drayton. He has talked about what it took to get the big story about changemakers out, and there’s a lot more planning and a lot more time than I would have thought. He didn’t just talk to reporters who said, “Yeah, I’d like to talk to Mr. Drayton and find out what you’re up to.”

Instead, he spoke to people in the media and found out somebody who was an up and coming writer — this is David Bornstein. And, there were articles published, and then a major book. But the point that really hit home with me is that it helps to understand what a changemaker is if you’ve become one yourself.

Now that was probably not thought of as part of the plan initially, but do I have it right?

Drayton: Well, David became a social entrepreneur, introducing solutions journalism much later.

Gardner: Mmm.

Drayton: The original insight, I think, is that if you’ve had a big story, it doesn’t fit into the journalistic story-writing time-frame. You have to find people who are big enough to break a big story. And then you can help them do that, but it’s a completely different type of conversation — “big framework change story” — and it’s ripe. So, there’s some wonderful writers that we’ve been talking to for years who genuinely understand “everyone a changemaker”, but they haven’t done anything.

In terms of writing.

They’ve done other things.

So one of them has set up, come, and said to me, “I’d like to do a book on this.” And another has, as a trustee of an institution, gotten that institution to move, but he hasn’t been writing. Now we’re at the point where we’re in the tipping mode, and so it’s the right judgment for a publisher, or an editor, or an individual writer to say, “Not only is this where the world is going, but it’s now here.” And we’re at the stage where millions and millions of people are going to look dumb if they…

Gardner: (laughs)

Drayton: Just in the last year, literally the last nine months, we have several publishers who’ve now made this decision. It’s a very interesting measure of where we are in the tipping process.

Gardner: So, can you spell that out a bit? What does it mean for a publisher to get it and to be an embodier?

Drayton: Well, it’s always been the case. A great editor or publisher sees a big story and they make a judgment: “This is important. This is important for the readers I want or have, and it’s time.” If they make those judgements correctly, they will get the writers and therefore the readers, and therefore the advertisers and the elan inside their organization. … So, there’s this wonderful book, I just love it, Bully Pulpit. Do you know of it?

Well, it’s by Doris Kearns Goodwin, on progressivism, and the subtitle is “Theodore Roosevelt, William Howard Taft, and the Golden Age of Journalism.”

I love this book, because it’s so right on this point and, you know, I give this book to people (laugh) when I’m trying … You have the opportunity to do this.

Gardner: McClure’s Magazine?

Drayton: McClure’s. Exactly.

So, Mr. S.S. McClure founds this magazine and he sees the big story. Now, all this stuff is going on. Industrial revolution, and uncontrolled capitalism, and the farmers are mad at the railroads, and other people are upset about the food being unsafe, but no one sees the whole picture. And they don’t understand how it really works.

And so he goes to Paris for Ida Tarbell. As a young woman, she said, “I’m not doing the mom thing. I’m going to be a writer.” She’s struggling writing on the West Bank. McClure goes to her and says, “Ida, you grew up in western Pennsylvania. You remember what happened to the independent oil people there? Now if you really do take on investigative reporting about Standard Oil and John D. Rockefeller, et cetera, it may take you two years, but then, month by month, we will publish this.” And his advice to her was, “This is going to be complicated…. build the storyline around the story of John D. Rockefeller.” So, that’s what made Ida Tarbell’s career. And then it was Ray Baker and various others. And so this is a great editor who saw the moment. This is a framework change when people in America could see, “Oh, that’s the problem. And here’s what we can do about it.”

Gardner: So, let me ask you about something that just came up today when I was doing an interview with a leader at the local college here. The person said to me, “You know, we’re trying to mobilize the students, but we’ve been behind. We’ve been behind because we’ve been using email, and of course they’re using Twitter and other social media platforms, and we just don’t have either the knowledge or the personnel to work on that.” So, you and I have been talking about books and magazines, but I’ve also said to you people don’t read anymore, meaning that they don’t read books. So, to what extent are you thinking in terms of the 21st century media rather than books and magazines?

Drayton: I don’t think it matters whether it’s this or that…

Sure, the medium has an impact, but this is a gigantic framework change. That’s just a fact, F-A-C-T.

From the year 1700 to now, the rate of change and degree of connection has been going up exponentially. Mirror curve going down: negative demand for repetition. Those are facts. Here we are. Old system dead. It’s continuing because of inertia, with more and more pieces falling off because they are failing and some other pieces making it into the new structure, and they will survive. We are just at a point where the old system doesn’t work anymore. And you know, if you are a 12-year-old or a 15-year-old, you have to figure out that this is the new game and to play in this game, right now, you’ve got to practice being a changemaker.

And you want to be a part of an “everyone a changemaker” school, youth program, whatever. Your parents have to figure it out. And the school board and the education writers. Everybody. This is a very profound framework change. People get lost in many distractions including social media because they don’t see the strategic change or their way forward. It’s the society, the schools, and youth programs. It’s terrible. So many are deeply disempowered.

Gardner: Yeah, but what you have to realize, what you have to concede, is that most of the use of social media platforms is no better. In fact, it’s frivolous.

Drayton: So, when you have an “everyone a changemaker” school — and we have a lot of them because of the Fellows and because of the Youth Venture initiative — it’s very different. It’s the norm for kids to have an idea and build a team and build something and make it work. And they are all being invited to participate as contributors or clients of the others. And it’s the norm. And people are empowered. And the moment you got your power and you know you can change the world …

Gardner: Yeah.

Drayton: You can express love and respect and action. You know what reaction that brings. You have the skills that the world wants.

Gardner: Tell me if I have the chronology at all right. You’ve been thinking about and developing both the idea and the practice of changemaker for decades. The notion of empathy, which at one point I think you called active empathy or something like that, is a somewhat newer idea in your conspectus. Is that right?

Drayton: I think it’s always the case. Great entrepreneurs have to intuitively know where the world is going to be in 20 years. And they have to be right about that. Because they’re launching a change that’s going to take 20 years to get there. And if they don’t know what the environment is going to be, they’re not going to succeed. So, we were like that. We knew consciously — the curve sort of knew that. And we could see empirically that a wave of social entrepreneurs, which is the cutting edge of this thing, were moving into the social arena.

The “everyone a changemaker” revolution started around 1700 with business. In 1980, it moves into the social arena. You’ve got a wave of social entrepreneurs. And we have a framework change goal introducing the construct of social entrepreneurship, which is deeply empowering everyone, including millions and millions of people who will never be entrepreneurs.

You can care, you can organize. That’s practical. It’s respectable. In fact, people will respect you for doing this. That is empowering for people, so that’s part of the “everyone a changemaker” revolution.

We did not consciously articulate our “everyone a changemaker” insight until about 12 years ago, but we knew it intuitively.

Gardner: Yeah. And I think, even if you didn’t put it into words, for people who knew what you were about, this was not a mystery to them.

Drayton: But it makes a big difference when you say it out loud. Everyone is a Change-maker. So, it’s not just a small number of people or some people, but everybody.

Because everyone has to be because (otherwise) you’re marginalized. The “new inequality” is: Are you a changemaker or are you out of the game? All the old inequalities are still there, but if you were in a winning group in the past and you missed this turning point, you are now on the wrong side of the new big divide. And if you’re part of a group that was doing badly before and you got on this, you’re part of the new winning arrangement.

Gardner: Or at least you’re in the game.

Drayton: There is such overwhelming demand for people who have changemaker skills, especially with anything resembling the high level. Sure, the old prejudices are there, but this is a turning point that’s so powerful …. just look at the difference. How long did it take for the new wealth, not based on land, to be very powerful and very prestigious? And it took over …

Gardner: But not very empathically, which is why I want to ask: When did empathy become a major part of the narrative?

Drayton: Well, first of all, it’s always been there. So, one of our four criteria for the election of staff or Ashoka Fellows has always been ethical fiber.

Gardner: Yeah, and that’s true. I know that from you.

Drayton: And we know you can’t be a good entrepreneur if you don’t have that (empathic trait).

Gardner: Well, the former head of Uber does not … the reason I’m pushing this is it seems to me, you and I have written and talked about this before that changemakers can cause a lot of trouble and that’s where I think …

Drayton: Yes …

Gardner: The move toward empathy is so important.

Drayton: No, you’re exactly right. So, I’m sorry. I know I’m jumping into the future, so let me just finish up on that last strand for a moment and then I’ll focus on empathy.

So, in the human lifecycle, you’ve got to master cognitive empathy really early. Because you need that to be able to master the other three essential changemaking skills: sophisticated teamwork, opposite to the past leadership, and changemaking. And if you don’t have the cognitive empathy skills that allow you to serve the good of all, you will be rejected, marginalized. This is the first generation where you can’t be a good person by diligently following the rules. They increasingly aren’t there as change accelerates. You will hurt people. You will disrupt groups. So, cognitive empathy is completely fundamental.

Now, we are in the transition zone. The old system was rules, punishment, fear. Well, that doesn’t work very well. So, of course, you have people who are taking advantage of the transition. And, once you’re in the new arrangements where every institution absolutely has to have people that have mastered this set of skills, starting with cognitive empathy, it allows you to be a trustworthy, good person committed to the good of all. Every organization needs to have you use these skills and purpose. You can’t have everyone be powerful and not have this. So, you move into a coherent system where everyone wants this because it makes you healthy or happy and live longer. Everyone around you wants you to have it and they help you and every institution. The transition is the messy part. People are no longer living in tiny communities where you don’t need this empathy-based ethics.

Gardner: Or you have it within the community — what I call neighboring morality — but it doesn’t extend beyond the community, what I call ‘the ethics of roles’.

Drayton: Why was psychology invented as a field in the late 19th and early 20th centuries? Because we really needed it.

And all those little communities, tiny little communities of 500 or 1,000 people, most people don’t even need … Not only the village, but they’re part of it. I mean, it’s astonishing. Even today, there are vast numbers of people who live on one street in a village and that’s it.

Gardner: Interesting. You may not know this, but Freud initially studied hysteria. And his cases were about hysterics. It turned out that they were overwhelmingly women,… they tended to be women who had grown up in the (small country villages), so to speak, and then moved to the city, and they couldn’t deal with it. So in a sense, I mean, psychology was probably invented for many reasons in many places, but it’s interesting that it was that dislocation from what the Germans called “From Gemeinschaft to Gesellschaft” from community to business that was very disruptive for generations.

Drayton: Community and a system of living by rules, you do what your father did and your mother did, et cetera, et cetera. By the late 19th, early 20th century, the rate of change in major parts of the world had reached the point that people desperately needed these sets of tools, so the field of psychology is invented, and it’s popularized almost immediately. I do not think that was an accident.

Gardner: And actually, even though I’m not particularly a fan of how it’s done, the big psychology in the last 20 years has been positive psychology. I don’t know if you’ve heard that phrase or not, and it does try to focus on happiness and empathy and things like that. But to me it’s too sloganish, faddish, and a little bit creepy. But you know, it, it’s certainly important.

Drayton: What gives you stability in managing and understanding this is if you start with “This is the way the world is organized and will be organized” and work from that to what’s needed. So, 150 years ago, we needed everyone to be literate in written language. Everyone. Completely radical idea. It just was needed. Now, we face a similar reality-based need — that all humans learn how to contribute value in an everything-changing world as changemakers.

All this is building, in my view, to humanity becoming one big organism. We’re developing very rapidly to operate like a brain. For any particular goal or need, a very large part — 20% or so — of the brain lights up and, you know, memories, and the inner ear and the left toe, they are all ready for you.

The same thing is true with humans as a species now. And this is now going into the future…. the entrepreneurs, the big nature framework change, pattern change entrepreneurs deeply from within committed to the good of all — that’s what a social entrepreneur is. It’s not defined by subject matter. That group of entrepreneurs is (parenthetically) the only group that takes everything into account because that’s what their value system is. They are not in it for the shareholders or this or that ideological point of view. And they are not the lazy person who leaves pieces out. This is the group of entrepreneurs that you want to be as powerful as possible. And that’s what our “Collaborative Entrepreneurship Jujitsu” (CEJ) process is about. And, we are now restructuring the movement so that, right down to the budgets, we build around the CEJs — not geography, not function, not subject matter.

So, a new issue comes out. All around the world, we don’t control this. Great entrepreneurs come up. They see it. They have a solution. And we are really good at spotting them, bringing them together, and seeing the pattern. And this is virtually the only focus group with any value about the future, because this is a focus group of entrepreneurs who can’t succeed in their life if they make bad judgments about what the world is going to be like in 20 years. And so, overwhelmingly, 90-95% of the 1,000 (Ashoka) Fellows who are focused on kids put kids in charge.

It’s the same ratio when you look at health. It’s a smaller number of 600 or 700 Fellows. Overwhelmingly, they put patients, family, friends, neighbors, and peers in charge. Both of those patterns fit what an “everyone a changemaker” world needs and the way it’ll work. So, we have two independent ways of saying, “Okay, check.” This is where we have to end up. And both of them are in the frame of what’s good for the whole of life, not for this piece or that piece. So, then, the methodology we’ve developed in the last three years, which is totally thrilling, is working. We’re far enough in now that I can tell you it’s working. It’s a four-stage process of how the team of leading social entrepreneurs build teams of others — I’ll explain that — and opens that up to absolutely every single human being so that you get into the mass tipping stage three.

Stage One is recruiting Big Game capacity co-leaders, chiefly Fellows. Winning top Big Game organizations with huge power is the first part of Stage Two. The other half is making the ideas, services, and links open to all, e.g., the 14-year-old girl in a district town in central India. Stage three is the soap opera with daily episodes. That’s what we must provide once you have tens of millions of people who have to find a safe place to get evocative stories so they can understand and participate in these changes and not get left behind. Then, there’s this huge demand for publishers. And that’s where we are — we’re very close to that.

And so, when we go to publishers, we’re saying everything I’ve said to you, and this is your big strategic opportunity, a really big one. So, once you see it’s in everyone, meaning literally everyone — a powerful person, the giver, the changemaker world, and then you’ve got to have a different definition of growing up, which starts with cognitive empathy, which you have to have to be a good person. Without that, you don’t have the skill to be in life for the good of all. Which is why that particular definition of empathy is so completely critical.

Then, the younger, the better, but …

Gardner: No, just a definitional question. Jujitsu, how you’re using it. Pressure, counter-pressure, any challenges?

Drayton: So, this image is a slight simplification of jujitsu. But this great big gorilla of a person is charging at you and you’re a little person.

Gardner: (laugh)

Drayton: And you hold your pinkie out and you do at just the right point and the gorilla goes head over heels and crashes. So, it’s knowing exactly the move that will use the energy of the forces that are there, of the existing world, to tip it.

Gardner: So, I have three things, that I want to ask you about and since you know me, you’re not going to be surprised by any of them.

The first is Trump and Trumpism. The other two are bigger issues. One is artificial intelligence. And the other one is brain and genetic manipulation. Because one thing that hasn’t changed in millennia is what human beings are, but those two things could. And so, I’m wondering there to what extent do you think about them. So, you can take those in any order.

Drayton: Well, Trump is very easy, because that’s happening all over the world.

Gardner: Yep.

Drayton: I mean, he’s just a particularly egregious person.

You know, the prime minister of India is an RSS monk. And that’s pretty extreme. You see this pattern all over the world. We have increasing we versus them politics. Pretty much every where in the world, income distributions are getting worse, regardless of the nature of the economy or the ideology of the country. What’s happening is the new inequality. Many people (including almost everyone you and I know personally), even though they don’t say it out loud, are in the new economy. They’re doing very well. Their incomes are going well.

I mean, the support is people who have less and people who have more have said it’s an unholy alliance. Just like the Southern Democrats and the northern Liberals in the 30s.

Gardner: I don’t understand. What’s the alliance?

Drayton: I mean, Trump is supported both by people who have less and less and by people who are very, very rich and don’t want to give up anything. And if you look at the tax bill (passed in January 2018, after this discussion), that’s the evidence for it. It’s an unholy alliance. It’s the opposite of jujitsu. And you’re right, I mean, Scandinavia is less troubled than other parts of the world, but even there you see it. Yeah. But, your point here would be that when it looks like big change is going to happen, people re-trench in various ways.

I don’t think this is necessary. Our job is to stop it.

Everyone is a changemaker. I mean that very literally. It has to be. There’s no guarantee that it comes out right. What’s happening now is those people who are not in the game and don’t have the skills and are not developing the skills, every year the level of these skills that is required is going up. It’s not going from A to B. It’s on and we’re on an exponential curve. So, you know, they’re falling further and further behind.

Gardner: Yeah. But that like leads right into artificial intelligence, which is what I worry about. Because of more and more things that people used to do are now done more efficiently, more effectively by machines, much of what we have traditionally considered to be work. Something you know a great deal about —ordinary labor and white collar work — is transformed and there may not be a replacement for it.

Drayton: Well, I mean, I don’t know about you, but I don’t want to be a lawyer. I don’t want to be a truck driver. I think good riddance.

Gardner: But, radiologists are also being replaced. 

Drayton: Who wants to be a radiologist?

Gardner: (laugh) Probably a lot of people in this club where we are meeting.

Drayton: (laugh)

Gardner: But, you know, people have to have a livelihood, right?

Drayton: Yes, actually we can figure this out. So, here are things that we know we’re going to need more and more of and are very satisfying for both sides of the transaction. All of us need help growing up and not just for 20 years, but we’re going to have to keep growing up as the world changes faster and faster and …

Gardner: That I certainly agree with.

Drayton: And well, you know, you help me, I help you.

Gardner: (laugh)

Drayton: Can we figure out how to monetize that?

Gardner: (laugh)

Drayton: Well, we did that, you know, with potatoes and beets. Why can’t we do that with different ways of helping one another?

Gardner: Well, especially if we live longer and longer, it’s a new kind of problem.

Drayton: Well, and …

Gardner: I mean, as you know Social Security was based on the assumption that not many would reach our age.

Drayton: Yes. And so, of course it’ll be different, but you know, we’ve gone from 97% of the world’s people living in small, isolated agricultural villages to 2% of the U.S. population doing agriculture. And we’re not poor. We haven’t collapsed because of that.

Gardner: Yeah, but it is true that the new entrepreneurs in the technological world are able to do what they do with very few people. That’s the big difference from steel and coal and so on.

Drayton: And, you know, if you look at what’s going on inside Google, they’ve got some areas that are so repetitive. And it’s hard for Google to have the fluid, open team of teams architecture for those who work in these areas. That’s a problem for them. But the rest of it is a fluid open team of teams. That’s one of the reasons they are a successful company. And, you know, if we’ve gone from 97% agricultural to 15%, are we really going to die if it goes to 2%? No. I think it would be marvelous if all that energy, people had the ability to work together as a species-wide brain, providing consciousness to the universe. Which means we’ve got to help one another. And why do we have this artificial boundary between people and the rest of a) life and b) creation?

It doesn’t make sense. I believe the South Asian point of view versus the Abrahamic point of view.

Which is very homocentric. But anyway, you can leave that aside if you want …

Gardner: I think it’s an interesting and important idea. I guess if you look at the history of humankind in a generous spirit, our circles have gotten larger.

Drayton: This is what sets humans apart. I just read this article about how the first multicellular creature came about. Apparently, the main hypothesis is that one cell ate another. And the other said, “Oh, this is very comfy in here.” And then you had two cells, and you start creating an organelle and, now we’re billions that make us up. And it’s the story of cooperation being what wins, which is very different from the image of Darwin. But the evolutionary success is cooperation.

Now, think about the frontiers, the projects that we humans can give ourselves to figure out how to cooperate better, more. And, as I say, why limit it? What if we find life elsewhere? Well, we will find life elsewhere. That’s obvious.

Gardner: Yeah, I guess it depends on what you mean by ‘life’. Most people are interested in whether there are entities that we could have some communication with. But, of course, that’s a much higher bar than, uh …

Drayton: Well, if you’re a South Asian …you think, you are one with an amoeba or a rock.

Gardner: Mmm.

Drayton: The South Asian point of view is that the universe is one. And nirvana is your getting back to that point. You have universal, complete 100% empathy, unity with the universe as a whole. Not just humans. I don’t know if it’s right or not …

But why not? It’s certainly an attractive idea. And you know, if you observe even relatively simple forms of life, it’s hard not to empathize. Children certainly empathize. And Howard, I saw my first mountain lion in life ever this September.

I was coming up this mountainside and it was reasonably steep. It was above the tree line. I don’t know what it was doing up there. It’s not supposed to be there, but there it was. So, I looked up and there was this mountain lion 40 or 50 feet ahead just walking across. I’m going up. It’s going this way. And it was just this amazingly beautiful powerful being. You could see the muscles, that long cat-like stride. And it was beautiful. Now, how could you not empathize?

If we don’t have the imagination to spot value creating opportunity, something is wrong with this picture.

Gardner: Yeah, but what I’m just thinking is that these wonderful epistemologies and eschatologies unfortunately coexist with, you know, what’s happening in Myanmar and, as you said, in India and so on. And, you and I are both betting on the better angels. I am just much less optimistic than you are. And that’s in part because, you know, you’ve made moves, which have been reinforced in the sense that they work better than many people would have expected. And then, you keep raising the gauntlet or the envelope or the jujitsu movement.

In contrast, my training is basically to be critical and skeptical and to think about why things might not work, even though I love what you’re doing and I want it to work.

We would be kind of a yin and yang if we were in the same office every day.

My mentor Jerome Bruner, whom you know of, and George Miller started the Center for Cognitive Studies at Harvard. Bruner was a great optimist. He said, “Every day, I would plan for optimism and Miller would plan for pessimism.” (laugh) And much, much nicer to be on the optimistic side.

Drayton: But it’s not just about optimism. It’s fact.

Gardner: Well, it’s fact that things are changing quickly and that most of us are not able to instantly get with the program. It’s not fact what will happen. That’s the Trump phenomenon.

Drayton: But, Howard, we have gone from no life to one cell to two cells to us being here. Cooperation wins.

That’s the story. That’s a fact. And if you look at all three and a half billion years or whatever it is, it’s an exponential curve. And we are privileged to be at the point where we leave the world of rules …

Rules will still be there to help us, but they are really increasingly minor authors. It’s not rules, punishment, fear. It’s people from their deepest beings freed up to be good people, to express love and respect. To have this way of living, and with everything around them reinforcing that and helping that. We don’t have that now because we are in this wretched transition. And so, of course, we have craziness at the moment.

We are hurting so many people now.

Gardner: Norman Ornstein, the political commentator, said something quite shrewd about Trump. He said, “You know, we are a nation with laws. But we’re also a nation of norms.” And, you know, the number of norms about how one treats people, whether it’s publicly or in your own circle, that have been basically, you know, exploded by, one man and his circle is So, the way I think about this in my work: I’m very familiar with what it means to be in a profession – and transformative and disruptive things are happening to all manner of professions, from professors to radiologists to journalists.

But the way I put it is, maybe the professions will disappear, but I wouldn’t want to be living in a world where it wouldn’t make sense to say, “He or she is acting like a professional, is very professional. And he or she isn’t.” Whether it’s the people who serve here in the club, or the people who are served, you can evaluate each person on whether he or she acts professionally.

Drayton: So, in a world defined by change, not repetition, it’s totally psychologically stable, because people are changemakers. They have those skills. You know, I would go crazy if you put me on a repetitive job. I would be very unhappy and then I would either leave or I would blow the thing up.

People have, for millennia, been living in a world of repetition. Efficiency in repetition was the game. You learn one skill, you repeat it for life in a world of workplaces with walls. Well, this world is going away, has largely already gone away. And many of these people have no clue.

Cognitive empathy is hard. You’ve got to connect the mirror neurons and the cerebral cortex and God knows what else to make that work. And, you’ve got to practice it a lot and then you’ve got to constantly be building up your map of how the world works, because otherwise you can’t understand the kaleidoscope of constantly changing, morphing contexts and combinations.

Gardner: You know, you were talking about how the world has been a world of rules and the invention of science is very interesting in that regard. 

Because when you were saying about how you would be bored out of your mind, I posit that people go into science particular with that motivation. Though it may be understanding amoebas …

Drayton: Not wanting…?

Gardner: Not wanting to live in a world of routine. And those of us in science do depend on people who will be more routine to make sure the cages are clean and so on. That “scientific attitude” was not really manifested in much of the world beforehand.

Drayton: It was one period and it was …

Gardner: The Greeks.

Drayton: And North India at the same time.

Gardner: The so-called axial age.

Drayton: And what happened there was that you had a series of towns that experienced a town meeting-like, civic community. It was so far superior to kings and tyrants. And then, there was a coalition of cities like that in both places. So, you had a critical mass, that for 150 years, you had the first outbreak of the “everyone a changemaker” world. Leading to the invention of history, geometry, and so much more.

Gardner: Or at least the people who counted, which would have been male voters …

Drayton: But, if you listen to some of the Delian League tales of these times, the women were not equal, but they participated in the culture too. 

Then Philip of Macedon modernizes the phalanx. And in his generation, that’s the end of the city-states in Greece. And his son Alexander went on to demolish everything else.

And you know, the phalanx is a form of organization. The team is the next form. In a team, everyone is responsible for helping everyone else individually and collectively build the skills and teamwork needed for overall success. That means also helping build and constantly rebuild the synaptic architecture of the team.

Gardner: Let me switch gears radically because something happened earlier this week. And, I was surprised because I made a connection, which I hadn’t made before. At Project Zero, which is our organization, we had a visit from a major educational policy maker.

We were talking about international tests where the performance of all countries are compared with one another. You know, Finland is first, Singapore is second, the US is 45th, etc. I asked him, “How about if individual states or individual cities could be included in the rankings and not just ‘official’ nations?” And he said, “That would be fine with us,” which I was pleasantly surprised to hear him say.

So, how do you deal with the nation thing? I mean, we’re talking about the Delian League and you know, it’s been a motif in various ways for as long as we have recorded history.

Drayton: Well, nation states are actually pretty recent.

Gardner: Well, nation states from the 17th century, but I mean the notion that you are, a Roman or that you are a Persian goes back thousands of years.

I’m saying, here we are, you know, in a world which is currently defined by nations. If you were the czar so to speak, what scenario do you see?

Drayton: We are a part of building that alternative. So, Jean Monnet set this in motion for Europe. Europe’s the most balkanized, the most tribal of all continents bar none. And it’s caused huge trouble for the world. So, Monnet went after that by articulating the positive goal of a united Europe and then working from a citizen group, finding political opportunities that were “wins” for all the politicians to build European institutions. And then you’ve got a positive dynamic of idea, institution strengthens idea, makes it more credible.

I think it’s a prototype. That’s why I think he is the second greatest person in the last century.

Gardner: I guess, part of what I’m saying is it was easy to underestimate the reactionary powers.

Drayton: But, Howard, we’re just now getting to the turning point.

Gardner: I mean, people have seen this coming for a long time.

Drayton: The Renaissance was an effort to get back to the city states organized the way the Greek polises were. And you know, Charles of France and his medieval army made that a crazy idea. And England develops the first integrated/de-centralized society bigger than a town meeting. And that provides a critical piece. So, that wasn’t that long ago. And you know, this has been evolving very rapidly. Those curves, since 1700, are mathematically exponential. Both the up rate of change and the degree of interconnection. Down is declining demand for repetition. Those are just happening. So, you know, of course, the transition is a mess. That’s what I keep saying.

Gardner: You have to hope you’re right.

Drayton: So, our job, for all of us, is to build the reality. And the form, as usual, will follow the reality.

Gardner: What about changemaker schools?

Drayton: Most principals and teachers do not define themselves as big. Their job is not to change the world. They wouldn’t choose those jobs if that was how they defined themselves. And they don’t have practice in the big game. And so, this is not where we’re going to find the leadership for the change that’s going on now that we need. It’s first the Fellows.

Either 1,000 or 1,300 Ashoka Fellows are focused primarily on kids. And Fellows that are not primarily focused on kids deal with kids. You can’t deal with the “new inequality” if you don’t deal with kids. Almost all of the Fellows put kids in charge. And when we help a community see what this means, it changes their lives in really profound ways.

Gardner: So, the kids are educating the adults in a certain way?

Drayton: Well, if you look at the “Your Kids” tool, that’s exactly what happens. So, roughly 10 days ago, we were in Phoenix with the top managers of Boehringer Ingelheim US. And we did “Your Kids” with the managers. They overwhelmingly want it. They want to be trained in how to do it. We’ve done it with machinists. We are working with the service workers, vis-a-vis the janitors and lunch ladies in the schools. This is their way to get dignity as well as make sure their kids succeed.

In that methodology, which we’ve discussed before, the host organization person says “It’s a world of change. We either are there or we must get there, but this is about your kids. They’re going to have to live in this world of change.” Ashoka then says the same thing in a different way. And then, a young person who has her power stands up and tells her story. And that’s it. People in the room, they’ve been told twice, world of change. This is really important for something that most people care about more than anything else in their life. And then they see this young woman who has her power. She is going to be a happy, healthy, long-lived person. The world wants her. She knows it. She has it. That’s the turning point. They have just seen and felt what their success as a parent or grandparent requires.

Ashoka then asks three questions: “Is your daughter practicing changemaking like Daniella?” “Does your daughter have Daniella’s power?” “Does America or Brazil or whatever have a future if all our kids don’t have this?” Then we jump into what you must and easily can do to get your daughter thus to get her power. The key to the success of Your Kids is the young person. That’s it, Howard — I just get goosebumps every time I see a young person who has that power. It’s just, you know, all animals have evolved to feel deep satisfaction when their young can fly. This is life success…

We’ve got four major new thrusts, all mutually reinforcing. We are going to start moving them out, this year. First, LeadYoung, stories of young changemakers for school intranets and more. Second, Ashoka Young Changemakers. These kids have to apply to be co-leaders in the Everyone a Changemaker movement, because we need them to do that. And they have to make that decision.

Third, Peer-to-Peer Allies. We never have “trainings”. At least if I have my way. Anytime the word “training” comes up, I try to cut its head off! Then, fourth, Your Kids. You can see how all four reinforce one another, feed one another. And the demand is overwhelming. It’s absolutely overwhelming. We’re way behind. We underestimated the market.

Well, we’ve been saying to universities, through Ashoka U, that probably one of the most strategic things you need to do is reach out to the high schools and middle schools that feed you and help them see the new pedagogic reality …

Gardner: Absolutely.

Drayton: And you are championing the young people in those schools who are being powerful now. You. NYU or Arizona State or wherever. And, then, of course, you’re going to be able to recruit the people who are going to be the changemakers, the really strong ones.

And they will define your campus culture, and they will be really successful alumni in the new game. This is the smartest thing you can do.

New York City does really well when it has good immigrants.

It’s a city designed for that. So, why doesn’t the city go out, you know, like a football team, and recruit changemaker immigrants?

Gardner: So, you were talking about John Lindsay, Mayor of New York City, immigrants. This, this must have been in the late 60s, early 70s.

Drayton: So, my prescription is that the city should go out and go to places like Gujarat and Maharashtra and recruit. And you get people who have that energy. The city is made for them. This city is made for immigrants to start things …

Gardner: But, I did have one other idea. You were concerned, and I think even upset by something that I told you or wrote you a year or two ago. But there’s a way of spinning it so that it fits very much into what we are saying now. The one finding from our study, which we already discerned years ago, was that mental health is the biggest problem everywhere. From the elite schools to the most unselective, the very fact that the mental health is the biggest issue everywhere.

Maybe this is a symptom of the fact that the kids realize that the world is changing in a way that they are not going to be able to deal with. And that’s all they know because they haven’t been at your feet.

But it’s not to say that if they end up in a changemaker environment at ASU or NYU, they’ll suddenly change. But it could be a symptom.

Drayton: It’s hard, I think, for any of us to even begin to imagine what it’s like if you know the world doesn’t need you, doesn’t want you. I mean, that’s terrible. And then, you’re in this wretched school and everyone tells you “you can’t”. And you’re being narrowed and narrowed, which is the exact opposite … I mean, it’s a terrible thing that’s happening.

Now, I didn’t have that sort of a childhood.

Gardner: None of us did.

Drayton: Right.

The fact is that this world is here now and you don’t have the opportunity to just follow a set of skills and a set of rules. That option is gone. And if it isn’t gone now, it will be in five years. This is the new inequality.

Gardner: Bill, I always thought of you as being a species and you’re trying to find conspecifics. And there were conspecifics. They could have been anywhere. But you were a mutation, an aberration.

And now, we’ve reached the point, for better or worse, where we need the aberration, the mutation, to become part of the natural DNA. And this is why starting very early is essential, because to undo the damage that is …

Drayton: Absolutely. Yes.

Categories: Blog

Two Departures from the Professoriate: A World Apart

March 27, 2018 - 7:29am

Matt Welsh was a highly talented assistant professor of computer science at Harvard University. Like some but by no means all junior professors, he was approved to become a full time, tenured professor—indeed, he was the occupant of a named chair “The Gordon McKay Professor of Computer Science.”

Just months after receiving tenure, Welsh resigned his professorship to become a fellow at Google. This decision caused quite a stir. Because so many were surprised, he decided to explain why.

In an article with the straightforward title “Why I’m Leaving Harvard,” Welsh begins by saying that he did not have any major problems with his work at Harvard. He liked his colleagues, said that the students were the best that he could ever hope for, and underscored that he had plenty of support for the research that he wanted to do.

But he went on to say, “There is one simple reason that I’m leaving academia: I simply love work I’m doing at Google. I get to hack all day working on problems that are orders of magnitude larger and more interesting than I can work on at any university… [W]orking at Google is realizing the dream I’ve had of building big systems… I can just sit down and write the code and deploy the system, on more machines than I will ever have access to at a university.”

Let’s take Welsh at his word—and assume that he did not leave Harvard simply (or primarily) to triple his salary or get a mortgage-free house or guaranteed scholarships for members of his family. In our country, few would say that Welsh should be deprived of the opportunity to fulfill his life’s dream.

Yet as a fellow professor and as one who believes in educational institutions, I am disappointed—in him and/or in the system. 

From the time of graduate school if not before, Welsh was supported in his pursuit of the doctorate and of post doctorate work—on the tacit assumption that, if he had the opportunity, he would join the professoriate. Citizens (via their taxes) as well as private funding agencies put their faith in him. And now, he is working for private industry—admittedly having lots of fun, perhaps doing some good, but the more cynical would say that he has “gone over to the dark side.”

Consider an entirely different case—that of Erin Bartram. Trained as an historian of 19th century America, Bartram was an assistant professor at a much less prestigious school—the University of Hartford. After years of searching unsuccessfully for a tenure track job, she decided to leave academe. She would have done so silently and without any public knowledge had she not decided to write an essay titled “The Sublimated Grief of the Left Behind” (an example of a genre apparently dubbed “quit lit”). In her soul-searching evocative piece, she notes, “We don’t want to face how much knowledge [a] colleague has in their head that just going to be lost to those who remain, and even worse, we don’t to face how much knowledge that colleague has in their head that’s going to be utterly useless for the rest of their lives.” To her surprise, the essay went viral, and as Bartram comments, subsequently and ruefully, had that not happened, “I would have been nobody.”

Of course, and alas, Bartram’s story is far more common than Welsh’s. Every year, many hundreds of young scholars—primarily in the humanities and the “softer” social sciences—receive their doctorates and try, unsuccessfully, to find a full-time tenure-track position. Some find a post doc position for a year or two; some fill in for a professor who is on sabbatical; some moonlight on several campuses (so-called “taxi cab professors”); some end up teaching in high schools or for-profit institutions; and some, and one could even call them “lucky,” end up teaching at a second or third tier school, or a community college, with a teaching (and perhaps also an advising) load so heavy that there is essentially no chance that they can carry out the scholarship that they were trained to do—and that they presumably want to do.

And many quit the academy altogether as Bertram has apparently done—sometimes trying to be “independent scholars,” more often seeking and accepting positions that would have been more appropriate for those who have not spent years fulfilling the requirements for a doctoral degree.

Darting back to the words of Welsh, these less fortunate young scholars would not be soothed by his concession: “I also admire the professors who flourish in an academic setting, writing books, giving talks, mentoring students, sitting on government advisory boards, all that. I never found most of these things very satisfying, and all of that extra work only takes away from time spent building systems, which is what I really want to be doing.”

I hope that readers of this blog join me in asking, “What’s wrong with this picture?”, or, more properly, “What’s wrong with with these pictures?” It’s lamentable that Welsh does not appreciate many facets of the traditional professorship (which presumably he should have known about by the second year of doctoral training); it’s tragic that Bartram is one of thousands of trained scholars who never get the opportunity to teach students who want to learn and to add their own bricks—small and not so small—to the edifice of knowledge in their chosen field of study.

At least in the United States, from pre-school to graduate school, education is no longer a public good; it’s become a private good. The lucky few get to do just what they want to do—even if they never see another student or teach another class. A large majority of those with doctorates would give anything to take the place of the “leaver,” but never gain the chance.

Other than remain with the unsatisfactory status quo, could this situation be handled be done differently?

One solution, with a long history in Europe, is to have two separate tracks—in practice, or at least in theory. For a fortunate few, there is a research track, wherein you join an institute, get to carry out the research that you want to, and never need to teach any students unless you so desire. For a vast majority, you either begin by teaching secondary students far from the metropolis, or you consign yourself to teaching huge lectures in the big universities, without having any contact with students, many of whom never show up in class and most of whom will never graduate. You are solely a teacher—not a teacher-researcher-scholar.

Another solution would be for the colleges and universities to cease graduate training altogether and let wealthy private companies set up their own schools. As I understand it, Google hires hundreds of post-docs, most trained at prestigious universities. Why not have Google, or Amazon, or Microsoft, or “the next big tech company” train the next generation of in-house researchers? We’d have to decide (as a society) whether to award doctorates, doctorates with an asterisk (PhD, with a Twitter bird or Amazon shopping cart next to it), or some newfangled degree.

Yet another possibility. Perhaps students who elect to pursue a PhD would understand from the beginning that there is a respectable alternate life style, called an independent scholar. After all, this option was in effect the tradition in much of Europe over the centuries. And of course, if you or your family are wealthy enough, this remains a viable pathway today. Not for Bertram, alas; she has to worry about how to pay next month’s rent.

But there is a better way, though I have to admit it is no longer the American way: an agreed upon bargain between our higher degree awarding institutions and our talented students who want to be teachers and scholars. If our institutions train you to be a skilled scholars and teacher, you commit to giving back—to staying within the professoriate, barring unusual circumstances. (Presumably Welsh could even come to like, to cherish his Harvard undergraduates.) Conversely, if we take you on as a student and you complete the requirements successfully, we commit to providing a job which makes use of your talents. If that means radically reducing the number of doctorates in history or in Romance languages, better to do so “up front” than to hold out the false hope of a job—and in the process, ensuring many repetitions of Erin Bartram’s sad saga.

Categories: Blog

“Why Are You Doing That Research?”

March 12, 2018 - 8:01am

I am often asked about how I became interested in a certain line of research. Recently, people have asked me why, at the age of 70, I embarked on a very large empirical study of higher education in the United States.

One answer: I’ve always been interested in education. As a young child, I thought that one day I would teach classes to children of every age. I’ve stayed in school all my life. But for the first half of my scholarly career, I carried out studies in developmental psychology and neuropsychology, without a particular focus on education.

That situation changed for two reasons. First of all, my theory of multiple intelligences, never of particular interest to psychologists, proved of great interest to educators. (One inevitably notices what others notice.) Second, I took a teaching job at the Harvard Graduate School of Education. My students were largely involved in education but not in higher education.

As a result, if you look at my published writings in education from 1985 to 2015, they have focused almost entirely on K-12 education.

But while my research veered toward pre-collegiate education, my colleagues and I were also undertaking a major study of professions in American life. That study, known initially as The Good Work Project, took place over 10 years. It yielded a variety of publications and toolkits, now collated under the label of The Good Project.

In that research, I and my colleagues—Wendy Fischman and Lynn Barendsen—were disturbed by a particular finding. Young people whom we interviewed wanted to do “good work” but felt they could not afford to do it at that point in their lives. They wanted to be successful and well-off, and so they, like their peers, had to be willing to cut corners, or so they thought. They told us that one day, once they had achieved their material goals, then they would be doing good work and modeling such work for other people.

We wondered whether we could do something with college students to orient them to the importance of carrying out good work from the start—work that is technically excellent, personally engaging, and—most important—carried out in an ethical way.

Accordingly, we looked for opportunities where we could work with college students on these issues. The first opportunity arose at Colby College in Waterville, Maine. There, a colleague, Sandy Maisel, invited us to teach a course in ethics, along with the nearby Institute for Global Ethics. With a set of engaged students, we reviewed many of the ethical dilemmas that arise in work life. The course worked out well.

The following year, I had the opportunity to carry out a similar, briefer course at Amherst College with the then-president, Tony Marx.

While carrying out these experimental courses, we had also been speaking to educators at Harvard College—much closer to home!

In talks with the Freshman Dean’s Office, particularly with Dean Tom Dingman and his associate Katie Steele, we began to design a course for Harvard freshman which we ultimately called “Reflecting on Your Life.” That course, now in its 11th year, continues to this day.

Especially important was the collegiality of Dick Light, a long-time friend and colleague and, unlike me, a genuine expert in higher education. Dick was a full collaborator in the design and execution of “Reflecting on Your Life.”

Over the years, Dick and I spoke frequently and at length about what it takes to make the most of the college experience—and to do so in a way that makes sense to you but also to others in your present and future worlds. That, in a nutshell, is the core idea of “Reflecting on Your Life.” As a result of these conversations, Dick and I wondered whether it might be opportune to carry out an empirical study of higher education in the United States today.

We were struck by two phenomena. On the one hand, nearly every month, new books appear—often rather depressing in tone—about the state of higher education in America. On the other hand, most of these books have far more “attitude” and considerably more recommendations than they have data carefully gathered and thoroughly analyzed.

Over the course of many months and many conversations, there arose a project, which we initially called “Liberal Arts and Sciences in the 21st Century.” Now in its fifth year, the project has carried out close to 2000 interviews on 10 deliberately disparate campuses. We speak to all the major stakeholders on each campus. In our interviews, we ask participants, particularly students, about their perspectives on excellence, ethics, and engagement. These virtues relate to the college experience, in both academic and campus life.

Soon we will turn our attention to making sense of these in-depth interviews. In the meantime, Dick has been promoted to Senior Research Advisor, and Wendy Fischman, who worked with me initially on The Good Work Project, is now Senior Project Manager. Over the next months and years, we will report many of our impressions and findings on this blog, Life-Long Learning.

Categories: Blog

Early Stops on My Quest for Mind

February 26, 2018 - 2:01pm

As an adolescent, I became interested in the field of psychology. My interests in psychological topics resembled those of most American adolescents—curiosity about my own personality, emotions, family relations, ambitions, and anxieties. And so it is not surprising that when I had the chance (as an undergraduate), I seized the opportunity to study with Erik Erikson, a famous psychoanalyst. As a corollary, my reading and writing in psychology were primarily in the psychoanalytic and dynamic psychological traditions. Indeed, I initially applied to graduate school to study clinical psychology, though in the end I chose not to pursue that profession.

And that is because, right after graduating from college, thanks to a chance summer job opportunity with a noted scholar named Jerome Bruner, I encountered a quite different area of psychology. Cognitive psychology was on the ascent—more respectable within the discipline of “psychological science,” and (most importantly) of greater interest to me. The cognitive area also explores the mind (that’s what psychology does!), but it zeroes in on the mind of thinking, reasoning, creating, problem solving, problem finding, and, more broadly, the use of language and other symbol systems. We might say that cognitive psychology is situated in the senior common room of the academy, while dynamic psychology inhabits the room with the couch. Before long, I had become a convert to the branch of psychology represented by Professor Bruner and his colleagues.

My first book for the general reader was The Quest for Mind: Piaget, Lévi-Strauss and the Structuralist Movement. In this book, published in 1973, I described the major methodologies and findings of two thinkers who could readily be called cognitivists: Jean Piaget, a biologist-turned-psychologist, the master of cognitive development in children and adolescents; and Claude Lévi-Strauss, an anthropologist turned systematic thinker, the master—in his memorable phrase—of “the savage mind.”

For a first book, The Quest for Mind was widely reviewed, well received, and gave me a modest reputation as a reliable synthesizer of the works of others. In the book, I was doing what one is trained to do in high school and the university—understanding and summarizing clearly the ideas of others, with only a bit of personal commentary. My scholarly contribution, if any, was to find intriguing similarities and differences between the ideas and findings of Piaget and Lévi-Strauss. (Only later did I stick my own scholarly neck further out and write about my own developing ideas—most conspicuously, the theory of multiple intelligences.)

In retrospect, I have become curious about why I chose to write about these two figures—Piaget and Lévi-Strauss—and why and how I chose to link them. I don’t know precisely what I thought in the early 1970s, but I attempt here to resurrect the intellectual milieu of the time.

Following my introduction to cognitive studies courtesy of Professor Bruner, I spent a year in England. I read widely. Two of the scholars who most impressed me were Piaget and Lévi-Strauss. Personal encounters also mattered. In 1965, Lévi-Strauss was invited to London to give the prestigious Huxley Lecture to the Royal Anthropological Institute of Great Britain and Ireland. I attended and was impressed with his wide knowledge, his carefully crafted arguments, and his ability to interact in a sophisticated manner (and in excellent English) with his British peers—specifically the noted anthropologists Edmund Leach and Rodney Needham. And then, revealingly, at the start of my honeymoon with Judy Krieger, who was also a student of cognitive development, we elected to travel to Geneva and sit in on Piaget’s weekly seminar. We met “Le Patron,” as he was nicknamed, and exchanged pleasantries. Thereafter, on several occasions when Piaget came to the United States, I went to hear him speak.

And so, by the summer of 1966, I was already immersed in the worlds of Piaget and Lévi-Strauss, both through their writings and through personal observations. As far as I can recall, at that time their American readership did not overlap very much—experts in child development and education were reading and arguing about Piaget, scholars in sociology and anthropology were reading and debating with Lévi-Strauss.

I believe that I was attracted by two features. First of all, I was intrigued by the concrete details of Piaget’s clever experiments with children of different ages, and by Lévi-Strauss’s vivid accounts of his field work in South America as well as his keen analyses of exotic myths and enigmatic kinship relations. Scholars were trying to map out how these two intriguing populations (Western children and indigenous populations) thought about the worlds of objects and of persons.

Second, and importantly, I was impressed by the systematicity of their thinking. Both Piaget and Lévi-Strauss were trying to lay out the logical structures that gave rise to the words and the behaviors that they and other social scientists had observed. More specifically, when Piaget asked young persons to explain physical phenomena (like the conservation of mass in the face of various physical distortions) or moral dilemmas (like the equitable distribution of desirable objects), he claimed that specific logical structures were at work. He then expressed these structures in the language of logic—technically, group and grouping theories. And when Lévi-Strauss described the manner in which choices of mates were facilitated or prohibited in different cultures, or how themes (like power) and contrasts (like the raw and the cooked) were captured in myths, he too laid out the basic structures in algebraic form. In retrospect (after almost fifty years!), I would say I was too impressed with Piaget and Lévi-Strauss’s penchant for describing the logics that underlies thought and behavior of children, inhabitants of distant societies, and (of course, by implication) our own thought process. I was looking, so to speak, for “dry land” that undergirded seemingly messy human behaviors.

And then I made a leap, one that, as an American, I was perhaps more poised to make than a European student would have been. I saw both Piaget (Genevan) and Lévi-Strauss (Parisian) as rationalists, growing out of a lengthy French intellectual tradition that placed a premium on logic. That tradition, dating back at least to the time of René Descartes, conceived of the mind as a privileged territory, quite apart from more mundane physical or physiological material. Not that Piaget and Lévi-Strauss were dualists in any literal sense: they both fully subscribed to the biological and physical scientific views of the time. And yet, perhaps more so than those in an Anglo-American empiricist tradition, Piaget and Lévi-Strauss continued to believe that one could study the mind directly, by-passing more mundane or more materialist bases.

It’s a bit odd that I was attracted to this formalist way of thinking. I was not trained in mathematics or physics and in those days had but modest interest in the biological sciences. But I have never forgotten a casual remark that Bruner made to me during that fateful summer of 1965—he said, “Howard, you think like a physicist.” If there was any force to this comment—the kind of sentence that almost everyone else present would have immediately forgotten but that has haunted me for decades—it raised the possibility that I was looking for a more formal way of thinking about the rich qualitative phenomena that interested me.

Quite possibly Bruner also realized that I was looking for a firmer foundation of my earlier interests. As a student of an interdisciplinary field called “social relations,” I had initially been most attracted to the writings of my tutor Erikson and also those of the sociologist David Riesman. Neither of these thinkers had any attraction to formal analysis (indeed, Erikson had never gone to college, and Riesman had not earned a Ph.D.). Perhaps I found myself lacking a powerful response to criticisms levelled at the seemingly artistic approaches to scholarship assumed by these mentoring figures—criticisms on the part of my friends, my other teachers, or my superego.

Anyway, whatever the cognitive or affective motivators, I began to think about the relations between, as well as the differences across, these two towering intellects. I then decided to write an article comparing their contributions. To my pleasure, it was accepted and published in Social Research, an established social science journal. Almost on a whim, I sent the article to Piaget and Lévi-Strauss. To my delight and astonishment, both men responded quickly and personally—to me, a mere graduate student (though Piaget conferred a doctorate on me in his address “Cher Docteur Gardner”). Only in the last few years, in digging up their correspondence, did I realize that their typed letters had the same date on them—April 10, 1970. Those letters, proud possessions, are now hung side by side in my office. I hope that they signal to me and to my students the importance—indeed the irreplaceability—of substantive communication among scholars and between mentors and mentees.

Before long, this article and comments from colleagues stimulated me to write the aforementioned book. (I note that, in the aforementioned 1970 article, I penned the pregnant phrase, “A task for a book, without question, but one worth sketching at this juncture.”) I conducted memorable interviews with both Piaget and Lévi-Strauss. I’d like to think that those personal encounters conferred a pertinence and passion to the book that took it beyond their printed words.

I was fortunate to find a wonderful editor, Dan Okrent; a distinguished publisher, Alfred A. Knopf; and a remarkable copy editor, Mel Rosenthal. That publishing team helped me to produce a book that I still have positive feelings about.

Now, because of two recent developments, I have a somewhat different slant on this book.

Development 1: For a volume in her honor, I was asked by my friend and colleague Sherry Turkle to write about one of her books, and together we decided that I should write about her first book Psychoanalytic Politics. Sherry’s book was also a study of the French intellectual tradition—in her case, focusing on another intellectual luminary, Jacques Lacan, a revolutionary psychoanalyst who was far more controversial than either Piaget or Lévi-Strauss. While I make no claim to have probed Sherry’s own motivations, I see both of us as having stretched to master another intellectual culture, one quite alien from ours, as a means of stretching our own social scientific understanding.

Development 2: As part of our current large national study of higher education, I’ve become intrigued by those experiences that students (and other informants) consider to be transformational. More frequently than I could have ever have anticipated, students today nominate their time abroad—the proverbial junior year, though it can occur in any year of college, or in a gap year, or a summer internship, or, as in my own case, a postgraduate fellowship. In one sense, it’s disappointing that so many young persons describe as particularly transformative their time away from the home campus. (Why bother to have a well-appointed home campus? Does an airplane ticket suffice?) But perhaps one needs both the time on campus and the time away for such an effect to be felt.

Both Sherry Turkle and I had spent time abroad before we embarked on these particular studies. But I think in my case—and I would speculate in Sherry Turkle’s case as well—this was not just a cultural year abroad. It was an intellectual voyage abroad, immersed in another cultural tradition, that expanded our own understanding of the work in which we were interested—and, perhaps, in ways that we ourselves only dimly comprehend, affected what we did thereafter. Indeed, my decision—while I was still studying the texts of Piaget and Lévi-Strauss—to study neuroscience and spend 15 years working on a neurological ward reveals yet another effort to find some “hard ground” undergirding the human issues in which I was interested. But these speculations and subsequent events are stuff for another blog, at another time.

Categories: Blog

Audi Aicon Campaign Uses Multiple Intelligences

February 14, 2018 - 12:07pm

German car company Audi has unveiled the Aicon, an electric self-driving car, and a recent advertising campaign for the model uses Howard Gardner’s theory of multiple intelligences to highlight the vehicle’s various features.

For example, the electric motor is highlighted as evidence of the car’s naturalist intelligence, which its autonomous driving capabilities are cited as mathematical intelligence.

Gardner comments on this development below:

My friend, Tom Hoerr, a leading authority on the theory of multiple intelligences, brought this advertisement to my attention. When I developed the concept of MI, I never anticipated how much mileage (!) others might get out of the idea. As I quipped to my children, they may get a kick out of this, but not a kickback—the idea of multiple intelligences has always been in the public domain.

Click here to see more.

Categories: Blog

Reflections on Transformations

February 7, 2018 - 8:42am

In several blogs I this series, I have written about the transformative powers of education. Drawing on my own experiences over a long life spent almost exclusively in educational institutions, I have recalled mentors, books, travels, and meetings—usually ones that had a positive transformative effect, but also mentioning at least one transformational experience that was decidedly negative. I have not yet written about family, friends, colleagues or, indeed, enemies that have also had major influences on my development. Perhaps someday…

It’s opportune to step back and to think more broadly about the meaning of transformation and the processes by which it may occur. I’ll approach this in a somewhat schematic way—with the thought that I and others can fill in the blanks, or, if more appropriate, leave them blank!

Size of Transformation

As one who has spent much of every day reading, I have little hesitation in saying that becoming literate is a Big T Transformation. (Indeed, it has been for the species, as well as for billions of individuals.) Writing belongs in the same category.

But there are also smaller transformations in the realm of literacy. When I learned to dictate essays, rather than scrawling them by hand, that prosthesis certainly changed my life in significant ways—but I would not consider that to be a major transformation. By an analogy, being able to navigate via driving a standard car constitutes a major transformation in the lives of most young people—but then, learning to drive a truck or even a bus is not nearly so great a leap. (Driverless cars may constitute a different kind of transformation!)

Length of Time for Transformative Effect

Take teaching as an example. I have always wanted to be a teacher and, indeed, began to teach others when I was very young (no doubt making a mess of things). A major transformation occurred when I was able to prepare a lesson plan and then, if something unexpected but promising occurred, to toss the lesson plan aside and, so to speak, go with the flow. This transformation required both mastery of materials and the flexibility to juggle priorities, while honoring the broader aims of the lesson or the course.

Speaking of teaching, I can point to a transformation that took much less time. For years, in any big course assigned to me, I simply lectured for most of an hour—as I like to quip, “easier for the teacher, easier for the students”. But about 20 years ago, when video became widely available, I decided to record all of my lectures; to ask students to view the videos prior to class (most did); and then to base the class on a discussion of the recorded lecture (and associated readings). This shift took time to effect—but I would say that after one year (which would be 10-20 recoded lectures), I became able to lead comfortably what we now call a “flipped classroom.”

Areas of Transformation

Those who know me (which includes faithful readers of this blog) realize that I focus, probably too much, on academic and intellectual matters… the likely fate of a professor, I suppose. But in conversations with Rakesh Khurana, the remarkable Dean of Harvard College, I’ve become convinced that higher education in the liberal arts and sciences should strive to bring about three changes (in students and others):

1) changes in how one conceptualizes the world of ideas and associated practices;

2) changes in how one relates to peers and other individuals; and

3) changes in how one thinks about oneself.

Like the trio of good worker, good citizen, and good person (see thegoodproject.org), one can achieve one “good” without nudging the others. It is certainly possible that a college (or a high school or even a summer camp) can affect one of these spheres without affecting the others. As examples, there are certainly students whose way of interacting with peers or teachers can be significantly affected in one way, without there being comparable shifts in the other spheres—and vice versa. Moreover, in any particular case, there may be good reasons not to change the way that one thinks about others, nor the ways in which one interacts with others. But in most cases, young people benefit if all three modes of being are changed in a significant way—hopefully, of course, in a productive direction.

Let me be concrete—and personal. I would like young people to attend schools which strive to bring about these transformations. Along those lines, I hope that most young people will have the opportunity to consider—and perhaps pursue –career paths other than the ones that they (and, most probably, their hovering parents) had thought were their “chosen” trajectories.  I am not so pleased when such young people had embraced a passion or a mission, only to delete that worthy goal in favor of pursuing the easy way or the most lucrative way. By the same token, one could have arrived at college as a genuine democrat (small d!) and leave as a snob, a non-caring member of an elite—a less than happy transformation, on my values.

Deceptive Transformations and Non-Transformations

Though we have learned to give tests or award badges for many skills, we are far from having reliable ways of ascertaining whether transformations have taken place. All of us can think of experiences which seemed very important at the time—a first date, a first love, a first trophy, a first public embarrassment—but whose distinctiveness seems to have faded or disappeared over time. Call this a false positive.

By the same token, we can certainly think of experiences which, at the time, seemed casual or unimportant but whose significance looms larger as our lives unfold. For example, when, in the spring of 1967, one of my teachers suggested that I drive out to Brandeis University to meet Professor Nelson Goodman, I had no idea that this meeting would lead me to help launch an organization—Project Zero– that has been central in my life for a half century. Nor when on short notice I decided to hitch a ride to Ann Arbor, Michigan, did I have any idea that a chance conversation with the driver—a young scholar named David McNeill—would lead me both to a new field of study (cognitive psychology) and introduce me to Judy Krieger, the fellow student who became my wife and the mother of three of my children. Call these the “hidden triggers” of life-changing transformations.

As we think about transformations, some other questions arise. We think of transformations as moving forward—but there are certainly regressions. As individuals become older, and, in less happy cases, begin to dement, they can revert to thinking in a qualitatively more primitive way. There is also the question of whether societies can actually mandate transformations. Of course they can do so in a legal way (for example, “I now pronounce you husband and wife”) or via a dramatic experience (for example, circumcision as part of an initiation rite). But it is less clear that you yourself are changed by such legalistic process–though the ways in which others treat you may well alter significantly (in which case we can say that the others have been at least slightly transformed!).

Disciplinary Transformations 

And as a scholar, I should mention the transformation of fields of knowledge—as described dramatically by the historian of science Thomas Kuhn in his famous book The Structure of Scientific Revolutions. Any field can be so transformed—think Darwin, think Einstein, think Picasso (or Virginia Woolf or Duke Ellington), think Stravinsky (or The Beatles). The fields may be transformed but not necessarily their long time practitioners—as has often been quipped, “In the academy, change occurs one funeral at a time.”

Closing Challenge

If you accept my claim that transformations are important—indeed, that they constitute a central goal of education—you need to confront a dis-settling situation. At present, we have few if any systematic ways of studying transformations. Models occur to me—models from biology (morphogenesis—my dictionary says “a transformation by magic or sorcery”), models from mathematics (chaos theory), or models from physics (change of state). But when it comes to transformations of human beings, we remain moored to far older methods—introspections, observations, and an occasional low-key measure. Indeed, in our study of higher education, we attempt to determine whether an individual has learned to think in a liberal arts way… certainly not a way that is embedded in the genome, nor, indeed, available in most cultures across the centuries. Another story, another time.

Categories: Blog

2018 RHSU Edu-Scholar Public Influence Rankings

January 23, 2018 - 9:08am

Howard Gardner is second amongst the most influential education thinkers in a 2018 ranking.

The annual RHSU Edu-Scholar Public Influence Rankings, released by Rick Hess in his column in Education Week, is a list of American university-based scholars who are shaping educational policy and practice. The ranking is based on factors such as Google Scholar and Amazon rankings, as well as press and web mentions.

The 2018 ranking of second place is a three-spot jump for Gardner, who has moved up from the #5 slot in 2017. Rounding out this year’s top five are Linda Darling-Hammond, Angela Duckworth, Gloria Ladson-Billings, and Diane Ravitch.

Click here to read the release in full, and as always, congratulations to all the scholars who made the list.

Categories: Blog

Transformative Experiences: Positive and Negative

January 22, 2018 - 11:57am

It is certainly reasonable for education to foster transformative experiences. I am skeptical of any educational program, whether preschool or adult classes, which denies that it seeks to bring about such experiences. To be sure, transformation cannot be guaranteed. Moreover, there can be false positives—one thinks that the trip to Europe with one’s friends was transformative, but it actually disappears from memory with little trace. And there can be false negatives: one takes a course in art history and gives it a low rating. But later in life, one calls regularly and gratefully on knowledge and skills from the course as one begins to collect or indeed to make drawings.

In my own case, during my years as a graduate student at Harvard, I had two experiences which, at least in retrospect, prove to have been quite transformative. One was decidedly negative, the other clearly positive.

The bad news first. As a beginning graduate student in developmental psychology, I was required to take a course in social psychology. The course was taught by two young instructors: Stanley Milgram, an expert in the experimental study and manipulation of human behavior, and Thomas Pettigrew, an expert on the nature and sources of prejudice. Each week, about 15 graduate students read key texts and then, seminar style, sat around a table and discussed the readings critically.

I took the course a few years after Milgram had carried out and published his path-breaking studies on “obedience to authority.” One week, we read and discussed a key article about the amazing finding—that, contrary to what the overwhelming majority of psychiatrists had predicted, most American subjects would deliver a powerful electric shock to another individual simply because a man wearing a laboratory jacket had so instructed them.

In the course of the discussion, I made a few critical remarks about the experiment. I wish that I could remember their substance; I do remember that they were quite reasonable comments, and I think that I presented them in a polite or at least a non-confrontational manner.

What then happened, fifty years after the event, still makes me shudder. For several minutes Milgram viciously attacked me, saying that I was trying to ruin him, destroy his career, undermine social psychological experiments, and blow up the field of social psychology. None of these comments were fair-minded—indeed, I would call them paranoid.

What happened thereafter was even worse. Not a single person in the room—neither my fellow students, nor the other faculty member—rose to my defense in any way. Rather, like a sudden, one-time explosion, the episode passed, and we went on to other comments on other readings.

Only after the session did my fellow classmates and the other professor come and speak to me and, in effect, apologized to me for Milgram’s unprompted explosion. I was too stunned to ask them why and, indeed, sought to suppress if not banish the experience altogether from my mind.

But in fact the opposite occurred. I learned invaluable lessons from the experience, ones that have stayed with me to this day. First, you can get attacked by an authority figure, and there is nothing that you can do about it at the time. Second, do not expect to be defended, even by individuals who know better. Three, continue to speak your mind, but strive to do so in as non-confrontational a manner as possible. And, if you cross over a line, apologize.

You may wonder what happened, in a small academic department, between Milgram and me. We never discussed the event. We entered into the more traditional student-faculty relationship, and he even read and commented on some of my papers. And later on, well after I graduated, we had a few professional contacts. Milgram died at a very young age—his loss to the field (as well as to his family and friends) was severe, and I have long since forgiven him, though I have never forgotten him.

On to the positive experience—and happily so. As a graduate student, I was a founding member of Harvard Project Zero, a research group in education which happily survives to this day, fifty years later. Project Zero was initially directed by Nelson Goodman, an eminent philosopher who was particularly interested in the nature of different kinds of symbols, including those in the arts. (With David Perkins, I co-directed Project Zero from 1972-2000.)

Goodman and I became quite close, and we even collaborated on projects—he as a seasoned philosopher, I as a budding psychologist. We were both interested in how different kinds of symbols are understood and processed—for example, to use the language of Susanne Langer (see here), how human beings process written and spoken language as compared to how we process the visual arts or dance.

At that time, studies of the functions of the two halves of the brain were becoming known, due chiefly to experimental procedures whereby one can send stimuli only to the left or the right hemisphere. Both Goodman and I wondered whether the difference between language (and what is often called a discursive symbol) and depiction (what is often called a presentational symbol) might be respected (respectively!) by the left and right hemispheres of the brain—thus giving a “material” basis for a distinction important to philosophers and psychologists.

It so happened that Norman Geschwind, a brilliant young neurologist working in Boston, had been studying this issue—disorders of higher cortical functions with brain-damaged patients. Possibly with my help, Goodman invited Geschwind to speak to our small research group, then housed in a small building on Prescott Street in Cambridge.

As with the Milgram episode, I do not remember the details of Geschwind’s presentation—though I could certainly invent one convincingly because I heard him lecture at least 100 times in the succeeding fifteen years. But as an individual interested in artistry (then the principal focus of Project Zero), I was fascinated to learn that famous artists had been studied after they had sustained damage to their brains—and that what they were still able to do, as opposed to capacities that had been impaired, revealed important information about artistry. So, for example, a famous composer had sustained damage to certain areas of the left hemisphere; he could no longer speak, but he could still compose. In contrast, an eminent painter with damage to certain areas of the right hemisphere could still speak about his drawings, but the spatial configurations were greatly distorted.

Not only were these findings fascinating and counter-intuitive. I came to realize that a study of the brain—and particularly of cortical pathology—might hold answers to questions about the nature and organization of artistry. These were questions that had haunted me but for which I had hitherto lacked both populations to study and methods by which to study them.

Not long thereafter, I made one of the major decisions of my then young scholarly life. Instead of continuing work in developmental psychology and looking for a teaching job, I instead would seek to do postdoctoral work in neuropsychology with Norman Geschwind. He was kind enough to agree. I did a three year post-doctoral fellowship under his guidance, and we became colleagues and friends until his untimely death in 1984—when he, like Milgram, was still in the prime of life.

I have one other memory from Geschwind’s visit to Project Zero. He was scheduled to speak in the afternoon. I had to return home after the formal talk because I had recently become a father and wanted to be with my wife and daughter. But after dinner, I returned to Project Zero, and what had started out as a standard seminar was still proceeding until well into the night.

I cannot draw any deep conclusions from these two examples—one of very short duration but with long-time personal consequences, the other somewhat longer and with long-time scholarly consequences. But I can say to every student—of any age—that you should always be open to such life-changing experiences and seek to give them as positive a spin as possible. And I can say to every teacher, as Henry Adams memorably wrote, “A teacher can affect eternity; he can never tell where his influence stops.”

Categories: Blog

Travel as Transformational

January 5, 2018 - 11:51am

In our large national study of higher education, we ask students—and others connected with colleges and universities—whether college can or should be a “transformational experience.” Recently, we have asked informants whether they themselves can name a transformational educational experience of their own, either within or beyond traditional schooling. Often, when asked about transformational experiences, informants mention foreign travel—typically, the proverbial “junior year abroad.”

I am fortunate to have had several transformational experiences in college. I am also fortunate to have had the opportunity to travel a good deal. Alas, as is the case with many of my colleagues, the travel is all too often via air, to a hotel, a speaking venue, possibly a quick walk down the main promenade, well-regarded museum, or some other iconic public building, followed by a quick return to the airport and, if fortunate, an on-time flight back home. (As if to prove the point, this blog has been drafted and edited on plane trips.)

When I reflect on my own education, three travel experiences stand out as transformational.

A Year Abroad Following College

Upon my graduation from Harvard College, I was lucky enough to receive a year’s support to study in London and to travel elsewhere as well. Perhaps wisely, perhaps foolishly, I did only minimal study in a formal sense. Instead, often with friends who were also residing in London or nearby, I took advantage of the year to immerse myself in the artistic culture of England—attending many theatrical productions, especially at the recently launched National Theatre; visiting numerous museums and other cultural sites; travelling throughout the British Isles and Western Europe; and making a memorable train ride through Communist Eastern Europe to Moscow and Leningrad (now St. Petersburg again) in the heartland of what was then the Soviet Union. I also read widely, in a few European languages, and spent a great deal of time writing—among the products a diary, various shorter articles, and a long and totally unpublishable novel. I also maintained a long-distance relationship which culminated in a June wedding in London and a summer honeymoon motoring throughout Europe.

Back in the day, Americans of means did their “European tour”—partly to see iconic sites, partly to acquire culture or become cultured. Indeed, my parents, who had fled Nazi Germany in the late 1930s, had taken my sister Marion and me for our first trip abroad in the summer just before I began college. But it was really during the post-graduate year—the other bookend of college, so to speak—that I acquired some intimacy with European culture, or, as members of my family now put it, received the requisite “culture credits.” I’m now grateful that I can continue to draw on these credits nearly every day—even more so in late than in mid-life.

The two other examples relate much more closely to my own vocation as a teacher and scholar:

A First Visit to Reggio Emilia in Northern Italy

In the late 1970s, I was a young researcher in developmental psychology with a particular interest in the arts (thanks, in part, to the year spent in England). At that time, I heard about extraordinary schools for young Italian children begun in the early 1960s in a small city, hitherto unknown to me, called Reggio Emilia (hereafter RE). I also began to contribute modestly to a publication about early childhood education, emanating from RE, with the neat name Zerosei (“Zero to Six”).

I was delighted when my wife Ellen Winner, an expert in artistic development, and I were invited to visit RE in the early 1980s. At that time, the inspiring founder of the RE educational experiment, Loris Malaguzzi, was very much alive. He and his close colleagues, who had been with him almost from the start, hosted Ellen and me for several memorable days. We spent a lot of time in the local schools, visiting and interacting with students, teachers, and parents—and perhaps surprisingly, the language barrier did not hinder communication. Indeed, it may even have had certain advantages, heightening information taken in via other senses. I became an unabashed admirer of these schools, with their foci on family participation, art and design, explorations of the urban environment, and hands-on activities. Malaguzzi’s evocative “hundred languages of children” dovetailed neatly with my newly developed theory of multiple intelligences.

In 1994, Malaguzzi died suddenly. I was concerned that his magnificent educational experiment might flounder. I approached one of the funders of our own research, the Atlantic Philanthropies (AP), and I asked whether that funding agency might provide support so that the surviving educators in RE would have the opportunity to reflect, regroup, and anticipate next stages and challenges. Due to legal restrictions, AP was not able to provide direct funding to RE. But AP encouraged us to create a joint project in which our research team at Harvard would have the option of directing some funds to RE (in effect, subcontracting).

Thence began a wonderful collaboration—still ongoing—between the educators in northern Italy and the educational researchers at Project Zero in Cambridge. Most of the exciting practices and stances came from the Italian educators. But our research group made substantive contributions in helping RE personnel to understand better what is distinctive about their practices; how best to describe them (in Italian as well as English); and how these insights and processes might be conveyed to and implemented by motivated educators throughout the world. In a well-received book called Making Learning Visible, we described practices like collaborative learning, documentation of student work, and the cultivation of expertise in pedagogy and in the arts.

A Surprise Visit to the Middle Kingdom

China was long characterized as the “Middle Kingdom.” In 1949, it proclaimed itself as the People’s Republic of China, but it was referred to in the West in the 1950s and 1960s as Communist China. Beginning in 1966, the leadership of China launched a large scale, quite violent, and (it is now universally agreed) highly destructive movement called The Cultural Revolution. Only with the death of Chairman Mao Zedong in 1976, and the ascension to power of Deng Xiaoping in 1978, did a calmer and more constructive China come into being.

In 1980, I knew almost nothing about China—it was far away both geographically and culturally. No one was more surprised than I when I was invited to join what was described as the first post-Cultural Revolution trip to China undertaken by Harvard University. (The circumstances were idiosyncratic, having to do with the lifelong friendship between my then Dean, Paul Ylvisaker, and the long suffering president of a Chinese university, Xia Shuzhang.) Along with about ten colleagues, I spent two weeks in China, visiting several cities, learning about the culture as it had evolved over the millennia, being introduced to the nation’s recent turbulent history, and, most memorably, meeting with dozens of academics who had been horribly mistreated for a decade and were still shell shocked.

The single trip was memorable and certainly raised my consciousness about China (such “consciousness raising” is certainly one of the main dividends of travel). But China would not have become transformative for me had it not been for another unforeseen set of events.

Two years later, I was invited to join a delegation of American arts educators travelling to China—once again described as the first such organized trip to China. As a minor figure in the delegation, I had planned to under-dress and to carry little luggage with me (Rule #1 of Gardner travel: “Travel light!”). But the day before our delegation was scheduled to depart from JFK Airport, the designated leader of the delegation became ill. As the only member of the delegation who had been in China before, I was asked to lead. Dressed in one crumbling suit along with one moth eaten sweater, I led twelve far more distinguished American arts educators on a visit to China which included a major conference of arts educators from the two nations.

Those who know little about China might well assume that in a determinedly Communist country, there is little status hierarchy—whatever their position, the cadres look, dress, and act alike. And you—like me—would be completely wrong. Everywhere we went, it was easy to tell who was the leader, who were the other administrators, and who were relatively lower-status artists and academics. By the same token, wherever I went—and with whomever I dealt—I was THE professor, THE leader. I learned that where one sits, next to whom one sits, and who makes the toast and receives the obligatory gifts mattered in this context; as the designated head of the delegation, I remained the person-of-the-hour. (No doubt my threadbare dress was noted and remarked upon by many people.)

During the course of this tour, I learned a lot about education in the arts in China. I hope that our Chinese counterparts learned a lot as well about the situation in the United States and the West. But I was treated as a “big deal”—even though in no way was that true. (At the time I was not even a tenured professor, let alone a star of any sort or of any sector.)

That said, I am certain of one thing. What most impressed the members of the Chinese delegation was what happened when we Americans had to decide on something. In front of our dumbfounded Chinese colleagues, I would poll my American colleagues orally, weigh pros and cons, often take a vote, and we would eventually decide on a course, by vote or by consensus. Women counted as much as men, academics and artists and administrators had equal voice. Whatever China called itself or however it styled itself, we were giving them a lesson in democratic procedures.

Even with two trips under my belt, I still assumed that my Chinese adventures were a sideshow. But shortly thereafter, I became co-director of a large-scale study of arts education in China and the U.S. Thereafter, my colleagues and I made several trips to China—with my wife and our then infant son Benjamin joining us in 1987. Despite the fact that I never learned to speak or read Chinese, I became de facto an expert on the comparative study of education in the arts in our two countries. Indeed, after my fourth trip to China, I wrote a book about my experiences, with the title To Open Minds: Chinese Clues to the Dilemma of American Education.

The theme of the book—the dilemma alluded to in the subtitle—foregrounded the contrasting approaches to creativity in our two countries. Briefly, in the U.S. and other Western countries, including the Italy of Reggio Emilia, we value the creative explorations undertaken by young children. Only after relatively unstructured time in the first years of life do we introduce and value more systematic, disciplined study. In contrast, in China, the emphasis from the first falls heavily on disciplined learning—as the oft- repeated cliché has it, “One must walk before one can run.” Only much later, after discipline and skills are completely ensconced, are certain individuals encouraged—or at least permitted—to take more of an imaginative or creative leap.

I argued that either approach to the nurturing of creativity is valid. The risk for the Western approach is that one becomes so attracted to exploration that one never acquires the essential skills and discipline. The risk for the Chinese approach is that skills become so entrenched that one never takes a risk—or that by the time that one is prepared to branch out, it may be too late to accomplish anything that is truly innovative.

To Open Minds was written in a burst of energy in 1987 and published in 1989, just about the time of the horrific mass killings, mostly of young students, in Beijing’s Tiananmen Square. I was so upset by this brutality that I did not visit China for many years. Indeed, by the time of my first post-Tiananmen trip  in 2004, it had become a totally different country. In some ways, China has become more like the United States. In other ways, especially in light of American state and federal educational policies, the United States has moved to a more classically Chinese orientation of drill-and-kill, unfolding in a pervasive omni-testing environment.

With the benefit of the passage of time and the shifting of norms, I see the two approaches to creativity as a continuing oscillation within and also between our two nations. And of course, any account of creativity has been complexified these days by the introduction of powerful digital tools and devices, essentially unknown at the time of my first trips to China and Reggio Emilia alike.

What do we gain from travel, particularly travel that we venture to characterize as transformational? I suggest at least three benefits.

  1. In the case of my first trip to Europe and my post-college year wandering about England and the Continent, I learned far more about the Western cultural heritage—the background of my own family, as well as the intellectual roots and scope of what I had studied in the humanities (and some sciences) in high school and college. Let’s call this depth.
  2. Reggio Emilia exposed me to ways of teaching and learning that I might have read about in the writings of progressive educators like John Dewey or might have seen in certain progressive schools that I visited in the American northeast. But I had never seen these ideas realized with the seriousness, vividness, and longevity that I saw every day, nearly every hour, on every visit, to RE. Let’s call this realization of potential.
  3. With its long history, and its recent turbulence, China constituted the most alien travel experience I have had. The alienation (in the literal sense) was underscored because I do not read or speak Chinese and because I lived alone for a month in Xiamen (and that’s where, on long and lonely evenings, I began to write To Open Minds.) But ever since that trip, whenever I consider any cultural issue, I have China in mind as an alternative, a radically different culture against which I can test my own assumptions and predilections. Let’s call this a comparison case.

And while I have not continued to visit and study China (life is short), I have had many excellent students from China; I continue to follow political and artistic events in China; and, as we plow through a new century, I am open to the possibility that this one may prove to be the “Chinese Century.” As a Westerner and democrat, I hope that China will be less Stalinist, and more Confucian.

Stepping back from travel, I believe that these benefits can come from other experiences—wide reading can certainly deepen one’s knowledge, flesh out hypotheses and intimations, and provide vivid comparisons. But, all the same, travel to faraway places probably achieves these dividends most vividly—and it’s also the most fun.

Categories: Blog

Project Zero: Celebrating 50 Years

December 18, 2017 - 9:24am

Founded in 1967 as an investigation of arts education, Project Zero turned 50 years old in 2017 and now encompasses research in topics ranging from understanding and creativity to interdisciplinary learning and ethics.

On October 13, 2017, PZ held a public forum at the Harvard Graduate School of Education to celebrate this milestone, reminiscing on half a century of contributions to the field of education and beyond, and looking forward to the future as a center of innovative thinking.

Speakers at this forum included:

  • James Ryan, Dean of the Harvard Graduate School of Education
  • Drew Faust, President of Harvard University
  • Daniel Wilson, Director of Project Zero 
  • Howard Gardner, Principal Investigator and former Co-Director at Project Zero
  • David Perkins, Principal Investigator and former Co-Director at Project Zero
  • Shari Tishman, Principal Investigator and former Director at Project Zero
  • Steve Seidel, Principal Investigator and former Director at Project Zero

A video of this event is available below. Throughout the year, PZ will be commemorating its 50th year through particular themes and events. Check their website for more information.

Categories: Blog

On Liberal Education: Views from Abroad

December 11, 2017 - 1:42pm

In the United States, when we contemplate the phrases “liberal education,” “liberal arts education,” or “education in the liberal arts and sciences,” we face two essentially opposed perspectives. On the one hand, the years beyond high school have long been seen as a period when young people can leave home, spend several years in a comfortable setting (perhaps near, perhaps distant from their families), mix with peers, enjoy an active social life, and perhaps learn things that are interesting and useful. We can call this the romantic view of higher education. More recently, however, the high expense of higher education, as well as the lesser likelihood of finding a good job right after graduation, has led to a less happy perspective. Perhaps college is not worth it; indeed, in recent polls, politically conservative respondents actually indicate that higher education is bad for the national interests—a pattern of response that would have been unthinkable a few decades ago. We can call this the disenchanted view.

In other parts of the world, higher education has had a quite different history. For one thing, it has been restricted to a small elite: students who have done well in secondary school and have passed a challenging completion exam. Second, it is usually pointedly vocational; one goes to university to become an engineer a lawyer, or a physician. Third, and importantly, it has traditionally been free or of low cost, with the vast majority of students living at home and not “on campus.”

But recently, notably in Europe and Asia, an increasing number of institutions of higher learning—both government-sponsored and for-profit—have been launched. (It’s been estimated that there are about 200 self-styled liberal education institutions outside the United States.) Students who decide to enroll have a “blanker slate”—they do not arrive with a disposition to romanticize or castigate this form of education. Taking advantage of this situation, three scholars (Jakob Tonda Dirksen, Daniel Kontowski, and David Kretz) have asked students who are attending or have attended liberal arts institutions in Europe to answer the question, “What is liberal education, and what could it be?”. The editors have published the responses of 17 students in the slim volume What is Liberal Education and What Could It Be? European Students on Their Liberal Arts Education.

In some ways, the respondents are reminiscent of students at select American liberal arts schools. By their own testimony, they tend to come from relatively affluent backgrounds—and yet also have to defend themselves against friends and family members who ask them why they are not pursuing a vocational career. As Leon says of his school, “Leuphana University is a quite homogeneous space where many of us students come from middle-class backgrounds, spent a year volunteering before entering university, and speak at least three languages.”

More so than most Americans students, those enrolled in European colleges that style themselves as “liberal arts” centers see themselves as risk takers. The kind of education that they have chosen to pursue is unfamiliar to many in their worlds—and so they feel like they have proclaimed themselves as different from their peers. In this sense, they are more like first-generation students in the United States: they have placed a distance between themselves and both family member and secondary school peers—who, if they had pursued tertiary education at all, would have been more likely to pursue a conventional degree in a single subject matter (like economics) or in a professional career.

These select students declare their uniqueness in what they say and how they say it. Consider this evidence:

  • Some lines from the poem “Artists and Scientists: The Uncommon View” by Nathalie (Leiden University College):

We are the artists one hasn’t seen before
Since we draw connections through actions, reactions, and dissatisfaction
We are the scientists of the shades of grey
When everyone’s leaving, we smile and stay to inspect it all

  • Teun (University College, Utrecht) on “The Headaches and Joys of an Open Curriculum”:

For the Renaissance women and men, they have an opportunity to avoid choosing
For the Tailors, a way to choose precisely what they wanted
For the Shoppers, they will try different courses and see what grabs them
For the Avoiders, a way to avoid courses or approaches that they feared or were not interested in

  • Sem (University of Winchester) makes a drawing of the difference between the oblivious child and the one who has seen the light:

  • While Lukas (Leuphana Univesrity, Luneberg) mixes Simon and Garfunkel’s lyrics with his own text about liberal education:

Simon and Garfunkel: “Like a bridge over troubled water I will lay me down”

Lukas: Against this background, my study was a salad bowl of experiences. All over the liberal arts, the “multi-, inter-, & trans-disciplinary hype of un-education accompanied me smoothly, carried me safe”

As should be evident, the writers collated by the editors are a lively group, not reluctant to express what is on their mind in artistic form.

Some of their testimony is more pointed and critical:

  • Iesse (Leuphana University in Luneberg) wonders whether, instead of being critical of capitalism, he and his peers are being prepared to join the neo-liberal class—becoming in effect the future “Davos” women and men: “For me liberal education rather corresponds to the latest developments in capital society—its ideals of capital accumulation, market-liberalism, comprehensive competitiveness, and the inherent exploitations of capitalism. Whether one likes or dislikes this will certainly vary with context. I guess that many people, are, like, me, torn.”
  • Jacob (European College of Liberal Arts, Berlin) questions a number of widely held assumptions. He wonders whether liberal education truly achieves critical thinking (let alone the more radical challenging of assumptions); rather than being “interdisciplinary,” he suggests that it is pre-disciplinary; and it fails to ask whether a competitive career is really the sole aim of life. As he concludes, “modern liberal education misses the same introspective qualities that it fails to develop in its students.”

I’ve introduced some of the more exotic responses to the questions put forth by the editors—allusive in their use of artistic tools and/or pointedly critical of the programs in which they have matriculated.

But this sample in isolation gives a distorted picture of the testimonies in this collection. Overall, I was impressed by the thoughtfulness of the responses—whether literal or metaphoric, whether critical or complimentary. Here are some powerful points made by the students:

  • Drawing on Plato, Clara (Leiden University College) sees the education of the soul as the ideal conception of liberal education. She delineates how one might educate the soul of the good man (the ideal soul, the just soul, the educated soul) as one which is wise and courageous and which is able to moderate its desires so that it may focus on the achievement of higher (immaterial) goods. The true liberal artisans would get along well with Plato; they are open minded critical thinkers, who do not back away from challenges.
  • Nathan (Amsterdam University College) points out that “the liberal arts have shown me that it is this professional and academic humility—at a time when young people are pressured to have clearly articulated convictions, interests, and ambitions—that will allow me to dare to explore disciplines beyond my specialization.”
  • Sanne (University College, Roosevelt, Middleburg) praises the features of campus life that American students too often take for granted: “The university college made sure rent was affordable; living together next to your fellow students only minutes from the university made working together easy to arrange, and there were always people around to have a cup of tea with late at night. UCR students really formed a strong close-knit community.”
  • Arthur (King’s College, London) asserts that “the liberal arts afforded me the opportunity to think as a history student, as philosophy student, as a film student, and as a literature student at the same time. I found that studying multiple different subjects at the same time allowed me to pool knowledge and different methodologies from each discipline for the benefit of a project. In addition I could take different ways of thinking from different disciplines to approach a subject in a new way.”
  • Brita (King’s College, London) declares that “ultimately liberal arts and its inter-disciplinarity has for me involved an acknowledgement of life as simultaneously meaningless and bursting with meaning. I am no longer to conceptualize or express my life and future life without including art, personal growth, relationships and emotions, as well as academic and professional progress… Liberal arts can teach you what is good, what is bad, what you value, and what does not matter to you. Ultimately, what more could you ask of an education?”

These voices from young persons studying liberal arts outside of the United States are illuminating in two ways:

  1. They cast fresh light on features of a form of education that has long been associated with the United States—both its prestigious private institutions and its capacious public institutions—that may have become less visible and less vivid to those who have long taken their assumptions for granted.
  2. At a time when liberal education is under severe attack in the United States (for some valid reasons, but mostly for reasons that are ill-informed), this informal European study suggests some features that may flourish in soils remote from our shores.

Note: For expositional purposes, some of the quotations above have been lightly edited. I trust that the intended meaning always comes through.

Categories: Blog

The Price of Passion… And Its Rewards

November 28, 2017 - 12:47pm

Visiting a campus that is not very selective (I’ll call it “Downtown University”) as part of our study of higher education, I spoke to a middle aged painter (I’ll call him “Henry”) who teaches drawing and painting to undergraduates. A handful of his students hope to be able to make a living as artists of some sort. The vast majority take his courses because they would like a job in an arts-related business (perhaps fashion or communications or advertising); because they want to become art teachers in public schools; or out of curiosity, hobby, or—that bugaboo of contemporary non-vocational education—in order to meet “distribution requirements.”

In the course of our conversation, Henry mentioned that he himself had studied at a conservatory, where most of the students fashioned themselves as future artists or teachers of art; and that he had also taught at two highly selective Ivy League colleges. Curious to learn about his experiences at these more selective schools, I departed from our customary protocol and asked him to compare his “ivy-covered” students with those at Downtown University.

Henry thought for awhile and then said, “Well, in many ways it is easier to teach drawing and painting at an Ivy School. The students are highly articulate, and since I like to give verbal feedback, it’s easy to explain to them what they might do differently and why and for what purpose. Also, I begin with highly technical lessons and, used to being obedient and to following rules, the students have fewer problems mastering technique than those who come from less privileged or more chaotic backgrounds.”

Henry paused again and added, “But there’s a big problem with many of the Ivy students. To get into these highly selective schools, students need to amass a portfolio of assets: high grades, high test scores, and a panoply of extra-curricular and service activities. I understand and respect that. But then when they arrive at college, they feel that they have to continue that pattern. They know no other! And so, come the weekend, they divide their time between homework, seeing friends, going to athletic events (if they are not actually on one of those numerous teams), or some other artistic or athletic or academic club. And before they know it, it’s late Sunday evening, if not early Monday morning.”

But to become an artist, Henry explained, “You need to have passion. Making art has to be the most important thing that you do. You need to be prepared to spend nights and all weekend on your painting or your mural or your triptych—in fact, you have to want to spend your time on that artistic endeavor. Of course, you pay a price, but it’s a price that you realize you have to pay, and you will want to continue to pay into the indefinite future.”

I did not want to put words into Henry’s mouth, but it seemed he was saying that, inadvertently, preparation for college may undermine the drive, passion, grit, and love that enable a young person to pursue certain careers, and especially a career in artistry, where no holds should be barred. If so, this is a steep price for an individual—or, indeed, for a culture—to pay, especially when the individual or the culture is unaware of this sacrifice.

To be sure, perhaps such individuals should be directed to artistic conservatories—to Julliard or Curtis in music, to Rhode Island School of Design or Parsons School of Design in the visual arts. But then, two costs are incurred: the students themselves are deprived of a balanced education in the liberal arts and sciences, and their classmates lack contact with future major artists (no Yo-Yo Ma or Leonard Bernstein at Harvard, no Frank Stella at Princeton or Helen Frankenthaler at Bennington).

Perhaps there is a way to decrease the dilemma that Henry foregrounded. In current efforts to rethink college admission—for example, Turning the Tide—it’s been proposed that on their applications, students should only list 1-2 extra-curricular activities. Not only would this stricture slow down the trend toward quantity rather than quality, but it might reward those students who have a passion for the arts, or, indeed, for any hobby, discipline, or topic. And perhaps, in a similar vein, college students should be restricted to one major, rather than the two, or, increasingly, three majors, two badges, and a certificate to spare, that I’ve been hearing about of late.

Of course, as colleagues have reminded me, students (and their parents and advisers) are keen readers of changing signals. And so, if colleges decide to valorize those students who seem to have a passion, no doubt there will be efforts to “game the system.” At least some may attempt to “fake passion.” One has to hope that those who preside over college admissions will be able to discern which applicants are truly and passionately engaged in an activity and which simply purport to be passionate.

A bigger challenge is to change the way in which we as a society think about and admire children growing up. All of society recognizes that certain young persons will excel in an area—be it chess, spelling, baseball, or, to use examples from the arts, drawing, caricature, mime, or musical performance.  But all too often these young persons are seen as anomalies, as freaks, as Gladwellian outliers—and so, as not particularly relevant to the rest of society or, to be specific, to child-rearing at home, or classroom education at school.

If, instead, from a young age, children were encouraged to find an idea or activity that inspired them, that they enjoyed, that they wanted to get better at, and from which they gained “flow,” not only would we have more youths of passion and with passion, of purpose and with purpose. Equally important, we would be bestowing on these young people a gift that they would have for the rest of their lives. When I was young, I enjoyed playing the piano, quite possibly because my mother sat alongside me on most days. Now, as someone well on in years, I remain passionate about music. Whenever possible, I listen to music. And when I am home, I play the piano every day—only for myself, to be sure—and there is no activity, whatever its resonance of Walter Mitty, from which I gain more satisfaction. I am grateful that this passion has endured, and I wish that everyone had an activity from which they can gain sustenance throughout their lives.

Note: I thank Wendy Fischman, Lloyd Thacker, and Rick Weissbourd for their helpful comments on this piece.

Categories: Blog

Podcast: Alanis Morissette Interviews Howard Gardner

November 27, 2017 - 12:48pm

Howard Gardner has been interviewed by Alanis Morissette for her podcast “Conversation with Alanis Morissette.”

Morissette, a Grammy Award-winning singer and entertainer, spoke with Gardner about his work, ranging from the theory of multiple intelligences to The Good Project to his latest co-authored book, The App Generation.

To listen to the piece in full, click here.

 

Categories: Blog

Wolfram and Gardner Discuss Computational Thinking

November 15, 2017 - 1:35pm

Howard Gardner and Stephen Wolfram shared the stage on November 6, 2017, at the Harvard Graduate School of Education to discuss Wolfram’s theories of computational thinking.

Stephen Wolfram is the creator several innovative computational systems and the founder and CEO of Wolfram Research. According to his website, he is a pioneer in the area of computational thinking, a mode of formulating problems and solutions, and has been responsible for many discoveries, inventions and innovations in science, technology and business.

A full video of the conversation is available via YouTube below.

Categories: Blog

Contrasting Views of Human Behavior and Human Mind: An Epistemological Drama in Five Acts

November 14, 2017 - 12:06pm

Last month, I received an unexpected communication from Dr. Henry (Hank) Schlinger, a scholar whom I did not know. As he pointed out, this was a somewhat delayed communication, since it referred to an article of mine written quite some time ago. 

In his note to me, Dr. Schlinger argued that I had been mistaken in my assertion that his brand of psychology—called behaviorism—has been discredited and that another brand of psychology—called cognitive psychology—had taken its place. And he took issue with the way in which I had dramatized this process—I had dubbed the change “the cognitive revolution”—and personalized it, citing the work of linguist Noam Chomsky as being a principal factor in challenging the behaviorist account of “verbal behavior” put forth by B.F. Skinner, a well-known psychologist.

After some reflection, I decided both to respond to Dr. Schlinger and to share the correspondence with Noam Chomsky, whom I have known for many years. (I also knew “Fred” Skinner, who was a neighbor, and who befriended my young son, Benjamin, with whom he walked around the neighborhood.) Chomsky responded and, with this permission, I quote his response here.

There ensued one more round of letters—and I’ve described collection as “a suite of letters in five acts.” I reproduce the exchange here. I would like to think that it is an example of how scholars can disagree profoundly but do so in a respectful way. I thank both Hank Schlinger and Noam Chomsky for their cooperation.

Act I: An Opening Foray from Hank Schlinger

Dear Professor Gardner,

I know I’m a bit late to the game, but I just read your article “Green ideas sleeping furiously” (1995), and I have the following comments.

In your article, you said the following:

“Chomsky’s review of Verbal Behavior was a major event in the movement that was to topple behaviorism and itself become a new orthodoxy,” and “His own research, however, was quite specifically grounded in linguistics and took a decidedly unusual perspective on human language.”

As for Chomsky’s research, I’m curious what you’re referring to because I just looked at all the articles he lists on his CV and didn’t see one research article; that is, no experiments.

As to Chomsky’s review toppling behaviorism, I find that curious too because I’m a radical behaviorist and the last time I looked, I’m still here and teaching behavior analysis classes at my university. And there are thousands of other behavior analysts like me all over the world who belong to numerous professional organizations and who publish in journals devoted to the experimental, conceptual, and applied analysis of behavior.

As to the new orthodoxy, again I’m curious what that was or is. It certainly wasn’t Chomsky’s “theory” of 1957, because that “theory” is gone and his positions have changed with the intellectual wind as one would expect of a non-experimental, rationalist.

As I wrote in 2008 on the 50th anniversary of Skinner’s book:

It seems absurd to suggest that a book review could cause a paradigmatic revolution or wreak all the havoc that Chomsky’s review is said to have caused to Verbal Behavior or to behavioral psychology. To dismiss a natural science (the experimental analysis of behavior) and a theoretical account of an important subject matter that was 23 years in the writing by arguably the most eminent scientist in that discipline based on one book review is probably without precedent in the history of science. 

To sum up the logical argument against Chomsky’s “review” of Skinner’s book Verbal Behavior in a rather pithy statement, a neuroscientist at Florida State University once asked rhetorically, “What experiment did Chomsky do?”

And for all of Chomsky’s and your diatribes against Skinner, his book, and the science he helped to foster, his book has been selling better than ever and is now being used as the basis of language training programs all over the world for individuals with language delays and deficits.

Science doesn’t proceed by rational argument, but by experimentation. The experimental foundation of behavior analysis is without precedent in psychology and the principles derived therefrom not only parsimoniously explain a wide range of human behaviors—yes, including language—but they have been used successfully to ameliorate behavioral problems in populations ranging from people diagnosed with autism to business and industry. And what have Chomsky’s “theories” enabled us to do?

I would say that the proof is in the pudding. The fact that some psychologists have not been convinced says a lot about them, but nothing about the pudding.

In case you’re interested, I’ve attached a couple of articles that bear on the subject. You might also want to check out this relevant article:

Andresen, J. T. (1990). Skinner and Chomsky 30 years later. Or: The return of the repressed. Historiographia Linguistica, 17,(1-2), 145 –165.

Sincerely, 

Hank Schlinger

 ***

Act II: Howard Gardner Responds

Dear Dr. Schlinger,

I appreciate your taking the time to write to me.

Clearly, we have very different views of science. As I understand it, for you science is totally experimental and good science has to change the world, hopefully in a positive direction.

I have a much more capacious view of science—going back to its original etymology as “knowledge.” There are many ways to know the world and that includes many forms of science. Much of Einstein’s work was totally theoretical; Darwin’s work was primarily observational and conceptual; whole fields like astronomy (including cosmology), geology, and evolutionary biology do not and often cannot carry out experiments.

An even more fundamental difference: I basically accept Thomas Kuhn’s argument, in The Structure of Scientific Revolutions that the big changes in science involve the adoption of fundamentally different questions and even fundamentally different views of the world. Physics in Aristotle’s time turns out to have been a wholly different enterprise than it was for Newton; Einstein, and then quantum mechanics entailed paradigm shifts again. A similar evolution/revolution occurred in other fields, ranging from biology to geology.

In the field that we both know—psychology—there were what are often called mini-paradigm shifts from the associationism and structural-functionalism of the nineteenth century to the behaviorism of the early decades of the 20th century, to the cognitive revolution (which I chronicled in The Mind’s New Science), and now-again—the emergence of cognitive neurosciences, including psychology.

These paradigm shifts occur for many reasons—and the shifts are not all progressive—but they affect what promising younger scientists (whether theoretically or empirically oriented) consider to be questions/problems worth investigating and how they proceed to investigate them.

It’s in this spirit, and on the basis of this analysis, that I, and many others, claim that over the last several decades, the behaviorist approach was replaced by a cognitive approach to psychological (and related) issues and questions. Neither Skinner nor Chomsky caused this change; but they serve as convenient “stand-ins” for a process that involved many scientists doing many kinds of theoretical and empirical work in many societies.

Turning to your specific point, neither I (nor, I believe Chomsky) dismiss the belief that one can affect behavior by rewards and punishment. Indeed, nearly everyone in the world believes this—including the proverbial grandmothers. From our perspective, the behaviorist approach has two crippling difficulties:

  1. When results come out differently than anticipated—for example, behavior changing for all time because of one positive or negative experience or behavior failing to change despite several experiences—then the analysis is simply reconfigured to account for the results. If a behavior changes, then it must have been reinforced. In that way, as with psychoanalysis, it becomes circular.
  2. While the experimental analysis of behavior may explain certain aspects of verbal behavior, it leaves out what many of us consider to be the most interesting and important set of questions: what is language, how does it differ from other human processes and behaviors, how do we account for the universals of language as well as the speed and similarity with which languages are acquired, despite their superficial differences.

None of this should be seen as an indication that your own work is anachronistic or as a critique of the work per se—but it is a claim that the world of science moves on and that what was on center stage in the U.S. (and the Soviet Union) seventy years ago is now decidedly a side show.

I may post parts of our exchange on my website. Please let me know if you prefer to be identified or not.

Sincerely,

Howard

*** 

ACT III: Communication from Noam Chomsky

Thanks for letting me see the exchange. I have a different view of what an experiment is. Take standard elicitation of the judgments about grammatical status and interpretation, e.g., the example that apparently troubled him: “colorless green ideas….”, “revolutionary new ideas…”, “furiously sleep ideas green colorless,” etc. – the kind of judgments that litter my papers and all papers on linguistics. Each is an experiment, in fact, the kind of experiment familiar for centuries in perceptual psychology. By now they have also been replicated very carefully by controlled experiments, e.g. Jon Sprouse’s, which show that the judgments used as illustrations in standard texts have about 98% confirmation under carefully controlled experiment. Furthermore, there is experimental work of the kind that Schlinger would regard as experiment under his narrow view, in psycholinguistics and neurolinguistics, confirming many of the conclusions drawn in theoretical work based on the usual kinds of (highly reliable) elicitation experiments, e.g. work showing crucially differential brain activity in invented languages that do or do not conform to deep linguistic universals.

In contrast, work in the Skinnerian paradigm has yielded essentially nothing involving language or other domains related to human (or even animal) higher mental processes. Or for that matter anywhere apart from extremely narrow conditions.

I always felt that the death-knell for Skinnerian (and indeed most) behaviorism was Lashley’s serial order paper, apparently ignored (as far as I could determine then, or have since) until I brought it up in my review. And the last nail in the coffin should have been Breland-Breland on instinctual drift. And shortly after a mass of work by others trained within that tradition: Brewer, Dulaney, by now too many others to mention.

Noam

 ***

ACT IV: Hank Schlinger’s Further Comments

Dear Howard,

Again, thank you for your reply. I appreciate the opportunity to have this exchange. Below are my comments.

1. Yes, we have different views of science, but you misread my view. I do not think science is or should be totally experimental, but I do believe that the natural sciences—and you, or other psychologists, may not want to include psychology in that exclusive club (see below)—have proceeded first by experimentation, the results of which led to laws and then theories, which were used to understand and make predictions about novel phenomena. And, while the goal of science it not necessarily to change the world, the natural sciences, through experimentation, have enabled us to cure and prevent diseases, for example, and to develop technologies that have dramatically changed our world, in many instances, for the better. 

1a. Einstein’s theoretical work was based on the experimental foundation of physics. And while much of Darwin’s work was observational, he also conducted experiments, and his thinking was informed by experimental biology.

1b. It is true, as you say, that astronomers, geologists, and evolutionary biologists in some cases may not be able to conduct experiments, though sometimes they do—and must. But their theoretical work is predicated on the discovery of laws through experimentation with things here on earth that are observable, measurable, and manipulable. Otherwise, they are no better than philosophers.

2. I know you have written about the so-called cognitive revolution; I have your book. I say, “so-called because one psychologist’s cognitive revolution is another psychologist’s cognitive resurgence (Greenwood, 1999), myth (Leahey, 1992), or even rhetorical device (O’Donohue & Ferguson, 2003). As Leahey (1992) points out, “But we need not assume that Kuhn is good philosophy of science, and instead rescue psychology from the Procrustean bed of Kuhnianism. His various theses have been roundly criticized (Suppe, 1977), and the trend in history and philosophy of science today, excepting Cohen, is toward emphasizing continuity and development instead of revolution.” (p. 316).

3. As for the claim by you and other cognitive revolution proponents that “the behaviorist approach was replaced by a cognitive approach to psychological (and related) issues and questions,” not all cognitive psychologists adhere to that position. The cognitive psychologist Roddy Roediger (2004) called it a “cartoon view of the history of psychology.” That, plus the frequent statements by cognitivists that Chomsky’s review of Skinner’s Verbal Behaviornot only demolished the book but behaviorsm as well, remind me of the real fake news spewed by Fox News, and now Trump, that is accepted as truth because it is repeated so often. It’s a bit like saying that humans evolved from apes, ignoring that apes still exist. Yes, the predominant view among psychologists is a cognitive one, but it has always been the case. And, behavior analysis still exists. The idea that there ever was a behavioristic hegemony is absurd. Even some of the so-called behaviorists, such as Tolman and Hull, were barely indistinguishable from today’s cognitive psychologist. 

4. Calling the results of decades of systematic experimentation—which by the way, is promoted in almost every introductory psychology textbook I have ever seen as the only method to discover cause and effect—on operant learning “rewards and punishment,” is like calling the centuries of experimental work which led to the theory gravity “apples falling from trees,” which “nearly everyone in the world believes …including the proverbial grandmothers.” That fails to appreciate or even understand what systematic experimentation contributes to our understanding and, yes, knowledge, of the world.

5. Your depiction of the “two crippling difficulties” of the behaviorist approach are simply caricatures created by cognitivists to justify the necessity of their (the cognitivists’) anachronistic, dualistic, view of psychology. Without providing references, your first difficulty remains an unsupported assertion. And, numerous behavior analysts, starting with Skinner himself, have dealt effectively with your second difficulty. The fact that cognitivists refuse to be convinced is the real issue.

6. Back to the beginning, we—and I mean you and I as stand-ins for cognitive and behavioral psychologists—do have different views of science. My science is importantly based on, but not limited to, experimentation. In other words, going back to Watson’s (1913) call to action, a natural science. Yours is apparently based mostly on reason and logic (a rationalist position, like Chomsky’s) and as Skinner once wrote (in a book apparently relegated to the historical trash heap by the cognitivist’s hero—Chomsky) about appealing to hypothetical cognitive constructs to explain language behavior, “There is obviously something suspicious in the ease with which we discover in a set of ideas precisely those properties needed to account for the behavior which expresses them. We evidently construct the ideas at will from the behavior to be explained. There is, of course, no real explanation” (p. 6). This, in a nutshell, is the weakness of the cognitive approach.

As an editor of a mainstream psychology journal recently said in reply to a colleague of mine who wrote in his submission that “if psychology is to be a natural science, then it has to study the actual behaivor of individual organisms,” “Why should psychology aspire to become a natural science? Psychology is a social science.”

This seems to be a (or the) critical difference between our respective disciplines.

Yours truly,

Hank

P.S. Here are a couple of more recent (than Kuhn) approaches to the philosophy of science.

Hull, D. L. (1988). Science as a process. Chicago: University of Chicago Press.

Hull, D. L. (2001). Science and selection: Essays on biological evolution and the philosophy of science. New York: Cambridge University Press.

———————————-

References

Greenwood, J. D. (1999). Understanding the cognitive revolution in psychology. Journal of the History of the Behavioral Sciences, 35, 1-22.

Leahey, T. H. (1992). Mythical revolutions in the history of American psychology. American Psychologist, 47, 308-318.

O’Donohue, W., & Ferguson, K. E. (2003). The structure of the cognitive revolution: An examination from the philosophy of science. The Behavior Analyst, 26, 85-110.

Roediger, H. L. (2004). What happened to behaviorism? APS Observer (https://www.psychologicalscience.org/observer/what-happened-to-behaviorism)

 ***

ACT V: Howard’s End (for this play…)

Dear Hank,  

Thanks for continuing our conversation. Here are some quick responses:

1. We do have different views of science but, in your recent note, you put forth a more reasonable perspective. You say that the natural sciences proceed from experimentation. I’d rather contend that science can proceed from observations, from experiments, from interesting ideas, and even from grand theories. The “conversation” is continuous and can go in many directions.

2. On the nature of experiments, Noam Chomsky makes an important point. There is not a sharp line between observation, informal investigations and more formal experiments. When it comes to judgments of grammaticality, there is no reason for large subject masses, control groups, high power statistics. Almost all judgments are pretty clear—and in the few ambiguous cases can be investigated more systematically, if they is reason to do so. And of course, modern linguistic theory has generated thousands of experiments, reported in dozens of journals.

The most difficult question you raise is whether there has indeed been a revolution, and whether Kuhn’s formulation helps us to understand what happened as cognitivism moved center stage (to continue my dramaturgical metaphor) and behaviorism become a side show. There is no way to ‘test’ these propositions. The discipline that will eventually determine whether my account of the last century, or your account of the last century, is more accurate is intellectual history or the history of science.

Indeed, we can each quote many contemporary scholars and observers who support ‘our’ respective positions, but in the end, the judgments that matter will be made by history.

3. That said, I don’t accept your contention that I am a rationalist and not an empiricist. The record does not support your contention (hundreds of empirical and experimental studies over almost five decades). In more recent years, I do think of my work as social science rather than natural science, but social science has empirical standards and measures as well, and I use them as rigorously as appropriate.

Best,

Howard

 ***

EPILOGUE:

With the fifth act completed, the curtain descends on our conversation… at least for now. But I’d be delighted if others who read the exchanges would join in.

Categories: Blog

Comment on “Three Cognitive Dimensions for Tracking Deep Learning Progress”

November 14, 2017 - 11:47am

The original metaphor for each of the several intelligences was that of a computer, or a computational device. I sought to convey that that there exist different kinds of information in the world—information deliberately more abstract than a signal to a specific sensory organ—and that the human mind/brain has evolved to be able to assimilate and operate upon those different forms of information. To be more concrete, as humans we are able to operate upon linguistic information, spatial information, musical information, information about other persons, and so on—and these operations constitute the machinery of the several intelligences.

Even at the time that the theory was conceived—around 1980—I was at least dimly aware that there existed various kinds of computational processes and devices. And by the middle 1980s, I had become aware of a major fault-line within the cognitive sciences. On the one hand, there are those who (in the Herbert Simon or Marvin Minsky tradition) think of computers in terms of their operating upon strings of symbols—much like a sophisticated calculator or a translator. On the other hand, there are those who (in the David Rumelhart or James McClelland tradition) think of computers in terms of neural networks that change gradually as a result of repeated exposure to certain kinds of data presented in certain kinds of ways. A fierce battle ground featured rival accounts of how human beings all over the world master language so efficiently—but it eventually has played out with respect to many kinds of information.

Fast forward thirty years. Not only do we have computational devices that work at a speed and with amounts of information that were barely conceivable a few decades ago. We are also at the point where machines seem to have become so smart at so many different tasks—whether via symbol manipulation or parallel distributed processing or some other process or processes—that they resemble or even surpass the kinds of intelligence that, since Biblical times, we have comfortably restricted to human beings. Artificial intelligence has in many respects (or in many venues) become more intelligent than human intelligence. And to add to the spice, genetic manipulations and direct interventions on the brain hold promise–or threat—of altering human intelligence in ways that would have been inconceivable… except possibly to writers of science fiction.

In an essay “Three Cognitive Dimensions for Tracking Deep Learning Progress,” Carlos Perez describes the concept of AGI—self-aware sentient automation. He goes on to delineate three forms of artificial intelligence. The autonomous dimension reflects the adaptive intelligence found in biological organisms (akin to learning by neural networks). The computation dimension involves the decision making capabilities that we find in computers as well as in humans (akin to symbol manipulation). And the social dimension involves the tools required for interacting with other agents (animate or mechanical)—here Perez specifically mentions language, conventions, and culture.

These three forms of artificial intelligence may well be distinct. But it is also possible they may confound function (what a system is trying to accomplish) and mechanism (how the system goes about accomplishing the task). For instance, computation involves decision making—but decision making can occur through neural networks, even when intuition suggests that it is occurring via the manipulation of symbols. By the same token, the autonomous intelligence features adaptation, which does not necessarily involve neural networks. I may be missing something—but in any case, some clarification on the nature of these three forms, and how we determine which is at work (or in play), would be helpful.

Returning to the topic at hand, Perez suggests that these three dimensions map variously onto the multiple intelligences. On his delineation, spatial and logical intelligences align with the computational dimension; verbal and intrapersonal intelligences align with the social dimension; and, finally, the bodily-kinesthetic, naturalistic, rhythmic-musical, and interpersonal intelligences map onto the autonomous dimension.

I would not have done the mapping in the same way. For example, language and music seem to me to fall under the computational dimension. But I applaud the effort to conceive of the different forms of thinking that might be involved as one attempts to account for the range of capacities of human beings (and, increasingly, other intelligent entities) that must accomplish three tasks: carry out their own operations by the available means; evolve in light of biological and other physical forces; and interact flexibly with other agents in a cultural setting. I hope that other researchers will join this timely effort.

(I thank Jim Gray and David Perkins for their helpful comments on this piece.)

Categories: Blog

The Professions: Can They Help Us Invigorate Non-Professional Education?

November 13, 2017 - 1:42pm

For many years, within the United States, the phrases “higher education” and “the professions” have evoked different associations. When you go to a four year college to pursue higher education, you are supposed to sample broadly across subject matters and disciplines; hone your speaking and writing abilities; and master critical (and perhaps creative) thinking.

In contrast, when you seek training for a profession or vocation, traditionally after you have graduated from a four year college, you master those skills and make those networking connections that will help you to succeed as a physician, lawyer, professor, social worker, or architect. Of course, in many other countries, you typically choose a profession after completing secondary school; and it is assumed (rightly or wrongly) that you have already accrued those skills and understandings that many Americans pursue in college.

Indeed, this “division of labor” has occurred in my own thinking and my own blogging. Until this past spring, I wrote a bi-weekly blog called “The Professional Ethicist.” In mid-2017, I suspended that blog so as to a launch a new one called “Life-Long Learning.” Ultimately, this new blog, which you are reading, will focus increasingly on higher education, and specifically higher education of the non-vocational variety—think Princeton, think Pomona.

Yet, nowadays, as I have detailed on both blogs, the educational and vocational landscapes are undergoing tremendous changes, at a very rapid pace. In the case of the professions, an ever increasing amount of the routine work is now being executed by smart apps or programs or by trained paraprofessionals; accordingly, the survival of “professions as we have known them” is by no means assured. With respect to higher education, the costs are so great, and the anxieties about finding work post-college are so acute, that the very phrase “liberal arts” is considered toxic. The search for vocational justifications of curricula (and even of extra-curricular activities) is ubiquitous.

Amidst this rapidly shifting domain, an understanding of professions may prove helpful to both sectors. On my definition, professionals are individuals who have acquired expertise in a practice valued by a society; are able to make complex and often vexing judgments in a fair and disinterested way; and, as a consequence of their expertise and their ethical fiber, are offered and merit trust, status, and reasonable compensation.

Though professions were at one time cordoned off from the rest of society, that situation no longer obtains. We can argue about whether that shift constitutes a desirable state of affairs. But I’ve come to realize that ultimately we would like expertise and ethics from every member of society, from every citizen. The phrases, “She is acting like a professional” and “How professionally done!” should be applicable to any worker, whether a plumber or waiter, a minister, musician, or mogul. Indeed, I would not want to live in a society where the notion of “behaving professionally” had lost its meaning.

How does this formulation link to higher education? Under reasonable conditions, any young person who has succeeded in secondary school and is attending college should be on her way to disciplined thinking—that is, being able to analyze issues and think in the way of a scientist (e.g. a biologist, a chemist), a social scientist (e.g. an economist, a psychologist), a thinker in the humanities (e.g. a historian, a literary or artistic connoisseur). Mastering a particular discipline is not nearly as important as apprehending the ways in which various spheres of scholarship make sense of the world. College should be the time at which—and the place in which—students acquire ways of thinking that are elusive for most individuals until later adolescence. As possible candidates for these modes, I would suggest philosophical thinking (what are the enduring conundra that humans have struggled with, how have we done so, and how have we fared), interdisciplinary and synthetic thinking (how do we combine insights from, say, history and physics, in thinking about the concept of time); and an understanding of semiotics (what are the different symbol systems, ranging from written language to computer codes, by which individuals have captured and communicated their knowledge and how do those symbol systems work). In future writings, I’ll flesh out these requirements.

By the completion of such a secondary (high school) and tertiary (college) education, students should know what these forms of expertise are like and also know, if not have mastered, the sector(s) where they would like to be employed, at least for a while. They are on the way to achieving one leg of professionalism—call it relevant knowledge and skills.

Which leaves the second facet: being aware of vexing problems, having the motivation to tackle them, and being committed to doing so in a disinterested and ethical manner. One established way of gaining this expertise is to work as an intern or apprentice in an office or company that exemplifies and transmits an impressive professionalism. (Conversely, an internship or apprenticeship where professionalism is routinely flouted portends future failure in thoughtful tackling of tricky dilemmas.)

My “modest proposal” is that the college itself should serve as a model of professionalism. Teachers, administrators, and other adult members of the institution should hold themselves to high standards, expect those standards to be observed by others, and hold accountable members of the community who disregard or undermine the standards. And going beyond specific individuals, the rules, structures, practices, and—an important word—the norms of the college community should capture and embody the values of a profession. In this case, the profession happens to be education and/or scholarly research. But colleges are inhabited by a range of professionals (from lawyers to engineers to ministers to nurses and physicians); accordingly, the community should model the stances of professions in general, and, equally important, what it means to behave in a professional manner.

This last paragraph may sound idealistic, if not “holier than thou”; but I mean it, seriously and literally. I have observed enough workers in numerous institutions over many years to feel confident in saying that some embody professionalism, while others flout it, knowingly or unknowingly. Moreover, ill-chosen leadership can rapidly undermine the professionalism of an institution (and if you think I have in mind the current executive branch of the federal government, I won’t dissuade you), and it’s much more difficult to resurrect professionalism than to wreck it.

The very fragility of many of our professions and many of our colleges may harbor a rare opportunity. If we were to take (as a primary mission) crafting our institutions of higher education as laboratories for the professions, we might end up strengthening both. And, indeed, if we look at the earliest years of our colleges in the United States, the picture I’ve presented here would be quite familiar. It’s perhaps worth noting that in the 17th century, it was the ministry for which college students in the American colonies were being prepared.

Categories: Blog

Pages