John Hennessy is the chairman of Alphabet, the parent company of Google, and the former president of Stanford. He’s also an acclaimed computer scientist and cofounder of MIPS Technologies. He’s just published a fascinating new book, “Leading Matters,” and he agreed to sit for an interview about his experiences.
Nicholas Thompson: In the book you talk about a growing leadership crisis, and you mention some industries that have been faltering. But you don't mention Silicon Valley. Did you leave it out deliberately, or do you think there is a leadership crisis in Silicon Valley?
John Hennessy: The valley has its share of leadership crises. And I think there's also a growing challenge that these companies have now gotten to the size where their influence on the public is much larger. That's creating new leadership challenges that I think are going to require growth and new approaches in our leaders.
NT: What kind of new approaches in our leaders?
JH: Well I think we're going to have to think more carefully about some of the things that have become idioms in the valley, like, you know, “Move fast and break things.” When the number of people impacted when you say “break something” or the impact on their lives is very large, then we've got to think a little differently about that. It’s fine for a little tiny valley company maybe to do that, but it’s certainly not fine for Facebook or Google or Twitter.
So I think that's going to require more self-reflection and more examination of the impact of decisions. And it’s going to require some more long-term thinking. Short-term thinking, which is something that’s entered the tech sector not just the business sector, I think can lead to serious mistakes.
NT: A lot of the people who lead the companies that are implicitly mentioned in what you just said came through Stanford. The head of product at Facebook was a student at Stanford when you were president. Instagram was started by students who were at Stanford when you were president. Snapchat was started at Stanford when you were president. Is there anything else you wish they had all learned when they were students there?
'I think we're going to have to think more carefully about some of the things that have become idioms in the valley, like, you know, “Move fast and break things.”'
JH: At the time that all those people were undergraduates here, we didn't have a requirement on ethical reasoning and ethical decision-making. We now have a requirement for all undergraduates for courses in that. So I think that reflects a growing realization that many people who go into leadership positions don't have adequate preparation in thinking about these problems. And my view has been what tends to happen then is, people are put in situations that require rapid decision making and they have no background in how to deal with some of the ethical issues in that setting. They don't have a reference; they don't have a starting point. And because it's real time, they make mistakes. So I think our goal is to help educate them better so that they'll be a little more reflective and a little more cautious and maybe think about things from a slightly different perspective than they would have otherwise.
NT: I don't want to single out the the four executives I just mentioned. But shouldn’t all the Stanford graduates who have become extremely important in Silicon Valley have gotten a lot of that anyway through the Stanford education? For starters, you have to take liberal arts courses as a freshman.
JH: You do. And I think some of those courses, they help develop some aspects of people's perspective. For example, I think for a long time our curriculum has done a good job of developing appreciation for diversity and different viewpoints. But I don't think we had a course focused specifically on thinking about ethical considerations and thinking about how they might play out over time. And I think that that's something which I think is necessary given the world we live in.
If you think about think about that young group of undergraduates coming in, a bunch of them are going to be doctors; a bunch are going to become business leaders; a bunch are going to be tech leaders; a bunch will go into politics. In all those areas, understanding ethical conduct and the consequences of important decisions seem absolutely crucial to me.
NT: I certainly do not disagree there. When did you start to realize that students were coming through, earning CS degrees, heading off into positions of great power but not having studied enough ethics?
JH: We've had an optional course for a long time that Eric Roberts, who was a longstanding associate chair for undergraduate education, put in place. But we only made it a university requirement probably about eight or nine years ago. And it applies to everybody.
NT: And there are particular ethics classes that are soon going to be, or perhaps already started, in the CS department?
JH: Correct. So there are specific courses that try to bring to bear core ethical principles that arise in the tech sector. So let's just take one that's now the topic of hot discussion: bias in machine learning systems. Of course that bias essentially comes from the fact that the training data has bias and it reflects all kinds of biases in society. So you, for example, take police records and you use those to train a system for future criminal cases. You're going to start arresting people on the basis of the fact that they resemble other people with existing criminal records. And that’s going to lead to all kinds of prejudicial and discriminatory things. We've got to train people to look for that. So when they build those next-generation machine learning systems, they are aware of that bias, and they take attention of it and compensate for it.
NT: So Stanford will be teaching people both how to identify the biases and then also how to weigh different ethical factors when responding to them?
JH: Yes. And how do you build systems which begin to compensate for that for that bias in the data.
NT: One of the critiques is that the reason the data is biased is because the people who built the system are mostly men, many from similar backgrounds. How has Stanford done diversifying its CS department. Or let me put it a different way: My understanding is that Stanford did a tremendous job under you, but that it's not close to 50-50 male-female and that there's still racial disparities as well.
JH: Yeah we're not close to 50-50, but I think after a long winter where women were not coming into computer science and being successful in the discipline, I think we’ve really made substantial progress in the last five years. So we're not 50-50 but probably the incoming class is roughly 35 percent women.
NT: How are you able to make that progress?
JH: The women were the key drivers. They formed interest and support groups. The hard thing is getting what you might think of as the icebreakers: the people who are on the leading edge. Because once it’s 35 percent, you know, you don't feel completely isolated and you don't feel like, well there's nobody like me in this class because there’s only two women in the entire class. Go back to when Marissa Mayer was a major. You had these large classes with 100 students in the class and five of them are women, six of them, 10 of them are women. And the women formed these support groups and study groups. And that really began to move the needle.
'We're not 50-50 (in enrollment in computer science), but probably the incoming class is roughly 35 percent women.'
I think the other thing that happened is the role of technology quite frankly changed. If you go back to the 80s, so many young people who grew up and got interested in computing, their initial addiction was around computer games, most of which involved killing monsters or people. And that was not a methodology that attracted lots of young women. Now we've moved to social media, technology playing a big role. The students have started a course Computer Science for Social Good. Let's get together and work on technologies which can help improve the world and make the world better. And I think that's inspired a lot of a lot of women to come into the field.
NT: And is the change in statistics of the undergraduates match a change in statistics in the faculty?
JH: Slowly. Faculty turnover is very slow. We're hiring women more than they are represented in the pool. But it still takes a long time. Just to put it in context: when I was the provost, Stanford appointed its first woman dean to one of the seven schools, the dean of law school at the time. By the time I was leaving as president, of the seven schools, four of them had had women deans. And you know, we currently have a woman provost, we have a woman dean of engineering. We have a woman dean of the law school, we have a woman dean of humanities and sciences. Just going back to 1999, we had one woman in the entire history of the university. So that's a real, important change. And I think role models do matter and they inspire people. We still have a lot of work to do on getting more underrepresented minorities on the faculty. That's still a very slow process.
NT: What else is Stanford doing to prepare students for a world of job churn and artificial intelligence?
JH: We're trying to educate them and equip them with with the tools they'll need given that jobs are going to change a lot. I don't believe this is going to lead to a nuclear winter in terms of no job possibilities, but it is going to lead to a lot of changes in what people's jobs are, re-skilling of some traditional roles, as computers can do those equally as well. And new opportunities.
We’ve been talking about medicine and what the impact of might be is as diagnostic assistance technologies come. Let’s say computer programs to read radiographs can perform as well as radiologists. The opportunity is to then take the physician, pull them away from looking at a machine, and engage them more with the patient. I know if I've got a serious disease, I don't want the physician looking at the computer, reading something off the screen to me. I want them to take what the computer can do well, then use that data and look me in the eye and tell me, OK you've got a serious disease. Here's what we're going to do about it. I think we just need to re-conceive of how people do their work and how they take advantage of technology. We're thinking about using artificial intelligence to accelerate and amplify what humans can do, not replace what they can do.
NT: I want to ask one question about distance education. You have a lot in the book about expanding the number of students who come to Stanford. There's a lot about the potential New York campus. There's relatively little about distance education, which is something I know you had talked about a lot previously. What is your view now on where distance education is going?
JH: My view is we have found a sector where distance education works and works really well. And it's continuing education, lifelong education, focused largely on people in professional roles where they either need to learn some new technology, some new capabilities, some new management skills or they're doing a job switch. But they are characterized by people who already have a degree and they're looking to enhance their skills. They're very focused, they are careful with the use their time. They really want to learn and master something. So they find a really great opportunity. And we’ve got lots of people doing that. You run a course now on the blockchain and cybersecurity, you get hundreds of people who want to come in and take that online course, who have heard about this technology, they’re practitioners in this field, they're interested in seeing how it might apply in their context, and they are extremely successful because for them learning that material is important. I think the place where we’ve struggled more is to see what role it can play in enhancing or reaching other nontraditional students.
You know one that I've always wanted to see addressed is how do you deal with remedial coursework, given the number of students in the U.S. that need remedial coursework before they're really ready to go to college. That's something we should be trying to do online. Figuring out how to make that work and make it compelling has proved to be much harder than we thought.
'You're serving your investors, your shareholders, you're serving your employees, you're serving your user base. It's not about you. It's about what you help accomplish for all those people.'
JH: That’s a really good question and the answer is because learning is a much more complicated, human-centered process than we understood. It’s not just a question of watching that video, let it wash over you, and it happens. It's about motivation. It's about people, individuals and their ability to focus on something, to try something hard, to, you know, get a problem wrong and then go back and fix that and understand the material. And so our learning systems in order to support that are going to have to be a lot more sophisticated than we initially thought.
NT: Last question. Of the many lessons in your book, what is the one that you think is most important for Silicon Valley right now? If all the leaders could read one chapter, which one?
JH: Leadership as service. You lead a large organization in the valley now, even a small organization, you're serving your investors, your shareholders, you're serving your employees, you're serving your user base. It's not about you. It's about what you help accomplish for all those people.
When you buy something using the retail link in this story, we may earn a small affiliate commission. Read more about how this works.