Walter Isaacson: “The Innovators” NYC | Talks at Google

We are absolutely delighted to have Walter
Isaacson here with us today to talk about his latest
book, “The Innovators: How a Group of Hackers,
Geniuses, and Geeks Created the Digital Revolution.” I’m going to turn
it over to Walter to say a few things
about his book. But we have two
microphones and we’d love for folks to really
interact with Walter. WALTER ISAACSON: I hope we
can make it more interactive, but I will say a few
words, one of which is you’re very lucky
to be at Google. The book ends with
Google but in some ways, the whole revolution begins
with Google, as well. Because Vint Cerf is one of
the great heroes in this book. In fact, I did a Google
meeting in Washington last week with Vint and Bob Kahn about
the creation of the Internet Protocols. And one of the things that
has sort of surprised me about the digital
revolution is most of us have some sense of the
history of many revolutions. I mean, we know of George
Washington and Ben Franklin in the American Revolution,
or the scientific revolution– the role Galileo plays– or
the Industrial Revolution. But the history of
the digital revolution hasn’t really been written. And it’s got all these
unbelievably fascinating characters in it. Most of them you would have
heard of, but people you know wouldn’t have heard of– people
like Doug Engelbart, Alan Kay, Vint Cerf. And each one of them builds on
the creativity of the others. And they create teams
to do creativity. There’s a collaborative
DNA infused in the digital
revolution, partly because the tools of
the digital revolution were invented collaboratively–
like the ARPANET and the Internet Protocols
as well as the computer– and partly because the digital
revolution is about networking and connectivity,
and thus it allows for a collaborative process. In fact, when I was
doing this book, I was writing about how
the genetic code of the DNA has infused into it this
collaborative nature, partly because it was originally
created as an ARPANET to allow time-sharing
of computers, as well as sharing of research
and collaborating on research. And I said, well, you
know, it’s strange that since Gutenberg invented
the movable-type printing press, books have been sort
of handed down by one author. Let me see if I can
use the internet even in doing this book. So I took chapters of this
book and posted it a year ago, year and a half ago,
on places like Medium. I’d been working
with Ev Williams, because before Twitter, he
had invented Blogger, which was an important step in
taking the World Wide Web away from being a publishing medium
and being into a community medium. And he had invented Medium. And I put it up and put it
on various wikified sites, so people can make notes on it. Now we’ve done
that for centuries. That’s why Ben Franklin
invented the postal service, so that his scientific and
philosophical papers could be circulated and people
could comment on them. And whenever I wrote a previous
book, or wrote for “Time,” you’d send your
article or book out for what we called
comments and corrections. And you’d send it
to 20, 40 people. The first day I put the chapter
on the personal computer and the California
cauldron of tribes in the early ’70s in which the
personal computer was born, I got 7,800 comments–
which is obviously much more than I’d ever gotten
before by circulating my papers to my friends–
including people who were there at the revolution. I had talked about the
“Whole Earth Catalog,” and the Demise Party for
the “Whole Earth Catalog,” and how that money from
that funded these community computer centers,
and how that was part of this cultural
stew that helped create the personal computer,
taking back access to tools. All of the sudden there’s
Stewart Brand saying, well you know, you
got it kind of wrong. We were dropping acid at the
Whole Earth Demise Party. And so we had this rock
band and they kept playing. But in between, we had
this stack of $100 bills and we were trying to figure
out how to divvy it up. And you know, he gave
me the color on that. Liza Loop, who I’d
never heard of, said wait, do you
not understand. It wasn’t just hippies
like Stewart Brand. There were those of us who
were community organizers. We were creating the
Community Memory Project. That’s how online community
bulletin boards really started. So now she’s in the book. And now I put all this stuff in. I even put in a whole
section on Dan Bricklin. You know, one of the things
that’s a problem in this book is there’s so much
you’ve got to leave out. But I realized that the
creation of application software was something I hadn’t
delved into enough. And Bricklin had done VisiCalc. And all of the
sudden he’s giving me all of his papers, when he was
sitting at the Harvard Business School trying to visualize
how he’d do VisiCalc. And so that section’s in. So the notion of
collaboratively creating a book seemed to come
naturally in a book about the digital revolution. And I hope someday– I mean,
Google Docs is a first step. Medium is a first step. Wikis are a step. I hope we will have ways
and tools in which narrative history can be curated
by somebody like myself but not written by me, where
every one of you who invents some particular
product at Google can say, OK, I was in on this. Let me upload my oral history,
my documents, my sketches, whatever it may be. And we’ll have living history
books that are collaboratively produced. But mainly, that
type of projection into the future– and it
comes from Alan Kay and Xerox PARC, who said, the best
way to predict the future is to invent it. But the best way to
invent the future is to realize that each
new invention is built upon the trajectory
from whence we came. And I really wanted to do that
for the digital revolution. It starts– my book,
at least– starts with Ada Lovelace, who
comes up with the concept of the computer algorithm,
but also the concept of the general interest
computer, and more importantly, the concept of connecting
the humanities to technology. She’s Lord Byron’s
daughter, a poet. But her mother, who
doesn’t particularly like Lord Byron when
Ada was growing up, wants to make sure she doesn’t
become a poet, so has her tutored in mathematics, mainly–
as if being tutored in math were an antidote to
becoming poetical. So she develops what she
calls poetical science. She combines the two, and looks
at how punch cards are being used in the looms of industrial
England to create tapestries. And her father was a Luddite. And I mean that literally. Lord Byron’s only speech
in the House of Lords was defending the followers
of Ned Ludd, the person who was smashing the mechanical
looms in England, because he thought they were putting
people out of work. But Ada loved the
looms, and the beauty of connecting the technology
and humanities together. And so she looked at her
friend Charles Babbage’s analytical engine, which was a
numbers calculator, basically to do differential equations. And she said with
punch cards, it can do not just
numbers, but anything that can be notated in symbols. Meaning, as she put
it, it can do art. It can do patterns. It can do words. It could even make musical
composition, something that would’ve made
Lord Byron blanch. And so that notion of a
general interest computer comes from her, but
also that notion of connecting the arts and
humanities side of our brain with the engineering and
tech side of our brain. And to me, that’s a thing
you see all the way through. And I’ll leap forward to
how it ends with Google, and then we can open
up for discussion. I really see that there
are two branches of thought in the digital revolution. There’s what I would call the
Ada Lovelace strand, which is the connection of
humans to their technology more intimately– connecting
humanities and technology. When she said, a machine will
be able to do everything, she then adds the caveat. She says, except, it will
never be able to think. It will not originate
thought or have imagination. Only humans will do that. And our machines will
amplify our imaginations and our creativity. And as partners in symbiosis
technology and humans will work together. Exactly 100 years after that,
Alan Turing comes along. You’ll see the movie
in a few weeks, if you haven’t seen it already,
called “The Imitation Game.” He’s at Bletchley Park breaking
the German wartime codes. But he’s also, as he’s
using this machine to break human language
codes, thinking about the difference between
artificial intelligence and human intelligence. So he, like a good
historian, building on the inventions of others,
knows of Ada Lovelace. He’s read Ada’s notes on
Babbage’s analytical engine. And so he coins a phrase,
“Lady Lovelace’s objection,” to describe Ada Lovelace’s view
that machines will never think. And he says, how
do we know that? How would we know a
machine isn’t thinking? And he comes up with the
Imitation Game– which we now call the Turing
Test– which is, you put a machine in a room
and a human in a room, and you send in questions. And if after a
while you can’t tell the difference between
the machine and the human, then it makes no
logical, sensible reason to say the machine’s
not thinking. You have no empirical data to
say the machine’s not thinking. Those of you who
did philosophy can try to knock down that
argument about consciousness and everything else. But leaving that aside,
that’s the other strand of the digital
revolution, which is those who believe
in machine learning and artificial intelligence. In your company, amongst your
three or four top people, Larry and Sergey and
Eric, there’s still that debate on how fast we
are doing machine learning. How are we going to create
robotics and machines that think without us? Versus, how do we intimately
connect humans to computers? I argue in the book–
not forcefully, because the book’s a narrative. It’s not preaching. It’s just a tale. But I just tell the
tale, which happens to tell you that ever since Alan
Turing came up with the Turing test, it’s always been,
20 years in the future there’ll be machines that can
think and leave us behind. The singularity will happen. I even go through
it– I tried not to embarrass some of my
colleagues in the media. Some are still around. But you can start in 1957
with the perceptron, one of the machines
that were supposed to mimic the neural
networks of the human brain. And it always says,
in 20 years, there’ll be machines that can
think without us. And every 20 years comes
along, and it’s always still another 20 years. You can read Ray Kurzweil. It’s always 20 years. Whether he’s telling you
in 1990 it will happen, or now it’s going to
happen, they always tell you it’s going to
happen in 20 years. There’s a wonderful
guy in the book, Lick Licklider, JCR Licklider. He comes up with the notion
of interactive computing, because he’s doing
an air defense system and you can’t batch process
when a missile may be coming. He also comes up with
graphical user interfaces, because you can’t make a mistake
and shoot down the wrong thing. He also comes up with what
he calls the Intergalactic Computer Network, because he
had a good sense of humor, and he had 23 air
defense stations that had to be connected. So when he goes to the
Pentagon and he funds it, it becomes the ARPANET,
then the internet. One of the things
Lick Licklider said is everybody keeps
telling me machines are going to think
without us, that we’re going to have robots and
artificial intelligence and machine learning. They always say it’s
happening in the future. And I say, in the
meantime, let’s just become more intimate
with our machines, connected more, have graphical
interfaces so we feel comfortable with them,
that sort of thing. Sometimes when I
say that the history of the digital
revolution has been the awesome success of the
Lady Lovelace strand, which is becoming more intimate–
and I get Google Glass this afternoon,
which is my next step to being more intimate
with my machines. I keep saying, that’s
been awesomely successful. And artificial
intelligence has been– there’s a wonderful line, not
about artificial intelligence but I use it about
that, which is, after decades of rampant growth,
artificial intelligence is now entering its infancy. It’s always 20 years away. And people say,
well, look at Google. Google is artificial– you
know, you can type in anything. And it’ll answer. It’s like a machine. It can pass, virtually,
the Turing test. First of all, it can’t. You can ask Google a
really difficult question that none of you
know the answer to, like what is the
depth of the Red Sea. Bing. Right up top. 5,267 feet. It knows it. Your smartest friends
don’t know that. But if you ask Google–
with all due respect– some question like can a
crocodile play basketball, you’re going to end up
with the Florida Gators schedule or something. But you ain’t going to
end up with an answer to that question. And those of us from Louisiana
know a crocodile’s not a gator. And so I say no. First of all, we haven’t gotten
to the point where our machine learning replicates, or tries
to replicate, the analog, messy, wetware of the human
brain in digital form. Secondly– and this is where
the book wraps it all together, with Larry and Sergey at
graduate school in Stanford. And when they decide to
create the algorithm, as you all know full
well, but most people don’t focus on– it’s
not a web crawler that goes around and decides
how to answer questions by some algorithm that does it. It crawls around and
collates, and brings together, billions of human judgments
made by real people every day when they put
a link on their website. It is the ultimate Ada Lovelace
integration partnership and symbiosis of human judgments
with machine algorithms. And so that is why Google
is so central to my book. And also why, 20 years from
now, either we will have robots and a singularity that
will have left us behind– in which case I will
apologize– or we won’t. And we’ll still be
following the trajectory of the digital revolution. And I’ll come back here
and say, I told you so. Anyway, thank you all. Let’s open it up, if we could. I want to tell you that Google
colleagues in Washington were not shy. They were peppering
me with questions. So do not let them down, please. Yes. Are you leaning
forward, or– Yeah. Oh, OK. AUDIENCE: Hi. So, what’s the biggest
surprise that you did when you were
researching this book? WALTER ISAACSON: I’ll
say this– not simply because you’re a woman–
but the biggest surprise was the pioneering role
of women in software. I knew about Ada. Actually, there’s a “New York
Times” piece by Nick Bilton a few weeks ago about my book,
in which I tell the story– and got it slightly
wrong, I’m sorry– which was that I learned about
Ada because my daughter, who was applying to college,
decided to do her entrance essay on Ada Lovelace. And I said, who’s Ada? I actually knew who Ada was, but
I couldn’t remember, frankly, what she had done, exactly. And so I got interested
in Ada Lovelace. But then I became
interested in the six women who programmed ENIAC. You know, boys with
their toys believed the hardware was the only thing. And so they left the
task that they thought was not quite as
important of programming it– [? replugging ?] the cables
and all– to six great women mathematicians. And what surprised me– to get
more granular in the surprise– so I’m reading about Jean
Jennings Bartik and Frances Bilas, but also Grace
Hopper up at Harvard. And they all have PhDs
in math from the 1930s. One little surprise
was more women got PhDs in math– in absolute
number and in proportion– in the ’30s than during
the ’50s or ’60s. In other words,
women back then were much more advanced in
the world of mathematics than they later became. And I don’t know why the
backsliding happened. But one problem, I think, is
there were no role models. I mean, these six women who
programmed ENIAC didn’t become famous because partly,
it was wartime secrecy. And partly, they just– people
got written out of history. More women got degrees in
computer science in 1980 than got degrees in
computer science last year. In fact, the proportion
has been cut in half, from 38% to just over 17%. So all of that was weird to me. And I wanted to at least
explain how COBOL– all the great programming
languages done collaboratively by people like Grace Hopper. And I think it’s kind
of useful, if you’re going to have a revolution, to
make sure 100% of the people have the chance to be included. AUDIENCE: Having studied so
many of the great innovators of humanity, have you noticed
any common threads in how their personalities,
environments, techniques work to kind of result
in those innovations? WALTER ISAACSON: Well,
first off, I always say this isn’t a how-to book. Steve Jobs was, don’t
try this at home. Read Steve Jobs. OK, I’m going to–
likewise, this book. There are a lot of people who
write how-to books– you know, “Seven Secrets to Innovation”
or “12 Steps to Being a Leader.” A, I think those books suck. And secondly, it doesn’t
leave room for biographies. And biographies tell you,
people are more complex– that Bob Noyce is totally
different from Steve Jobs. And yet, Noyce was
a mentor to Jobs. And they both were creative. So I try to do it
through real people. I think the word “innovation”
has been so overused, it’s sapped of most
of its meaning. It’s become a buzzword. So I wanted to say, OK. Here are real people. You don’t know Engelbart that
well, but let me show you. Here’s how he made his leap. There are, having said
that, a few common threads. One of which is everybody
who’s a great genius– and this is a bad story, so
don’t take this to heart– dropped out of school. It’s like, forget it. You know, whether it’s Ben
Franklin or Mark Zuckerberg or Bill Gates or Steve
Jobs or Einstein, even, who runs away– this is
why I don’t get asked to speak at college graduations. But it’s not really that
they drop out of school. What they are is rebellious. They don’t like received wisdom. You can teach Einstein
the first paragraph of the Principia, which says
time marches along irrespective of how we observe it, and he
says, how do we know that? How would we test that? I mean, no received
wisdom is taken without pushing back
against authority. If you look at the people who
did the internet, part of them were pushing back
against authority because they were
avoiding the Vietnam War. And that’s where perpetual
graduate students– and they were just rebellious, as was
Steve, as was Bill Gates. So that ability to
question authority, think different,
as Steve would say. Think out of the box. That’s the common trait from
Ben Franklin to Einstein, to Bob Noyce, and Steve Jobs, and
Larry and Sergey, and others. Yeah. AUDIENCE: Hi. Thanks for being here. WALTER ISAACSON: Thanks. My pleasure. AUDIENCE: So I have a
question, not about this work, but about the fruition of
one of your previous works, which is what the process was
like working with Aaron Sorkin and the Steve Jobs notes
and things like that. WALTER ISAACSON: Well actually,
I admire Aaron Sorkin hugely, but I haven’t really
worked with him. I’ve met him a couple of times. You have to know
what you’re good at. I’ve proven on
the national stage I don’t know cable TV that well. When I was at CNN, it was
like– it’s just not my medium. And the same is true of movies. And so when they
bought the rights to the book, and
Aaron Sorkin– I said whoa, the
great screenwriter. They said, well, do you
want to be a consultant? Do you want to be part of it? I said no, actually, I don’t. Because I’m going
to be too literal. I’m not the best
person to do it. So I didn’t really–
I’ve kind of got to have a meal or two with
him, but I have not helped him, alas, write that screenplay. I wish I had the talent to
write screenplays and stuff, but I don’t. AUDIENCE: Have you read it? WALTER ISAACSON: Huh? AUDIENCE: Have you read it? WALTER ISAACSON: There
are different parts and different versions, and I
probably ought not go there, but yeah. AUDIENCE: Thanks. AUDIENCE: Hi. What other modern day
inventors, and innovators– even though you don’t like
the word– interest you? Who else do you– WALTER ISAACSON: Well I
love the word “innovators,” meaning people actually do it. I just don’t like the
concept of innovation as an abstract
concept that you can learn in seven easy lessons. It’s like, no. You have to see how
real innovators did it, which is why I
did– I love Elon Musk. I just was with him last
week in San Francisco. I got to interview
him and do some stuff. And I think he’s in a tougher
place than we in this room, meaning, it’s harder to
innovate in physical industries like transportation and cars. We have an
over-regulated society. We have– you know, you cannot
easily, in the garage, with, you know, the kid named
Raj from down the street, invent an auto company. But you can invent Apple. I also think he just thinks–
I mean, talk about rebellious, talk about questioning
authority, whatever. I think he’s very good. I’m not just blowing
smoke to think that I just get awed by Google at
every step of the way, and the new things they’re
doing, including cars, driverless cars and stuff. I watch Larry Page. I mean, I’m sure you all have
met him or listened to him. Larry’s mind is so fast that
the biggest mistake you can do is have anything that
hints at a premise when you’re about to
ask him something. Because then he decides to drill
down and question the premise. Like, if you said,
because it’s sunny today, I want to ask you– and
then all of a sudden, it would be questioning
the whole question of sun. I mean, I’m exaggerating
there, but that’s a mark of a very questioning,
fertile mind, who is going to take nothing
as received wisdom. And his ability to tell the
tale of how he created Google is better than Steve Jobs’s. Steve was very intuitive. And I spent a lot
of time with him. But if I tried to drill
down with him on especially, say, software– like, OK,
there’s a Darwin kernel that you had in the
[? next ?] operating system, and to what extent did you
make this decision involving how it was going to be part
of the Apple operating system when you’re back at Apple? There’s no way. I’d peel back and you
know, I’d get [? blank. ?] So he was not deeply reflective
of how decisions are made. Now he would talk
intuitively about why the iPad had to
have a curved thing, or what he did in
Jony Ive’s studio. But he was not
self-analytic in terms of his decision-making process. I found Larry Page
in particular to be very smart about
understanding how his creative process worked. And I’m sorry I put it
at the end of the book, but you can skip forward and
skip all the middle parts if you want. But I find his interview
quite interesting. AUDIENCE: Hi. I had a question about your
methods as a biographer. Because writing about people
is one skill, but actually understanding them is
a very different one. So what is your approach
when you come across these very different,
unique people? Because if you come with
preconceived notions of how to analyze the life and
creativity of someone, you end up losing a lot. So what’s your method? Is it spending time with them? WALTER ISAACSON: Yeah. Narrative tends to
distort history, because you’re sort of
trying to push things into whatever arc you created. So you avoid trying to have
a narrative arc before you have the data
points that plot it. Secondly, when you’re
talking about people, there’s the simplest
of all things to do that I think most
biographers and journalists don’t do enough of, which
is, just think about it and look in yourself a bit. Like if Steve Jobs
did something– whether it was a personal thing,
like dealing with his firstborn child born out of wedlock,
or dealing with being ousted from Apple, and what he said
to somebody on the board and what happened– I
sometimes think, OK. Let me put myself
in that position. Let me try to feel it. Instead of playing gotcha,
where you kind of say– whether you’re writing
about a politician or you’re writing about the
head of the Centers for Disease Control or the head of Apple
or the Secretary of Health and Human Services– you
sort of say, wait a minute. I’ve been in those
situations before. I’ve made mistakes. I’ve done it not for evil
motives or bad motives, but because this happened. And you try very hard to
put yourself in their shoes and to see it as they
would at that moment, rather than imposing
what’s sometimes called presentism– a very bad
word– where a biographer will impose what we know at the
present on the subject who only knew what that subject
knew at that point. And sometimes people
say, well, how can Steve Jobs have done this? Or how could Bill
Gates have done this? I say, wait a minute. Have you ever been
in that situation? Think about it. How would you react? How would you feel? So I try to empathize more. And it sometimes means
sugarcoating things. I mean, there’s things
in the Steve Jobs book, if you’ll read, you’ll say,
wow, that was pretty bad. He yelled at somebody. And well, yeah, I yelled
at people occasionally. And it got them to do better
work– not when I did it, but when Steve did it. And so I try to put
that into a context. So you say, all
right, he was tough but I can now put myself
in his head, or her head. And it makes for
a more empathetic, thus a more sympathetic,
biography, which in some ways is bad, because you’re
not as tough of a writer. But I look at a Bob
Woodward and know I could never be a Bob Woodward,
because he’s much tougher. And I would be
saying, well, I can understand why Nixon did that. I probably would have
bugged– not really, but you know what I mean. You get two types
of biographers. Those that really
want to expose things, and those that somehow go soft,
as we’re sometimes accused of. But I want to say, yeah,
but understand– I mean, I was just re-reading
a part of my book because I was dealing
with Bill Gates, and I wanted to– There’s a
part of it in which he and Paul Allen have split it 50/50. And then Bill’s
doing all the work. And they’re in Albuquerque,
and he’s coding all night long. And he’s also hiring and
firing, going on sales calls. And he finally tells Paul
Allen, no, 60/40– and then I think 66, whatever, 33. And people have written
about that, including Paul, and say isn’t that horrible? You know, he jammed me. I was trying to remember
how I had handled it. And there’s a paragraph
I write that begins, to be fair to Bill Gates
comma– and then I explain, he was running the show. And so I try– is this
answering your question? I try to feel what
the person was feeling and maybe try to understand
the motives more. AUDIENCE: Yeah,
that’s very helpful. Thank you. WALTER ISAACSON: OK. AUDIENCE: Hello. So the Jobs biography
was a fascinating read. Thank you for that book. At the start of the book,
there is a little bit of a section in the
book where you mentioned that when Jobs asked
you to write about him, you were not very
certain about it. You had to take
some time to decide. So I was curious about, how
did you make that decision? I mean, aside from
what’s told in the book, more in terms of–
it’s probably easier to write about people who
are way back in history, like somebody like
Einstein, because you’re not interacting with them
on a daily basis. You have a lot of published
material available. So how do you decide
for start for Jobs and what’s your process? WALTER ISAACSON: Well,
I do go back and forth. Meaning the first
big book I did– well, it was something
a friend and I did, something called “The Wise Men,”
but then I did “Kissinger.” And trust me, after dealing
with Kissinger a whole lot, and then dealing with his
reaction to the book, I said, man, I’m going to do somebody
who’s been dead for 200 years. So I go back to Ben Franklin. Right? I don’t think I’m the
best historian you’ll ever have on this stage, even. I mean you can get the
Doris Kearnses up the kazoo. I’m also probably not
the best reporter. As I say, Bob Woodward
probably is a little bit more dogged than I am. But if you do the
Venn diagram, I’m pretty good at combining
archival historical research with interviews. And I happen to have one
lucky thing in my life, which is having been the editor of
“Time,” having written books. If I call Larry
Page, he says yes. Come on by. Let’s spend the
morning going over it. Whereas if the kid
from down the street trying to write a book
about it calls Larry Page, he’ll never get past– so I have
a little bit more [? entry ?], which I don’t want to screw up. So I try to do more interviews. I mean, I drive up to
Gordon Moore’s house. I think that’s a problem
with journalism today is that it’s so easy for
people to be journalists and to write their own stuff,
that that notion of let me rent a car and drive
to Gordon Moore’s house and hear him tell the
story– there’s not as much of that done. And there’s more,
let me tell you why Moore’s law doesn’t
work or does work or give you my opinion on it. Totally understandable,
because A, you have to be somewhat
privileged to be able to get through and get
an appointment with some of these people. And B, it takes some resources
to fly out, rent cars, that sort of thing. So that’s something
I bring to the party. I’m just lucky that I’ve
had a couple of books that were successful,
so I can do that. So that’s what I
like to combine. Which gets to your
question of where do you pick, the Wayback Machine
versus somebody [INAUDIBLE]. When Steve first talked to
me about it, and I think I put this in the introduction. You know, I had done Ben
Franklin, I had done Einstein. And my first reaction
was like, OK, yeah. Franklin, Einstein, you? I had known him
since ’84 when he came to plug the Mac
at “Time” magazine. And I knew he was a genius and
wonderful, but he’s my age. And I’m thinking,
wait, I don’t want to write a biography
of somebody my age who’s still in the
middle of his career. What I actually said to him was,
yeah, that’d be really cool. But let’s wait 30
years until you retire. And then, you know, I was
just saying, put it off. And then, of course, I
realized he was sick, and that he had
called me, I think, right after he was diagnosed. So that put it in
a new perspective. And I also realize,
OK, I’m going to have a chance that
nobody else does. There are people who
know software engineering in this room. Every one of you probably
knows it a little bit. I mean, I try hard. I used to code. But I’m not– but
I have the ability to get a Steve Jobs to want
to spend 40 days of me just sitting in his backyard, taking
walks, and talking to him. And so I figured, you don’t
get that chance that often. I shouldn’t blow it. I shouldn’t deflect
it, especially when I knew he was sick. And very rarely does a
biographer or a journalist get to get that
close to a hugely brilliant, amazing subject. Obviously Boswell
does to Dr. Johnson. But you can list, probably on
one hand, the number of people who’ve gotten to spend an
enormous amount of time with a subject as
interesting as a Steve Jobs. So obviously, I
was going to do it. It was just a
question of timing. I had been working on this book,
“The Innovators,” for really 15 years, sort of off
and on, not knowing. I’d make Andy Grove Man
of the Year at “Time,” and spent some time with him. And then I’d
collect all my notes on how did Intel get formed. And I knew someday
I was going to try to do a history of the
digital revolution. And to me, this book is
the best of both worlds. Approximately half
the characters– there’s about 80 characters
in the book, 40 done deeply. And about half of
them were alive, and I could interview them. All of them who were alive,
I think I went to interview. And there are great archives,
oral histories, documents, and nobody has written a history
of the digital revolution. So it’s really the best of
both worlds, being a historian and being a journalist. AUDIENCE: So you, I think,
framed the book around people who are innovative or
innovators, as you call them. I was just curious to hear if
you had any thoughts about what it means for a company
to be innovative, if that’s even possible,
or if that’s just the product of a group of
innovators coming together. WALTER ISAACSON: Yeah,
it’s very difficult to be innovative
once you get big and get to become a company. I did ask Steve, what
was his best product? What was he most proud
of, his best innovation? I thought he’d say the
Mac, or maybe the iPhone. He said, no, creating
a product like the Mac or the iPhone is pretty hard. But what’s really
hard is creating a company that remains creative. And so the best thing
I created was Apple, because it’s able
to remain creative. Why has Apple been
able to do it? Take the iPod. I mean, just this
huge success– they’re making money hand over fist
in the early 2000s when this thing comes out. And it’s out of whole cloth. Who knew we needed 1,000
songs in our pocket? But we did. So then instead of being
happy, all of a sudden he’s bummed out. And it’s because he realizes
somebody could cannibalize it. That if the braindead
people who make cellphones figured out that they could put
music and create a smartphone, that would put iPods
out of business. So he creates the iPhone. And the people at
Apple say, well wait, that will cannibalize
our iPod business, right? And he says, yeah, but if
we don’t, somebody else will eat us for lunch. So that’s what it takes to avoid
what Clayton Christensen calls the Innovator’s Dilemma. It’s the willingness
to destroy in order to create something new. I’m a little woozy
because I was at 5:00 AM on Bloomberg financial
TV, God knows why. But a person kept asking
me, but isn’t Apple going to hurt the iPad? Aren’t they in
trouble with the iPad because their phones
are becoming bigger? And it’s true. I no longer take my iPad around
town with me, or even on trips sometimes, because
I’ve got my smartphone. And I said yes, but the
whole point I’m making is Apple’s not afraid to
say yes, a new iPhone will hurt the iPad market. But we’ve got to go for it. In fact, Apple was reticent. And they didn’t
make a big iPhone until, as you know quite well,
after Samsung, Google, Androids all were coming out. But it remained innovative. Now obviously, Google does that. Google all the time
is shooting the moon. Sergey in particular,
doing GoogleGoogle[x]. And that’s really cool. I look at more
staid corporations. IBM is a really
interesting example. It’s been around for
more than 100 years. So it kind of gets it. But you always think they’re
about to just jump the shark and be over. That was the way, back in my
book, when the PC comes along, and Apple comes along. And then suddenly IBM says, no,
we can actually create a PC. And they get away
from headquarters, and they do it in
Orlando, and boom. IBM PC comes out. Not bad. Likewise, Xerox decides, we
can’t just be a copier company. And they create Xerox PARC. But they don’t capitalize on it. They do the graphical
user interface. They do all the things
you find in the Mac, but they don’t– in the
Xerox Star, and the Alto– it just doesn’t get it,
and it doesn’t work. So most big companies
can’t innovate well. I think it takes a
visionary set of leaders who are willing to
break china– i.e., destroy the iPad to
make the bigger iPhone. And you know, I
watch Ginni Rometty, and I think OK, this
will be interesting. Can she remake
IBM one more time? She’s doing it a bit to get
back to machine learning by taking their cognitive
computing division– whatever you want to call
it– and making it so it’s a collaborative
division, where cognitive computing
is done with people. You take Watson and
pair it with doctors. You take Deep Blue
and it plays chess better when it
plays in partnership with people than
when it– you know. And so I think that she’s
trying to do this thing. I don’t know whether the
Watson division of IBM will be its saving grace. But I watch industries
that get disrupted, and it’s usually
because people are trying to protect what
they’ve invented before. AUDIENCE: Hi. I’m curious about
your relationship with your subjects. And how do you create a
context in which they’re going to reveal
themselves to you. Are there lessons
learned in your career? WALTER ISAACSON:
Yeah, one is just a simple one, which is listen. And the second is silence
is the best question. There were times, whether it was
Henry Kissinger or Steve Jobs, I’d say, OK. How’d you do this? And they’d sort of blow it off. I’d just sit there. People hate silence. Eventually, they will
start talking again. I don’t fully know–
I mean, I don’t want to be bragging, because
I know all my weaknesses. But I do have a particular
ability at times– and I think I learned it as
a police reporter for the New Orleans “Times-Picayune”– to
just go up to people and say, tell me the story. And you just wait. And you say, tell me the story. Everybody’s got a story. And everybody wants to tell it. And if you ask them
to do it as a story, they’re going to
tell you the story. I’ll tell a story
I haven’t told. I’m almost embarrassed
to say it in public. First day I ever
was a journalist. Summer job, “Times-Picayune,”
New Orleans. I’m on the 5:00 AM police beat. I get sent out. A young woman, a
young girl, has been killed on Carrollton
Avenue in New Orleans, and I’m sent there,
to the crime scene. I go, and interview the police. And back days before
cellphones or anything else, I go to a payphone,
phone in the story. And the early morning
rewrite man says, did you talk to her parents? I said of course not. I mean, the parents
are in the house. I’m not– She said go
talk to the parents. So I knock on the door. And I’m like, holy shit. Their daughter just got killed. And they talk to
me for a good hour. And after I finished,
I call the rewrite man. He says, did you get a picture? I said, no. He said, go back and
get the yearbook. And they give me the yearbook. So I learned a lesson, which
is just sit there and listen. People want to tell
you their story. And that has been lost a
little bit in journalism these days, which is people
want to engage on opinions. But they don’t just
say, tell me the story. AUDIENCE: Hi. Just interested to hear
your top list of people you would like to interview
throughout history. WALTER ISAACSON: Interview
throughout history. Well obviously the people
I’ve written about. I mean, if I could
have a beer tonight, it would be with Ben Franklin. And I think all of us would. I mean, he would be
totally blown away by the question of whether an
open Android system or a closed integrated system of
hardware and software made the most sense. He would love every– I mean,
he invented more devices than anybody in this room has
ever thought of inventing. He invented gadgets to take
books down from shelves. And he invented the pedometer. He invented, obviously,
bifocals and lightning rods. So this guy is the
most inventive person, and he loves and
embraces technology. I also like him because,
unlike Lord Byron, who was a Luddite, as I
said, he’s an optimist. One of the weird things I get
in this book– in interviews about it all the time–
is, isn’t technology bad? Isn’t it hurting us? Isn’t the NSA now spying on us? Isn’t life horrible? Isn’t Google, you know, keeping
track– I go, wait a minute. We own this technology. And isn’t it putting
us all out of work? Well no. Show me the data points. It’s creating– as it did in
the Industrial Revolution, despite what Lord
Byron thought– it’s creating all
sorts of new economies. So I’m very optimistic. And Ben Franklin was the
most optimistic innovator I’ve ever written about. He loved the concept
that’s at the heart of the digital
revolution, which is that the free flow of
peer-to-peer information will be the most empowering
way to create a new society, to create a democracy,
to transform the world. That’s what you do at Google. That’s what this
digital revolution has been about– personal
computers, distributed networks, peer-to-peer. And I would love, not
to go back and have dinner with Ben Franklin, but
to bring him forward and put him here, and to let him walk, and
to fit him with Google Glass. I mean shit! That would be so cool. Yes. AUDIENCE: Hi. Thanks for coming over. So I read your book
about Steve Jobs, and it was fascinating because
he was a very interesting character, and I enjoyed
learning about him from your perspective. The one thing I want to ask
about your memory with Steve Jobs, maybe that
wasn’t in the book, while you were working with
him, that you can share with us. WALTER ISAACSON: Memories? Yeah, I mean, there’s
a particular– and I tried to convey
it in the book. But the most striking
thing about him was how emotional he was. I mean, he would cry at times. He’d get worked up. And that intensity of emotion,
I think– somebody asked me, what are the keys to– I think
you’ve got to be passionate. But more than just passionate,
you’ve got to be emotional. Early on in the interviewing
process with him, I asked him what
makes a good– we got into this you got to be
a rebel, question authority thing that I said earlier. And he said, but you
know my manifesto. And it was the
“Think Different” ad from 1999 or so, which
you’re too young to remember. But it was this awesome ad
that had a print campaign which was Gandhi and Einstein and
it just said, think different. But there was a TV ad
that Steve helped write. So he’s sitting
there in his garden, and he recites the
entire 60-second ad. “Here’s to the crazy ones,
the misfits, the rebels, the round pegs in
the square holes.” And he goes on. And then he gets to the end,
and you know, “we at Apple, we celebrate them, because the
people who are crazy enough to think they can change the
world are the ones who do.” And I got a little
choked up just saying it, because I remember
when he said it to me. By the end, he’s crying. He’s choked up. And I’m going, what
just happened here? Did he get something in his eye? We were sitting in the garden. And he said, you just
have to excuse me. There are things that
make me so emotional, because they’re so beautiful
to me, and so meaningful to me, that I get choked up. And I go OK. That’s why he’s a bit
different from you and me. But that’s why he was Steve. AUDIENCE: Great. Walter, you seem so
optimistic about the future, about this digital age,
compared to other people from what we would
call old media. WALTER ISAACSON: That’s because
I got out in the nick of time, one step ahead of
the [? disruptors. ?] AUDIENCE: And you alluded a
bit to the art of storytelling and listening being lost
from journalism today. I mean, what would
your advice be to those who continue– it was
just on CNBC this morning– hand wringing over the
demise of journalism, the demise of media. WALTER ISAACSON:
Actually, they were trying to get me to hand wring. I was on CNBC yesterday. I was on Bloomberg. Ah, journalism, demise, demise. There are a couple things. Journalism is not
in demise at all. This is the best era
ever for journalism. There’s journalists all over
the world doing amazing things. This is a very bad
period for the business model for journalism. Which is different–
I mean, it’s related. And the business
model got screwed up. And I was one of the
people who screwed it up. I was in charge of digital
media for Time, Inc. And right when the
web came along, we were being paid
$1 million a year for AOL and CompuServe and
others to put up our news and run the bulletin boards. And the web comes along,
and it’s like, great. We can publish our own. We don’t need AOL. And we were going to do exactly
what we did with AOL, which is put up advertising,
and we created banner ads, and charge the user a
certain amount– metered, or whatever it would be. From Madison Avenue, you could
look out of the Time Life building, and people were
carrying bags of money. They want to buy
ads, because they want to get with
it, this new medium. So we ended up not
charging for content, not trying to get
subscriptions, not having a micropayment system,
which Ted Nelson wanted to put in the hypertext,
which had been discussed at the Web Consortium
by Tim Berners-Lee. And everybody said, oh,
it wants to be free. And you don’t get it. You’re clueless if you
don’t make everything free, because we can all do– there is
no way advertising will support great journalism alone. Even Henry Luce
70 years ago said, if you’re depending on what
he called giveaway journalism to collect eyeballs
for advertisers, it’s not only morally
abhorrent, it’s economically self-defeating. I think the latter
part was worse to him. And it’s true. There are always going to
be more websites created than ad dollars added
in any given year. Chevrolet’s only going to
launch a certain number of cars. And even with all
sorts of new products, ad dollars go up at
best on a small slope, whereas the number of web
avails goes up exponentially. So that was a bad
business model. And it’s got a lot
of people screwed up. I think you could get
back to a good business model in some ways. First of all, the
genie is slowly being put back in the bottle. I subscribe to “Wall
Street Journal,” I subscribe to
“The Economist,” I subscribe to the “New
York Times,” or whatever. And believe it or
not, 800,000 people pay serious money to subscribe
to the “New York Times.” I don’t think subscriptions
is a great idea. I mean that’s fine,
but I don’t think it’s the way for all journalism. I don’t think bloggers
can get subscribers. I don’t think people who
write musicals or LARPs or whatever are going
to have subscribers. I think you want to be able
to pay per drink, or newsstand pay. And our financial system
is so [? undisrupted, ?] and so screwed up. If there’s something
I’m pessimistic about, it’s the banking system. I mean, if I’m trying to
give Betsy, my daughter, $50, sort of Popmoney kind of works. I end up with an Akimbo
Card because I can put money on an Akimbo Card
and designate it to whoever else has
that Akimbo Card. But I mean, this is nutty. So Bitcoin, I think, and other
cyber- and crypto-currencies could be the saving
grace, which is you don’t need the
banking system. You don’t need Paypal’s
clunky kludgy interface. You don’t need Visa
card passwords. You just say, oh. That article looks
really interesting. It’s $0.25. My browser will have
bitcoins or something in it. There’s about eight new
companies popping up doing it. I don’t even have
to authorize it. I can sort of set a level. Like if it asks me for
anything less than $5, don’t even ask, just
if I click on it, send out the cyber-currency. That will bring back a
golden age of journalism, not simply because it would
bring back a revenue stream, but because it will
do something to get to the morally abhorrent part
of Henry Luce’s quote– which is if our goal is mainly
to aggregate eyeballs for advertisers
and marketers, then we will produce clickbait
that can do that. If our goal is to be beholden
to giving something of value to the reader that that
reader or user or viewer is not going to get
somewhere else– that isn’t just a commodity,
headlines, or clickbait, but something that’s a true
story that they don’t have elsewhere– the only way that
that becomes economically feasible is if you
are incented every day to produce something of such
value that the person will say, $0.25? Fine, fine– and click on it. And that will give us the
non-perverse incentive to avoid just doing
clickbait and saying, I’m going to do something
that somebody will value. And the only way to know
somebody values something is that they will
put a value on it, that they will pay
something for it. So I have this argument
with my daughter, who believes everything
should be free. And I explained to her,
since she’s a writer, and since she’s doing a
book or whatever, what do you think– you know,
how does this work? And don’t you want
to be incented to provide something so valuable
that it’s at least worth a buck to somebody
who’ll click on it? And that will help. And two other tech– I’m
sorry to go deep into this, but I worry about it– two
other technological things. I do think the internet and
the digital age in general tends to Balkanize. It used to be, I’m old
enough to remember growing up when Walter Cronkite said,
that’s the way it is. Because you could not
start a TV network. I mean, maybe Sarnoff did
when there were two of them and he started the third. But after that,
it’s kind of hard, before cable and digital comes
along to start a new network. So you had to
gather a broad base. You had to get a third of
the audience, at least, or you were failing. Nowadays, to win in
the digital world– whether you’re Fox Cable News or
a web– “Talking Points Memo”– you have to get like 1% of the
audience and be passionate, and you win. So it causes people to go for
passionate niche audiences, rather than the
mass audience, which allows it to be somewhat more
politically and ideologically Balkanized. And secondly, the
web inherently is better for information gathering
than it is for narrative. Narrative, especially
long-form narrative– and I don’t mean this because
technology makes us ADD. I just mean when you’re on the
web and you’re clicking around, that’s because you’re
putting together the thing. But if you want somebody
to tell you a whole story, if you want that “New
Yorker” essay on you know, Remnick in Moscow on Putin,
that demands– and you can do it online. It’s not like print
is better than online. But the web has all sorts of
landmines which we call links. And you know, you’re
into the third paragraph and all of the sudden
you’re floating off and looking at something else. So I think sustained
narrative is something we have to try to bring
back to journalism, which is, as I said, the
simplest six words. Let me tell you a story. AUDIENCE: Hopefully a
quick closing question. WALTER ISAACSON:
Yeah, it’s 12:59:27. AUDIENCE: Yeah. I’ll go real quick. As a fan of your
work, just curious if you know what your next
book or your next topic is going to be. WALTER ISAACSON: Yeah,
the ultimate connection between art and technology. Does anybody have
a copy of my book? Do you have it? I thought I saw
somebody with it. Yeah. If you’ll open to the very
last page of the narrative, I don’t announce it,
but I give a hint. Because if you had to
pick the ultimate symbol of the connection of the
sciences to the humanities, it would be “Vitruvian
Man,” the amazing drawing. And nobody has
fully done Leonardo with the science
connected to the art. And I want to go back
in the WayBack Machine so I don’t have to deal
with people anymore. And I want to go to Florence. So I’m going to go do Leonardo. Thank you. [APPLAUSE]

Leave a Reply

Your email address will not be published. Required fields are marked *