• International edition
  • Australia edition
  • Europe edition

Steve Jobs

Steve Jobs: The Exclusive Biography by Walter Isaacson – review

P erhaps the funniest passage in Walter Isaacson's monumental book about Steve Jobs comes three quarters of the way through. It is 2009 and Jobs is recovering from a liver transplant and pneumonia. At one point the pulmonologist tries to put a mask over his face when he is deeply sedated. Jobs rips it off and mumbles that he hates the design and refuses to wear it. Though barely able to speak, he orders them to bring five different options for the mask so that he can pick a design he likes. Even in the depths of his hallucinations, Jobs was a control-freak and a rude sod to boot. Imagine what he was like in the pink of health. As it happens, you don't need to: every discoverable fact about how Jobs, ahem, coaxed excellence from his co-workers is here.

As Isaacson makes clear, Jobs wasn't a visionary or even a particularly talented electronic engineer. But he was a businessman of astonishing flair and focus, a marketing genius, and – when he was getting it right, which wasn't always – had an intuitive sense of what the customer would want before the customer had any idea. He was obsessed with the products, rather than with the money: happily, as he discovered, if you get the products right, the money will come.

Isaacson's book is studded with moments that make you go "wow". There's the Apple flotation, which made the 25-year-old Jobs $256m in the days when that was a lot of money. There's his turnaround of the company after he returned as CEO in 1997: in the previous fiscal year the company lost $1.04bn, but he returned it to profit in his first quarter. There's the launch of the iTunes store : expected to sell a million songs in six months, it sold a million songs in six days.

When Jobs died , iShrines popped up all over the place, personal tributes filled Facebook and his quotable wisdom – management-consultant banalities, for the most part – was passed from inbox to inbox. This biography – commissioned by Jobs and informed by hours and hours of interviews with him – is designed to serve the cult. That's by no means to say that it's a snow-job: Isaacson is all over Jobs's personal shortcomings and occasional business bungles, and Jobs sought no copy approval (though, typically, he got worked up over the cover design).

But its sheer bulk bespeaks a sort of reverence, and it's clear from the way it's put together that there's not much Jobs did that Isaacson doesn't regard as vital to the historical record. We get a whole chapter on one cheesy ad ("Think Different"). We get half a page on how Jobs went about choosing a washing machine – itself lifted from an interview Jobs, bizarrely, gave on the subject to Wired . Want to know the patent number for the box an iPod Nano comes in? It's right there on page 347. Similarly, the empty vocabulary of corporate PR sometimes seeps into Isaacson's prose, as exemplified by the recurrence of the word "passion". There's a lot of passion in this book. Steve's "passion for perfection", "passion for industrial design", "passion for awesome products" and so on. If I'd been reading this on an iPad, the temptation to search-and-replace "passion" to "turnip" or "erection" would have been overwhelming.

Isaacson writes dutiful, lumbering American news-mag journalese and suffers – as did Jobs himself – from a lack of sense of proportion. Chapter headings evoke Icarus and Prometheus. The one on the Apple II is subtitled "Dawn of a New Age", the one on Jobs's return to Apple is called "The Second Coming", and when writing about the origins of Apple's graphical user interface ( Jobs pinched the idea from Xerox ), Isaacson writes with splendid bathos: "There falls a [sic] shadow, as TS Eliot noted, between the conception and the creation."

But get past all that pomp and there's much to enjoy. Did you know that the Apple Macintosh was nearly called the Apple Bicycle? Or that so obsessed was Jobs with designing swanky-looking factories (white walls, brightly coloured machines) that he kept breaking the machines by painting them – for example bright blue?

As well as being a sort-of-genius, Jobs was a truly weird man. As a young man, he was once put on the night-shift so co-workers wouldn't have to endure his BO. (Jobs was convinced his vegan diet meant he didn't need to wear deodorant or shower more than once a week.) He was perpetually shedding his shoes, and sometimes, to relieve stress, soaked his feet in the toilet. His on-off veganism was allied to cranky theories about health. When he rebuked the chairman of Lotus Software for spreading butter on his toast ("Have you ever heard of serum cholesterol?"), the man responded: "I'll make you a deal. You stay away from commenting on my dietary habits, and I will stay away from the subject of your personality."

That personality. An ex-girlfriend – and one, it should be said, who was very fond of him – told Isaacson that she thought Jobs suffered from narcissistic personality disorder. Jobs's personal life is sketchily covered, but what details there are don't charm. When he got an on/off girlfriend pregnant in his early 20s, he cut her off and aggressively denied paternity – though he later, uncharacteristically, admitted regretting his behaviour and sought to build a relationship with his daughter. (Jobs himself was adopted, and seems to have had what Americans call "issues around abandonment".)

He cheated his friends out of money. He cut old colleagues out of stock options. He fired people with peremptoriness. He bullied waiters, insulted business contacts and humiliated interviewees for jobs. He lied his pants off whenever it suited him – "reality distortion field" is Isaacson's preferred phrase. Like many bullies, he was also a cry-baby. Whenever he was thwarted – not being made "Man of the Year" by Time magazine when he was 27, for instance – he burst into tears.

As for critiquing the work of others, Jobs's analytical style was forthright: "too gay" (rabbit icon on desktop); "a shithead who sucks" (colleague Jef Raskin); "fucking dickless assholes" (his suppliers); "a dick" (the head of Sony music); "brain-dead" (mobile phones not made by Apple).

Nowadays we are taught that being nice is the way to get on. Steve Jobs is a fine counter-example. In 2008, when Fortune magazine was on the point of running a damaging article about him, Jobs summoned their managing editor to Cupertino to demand he spike the piece: "He leaned into Serwer's face and asked, 'So, you've uncovered the fact that I'm an asshole. Why is that news?'"

Sam Leith's You Talkin' to Me? is published by Profile Books.

  • Biography books

Making the iBio for Apple’s Genius

  • Share full article

By Janet Maslin

  • Oct. 21, 2011

After Steve Jobs anointed Walter Isaacson as his authorized biographer in 2009, he took Mr. Isaacson to see the Mountain View, Calif., house in which he had lived as a boy. He pointed out its “clean design” and “awesome little features.” He praised the developer, Joseph Eichler, who built more than 11,000 homes in California subdivisions, for making an affordable product on a mass-market scale. And he showed Mr. Isaacson the stockade fence built 50 years earlier by his father, Paul Jobs.

“He loved doing things right,” Mr. Jobs said. “He even cared about the look of the parts you couldn’t see.”

Mr. Jobs, the brilliant and protean creator whose inventions so utterly transformed the allure of technology, turned those childhood lessons into an all-purpose theory of intelligent design. He gave Mr. Isaacson a chance to play by the same rules. His story calls for a book that is clear, elegant and concise enough to qualify as an iBio. Mr. Isaacson’s “Steve Jobs” does its solid best to hit that target.

biographies such as steve jobs by walter isaacson

As a biographer of Albert Einstein and Benjamin Franklin , Mr. Isaacson knows how to explicate and celebrate genius: revered, long-dead genius. But he wrote “Steve Jobs” as its subject was mortally ill, and that is a more painful and delicate challenge. (He had access to members of the Jobs family at a difficult time.) Mr. Jobs promised not to look over Mr. Isaacson’s shoulder, and not to meddle with anything but the book’s cover. (Boy, does it look great.) And he expressed approval that the book would not be entirely flattering. But his legacy was at stake. And there were awkward questions to be asked. At the end of the volume, Mr. Jobs answers the question “What drove me?” by discussing himself in the past tense.

Mr. Isaacson treats “Steve Jobs” as the biography of record, which means that it is a strange book to read so soon after its subject’s death. Some of it is an essential Silicon Valley chronicle, compiling stories well known to tech aficionados but interesting to a broad audience. Some of it is already quaint. Mr. Jobs’s first job was at Atari, and it involved the game Pong. (“If you’re under 30, ask your parents,” Mr. Isaacson writes.) Some, like an account of the release of the iPad 2, is so recent that it is hard to appreciate yet, even if Mr. Isaacson says the device comes to life “like the face of a tickled baby.” 

And some is definitely intended for future generations. “Indeed,” Mr. Isaacson writes, “its success came not just from the beauty of the hardware but from the applications, known as apps, that allowed you to indulge in all sorts of delightful activities.” One that he mentions, which will be as quaint as Pong some day, features the use of a slingshot to launch angry birds to destroy pigs and their fortresses.

So “Steve Jobs,” an account of its subject’s 56 years (he died on Oct. 5), must reach across time in more ways than one. And it does, in a well-ordered, if not streamlined, fashion. It begins with a portrait of the young Mr. Jobs, rebellious toward the parents who raised him and scornful of the ones who gave him up for adoption. (“They were my sperm and egg bank,” he says.)

Although Mr. Isaacson is not analytical about his subject’s volatile personality (the word “obnoxious” figures in the book frequently), he raises the question of whether feelings of abandonment in childhood made him fanatically controlling and manipulative as an adult. Fortunately, that glib question stays unanswered.

Mr. Jobs, who founded Apple with Stephen Wozniak and Ronald Wayne in 1976, began his career as a seemingly contradictory blend of hippie truth seeker and tech-savvy hothead.

“His Zen awareness was not accompanied by an excess of calm, peace of mind or interpersonal mellowness,” Mr. Isaacson says. “He could stun an unsuspecting victim with an emotional towel-snap, perfectly aimed,” he also writes. But Mr. Jobs valued simplicity, utility and beauty in ways that would shape his creative imagination. And the book maintains that those goals would not have been achievable in the great parade of Apple creations without that mean streak.

Mr. Isaacson takes his readers back to the time when laptops, desktops and windows were metaphors, not everyday realities. His book ticks off how each of the Apple innovations that we now take for granted first occurred to Mr. Jobs or his creative team. “Steve Jobs” means to be the authoritative book about those achievements, and it also follows Mr. Jobs into the wilderness (and to NeXT and Pixar) after his first stint at Apple, which ended in 1985.

With an avid interest in corporate intrigue, it skewers Mr. Jobs’s rivals, like John Sculley, who was recruited in 1983 to be Apple’s chief executive and fell for Mr. Jobs’s deceptive show of friendship. “They professed their fondness so effusively and often that they sounded like high school sweethearts at a Hallmark card display,” Mr. Isaacson writes.

Of course the book also tracks Mr. Jobs’s long and combative rivalry with Bill Gates. The section devoted to Mr. Jobs’s illness, which suggests that his cancer might have been more treatable had he not resisted early surgery, describes the relative tenderness of their last meeting.

“Steve Jobs” greatly admires its subject. But its most adulatory passages are not about people. Offering a combination of tech criticism and promotional hype, Mr. Isaacson describes the arrival of each new product right down to Mr. Jobs’s theatrical introductions and the advertising campaigns. But if the individual bits of hoopla seem excessive, their cumulative effect is staggering. Here is an encyclopedic survey of all that Mr. Jobs accomplished, replete with the passion and excitement that it deserves.

Mr. Jobs’s virtual reinvention of the music business with iTunes and the iPod, for instance, is made to seem all the more miraculous (“He’s got a turn-key solution,” the music executive Jimmy Iovine said.) Mr. Isaacson’s long view basically puts Mr. Jobs up there with Franklin and Einstein, even if a tiny MP3 player is not quite the theory of relativity. The book emphasizes how deceptively effortless Mr. Jobs’s ideas now seem because of their extreme intuitiveness and foresight. When Mr. Jobs, who personally persuaded musician after musician to accept the iTunes model, approached Wynton Marsalis, Mr. Marsalis was rightly more impressed with Mr. Jobs than with the device he was being shown.

Mr. Jobs’s love of music plays a big role in “Steve Jobs,” like his extreme obsession with Bob Dylan. (Like Mr. Dylan, he had a romance with Joan Baez. Her version of Mr. Dylan’s “Love Is Just a Four-Letter Word” was on Mr. Jobs’s own iPod.) So does his extraordinary way of perceiving ordinary things, like well-made knives and kitchen appliances. That he admired the Cuisinart food processor he saw at Macy’s may sound trivial, but his subsequent idea that a molded plastic covering might work well on a computer does not. Years from now, the research trip to a jelly bean factory to study potential colors for the iMac case will not seem as silly as it might now.

Skeptic after skeptic made the mistake of underrating Steve Jobs, and Mr. Isaacson records the howlers who misjudged an unrivaled career. “Sorry Steve, Here’s Why Apple Stores Won’t Work,” Business Week wrote in a 2001 headline. “The iPod will likely become a niche product,” a Harvard Business School professor said. “High tech could not be designed and sold as a consumer product,” Mr. Sculley said in 1987.

Mr. Jobs got the last laugh every time. “Steve Jobs” makes it all the sadder that his last laugh is over.

By Walter Isaacson

Illustrated. 630 pages. Simon & Schuster. $35.

The Books of The Times review on Saturday, about “Steve Jobs,” by Walter Isaacson, described Angry Birds, a popular iPhone game, incorrectly. Slingshots are used to launch birds to destroy pigs and their fortresses, not to shoot down the birds.

How we handle corrections

Explore More in Books

Want to know about the best books to read and the latest news start here..

How did fan culture take over? And why is it so scary? Justin Taylor’s novel “Reboot” examines the convergence of entertainment , online arcana and conspiracy theory.

Jamaica Kincaid and Kara Walker unearth botany’s buried history  to figure out how our gardens grow.

A new photo book reorients dusty notions of a classic American pastime with  a stunning visual celebration of black rodeo.

Two hundred years after his death, this Romantic poet is still worth reading . Here’s what made Lord Byron so great.

Harvard’s recent decision to remove the binding of a notorious volume  in its library has thrown fresh light on a shadowy corner of the rare book world.

Bus stations. Traffic stops. Beaches. There’s no telling where you’ll find the next story based in Accra, Ghana’s capital . Peace Adzo Medie shares some of her favorites.

Each week, top authors and critics join the Book Review’s podcast to talk about the latest news in the literary world. Listen here .

  • Divisions and Offices
  • Grants Search
  • Manage Your Award
  • NEH's Application Review Process
  • Professional Development
  • Grantee Communications Toolkit
  • NEH Virtual Grant Workshops
  • Awards & Honors
  • American Tapestry
  • Humanities Magazine
  • NEH Resources for Native Communities
  • Search Our Work
  • Office of Communications
  • Office of Congressional Affairs
  • Office of Data and Evaluation
  • Budget / Performance
  • Contact NEH
  • Equal Employment Opportunity
  • Human Resources
  • Information Quality
  • National Council on the Humanities
  • Office of the Inspector General
  • Privacy Program
  • State and Jurisdictional Humanities Councils
  • Office of the Chair
  • NEH-DOI Federal Indian Boarding School Initiative Partnership
  • NEH Equity Action Plan
  • GovDelivery

Walter Isaacson

Jefferson lecture.

Walter Isaacson Picture

Patrice Gilbert

BY DAVID SKINNER

The story of Walter Isaacson—celebrated journalist, biographer, intellectual leader, and humanist—begins on May 20, 1952, when he was born at the Touro Infirmary in New Orleans. Much later on, he described his father, Irwin, as a “kindly Jewish distracted humanist engineer with a reverence for science.” His mother, Betsy, was a real estate broker for whom Walter would name his only child.

The Isaacsons were local boosters. They appreciated the unique racial and cultural mix of their neighborhood, Broadmoor, and joined a committee to help preserve it. The family lived on Napoleon Avenue, and Walter, the older of two brothers, was noted early on for his ambition. Student body president at the Isidore Newman School, he was also named “most likely to succeed.”

An article in the “Terrific Teens” column of the Times-Picayune reported that he’d been working to unite students of different religions and races to develop a program for tutoring poor children. He also joined a committee that worked to reopen a public pool that had been closed to sidestep integration. It was not yet clear that he wanted to be a writer, but a keenness to understand how the world worked and to find ways to address social problems was evident. He told the Times-Picayune columnist, Millie Ball, that he thought his future might be in sociology or political economics.

Another strain of his upbringing was literary. As Isaacson recently wrote in a personal essay in Louisiana Cultural Vistas , published by the Louisiana Endowment for the Humanities, his parents were proudly middlebrow. They subscribed to Time magazine, the Saturday Review , and the Book-of-the- Month club, all staples of the mainstream cultural diet in those days.

In addition, he personally knew a bona fide novelist: Walker Percy, author of The Moviegoer , The Last Gentleman , and many later works that address a mixture of existential, religious, and scientific themes. This uncle of a childhood friend entertained occasional questioning from the future journalist about the messages written into his carefully layered books.

Throughout his life, Isaacson has shown a knack for meeting interesting and important people. In college, “he was the mayor of literary Harvard,” Kurt Andersen recently told Evan Thomas for an article in Humanities . Interviewing for his Rhodes scholarship, he nervously underwent a grilling from Willie Morris, the well-known writer and editor, and a young Arkansas lawyer named Bill Clinton.

Returning to New Orleans after studying philosophy at Oxford, Isaacson took a job as a reporter with the New Orleans States-Item , which later merged with the Times-Picayune . He covered City Hall and while looking around for sources found an especially valuable one in Donna Brazile, then guardian of access to Mayor Moon Landrieu, and later on a well-known adviser to President Clinton and Vice President Al Gore.

How Walter Isaacson went from covering City Hall in New Orleans to the editorial staff of Time magazine concerns one of the few instances when he was mistaken for a provincial. Hedley Donovan, Henry Luce’s chosen successor, had sent forth one of his editors to discover some young journalists from the Great Beyond west of the Hudson River. As this editor arrived in New Orleans, he could not help but learn about Isaacson, who was being touted by his newspaper for correctly predicting the outcome of a 12-candidate mayoral primary. Isaacson’s glory was shortlived, however, as he failed to correctly predict the winner of the runoff. The editor from Time nevertheless offered him a job.

Brought to New York City, Isaacson was presented to the editor in chief. As Isaacson tells the story,

Donovan proclaimed how pleased he was that they had found someone from “out there,” because far too many of the people at the magazine had gone to Harvard and Oxford. By the way, he asked, where did I go to school? I thought he was joking, so I just laughed. He repeated the question. The editor who had found me gave me a nervous look. I mumbled Harvard in a drawl that I hoped made it sound like Auburn. Donovan looked puzzled. I was whisked away. I do not recall ever being brought to meet him again.

The gift of knowing the right people, in Isaacson’s case, may very well be a happy side effect of wanting to know more about people, period. At a recent photo shoot, during a short break while equipment was being reset, Isaacson turned to one of the cameramen and said, very simply, “Tell me something about yourself.” On a sidewalk in D.C., he lately ran into a writer he’d worked with. The writer was coming back from lunch with some younger colleagues, and it wasn’t long before Isaacson was pumping the junior writers for information about what they were working on. Many journalists find it easy to go into interview mode, but the case of Walter Isaacson is that plus something else. In his younger days, he fancied he could be dropped into any small town and come out with a story. He even tested the theory, producing a series of articles on the lives of sharecroppers at a plantation in southeastern Louisiana.

At Time magazine, he got to work on national and international stories—big league journalism practiced with big league resources. In 1980 he covered the presidential campaign of Ronald Reagan. A picture from the time shows him looking barely old enough to buy a drink while being offered a treat from Nancy Reagan walking the aisle of the campaign plane like a stewardess. With access came a sense of responsibility. Isaacson and Evan Thomas coauthored a book that took readers beyond the weekly news cycle to look at how a group of privileged, Ivy Leaguers from the same blue-blood milieu made Cold War history. The Wise Men directed a spotlight at such establishment figures as Averell Harriman and Dean Acheson to produce a group portrait of key supporting players who shaped American foreign policy after World War II through the Vietnam War.

The Time magazine formula of writing history on the spot, through the lives of historymakers, was an agreeable match for the intensely social and hard-working Isaacson. And being at Time brought him into the orbit of some of the most interesting people around. When, in 1984, Steve Jobs came to Time to tout his awesome new desktop, Isaacson, the only reporter on staff who wrote on a computer, was asked to sit in.

 In the 1980s and ’90s he got to cover two of the greatest stories going. The first concerned the decline of the Soviet Union and its ripple effects across Eastern Europe. Seeing Lech Walesa rallying shipbuilders in Poland and the dissident Vaclav Havel becoming a leading light in then Czecho-slovakia confirmed his belief that history is not simply the result of impersonal forces but that individuals play major roles—a view he happened to share with Henry Kissinger, about whom he wrote a thorough, not always friendly, but well-received biography in 1992.

The other major story was the digital revolution. Isaacson was promoted to new media editor for all of Time Warner for two years during the era of the Pathfinder website. One of the first large-scale entrants into digital journalism, Pathfinder.com combined content from Time , People , Sports Illustrated , and several other magazines, offering it free of charge. Though notable for its ambition and for bringing advertisers online, its go-big strategy failed to set a template for online writing and reporting. The near future proved more amenable to search engines and smaller news-gatherers. Isaacson was then named managing editor of Time , the most senior editorial position within the magazine. As the tide shifted away from political news, the magazine under Isaacson still looked to occupy a great breadth of common ground. He never abandoned the classic Luce editorial formula, but he looked to update it with sharper writing and a broader cultural scope that included an energetic commitment to the story of digital technology.

In 2001 he became the CEO of CNN, overseeing its operations during 9/11 and afterwards, a job he held until 2003. While at CNN, he began working on a biography of Benjamin Franklin. It was a good period for popular books about the American Founders, but Isaacson’s fondness for his subject is evident throughout. He seemed to identify with the lighthearted Franklin, a fellow lover of science and technology who, like Isaacson, made friends easily.

The life of the mind has become Walter Isaacson’s major subject, and it goes well with his day job as president and CEO of the Aspen Institute, which might be described as the Ben Franklin of think tanks: well connected, intellectually broad, and consistently practical-minded. Founded by Walter Paepcke as a bipartisan forum where leaders could escape the rough and tumble of daily politics to reflect on enduring values, the institute has become an important venue for education reformers, technologists, and global leadership.

One of the more commonly asked questions about Walter Isaacson is, How does he get so much work done? When asked by Humanities magazine, he replied, “I don’t watch TV. If you give up TV, it’s amazing how many hours there are between 7:00 p.m. and 1:00 a.m. in which you can do writing.”

The reason it’s a popular question is that while running the Aspen Institute, Isaacson has completed two generously sized biographies. The first, about Albert Einstein, forced him to confront a whole battery of research and writing challenges. Readers wondering whether he might have skipped some of the hard parts are greeted by several pages of acknowledgments, stating Isaacson’s various debts to numerous physics professors and Einstein scholars. But it was more than math homework that made the book a huge best-seller: In its descriptions of Einstein’s breakthroughs, Isaacson showed off a pictorial gift that helped him to describe some of what Einstein was visualizing when the physicist discovered the general theory of relativity and other breakthroughs.

Writing the biography of Steve Jobs required a spectacular commitment to journalistic principles, re-reporting oft-told stories, getting close to the mesmerizing and mercurial founder of Apple without falling under his spell, and tracing the sometimes technical steps of several major innovations. To make things more difficult, Isaacson was writing from within the whirlwind of the present moment, with its constant reminders that the person he was writing about was considered by many to be no mere mortal. That the biography doubles as an ethical portrait of Jobs is, of course, a credit to Isaacson’s careful study of his subject.

Still practicing the Henry Luce philosophy, Isaacson used the story of Steve Jobs to tell a major story of our times. And just as Jobs humanized the personal computer and portable devices to appeal to a large variety of consumers, Isaacson has humanized the complicated interior lives of a series of historical figures, helping us to better understand several people who have changed the world.

Click here to read an interview with Walter Isaacson and  Humanities Magazine. 

Click here to read "Picture of a Humanist" by  Humanities Magazine. 

Walter Isaacson, best-selling author, acclaimed journalist, and president and CEO of the Aspen Institute, an educational and policy studies organization, will deliver the 2014 Jefferson Lecture in the Humanities on May 12 at 7: 30 PM at the John F. Kennedy Center for the Performing Arts in Washington, DC.

Isaacson's Jefferson Lecture, "The Intersection of the Humanities and the Sciences," will touch on the careers of Leonardo da Vinci, Albert Einstein, Benjamin Franklin, Steve Jobs, Ada Lovelace, Walker Percy, and Edwin Land and others who fused humanistic thought with scientific discovery.

Tickets for the lecture will be available at www.neh.gov starting April 22.

One of the preeminent biographers of our time, Isaacson is best known for his complex and comprehensive portraits of some of history’s most innovative and influential minds.  His 2011 biography Steve Jobs , based on more than 40 interviews with the Apple cofounder, delves into the obsessive perfectionism and drive that helped transform the personal computing, animated film, music, telecommunications, and digital publishing industries.  Isaacson’s best-selling Einstein: His Life and Universe (2007) is a riveting portrayal of the unconventional thinker who forever changed science with his theory of relativity while still a 26-year-old patent clerk. His 2003 biography Benjamin Franklin: An American Life presents a detailed account of the playful and pragmatic founding father whose extraordinary accomplishments have had such far-reaching effects on American life, from the structure of our democracy to the shape of our bifocals.

Isaacson is also the author of American Sketches: Great Leaders, Creative Thinkers, and Heroes of a Hurricane (2009), Kissinger: A Biography (1992), and co-author of The Wise Men: Six Friends and the World They Made (1986).  He is currently working on a book on the innovators who helped create the digital age.

Isaacson’s biographies represent among the best of the genre, “educating us while demonstrating the continued fascination of the seriously examined life, rendered by Isaacson with the objectivity of a true historian and the flair of a born storyteller,” wrote Madeleine Albright, former U.S. Secretary of State, when Isaacson was named by TIME magazine among “The World’s 100 Most Influential People” in 2012. “Both as an author and as president of the intellectually fertile Aspen Institute, Isaacson is a purveyor of knowledge, a supplier to addicts who seek a deeper understanding of all manner of things.”

Walter Isaacson was born in New Orleans in 1952. He earned a B.A. in history and literature from Harvard College in 1974, and continued his studies in Philosophy, Politics, and Economics at Pembroke College of Oxford University as a Rhodes Scholar.

Isaacson began his career in journalism at The Sunday Times of London and then the New Orleans Times-Picayune / States-Item . He joined TIME magazine in 1978 and served as a political correspondent, national editor, and editor of new media before becoming the magazine’s 14th editor in 1996. He became chairman and CEO of CNN in 2001. In 2003 Isaacson was named president and CEO of the Aspen Institute, a nonpartisan educational and policy studies institute in Washington, DC.

He is the chairman emeritus of the board of Teach for America, which recruits recent college graduates to teach in underserved communities. He was appointed by President Barack Obama and confirmed by the Senate to serve as the chairman of the Broadcasting Board of Governors, which oversees Voice of America, Radio Free Europe, and other international broadcasts of the United States, a position he held until 2012. He is vice-chair of Partners for a New Beginning, a private-public group tasked with forging ties between the United States and the Muslim world. He is on the board of United Airlines, Tulane University, and the Overseers of Harvard University. From 2005-2007, after Hurricane Katrina, he was the vice-chair of the Louisiana Recovery Authority.

Isaacson is a fellow of the Royal Society of Arts and was awarded its 2013 Benjamin Franklin Medal, and a member of the American Philosophical Society. He lives with his wife and daughter in Washington, DC.

Click here to watch Isaacson, the 2014 Jefferson Lecturer, reminisce about his boyhood in New Orleans.

Click here to watch Walter Isaacson on Imaginative Genius.

My fellow humanists,

I am deeply humbled to be here today. I know this is a standard statement to make at moments such as these, but in my case it has the added virtue of being true. There is no one on the list of Jefferson lecturers, beginning with Lionel Trilling, who is not an intellectual and artistic hero of mine, and I cannot fathom why I am part of this procession. But that makes me feel all the more humbled, so I thank you. 

It is particularly meaningful for me to be giving this lecture on the 25th anniversary of the one by Walker Percy. I took the train from New York for that occasion, looking out of the window and thinking of his eerie essay about the malaise, “The Man on the Train.” If memory serves, it was over at the Mellon Auditorium, and Lynne Cheney did the introduction.

Dr. Percy, with his wry philosophical depth and lightly-worn grace, was a hero of mine. He lived on the Bogue Falaya, a bayou-like, lazy river across Lake Pontchartrain from my hometown of New Orleans. My friend Thomas was his nephew, and thus he became “Uncle Walker” to all of us kids who used to go up there to fish, capture sunning turtles, water ski, and flirt with his daughter Ann. It was not quite clear what Uncle Walker did. He had trained as a doctor, but he never practiced. Instead, he seemed to be at home most days, sipping bourbon and eating hog’s head cheese. Ann said he was a writer, but it was not until his first novel, The Moviegoer , had gained recognition that it dawned on me that writing was something you could do for a living, just like being a doctor or a fisherman or an engineer. Or a humanist.

He was a kindly gentleman, whose placid face seemed to know despair but whose eyes nevertheless often smiled. He once said: “My ideal is Thomas More, an English Catholic who wore his faith with grace, merriment, and a certain wryness.” [i] That describes Dr. Percy well.

His speech twenty-five years ago was, appropriately enough for an audience of humanists, about the limits of science. “Modern science is itself radically incoherent, not when it seeks to understand subhuman organisms and the cosmos, but when it seeks to understand man,” he said. I thought he was being a bit preachy. But then he segued into his dry, self-deprecating humor. “Surely there is nothing wrong with a humanist, even a novelist, who is getting paid by the National Endowment for the Humanities, taking a look at his colleagues across the fence, scientists getting paid by the National Science Foundation, and saying to them in the friendliest way, ‘Look, fellows, it’s none of my business, but hasn’t something gone awry over there that you might want to fix?’” He said he wasn’t pretending to have a grand insight like “the small boy noticing the naked Emperor.” Instead, he said, “It is more like whispering to a friend at a party that he’d do well to fix his fly.” [ii]

The limits of science was a subject he knew well. He had trained as a doctor and was preparing to be a psychiatrist. After contracting tuberculosis, he woke up one morning and had an epiphany. He realized science couldn’t teach us anything worth knowing about the human mind, its yearnings, depressions, and leaps of faith.

So he became a storyteller. Man is a storytelling animal, especially southerners. Alex Hailey once said to someone, who was stymied about how to give a lecture such as this, that there were six magic words to use: “Let me tell you a story.” So let me tell you a story: Percy’s novels, I eventually noticed, carried philosophical, indeed religious, messages. But when I tried to get him to expound upon them, he demurred. There are, he told me, two types of people who come out of Louisiana: preachers and storytellers. For goodness sake, he said, be a storyteller. The world has too many preachers.

For Dr. Percy, storytelling was the humanist’s way of making sense out of data. Science gives us empirical facts and theories to tie them together. Humans turn them into narratives with moral and emotional and spiritual meaning.

His specialty was the “diagnostic novel,” which played off of his scientific knowledge to diagnose the modern condition. In Love in the Ruins , Dr. Thomas Moore, a fictional descendant of the English saint, is a psychiatrist in a Louisiana town named Paradise who invents what he calls an “Ontological Lapsometer,” which can diagnose and treat our malaise.

I realized that Walker Percy’s storytelling came not just from his humanism – and certainly not from his rejection of science. Its power came because he stood at the intersection of the humanities and the sciences. He was our interface between the two.

That’s what I want to talk about today. The creativity that comes when the humanities and science interact is something that has fascinated me my whole life.

When I first started working on a biography of Steve Jobs, he told me: “I always thought of myself as a humanities person as a kid, but I liked electronics. Then I read something that one of my heroes, Edwin Land of Polaroid, said about the importance of people who could stand at the intersection of humanities and sciences, and I decided that’s what I wanted to do.” [iii]

In his product demos, Jobs would conclude with a slide, projected on the big screen behind him, of a street sign showing the intersection of the Liberal Arts and the Sciences. At his last major product launch, the iPad 2, in 2011, he ended again with those street signs and said: "It's in Apple's DNA that technology alone is not enough — it's technology married with liberal arts, married with the humanities, that yields us the result that makes our heart sing.” That’s what made him the most creative technology innovator of our era, and that’s what he infused into the DNA of Apple, which is still evident today.

It used to be common for creative people to stand at this intersection. Leonardo da Vinci was the exemplar, and his famous drawing of the Vitruvian Man became the symbol, of the connection between the humanities and the sciences. “Leonardo was both artist and scientist, because in his day there was no distinction,” writes science historian Arthur I. Miller in his forthcoming book, Colliding Worlds.   

Two of my biography subjects embody that combination. Benjamin Franklin was America’s founding humanist, but he was also the most important experimental scientist of his era. And his creativity came from connecting the two realms.

We sometimes think of him as a doddering dude flying a kite in the rain. But his electricity experiments established the single-fluid theory of electricity, the most important scientific breakthrough of his era. As Harvard professor Dudley Herschbach declared: “His work on electricity ushered in a scientific revolution comparable to those wrought by Newton in the previous century or by Watson and Crick in ours.”

Part of his talent as both a scientist and humanist was his facility as a clear writer, and he crafted the words we still use for electrical flow: positive and negative charges, battery, condenser, conductor.

Because he was a humanist, he looked for ways that his science could benefit society. He lamented to a friend that he was “chagrined” that the electricity experiments “have hitherto been able to discover nothing in the way of use to mankind.” He actually did come up with one use early on. He was able to apply what they learned to prepare the fall feast. He wrote, “A turkey is to be killed for our dinners by the electrical shock; and roasted by the electrical jack, before a fire kindled by the electrified bottle.” Afterwards he reported, “The birds killed in this manner eat uncommonly tender.” [iv] I think that I can speak for Dr. Percy and say that we Southerners ought to honor him as the inventor of the first fried turkey.

Of course his electricity experiments eventually led him to the most useful invention of his age: the lightning rod. Having noticed the similarity of electrical sparks and lightning bolts, he wrote in his journal the great rallying cry of the scientific method: “Let the experiments be made.” [v] And they were. He became a modern Prometheus, stealing fire from the gods.  Few scientific discoveries have been of such immediate service to mankind.

Franklin’s friend and protégé, and our lecture’s patron, Thomas Jefferson, also combined a love of science with that of the humanities. The week that he became Vice President in 1797, Jefferson presented a formal research paper on fossils to the American Philosophical Society, the scientific group founded a half century earlier by young Benjamin Franklin. Jefferson became president of the organization and held that post even as he served as President of the United States.

My point is not merely that Franklin and Jefferson loved the sciences as well as the arts. It’s that they stood at the intersection of the two. They were exemplars of an Enlightenment in which natural order and Newtonian mechanical balances were the foundation for governance.

Take for example the crafting of what may be the greatest sentence ever written, the second sentence of the Declaration of Independence.

The Continental Congress had created a committee to write that document. It may have been the last time Congress created a great committee. It included Benjamin Franklin, Thomas Jefferson, and John Adams.

When he had finished a rough draft, Jefferson sent it to Franklin in late June 1776. “Will Doctor Franklin be so good as to peruse it,” he wrote in his cover note, “and suggest such alterations as his more enlarged view of the subject will dictate?” [vi] People were more polite to editors back then than they were in my day.

Franklin made only a few changes, some of which can be viewed on what Jefferson referred to as the “rough draft” of the Declaration. (This remarkable document is at the Library of Congress.) The most important of his edits was small but resounding. Jefferson had written,  “We hold these truths to be sacred and undeniable…” Franklin crossed out, using the heavy backslashes that he often employed, the last three words of Jefferson’s phrase and changed it to the words now enshrined in history: “We hold these truths to be self-evident.”

The idea of “self-evident” truths came from the rationalism of Isaac Newton and Franklin’s close friend David Hume. The sentence went on to say that “all men are created equal” and “from that equal creation they derive rights.” The committee changed it to “they are endowed by their creator with certain inalienable rights.” [vii]

So here in the editing of a half of one sentence we see them balancing the role of divine providence in giving us our rights with the role of rationality and reason. The phrase became a wonderful blending of the sciences and humanities.

The other great person I wrote about who stood at the intersection of the sciences and humanities came at it from the other direction: Albert Einstein.

I have some good news for parents in this room. Einstein was no Einstein when he was a kid.

He was slow in learning how to talk. “My parents were so worried,” he later recalled, “that they consulted a doctor.” The family maid dubbed him “der Depperte,” the dopey one. [viii]

His slow development was combined with a cheeky rebelliousness toward authority, which led one schoolmaster to send him packing and another to amuse history by declaring that he would never amount to much. These traits made Albert Einstein the patron saint of distracted school kids everywhere.  But they also helped to make him, or so he later surmised, the most creative scientific genius of modern times.

His cocky contempt for authority led him to question received wisdom in ways that well-trained acolytes in the academy never contemplated. And as for his slow verbal development, he thought that it allowed him to observe with wonder the everyday phenomena that others took for granted. “When I ask myself how it happened that I in particular discovered relativity theory, it seemed to lie in the following circumstance,” Einstein once explained. “The ordinary adult never bothers his head about the problems of space and time. These are things he has thought of as a child. But I developed so slowly that I began to wonder about space and time only when I was already grown up. Consequently, I probed more deeply into the problem than an ordinary child would have.”

His success came from his imagination, rebellious spirit, and his willingness to question authority. These are things the humanities teach.

He marveled at even nature’s most mundane amazements. One day, when he was sick as a child, his father gave him a compass. As he moved it around, the needle would twitch and point north, even though nothing physical was touching it. He was so excited that he trembled and grew cold. You and I remember getting a compass when we were a kid. “Oh, look, the needle points north,” we would exclaim, and then we’d move on – “Oh, look, a dead squirrel” – to something else. But throughout his life, and even on his deathbed as he scribbled equations seeking a unified field theory, Einstein marveled at how an electromagnetic field interacted with particles and related to gravity. In other words, why that needle twitched and pointed north.

His mother, an accomplished pianist, also gave him a gift at around the same time, one that likewise would have an influence throughout his life. She arranged for him to take violin lessons. After being exposed to Mozart’s sonatas, music became both magical and emotional to him.

Soon he was playing Mozart duets with his mother accompanying him on the piano. “Mozart’s music is so pure and beautiful that I see it as a reflection of the inner beauty of the universe itself,” he later told a friend. [ix] “Of course,” he added in a remark that reflected his view of math and physics as well as of Mozart, “like all great beauty, his music was pure simplicity.” [x]

Music was no mere diversion. On the contrary, it helped him think. “Whenever he felt that he had come to the end of the road or faced a difficult challenge in his work,” said his son, “he would take refuge in music and that would solve all his difficulties.” [xi] The violin thus proved useful during the years he lived alone in Berlin wrestling with general relativity. “He would often play his violin in his kitchen late at night, improvising melodies while he pondered complicated problems,” a friend recalled. “Then, suddenly, in the middle of playing, he would announce excitedly, ‘I’ve got it!’ As if by inspiration, the answer to the problem would have come to him in the midst of music.” [xii]

He had an artist’s visual imagination. He could visualize how equations were reflected in realities. As he once declared, “Imagination is more important than knowledge.” [xiii]

He also had a spiritual sense of the wonders that lay beyond science. When a young girl wrote to ask if he was religious, Einstein replied: “Everyone who is seriously involved in the pursuit of science becomes convinced that a spirit is manifest in the laws of the Universe – a spirit vastly superior to that of man, and one in the face of which we with our modest powers must feel humble.” [xiv]  

At age 16, still puzzling over why that compass needle twitched and pointed north, he was studying James Clark Maxwell’s equations describing electromagnetic fields. If you look at Maxwell’s equations, or if you’re Einstein and you look at Maxwell’s equations, you notice that they decree that an electromagnetic wave, such as a light wave, always travels at the same speed relative to you, no matter if you’re moving really fast toward the source of the light or away from it. Einstein did a thought experiment. Imagine, he wrote, “a person could run after a light wave with the same speed as light.” [xv] Wouldn’t the wave seem stationary relative to this observer? But Maxwell’s equations didn’t allow for that. The disjuncture caused him such anxiety, he recalled, that his palms would sweat. I remember what was causing my palms to sweat at age 16 when I was growing up in New Orleans, and it wasn’t Maxwell’s equations. But that’s why he’s Einstein and I’m not.

He was not an academic superstar. In fact, he was rejected by the second best college in Zurich, the Zurich Polytech. I always wanted to track down the admissions director who rejected Albert Einstein. He finally got in, but when he graduated he couldn’t get a post as a teaching assistant or even as a high school teacher. He finally got a job as a third class examiner in the Swiss patent office.

Among the patent applications Einstein found himself examining were those for devices that synchronized clocks. Switzerland had just gone on standard time zones, and the Swiss, being rather Swiss, deeply cared that when it struck seven in Bern in would strike seven at that exact same instant in Basel or Geneva. The only way to synchronize distant clocks is to send a signal between them, and such a signal, such as a light or radio signal, travels at the speed of light. And you had this patent examiner who was still thinking, What if I caught up with a light beam and rode alongside it?

His imaginative leap – a thought experiment done at his desk in the patent office – was that someone travelling really fast toward one of the clocks would see the timing of the signal’s arrival slightly differently from someone travelling really fast in the other direction. Clocks that looked synchronized to one of them would not look synchronized to the other. From that he made an imaginative leap. The speed of light is always constant, he said. But time is relative, depending on your state of motion. 

Now if you don’t fully get it, don’t feel bad. He was still a third-class patent clerk the next year and the year after. He couldn’t get an academic job for three more years. That’s how long it took most of the physics community to comprehend what he was saying.

Einstein’s leap was not just a triumph of the imagination. It also came from questioning accepted wisdom and challenging authority. Every other physicist had read the beginning of Newton’s Principia , where the great man writes. “Absolute, true, and mathematical time, of itself and from its own nature, flows equably without relation to anything external.” Einstein had read that, too, but unlike the others he had asked, How do we know that to be true? How would we test that proposition?

So when we emphasize the need to teach our kids science and math, we should not neglect to encourage them to be imaginative, creative, have an intuitive feel for beauty, and to “Think Different,” as Steve Jobs would say. That’s one role of the humanities.

Einstein had one bad effect on the connection between the humanities and the sciences. His theory of relativity, combined with quantum theory that he also pioneered, made science seem intimidating and complex, beyond the comprehension of ordinary folks, even well-educated humanists. 

For nearly three centuries, the mechanical universe of Newton, based on absolute certainties and laws, had formed the psychological foundation of the Enlightenment and the social order, with a belief in causes and effects, order, even duty. Newton’s mechanics and laws of motion were something everyone could understand. But Einstein conjured up a view of the universe in which space and time were dependent on frames of reference. 

Einstein’s relativity was followed by Bohr’s indeterminacy, Heisenberg’s uncertainty, Gödel's incompleteness, and a bestiary of other unsettling concepts that made science seem spooky. This contributed to what C.P. Snow, in a somewhat overrated essay with one interesting concept, called the split between the two cultures. 

My thesis is that one thing that will help restore the link between the humanities and the sciences is the human-technology symbiosis that has emerged in the digital age.

That brings us to another historical figure, not nearly as famous, but perhaps she should be: Ada Byron, the Countess of Lovelace, often credited with being, in the 1840s, the first computer programmer.

The only legitimate child of the poet Lord Byron, Ada inherited her father’s romantic spirit, a trait that her mother tried to temper by having her tutored in math, as if it were an antidote to poetic imagination. When Ada, at age five, showed a preference for geography, Lady Byron ordered that the subject be replaced by additional arithmetic lessons, and her governess soon proudly reported, “she adds up sums of five or six rows of figures with accuracy.”

Despite these efforts, Ada developed some of her father’s propensities. She had an affair as a young teenager with one of her tutors, and when they were caught and the tutor banished, Ada tried to run away from home to be with him. She was a romantic as well as a rationalist.

The resulting combination produced in Ada a love for what she took to calling “poetical science,” which linked her rebellious imagination to an enchantment with numbers.

For many people, including her father, the rarefied sensibilities of the Romantic Era clashed with the technological excitement of the Industrial Revolution. Lord Byron was a Luddite. Seriously. In his maiden and only speech to the House of Lords, he defended the followers of Nedd Ludd who were rampaging against mechanical weaving machines that were putting artisans out of work. But his daughter Ada loved how punch cards instructed those looms to weave beautiful patterns, and she envisioned how this wondrous combination of art and technology could someday be manifest in computers.

Ada’s great strength was her ability to appreciate the beauty of mathematics, something that eludes many people, including some who fancy themselves intellectual. She realized that math was a lovely language, one that describes the harmonies of the universe, and it could be poetic at times.

She became friends with Charles Babbage, a British gentleman-inventor who dreamed up a calculating machine called the Analytical Engine. To give it instructions, he adopted the punch cards that were being used by the looms.

Ada’s love of both poetry and math primed her to see “great beauty” in such a machine. She wrote a set of notes that showed how it could be programmed to do a variety of tasks. One example she chose was how to generate Bernoulli numbers. I’ve explained special relativity already, so I’m not going to take on the task of also explaining Bernoulli numbers except to say that they are an exceedingly complex infinite series that plays a role in number theory. Ada wrote charts for a step by step program, complete with subroutines, to generate such numbers, which is what earned her the title of first programmer.

In her notes Ada propounded two concepts of historic significance.

The first was that a programmable machine like the Analytical Engine could do more than just math. Such machines could process not only numbers but anything that could be notated in symbols, such as words or music or graphical displays. In short, she envisioned what we call a computer.

Her second significant concept was that no matter how versatile a machine became, it still would not be able to think . “The Analytical Engine has no pretensions whatever to originate anything,” she wrote. “It can do whatever we know how to order it to perform… but it has no power of anticipating any analytical relations or truths.” [xvi]

In other words, humans would supply the creativity.

This was in 1842. Flash forward one century.

Alan Turing was a brilliant and tragic English mathematician who helped build the computers that broke the German codes during World War II. He likewise came up with two concepts, both related to those of Lovelace.

The first was a formal description of a universal machine that could perform any logical operation.

Turing’s other concept addressed Lovelace’s contention that machines would never think. He called it “Lady Lovelace’s Objection.” [xvii] He asked, How would we know that? How could we test whether a machine could really think?

His answer got named the Turing Test. Put a machine and a person behind a curtain and feed them both questions, he suggested. If you cannot tell which is which, then it makes no sense to deny that the machine is thinking. This was in 1950, and he predicted that in the subsequent few decades machines would be built that would pass the Turing Test.

From Lovelace and Turing we can define two schools of thought about the relationship between humans and machines.

The Turing approach is that the ultimate goal of powerful computing is artificial intelligence: machines that can think on their own, that can learn and do everything that the human mind can do. Even everything a humanist can do.

The Lovelace approach is that machines will never truly think , and that humans will always provide the creativity and intentionality. The goal of this approach is a partnership between humans and machines, a symbiosis where each side does what it does best. Machines augment rather than replicate and replace human intelligence.

We humanists should root for the triumph of this human-machine partnership strategy, because it preserves the importance of the connection between the humanities and the sciences.

Let’s start, however, by looking at how the pursuit of pure artificial intelligence – machines that can think without us – has fared.

Ever since Mary Shelley conceived Frankenstein during a vacation with Ada’s father, Lord Byron, the prospect that a man-made contraption might have its own thoughts and intentions has been frightening. The Frankenstein motif became a staple of science fiction. A vivid example was Stanley Kubrick’s 1968 movie, 2001: A Space Odyssey , featuring the frighteningly intelligent and intentional computer, Hal.

Artificial intelligence enthusiasts have long been promising, or threatening, that machines like Hal with minds of their own would soon emerge and prove Ada wrong. Such was the premise at the 1956 conference at Dartmouth, organized by John McCarthy and Marvin Minsky, where the field of “artificial intelligence” was launched. The conferees concluded that a breakthrough was about twenty years away. It wasn’t. Decade after decade, new waves of experts have claimed that artificial intelligence was on the visible horizon, perhaps only twenty years away. Yet true artificial intelligence has remained a mirage, always about twenty years away.

John von Neumann, the breathtakingly brilliant Hungarian-born humanist and scientist who helped devise the architecture of modern digital computers, began working on the challenge of artificial intelligence shortly before he died in 1957. He realized that the architecture of computers was fundamentally different from that of the human brain. Computers were digital and binary – they dealt in absolutely precise units – whereas the brain is partly an analog system, which deals with a continuum of possibilities. In other words, a human’s mental process includes many signal pulses and analog waves from different nerves that flow together to produce not just binary yes-no data but also answers such as “maybe” and “probably” and infinite other nuances, including occasional bafflement. Von Neumann suggested that the future of intelligent computing might require abandoning the purely digital approach and creating “mixed procedures” that include a combination of digital and analog methods. [xviii]

In 1958, Cornell professor Frank Rosenblatt published a mathematical approach for creating an artificial neural network like that of the human brain, which he called a “Perceptron.” Using weighted statistical inputs, it could, in theory, process visual data. When the Navy, which was funding the work, unveiled the system, it drew the type of press hype that has accompanied many subsequent artificial intelligence claims. “The Navy revealed the embryo of an electronic computer today that it expects will be able to walk, talk, see, write, reproduce itself and be conscious of its existence,” the New York Times wrote. The New Yorker was equally enthusiastic. “The Perceptron, …as its name implies, is capable of what amounts to original thought,” it reported. [xix]

That was almost sixty years ago. The Perceptron still does not exist. However, almost every year since then there have been breathless reports about some “about-to-be marvel” that would surpass the human brain, many of them using almost the exact same phrases as the 1958 stories about the Perceptron.

Discussion about artificial intelligence flared up a bit after IBM’s Deep Blue, a chess-playing machine, beat world champion Garry Kasparov in 1997 and then Watson, its natural-language question-answering cousin, won at Jeopardy! against champions Brad Rutter and Ken Jennings in 2011. But these were not true breakthroughs of artificial intelligence. Deep Blue won its chess match by brute force; it could evaluate 200 million positions per second and match them against 700,000 past grandmaster games. Deep Blue’s calculations were fundamentally different, most of us would agree, from what we mean by “real” thinking. “Deep Blue was only intelligent the way your programmable alarm clock is intelligent,” Kasparov said. “Not that losing to a $10 million alarm clock made me feel any better.” [xx]

Likewise, Watson won at Jeopardy! by using megadoses of computing power: It had 200 million pages of information in its four terabytes of storage, of which the entire Wikipedia accounted for merely 0.2% of that data. It could search the equivalent of a million books per second. It was also rather good at processing colloquial English. Still, no one who watched would bet on it passing the Turing Test. For example, one question was about the “anatomical oddity” of former Olympic gymnast George Eyser. Watson answered, “What is a leg?” The correct answer was that Eyser was missing a leg. The problem was understanding an “oddity,” David Ferrucci, who ran the Watson project at IBM explained.  “The computer wouldn't know that a missing leg is odder than anything else.” [xxi]

Here’s the paradox: Computers can do some of the toughest tasks in the world (assessing billions of possible chess positions, finding correlations in hundreds of Wikipedia-sized information repositories), but they cannot perform some of the tasks that seem most simple to us mere humans. Ask Google a hard question like, “What is the depth of the Red Sea?” and it will instantly respond 7,254 feet, something even your smartest friends don’t know. Ask it an easy one like, “Can an earthworm play basketball?” and it will have no clue, even though a toddler could tell you, after a bit of giggling. [xxii]  

At Applied Minds near Los Angeles, you can get an exciting look at how a robot is being programmed to maneuver, but it soon becomes apparent that it still has trouble navigating across an unfamiliar room, picking up a crayon, or writing its name. A visit to Nuance Communication near Boston shows the wondrous advances in speech recognition technologies that underpin Siri and other systems, but it’s also apparent to anyone using Siri that you still can’t have a truly meaningful conversation with a computer, except in a fantasy movie. A visit to the New York City police command system in Manhattan reveals how computers scan thousands of feeds from surveillance cameras as part of a “Domain Awareness System,” but the system still cannot reliably identify your mother’s face in a crowd.

There is one thing that all of these tasks have in common: even a four-year-old child can do them.

Perhaps in a few more decades there will be machines that think like, or appear to think like, humans. “We are continually looking at the list of things machines cannot do – play chess, drive a car, translate language – and then checking them off the list when machines become capable of these things,” said Tim Berners-Lee, who invented the World Wide Web. “Someday we will get to the end of the list.” [xxiii]  

Someday we may even reach the “singularity,” a term that John von Neumann coined and the science fiction writer Vernor Vinge popularized, which is sometimes used to describe the moment when computers are not only smarter than humans but can also design themselves to be even supersmarter, and will thus no longer need us mere mortals. Vinge says this will occur by 2030. [xxiv]  

On the other hand, this type of artificial intelligence may take a few more generations or even centuries. We can leave that debate to the futurists. Indeed, depending on your definition of consciousness, it may never happen. We can leave that debate to the philosophers and theologians.

There is, however, another possibility: that the partnership between humans and technology will always be more powerful than purely artificial intelligence. Call it the Ada Lovelace approach. Machines would not replace humans, she felt, but instead become their collaborators. What humans –and humanists – would bring to this relationship, she said, was originality and creativity.

The past fifty years have shown that this strategy of combining computer and human capabilities has been far more fruitful than the pursuit of machines that could think on their own.

J.C.R. Licklider, an MIT psychologist who became the foremost father of the Internet, up there with Al Gore, helped chart this course back in 1960. His ideas built on his work designing the America’s air defense system, which required an intimate collaboration between humans and machines.

Licklider set forth a vision, in a paper titled “Man-Computer Symbiosis,” that has been pursued to this day: “Human brains and computing machines will be coupled together very tightly, and the resulting partnership will think as no human brain has ever thought and process data in a way not approached by the information-handling machines we know today.” [xxv]

Licklider’s approach was given a friendly face by a computer systems pioneer named Doug Engelbart, who in 1968 demonstrated a networked computer with an interface involving a graphical display and a mouse. In a manifesto titled “Augmenting Human Intellect,” he echoed Licklider. The goal, Engelbart wrote, should be to create “an integrated domain where hunches, cut-and-try, intangibles, and the human ‘feel for a situation’ usefully coexist with… high-powered electronic aids.” [xxvi]

Richard Brautigan, a poet based at Caltech for a while, expressed that dream a bit more lyrically in his poem “Machines of Loving Grace.” It extolled “a cybernetic meadow / where mammals and computers / live together in mutually / programming harmony.” [xxvii]

The teams that built Deep Blue and Watson later adopted this symbiosis approach, rather than pursuing the objective of the artificial intelligence purists. “The goal is not to replicate human brains,” said John E. Kelly, IBM’s Research director. “This isn’t about replacing human thinking with machine thinking. Rather, in the era of cognitive systems, humans and machines will collaborate to produce better results, each bringing their own superior skills to the partnership.” [xxviii]

An example of the power of this human-machine symbiosis arose from a realization that struck Kasparov after he was beaten by Deep Blue. Even in a rule-defined game such as chess, he came to believe, “what computers are good at is where humans are weak, and vice versa.” That gave him an idea for an experiment. “What if instead of human versus machine we played as partners?” 

This type of tournament was held in 2005. Players could work in teams with computers of their choice. There was a substantial prize, so many grandmasters and advanced computers joined the fray. But neither the best grandmaster nor the most powerful computer won. Symbiosis did. The final winner was not a grandmaster nor a state-of-the-art computer, but two American amateurs who used three computers at the same time and knew how to manage the process of collaborating with their machines. “Their skill at manipulating and ‘coaching’ their computers to look very deeply into positions effectively counteracted the superior chess understanding of their grandmaster opponents and the greater computational power of other participants,” according to Kasparov. [xxix] In other words, the future might belong to those who best know how to partner and collaborate with computers.

In a similar way, IBM decided that the best use of Watson, the Jeopardy! -playing computer, would be for it to collaborate with humans, rather than try to top them. One project involved reconfiguring the machine to work in partnership with doctors on cancer diagnoses and treatment plans. The Watson system was fed more than two million pages from medical journals, 600,000 pieces of clinical evidence, and could search up to 1.5 million patient records. When a doctor put in a patient’s symptoms and vital information, the computer provided a list of recommendations ranked in order of its level of confidence. [xxx]

In order to be useful, the IBM team realized, the machine needed to interact with human doctors in a humane way – a manner that made collaboration pleasant. David McQueen, the Vice President of Software at IBM Research, described programming a pretense of humility into the machine. “We reprogrammed our system to come across as humble and say, ‘here’s the percentage likelihood that this is useful to you, and here you can look for yourself.’” Doctors were delighted, saying that it felt like a conversation with a knowledgeable colleague. “We aim to combine human talents, such as our intuition, with the strengths of a machine, such as its infinite breadth,” said McQueen. “That combination is magic, because each offers a piece that the other one doesn’t have.” [xxxi]

This belief that machines and humans will get smarter together, playing to each other’s strengths and shoring up each other’s weaknesses, raises an interesting prospect: perhaps no matter how fast computers progress, artificial intelligence may never outstrip the intelligence of the human-machine partnership.

Let us assume, for example, that a machine someday exhibits all of the mental capabilities of a human: it appears to feel and perceive emotions, appreciate beauty, create art, and have its own desires. Such a machine might be able to pass a Turing Test. It might even pass what we could call the Ada Test, which is that it could appear to “originate” its own thoughts that go beyond what we humans program it to do.

There would, however, still be another hurdle before we could say that artificial intelligence has triumphed over human-technology partnership. We can call it the Licklider Test. It would go beyond asking whether a machine could replicate all the components of human intelligence. Instead, it would ask whether the machine accomplishes these tasks better when whirring away completely on its own, or whether it does them better when working in conjunction with humans. In other words, is it possible that humans and machines working in partnership will indefinitely be more powerful than an artificial intelligence machine working alone?

If so, then “man-machine symbiosis,” as Licklider called it, will remain triumphant. Artificial intelligence need not be the holy grail of computing. The goal instead could be to find ways to optimize the collaboration between human and machine capabilities – to let the machines do what they do best and have them let us do what we do best. 

If this human-machine symbiosis turns out to be the wave of the future, then it will make more important those who can stand at the intersection of humanities and sciences. That interface will be the critical juncture. The future will belong to those who can appreciate both human emotions and technology’s capabilities.

This will require more than a feel for only science, technology, engineering, and math. It will also depend on those who understand aesthetics, human emotions, the arts, and the humanities.

Let’s look at two of the most brilliant contemporary innovators who understood the intersection of humans and technology: Alan Kay of Xerox PARC and Steve Jobs of Apple.

Alan Kay’s father was a physiology professor and his mother was a musician. “Since my father was a scientist and my mother was an artist, the atmosphere during my early years was full of many kinds of ideas and ways to express them,” he recalled. “I did not distinguish between ‘art’ and ‘science’ and still don’t.” He went to graduate school at the University of Utah, which then had one of the best computer graphics programs in the world. He became a fan of Doug Engelbart’s work and came up with the idea for a Dynabook, a simple and portable computer, “for children of all ages,” with a graphical interface featuring icons that you could point to and click. In other words, something resembling a MacBook Air or an iPad, thirty years ahead of its time. He went to work at Xerox PARC, where a lot of these concepts were developed.

Steve Jobs was blown away by these ideas when he saw them on visits to Xerox PARC, and he was the one who turned them into a reality with his team at Apple. As noted earlier, Jobs’s core belief was that the creativity of the new age of technology would come from those who stood at the intersection of the humanities and the sciences. He went to a very creative liberal arts college, Reed, and even after dropping out hung around to take courses like calligraphy and dance. He combined his love of beautiful lettering with his appreciation for the bit-mapped screen displays engineered at Xerox PARC, which allowed each and every pixel on the screen to be controlled by the computer. This led to the delightful array of fonts and displays he built into the first Macintosh and which we now can enjoy on every computer.

More broadly, Jobs was a genius in understanding how people would relate to their screens and devices. He understood the emotion, beauty, and simplicity that make for a great human-machine interface. And he ingrained that passion and intuition into Apple, which under Tim Cook and Jony Ive continues to startle us with designs that are profound in their simplicity.

Alan Kay and Steve Jobs are refutations of an editorial that appeared a few months ago in the Harvard Crimson , titled “Let Them Eat Code,” which poked fun at humanities-lovers who decried the emphasis on engineering and science education. The Crimson wrote:

   We’re not especially sorry to see the English majors go. Increased mechanization and digitization necessitates an increased number of engineers and programmers. Humanities apologists should be able to appreciate this. It’s true that fewer humanities majors will mean fewer credentialed literary theorists and hermeneutic circles. But the complement—an increased number of students pursuing degrees in science, technology, engineering, and math—will mean a greater probability of breakthroughs in research. We refuse to rue a development that has advances in things like medicine, technological efficiency, and environmental sustainability as its natural consequence. To those who are upset with the trend, we say: Let them eat code. [xxxii]

Let me remind the Crimson editors that Bill Gates, who focused relentlessly on applied math and engineering when he was at Harvard, produced a music player called the Zune. Steve Jobs, who studied dance and calligraphy and literature at Reed, produced the iPod.

I hasten to add that I deeply admire Bill Gates as a brilliant software engineer, business pioneer, philanthropist, moral person, and (yes) humanist in the best sense. But there may be just a tiny bit of truth to Steve Jobs’s assertion about Gates: “He’d be a broader guy if he had dropped acid once or gone off to an ashram when he was younger.” At the very least, his engineering skills may have benefited a bit if he had taken a few more humanities courses at Harvard.

That leads to a final lesson, one that takes us back to Ada Lovelace. In our symbiosis with machines, we humans have brought one crucial element to the partnership: creativity. “The machines will be more rational and analytic,” IBM’s research director Kelly has said. “People will provide judgment, intuition, empathy, a moral compass, and human creativity.”

We humans can remain relevant in an era of cognitive computing because we are able to “Think Different,” something that an algorithm, almost by definition, can’t master. We possess an imagination that, as Ada said, “brings together … ideas and conceptions in new, original, endless, ever-varying combinations.” We discern patterns and appreciate their beauty. We weave information into narratives. We are storytelling animals. We have a moral sense.

Human creativity involves values, aesthetic judgments, social emotions, personal consciousness, and yes, a moral sense. These are what the arts and humanities teach us – and why those realms are as valuable to our education as science, technology, engineering, and math. If we humans are to uphold our end of the man-machine symbiosis, if we are to retain our role as partners with our machines, we must continue to nurture the humanities, the wellsprings of our creativity. That is what we bring to the party.

“I have come to regard a commitment to the humanities as nothing less than an act of intellectual defiance, of cultural dissidence,” the New Republic literary editor Leon Wieseltier told students at Brandeis a year ago. “You had the effrontery to choose interpretation over calculation, and to recognize that calculation cannot provide an accurate picture, or a profound picture, or a whole picture, of self-interpreting beings such as ourselves. There is no greater bulwark against the twittering acceleration of American consciousness than the encounter with a work of art, and the experience of a text or an image.” [xxxiii]

But enough singing to the choir. No more nodding amen. Allow me to deviate from storytelling, for just a moment, to preach the fourth part of the traditional five-part Puritan sermon, the passages that provide a bit of discomfort, perhaps even some fire and brimstone about us sinners in the hands of an angry God.

The counterpart to my paean to the humanities is also true. People who love the arts and humanities should endeavor to appreciate the beauties of math and physics, just as Ada Lovelace did. Otherwise, they will be left as bystanders at the intersection of arts and science where most digital-age creativity will occur. They will surrender control of that territory to the engineers.

Many people who extol the arts and the humanities, who applaud vigorously the paeans to their importance in our schools, will proclaim without shame (and sometimes even joke) that they don’t understand math or physics. They would consider people who don’t know Hamlet from Macbeth to be uncultured, yet they might merrily admit that they don’t know the difference between a gene and a chromosome, or a transistor and a diode, or an integral and differential equation. These things may seem hard. Yes, but so, too, is Hamlet . And like Hamlet , each of these concepts is beautiful. Like an elegant mathematical equation, they are brushstrokes of the glories of the universe.

Trust me, our patron Thomas Jefferson, and his mentor Benjamin Franklin, would regard as a Philistine anyone who felt smug about not understanding math or complacent about not appreciating science.

Those who thrived in the technology revolution were people in the tradition of Ada Lovelace, who saw the beauty of both the arts and the sciences. Combining the poetical imagination of her father Byron with the mathematical imagination of her mentor Babbage, she became a patroness saint of our digital age. 

The next phase of the digital revolution will bring a true fusion of technology with the creative industries, such as media, fashion, music, entertainment, education, and the arts. Until now, much of the innovation has involved pouring old wine – books, newspapers, opinion pieces, journals, songs, television shows, movies – into new digital bottles. But the interplay between technology and the creative arts will eventually result in completely new formats of media and forms of expression. Innovation will come from being able to link beauty to technology, human emotions to networks, and poetry to processors.

The people who will thrive in this future will be those who, as Steve Jobs put it, “get excited by both the humanities and technology.” In other words, they will be the spiritual heirs of Ada Lovelace, people who can connect the arts to the sciences and have a rebellious sense of wonder that opens them to the beauty of both. 

[i] Walker Percy interview, The Paris Review , Summer 1987. This draws from a piece I wrote on Percy in American Sketches (Simon and Schuster, 2009).

[ii] Walker Percy, “The Fateful Rift: The San Andreas Fault in the Modern Mind,” Jefferson Lecture, May 3, 1989.

[iii] Author’s interview with Steve Jobs.

[iv] Benjamin Franklin to Peter Collinson, Apr. 29, 1749 and Feb. 4, 1750.

[v] Benjamin Franklin to John Lining, March 18, 1755.

[vi] Thomas Jefferson to Benjamin Franklin, June 21, 1776. This draws from my Benjamin Franklin: An American Life (Simon and Schuster, 2003).

[vii] See, Pauline Maier, American Scripture (New York: Knopf, 1997); Garry Wills, Inventing America , (Garden City: Doubleday, 1978); and Carl Becker, The Declaration of Independence (New York: Random House 1922).

[viii] Albert Einstein to Sybille Blinoff, May 21, 1954. This draws from my Einstein: His Life and Universe (Simon and Schuster, 2007).

[ix] Peter Bucky, The Private Albert Einstein , (Andrews Macmeel, 1993), 156.

[x] Albert Einstein to Hans Albert Einstein, Jan. 8, 1917.

[xi] Hans Albert Einstein interview in Gerald Whitrow, Einstein: The Man and His Achievement (London: BBC, 1967), 21.

[xii] Bucky, 148.

[xiii] George Sylvester Viereck, Glimpses of the Great (New York: Macauley, 1930), 377. (First published as “What Life Means to Einstein,” Saturday Evening Post, October 26, 1929.)

[xiv] Einstein to Phyllis Wright, Jan. 24, 1936.

[xv] Albert Einstein, “Autobiographical Notes,” in Paul Arthur Schilpp, ed. Albert Einstein: Philosopher-Scientist (La Salle, Ill.: Open Court Press, 1949), 53.

[xvi] Ada, Countess of Lovelace, “Notes on Sketch of The Analytical Engine,” October, 1842.

[xvii] Alan Turing, “Computing Machinery and Intelligence,” Mind , October 1950.

[xviii] John von Neumann, The Computer and the Brain (Yale, 1958), 80.

[xix] Gary Marcus, “Hyping Artificial Intelligence, Yet Again,” New Yorker,  Jan. 1, 2014, citing: “New Navy Device Learns by Doing”, (UPI wire story) New York Times, July 8, 1958; “Rival”, The New Yorker, Dec. 6, 1958.

[xx] Garry Kasparov, “The Chess Master and the Computer,” The New York Review of Books, Feb. 11, 2010; Clive Thompson, Smarter Than You Think (Penguin, 2013), 3.

[xxi] “Watson on Jeopardy,” IBM’s “Smarter Planet” website, Feb. 14, 2011.

[xxii] Gary Marcus, “Why Can’t My Computer Understand Me,” New Yorker, Aug. 16, 2013, coined the example I adapted.

[xxiii] Author’s interview with Tim Berners-Lee.

[xxiv] Vernor Vinge, “The Coming Technological Singularity,” Whole Earth Review , Winter 1993

[xxv] J.C.R. Licklider, “Man-Human Symbiosis,” IRE Transactions on Human Factors in Electronics,  March 1960.

[xxvi] Douglas Engelbart, “Augmenting Human Intellect,” Prepared for the Director of Information Sciences, Air Force Office of Scientific Research,  October 1962.

[xxvii] First published in limited distribution by The Communication Company, San Francisco, 1967.

[xxviii] Kelly and Hamm, 7.

[xxix] Kasparov, “The Chess Master and the Computer.”

[xxx]  “Why Cognitive Systems?” IBM Research website, http://www.research.ibm.com/cognitive-computing/why-cognitive-systems.shtml.

[xxxi] Author’s interview.

[xxxii] “Let Them Eat Code,” the Harvard Crimson , Nov. 8, 2013.

[xxxiii] Leon Wielsetier, "Perhaps Culture is Now the Counterculture: A Defense of the Humanities,” The New Republic , May 28, 2013.

May/June 2014

  • Book Review

Book review: 'Steve Jobs' by Walter Isaacson

Review of walter isaacson's biography of steve jobs.

By Laura June

Share this story

Steve Jobs

Walter Isaacson’s biography of  Steve Jobs  is in some ways another product created from the mind of its subject. Though Jobs was insistent that he wouldn’t interfere with the writing of the book (and in fact he seems not to have read any part of it), he hand-picked Isaacson to lay down his legacy for all to see. Why he chose him is not surprising: Isaacson’s biographies of Benjamin Franklin and Albert Einstein are engrossing, epic, and readable studies of men who changed history. That Steve Jobs saw himself in this light (and such august company) is neither shocking nor unjustified. And while Isaacson never shies away from Jobs’s often vitriolic temper (and indeed he sometimes seems to dwell on it to make his point), it is clear that in some respects, Steve Jobs is a book told through the often discussed "reality distortion field" of Steve Jobs himself: though other opinions or sides to a story are presented, Steve always has the last, blunt word.

Given the unprecedented access to Jobs and his blessing to interview those close to him presents the reader with a vast and exceedingly complex — but also incredibly consistent — portrait of the man who created Apple and some of the most important technology products of this century. In many ways, the Jobs of the early ’80s at the outset of his breathtaking career is the same feisty and impetuous man we find at the end of the book, picking apart his plans to build a yacht that he knew he would likely never see to completion. Jobs, at least according to this tale, didn’t evolve so much as he forced the world around him to do so. Isaacson’s mastery of the form is evident throughout, and he weaves the tale of Jobs’s life deftly.

For technology enthusiasts and those who followed Steve Jobs’s life as though he were Bob Dylan, the biography reinforces the previously known timeline. Jobs’s own admission early in the process with Isaacson that he didn’t "have any skeletons" in his "closet that can’t be allowed out" is largely true (Isaacson, xx). There are no shocking revelations, but the nuance brought to the events by the wide array of characters Isaacson spent time with, and Jobs’s candid and original perspective, never fail to bring well-known events into sharp and personal focus. One example which was well-documented in the media at the time and which gets several pages of attention in the book is the issue of the iPhone 4′s antenna problems. The story, as told in the book, is significant for a few reasons. First, the book reveals that the band of steel around the edge of the phone was never a big hit with Apple’s engineers, who warned that it could cause reception problems. But Apple’s SVP of Industrial Design Jonathan Ive and Steve Jobs, living deep in the "reality distortion field" which is repeatedly referred to in the book (and which Jobs’s wife more strikingly terms "magical thinking") insisted that the engineers could figure out how to make it work, to the point that they (Ive and Jobs) even resisted putting a clear coating of varnish on the band to make problems less likely. Secondly, when problems did, in fact, arise, the book makes clear how personally Jobs took the entire situation, going so far as to adamantly suggest that Apple simply ignore the issue, because in his mind, there was no problem, saying, "Fuck this, it’s not worth it" (Isaacson, 521). Only when Tim Cook implored him to face facts did Jobs decide to hold a press conference and offer solutions.

Likewise, it is almost amusing and even a bit sad to read of Jobs’s depression and anger on the evening following the debut of the iPad. Isaacson was by then, somewhat embedded in the Jobs household, and he notes that "as we gathered in his kitchen for dinner, he paced around the table calling up emails and web pages on his iPhone." Jobs told him, "I got about eight hundred email messages in the last twenty-four hours. Most of them are complaining. There’s no USB cord! There’s no this, no that. Some of them are like, ‘Fuck you, how can you do that?’ I don’t usually write people back, but I replied, ‘Your parents would be so proud of how you turned out.’ And some don’t like the iPad name, and on and on. I kind of got depressed today. It knocks you back a bit" (Isaacon, 495). In this and every previous or future launch, Jobs took the products, and their reception, very personally. In every phase of development, from inception to advertisements, he was a dictator, and, as the book underlines quite clearly, people who reacted badly or were underwhelmed simply didn’t get it. The book is rife with such personal perspectives of what are hallowed occurrences in the timeline of Jobs and Apple.

Jobs’s many achievements are tallied in detail, and while they are well known — the Macintosh, Pixar, the iMac, the iPhone, the iPad — it has only been previously assumed that Jobs was closely involved. Now all of his interactions with Apple’s products are truly exposed, in great, painstaking detail. That Jobs was exhaustively involved from beginning to end in the creation of these products and companies — even during the years in which he was gravely ill — is a testament to his work ethic, his creativity, and his genius. While Steve Jobs never shies away from turning a critical eye on its subject, it rightfully gives much credit to Jobs where it is due. People have long pointed out that Jobs could be an "asshole," and while the book never outright denies such a description, the sheer volume of his achievements and creations often puts the erratic and childish behavior into soft focus. In fact, the book seems to suggest that Jobs’s fantastic career was born out of his harsh, demanding attitude, rather than in spite of it. "I don’t think I run roughshod over people," Jobs told Isaacson, "but if something sucks, I tell people to their face. It’s my job to be honest. I know what I’m talking about, and I usually turn out to be right. That’s the culture I tried to create. We are brutally honest with each other, and anyone can tell me they think I am full of shit and I can tell them the same" (Isaacson, 568-569). Rather than exposing Jobs an "asshole," the biography presents, front to back, a human being who was essentially incapable of being phony, even if doing so would make him appear better to others.

The book also emphasizes, in anecdotes that probably aren’t totally surprising, Jobs’s belief, from the beginning of his career to the end of it, that everything should be (and was if possible), in his control. This meant not just making hardware and software into a closed ecosystem, but also controlling what could be done with the actual products once purchased. The stubborn surety that he knew what was right for himself and everyone else famously resulted in Macs and iPhones which were hard to open up and hack (even adding special screws to the latter to make it more difficult), and in the fact that the iPad wouldn’t display Flash. It also resulted, however, in Jobs stubbornly and often refusing to eat (even when sick), in a belief that being vegan meant he didn’t have to shower, and in a resistance to allow his doctors to remove the cancerous tumor on his colon for nine months in 2003.

Jobs’s managerial style (or lack of one), had been previously well-documented after his ouster from Apple, but the biography is probably at its harshest when describing his various working relationships with other people. We are presented with personal accounts of a well-known volatility that is increasingly shocking, sometimes delusional, and always, in the mind of its subject, justified. One of the true revelations of the book is that Steve Jobs cried — a lot, and in the presence of his co-workers. From the earliest days of his career when he cried to Steve Wozniak’s father Jerry about getting Woz to come work at Apple full time, he broke down in tears regularly when frustrated, when cornered, when happy or touched, and when angry. Though his return to Apple did seem to bring some temperance and evenness to his management efforts, Jobs never stopped openly crying when emotion overwhelmed him.

The sections where  Bill Gates  — who was sometimes an insider and sometimes not — weighs in, are variously the most touching, sometimes the most interesting, and often do the most to underline the great chasm of difference there was between the two personalities. While Jobs avoids branding him with his favorite and oft-used title "bozo," Gates, in this tale, truly doesn’t get it a lot of the time, but he gets that he doesn’t get it. On the success of the iPad, Gates tells Isaacson, "Here I am, merely saving the world from malaria and that sort of thing, and Steve is still coming up with amazing new products," adding, "Maybe I should have stayed in that game" (Isaacson, 553).

Throughout the book, Jobs is incredibly and sometimes amusingly cutting about various friends, former colleagues, business associates, and even celebrities. Many people, in his view (including but not limited to John Mayer, President Obama, Google, and Rupert Murdoch) were constantly "blowing it." He makes it clear that grudges held could often be permanent. When speaking of Jon Rubinstein, a former Apple executive who helped give birth to the iPod and was then head of Palm, Jobs admits to having emailed Bono, a Palm investor, to complain when the company began trying to make an iPhone competitor. Bono replied that his remarks were akin to "the Beatles ringing up because Herman and the Hermits have taken one of their road crew" (Isaacson, 459). "The fact that they [Palm] completely failed salves that wound," Jobs says (Isaacson, 460).

Jobs perspective that certain things "sucked" could often be influenced by other factors. For example, it’s hard to tell if Jobs truly thought that Android is "crap," or if he says it because he was involved in a lengthy battle against Google over patent infringement. What emerges from the Android discussion, however, is that Jobs passionately believed that it was a stolen product. Isaacson was with Jobs the week Apple filed its lawsuit against Google, when Jobs was the "angriest he’d ever seen him."

"Our lawsuit is saying, ‘Google, you fucking ripped off the iPhone, wholesale ripped us off.’ Grand theft. I will spend my last dying breath if I need to, and I will spend every penny of Apple’s $40 billion in the bank, to right this wrong. I’m going to destroy Android, because it’s a stolen product. I’m willing to go to thermonuclear war on this. They are scared to death, because they know they are guilty. Outside of Search, Google’s products — Android, Google Docs — are shit" (Isaacson, 511-512). In fact, there are few people and companies Jobs sets his sights on who don’t fail to cut the mustard on many levels. Notable exceptions are the Beatles (who Jobs talks about at length in one of the most insightful sections of the book), his wife Laurene, and Jony Ive.

Though none of the Beatles weigh in on Jobs, both Laurene and Ive do, and Ive in particular seems to grapple with Jobs’s personality, telling Isaacson "He’s a very, very sensitive guy. That’s one of the things that make his antisocial behavior, his rudeness, so unconscionable" (Isaacson, 462). Ive is significant to the book in other ways, as Jobs’s main creative brother-in-arms, and, as the story progresses, it is clear that both men struggled with the idea of a post-Jobs Apple. For nearly the entire latter half of the book, and much of Jobs’s "phase two" at Apple, his health was a near constant concern for those closest to him, and Ive was in that inner-circle. When Jobs returned from a two-month stay in Memphis in May 2009 following his liver transplant, Ive and Cook were there to meet him and his wife on the tarmac. Both Ive and Jobs reported feeling the same way — Ive was "devastated" and "underappreciated" by media stories questioning the ability of  Apple to innovate without Jobs , while Jobs was somewhat miffed at Cook’s earnings report call where he suggested that Apple could do just that. "He didn’t know whether to be proud or hurt that it might be true," Issacson writes. "There was talk that he might step aside and become chairman rather than CEO. That made him all the more motivated to get out of his bed, overcome the pain, and start taking his restorative walks again" (Isaacson, 488). The book thus is oddly positioned in that its subject, near the end of the story, is well aware that he is very likely near the end of his career, and indeed, he tells Isaacson on their last meeting, "I’ve done all that I can do" (Isaacson, 559).

In that respect, Jobs the man is consistent throughout, expressing little regret or dissatisfaction with himself, except for his repeated wish that he had spent more time with his children, who, he says, were his main motivation for cooperating with and encouraging that a biography be written at all. In a world where people and media will pay actual money for one glimpse of a dying and frail CEO, Steve Jobs will not be the final book on the man, but it will be the only one told largely in his words, and the only one in which he had the final say on its cover. All the other books will no doubt be written by bozos who blow it.

Elon Musk goes ‘absolutely hard core’ in another round of Tesla layoffs

Razer made a million dollars selling a mask with rgb, and the ftc is not pleased, fcc fines at&t, sprint, t-mobile, and verizon nearly $200 million for illegally sharing location data, binance founder’s sentencing hearing, automatic emergency braking at speeds up to 90mph required under new rule.

Sponsor logo

More from Apple

Woman holding a purse while modeling the Stripes watchface on the Apple Watch SE (2022)

Here are the best Apple Watch deals right now

Apple AirPods Pro

The best Presidents Day deals you can already get

An illustration of The Vergecast team, with a Vision Pro over top.

The shine comes off the Vision Pro

Epic Games logo

Apple unbanned Epic so it can make an iOS games store in the EU

'Steve Jobs': An apt portrait of a jerk and a genius

<b style="color:#900;">book review</b> Walter Isaacson's biography is a compelling book that, like Jobs himself, doesn't sugarcoat the Apple leader's history. But in it, Jobs does get the last word.

biographies such as steve jobs by walter isaacson

  • Shankland covered the tech industry for more than 25 years and was a science writer for five years before that. He has deep expertise in microprocessors, digital photography, computer hardware and software, internet standards, web technology, and more.

The cover of "Steve Jobs," by Walter Isaacson.

book review Amid the choking fumes from the Apple flame wars, Walter Isaacson's biography of Steve Jobs comes as a breath of fresh air.

Jobs, along with the bold company he built, gets people's blood boiling with loyalty and with loathing. Vitriol often is the chief characteristic of debates between fans of Macs and of Windows PCs, between fans of iOS and of Android.

Isaacson, though, has done an admirable job navigating the minefields with his biography, simply titled "Steve Jobs." The result is a book that, although not perfect, is a reliable and captivating guide to a man who reshaped the computing industry and more.

It helps that Jobs' life is packed with drama. And it helps that when Jobs died of cancer this month at a relatively young age of 56 , Apple finds itself at the height of its power.

There was a risk, as an authorized biography, that the book could have been tame, but it's not. Jobs himself urged Isaacson to write it and, after initial "skittishness," encouraged those he's known to open up to the writer. Ultimately, just as Jobs told Isaacson that "my job is to say when something sucks rather than sugarcoat it," Isaacson has presented an unvarnished view of Jobs. That means we get to hear about the employees he treated harshly, the management incompetence that defeated some of his dreams and got him ejected from Apple, and the first daughter he largely abandoned for years.

It would have been impossible, of course, to overlook Jobs' temper, his impatience, his brutal treatment of co-workers, his callous treatment of his first child, and his unforgiving separation of the world's population into A-team gods in the one corner and shitheads and bozos in the other.

Isaacson, while opining that the "nasty edge to his personality was not necessary," more often presents Jobs' harshness as effective. "Dozens of the colleagues whom Jobs most abused ended their litany of horror stories by saying that he got them to do things they never dreamed possible," Isaacson wrote.

It's not clear whether Jobs could have left a legacy that was more humanitarian--a sequel, perhaps, to the HP Way that Bill Hewlett and Dave Packard established at Hewlett-Packard, a company that Jobs admired. It is clear, though, that Jobs couldn't be bothered to behave otherwise.

"This is who I am, and you can't expect me to be someone I'm not," Jobs told Isaacson.

Jobs' primary legacy certainly will be Apple and its products, just as Mozart's difficult personality has faded as his music lives on. But as his pursuit of Isaacson demonstrates, Jobs cared deeply about how history sees him. His battles with cancer led him to an articulate awareness of his own mortality, and it appears that Jobs made the calculation that an independent but authorized biography would be better than writing his memoirs.

"Jobs surprised me by readily acknowledging that he would have no control over it or even the right to see it in advance," Isaacson said of the biography, which he began in earnest in 2009.

Through the course of more than 40 interviews with the author, Jobs' famed "reality distortion field" presumably was in effect, but Isaacson also interviewed more than 100 others--and Jobs certainly has made plenty of enemies over the decades.

Praise for an industry titan Isaacson probably is more right than wrong to conclude that history will place Jobs in the "pantheon right next to Edison and Ford," but I fear Isaacson gives the infamously micromanagerial Jobs a bit too much credit for developing products himself.

Clearly he was a profoundly hands-on executive, from the earliest days designing the user interface for his and Steve Wozniak's Blue Boxes for phone hacking to the latter years with iPhones and iPads. And certainly Jobs' leadership was an essential ingredient in the company's present success. But it's hard to assess the true effect of a legion of foot soldiers.

Isaacson nibbles at the issue. He offers anecdotes from employees who saw Jobs reject their ideas one day and present them as his own the next. He quotes Apple lead designer Jonathan Ive as saying, "I pay maniacal attention to where an idea comes from, and I even keep notebooks filled with my ideas. So it hurts when he takes credit for one of my designs." Perhaps it's the nature of biographies, which place a single person at the center of an episode of a historical narrative, to overemphasize one person's importance.

And every now and again, a little bit of the Apple fanboy creeps into Isaacson's view. It's true that Apple has helped improve user interfaces of digital devices, but inscrutable error messages and crashes are hardly unique to Windows. And when Isaacson declares that Jobs "launched a series of products over three decades that transformed whole industries," rightly listing several such as the Apple II, Macintosh, iPod, iPhone, and the App Store for iOS software, he prematurely includes iCloud in the list, too.

iCloud has only begun to hit the market, and Google--often with the very Android products Jobs castigates--has shown a greater ability to transform the world through cloud computing. iCloud shows promise for keeping devices in sync, but when it comes to the deep integration of the Internet into computing, Apple so far hasn't been driving the industry. Google Docs, for all its warts, shows more signs of shaking up the Microsoft Office status quo than Apple's alternatives. And it's Google Maps on which so much of the iPhone's location smarts is built.

Tempered by reality But in the scheme of things, this criticism is secondary. Isaacson--a seasoned author who was managing editor of Time and who has written biographies of Albert Einstein and Benjamin Franklin--didn't write a hagiography. He praises Jobs for his accomplishments, but he also brings up Jobs' mistaken view as a young man that being a fruitarian would neutralize his body odor and allow him to bathe but once a week.

Indeed, sometimes it can be painful reading about Jobs' behavior. Nowhere is this more true than in his handling of his first child, Lisa Brennan-Jobs, the daughter of one-time girlfriend Chrisann Brennan.

"At times he was able to distort reality not just for others but even for himself. In the case of Brennan's pregnancy, he simply shut it out of his mind," Isaacson writes. Jobs apparently tried to take some of his own medicine later--the unvarnished truth--but his rocky history with his first daughter showed it to be a lifelong struggle. "I wish I had handled it differently. I could not see myself as a father then, so I didn't face up to it....I tried to do the right thing. But if I could do it over, I would do a better job."

Brennan-Jobs lived with her father for four years after her school warned that things were bad with her mother. And Chrisann Brennan would walk over to Jobs' house and yell from the yard. But Brennan saw Jobs as having some responsibility for that behavior and for the problems that led to their daughter moving in with him:

"Do you know how Steve was able to get the city of Woodside to allow him to tear his Woodside home down? There was a community of people who wanted to preserve his Woodside house due to its historical value, but Steve wanted to tear it down and build a home with an orchard. Steve let that house fall into so much disrepair and decay over a number of years that there was no way to save it. The strategy he used to get what he wanted was to simply follow the line of least involvement and resistance. So by his doing nothing on the house, and maybe even leaving the windows open for years, the house fell apart. Brilliant, no?...In a similar way did Steve work to undermine my effectiveness AND my well being at the time when Lisa was 13 and 14 to get her to move into his house. He started with one strategy but then it moved to another easier one that was even more destructive to me and more problematic for Lisa. It may not have been of the greatest integrity, but he got what he wanted."

Fortunately, Isaacson treats these prickly issues with cool dispassion, neither shrinking from them nor apologizing for Jobs' behavior.

He also accurately assesses the difficulties Jobs must have had reconciling his business success with the Dylan-loving, 1960s-era rebelliousness and affinity for counterculture: "He refused such trappings as having a 'Reserved for CEO' spot, but he assumed for himself the right to park in the handicapped spaces. He wanted to be seen (both by himself and by others) as someone willing to work for $1 a year, but he also wanted to have huge stock grants bestowed upon him. Jangling inside him were the contradictions of a counterculture rebel turned business entrepreneur, someone who wanted to believe that he had turned on and tuned in without having sold out and cashed in."

It's an apt assessment of a man who probably saw the world and himself as more straightforward than either really were. But ultimately, perhaps the fact that the reality distortion field worked on Steve Jobs, too, is what gave him the power to carve out such a position of influence in industry and history.

Jobs got an opportunity to reshape the computing industry. With Apple, he made the most of it--twice--and with Walter Isaacson, he took that opportunity again. Isaacson presents Jobs as he was, but Jobs gets his "One More Thing" moment too, in the form of 1,493 words written shortly before his death, the closing words of the book. It summarizes his view of leadership, innovation, and changing the world. Countless people will read it.

All most of us get when we die is a last will and testament.

Disclosure: "Steve Jobs" is published by Simon & Schuster, which like CNET is owned by CBS.

  • Sign up and get a free ebook!
  • Don't miss our $0.99 ebook deals!

Walter Isaacson: The Genius Biographies

Walter Isaacson: The Genius Biographies

Benjamin franklin, einstein, steve jobs, and leonardo da vinci.

Trade Paperback

LIST PRICE $88.00

Buy from Other Retailers

  • Amazon logo
  • Bookshop logo

Table of Contents

About the book, about the author.

Walter Isaacson

Walter Isaacson is the bestselling author of biographies of Jennifer Doudna, Leonardo da Vinci, Steve Jobs, Benjamin Franklin, and Albert Einstein. He is a professor of history at Tulane and was CEO of the Aspen Institute, chair of CNN, and editor of  Time . He was awarded the National Humanities Medal in 2023. Visit him at Isaacson.Tulane.edu.

Product Details

  • Publisher: Simon & Schuster (May 28, 2019)
  • Length: 2592 pages
  • ISBN13: 9781982130428

Browse Related Books

  • Biography & Autobiography > Historical
  • Biography & Autobiography > Science & Technology
  • Biography & Autobiography > Artists, Architects, Photographers

Resources and Downloads

High resolution images.

  • Book Cover Image (jpg): Walter Isaacson: The Genius Biographies Boxed Set Trade Paperback 9781982130428

Get a FREE ebook by joining our mailing list today!

Plus, receive recommendations and exclusive offers on all of your favorite books and authors from Simon & Schuster.

More books from this author: Walter Isaacson

Elon Musk

You may also like: Thriller and Mystery Staff Picks

Invisible Girl

More to Explore

Limited Time eBook Deals

Limited Time eBook Deals

Check out this month's discounted reads.

Our Summer Reading Recommendations

Our Summer Reading Recommendations

Red-hot romances, poolside fiction, and blockbuster picks, oh my! Start reading the hottest books of the summer.

This Month's New Releases

This Month's New Releases

From heart-pounding thrillers to poignant memoirs and everything in between, check out what's new this month.

Tell us what you like and we'll recommend books you'll love.

biographies such as steve jobs by walter isaacson

  • Business and Careers
  • Management & Leadership

Amazon International Store

  • International products have separate terms, are sold from abroad and may differ from local products, including fit, age ratings, and language of product, labeling or instructions.
  • Manufacturer warranty may not apply.
  • Learn more about Amazon International Store.

biographies such as steve jobs by walter isaacson

Image Unavailable

Steve Jobs: A Biography

  • To view this video download Flash Player

Steve Jobs: A Biography Hardcover – 24 October 2011

Purchase options and add-ons.

  • Print length 656 pages
  • Language English
  • Publication date 24 October 2011
  • ISBN-10 1451648537
  • ISBN-13 978-1451648539
  • Lexile measure 1080L
  • See all details

Frequently bought together

Steve Jobs: A Biography

Customers who bought this item also bought

Elon Musk

Product description

About the author, excerpt. © reprinted by permission. all rights reserved., product details.

  • Language ‏ : ‎ English
  • Hardcover ‏ : ‎ 656 pages
  • ISBN-10 ‏ : ‎ 1451648537
  • ISBN-13 ‏ : ‎ 978-1451648539
  • 1 in Computer History & Culture
  • 3 in Technology
  • 3 in Graphics & Multimedia Software

Customer reviews

Review this product.

  • Sort reviews by Top reviews Most recent Top reviews

Top reviews from Singapore

Top reviews from other countries.

biographies such as steve jobs by walter isaacson

  • Press Releases
  • Amazon Science
  • Protect and Build Your Brand
  • Advertise Your Products
  • Sell on Amazon
  • Associates Programme
  • Fulfillment by Amazon
  • COVID-19 and Amazon
  • Your Account
  • Your Orders
  • Shipping Rates and Policies
  • Recalls and Product Safety Alerts
  • Netherlands
  • United Arab Emirates
  • United Kingdom
  • United States
  • Conditions of Use
  • Privacy Notice
  • Interest-Based Ads

‘Steve Jobs’ by Walter Isaacson

New biography details the two sides of apple creator steve jobs.

In a new biography, author Walter Isaacson describes Steve Jobs as selfish and arrogant as well as the creative genius behind Apple’s successful products.

Life often reduced Steve Jobs to tears. But he rarely suffered alone. The cofounder of Apple Inc. spread his unhappiness like a virus, abusing his friends, neglecting his family, insulting and reviling his colleagues. And almost to a person, they loved Jobs to the end.

It’s a neat trick, inspiring such extremes of loyalty and dread. Jobs carried it off, while endearing himself to the millions who bought his company’s products. It didn’t hurt that those products were magical - computers and smartphones, tablets and music players, all designed to a standard of elegance and efficiency no one else could match. Jobs must have figured that excellence covers a multitude of sins, and maybe he was right.

Advertisement

You can judge for yourself. With death on the horizon, Jobs asked veteran journalist Walter Isaacson to tell his side of the story. I’m not sure Jobs would consider Isaacson’s book a complete success. Workmanlike and efficient, it’s as thorough an overview of the man’s life as anyone could wish, beginning with the stories of his biological and adoptive parents, through his youth in the burgeoning Silicon Valley, his fateful friendship with Apple cofounder Steve Wozniak, and the wild swings of his three-decade career. Still, the prose hardly evokes the graceful sense of fun that pervades Jobs’s creations. The book reads as if it were written on a Windows PC, not a Macintosh.

With death on the horizon, Jobs asked veteran journalist Walter Isaacson (pictured) to tell his side of the story.

Never mind. Isaacson is a sympathetic but unsparing biographer. In his hands, Jobs’s story contains more than enough dark magic to keep the pages turning.

It helps that Jobs was in many respects an appalling human being - selfish, arrogant, utterly contemptuous of anybody who didn’t measure up. His archrival, Microsoft Corp’s ruthless cofounder Bill Gates, has lately acquired a patina of geeky saintliness through his charitable foundation. Not Jobs, who combined ascetic money-doesn’t-matter Buddhism with a Scroogish contempt for philanthropy.

Yet the same man radiated a childlike charm and wielded a brilliant sense of style that enthralled pretty much everybody. He wasn’t an engineer or a programmer, but he was a master at understanding what engineers and programmers did, and more importantly what they could do if Jobs was there to point them in the right direction.

This native talent enabled Jobs to launch Apple, and the personal computer industry. It helped Jobs to survive his devastating ouster from the company in 1985, and more than a decade in the wilderness, and it gave him the wit to see the potential in a little computer graphics business called Pixar, which made cute animated movies on the side. And, of course, the massive success of Pixar’s feature film “Toy Story’’ made possible Jobs’s triumphant return to Apple. Then came the final 15 years of Jobs’s life, when his aesthetic brilliance and ruthless perfectionism produced an almost miraculous series of superb products - the iMac, the iPod music player, the iPhone, and the iPad tablet computer.

Isaacson retells the story with lucidity and sharp insight. He’s especially good with palace coups - the one in 1985 when Apple chief executive John Sculley, whom Jobs had lured away from PepsiCo., exiled Jobs, and the 1997 rebound, in which Jobs overthrew the colorless, stodgy Gil Amelio and retook the helm.

Throughout, Jobs never wavered in his merciless perfectionism, an attitude that produced delightful digital products, but endless torments for all who worked with him. Those with the smarts and guts to fight back against his whims were Jobs’ “A players,’’ the best and brightest. The rest were “bozos,’’ or worse.

Jobs found that brutalizing his colleagues was an excellent way to filter the corporate talent pool; his best engineers and designers got better and the losers got out. Seen merely as a management tool, his cruelty is easily excusable. But for Jobs meanness wasn’t just a tactic, but a way of life. “Because of how very sensitive he is,’’ said Apple product design ace Jonathan Ive, “he knows exactly how to efficiently and effectively hurt someone.’’

Jobs’s cruelty wasn’t confined to the office. An adopted child, he speaks of gratitude to his birth mother for her decision not to abort him. Yet Jobs tells Isaacson that when he got girlfriend Chrisann Brennan pregnant at age 23, he’d wanted her to get an abortion. When Brennan didn’t, Jobs tried to deny he was the father, and had no contact with the child for a decade.

The narrative comes most fully to life toward the end, as Jobs’s body betrays him. The most heralded bit of news, about Jobs’s nine-month delay in seeking surgery for his pancreatic cancer, was broken years ago by Fortune magazine, as Isaacson himself reminds us. But we learn how close Jobs came to death two years ago. His liver began to fail him in early 2009; doctors warned that without a transplant, he’d likely be finished by April. He was saved by the death of a young man in a car crash on March 21. Within days, Jobs contracted pneumonia and once again barely survived.

He was being given a little more time, and he didn’t waste it. Jobs unveiled his last great product, the iPad, in January of 2010. Then, as the cancer reasserted itself, he made peace with family members and business rivals too - even Jobs’s enemies revered him.

The nastiness won’t be entirely forgotten. Isaacson’s magisterial account ensures it. But those who suffered Jobs’s wrath have forgiven him. Whatever was ugly in him is gone now; all that’s left is the magic.

Hiawatha Bray can be reached at [email protected] .

We will keep fighting for all libraries - stand with us!

Internet Archive Audio

biographies such as steve jobs by walter isaacson

  • This Just In
  • Grateful Dead
  • Old Time Radio
  • 78 RPMs and Cylinder Recordings
  • Audio Books & Poetry
  • Computers, Technology and Science
  • Music, Arts & Culture
  • News & Public Affairs
  • Spirituality & Religion
  • Radio News Archive

biographies such as steve jobs by walter isaacson

  • Flickr Commons
  • Occupy Wall Street Flickr
  • NASA Images
  • Solar System Collection
  • Ames Research Center

biographies such as steve jobs by walter isaacson

  • All Software
  • Old School Emulation
  • MS-DOS Games
  • Historical Software
  • Classic PC Games
  • Software Library
  • Kodi Archive and Support File
  • Vintage Software
  • CD-ROM Software
  • CD-ROM Software Library
  • Software Sites
  • Tucows Software Library
  • Shareware CD-ROMs
  • Software Capsules Compilation
  • CD-ROM Images
  • ZX Spectrum
  • DOOM Level CD

biographies such as steve jobs by walter isaacson

  • Smithsonian Libraries
  • FEDLINK (US)
  • Lincoln Collection
  • American Libraries
  • Canadian Libraries
  • Universal Library
  • Project Gutenberg
  • Children's Library
  • Biodiversity Heritage Library
  • Books by Language
  • Additional Collections

biographies such as steve jobs by walter isaacson

  • Prelinger Archives
  • Democracy Now!
  • Occupy Wall Street
  • TV NSA Clip Library
  • Animation & Cartoons
  • Arts & Music
  • Computers & Technology
  • Cultural & Academic Films
  • Ephemeral Films
  • Sports Videos
  • Videogame Videos
  • Youth Media

Search the history of over 866 billion web pages on the Internet.

Mobile Apps

  • Wayback Machine (iOS)
  • Wayback Machine (Android)

Browser Extensions

Archive-it subscription.

  • Explore the Collections
  • Build Collections

Save Page Now

Capture a web page as it appears now for use as a trusted citation in the future.

Please enter a valid web address

  • Donate Donate icon An illustration of a heart shape

Bookreader Item Preview

Share or embed this item, flag this item for.

  • Graphic Violence
  • Explicit Sexual Content
  • Hate Speech
  • Misinformation/Disinformation
  • Marketing/Phishing/Advertising
  • Misleading/Inaccurate/Missing Metadata

[WorldCat (this item)]

plus-circle Add Review comment Reviews

3,548 Views

88 Favorites

Better World Books

DOWNLOAD OPTIONS

No suitable files to display here.

IN COLLECTIONS

Uploaded by station47.cebu on July 17, 2020

SIMILAR ITEMS (based on metadata)

Join our mailing list!

Steve Jobs

  • Unabridged Audio Download
  • Abridged Audio Download
  • Abridged Compact Disk
  • Unabridged Compact Disk

Table of Contents

Reading group guide, about the book.

Join our mailing list! Get our latest staff recommendations, award news and digital catalog links right to your inbox.

About The Author

Walter Isaacson

Walter Isaacson is the bestselling author of biographies of Jennifer Doudna, Leonardo da Vinci, Steve Jobs, Benjamin Franklin, and Albert Einstein. He is a professor of history at Tulane and was CEO of the Aspen Institute, chair of CNN, and editor of  Time . He was awarded the National Humanities Medal in 2023. Visit him at Isaacson.Tulane.edu.

Product Details

  • Publisher: Simon & Schuster (October 5, 2021)
  • Length: 672 pages
  • ISBN13: 9781982176860

Browse Related Books

  • Business & Economics > Leadership
  • Biography & Autobiography > Business
  • Biography & Autobiography > Composers & Musicians
  • Computers > History

Related Articles

  • 12 Mathematical Books to Help You Solve Your Reading Rut Problem - Off the Shelf
  • 11 Exceptional Audiobooks for Your Every Mood - Off the Shelf
  • 13 Books You’ll Want to Read Before They Hit the Big Screen - Off the Shelf
  • Chipotle’s Literary Salon: The Books Behind the Burritos - Off the Shelf

Awards and Honors

  • Heather's Pick - Fiction

Resources and Downloads

High resolution images.

  • Book Cover Image (jpg): Steve Jobs Trade Paperback 9781982176860

Get our latest staff recommendations, award news and digital catalog links right to your inbox.

More books from this author: Walter Isaacson

Elon Musk

More to Explore

Limited Time eBook Deals

Limited Time eBook Deals

Check out this month's discounted reads.

Our Summer Reading Recommendations

Our Summer Reading Recommendations

Red-hot romances, poolside fiction, and blockbuster picks, oh my! Start reading the hottest books of the summer.

This Month's New Releases

This Month's New Releases

From heart-pounding thrillers to poignant memoirs and everything in between, check out what's new this month.

Tell us what you’d like to receive below. Or visit our preference center to select the newsletter(s) you prefer.

Academic Newsletter

Please specify your subject area(s):

Steve Jobs, Walter Isaacson - Book Summary

biographies such as steve jobs by walter isaacson

This book chronicles the daring and adventurous life of Steve Jobs, an innovative businessman and eccentric founder of Apple. Drawing on Jobs' early experiences with the spirit and aspiration to reach the pinnacle of becoming a worldwide technology icon, Steve Jobs describes his successful business journey as well as his battles. that he had to confront on his journey.

This book is for:  

  • Anyone curious about the interesting life of Steve Jobs;
  • Anyone curious as to how Apple managed to achieve the enormous success it is now;
  • Anyone inspired by the man who made Apple's tech giant today.

About the author:

Walter Isaacson is an American writer and biographer. He was one of the original editors of TIME magazine and is also the president and CEO of the CNN news network. Isaacson has written two best-selling biographies of Albert Einstein and Benjamin Franklin, and he is also the author of American Sketches (2003) .

What does this book have for you? Find out why Steve Jobs' Apple became an icon of technology around the world.    

There's absolutely no denying the role Steve Jobs played in shaping our tech world today.  

A minded perfectionist, Steve Jobs had a vision of changing the world through technology.

In this best-selling biography, you'll learn that while perfectionism and desire drove Jobs to achieve greatness, it was his personality that was the cause of discord and conflict. In his relationships with employees and co-workers, Jobs' behavior was often viewed as highly offensive, although he frequently argued that he was simply trying to motivate his employees to achieve success. get what's best.

The summary pages that follow detail the enchanting life of one of the most influential tech entrepreneurs of our time, while also telling the delightful story of the childish prank brought partnership that later built one of the most valuable technology companies in the world.

Also in these summary pages you will learn:

  • How LSD led to the formation of today's technology;
  • Why Woody and Buzz Lightyear wouldn't exist without Steve Jobs;
  • Why Jobs believes he can cure his cancer with acupuncture and eating fruit.

A skillful father and a naughty best friend made Jobs bring with him a passion for engineering and design.

On February 24, 1955, a boy was born to Abdulfattah Jandali and Joanne Schieble.

However, they did not raise the child. The reason is because Schieble comes from a very strict Catholic family, they did not accept her having a child with a Muslim man and they were forced to take the child away for adoption.

And so, the child grew up in the arms of Paul and Clara Jobs, a couple living in Silicon Valley. They named the baby Steven.

Paul Jobs was a mechanical engineer specializing in cars and it was he who opened the door that brought Steve into the world of engineering.

From an early age, Paul tried to instill his love of mechanics with Steve. Steve once said that he was impressed by his father's focus on the profession. If the family needed a cabin, Paul could easily make it and he let Steve help with the work.

In addition, the family's smart yet very affordable Eichler home - a modern home with floor-to-ceiling glass walls and expansive floors - ignited Steve's passion for clean design. will and luxury.

Then, in high school, Steve Jobs met Steve Wozniak, the two quickly became close friends.

Wozniak is 5 years older than Jobs and is a genius computer engineer, from which Jobs learned a lot about computers.

In many ways, Jobs and Wozniak were both kids and both loved to be naughty. But they also love the world of electronics and want to be able to create something.

Combining the two personalities, in 1971, they released their first product: the "Blue Box", a device that allowed users to make calls from long distances and completely free.

Wozniak provided the design and Jobs brought it to business, each investing $40 and selling the device for $150.

The pair sold nearly 100 units, showing them what they could do with Wozniak's mechanical engineering and Jobs' vision, and it was also the beginning of the path to creating Apple.

Spirituality, LSD, and the arts shaped Jobs' taste and intense focus.

By the late 1960s, cultural interest and curiosity among computer geeks and "hippie" lifestyles had begun to overlap.

So perhaps, in addition to his fascination with math, science, and electronics, Jobs immersed himself in cross-culturalism and started taking LSD (strong hallucinogens).

Jobs later demonstrated a refined aesthetic sense and intense focus on experiences with hallucinogenic drugs.

In 1972, Jobs enrolled at Reed College, a libertarian private school in Oregon, and since then both thinking and using LSD with friends has become serious.

Jobs felt that taking these drugs reinforced his sense of the important things in life, by showing "there is no flip side of the coin". For Jobs, creating great things was more important than anything else.

Eager to explore the spiritual culture of the East, Jobs even went to India, where he stayed for seven months. Buddhism in particular became an important part of his personality, influencing his minimalist aesthetic and exposing him to the power of intuition.

Both interests - LSD and spirituality - helped develop a steady focus, what has come to be known as Jobs's reality-distorting expertise: What if he had decided what should have happened? , then he made it happen simply by bending reality with his will.

Another factor that shaped Jobs' minimalist aesthetic was his devotion to art. Throughout his career, Jobs emphasized many times that Apple products should be neat and simple.

This idea was formed during my college years. Despite dropping out of his studies, Jobs was allowed to continue taking classes, which he did solely for the purpose of enriching himself. One of them was a calligraphy class, his skill in which later became a key element of the Apple Mac's graphical user interface.  

A visit to the apple farm gave them a name; A reverse vision and hard work made a company.

It seems like a strange combination: a spiritual person, fond of LSD and a background in the computer industry. In the early 1970s, many people began to see computers as symbols of personal expression.

When Jobs was hooked on ecstasy and Zen, he dreamed of starting his own business. And around the same time, his friend Steve Wozniak got the idea for the modern personal computer.

In the early days of Silicon Valley's technological revolution, Steve Wozniak joined the Homebrew computer club, a place where computer geeks meet to exchange ideas and where opposites go Tradition, combined with technology becomes the perfect thing.

It was also here that Steve Wozniak got his idea. Computers at that time needed many separate hardware devices to operate, making management and use extremely complicated. Wozniak envisions a device as a self-contained package with an all-in-one keyboard, computer, and monitor.

At first, Wozniak hesitated to make his design available to everyone for free, which is also a Homebrew tradition. However, Jobs insisted that they should profit from Wozniak's invention.

So in 1976, with just $1,300 starting a business, Jobs and Wozniak founded the Apple computer company.

On the day they came up with the name, Jobs visited an apple farm; and because it's so simple, fun, and relatable – the name Apple was born.

Jobs and Wozniak worked extremely hard for a month to build 100 computers by hand. Half of it was sold to a local computer dealer, and the other half went to friends and other customers.

After just 30 days, Apple's first computer, the Apple I, turned a profit.

Jobs and Wozniak made a very strong team – Wozniak was a tech wizard and Jobs was a visionary who saw the world-changing potential of the personal computer.

Jobs was a controlling and capricious boss, driven by perfection.

Those who know Jobs will agree that he is a dominant and exceptionally capricious personality. If the work does not meet the standards, he will get angry and may scold others.

But why did Jobs have such a bad temper?

In short, he is a very perfectionist. Jobs wanted the Apple II to be the perfect design, fully equipped and integrated with everything. But when the Apple II team made it a success when it was released in 1977, it also drained people of energy and spirit.

If Jobs felt that an employee's job was bad, he would tell them it was a pile of rubbish and that things would become extremely serious if he found even a small mistake.

As Apple grew stronger, Jobs became increasingly erratic. Mike Scott was even appointed as Apple's director, with the main task of curbing that temper of Jobs.

Scott essentially confronts Jobs about issues other employees don't have the energy to do. This often led to disagreements, sometimes even bringing Jobs to tears because he felt that giving up control of Apple was really difficult.

Jobs felt extremely frustrated when Scott tried to put limits on his perfectionism. For his part, Scott didn't want Jobs' perfectionism to rise above pragmatism.

For example, Scott intervened when Jobs thought that none of the nearly 2,000 shades of gray were good enough for the Apple II, and similarly when Jobs spent days just deciding how the computer's corners should be rounded. Anyway, Scott just focused on making and selling them.

However, because the company is still running smoothly, these personalities are still manageable. But as you will see later, this is not the end.

The Macintosh made Jobs a technology icon, but Jobs' temperament brought him down.

The Apple II, with about 6 million copies sold, is seen as the spark that led to the birth of the personal computer industry.

But for Jobs, it was not a complete success because the Apple II was Wozniak's masterpiece, not his.

Jobs wanted to create a machine that could, in his words, “create a pattern in the universe.” Driven by this ambition, Jobs began working on the Macintosh – a successor to the Apple II that would change the look of the personal computer and make him a technology icon.

However, the Macintosh was not Jobs's invention, because Jobs actually stole the Macintosh project from its creator, Jef Raskin, a computer interface expert. Jobs took this idea and built a machine that ran on a microprocessor powerful enough to accommodate sophisticated graphics, and could be controlled with a mouse.

The Macintosh became an unprecedented achievement, thanks in part to a lavish promotional campaign that included a sensational TV commercial – now known as the “1984 commercial” – directed by Mr. Hollywood filmmaker Ridley Scott. Attached to the popularity of commerce, the Macintosh set off a chain reaction in the community with Jobs as well as with the product.

With inherent ingenuity, Jobs succeeded in giving high-profile interviews to a number of prominent magazines, by manipulating journalists into thinking that he was giving them "exclusive" interviews.

His strategy worked, and the Macintosh made Jobs rich and famous. He became such a celebrity that he was able to invite singer Ella Fitzgerald to perform at his splendid 30th birthday party.

However, the same personality that helped Jobs create success for the Macintosh also got him fired.

Jobs' perfectionism and repressive attitude toward Apple employees did not diminish. He would constantly call people "assholes" if they weren't focused on perfection.

Those attitudes and expressions of Jobs drained the patience of the company's leadership. And in 1985, they decided to fire Jobs.

Jobs failed with NeXT and then succeeded with Pixar, an animation company.

After recovering from being fired from Apple, Jobs realized he could be exactly what he wanted—with his good points and bad points.

He first started a company focused on the education market, a computer company called NeXT.

With the NeXT project, Jobs brought his passion to design. He paid $100,000 to design the logo, and insisted that NeXT would be a perfection.

But Jobs' perfectionism made engineering and production extremely difficult. For example, the two sides of the block must be manufactured individually, using molds costing up to $650,000.

Jobs' determination became the death knell for NeXT. The project was almost financially exhausted, the product was delayed for years, and in the end, the machine was too expensive for consumers. And because of its high price tag and small software library, NeXT barely made its mark in the computer industry.

During the same period, Jobs bought a large amount of shares from the company Pixar. As chairman, Jobs invested in a business—a perfect blend of technology and art.

By 1988, Jobs had invested $50 million in Pixar while still losing money on NeXT.

But after years of tough financial times, the studio released Tin Toy , a film that showcased Pixar's unique vision for computer animation. Tin Toy won the Best Animated Feature category at the 1988 Academy Awards.

So Jobs felt that he should shift his focus from hardware and software production, which he lost a lot of money to, to Pixar, an advanced and potential animation company.

And finally, Pixar teamed up with Disney to produce its first movie, Toy Story . Released in 1996, Toy Story reached the top of the highest-grossing films of the year. When Pixar went public, Jobs' shares (80% of the company) were worth 20 times what he had invested: $1.2 billion.

Away from Apple, Jobs improved his personal life, reconnecting with his biological family.  

Besides learning during his 12 years away from Apple, Jobs has also developed his personal life.

In 1986, after the death of his adoptive mother, Jobs was curious about his origins and decided to find his biological mother.

When he found Joanne Schieble, she was very emotional and regretted giving Jobs to someone else to raise.

Jobs was also surprised to learn that he also had a younger sister, Mona Simpson. Both people with strong passion for art and strong will, the two have become close to each other.

In 1996, Simpson published a novel with the title A Regular Guy . The main character is based on Jobs and shares many aspects of Jobs' personality. However, because he didn't want any conflict with his newly found sister, Jobs never read the novel.

Around the same time, Jobs met Laurene Powell. The couple married in 1991, with prayers from Jobs' former patriarch. Powell was previously pregnant with their first child, Reed Paul Jobs. They also had two more children, Erin and Eve.

With encouragement from Powell, Jobs also tried to spend more time with Lisa Brennan, the daughter he had with his first relationship, who had become estranged from him.

Jobs tried to be a better father to Lisa; and eventually, she moved in with him and Powell until she attended Harvard.

Lisa grew up with the same temperament as Jobs and both are not very good at reaching out and correcting, they can be apart for months without saying a word to each other.

In a broader sense, the way he treats people around him is similar to the way he works. Jobs' approach: either very passionate or very cold.

Apple was on the verge of decline, Jobs returned as a child and led the company as CEO.

After years of firing Jobs, Apple gradually went downhill and was in danger of bankruptcy.

To prevent this, Gil Amelio was named CEO in 1996. Amelio knew that to get Apple back on track, it needed to merge with a company with new ideas.

And for that reason, in 1997, Amelio chose NeXT and Jobs became an advisor to Apple.

Once back at Apple, Jobs gathered as much power as he could. He has quietly built a power base by placing his favorite employees at Next in high positions within Apple.

During this period, Apple's management realized that Amelio would not be able to become Apple's savior, but perhaps the company would have a chance again with Jobs.

So the board asked Jobs to return to the position of CEO. However, the unexpected happened, Jobs declined the offer. Instead, Jobs wanted to stay on as an advisor and help find a new CEO.

Jobs as a consultant increased his influence inside Apple. He forced the board to resign—the board that had offered him the CEO position—because he felt they were too slow to change the company.

As a consultant, Jobs also succeeded in partnering with rival Microsoft, prompting them to make a new version of Microsoft Office for Mac, thus ending a decade of competition and dramatically accelerating the pace. sell Apple products.

And finally, after much hesitation, Jobs became CEO and suggested the company make fewer products.  

Jobs terminated the licensing agreements Apple had with a few other manufacturers and decided to focus the company on making just four great computers: A desktop computer and a laptop for both. professional market and consumer market.

In 1997, Apple lost $1.04 billion. But in 1998, Jobs' first year as CEO, the company made $309 million in profits. Jobs really saved Apple.

Bold ideas and forward-thinking designs made the first iMac and Apple Store hugely successful.

When Jobs saw Jony Ive's design talent, he made Ive the second most powerful person in Apple - just behind him. From there began a collaboration that became the most important combination in the design industry of this era.

The first product that Jobs and Ive designed together was the iMac, a desktop computer that cost about $1,200 and was designed for everyone.

With the iMac, Jobs and Ive challenged conventional ideas about what a desktop computer should look like. In choosing a blue, matte frame, the pair reflected their obsession with creating the perfect computer, inside and out. This design also gives the computer a playful look.

Released in May 1998, the iMac became the best-selling product in Apple's history.

However, Jobs began to worry that Apple's unique products would become out of place among the electronics in the vast technology market. His solution was to create an Apple Store as a way for the company to manage the entire retail process.

Gateway Computer Company suffered financial losses after opening retail stores, so management opposed Jobs' idea. However, convinced that they were right, management agreed to test four Apple Store stores.

Jobs started by building a prototype store, equipping it to perfection, and paying attention to every detail of the service and overall aesthetic. He emphasizes minimalism throughout the entire process, from the moment customers enter the store to the moment they leave.

In May 2001, the first Apple Store opened. It was a resounding success, as Jobs' careful design pushed retail and the brand's image to the next level.

In fact, the Manhattan store went on to become the highest-earning of all New York stores, including established brands like Saks Fifth Avenue and Bloomingdale's.

Desperate for total digital control, Jobs created the iPod, iPhone, and iPad.

Following the success of the Apple Store and iMac, Jobs came up with a completely new strategy. His vision is a personal computer at the heart of a new digital lifestyle.

He calls it a digital-centric strategy.

This strategy envisions the personal computer as a control center comprising devices ranging from music players to cameras.

As a first step in shaping this idea, Jobs decided that a music player would be Apple's next product.

In 2001, Apple released the iPod, a streamlined device with the one button that has become famous today, a small screen, and a new hard disk technology.

Critics questioned whether consumers would shell out $399 for a music player, but Apple succeeded, by 2007 sales of iPods accounted for half of Apple's sales. Apple.

The next step was to design a cell phone for Apple, because Jobs had this in mind before, a cell phone with a built-in music player would make the iPod superfluous.

In 2007, Apple released the first generation of iPhone. Two seemingly impossible technologies have been applied: a touch screen, which can run multiple applications at the same time, and a solid glass cover, called Gorilla glass.

Once again, critics cast doubt on Apple's strategy, arguing that no one would shell out $500 for a cell phone — and again Jobs proved them wrong. By the end of 2010, profits from iPhone sales accounted for more than half of all mobile profits worldwide.

The final step in Jobs' strategy was the iPad tablet.

Apple officially started building the iPad in January 2010. However, Jobs revealed the product before it was made public, the press underestimated it when they hadn't tried it yet.  

And when the iPad was officially launched, it became a resounding success. In fact, Apple sold more than a million units in the first month and 15 million in the next nine months.

With the release of the iPod, iPhone, and iPad, it became clear that Jobs' bold ideas succeeded in changing the electronics industry.

Jobs' insistence on perfect and closed systems reflects his obsession with control.

Throughout his career, Jobs maintained that a closed, tightly integrated system would give customers the best experience. This idea reflects Jobs' desire for control, since he launched his system, preventing users from modifying it.

The obsession with control has caused major conflicts – especially with Microsoft and Google.

Bill Gates has many different ideas about business and technology, in which he is always willing to license his company's systems and software to partners. In fact, Bill Gates wrote software for the Macintosh.

However, the friendly business relationship between Jobs and Gates turned into a lifelong rivalry.

When Gates released the Windows operating system, Jobs accused him of copying the Macintosh's interface. In fact, both systems "borrow" ideas from another tech company, called Xerox.

Towards the end of his career, Jobs also attacked Google. In the company's design of the Android system, Jobs argued that Google copied a lot from Apple.

While both Microsoft and Google believed that the expansion of the computer system and natural competition would determine which technology was superior, Jobs maintained that in the end both companies stole the ideas. idea from Apple.

But Jobs' goal wasn't just competition between companies. Jobs also fought relentlessly for perfection within Apple, resulting in employees who didn't resign themselves and were fired. Under Jobs, there was no tolerance for undermining Apple's quality.

Whenever he thinks someone isn't an "A" and doesn't work 90 hours a week, he doesn't remind them to strive. Instead, he fired them immediately.

And when a company had problems getting its chips on time, Jobs became furious and cursed them ferociously. This reaction was a sign of Jobs's terrible perfectionism.

Jobs ignored all treatments for his cancer and died in 2011.

Jobs first found out he had cancer during a checkup in October 2003.

Unfortunately, Jobs tackled cancer the same way he did with his designs: ignoring all the conventional wisdom and deciding to fight his own battles.

He refused an operation for 9 months, instead receiving acupuncture and a vegetarian diet. As time went on, the tumor grew and eventually, Jobs had to undergo surgery so it could be removed.

Then cancer returned in 2008, once again Jobs insisted on eating fruits and vegetables to cure the disease, causing him to lose 40 pounds.

Finally, Jobs was persuaded to have a liver transplant; but after that, his health deteriorated seriously and could not be restored to the original.

Jobs died in 2011. He left behind a legacy of being one of the biggest technology companies in the world.

Everything Jobs did in life was the product of unbelievable strength, and before he died, Jobs said, “I've had a blessed life, a wonderful career. I did everything I could.”

Unlike other individuals, Jobs' personality is fully portrayed in his inventions as all Apple products are a tightly closed system and integrate both hardware and software.

And while Microsoft's expansion strategy – allowing its Windows operating system to be licensed – led them to dominate the operating system industry for years, Jobs' sample proved advantageous in long-term use, as it ensures a seamless user experience from start to finish.

Shortly before his death, Jobs was able to see Apple surpass Microsoft as the most valuable technology company in the world.

The main message of this book is:

Steve Jobs grew up in Silicon Valley at the intersection of art and technology, ecstasy and computer tech enthusiasts. Here, Jobs had a friendship that led to the birth of Apple as well as the change of world technology. During his lifetime, Jobs succeeded in transforming our relationship with technology, inventing digital devices with streamlined designs and user-friendly interfaces.

Related articles

biographies such as steve jobs by walter isaacson

Ungifted (Scott Barry Kaufman) - Book Summary

biographies such as steve jobs by walter isaacson

The Effective Executive, Peter F. Drucker - Book Summary

biographies such as steve jobs by walter isaacson

12 (2006), Rodd Wagner & James K. Harter - Book Summary

biographies such as steve jobs by walter isaacson

The Undercover Economist, Tim Harford - Book Summary

biographies such as steve jobs by walter isaacson

Men are Mars Women Venus (John Gray) - Book Summary

biographies such as steve jobs by walter isaacson

What Got You Here Won't Get You There (2007), Marshall Goldsmith - Book Summary

Brought to you by, zen flowchart, flowchart guides.

Steve Jobs: B2B Marketing Lessons from His Biography by Walter Isaacson with Jotform CMO Steve Hartert Remarkable Marketing

We knew this day would come. The day where we finally talk about the one tech company all others look up to: Apple. The truth is that the behemoth has endured ups and downs to become one of the greatest brands of all time, and all under the leadership of Steve Jobs. So in this episode, we’re taking marketing lessons from Steve Jobs based on his biography by Walter Isaacson with the help of our guest, Jotform CMO Steve Hartert. Together, we talk about being insatiably curious about your customers, daring to fail, and exercising your creativity.

  • Episode Website
  • More Episodes
  • Caspian Studios

Spotify is currently not available in your country.

Follow us online to find out when we launch., spotify gives you instant access to millions of songs – from old favorites to the latest hits. just hit play to stream anything you like..

biographies such as steve jobs by walter isaacson

Listen everywhere

Spotify works on your computer, mobile, tablet and TV.

biographies such as steve jobs by walter isaacson

Unlimited, ad-free music

No ads. No interruptions. Just music.

biographies such as steve jobs by walter isaacson

Download music & listen offline

Keep playing, even when you don't have a connection.

biographies such as steve jobs by walter isaacson

Premium sounds better

Get ready for incredible sound quality.

biographies such as steve jobs by walter isaacson

Buy new: .savingPriceOverride { color:#CC0C39!important; font-weight: 300!important; } .reinventMobileHeaderPrice { font-weight: 400; } #apex_offerDisplay_mobile_feature_div .reinventPriceSavingsPercentageMargin, #apex_offerDisplay_mobile_feature_div .reinventPricePriceToPayMargin { margin-right: 4px; } $24.99 $ 24 . 99 $3.99 delivery Wednesday, May 8 Ships from: jdbooks00 Sold by: jdbooks00

Save with used - very good .savingpriceoverride { color:#cc0c39important; font-weight: 300important; } .reinventmobileheaderprice { font-weight: 400; } #apex_offerdisplay_mobile_feature_div .reinventpricesavingspercentagemargin, #apex_offerdisplay_mobile_feature_div .reinventpricepricetopaymargin { margin-right: 4px; } $18.95 $ 18 . 95 free delivery may 10 - 15 ships from: hawking books sold by: hawking books.

Kindle app logo image

Download the free Kindle app and start reading Kindle books instantly on your smartphone, tablet, or computer - no Kindle device required .

Read instantly on your browser with Kindle for Web.

Using your mobile phone camera - scan the code below and download the Kindle app.

QR code to download the Kindle App

Image Unavailable

Steve Jobs: The Exclusive Biography by Isaacson, Walter (2015) Paperback

  • To view this video download Flash Player

Follow the author

Walter Isaacson

Steve Jobs: The Exclusive Biography by Isaacson, Walter (2015) Paperback Unknown Binding – February 15, 2015

Purchase options and add-ons.

  • Print length 568 pages
  • Language English
  • Publisher DO NOT USE
  • Publication date February 15, 2015
  • Dimensions 1.5 x 4.96 x 7.8 inches
  • ISBN-10 034914043X
  • See all details

The Amazon Book Review

Frequently bought together

Steve Jobs: The Exclusive Biography by Isaacson, Walter (2015) Paperback

Customers who viewed this item also viewed

Steve Jobs

Product details

  • ASIN ‏ : ‎ B011W94JKY
  • Publisher ‏ : ‎ DO NOT USE (February 15, 2015)
  • Language ‏ : ‎ English
  • Unknown Binding ‏ : ‎ 568 pages
  • ISBN-10 ‏ : ‎ 034914043X
  • Item Weight ‏ : ‎ 15.1 ounces
  • Dimensions ‏ : ‎ 1.5 x 4.96 x 7.8 inches
  • Best Sellers Rank: #1,179,236 in Books ( See Top 100 in Books )

Videos for this product

Video Widget Card

Click to play video

Video Widget Video Title Section

Steve Jobs life journey is a must for every entrepreneur!

biographies such as steve jobs by walter isaacson

About the author

Walter isaacson.

Walter Isaacson is writing a biography of Elon Musk. He is the author of The Code Breaker: Jennifer Doudna, Gene Editing, and the Future of the Human Race; Leonardo da Vinci; Steve Jobs; Einstein: His Life and Universe; Benjamin Franklin: An American Life; The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution; and Kissinger: A Biography. He is also the coauthor of The Wise Men: Six Friends and the World They Made. He is a Professor of History at Tulane, has been CEO of the Aspen Institute, chairman of CNN, and editor of Time magazine.

Customer reviews

Customer Reviews, including Product Star Ratings help customers to learn more about the product and decide whether it is the right product for them.

To calculate the overall star rating and percentage breakdown by star, we don’t use a simple average. Instead, our system considers things like how recent a review is and if the reviewer bought the item on Amazon. It also analyzed reviews to verify trustworthiness.

  • Sort reviews by Top reviews Most recent Top reviews

Top reviews from the United States

There was a problem filtering reviews right now. please try again later..

biographies such as steve jobs by walter isaacson

Top reviews from other countries

biographies such as steve jobs by walter isaacson

  • Amazon Newsletter
  • About Amazon
  • Accessibility
  • Sustainability
  • Press Center
  • Investor Relations
  • Amazon Devices
  • Amazon Science
  • Sell on Amazon
  • Sell apps on Amazon
  • Supply to Amazon
  • Protect & Build Your Brand
  • Become an Affiliate
  • Become a Delivery Driver
  • Start a Package Delivery Business
  • Advertise Your Products
  • Self-Publish with Us
  • Become an Amazon Hub Partner
  • › See More Ways to Make Money
  • Amazon Visa
  • Amazon Store Card
  • Amazon Secured Card
  • Amazon Business Card
  • Shop with Points
  • Credit Card Marketplace
  • Reload Your Balance
  • Amazon Currency Converter
  • Your Account
  • Your Orders
  • Shipping Rates & Policies
  • Amazon Prime
  • Returns & Replacements
  • Manage Your Content and Devices
  • Recalls and Product Safety Alerts
  • Conditions of Use
  • Privacy Notice
  • Consumer Health Data Privacy Disclosure
  • Your Ads Privacy Choices

COMMENTS

  1. Steve Jobs by Walter Isaacson

    Walter Isaacson's worldwide bestselling biography of Apple cofounder Steve Jobs. Based on more than forty interviews with Steve Jobs conducted over two years--as well as interviews with more than 100 family members, friends, adversaries, competitors, and colleagues--Walter Isaacson has written a riveting story of the roller-coaster life and searingly intense personality of a creative ...

  2. Steve Jobs (book)

    Steve Jobs is the authorized self-titled biography of American business magnate and Apple co-founder Steve Jobs.The book was written at the request of Jobs by Walter Isaacson, a former executive at CNN and Time who had previously written best-selling biographies of Benjamin Franklin and Albert Einstein.. Based on more than 40 interviews with Jobs conducted over two years—in addition to ...

  3. Amazon.com: Steve Jobs: 9781451648539: Isaacson, Walter: Books

    Steve Jobs. Hardcover - Big Book, October 24, 2011. Walter Isaacson's "enthralling" (The New Yorker) worldwide bestselling biography of Apple cofounder Steve Jobs. Based on more than forty interviews with Steve Jobs conducted over two years—as well as interviews with more than 100 family members, friends, adversaries, competitors, and ...

  4. Walter Isaacson: The Genius Biographies: Benjamin Franklin, Einstein

    This exclusive boxed set from beloved New York Times bestselling author Walter Isaacson features his definitive biographies: Steve Jobs, Einstein, Benjamin Franklin, and Leonardo da Vinci. "If anybody in America understands genius, it's Walter Isaacson." — Salon Celebrated historian, journalist, and bestselling author Walter Isaacson's biography collection of geniuses now available ...

  5. Steve Jobs

    Walter Isaacson's "enthralling" (The New Yorker) worldwide bestselling biography of Apple cofounder Steve Jobs.Based on more than forty interviews with Steve Jobs conducted over two years—as well as interviews with more than 100 family members, friends, adversaries, competitors, and colleagues—Walter Isaacson has written a riveting story of the roller-coaster life and searingly ...

  6. Steve Jobs: The Exclusive Biography by Walter Isaacson

    Tue 25 Oct 2011 12.26 EDT. P erhaps the funniest passage in Walter Isaacson's monumental book about Steve Jobs comes three quarters of the way through. It is 2009 and Jobs is recovering from a ...

  7. Steve Jobs

    Walter Isaacson's "enthralling" (The New Yorker) worldwide bestselling biography of Apple cofounder Steve Jobs.Based on more than forty interviews with Steve Jobs conducted over two years—as well as interviews with more than 100 family members, friends, adversaries, competitors, and colleagues—Walter Isaacson has written a riveting story of the roller-coaster life and searingly ...

  8. 'Steve Jobs' by Walter Isaacson

    Walter Isaacson's biography of Steve Jobs is a clear, elegant and concise book of record. ... After Steve Jobs anointed Walter Isaacson as his authorized biographer in 2009, he took Mr. Isaacson ...

  9. Walter Isaacson

    The story of Walter Isaacson—celebrated journalist, biographer, intellectual leader, and humanist—begins on May 20, 1952, when he was born at the Touro Infirmary in New Orleans. ... His 2011 biography Steve Jobs, ... Such machines could process not only numbers but anything that could be notated in symbols, such as words or music or ...

  10. Book review: 'Steve Jobs' by Walter Isaacson

    In many ways, the Jobs of the early '80s at the outset of his breathtaking career is the same feisty and impetuous man we find at the end of the book, picking apart his plans to build a yacht ...

  11. Walter Isaacson

    Walter Seff Isaacson (born May 20, 1952) is an American author, ... His courses often feature prominent guest speakers such as author Michael Lewis, ... Isaacson's book Steve Jobs, about the life of the entrepreneur, earned Isaacson the 2012 Gerald Loeb Award.

  12. Steve Jobs: Isaacson, Walter: 9781982176860: Amazon.com: Books

    Walter Isaacson's "enthralling" (The New Yorker) worldwide bestselling biography of Apple cofounder Steve Jobs.Based on more than forty interviews with Steve Jobs conducted over two years—as well as interviews with more than 100 family members, friends, adversaries, competitors, and colleagues—Walter Isaacson has written a riveting story of the roller-coaster life and searingly ...

  13. 'Steve Jobs': An apt portrait of a jerk and a genius

    Oct. 24, 2011 8:33 a.m. PT. 7 min read. The cover of "Steve Jobs," by Walter Isaacson. Simon & Schuster. book review Amid the choking fumes from the Apple flame wars, Walter Isaacson's biography ...

  14. Walter Isaacson: The Genius Biographies

    This exclusive boxed set from beloved New York Times bestselling author Walter Isaacson features his definitive biographies: Steve Jobs, Einstein, Benjamin Franklin, and Leonardo da Vinci. "If anybody in America understands genius, it's Walter Isaacson." —Salon Celebrated historian, journalist, and bestselling author Walter Isaacson's biography collection of geniuses now available in ...

  15. Steve Jobs: A Biography : Isaacson, Walter: Amazon.sg: Books

    Walter Isaacson's "enthralling" (The New Yorker) worldwide bestselling biography of Apple cofounder Steve Jobs.Based on more than forty interviews with Steve Jobs conducted over two years—as well as interviews with more than 100 family members, friends, adversaries, competitors, and colleagues—Walter Isaacson has written a riveting story of the roller-coaster life and searingly ...

  16. 'Steve Jobs' by Walter Isaacson

    In a new biography, author Walter Isaacson describes Steve Jobs as selfish and arrogant as well as the creative genius behind Apple's successful products. paul sakuma/ap/file 2007. Life often ...

  17. Steve Jobs : Isaacson, Walter, author : Free Download, Borrow, and

    Steve Jobs by Isaacson, Walter, author. Publication date 2011 Topics ... THIS IS THE EXCLUSIVE BIOGRAPHY OF STEVE JOBS. Based on more than forty interviews with Jobs conducted over two years--as well as interviews with more than a hundred family members, friends, adversaries, competitors, and colleagues--Walter Isaacson has written a riveting ...

  18. Walter Isaacson

    Walter Isaacson is the bestselling author of Elon Musk; The Code Breaker; Leonardo da Vinci; The Innovators; Steve Jobs; Einstein: His Life and Universe; Benjamin Franklin: An American Life; and Kissinger: A Biography, and the coauthor of The Wise Men: Six Friends and the World They Made. Isaacson was awarded the National Humanities Medal in 2023.

  19. Steve Jobs

    Walter Isaacson's "enthralling" (The New Yorker) worldwide bestselling biography of Apple cofounder Steve Jobs.Based on more than forty interviews with Steve Jobs conducted over two years—as well as interviews with more than 100 family members, friends, adversaries, competitors, and colleagues—Walter Isaacson has written a riveting story of the roller-coaster life and searingly ...

  20. Steve Jobs, Walter Isaacson

    Steve Jobs, Walter Isaacson - Book Summary. ... Isaacson has written two best-selling biographies of Albert Einstein and Benjamin Franklin, and he is also the author of American Sketches (2003). ... But why did Jobs have such a bad temper? In short, he is a very perfectionist. Jobs wanted the Apple II to be the perfect design, fully equipped ...

  21. Steve Jobs: The Exclusive Biography

    Walter Isaacson tells the story of the rollercoaster life and searingly intense personality of creative entrepreneur whose passion for perfection and ferocious drive revolutionized six industries: personal computers, animated movies,music, phones, tablet computing, and digital publishing. Although Jobs cooperated with this book, he asked for no ...

  22. ‎Remarkable Marketing: Steve Jobs: B2B Marketing Lessons from His

    The truth is that the behemoth has endured ups and downs to become one of the greatest brands of all time, and all under the leadership of Steve Jobs. So in this episode, we're taking marketing lessons from Steve Jobs based on his biography by Walter Isaacson with the help of our guest, Jotform CMO Steve Hartert.

  23. Steve Jobs: A Biography: Isaacson, Walter: 9781410445223: Amazon.com: Books

    Based on more than forty interviews with Jobs conducted over two years--as well as interviews with more than a hundred family members, friends, adversaries, competitors, and colleagues--Walter Isaacson has written a riveting story of the roller-coaster life and searingly intense personality of a creative entrepreneur whose passion for perfection and ferocious drive revolutionized six ...

  24. Steve Jobs: B2B Marketing Lessons from His Biography by Walter Isaacson

    The truth is that the behemoth has endured ups and downs to become one of the greatest brands of all time, and all under the leadership of Steve Jobs. So in this episode, we're taking marketing lessons from Steve Jobs based on his biography by Walter Isaacson with the help of our guest, Jotform CMO Steve Hartert.

  25. Steve Jobs: The Exclusive Biography by Isaacson, Walter (2015

    Walter Isaacson's "enthralling" (The New Yorker) worldwide bestselling biography of Apple cofounder Steve Jobs. Based on more than forty interviews with Steve Jobs conducted over two years--as well as interviews with more than 100 family members, friends, adversaries, competitors, and colleagues--Walter Isaacson has written a riveting story of the roller-coaster life and searingly intense ...