Tuesday, October 25, 2011

Steve Jobs: A Genius, Yes; A Role Model for the Rest of Us, No Way

The nearly three weeks since Steve Jobs’s death has been like an extended tribute to the first global head of state. The memorial ceremonies worldwide, the special commemorative issues and, today, the release of Walter Isaacson’s Steve Jobs, all bear testament to the Apple founder’s legacy. Jobs deserved it. As Isaacson pointed out on CBS’s 60 Minutes last night, Jobs transformed personal computers, telephones, even retail stores, among others—and he would have probably taken on television, if he had lived long enough.

Many heads of state assuredly do not merit such eulogies. Gaddafi is dead. And when the Turkmens turned out to mourn Saparmurat Atayevich Niyazov in 2006, they were probably secretly celebrating at least the recovery of the month of January, as Niyazov had renamed the first month of the year after his personal honorific, Türkmenbaşy.

One thread among the encomiums suggests that the world would be a better place if we just had more Steve Jobs in high places. Consider this from Thomas Friedman: “The melancholy over Steve Jobs’s passing is not just about the loss of the inventor of so many products we enjoy. It is also about the loss of someone who personified so many of the leadership traits we know are missing from our national politics.”

It would be unfortunate if the remembrance of Jobs spawns a legion of Steve wannabes. Jobs, in geekspeak, was an “N of 1.” Jobs’s perfectionism and design sense helped establish Apple’s signature “iBrands,” but these traits also transcended, to some extent,  a toxic personality that could have served as a model for the Kevin Spacey character in the movie “Horrible Bosses.” In the film, Dave Harken implies that a promotion awaits one of his employees but ends up awarding it to himself. The Jobs equivalent: stiffing early Apple employees out of stock options when the company first went public. The guy was a…

In the weeks since his death, Jobs has been compared to Einstein and Edison. Maybe so. But the problem with using his interpersonal style as a management role model is that the rest of us, to parrot Apple advertising, will assuredly blow it. In business, the control freak boss—the emblematic Jobs model—is a recipe for unintentionally delivering your best employees as new hires to your closest competitors.

Millions of people have to manage others, and this challenge doesn’t necessarily bring out the best in us. A 2005 article by two psychologists from the University of Surrey, “Disordered Personalities at Work,” found that senior British executives were more likely to demonstrate histrionic personality disorder (grandiosity and lack of empathy among other traits) than criminal psychiatric patients at Broadmoor Special Hospital in Berkshire, England, and they were equally likely to show narcissistic (perfectionism and a dictatorial bent) and compulsive tendencies. Is it that this type of person is attracted to the job or the workplace encourages this type of behavior? Who knows? But entreating subordinates to “insanely great” levels of performance, to quote Jobs’s hyperbolic rhetoric, is more likely to initiate a collective bargaining drive than produce the next iPad.

Even Jobs may have been at his best when he left behind the persona of the old Steve. New Yorker writer James Surowiecki and author of The Wisdom of Crowds: Why the Many Are Smarter Than the Few and How Collective Wisdom Shapes Business, Economies, Societies and Nations, noted in that magazine how Jobs loosened up in recent years on his insistence on totally closed architectures. The old Steve might have forbidden MP3s on iPods and apps for iPhones and iPads. Giving up a modicum of control eventually propelled the company to heights it had never before experienced—and cemented Jobs’s legacy in the most histrionic terms imaginable.

What the World Looks Like, If You Move Backward in Time??

In video, a woman walks, skips, and jump-ropes through the streets of New York while everyone else moves backwards. Of course, she’s the one moving backwards—a vivid illustration of the time-reversibility of the laws of physics.
The most remarkable thing is how the passers-by seem not to notice. A New York minute, it seems, is not just sped up, but time-reversal invariant.

Wednesday, October 19, 2011

Free Will and Quantum Clones: How Your Choices Today Affect the Universe at its Origin

The late philosopher Robert Nozick, talking about the deep question of why there is something rather than nothing, quipped: “Someone who proposes a non-strange answer shows he didn’t understand the question.” So, when Scott Aaronson began a talk three weeks ago by saying it would be “the looniest talk I’ve ever given,” it was a good start. At a conference on the nature of time—a question so deep it’s hard even to formulate as a question—“loony” is high praise indeed. And indeed his talk was rich in ambition and vision. It left physics überblogger Sabine Hossenfelder uncharacteristically lost for words.
As part of his general push to apply theoretical computer science to philosophy, Aaronson has been giving thought to that old favorite of college metaphysics classes and late-night dorm-room bull sessions: free will. Do we have autonomy, or are our choices preordained? Is that a false choice? What does it mean to be free, anyway? For some of Aaronson’s earlier thoughts, see his lecture and blog post. Though hard to summarize, his talk (slides here) can be broken down into two parts.

First, he sought to translate fuzzy notions of free will into a concrete operational definition. He proposed a variation on the Turing Test which he calls the Envelope Argument or Prediction Game: someone poses questions to you and to a computer model of your brain, trying to figure out who’s the human. If a computer, operating deterministically, can reproduce your answers, then you, too, must be operating deterministically and are therefore not truly free. (Here, I use the word “deterministically” in a physicist’s or philosopher’s sense; computer scientists have their own, narrower meaning.) Although the test can never be definitive, the unpredictability of your responses can be quantified by the size of the smallest computer program needed to reproduce those responses. Zeeya Merali gave a nice summary of Aaronson’s proposal at the Foundation Questions Institute blog.
The output of this game, as Aaronson portrayed it, would be a level of confidence for whether your will is free or not. But I think it might be better interpreted as a measure of the amount of free will you have. Last year, quantum physicists Jonathan Barrett and Nicolas Gisin argued that free will is not a binary choice, live free or die, but a power that admits of degree. They proposed to quantify free will using quantum entanglement experiments. Freedom of will enters into these experiments because physicists make a choice about which property of a particle to measure, and the choice affects the outcome. Such experiments are commonly taken as evidence for spooky action at a distance, because your choice can affect the outcome of a measurement made at a distant location. But they can also be interpreted as a probe of free will.

If there are, say, 1000 possible measurements, then complete freedom means you could choose any of the 1000; if your choice were constrained to 500, you would have lost one bit of free will. Interestingly, Barrett and Gisin showed that the loss of even a single bit would explain away spooky action. You wouldn’t need to suppose that your decision somehow leaps across space to influence the particle. Instead, both your choice and the outcome could be prearranged to match. What is surprising is how little advance setup would do the trick. The more you think about this, the more disturbed you should get. Science experiments always presume complete freedom of will; without it, how would we know that some grand conspiracy isn’t manipulating our choices to hide the truth from us?

Back to Aaronson’s talk. After describing his experiment, he posed the question of whether a computer could ever convincingly win the Prediction Game. The trouble is that a crucial step—doing a brain scan to set up the computer model—cannot be done with fidelity. Quantum mechanics forbids you from making a perfect copy of a quantum state—a principle known as the no-cloning theorem. The significance of this depends on how strongly quantum effects operate in the brain. If the mind is mostly classical, then the computer could predict most of your decisions.

Invoking the no-cloning theorem is a clever twist. The theorem derives from the determinism—technically, unitarity—of quantum mechanics. So here we have determinism acting not as the slayer of free will, but as its savior. Quantum mechanics is a theory with a keen sense of irony. In the process of quantum decoherence, to give another example, entanglement is destroyed by… more entanglement.

As fun as Aaronson’s game is, I don’t see it as a test of free will per se. As he admitted, predictable does not mean unfree. Predictability is just one aspect of the problem. In the spirit of inventing variations on the Turing Test, consider the Toddler Test. Ask a toddler something, anything. He or she will say “no.” It is a test that parents will wearily recognize. The answers, by Aaronson’s complexity measure, are completely predictable. But that hardly reflects on the toddler’s freedom; indeed, toddlers play the game precisely to exercise their free will. The Toddler Test shows the limits of predictability, too. Who knows when the toddler will stop playing? If there is anybody in the world who is unpredictable, it is a toddler. What parents would give for a window in their skulls!

Yet no one denies that toddlers are composed of particles that behave according to deterministic laws. So how do you square their free will with those laws? Like cosmologist Sean Carroll, I lean toward what philosophers call compatibilism: I see no contradiction whatsoever between determinism and free will, because they operate at two different levels of reality. Determinism describes the basic laws of physics. Free will describes the behavior of conscious beings. It is an emergent property. Individual particles aren’t free. Nor are they hot, or wet, or alive. Those properties arise from particles’ collective behavior.

To put it differently, we can’t talk about whether you have free will until we can talk about you. The behavior of particles could be completely preordained by the initial conditions of the universe, but that is irrelevant to your decisions. You still need to make them.

What you are is the confluence of countless chains of events that stretch back to the dawn of time. Every decision you make depends on everything you have ever learned and experienced, coming together in your head for the first and only time in the history of the universe. The decision you make is implicit in those influences, but they have never all intersected before. Thus your decision is a unique creative act.

This is why even the slightest violation of free will in a quantum entanglement experiment beggars belief. “Free will” in such an experiment means simply that your choice of what to measure is such a distant cousin of the particle’s behavior that the two have never interacted until now.

This is where we get into the second big point that Aaronson made in his talk, about just how creative an act it was. Even if the influences producing a free choice have never interacted before, they can all be traced to the initial state of the universe. There is always some uncertainty about what that state was; a huge range of possibilities would have led to the universe we see today. But the decision you make resolves some of that uncertainty. It acts as a measurement of those countless influences.

Yet in a deterministic universe, those is no justification for saying that the initial state caused the decision; it is equally valid to say that the decision caused the initial state. After all, physics is reversible. What determinism means is that the state at one time implies the state at all other times. It does not privilege one state over another. Thus your decision, in a very real sense, creates the initial conditions of the universe.

This backward causation, or retrocausality, was the “loony” aspect of Aaronson’s talk. Except there’s nothing loony about it. It is a concept that Einstein’s special theory of relativity made a live possibility. Relativity convinced most physicists that we live in a “block universe” in which past, present, and future are equally real. In that case, there’s no reason to suppose the past influences the future, but not vice-versa. Although their theories shout retrocausality, physicists haven’t fully grappled with the implications yet. It might, for one thing, explain many of the mysteries of quantum mechanics.

In a follow-up email, Aaronson told me that the connection between free will and cosmic initial state was also explored by philosopher Carl Hoefer in a 2002 paper. What Aaronson has done is apply the insights of quantum mechanics. If you can’t clone a quantum state perfectly, you can’t clone yourself perfectly, and if you can’t clone yourself perfectly, you can’t ever be fully simulated on a computer. Each decision you take is yours and yours alone. It is the unique record of some far-flung collection of particles in the early universe. Aaronson wrote, “What quantum mechanics lets you do here, basically, is ensure that the aspects of the initial microstate that are getting resolved with each decision are ‘fresh’ aspects, which haven’t been measured or recorded by anyone else.”

If  nothing else, let this reconcile parents to their willful toddlers. Carroll once wrote that every time you break an egg, you are doing observational cosmology. A toddler playing the “no” game goes you one better. Every time the toddler says no, he or she is doing cosmological engineering, helping to shape the initial state of the universe.

Monday, October 17, 2011

The Secret Origin of The Phantom Menace 3-D Movie Poster



Lucasfilm Marketing Executive: Mr. Lucas? (knocking on door)
George Lucas: Come in.
Lucasfilm Marketing Executive: Sir, the marketing team is having a hard time deciding on a movie poster for the 3-D re-release of The Phantom Menace.
George Lucas: What's the problem?
Lucasfilm Marketing Executive: Our research team indicates everyone hates The Phantom Menace.
George Lucas: Really?
Lucasfilm Marketing Executive: Very much so.
George Lucas: I had no idea.
Lucasfilm Marketing Executive: ...really?
George Lucas: What don't they like?
Lucasfilm Marketing Executive: Everything, really. Jar Jar. The story. Jake Lloyd. The borderline racist aliens. Natalie Portman macking on a 10-year-old. "Yippee." Jar Jar. Anakin building C-3PO for no reason whatsoever. Midichlorians. All the dull political bullshit. Jar Jar. T--
George Lucas: I think you said Jar Jar a couple of times already.
Lucasfilm Marketing Executive: People really don't like Jar Jar.
George Lucas: I had no idea.
Lucasfilm Marketing Executive: (stares flatly) Anyways, we were planning on just using the old Drew Struzan poster, but it features Jar Jar and Jake Lloyd and a lot of other things that test poorly.
George Lucas: Well, people can't hate everything about the movie. I'm not Michael Bay, for fuck's sake.
Lucasfilm Marketing Executive: Well, that's true. In our polling, we found two things that people didn't actively dislike about The Phantom Menace.
George Lucas: What are they?
Lucasfilm Marketing Executive: Podracing and Darth Maul.
George Lucas: They like podracing and Darth Maul?
Lucasfilm Marketing Executive: No. They like Darth Maul and his fight scene. They simply don't hate podracing.
George Lucas: Okay. Well, just put a big Darth Maul on the poster and a couple of pods.
Lucasfilm Marketing Executive: ...
George Lucas: ...
Lucasfilm Marketing Executive: ...really?
George Lucas: Yeah, why not?
Lucasfilm Marketing Executive: Well, because it makes no sense. No Anakin, the ostensible protagonist of the entire Star Wars saga? No Liam Neeson or Natalie Portman or Ewan McGregor, the stars of the movie?
George Lucas: Put a tiny one of them in there in a corner or something, but it should basically just be Maul and a pod.
Lucasfilm Marketing Executive: Sir, with all respect, Darth Maul has three lines and maybe three total minutes of screentime. Watto is a more prominent character.
George Lucas: Do people like Watto?
Lucasfilm Marketing Executive: Not even slightly.
George Lucas: Then just put a big Darth Maul and a podracing pod.
Lucasfilm Marketing Executive: Sir, with a decreasing amount of respect, this would be like marketing the Harry Potter movies with a poster with no Harry, Ron or Hermione on it, and maybe just one of the Weasley Twins.
George Lucas: I see no problem with that.
Lucasfilm Marketing Executive: Fine, fuck it. Whatever. Darth Maul and some pods. I'll tell graphic design.
George Lucas: Thanks, Barry.
Lucasfilm Marketing Executive: (mumbles) My name is Jonathan.
George Lucas:  (turns to desk, presses intercom) Sally, get me the douchebag with the hat that does Clone Wars.
Lucasfilm Receptionist: You mean Dave Filoni?
George Lucas: Yeah, Midnight Cowboy. Whatever his name is.
Lucasfilm Receptionist: Yes, sir. (30 seconds pass)
Dave Filoni (via intercom): Yes, sir? You wanted to speak to me?
George Lucas: Hat guy?
Dave Filoni: (sighs) ...yes?
George Lucas: You're wearing the hat right now, aren't you?
Dave Filoni: (sighs) ...yes.
George Lucas: I knew it, you douchebag. Anyways, I want you to put Darth Maul in Clone Wars.
Dave Filoni: Sir?
George Lucas: You heard me. Put Maul in the cartoon. I'm putting him on the Phantom Menace 3-D poster, so the poster can promote the cartoon and the cartoon can promote the poster. I mean movie.
Dave Filoni: Isn't Darth Maul dead, sir?
George Lucas: Isn't it ridiculous to wear a cowboy hat after 1898? Just do it.
Dave Filoni: But sir... we specifically created Savage Opress, Darth Maul's brother, because you wouldn't let me use Darth Maul last year. In fact you said it was a stupid idea.
George Lucas: Well, now it's a good idea.
Dave Filoni: But now there will effectively be two Darth Mauls in Clone Wars.
George Lucas: Exactly! And we'll market them both! So shut up and do it before I hire some idiot with a sombrero to replace your ass.
Dave Filoni: Y-yes sir. (hangs up)
George Lucas: Goddamn I'm good.

Friday, October 7, 2011

How Steve Jobs's Early Vision For Apple Inspired A Decade Of Innovation

Steve Jobs, co-founder of Apple, has passed away at the age of 56, leaving behind a larger-than-life legacy which no obituary could possibly capture. As colleagues and family members and all those who he inspired begin to reflect on his life and impact, it's impossible not to do so without feeling an almost shared sadness, as if the world is collectively mourning the loss of a close relative--even if most of us weren't fortunate to meet him. We all knew this day was coming, but we can't believe it came so soon.
Simply put, Steve Jobs made our lives better, and the world is a worse place without his presence and vision. In his memory, we'll be re-publishing stories on Jobs and all that he came to represent. Please leave your thoughts and memories in the comments below.
Steve Jobs's return to Apple in 1997 is often referred to as the greatest second act in business history. He had been ousted more than a decade earlier in 1985, and was forced to watch helplessly as the company he built tumbled toward bankruptcy, hampered by poor management, a weak product line, and a dearth of innovation.
That all changed when Jobs came back, and breathed new life into the struggling company. We know how the story goes from there: Apple unveiled revolutionary products--the iMac, Mac OS X, iTunes, the iPod, iPad, and iPad--which led to unprecedented growth. When Jobs returned in 1997, Apple shares were being traded for barely a couple dollars; today, Apple stock hovers around $380 a share, and recently shot passed $400, briefly making Apple the most valuable brand and company in the world.
But to get to that point, Jobs had to do more than introduce flashy products. He had to define Apple's future. And he did so over the years, fighting off skeptics, refocusing the company, and most importantly, giving Apple a long-term vision.
Responding To Critics
Steve Jobs's return was never a cakewalk--many were skeptical that Jobs would be able to rebuild Apple. After all, NeXT, the ultra-high end computer company he started while away, failed to crack the mainstream market (though Apple ended up acquiring the startup for $400 million). In this rare Q&A session at 1997's Worldwide Developers Conferences (WWDC), an audience member takes Jobs to task, angrily questioning his technical understanding and telling Jobs that he doesn't know "what he's talking about."
Jobs responds calmly to the question, even going so far as to say, "People like this gentleman are right." He apologizes for his mistakes in the past, acknowledges there will likely be more mistakes in the future, and admits he does not have all the answers. However, he says, "We've tried to come up with a strategy and vision for Apple--it started with: 'What incredible benefits can we give the customer?' [And did] not start with: 'Let's sit down with the engineers, and figure out what awesome technology we have and then figure out how to market that.'"
Jobs goes on to cite the reaction consumers had when first seeing the laser printer. "People went, 'Wow, yes!'" Jobs said. "That's where Apple has to get back to."
Focusing On Saying No
Apple in the 1990s had a lot more products than the Apple of the aughts. QuickTake digital cameras, LaserWriter printers, Newton PDAs--all of these product lines were discontinued when Jobs came back. And Jobs's thinking can be found at WWDC 1997, when he explained how Apple had lost its way.
"Apple suffered for several years from lousy engineering management," he said. "There were people that were going off in 18 different directions…What happened was that you looked at the farm that's been created with all these different animals going in all different directions, and it doesn't add up--the total is less than the sum of the parts. We had to decide: What are the fundamental directions we are going in? What makes sense and what doesn't? And there were a bunch of things that didn't."
"Focusing is saying yes, right? No. Focusing is about saying no. You've got to say, no, no, no," Jobs continued. "The result of that focus is going to be some really great products where the total is much greater than the sum of the parts."
Burying The Hatchet, Letting Apple Be Apple
Microsoft is likely the main reason Apple had lost its way. The bitter battle between the two companies--for market share, over operating system superiority and patent issues--ended with Apple in a significant hole, trying to bite off more than it could chew.
Steve Jobs--who probably more than anyone had the right to be bitter about Microsoft--decided on his return to put the intense rivalry to rest. "Relationships that are destructive don't help anyone," Jobs said at MacWorld in 1997. "I'd like to announce one of our first partnerships today, and a very, very meaningful one, and that is with Microsoft."
Predictably, the crowd groaned and booed--even more so when Bill Gates made a cameo appearance via satellite. But the partnership showed just how far Jobs had come, and just how much he and his vision for Apple had matured. The deal proved incredibly important for the company: Microsoft injected $150 million into Apple; agreed to provide Macs with Microsoft Office; and agreed to patent cross-licensing and Java collaboration.
But most important was just how much Jobs dramatically changed the direction of his company, during one 12-minute speech. "If we want to move forward and see Apple healthy and prospering again, we have to let go of a few things here. We have to let go of this notion that for Apple to win, Microsoft has to lose," Jobs said. "We have to embrace the notion that for Apple to win, Apple has to do a really good job. If others are going to help us, that's great. Because we need all the help we can get…The era of setting this up as a competition between Apple and Microsoft is over."
The Apple Hierarchy Of Skepticism, Redefining Product Strategy
At MacWorld 1998, Steve Jobs finally got the chance to silence a few skeptics. After the release of the iMac, he had the facts to back up his success. Profits were surging, to $47 million his first quarter, and to $55 million during his second. He had launched Apple.com, which he refers to here at the "gold standard of e-commerce," a site that rocketed from 1 million hits per day to more than 10 million. And Apple's market value had quadrupled to roughly $4 billion.
"This went a long way to convince a lot of the skeptics," Jobs said. "When I came to Apple a year ago, all I heard was that Apple is dying, that Apple can't survive. Turns out that every time we convince people that we've accomplished something at one level, they come up with something new. I used to think this was a bad thing. I thought, 'When are they ever going to believe that we're going to turn this thing around?' But actually now I think it's great."
In what he calls the "Apple Hierarchy of Skepticism," Jobs lays out all the ways critics will be skeptical going forward--in many ways, the critics would never be silenced, and as Jobs said at the time, that's a good thing. It meant Apple was ahead of the curve--that it was taking risks, and trying to innovate.
One of the riskier moves Jobs also took that year was to redefine Apple's product strategy. At the time, most other OEMs were spitting out dozens of products into the market--Apple was no different. "When we got to the company a year ago, there were a lot of products--15 product platforms with a zillion variants of each one," Jobs said. "I couldn't even figure this out myself, after about three weeks: How are we going to explain this to others when we don't even know which products to recommend to our friends?"
Jobs dramatically changed and streamlined Apple's product roadmap. The company, he said, would now just offer four products, a simple offering plan that has allowed Apple to differentiate itself over the years from the endless options available from competitors such as HP, Dell, and others.
The Big Picture: Vertical Integration = Customer Experience
"I remember two-and-a-half years ago when I got back to Apple, there were people throwing spears, saying, 'Apple is the last vertically integrated PC manufacturer. It should be broken up into a hardware company, a software company, what have you,'" Jobs said in 2000 at MacWorld. "And it's true, Apple is the last company in our industry [that's vertically integrated]. What that also means if managed properly is that it's the last company that can take responsibility for customer experience--there's nobody left."
That strategy--to keep Apple a hardware-software integrated company--has allowed Apple to thrive in recent years. It was the opposite approach as the one taken by Microsoft, which licensed its OS, Windows, to device makers. And even with critics arguing that Apple should follow suit, Jobs resisted--in fact, Apple pre-Jobs-comeback had tried (and failed) to license its operating system to OEMs such as Gateway. Jobs devotion to hardware-software interplay led to breakthroughs with the Mac/OS X and iPhone/iPad/iOS.
"There's no other company left in this industry that can bring innovation to the marketplace like Apple can. It means that we don't have to get 10 companies in a room to agree on everything to innovate--we can decide ourselves to place our bets," Jobs said at the time. "We're going to integrate these things together in ways that no else in this industry can to provide a seamless user experience where the whole is greater than the sum of the parts. We're the last guys left in this industry than can do it. And that's what we're about."
The Digital Hub
At MacWorld 2001, Jobs began by looking backward to describe what he called the golden ages of the PC, from the age of productivity to the age of the Internet. He then spoke at length about what the next golden era would be.  
"I'd like to tell you where we are going," Jobs said. "What is our vision?"
Describing the "explosion of new digital devices" such as cell phones and music players, Jobs said he envisions Apple (the Mac, specifically) to become the new "digital hub for our emerging digital lifestyle." While he didn't explicity say what products were likely to come from Apple--Jobs would never do such a thing--he described a future very much like the one we're living today, where Macs, iPods, iPhones, and iPads are central to our digital media experience.  
"We don't think the PC is dying," Jobs said. "We think it's evolving."  
In the decade that lay ahead, traces of Apple's products and innovations can be traced back to this talk, and speeches Jobs had delivered in the years prior--to what Jobs had envisioned for the company all the way back in 1997.
And in true Jobsian style, he concluded his 2001 MacWorld address with what is now justified hyperbole. 
"We think this is going to be huge," Jobs said.
And he was right.

Steve Jobs 1955 - 2011

Apple has confirmed that Steve Jobs died today. His death came exactly six weeks after he resigned as CEO of Apple. The company did not specify a cause of death, but Jobs was diagnosed with pancreatic cancer in 2004 and underwent a liver transpant in 2009. He took a medical leave of absence beginning in January. In August, he announced he was stepping down altogether. A statement from Jobs' family said he "died peacefully surrounded by his family." He was 56.
Oct. 5, 2011 23:55 UTC
Apple Media Advisory
CUPERTINO, Calif.--(BUSINESS WIRE)-- Apple CEO Tim Cook today sent the following email to all Apple employees:
Team,
I have some very sad news to share with all of you. Steve passed away earlier today.
Apple has lost a visionary and creative genius, and the world has lost an amazing human being. Those of us who have been fortunate enough to know and work with Steve have lost a dear friend and an inspiring mentor. Steve leaves behind a company that only he could have built, and his spirit will forever be the foundation of Apple.
We are planning a celebration of Steve’s extraordinary life for Apple employees that will take place soon. If you would like to share your thoughts, memories and condolences in the interim, you can simply email rememberingsteve@apple.com.
No words can adequately express our sadness at Steve’s death or our gratitude for the opportunity to work with him. We will honor his memory by dedicating ourselves to continuing the work he loved so much.
Tim
Statement by Steve Jobs’ Family
PALO ALTO, Calif.--(BUSINESS WIRE)--Steve Jobs’ family today made the following statement regarding his death:
Steve died peacefully today surrounded by his family.
In his public life, Steve was known as a visionary; in his private life, he cherished his family. We are thankful to the many people who have shared their wishes and prayers during the last year of Steve’s illness; a website will be provided for those who wish to offer tributes and memories.
We are grateful for the support and kindness of those who share our feelings for Steve. We know many of you will mourn with us, and we ask that you respect our privacy during our time of grief.

Statement by Apple’s Board of Directors
CUPERTINO, Calif.--(BUSINESS WIRE)-- We are deeply saddened to announce that Steve Jobs passed away today.
Steve’s brilliance, passion and energy were the source of countless innovations that enrich and improve all of our lives. The world is immeasurably better because of Steve.
His greatest love was for his wife, Laurene, and his family. Our hearts go out to them and to all who were touched by his extraordinary gifts.