torocpsummit 130116180230-phpapp02

40
January 16-17, 2013 Santa Clara Compute Summit Wednesday, January 16, 13

Upload: king-parmeshwar-pawar-eduli

Post on 27-Jul-2015

46 views

Category:

Internet


2 download

TRANSCRIPT

January 16-17, 2013Santa Clara

Compute Summit

Wednesday, January 16, 13

Compute Summit

The Architecture of ParticipationLessons from open source software for open source hardware

+Tim O’Reilly@timoreillyO’Reilly Media

Wednesday, January 16, 13

3

“History doesn’t repeat itself, but it does rhyme.”

-Mark Twain

Wednesday, January 16, 13

I’m going to look both backwards and forwards in this talk. Before talking about the future, it’s worthwhile to reflect on the past, to see what lessons we can take from it. As Mark Twain said, “History...

Wednesday, January 16, 13

The computing universe that we take for granted began with a profound act of open source hardware. As George Dyson explains in his book Turing’s Cathedral, John von Neumann and his colleagues at the Institute for Advanced Study put the fundamental architecture of stored program computers, the ancestral architecture reflected in all computers we use today, into the public domain, declining to seek any patents. This was an act of radical idealism. But it wasn’t “political” in the sense that “free software” came to be seen in the 1980s. It was really all about the sense that this technology was too important, too fundamental for one organization to try to wring proprietary advantage out of it. For computing to reach its full, world-changing potential, it had to be available for everyone to build on. That’s the spirit with which the Open Compute Project also operates.

“What we’re selling to users of open source is control.”

- Michael Tiemann, Red Hat

Wednesday, January 16, 13

Michael Tiemann echoed this sense of the importance of the users of technology being in charge of their own destiny when he said “What we’re selling...” (This is probably not an exact quote, just a memory of a conversation we had the better part of a dozen years ago.)

Why Fidelity and Goldman Sachs Care About Open Compute

“In the past we needed to simplify and reduce motherboards of unnecessary proprietary components, open up and simplify management software, maximize hands-free management software, and so on, in order to make them work efficiently for us. This behavior was similar to Facebook’s early days’ server OEM experiences. Despite our numerous attempts in the past to influence design, none of the server providers listened to our needs. Although not an ideal design, we maximized power efficiency and automated system management as best we could.”

“What jumped out at us last summer at the OCP summit was that for the first time the non-hyperscale world could access many of the same design points, ODMs, simplifications and “freedom of choice” advantages indigenous to the hyperscale Web 2.0 world. This open access led to an evolution in thinking.”

- Peter Krey, in A Concise History of AMD’s Roadrunner Server for OCP http://www.opencompute.org/2012/06/25/a-concise-history-of-amds-roadrunner-server-for-ocp/

Wednesday, January 16, 13

It’s this sense of users working together to advance the state of the art that comes through in the OCP blog post explaining why Fidelity Investments and Goldman Sachs came to work together on the AMD Roadrunner server for OCP. “Despite our numerous attempts...”

Wednesday, January 16, 13

This is also the underlying spirit of Richard Stallman’s Free Software Definition, the first overtly political statement of user rights, from the early 1980s.

Unfortunately, the Free Software Foundation brought in a lot of stridency to the discussion, and what, to my mind, became an excessive focus on legal means -licenses - as the heart of the free software story.

“Given enough eyeballs, all bugs are shallow”

--Eric Raymond

Wednesday, January 16, 13

The open source movement emerged in 1998 with a more pragmatic approach, selling openness as a benefit, providing better software development practices through community.

Wednesday, January 16, 13

But there was still an unfortunate focus on licensing as the heart of the open source movement. Open source was ultimately defined by a set of approved licenses.

We need to get away from the narrative that makes us focus on licensing. The most important things are system architecture, community, and tools and practices for actually sharing our work. Licenses are just a way of making sure that bad actors don’t ruin the party.

Wednesday, January 16, 13

I like how the OCP seems pretty clear about this. The licenses are simple, focused mainly on restricting patent assertions, and the emphasis is on providing specifications and implementation details. You know that it’s about working designs, and about community.

Wednesday, January 16, 13

The core advances of open source software, in my opinion, have always come from people who are more pragmatic. I was around in the early days of Unix, and what drove code sharing wasn’t radical idealism or licenses. The early Unix code wasn’t shared under a license that would have qualified as open source, but it was open enough. Early versions of Unix were developed collaboratively by hundreds of developers at a loose network of institutions, most notably AT&T, where it started, and UC Berkeley.

That community collaboration is a big part of what OCP is trying to recreate. The future of OCP depends on you. It is your contributions that will push it forward. Own it!

“No matter who you are, most of the smart people work for someone else.”

- Bill Joy

Wednesday, January 16, 13

The importance of open source in enabling a distributed community is summed up in what has been referred to as Joy’s Law. “No matter...” That’s what leads to so much innovation. Open source is fabulous for innovation.

“Richard Stallman talks about the evil of copyright, and says we need copyleft to fix it. At Berkeley, we just say ‘Go down to Copy Central and copy it.’”

--Kirk McKusick, head of Berkeley Unix project

Wednesday, January 16, 13

But there was another element to the early spread of open source. Unix was the first operating system that became divorced from the underlying hardware. It ran on many different machines with very different architectures. Code from one machine couldn’t simply be run on another; it had to be recompiled. With a standardized hardware architecture, PC software could be distributed in binary. Unix *had* to be distributed in source form, because that was the only way to get the software to run. All of us spent time “porting” programs we’d received to account for either differences in hardware architecture or differences between various implementations of the operating system. In addition, because it was initially a non-commercial operating system, software was shared freely. Unix was developed collaboratively by hundreds of developers across many organizations. In an odd way, open source was a response to the problem of incompatibility.

Wednesday, January 16, 13

We know all about incompatibility in the hardware world. My friend Nat Torkington once said that there must be a special circle of Dante’s hell reserved for the makers of incompatible power supplies for consumer devices. In his infernal vision, the manufacturers of such devices were all condemned to a hell of perpetual sexual arousal combined with incompatible sexual organs. I’m sure that those of you in the data center world know people who belong in this same hell. But I thought this image of one of Dante’s circles of hell recreated in lego tells another story about architecture. You can make anything out of lego, because the pieces are designed to fit together.

“The book is perhaps most valuable for its exposition of the Unix philosophy of small cooperating tools with standardized inputs and outputs, a philosophy that also shaped the end-to-end philosophy of the Internet. It is this philosophy, and the architecture based on it, that has allowed open source projects to be assembled into larger systems such as Linux, without explicit coordination between developers.”

Wednesday, January 16, 13

This is also what was so powerful about Unix, the system that Linux emulated. It wasn’t itself open source by today’s standards of licensing, but it had an architecture that allowed it to be developed collaboratively by a community of loosely connected developers. It was the architecture that mattered. In writing an entry for this classic book on Wikipedia, I wrote... I believe this philosophy of interoperable components is also at the heart of the OCP vision.

“The architecture of participation”

“I couldn’t have written a new kernel for Windows even if I had access to the source code. The architecture just didn’t support it.”

-Linus Torvalds

Wednesday, January 16, 13

I heard another striking assertion about the importance of architecture fifteen years or so ago in a conversation with Linus Torvalds. He observed...

That term “architecture” stuck in my head, and I realized how true it was of all the most successful open source projects - that it was far more than a matter of just releasing source code. It was designing systems in such a way that someone could bite off a manageable chunk and modify, replace, or extend it. I call this “the architecture of participation.” Some systems are designed for participation; others are not.

The internet would not exist without open source software

Wednesday, January 16, 13

Here’s an even stronger assertion: “The internet....” And that’s not just because the initial implementations of TCP/IP and related tools like the DNS came out of Berkeley Unix and were open source. It’s not just because the services we all take for granted are built on top of an open source foundation. It’s because the very architecture of the internet and the www are shaped by open source.

Wednesday, January 16, 13

Tim Berners-Lee put the web into the public domain, and that was a profound act of open source software. But the software that Tim wrote is long gone, subsumed by other software that built on the architecture, communication protocols, and markup language that he designed. An even deeper contribution was the fundamental architecture of the web, which allowed anyone to put up a site without permission from anyone - all they had to do was speak the same language and communication protocol.

Wednesday, January 16, 13

You also see this architectural element in the success of the Apache web server. I remember back in the mid 90s, when there was this media hysteria that Apache wasn’t keeping up, because it wasn’t adding features as fast as Netscape’s web server or Microsoft IIS. The folks at Apache were clear: We’re an HTTP server. We have an extension layer (read “we are a platform”) that allows other people to add new features. Fifteen years later, Apache is still the dominant web server, and Netscape and IIS are footnotes in history.

Work on stuff that matters

Wednesday, January 16, 13

Moving on to another topic

I’ve made a practice for the past half-dozen years of asking the tech industry to work on stuff that matters. The Open Compute Project matters, and I want to give you some forward-looking perspective on just why I think it does.

opportunities for innovation

minimize environmental impact

bring computing to people at the lowest cost and widest distribution

improved upon by anyone

Wednesday, January 16, 13

I want to start with the mission statement for OCP. I’ve pulled out some key phrases. This is idealism of the kind expressed by von Neumann and his colleagues at the Institute for Advanced Study.

Why does this matter so much?

Wednesday, January 16, 13

The traditional wisdom was always that there weren’t that many companies of Google or Facebook scale. We now know better.

Wednesday, January 16, 13

What we’re really engaged in is building a platform for a global internet operating system. Back in 2002, I ran a conference entitled Buidling the Internet Operating System. In his keynote at that conference, Clay Shirky told a thought-provoking story. He remarked on the assertion by IBM CEO Thomas Watson Jr. that he saw no need for more than five computers in the world. “We now know that he was wrong,” said Clay. The audience nodded, thinking of the millions of PCs in the world. Today, it’s billions of smartphones. But then Clay delivered his devastating punch line: “We now know that he overstated the number by four.” We are moving towards a world that can be thought of as one global computer. The big battle in computing is about who will control the operating system for that computer.

Wednesday, January 16, 13

With the rise of applications like Facebook, which reach a billion people, you can see why this matters. Before we know it, there will be applications with many billions of users.

Wednesday, January 16, 13

And of course, the smartphone is really just a portal to network services.

What happens when the kind of collective intelligence applications of the web are driven by sensors rather than people typing on keyboards?

Wednesday, January 16, 13

But the big question I’ve been asking myself for the past half dozen years is this: “What happens...”

The Google Autonomous Vehicle

Wednesday, January 16, 13

We see this in unexpected places, such as the Google autonomous vehicle. This car is thought-provoking on a number of levels.

2005: Seven Miles in Seven Hours

Wednesday, January 16, 13

You see, back in 2005, the car that won the DARPA Grand Challenge went seven miles in seven hours.

AI plus the recorded memory of augmented humans

Wednesday, January 16, 13

What was the difference? It turns out that the autonomous vehicle is made possible by Google Streetview. Google had human drivers drive all those streets in cars that were taking pictures, and making very precise measurements of distances to everything. The autonomous vehicle is actually remembering the route that was driven by human drivers at some previous time. That “memory”, as recorded by the car’s electronic sensors, is stored in the cloud, and helps guide the car. As Peter Norvig of Google pointed out to me, “picking a traffic light out of the field of view of a video camera is a hard AI problem. Figuring out if it’s red or green when you already know it’s there is trivial.” Effectively, the Google autonomous vehicle is part of a cloud-based system reliant on what I’ve called “the global brain.”

Wednesday, January 16, 13

You can see this same “data center behind apps in everyday life” in applications like Square. Square is revolutionizing the retail experience for small merchants. I don’t know how many of you have tried the combination of Square Register and the Square wallet app. It automatically checks you in when you walk into a participating merchant. Your name and face appear on the register, and since your payment details are already on file, all the retail clerk has to do is confirm your identity, as shown in this screen shot.

Wednesday, January 16, 13

We can also see this in the Apple Store. If you squint a little, you can see the Apple Store clerk as a cyborg. Where most stores (at least in America) have used technology to eliminate salespeople, Apple has used it to augment them. Each store is flooded with smartphone-wielding salespeople who are able to help customers with everything from technical questions to purchase and checkout. Walgreens is experimenting with a similar approach in the pharmacy, and US CTO Todd Park foresees a future in which health workers will be part of a feedback loop including sensors to track patient data coupled with systems that alert them when a patient needs to be checked up on. The augmented home health worker will allow relatively unskilled workers to be empowered with the much deeper knowledge held in the cloud.

Wednesday, January 16, 13

Or consider how a taxi service like Uber creates a system connecting passenger and driver - with a data-center app providing the dispatching, coordination, billing, and reputation system that ties it all together.

Wednesday, January 16, 13

They said that CES was “the break out year for the Internet of Things.” While much of it may be hype, we do see a lot of activity around ideas like the connected car, smart homes, and the quantified self - all consisting of sensor driven apps with big data back ends.

Wednesday, January 16, 13

Perhaps the most striking development on the Internet of Things front is what GE is calling “the Industrial Internet”. I spoke at GE’s event in San Francisco a few months ago. Jet engines equipped with sensors are putting out a terabyte of data a day. Analysis of that data can improve fuel efficiency, predict when parts are breaking down and require service, and much more. Again, devices are being woven into something greater, and there’s a data center somewhere in the background.

Wednesday, January 16, 13

I really started thinking about the operational implications of the internet as operating system back in 2006. I wrote a blog post called Operations: The New Secret Sauce, which made the assertion that in the future, operations - what’s now called “devops” in particular - would become a key competency not just for internet companies but also for the enterprise.

Wednesday, January 16, 13

That prediction led a group of operations professionals to ask me to launch “a gathering place for their tribe.” That became our Velocity Conference, which focuses on web performance and operations, and increasingly, the back end for the internet of everything. What you do with the Open Compute Platform is very much part of that same story.

Large cloud end users “are beginning to see devops, openstack, open source methods, and hardware as one long continuum.”

- Bob Ogrey, AMD

Wednesday, January 16, 13

Bob Ogrey of AMD reportedly described how the devops movement, open source, and open hardware are all part of the same enterprise transformation. He said....

“Being a developer on someone’s platform will mean being hosted on their infrastructure.”

- Debra Chrapaty, Microsoft, 2006

Wednesday, January 16, 13

But that’s a key to why the Open Compute Platform really matters. Remember what Michael Tiemann said about the benefit of open source being control by users? The conversation that led me to write that 2006 blog post about web operations was with Debra Chrapaty who at the time ran operations for Windows Live. (She’s now the CIO of Zynga.) She said, “Being a developer...” Since I was talking with her at the O’Reilly Open Source Convention, that led me to ask, “who will control that platform?” and to make the case that what we now call the cloud, not the desktop, should be the focus of the open source community.

“What we are creating now is a monster whose influence is going to change history, provided there is any history left.”

-John von Neumann

Wednesday, January 16, 13

But as I’ve suggested here, the enterprise transformation is only the tip of the iceberg. We’re talking about something that is incredibly pervasive.

von Neumann’s wife Klari recalled him waking up one night in 1945 in a cold sweat. He said: “What we are creating...” She thought he was talking about the atom bomb, but George Dyson argues that his greater worry was the growing power of machines. Again, if, as Michael Tiemann notes, open source is about control, our ability to have distributed control over the hardware of the global brain may turn out to be very important.

“The species of devices of which this is to be the first representative is so radically new that many of its uses will become clear only after it is put into operation. These uses which are not, or not easily, predictable now are likely to be the most important ones.”

-John von NeumannWednesday, January 16, 13

But I prefer to end on a more hopeful note. In a letter to one of the military funders of the first computing project at the Institute for Advanced Study, von Neumann wrote, “The species...”

That’s the real beauty of open source, that it allows everyone to play a role in inventing the future. Your creativity is what will make this a success. Go forth and make the future happen! Thank you very much.