the paradigm shift in computing - ibm research | zurich · the paradigm shift in computing abu...

2
The paradigm shift in computing Abu Sebastian IBM Research – Zurich, 8803 R¨ uschlikon, Switzerland. Computers have played a key role in shaping the human history over the past several decades. Computing technol- ogy has led to dramatic improvements in productivity and the standard of living. In that sense, there are parallels with the industrial revolution, where the steam engines, automobiles and airplanes significantly altered the course of human his- tory. While these inventions mostly augmented our physical abilities, computers have been enhancing our intelligence. It is also worth noting that, while Eastern economies such as India were notably absent during the industrial revolution, compa- nies and individuals of Indian origin continue to play a key role in shaping the computing revolution. In this article, I would like to discuss a paradigm shift that is occurring in the field of computing technology. The on- going developments will have significant ramifications for the way our society functions. John Kelly, the director of IBM Research, refers to this as the third era of computing 1 . In the first era of computing, computers were essentially tabulating machines that counted things. These early com- puters were used in a number of applications including the control of looms and other industrial equipment. Note that historically, the terminology “computers” referred to human clerks who were specialized in making calculations using var- ious established methods. Thousands of them were employed in commerce, government, and research establishments. The second era of computing, known as the programmable computing era, emerged in the 1940s. The foundations of this computing paradigm was laid forth by pioneering thinkers such as Alan Turing and John von Neumann. This era is char- acterized by the so-called “if A, then B” logical statements programmed into a computer. Electronic devices governed by software programs perform calculations on data, execute logi- cal sequences of steps, and store the results. This is a powerful paradigm that has served us very well over many decades. However, in recent years we are witnessing something quite remarkable. There is an explosion of data 2 . With the multi- tude of digital devices around us, such as mobile phones and digital cameras, gigantic amounts of information are created every second. The emergence of social networking, sensor networks, and huge storehouses of business, scientific and government records creates an abundance of information re- ferred to as “big data”. It is estimated that the digital universe is growing at about 60 percent each year! Moreover, we have started realizing the significant power of all this data at our disposal. “Big data” is a natural resource ready to be mined. This data enables us to understand the in- credibly complex economies and societies. We have realized that a statistical approach based on analyzing vast amounts of information using powerful computers and sophisticated algo- rithms produces something similar to intelligence/knowledge. This has brought us to the third era of computing namely, the so-called cognitive era. Note that this cognitive computing differs significantly from attempts at artificial intelligence a couple of decades ago. There the attempt was to somehow program computers to exhibit intelligent behavior. These at- tempts failed to deliver on their promise, except in some niche applications such as voice recognition and industrial robotics. FIG. 1. IBM’s Watson computer competing in a Jeopardy! match. A significant milestone in cognitive computing was when IBM’s Watson computer bested two past grand champions, Brad Rutter and Ken Jennings, on the TV quiz show Jeop- ardy! 3 . Watson demonstrated that computers could now de- rive meaning from the unstructured knowledge existing in books, newspapers, magazines, web sites, social media, and anything expressed in natural language. In response to the questions posed in the quiz show, Watson could rely on all this information, access its confidence level and when suffi- ciently confident could beat the competitors to the buzzer. It can be seen that similar expertise can easily be applied in med- ical diagnoses, financial advice, customer service etc. In fact, in collaboration with Memorial SloanKettering Cancer Cen- ter researchers are investigating the application of Watson as a medical advisor . 4 So how does a cognitive computer differ from traditional computers? While traditional computers must be programmed by humans to perform specific tasks, cognitive computers such as Watson will learn from their interactions with data and humans and will be able in a sense to program themselves to perform new tasks. Traditional computers are designed to cal- culate rapidly while cognitive computers are designed to draw inferences from data. Traditional computers have only basic sensing capabilities, whereas cognitive computers will signif- icantly augment our sight, hearing, smell and touch. With cognitive computing, we will be able to harvest insights from huge quantities of data to handle complex situations, make more accurate predictions about the future, and better antici- pate unintended consequences of actions. This will enable us to see the big picture and make better and more objective de- cisions. Another tantalizing prospect of cognitive computing

Upload: others

Post on 21-Jun-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: The paradigm shift in computing - IBM Research | Zurich · The paradigm shift in computing Abu Sebastian IBM Research – Zurich, 8803 Ruschlikon, Switzerland.¨ Computers have played

The paradigm shift in computingAbu SebastianIBM Research – Zurich, 8803 Ruschlikon, Switzerland.

Computers have played a key role in shaping the humanhistory over the past several decades. Computing technol-ogy has led to dramatic improvements in productivity and thestandard of living. In that sense, there are parallels with theindustrial revolution, where the steam engines, automobilesand airplanes significantly altered the course of human his-tory. While these inventions mostly augmented our physicalabilities, computers have been enhancing our intelligence. It isalso worth noting that, while Eastern economies such as Indiawere notably absent during the industrial revolution, compa-nies and individuals of Indian origin continue to play a keyrole in shaping the computing revolution.

In this article, I would like to discuss a paradigm shift thatis occurring in the field of computing technology. The on-going developments will have significant ramifications for theway our society functions. John Kelly, the director of IBMResearch, refers to this as the third era of computing1.

In the first era of computing, computers were essentiallytabulating machines that counted things. These early com-puters were used in a number of applications including thecontrol of looms and other industrial equipment. Note thathistorically, the terminology “computers” referred to humanclerks who were specialized in making calculations using var-ious established methods. Thousands of them were employedin commerce, government, and research establishments.

The second era of computing, known as the programmablecomputing era, emerged in the 1940s. The foundations ofthis computing paradigm was laid forth by pioneering thinkerssuch as Alan Turing and John von Neumann. This era is char-acterized by the so-called “if A, then B” logical statementsprogrammed into a computer. Electronic devices governed bysoftware programs perform calculations on data, execute logi-cal sequences of steps, and store the results. This is a powerfulparadigm that has served us very well over many decades.

However, in recent years we are witnessing something quiteremarkable. There is an explosion of data2. With the multi-tude of digital devices around us, such as mobile phones anddigital cameras, gigantic amounts of information are createdevery second. The emergence of social networking, sensornetworks, and huge storehouses of business, scientific andgovernment records creates an abundance of information re-ferred to as “big data”. It is estimated that the digital universeis growing at about 60 percent each year!

Moreover, we have started realizing the significant powerof all this data at our disposal. “Big data” is a natural resourceready to be mined. This data enables us to understand the in-credibly complex economies and societies. We have realizedthat a statistical approach based on analyzing vast amounts ofinformation using powerful computers and sophisticated algo-rithms produces something similar to intelligence/knowledge.This has brought us to the third era of computing namely, theso-called cognitive era. Note that this cognitive computingdiffers significantly from attempts at artificial intelligence a

couple of decades ago. There the attempt was to somehowprogram computers to exhibit intelligent behavior. These at-tempts failed to deliver on their promise, except in some nicheapplications such as voice recognition and industrial robotics.

FIG. 1. IBM’s Watson computer competing in a Jeopardy! match.

A significant milestone in cognitive computing was whenIBM’s Watson computer bested two past grand champions,Brad Rutter and Ken Jennings, on the TV quiz show Jeop-ardy!3. Watson demonstrated that computers could now de-rive meaning from the unstructured knowledge existing inbooks, newspapers, magazines, web sites, social media, andanything expressed in natural language. In response to thequestions posed in the quiz show, Watson could rely on allthis information, access its confidence level and when suffi-ciently confident could beat the competitors to the buzzer. Itcan be seen that similar expertise can easily be applied in med-ical diagnoses, financial advice, customer service etc. In fact,in collaboration with Memorial SloanKettering Cancer Cen-ter researchers are investigating the application of Watson asa medical advisor .4

So how does a cognitive computer differ from traditionalcomputers? While traditional computers must be programmedby humans to perform specific tasks, cognitive computerssuch as Watson will learn from their interactions with data andhumans and will be able in a sense to program themselves toperform new tasks. Traditional computers are designed to cal-culate rapidly while cognitive computers are designed to drawinferences from data. Traditional computers have only basicsensing capabilities, whereas cognitive computers will signif-icantly augment our sight, hearing, smell and touch. Withcognitive computing, we will be able to harvest insights fromhuge quantities of data to handle complex situations, makemore accurate predictions about the future, and better antici-pate unintended consequences of actions. This will enable usto see the big picture and make better and more objective de-cisions. Another tantalizing prospect of cognitive computing

Page 2: The paradigm shift in computing - IBM Research | Zurich · The paradigm shift in computing Abu Sebastian IBM Research – Zurich, 8803 Ruschlikon, Switzerland.¨ Computers have played

2

is that of discovering and exploring new ideas, including newscientific laws governing the universe.

The advent of cognitive computing does not mean that thestored program computers will cease to be relevant. In mostinstances, we will see a hybrid system in which both types ofcomputing will co-exist. In the same vein, cognitive com-puters are not meant to replace humans. Instead they willaugment our cognitive capabilities to process the significantamount of data at our disposal. To paraphrase John Kelly“Humans and machines will collaborate to produce better re-sults - each bringing their own superior skills to the partner-ship. The machines will be more rational and analytic - and, ofcourse, possess encyclopedic memories and tremendous com-putational abilities. People will provide judgment, intuition,empathy, a moral compass and human creativity.”1

As an electrical engineer, what excites me the most with thearrival of cognitive computing is the need for a new computerarchitecture. Most computers being used today are based onthe von Neumann architectural principles laid out in 1945 bythe Hungarian American mathematician John von Neumann.The key components of such an architecture are a central pro-cessing unit (CPU), memory units, and a “bus” that intercon-nects those components to move around data. Data is pro-cessed via calculations and application of logic in the CPU.The data as well as the instructions to be performed in theCPU are stored in the memory.

In the von Neumann architecture, each processing stepin the CPU requires the movement of data and instructionsacross the CPU and the memory. Note that in state-of-the-artprocessors, the CPU clock cycle is in excess of 1 GHz whereasthe time to access the dynamic random access memory is inthe order of tens of nanoseconds. A random access to a stor-age element, such as the hard disk, has a mean latency on theorder of a millisecond. To put this into the right perspective,if the time it took the CPU to perform one instruction were 1second, the time to fetch data from the hard disk to performthat instruction could take a month of waiting! Naturally, thiswould slow down the whole computing process if there is needto process a large amount of data as in the case of cognitivecomputing. This bottleneck is referred to as the von Neumannbottleneck.

It is clear that the von Neumann architecture is highlyinefficient for such data-intensive, sense-making, insight-extracting, problem-solving cognitive computers . IBM Wat-son, which won the Jeopardy! show, consisted of 2880 com-puting cores (10 refrigerators worth in size and space) and re-quired about 80 kW of power and 20 tonnes of air-conditionedcooling capacity5. Compared to this the human brain of Wat-son’s competitors occupied less space than a 2 litre soft drinkbottle and consumed power in the order of 10W, the equivalentof that consumed by a light bulb.

Hence, it is apparent that for cognitive computing, the struc-ture of the human brain could serve as a blue print. The humanbrain evolved over millions of years to become a remarkable

instrument of cognition. Inside the brain, information is pro-cessed in parallel, and computation and memory are entwined.Each neuron is connected to many others, and the strength ofthese connections (known as synapses) changes constantly asthe brain learns. These attributes are thought to be crucial tolearning and memory.

It is fascinating that one of the fundamental quests of hu-manity, namely, that of figuring out how our brain functions,has found a synergy with the need to develop the computingarchitecture for cognitive computing. The former objectiveis being actively researched around the world6. In Europe,the Human Brain Project aims to develop a large-scale com-puter simulation of the brain, whereas in the United States, theBrain Activity Map is working towards establishing a func-tional connectome of the entire brain. In April 2013, US Pres-ident Barack Obama announced the BRAIN Initiative (BrainResearch through Advancing Innovative NeurotechnologiesInitiative). A significant research effort towards building abrain-inspired computing hardware is the US defense spon-sored project “SYNAPSE” where IBM is a key partner7. Inthis project and elsewhere the main focus is on mimickingthe biophysics of neural systems in silicon8,9. More recently,researchers are also investigating the application of novelnanoscale devices known as memristors to mimic synapticplasticity5.

To summarize, we are witnessing the third era of comput-ing, namely, cognitive computing. Cognitive computers willleverage the vast amount of digital data being created to sig-nificantly augment our cognitive capabilities. However, con-ventional von Neumann architectures that traditional comput-ers rely on, will prove to be highly inefficient for the highlydata-centric nature of cognitive computing. There is signifi-cant on-going research towards developing an alternate com-puter architecture that will require innovations in several di-verse fields such as neuroscience and nanotechnology.1Kelly III, J. E. & Hamm, S. Smart Machines: IBM’s Watson and the Era ofCognitive Computing (Columbia University Press, 2013).

2Gantz, J. F. & Chute, C. The diverse and exploding digital universe: An up-dated forecast of worldwide information growth through 2011 (IDC, 2008).

3Ferrucci, D. A. Introduction to ”This is Watson”. IBM Journal of Researchand Development 56, 1–1 (2012).

4Upbin, B. IBM’s Watson gets its first piece of businessin healthcare. Forbes. Retrieved March 31, 2014, http://www.forbes.com/sites/bruceupbin/2013/02/08/ibms-watson-gets-its-first-piece-of-business-in-healthcare(2013).

5Kuzum, D., Yu, S. & Wong, H. P. Synaptic electronics: materials, devicesand applications. Nanotechnology 24, 382001 (2013).

6Kandel, E. R., Markram, H., Matthews, P. M., Yuste, R. & Koch, C. Neuro-science thinks big (and collaboratively). Nature Reviews Neuroscience 14,659–664 (2013).

7Modha, D. S. et al. Cognitive computing. Communications of the ACM 54,62–71 (2011).

8Arthur, J. V. et al. Building block of a programmable neuromorphic sub-strate: A digital neurosynaptic core. In The 2012 International Joint Con-ference on Neural Networks (IJCNN), 1–8 (IEEE, 2012).

9Indiveri, G. et al. Neuromorphic silicon neuron circuits. Frontiers in Neuro-science 5 (2011).