Blog Posts Process Management Robotic Process Automation (RPA) Tools

When a Computer Outsmarted a Master


On May 11, 1997, a single event signaled a turn in the world of computing. The reigning chess champion Garry Kasparov was beaten in a set of six games by a computer. The computer, known as Deep Blue, was created by IBM to tackle the complexity of chess. If the robotic process automation robots of 2014 had a grandfather, it would be Deep Blue.


Conquering the game of chess had been a goal for computer scientists since the 1950s. Various computers were built for this purpose, but none could hold their own against a true chess master. Then, in 1985, a graduate student at Carnegie Mellon University named Feng-hsiung Hsu began work on a chess-playing computer as his dissertation. He was joined by a classmate, Murray Campbell, and the two were hired by IBM to continue their project with added team members.

Kasparov actually won the first match proposed by IBM in 1996. The game in 1997 was a rematch after significant upgrades were made to Deep Blue. In a small television studio, Kasparov faced off against the computer (presumably with human assistance to move the pieces). The chess champion and computer both won a game each, followed by three draws, and the match concluded with a mistake by Kasparov which allowed Deep Blue to seize victory. Kasparov later claimed that the computer had cheated, but Deep Blue was retired soon after, so they never got a rematch.

Of course, technology has progressed so that computer programs (not even dedicated computers) can routinely beat chess masters. The key to Deep Blue’s success was utilizing large data sets to tackle a complex problem. This is called deep computing. It also made use of parallel processing, which splits a problem into many pieces for separate processing by multiple CPUs. Problems are solved much faster with parallel processing, and it’s at the heart of all high level computing today. At the time of the match, Deep Blue could examine 200 million positions per second and see strategies up to 14 levels deep.

Chess was an appropriate challenge for pushing the limits of computing, because the game combines a set of simple rules with the element of immense possibility. What IBM achieved with Deep Blue led to amazing developments in deep computing and parallel processing which are now used by businesses and science labs across the globe. It was a significant step forward in the history of robotics, and we are still reaping the benefits today.

This is the third part of a series on the history of computer robotics. Read the first and second parts <em>here</em> and <em>here</em>.

Leave a Comment

Get the BPI Web Feed

Using the HTML code below, you can display this Business Process Incubator page content with the current filter and sorting inside your web site for FREE.

Copy/Paste this code in your website html code:

<iframe src="" frameborder="0" scrolling="auto" width="100%" height="700">

Customizing your BPI Web Feed

You can click on the Get the BPI Web Feed link on any of our page to create the best possible feed for your site. Here are a few tips to customize your BPI Web Feed.

Customizing the Content Filter
On any page, you can add filter criteria using the MORE FILTERS interface:

Customizing the Content Filter

Customizing the Content Sorting
Clicking on the sorting options will also change the way your BPI Web Feed will be ordered on your site:

Get the BPI Web Feed

Some integration examples