On the occasion of IBM's "Watson" computer beating two human "Jeopardy" champions (though it could not properly identify Chicago in the "Final Jeopardy" round), I note the following chronology:
- 1940s: computer power is measured in hundreds of operations per second, and we use computers to do things like crack the Enigma cipher and win World War II.
- 1960s: computer power just makes it into the millions of operations per second, and we use computers to put men on the moon and control the national telephone network.
- 1980s: computer power is comfortably in the tens of millions of operations per second, microchip technology crams these onto tiny chips, and we use them to put a phone in every pocket, so people can annoy each other while driving 60mph in a school zone.
- late 1990s: we break into the billions of operations per second, and use this power to render really realistic blood in first-person shooter video games.
- 2000s: computer power is now in the tens of billions of operations per second, and we use this power to win a TV trivia game show.
Conclusion: over time, the power of the computer multiplied by the usefulness of the application remains constant.
The logical extension is that when we finally do build a computer that matches the power of the human brain, all it will do is sit around, watch TV and play video games. Which might be a useful thing, if it frees us from those tasks.