There are three major interpretations of the technological singularity:
I.J. Good's intelligence explosion, Vernor Vinge's event horizon, and Ray Kurzweil's law of accelerating returns.
I.J. Good's concept of an "intelligence explosion" can best be defined in his own words:
: ''"Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an 'intelligence explosion,' and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make." ''
This is not confined to a machine, of course. Some singularitarians feel that the intelligence explosion will be heralded by BCIs, or brain-computer interfaces.
Vernor Vinge's "event horizon" is less concretely defined. It is an analogy to the concept of a singularity in physics, also known as a black hole. As you near a black hole, physics begins to act stranger and stranger until you reach the "event horizon," upon which all physics breaks down. Vinge postulates that this kind of barrier can also be seen in history. As progress accelerates, eventually there will come a point past which no predictions can be made, the future having become far too complex for a human brain to understand.
Ray Kurzweil's "law of accelerating returns" extends today's exponential growth far into the past, to the beginning of life on Earth, as well as using this as justification for it continuing far into the future.
Chat with our AI personalities
No no
Tesco is a very technological company
Technological has five syllables.
Anything that uses the most recent technological information. A screw driver at one time was a technological device.
Luddite : a person who resists technological progression.