The question of what happens when machines get to be as intelligent as and even more intelligent than people seems to occupy many science-fiction writers. The Terminator movie trilogy, for example, featured Skynet, a self-aware artificial intelligence that served as the trilogy’s main villain, battling humanity through its Terminator cyborgs. Among technologists, it is mostly “Singularitarians” who think about the day when machine will surpass humans in intelligence. The term “singularity” as a description for a phenomenon of technological acceleration leading to “machine-intelligence explosion” was coined by the mathematician Stanislaw Ulam in 1958, when he wrote of a conversation with John von Neumann concerning the “ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.” More recently, the concept has been popularized by the futurist Ray Kurzweil, who pinpointed 2045 as the year of singularity. Kurzweil has also founded Singularity University and the annual Singularity Summit. (via The Consequences of Machine Intelligence - Moshe Y. Vardi - The Atlantic)
1. Amara’s Law: “We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run” (via Four Geeky Laws That Rule Our World - Neatorama)
That effectively means that while your total content experience perhaps doesn’t need to be designed for a smartphone experience, at least the initial part of it should be, and that part should be integrated with how that content might be used on other devices — so, for example, watching a film first on a phone and then finishing it on a TV, or starting a shopping experience on a phone and finishing it on a PC….
Singularity: What do you think about the technological singularity? Do you think it will happen? If so do you fear it?
What is the Technological Singularity?
The technological singularity is the hypothetical future emergence of greater-than-human superintelligence through technological means.Since the capabilities of such intelligence would be difficult for an unaided human mind to comprehend, the occurrence of a technological singularity is seen as an intellectual event horizon, beyond which events cannot be predicted or understood. - Wikipedia
Rate of Technological Change May Be Outstripping Humans’ Ability to Manage and Adapt to It
Our relationship with tools dates back millions of years, and anthropologists still debate whether it was the intelligence of human-apes that enabled them to create tools or the creation of tools that enabled them to become intelligent.
In any case, everyone agrees that after those first tools had been created, our ancestors’ intelligence coevolved with the tools. In the process our forebears’ jaws became weaker, their digestive systems slighter, and their brains heavier.
Chimpanzees, genetically close to us though they are, have bodies two to five times as strong as ours on a relative basis and brains about a quarter as big. In humans, energy that would have gone into other organs instead is used to run energy-hungry brains. And those brains, augmented by tools, more than make up for any diminishment in guts and muscle. Indeed, it’s been a great evolutionary trade‑off: There are 7 billion people but only a few hundred thousand chimpanzees.
In the distant past our tools improved slowly enough to allow our minds, our bodies, our family structures, and our political organizations to keep up. The earliest stone tools are about 2.6 million years old. As those and other tools became more refined and sophisticated, our bodies and minds changed to take advantage of their power. This adaptation was spread over more than a hundred thousand generations.
The end of death?
‘Mind uploading’ featured in academic journal special issue for first time
The Special Issue on Mind Uploading (Vol. 4, issue 1, June 2012) of the International Journal of Machine Consciousness, just released, “constitutes a significant milestone in the history of mind uploading research: the first-ever collection of scientific and philosophical papers on the theme of mind uploading,” as Ben Goertzel and Matthew Ikle’ note in the Introduction to this issue. “Mind uploading” is an informal term that refers to transferring the mental contents from a human brain into a different substrate, such as a digital, analog, or quantum computer. It’s also known as “whole brain emulation” and “substrate-independent minds.” Serious mind uploading researchers have emerged recently, taking this seemingly science-fictional notion seriously and pursuing it via experimental and theoretical research programs, Goertzel and Ilke’ note. (via ‘Mind uploading’ featured in academic journal special issue for first time | KurzweilAI)
We should be able to jump galaxy by now with an iphone , right ? No, we don´t have coverage there yet ;O)