CNN: Your Behavior Will Be Controlled by a Brain Chip!

CNN: Your Behavior Will Be Controlled by a Brain Chip

“Smart phone will be implanted”

Paul Joseph Watson
Infowars.com
October 9, 2012

A new CNN article predicts that within 25 years people will have embedded microchips within their brain that will allow their behavior to be controlled by a third party.

The story, entitled Smartphone of the future will be in your brain, offers a semi-satirical look at transhumanism and the idea of humans becoming part cyborg by having communications devices implanted in their body.

Predicting first the widespread popularity of wearable smartphones, already in production by Google, the article goes on to forecast how humans will communicate by the end of the century.

“Technology takes a huge leap in 25 years. Microchip can be installed directly in the user’s brain. Apple, along with a handful of companies, makes these chips. Thoughts connect instantly when people dial to “call” each other. But there’s one downside: “Advertisements” can occasionally control the user’s behavior because of an impossible-to-resolve glitch. If a user encounters this glitch — a 1 in a billion probability — every piece of data that his brain delivers is uploaded to companies’ servers so that they may “serve customers better.”

The tone of the CNN piece is somewhat sophomoric, but the notion that humans will eventually merge with machines as the realization of the technological singularity arrives is one shared by virtually all top futurists.

Indeed, people like inventor and futurist Ray Kurzweil don’t think we’ll have to wait 25 years to see smartphones implanted in the brain. He sees this coming to pass within just 20 years.

In his 1999 book The Age of Spiritual Machines, Kurzweil successfully predicted the arrival of the iPad, Kindle, iTunes, You Tube and on demand services like Netflix.

By 2019, Kurzweil forecasts that wearable smartphones will be all the rage and that by 2029, computers and cellphones will now be implanted in people’s eyes and ears, creating a “human underclass” that is viewed as backwards and unproductive because it refuses to acquiesce to the singularity.

Although the CNN piece doesn’t even foresee implantable brain chips until the end of the century, Kurzweil’s predictions are far beyond this. According to him, by 2099, the entire planet is run by artificially intelligent computer systems which are smarter than the entire human race combined – similar to the Skynet system fictionalized in the Terminator franchise.

Humans who have resisted altering themselves by becoming part-cyborg will be ostracized from society.

“Even among those human intelligences still using carbon-based neurons, there is ubiquitous use of neural implant technology, which provides enormous augmentation of human perceptual and cognitive abilities. Humans who do not utilize such implants are unable to meaningfully participate in dialogues with those who do,” writes Kurzweil.

Kurzweil’s forecasts are echoed by Sun Microsystems’ Bill Joy, who in a 2000 Wired Magazine article entitled Why The Future Doesn’t Need Us, predicted that technological advancements in robotics would render most humans obsolete.

As a result the elite, “may simply decide to exterminate the mass of humanity,” wrote Joy.

*********************

Paul Joseph Watson is the editor and writer for Prison Planet.com. He is the author of Order Out Of Chaos. Watson is also a regular fill-in host for The Alex Jones Show and Infowars Nightly News.

Regulate Brain Implant Technology before Human Rights abuses become common practice!

 

Regulate Brain Implant Technology before Human Rights abuses become common practice!

The advances in some areas of the human brain sciences and the possible threats for the future became apparent to Ellen M. McGee and Gerald Q. Maguire already in 1999. They talked about ear- and eye implants but also about more advanced implants and sensors in the environment able to spy on the human being and to control behavior and the human mind. The same ethical questions arising then are still very important today.

Since 1999, the progresses in the implant technology happened very fast. It is possible today to connect a human brain to a computer, creating today’s cyborgs.

Technical innovation, scientists claim, are neither good nor bad, but how it is used and the moral and ethical consequences arising from the use of the technology in unethical ways. Today, the technology and its applications are still completely or partially unregulated. There are still no laws that admit recognize or regulate how and to what extent human brain functions can or cannot be used, leaving a very open and huge range of possibilities for anyone that has its hands on this tech to use it – even when tested on people with more normal implants, like cochlear implants, eye implants or pace makers.

Because the brain chips are such a huge research area right now and because so many kinds already are developed, is it important that already today, formulate strategies and directions that might be able to at least diminish some of the consequences of this technology and eliminate abuses. Implanting this technology in the human body without knowledge or consent, must be prohibited.

Soon enough, the technology will be widespread enough to be used in normal medicine in the form of for example nanotechnology or parts of vaccination against viruses of any kind. The human being subjected must be informed and humanity must know what the technology is capable of.

Paradoxically enough, the brain implant technology is getting too little or no attention or ethical debate. At the same time, the potential of this technology to affect human beings and change them is huge. The threat of the implantation technology is in fact greater then genetic changes or enhancements. Genetic changes are very much limited of the human biology. Creating human-machine hybrids doesn’t have the same limitations. A computer connected to a human brain can share information at a distance. The potential for computer-chips implanted into the human brain to change humanity is far greater.

ELLEN M. McGEE and GERALD Q. MAGUIRE (2007). Becoming Borg to Become Immortal: Regulating Brain Implant Technologies. Cambridge Quarterly of Healthcare Ethics, 16 , pp 291-302 doi:10.1017/S0963180107070326

Original: http://journals.cambridge.org/action/displayAbstract?fromPage=online&aid=1017164

IBM get close to mimicking a human brain !

Man vs. machine

 Computerchip from IBM get close to mimicking a human brain

By Jordan Robertson Friday, August 19, 2011



 
 

 Computers, like humans, can learn. But when Google tries to fill in your search box based only on a few keystrokes, or your iPhone predicts words as you type a text message, it’s only a narrow mimicry of what the human brain is capable of.The challenge in training a machine to behave like a human brain is technological and physiological, testing the limits of computer and neuroscience. But IBM researchers say they’ve made a key step toward combining the two worlds.

 

The company announced it has built two prototype chips that it says process data more like how humans digest information than the chips that now power PCs and supercomputers.The chips represent a milestone in a six-year project that has involved 100 researchers and $41 million in funding from the government’s Defense Advanced Research Projects Agency, or DARPA. IBM has also committed an undisclosed amount of money.

The prototypes offer further evidence of the growing importance of “parallel processing,” or computers doing multiple tasks simultaneously. That is important for rendering graphics and crunching large amounts of data.

The uses of the IBM chips so far are prosaic, such as steering a simulated car through a maze, or playing Pong. It may be a decade or longer before the chips make their way out of the lab and into actual products.

 

But what’s important is not what the chips are doing, but how they’re doing it, said Giulio Tononi, a professor of psychiatry at the University of Wisconsin at Madison who worked with IBM on the project.The chips’ ability to adapt to types of information that they weren’t specifically programmed to expect is a key feature.

“There’s a lot of work to do still, but the most important thing is usually the first step,” Tononi said in an interview. “And this is not one step; it’s a few steps.”

Technologists have long imagined computers that learn like humans. Your iPhone or Google’s servers can be programmed to predict certain behavior based on past events. But the techniques being explored by IBM and other companies and university research labs around “cognitive computing” could lead to chips that are better able to adapt to unexpected information.

IBM’s interest in the chips lies in their ability to potentially help process real-world signals, such as temperature or sound or motion, and make sense of them for computers.

 

IBM, based in Armonk, N.Y., is a leader in a movement to link physical infrastructure, such as power plants or traffic lights, and information technology, such as servers and software that help regulate their functions. Such projects can be made more efficient with tools to monitor the myriad analog signals present in those environments.Dharmendra Modha, project leader for IBM Research, said the new chips have parts that behave like digital “neurons” and “synapses” that make them different from other chips. Each “core,” or processing engine, has computing, communication and memory functions.

“You have to throw out virtually everything we know about how these chips are designed,” he said. “The key, key, key difference really is the memory and the processor are very closely brought together. There’s a massive, massive amount of parallelism.”

The project is part of the same research that led to IBM’s announcement in 2009 that it had simulated a cat’s cerebral cortex, the thinking part of the brain, using a massive supercomputer. Using progressively bigger supercomputers, IBM previously had simulated 40 percent of a mouse’s brain in 2006, a rat’s full brain in 2007, and 1 percent of a human’s cerebral cortex in 2009.

A computer with the power of a human brain is not yet near. But Modha said the latest development is an important step.

“It really changes the perspective from ‘What if?’ to ‘What now?’” Modha said. “Today we proved it was possible. There have been many skeptics, and there will be more, but this completes in a certain sense our first round of innovation.”

– Associated Press

They Really Do Want To Implant Microchips Into Your Brain

They Really Do Want To Implant Microchips Into Your Brain

Michael Snyder
American Dream
Aug 2, 2012

Are you ready to have a microchip implanted into your brain? That might not sound very appealing to you at this point, but this is exactly what the big pharmaceutical companies and the big technology companies have planned for our future.

 

They are pumping millions of dollars into researching “cutting edge” technologies that will enable implantable microchips to greatly “enhance” our health and our lives. Of course nobody is going to force you to have a microchip implanted into your brain when they are first introduced. Initially, brain implants will be marketed as “revolutionary breakthroughs” that can cure chronic diseases and that can enable the disabled to live normal lives. When the “benefits” of such technology are demonstrated to the general public, soon most people will want to become “super-abled”.

Just imagine the hype that will surround these implants when people discover that you can get rid of your extra weight in a matter of days or that you can download an entire college course into your memory in just a matter of hours. The possibilities for this kind of technology are endless, and it is just a matter of time before having microchips implanted into your brain is considered to be quite common. What was once science fiction is rapidly becoming reality, and it is going to change the world forever.

But aren’t there some very serious potential downsides to having microchips implanted into our brains?

Of course there are.

Unfortunately, this technology is not as far off as you might think, and most people are not even talking about what the negative consequences might be.

According to a recent article in the Financial Times, the pharmaceutical company of the future will include a “bioelectronics” business that “treats disease through electrical signalling in the brain and elsewhere.”

Diseases such as diabetes and epilepsy and conditions such as obesity and depression will be will be treated “through electronic implants into the brain rather than pills or injections.”

These implants will send electrical signals to cells and organs that are “malfunctioning”. People will be totally “cured” without ever having to pop a pill or go under the knife.

It sounds too good to be true, right?

Well, the Financial Times says that British pharmaceutical giant GlaxoSmithKline is working very hard to develop these kinds of technologies. Moncef Slaoui, the head of research and development at GlaxoSmithKline, says that the “challenge is to integrate the work – in brain-computer interfaces, materials science, nanotechnology, micro-power generation – to provide therapeutic benefit.”

If a brain implant could cure a disease that you have been suffering from your whole life would you take it?

A lot of people are going to be faced with that kind of a decision in future years.

And this kind of technology is advancing very rapidly. In fact, some researchers have already had success treating certain diseases by implanting microchips into the brains of rats. The following is from a recent Mashable article….

Stroke and Parkinson’s Disease patients may benefit from a controversial experiment that implanted microchips into lab rats. Scientists say the tests produced effective results in brain damage research.

Rats showed motor function in formerly damaged gray matter after a neural microchip was implanted under the rat’s skull and electrodes were transferred to the rat’s brain. Without the microchip, rats with damaged brain tissue did not have motor function. Both strokes and Parkinson’s can cause permanent neurological damage to brain tissue, so this scientific research brings hope.

In addition, the U.S. government has been working on implantable microchips that would monitor the health of our soldiers and enhance their abilities in the field.

So this technology is definitely coming.

But it must be very complicated to get a microchip implanted into your brain, right?

Actually it is fairly simple.

According to an article in the Wall Street Journal, the typical procedure is very quick and it often only requires just an overnight stay in the hospital….

Neural implants, also called brain implants, are medical devices designed to be placed under the skull, on the surface of the brain. Often as small as an aspirin, implants use thin metal electrodes to “listen” to brain activity and in some cases to stimulate activity in the brain. Attuned to the activity between neurons, a neural implant can essentially “listen” to your brain activity and then “talk” directly to your brain.

If that prospect makes you queasy, you may be surprised to learn that the installation of a neural implant is relatively simple and fast. Under anesthesia, an incision is made in the scalp, a hole is drilled in the skull, and the device is placed on the surface of the brain. Diagnostic communication with the device can take place wirelessly. When it is not an outpatient procedure, patients typically require only an overnight stay at the hospital.

But is it really safe to have a device implanted into your head that can “talk” directly to your brain?

Many large corporations are banking on the fact that in a world that is always hungry for new technology that most people will not be bothered by such things.

For example, Intel is working on sensors that will be implanted in the brain that will be able to directly control computers and cell phones. The following is an excerpt from a Computer World UK article….

By the year 2020, you won’t need a keyboard and mouse to control your computer, say Intel researchers. Instead, users will open documents and surf the web using nothing more than their brain waves.

Scientists at Intel’s research lab in Pittsburgh are working to find ways to read and harness human brain waves so they can be used to operate computers, television sets and cell phones. The brain waves would be harnessed with Intel-developed sensors implanted in people’s brains.

The scientists say the plan is not a scene from a sci-fi movie, Big Brother won’t be planting chips in your brain against your will. Researchers expect that consumers will want the freedom they will gain by using the implant.

Once again, this is not something that will be forced on you against your will.

These big corporations are banking on the fact that a lot of people will want to get these brain implants.

Even now, some video game makers are developing headsets that allow users to play games using their brain waves rather than a joystick or a control pad.

Other companies want to make it possible to directly connect your brain to the Internet.

As I have written about previously, IBM is aggressively working to develop this kind of technology. The following is from arecent IBM press release….

IBM scientists are among those researching how to link your brain to your devices, such as a computer or a smartphone. If you just need to think about calling someone, it happens. Or you can control the cursor on a computer screen just by thinking about where you want to move it.

Scientists in the field of bioinformatics have designed headsets with advanced sensors to read electrical brain activity that can recognize facial expressions, excitement and concentration levels, and thoughts of a person without them physically taking any actions.

The potential “benefits” of such technology are almost beyond imagination. An article on the website of the Science Channel put it this way….

If you could pump data directly into your gray matter at, say, 50 mbps — the top speed offered by one major U.S. internet service provider — you’d be able to read a 500-page book in just under two-tenths of a second.

How would the world change if you could download a lifetime of learning directly into your brain in a matter of weeks?

The possibilities are endless.

But so is the potential for abuse.

Implantable microchips that can “talk” directly to the brain would give a tyrannical government the ultimate form of control.

If you could download thoughts and feelings directly into the brains of your citizens, you could achieve total control and never have to worry that they would turn on you.

In fact, you could potentially program these chips to make your citizens feel good all the time. You could have these chips produce a “natural high” that never ends. That would make your citizens incredibly dependent on the chips and they would never want to give them up.

This kind of technology has the potential to be one of the greatest threats to liberty and freedom in the history of mankind.

At first these implantable microchips will be sold to us as one of the greatest “breakthroughs” ever, but in the end they could end up totally enslaving us.

So I will never be taking any kind of a brain implant, and I hope that you will not either.

 
Similar/Related Articles
 
  1. Humans ‘will be implanted with microchips’
  2. Scientists Successfully Implant Chip That Controls The Brain
  3. Brain Implant Allows Paralyzed Woman to Control a Robot with Her Thoughts
  4. After The Government Microchips Our Soldiers, How Long Will It Be Before They Want To Put A Microchip In YOU
  5. Microchip Implant to Link Your Health Records, Credit History, Social Security
  6. Animal microchips linked to causing cancer
  7. Are Populations Being Primed For Nano-Microchips Inside Vaccines?
  8. Edible Microchips, Biometric Identity Systems And Mind Reading Computers
  9. New Implantable Microchips to Medicate Patients
  10. Intel Wants Brain Implants in Its Customers’ Heads by 2020
  11. British Court Orders Singer Get “Medical Implant” for Drug Addiction
  12. Hacking The Human Brain

 

Scientists to build ‘human brain’: Supercomputer will simulate the entire mind and will help fight against brain diseases

mind control

Scientists to build ‘human brain’: Supercomputer will simulate the entire mind and will help fight against brain diseases

  • The ‘brain’ will take 12 years to build
  • It will feature thousands of three-dimensional images built around a semi-circular ‘cockpit’

PUBLISHED: 18:27 GMT, 15 April 2012 | UPDATED: 19:14 GMT, 15 April 2012 

The human brain’s power could rival any machine. And now scientists are trying to build one using the world’s most powerful computer.

It is intended to combine all the information so far uncovered about its mysterious workings – and replicate them on a screen, right down to the level of individual cells and molecules.

If it works it could be revolutionary for understanding devastating neurological diseases such as Alzheimer’s and Parkinson’s, and even shedding light into how we think, and make decisions.

 
Ambitious: Scientists are hoping to build a computer that will simulate the entire human brain
 
Ambitious: Scientists are hoping to build a computer that will simulate the entire human brain

Leading the project is Professor Henry Markram based in Switzerland, who will be working with scientists from across Europe including the Wellcome Trust Sanger Institute at Cambridge.

They hope to complete it within 12 years. He said: ‘The complexity of the brain, with its billions of interconnected neurons, makes it hard for neuroscientists to truly understand how it works.

‘Simulating it will make it much easier – allowing them to manipulate and measure any aspect of the brain.’

Housed at a facility in Dusseldorf in Germany, the ‘brain’ will feature thousands of three-dimensional images built around a semi-circular ‘cockpit’ so scientists can virtually ‘fly’ around different areas and watch how they communicate with each other.

It aims to integrate all the neuroscience research being carried out all over the world – an estimated 60,000 scientific papers every year – into one platform.

The project has received some funding from the EU and has been shortlisted for a 1 billion euro (£825million) EU grant which will be decided next month.

When complete it could be used to test new drugs, which could dramatically shorten the time required for licencing them than human trials, and pave the way for more intelligent robots and computers. 

There are inevitably concerns about the consequences of this ‘manipulation’ and creating computers which can think for themselves. In Germany the media have dubbed the researchers ‘Team Frankenstein’.

 
The various areas of the human brain
Graphic: Corbis

But Prof Markram said: ‘This will, when successful, help two billion people annually who suffer from some type of brain impairment.

‘This is one of the three grand challenges for humanity. We need to understand earth, space and the brain. We need to understand what makes us human.’

Over the past 15 years his team have painstakingly studied and managed to produce a computer simulation of a cortical column – one of the small building blocks of a mammal’s brain.

They have also simulated part of a rat’s brain using a computer. But the human brain is a totally different proposition.

High energy consumption: The computer will require the output of a nuclear power station
 
High energy consumption: The computer will require the output of a nuclear power station like Sellafield, pictured here

Read more: http://www.dailymail.co.uk/sciencetech/article-2130124/Scientists-build-human-brain-Supercomputer-simulate-mind-exactly-help-fight-brain-diseases.html#ixzz1yiRQqhoy

Mind-boggling! Science creates computer that can decode your thoughts and put them into words

Mind-boggling! Science creates computer that can decode your thoughts and put them into words

  • Technology could offer lifeline for stroke victims and people hit by degenerative diseases
  • In the study, a computer analyzed brain activity and reproduced words that people were hearing 

By Tamara Cohen
05:49 GMT, 1 February 2012

It sounds like the stuff of science fiction dreams – or nightmares.

Scientists believe they have found a way to read our minds, using a computer program that can decode brain activity in our brains and put it into words.

They say it could offer a lifeline to those whose speech has been affected by stroke or degenerative disease, but many will be concerned about the implications of a technique that can eavesdrop on thoughts and reproduce them.

Scroll down for video

 

Scientific breakthrough: An X-ray CT scan of the head of one of the volunteers, showing electrodes distributed over the brain's temporal lobe, where sounds are processed

Scientific breakthrough: An X-ray CT scan of the head of one of the volunteers, showing electrodes distributed over the brain’s temporal lobe, where sounds are processed

 
 
 
 

Weird science: Scientists believe the technique, shown here, could also be used to read and report what they were thinking of saying next

Weird science: Scientists believe the technique, shown here, could also be used to read and report what they were thinking of saying next

Neuroscientists at the University of California Berkeley put electrodes inside the skulls of brain surgery patients to monitor information from their temporal lobe, which is involved in the processing of speech and images.

As the patient listened to someone speaking, a computer program analysed how the brain processed and reproduced the words they had heard.

 

 

The scientists believe the technique could also be used to read and report what they were thinking of saying next.

In the journal PLoS Biology, they write that it takes attempts at mind reading to ‘a whole new level’.

 

Brain workings: Researchers tested 15 people who were already undergoing brain surgery to treat epilepsy or brain tumours

Brain workings: Researchers tested 15 people who were already undergoing brain surgery to treat epilepsy or brain tumours

 

Words with scientists: The top graphic shows a spectrogram of six isolated words (deep, jazz, cause) and pseudo-words (fook, ors, nim). At bottom, the speech segments how the words were reconstructed based on findings from the electrodes

Words with scientists: The top graphic shows a spectrogram of six isolated words (deep, jazz, cause) and pseudo-words (fook, ors, nim). At bottom, the speech segments how the words were reconstructed based on findings from the electrodes

Robert Knight, professor of psychology and neuroscience, added: ‘This is huge for patients who have damage to their speech mechanisms because of a stroke or Lou Gehrig’s [motor neurone] disease and can’t speak.

‘If you could eventually reconstruct imagined conversations from brain activity, thousands could benefit.’

 

The researchers tested 15 people who were already undergoing brain surgery to treat epilepsy or brain tumours.

They agreed to have up to 256 electrodes put on to the brain surface, as they listened to men and women saying individual words including nouns, verbs and names.

 
1
2
 

Testing: As a subject listened to someone speaking, a computer program analysed how the brain processed and reproduced the words they had heard

Breakthrough: The ability to scan the brain and read thoughts could offer a lifeline to those whose speech has been affected by a stroke or degenerative disease

Breakthrough: The ability to scan the brain and read thoughts could offer a lifeline to those whose speech has been affected by a stroke or degenerative disease

A computer programme analysed the activity from the electrodes, and reproduced the word they had heard or something very similar to it at the first attempt.

 
 

Co-author Brian Pasley said there is already mounting evidence that ‘perception and imagery may be pretty similar in the brain’.

Therefore with more work, brain recordings could allow scientists to ‘synthesise the actual sound a person is thinking, or just write out the words with a type of interface device.’

Their study also shows in sharp relief how the auditory system breaks down sound into its individual frequencies – a range of around 1 to 8,000 Hertz for human speech.

Pasley told ABC News: ‘This study mainly focused on lower-level acoustic characteristics of speech. But I think there’s a lot more happening in these brain areas than acoustic analysis’.

He added: ‘We sort of take it for granted, the ability to understand speech. But your brain is doing amazing computations to accomplish this feat.’

 
 

Analyzing words: This graphic breaks down the three ways the brain hears spoken words and processes sounds

Analyzing words: This graphic breaks down the three ways the brain hears spoken words and processes sounds

This information does not change inside the brain but can be accurately mapped and the original sound decoded by a computer. British expert Professor Jan Schnupp, from Oxford University who was not involved in the study said it was ‘quite remarkable’.

‘Neuroscientists have of course long believed that the brain essentially works by translating aspects of the external world, such as spoken words, into patterns of electrical activity’, he said.

‘But proving that this is true by showing that it is possible to translate these activity patterns back into the original sound (or at least a fair approximation of it) is nevertheless a great step forward, and it paves the way to rapid progress toward biomedical applications.’

He played down fears it could lead to range of ‘mind reading’ devices as the technique can only, at the moment, be done on patients willing to have surgery.

Non-invasive brain scans are not powerful enough to read this level of information so it will remain limited to ‘small numbers of willing patients’.

He added: ‘Perhaps luckily for all those of us who value the privacy of their own thoughts, we can rest assured that our skulls will remain an impenetrable barrier for any would-be technological mind hacker for any foreseeable future.’

Watch

http://cdnapi.kaltura.com/index.php/kwidget/wid/1_x2ou92gb/uiconf_id/5590821

Read more: http://www.dailymail.co.uk/sciencetech/article-2094671/Mind-boggling-Science-creates-decode-thoughts-words.html#ixzz1wjAdr1ov

2050 – and immortality is within our grasp

2050 – and immortality is within our grasp

 

 David Smith, technology correspondent

Britain’s leading thinker on the future offers an extraordinary vision of life in the next 45 years

Cross section of the human brain

Supercomputers could render the wetware of the human brain redundant. Photograph: Gregor Schuster/Getty Images

Aeroplanes will be too afraid to crash, yoghurts will wish you good morning before being eaten and human consciousness will be stored on supercomputers, promising immortality for all – though it will help to be rich.

These fantastic claims are not made by a science fiction writer or a crystal ball-gazing lunatic. They are the deadly earnest predictions of Ian Pearson, head of the futurology unit at BT.

‘If you draw the timelines, realistically by 2050 we would expect to be able to download your mind into a machine, so when you die it’s not a major career problem,’ Pearson told The Observer. ‘If you’re rich enough then by 2050 it’s feasible. If you’re poor you’ll probably have to wait until 2075 or 2080 when it’s routine. We are very serious about it. That’s how fast this technology is moving: 45 years is a hell of a long time in IT.’

Pearson, 44, has formed his mind-boggling vision of the future after graduating in applied mathematics and theoretical physics, spending four years working in missile design and the past 20 years working in optical networks, broadband network evolution and cybernetics in BT’s laboratories. He admits his prophecies are both ‘very exciting’ and ‘very scary’.

He believes that today’s youngsters may never have to die, and points to the rapid advances in computing power demonstrated last week, when Sony released the first details of its PlayStation 3. It is 35 times more powerful than previous games consoles. ‘The new PlayStation is 1 per cent as powerful as a human brain,’ he said. ‘It is into supercomputer status compared to 10 years ago. PlayStation 5 will probably be as powerful as the human brain.’

The world’s fastest computer, IBM’s BlueGene, can perform 70.72 trillion calculations per second (teraflops) and is accelerating all the time. But anyone who believes in the uniqueness of consciousness or the soul will find Pearson’s next suggestion hard to swallow. ‘We’re already looking at how you might structure a computer that could possibly become conscious. There are quite a lot of us now who believe it’s entirely feasible.

‘We don’t know how to do it yet but we’ve begun looking in the same directions, for example at the techniques we think that consciousness is based on: information comes in from the outside world but also from other parts of your brain and each part processes it on an internal sensing basis. Consciousness is just another sense, effectively, and that’s what we’re trying to design in a computer. Not everyone agrees, but it’s my conclusion that it is possible to make a conscious computer with superhuman levels of intelligence before 2020.’

He continued: ‘It would definitely have emotions – that’s one of the primary reasons for doing it. If I’m on an aeroplane I want the computer to be more terrified of crashing than I am so it does everything to stay in the air until it’s supposed to be on the ground.

‘You can also start automating an awful lots of jobs. Instead of phoning up a call centre and getting a machine that says, “Type 1 for this and 2 for that and 3 for the other,” if you had machine personalities you could have any number of call staff, so you can be dealt with without ever waiting in a queue at a call centre again.’

Pearson, from Whitehaven in Cumbria, collaborates on technology with some developers and keeps a watching brief on advances around the world. He concedes the need to debate the implications of progress. ‘You need a completely global debate. Whether we should be building machines as smart as people is a really big one. Whether we should be allowed to modify bacteria to assemble electronic circuitry and make themselves smart is already being researched.

‘We can already use DNA, for example, to make electronic circuits so it’s possible to think of a smart yoghurt some time after 2020 or 2025, where the yoghurt has got a whole stack of electronics in every single bacterium. You could have a conversation with your strawberry yogurt before you eat it.’

In the shorter term, Pearson identifies the next phase of progress as ‘ambient intelligence’: chips with everything. He explained: ‘For example, if you have a pollen count sensor in your car you take some antihistamine before you get out. Chips will come small enough that you can start impregnating them into the skin. We’re talking about video tattoos as very, very thin sheets of polymer that you just literally stick on to the skin and they stay there for several days. You could even build in cellphones and connect it to the network, use it as a video phone and download videos or receive emails.’

Philips, the electronics giant, is developing the world’s first rollable display which is just a millimetre thick and has a 12.5cm screen which can be wrapped around the arm. It expects to start production within two years.

The next age, he predicts, will be that of ‘simplicity’ in around 2013-2015. ‘This is where the IT has actually become mature enough that people will be able to drive it without having to go on a training course.

‘Forget this notion that you have to have one single chip in the computer which does everything. Why not just get a stack of little self-organising chips in a box and they’ll hook up and do it themselves. It won’t be able to get any viruses because most of the operating system will be stored in hardware which the hackers can’t write to. If your machine starts going wrong, you just push a button and it’s reset to the factory setting.’

Pearson’s third age is ‘virtual worlds’ in around 2020. ‘We will spend a lot of time in virtual space, using high quality, 3D, immersive, computer generated environments to socialise and do business in. When technology gives you a life-size 3D image and the links to your nervous system allow you to shake hands, it’s like being in the other person’s office. It’s impossible to believe that won’t be the normal way of communicating.

“Humanity is about going beyond biological limitations”

Image: Drawing of The Vitruvian Man

Leonardo da Vinci’s drawing of The Vitruvian man.

NEW YORK Dreams of immortality inspired the fantastical tales of Greek historian Herodotus and Spanish explorer Juan Ponce de Leon’s legendary search for the fountain of youth. Nowadays, visionaries push for the technologies to transplant human brains into new bodies and download human consciousness into hologram-like avatars.

The latest science and schemes for achieving long life and the “singularity” moment of smarter-than-human intelligence came together at the Singularity Summit held here October 15-16. Some researchers explored cutting-edge, serious work about regenerating human body parts and defining the boundaries of consciousness in brain studies. Other speakers pushed visions of extending human existence in “Avatar”- style bodies — one initiative previously backed by action film star Steven Seagal — with fuzzier ideas about how to create such a world.

Above all, the summit buzzed with optimism about technology’s ability to reshape the world to exceed humanity’s wildest dreams, as well as a desire to share that vision with everyone. True believers were even offered the chance to apply for a credit card that transfers purchase rewards to the Singularity Institute.

“Humanity is about going beyond biological limitations,” said Ray Kurzweil, the inventor and futurist whose vision drives the Singularity Institute.

Rebuilding a healthy body The most immediate advances related to living longer and better may come from regenerative medicine. Pioneering physicians have already regrown the tips of people’s fingers and replaced cancer-ridden parts of human bodies with healthy new cells.

“What we’re talking about here is not necessarily increasing the quantity of life but the quality of life,” said Stephen Badylak, deputy director of the McGowan Institute for Regenerative Medicine at the University of Pittsburgh in Pennsylvania.

Success so far has come from using a special connective tissue — called the extracellular matrix (ECM) — to act as a biological scaffold for healthy cells to build upon. Badylak showed a video where his team of surgeons stripped out the cancerous lining of a patient’s esophagus like pulling out a sock, and relined the esophagus with an ECM taken from pigs. The patient remains cancer-free several years after the experimental trial.

The connective tissue of other animals doesn’t provoke a negative response in human bodies, because it lacks the foreign animal cells that would typically provoke the immune system to attack. It has served the same role as a biological foundation for so long that it represents a “medical device that’s gone through hundreds of millions of years of R&D,” Badylak said.

If work goes well, Badylak envisions someday treating stroke patients by regenerating pieces of the functioning human brain.

Live long and prosper The work of such researchers could do more than just keep humans happy and healthy. By tackling end-of-life chronic diseases such as cancer, medical advances could nearly double human life expectancy beyond almost 80 years in the U.S. to 150 years, said Sonia Arrison, a futurist at the Pacific Research Institute in San Francisco, Calif.

Long-lived humans could lead to problems such as anger over a “longevity gap” between haves and have-nots and perhaps add to stress on food, water and energy sources. But Arrison took a more positive view of how “health begets wealth” in a talk based on her new book, “100 Plus” (Basic Books, 2011).

Having healthier people around for longer means that they can remain productive far later in life, Arrison pointed out. Many past innovators accomplished some of their greatest or most creative work relatively late in life — Leonardo da Vinci began painting the Mona Lisa at 51, and Benjamin Franklin conducted his kite experiment at 46.

“Innovation is a late-peak field,” Arrison told the audience gathered at the Singularity Summit.

Even religion might find a renewed role in a world where death increasingly looks far off, Arrison said. Religion remains as popular as ever despite a doubling of human life expectancy up until now, and so Arrison suggested that religions focused on providing purpose or guidance in life could do well. But religions focused on the afterlife may want to rethink their strategy.

Making ‘Avatar’ real (or not) The boldest scheme for immortality came from media mogul Dmitry Itskov, who introduced his “Project Immortality 2045: Russian Experience.” He claimed support from the Russian Federation’s Ministry of Education and Science, as well as actor Seagal, to create a research center capable of giving humans life-extending bodies.

Itskov’s wildly ambitious plans include creating a humanoid avatar body within five to seven years, transplanting a human brain into a new “body B” in 10 to 15 years, digitally uploading a human brain’s consciousness in 20 to 25 years, and moving human consciousness to hologram-like bodies in 30 to 35 years.

That vision may have exceeded even the optimism of many Singularity Summit attendees, given the apparent restlessness of the crowd during Itskov’s presentation. But it did little to dampen the conference’s overall sense that humanity has a positive future within its collective grasp — even if some people still need to be convinced.

“We are storming the fricking barricades of death, both physically and intellectually, so we have to make it sexy,” said Jason Silva, a filmmaker and founding producer/host for Current TV.

By Jeremy Hsu

    10/17/2011 7:39:40 PM ET2011-10-17T23:39:40

You can follow InnovationNewsDaily Senior Writer Jeremy Hsu on Twitter @ScienceHsu. Follow InnovationNewsDaily on Twitter @News_Innovation, or on Facebook.

The missing link between us and the future !

The missing link between us and the future

In the early 1990s, the IT industry got very excited about virtual reality, the idea that you could use some sort of headset display to wander around in a 3d computer-generated world. We quickly realised there are zillions of variations on this idea, and after the one that became current computer gaming (3d worlds on a 2d monitor) the biggest of the rest was augmented reality, where data and images could be superimposed on the field of view.

Now, we are seeing apps on phones and pads that claim to be augmented reality, showing where the nearest tube station is for example. To a point I guess they are, but only in as far as they can let you hold up a display in front of you and see images relevant to the location and direction. They hardly amount to a head up display, and fall a long way short of the kind of superimposition we’re been used to on sci-fi since Robocop or Terminator. It is clear that we really need a proper head-up display, one that doesn’t require you to take a gadget out and hold it up in front of you.

There are some head-up displays out there. Some make overlay displays in a small area of your field of view, often using small projectors and mirrors. Some use visors.  However the video visor based displays are opaque. They are fine for watching TV or playing games while seated, but not much use for wandering around.

This will change in the next 18 months – 2 years. Semi-transparent visors will begin to appear then. The few years after that will undoubtedly see rapid development of them, eventually bringing a full hi-res 3d overlay capability. And that will surely be a major disruptive technology. Just as we are getting used to various smart phones, pads, ebbook readers and 3d TVs, they could all be absorbed into a general purpose head up display that can be used for pretty much anything.

It is hard to overstate the potential of this kind of interface once it reaches good enough quality. It allows anything from TV, games, or the web, to be blended with any real world scene or activity. This will transform how we shop, work and socialise, how we design and use buildings, and even how we use art or display ourselves. Each of these examples could easily fill a book.  The whole of the world wide web was enabled by the convergence of just the computing and telecoms industries. The high quality video visor will enable convergence of the real world with the whole of the web, media, and virtual worlds, not just two industry sectors. Augmented reality will be a huge part of that, but even virtual reality and the zillions of variants can then start to be explored too.

In short, the semi-transparent video visor is the missing link. It is the biggest bottleneck now stopping the future arriving. Everything till we get that is a sideshow.

Artificial Hippocampus, the Borg Hive Mind, and Other Neurological Endeavors

Artificial Hippocampus, the Borg Hive Mind, and Other Neurological Endeavors

November 15

Many of us know about ‘Borg Hive Mind’ from TV programs where the characters are linked through brain-to-brain or computer-to-brain interactions. However, this is more than a science fiction fantasy. The idea was contemplated seriously in the 2002 National Science Foundation report, Converging Technologies for Improving Human Performance: Nanotechnology, Biotechnology, Information Technology and Cognitive Science. ‘Techlepathy‘ is the word coined, referring to the communication of information directly from one mind to another (i.e. telepathy) with the assistance of technology.

Many research activities focus on neuro-engineering and the cognitive sciences. Many neuroscientists and bioengineers now work on:

  • cognitive computing
  • digitally mapping the human brain (see here and here); the mouse brain map has just been published
  • developing microcircuits that can repair brain damage, and
  • other numerous projects related to changing the cognitive abilities and functioning of humans, and artificial intelligence.

Journals exist for all of these activities — including the Human Brain Mappingjournal. Some envision a Human Cognome Project. James Albus, a senior fellow and founder of the Intelligent Systems Division of the National Institute of Standards and Technology believes the era of ‘engineering the mind‘ is here. He has proposed a national program for developing a scientific theory of the mind.

Neuromorphic engineering, Wikipedia says, “is a new interdisciplinary discipline that takes inspiration from biology, physics, mathematics, computer science and engineering to design artificial neural systems, such as vision systems, head-eye systems, auditory processors, and autonomous robots, whose physical architecture and design principles are based on those of biological nervous systems.”

mind computer

There are many examples.

Researchers from Harvard University have linked nanowire field-effect transistors to neurons. Three applications are envisioned: hybrid biological/electronic devices, interfaces to neural prosthetics, and the capture of high-resolution information about electrical signals in the brain. Research is advancing in four areas: neuronal networks, interfaces between the brain and external neural prosthetics, real-time cellular assays, and hybrid circuits that couple digital nanoelectronic and biological computing components.

Numenta, a company formed in 2005, states on its webpage that it “is developing a new type of computer memory system modelled after the human neocortex.”

Kwabena Boahen, an associate professor of bioengineering at Stanford University, has developed Neurogrid, “a specialized hardware platform that will enable the cortex’s inner workings to be simulated in detail — something outside the reach of even the fastest supercomputers.” He is also working on a silicon retina and a silicon chip that emulates the way the juvenile brain wires itself up.

Researchers at the University of Washington are working on an implantable electronic chip that may help to establish new nerve connections in the part of the brain that controls movement.

The Blue Brain project — a collaboration of IBM and the Ecole Polytechnique Federale de Lausanne, in Lausanne, Switzerland – will create a detailed model of the circuitry in the neocortex.

A DNA switchnanoactuator‘ has been developed by Dr. Keith Firman at the University of Portsmouth and other European researchers, which can interface living organisms with computers.

Kevin Warwick had an RFID transmitter (a future column will deal with RFID chips) implanted beneath his skin in 1998, which allowed him to control doors, lights, heaters, and other computer-controlled devices in his proximity. In anotherexperiment, he and his wife Irena each had electrodes surgically implanted in their arms. The electrodes were linked by radio signals to a computer which created a direct link between their nervous systems. Kevin’s wife felt when he moved his arm.

mind

In his book I, Cyborg, Kevin Warwick imagines that 50 years from now most human brains will be linked electronically through a global computer network.

St. Joseph’s Hospital in the United States has implanted neurostimulators (deep brain stimulators) using nanowires to connect a stimulating device to brain. A pacemaker-like device is implanted in the chest, and flexible wires are implanted in the brain. Electrical impulses sent from the ‘pacemaker’ to the brain are used to treat Parkinson’s, migraine headaches and chronic pain, depression, obsessive-compulsive disorder, improve the mobility of stroke victims, and curb cravings in drug addicts.

In 2003/2004 a variety of publications (see links below) reported on the efforts of professor Theodore W. Berger, director of the Center for Neural Engineering at the University of Southern California, and his colleagues, to develop the world’s firstbrain prosthesis – an ‘artificial hippocampus’ which is supposed to act as a memory bank. These publications highlighted in particular the use of such implants for Alzheimer’s patients.

The research program is proceeding in four stages: (1) tests on slices of rat brains kept alive in cerebrospinal fluid… reported as successful in 2004; (2) tests on live rats which are to take place within three years; (3) tests on live monkeys; and (4) tests on humans — very likely on Alzheimer’s patients first.

The Choice is Yours

If these advancements come to pass, they will create many ethical, legal, privacy and social issues. For the artificial hippocampus we should ask: would brain implants force some people to remember things they would rather forget? Could someone manipulate our memory? What would be the consequence of uploading information (see my education column)? Will we still have control over what we remember? Could we be forced to remember something over and over? If we can communicate with each other through a computer what will be the consequence of a Global Brain?

It is important that people become more involved in the governance of neuro-engineering and cognitive science projects. We should not neglect these areas because we perceive them to be science fiction. We also need to look beyond the outlined ‘medical applications.’ If the artificial hippocampus works, it will likely be used for more than dealing with diseases.

I will cover brain-machine interfaces, neuro-pharmaceutical-based ‘cognitive enhancement,’ and neuroethics and the ethics of artificial intelligence in future columns.

Gregor Wolbring is a biochemist, bioethicist, science and technology ethicist, disability/vari-ability studies scholar, and health policy and science and technology studies researcher at the University of Calgary. He is a member of the Center for Nanotechnology and Society at Arizona State University; Member CAC/ISO – Canadian Advisory Committees for the International Organization for Standardization section TC229 Nanotechnologies; Member of the editorial team for the Nanotechnology for Development portal of the Development Gateway Foundation; Chair of the Bioethics Taskforce of Disabled People’s International; and Member of the Executive of the Canadian Commission for UNESCO. He publishes the Bioethics, Culture and Disability website, moderates a weblog forthe International Network for Social Research on Diasbility, and authors a weblogon NBICS and its social implications.

Resources