
As computers get smarter, faster and more compact we often find ourselves wondering: how much further can we push this technology?
The singularity is a theoretical instance during which artificial intelligence will surpasses human intelligence and bring about radical change in human nature. While this notion seems closer to science fiction than science fact, recent breakthroughs in computer processing show computers that can mimic the human brain.
In a statement last week, technology specialists Qualcomm announced they were making headway in relation to their “biologically-inspired” processor that is modelled on real-life neurons.
“Instead of preprograming behaviours and outcomes with a lot of code, we’ve developed a suite of software tools that enable devices to learn as they go and get feedback from their environment,” states Samir Kumar, director of business development at Qualcomm.
The tech giant has recently set up operations in Cork creating 100 digital IT security positions and has expressed interest in setting up a research and development wing, which could lead to up to 150 more jobs.
Qualcomm have already built a robot which uses this ground-breaking technology. The machine learns by means of a reward system. If the machine preforms a task correctly a “good robot” message is sent.
The companies “neuro-inspired” chips will find their way into robots, vision systems, brain implants and smartphones. They are designed to be massively parallel, reprogrammable, and capable of cognitive tasks such as classification and prediction.
The ultimate aim is for users to be able to train their devices. The use of this technology in cell phones opens up the possibility for a customised user experience for each individual.
The enabling of devices to see and perceive the world as humans do is a goal that Qualcomm feels is realistically within reach. “A major pillar of Zeroth processor function is striving to replicate the efficiency with which our senses and our brain communicate information,” said Kumar.
Other companies such as IBM and Google are also investing millions into the field of cognitive computing. Last year Google unveiled a “neural network” that taught itself how to identify cats after being exposed to YouTube videos.
Earlier this month IBM announced a collaborative research initiative with four leading universities. The study seeks to develop a system which can learn, reason and help human experts make complex decisions.
“I believe that cognitive systems technologies will make it possible to connect people and computers in new ways so that–collectively–they can act more intelligently than any person, group, or computer has ever done before,” said Thomas Malone, Director of the MIT Centre for Collective Intelligence, in a press release.
What will come of this research and how will it affect the average user’s virtual experience? Only time will tell. In the mean time we will have to make do with our not-so-brainy smart phones.