Evangelizing Mainframe
Print Email

The Ongoing Evolution of Cognitive Computing

You probably think mainframe computers are pretty smart—after all, they get through a lot of work accurately and speedily. You probably think your smartphone and tablet are none too shabby when it comes to their processing ability either. But the truth is, they’re old hat!

IBM has announced a new structure to computing and a new programming approach that it’s calling cognitive computing. And it’s a whole generation ahead of classic von Neumann architecture and programmable computing.

The SyNAPSE team at IBM has worked with Cornell University and iniLabs, Ltd., to produce what it calls, corelet programming. Let’s start with the name: SyNAPSE stands for Systems of Neuromorphic Adaptive Plastic Scalable Electronics. This corelet model is described as a high-level description of a software program based on reusable building blocks of code. Each corelet handles a simple function, but, like the human brain, when you start plugging lots of them together they can handle complex functions. So programmers can write large and complex programs with little effort to use them.

This new programming model is designed for a new generation of distributed, highly interconnected, asynchronous, parallel, large-scale, cognitive computing systems. The current generation of computers was designed just after World War II for sequential processing. What they can’t do is handle real-time processing in an environment with lots of data and lots of noise. Human brains, which these corelets are mimicking, have evolved to be able to quickly and easily recognize, interpret and act on perceived patterns. And they can do all that in a relatively small space and small energy demands. The new generation computing systems will be capable of perception, cognition and action.

IBM’s plan is to build systems from these chips that would bring the real-time capture and analysis of various types of data closer to the point of collection. It seems they would not only gather symbolic data, which is fixed text or digital information, but also gather sub-symbolic data, which is sensory based and whose values change continuously. IBM goes on to suggest that this raw data reflects activity in the world of every kind, including commerce, social, logistics, location, movement and environmental conditions.

This project has cost IBM about $53 million so far, with $2 million in new funding recently pledged by the U.S. Defense Advanced Research Projects Agency (DARPA)—the people who had a hand in the development of the Internet.

This recent announcement about a new programming model follows a number of earlier announcements. In 2011, IBM showed off its “neurosynaptic cores.” In 2009, it announced it had simulated a cat’s cerebral cortex. In 2007, it simulated a rat’s brain. And in 2006, it had simulated 40 percent of a mouse’s brain.

In “Smart Machines: IBM’s Watson and the Era of Cognitive Computing,” John E. Kelly III, director of IBM Research, and Steve Hamm, writer at IBM and former business and technology journalist say:

“Tomorrow’s cognitive systems will be fundamentally different from the machines that preceded them. While traditional computers must be programmed by humans to perform specific tasks, cognitive systems will learn from their interactions with data and humans and be able to, in a sense, program themselves to perform new tasks. Traditional computers are designed to calculate rapidly; cognitive systems will be designed to draw inferences from data and pursue the objectives they were given. Traditional computers have only rudimentary sensing capabilities such as license-plate-reading systems on toll roads. Cognitive systems will be able to sense more like humans do. They’ll augment our hearing, sight, taste, smell, and touch. In the programmable-computing era, people have to adapt to the way computers work. In the cognitive era, computers will adapt to people. They’ll interact with us in ways that are natural to us.”

In the days of big data, the Internet of things and smarter analytics, cognitive computing is a whole order of magnitude shift in the computing power that will be available and perhaps in the simplicity of programming it. It certainly has the potential to change so much of what we do and the environment we live in.

However, there’s a thought lurking in a tiny corner of my mind, can I see a cognitive computer on the couch with its therapist?

Trevor Eddolls is CEO at iTech-Ed Ltd., an IT consultancy. For many years, he was the editorial director for Xephon’s Update publications and is now contributing editor to the Arcati Mainframe Yearbook. Eddolls has written three specialist IT books, and has had numerous technical articles published. He currently chairs the Virtual IMS and Virtual CICS user groups.

Posted: 8/27/2013 1:01:01 AM by Trevor Eddolls | with 0 comments

Print Email

Please sign in to comment.

Sign In

Blog post currently doesn't have any comments.

Join Now!