manchestercomp_web

Sometimes it feels like we are kidding ourselves about innovation. On one hand, just about everyone in technology brags about how innovative they are or how innovative they want to be. But when I describe some really innovative scenarios, I often hear things like “That’s scary!”

Here’s an example: When you walk into a retail store, a sales clerk walks up to you, greets you by name, and offers you a sale on items that you have bought from them in the past.

Not scary enough? What if the sales clerk then asks about your spouse by name and suggests something that might be a nice gift for them – in their size, in colors they like?

While these examples are fairly simple (and easily implemented today), they just barely start to scratch the surface of smarter technology. What should really blow you away are not these simple things that we can do today, but the truly innovative artificial intelligence and deep learning we’ll be able to achieve in the very near future.

But before we get into those capabilities, let’s take a look back at how we got here:

Generations of Computing

Advances in modern computer technology were really born in the 1930’s from the work of Alan Turing (subject of the movie the Imitation Game from 2014) and other computer scientists of that age. The critical invention of the time was the digital computing device that has since become the center of the modern technology age we are currently living in today.

First Generation Computing (1950’s)

The first generation of digital computing devices relied on vacuum tubes to send electrical signals. In computing, those signals were either on (1) or off (0). The limitation of such machines was that they were very large, very expensive, and very slow. But, for the first time, calculations could be automated on a large scale with accuracy greater than most humans could perform.

Second Generation Computing (1960’s)

The invention of the transistor for computing was a huge step forward. The transistor itself contained electrical leads and a conductive material that replaced vacuum tubes at a fraction of the price and space. This critical step in the advancement of computers would start the explosive growth of computing devices that spanned well into today’s world.

Third Generation Computing (1970’s)

While the transistor is still the fundamental building block of computers, the integrated circuit provided for a greater density than individual transistors and could be used for discrete functions, such as mathematics, graphics processing, etc. The integrated circuit brought forth more powerful machines and continued to dramatically reduce the cost of building computers.

Fourth Generation Computing (1980’s)

The concept of Moore’s Law really blossomed with the introduction of the microprocessor. The microprocessor is essentially the brains of an individual computer (or the CPU – central processing unit). A microprocessor includes one or many integrated circuits arranged into smaller and smaller spaces. As Moore’s Law predicted, the number of transistors contained within a single microprocessor could double every 18 months – and it has since those early days of the microprocessor, even up to today.

The implication of the microprocessor was that we could have a computer in every business, in every home, in every car, and just about everywhere else.

Stuck in Fourth Gear

But for all the amazing advances we have made in computing, we seem to be stuck in the Fourth Generation, which has been around since the 1980’s. Previously, we saw a major break-through about every decade, but it has been 30 years since that time. Yes, there have been advances in networking, storage, and other areas of technology, however as we are effectively nearing the end of Moore’s law (where more transistors in a single chip will no longer be physically possible), we need to find another way of structuring computing devices if we want to achieve the same level of explosive advancements.

Today, most major researchers and publications predict that the next great evolution of computing devices is going to be in Artificial Intelligence (or the Fifth Generation of computing). But what is that, and how will it impact corporations struggling with their current technology investments and innovations? Find out in my next blog: “Innovating with Artificial Intelligence – Part 2: Computing for Business.”