Since the earliest days of computers, we’ve known the incredible promise they held. Inventors like Charles Babbage and Alan Turing were visionaries who recognized that these tools – like the steam shovel and printing press before them – had the power to not only replicate the work of dozens of humans, but to fundamentally change history.
While technology has improved in leaps and bounds since the first computers, the fact is the basic architecture on which those computers are built hasn’t really changed in more than 60 years. Today, we’re feeling the effects of that incremental progress. As we begin to outgrow those early systems and the sun is setting on Moore’s law – which assured us that processor performance will double every two years or so – we look forward to a new, more complex era.
Our world today is connected – smart cars, smart homes, smart factories, smart bodies. The amount of data we are creating is staggering, and our expectations for what we will be able to do with that data are being limited by today’s tools.
This brings with it more than opportunity, but a necessity to do something revolutionary, truly innovative and unconventional. Together, we can collaborate openly to find solutions and bring benefits that will once again chart the course of history.
We now find ourselves on a precipice, where the near future will bring with it a wealth of possibilities. Our current computational power is something that Babbage couldn’t have even dreamed about. The potential of what’s on the horizon of computing is something truly remarkable. However, this immense potential brings with it significant challenges – and it is our responsibility to do everything in our power to overcome them.
New emerging technologies could be the tools we need to flourish as a society. At HPE, we aim to harness the transformative power of these technologies to achieve our purpose – to advance the way people live and work. Only then can we start to unlock our technological potential. What if we could:
⦁ Identify a sustainable power source for our planet?
⦁ Offer financial inclusion for the underbanked?
The main roadblock on this journey to a better world lies in the limitations of our conventional computing solutions. There is a growing data deluge as our ambitions grow faster than our computers improve. Every two years we create more data than has ever been created before, with the majority of it originating at the edge, or the periphery of the network.
But, as our high-powered, processor-centric architecture meets the limitations of physics, we see that the important questions of our age cannot be matched by the linear improvements of our current systems.
Combined with this exponential growth in data is the increasingly pressing need to act in order to begin solving these questions. And we’ve never had such little time to do so. Companies and governments are preparing for the future today, but tomorrow’s computing will need to be drastically different to meet the changes that we’re seeing, so an understanding of the next generation computing ecosystem – and how best to invest in it – is vital.
We need a new paradigm that reinvents the most basic functions of a computing system from the ground up. Memory-Driven Computing is one approach that delivers an entirely new way of computing specifically designed for the Big Data era.
Only through a new architecture like Memory-Driven Computing will it be possible to simultaneously work with every digital health record of every person on earth, every trip of Google’s autonomous vehicles and every data set from space exploration all at the same time – getting to answers and uncovering new opportunities at unprecedented speeds.
By eliminating the inefficiencies of traditional systems today, Memory-Driven Computing reduces the time needed to process complex problems from days to hours, hours to minutes, minutes to seconds, to deliver real-time intelligence.
The problem with quantum computing
Quantum computing is one component that is in the global spotlight at the moment. Undeniably quantum computers will be able to achieve some amazing things, like the discovery of new drugs and materials. So rightly, organizations are exploring this technology. But as a society we can’t rely on quantum computers as the sole solution because quantum computing only solves quantum problems.
For instance, quantum computers can’t analyze and derive insights from the massive amount of sensor data our society is producing – from factories and connected cars to airports and security infrastructure. And quantum systems require a tremendous amount of energy.
A need for customized solutions
The solutions are multifaceted, with ripple effects – like power consumption and data protection – extending beyond the four walls of a company or research lab or even a country. I believe these emerging technologies should be evaluated in a way that’s integrated and comprehensive, not just focusing on the computer architecture, but also the skills, science, research and wider societal implications.
As the complexity of the demands we place on computers increases, so too does our need for customized solutions, built for the problem at hand. A holistic view of future challenges combined with a new type of computing will allow for tailored solutions to some of the most significant problems facing our world. Imagine the impact these fine-tuned, powerful technologies could have on mapping the universe or the human brain. Consider the importance of speed-to-results in the case of human trafficking or cancer research.
The challenges we’re facing are not insurmountable, and for governments and businesses around the world, the question is how to best prepare to meet these challenges head-on and succeed.
The next generation of computers will be powerful assets in solving global problems and enhancing the way we live and work. But to be most potent, these tools should be viewed and applied through a holistic lens. Our digitally transforming world demands that we look ahead and defy convention. It is our duty to do so.