As IT organizations adjusted to meet the new demands created by the pandemic, the pace of digital transformation actually increased – and not halted or slowed down as some expected. We are hearing from customers they have moved beyond survival mode and looking at how they can invest in a smarter way.
Customers are trying to harness their data’s value through the spectrum of analytics and AI. Organizations are also focused on managing a massive increase in their data, especially from connected devices and non-traditional sources outside the data center.
As 5G begins to make its impact felt, the volume of data is going to grow exponentially as tens of thousands of connected devices come online in IoT (internet of things) deployments, particularly in industrial uses.
It’s all in real-time
The global ‘datasphere’ is projected to nearly quadruple in the next five years. What will set successful organizations apart will be their ability to drive insights and value from data in real-time. Business intelligence and analytics will happen in real-time, making critical decisions in milliseconds.
No longer is data being processed within a traditional data center. It is being created at multiple access points and the latest innovations are now allowing customers to do more with their data, and process and analyze it at the edge and even within the cloud.
We start by exploring the need for ‘data-centered’ to get faster, real-time insights, whether it be on the cloud, on the edge, or in traditional databases. One proven technique uses High Performance Data Analytics (HPDA) to achieve real-time results.
Built around it
When we say ‘data-centered’, we are talking about people who use data to dramatically accelerate their organisations, improve their industries and strive to solve humanities greatest challenges using technology for the greater good.
Central to managing this data windfall is HPDA, which is the convergence of Big Data and High Performance computing (HPC). Traditional Big Data analytics engines are excellent at storing and managing large amounts of unstructured data that can be mined for trends and insights.
However, the speed and latency requirements of real-time analytics creates significant challenges for traditional Big Data systems.
Finding the fit
The challenge is that for customers running these workloads, traditional scale-out computing is not cost-effective, nor does it provide the performance needed to analyze data in real-time. Customers with common data center platforms typically must make trade-offs between storage, memory and compute performance – and then decide which combination best suits their specific workload needs.
Running these compute intensive workloads often not only requires the performance of HPC, but also demands the level of storage and memory capacity suited for Big Data.
For even greater performance, GPUs (graphiscs processing units) can be leveraged for ultra-fast data processing by loading entire datasets into the GPU memory and leveraging GPU acceleration for extremely fast processing. The added benefit of GPUs enables organizations to be AI-ready. These built in AI capabilities gives customers the ability to easily move beyond HPDA to complex Artificial Intelligence models.
We have all learned this year that the future is unpredictable. We can prepare, but we must innovate – together – to continue adapting to changing workloads and customer demands. Together, we help customers discover the potential and benefits of AI, develop AI applications using AI-optimized hardware and their choice of AI frameworks and deploy rapidly using simplified end-to-end solutions.
– Christopher Cooper, General Manager, Middle East and Africa, Lenovo Data Center Group.