It’s fair to say the era of GenAI has sparked a data-driven revolution — companies today are investing in AI projects at an unprecedented pace, with venture capitalists pouring close to $40 billion into related startups in the first half of this year alone.

At the same time, this surge in new data has presented an even greater challenge: the size and scale of information being created is threatening to overwhelm traditional data processing.

In SQream’s 2024 State of Big Data Analytics Report, the tech startup highlights a shift in how data projects are being handled to tackle increased costs and finite results associated with CPU-based systems. 

In today’s market, increased expenses associated with analytics and AI are pushing companies into a corner, requiring decision makers to decide between prioritizing query complexity, data volume, and project scope.

Additionally, CPU-based processing often creates bottlenecks, hindering project success and stifling innovation, creating an approach that isn’t unsustainable.

The report by SQream, which saw the startup survey 300 senior data management professionals from US companies with at least $5 million annual spend on cloud and infrastructure, provides insight into a shift playing out in the strategies adopted by industry leaders, including a move towards GPUs.

Findings from the report include:

The CPU Myth: While 78% of companies believe adding more CPUs will have the most significant impact on their 2024 data analytics and AI/ML goals, adding GPU instances ranks a close second.

The “Bill Shock” Epidemic: A significant portion of respondents experience frequent surprises in their cloud analytics bills, with costs exceeding expectations.

High Costs Hinder Progress: Higher costs associated with ML experimentation are the primary roadblock for 41% of companies when it comes to data analytics and AI projects.

Project Failure Breakdown: Insufficient budget was the leading culprit behind 2023 project failures, mirroring the report’s overall cost concerns. Poor data preparation and cleansing were also significant contributors.

GPU Acceleration for Existing Pipelines: GPU processing can significantly accelerate existing data analytics pipelines without requiring a complete overhaul, allowing businesses to unlock the power of AI and address the data deluge without disrupting current workflows.

Rightsizing the Cloud Challenge: Recognizing the cost burdens, a staggering 92% of companies are actively working to optimize their cloud spending on analytics. Nearly half admit to compromising on query complexity and project volume to manage these costs.

Read the full report from the company here.

Featured photo of SQream CEO Ami Gal