Cloudera has announced the release of Cloudera Data Services, designed to bring Private AI on-premises. This development aims to provide enterprises with secure, GPU-accelerated generative AI capabilities directly within their firewalls. The company highlights that this move allows organizations to build and scale their own sovereign data cloud in their data centers, addressing common security concerns.

According to Cloudera, they are the sole vendor offering the full data lifecycle with consistent cloud-native services across both on-premises and public cloud environments. A significant barrier to AI adoption for many enterprises has been concerns about data security and intellectual property. An Accenture report indicates that 77% of organizations lack the foundational data and AI security practices needed to protect critical models, data pipelines, and cloud infrastructure. Cloudera’s latest offering seeks to directly mitigate these risks, aiming to accelerate the journey from AI prototype to production.

This release extends the benefits of Cloudera Data Services to an organization’s data center, which the company states can lead to reduced infrastructure costs and streamlined data lifecycles, ultimately boosting data team productivity. It also promises accelerated workload deployment, enhanced security through automation, and faster time-to-value for AI deployments. Users are expected to gain cloud-native agility behind their firewall, enabling efficient scaling without compromising security.

As part of this release, Cloudera AI Inference Service and AI Studios are now available for on-premises deployment. Previously, these tools were exclusive to cloud environments. Their on-premises availability is intended to empower organizations to accelerate AI adoption and securely develop and run GenAI applications within their own data centers, keeping sensitive intellectual property protected.

Cloudera AI Inference services, accelerated by NVIDIA, on premises, is described as an industry-first AI inference service to provide embedded NVIDIA NIM microservice capabilities. This service is designed to streamline the deployment and management of large-scale AI models within the data center, where data securely resides. Cloudera AI Studios on premises aims to democratize the entire AI application lifecycle by offering low-code templates for building and deploying GenAI applications and agents.

An independent “Total Economic Impact (TEI)” study by Forrester Consulting, commissioned by Cloudera, found that a composite organization using Cloudera Data Services on premises experienced an 80% faster time-to-value for workload deployment, a 20% increase in productivity for data practitioners and platform teams, and overall savings of 35% from the modern cloud-native architecture. The study also noted operational efficiency gains, with some organizations improving hardware utilization and reporting reduced capacity needs after modernization.

Sanjeev Mohan, an industry analyst, commented, “Historically, enterprises have been forced to cobble together complex, fragile DIY solutions to run their AI on-premises. Today the urgency to adopt AI is undeniable, but so are the concerns around data security. “

Leo Brunnick, Cloudera’s Chief Product Officer, stated, “Cloudera Data Services On-Premises delivers a true cloud-native experience on-premises, providing agility and efficiency without sacrificing security or control. “

Toto Prasetio, Chief Information Officer of BNI, an early adopter, said, “This technology provides the essential infrastructure to securely and efficiently expand our generative AI initiatives, all while adhering to Indonesia’s dynamic regulatory environment. It marks a significant advancement in our mission to offer smarter, quicker, and more dependable digital banking solutions to the people of Indonesia.”

Leave a comment

Your email address will not be published. Required fields are marked *