Red Hat accelerates AI/ML workflows and delivery of AI-powered intelligent applications with Red Hat OpenShift
Via Red Hat Newsroom
Mar 24, 2020
March 24, 2020 - RALEIGH, N.C., USA: Red Hat, Inc. , the world's leading provider of open source solutions, today highlighted that more organizations are using Red Hat OpenShift as the foundation for building artificial intelligence (AI) and machine-learning (ML) data science workflows and AI-powered intelligent applications. OpenShift helps to provide agility, flexibility, portability and scalability across the hybrid cloud, from cloud infrastructure to edge computing deployments, a necessity for developing and deploying ML models and intelligent applications into production more quickly and without vendor lock-in.
AI/ML represents a top emerging workload for Red Hat OpenShift across hybrid cloud and multicloud deployments for both our customers and for our partners supporting these global organizations. By applying DevOps to AI/ML on the industry’s most comprehensive enterprise Kubernetes platform, IT organizations want to pair the agility and flexibility of industry best practices with the promise and power of intelligent workloads. Ashesh Badani senior vice president, Cloud Platforms, Red Hat
As a production-proven enterprise container and Kubernetes platform, OpenShift delivers integrated DevOps capabilities for independent software vendors (ISVs) via Kubernetes Operators and NVIDIA GPU-powered infrastructure platforms. This combination can help organizations simplify the deployment and lifecycle management of AI/ML toolchains as well as support hybrid cloud infrastructure. With these enhancements, data scientists and software developers are empowered to better collaborate and innovate in the hybrid cloud rather than simply manage infrastructure resource requests.
Customer and ecosystem interest in AI/ML
The customer momentum seen by Red Hat validates the AI/ML findings from the recent 2020 Red Hat Global Customer Tech Outlook report. The report surveyed 876 Red Hat customers on their top IT priorities and found that 30% of respondents plan on using AI/ML over the next 12 months, ranking AI/ML as the top emerging technology workload consideration for companies surveyed in 2020.
For example, Kasikorn Business-Technology Group (KBTG) supports the day-to-day operations of KBank, one of Thailand’s largest commercial banks, and also provides technology developer and partner services for fintech firms across Thailand. To support the doubling of KBank’s user base, KBTG developed K PLUS AI-Driven Experience (KADE) to help analyze customer behavior and deliver a more personalized experience and also launched UCenter, a unified notification feed system, built and deployed on Red Hat OpenShift. Other customer cases of AI/ML solutions on Red Hat OpenShift include Boston Children's Hospital and more.
Along with these customers using OpenShift to accelerate AI/ML workflows and deliver AI-powered intelligent applications, AI/ML ISV partners including CognitiveScale, Dotscience, NVIDIA and Seldon have recently developed OpenShift integrations via certified Kubernetes Operators. OpenShift is also powering IBM Cloud Paks to help customers accelerate their journey to the cloud and transform business operations in support of new workloads, including AI/ML/DL.
Additionally, to streamline the adoption of AI-enabled infrastructure in the enterprise datacenters, Red Hat has collaborated with Hewlett Packard Enterprise (HPE) and NVIDIA on a new Accelerated AI Reference Architecture, which offers design and deployment guidelines to help mutual customers bring AI-based applications to production more quickly.
Driving open AI innovation
Red Hat continues to be an active contributor to the Kubeflow open source community project, which focuses on simplifying ML workflows for Kubernetes while enhancing workload portability and scalability. Kubeflow can now run on OpenShift as documented here, and a Kubeflow Kubernetes Operator is in development to help simplify the deployment and lifecycle management of Kubeflow on OpenShift.
Additionally, Red Hat leads the Open Data Hub community project to provide a blueprint for building an AI-as-a-Service platform with Red Hat OpenShift, Red Hat Ceph Storage and more. Open Data Hub v0.5.1 is now available and includes tools like JupyterHub 3.0.7, Apache Spark Operator 1.0.5 for managing spark clusters on OpenShift and the Apache Superset data exploration and visualization tool.
To learn more about how Red Hat is helping organizations globally accelerate AI/ML workflows and deliver AI-powered intelligent applications, please visit www.openshift.com/ai-ml.
To experience Red Hat’s open source solutions that fuel many emerging workloads, including AI/ML/DL, join the online presentations at NVIDIA’s GTC Digital event. Red Hat’s experts will showcase how scalable software infrastructure from Red Hat can be deployed in a range of scenarios, from virtualized environments in corporate datacenters to massive-scale services on public clouds.
*Ashesh Badani, senior vice president, Cloud Platforms, Red Hat *
"AI/ML represents a top emerging workload for Red Hat OpenShift across hybrid cloud and multicloud deployments for both our customers and for our partners supporting these global organizations. By applying DevOps to AI/ML on the industry’s most comprehensive enterprise Kubernetes platform, IT organizations want to pair the agility and flexibility of industry best practices with the promise and power of intelligent workloads. We’re pleased to help support these initiatives through our extensive partner ecosystem’s use of Certified Kubernetes Operators."
*Matt Sanchez, chief technology officer, CognitiveScale *
"Combining the power of CognitiveScale Certifai platform and Red Hat OpenShift via Kubernetes Operators integration enables our mutual customers to accelerate AI/ML workflows across hybrid and multi cloud deployments. This can result in faster delivery of intelligent applications and more simplified IT operations."
*Luke Marsden, founder and chief executive officer, Dotscience *
"AI projects often fail due to differences between software DevOps and MLOps. As projects scale up, lack of control can result in chaos and pain. The Dotscience operator for OpenShift helps AI projects to deliver business value faster and reduces risk with specialized MLOps tooling across the build, deploy & monitor lifecycle."
*Thanussak Thanyasiri, senior delivery manager, Kasikorn Business-Technology Group (KBTG) *
"The UCenter project has proven that Red Hat can fully support our ambitions. Being able to scale without compromising availability and security is critical to our ambitious growth targets, and Red Hat gives us that capability. We have identified five additional applications and use cases, such as threat detection, where we are confident that Red Hat can also add value. We are delighted with the service provided by Red Hat, particularly its local technical expertise, so we look forward to deepening our relationship."
*Justin Boitano, general manager, enterprise and edge computing, NVIDIA *
"Deeper integration between Red Hat OpenShift and NVIDIA’s GPU Operator accelerates the deployment of AI solutions for our customers. The combination of these technologies makes it easier for IT organizations to access and scale GPU-powered, hybrid-cloud infrastructure."
*Alex Housley, founder and chief executive officer, Seldon *
"Seldon’s integrations with Red Hat OpenShift via Kubernetes Operators helps organizations speed up deployment of machine learning models across the hybrid cloud, providing faster roll-outs of AI-powered digital services."
- AI/ML on OpenShift Explainer Video, webpage
- Red Hat blog on hardware acceleration on OpenShift with NVIDIA GPU operator
- Red Hat blog on how to deploy Kubeflow on OpenShift
- Open Data Hub 0.5.1 release blog
Connect with Red Hat
Stay up to date with the latest industry developments: sign up to receive TelecomTV's top news and videos plus exclusive subscriber-only content direct to your inbox – including our daily news briefing and weekly wrap.