AI platform provider Clarifai is publicly previewing compute orchestration for AI workloads that the company says works across any AI model on any compute, at any scale.

Clarifai said its vendor-agnostic platform could build and orchestrate AI workloads across any hardware provider, cloud provider, on-premises, or air-gapped environment, helping enterprises to optimize AI performance and AI spending and eliminate vendor lock-in. Users can bring their own AI workloads, use Clarifai’s full-stack AI platform to customize them, seamlessly orchestrate their workloads across any compute, and use Clarifai’s SaaS compute while centrally managing costs, governance, and performance through a unified control plane, Clarifai said.

The Clarifai compute orchestration layer provides the following:

  • A control plane for governing access to AI resources, monitoring performance, and managing costs.
  • The ability to deploy a model using any hardware vendor in the cloud, on-premises, in air-gapped environments, or in SaaS environments for inference.
  • Integration with a full stack of AI tools.
  • Maintenance of enterprise security and flexibility, with the ability to deploy the compute cloud into a customer’s virtual private cloud (VPC) or on-premises Kubernetes cluster.
  • The ability to administer and allocate access to AI resources across projects and teams.

Users can sign up for Clarifai’s compute orchestration preview at the Clarifai website.