Back to all blogs
aistandardization

OCI...The Next Standard for AI Infrastructure?

Explore why OCI is emerging as the winning standard for AI/ML artifacts and how standardization is bringing order to the fragmented MLOps landscape with Görkem Ercan, CTO at Jozu

Rohit RaveendranPlaceholder avatar
Rohit Raveendran‱ Co-Founder & VP Engg
‱November 17, 2025‱5 min read

When we talk about AI systems today, it's easy to forget how messy things look behind the scenes. Models trained in one environment, tested in another, and deployed somewhere completely different. Data versions don't match, pipelines behave differently on staging and prod, and everyone's debugging the same issue in a different place.

In the latest episode of the AI x DevOps Podcast, Rohit from Facets sat down with Görkem Ercan, CTO of Jozu, to unpack this problem and how the next phase of AI engineering might take inspiration from one of DevOps' greatest breakthroughs: the Open Container Initiative (OCI).

The Parallels Between DevOps and AI

The discussion began with an observation: 10 years ago, DevOps teams faced almost the same kind of fragmentation AI teams are living through now. Every environment behaved differently, deployments broke easily, and reproducibility was mostly luck. Then containers arrived, creating a universal format for packaging and running applications.

Today, AI teams are in that pre-container phase. Training workflows, model evaluation, and deployment all live in silos. The lack of a common structure means the same problem gets solved five different ways inside one organization. As Görkem put it, "It's a fractured ecosystem held together by scripts."

Why Standardization Is Becoming Urgent

Machine learning pipelines are inherently unpredictable. Run the same code twice and you might not get the same result. Combine that with the number of tools involved - experiment tracking, serving frameworks, monitoring systems, and you get a workflow that is hard to govern, harder to reproduce, and nearly impossible to audit end-to-end.

Görkem explained that what's missing isn't more automation; it's standardization. Just as OCI gave DevOps a way to define how code moves through environments, AI needs a universal way to package and ship artifacts; whether they're models, datasets, or prompts. Only then can governance, provenance, and reproducibility become defaults instead of afterthoughts.

OCI: A Foundation That Already Works

OCI isn't new; it's the standard behind Docker and Kubernetes (technologies that reshaped how we build and run applications). What makes it interesting in the AI context is that it already solves most of the problems teams struggle with: version control, immutability, access management, and trust.

If AI artifacts could live inside OCI-compliant registries, every model could carry its full lineage — from the data that trained it to the configuration that deployed it. That's the kind of traceability AI teams are missing today.

What This Means for AI Product Builders

For teams building AI products, the implications are big. Imagine models being versioned, signed, and governed just like application code. Imagine your data scientists and DevOps engineers working through the same pipeline without reinventing tools for every new project.

This is the kind of structural shift that can turn AI from a set of experimental workflows into a repeatable, scalable production system. It's what made DevOps work and it's what AI engineering needs next.

The Road Ahead

The AI industry doesn't need another framework; it needs a foundation. Containers did it for software. OCI could do it for AI.

It won't happen overnight, but the direction feels inevitable. Once models, datasets, and prompts can move through the same infrastructure with the same consistency as code, AI development will finally start to look less like research and more like engineering.

Because the next big breakthrough in AI might not come from a bigger model, but from a better system that knows how to handle one.


Want to hear the full conversation? Listen to the complete episode

Tags

#oci#ai#mlops#devops