Knative Matures: The Serverless Layer for Kubernetes

Knative is now officially production-ready.

The CNCF (Cloud Native Computing Foundation) regards it as such. It has promoted the project to the highest maturity level in its nomenclature (graduated).

Originating this serverless framework for Kubernetes is Google, which opened the project to the community in the summer of 2018. Entry into the CNCF – at the incubation stage – occurred in March 2022, a few months after the 1.0 release. VMware, IBM and Red Hat were then the main contributors. The former held the majority on the technical oversight committee. The other two were also present, in addition to having each a seat in the upper body: the Steering Committee.

Red Hat and VMware as the leading champions

In 2024, the Technical Steering Committee was merged into the Steering Committee. Two Red Hat employees sit on it, alongside representatives from CoreWeave and Stacklok, as well as a former Pivotal-VMware-Broadcom executive who marketed Knative as part of the Tanzu platform.

Read also: Kubernetes: Databricks’ choices for load balancing

The project is currently divided into seven working groups:

  • Functions (led by Red Hat and VMware employees)
  • Serving (led by the former Pivotal-VMware-Broadcom executive)
  • Eventing (Red Hat)
  • UX (OCAD University and the University of Toronto)
  • Operations (Bloomberg)
  • Productivity* (Cisco and Red Hat)
  • Security (IBM and VMware)

Bloomberg is particularly involved as it is among the organizations using Knative. Alongside, for example, Alibaba Cloud, Blue Origin, Box, Gojek, ManoMano, Scaleway and Tata Communications.

An AI positioning, including agentic AI

The middleware trio Serving, Eventing and Functions constitutes the project’s functional core.

Serving provides the means to deploy and manage stateless HTTP services in a serverless manner.
Eventing offers a set of APIs enabling the implementation of an event-driven architecture. It relies on the CloudEvents specifications.
Functions uses Serving and Eventing to help deploy functions in the form of OCI images.

Serving and Eventing share sub-projects: Client and Operator. The former provides a command-line tool designed to create Knative resources without having to modify YAML files. The latter helps install the two building blocks on Kubernetes clusters.

In recent times, the project’s public communication has notably leaned toward LLMs. Specifically around KServe, a model server based on Knative Serving. But also in support of several use cases. Driven in particular by Red Hat (inference with Llama Stack, agentic AI for handling customer conversations…) and by IBM (for training models within the watsonx Assistant service).

* “Health” of the project: tests, infrastructure, CI/CD, etc.

Dawn Liphardt

Dawn Liphardt

I'm Dawn Liphardt, the founder and lead writer of this publication. With a background in philosophy and a deep interest in the social impact of technology, I started this platform to explore how innovation shapes — and sometimes disrupts — the world we live in. My work focuses on critical, human-centered storytelling at the frontier of artificial intelligence and emerging tech.