HomeSample Page

Sample Page Title


 12.1_blog_hero

This weblog put up focuses on new options and enhancements. For a complete record, together with bug fixes, please see the launch notes.

Constructing Manufacturing-Prepared Agentic AI at Scale

Agentic AI methods are transferring from analysis prototypes to manufacturing workloads. These methods do not simply generate responses. They purpose over multi-step duties, name exterior instruments, work together with APIs, and execute long-running workflows autonomously.

However manufacturing agentic AI requires greater than highly effective fashions. It requires infrastructure that may deploy brokers reliably, handle the instruments they rely on, deal with state throughout advanced workflows, and scale throughout cloud, on-prem, or hybrid environments with out vendor lock-in.

Clarifai’s Compute Orchestration was constructed for this. It gives the infrastructure layer to deploy any mannequin on any compute, at any scale, with built-in autoscaling, multi-environment help, and centralized management. This launch extends these capabilities particularly for agentic workloads, making it simpler to construct, deploy, and handle manufacturing agentic AI methods.

With Clarifai 12.1, now you can deploy public MCP (Mannequin Context Protocol) servers immediately on the platform, giving agentic fashions entry to searching capabilities, real-time information, and developer instruments with out managing server infrastructure. Mixed with help for customized MCP servers and agentic mannequin uploads, Clarifai gives an entire orchestration layer for agentic AI: from improvement to manufacturing deployment.

This launch additionally introduces Artifacts, a versioned storage system for information produced by pipelines, and Pipeline UI enhancements that streamline monitoring and management of long-running workflows.

Let’s stroll via what’s new and easy methods to get began.

Deploying Public MCP Servers for Agentic AI

Agentic AI methods break when fashions cannot entry the instruments they want. A reasoning mannequin would possibly know how to browse the net, execute code, or question a database, however with out the infrastructure to really name these instruments, it is restricted to producing textual content.

Mannequin Context Protocol (MCP) servers resolve this. They’re specialised net providers that expose instruments, information sources, and APIs to LLMs in a standardized manner. An MCP server acts because the bridge between a mannequin’s reasoning capabilities and real-world actions, like fetching stay climate information, navigating net pages, or interacting with exterior methods.

Clarifai has already been supporting customized MCP servers, permitting groups to construct their very own instrument servers and run them on the platform utilizing Compute Orchestration. This provides full management over what instruments brokers can entry, but it surely requires writing and sustaining customized server code.

With 12.1, we’re making it simpler to get began by including help for public MCP servers. These are open-source, community-maintained MCP servers that you may deploy on Clarifai with a easy configuration, with out writing or internet hosting the server your self.

How Public MCP Servers Work

Public MCP servers are deployed as fashions on the Clarifai platform. As soon as deployed, they run as managed API endpoints on Compute Orchestration infrastructure, dealing with instrument execution and returning outcomes to agentic fashions throughout inference.

This is what the workflow appears like:

  1. Deploy a public MCP server as a mannequin on Clarifai utilizing the CLI or SDK
  2. Join it to an agentic mannequin that helps instrument calling and MCP integration
  3. The mannequin discovers out there instruments from the MCP server throughout inference
  4. The mannequin calls instruments as wanted, and the MCP server executes them and returns outcomes
  5. The mannequin makes use of these outcomes to proceed reasoning or full the duty

The whole circulation is managed by Compute Orchestration. The MCP server runs as a containerized deployment, scales based mostly on demand, and may be deployed throughout any compute atmosphere (cloud, on-prem, or hybrid) similar to every other mannequin on the platform.

Accessible Public MCP Servers

We have revealed a number of open-source MCP servers on the Clarifai Group that you may deploy at the moment:

Browser MCP Server
Offers agentic fashions the power to navigate net pages, extract content material, take screenshots, and work together with net kinds. Helpful for analysis duties, information gathering, or any workflow that requires real-time net interplay.

Climate MCP Server
Offers real-time climate information lookup by location. A easy instance of how MCP servers can join fashions to exterior APIs with out requiring the mannequin to deal with authentication or API-specific logic.

These servers are already deployed and operating on the platform. You need to use them immediately with any agentic mannequin, or reference them as examples when deploying your individual public MCP servers.

Deploying Your Personal Public MCP Server

If you wish to deploy an open-source MCP server from the group, the method is easy. You present a configuration pointing to the MCP server repository, and Clarifai handles containerization, deployment, and scaling.

This is an instance of deploying the Browser MCP server utilizing the identical workflow as importing a customized mannequin. The complete instance is obtainable within the Clarifai runners-examples repository.

The configuration follows the identical construction as every other mannequin add on Clarifai. You outline the server’s runtime, dependencies, and compute necessities, then add it utilizing the CLI:

clarifai mannequin add

As soon as deployed, the MCP server turns into a callable API endpoint.

Utilizing MCP Servers with Agentic Fashions

A number of fashions on the Clarifai platform natively help agentic capabilities and may combine with MCP servers throughout inference. These fashions are constructed with instrument calling and iterative reasoning, permitting them to find, name, and course of outcomes from MCP servers with out extra configuration.

Fashions with agentic MCP help embrace:

Once you name one among these fashions via the Clarifai API, you possibly can specify which MCP servers it ought to have entry to. The mannequin handles instrument discovery and execution throughout inference, iterating till the duty is full.

You too can add your individual agentic fashions with MCP help utilizing the AgenticModelClass. This extends the usual mannequin add workflow with built-in help for instrument discovery and execution. An entire instance is obtainable within the agentic-gpt-oss-20b repository, displaying easy methods to add an agentic reasoning mannequin that integrates with MCP servers.

Why This Issues for Manufacturing Agentic AI

Deploying MCP servers on Compute Orchestration means you get the identical infrastructure advantages as every other workload on the platform:

  • Deploy wherever: MCP servers can run on Clarifai’s shared compute, devoted cases, or your individual infrastructure (VPC, on-prem, air-gapped)
  • Autoscaling: Servers scale up or down based mostly on demand, with help for scale-to-zero when idle
  • Centralized management: Monitor efficiency, handle prices, and management entry via the Clarifai Management Middle
  • No vendor lock-in: Run the identical MCP servers throughout completely different environments with out reconfiguration

That is production-grade orchestration for agentic AI. MCP servers aren’t simply operating domestically or on a single cloud supplier. They’re deployed as managed providers with the identical reliability, scaling, and management you’d count on from any enterprise AI infrastructure.

For a step-by-step information on deploying public MCP servers, connecting them to agentic fashions, and constructing your individual tool-enabled workflows, take a look at the Clarifai MCP documentation and the examples within the runners-examples repository.

Artifacts: Versioned Storage for Pipeline Outputs

Clarifai Pipelines, launched in 12.0, assist you to outline and execute long-running, multi-step AI workflows immediately on the platform. These workflows deal with duties like mannequin coaching, batch processing, evaluations, and information preprocessing as containerized steps that run asynchronously on Clarifai’s infrastructure.

Pipelines are presently in Public Preview as we proceed iterating based mostly on consumer suggestions.

Pipelines produce information. Mannequin checkpoints, coaching logs, analysis metrics, preprocessed datasets, configuration information. These outputs are invaluable, however till now, there was no standardized approach to retailer, model, and retrieve them inside the platform.

With 12.1, we’re introducing Artifacts, a versioned storage system designed particularly for information produced by pipelines or consumer workloads.

What Are Artifacts

An Artifact is a container for any binary or structured file. Every Artifact can have a number of ArtifactVersions, capturing distinct snapshots over time. Each model is immutable and references the precise file saved in object storage, whereas metadata like timestamps, descriptions, and visibility settings are tracked within the management airplane.

This separation retains lookups quick and storage prices low.

Why Artifacts Matter

Reproducibility: Save the precise information (weights, checkpoints, configs, logs) that produced outcomes, making experiments reproducible and auditable.

Resume and checkpointing: Pipelines can resume from saved checkpoints as a substitute of recomputing, saving time and value on long-running jobs.

Model management: Monitor how mannequin checkpoints evolve over time or examine outputs throughout completely different pipeline runs.

Utilizing Artifacts with the CLI

The Clarifai CLI gives a easy interface for managing artifacts, modeled after acquainted instructions like cp for add and obtain.

Add a file as an artifact:

Add with description and visibility:

Obtain the most recent model:

Obtain a particular model:

Checklist all artifacts in an app:

Checklist variations of a particular artifact:

The CLI handles multipart uploads for big information mechanically, guaranteeing environment friendly transfers even for multi-gigabyte checkpoints.

Utilizing Artifacts with the Python SDK

The SDK gives programmatic entry to artifact administration, helpful for integrating artifact uploads and downloads immediately into coaching scripts or pipeline steps.

Add a file:

Obtain a particular model:

Checklist all variations of an artifact:

Artifact Use Circumstances

Mannequin coaching workflows: Add mannequin checkpoints after every coaching epoch. If coaching is interrupted, resume from the final saved checkpoint as a substitute of restarting from scratch.

Pipeline outputs: Retailer analysis metrics, preprocessed embeddings, or serialized configurations produced by pipeline steps. Reference these artifacts in downstream steps or share them throughout groups.

Experiment monitoring: Model management for all outputs associated to an experiment. Monitor how mannequin efficiency evolves throughout coaching runs or examine artifacts produced by completely different hyperparameter configurations.

Artifacts are scoped to apps, similar to Pipelines and Fashions. This implies entry management, versioning, and lifecycle insurance policies comply with the identical patterns you are already utilizing for different Clarifai assets.

Pipeline UI Enhancements

Managing long-running workflows requires visibility into what’s operating, what’s queued, and what failed. With this launch, we have added a number of UI enhancements to make it simpler to observe and management pipeline execution immediately from the platform.

What’s New

Pipelines Checklist
View all pipelines in your app from a single interface. You’ll be able to see pipeline metadata, creation dates, and rapidly navigate to particular pipelines with no need to make use of the CLI or API.

Pipeline Variations Checklist
Every pipeline can have a number of variations, representing completely different configurations or iterations of the workflow. The brand new Variations view allows you to browse all variations of a pipeline, examine configurations, and choose which model to run.

Pipeline Model Runs View
That is the place you monitor lively and accomplished runs. The Runs view reveals execution standing, timestamps, and logs for every run, making it simpler to debug failures or observe progress on long-running jobs.

Fast switching between pipelines and variations
Navigate between pipelines, their variations, and particular person runs with out leaving the UI. This makes it quicker to check outcomes throughout completely different pipeline configurations or troubleshoot particular runs.

Begin / Pause / Cancel Runs
Now you can begin, pause, or cancel pipeline runs immediately from the UI. Beforehand, this required CLI or API calls. Now, you possibly can cease a run that is consuming assets unnecessarily or pause execution to examine intermediate state.

View run logs
Logs are streamed immediately into the UI, so you possibly can monitor execution in actual time. That is particularly helpful for debugging failures or understanding what occurred throughout a particular step in a multi-step workflow.

These enhancements make pipelines extra accessible for groups that want working via the UI slightly than completely via the CLI or SDK. You continue to have full programmatic entry via the API, however now you may also handle and monitor workflows visually.

Pipelines stay in Public Preview. We’re actively iterating based mostly on suggestions, so in case you’re utilizing pipelines and have recommendations for the way the UI or execution mannequin could possibly be improved, we would love to listen to from you.

For a step-by-step information on defining, importing, and operating pipelines, take a look at the Pipelines documentation.

Extra Modifications

Cessation of the Group Plan

We have retired the Group Plan and migrated all customers to our new Pay-As-You-Go plan, which gives a extra sustainable and aggressive pricing mannequin.

All customers who confirm their telephone quantity obtain a $5 free welcome bonus to get began. The Pay-As-You-Go plan has no month-to-month minimums and much fewer characteristic gates, making it simpler to check and scale AI workloads with out upfront commitments.

For extra particulars on the brand new pricing construction, see our current announcement on Pay-As-You-Go credit.

Python SDK Updates

We have made a number of enhancements to the Python SDK to enhance reliability, developer expertise, and compatibility with agentic workflows.

  • Added the load_concepts_from_config() technique to VisualDetectorClass and VisualClassifierClass to load ideas from config.yaml.
  • Added a Dockerfile template that conditionally installs packages required for video streaming.
  • Fastened deployment cleanup logic to make sure it targets solely failed mannequin deployments.
  • Applied an automated retry mechanism for OpenAI API calls to gracefully deal with transient httpx.ConnectError exceptions.
  • Fastened attribute entry for OpenAI response objects in agentic transport by utilizing hasattr() checks as a substitute of dictionary .get() strategies.

For a whole record of SDK updates, see the Python SDK changelog.

Able to Begin Constructing?

You can begin deploying public MCP servers at the moment to offer agentic fashions entry to searching capabilities, real-time information, and developer instruments. Deploy them on Clarifai’s shared compute, devoted cases, or your individual infrastructure utilizing the identical orchestration layer as your fashions.

If you happen to’re operating long-running workflows, use Artifacts to retailer and model information produced by pipelines. Add checkpoints, logs, and outputs immediately via the CLI or SDK, and resume execution from saved state when wanted.

For groups managing advanced pipelines, the brand new UI enhancements make it simpler to observe runs, view logs, and management execution with out leaving the platform.

Pipelines and public MCP server help can be found in Public Preview. We would love your suggestions as you construct.

Join right here to get began with Clarifai, or take a look at the documentation. If in case you have questions or need assistance whereas constructing, be part of us on Discord. Our group and workforce are there to assist.



Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles