HomeSample Page

Sample Page Title


12.2_blog_hero - Version A (1)

This weblog publish focuses on new options and enhancements. For a complete record, together with bug fixes, please see the launch notes.

Three-Command CLI Workflow for Mannequin Deployment

Getting fashions from improvement to manufacturing usually entails a number of instruments, configuration recordsdata, and deployment steps. You scaffold a mannequin regionally, check it in isolation, configure infrastructure, write deployment scripts, after which push to manufacturing. Every step requires context switching and guide coordination.

With Clarifai 12.2, we have streamlined this right into a 3-command workflow: mannequin init, mannequin serve, and mannequin deploy. These instructions deal with scaffolding, native testing, and manufacturing deployment with computerized infrastructure provisioning, GPU choice, and well being checks inbuilt.

This is not simply quicker. It removes the friction between constructing a mannequin and operating it at scale. The CLI handles dependency administration, runtime configuration, and deployment orchestration, so you may concentrate on mannequin logic as a substitute of infrastructure setup.

This launch additionally introduces Coaching on Pipelines, permitting you to coach fashions straight inside pipeline workflows utilizing devoted compute sources. We have added Video Intelligence help via the UI, improved artifact lifecycle administration, and expanded deployment capabilities with dynamic nodepool routing and new cloud supplier help.

Let’s stroll via what’s new and find out how to get began.

Streamlined Mannequin Deployment: 3 Instructions to Manufacturing

The everyday mannequin deployment workflow entails a number of steps: scaffold a challenge construction, set up dependencies, write configuration recordsdata, check regionally, containerize, provision infrastructure, and deploy. Every step requires switching contexts and managing configuration throughout totally different instruments.

Clarifai’s CLI consolidates this into three instructions that deal with the whole lifecycle from scaffolding to manufacturing deployment.

How It Works

1. Initialize a mannequin challenge

clarifai mannequin init --toolkit vllm --model-name Qwen/Qwen3-0.6B 

This scaffolds a whole mannequin listing with the construction Clarifai expects: config.yaml, necessities.txt, and mannequin.py. You need to use built-in toolkits (HuggingFace, vLLM, LMStudio, Ollama) or begin from scratch with a base template.

The generated config.yaml contains sensible defaults for runtime settings, compute necessities, and deployment configuration. You possibly can modify these or go away them as-is for primary deployments.

2. Check regionally

clarifai mannequin serve 

This begins an area inference server that behaves precisely just like the manufacturing deployment. You possibly can check your mannequin with actual requests, confirm conduct, and iterate shortly with out deploying to the cloud.

The serve command helps a number of modes:

  • Surroundings mode: Runs straight in your native Python surroundings
  • Docker mode: Builds and runs in a container for manufacturing parity
  • Standalone gRPC mode: Exposes a gRPC endpoint for integration testing

3. Deploy to manufacturing

clarifai mannequin deploy 

This command handles every little thing: validates your config, builds the container, provisions infrastructure (cluster, nodepool, deployment), and screens till the mannequin is prepared.

The CLI exhibits structured deployment phases with progress indicators, so precisely what’s taking place at every step. As soon as deployed, you get a public API endpoint that is able to deal with inference requests.

Clever Infrastructure Provisioning

The CLI now handles GPU choice robotically throughout mannequin initialization. GPU auto-selection analyzes your mannequin’s reminiscence necessities and toolkit specs, then selects applicable GPU cases.

Multi-cloud occasion discovery works throughout cloud suppliers. You need to use GPU shorthands like h100 or legacy occasion names, and the CLI normalizes them throughout AWS, Azure, DigitalOcean, and different supported suppliers.

Customized Docker base pictures allow you to optimize construct instances. You probably have a pre-built picture with frequent dependencies, the CLI can use it as a base layer for quicker toolkit builds.

Deployment Lifecycle Administration

As soon as deployed, you want visibility into how fashions are operating and the flexibility to manage them. The CLI supplies instructions for the complete deployment lifecycle:

Test deployment standing:

clarifai mannequin standing --deployment <deployment-id> 

View logs:

clarifai mannequin logs --deployment <deployment-id> 

Undeploy:

clarifai mannequin undeploy --deployment <deployment-id> 

The CLI additionally helps managing deployments straight by ID, which is helpful for scripting or CI/CD pipelines.

Enhanced Native Growth

Native testing is important for quick iteration, however it usually diverges from manufacturing conduct. The CLI bridges this hole with native runners that mirror manufacturing environments.

The mannequin serve command now helps:

  • Concurrency controls: Restrict the variety of simultaneous requests to simulate manufacturing load
  • Non-compulsory Docker picture retention: Maintain constructed pictures for quicker restarts throughout improvement
  • Well being-check configuration: Configure health-check settings utilizing flags like --health-check-port, --disable-health-check, and --auto-find-health-check-port

Native runners additionally help the identical inference modes as manufacturing (streaming, batch, multi-input), so you may check complicated workflows regionally earlier than deploying.

Simplified Configuration

Mannequin configuration used to require manually modifying YAML recordsdata with precise discipline names and nested buildings. The CLI now handles normalization robotically.

Whenever you initialize a mannequin, config.yaml contains solely the fields you should customise. Good defaults fill in the remainder. In the event you add fields with barely incorrect names or codecs, the CLI normalizes them throughout deployment.

This reduces configuration errors and makes it simpler emigrate current fashions to Clarifai.

Why This Issues

The three-command workflow removes friction from mannequin deployment. You go from concept to manufacturing API in minutes as a substitute of hours or days. The CLI handles infrastructure complexity, so you do not should be an skilled in Kubernetes, Docker, or cloud compute to deploy fashions at scale.

This additionally standardizes deployment throughout groups. Everybody makes use of the identical instructions, the identical configuration format, and the identical testing workflow. This makes it simpler to share fashions, reproduce deployments, and onboard new workforce members.

For an entire information on the brand new CLI workflow, together with examples and superior configuration choices, see the Deploy Your First Mannequin through CLI documentation.

Coaching on Pipelines

Clarifai Pipelines, launched in 12.0, can help you outline and execute long-running, multi-step AI workflows. With 12.2, now you can practice fashions straight inside pipeline workflows utilizing devoted compute sources.

Coaching on Pipelines integrates mannequin coaching into the identical orchestration layer as inference and knowledge processing. This implies coaching jobs run on the identical infrastructure as your different workloads, with the identical autoscaling, monitoring, and price controls.

How It Works

You possibly can initialize coaching pipelines utilizing templates through the CLI. This creates a pipeline construction with pre-configured coaching steps. You specify your dataset, mannequin structure, and coaching parameters within the pipeline configuration, then run it like some other pipeline.

This creates a pipeline construction with pre-configured coaching steps. You specify your dataset, mannequin structure, and coaching parameters within the pipeline configuration, then run it like some other pipeline.

The platform handles:

  • Provisioning GPUs for coaching workloads
  • Scaling compute primarily based on job necessities
  • Saving checkpoints as Artifacts for versioning
  • Monitoring coaching metrics and logs

As soon as coaching completes, the ensuing mannequin is robotically appropriate with Clarifai’s Compute Orchestration platform, so you may deploy it utilizing the identical mannequin deploy workflow. Learn extra about Pipelines right here.

UI Expertise

We have additionally launched a brand new UI for coaching fashions inside pipelines. You possibly can configure coaching parameters, choose datasets, and monitor progress straight from the platform with out writing code or managing infrastructure.

This makes it simpler for groups with out deep ML engineering experience to coach customized fashions and combine them into manufacturing workflows.

Coaching on Pipelines is offered in Public Preview. For extra particulars, see the Pipelines documentation.

Artifact Lifecycle Enhancements

With 12.2, we have improved how Artifacts deal with expiration and versioning.

Artifacts now not expire robotically by default. Beforehand, artifacts had a default retention coverage that might delete them after a sure interval. Now, artifacts persist indefinitely until you explicitly set an expires_at worth throughout add.

This offers you full management over artifact lifecycle administration. You possibly can set expiration dates for non permanent outputs (like intermediate checkpoints throughout experimentation) whereas preserving manufacturing artifacts indefinitely.

The CLI now shows latest-version-id alongside artifact visibility, making it simpler to reference the latest model with out itemizing all variations first.

These adjustments make Artifacts extra predictable and simpler to handle for long-term storage of pipeline outputs.

Video Intelligence

Clarifai now helps video intelligence via the UI. You possibly can join video streams to your utility and apply AI evaluation to detect objects, observe motion, and generate insights in actual time.

This expands Clarifai’s capabilities past picture and textual content processing to deal with reside video feeds, enabling use instances like safety monitoring, retail analytics, and automatic content material moderation for video platforms.

Video Intelligence is offered now.

Deployment Enhancements

We have made a number of enhancements to how deployments work throughout compute infrastructure.

Dynamic nodepool routing means that you can connect a number of nodepools to a single deployment with configurable scheduling methods. This offers you extra management over how site visitors is distributed throughout totally different compute sources, which is helpful for dealing with spillover site visitors or routing to particular {hardware} primarily based on request kind.

Deployment visibility has been improved with standing chips and enhanced record views throughout Deployments, Nodepools, and Clusters. You possibly can see at a look which deployments are wholesome, that are scaling, and which want consideration.

New cloud supplier help: We have added DigitalOcean and Azure as supported occasion suppliers, providing you with extra flexibility in the place you deploy fashions.

Begin and cease deployments explicitly: Now you can pause deployments with out deleting them. This preserves configuration whereas liberating up compute sources, which is helpful for dev/check environments or fashions with intermittent site visitors.

Redesigned Deployment particulars web page supplies expanded standing visibility, together with reproduction counts, nodepool well being, and request metrics, multi functional view.

Further Adjustments

Platform Updates

We have launched a number of UI enhancements to make the platform simpler to navigate and use:

  • New Mannequin Library UI supplies a streamlined expertise for shopping and exploring fashions
  • Common Search added to the navbar for fast entry to fashions, datasets, and workflows
  • New account expertise with improved onboarding and settings administration
  • Dwelling 3.0 interface with a refreshed design and higher group of latest exercise

Playground Enhancements

The Playground now contains main upgrades to the Common Search expertise, with multi-panel (evaluate mode) help, improved workspace dealing with, and smarter mannequin auto-selection. Mannequin choices are panel-aware to stop cross-panel conflicts, and the UI can show simplified mannequin names for a cleaner expertise.

Pipeline Step Visibility

Now you can set pipeline steps to be publicly seen throughout initialization via each the CLI and builder APIs. By default, pipelines and pipeline step templates are created with PRIVATE visibility, however you may override this when sharing workflows throughout groups or with the group.

Modules Deprecation

Help for Modules has been totally dropped. Modules beforehand prolonged Clarifai’s UIs and enabled custom-made backend processing, however they have been changed by extra versatile options like Artifacts and Pipelines.

Python SDK Updates

We have made a number of enhancements to the Python SDK, together with:

  • Fastened ModelRunner well being server beginning twice, which may trigger “Deal with already in use” errors
  • Added admission-control help for mannequin runners
  • Improved sign dealing with and zombie course of reaping in runner containers
  • Refactored the MCP server implementation for higher logging readability

For an entire record of SDK updates, see the Python SDK changelog.

Able to Begin Constructing?

You can begin utilizing the brand new 3-command deployment workflow in the present day. Initialize a mannequin with clarifai mannequin init, check it regionally with clarifai mannequin serve, and deploy to manufacturing with clarifai mannequin deploy.

For groups operating long-running coaching jobs, Coaching on Pipelines supplies a strategy to combine mannequin coaching into the identical orchestration layer as your inference workloads, with devoted compute and computerized checkpoint administration.

Video Intelligence help provides real-time video stream processing to the platform, and deployment enhancements provide you with extra management over how fashions run throughout totally different compute environments.

The brand new CLI workflow is offered now. Try the Deploy Your First Mannequin through CLI information to get began, or discover the complete 12.2 launch notes for full particulars.

Enroll right here to get began with Clarifai, or try the documentation for extra data.

You probably have questions or need assistance whereas constructing, be part of us on Discord. Our group and workforce are there to assist.

 

 

 



Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles