Saturday, March 14, 2026
HomeArtificial IntelligenceThree-Command CLI Workflow for Mannequin Deployment

Three-Command CLI Workflow for Mannequin Deployment

12.2_blog_hero - Version A (1)

This weblog submit focuses on new options and enhancements. For a complete listing, together with bug fixes, please see the launch notes.

Three-Command CLI Workflow for Mannequin Deployment

Getting fashions from growth to manufacturing usually includes a number of instruments, configuration information, and deployment steps. You scaffold a mannequin domestically, take a look at it in isolation, configure infrastructure, write deployment scripts, after which push to manufacturing. Every step requires context switching and guide coordination.

With Clarifai 12.2, we have streamlined this right into a 3-command workflow: mannequin init, mannequin serve, and mannequin deploy. These instructions deal with scaffolding, native testing, and manufacturing deployment with computerized infrastructure provisioning, GPU choice, and well being checks inbuilt.

This is not simply sooner. It removes the friction between constructing a mannequin and operating it at scale. The CLI handles dependency administration, runtime configuration, and deployment orchestration, so you’ll be able to concentrate on mannequin logic as a substitute of infrastructure setup.

This launch additionally introduces Coaching on Pipelines, permitting you to coach fashions straight inside pipeline workflows utilizing devoted compute assets. We have added Video Intelligence help by the UI, improved artifact lifecycle administration, and expanded deployment capabilities with dynamic nodepool routing and new cloud supplier help.

Let’s stroll by what’s new and get began.

Streamlined Mannequin Deployment: 3 Instructions to Manufacturing

The standard mannequin deployment workflow includes a number of steps: scaffold a venture construction, set up dependencies, write configuration information, take a look at domestically, containerize, provision infrastructure, and deploy. Every step requires switching contexts and managing configuration throughout completely different instruments.

Clarifai’s CLI consolidates this into three instructions that deal with the whole lifecycle from scaffolding to manufacturing deployment.

How It Works

1. Initialize a mannequin venture

clarifai mannequin init --toolkit vllm --model-name Qwen/Qwen3-0.6B 

This scaffolds a whole mannequin listing with the construction Clarifai expects: config.yaml, necessities.txt, and mannequin.py. You should utilize built-in toolkits (HuggingFace, vLLM, LMStudio, Ollama) or begin from scratch with a base template.

The generated config.yaml consists of good defaults for runtime settings, compute necessities, and deployment configuration. You may modify these or depart them as-is for fundamental deployments.

2. Take a look at domestically

clarifai mannequin serve 

This begins a neighborhood inference server that behaves precisely just like the manufacturing deployment. You may take a look at your mannequin with actual requests, confirm habits, and iterate shortly with out deploying to the cloud.

The serve command helps a number of modes:

  • Atmosphere mode: Runs straight in your native Python surroundings
  • Docker mode: Builds and runs in a container for manufacturing parity
  • Standalone gRPC mode: Exposes a gRPC endpoint for integration testing

3. Deploy to manufacturing

clarifai mannequin deploy 

This command handles every little thing: validates your config, builds the container, provisions infrastructure (cluster, nodepool, deployment), and screens till the mannequin is prepared.

The CLI exhibits structured deployment phases with progress indicators, so you already know precisely what’s taking place at every step. As soon as deployed, you get a public API endpoint that is able to deal with inference requests.

Clever Infrastructure Provisioning

The CLI now handles GPU choice mechanically throughout mannequin initialization. GPU auto-selection analyzes your mannequin’s reminiscence necessities and toolkit specs, then selects applicable GPU cases.

Multi-cloud occasion discovery works throughout cloud suppliers. You should utilize GPU shorthands like h100 or legacy occasion names, and the CLI normalizes them throughout AWS, Azure, DigitalOcean, and different supported suppliers.

Customized Docker base photos allow you to optimize construct instances. You probably have a pre-built picture with frequent dependencies, the CLI can use it as a base layer for sooner toolkit builds.

Deployment Lifecycle Administration

As soon as deployed, you want visibility into how fashions are operating and the flexibility to manage them. The CLI gives instructions for the total deployment lifecycle:

Test deployment standing:

clarifai mannequin standing --deployment  

View logs:

clarifai mannequin logs --deployment  

Undeploy:

clarifai mannequin undeploy --deployment  

The CLI additionally helps managing deployments straight by ID, which is helpful for scripting or CI/CD pipelines.

Enhanced Native Growth

Native testing is crucial for quick iteration, however it usually diverges from manufacturing habits. The CLI bridges this hole with native runners that mirror manufacturing environments.

The mannequin serve command now helps:

  • Concurrency controls: Restrict the variety of simultaneous requests to simulate manufacturing load
  • Non-compulsory Docker picture retention: Preserve constructed photos for sooner restarts throughout growth
  • Well being-check configuration: Configure health-check settings utilizing flags like --health-check-port, --disable-health-check, and --auto-find-health-check-port

Native runners additionally help the identical inference modes as manufacturing (streaming, batch, multi-input), so you’ll be able to take a look at complicated workflows domestically earlier than deploying.

Simplified Configuration

Mannequin configuration used to require manually modifying YAML information with precise discipline names and nested constructions. The CLI now handles normalization mechanically.

While you initialize a mannequin, config.yaml consists of solely the fields it is advisable customise. Sensible defaults fill in the remainder. When you add fields with barely incorrect names or codecs, the CLI normalizes them throughout deployment.

This reduces configuration errors and makes it simpler emigrate present fashions to Clarifai.

Why This Issues

The three-command workflow removes friction from mannequin deployment. You go from thought to manufacturing API in minutes as a substitute of hours or days. The CLI handles infrastructure complexity, so you do not must be an skilled in Kubernetes, Docker, or cloud compute to deploy fashions at scale.

This additionally standardizes deployment throughout groups. Everybody makes use of the identical instructions, the identical configuration format, and the identical testing workflow. This makes it simpler to share fashions, reproduce deployments, and onboard new staff members.

For a whole information on the brand new CLI workflow, together with examples and superior configuration choices, see the Deploy Your First Mannequin through CLI documentation.

Coaching on Pipelines

Clarifai Pipelines, launched in 12.0, help you outline and execute long-running, multi-step AI workflows. With 12.2, now you can practice fashions straight inside pipeline workflows utilizing devoted compute assets.

Coaching on Pipelines integrates mannequin coaching into the identical orchestration layer as inference and information processing. This implies coaching jobs run on the identical infrastructure as your different workloads, with the identical autoscaling, monitoring, and price controls.

How It Works

You may initialize coaching pipelines utilizing templates through the CLI. This creates a pipeline construction with pre-configured coaching steps. You specify your dataset, mannequin structure, and coaching parameters within the pipeline configuration, then run it like some other pipeline.

This creates a pipeline construction with pre-configured coaching steps. You specify your dataset, mannequin structure, and coaching parameters within the pipeline configuration, then run it like some other pipeline.

The platform handles:

  • Provisioning GPUs for coaching workloads
  • Scaling compute based mostly on job necessities
  • Saving checkpoints as Artifacts for versioning
  • Monitoring coaching metrics and logs

As soon as coaching completes, the ensuing mannequin is mechanically suitable with Clarifai’s Compute Orchestration platform, so you’ll be able to deploy it utilizing the identical mannequin deploy workflow. Learn extra about Pipelines right here.

UI Expertise

We have additionally launched a brand new UI for coaching fashions inside pipelines. You may configure coaching parameters, choose datasets, and monitor progress straight from the platform with out writing code or managing infrastructure.

This makes it simpler for groups with out deep ML engineering experience to coach customized fashions and combine them into manufacturing workflows.

Coaching on Pipelines is offered in Public Preview. For extra particulars, see the Pipelines documentation.

Artifact Lifecycle Enhancements

With 12.2, we have improved how Artifacts deal with expiration and versioning.

Artifacts now not expire mechanically by default. Beforehand, artifacts had a default retention coverage that might delete them after a sure interval. Now, artifacts persist indefinitely except you explicitly set an expires_at worth throughout add.

This offers you full management over artifact lifecycle administration. You may set expiration dates for short-term outputs (like intermediate checkpoints throughout experimentation) whereas holding manufacturing artifacts indefinitely.

The CLI now shows latest-version-id alongside artifact visibility, making it simpler to reference the newest model with out itemizing all variations first.

These adjustments make Artifacts extra predictable and simpler to handle for long-term storage of pipeline outputs.

Video Intelligence

Clarifai now helps video intelligence by the UI. You may join video streams to your software and apply AI evaluation to detect objects, observe motion, and generate insights in actual time.

This expands Clarifai’s capabilities past picture and textual content processing to deal with stay video feeds, enabling use instances like safety monitoring, retail analytics, and automatic content material moderation for video platforms.

Video Intelligence is offered now.

Deployment Enhancements

We have made a number of enhancements to how deployments work throughout compute infrastructure.

Dynamic nodepool routing means that you can connect a number of nodepools to a single deployment with configurable scheduling methods. This offers you extra management over how visitors is distributed throughout completely different compute assets, which is helpful for dealing with spillover visitors or routing to particular {hardware} based mostly on request sort.

Deployment visibility has been improved with standing chips and enhanced listing views throughout Deployments, Nodepools, and Clusters. You may see at a look which deployments are wholesome, that are scaling, and which want consideration.

New cloud supplier help: We have added DigitalOcean and Azure as supported occasion suppliers, providing you with extra flexibility in the place you deploy fashions.

Begin and cease deployments explicitly: Now you can pause deployments with out deleting them. This preserves configuration whereas releasing up compute assets, which is helpful for dev/take a look at environments or fashions with intermittent visitors.

Redesigned Deployment particulars web page gives expanded standing visibility, together with duplicate counts, nodepool well being, and request metrics, multi function view.

Further Adjustments

Platform Updates

We have launched a number of UI enhancements to make the platform simpler to navigate and use:

  • New Mannequin Library UI gives a streamlined expertise for looking and exploring fashions
  • Common Search added to the navbar for fast entry to fashions, datasets, and workflows
  • New account expertise with improved onboarding and settings administration
  • Residence 3.0 interface with a refreshed design and higher group of latest exercise

Playground Enhancements

The Playground now consists of main upgrades to the Common Search expertise, with multi-panel (evaluate mode) help, improved workspace dealing with, and smarter mannequin auto-selection. Mannequin alternatives are panel-aware to forestall cross-panel conflicts, and the UI can show simplified mannequin names for a cleaner expertise.

Pipeline Step Visibility

Now you can set pipeline steps to be publicly seen throughout initialization by each the CLI and builder APIs. By default, pipelines and pipeline step templates are created with PRIVATE visibility, however you’ll be able to override this when sharing workflows throughout groups or with the neighborhood.

Modules Deprecation

Assist for Modules has been absolutely dropped. Modules beforehand prolonged Clarifai’s UIs and enabled custom-made backend processing, however they have been changed by extra versatile options like Artifacts and Pipelines.

Python SDK Updates

We have made a number of enhancements to the Python SDK, together with:

  • Fastened ModelRunner well being server beginning twice, which may trigger “Handle already in use” errors
  • Added admission-control help for mannequin runners
  • Improved sign dealing with and zombie course of reaping in runner containers
  • Refactored the MCP server implementation for higher logging readability

For a whole listing of SDK updates, see the Python SDK changelog.

Able to Begin Constructing?

You can begin utilizing the brand new 3-command deployment workflow at this time. Initialize a mannequin with clarifai mannequin init, take a look at it domestically with clarifai mannequin serve, and deploy to manufacturing with clarifai mannequin deploy.

For groups operating long-running coaching jobs, Coaching on Pipelines gives a technique to combine mannequin coaching into the identical orchestration layer as your inference workloads, with devoted compute and computerized checkpoint administration.

Video Intelligence help provides real-time video stream processing to the platform, and deployment enhancements provide you with extra management over how fashions run throughout completely different compute environments.

The brand new CLI workflow is offered now. Take a look at the Deploy Your First Mannequin through CLI information to get began, or discover the total 12.2 launch notes for full particulars.

Enroll right here to get began with Clarifai, or try the documentation for extra info.

You probably have questions or need assistance whereas constructing, be a part of us on Discord. Our neighborhood and staff are there to assist.

 

 

 


RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments