Introduction
Whether or not you are refactoring legacy code, implementing new options, or debugging advanced points, AI coding assistantsĀ can speed up your growth workflow and cut back time-to-delivery. OpenHands is an AI-powered coding framework that acts like an actual growth associateāit understands advanced necessities, navigates complete codebases, writes and modifies code throughout a number of information, debugs errors, and may even work together with exterior providers. Not like conventional code completion instruments that recommend snippets, OpenHands acts asĀ an autonomous agent able to finishing up full growth duties from begin to end.
On the mannequin aspect, GPT-OSS is OpenAIās household of open-source giant language fashions constructed for superior reasoning and code technology. These fashions, launched underneath the Apache 2.0 license, deliver capabilities that had been beforehand locked behind proprietary APIs into a completely accessible type. GPT-OSS-20B presents quick responses and modest useful resource necessities, making it well-suited for smaller groups or particular person builders working fashions domestically.
GPT-OSS-120B delivers deeper reasoning for advanced workflows, large-scale refactoring, and architectural decision-making, and it may be deployed on extra highly effective {hardware} for larger throughput. Each fashions use a mixture-of-experts structure, activating solely the components of the community wanted for a given request, which helps steadiness effectivity with efficiency.
On this tutorial will information you thru creating an entire native AI coding setup that mixes OpenHands‘ agent capabilities with GPT-OSS fashions.
Tutorial: Constructing Your Native AI Coding Agent
Conditions
Earlier than we start, guarantee you’ve gotten the next necessities:
Get a PAT key ā To make use of OpenHands with Clarifai fashions, you may want a Private Entry Token (PAT). Log in or join a Clarifai account, then navigate to your Safety settings to generate a brand new PAT.
Get a mannequin ā Clarifai’s Group presents a wide array of cutting-edge language fashions that you would be able to run utilizing OpenHands. Browse the neighborhood to discover a mannequin that most closely fits your use case. For this instance, we’ll use the gpt-oss-120b mannequin.
Set up Docker Desktop ā OpenHands runs inside a Docker container, so you may want Docker put in and working in your system. You’ll be able to obtain and set up Docker Desktop to your working system from the official Docker web site. Be sure you comply with the set up steps particular to your OS (Home windows, macOS, or Linux).
Step 1: Pull Runtime Picture
OpenHands makes use of a devoted Docker picture to supply a sandboxed execution setting. You’ll be able to pull this picture from the all-hands-ai Docker registry.
Step 2: Run OpenHands
Begin OpenHands utilizing the next complete docker run command.
This command launches a brand new Docker container working OpenHands with all mandatory configurations together with setting variables for logging, Docker engine entry for sandboxing, port mapping for net interface entry on localhost:3000, persistent knowledge storage within the ~/.openhands folder, host communication capabilities, and automated cleanup when the container exits.
Step 3: Entry the Net Interface
After working the docker run command, monitor the terminal for log output. As soon as the applying finishes its startup course of, open your most popular net browser and navigate to: http://localhost:3000
At this level, OpenHands is efficiently put in and working in your native machine, prepared for configuration.
Step 4: Configure OpenHands with GPT-OSS
To configure OpenHands, open its interface and click on the Settings (gear icon) within the bottom-left nook of the sidebar.
The Settings web page permits you to join OpenHands to a LLM, which serves as its cognitive engine, and combine it with GitHub for model management and collaboration.
Hook up with GPT-OSS through Clarifai
Within the Settings web page, go to the LLM tab and toggle the Superior button.
Fill within the following fields for the mannequinĀ integration:
Customized Mannequin ā Enter the Clarifai mannequin URL for GPT-OSS-120B. To make sure OpenAI compatibility, prefix the mannequin path with openai/
, adopted by the total Clarifai mannequin URL:Ā “openai/https://clarifai.com/openai/chat-completion/fashions/gpt-oss-120b”
Base URL ā Enter Clarifai’s OpenAI-compatible API endpoint: “https://api.clarifai.com/v2/ext/openai/v1”
API Key ā Enter your Clarifai PAT.
After filling within the fields, click on the Save Modifications button on the bottom-right nook of the interface.
Whereas this tutorial focuses on GPT-OSS-120B mannequin, Clarifai’s Group has over 100 open-source and third-party fashions that you would be able to simply entry by way of the identical OpenAI-compatible API. Merely exchange the mannequin URL within the Customized Mannequin discipline with every other mannequin from Clarifai’s catalog to experiment with completely different AI capabilities and discover the one that most closely fits your growth workflow.
Step 5: Combine with GitHub
Inside the identical Settings web page, navigate to the Integrations tab.
Enter your GitHub token within the offered discipline, then click on Save Modifications within the bottom-right nook of the interface to use the combination
Step 6: Begin Constructing with AI-Powered Growth
Subsequent, click on the plus (+) Begin new dialog button on the high of the sidebar. From there, hook up with a repository by choosing your required repo and its department.
As soon as chosen, click on the Launch button to start your coding session with full repository entry.
In the primary interface, use the enter discipline to immediate the agent and start producing your code. The GPT-OSS-120B mannequin will perceive your necessities and supply clever, context-aware help tailor-made to your related repository.
Instance prompts to get began:
- Documentation: “Generate a complete README.md file for this repository that explains the undertaking function, set up steps, and utilization examples.”
- Testing: “Write detailed unit exams for the person authentication capabilities within the auth.py file, together with edge circumstances and error dealing with eventualities.”
- Code Enhancement: “Analyze the database connection logic and refactor it to make use of connection pooling for higher efficiency and reliability.”
OpenHands forwards your request to the configured GPT-OSS-120B mannequin, which responds by producing clever code options, explanations, and implementations that perceive your undertaking context, and when you’re glad, you’ll be able to seamlessly push your code to GitHub immediately from the interface, sustaining full model management integration.
Ā
Conclusion
Youāve arrange a completely useful AI coding agent that runs solely in your native infrastructure utilizing OpenHands and GPT-OSS-120B fashions.
If you wish to use a mannequin working domestically, you’ll be able to set it up with native runners. For instance, you’ll be able to run the GPT-OSS-20B mannequin domestically, expose it as a public API, and use that URL to energy your coding agent. Try the tutorial on working gpt-oss fashions domestically utilizing native runners right here.
If you happen to want extra computing energy, you’ll be able to deploy gpt-oss fashions by yourself devoted machines utilizing compute orchestration after which combine them along with your coding brokers, providing you with higher management over efficiency and useful resource allocation.