Introduction
Whether or not you are refactoring legacy code, implementing new options, or debugging complicated points, AI coding assistants can speed up your growth workflow and scale back time-to-delivery. OpenHands is an AI-powered coding framework that acts like an actual growth associate—it understands complicated necessities, navigates total codebases, writes and modifies code throughout a number of recordsdata, debugs errors, and might even work together with exterior companies. Not like conventional code completion instruments that counsel snippets, OpenHands acts as an autonomous agent able to finishing up full growth duties from begin to end.
On the mannequin aspect, GPT-OSS is OpenAI’s household of open-source massive language fashions constructed for superior reasoning and code era. These fashions, launched beneath the Apache 2.0 license, deliver capabilities that have been beforehand locked behind proprietary APIs into a totally accessible kind. GPT-OSS-20B provides quick responses and modest useful resource necessities, making it well-suited for smaller groups or particular person builders working fashions domestically.
GPT-OSS-120B delivers deeper reasoning for complicated workflows, large-scale refactoring, and architectural decision-making, and it may be deployed on extra highly effective {hardware} for larger throughput. Each fashions use a mixture-of-experts structure, activating solely the components of the community wanted for a given request, which helps steadiness effectivity with efficiency.
On this tutorial will information you thru creating an entire native AI coding setup that mixes OpenHands‘ agent capabilities with GPT-OSS fashions.
Tutorial: Constructing Your Native AI Coding Agent
Conditions
Earlier than we start, guarantee you might have the next necessities:
Get a PAT key — To make use of OpenHands with Clarifai fashions, you may want a Private Entry Token (PAT). Log in or join a Clarifai account, then navigate to your Safety settings to generate a brand new PAT.
Get a mannequin — Clarifai’s Group provides a big selection of cutting-edge language fashions which you could run utilizing OpenHands. Browse the neighborhood to discover a mannequin that most closely fits your use case. For this instance, we’ll use the gpt-oss-120b mannequin.
Set up Docker Desktop — OpenHands runs inside a Docker container, so you may want Docker put in and working in your system. You’ll be able to obtain and set up Docker Desktop to your working system from the official Docker web site. You’ll want to comply with the set up steps particular to your OS (Home windows, macOS, or Linux).
Step 1: Pull Runtime Picture
OpenHands makes use of a devoted Docker picture to offer a sandboxed execution surroundings. You’ll be able to pull this picture from the all-hands-ai Docker registry.
Step 2: Run OpenHands
Begin OpenHands utilizing the next complete docker run command.
This command launches a brand new Docker container working OpenHands with all mandatory configurations together with surroundings variables for logging, Docker engine entry for sandboxing, port mapping for net interface entry on localhost:3000, persistent knowledge storage within the ~/.openhands folder, host communication capabilities, and automated cleanup when the container exits.
Step 3: Entry the Net Interface
After working the docker run command, monitor the terminal for log output. As soon as the applying finishes its startup course of, open your most popular net browser and navigate to: http://localhost:3000
At this level, OpenHands is efficiently put in and working in your native machine, prepared for configuration.
Step 4: Configure OpenHands with GPT-OSS
To configure OpenHands, open its interface and click on the Settings (gear icon) within the bottom-left nook of the sidebar.
The Settings web page permits you to join OpenHands to a LLM, which serves as its cognitive engine, and combine it with GitHub for model management and collaboration.
Hook up with GPT-OSS by way of Clarifai
Within the Settings web page, go to the LLM tab and toggle the Superior button.
Fill within the following fields for the mannequin integration:
Customized Mannequin — Enter the Clarifai mannequin URL for GPT-OSS-120B. To make sure OpenAI compatibility, prefix the mannequin path with openai/
, adopted by the total Clarifai mannequin URL: “openai/https://clarifai.com/openai/chat-completion/fashions/gpt-oss-120b”
Base URL — Enter Clarifai’s OpenAI-compatible API endpoint: “https://api.clarifai.com/v2/ext/openai/v1”
API Key — Enter your Clarifai PAT.
After filling within the fields, click on the Save Adjustments button on the bottom-right nook of the interface.
Whereas this tutorial focuses on GPT-OSS-120B mannequin, Clarifai’s Group has over 100 open-source and third-party fashions which you could simply entry by the identical OpenAI-compatible API. Merely exchange the mannequin URL within the Customized Mannequin area with every other mannequin from Clarifai’s catalog to experiment with completely different AI capabilities and discover the one that most closely fits your growth workflow.
Step 5: Combine with GitHub
Inside the similar Settings web page, navigate to the Integrations tab.
Enter your GitHub token within the supplied area, then click on Save Adjustments within the bottom-right nook of the interface to use the mixing
Step 6: Begin Constructing with AI-Powered Improvement
Subsequent, click on the plus (+) Begin new dialog button on the prime of the sidebar. From there, connect with a repository by deciding on your required repo and its department.
As soon as chosen, click on the Launch button to start your coding session with full repository entry.
In the principle interface, use the enter area to immediate the agent and start producing your code. The GPT-OSS-120B mannequin will perceive your necessities and supply clever, context-aware help tailor-made to your linked repository.
Instance prompts to get began:
- Documentation: “Generate a complete README.md file for this repository that explains the mission objective, set up steps, and utilization examples.”
- Testing: “Write detailed unit exams for the person authentication capabilities within the auth.py file, together with edge circumstances and error dealing with situations.”
- Code Enhancement: “Analyze the database connection logic and refactor it to make use of connection pooling for higher efficiency and reliability.”
OpenHands forwards your request to the configured GPT-OSS-120B mannequin, which responds by producing clever code options, explanations, and implementations that perceive your mission context, and when you’re glad, you’ll be able to seamlessly push your code to GitHub straight from the interface, sustaining full model management integration.
Conclusion
You’ve arrange a totally purposeful AI coding agent that runs fully in your native infrastructure utilizing OpenHands and GPT-OSS-120B fashions.
If you wish to use a mannequin working domestically, you’ll be able to set it up with native runners. For instance, you’ll be able to run the GPT-OSS-20B mannequin domestically, expose it as a public API, and use that URL to energy your coding agent. Take a look at the tutorial on working gpt-oss fashions domestically utilizing native runners right here.
In the event you want extra computing energy, you’ll be able to deploy gpt-oss fashions by yourself devoted machines utilizing compute orchestration after which combine them together with your coding brokers, supplying you with higher management over efficiency and useful resource allocation.