HomeArtificial IntelligenceRun A number of AI Coding Brokers in Parallel with Container-Use from...

Run A number of AI Coding Brokers in Parallel with Container-Use from Dagger


In AI-driven improvement, coding brokers have grow to be indispensable collaborators. These autonomous or semi-autonomous instruments can write, check, and refactor code, dramatically accelerating improvement cycles. Nonetheless, because the variety of brokers engaged on a single codebase grows, so do the challenges: dependency conflicts, state leakage between brokers, and the issue of monitoring every agent’s actions. The container-use venture from Dagger addresses these challenges by providing containerized environments tailor-made for coding brokers. By isolating every agent in its container, builders can run a number of brokers concurrently with out interference, examine their actions in real-time, and intervene immediately when obligatory.

Historically, when a coding agent executes duties, reminiscent of putting in dependencies, working construct scripts, or launching servers, it does so throughout the developer’s native setting. This strategy rapidly results in conflicts: one agent might improve a shared library that breaks one other agent’s workflow, or an errant script might go away behind artifacts that obscure subsequent runs. Containerization elegantly solves these points by encapsulating every agent’s setting. Reasonably than babysitting brokers one after the other, you may spin up totally recent environments, experiment safely, and discard failures immediately, all whereas sustaining visibility into precisely what every agent executed.

Furthermore, as a result of containers might be managed by way of acquainted instruments, Docker, git, and customary CLI utilities, container-use integrates seamlessly into present workflows. As an alternative of locking right into a proprietary answer, groups can leverage their most well-liked tech stack, whether or not meaning Python digital environments, Node.js toolchains, or system-level packages. The end result is a versatile structure that empowers builders to harness the complete potential of coding brokers, with out sacrificing management or transparency.

Set up and Setup

Getting began with container-use is easy. The venture gives a Go-based CLI device, ‘cu’, which you construct and set up by way of a easy ‘make’ command. By default, the construct targets your present platform, however cross-compilation is supported by way of customary ‘TARGETPLATFORM’ setting variables.

# Construct the CLI device
make

# (Elective) Set up into your PATH
make set up && hash -r

After working these instructions, the ‘cu’ binary turns into out there in your shell, able to launch containerized periods for any MCP-compatible agent. If you might want to compile for a unique structure, say, ARM64 for a Raspberry Pi, merely prefix the construct with the goal platform:

TARGETPLATFORM=linux/arm64 make

This flexibility ensures that whether or not you’re creating on macOS, Home windows Subsystem for Linux, or any taste of Linux, you may generate an environment-specific binary with ease.

Integrating with Your Favourite Brokers

One in all container-use’s strengths is its compatibility with any agent that speaks the Mannequin Context Protocol (MCP). The venture gives instance integrations for common instruments like Claude Code, Cursor, GitHub Copilot, and Goose. Integration sometimes includes including ‘container-use’ as an MCP server in your agent’s configuration and enabling it:

Claude Code makes use of an NPM helper to register the server. You may merge Dagger’s advisable directions into your ‘CLAUDE.md’ in order that working ‘claude’ robotically spawns brokers in remoted containers:

  npx @anthropic-ai/claude-code mcp add container-use -- $(which cu) stdio
  curl -o CLAUDE.md https://uncooked.githubusercontent.com/dagger/container-use/principal/guidelines/agent.md

Goose, a browser-based agent framework, reads from ‘~/.config/goose/config.yaml’. Including a ‘container-use’ part there directs Goose to launch every looking agent inside its personal container:

  extensions:
    container-use:
      identify: container-use
      kind: stdio
      enabled: true
      cmd: cu
      args:
        - stdio
      envs: {}

Cursor, the AI code assistant, might be hooked by dropping a rule file into your venture. With ‘curl’ you fetch the advisable rule and place it in ‘.cursor/guidelines/container-use.mdc’.

VSCode and GitHub Copilot customers can replace their ‘settings.json’ and ‘.github/copilot-instructions.md’ respectively, pointing to the ‘cu’ command because the MCP server. Copilot then executes its code completions contained in the encapsulated setting. Kilo Code integrates by way of a JSON-based settings file, letting you specify the ‘cu’ command and any required arguments below ‘mcpServers’. Every of those integrations ensures that, no matter which assistant you select, your brokers function of their sandbox, thereby eradicating the chance of cross-contamination and simplifying cleanup after every run.

Fingers-On Examples

As an instance how container-use can revolutionize your improvement workflow, the Dagger repository consists of a number of ready-to-run examples. These reveal typical use instances and spotlight the device’s flexibility:

  • Whats up World: On this minimal instance, an agent scaffolds a easy HTTP server, say, utilizing Flask or Node’s ‘http’ module, and launches it inside its container. You may hit ‘localhost’ in your browser to substantiate that the code generated by the agent runs as anticipated, totally remoted out of your host system.
  • Parallel Improvement: Right here, two brokers spin up distinct variations of the identical app, one utilizing Flask and one other utilizing FastAPI, every in its personal container and on separate ports. This situation demonstrates the way to consider a number of approaches facet by facet with out worrying about port collisions or dependency conflicts.
  • Safety Scanning: On this pipeline, an agent performs routine upkeep, updating susceptible dependencies, rerunning the construct to make sure nothing broke, and producing a patch file that captures all modifications. The complete course of unfolds in a throwaway container, leaving your repository in its unique state until you resolve to merge the patches.

Operating these examples is so simple as piping the instance file into your agent command. As an example, with Claude Code:

cat examples/hello_world.md | claude

Or with Goose:

goose run -i examples/hello_world.md -s

After execution, you’ll see every agent commit its work to a devoted git department that represents its container. Inspecting these branches by way of ‘git checkout’ helps you to evaluation, check, or merge modifications in your phrases.

One frequent concern when delegating duties to brokers is figuring out what they did, not simply what they declare. container-use addresses this by way of a unified logging interface. If you begin a session, the device information each command, output, and file become your repository’s ‘.git’ historical past below a particular distant known as ‘container-use’. You may observe alongside because the container spins up, the agent runs instructions, and the setting evolves.

If an agent encounters an error or goes off observe, you don’t have to look at logs in a separate window. A easy command brings up an interactive view:

This reside view reveals you which ones container department is lively, the newest outputs, and even provides you the choice to drop into the agent’s shell. From there, you may debug manually: examine setting variables, run your personal instructions, or edit recordsdata on the fly. This direct intervention functionality ensures that brokers stay collaborators somewhat than inscrutable black bins.

Whereas the default container pictures supplied by container-use cowl many node, Python, and system-level use instances, you might need specialised wants, say, customized compilers or proprietary libraries. Fortuitously, you may management the Dockerfile that underpins every container. By putting a ‘Containerfile’ (or ‘Dockerfile’) on the root of your venture, the ‘cu’ CLI will construct a tailored picture earlier than launching the agent. This strategy allows you to pre-install system packages, clone non-public repositories, or configure complicated toolchains, all with out affecting your host setting.

A typical customized Dockerfile may begin from an official base, add OS-level packages, set setting variables, and set up language-specific dependencies:

FROM ubuntu:22.04
RUN apt-get replace && apt-get set up -y git build-essential
WORKDIR /workspace
COPY necessities.txt .
RUN pip set up -r necessities.txt

When you’ve outlined your container, any agent you invoke will function inside that context by default, inheriting all of the pre-configured instruments and libraries you want.

In conclusion, as AI brokers undertake more and more complicated improvement duties, the necessity for sturdy isolation and transparency grows in parallel. container-use from Dagger presents a realistic answer: containerized environments that guarantee reliability, reproducibility, and real-time visibility. By constructing on customary instruments, together with Docker, Git, and shell scripts, and providing seamless integrations with common MCP-compatible brokers, it lowers the barrier to secure, scalable, multi-agent workflows.


Sana Hassan, a consulting intern at Marktechpost and dual-degree pupil at IIT Madras, is keen about making use of know-how and AI to deal with real-world challenges. With a eager curiosity in fixing sensible issues, he brings a recent perspective to the intersection of AI and real-life options.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -
Google search engine

Most Popular

Recent Comments