Need smarter insights in your inbox? Join our weekly newsletters to get solely what issues to enterprise AI, information, and safety leaders. Subscribe Now
The chatter round synthetic normal intelligence (AGI) could dominate headlines coming from Silicon Valley firms like OpenAI, Meta and xAI, however for enterprise leaders on the bottom, the main target is squarely on sensible purposes and measurable outcomes. At VentureBeat’s current Rework 2025 occasion in San Francisco, a transparent image emerged: the period of actual, deployed agentic AI is right here, is accelerating and it’s already reshaping how companies function.
Firms like Intuit, Capital One, LinkedIn, Stanford College and Highmark Well being are quietly placing AI brokers into manufacturing, tackling concrete issues, and seeing tangible returns. Listed here are the 4 largest takeaways from the occasion for technical decision-makers.
1. AI Brokers are transferring into manufacturing, sooner than anybody realized
Enterprises are actually deploying AI brokers in customer-facing purposes, and the pattern is accelerating at a breakneck tempo. A current VentureBeat survey of two,000 business professionals carried out simply earlier than VB Rework revealed that 68% of enterprise firms (with 1,000+ staff) had already adopted agentic AI – a determine that appeared excessive on the time. (In actual fact, I apprehensive it was too excessive to be credible, so after I introduced the survey outcomes on the occasion stage, I cautioned that the excessive adoption could also be a mirrored image of VentureBeat’s particular readership.)
Nonetheless, new information validates this fast shift. A KPMG survey launched on June 26, a day after our occasion, exhibits that 33% of organizations are actually deploying AI brokers, a stunning threefold enhance from simply 11% within the earlier two quarters. This market shift validates the pattern VentureBeat first recognized simply weeks in the past in its pre-Rework survey.
This acceleration is being fueled by tangible outcomes. Ashan Willy, CEO of New Relic, famous a staggering 30% quarter over quarter progress in monitoring AI purposes by its prospects, primarily due to the its prospects’ transfer to undertake brokers. Firms are deploying AI brokers to assist prospects automate workflows they need assistance with. Intuit, for example, has deployed bill era and reminder brokers in its QuickBooks software program. The end result? Companies utilizing the characteristic are getting paid 5 days sooner and are 10% extra prone to be paid in full.
Even non-developers are feeling the shift. Scott White, the product lead of Anthropic’s Claude AI product, described how he, regardless of not being an expert programmer, is now constructing production-ready software program options himself. “This wasn’t doable six months in the past,” he defined, highlighting the ability of instruments like Claude Code. Equally, OpenAI’s head of product for its API platform, Olivier Godement, detailed how prospects like Stripe and Field are utilizing its Brokers SDK to construct out multi-agent programs.
2. The hyperscaler race has no clear winner as multi-cloud, multi-model reigns
The times of betting on a single giant language mannequin (LLM) supplier are over. A constant theme all through Rework 2025 was the transfer in direction of a multi-model and multi-cloud technique. Enterprises need the flexibleness to decide on the very best instrument for the job, whether or not it’s a strong proprietary mannequin or a fine-tuned open-source different.
As Armand Ruiz, VP of AI Platform at IBM defined, the corporate’s improvement of a mannequin gateway — which routes purposes to make use of no matter LLM is best and performant for the precise case –was a direct response to buyer demand. IBM began by providing enterprise prospects its personal open-source fashions, then added open-source help, and at last realized it wanted to help all fashions. This want for flexibility was echoed by XD Huang, the CTO of Zoom, who described his firm’s three-tiered mannequin strategy: supporting proprietary fashions, providing their very own fine-tuned mannequin and permitting prospects to create their very own fine-tuned variations.
This pattern is creating a strong however constrained ecosystem, the place GPUs and the ability wanted to generate tokens are in restricted provide. As Dylan Patel of SemiAnalysis and fellow panelists Jonathan Ross of Groq and Sean Lie of Cerebras identified, this places strain on the profitability of numerous firms that merely purchase extra tokens when they’re out there, as a substitute of locking into income as the price of these tokens continues to fall. Enterprises are getting smarter about how they use completely different fashions for various duties to optimize for each value and efficiency — and that will typically imply not simply counting on Nvidia chips, however being far more personalized — one thing additionally echoed in a VB Rework session led by Solidigm across the emergence of personalized reminiscence and storage options for AI.
3. Enterprises are targeted on fixing actual issues, not chasing AGI
Whereas tech leaders like Elon Musk, Mark Zuckerberg and Sam Altman are speaking concerning the daybreak of superintelligence, enterprise practitioners are rolling up their sleeves and fixing speedy enterprise challenges. The conversations at Rework had been refreshingly grounded in actuality.
Take Highmark Well being, the nation’s third-largest built-in medical insurance and supplier firm. Its Chief Information Officer Richard Clarke mentioned it’s utilizing LLMs for sensible purposes like multilingual communication to raised serve their various buyer base, and streamlining medical claims. In different phrases, leveraging expertise to ship higher companies at this time. Equally, Capital One is constructing groups of brokers that mirror the capabilities of the corporate, with particular brokers for duties like danger analysis and auditing, together with serving to their automobile dealership shoppers join prospects with the proper loans.
The journey business can be seeing a practical shift. CTOs from Expedia and Kayak mentioned how they’re adapting to new search paradigms enabled by LLMs. Customers can now seek for a lodge with an “infinity pool” on ChatGPT, and journey platforms want to include that stage of pure language discovery to remain aggressive. The main focus is on the shopper, not the expertise for its personal sake.
4. The way forward for AI groups is small, nimble, and empowered
The age of AI brokers can be reworking how groups are structured. The consensus is that small, agile “squads” of three to 4 engineers are only. Varun Mohan, CEO of Windsurf, a fast-growing agentic IDE, kicked off the occasion by arguing that this small workforce construction permits for fast testing of product hypotheses and avoids the slowdown that plagues bigger teams.
This shift implies that “everyone seems to be a builder,” and more and more, “everyone seems to be a supervisor” of AI brokers. As GitHub and Atlassian famous, engineers are now studying to handle fleets of brokers. The abilities required are evolving, with a higher emphasis on clear communication and strategic pondering to information these autonomous programs.
This nimbleness is supported by a rising acceptance of sandboxed improvement. Andrew Ng, a number one voice in AI, suggested attendees to depart security, governance, and observability to the top of the event cycle. Whereas this might sound counterintuitive for giant enterprises, the thought is to foster fast innovation inside a managed atmosphere to show worth rapidly. This sentiment was mirrored in our survey, which discovered that 10% of organizations adopting AI haven’t any devoted AI security workforce, suggesting a willingness to prioritize velocity in these early phases.
Collectively, these takeaways paint a transparent image of an enterprise AI panorama that’s maturing quickly, transferring from broad experimentation to targeted, value-driven execution. The conversations at Rework 2025 confirmed that firms are deploying AI brokers at this time, even when they’ve needed to study powerful classes on the best way. Many have already gone by means of one or two large pivots since first making an attempt out generative AI one or two years in the past — so it’s good to get began early.
For a extra conversational dive into these themes and additional evaluation from the occasion, you possibly can take heed to the complete dialogue I had with impartial AI developer Sam Witteveen on our current podcast under. We’ve additionally simply uploaded the main-stage talks at VB Rework right here. And our full protection of articles from the occasion is right here.
Take heed to the VB Rework takeaways podcast with Matt Marshall and Sam Witteveen right here:
Editor’s notice: As a thank-you to our readers, we’ve opened up early chook registration for VB Rework 2026 — simply $200. That is the place AI ambition meets operational actuality, and also you’re going to wish to be within the room. Reserve your spot now.