On this tutorial, we introduce a complicated AI agent constructed utilizing Nebius’ sturdy ecosystem, notably the ChatNebius, NebiusEmbeddings, and NebiusRetriever parts. The agent makes use of the Llama-3.3-70B-Instruct-fast mannequin to generate high-quality responses, incorporating exterior functionalities corresponding to Wikipedia search, contextual doc retrieval, and protected mathematical computation. By combining structured immediate design with LangChain’s modular framework, this tutorial demonstrates construct a multi-functional, reasoning-capable AI assistant that’s each interactive and extensible. Whether or not for scientific queries, technological insights, or primary numerical duties, this agent showcases the potential of Nebius as a platform for constructing subtle AI programs.
!pip set up -q langchain-nebius langchain-core langchain-community wikipedia
import os
import getpass
from typing import Listing, Dict, Any
import wikipedia
from datetime import datetime
We start by putting in important libraries, together with langchain-nebius, langchain-core, langchain-community, and Wikipedia, that are essential for constructing a feature-rich AI assistant. It then imports crucial modules corresponding to os, getpass, datetime, and typing utilities, and initializes the Wikipedia API for exterior knowledge entry.
from langchain_core.paperwork import Doc
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser
from langchain_core.runnables import RunnablePassthrough
from langchain_core.instruments import instrument
from langchain_nebius import ChatNebius, NebiusEmbeddings, NebiusRetriever
if "NEBIUS_API_KEY" not in os.environ:
os.environ["NEBIUS_API_KEY"] = getpass.getpass("Enter your Nebius API key: ")
We’re importing core parts from LangChain and Nebius to allow doc dealing with, immediate templating, output parsing, and power integration. It units up key courses corresponding to ChatNebius for language modeling, NebiusEmbeddings for vector illustration, and NebiusRetriever for semantic search. The person’s Nebius API key’s securely accessed utilizing getpass to authenticate subsequent API interactions.
class AdvancedNebiusAgent:
"""Superior AI Agent with retrieval, reasoning, and exterior instrument capabilities"""
def __init__(self):
self.llm = ChatNebius(mannequin="meta-llama/Llama-3.3-70B-Instruct-fast")
self.embeddings = NebiusEmbeddings()
self.knowledge_base = self._create_knowledge_base()
self.retriever = NebiusRetriever(
embeddings=self.embeddings,
docs=self.knowledge_base,
okay=3
)
self.agent_prompt = ChatPromptTemplate.from_template("""
You're a complicated AI assistant with entry to:
1. A data base about know-how and science
2. Wikipedia search capabilities
3. Mathematical calculation instruments
4. Present date/time data
Context from data base:
{context}
Exterior instrument outcomes:
{tool_results}
Present date: {current_date}
Person Question: {question}
Directions:
- Use the data base context when related
- In the event you want extra data, point out what exterior sources would assist
- Be complete however concise
- Present your reasoning course of
- If calculations are wanted, break them down step-by-step
Response:
""")
def _create_knowledge_base(self) -> Listing[Document]:
"""Create a complete data base"""
return [
Document(
page_content="Artificial Intelligence (AI) is transforming industries through ML, NLP, and computer vision. Key applications include autonomous vehicles, medical diagnosis, and financial trading.",
metadata={"topic": "AI", "category": "technology"}
),
Document(
page_content="Quantum computing uses quantum mechanical phenomena like superposition and entanglement to process information. Companies like IBM, Google, and Microsoft are leading quantum research.",
metadata={"topic": "quantum_computing", "category": "technology"}
),
Document(
page_content="Climate change is caused by greenhouse gas emissions, primarily CO2 from fossil fuels. Renewable energy sources are crucial for mitigation.",
metadata={"topic": "climate", "category": "environment"}
),
Document(
page_content="CRISPR-Cas9 is a revolutionary gene editing technology that allows precise DNA modifications. It has applications in treating genetic diseases and improving crops.",
metadata={"topic": "biotechnology", "category": "science"}
),
Document(
page_content="Blockchain technology enables decentralized, secure transactions without intermediaries. Beyond cryptocurrency, it has applications in supply chain, healthcare, and voting systems.",
metadata={"topic": "blockchain", "category": "technology"}
),
Document(
page_content="Space exploration has advanced with reusable rockets, Mars rovers, and commercial space travel. SpaceX, Blue Origin, and NASA are pioneering new missions.",
metadata={"topic": "space", "category": "science"}
),
Document(
page_content="Renewable energy costs have dropped dramatically. Solar & wind power are now cheaper than fossil fuels in many regions, driving global energy transition.",
metadata={"topic": "renewable_energy", "category": "environment"}
),
Document(
page_content="5G networks provide ultra-fast internet speeds and low latency, enabling IoT devices, autonomous vehicles, and augmented reality applications.",
metadata={"topic": "5G", "category": "technology"}
)
]
@instrument
def wikipedia_search(question: str) -> str:
"""Search Wikipedia for extra data"""
strive:
search_results = wikipedia.search(question, outcomes=3)
if not search_results:
return f"No Wikipedia outcomes discovered for '{question}'"
web page = wikipedia.web page(search_results[0])
abstract = wikipedia.abstract(search_results[0], sentences=3)
return f"Wikipedia: {web page.title}n{abstract}nURL: {web page.url}"
besides Exception as e:
return f"Wikipedia search error: {str(e)}"
@instrument
def calculate(expression: str) -> str:
"""Carry out mathematical calculations safely"""
strive:
allowed_chars = set('0123456789+-*/.() ')
if not all(c in allowed_chars for c in expression):
return "Error: Solely primary mathematical operations allowed"
consequence = eval(expression)
return f"Calculation: {expression} = {consequence}"
besides Exception as e:
return f"Calculation error: {str(e)}"
def _format_docs(self, docs: Listing[Document]) -> str:
"""Format retrieved paperwork for context"""
if not docs:
return "No related paperwork present in data base."
formatted = []
for i, doc in enumerate(docs, 1):
formatted.append(f"{i}. {doc.page_content}")
return "n".be part of(formatted)
def _get_current_date(self) -> str:
"""Get present date and time"""
return datetime.now().strftime("%Y-%m-%d %H:%M:%S")
def process_query(self, question: str, use_wikipedia: bool = False,
calculate_expr: str = None) -> str:
"""Course of a person question with non-compulsory exterior instruments"""
relevant_docs = self.retriever.invoke(question)
context = self._format_docs(relevant_docs)
tool_results = []
if use_wikipedia:
wiki_keywords = self._extract_keywords(question)
if wiki_keywords:
wiki_result = self.wikipedia_search(wiki_keywords)
tool_results.append(f"Wikipedia Search: {wiki_result}")
if calculate_expr:
calc_result = self.calculate(calculate_expr)
tool_results.append(f"Calculation: {calc_result}")
tool_results_str = "n".be part of(tool_results) if tool_results else "No exterior instruments used"
chain = (
{
"context": lambda x: context,
"tool_results": lambda x: tool_results_str,
"current_date": lambda x: self._get_current_date(),
"question": RunnablePassthrough()
}
| self.agent_prompt
| self.llm
| StrOutputParser()
)
return chain.invoke(question)
def _extract_keywords(self, question: str) -> str:
"""Extract key phrases for Wikipedia search"""
important_words = []
stop_words = {'what', 'how', 'why', 'when', 'the place', 'is', 'are', 'the', 'a', 'an'}
phrases = question.decrease().break up()
for phrase in phrases:
if phrase not in stop_words and len(phrase) > 3:
important_words.append(phrase)
return ' '.be part of(important_words[:3])
def interactive_session(self):
"""Run an interactive session with the agent"""
print("🤖 Superior Nebius AI Agent Prepared!")
print("Options: Information retrieval, Wikipedia search, calculations")
print("Instructions: 'wiki:' for Wikipedia, 'calc:' for math")
print("Kind 'give up' to exitn")
whereas True:
user_input = enter("You: ").strip()
if user_input.decrease() == 'give up':
print("Goodbye!")
break
use_wiki = False
calc_expr = None
if user_input.startswith('wiki:'):
use_wiki = True
user_input = user_input[5:].strip()
elif user_input.startswith('calc:'):
components = user_input.break up(':', 1)
if len(components) == 2:
calc_expr = components[1].strip()
user_input = f"Calculate {calc_expr}"
strive:
response = self.process_query(user_input, use_wiki, calc_expr)
print(f"n🤖 Agent: {response}n")
besides Exception as e:
print(f"Error: {e}n")
The core of the implementation is encapsulated throughout the AdvancedNebiusAgent class, which orchestrates reasoning, retrieval, and power integration. It initializes a high-performance LLM from Nebius (meta-llama/Llama-3.3-70B-Instruct-fast). It units up a semantic retriever based mostly on embedded paperwork, forming a mini data base that covers subjects corresponding to AI, quantum computing, blockchain, and extra. A dynamic immediate template guides the agent’s responses by together with retrieved context, exterior instrument outputs, and the present date. Two built-in instruments, wikipedia_search and calculate, improve the agent’s performance by offering entry to exterior encyclopedic data and protected arithmetic computation, respectively. The process_query methodology brings all of it collectively, dynamically invoking the immediate chain with context, instruments, and reasoning to generate informative, multi-source solutions. An non-compulsory interactive session allows real-time conversations with the agent, permitting recognition of particular prefixes corresponding to wiki: or calc: to activate exterior instrument assist.
if __name__ == "__main__":
agent = AdvancedNebiusAgent()
demo_queries = [
"What is artificial intelligence and how is it being used?",
"Tell me about quantum computing companies",
"How does climate change affect renewable energy adoption?"
]
print("=== Nebius AI Agent Demo ===n")
for i, question in enumerate(demo_queries, 1):
print(f"Demo {i}: {question}")
response = agent.process_query(question)
print(f"Response: {response}n")
print("-" * 50)
print("nDemo with Wikipedia:")
response_with_wiki = agent.process_query(
"What are the newest developments in house exploration?",
use_wikipedia=True
)
print(f"Response: {response_with_wiki}n")
print("Demo with calculation:")
response_with_calc = agent.process_query(
"If photo voltaic panel effectivity improved by 25%, what can be the brand new effectivity if present is 20%?",
calculate_expr="20 * 1.25"
)
print(f"Response: {response_with_calc}n")
Lastly, we showcase the agent’s capabilities by way of a set of demo queries. It begins by instantiating the AdvancedNebiusAgent, adopted by a loop that processes predefined prompts associated to AI, quantum computing, and local weather change, demonstrating the retrieval performance. It then performs a Wikipedia-enhanced question about house exploration, using real-time exterior data to complement the data base. Lastly, it runs a mathematical situation involving photo voltaic panel effectivity to validate the calculation instrument. These demos collectively illustrate how Nebius, mixed with LangChain and well-structured prompts, allows clever, multi-modal question dealing with in a real-world assistant.
In conclusion, this Nebius-powered agent exemplifies successfully mix LLM-driven reasoning with structured retrieval and exterior instrument utilization to construct a succesful, context-aware assistant. By integrating LangChain with Nebius APIs, the agent accesses a curated data base, fetches stay knowledge from Wikipedia, and handles arithmetic operations with security checks. The tutorial’s modular structure, that includes immediate templates, dynamic chaining, and customizable inputs, gives a strong blueprint for builders looking for to create clever programs that surpass static massive language mannequin (LLM) responses.
Try the Codes. All credit score for this analysis goes to the researchers of this undertaking. Additionally, be happy to observe us on Twitter and don’t neglect to hitch our 100k+ ML SubReddit and Subscribe to our E-newsletter.
Asif Razzaq is the CEO of Marktechpost Media Inc.. As a visionary entrepreneur and engineer, Asif is dedicated to harnessing the potential of Synthetic Intelligence for social good. His most up-to-date endeavor is the launch of an Synthetic Intelligence Media Platform, Marktechpost, which stands out for its in-depth protection of machine studying and deep studying information that’s each technically sound and simply comprehensible by a large viewers. The platform boasts of over 2 million month-to-month views, illustrating its reputation amongst audiences.