This directory contains example scripts demonstrating how to use the OpenGradient Python SDK for various use cases, from basic model operations to advanced agent integrations.
Before running any examples, ensure you have:
- Installed the SDK:
pip install opengradient - Set up your credentials: Configure your OpenGradient account using environment variables:
OG_PRIVATE_KEY: Private key funded with Base Sepolia OPG tokens for x402 LLM payments (can be obtained from our faucet).OG_ALPHA_PRIVATE_KEY: (Optional) Private key funded with OpenGradient testnet gas tokens for Alpha Testnet on-chain inference. Falls back toOG_PRIVATE_KEYwhen not set.OG_MODEL_HUB_EMAIL: (Optional) Your Model Hub email for model uploadsOG_MODEL_HUB_PASSWORD: (Optional) Your Model Hub password for model uploads
You can also use the configuration wizard:
opengradient config initCreates a new model repository on the Model Hub.
python examples/create_model_repo.pyWhat it does:
- Creates a new model repository with a name and description
- Returns a model repository object with version information
Uploads a model file to an existing model repository.
python examples/upload_model_to_hub.pyWhat it does:
- Creates a model repository (or uses an existing one)
- Uploads an ONNX model file to the repository
- Returns the model CID (Content Identifier) for use in inference
Note: Requires Model Hub credentials (OG_MODEL_HUB_EMAIL and OG_MODEL_HUB_PASSWORD).
Runs a basic LLM chat completion.
python examples/llm_chat.pyWhat it does:
- Sends a multi-turn conversation to an LLM
- Uses x402 protocol for payment processing
- Returns the model's response
Runs a streaming LLM chat completion.
python examples/llm_chat_streaming.pyWhat it does:
- Sends a multi-turn conversation to an LLM with streaming enabled
- Demonstrates real-time token streaming
- Returns chunks as they arrive from the model
Demonstrates LLM tool/function calling.
python examples/llm_tool_calling.pyWhat it does:
- Defines a tool (weather lookup) and passes it to the LLM
- The model decides when to invoke tools based on the user's query
- Uses x402 protocol for payment processing
Examples for features only available on the Alpha Testnet are located in the alpha/ folder. These include:
- Model inference (
run_inference.py) - Embeddings models (
run_embeddings_model.py) - Workflow creation and usage (
create_workflow.py,use_workflow.py)
See alpha/README.md for details.
Creates a basic LangChain ReAct agent powered by an OpenGradient LLM.
python examples/langchain_react_agent.pyWhat it does:
- Uses
og.agents.langchain_adapterto create a LangChain-compatible LLM - Sets up a LangGraph ReAct agent with a custom tool
- Demonstrates tool calling via x402 payment processing
Example use case: Building conversational agents with tool access, task automation.
Chat with digital twins from twin.fun via OpenGradient verifiable inference.
python examples/twins_chat.pyWhat it does:
- Connects to the Twins API to chat with digital twins of public figures
- Demonstrates multi-turn conversations with different twin personas
- Uses TEE-based verifiable inference for trustworthy responses
Required environment variables:
OG_PRIVATE_KEY: Your Ethereum private keyTWINS_API_KEY: Your Twins API key
Example use case: Building applications that interact with digital twin personas through verified AI inference.
All examples use a similar pattern to initialize the OpenGradient client:
import os
import opengradient as og
og_client = og.Client(
private_key=os.environ.get("OG_PRIVATE_KEY"), # Base Sepolia OPG tokens for LLM payments
alpha_private_key=os.environ.get("OG_ALPHA_PRIVATE_KEY"), # Optional: OpenGradient testnet tokens for on-chain inference
email=os.environ.get("OG_MODEL_HUB_EMAIL"),
password=os.environ.get("OG_MODEL_HUB_PASSWORD"),
)Basic inference pattern:
result = og_client.alpha.infer(
model_cid="your-model-cid",
model_input={"input_key": "input_value"},
inference_mode=og.InferenceMode.VANILLA
)
print(f"Output: {result.model_output}")
print(f"Tx hash: {result.transaction_hash}")LLM chat pattern:
completion = og_client.llm.chat(
model=og.TEE_LLM.CLAUDE_HAIKU_4_5,
messages=[{"role": "user", "content": "Your message"}],
)
print(f"Response: {completion.chat_output['content']}")Browse available models on the OpenGradient Model Hub. Each model has a CID that you can use in your code.
- Run
opengradient --helpfor CLI command reference - Visit our documentation for detailed guides
- Check the main README for SDK overview