RustCloud is a Rust library that hides the differences between APIs provided by various cloud providers (AWS, GCP, Azure, and more), letting you manage cloud resources through a single, consistent interface.
Note: This is the Rust port of the original gocloud library. The install instructions below are for Rust/Cargo — ignore any Go references you may see in older branches.
- Overview
- Service Types
- Supported Providers
- Getting Started
- Credential Setup
- Usage
- LLM Provider Abstraction
- Development
- Running Tests
- Contributing
The core idea is straightforward: you should be able to switch between AWS and GCP (or add a new provider entirely) without rewriting your application logic. RustCloud defines traits for each service category, and each cloud provider implements those traits.
Your application code
│
▼
┌──────────────────────┐
│ RustCloud Traits │ ← unified API surface
└──────────┬───────────┘
│
┌──────┼──────┐
▼ ▼ ▼
AWS GCP Azure ...
All I/O is async (backed by Tokio), and errors are returned as the CloudError enum so you can match on them precisely.
| Type | Description |
|---|---|
| Compute | Manage virtual machines and cloud servers |
| Container | Deploy and manage containerized workloads |
| Database | Interact with managed database services |
| Storage | Object storage, block storage, and archival |
| Network | Load balancers and DNS management |
| Security | Identity, access management, and key management |
| AI/ML | Machine learning and LLM provider abstractions |
| Category | Service |
|---|---|
| Compute | EC2, ECS, EKS |
| Database | DynamoDB |
| Management | CloudWatch |
| Network | Route53, Elastic Load Balancing |
| Security | IAM, KMS |
| Storage | S3, Glacier, Block Storage |
Examples: examples/aws/
| Category | Service |
|---|---|
| AI / ML | AutoML |
| App Services | Cloud Pub/Sub |
| Compute | Compute Engine, GKE |
| Database | Bigtable, BigQuery |
| Network | Cloud DNS, Load Balancing |
| Storage | Cloud Storage |
Examples: examples/gcp/
| Category | Service |
|---|---|
| Auth | Azure authentication |
| Storage | Blob Storage |
| Category | Service |
|---|---|
| Compute | Droplets |
| Network | Load Balancer |
| DNS | DigitalOcean DNS |
| Storage | Block Storage |
You need a working Rust toolchain. If you don't have one:
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | shThis installs rustup, cargo, and the stable Rust compiler. The project requires Rust 2021 edition.
RustCloud is not yet published to crates.io, so reference it directly from the repository:
[dependencies]
rustcloud = { git = "https://github.com/c2siorg/RustCloud", subdirectory = "rustcloud" }
tokio = { version = "1", features = ["full"] }git clone https://github.com/c2siorg/RustCloud
cd RustCloud/rustcloud
cargo buildRustCloud uses the standard credential mechanisms for each provider, so you don't need any custom config file format.
The AWS SDK for Rust uses the same credential chain as the AWS CLI. The easiest options are environment variables or the shared credentials file.
Environment variables:
export AWS_ACCESS_KEY_ID="your-key-id"
export AWS_SECRET_ACCESS_KEY="your-secret-key"
export AWS_DEFAULT_REGION="us-east-1"Credentials file at ~/.aws/credentials:
[default]
aws_access_key_id = your-key-id
aws_secret_access_key = your-secret-keyGCP uses Application Default Credentials (ADC). Point the environment variable at your service account key file:
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/service-account-key.json"You can download a service account key from the GCP Console. If you're running locally with the gcloud CLI installed, gcloud auth application-default login works too.
export AZURE_CLIENT_ID="your-client-id"
export AZURE_CLIENT_SECRET="your-client-secret"
export AZURE_TENANT_ID="your-tenant-id"export DIGITALOCEAN_TOKEN="your-token"All operations are async and return Result<_, CloudError>. A minimal example using the AWS EC2 module:
use aws_config::meta::region::RegionProviderChain;
use aws_sdk_ec2::Client;
use rustcloud::aws::aws_apis::compute::aws_ec2;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let region = RegionProviderChain::default_provider().or_else("us-east-1");
let config = aws_config::from_env().region(region).load().await;
let client = Client::new(&config);
let instance_id = aws_ec2::create_instance(&client, "ami-0abcdef1234567890").await?;
println!("Created instance: {}", instance_id);
Ok(())
}For more complete examples, see the examples/ directory — each service has its own markdown file with copy-pasteable snippets.
One of the newer additions to RustCloud is a unified interface for interacting with large language model (LLM) providers. The goal is the same as the rest of the library: write your AI code once, swap the backend provider without touching application logic.
use rustcloud::traits::llm_provider::LlmProvider;
use rustcloud::types::llm::{LlmRequest, ModelRef, Message};
// Any provider that implements LlmProvider can be used here
async fn ask(provider: &dyn LlmProvider, question: &str) {
let req = LlmRequest {
model: ModelRef::Logical {
family: "gemini".to_string(),
tier: Some("pro".to_string()),
},
messages: vec![Message {
role: "user".to_string(),
content: question.to_string(),
}],
max_tokens: Some(512),
temperature: Some(0.7),
system_prompt: None,
};
let response = provider.generate(req).await.unwrap();
println!("{}", response.text);
}| Method | Description |
|---|---|
generate |
Standard text generation (request/response) |
stream |
Streaming generation via async Stream |
embed |
Get embeddings for a list of texts |
generate_with_tools |
Text generation with tool/function calling |
Instead of embedding provider-specific model IDs everywhere, ModelRef gives you three options:
// Reference a specific provider model ID (e.g., for Bedrock, Vertex AI)
ModelRef::Provider("anthropic.claude-3-sonnet-20240229-v1:0".to_string())
// Reference a model logically — the provider implementation resolves this
ModelRef::Logical { family: "claude".to_string(), tier: Some("sonnet".to_string()) }
// Reference a named deployment (e.g., Azure OpenAI deployments)
ModelRef::Deployment("my-gpt4-deployment".to_string())This abstraction is what makes it practical to target Vertex AI, AWS Bedrock, and Azure OpenAI from the same calling code.
The GSoC 2026 project is extending this by adding concrete implementations for BigQuery, Vertex AI, AWS Bedrock GenAI, and Azure OpenAI. If you're interested in contributing, see issue #36 and the
#rust-cloudSlack channel.
git clone https://github.com/c2siorg/RustCloud
cd RustCloud/rustcloud
cargo buildBefore submitting a PR, run the formatter and linter:
cargo fmt
cargo clippy -- -D warningsSee CONTRIBUTING.md for the full contribution guide.
cd RustCloud/rustcloud
cargo testTo run tests for a specific provider:
cargo test aws # all AWS tests
cargo test gcp # all GCP testsImportant: Tests that create real cloud resources will create live infrastructure. Make sure you clean up any instances, storage buckets, load balancers, and DNS records after running integration tests — check each provider's console.
GCP note: Some GCP tests currently have a known compilation issue with struct initialization (see #14). Unit tests and AWS tests compile and run correctly.
Contributions are welcome. A few things to keep in mind:
- Comment on an issue before starting work — it avoids duplicate effort
- Keep PRs focused; one logical change per PR is easier to review
- Run
cargo fmtandcargo clippybefore pushing - Add tests for new functionality
For details, see CONTRIBUTING.md. To discuss ideas or ask questions, join the #rust-cloud channel on c2si.slack.com.
Apache 2.0 — see LICENSE.