One engine. Four ways to ship it.

Pick the layer that matches your app. Go straight to the metal with Rust, ship the Apple-native path with Swift, stay productive in Dart, or keep your app in React Native. Same inference core. Same on-device model story.

Rust

The source of truth. Direct access to `ChatEngine`, model configs, streaming, and the full async API when you want Onde without a wrapper.

Swift

The Apple-native path. `onde-swift` installs with Swift Package Manager, flips to a local XCFramework for SDK work, and already ships in Klepon across iPhone, Mac, Apple TV, Vision Pro, and Apple Watch.

Dart

Flutter-first and practical. One plugin, one Dart API, local inference on iPhone and Mac without inventing a backend just to send tokens around.

React Native

JavaScript on top, native engine underneath. Keep your React Native app and still run the model on-device.

Current scope

Today the product is strongest in chat: model loading, history, platform defaults, one-shot generation, and streaming. That is enough to build real apps, not just toy demos.

Quick start

Start with the crate.

Add `onde` via Cargo. If you want to test model downloads or GGUF output before wiring it into app code, use Onde CLI. If you want the bigger argument for local inference, read The Forward Pass.

# via CLI
cargo add onde

# or in Cargo.toml
[dependencies]
onde = "1.1.1"

Usage

Load. Prompt. Ship.

The Rust API stays direct. Load the platform default model, send a message, get a result. No extra service layer. No cloud round-trip pretending to be a feature.

use onde::inference::{ChatEngine, GgufModelConfig};

let engine = ChatEngine::new();
engine
    .load_gguf_model(
        GgufModelConfig::platform_default(),
        Some("You are a helpful assistant.".into()),
        None,
    )
    .await?;

let result = engine.send_message("Hello!").await?;
println!("{}", result.text);

Coverage

Apple first. Not Apple only.

Onde started with Apple silicon in mind, but the Rust core is wider than that now. Metal on Apple platforms. CPU paths on Android, Linux, and Windows. If you want to watch the network side of Onde, open Onde Inference Pulse.

PlatformTargetStatus
macOSaarch64-apple-darwinReady
iOSaarch64-apple-ios / aarch64-apple-ios-simReady
tvOSaarch64-apple-tvos / aarch64-apple-tvos-simReady
visionOSaarch64-apple-visionos / aarch64-apple-visionos-simReady
watchOSaarch64-apple-watchos / aarch64-apple-watchos-simReady
Androidaarch64-linux-androidReady
Linuxx86_64-unknown-linux-gnu / aarch64-unknown-linux-gnuReady
Windowsx86_64-pc-windows-msvcReady