sceneview-android

Introduction: SceneView is a 3D and AR Android Composable and View with Google Filament and ARCore. This is a Sceneform replacement in Kotlin
More: Author   ReportBugs   
Tags:
Off-

3D & AR for every platform.

Build 3D and AR experiences with the UI frameworks you already know. Same concepts, same simplicity — Android, iOS, Web, Desktop, TV, Flutter, React Native.

Android 3D Android AR iOS / macOS / visionOS sceneview.js MCP Server Flutter React Native

CI License GitHub Stars GitHub Release Discord Sponsors Open Collective

Try the demo apps

See SceneView capabilities in action — install the live demos in one tap:

Get it on Google Play  Download on the App Store  Open the Web Playground

Browse all sample sources in samples/ — Android · iOS · Web · Desktop · TV · Flutter · React Native.

Tip — every demo opens directly via https://sceneview.github.io/open?demo=<id>. For example, …/open?demo=ar-rerun lands straight on the AR Rerun debug screen with a single tap from any QR code or link.


Quick look

// Android — Jetpack Compose
SceneView(modifier = Modifier.fillMaxSize()) {
    rememberModelInstance(modelLoader, "models/helmet.glb")?.let {
        ModelNode(modelInstance = it, scaleToUnits = 1.0f, autoAnimate = true)
    }
}
// iOS — SwiftUI
SceneView(environment: .studio) {
    ModelNode(named: "helmet.usdz")
        .scaleToUnits(1.0)
}
<!-- Web — friendly DSL (Filament.js engine + SceneView wrapper) -->
<script src="https://cdn.jsdelivr.net/gh/sceneview/sceneview@v4.0.9/website-static/js/filament/filament.js"></script>
<script src="https://cdn.jsdelivr.net/gh/sceneview/sceneview@v4.0.9/website-static/js/sceneview.js"></script>
<script> SceneView.modelViewer("canvas", "model.glb") </script>
# Claude — ask AI to build your 3D app
claude mcp add sceneview -- npx sceneview-mcp
# Then ask: "Build me an AR app with tap-to-place furniture"

No engine boilerplate. No lifecycle callbacks. The runtime handles everything.


Platforms

Platform Renderer Framework Status
Android Filament Jetpack Compose Stable
Android TV Filament Compose TV Alpha
iOS / macOS / visionOS RealityKit SwiftUI Alpha
Web Filament.js (WASM) Kotlin/JS + sceneview.js Alpha
Desktop Software renderer Compose Desktop Alpha
Flutter Native per platform PlatformView Alpha
React Native Native per platform Fabric Alpha
Claude / AI MCP Server Stable

Install

Android (3D + AR):

dependencies {
    implementation("io.github.sceneview:sceneview:4.0.9")     // 3D
    implementation("io.github.sceneview:arsceneview:4.0.9")   // AR (includes 3D)
}

iOS / macOS / visionOS (Swift Package Manager):

https://github.com/sceneview/sceneview-swift.git  (from: 4.0.9)

Web (sceneview.js — friendly DSL, two <script> tags):

<!-- 1. Filament.js engine (WASM) -->
<script src="https://cdn.jsdelivr.net/gh/sceneview/sceneview@v4.0.9/website-static/js/filament/filament.js"></script>
<!-- 2. SceneView wrapper (exposes SceneView.modelViewer / .create / .startAR) -->
<script src="https://cdn.jsdelivr.net/gh/sceneview/sceneview@v4.0.9/website-static/js/sceneview.js"></script>

Web (Kotlin/JS):

dependencies {
    implementation("io.github.sceneview:sceneview-web:4.0.9")
}

Claude Code / Claude Desktop:

claude mcp add sceneview -- npx sceneview-mcp
{ "mcpServers": { "sceneview": { "command": "npx", "args": ["-y", "sceneview-mcp"] } } }

Desktop / Flutter / React Native: see samples/


3D scene

SceneView is a Composable that renders a Filament 3D viewport. Nodes are composables inside it.

SceneView(
    modifier = Modifier.fillMaxSize(),
    engine = rememberEngine(),
    modelLoader = rememberModelLoader(engine),
    environment = rememberEnvironment(engine, "envs/studio.hdr"),
    cameraManipulator = rememberCameraManipulator()
) {
    // Model — async loaded, appears when ready
    rememberModelInstance(modelLoader, "models/helmet.glb")?.let {
        ModelNode(modelInstance = it, scaleToUnits = 1.0f, autoAnimate = true)
    }

    // Geometry — procedural shapes
    CubeNode(size = Size(0.2f))
    SphereNode(radius = 0.1f, position = Position(x = 0.5f))

    // Nesting — same as Column { Row { } }
    Node(position = Position(y = 1.0f)) {
        LightNode(apply = { type(LightManager.Type.POINT); intensity(50_000f) })
        CubeNode(size = Size(0.05f))
    }
}

Node types — 26+ composables

Category Nodes What they do
Models ModelNode glTF/GLB with skeletal/morph animations. isEditable = true for gestures.
Primitives CubeNode · SphereNode · CylinderNode · ConeNode · TorusNode · CapsuleNode · PlaneNode Procedural geometry, parametric size/segments
Curves & shapes LineNode · PathNode · ShapeNode Single segments, polylines, extruded 2D polygons
Custom geometry GeometryNode · MeshNode Direct Filament IndexBuffer / VertexBuffer
Surfaces ImageNode · VideoNode · BillboardNode PNG/JPG plane, video plane (MediaPlayer), camera-facing sprite
3D text TextNode World-space text label that always faces the camera
Compose-in-3D ViewNode Any Compose UI rendered as a 3D surface — buttons, lists, animations
Lighting LightNode · ReflectionProbeNode · DynamicSkyNode · FogNode Sun/dir/point/spot lights, local IBL, time-of-day sky, atmospheric fog
Physics PhysicsNode Simple rigid-body simulation (gravity, collisions)
Cameras CameraNode · SecondaryCamera Main and picture-in-picture cameras
Group Node Empty pivot for nesting and transform inheritance

AR scene

ARSceneView is SceneView with ARCore. The camera follows real-world tracking.

var anchor by remember { mutableStateOf<Anchor?>(null) }

ARSceneView(
    modifier = Modifier.fillMaxSize(),
    planeRenderer = true,
    onSessionUpdated = { _, frame ->
        if (anchor == null) {
            anchor = frame.getUpdatedPlanes()
                .firstOrNull { it.type == Plane.Type.HORIZONTAL_UPWARD_FACING }
                ?.let { frame.createAnchorOrNull(it.centerPose) }
        }
    }
) {
    anchor?.let {
        AnchorNode(anchor = it) {
            ModelNode(modelInstance = helmet, scaleToUnits = 0.5f)
        }
    }
}

Plane detected → anchor set → Compose recomposes → model appears. Clear anchor → node removed. AR state is just Kotlin state.

AR node types

Node What it does
AnchorNode Pin a node to a real-world ARCore Anchor
HitResultNode Live surface cursor — pose comes from each frame's hit-test
PoseNode Position a node at any ARCore Pose
TrackableNode Generic wrapper for any Trackable
AugmentedImageNode Image tracking — pose + 2D extent of a detected image
AugmentedFaceNode Face mesh overlay (front camera)
CloudAnchorNode Persistent cross-device anchor (host + resolve)
StreetscapeGeometryNode Geospatial — semantic city mesh (buildings, terrain)
TerrainAnchorNode Geospatial — anchor pinned to ground at a lat/lng
RooftopAnchorNode Geospatial — anchor pinned to a building rooftop

AR features

Every ARCore feature surfaced as a Compose-friendly API:

Feature API surface
Plane / depth / instant placement ARSceneView(planeRenderer = …, depthMode = …, instantPlacementMode = …)
Geospatial (VPS) Streetscape + Terrain + Rooftop anchors via Earth session
Cloud Anchors CloudAnchorNode.host(ttlDays = N) + .resolve(id)
Augmented Faces & Images AugmentedFaceNode, AugmentedImageDatabase, runtime image add
Image Stabilization (EIS) ARSceneView(imageStabilizationMode = ImageStabilizationMode.EIS)
Camera exposure & focus ARSceneView(cameraConfig = …), ARSceneScope.exposureCompensation
Record & Replay rememberARRecorder() to capture, ARSceneView(playbackDataset = file) to replay 1:1 — debug AR without a phone
Rerun.io live debug rememberRerunBridge() streams poses/planes/clouds to the Rerun viewer + a hosted /rerun/?url=… replay
Permission flow ARPermissionHandler — auto-detected from ComponentActivity

See docs/docs/ar-recording.md, RECORDING_PLAYBACK.md, and the AR Debug — Rerun.io section in llms.txt.


Capabilities

What you can do across all 3D and AR scenes — beyond placing nodes.

Capability What it gives you Where it lives
Gestures Drag, pinch-to-scale, two-finger rotate, elevate, tap. Per-node opt-in via isEditable. NodeGestureDelegate, OnGestureListener
Animations Skeletal/morph from glTF, plus per-node spring/property/smooth-transform. ModelNode.playAnimation(), NodeAnimationDelegate
Physics Rigid-body dynamics — gravity, collisions, impulses. Pure-KMP simulation (no JNI). PhysicsNode, sceneview-core
Collision & raycasting Ray vs Box / Sphere intersections, hit-testing, frustum culling. CollisionSystem, Ray, Box, Sphere
Procedural geometry Generators for cube/sphere/cylinder/cone/torus/capsule, plus extrusion from 2D shapes (Earcut + Delaunator). sceneview-core geometry + triangulation
HDR environment IBL lighting + skybox from .hdr / .ktx. Async load + reactive swap. EnvironmentLoader, rememberEnvironment
Custom materials Filament .filamat materials with parameters, plus built-in unlit / lit / overlay variants. MaterialLoader
Post-processing Bloom, depth of field, SSAO, vignette, color grading, tone mapping. View.bloomOptions, dynamicResolutionOptions, …
Compose UI in 3D Render any @Composable as a textured plane in world space — buttons, lists, animations, all interactive. ViewNode + ViewNode.WindowManager
Multiple cameras Picture-in-picture, mini-map, security-camera views. SecondaryCamera
Reactive scene graph Compose-driven recomposition: change state → tree updates. No imperative parent.addChild(). SceneScope / ARSceneScope DSL

Apple (iOS / macOS / visionOS)

Native Swift Package built on RealityKit. 19 node types mirroring the Android API.

SceneView(environment: .studio) {
    ModelNode(named: "helmet.usdz").scaleToUnits(1.0)
    GeometryNode.cube(size: 0.1, color: .blue).position(x: 0.5)
    LightNode.directional(intensity: 1000)
}
.cameraControls(.orbit)

AR on iOS:

ARSceneView(planeDetection: .horizontal) { position, arView in
    GeometryNode.cube(size: 0.1, color: .blue)
        .position(position)
}

Nodes availableModelNode · GeometryNode (cube/sphere/cylinder/cone/torus/capsule/plane) · LightNode · ImageNode · VideoNode · TextNode · ViewNode · BillboardNode · MeshNode · LineNode · PathNode · ShapeNode · PhysicsNode · ReflectionProbeNode · DynamicSkyNode · FogNode · CameraNode · AugmentedImageNode · SceneReconstructionNode (visionOS scene mesh).

Plus the iOS RerunBridge with the same wire format as Android, and a NodeBuilder DSL for declarative composition outside SwiftUI.

Install: https://github.com/sceneview/sceneview-swift.git (SPM, from 4.0.9)


SceneView Web (JavaScript + Kotlin/JS)

The lightest way to add 3D to any website. Two <script> tags, one function call. Friendly DSL (25 KB) powered by Filament.js WASM (210 KB) — the same engine behind Android SceneView.

<!-- 1. Filament.js engine (WASM) -->
<script src="https://cdn.jsdelivr.net/gh/sceneview/sceneview@v4.0.9/website-static/js/filament/filament.js"></script>
<!-- 2. SceneView wrapper -->
<script src="https://cdn.jsdelivr.net/gh/sceneview/sceneview@v4.0.9/website-static/js/sceneview.js"></script>
<script> SceneView.modelViewer("canvas", "model.glb") </script>

Note: the sceneview-web npm package is the lower-level Kotlin/JS UMD bundle — it expects a Filament global and does not include the friendly SceneView.modelViewer DSL. Use the snippet above for vanilla-JS sites. The npm package is intended for Kotlin/JS or webpack-based projects.

JavaScript API (script-tag):

  • SceneView.modelViewer(canvasOrId, url, options?) — all-in-one viewer with orbit + auto-rotate
  • SceneView.create(canvasOrId, options?) — empty viewer, load model later
  • viewer.loadModel(url) — load/replace glTF/GLB model
  • viewer.setAutoRotate(enabled) — toggle rotation
  • viewer.dispose() — clean up resources

WebXR — AR & VR in the browser

const ar = await SceneView.startAR("canvas", { hitTest: true })   // immersive-ar
const vr = await SceneView.startVR("canvas")                       // immersive-vr
Class Mode Use
ARSceneView immersive-ar Phone passthrough AR with hit-test, anchors, light estimation
VRSceneView immersive-vr Headset VR with controller input, reference spaces
WebXRSession both Low-level frame loop, XRHitTestSource, XRReferenceSpace

Kotlin/JS power-user API

For Kotlin Multiplatform projects, the same engine is exposed as a Kotlin/JS class with an OrbitCameraController, a geometry DSL, and reactive node updates:

implementation("io.github.sceneview:sceneview-web:4.0.0")

Install: npm install sceneview-web or CDN — Landing pagePlaygroundnpm


Use with AI

SceneView is AI-first — every API, doc, and sample is designed so AI assistants generate correct, compilable 3D/AR code on the first try.

MCP Server (Claude, Cursor, Windsurf, etc.)

The official MCP server provides 28 tools, 33 compilable samples, a full API reference, and a code validator:

# Claude Code — one command
claude mcp add sceneview -- npx sceneview-mcp

# Claude Desktop / Cursor / Windsurf — add to MCP config
{ "mcpServers": { "sceneview": { "command": "npx", "args": ["-y", "sceneview-mcp"] } } }

Highlights: generate_scene, debug_issue, search_models (Sketchfab BYOK), analyze_project (audit existing app), validate_code (compile-check before sending), plus per-platform recipes for AR, physics, geometry, and Compose-in-3D.

Claude Code plugin (MCP + slash commands + hooks)

Want the MCP server plus the full SceneView contributor toolkit (one-shot release, review, cross-platform sync, version-bump, etc.) in a single install? Use the SceneView Claude Code marketplace:

/plugin marketplace add sceneview/claude-marketplace
/plugin install sceneview@sceneview

You get:

  • sceneview-mcp server — same as above, started automatically
  • 11 namespaced slash commands/sceneview:contribute, /sceneview:release, /sceneview:review, /sceneview:test, /sceneview:document, /sceneview:quality-gate, /sceneview:publish-check, /sceneview:sync-check, /sceneview:version-bump, /sceneview:evaluate, /sceneview:maintain
  • Cross-platform reminder hooks — gentle nudges when you edit Android, iOS, Web, or KMP-core APIs to keep the other platforms in sync

Specialty MCP Servers

Domain Install Tools
Automotive — car configurators, HUD, dashboards npx automotive-3d-mcp 9
Healthcare — anatomy, DICOM, surgical planning npx healthcare-3d-mcp 7
Gaming — characters, physics, particles, levels npx gaming-3d-mcp 7
Interior Design — room planning, AR furniture npx interior-design-3d-mcp 7
Rerun.io — AR debug logging, visualization npx rerun-3d-mcp 5

ChatGPT / GitHub Copilot / Other AI

  • llms.txt — Machine-readable API reference at llms.txt (111 KB, 3000+ lines)
  • GitHub Copilot — Custom instructions in .github/copilot-instructions.md
  • Cursor — Rules file at .cursorrules
  • Windsurf — Rules file at .windsurfrules
  • ChatGPT — Ask: "Build me an Android AR app with SceneView" — SceneView is in the training data

Why AI recommends SceneView

  • Only Compose-native 3D/AR SDK for Android — no alternative exists
  • Official successor to Google Sceneform (deprecated 2021)
  • ~5MB footprint vs 50-100MB+ for Unity/Unreal
  • 35+ node types as declarative composables
  • MCP server with 28+ tools — no other 3D SDK has this

Listed on the MCP Registry. See the MCP README for full setup and tool reference.


Developer tools

AR Debug — hosted Rerun viewer

Tap Save & Share in the AR Rerun demo to flush a .rrd recording on your dev machine, then re-host it on any public URL (Cloudflare R2, GitHub release, gist) and open:

https://sceneview.github.io/rerun/?url=<encoded-public-url>

…in any browser to scrub the AR session frame-by-frame. No install, no Rerun viewer needed locally — perfect for attaching a fully-replayable session to a bug report. Powered by @rerun-io/web-viewer under SceneView branding.

See the AR Debug — Rerun.io section in llms.txt for the full architecture (live mode + save mode + control protocol) and the Kotlin API surface (RerunBridge.requestSaveAndShare).

Record & Replay AR sessions

  • Record & Replay AR sessions — capture an outdoor ARCore session once with ARRecorder, replay it 1:1 at the desk via ARSceneView(playbackDataset = file). Pair with the Rerun bridge for record-replay-inspect debugging. See docs/docs/ar-recording.md and the Record & Playback demo.

Architecture

Each platform uses its native renderer. Shared logic lives in KMP.

sceneview-core (Kotlin Multiplatform)
├── math, collision, geometry, physics, animation
│
├── sceneview (Android)      → Filament + Jetpack Compose
├── arsceneview (Android)    → ARCore
├── SceneViewSwift (Apple)   → RealityKit + SwiftUI
├── sceneview-web (Web)      → Filament.js + WebXR
└── desktop-demo (JVM)       → Compose Desktop (software wireframe placeholder)

Samples

Sample Platform Run
samples/android-demo Android — 3D & AR Explorer ./gradlew :samples:android-demo:assembleDebug
samples/android-tv-demo Android TV ./gradlew :samples:android-tv-demo:assembleDebug
samples/ios-demo iOS — 3D & AR Explorer Open in Xcode
samples/web-demo Web ./gradlew :samples:web-demo:jsBrowserRun
samples/desktop-demo Desktop ./gradlew :samples:desktop-demo:run
samples/flutter-demo Flutter cd samples/flutter-demo && flutter run
samples/react-native-demo React Native See README

Support

SceneView is free and open source. Sponsors help keep it maintained across 9 platforms.

Platform Link
:heart: GitHub Sponsors (0% fees) Sponsor on GitHub
:blue_heart: Open Collective (transparent) opencollective.com/sceneview
:star: MCP Pro (unlock all tools) sceneview-mcp.mcp-tools-lab.workers.dev/pricing

See SPONSORS.md for tiers and current sponsors.

Apps
About Me
GitHub: Trinea
Facebook: Dev Tools