Scope4Mac

00:00
00:00

Scope4Mac

Daydream Scope

Explore new worlds with Daydream Scope

Check out the latest model drops and powerful integrations.

Download Now

Make 6 FPS feel intentional. Beat-synced AI video tools for slow GPUs (and no GPU at all).

How it started

"Let me try Daydream on my Mac before switching to CUDA" escalated into a full MPS port with some novel features.

Scope runs at 6 FPS on M2 Max Apple Silicon, and only with SD-Turbo. At 6 FPS, every frame matters. You stop chasing generation speed and start controlling what happens between frames. I actually found it to be a very interesting and creative challenge, so I stuck with it and...

How it's going

This is a fully working fork of the official Scope distribution running natively on Apple Silicon. No CUDA, no cloud, just M2 Max 96GB unified memory and Metal. Many lessons learned and some new toys that will be particularly fun to bring back to Windows.

The ethos and impetus behind sticking to the M2 instead of switching to my shiny new 5090 is twofold: constrains boost creativity, and empowering artists with less modern hardware. If it works on Apple Silicon, it will work on any ancientNVidia card, I think.

Uncut 7 minute jam showcasing most of the new features
Dynamic parameter step sequencer, synced to Ableton via Ableton Link. Pre-release version

Dynamic parameter step sequencer, synced to Ableton via Ableton Link. Pre-release version

Beat-synced generation via Ableton Link

Scope4Mac joins the Ableton Link session as a peer — it syncs with Live, Resolume, TouchDesigner, or anything on the network that speaks Link.

16-step parameter sequencer locked to the beat grid at sixteenth-note resolution. Each step carries a value and a random deviation, scaled to the parameter's range. Hardwired tracks for diffusion strength and seed.

Dynamic tracks auto-populate from active pipeline nodes — add Bloom to the chain, its parameters appear as sequencer lanes. Remove it, they disappear.

Beat-reactive modulation: seed jumps and strength envelopes on beat boundaries. 60fps playhead via client-side extrapolation from Link's linear timeline model.

Who needs high FPS when you have RIFE-varispeed and RIFE-buffered

Beat-gated frame release

Frames buffer into a ring buffer and release on the sixteenth-note grid. A 16-step binary gate pattern controls which subdivisions fire — click to toggle each gate open or closed. Closed gates hold the previous frame. This gives rhythmic control over the visual output: syncopated patterns, half-time feels, stutters. At 6 FPS generation / 107 BPM, the buffer fills in about 0.7 seconds then gates fire precisely on the grid. RIFE after the gate interpolates between the beat-locked frames for smooth output.

Frame gate sequencer. All Ableton Link modules have an Ableton UI skin.

Frame gate sequencer. All Ableton Link modules have an Ableton UI skin.

Prompt timeline

Rebuilt the prompt timeline with DAW-style editing: scissors tool to split clips, hand tool to pan, select tool to drag clips in time. Live prompt tracking — the active block extends to follow playback position. Double-click to edit inline. Scissors and hand tools work during playback. Prompts update live via Enter during streaming.

Cutup prompt sequence

Cutup prompt sequence

Live performance pipeline

Hot-swap nodes during streaming — add, remove, reorder pre/postprocessors without dropping the WebRTC connection or restarting generation.

RIFE-Buffered: 6 FPS diffusion to 50+ FPS interpolated output on M2 Max at 8x multiplier.

6 processing nodes published as standalone Scope plugins:

RIFE-Buffered — frame interpolation, auto/manual mode, 2x-16x

RIFE-Varispeed — adaptive interpolation, MPS-optimised 8x cap

Fast Bloom — GPU glow via bilinear downsample-upsample, zero convolution overhead

Kaleidoscope — N-fold radial symmetry via polar coordinate folding

Feedback — TouchDesigner-style frame buffer with mix and decay

Invert — colour inversion with mix control

All parameters are runtime-controllable, OSC-addressable, and sequenceable.

Dynamic parameter surface

The set of controllable parameters is derived from pipeline schemas at runtime. Any node's runtime parameters automatically get a sequencer track in the UI, an OSC address for external control, and type-safe coercion at the WebRTC boundary.

This is the foundation for the next step: Max for Live devices that auto-discover available parameters and drive them with audio analysis.

Architecture

Backend owns the beat clock. Frontend extrapolates for display only. The sequencer is a graph-level controller — it modifies pipeline parameters per-frame from the authoritative beat position, not as a node in the processing chain. Patterns are declarative: the frontend sends the full 16-step pattern, the backend applies the correct step each frame. No per-beat round trips. Hot-swap rebuilds the pipeline graph in ~300ms without tearing down the WebRTC session.

Bugs found in upstream Scope

RIFE depth enum arrives as string "x8" via WebRTC — int("x8") silently falls back to 2x, users get 12 FPS instead of 48.

RIFE FPS measurement feedback loop — processing time inflates the measured interval, auto-picker converges at 40 FPS regardless of input rate.

schemaFieldOverrides infinite render loop — unknown parameter names cause React to re-render endlessly (blank screen).

Plugin init crash — non-builtin pipelines receive base schema kwargs (height, width) that fixed __init__ signatures don't accept.

Seed LFO silent failure when seed=0 and counter that never resets.

MPS findings

grid_sample with border padding: crashes on MPS, use zeros instead.

grid_sample with float16: produces scanlines on MPS, promote to float32.

int64 tensor ops: fall back to CPU on MPS, use int32 or float math.

Advanced indexing with int64: severe performance hit on MPS, use torch.gather with flat indices.

Design

Dual aesthetic: Mac OS X Aqua (brushed aluminum, traffic lights, pinstripe) for the application chrome, Ableton Live styling (dark LCD, amber accents, green transport) for the creative tools. Users familiar with either recognise the interaction patterns immediately.

Daydream Aqua icon

Daydream Aqua icon

Daydream iTunes 2001. Now holds up to 200 songs!

Daydream iTunes 2001. Now holds up to 200 songs!

Roadmap

Max for Live audio-reactive control — M4L devices auto-discover Scope parameters via REST API, send audio analysis (envelope follower, spectral centroid, onset detection) as OSC to the dynamic parameter surface. Architecture designed, OSC infrastructure complete.

Parameter envelopes — continuous curves alongside step values, interpolated against the local beat clock.

Open skin system — the dual aesthetic suggests swappable themes (Resolume dark, TouchDesigner minimal, system native).

RIFE structural fixes — input FPS at queue boundary, multi-frame preservation, enum coercion. Designed and implemented, reverted for stability, ready for isolated testing.

Links

Scope4Mac — https://github.com/555n/scope4mac

RIFE-Buffered — https://github.com/555n/scope-rife-buffered

RIFE-Varispeed — https://github.com/555n/scope-rife-varispeed

Fast Bloom — https://github.com/555n/scope-fast-bloom

Kaleidoscope — https://github.com/555n/scope-kaleidoscope

Feedback — https://github.com/555n/scope-feedback

Invert — https://github.com/555n/scope-invert

Early build of RIFE-Buffered node, hitting 50fps on M2

Early build of RIFE-Buffered node, hitting 50fps on M2

Early output = Buffered RIFE + Bloom (some fps eaten up by NDI as I couldn't get Syphon to work)
First iteration of the Aqua OSX 2001 skin

First iteration of the Aqua OSX 2001 skin