VIBENET RESEARCH LAB

Twelve proofs-of-concept, one signal contract.

VIBEnet is the audible awareness layer between humans, agents, and devices. This lab publishes the twelve proofs-of-concept that demonstrate that claim end to end — from a single agent heard through a browser, to a cross-device mesh that renders one signal into audio, voice, AR, and cognitive gating. Each rung below has a status. Most are planned. Some are in progress. One is shipped.

LIVE ANCHOR

  1. 00:00
    nominal

    Task received. The field stabilizes before the run opens.

  2. 00:04
    advisory

    The main figure enters. Planning becomes audible before the trace gets crowded.

  3. 00:07
    advisory

    The run leaves the model for evidence and carries the same phrase forward.

  4. 00:22
    warning

    Source confidence drops. The score tightens before you inspect the evidence.

  5. 00:27
    handoff

    The phrase holds instead of bluffing. Human attention is requested inside the suspension pocket.

  6. 00:32
    recovery

    The final chorus opens and the field releases without erasing the uncertainty.

shippedRunning in production. Linked demo available.
in-progressActively building. Partial demo or prototype.
plannedScoped, not yet started.
researchStill forming. Platform, ethical, or technical questions open.
  1. 01
    shipped

    VIBEnet Pulse

    I can hear what the agent is doing.

    nominaladvisorywarninghandoffrecovery

    A single autonomous agent runs a task. Every meaningful execution event: planning, tool call, retrieval, warning, handoff, completion. Each one emits a Signal Contract event, which VIBEnet converts into a live ambient audio layer. The visitor listens to the agent work.

    Proves agent awareness by ear. A cold visitor should be able to close their eyes for twenty seconds and describe the arc of the task afterward. This rung is the primitive. Every higher rung builds on it.

    browsertone.jscore product
  2. 02
    planned

    Sonic Trace Replay

    Listen to an agent run after the fact.

    nominaladvisorywarningrecovery

    Take one completed agent trace and render it as a thirty-second audio replay with timestamped event markers. Planning, tool calls, retries, cost spikes, errors, recovery, final answer — all become motifs played back in compressed time.

    Proves that operational logs can become auditable operational memory. The wedge is "Datadog trace, but audible" without overclaiming realtime observability.

    browserserpradio traceenterprise demo
  3. 03
    planned

    Audible Contract

    Agent compliance becomes hearable.

    nominalwarningcriticalhandoff

    Define a small agent contract. Each rule has an audio consequence. Approved tool use sounds stable. Expensive model calls add a cost shimmer. Retry loops raise tension. Route or data-boundary violations trigger a warning interval. Human-approval requests fire a call-and-response motif. Clean completion resolves in cadence.

    Proves that compliance monitoring can be continuous and sonic, not only log-based and retrospective. Maps directly to the Constitutional CMS governance framework.

    browsercontract schemagovernance
  4. 04
    in-progress

    Human-Agent Call and Response

    The human talks, the agent answers, the system sings underneath.

    advisorywarninghandoff

    The human speaks through phone or glasses microphone: what are you doing, why did tension rise, pause, summarize the last sixty seconds, escalate only if it gets worse. The agent responds in voice. Underneath both voices, ambient VIBEnet audio continues, keeping the human oriented to system state across the whole conversation.

    Proves that voice is the explicit channel and VIBEnet is the ambient channel. Those are different jobs. Speech says things; ambient audio maintains awareness.

    browserwebsocketvoice agent
  5. 05
    planned

    Ray-Ban Audio Companion

    Glasses become agent-awareness earbuds.

    nominaladvisorywarninghandoff

    Phone runs a companion app or PWA. Ray-Ban Meta glasses act as open-ear audio output and microphone input over Bluetooth. The human walks, cooks, commutes, codes, or sits at the piano while agent state plays continuously in open air. A route anomaly triggers a motif shift. The human asks what changed. The agent answers.

    Proves that post-screen agent awareness works with consumer glasses today, before any display SDK access. This is likely the fastest glasses demo available.

    ray-ban metaray-ban displayoakley metaphone bluetooth
  6. 06
    research

    Meta POV Agent

    The glasses camera gives the agent eyes.

    advisorywarninghandoff

    Use the Meta Wearables Device Access Toolkit to let a mobile companion app receive camera context from compatible AI glasses, then forward that context to a VIBEnet agent. The agent sees what the wearer sees — a laptop dashboard, an airport departures board, a printed meeting agenda, a piano session — and converts visible complexity into sonic summary.

    Proves VIBEnet can become a wearable perception layer, not only a website. Safe first demos target objects, documents, dashboards, and context. Face recognition and bystander identification are explicitly excluded.

    meta WDATcameramobile companion
  7. 07
    planned

    Snap Spectacles VET Lens

    Agent state becomes AR atmosphere.

    nominaladvisorywarningrecoveryhandoff

    A Lens Studio experience in which VET state renders as spatial AR above the wearer's workspace. Low tension shows as a soft stable aura. Higher energy speeds particle motion. Warning pulses amber. Completion expands a ring. Agent handoffs travel as arcs between nodes. In multi-agent rooms, each agent has a floating sonic and visual identity.

    Proves VIBEnet renders across audio and spatial visuals together. First demo: one floating agent orb above a laptop, changing color, motion, and sound with live agent state.

    snap spectacleslens studiosnap cloud
  8. 08
    research

    Crown Cognitive Gate

    The agent learns when not to interrupt.

    advisorycriticalhandoff

    Use the Neurosity Crown's focus and calm probability metrics as a soft gate on VIBEnet's interruption density. High focus reduces audio intensity. Low focus permits richer explanatory narration. Low calm routes non-urgent alerts away. Critical agent events break through regardless, with a distinct warning motif.

    Proves that human-agent communication can respect the human nervous system. Framed as cognitive availability sensing, not mind reading. This is the politeness primitive — agents learning when, how, and how loudly to enter human awareness.

    neurosity crowneegcognitive adaptive interface
  9. 09
    research

    Kinesis Agent Control

    Thought gesture as a command layer.

    handoff

    Use Neurosity Kinesis for two or three trained motor-imagery commands. Low-stakes actions only at first: pause or resume sonification, request agent explanation, mark the current moment as important, lower audio density. Every Kinesis action requires audible confirmation before execution.

    Proves human-agent communication does not have to route only through keyboard, mouse, voice, or touchscreen. Sequenced after Cognitive Gate because safer gating must be proven before command execution.

    neurosity crownkinesismotor imagery
  10. 10
    in-progress

    Human-Origin Hybrid Composer

    One piano note becomes an adaptive agent language.

    nominaladvisoryrecovery

    A human plays a motif at the piano — one note, a two-note interval, a phrase, or an improvisation. The system captures MIDI, velocity, timing, sustain, phrasing, and tags the gesture with VET coordinates. AI augmentation then harmonizes, orchestrates, stretches, inverts, layers, and spatializes. The same human-origin motif appears as browser piano, Ray-Ban earcon, Spectacles particle pulse, agent trace cue, Crown-adaptive variation, and live route signal in SERPRadio.

    Proves the system is neither human-only music nor generic AI music. It is human-origin, AI-augmented sonic intelligence. The authorship nuance is the moat.

    jamcorderhuman-origin librarycomposer pipelinevet tagging
  11. 11
    planned

    Cross-Device Mesh

    One signal, many renderers.

    nominaladvisorywarningcriticalrecoveryopportunityhandoff

    A single Signal Contract object describes an entity, an event, a public channel, and six flat renderer-facing fields. That one object flows simultaneously to every connected renderer. Browser shows a trace timeline and live state. Phone brokers voice and WebSocket. Ray-Ban plays open-ear ambient sonification. Meta-compatible glasses add camera context and audio response. Snap Spectacles render spatial AR. Neurosity Crown gates cognitive interruption. Server logs keep an auditable signal trail.

    Proves VIBEnet is a mesh fabric, not a single app. This is the keynote demo.

    signal contractmesh architecturemulti-device
  12. 12
    in-progress

    SERPRadio Wearable Route Demo

    Flight intelligence you can hear while moving.

    nominalopportunitywarning

    Put on Ray-Bans. Say monitor JFK to London. SERPRadio loads the JFK-LHR corridor. A calm motif plays when the route is stable. A warning motif plays when volatility rises. Ask is this actionable. The agent answers in voice. The Crown detects focus state and suppresses non-urgent narration. If Spectacles are worn, a route arc draws across the workspace with a VET-keyed aura.

    Proves VIBEnet is not abstract — it already has a live data domain with 231 NYC-origin routes in production. The first consumer-facing wedge.

    serpradioray-ban metavoice agentlive data

The thread that connects them.

Every rung speaks the same Signal Contract. One JSON object carries an entity, an event, a public semantic channel, and six flat renderer-facing state fields. Each device adapter renders that signal differently: a browser shows a trace, glasses play an earcon, Spectacles draw an aura, the Crown gates interruption. The contract is the moat. Ship it open source, let the adapters proliferate, keep the rendering substrate proprietary.

One object carries entity, event, channel, valence, energy, tension, intensity, hue, and pulse. Read the protocol page for the canonical schema and full sample object.

One human-origin motif. One live agent. One signal contract. Many devices.