Wikipedia
HackerNews
GitHub
Webarmonium is a new instrument.
Traditional instruments translate a performer's physical gestures into sound. Webarmonium replaces the individual gesture with two forms of distributed interaction:
1. Network activity as unconscious performance
The landing page connects to three real-time data streams—Wikipedia edits, HackerNews posts, and GitHub commits. Each event contributes to the advent of a note: position determines frequency, source determines timbre. The resulting composition emerges from collective human activity, with no single author.
2. Remote motor gestures as collaborative performance
In collaborative rooms, up to 4 users become performers/composers. Tap, hold, and drag gestures generate musical phrases, or use automated gestures or the sequencer, and focus on shaping your sound with the synth panel. Additional users can join as listeners—hearing and watching without taking a slot. Co-compose music in an asynchronous, beat quantized space. The room develops "environmental memory"—interaction patterns shape its tonal character over 24 hours.
The system is fully deterministic: identical input produces identical output. There is no randomness, only the structured translation of human activity into sound.