Browser-based experiment in generating and shaping sound with JavaScript and the Web Audio API. The UI uses a canvas and Bézier-style control lines; the drawn path is sampled into an audio buffer for playback.
Open snd.html in a browser (or serve the project root with any static file server so relative paths like ../audio/ resolve if you use WAV loading).
Authoritative documentation lives in the GitHub Wiki for zFlux/soundwave. Edit pages in the wiki UI, or clone the wiki repository (https://github.com/zFlux/soundwave.wiki.git) and push Markdown changes. See GitHub Wiki notes on the wiki for clone and publish workflow.
| Topic | Wiki page |
|---|---|
| Wiki home (reading order) | Home |
| Product vision, State A vs B, success metric | product-vision-and-state |
| Design decisions, AI notes, v1 scope | design-decisions |
| Canonical terminology | glossary |
| Why “draw the wave” is the wrong grain | design-grain-and-abstraction |
| Useful, playful ways to control sound | fun-useful-sound-control |
| Planned tabbed workflow (prompt → edit → arrange) | staged-workflow-design |
| How this codebase maps drawing to audio | how-soundwave-works |
| Wiki setup and relationship to this repo | github-wiki-setup |