Scriptable C# music engine for realtime playback, instruments, and VST3 hosting. It can also record mics and music, render audio, and cover a full DAW-like workflow if you know what you're doing.
MusicEngine is a C# scripting language for music. It supports VSTs, sequencers, and patterns without forcing you into any workflow. You keep full control over time and timing. Everything is a tool, nothing is forced, and there are no hidden mathematical clamps. Almost anything can be turned into a variable and be modulated.
var synth = CreateSynth();
var pattern = CreatePattern(synth);
pattern.LoopLength = 4.0;
pattern.Note(60, 0.0, 0.5, 110);
pattern.Note(64, 0.5, 0.5, 110);
pattern.Note(67, 1.0, 1.0, 110);
pattern.Note(60, 2.0, 1.0, 110, slideTo: 67, slideTimeMs: 500);
pattern.NoteMs(72, 1500, 250, 110);
// Note(...) auto-uses ms when duration > 8.0 or beat > 32.0 (still supports beats)
pattern.Note(72, 1500, 250, 110);
var seq1 = pattern.Note(60, 0.0, 0.25, 100).Siquenz("0010101001101110");
seq1.Loop = true;
pattern.Play();var synth = CreateSynth();
midi.device(0).to(synth);
midi.device(0).cc(1).to(value => synth.SetParameter("cutoff", value));
midi.device(0).pitchbend().to(value => synth.PitchBend(value * 2f - 1f));More MIDI details: Docs/Midi.md
Device/channel/route toggles:
var synth = CreateSynth();
var midi1 = midi.Device(0); // device handle
var route1 = midi.Device(0).to(synth); // device -> synth route
midi1.Active(false); // disable device input
midi1.Active(true); // enable device input
midi.Device(0).Channel(1).Active(false); // disable channel 2 on device 0
route1.Active(false); // disable only this routevar fx = CreateEffect()
.ReverbPreset("Hall")
.BitCrushPreset("Lofi")
.NoisePreset("Light")
.FilterPreset("Low");
var ch1 = Audio.CreateChannel(1);
ch1.Effect(fx);Preset factory:
var ch2 = Audio.CreateChannel(2);
ch2.Effect(Effect.Reverb);
ch2.Effect(Effect.Drive);Preset names:
var fx2 = CreateEffect()
.ReverbPreset("Hall")
.DelayPreset("Echo")
.DrivePreset("Warm");
Audio.CreateChannel(3).Effect(fx2);Preset with overrides:
var fx3 = CreateEffect()
.ReverbPreset("Hall", r => r.Mix = 0.4f)
.DelayPreset("Echo", d => d.TimeMs = 500f)
.FilterPreset("Low", f => f.CutoffHz = 800f);
Audio.CreateChannel(4).Effect(fx3);Standalone preset (for modulation):
var reverbFx3 = ReverbPreset("Hall", r => r.Mix = 0.4f);
var mix = Mod.Var(reverbFx3, "Mix");
mix.Lfo(0.2f, 0.6f, rateHz: 0.5f);var synth = CreateSynth();
var osc3 = synth.Oscillator();
osc3.Waveform = WaveType.Sine;
osc3.Level = 0.2f;
osc3.Pan = -0.2f;
var osc4 = synth.Oscillator();
osc4.Waveform = WaveType.Sine;
osc4.Level = 0.2f;
osc4.Pan = 0.2f;
osc4.ModToFilter = 0.2f;var synth = CreateSynth();
// Built-in tables
synth.Osc1Wavetable = Wavetable.Saw(2048);
synth.Osc2Wavetable = Wavetable.WhiteNoise(2048);
// Custom file (first channel)
var wt = Wavetable.FromFile("my_wave.wav", maxSamples: 2048);
var osc = synth.Oscillator();
osc.Wavetable = wt;
osc.Enabled = true;var vital = CreateVst("Vital");
vital.SetParameterNormalized("Cutoff", 0.5f);
midi.device(0).to(vital);
midi.device(0).pitchbend().to(value => vital.PitchBend(value * 2f - 1f));
var pattern = CreatePattern(vital);
pattern.LoopLength = 4.0;
pattern.Note(60, 0.0, 0.5, 110);
pattern.Note(64, 0.5, 0.5, 110);
pattern.Note(67, 1.0, 1.0, 110);
pattern.Play();Note: VST UI embedding is Windows-only for now. Linux runs VST audio + parameters without the editor window.
var vital = CreateVst("Vital");
vital.State(); // on /S or exit, writes base64 into the ()Notes:
- The inline
State()call is updated on refresh or exit, so you can copy/share the script. - States are stored per script under
.musicengine/states/<script>/<name>.state. - Missing VSTs warn and stay silent instead of crashing.
Manual overrides still work:
vital.LoadState("States.vital.state");
vital.SaveState("States.vital.state");To send audio into Discord as a microphone, you need a virtual audio device (e.g. VB-CABLE or VoiceMeeter). MusicEngine can route the master or a channel to any Windows output device.
Audio.Output.List(); // list render devices
Audio.Master.VirtualOut("CABLE Input"); // route master to virtual mic
var ch1 = Audio.CreateChannel(1);
ch1.VirtualOut("CABLE Input"); // route channel to virtual micNotes:
- Without a virtual audio device installed, Windows has no "virtual mic" target.
- In Discord, set Input Device to the virtual cable.
Note: audio input is Windows-only right now.
Audio.Input.List(); // list capture devices
var mic = CreateMic(0); // or CreateInput(0)
var ch1 = Audio.CreateChannel(1);
ch1.Route(mic);
ch1.Gain(0.7);CreateGeneralMidi()
.Pan(0.2f)
.Channel(0)
.Name("GM_AcousticGrandPiano");
CreateSynth()
.Volume(0.7f)
.Pan(-0.2f);
CreateVst("Vital")
.Volume(0.9f)
.Pan(0.1f);
CreateMic(0)
.Gain(0.8f)
.Mute(false);var pan = Mod.Pan(piano, 0f);
pan.Random(-0.5f, 0.5f, everyMs: 400);
var vol = Mod.Volume(synth, 0.7f);
vol.Lfo(0.2f, 0.9f, rateHz: 0.5f);
// Any property
var cutoff = Mod.Var(synth, "Cutoff");
cutoff.Random(0.2f, 0.9f, everyMs: 300);var piano = CreateGeneralMidi();
piano.Program = GeneralMidiProgram.AcousticGrandPiano;
piano.Volume = 0.8f;
piano.Pan = 0f;
piano.Reverb = 0.2f;
piano.Channel = 0;
midi.device(0).to(piano);
midi.device(0).pitchbend().to(value => piano.PitchBend(value * 2f - 1f));
var pattern = CreatePattern(piano);
pattern.LoopLength = 4.0;
pattern.Note(60, 0.0, 1.0, 100);
pattern.Note(67, 1.0, 1.0, 100);
pattern.Note(72, 2.0, 2.0, 100);
pattern.Play();- Low-latency MIDI routing and controller mapping.
- Scriptable pattern sequencer with loop length and per-note velocity.
- Built-in polyphonic synth with oscillators, ADSR, filter, LFO, and effects.
- General MIDI output wrapper for system MIDI devices.
- VST3 hosting with parameter control and editor support.
- Channel routing with per-channel effects and master processing.
- Recording taps for master and per-channel audio.
- Lighting and show control: DMX512, Art-Net, sACN (E1.31), MIDI Show Control.
- MIDI and timing: MIDI 1.0, MIDI 2.0, MTC, SMPTE/LTC sync, OSC control, MIDI screen/LED feedback with live data.
- Audio routing: Windows ASIO/WaveOut plus virtual outputs, Linux PortAudio (default device).
- Game engine integration: Unity/Unreal hooks, custom engine integration, realtime music logic control.
- Modular control: CV rack-style modularity via code for every variable and setting.
- DSP and hardware: planned DSP control surface and library to drive custom sound cards for full flexibility.
- The project uses NAudio for audio/MIDI and provides its own instrument wrappers.
- VST3 scanning can be configured via
MUSICENGINE_VST3_PATHS(path-separator separated). - Extra free instruments, kits, and tools live in the companion library:
- MusicEngine.Library
MusicEngine.Library is a separate, community-driven repository of free instruments, kits, and utilities. It is optional: the engine runs without it. When present, you can load instruments with:
var speech = LibraryApi("MusicEngine.Instruments.SpeechInstrument");Use cases:
- Sample-based kits (drums, one-shots, multisamples).
- Procedural instruments (speech, drones, generators).
- Utility helpers to build instruments from local folders.
Missing library instruments are skipped and stay silent (no crash), similar to missing VSTs.
- Install .NET SDK 10.0 (or newer).
- Windows: install Visual Studio 2022 with Desktop development with C++ (for the native VST3 host).
- Linux: install
cmake,build-essential, and the PortAudio dev package (e.g.portaudio19-dev). - Rider works fine, but MSBuild from Visual Studio Build Tools is still required for the Windows native layer.
Settings reference: Docs/Settings.md
- Output uses PortAudio (default device).
- VST UI embedding is not supported yet.
- Audio input/recording are not supported yet.
- Sample loading currently supports WAV (PCM/float) only.
The engine scans the default project folder:
Test Project/
Any .cs or .csx file can be a main script if it calls File.Main(); or File(Main, "Name");.
To use a different project folder, set MUSICENGINE_PROJECT_DIR (or pass --project-dir <path>).
The engine looks for Test Project/ first and then the legacy Scripts/ folder inside that project directory.
Managed build (C#):
- Open the solution in Rider or Visual Studio and build the
MusicEngineproject.
Native VST3 host (C++):
- First-time setup (pull the Steinberg SDK into
external/vst3sdk):git clone https://github.com/steinbergmedia/vst3sdk.git external/vst3sdkgit -C external/vst3sdk submodule update --init --recursive
Windows build (MSBuild):
- Rider: open
MusicEngine.CppLayer/native/MusicEngine.CppLayer.Native.vcxprojand build x64 Debug/Release. - Visual Studio: open the same
.vcxproj, select x64 + Debug/Release, then Build. - Command line (PowerShell):
& "C:\Program Files\Microsoft Visual Studio\18\Community\MSBuild\Current\Bin\MSBuild.exe" "MusicEngine.CppLayer\native\MusicEngine.CppLayer.Native.vcxproj" /p:Configuration=Release /p:Platform=x64
Linux build (CMake):
cmake -S MusicEngine.CppLayer/native -B MusicEngine.CppLayer/native/build -DCMAKE_BUILD_TYPE=Releasecmake --build MusicEngine.CppLayer/native/build --config Release
The native build copies MusicEngine.CppLayer.Native.dll or libMusicEngine.CppLayer.Native.so into MusicEngine\ automatically (and C# build copies it to the output).
Troubleshooting native build:
- Error:
Cannot open include file: public.sdk/source/vst/hosting/module.h- The SDK submodules are missing or empty. Run:
git -C external/vst3sdk submodule update --init --recursive- If the submodules are present but empty, reset them:
git -C external/vst3sdk/public.sdk reset --hardgit -C external/vst3sdk/base reset --hardgit -C external/vst3sdk/pluginterfaces reset --hard
- The SDK submodules are missing or empty. Run:
Copyright 2026 watermann429 and contributers.