check out the paper: https://neuripscreativityworkshop.github.io/2023/papers/ml4cd2023_paper23.pdf
HARP is an ARA plug-in that allows for hosted, asynchronous, remote processing of audio with deep learning models. HARP works by routing audio from a digital audio workstation (DAW) through Gradio endpoints. Because Gradio apps can be hosted locally or in the cloud (e.g., HuggingFace Spaces), HARP lets users of Digital Audio Workstations (e.g. Reaper) access large state-of-the-art models in the cloud, without breaking their within-DAW workflow.
HARP has been tested on arm-based Mac computers running Mac OS (versions 13.0 and 13.4), using the REAPER digital audio workstation.
HARP requires a DAW that fully supports the Audio Random Access to VST plugins.
-
Download the HARP DMG file from from the HARP releases
-
Double click on the DMG file. This will open the window below
-
Double click on "Your Mac's VST3 folder"
-
Drag HARP.vst3 to the folder that was opened in the previous step
The windows build is still under development. There are no current plans to support Linux
-
Install the latest Reaper
-
Install HARP (see above)
-
Start Reaper
-
Open the preferences dialog by selecting the Reaper>Settings menu item
-
Scroll down to find Plug-ins>ARA and make sure "Enable ARA for plug-ins" is checked.

-
Restart Reaper
HARP should now be available as a VST3 plugin.
-
Record a track in Reaper
-
Select "FX" on the track's channel strip. This brings up the following dialog
- Add HARP(TeamUP) as a VST3 plugin. This will call up HARP
- Type the gradio endpoint of an available HARP mode where it says "path to a gradio endpoint." For example "hugggof/harmonic_percussive". This will bring up the model controls.
-
Adjust the controls and hit "process"
-
To hear your result, just hit the space bar.
Warning: please note that Reaper may block the use of hotkeys (i.e. CTRL + {A, C, V}) and the space bar within text fields by default. However, these can be enabled by checking "Send all keyboard input to plug-in" under the "FX" window.
While any model can be made HARP-compatible with the pyHARP API, at present, the following models are available for use within HARP. Just enter the gradio path (e.g. "hugggof/pitch_shifter" or "descript/vampnet") for any of these models into HARP.
- Pitch shifting: hugggof/pitch_shifter
- Harmonic/percussive source separation: hugggof/harmonic_percussive
- Music audio generation: descript/vampnet
- Convert Instrumental Music into 8-bit Chiptune: hugggof/nesquik
- Music audio generation: hugggof/MusicGen
- Pitch-preserving timbre-removal: cwitkowitz/timbre-trap
We provide a lightweight API called pyHARP for building compatible Gradio audio-processing apps with optional interactive controls. This lets deep learning model developers create user interfaces for virtually any audio processing model with only a few lines of Python code.
To build the HARP plugin from scratch, perform the following steps:
clone the HARP repo
git clone --recurse-submodules git@github.com:audacitorch/HARP.git
cd harp
Mac OS builds of HARP are known to work on apple silicon only. We've had trouble getting REAPER and ARA to work together on x86. TODO: test on x86 macs.
Configure
mkdir build
cd build
cmake .. -DCMAKE_BUILD_TYPE=Debug
Build
make -jNUM_PROCESSORS
To specify which OSX architecture you'd like to build for, set CMAKE_OSX_ARCHITECTURES to either arm64 or x86_64:
(for example, for an x86_64 build)
cmake .. DCMAKE_OSX_ARCHITECTURES=x86_64HARP has been tested on Windows 10 x64. You can checkout in the windowsBuild branch and follow the instructions there.
Codesigning and packaging for distribution is done through the script located at packaging/package.sh.
You'll need to set up a developer account with Apple and create a certificate for signing the plugin.
For more information on codesigning and notarization for mac, refer to the pamplejuce template.
The script requires the following variables to be passed:
# Retrieve values from either environment variables or command-line arguments
DEV_ID_APPLICATION # Developer ID Application certificate
ARTIFACTS_PATH # should be packaging/dmg/HARP.vst3
PROJECT_NAME # "HARP"
PRODUCT_NAME # "HARP"
NOTARIZATION_USERNAME # Apple ID
NOTARIZATION_PASSWORD # App-specific password for notarization
TEAM_ID # Team ID for notarization
Usage:
bash packaging/package.sh <Developer ID Application> <Artifacts Path> <Project Name> <Product Name> <Notarization Username> <Notarization Password> <Team ID>After running package.sh, you should have a signed and notarized dmg file in the packaging/ directory.
TODO
- download visual studio code for mac https://code.visualstudio.com/
- install Microsoft's C/C++ extension
- open the "Run and Debug" tab in vsc, and press "create a launch.json file" using the LLDB
- create a configuration for attaching to a process, here's an example launch.json you could use
{
"version": "0.2.0",
"configurations": [
{
"name": "lldb reaper",
"type": "cppdbg",
"request": "launch",
"program": "/Applications/REAPER.app/Contents/MacOS/REAPER",
"args": [],
"cwd": "${fileDirname}",
"MIMode": "lldb",
}
]
}
- build the plugin using this flag
-DCMAKE_BUILD_TYPE=Debug - run the debugger and add break poitns
{
"version": "0.2.0",
"configurations": [
{
"name": "lldb reaper",
"type": "cppvsdbg",
"request": "launch",
"program": "c:/Program Files/REAPER (x64)/reaper.exe",
"args": [],
"cwd": "${fileDirname}",
}
]
}
