Skip to content

Jamulus on Linux vs Windows #669

@seanogdelaney

Description

@seanogdelaney

TL;DR: Windows and Linux perform almost exactly the same running Jamulus on the same hardware. Windows was very slightly faster on my specific hardware. Typically, the difference in audio drivers will be the main factor (in my case, the networking was faster on Windows).


Hi,

I've used jack_iodelay (linux) and the RTL Utility (windows) to compare latencies.
I'm using the same computer (dual boot), hardware, and AWS server for both tests.

My audio interface shows the same measured latency on both linux and windows (7.7ms at buffer size 64).
Jamulus "overall delay" reports 22ms (linux) compared to 25ms (windows).
Measurements of the true total latency are 27ms (linux) and 33ms (windows).
Network jitter is typically +- 1ms.
Both versions of jamulus were downloaded in the past week.

The network latency (ping 9ms), interface latency (7.7ms) and jitter buffer size (4+4, "small buffers") are constant.
It seems that the linux jitter buffers account for 10ms of latency.
The windows jitter buffers seem to account for 15ms of latency.
Why is there a 50% difference here?

My collaborators use windows, and have slower networks, so a 5ms improvement would be great!

In any case, Jamulus developers should be very proud. Fantastic software.

Thanks.

P.S. Linux allows the user to choose 32 sample buffers, and this improves latency by an additional 3-4 ms.
That does not seem to be possible on Windows with Focusrite ASIO drivers (Jamulus sets the buffer size, minimum 64).

[Dell Latitude 7390, Intel Core i7, 16GB RAM, Focusrite 4i4 3rd Gen, 300Mbps Fibre internet (wired), Ubuntu 20.04 (ALSA driver) and Windows 10 (Focusrite ASIO Driver)]

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions