You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The idea is:
A new jitter buffer uses a timing histogram to compute the actual jitter relative to the audio device or system clock timer.
A bit sequence will be added to all transmitted frames to improve the jitter histogram computation.
Open issues:
What about compatibility to old Jamulus servers and clients?
How does the new auto jitter buffer algorithm behave? For what situation is it optimized? See, e.g., Variable auto jitterbuffer. #417.
The old algorithm was designed to measure the block error rate for different jitter buffer sizes. What is the metric for the new auto jitter buffer algorithm to set the buffer size?
We are using an unused bit in the OPUS bit stream. Question: Why is this bit not used? Is it a reserved bit? Do we have a reference to the OPUS specification document, where this bit is mentioned? Is there a risk that at a later version of OPUS that this bit is then used by OPUS and we cannot upgrade Jamulus to a new OPUS version?
A documentation of the algorithm should be added so that others can understand what's going on and can improve/tune the new algorithm.
The idea is:
A new jitter buffer uses a timing histogram to compute the actual jitter relative to the audio device or system clock timer.
A bit sequence will be added to all transmitted frames to improve the jitter histogram computation.
Open issues:
A first patch is already available: #539.
See also the discussion in the pull request: #529 (comment)