Fixes to allow changing stream format#47
Conversation
|
Hey @plietar thanks for the PR :) Can you explain a little more the reasoning behind changing the |
|
Sure, disclaimer though, I've got very little experience with CoreAudio. Before this patch, setting the format worked ( This PR makes |
This matches what set_stream_format does.
As i understand it, an audio unit can have several inputs and several outputs, and an 'element' is just an index of one of those (https://developer.apple.com/library/archive/documentation/MusicAudio/Conceptual/AudioUnitProgrammingGuide/TheAudioUnit/TheAudioUnit.html). Therefore, it's should be possible, for example, to have several render callbacks for a single audio unit. An example would be a crossfade unit with 2 inputs: it'll have 2 elements in its input scope and 1 in output scope, and it'll require either two render callbacks (one for each input), or two upstream audio units. This changes Element to be just a number and adds explicit element parameter to all the places where it hasn't been present before (i.e. setting callbacks and input/output stream formats). This relates to the issue RustAudio#60 and PR RustAudio#47.
As i understand it, an audio unit can have several inputs and several outputs, and an 'element' is just an index of one of those. (https://developer.apple.com/library/archive/documentation/MusicAudio/Conceptual/AudioUnitProgrammingGuide/TheAudioUnit/TheAudioUnit.html). Therefore, it's should be possible, for example, to have several render callbacks for a single audio unit. An example would be a crossfade unit with 2 inputs: it'll have 2 elements in its input scope and 1 in output scope, and it'll require either two render callbacks (one for each input), or two upstream audio units. This changes Element to be just a number and adds explicit element parameter to all the places where it hasn't been present before (i.e. setting callbacks and input/output stream formats). This relates to the issue RustAudio#60 and PR RustAudio#47.
As i understand it, an audio unit can have several inputs and several outputs, and an 'element' is just an index of one of those. (https://developer.apple.com/library/archive/documentation/MusicAudio/Conceptual/AudioUnitProgrammingGuide/TheAudioUnit/TheAudioUnit.html). Therefore, it's should be possible, for example, to have several render callbacks for a single audio unit. An example would be a crossfade unit with 2 inputs: it'll have 2 elements in its input scope and 1 in output scope, and it'll require either two render callbacks (one for each input), or two upstream audio units. This changes Element to be just a number and adds explicit element parameter to all the places where it hasn't been present before (i.e. setting callbacks and input/output stream formats). I also had to change handling of render callbacks a bit, since there can now be more than one of them for a single audio unit. This relates to the issue RustAudio#60 and PR RustAudio#47.
As i understand it, an audio unit can have several inputs and several outputs, and an 'element' is just an index of one of those. (https://developer.apple.com/library/archive/documentation/MusicAudio/Conceptual/AudioUnitProgrammingGuide/TheAudioUnit/TheAudioUnit.html). Therefore, it's should be possible, for example, to have several render callbacks for a single audio unit. An example would be a crossfade unit with 2 inputs: it'll have 2 elements in its input scope and 1 in output scope, and it'll require either two render callbacks (one for each input), or two upstream audio units. This changes Element to be just a number and adds explicit element parameter to all the places where it hasn't been present before (i.e. setting callbacks and input/output stream formats). I also had to change handling of render callbacks a bit, since there can now be more than one of them for a single audio unit. This relates to the issue RustAudio#60 and PR RustAudio#47.
|
I've recently been working on using this crate for audio input and encountered the scope problem, which is actually a bit bigger than just this change. It took a lot of searching but Figure 1-3 in https://developer.apple.com/library/archive/documentation/MusicAudio/Conceptual/AudioUnitHostingGuide_iOS/AudioUnitHostingFundamentals/AudioUnitHostingFundamentals.html makes it clear. For rendering audio, your application should use the input scope and element 0 (aka element::Output). For taking audio from the microphone you need to use the output scope and element 1 (aka element::Input). |
No description provided.