-
Notifications
You must be signed in to change notification settings - Fork 365
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CoreAudio on iOS lists no input devices #842
Comments
update: When I get the audio devices in Dart using audio_service: final session = await AudioSession.instance;
List<AudioDevice> audioDevices = (await session.getDevices()).toList();
for (var device in audioDevices) {
print("Device: ${device.name}, input: ${device.isInput}, type: ${device.type}");
} the mic is found:
|
I think this is a I had to add Note: The same Hope that helps. |
Thanks for the investigation! I added
to It might be a permissions issue, but the microphone permission modal pops up and flutter has microphone permissions. Just the rust library it links maybe doesn't 🤔 |
I'm still investigating this issue and I have some news.
|
@tGrothmannFluffy you may also need to configure and activate AVAudioSession. Set category to play and record |
Oh goodness gracious!
into AppDelegate.swift. |
@tGrothmannFluffy I don't know your app, but if you want play nice with other apps that are currently in background, it is better to activate your audio session on demand and not in AppDelegate app start. Otherwise, for instance, Apple Music will stop playing when your app starts and may be that behavior your users would not like. |
Thanks for the help! |
Ok, turns out using audio_session in Dart/Flutter works: final session = await AudioSession.instance;
await session.configure(const AudioSessionConfiguration.music().copyWith(
avAudioSessionCategory: AVAudioSessionCategory.playAndRecord,
));
await session.setActive(true); But I wonder, isn't this something cpal should do when opening a stream? |
Hi there,
thanks for this awesome crate!
I am running into an issue on iOS (debugging and release, on an iphone and on simulator). The project is a flutter app using the flutter rust bridge. Running this code:
logs:
I also get this error:
[aurioc] AURemoteIO.cpp:1151 failed: -10851 (enable 1, outf< 2 ch, 0 Hz, Float32, deinterleaved> inf< 2 ch, 0 Hz, Float32, deinterleaved>)
When I try to access
host.default_input_device()
it results in:Could not get default input config: BackendSpecific { err: BackendSpecificError { description: "Invalid property value" } }
Same when I try to access
device.supported_input_configs().unwrap()
:called `Result::unwrap()` on an `Err` value: BackendSpecific { err: BackendSpecificError { description: "Invalid property value" } }
Microphone permissions are granted via the permission handler
The text was updated successfully, but these errors were encountered: