Reputation: 291
I am attempting to send an audio stream out over a phone call on Android.
For example, to create an app which would play some custom on-hold music, or answer a call and play a recording/audio file. I know it is possible to have an app automatically answer a call, but can it send audio to the caller?
If it is possible, please let me know what classes/functions handle this.
Upvotes: 28
Views: 29560
Reputation: 23
Following up with a previous answer.
There was research being done in 2018 on forwarding custom audio to the Xiaomi Redmi Note 3:
https://xdaforums.com/t/direct-audio-playback-through-alsa.3806297/
More info can be found there.
First of all, you need your card name. You can get it from
/proc/asound/cards
. For example:0 [msm8976tashalit]: msm8976-tashali - msm8976-tashalite-snd-card msm8976-tashalite-snd-card
card name will be "msm8976-tashalite-snd-card". Next thing is paths. You need to figure out how codec needs to be prepared. This info lies in
mixer_paths.xml
in/system/etc
or/vendor/etc
. Search for paths calleddeep-buffer-playback
,low-latency-playback
,compress-offload-playback
, etc. Put values from there to "device" section. And a device number. See/proc/asound/pcm
. Example:00-09: (Compress1) : : playback 1
Device number is 9. Volume is hard one because android changes it with software, so you can't see any difference when dumping control vars (at least in my case). Use AlsaMixer and guess... This blog post can help you: https://arunraghavan.net/2016/01/audio-devices-and-configuration/
Upvotes: 0
Reputation: 2283
I know the apps that are already sending all
system audio to phone's earpiece
even the songs you plays in your media player without root or any other special permission.
After some research I found that it can be done using Audio Manager
class. For example
AudioManager audioManager = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
audioManager.setMode(AudioManager.MODE_IN_COMMUNICATION);
audioManager.setSpeakerphoneOn(false);
Why not to set default output speakers as device earpiece and then play audio using MediaPlayer
class?
Upvotes: 1
Reputation: 894
This must be possible because phones will allow Bluetooth to send audio as a microphone.
Since I need to solve this problem one idea I have is to essentially send a bluetooth signal to myself "cloaked" as a microphone input.
Another idea (apologies if it sucks) is to reverse the speaker as a microphone. Since it seems the speaker has access to the hardware required, there may be a way to inject the audio data stream that way.
Since I'm entirely new to coding on Androids with the SDK please excuse the initial botchery of this post, I'll update as I/we figure this out.
Upvotes: 3
Reputation: 640
To get the audio data is very possible, but to write audio to upstreaming channel, that's not possible
Upvotes: 0
Reputation: 158
Capturing the remote submix audio requires the CAPTURE_AUDIO_OUTPUT permission. This permission is reserved for use by system components and is not available to third-party applications:(
Upvotes: 0
Reputation: 401
Writing to the phone call stream IS possible, but not from the app level on a stock (non rooted) phone.
When a phone call is initiated the mic is "typically" (really depends on the specific phone) routed directly to the baseband, ie skipping the main processor altogether.
For outgoing audio: mic->codec->baseband For incoming audio: baseband->codec->speaker
If it were always routed: mic->codec->mainprocessor->codec->baseband
Then the stream would be "presumably" available if the Android APIs (frameworks) supported accessing it.
The reason I say it is possible is because the audio (for nearly all smartphones now) is connected via SlimBus This allows dynamic changing of audio paths. It is however done in the kernel via the codec driver living in ALSA.
So.... were you so motivated, you could get the source for the Linux kernel for a phone and modify the codec/ALSA driver to allow you to change what happens when the call audio path is setup.
Of course then you would incur latency with the new path, breaking the call/latency standards AT&T setup (that Audience helped them write...) and the baseband chip might reject your audio as it's not timely.
Lastly you would need to modify the Android source (frameworks) to grow the API to support injecting audio onto that stream. (You'd need to make big mods to mediaserver, audioflinger in particular...)
It's complicated, but there is your answer. Cheers, :-)
Upvotes: 30