This is old news but I just heard about it and thought it was worth posting for future reference.
Over on the Andraudio list Robert Munro kindly pointed out that there was a question about the sad state of audio latency on Android at the Google I/O 2011 “Fireside Chat with the Android Team.” The question is at 40:24 in the video:
I’ve transcribed Dave’s answer below:
In response to “There are questions about audio latency being a problem for Android, are you seeing this Dave?” Dave Sparks said:
“Latency is a big problem. We’re working at, hopefully we hope to be able to do something about it with ICS. As we investigated it it’s actually a pretty complex problem. There are a number of different places where latency gets introduced. Most of the latency is introduced below Android. Basically it’s happening in the drivers or in the chipsets or somewhere in there, and some of these are really obscene amounts like hundreds of milliseconds of latency in the audio path. So, that’s something we’re going to push on. We started/ I think we introduced something in CDD Gingerbread which was a “should” hit certain latencies.
There are some interesting problems we need to solve in the scheduler, so I’ll be talking to Rebecca shortly about this. Because the fair scheduler makes it really difficult to make sure that these low latency audio threads get scheduled when we need them to be scheduled every single time. That’s probably the biggest issue we’re running into right now. But it’s a problem we want to deal with and hopefully the next release will get it. Obviously it’s not going to solve the problems for legacy devices but it’s going to get better.”
Source:
Dave Sparks — Technical lead for the Android media framework.
Google I/O 2011, “Fireside Chat with the Android Team” May 10, 02:30PM – 03:30PM
On YouTube as “Google I/O 2011: Fireside Chat with the Android Team” at 40:24
www.youtube.com/watch?v=gfiYUL2exT8#t=2424s
11 Comments
Is the latency on all Android models. And if so, will the general public notice this latency on their cell phones? I never noticed it.
It’s encouraging to know that the Android team is at least aware of the problem, but I’m concerned the engineers like Dave Sparks don’t have the political clout at Google to get Google to lean hard on the device makers. The device makers themselves don’t feel any pressure from the market because consumers are completely ignorant of the problem. Part of the blame for this ignorance lies with the general tech media, who seem thrilled to delve into spec sheets when it comes to a device’s screen, processor, and camera but who wouldn’t know a DAC from their ass. The day Engadget pans an Android phone for 300 ms latency is the day we might see some movement on this problem.
Does anyone know if the new Galaxy Nexus at least meets the 45 ms output latency recommended in the latest CDD? It’s Google’s own dev phone, so they have no excuse if it doesn’t.
Also, great blog. I’ll be following it.
As someone who makes music in real time on the iPad (thanks to CoreMIDI and would like to do so on my Android tablet, I think it’s just absurd that Google has pretty much ceded live music making to Apple.
Granted it’s a rather small segment of of the action on either OS, but it show that certain EXTRA liberal arts sensibilities Steve Jobs spoke of that the Apple team does build into their devices.
Apple invested the time in developing CoreAudio when nobody cared, starting in the early 2000s. It’s an interest that both Apple and NeXT both shared and was merged in 1997.
I take your meaning. Apple is an experience company, unlike search companies seeking new sources of revenue.
The Android proponents are in denial thinking that latency can somehow be overcome by Google’s engineering prowess. It’s a combined hardware/software problem. Android, the software, alone can’t solve it.
Possibly the best the Android proponents could hope for is for specialized music making hardware which some company has yet to produce. The problem is though that Google’s monetization model won’t sustain an effort such for such a small interest.
@Jose: It’s a software engineering problem. Linux has great low latency support — on all hardware. Whether Google will fix it, well… that is, as you say, another matter.
@Matress Man: in this context, audio latency refers to latency experienced by Android Apps talking to the device’s audio hardware (audio recorders, music apps etc.) Android phones route audio from phone calls in proprietary ways that have separate latency behaviour compared to Android Apps.
@yrag: I’m not even going to begin to engage in a discussion about Apple here. Let’s just say they have their own shortcomings. In any case, you don’t need “liberal arts sensibilities” to deliver low latency audio, you need good engineering. Which is something Google is supposed to have.
This is just another example of Google’s lack of firm control over Android hampering it. Their first mistake was not prioritizing it – despite the fact that consumer music apps are some of the apps that really blow people away, like Songify or LaDiDa on the iPhone. This stuff sells iPhones. Their second mistake was to not enforce tighter standards so the consumer experience is largely the same across devices, and this is a problem for all kinds of interactions, especially the UI (scroll lag, etc.)
I certainly wouldn’t say users aren’t aware of the problem – if you’ve tried a voice changer app, or a guitar app (where you can pluck strings in real-time) you’ve encountered the horrible latency, which extends across all devices because Android just sucks at it. Again, it’s the kind of lack of attention to detail that will eventually doom Android to the “feature phone” level.
Also just to clarify what “latency” in this context usually means – it means the time takes for a real time audio input to be processed and re-output as audio. In an Android voice-changer app, this means you won’t hear your voice for quite some time, even a whole second. This makes Songify impossible because the app isn’t getting the audio on beat, and I assume the latency always varies a lot, so a developer can’t compensate (meaning anything you record with a backing track will out of sync with the music.) When you hit strings on a virtual guitar and there’s a delay, this is probably partly terrible audio drivers but also the crappy prioritization of the UI means that touches aren’t registered as fast as they could be.
This is also why Amplitube, the #1 maker of guitar amp simulation software across all platforms, won’t develop for Android: (see http://www.ikmultimedia.com/forum/viewtopic.php?p=3021_. No one wants to hear their guitar broadcast back to them 200ms later. Imagine trying to play a piano if you didn’t hear the sound of your playing for a quarter of a second! It doesn’t sound like a lot, but it makes it literally impossible to play on beat.
“Most of the latency is introduced below Android”
Every time a Google developer gives a statement regarding this issue it is utterly bullsh*t. Sorry.
Issue [3434] exists since 2009:
http://code.google.com/p/android/issues/detail?id=3434
Is is embarrassing – how can it be that a multi-billion company is not able to solve an almost four year old problem?
I have seen all their moronic attempts to solve it (which don’t work):
– introducing unnecessary layer of code (OpenSL ES)
– passing responsibility to hardware developers…the problem is that it doesn’t matter how well developed and fast your audio driver code is, Androids audio system will slow everything down (see below)
– having a false concept of low latency: the feature android.hardware.audio.low_latency was introduced which for Google means 45ms(!) and less…this is a joke! Every phone should be capable of 20ms per default, low latency should be something like 5ms and less.
Look at the design of Androids audio system:
https://docs.google.com/drawings/edit?id=18MqktlXHzA7O8sAXQt0vmg-czFHV ksx6qujQ_ZtWsSk&hl=en&authkey=CJGswKcN&pli=1
Anyone with a basic understanding of low-level audio programming will despair after seeing this code (it is open source): passing of data via binder and shared memory, thread locking with mutexes, huge fixed length audio buffers, … it is simply broken.
Someone even ported PulseAudio to Android and his result clearly show why we should blame Google…and only Google:
http://arunraghavan.net/2012/01/pulseaudio-vs-audioflinger-fight/
In short: PulseAudio has more than 150ms (!) less latency than AudioFlinger…this is a really, really huge difference!
I hope someday someone will do audio benchmarks for Android devices, print them out, tie them to a brick and throw it inside Googles office!
…only joking, but this issue makes me mad.
I’ve just completed some latency tests on mobile phones. It appears some must use the audio API for mobile calls as well as apps because the Samsung Galaxy S II adds 100ms of audio in each direction (200ms round trip time).
That means the round trip time for a call on Vodafone which is typically 500ms on a traditional mobile becomes 900ms with two Galaxy SIIs! (testing on Gingerbread and ICS).
Haven’t seen an update on this for a while, any new news?