Parametric EQ and streaming audio on iOS: What's the hold-up?
Some time back I wrote about custom equalizing headphones for one's ears. Similar to digital room correction (DRC) that corrects for speaker/room interactions, this involves compensating for the often very narrow but significant frequency peaks in one's particular headphone/ear system to achieve a flat perceived response.
Achieving this properly requires the use of either a notch filter circuit or a parametric equalizer. The latter seems like a particularly convenient solution, especially when implemented in software, and indeed there are various such plug-ins or other solutions available for audio playback on desktop and notebook computers. (I use the no-longer-supported Electri-Q plug-in with Foobar 2000 for music playback on my PC.)
On tablets and iPhones/iPods things get a bit more tricky. There are parametric equalizer music playback apps available (such as the excellent Equalizer for iOS), which are fine if you're only listening to music files that are stored on the device itself. But if you're among the increasing number of users using such devices to play streaming audio from files located on a music server elsewhere on the network, you're out of luck.
For example, I currently use the iPeng app for streaming music playback on my iPad. It works great, but the only equalizer I have access to with this app is the default iOS graphic equalizer, which is useless for my purposes.
I have also looked into other streaming solutions such as that offered by the JRiver Media Center software and associated JRemote app. Although the JRiver software offers some sophisticated DSP equalization capabilities, it does not seem they can be used on streamed files.
(There are also a couple of other server-side possibilities - BruteFIR and InguzEQ - but implementing them can apparently be a challenge. I may still look into them more closely if/when I have the time.)
But there is hope. Shortly before my original post on this topic, a new iOS app appeared that - for the first time - enabled the possibility of "live, app-to-app audio." The app, Audiobus, was aimed at musicians who were using iDevices to produce music. Using Audiobus-enabled apps, mobile musicians could now, for example, play and record a synthesizer app in real-time into a separate DAW (digital audio workstation) app, and even add a separate effects app into that audio signal chain.
Needless to say this opened up huge possibilities for mobile music making. And there's no reason this couldn't be used to expand playback processing options for basic music playback apps. For example, the same developer who produced the Equalizer music player app mentioned above also makes a separate Audiobus-enabled parametric equalizer/compressor app - Remaster - aimed at musicians. If my streaming music player app were Audiobus enabled (it currently isn't), I would simply be able to add it to the Input slot in Audiobus with Remaster in the Effect slot to have parametric equalization capability with streaming audio on my iDevice.
Of course processing resources are limited on tablets, and it's possible this could be an issue. But come on, it's about time this was tried.
More recently, with iOS 7, Apple introduced its own audio routing feature - Inter-App Audio - which accomplishes much the same thing as Audiobus. This offers yet another opportunity for iOS audio apps to work together. So I'm now not-so-patiently waiting to see which iOS or Android app(s) is first to produce a solution for enabling parametric equalization of streaming audio!