28 October 2013

In the previous post at here, I had shown you some basic terminologies in audio processing and several libraries to handle with audio in iOS. It is just a kick off! This post, I will provide in more details principle architect of audio (in iOS, certainly!) and give you some great documents to gain more audio knowledge.

I had an opportunity to work with audio, thus I must read documentation carefully. However, there are plenty of notions that I have not known ever, even listening about them has never occured. What I should mention are audio session, audio unit, audio category, and so on. One time again, we must familiar with these definitions. Here we go!

Basic Definitions

Audio Session

Audio Category

Audio Queue

Audio Unit

Audio Unit is defined as software-plugin service which carries some low-level tasks such as mixing, filtering, splitting or digital processing services as general term. If you want to do futher than just playback audio file, you must know audio unit, and use Audio Unit framework to complete your desire.

Audio Services in IOS

Audio Service is a service which performs several specific tasks. Thus, a framework may contain more than one services. The image below shows three levels dividing services: Low-level, Mid-level and High-level (we usually do with two last levels).

alt text

Audio Framework in iOS

If you search audio framework in Documentation and API in Xcode, you will get many audio frameworks which includes Audio Toolbox, CoreAudio, OpenAL, AVFoundation. So, you may be confused to determine what framework should be used. Generally, Audio Toolbox is an application-level service, Core Audio is a low-level API which used to communicate with hardware, OpenAL is an opensource audio library and also an application-level API (and plus, usually used in Game), AVFoundation is an Objective-C library to play audio file.

According to Apple iOS documentation:

The Audio Toolbox framework (AudioToolbox.framework) provides interfaces for the mid- and high-level services in Core Audio. In iOS, this framework includes Audio Session Services, the interface for managing your application’s audio behavior in the context of a device that functions as a mobile phone and iPod.

The Audio Unit framework (AudioUnit.framework) lets applications work with audio plug-ins, including audio units and codecs.

The AV Foundation framework (AVFoundation.framework), available in iOS, provides the AVAudioPlayer class, a streamlined and simple Objective-C interface for audio playback.

The Core Audio framework (CoreAudio.framework) supplies data types used across Core Audio as well as interfaces for the low-level services.

The Core Audio Kit framework (CoreAudioKit.framework) provides a small API for creating user interfaces for audio units. This framework is not available in iOS.

The Core MIDI framework (CoreMIDI.framework) lets applications work with MIDI data and configure MIDI networks. This framework is not available in iOS.

The Core MIDI Server framework (CoreMIDIServer.framework) lets MIDI drivers communicate with the OS X MIDI server. This framework is not available in iOS.

The OpenAL framework (OpenAL.framework) provides the interfaces to work with OpenAL, an open source, positional audio technology.

In summary, we should focus on three frameworks (priority descending): AVFoundation, AudioToolbox, CoreAudio for making an awesome iOS app.

blog comments powered by Disqus