Audio Processing - The second step
In the previous post at here, I had shown you some basic terminologies in audio processing and several libraries to handle with audio in iOS. It is just a kick off! This post, I will provide in more details principle architect of audio (in iOS, certainly!) and give you some great documents to gain more audio knowledge.
I had an opportunity to work with audio, thus I must read documentation carefully. However, there are plenty of notions that I have not known ever, even listening about them has never occured. What I should mention are audio session, audio unit, audio category, and so on. One time again, we must familiar with these definitions. Here we go!
Audio Unit is defined as software-plugin service which carries some low-level tasks such as mixing, filtering, splitting or digital processing services as general term. If you want to do futher than just playback audio file, you must know audio unit, and use Audio Unit framework to complete your desire.
Audio Services in IOS
Audio Service is a service which performs several specific tasks. Thus, a framework may contain more than one services. The image below shows three levels dividing services: Low-level, Mid-level and High-level (we usually do with two last levels).
Audio Framework in iOS
If you search audio framework in Documentation and API in Xcode, you will get many audio frameworks which includes Audio Toolbox, CoreAudio, OpenAL, AVFoundation. So, you may be confused to determine what framework should be used. Generally, Audio Toolbox is an application-level service, Core Audio is a low-level API which used to communicate with hardware, OpenAL is an opensource audio library and also an application-level API (and plus, usually used in Game), AVFoundation is an Objective-C library to play audio file.
According to Apple iOS documentation:
In summary, we should focus on three frameworks (priority descending): AVFoundation, AudioToolbox, CoreAudio for making an awesome iOS app.
blog comments powered by Disqus