Classes
The following classes are available globally.
-
- @brief The AudioPlayer plays audio files using Core Audio services.
@discussion To play audio files, the AudioPlayer uses the EZAudioFile and EZOutput objects. To play a file, the AudioPlayer creates an EZAudioFile instance to read audio data from the file. For playback of the audio data, it uses an EZOutput instance - this object makes use of AudioUnit to convert the audio data coming into a playback graph of audio processing units.
This class acts as the master delegate (the EZAudioFileDelegate) over whatever EZAudioFile instance, the
See moreaudioFile
property, it is using for playback as well as the EZOutputDelegate and EZOutputDataSource over whatever EZOutput instance is set as theoutput
. Classes that want to get the EZAudioFileDelegate callbacks should implement the AudioPlayer’s AudioPlayerDelegate on the AudioPlayer instance.Declaration
Objective-C
@interface AudioPlayer : NSObject <EZAudioFileDelegate, EZOutputDataSource, EZOutputDelegate>
Swift
class AudioPlayer : NSObject, EZAudioFileDelegate, EZOutputDataSource, EZOutputDelegate
-
This class represents a UIView controller for a view showing a vlsualisation for an encapsulated AudioPlayer (an audio player object).
See moreDeclaration
Objective-C
@interface AudioPlayerViewController : UIViewController <AudioPlayerDelegate>
Swift
class AudioPlayerViewController : AudioPlayerDelegate
-
An audio stream player based on Apple’s Core Audio Framework. More specifically, Audio Queue Services and AudioFileStream are used to handle audio buffers and audio streaming respectively.
@discussion: Special threading consideration- The audioQueue property should only ever be accessed inside a synchronized(self) block and only after checking that ![self isFinishing]
See moreDeclaration
Objective-C
@interface AudioStreamPlayer : NSObject { NSURL *url; AudioQueueRef audioQueue; AudioFileStreamID audioFileStream; AudioFileID audioFile; AudioStreamBasicDescription asbd; NSThread *internalThread; AudioQueueBufferRef[16] audioQueueBuffer; AudioStreamPacketDescription[512] packetDescs; unsigned int fillBufferIndex; UInt32 packetBufferSize; size_t bytesFilled; size_t packetsFilled; _Bool[16] inuse; NSInteger buffersUsed; NSDictionary *httpHeaders; NSString *fileExtension; AudioStreamPlayerState state; AudioStreamPlayerState laststate; AudioStreamPlayerStopReason stopReason; AudioStreamPlayerErrorCode errorCode; OSStatus err; _Bool discontinuous; pthread_mutex_t queueBuffersMutex; pthread_cond_t queueBufferReadyCondition; CFReadStreamRef stream; NSNotificationCenter *notificationCenter; UInt32 bitRate; NSInteger dataOffset; NSInteger fileLength; NSInteger seekByteOffset; UInt64 audioDataByteCount; UInt64 processedPacketsCount; UInt64 processedPacketsSizeTotal; double seekTime; BOOL seekWasRequested; double requestedSeekTime; double sampleRate; double packetDuration; double lastProgress; BOOL pausedByInterruption; }
Swift
class AudioStreamPlayer : NSObject
-
EZAudio is a simple, intuitive framework for iOS and OSX. The goal of EZAudio was to provide a modular, cross-platform framework to simplify performing everyday audio operations like getting microphone input, creating audio waveforms, recording/playing audio files, etc. The visualization tools like the EZAudioPlot and EZAudioPlotGL were created to plug right into the framework’s various components and provide highly optimized drawing routines that work in harmony with audio callback loops. All components retain the same namespace whether you’re on an iOS device or a Mac computer so an EZAudioPlot understands it will subclass an UIView on an iOS device or an NSView on a Mac.
Class methods for EZAudio are provided as utility methods used throughout the other modules within the framework. For instance, these methods help make sense of error codes (checkResult:operation:), map values betwen coordinate systems (MAP:leftMin:leftMax:rightMin:rightMax:), calculate root mean squared values for buffers (RMS:length:), etc.
Warning
As of 1.0 these methods have been moved over toEZAudioUtilities
to allow using specific modules without requiring the whole library.Declaration
Objective-C
@interface EZAudio : NSObject
Swift
class EZAudio : NSObject
-
The EZAudioDevice provides an interface for getting the available input and output hardware devices on iOS and OSX. On iOS the EZAudioDevice uses the available devices found from the AVAudioSession, while on OSX the EZAudioDevice wraps the AudioHardware API to find any devices that are connected including the built-in devices (for instance, Built-In Microphone, Display Audio). Since the AVAudioSession and AudioHardware APIs are quite different the EZAudioDevice has different properties available on each platform. The EZMicrophone now supports setting any specific EZAudioDevice from the
See moreinputDevices
function.Declaration
Objective-C
@interface EZAudioDevice : NSObject
Swift
class EZAudioDevice : NSObject
-
The EZAudioDisplayLink provides a cross-platform (iOS and Mac) abstraction over the CADisplayLink for iOS and CVDisplayLink for Mac. The purpose of this class is to provide an accurate timer for views that need to redraw themselves at 60 fps. This class is used by the EZAudioPlot and, eventually, the EZAudioPlotGL to provide a timer mechanism to draw real-time plots.
See moreDeclaration
Objective-C
@interface EZAudioDisplayLink : NSObject
Swift
class EZAudioDisplayLink : NSObject
-
The EZAudioFFT provides a base class to quickly calculate the FFT of incoming audio data using the Accelerate framework. In addition, the EZAudioFFT contains an EZAudioFFTDelegate to receive an event anytime an FFT is computed.
See moreDeclaration
Objective-C
@interface EZAudioFFT : NSObject
Swift
class EZAudioFFT : NSObject
-
The EZAudioFFTRolling, a subclass of EZAudioFFT, provides a class to calculate an FFT for an incoming audio signal while maintaining a history of audio data to allow much higher resolution FFTs. For instance, the EZMicrophone typically provides 512 frames at a time, but you would probably want to provide 2048 or 4096 frames for a decent looking FFT if you’re trying to extract precise frequency components. You will typically be using this class for variable length FFTs instead of the EZAudioFFT base class.
See moreDeclaration
Objective-C
@interface EZAudioFFTRolling : EZAudioFFT
Swift
class EZAudioFFTRolling : EZAudioFFT
-
The EZAudioFile provides a lightweight and intuitive way to asynchronously interact with audio files. These interactions included reading audio data, seeking within an audio file, getting information about the file, and pulling the waveform data for visualizing the contents of the audio file. The EZAudioFileDelegate provides event callbacks for when reads, seeks, and various updates happen within the audio file to allow the caller to interact with the action in meaningful ways. Common use cases here could be to read the audio file’s data as AudioBufferList structures for output (see EZOutput) and visualizing the audio file’s data as a float array using an audio plot (see EZAudioPlot).
See moreDeclaration
Objective-C
@interface EZAudioFile : NSObject <NSCopying>
Swift
class EZAudioFile : NSObject, NSCopying
-
Undocumented
See more
-
Undocumented
See more
-
The EZAudioPlayer provides an interface that combines the EZAudioFile and EZOutput to play local audio files. This class acts as the master delegate (the EZAudioFileDelegate) over whatever EZAudioFile instance, the
See moreaudioFile
property, it is using for playback as well as the EZOutputDelegate and EZOutputDataSource over whatever EZOutput instance is set as theoutput
. Classes that want to get the EZAudioFileDelegate callbacks should implement the EZAudioPlayer’s EZAudioPlayerDelegate on the EZAudioPlayer instance. Since 0.5.0 the EZAudioPlayer offers notifications over the usual delegate methods to allow multiple receivers to get the EZAudioPlayer’s state changes since one player will typically be used in one application. The EZAudioPlayerDelegate, thedelegate
, provides callbacks for high frequency methods that simply wrap the EZAudioFileDelegate and EZOutputDelegate callbacks for providing the audio buffer played as well as the position updating (you will typically have one scrub bar in an application).Declaration
Objective-C
@interface EZAudioPlayer : NSObject <EZAudioFileDelegate, EZOutputDataSource, EZOutputDelegate>
Swift
class EZAudioPlayer : NSObject, EZAudioFileDelegate, EZOutputDataSource, EZOutputDelegate
-
The EZAudioPlotWaveformLayer is a lightweight subclass of the CAShapeLayer that allows implicit animations on the
path
key.Declaration
Objective-C
@interface EZAudioPlotWaveformLayer : CAShapeLayer
Swift
class EZAudioPlotWaveformLayer : CAShapeLayer
-
EZAudioPlot
, a subclass ofEZPlot
, is a cross-platform (iOS and OSX) class that plots an audio waveform using Core Graphics.The caller provides updates a constant stream of updated audio data in the
updateBuffer:withBufferSize:
function, which in turn will be plotted in one of the plot types:- Buffer (
EZPlotTypeBuffer
) - A plot that only consists of the current buffer and buffer size from the last call toupdateBuffer:withBufferSize:
. This looks similar to the default openFrameworks input audio example. - Rolling (
EZPlotTypeRolling
) - A plot that consists of a rolling history of values averaged from each buffer. This is the traditional waveform look.
Parent Methods and Properties
See EZPlot for full API methods and properties (colors, plot type, update function)
See more - Buffer (
-
Undocumented
See more
-
The EZAudioUtilities class provides a set of class-level utility methods used throughout EZAudio to handle common operations such as allocating audio buffers and structures, creating various types of AudioStreamBasicDescription structures, string helpers for formatting and debugging, various math utilities, a very handy check result function (used everywhere!), and helpers for dealing with circular buffers. These were previously on the EZAudio class, but as of the 0.1.0 release have been moved here so the whole EZAudio is not needed when using only certain modules.
See moreDeclaration
Objective-C
@interface EZAudioUtilities : NSObject
Swift
class EZAudioUtilities : NSObject
-
The EZMicrophone provides a component to get audio data from the default device microphone. On OSX this is the default selected input device in the system preferences while on iOS this defaults to use the default RemoteIO audio unit. The microphone data is converted to a float buffer array and returned back to the caller via the EZMicrophoneDelegate protocol.
See moreDeclaration
Objective-C
@interface EZMicrophone : NSObject <EZOutputDataSource>
Swift
class EZMicrophone : NSObject, EZOutputDataSource
-
The EZOutput component provides a generic output to glue all the other EZAudio components together and push whatever sound you’ve created to the default output device (think opposite of the microphone). The EZOutputDataSource provides the required AudioBufferList needed to populate the output buffer while the EZOutputDelegate provides the same kind of mechanism as the EZMicrophoneDelegate or EZAudioFileDelegate in that you will receive a callback that provides non-interleaved, float data for visualizing the output (done using an internal float converter). As of 0.4.0 the EZOutput has been simplified to a single EZOutputDataSource method and now uses an AUGraph to provide format conversion from the
See moreinputFormat
to the playback graph’sclientFormat
linear PCM formats, mixer controls for setting volume and pan settings, hooks to add in any number of effect audio units (see theconnectOutputOfSourceNode:sourceNodeOutputBus:toDestinationNode:destinationNodeInputBus:inGraph:
subclass method), and hardware device toggling (via EZAudioDevice).Declaration
Objective-C
@interface EZOutput : NSObject
Swift
class EZOutput : NSObject
-
Undocumented
See more
-
The EZRecorder provides a flexible way to create an audio file and append raw audio data to it. The EZRecorder will convert the incoming audio on the fly to the destination format so no conversion is needed between this and any other component. Right now the only supported output format is ‘caf’. Each output file should have its own EZRecorder instance (think 1 EZRecorder = 1 audio file).
See moreDeclaration
Objective-C
@interface EZRecorder : NSObject
Swift
class EZRecorder : NSObject