Android Jelly Bean(4.2):
Low-latency audio:
Media codecs: The MediaCodec class provides access to low-level media codecs for encoding and decoding your media. You can instantiate a MediaCodec by calling createEncoderByType() to encode media or call createDecoderByType() to decode media. Each of these methods take a MIME type for the type of media you want to encode or decode, such as"video/3gpp" or "audio/vorbis".
With an instance of MediaCodec created, you can then call configure() to specify properties such as the media format or whether or not the content is encrypted.
Record audio on cue:
New method startRecording() allows you to begin audio recording based on a cue defined by a MediaSyncEvent. TheMediaSyncEvent specifies an audio session (such as one defined by MediaPlayer), which when complete, triggers the audio recorder to begin recording. For example, you can use this functionality to play an audio tone that indicates the beginning of a recording session and recording automatically begins so you don't have to manually synchronize the tone and the beginning of recording.
Timed text tracks:
The MediaPlayer now handles both in-band and out-of-band text tracks. In-band text tracks come as a text track within an MP4 or 3GPP media source. Out-of-band text tracks can be added as an external text source viaaddTimedTextSource() method. After all external text track sources are added, getTrackInfo() should be called to get the refreshed list of all available tracks in a data source.
To set the track to use with the MediaPlayer, you must call selectTrack(), using the index position for the track you want to use.
Audio effects:The AudioEffect class now supports additional audio pre-processing types when capturing audio:
Gapless playback:
You can now perform gapless playback between two separate MediaPlayer objects. At any time before your first MediaPlayer finishes, call setNextMediaPlayer() and Android attempts to start the second player the moment that the first one stops.
Low-level streaming multimedia:
Android 4.0 provides a direct, efficient path for low-level streaming multimedia. The new path is ideal for applications that need to maintain complete control over media data before passing it to the platform for presentation. For example, media applications can now retrieve data from any source, apply proprietary encryption/decryption, and then send the data to the platform for display.
To support this low-level streaming, the platform introduces a new native API based on Khronos OpenMAX AL 1.0.1.
Audio remote controls:Android 4.0 adds a new audio remote control API that lets media applications integrate with playback controls that are displayed in a remote view. Media applications can integrate with a remote music playback control that’s built into in the platform’s lock screen, allowing users to control song selection and playback without having to unlock and navigate to the music app.
Using the audio remote control API, any music or media app can register to receive media button events from the remote control and then manage play state accordingly. The application can also supply metadata to the remote control, such as album art or image, play state, track number and description, duration, genre, and more.
New media codecs and containers:
Android 4.0 adds support for additional media types and containers to give developers access to the formats they need. For high-quality compressed images, the media framework adds support for WebP content. For video, the framework now supports streaming VP8 content. For streaming multimedia, the framework supports HTTP Live streaming protocol version 3 and encoding of ADTS-contained AAC content. Additionally, developers can now use Matroska containers for Vorbis and VP8 content.
Low-latency audio:
Android 4.2 improves support for low-latency audio playback, starting from the improvements made in Android 4.1 release for audio output latency using OpenSL ES, Soundpool and tone generator APIs. These improvements depend on hardware support — devices that offer these low-latency audio features can advertise their support to apps through a hardware feature constant. New AudioManager APIs are provided to query the native audio sample rate and buffer size, for use on devices which claim this feature.
Android Jelly Bean(4.1):
Media codecs: The MediaCodec class provides access to low-level media codecs for encoding and decoding your media. You can instantiate a MediaCodec by calling createEncoderByType() to encode media or call createDecoderByType() to decode media. Each of these methods take a MIME type for the type of media you want to encode or decode, such as"video/3gpp" or "audio/vorbis".
With an instance of MediaCodec created, you can then call configure() to specify properties such as the media format or whether or not the content is encrypted.
New method startRecording() allows you to begin audio recording based on a cue defined by a MediaSyncEvent. TheMediaSyncEvent specifies an audio session (such as one defined by MediaPlayer), which when complete, triggers the audio recorder to begin recording. For example, you can use this functionality to play an audio tone that indicates the beginning of a recording session and recording automatically begins so you don't have to manually synchronize the tone and the beginning of recording.
The MediaPlayer now handles both in-band and out-of-band text tracks. In-band text tracks come as a text track within an MP4 or 3GPP media source. Out-of-band text tracks can be added as an external text source viaaddTimedTextSource() method. After all external text track sources are added, getTrackInfo() should be called to get the refreshed list of all available tracks in a data source.
To set the track to use with the MediaPlayer, you must call selectTrack(), using the index position for the track you want to use.
Audio effects:The AudioEffect class now supports additional audio pre-processing types when capturing audio:
- Acoustic Echo Canceler (AEC) with AcousticEchoCanceler removes the contribution of the signal received from the remote party from the captured audio signal.
- Automatic Gain Control (AGC) with AutomaticGainControl automatically normalizes the output of the captured signal.
- Noise Suppressor (NS) with NoiseSuppressor removes background noise from the captured signal.
You can now perform gapless playback between two separate MediaPlayer objects. At any time before your first MediaPlayer finishes, call setNextMediaPlayer() and Android attempts to start the second player the moment that the first one stops.
Android Ice Cream Sandwich(4.0):
Low-level streaming multimedia:
Android 4.0 provides a direct, efficient path for low-level streaming multimedia. The new path is ideal for applications that need to maintain complete control over media data before passing it to the platform for presentation. For example, media applications can now retrieve data from any source, apply proprietary encryption/decryption, and then send the data to the platform for display.
Using the audio remote control API, any music or media app can register to receive media button events from the remote control and then manage play state accordingly. The application can also supply metadata to the remote control, such as album art or image, play state, track number and description, duration, genre, and more.
Android 4.0 adds support for additional media types and containers to give developers access to the formats they need. For high-quality compressed images, the media framework adds support for WebP content. For video, the framework now supports streaming VP8 content. For streaming multimedia, the framework supports HTTP Live streaming protocol version 3 and encoding of ADTS-contained AAC content. Additionally, developers can now use Matroska containers for Vorbis and VP8 content.
No comments:
Post a Comment