Aracılığıyla paylaş


MediaSync Class

Definition

MediaSync class can be used to synchronously play audio and video streams.

[Android.Runtime.Register("android/media/MediaSync", ApiSince=23, DoNotGenerateAcw=true)]
public sealed class MediaSync : Java.Lang.Object
[<Android.Runtime.Register("android/media/MediaSync", ApiSince=23, DoNotGenerateAcw=true)>]
type MediaSync = class
    inherit Object
Inheritance
MediaSync
Attributes

Remarks

MediaSync class can be used to synchronously play audio and video streams. It can be used to play audio-only or video-only stream, too.

MediaSync is generally used like this:

MediaSync sync = new MediaSync();
            sync.setSurface(surface);
            Surface inputSurface = sync.createInputSurface();
            ...
            // MediaCodec videoDecoder = ...;
            videoDecoder.configure(format, inputSurface, ...);
            ...
            sync.setAudioTrack(audioTrack);
            sync.setCallback(new MediaSync.Callback() {
                {@literal @Override}
                public void onAudioBufferConsumed(MediaSync sync, ByteBuffer audioBuffer, int bufferId) {
                    ...
                }
            }, null);
            // This needs to be done since sync is paused on creation.
            sync.setPlaybackParams(new PlaybackParams().setSpeed(1.f));

            for (;;) {
              ...
              // send video frames to surface for rendering, e.g., call
              // videoDecoder.releaseOutputBuffer(videoOutputBufferIx, videoPresentationTimeNs);
              // More details are available as below.
              ...
              sync.queueAudio(audioByteBuffer, bufferId, audioPresentationTimeUs); // non-blocking.
              // The audioByteBuffer and bufferId will be returned via callback.
              // More details are available as below.
              ...
                ...
            }
            sync.setPlaybackParams(new PlaybackParams().setSpeed(0.f));
            sync.release();
            sync = null;

            // The following code snippet illustrates how video/audio raw frames are created by
            // MediaCodec's, how they are fed to MediaSync and how they are returned by MediaSync.
            // This is the callback from MediaCodec.
            onOutputBufferAvailable(MediaCodec codec, int bufferId, BufferInfo info) {
                // ...
                if (codec == videoDecoder) {
                    // surface timestamp must contain media presentation time in nanoseconds.
                    codec.releaseOutputBuffer(bufferId, 1000 * info.presentationTime);
                } else {
                    ByteBuffer audioByteBuffer = codec.getOutputBuffer(bufferId);
                    sync.queueAudio(audioByteBuffer, bufferId, info.presentationTime);
                }
                // ...
            }

            // This is the callback from MediaSync.
            onAudioBufferConsumed(MediaSync sync, ByteBuffer buffer, int bufferId) {
                // ...
                audioDecoder.releaseBuffer(bufferId, false);
                // ...
            }

The client needs to configure corresponding sink by setting the Surface and/or AudioTrack based on the stream type it will play.

For video, the client needs to call #createInputSurface to obtain a surface on which it will render video frames.

For audio, the client needs to set up audio track correctly, e.g., using AudioTrack#MODE_STREAM. The audio buffers are sent to MediaSync directly via #queueAudio, and are returned to the client via Callback#onAudioBufferConsumed asynchronously. The client should not modify an audio buffer till it's returned.

The client can optionally pre-fill audio/video buffers by setting playback rate to 0.0, and then feed audio/video buffers to corresponding components. This can reduce possible initial underrun.

Java documentation for android.media.MediaSync.

Portions of this page are modifications based on work created and shared by the Android Open Source Project and used according to terms described in the Creative Commons 2.5 Attribution License.

Constructors

MediaSync()

Class constructor.

Properties

Class

Returns the runtime class of this Object.

(Inherited from Object)
Handle

The handle to the underlying Android instance.

(Inherited from Object)
JniIdentityHashCode (Inherited from Object)
JniPeerMembers
PeerReference (Inherited from Object)
PlaybackParams

Gets the playback rate using PlaybackParams. -or- Sets playback rate using PlaybackParams.

SyncParams

Gets the A/V sync mode. -or- Sets A/V sync mode.

ThresholdClass

This API supports the Mono for Android infrastructure and is not intended to be used directly from your code.

(Inherited from Object)
ThresholdType

This API supports the Mono for Android infrastructure and is not intended to be used directly from your code.

(Inherited from Object)
Timestamp

Get current playback position.

Methods

Clone()

Creates and returns a copy of this object.

(Inherited from Object)
CreateInputSurface()

Requests a Surface to use as the input.

Dispose() (Inherited from Object)
Dispose(Boolean) (Inherited from Object)
Equals(Object)

Indicates whether some other object is "equal to" this one.

(Inherited from Object)
Flush()

Flushes all buffers from the sync object.

GetHashCode()

Returns a hash code value for the object.

(Inherited from Object)
JavaFinalize()

Called by the garbage collector on an object when garbage collection determines that there are no more references to the object.

(Inherited from Object)
Notify()

Wakes up a single thread that is waiting on this object's monitor.

(Inherited from Object)
NotifyAll()

Wakes up all threads that are waiting on this object's monitor.

(Inherited from Object)
QueueAudio(ByteBuffer, Int32, Int64)

Queues the audio data asynchronously for playback (AudioTrack must be in streaming mode).

Release()

Make sure you call this when you're done to free up any opened component instance instead of relying on the garbage collector to do this for you at some point in the future.

SetAudioTrack(AudioTrack)

Sets the audio track for MediaSync.

SetCallback(MediaSync+Callback, Handler)

Sets an asynchronous callback for actionable MediaSync events.

SetHandle(IntPtr, JniHandleOwnership)

Sets the Handle property.

(Inherited from Object)
SetOnErrorListener(MediaSync+IOnErrorListener, Handler)

Sets an asynchronous callback for error events.

SetSurface(Surface)

Sets the output surface for MediaSync.

ToArray<T>() (Inherited from Object)
ToString()

Returns a string representation of the object.

(Inherited from Object)
UnregisterFromRuntime() (Inherited from Object)
Wait()

Causes the current thread to wait until it is awakened, typically by being <em>notified</em> or <em>interrupted</em>.

(Inherited from Object)
Wait(Int64, Int32)

Causes the current thread to wait until it is awakened, typically by being <em>notified</em> or <em>interrupted</em>, or until a certain amount of real time has elapsed.

(Inherited from Object)
Wait(Int64)

Causes the current thread to wait until it is awakened, typically by being <em>notified</em> or <em>interrupted</em>, or until a certain amount of real time has elapsed.

(Inherited from Object)

Explicit Interface Implementations

IJavaPeerable.Disposed() (Inherited from Object)
IJavaPeerable.DisposeUnlessReferenced() (Inherited from Object)
IJavaPeerable.Finalized() (Inherited from Object)
IJavaPeerable.JniManagedPeerState (Inherited from Object)
IJavaPeerable.SetJniIdentityHashCode(Int32) (Inherited from Object)
IJavaPeerable.SetJniManagedPeerState(JniManagedPeerStates) (Inherited from Object)
IJavaPeerable.SetPeerReference(JniObjectReference) (Inherited from Object)

Extension Methods

JavaCast<TResult>(IJavaObject)

Performs an Android runtime-checked type conversion.

JavaCast<TResult>(IJavaObject)
GetJniTypeName(IJavaPeerable)

Applies to