CameraStreamer (v2.1)

Hints:

  • In Android CameraStreamer is rendering video frames directly into Surface while in iOS CameraStreamer returns video frames of desired output type.
  • In case of multiple CameraStreamer instances usage it's generally advised to mute audio on all or all except one CameraStreamer instances.
  • iOS specific: CameraStreamer has additional outputContentType field which determines type of output video frame (currently it can be UIImage or CIImage)
  • Android specific: CameraStreamer constructor takes Surface parameter (most commonly obtained via TextureView::getSurfaceTexture() method).
  • Android specific: takeSnapshot() method is absent because the video decoder works directly with passed Surface object and therefore doesn't have access to raw frames. There is an easy and convenient way to take snapshot though: TextureView::getBitmap() method (the same view that holds the Surface from the previous step).

class CameraStreamer

CameraStreamer(CameraStreamerDelegate delegate);

Constructs CameraStreamer instance.

String rtspUrlString;

String which represents RTSP URL to be streamed. Streamer will start streaming after variable set.

play();

Resumes streaming: footage from pause point, live from current timestamp.

pause();

Pauses streaming (both live and footage).

seekToTime(float time);

Jumps to concrete time in [0 ... duration] range and starts playing (even if footage was paused). In milliseconds. Wouldn't affect live streaming. 

boolean isPlaying();

Returns true if the playback has been started.

boolean isLiveStreaming();

Returns true if streaming is not footage.

float getStartTime();

returns start time of footage stream (0 for live stream), in milliseconds till 1970.

float getEndTime();

returns end time of footage stream (0 for live stream), in milliseconds till 1970.

float getDuration();

returns duration of footage stream (0 for live stream), in milliseconds.

float getPosition();

returns current position in [0 ... duration] range for footage stream or current timestamp for live stream, in milliseconds.

float getBufferTime();

returns buffer time of footage stream (0 for live stream), in milliseconds.

boolean muteAudio;

mutes audio by ignoring the audio stream altogether.

byte[] takeSnapshot();

iOS specific: returns last decoded frame.

FrameDataType outputContentType;

iOS specific: frame's output content type.

protocol CameraStreamerDelegate

@optional onStarted();

Will get called when the playback has been effectively resumed/started.

@optional onPaused();

Will get called when the playback has been paused.

@optional onSoughtToTime(float time);

Will get called when the playback has been sought to given time. 

@optional onEnded();

Will get called when the playback has been ended. Will be called only for footage stream as live stream can't end. 

@optional onDurationChanged(float duration);

Will be called when footage stream duration has been changed. 

@optional onBufferTimeChanged(float bufferTime);

Will be called when footage stream buffer time has been changed.

@required onFrameDataReceived(FrameData frameData);

iOS specific: When a new frame arrives, this method will be called with frame data instance.

@required onError(MobileSDKError error);

Will be called if there is an error in the streaming process.

enum FrameDataType (iOS specific)

case UIImage
case CIImage

class FrameData

FrameDataType getType();

iOS specific: Returns type of frame's data.

byte[] getData();

Returns data associated with frame.

float getTimestamp();

Returns timestamp associated with frame.