java media framework the java media framework (jmf) is an application programming interface (api)...

59
Java Media Framework The Java Media Framework (JMF) is an application programming interface (API) for incorporating time-based media into Java applications and applets The JMF 1.0 API (the Java Media Player API) enabled programmers to develop Java programs that presented time-based media JMF 2.0 API extends the framework to provide support for capturing and storing media data, controlling the type of processing that is performed during playback, performing custom processing on media data streams JMF 2.0 defines a plug-in API to allow developers to customize and extend JMF functionality

Upload: joana-grasty

Post on 16-Dec-2015

247 views

Category:

Documents


2 download

TRANSCRIPT

Java Media Framework

The Java Media Framework (JMF) is an application programming interface (API) for incorporating time-based media into Java applications and applets The JMF 1.0 API (the Java Media Player API) enabled programmers

to develop Java programs that presented time-based media JMF 2.0 API extends the framework to provide support for

capturing and storing media data, controlling the type of processing that is performed

during playback, performing custom processing on media data streams JMF 2.0 defines a plug-in API to allow developers to

customize and extend JMF functionality

JMF Media Processing Model

Media Streams

Often contain multiple channels Tracks

Example MPEG-1 file usually 2 tracks

Audio track Video track

Demultiplexing Multiplexing

Example

Process an mpeg-1 a/v media stream Transcode video track to H.263 Transcode audio track to GSM

Steps Demultiplex to obtain tracks Decompress video track Compress using H.263 Decompress audio track Compress using GSM Multiplex two tracks Save to file

JMF Design Goals

Enable input, processing and output of time-based media

Provides common cross platform API for accessing underlying media frameworks

Extensible support additional content types and formats, optimize

handling of supported format, create new presentation mechanisms

Supported Content Types

Supported types Not always both decode and encode Differences between platform independent and dependent

versions Audio

WAV, GSM, MIDI, etc Image

JPEG, etc Video

H.261, H.263, MPEG-1, Quicktime, AVI, etc

Recording, processing, and presenting time-based media

High-level Architecture

Some JMF Base Interfaces

Clock Controller Control

Time Model

The Clock interface Defines basic timing and synchronization operations Contains a Timebase

Based on the system clock time-base time

Simply provides current time Clock marks time for a particular media stream

media time Current position within a media stream

Time Model

Clock

Playback rate How fast the Clock is running in relation to its TimeBase Examples:

rate of 1.0 represents normal running time rate of 2.0 means presentation will run at twice the

normal rate Clock Transform

Media-time = media start-time+ Rate(time-base time – time-base start-time)

Example

Example: Have a 20 sec MPEG video stream MediaStartTime= 10 secs, TimeBaseTime= 3 secs TimeBaseStartTime= 0 secs, TimeBaseTime–

TimeBaseStartTime= 3 secs Media-time= media start-time+ Rate(time-base time –

time-base start-time)

So if Rate = 1.0, MediaTime= ?? Alternatively, if rate = -2.0, MediaTime= ??

Achieving Synchronization

Example Want to force a video renderer to sync to the timebase

of an audio renderer X = audio_renderer.getTimeBase() Video_renderer.setTimeBase(X)

Both objects would now use the same source of time.

Controller Interface

Controller Interface Defines basic state and control mechanism for an object

that controls, presents or captures time-based media Two types of Controller: Players and Processors

(considered later…)

Controllers

Controller lifecycle

Controller Events

JMF objects can post a MediaEvent Events posted by a Controller:

TransitionEvents Posted when a controller changes state

Change notification events e.g. RateChangeEvent

ControllerClosedEvents Posted when Controller shuts down

Corresponding listener interface for each type of JMF object that can post MediaEvents

JMF Event Model

Controls

Mechanism for setting and querying attributes of an object

Certain objects expose Controls e.g. often used by PlugIns to provide access to their

Control objects Examples

FrameRateControl GainControl

Can associate listener for when volume changes

Key objects in JMF

Managers DataSources Players Processors DataSinks

General Managers

Intermediary objects Enables new implementations of key interfaces 4 main types

Manager PlugInManager PackageManager CaptureDeviceManager

The Manager object

Manager object used for instantiating: DataSources,

used to deliver time-based multimedia data Players,

used to control and render multimedia data Processors,

used to process data and output the processed data DataSinks,

takes a DataSource as input and renders the output to a specified destination

The Manager Object

Data Model in JMF

Data Sources Media players use DataSources to manage the transfer

of media-content DataSource encapsulates location of media and the

protocol used to deliver the media Typically Identified by a:

URL, MediaLocator

Capture

Capture devices represented as DataSources e.g. microphone, video capture board, etc…

Devices can deliver multiple data streams e.g. audio and video from a camera e.g. multiple audio tracks from a recording session You may then wish a single DataSource to contain

multiple SourceStreams Manager.createMergingDataSource(SourceStreams)

Push and Pull Data Sources

Data sources can be categorized according to how data transfer is initiated

Pull Data-Source Client initiates the data transfer e.g. HTTP and FILE

Push Data-Source Server initiates the data transfer e.g. broadcast and multicast media

Players

Processes an input stream and renders it at a precise time

Does not provide any control over the processing that it performs or how it renders the media data

Players

Player extends the Controller interface. Has a lifecycle Sends media events

Player as a MediaHandler player = Manager.createPlayer(myDataSource); player = Manager.createPlayer(myMediaLocator); player = Manager.createPlayer(myUrl);

Players

UI Components

Players provides access to UI Components Player (or Processor) can provide two UI components

Visual component Control-panel component

Can retrieve these components using methods: getVisualComponent() getControlPanelComponent()

Player States Continued

Players post transitional events as they move between states ControllerListener Is the Player in an appropriate state? Only certain methods make sense in certain states

e.g. calling getTimeBase method on an unrealized player gives an error

Processors

Can also be used to present media data Specialized type of Player that provides control

over processing performed on the input media stream

Processing

Processor Stages

Additional Processor States

Two additional stand-by states: Configuring Configured – can use TrackControls

Processing Controls

For a given track, can control processing operations performed by the Processor by using the TrackControl for that track.

TrackControl[] = processor.getTrackControls()

Can explicitly select: Effect, Codec and Renderer plug-ins to use

TrackControl[1].setCodecChain( array_of_codecs )

Configuring the Processor

Consider using a processor to transcode an (audio+video) QuickTime movie – changing mpeg video track to h.263…

p = Manager.createProcessor(dataSource)

p.configure()

p.setContentDescriptor.QUICKTIME

tcs[] = p.getTrackControls()

Returns an array, e.g. 2 track controls…

Configuring the ProcessorFormat f0 = new VideoFormat(VideoFormat.h263,new Dimension(width, height),Format.NOT_SPECIFIED,Format.byteArray,(float)frameRate);Format f1 = new AudioFormat(AudioFormat.mpeg,80008,2);tcs[0].setFormat(f0)tcs[1].setFormat(f1)p.realize()p.start()

Processor Summary

A Processor does not have to output data as a DataSource, such a processor (i.e. one that renders the data) is effectively a configurable player.

Data Storage and Transmission

DataSink Used to read data from a DataSource and render the

media to an output destination Typical actions…

Write data to a file, across a network etc

Using the DataSink

MediaLocator dest = newMediaLocator(file://newfile.wav);dsink = Manager.createDataSink(ds, dest);dsink.addDataSinkListener(this);dsink.open();p.start();dsink.start(); Wait for EndOfStream event Close DataSink and remove Listener

dsink.close()

Example

Applet Movie Player

Applet Movie Player

Simple Java Applet that demonstrates how to create a simple media player with a media event listener. It will play the media clip right away and continuously loop.

<!-- Sample HTML

<applet code=TVApplet width=587 height=510>

<param name=file value=“playme.mov">

</applet> -->

Basic Steps

Initialisation… Retrieve applet’s FILE parameter Use this to locate media file and build URL Create Player using the Manager object Register applet as a ControllerListener

Steps 1 & 2: Resolving a URLfor the media stream// The applet tag should contain the path to the

// source media file, relative to the html page.

if ((mediaFile = getParameter("FILE")) == null)

Fatal("Invalid media file parameter");

try {

// Create a url from the file name and the url

// to the document containing this applet.

if ((url = new URL(codeBase, mediaFile)) == null)

Fatal("Can't build URL for " + mediaFile);

Step 3: Using Manager toCreate a Player// Create an instance of a player for this media

try {player = Manager.createPlayer(url);}catch (NoPlayerException e) {System.out.println(e);Fatal("Could not create player for " + url);}

Step 4: Register applet as aControllerListener// Add ourselves as listener for player's events

player.addControllerListener(this);

} catch (MalformedURLException e) {

Fatal("Invalid media file URL!");

} catch (IOException e) {

Fatal("IO exception creating player for " +

url);

Controlling the Player…

Starting the Playerpublic void start() {if (player != null)player.realize(); Stopping the Playerpublic void stop() { if (player != null) { player.stop(); player.deallocate(); }}

Responding to media events

Need to Implement ControllerListener When the Player is realized Posts a RealizeCompleteEvent

Get the Visual component

if (( visualComponent =player.getVisualComponent())!= null) {cPanel.add(visualComponent); Get the Control Panel component

When the media has reached the end… Posts an EndOfMediaEvent Rewind and start over

player.setMediaTime(new Time(0));

Extensibility

Can extend JMF functionality in two ways: Using Plug-ins

Effectively implementing custom processing components that can be interchanged with standard components used by a Processor

By direct implementation i.e implementing directly the Controller, Player , Processor,

DataSource, or DataSink interfaces e.g. implementing a player to utilise a h/w MPEG decoder e.g. integrating existing media engines

Interface Plugin

The base interface for JMF plug-ins. A PlugIn is a media processing unit that accepts

data in a particular format and processes or presents the data. Registered through the PlugInManager. Methods…

Open(), Close(), getName(), reset() Sub-interfaces…

Codec, effect, demultiplexer , multiplexer, etc.

Codecs

One input and one output Methods…

setInputFormat() setOutputFormat() getSupportedInputFormats() getSupportedOutputFormats() process()

input buffer, output buffer

Example 2

Accessing individual frames

FrameAccess

Problem How to access individual decoded video frames

from a Processor while processing the media. This could be used for scanning the decoded data; computing statistics for each video frame, etc.

FrameAccess

Solution… use a ‘pass-through’ plugin codec as a callback when

individual frames are being processed. Steps:

1) Build the pass-through codec. 2) Create a processor from the input file. This processor will be

used as a player to playback the media. 3) Get the TrackControls from the processor. 4) Set your codec on the video track:TrackControl.setCodecChain(your_codec[])

Basic code

// Get Video track as a track controlTrackControlvideoTrack = null;for (int i = 0; i < tc.length; i++) { if (tc[i].getFormat() instanceof VideoFormat) { videoTrack = tc[i]; break; }}// Instantiate & set frame access codec to data flow

path.Codec codec[] = { new PreAccessCodec(),new PostAccessCodec() };

PreAccessCodec

public class PreAccessCodec implements Codec {

void accessFrame(Buffer frame) {

long t = (long)(frame.getTimeStamp()/10000000f);

System.err.println("Pre: frame #: " +

frame.getSequenceNumber () +

", time: " + ((float)t)/100f +

", len: " + frame.getLength());

}

…………………

Other methods, e.g. getSupportedInputFormats etc.

The Codec’s Process method

public int process(Buffer in, Buffer out) { // This is the "Callback" to access individual frames. accessFrame(in); // Swap the data between the input & output. Object data = in.getData (); in.setData(out.getData ()); out.setData(data ); // Copy the input attributes to the output out.setFormat(in.getFormat()); out.setLength(in.getLength()); out.setOffset(in.getOffset());