Solo disponible en BuenasTareas
  • Páginas : 12 (2998 palabras )
  • Descarga(s) : 0
  • Publicado : 28 de noviembre de 2011
Leer documento completo
Vista previa del texto
The J2ME Mobile Media API

1 of 12


Jul 09, 2009


The J2ME Mobile Media API

by Qusay H. Mahmoud June 2003 Abstract
The Mobile Media API (MMAPI) is an optional package that supports multimedia applications on J2ME-enabled devices. This standard Java specification, defined by the Java Community Process (JCP) inJSR 135, is highly flexible. It has been designed to run with any protocol and format; for example, it doesn't specify that the implementation must support particular transport protocols such as HTTP or Real-Time Transport Protocol (RTP), or media formats such as MP3, MIDI, or MPEG-4 This article provides a technical overview of MMAPI's architecture and APIs, followed by a tutorial in which samplecode demonstrates how MMAPI can be used to build multimedia-rich wireless Java applications. A complete media player is developed, and steps for testing it are provided.

Overview of MMAPI
MMAPI has been designed to run on any J2ME-based virtual machine, including the CDC and CLDC VMs. Sun's reference implementation runs on CLDC/MIDP for Windows 2000. The J2ME Wireless Toolkit comes with theMMAPI. MMAPI's developers designed into it the following features: Support for Tone Generation, Playback, and Recording of Time-Based Media: The package supports any time-based audio or video content. Small Footprint: MMAPI works within the strict memory limits of CLDC devices. Protocol- and Content-Agnostic: The API is not biased towards any specific content type or protocol. Subsettable:Developers can limit support to particular types of content, basic audio for example. Extensible: New features can be added easily without breaking older functionality. More importantly, additional formats can be easily supported, and the framework is in place for additional controls. Options for Implementers: The API offers features for different purposes. The API is designed to allow implementers toleave some features unimplemented if they cannot be supported.

Multimedia Processing
There are two parts to multimedia processing: Protocol Handling: reading data from a source such as a file or a streaming server into a media-processing system. Content Handling: parsing or decoding the media data and rendering it to an output device such as an audio speaker or video display. To facilitate theseoperations, the API provides two high-level object types: DataSource encapsulates protocol handling by hiding the details of how the data is read from its source. This object's utility methods enable the Player object to handle the content. Player reads the data from DataSource, processes it, and renders it to an output device. This object provides methods to control media playback, includingmethods for type-specific controls to access features for specific media types.

The J2ME Mobile Media API

2 of 12

MMAPI specifies a third object, a factory mechanism known as the Manager, to enable your application to create Players from DataSources, and also from InputStreams. The overall architecture of MMAPI is shown in Figure 1:

Figure 1: The MMAPI Architecture
(Click image toenlarge.) The Manager object provides the method createPlayer(), which is the top-level entry point into the API. Here's an example:

... Player player = Manager.createPlayer(String url); ...

The url specifies the protocol and the content, using the format :. The application uses the methods of the returned Player to control the retrieval and playback of time-based media. The player's life-cycleincludes five states: UNREALIZED, REALIZED, PREFETCHED, STARTED, and CLOSED. Six of its methods result in state transitions:

realize() prefetch() start() stop() deallocate() close()

When a player is created, it is in the UNREALIZED state. Calling realize() moves it to the REALIZED state and initializes the information the player needs to acquire media resources. Calling prefetch() moves it...
tracking img