Caching Assets

If your application uses dynamic downloaded assets, similar to a Facebook application displaying your contacts picture, consider caching the assets on your device. Doing so will prevent the overhead of a download every time your application initializes.

Save this technique for small files so that you don’t abuse the user’s storage capacity.

Here, we are looking at the name of the file as part of a url. Check if the file already exists on the SD card. If it does, load it locally. If it does not, get the file from a remote server and save it to the device:

[code]

import flash.display.Loader;
import flash.net.URLRequest;
import flash.filesystem.File;
import flash.filesystem.FileStream;
import flash.filesystem.FileMode;
import flash.net.URLLoader;
import flash.net.URLLoaderDataFormat;
var stream:FileStream;
var fileName:String;
var file:File;
var urlLoader:URLLoader;
var url:String = “http://www.v-ro.com/cat.jpeg”;
fileName = new String(url).split(“/”).pop(); // cat.jpeg
file = File.applicationStorageDirectory.resolvePath(fileName);
if (file.exists) {
var loader:Loader = new Loader();
loader.contentLoaderInfo.addEventListener(Event.COMPLETE, onLocal);
loader.load(new URLRequest(file.url);
} else {
urlLoader = new urlLoader ();
urlLoader.dataFormat = URLLoaderDataFormat.BINARY;
loader.addEventListener(Event.COMPLETE, onRemote);
loader.load(new URLRequest(url));
}
function onLocal(event:Event):void {
event.currentTarget.removeEventListener(Event.Complete, onLocal);
addChild(event.currentTarget.content);
}
function onRemote(event:Event):void {
event.target.removeEventListener(Event.COMPLETE, onRemote);
var byteArray:ByteArray = event.target.data as ByteArray;
stream.open(file, FileMode.WRITE);
stream.writeBytes(data, 0, data.length);
stream.close();
var loader:Loader = new Loader();
loader.loadBytes(byteArray);
addChild(loader);
}

[/code]

Sprite Sheet and Blitting

Xerox PARC first created the bit-block transfer, Bit BLIT for short, for the Smalltalk system in 1975. Blitting is a computer graphics operation. It takes at least one bitmap and combines it with another. It has been a standard technique used since the beginning of the digital gaming industry.

Let’s see how we can use this technology for our purposes.

Blitting

A sprite sheet, also called a tile sheet animation, is a collection of images arranged into a single image. Traditionally, each sprite represents one frame of an animation.

In AIR, especially on mobile devices, the sprite sheet has benefits beyond animation. It can be used for ease of production and to load a single large image rather than monitoring many small loading processes. Also, bitmap manipulation is very quick. Figure 18-3 shows a sprite sheet for a walk cycle.

Figure 18-3. A sprite sheet for a walk cycle
Figure 18-3. A sprite sheet for a walk cycle

The motion is created in code to copy various areas of the image to the screen. All images have the same predictable repetitive width and height, so copying a rectangular region can be automated. Use measurement with integers in powers of two for faster calculation. Verify that the element that is animating is in the same relative position throughout, preferably from the upper left corner or the baseline.

Use the BitmapData.copyPixels method to copy a defined area. Required parameters are the source bitmap, a rectangle, and a destination point. The method copies a rectangular area for a source image to a destination area of the same size.

Let’s import the graphics of the walking man shown in Figure 18-3 as a PNG image:

[code]

var complete:BitmapData;
var walkIndex:int = 0;
const TOTAL:int = 8;
var loader:Loader = new Loader();
loader loader.contentLoaderInfo.addEventListener(Event.COMPLETE, onLoaded);
loader.load(new URLRequest(“/assets/walkingMan.png”);
function onLoaded(event:Event):void {
event.target.removeEventListener(Event.COMPLETE, onLoaded);
complete = event.target.content.bitmapData;
}

[/code]

Add an ENTER_FRAME event listener. Change the definition of the rectangle to copy a different segment of the original bitmap, and push these pixels into the BitmapData called cell. Wrap it into a Bitmap to render it to the screen:

[code]

var cell:BitmapData = new BitmapData(CELL_WIDTH, CELL_HEIGHT, true, 0);
stage.addEventListener(Event.ENTER_FRAME, walk);
function walk(event:Event):void {
var rectangle:Rectangle = new Rectangle(
CELL_WIDTH*walkIndex, 0,
CELL_WIDTH+CELL_WIDTH*walkIndex, CELL_HEIGHT);
cell.copyPixels(complete, rectangle, new Point(0, 0));
var bitmap:Bitmap = new Bitmap(cellBitmap);
addChild(bitmap);
walkIndex++;
if (walkIndex == TOTAL) {
walkIndex = 0;
}
}

[/code]

Some open source AIR applications are available to ease the process of creating sprite sheets. Sprite Sheet Maker, by Mike Jones, lets you import separate PNGs, reorganize them as needed, and export them as the final sprite sheet (see http://blog.flashgen.com/ 2010/12/21/sprite-sheet-maker-preview/). SWFSheet, by Keith Peters, imports an .swf file and converts it to a sprite sheet (see http://www.bit-101.com/blog/?p=2939). Keith takes advantage of Flash as an animation tool and shows the .swf running. The number of frames captured can be adjusted, as well as the frame area and the sprite dimension before exporting the final PNG.

 

 

Vector Graphics at Runtime

Vector graphics scale well, and are therefore reusable and reduce production time, but they render slowly.

Scaling

Create your vector art at medium size and resize it on the fly as needed. This is a great technique for multiple-screen deployment. Let’s assume your original art was created for an 800×480 device and the application is detecting a tablet at 1,024×600, resulting in a scale difference of about 30%. Let’s scale up the art:

[code]

var dpi:int = Capabilities.screenDPI;
var screenX:int = Capabilities.screenResolutionX;
var screenY:int = Capabilities.screenResolutionY;
var diagonal:Number = Math.sqrt((screenX*screenX)+(screenY*screenY))/dpi;
// if diagonal is higher than 6, we will assume it is a tablet
if (diagonal >= 6) {
myVector.scaleX = myVector.scaleY = 1.30;
}

[/code]

cacheAsBitmap

If the object’s only transformation is along the axes, use cacheAsBitmap:

[code]

myVector.cacheAsBitmap = true;
this.addEventListener(Event.ENTER_FRAME, moveArt);
function moveArt(event:Event):void {
myVector.x += 1;
}

[/code]

cacheAsBitmapMatrix

To rotate, scale, or alpha the object, use cacheAsBitmapMatrix along with cacheAsBit map: Both are required for the caching to work.

[code]

import flash.geom.Matrix;
myVector.cacheAsBitmap();
myVector.cacheAsBitmapMatrix = new Matrix();
this.addEventListener(Event.ENTER_FRAME, onMoveArt);
function onMoveArt(event:Event):void {
myVector.x += 1;
myVector.rotation += 1;
}

[/code]

Vector to Bitmap

If the object is animated with filters or if multiple vector assets need to be composited, draw them into a bitmap. Here, if the device is a tablet (as determined by the diagonal size), we scale the art and apply a drop shadow filter. We then draw the vector art into a bitmap.

[code]

import flash.display.Bitmap;
import flash.display.BitmapData;
import flash.filters.DropShadowFilter;
if (diagonal >= 6) {
myVector.scaleX = myVector.scaleY = 1.30;
}
var shadow:DropShadowFilter = new DropShadowFilter();
shadow.distance = 5;
shadow.angle = 35;
myVector.filters = [shadow];
var bitmapData:BitmapData = new BitmapData(myVector.width, myVector.height);
bitmapData.draw(vector);
var bitmap:Bitmap = new Bitmap(bitmapData);

[/code]

Compositing Vector Graphics

A common use case for vector graphics is the creation of a customized avatar in a virtual world. Discrete parts, such as hair, shirt, and shoes, are separate assets. They can be changed and saved at runtime. The avatar, however, animates best as a whole, so once the avatar information is collected and all the discrete parts are loaded, they can be composited as one single object.

Use the BitmapData.draw method to render vector graphics into a bitmap. The first parameter, the only one required, is the source, bitmap, or vector. The following parameters are a Matrix, a ColorTransform, a blend mode, a Rectangle, and smoothing (only for the BitmapData source).

In this example, we are using the Matrix to scale the assets down to half their original size. Define the dimension of the BitmapData object used to store the pixels of the assets loaded and a Matrix:

[code]

import flash.display.BitmapData;
import flash.geom.Matrix;
const CELL_WIDTH:int = 50;
const CELL_HEIGHT:int = 100;
var composite:BitmapData = new BitmapData(CELL_WIDTH, CELL_HEIGHT, true, 0);
var matrix:Matrix = new Matrix(0.50, 0, 0, 0.50, 0, 0);

[/code]

Create a Vector to store the path of the assets. The art must be loaded in the order it will be composited, as you would with layers in Flash or Photoshop:

[code]

var assets:Vector.<String> = new Vector.<String>;
assets[0] = PATH_SKIN_ASSET;
assets[1] = PATH_HAIR_ASSET;
assets[2] = PATH_SHIRT_ASSET;
var counter:int = 0;

[/code]

Load one image at a time using a Loader object. A counter variable is used to traverse the Vector and load all the assets. Every time one is available in AIR, it is converted to a bitmap and added to BitmapData. Note that the draw method has the alpha argument set to true to preserve the area of the vector without pixels so that it remains transparent:

[code]

import flash.display.Loader;
import flash.net.URLRequest;
import flash.events.Event;
var loader:Loader = new Loader();
loader loader.contentLoaderInfo.addEventListener(Event.COMPLETE, onLoaded);
loading();
function loading():void {
loader.load(new URLRequest(assets[counter]));
}
function onLoaded(event:Event):void {
composite.draw(event.target.content, matrix);
counter++;
if (counter < assets.length) {
loading();
} else {
loader.contentLoaderInfo.removeEventListener(
Event.COMPLETE, onLoaded);
display();
}
}

[/code]

Once all the assets are loaded and composited, create a Bitmap to render the image to the screen:

[code]

import flash.display.Bitmap;
function display():void {
var bitmap:Bitmap = new Bitmap(composite);
addChild(bitmap);
}

[/code]

MovieClip with Multiple Frames

Art production may require the use of a movie clip with multiple frames. This is a familiar workflow for many designers while also having the flexibility of being resizable.

Converting a MovieClip with vector art to a bitmap for better rendering is a good practice, but neither cacheAsBitmap nor cacheAsBitmapMatrix works for a MovieClip with multiple frames. If you cache the art on the first frame, as the play head moves, the old bitmap is discarded and the new frame needs to be rasterized. This is the case even if the animation is just a rotation or a position transformation.

GPU rendering is not the technique to use for such a case. Instead, load your Movie Clip without adding it to the display list.

This time, we need to load a single .swf file and traverse its timeline to copy each frame. Load the external .swf file comprising 10 frames:

[code]

var loader:Loader = new Loader();
loader.contentLoaderInfo.addEventListener(Event.COMPLETE, onLoadComplete);
loader.load(new URLRequest(PATH_TO_SWF));

[/code]

Create a rectangle for the cell at a predefined size. Traverse the MovieClip, and use draw, as before, to copy the vector into a bitmap. Then use copyPixels to add the new pixels into another BitmapData that is the width of the total number of frames:

[code]

function onLoadComplete(event:Event):void {
event.target.removeEventListener(Event.COMPLETE, onLoaded);
var mc:MovieClip = event.target.content as MovieClip;
var totalWidth:int = mc.totalFrames*CELL_WIDTH;
var complete:BitmapData =
new BitmapData(totalWidth, CELL_HEIGHT, true, 0);
var rectangle:Rectangle = new Rectangle(0, 0, CELL_WIDTH, CELL_HEIGHT);
var bounds:int = mc.totalFrames;
for (var i:int = 0; i < bounds; i++) {
mc.gotoAndStop(i+1);
var frame:BitmapData =
new BitmapData(CELL_WIDTH, CELL_HEIGHT, true, 0);
frame.draw(mc, scaleMatrix);
complete.copyPixels(image, rectangle, new Point(i*CELL_WIDTH, 0));
}
frame.dispose();
loader.unloadAndStop(mc);
mc = null;

[/code]

Display the large BitmapData into a Bitmap to render it to the screen:

[code]

var bitmap:Bitmap = new Bitmap(complete);
bitmap.smoothing = true;
addChild(bitmap);

[/code]

Use the reverse process to display a single frame. Here we show the third frame:

[code]

var cellBitmap:BitmapData = new BitmapData(CELL_WIDTH, CELL_HEIGHT, true, 0);
var rectangle:Rectangle =
new Rectangle(CELL_WIDTH*3, 0, CELL_WIDTH+CELL_WIDTH*3, CELL_HEIGHT);
cellBitmap.copyPixels(complete, rectangle, new Point(0, 0));
var bitmap:Bitmap = new Bitmap(cellBitmap);
addChild(bitmap);

[/code]

Next we will cover creating an animation from a sprite sheet bitmap.

 

 

 

Capturing Video

The native video camera can be used to capture video within AIR.

Video and the CameraUI Class

You can use the native camera within AIR to capture video. Your application needs to have permission. In Flash Professional, select File→AIR Android settings→Permissions→ Camera. In Flash Builder, add the following permission:

[code]<uses-permission android:name=”android.permission.CAMERA”/>[/code]

The flash.media.CameraUI class is an addition to the ActionScript language to support the device’s native camera application. It is a subclass of the EventDispatcher class and is only supported on AIR for mobile.

This object allows you to launch the native camera application to shoot a video while your AIR application moves to the background.

When you use the native camera, it comes to the foreground, your AIR application moves to the background, and NativeApplication Event.DEACTIVATE is fired. Make sure you don’t have any logic that could interfere with the proper running of your application, such as exiting. Likewise, when the native camera application quits and your AIR comes back to the foreground, Event.ACTIVATE is called.

The first step is to verify that your device supports access to the camera by checking the CameraUI.isSupported property. Note that, as of this writing, Android does not support the front camera natively, and therefore neither does AIR:

[code]

import flash.media.CameraUI;
if (CameraUI.isSupported == false) {
trace(“no camera accessible”);
return;
}

[/code]

If it is supported, create an instance of the CameraUI class.

Register your application to receive camera events. A MediaEvent.COMPLETE is dispatched after a picture is taken, an Event.CANCEL if no media is selected, and an ErrorEvent if there is an error in the process:

[code]

import flash.events.MediaEvent;
import flash.events.ErrorEvent;
import flash.media.CameraUI;
var cameraUI:CameraUI = new CameraUI();
cameraUI.addEventListener(MediaEvent.COMPLETE, onComplete);
cameraUI.addEventListener(Event.CANCEL, onCancel);
cameraUI.addEventListener(ErrorEvent.ERROR, onError);

[/code]

Call the launch function and pass the type MediaType.VIDEO as a parameter. This will launch the camera in video mode automatically:

[code]

import flash.media.MediaType;
var cameraUI:CameraUI = new CameraUI();
cameraUI.launch(MediaType.VIDEO);
function onError(event:ErrorEvent):void {
trace(event.text);
}

[/code]

The camera application is now active and in the foreground. The AIR application moves to the background and waits.

Once the event is received, the camera application automatically closes and the AIR application moves back to the foreground.

Video capture on Android requires a lot of memory. To avoid having the Activity Manager terminate the application, the capture setting is restricted to low resolution by default, which requires a smaller memory buffer.

MPEG-4 Visual, Android low-resolution video, is not supported by AIR. Therefore, captured videos cannot be played back in AIR. The native application can be used to play back the recorded videos.

Currently, this functionality should only be used for capturing and not viewing unless you use the native application in the Gallery. The video is saved in a 3GP format that AIR does not support. Trying to play it back will just display a white screen.

In the following example, I provide the code for playback in AIR in case this is resolved in the future.

On select, a MediaEvent object is returned:

[code]

import flash.media.Video;
import flash.net.netConnection;
import flash.net.netStream;
var videoURL:String;
var connection:NetConnection;
function onComplete(event:MediaEvent):void {
videoURL = event.data.file.url;
connection = new NetConnection();
connection.addEventListener(NetStatusEvent.NET_STATUS, onStatus);
}
function onStatus(event:NetStatusEvent):void {
switch(event.info.code) {
case “NetConnection.Connect.Success” :
connectStream();
break;
case “NetStream.Play.StreamNotFound” :
trace(“video not found ” + videoURL);
break;
}
}
function connectStream():void {
stream = new NetStream(connection);
stream.addEventListener(NetStatusEvent.NET_STATUS, onStatus);
stream.addEventListener(AsyncErrorEvent.ASYNC_ERROR, onAsyncError);
var video:Video = new Video();
video.attachNetStream(stream);
stream.play(videoURL);
addChild(video);
}
function onAsyncError(event:AsyncErrorEvent):void {
trace(“ignore errors”);
}

[/code]

The Camera Class

The device’s camera, using the flash.media.Camera class, can be attached to a Video object in the AIR application. You can use this approach to simulate a web cam or for an Augmented Reality project.

The hardware orientation of the camera is landscape, so try to make your application’s orientation landscape too by changing the aspectRatio tag in your application descriptor:

[code]<aspectRatio>landscape</aspectRatio>[/code]

The setMode function is used to determine the video’s resolution:

[code]

import flash.media.Camera;
import flash.media.Video;
var camera:Camera = Camera.getCamera();
if (camera != null) {
camera.setMode(stage.stageWidth, stage.stageHeight, 15, true);
var video:Video = new Video(camera.width, camera.height);
video.x = 100;
video.y = 100;
video.attachCamera(camera);
addChild(video);
}

[/code]

Note that frames are only captured when the application is in the foreground. If the application moves to the background, capturing is paused but will resume automatically when the application moves to the foreground again.

You can query for the camera properties. Here are a few queries which may be helpful in your development:

[code]

camera.height;
camera.width;
camera.bandwidth;
camera.fps;
camera.muted
camera.name

[/code]

Documentation and Tutorials

Development around video is constantly evolving. The following two resources are among those that will help you to stay informed:

  • The Open Source Media Framework (http://www.opensourcemediaframework .com/resources.html) helps developers with video-related products. It is a good place to find code samples, tutorials, and other materials.
  • Lisa Larson-Kelly specializes in web video publishing and, more recently, mobilepublishing. She offers free tutorials and a newsletter on the latest technology (http://learnfromlisa.com/).

Raw Data and the Sound Spectrum

With the arrival of digital sound, a new art form quickly followed: the visualization of sound.

A sound waveform is the shape of the graph representing the amplitude of a sound over time. The amplitude is the distance of a point on the waveform from the equilibrium line, also called the time-domain. The peak is the highest point in a waveform.

You can read a digital signal to represent sound in real time using amplitude values.

Making Pictures of Music is a project run by mathematics and music academics that analyses and visualizes music pieces. It uses Unsquare Dance, a complex multi-instrumental piece created by Dave Brubeck. For more information, go to http://www.uwec.edu/walkerjs/PicturesOfMusic/MultiInstrumental%20Complex%20Rhythm.htm.

In AIR, you can draw a sound waveform using the computeSpectrum method of the SoundMixer class. This method takes a snapshot of the current sound wave and stores the data in a ByteArray:

[code]SoundMixer.computeSpectrum(bytes, false, 0);[/code]

The method takes three parameters. The first is the container ByteArray. The second optional parameter is FFTMode (the fast Fourier transform); false, the default, returns a waveform, and true returns a frequency spectrum. The third optional parameter is the stretch factor; 0 is the default and represents 44.1 kHz. Resampling at a lower rate results in a smoother waveform and a less detailed frequency. Figure 11-1 shows the drawing generated from this data.

A waveform (top) and a frequency spectrum (bottom), both generated from the same piece of audio but setting the fast Fourier transform value to false and then to true
Figure 11-1. A waveform (top) and a frequency spectrum (bottom), both generated from the same piece of audio but setting the fast Fourier transform value to false and then to true

A waveform spectrum contains 512 bytes of data: 256 values for the left channel and 256 values for the right channel. Each byte contains a floating-point value between ‒1 and 1, which represents the amplitude of the points in the sound waveform.

If you trace the length of the ByteArray, it returns a value of 2,048. This is because a floating-point value is made of four bytes: 512 * 4 = 2,048.

Our first approach is to use the drawing API. Drawing a vector is appropriate for a relatively simple sound like a microphone audio recording. For a longer, more complex track, we will look at a different approach after this example.

We are using two loops to read the bytes, one at a time. The loop for the left channel goes from 0 to 256. The loop for the right channel starts at 256 and goes back down to 0. The value of each byte, between ‒1 and 1, is multiplied by a constant to obtain a value large enough to see. Finally, we draw a line using the loop counter for the x coordinate and we subtract the byte value from the vertical position of the equilibrium line for the y coordinate.

The same process is repeated every Enter_Frame event until the music stops. Don’t forget to remove the listener to stop calling the drawMusic function:

[code]

const CHANNEL_LENGTH:int = 256; // channel division
// equilibrium line y position and byte value multiplier
var PEAK:int = 100;
var bytes:ByteArray;
var sprite:Sprite;
var soundChannel:SoundChannel;
bytes = new ByteArray();
sprite = new Sprite();
var sound:Sound = new Sound();
sound.addEventListener(Event.COMPLETE, onLoaded);
sound.load(new URLRequest(“mySound.mp3”));
addChild(sprite);
function onLoaded(event:Event):void {
soundChannel = new SoundChannel();
soundChannel = event.target.play();
soundChannel.addEventListener(Event.SOUND_COMPLETE, onPlayComplete);
sprite.addEventListener(event.ENTER_FRAME, drawMusic);
}
function drawMusic(event:Event):void {
var value:Number;
var i:int;
SoundMixer.computeSpectrum(bytes, false, 0);
// erase the previous drawing
sprite.graphics.clear();
// move to the far left
sprite.graphics.moveTo(0, PEAK);
// left channel in red
sprite.graphics.lineStyle(0, 0xFF0000);
for (i = 0; i < CHANNEL_LENGTH; i++) {
value = bytes.readFloat()*PEAK;
// increase the x position by 2 pixels
sprite.graphics.lineTo(i*2, PEAK – value);
}
// move to the far right
sprite.graphics.lineTo(CHANNEL_LENGTH*2, PEAK);
// right channel in blue
sprite.graphics.lineStyle(0, 0x0000FF);
for (i = CHANNEL_LENGTH; i > 0; i–) {
sprite.graphics.lineTo(i*2, PEAK – bytes.readFloat()*PEAK);
}
}
function onPlayComplete(event:Event):void {
soundChannel. removeEventListener(Event.SOUND_COMPLETE, onPlayComplete);
sprite.removeEventListener(Event.ENTER_FRAME, drawMusic);
}

[/code]

On most Android phones, which have a width of 480 pixels, the waveform will draw off-screen on the right to pixel 512 (256 * 2). Consider presenting your application in landscape mode and positioning the sprite container centered on the screen.

For better performance, let’s draw the vector into a bitmap. As a general rule, on mobile devices, you should avoid the drawingAPI, which is redrawn every frame and degrades performance.

The Sprite is not added to the display list, and therefore is not rendered to the screen. Instead, we create a BitmapData and draw the sprite inside its rectangle:

[code]

import flash.display.Bitmap;
import flash.display.BitmapData;
var sprite:Sprite;
var bitmap:Bitmap;
sprite = new Sprite();
// draw a BitmapData to draw the waveform
var bitmapData = new BitmapData(480, PEAK*2, true, 0x000000);
// store it in a Bitmap
bitmap = new Bitmap(bitmapData);
// position and add Bitmap to displayList
bitmap.y = 200;
addChild(bitmap);
function drawMusic(event:Event):void {
var value:Number;
var i:int;
SoundMixer.computeSpectrum(bytes, false, 0);
// use the sprite.graphics as before
// but does not render it to the screen
sprite.graphics.clear();
sprite.graphics.moveTo(0, PEAK);
sprite.graphics.lineStyle(0, 0xFF0000);
for (i = 0; i < CHANNEL_LENGTH; i++) {
value = bytes.readFloat()*PEAK;
sprite.graphics.lineTo(i*2, PEAK – value);
}
sprite.graphics.lineTo(CHANNEL_LENGTH*2, PEAK);
sprite.graphics.lineStyle(0, 0x0000FF);
for (var i:int = CHANNEL_LENGTH; i > 0; i–) {
value = bytes.readFloat()*PEAK;
sprite.graphics.lineTo(i*2, PEAK – value);
}
// instead draw it into a bitmap
// empty bitmap
bitmap.fillRect(bitmap.rect(sprite), 0);
// draw the sprite onto the bitmap image
bitmap.draw(sprite);
}

[/code]

Playing Sounds, Displaying Progress

Playing Sounds

In the earlier example, we used the play method. You can add some optional parameters to have more control over playback. The first parameter represents the starting position, the second parameter the number of times to loop the sound.

In this example, the sound starts at the three-second position and loops five times:

sound.play(3000, 5);

When it loops again, it starts at the same position, here the third second.

The Sound class does not dispatch an event when it is done playing. The SoundChannel class is used for that purpose, as well as for controlling sound properties such as volume and to stop playback. Create it when a Sound object starts playing. Each sound has its own channel:

import flash.media.SoundChannel;
var sound = new Sound();
sound.addEventListener(Event.COMPLETE, onLoaded);
sound.load(new URLRequest(“mySound.mp3”));
function onLoaded(event:Event):void {
sound.removeEventListener(Event.COMPLETE, onLoaded);
var channel:SoundChannel = sound.play();
channel.addEventListener(Event.SOUND_COMPLETE, playComplete);
}
function playComplete(event:Event):void {
event.target.removeEventListener(Event.SOUND_COMPLETE, playComplete);
trace(“sound done playing”);
}

Displaying Progress

There is no direct way to see playback progress, but you can build a timer to regularly display the channel position in relation to the length of the sound. The sound needs to be fully loaded to acquire its length:

import flash.utils.Timer;
import flash.events.TimerEvent;

var channel:SoundChannel;
var sound:Sound;
// load sound
// on sound loaded
var timer:Timer = new Timer(1000);
timer.addEventListener(TimerEvent.TIMER, showProgress);
channel = sound.play();
channel.addEventListener(Event.SOUND_COMPLETE, playComplete);
timer.start();
function showProgress(event:TimerEvent):void {
// show progress as a percentage
var progress:int = Math.round(channel.position/sound.length*100);
}

Do not forget to stop the timer when the sound has finished playing:

function playComplete(event:Event):void {
channel.removeEventListener(Event.SOUND_COMPLETE, playComplete);
timer.removeEventListener(TimerEvent.TIMER, showProgress);
}

You do not know the length of a streaming audio file until it is fully loaded. You can, however, estimate the length and adjust it as it progresses:

function showProgress(event:TimerEvent):void {
var percentage:int = sound.bytesLoaded/sound.bytesTotal;
var estimate:int = Math.ceil(sound.length/percentage);
// show progress as a percentage
var progress:int = (channel.position/estimate)*100;
trace(progress);
}

Working with Sounds, Loading Sounds

Now that your sound files are ready, let’s see how we can use them. All sound-related classes belong to the flash.media package and will be introduced throughout this chapter.

Loading Sounds

The Sound class gets access to the audio information to load the sound file. It is a subclass of the EventDispatcher class.

As discussed before, your sound can be embedded, it can be loaded as an external file from your application assets directory, or it can be downloaded from a remote server. For the latter, advise your audience to use WiFi over 3G for a better experience.

If you try to play a sound that is not loaded, you will get a runtime error. Create a listener to be notified when the loading is complete, and then play your file:

import flash.media.Sound;
import flash.net.URLRequest;
import flash.events.Event;
var sound:Sound = new Sound();
sound.addEventListener(Event.COMPLETE, onLoaded);
var request:URLRequest = new URLRequest(“mySound.mp3”);
sound.load(request);
// sound fully loaded
function onLoaded(event:Event):void {
sound.removeEventListener(Event.COMPLETE, onLoaded);
sound.play();
}

You can inform the user that the file has started to load:

sound.addEventListener(Event.OPEN, onOpen);
function onOpen(event:Event):void {
trace(“sound loading”);
}

If it is a large file, create a listener to display the progress:

import flash.events.ProgressEvent;
sound.addEventListener(ProgressEvent.PROGRESS, onLoading);
function onLoading(event:ProgressEvent):void {
// display the percentage loaded
trace(event.bytesLoaded/event.bytesTotal)*100);
}

On Android devices, it is important to check for errors, particularly if the file is served from a remote server. Inform your user if there is a network issue:

import flash.events.IOErrorEvent;
sound.addEventListener(IOErrorEvent.IO_ERROR, onError);
function onError(event:IOErrorEvent):void {
trace(“sound cannot be loaded”, event.text);
}

Streaming

Streaming is the process of playing part of a sound file while other parts are loading in the background. The advantage is that you don’t need to wait for the whole file to download before playing. In addition, you can play very long tracks without memory constraints.

The audio files must be located on a streaming server. The quality of the server is an important factor to a good experience: 128 kbps is sufficient for audio, but a musician can detect artifacts in high frequency for MP3 encoding. Encoding your audio at 192 kbps is a good compromise.

Requesting the file is the same process as before.

You can start playing the audio as soon as there is enough data in the buffer. Buffering is the process of receiving and storing audio data before it is played. The default buffer time is 1,000 milliseconds. You can overwrite the default using the SoundLoaderCon text class.

In this example, the buffer time is changed to five seconds:

import flash.media.SoundLoaderContext;
var sound:Sound = new Sound();
var request:URLRequest = new URLRequest(“myStreamingSound.mp3”);
var context:SoundLoaderContext = new SoundLoaderContext(5000, true);
sound.load(request, context);
sound.play();

The SoundLoaderContext class is also used for security checks when loading sounds, but it may not be required in AIR.

Streaming MP3 files is buggy when it comes to midstream bit rate changes, a method often used by streaming services such as Internet radios. The audio sounds like it speeds up or is broken in chunks because it uses the bit rate declared at the start of the stream, even after a change.

EXIF Data and the Map Object

a JPEG image stores location information if the user allows that feature. Let’s look at an example in which the user can choose an image from the camera roll, read its location information, and display the corresponding map:

import com.google.maps.Map;
import com.google.maps.MapEvent;
import com.google.maps.LatLng;
import com.google.maps.MapType;
import com.google.maps.overlays.Marker;
import com.google.maps.overlays.MarkerOptions;
import flash.events.Event;
import flash.events.MediaEvent;
import flash.media.CameraRoll;
import flash.net.URLRequest;
import jp.shichiseki.exif.*;
public static KEY:String = YOUR_API_KEY;
public static SITE:String = YOUR_SITE;

var cameraRoll:CameraRoll;
var exifLoader:ExifLoader;
var map:Map;

Create your Map object as before:

map = new Map();
map.url = SITE;
map.key = KEY;
map.sensor = “false”;
map.setSize(new Point(stage.stageWidth, stage.stageHeight));
map.addEventListener(MapEvent.MAP_READY, onMapReady);
addChild(map);

Get an image from the device Gallery using the CameraRoll API:

function onMapReady(event:MapEvent):void {
map.setCenter
(new LatLng(40.736072, -73.992062), 14, MapType.NORMAL_MAP_TYPE);
if (CameraRoll.supportsBrowseForImage) {
var camera:CameraRoll = new CameraRoll();
camera.addEventListener(MediaEvent.SELECT, onImageSelected);
camera.browseForImage();
}
}

After the user selects an image, create an instance of the ExifLoader class and pass it the photo url. It will load the image and read its EXIF data:

function onImageSelected(event:MediaEvent):void {
exifLoader = new ExifLoader();
exifLoader.addEventListener(Event.COMPLETE, onExifRead);
exifLoader.load(new URLRequest(event.data.file.url));
}

If the image contains geolocation information, it is used to draw the map and a marker at the exact location:

function onImageLoaded(event:Event):void {
var exif:ExifInfo = reader.exif;
if (exif.ifds.gps) {
var gpsIfd:IFD = exif.ifds.gps;
var exifLat:Array = gpsIfd[“GPSLatitude”] as Array;
var latitude:Number = shorten(exifLat, gpsIfd[“GPSLatitudeRef”]);
var exifLon:Array = gpsIfd[“GPSLongitude”] as Array;
var longitude:Number = shorten(exifLon, gpsIfd[“GPSLongitudeRef”]);
var marker:Marker;
var parts:Array;
marker = new Marker(new LatLng(latitude, longitude));
map.addOverlay(marker);
map.setCenter(new LatLng(latitude, longitude));
}
}

function shorten(info:Array, reference:String):Number {
var degree:Number = info[0] + (info[1]/60) + (info[2]/3600);
// position from Greenwich and equator
if (reference == “S” || reference == “E”) {
degree * -1;
}
return degree;
}

Maps

Several geocoding systems and companies offer web services for the consumer market. They all provide similar features. A map is received. It is drawn or a composite of satellite pictures or street tiles is drawn, the latter being more common for mobile devices. It can pan and zoom. Geographical locations or points of interest are represented in the form of markers. Additional features include custom itineraries, the display of specific areas in color, driving and biking directions, and business searches.

Some of the better-known geocoding systems are Google Maps, Yahoo! Maps, Bing Maps, GeoNames, and USC Geocoder. As the technology is rapidly growing, this list may soon expand or change. A lot of map services get their information from NAVTEQ and Tele Atlas, companies that sell databases of geodata, or from MaxMind which sells IP geolocation data. Google now has its own full set of geodata, gathered by its streetview cars.

Launching Google Maps

As we previously discussed, you can collect a point location (latitude, longitude) using the Geolocation class, and pass it to the device using a URI handler. It then presents the user with the option of using the native Maps application or launching Google Maps in the browser:

<uses-permission android:name=”android.permission.INTERNET” />
import flash.events.GeolocationEvent;
import flash.net.navigateToURL;
import flash.net.URLRequest;
import flash.sensors.Geolocation;
function onTravel(event:GeolocationEvent):void {
geolocation.removeEventListener(GeolocationEvent.UPDATE, onTravel);
var long:String = event.longitude.toString();
var lat:String = event.latitude.toString();
navigateToURL(
new URLRequest(“http://maps.google.com/?q=” + lat + “,” + long));
}

Note that if you navigate to http://maps.yahoo.com instead, launching the native Google Maps is not an option.

The major hurdle with this approach is that your application is now in the background and there is no direct way to go back to it unless you press the device’s native back button.

The Android SDK has a library for embedding maps into native applications with interactivity. AIR doesn’t support this feature at the time of this writing, but there are many other ways to offer a map experience to your audience. To demonstrate some of the map features within AIR, we will use the Yahoo! Maps Web Services (http://developer.yahoo.com/maps) and Google Maps API family (http://code.google.com/apis/maps/).

Static Maps

A static map may be sufficient for your needs. It provides a snapshot of a location; although it doesn’t offer pan or zoom, it does load relatively quickly, even over GPS.

The Yahoo! Map Image API

The Yahoo! Map Image API from Yahoo! Maps (http://developer.yahoo.com/maps/rest/V1/) provides a reference to a static map image based on user-specified parameters. This API requires an applicationID. It doesn’t set a restriction on how to use the service, it serves images up to 1,024×1,024, and it has few customizable options.

To use the API, send a URLRequest with your parameters. In return, you receive the path to the image which you then load using a Loader object. The next example uses the point location from geolocation, the stage dimensions for the image size, and 1 for street level (zoom goes up to 12 for country level):

import flash.display.Loader;
import flash.events.GeolocationEvent;
import flash.events.Event;
import flash.net.URLLoader;
import flash.net.URLRequest;
import flash.sensors.Geolocation;
var geolocation:Geolocation;
const YAHOO_URL:String =
“http://local.yahooapis.com/MapsService/V1/mapImage”;
const applicationID:String = “YOUR_YAHOO_APP_ID”;
var urlLoader:URLLoader;
var loader:Loader;
function findLocation():void {
if (Geolocation.isSupported) {
geolocation = new Geolocation();
geolocation.addEventListener(GeolocationEvent.UPDATE, onTravel);
}
}

function onTravel(event:GeolocationEvent):void {
var request:String = “?appid=YOUR_APPI”
+ “&latitude=” + event.latitude
+ “&longitude=” + event.longitude
+ “&zoom=1”
+ “&image_height=” + stage.stageHeight
+ “&image_width=” + stage.stageWidth;
urlLoader = new URLLoader();
urlLoader.addEventListener(Event.COMPLETE, onXMLReceived);
urlLoader.load(new URLRequest(YAHOO_URL + request));
}

function onXMLReceived(event:Event):void {
urlLoader.removeEventListener(Event.COMPLETE, onXMLReceived);
geolocation.removeEventListener(GeolocationEvent.UPDATE, onTravel);
var xml:XML = XML(event.currentTarget.data);
loader = new Loader();
loader.contentLoaderInfo.addEventListener(Event.COMPLETE, onLoaded);
loader.load(new URLRequest(xml));
}

function onLoaded(event:Event):void {
event.currentTarget.removeEventListener(Event.COMPLETE, onLoaded);
this.addChild(event.currentTarget.content);
}

You should see a map of where you are currently located.

The Google Static Maps API

The Google Static Maps API (http://code.google.com/apis/maps/documentation/staticmaps/) offers more features than the Yahoo! product, but at the time of this writing, it enforces a rule whereby static maps can only be displayed as browser content unless you purchase a Google Maps API Premier license. An AIR application is not considered browser content. Read the terms carefully before developing a commercial product using this API.

With this service, a standard HTTP request returns an image with the settings of your choice.

The required parameters are as follows:

  • center for location as an address or latitude/longitude (not required with marker)
  • zoom from 0 for the Earth to 21 for a building (not required with marker)
  • size (up to 640×640 pixels)
  • sensor (with or without use of GPS locator)

The maximum image size is 640×640. Unless you scale the image up in size, it will not fill the screen on most Android devices. Optional parameters are mobile, format, map type, language, markers, visible, and path.

The following example requests a 480×640 image of Paris centered on the Eiffel Tower:

<uses-permission android:name=”android.permission.INTERNET” />
import flash.display.Loader;
import flash.net.URLRequest;
import flash.events.Event;
const GOOGLE_URL:String = “http://maps.google.com/maps/api/staticmap?”
function loadStaticImage():void {
var request:String = “center=Eiffel+Tower,Paris,France
&zoom=16&size=480×640&sensor=false”;
var loader:Loader = new Loader();
loader.contentLoaderInfo.addEventListener(Event.COMPLETE, imageLoaded);
loader.load(new URLRequest(GOOGLE_URL + request));
}
function imageLoaded(event:Event):void {
event.currentTarget.removeEventListener(Event.COMPLETE, imageLoaded);
addChild(event.currentTarget.content);
}

You should see on your screen the map of Paris, as shown in Figure 10-3.

Let’s make another request using a dynamic location. The sensor parameter is now set to true and we are adding the mobile parameter. The image size is also dynamic, choosing whichever value is the smallest between Google restricted values and our stage size. And we are now using the hybrid version of the maptype:

import flash.display.Loader;
import flash.net.URLRequest;
import flash.sensors.Geolocation;
const GOOGLE_URL:String = “http://maps.google.com/maps/api/staticmap?”
var geolocation:Geolocation = new Geolocation();
geolocation.addEventListener(GeolocationEvent.UPDATE, onTravel);

function onTravel(event:GeolocationEvent):void {
geolocation.removeEventListener(GeolocationEvent.UPDATE, onTravel);
loadStaticImage(event.latitude, event.longitude);
}
function loadStaticImage(lat:Number, long:Number):void {
var width:int = Math.min(640, stage.stageWidth);
var height:int = Math.min(640, stage.stageHeight);
var request:String = “center=” + lat + “,” + long +
“&zoom=15
&size=” + width + “x” + height +
“&maptype=hybrid&mobile=true&sensor=true”;
var loader:Loader = new Loader();
loader.contentLoaderInfo.addEventListener(Event.COMPLETE, imageLoaded);
loader.load(new URLRequest(GOOGLE_URL + request));
}
function imageLoaded(event:Event):void {
event.currentTarget.removeEventListener(Event.COMPLETE, imageLoaded);
addChild(event.currentTarget.content);
}

Map of Paris centered on the Eiffel Tower

The center parameter can be substituted for one or multiple markers. A marker can be given a color and a label, or it can be customized. And we are now using the road map version of the maptype:

function loadStaticImage(lat:Number, long:Number):void {
var width:int = Math.min(640, stage.StageWidth);
var height:int = Math.min(640, stage.StageHeight);
var request:String = “markers=size:large|color:blue|label:L|”
+ lat + “,” + long
+ “&zoom=16″
+ &size=” + width + “x” + height +
“&maptype=roadmap&mobile=true&sensor=true”;
var loader:Loader = new Loader();
addChild(loader);
loader.load(new URLRequest(googleURL + request));
}

This is again a low-impact but limited solution. The user only sees an image that cannot be scaled and is only given basic information. Let’s now look at using actual maps.

Dynamic Maps

Yahoo! and Google both provide well-documented AS3 libraries for use with their map APIs. The only restriction for both is a maximum of 50,000 uses per day.

Maps are slow to initialize. Create placeholder art to display instead of the default gray rectangle, and display an attractive loading animation. Do not forget to remove the art after the map appears. Any art under the map would slow down its performance.

The Google Maps API for Flash

With the Google Maps API for Flash (http://code.google.com/apis/maps/documentation/flash/), Flex developers can embed Google maps in Flash applications. Sign up for a Google Maps API key and download the Flash SDK. Use version 20 or up (map_1_20.swc or map_flex_1_20.swc), as version 19 has a known issue with ResizeE vent. Add the path to the .swc file in the library path and set Default Linkage to “Merged into code”. As you will see, this API offers a wealth of options that are easy to implement.

To set the library path in Flash Professional, go to File→Publish Settings. Click the tool icon next to Script. Select the Library Path tab and click the Flash icon to navigate to the .swc file. Once you’ve imported the file, change Default Linkage to “Merged into code”.

In Flash Builder, right-click your project, go to Properties→ ActionScript Build Path, and click Add SWC to navigate to the .swc file. “Merged into code” is the default setting.

Create a Map object as well as key and url properties. Entering both your API key and the site URL you submitted when you applied for the key is required even though you are not displaying the map in your website but rather as a standalone Android application.

The sensor parameter, also required, states whether you use a GPS sensor. It needs to be a string, not a boolean. The map size, defined by setSize, is set dynamically to the dimensions of the stage.

When the map is ready, the geolocation listener is set up. After the first update is received, the setCenter function is called with location, zoom level, and the type of map to use. Finally, the zoom control is added. Figure 10-4 shows the result:

My current location

<uses-permission android:name=”android.permission.INTERNET” />
import com.google.maps.Map;
import com.google.maps.MapEvent;
import com.google.maps.MapType;
import com.google.maps.LatLng;
import flash.geom.Point;
import flash.sensors.Geolocation;
import flash.events.GeolocationEvent;
import com.google.maps.controls.ZoomControl;
const KEY:String = YOUR_API_KEY;
const SITE:String = YOUR_SITE;
var map:Map;
var geolocation:Geolocation;
map = new Map();
map.key = KEY;
map.url = SITE;
map.sensor = “true”;
map.setSize(new Point(stage.stageWidth, stage.stageHeight));
map.addEventListener(MapEvent.MAP_READY, onMapReady);
function onMapReady(event:MapEvent):void {
geolocation = new Geolocation();
geolocation.addEventListener(GeolocationEvent.UPDATE, onTravel);
addChild(map);
}
function onTravel(event:GeolocationEvent):void {
geolocation.removeEventListener(GeolocationEvent.UPDATE, onTravel);
map.setCenter(new LatLng(event.latitude, event.longitude),
18, MapType.NORMAL_MAP_TYPE);
map.addControl(new ZoomControl());
}

Add a marker as landmarks and navigation control. Here the marker is customized to have a shadow, a blue color, and a defined radius. When you click on it, an information window opens with text content:

import com.google.maps.overlays.Marker;
import com.google.maps.overlays.MarkerOptions;
import com.google.maps.InfoWindowOptions;
var options:Object = {hasShadow:true,
fillStyle: new FillStyle({color:0x0099FF, alpha:0.75}),
radius:12
};
var marker:Marker =
new Marker(new LatLng(45.7924, 15.9696), new MarkerOptions(options));
marker.addEventListener(MapMouseEvent.CLICK, markerClicked);
map.addOverlay(marker);
function markerClicked(event:MapMouseEvent):void {
event.currentTarget.openInfoWindow
(new InfoWindowOptions({content:”hello”});
}

Styled Maps support

In October 2010, Google announced support for Styled Maps on Flash, included in Flash SDK version 20 and up (see http://code.google.com/apis/maps/documentation/flash/maptypes.html#StyledMaps). This addition gives you control over color scheme and customization of markers and controls. It makes your map look more unique or match your brand and design. You can also write or draw over the map. The Google Geo Developers Blog (http://googlegeodevelopers.blogspot.com/2010/10/five-great-styled-maps-examples.html) shows some examples of how Styled Maps has been used.

Google Maps 5

Google Maps 5 was released in December 2010. It provides 3D building rendering, dynamic vector-based map drawing, and offline reliability. If you would like to see it supported in AIR, file a software request on the Adobe site, http://www.mobilecrunch.com/2010/12/16/google-maps-5-with-3d-buildings-now-available-for-android/.

 

Reverse Geocoding

Unless you are displaying a map, providing an address or nearby points of interest is more tangible than latitude and longitude.

Reverse geocoding is the process of reverse-coding a point location to fetch a readable address and place name. It is widely used in combination with location-based services (LBS) to retrieve local weather data or business information, as well as by public safety services such as Enhanced 911. Such information is not immediately available, but there are free services that provide it.

The Yahoo! Geocoding API is well documented and reliable. You need to apply for an application ID that is required as an argument in your requests. It is not exclusive to mobile use and can be used in the browser, assuming it has a way to detect location. Go to http://developer.yahoo.com/geo/placefinder/ and read the “How Do I Get Started” section to request an ID. You need to log in with a Yahoo! account and fill out the form. Provide the URL of a website, even though it will not be used on the device.

First, add the permission to go out to the Internet:

<uses-permission android:name=”android.permission.INTERNET” />

In this example, I am receiving the latitude and longitude data from Geolocation Event and passing it along with my required applicationID and gFlags = R for reverse geocoding:

import flash.events.GeolocationEvent;
import flash.events.Event;
import flash.events.IOErrorEvent;
import flash.net.URLLoader;
import flash.net.URLRequest;
import flash.net.URLRequestMethod;
import flash.net.URLVariables;
import flash.sensors.Geolocation;
var geolocation:Geolocation;
const YAHOO_URL:String = “http://where.yahooapis.com/geocode”;
const applicationID:String = “GET_YOUR_ID_ON_YAHOO_SITE”;
var loader:URLLoader;

Set the geolocation listener:

if (Geolocation.isSupported) {
geolocation = new Geolocation();
geolocation.addEventListener(GeolocationEvent.UPDATE, onTravel);
}

Request reverse geolocation data from the Yahoo! service passing the coordinates:

function onTravel(event:GeolocationEvent):void {
var request:URLRequest = new URLRequest(YAHOO_URL);
var variables:URLVariables = new URLVariables();
variables.q = event.latitude.toString() + “n”
+ event.longitude.toString();
variables.gflags = “R”;
variables.appid = applicationID;
request.data = variables;
request.method = URLRequestMethod.GET;
loader = new URLLoader();
loader.addEventListener(Event.COMPLETE, onLocationLoaded);
loader.addEventListener(IOErrorEvent.IO_ERROR, onLocationLoaded);
loader.load(request);
}
function onError(event:IOErrorEvent):void {
trace(“error”, event);
}

Parse the XML received from the service to get city and country information:

function onLocationLoaded(event:Event):void {
loader.removeEventListener(Event.COMPLETE, onLocationLoaded);
geolocation.removeEventListener(GeolocationEvent.UPDATE, onTravel);
var xml:XML = new XML(event.target.data);
var city:String = xml.Result.city.text();
var country:String = xml.Result.country.text();
trace(xml);
}

The XML comes back with a ResultSet.Result node, which includes a street address, city, and country. For example, this is the result for my office building located in New York City’s Times Square:

<Result>
<quality>99</quality>
<latitude>40.757630</latitude>
<longitude>-73.987167</longitude>
<offsetlat>40.757630</offsetlat>
<offsetlon>-73.987167</offsetlon>
<radius>500</radius>
<name>40.7576303 -73.98716655000001</name>
<line1>230 Rodgers &amp; Hammerstein Row</line1>
<line2>New York, NY 10036-3906</line2>
<line3/>
<line4>United States</line4>
<house>230</house>
<street>Rodgers &amp; Hammerstein Row</street>
<xstreet/>
<unittype/>
<unit/>
<postal>10036-3906</postal>
<neighborhood/>
<city>New York</city>
<county>New York County</county>
<state>New York</state>
<country>United States</country>
<countrycode>US</countrycode>
<statecode>NY</statecode>
<countycode>NY</countycode>
<hash/>
<woeid>12761367</woeid>
<woetype>11</woetype>
<uzip>10036</uzip>
</Result>

This address is actually correct for this particular location. It is not my postal address, but it is where I am currently sitting on the west side of the building.

Other interesting data in the XML is woeid and woetype (“woe” is short for Where On Earth). GeoPlanet (http://developer.yahoo.com/geo/geoplanet/guide/) was developed by Yahoo! as a way to identify some features on Earth in a unique, language-neutral manner. This is particularly important because many places in the world have the same name, such as Paris, France, and Paris, Texas.

woeid is a 32-bit unique and nonrepetitive identifier. Numbers follow a hierarchy such as country, state, and city. woetype is the type used to identify a place—in this case, 11 refers to the postal code

Twitter and Flickr are currently using the GeoPlanet system.