Producing video with flash/air: png sequence rendering from swf playback

I’ve been quite busy at school here in Barcelona. Busy in fun ways too: recently I put flash to good use in a ‘creative project’ my mates and I had to pull through in the last couple of days before exams kicked in. We hacked together this short animation to welcome freshmen to next year’s csr classes:

View on Vimeo.

You have to love flash – you can pull out quite some tricks in no time. All scenes were done in flash authoring, and the swf files were rendered with camera shake and flicker to png sequences by a small air app. The non-flash part was simply to put everything in an avi container, compress it and mux it with audio, stuff that was done with virtualdub and megui.

Rendering swf playback to an image sequence

When working with video, I personally enjoy image sequences because they’re so easy to manage. My uncontested favourite format for image editing is png, mostly because it’s compact and lossless. In the case of dumping video frames from flash, it also has the advantage of compressing very well rasterised vector graphics.

The app I put together is some sort of a batch processor that you compile with the rendering ‘script’ inlined in its code; its beyond simple but worked so well that I thought about posting it here in case someone needs to do something similar.

Here’s the cleaned-up api:

public class PNGSequenceRenderer extends Sprite
	 * Sets the dimensions of the input swf material.
	 * @param	width (pixels)
	 * @param	height (pixels)

	public function setInputSize ( width : Number, height : Number ) : PNGSequenceRenderer;

	 * Sets the dimensions of the outputed png sequence.
	 * @param	width (pixels)
	 * @param	height (pixels)

	public function setOutputSize ( width : uint, height : uint ) : PNGSequenceRenderer;

	 * Sets the scale mode for the output, either 'show all' or 'no border'.
	 * @param	showAll true for 'show all', false for 'no border'.

	public function setScaleMode ( showAll : Boolean ) : PNGSequenceRenderer;

	 * Sets the level of camera shake. ( e.g. 0 - disabled, 1 - moderate, 2 - heavy. )
	 * @param	factor Camera shake amount.

	public function setCameraShake ( factor : Number = 0 ) : PNGSequenceRenderer;

	 * Sets the level of flicker. ( e.g. 0 - disabled, 1 - moderate, 2 - heavy. )
	 * @param	factor Level of shutter flicker.

	public function setFlicker ( factor : Number = 0 ) : PNGSequenceRenderer;

	 * Sets the output background transparency and fill color.
	 * @param	transparency Whether to draw on a transparency-enabled BitmapData.
	 * @param	color Background color for rendering.

	public function setBackground ( transparency : Boolean = false, color : uint = 0 )
		: PNGSequenceRenderer;

	 * Enqueues a scene for rendering.
	 * @param	url The location of the movie to render.
	 * @param	cameraShakeMultiplier Local camera shake adjustment.
	 * @param	flickerMultiplier Local flicker adjustment.
	 * @param	from First frame to render.
	 * @param	to Frame upon which to end the scene early.

	public function enqueueScene ( url : String, cameraShakeMultiplier : Number = 1,
		flickerMultiplier : Number = 1, from : uint = 0, to : uint = 0xffffff )
		: PNGSequenceRenderer;

	 * Begins loading scenes and outputing frames in the specified folder.
	 * @param	outputFolder The folder where all outputed files are to be put.
	 * If no other limit is specified, each scene is rendered until a pink
	 * (solid #ff00ff) frame is reached.

	public function render ( outputFolder : String ) : void;

How to

What you do with this is to extend your app from the Renderer class, and put together the rendering script in the constructor body, something like:

public class Main extends PNGSequenceRenderer
	public function Main () : void
		setCameraShake ( 1 );
		setFlicker ( 1 );
		setInputSize ( 800, 400 );
		setOutputSize ( 853, 480 );
		enqueueScene ( 'C:\\prj\\swfs\\scene01.swf' );
		enqueueScene ( 'C:\\prj\\swfs\\scene02.swf' );
		render ( 'C:\\prj\\output\\' );

Compiling the project for the air 1.5 target will launch the adl and run your app, which will begin loading and playing your scenes one by one, and dumping the png sequence in the output folder. During the process you’ll be getting some basic feedback on the screen, mainly which scene and frame is being rendered.

Note: Before you jump into testing it out, remember to have a pink solid fill frame (#ff00ff) at the end of each scene, as the renderer expects a frame with a pink middle pixel in order to proceed with the queue. Also, compile in release mode as the PNG encoder runs twice to three times faster without debug stuff in the bytecode.

Once you have the image sequences ready, you can edit them in virtually any video editing software package (if you need to), and then package everything in a video container. The latter will come out pretty voluminous, so you’ll need to compress it with a codec, such as h.264. Finally, you mux it with the audio and you are ready for distribution.

For the windows crowd, virtualdub is a great piece of software for playing with video, and also pulls off the packaging of the sequences into a container with a couple of mouse clicks. Also, check out megui for the video and audio compression and muxing, it makes it all real easy.


The code requires the PNGEncoder class from as3corelib, as well the standard normal output Park-Miller prng class (described here and used for the flicker and camera shake effects).

Bookmark and Share

6 Responses to “Producing video with flash/air: png sequence rendering from swf playback”

  • hi.

    last year in march we built a video renderer for Nokia’s retail spaces.
    the system is based on a website to use a WYSIWYG interface to create content,
    the newly generated piece is then fed to an off-site mac mini that renders the frames and passes them to ffmpeg through a series of perl scripts that after render update the DB and email the rendered video (or GIF) to the creator so it can be uploaded to a device running either S60 S60touch or S40 with the right screen format.
    it was a very fun project to develop and it was nice to find all the workarounds.
    we used Flash for the website, Air for the PNG renderer, a linux VM for ffmpeg.
    all the system elements communicate over TCP sockets for handshaking, error and queue management and so on.

  • Which makes me think, that if one has access to the shell of a linux server, one could encode the pngs in the flash client, send the frames to the server via http and directly encode them with ffmpeg without the air app intermediary. This would eat up a lot of bandwidth though.

  • Great job! When CS3 was first released, one of the features I was most excited about was export to video – but it never really worked properly – so I’ll have to give this a go. Cheers!

  • Just developed an AIR utility to convert SWF to PNG sequences: – hope you find it useful….

  • Also, check this one out – faster PNG Encoder:

Leave a Reply