Skip to content

Building animations from image series with ffmpeg

Markku Alho edited this page Sep 7, 2021 · 13 revisions

When working with Vlasiator simulation results, Analysator usually produces directories full of .png files. FFmpeg, the animation swiss army knife, can be used to turn these into video files. (ffmpeg is probably available on all relevant systems through module load ffmpeg or apt install ffmpeg)

A typical command line to do so looks like this:

ffmpeg -y -f image2 -start_number 1234 -framerate 5 -i ABC_Potato_%07d.png -vf "fps=5,scale=trunc(iw/2)*2:trunc(ih/2)*2" -c:v libx264 -preset slow -profile baselin -qp 18 -pix_fmt yuv420p outputfile.mp4

Watch out: ordering of command line options is important

The individual parts of this commandline are:

  • -y: Assume "yes" to every question, especially to overwriting the output file
  • -f image2 -start_number 1234 -framerate 5 -i ABC_Potato_%07d.png: This is the input specification. Read input files with names "ABC_Potato_0001234.png, ABC_Potato_0001235.png, ...", assuming an input framerate of 5 fps
  • -vf "fps=5,scale=trunc(iw/2)*2:trunc(ih/2)*2": Build a filter chain that enforces 5 fps framerate and scales the image size to be an even number of pixels. Some devices (hello, iphone!) won't play videos with an odd size.
  • -c:v libx264 -preset slow: Create video with the H.264 codec, with a bunch of parameters set to reasonable defaults with the "slow" preset (For slightly smaller files, use the "veryslow" preset, for real-time video encoding, consider using "fast" or "veryfast").
  • -profile baseline: Only use codec features of the "baseline" feature set of h.264. This ensures that the video is playable on all devices. You get significantly smaller files using the "high" profile, but many mobile devices then refuse to play it.
  • -qp 18: Quantization parameter set to 18. This number determinies the quality of the resulting video. Smaller numbers give better quality. -qp 1 is perfectly lossless video, -qp 10 is visually indistinguishable even on great screens. -qp 24 is about youtube quality.
  • -pix_fmt yuv420p: Encode the resulting video in the YUV420P colour space (which is a horrible digital version of the NTSC colourspace, with reduced chromacity resolution). Unfortunately, this is the only colour space reliably supported on all consumer hardware. If you leave this parameter out, the video will instead be encoded in rgb888, which is much better in terms of scientific reproduction of the results (since it won't distort plotting colour scales), but no longer plays on apple hardware.
  • outputfile.mp4: The output file that should be written to.

Note:

  • You can specify different framerates for input and output. Since the input, in this case, is "just a bunch of files", that number can be pretty arbitrary. The filter chain defines the output framerate.
  • If a resulting video is too huge to be played in a browser (because, for example, your simulation has ridiculously high resolution), you can subsequently create lower resolution or lower-quality versions of the video by re-encoding them: ffmpeg -i highres.mp4 -vf scale=trunc(iw/4):trunc(ih/4) -c:v libx264 -preset slow -qp 24 lores.mp4 for a quarter-resolution, low-quality version for example.
  • Instead of -start_number 1234 -i formatted_file_%07d.png one can use regular glob patterns: -pattern_type glob -i 'some_imgs_*.png'. This does not fail if some frame is missing.

See also

The FFmpeg FAQ page about making movies from images.