I have been filming out the window of subways lately and have come up with a foolproof method for creating slit scan images out of these vignettes. I use an iPhone 5s held in portrait mode, so the instructions below include a rotation of the video before compositing.
If they’re not already installed, install ffmpeg and imagemagick using Homebrew:
$> brew install imagemagick
$> brew install ffmpeg
Next, navigate on the terminal to where the .MOV file is located and create a folder to hold the frames images:
$> cd ~/myvideofolder
$> mkdir frames
If the video is long, you may want to do a test run at 1fps by doing the following. Note the -vf transpose=1″ that indicates a counter-clockwise rotation, and the -r 1 that is the fps to convert.
$> ffmpeg -i YOURMOVIEFILE.MOV -vf "transpose=1" -r 1 -f image2 frames/image-%3d.jpg
To convert all the frames, leave out the -r 1. You can also remove the -vf “transpose=1” if you do not wish to rotate the video in this process.
$> ffmpeg -i YOURMOVIEFILE -vf "transpose=1" -f image2 frames/image-%3d.jpg
This should run for a few seconds and output a bunch of JPEG files in the frames folder. Now to slice the images up using mogify. This crop indicates we want to crop to an image that is 1px by 1920px, and grabbed from the middle of every image in the folder. The image of the cyclist above actually used the flag -crop 5×1920+540+0 so that a 5px wide sample of each image would be taken.
$> cd frames
$> mogrify -crop 1x1920+540+0 *.jpg
Opening any of the images now should yield a skinny vertical line of pixels. Now to splice them together with montage.
$> montage *.jpg -tile x1 -mode concatenate slitscan.jpg
$> open slitscan.jpg
Voilà, a slit scanned image composited from a portrait video using the terminal. If these instructions were useful to you or you have any ideas for improvement, please let me know.
Here’s another version from 3 minutes of video on the F train looking north in the late afternoon.
A more traditional up-down slit scan requires some logic in the bash execution: