I saw a great example of slitscan videos recently and it reminded me of an experimental time masking video filter that I did way back when.
The basic idea is to use a grayscale image to mask time in a video feed. As the video frames stream in I store them in an rolling array to be used as a buffer. To compute the new distorted image, I loop over all the pixels in the mask image. I then take the pixel value’s distance from white, 255, and use that as the index to the image buffer. So if the pixel value is white, 255, I figure the distance as 0 and grab the pixel from zero frames ago. That is to say the most current frame. If the value is black, 0, then I compute the distance as 255. I use this as the index to the image buffer and get the pixel from 255 frames ago. If your camera is delivering 30 frames per second then this pixel would be 8 or 9 seconds old.
With that in place you can experiment with all sorts of different masks to see how they effect the video stream. For instance if you make your mask a vertical linear gradient then you will get an effect that is very similar to a slitscan video. Here is an example of that:
Here is an example that uses a radial gradient mask
Masks that use gradients are a lot of fun because it makes for really cool stretching effects. However, there are other options. For example here are a couple examples that use images for the mask