During COVID-19 confinement, I had to use Zoom to my daily chats and meeting with colleagues.

I had a old version installed, and after updating to a newer version, I found that we can have Virtual Background which is pretty funny, but Linux version (3.5.3), lacks for animated ones.

To had some extra fun during my calls, I wanted to integrate some funny filter, one might find in several video applications.

After several search, I’ve found that combining multiple tools and commands I was able to “integrate” ASCII art filter to my Linux Zoom setup.

Tools

Hasciicam

hasciicam is a project that converts video from TV card or webcam into ASCII. It uses AAlib underneath to render YUV420 video stream into ASCII art. It only output greyscale images, though.

Libcaca

libcaca is a graphics library that outputs text instead of pixels, so that it can work on older video cards or text terminals. And its output is in color. Its output is in rgb24.

V4L2

v4l2loopback is kernel module that create dummy video you can interact with via v4l2 API.

sudo apt install v4l2loopback-utils v4l2loopback-dkms gstreamer1.0-tools
sudo modprobe v4l2loopback card_label="VirtualCam #0"
ls /dev/video1

FFMPEG

ffmpeg is a great tool to manipulate audio and video.

Assuming your real webcam is attached and accessible via /dev/video0, you can simply “copy” its video stream and make it available through the dummy device provided by v4l2loopback

ffmpeg -i /dev/video0 -f v4l2 -pix_fmt yuv420p /dev/video1

ffmpeg, if compiled with libcaca, is able to provide colored ASCII art output.

ffmpeg -i /dev/video0 -c:v rawvideo  -pix_fmt rgb24 -f caca -

GStreamer

Another way to feed a dummy device is to use GStreamer pipeline. v4l2sink sink can be used to display video to v4l2 devices.

gst-launch-1.0 -v videotestsrc pattern=ball ! v4l2sink device=/dev/video1

Let’s go

Now lets connect everything:

  • Create a dummy video device (if not already available)
    sudo modprobe v4l2loopback card_label="VirtualCam #0"
    
  • Output converted video stream to ASCII art:
    ffmpeg -i /dev/video0 -c:v rawvideo -pix_fmt rgb24 -f caca -
    
  • Pipeline X11 window output
    gst-launch-1.0 ximagesrc xname="pipe:" ! video/x-raw,framerate=5/1 ! videoconvert ! videoscale ! "video/x-raw,format=YUY2,width=320,height=240" ! v4l2sink device=/dev/video1
    
  • Open Zoom. You are now able to select your dummy video device you can select: Video icon > Select a camera > VirtualCam #0 and enjoy.

A better solution, that involves coding would have been to:

  • Acquire video signal from read device, via V4L2 api
  • Convert steam via libcaca, with eventual image tuning
  • Convert back rgb24 to YUV420
  • Output video signal to dummy device, via V4L2 api

For now it’s good enough to play around.