How pencil2D uses ffmpeg as system call?

Hello, I’m trying to understand how Pencil2D actually integrates with FFMPEG for exporting the movie, what I’ve observed so far, I believe ffmpeg is being called via system calls, however I’m still not able to understand what procedure is exactly to be followed if I want to use ffmpeg in my project in a similar way, can anyone here give me a brief outline conceptually that I need to know about ffmpeg or qt-ffmpeg integration and how only a single .exe file of fmpeg is included in final build inside “plugins” folder and no other

I would actually not recommend using ffmpeg in the way that we do. FFmpeg has a c library (libav) which should be used instead. You can look at our incomplete implemenation of this for reference here.

If you insist on the binary approach however, you should look at core_lib/movieexporter.cpp, that is where you will find most of the exporting work we do. In particular we get the location with ffmpegLocation(), which returns a result depending on the operating system, but is usually in a plugins folder somewhere in or around the executable. Then we use that path to run ffmpeg with the desired arguments using QProcess. Frames can be fed in through stdin or saved to files and added as arguments (this is how we used to do it). As far as I know, the ffmpeg binary is not added by our build process, it must either be added manually, or added during the deploy process, which you can find in the .travis.yml file. The exact details of this depend on the operating system, but it always involves downloading an ffmpeg binary from an online source and then copying or moving it to the plugins folder.

2 Likes

scribblemaniac is right in that FFmpeg’s libav libraries are the way to go if you want to cleanly make programmatic use of its functionality in your program. However I wouldn’t necessarily take the WIP implementation they linked to as starting point, since I haven’t updated the version on GitHub with my latest progress in quite a while and the version that’s currently there had a whole host of problems IIRC. I would only use it for “inspiration” at most.

Also, you should know that there exists a fork of FFmpeg that was quite popular a few years ago which confusingly was also named Libav. However most people have since switched back to FFmpeg and when we talk about libav we generally mean the libraries included in the original FFmpeg project rather than the fork.

Lastly, in case you’d still like to use the executable (“system call”) approach, you might also have some luck scavenging the source code of screencasting programs for Linux – afaik many of those use that approach.

1 Like

Thank you, just one last thing, I see FFmpeg has a lot of other libraries, but my main issue is the lack of knowledge about names and associated meaning, related frameworks, for example, I’m not sure exactly what or who is responsible for generating the final .mp4 or video file, is it the responsibility of video codec? like by what name I should call an application which takes input as multiple images and other parameter and gives me a video as output? I hope I’m clear.

In general, video files are made up of two main parts: 1) the actual audio and video data, which is encoded through through the use of codecs (most commonly H.264, VP8/VP9, Theora, AAC, Vorbis, Opus, I think) and 2) some „packaging” called container format, which is generally responsible for organising both video and audio into a single file as well as providing some meta data (common examples: MP4, AVI, WebM, Matroska). Accordingly, the main libraries to use when working with video files using FFmpeg are libavcodec and libavformat, which deal with those two parts, respectively. These are the only two libraries which you absolutely need to deal with directly when working with common video files. The other libraries perform various related functions:

  • libavutil is a general utility library containing some math functions and other common stuff
  • libavdevice is for working with devices such as audio capture and playback devices. I haven’t really looked into it, though
  • libavfilter provides a filter graph for A/V data (think ffmpeg -filter / -filter_complex options)
  • libswscale is for software-side rescaling and color space conversion of video data
  • libswresample is for software-side audio resampling, mixing and sample format conversion
  • Apparently there’s also libpostproc but it’s practically undocumented and I have no idea what exactly it does

Some fundamental terms you will come across:

  • stream: audio and video data in container formats are organised into those. A typical video file has one video and one audio stream
  • packet: contains one or more frames of a stream
  • frame: pixel data of a single picture (in video streams) or a bunch of samples (in audio streams)

Most other fundamental terms (framerate, channels, samples, etc.) should be familiar to you if you know a little about digital media. Lastly, have a look at the API documentation and the examples for more orientation. I hope you find this useful for getting started.

1 Like

Thanks, yes it was very helpful indeed as I did study some of these terms on the web, however, the only issue is that resources regarding digital media are a bit scattered and it often is the case that I’m left with no choice but to come up with my own interpretation that I’m myself not sure about. Thanks for a concrete explanation and helping me out :slight_smile: