FFME: The Advanced WPF MediaElement Alternative
⭐ Please star this project if you like it and show your appreciation via PayPal.Me
Current NuGet Release Status
- If you would like to support this project, you can show your appreciation via PayPal.Me
- Current Status: (2019-04-19) - Release 4.1.300 is now available, (see the Releases)
- NuGet Package available here: https://www.nuget.org/packages/FFME.Windows/
- FFmpeg Version: 4.1.1 32-bit or 64-bit
Please note the current NuGet realease might require a different version of the FFmpeg binaries than the ones of the current state of the source code.
Quick Usage Guide for WPF Apps
Here is a quick guide on how to get started.
- Open Visual Studio (v2019 recommended), and create a new WPF Application. Target Framework must be 4.6.1 or above.
- Install the NuGet Package from your Package Manager Console:
PM> Install-Package FFME.Windows
- You need FFmpeg shared binaries (64 or 32 bit, depending on your app's target architecture). Build your own or download a compatible build from Zeranoe FFmpeg Builds site.
- Your FFmpeg build should have a
binfolder with 3 exe files and some dll files. Copy all those files to a folder such as
- Within you application's startup code (
Unosquare.FFME.Library.FFmpegDirectory = @"c:\ffmpeg";.
- Use the FFME
MediaElementcontrol as any other WPF control. For example: In your
MainForm.xaml, add the namespace:
xmlns:ffme="clr-namespace:Unosquare.FFME;assembly=ffme.win"and then add the FFME control your window's XAML:
<ffme:MediaElement x:Name="Media" Background="Gray" LoadedBehavior="Play" UnloadedBehavior="Manual" />
- To play files or streams, simply set the
Media.Source = new Uri(@"c:\your-file-here");. Since
Sourceis a dependency property, it need to be set from the GUI thread.
Note: To build your own FFmpeg binaries, I recommend the Media Autobuild Suite but please don't ask for help on it here.
Additional Usage Notes
- Remember: The
Unosquare.FFME.Windows.Sampleprovides usage examples for plenty of features. Use it as your main reference.
- The generated API documentation is available here
FFME is an advanced and close drop-in replacement for Microsoft's WPF MediaElement Control. While the standard MediaElement uses DirectX (DirectShow) for media playback, FFME uses FFmpeg to read and decode audio and video. This means that for those of you who want to support stuff like HLS playback, or just don't want to go through the hassle of installing codecs on client machines, using FFME might just be the answer.
FFME provides multiple improvements over the standard MediaElement such as:
- Fast media seeking and frame-by-frame seeking
- Properties such as Position, Balance, SpeedRatio, IsMuted, and Volume are all Dependency Properties.
- Additional and extended media events. Extracting (and modifying) video, audio and subtitle frames is very easy.
- Easily apply FFmpeg video and audio filtergraphs.
- Extract media metadata and tech specs of a media stream (title, album, bit rate, codecs, FPS, etc).
- Apply volume, balance and speed ratio to media playback.
- MediaState actually works on this control. The standard WPF MediaElement severely lacks in this area.
- Ability to pick media streams contained in a file or a URL.
- Specify input and codec parameters.
- Opt-in hardware decoding acceleration via devices or via codecs.
- Capture stream packets, audio, video and subtitle frames
- Perform custom stream reading and stream recording
... all in a single MediaElement control
FFME also supports opening capture devices. See example Source URLs below and issue #48
device://dshow/?audio=Microphone (Vengeance 2100):video=MS Webcam 4000 device://gdigrab?title=Command Prompt device://gdigrab?desktop
If you'd like audio to not change pitch while changing the SpeedRatio property, you'll need the
SoundTouch.dll library v2.1.1 available on the same directory as the FFmpeg binaries. You can get the SoundTouch library here.
About how it works
First off, let's review a few concepts. A
packet is a group of bytes read from the input. All
packets are of a specific
MediaType (Audio, Video, Subtitle, Data), and contain some timing information and most importantly compressed data. Packets are sent to a
Codec and in turn, the codec produces
Frames. Please note that producing 1
frame does not always take exactly 1
packet may contain many
frames but also a
frame may require several
packets for the decoder to build it.
Frames will contain timing informattion and the raw, uncompressed data. Now, you may think you can use
frames and show pixels on the screen or send samples to the sound card. We are close, but we still need to do some additional processing. Turns out different
Codecs will produce different uncompressed data formats. For example, some video codecs will output pixel data in ARGB, some others in RGB, and some other in YUV420. Therefore, we will need to
frames into something all hardware can use natively. I call these converted frames,
MediaBlocks will contain uncompressed data in standard Audio and Video formats that all hardware is able to receive.
The process described above is implemented in 3 different layers:
MediaContainerwraps an input stream. This layer keeps track of a
MediaComponentSetwhich is nothing more than a collecttion of
blockconversion logic. It provides the following important functionality:
- We call
Opento open the input stream and detect the different stream components. This also determines the codecs to use.
- We call
Readto read the next available packet and store it in its corresponding component (audio, video, subtitle, data, etc)
- We call
Decodeto read the following packet from the queue that each of the components hold, and return a set of frames.
- Finally, we call
Convertto turn a given
- We call
MediaContainerand it is responsible for executing commands to control the input stream (Play, Pause, Stop, Seek, etc.) while keeping keeping 3 background workers.
PacketReadingWrokeris designed to continuously read packets from the
MediaContainer. It will read packets when it needs them and it will pause if it does not need them. This is determined by how much data is in the cache. It will try to keep approximately 1 second of media packets at all times.
FrameDecodingWrokergets the packets that the
PacketReadingWorkerwrites and decodes them into frames. It then converts those frames into
blocksand writes them to a
MediaBlockBuffer. This block buffer can then be read by something else (the following worker described here) so its contents can be rendered.
- Finally, the
BlockRenderingWorkerreads blocks form the
MediaBlockBuffers and sends those blocks to a plat-from specific
- At the highest level, we have a
MediaElement. It wraps a
MediaEngineand it contains platform-specific implementation of methods to perform stuff like audio rendering, video rendering, subtitle rendering, and property synchronization between the
A high-level diagram is provided as additional reference below.
Some Work In Progress
Your help is welcome!
- I am planning the next version of this control,
Floyd. See the Issues section.
Windows: Compiling, Running and Testing
Please note that I am unable to distribute FFmpeg's binaries because I don't know if I am allowed to do so. Follow the instructions below to compile, run and test FFME.
- Clone this repository.
- Download the FFmpeg shared binaries for your target architecture: 32-bit or 64-bit.
- Extract the contents of the
zipfile you just downloaded and go to the
binfolder that got extracted. You should see 3
exefiles and multiple
dllfiles. Select and copy all of them.
- Now paste all files from the prior step onto a well-known folder. Take note of the full path. (I used
- Open the solution and set the
Unosquare.FFME.Windows.Sampleproject as the startup project. You can do this by right clicking on the project and selecting
Set as startup project. Please note that you will need Visual Studio 2019 with dotnet Core 3.0 SDK for your target architecture installed.
- Under the
Unosquare.FFME.Windows.Sampleproject, find the file
App.xaml.csand under the constructor, locate the line
Library.FFmpegDirectory = @"c:\ffmpeg";and replace the path so that it points to the folder where you extracted your FFmpeg binaries (dll files).
- Click on
Startto run the project.
- You should see a sample media player. Click on the
Openicon located at the bottom right and enter a URL or path to a media file.
- The file or URL should play immediately, and all the properties should display to the right of the media display by clicking on the
- You can use the resulting compiled assemblies in your project without further dependencies. Look for
In no particular order
- To the FFmpeg team for making the Swiss Army Knife of media. I encourage you to donate to them.
- To Kyle Schwarz for creating and making Zeranoe FFmpeg builds available to everyone.
- To the NAudio team for making the best audio library out there for .NET -- one day I will contribute some improvements I have noticed they need.
- To Ruslan Balanukhin for his FFmpeg interop bindings generator tool: FFmpeg.AutoGen.
- To Martin Bohme for his tutorial on creating a video player with FFmpeg.
- To Barry Mieny for his beautiful FFmpeg logo
- Please refer to the LICENSE file for more information.