ba8904429d
Former-commit-id:
|
||
---|---|---|
.github/workflows | ||
FFMpegCore | ||
FFMpegCore.Test | ||
.gitattributes | ||
.gitignore | ||
FFMpegCore.sln | ||
LICENSE | ||
README.md |
FFMpegCore
Setup
NuGet:
Install-Package FFMpegCore
A .NET Standard FFMpeg/FFProbe wrapper for easily integrating media analysis and conversion into your C# applications. Support both synchronous and asynchronous use
API
FFProbe
FFProbe is used to gather media information:
var mediaInfo = FFProbe.Analyse(inputFile);
or
var mediaInfo = await FFProbe.AnalyseAsync(inputFile);
FFMpeg
FFMpeg is used for converting your media files to web ready formats. Easily build your FFMpeg arguments using the fluent argument builder:
Convert input file to h264/aac scaled to 720p w/ faststart, for web playback
FFMpegArguments
.FromFileInput(inputPath)
.OutputToFile(outputPath, false, options => options
.WithVideoCodec(VideoCodec.LibX264)
.WithConstantRateFactor(21)
.WithAudioCodec(AudioCodec.Aac)
.WithVariableBitrate(4)
.WithFastStart()
.Scale(VideoSize.Hd))
.ProcessSynchronously();
Easily capture screens from your videos:
var mediaFileAnalysis = FFProbe.Analyse(inputPath);
// process the snapshot in-memory and use the Bitmap directly
var bitmap = FFMpeg.Snapshot(mediaFileAnalysis, new Size(200, 400), TimeSpan.FromMinutes(1));
// or persists the image on the drive
FFMpeg.Snapshot(mediaFileAnalysis, outputPath, new Size(200, 400), TimeSpan.FromMinutes(1))
Convert to and/or from streams
await FFMpegArguments
.FromPipeInput(new StreamPipeSource(inputStream))
.OutputToPipe(new StreamPipeSink(outputStream), options => options
.WithVideoCodec("vp9")
.ForceFormat("webm"))
.ProcessAsynchronously();
Join video parts into one single file:
FFMpeg.Join(@"..\joined_video.mp4",
@"..\part1.mp4",
@"..\part2.mp4",
@"..\part3.mp4"
);
Join images into a video:
FFMpeg.JoinImageSequence(@"..\joined_video.mp4", frameRate: 1,
ImageInfo.FromPath(@"..\1.png"),
ImageInfo.FromPath(@"..\2.png"),
ImageInfo.FromPath(@"..\3.png")
);
Mute videos:
FFMpeg.Mute(inputFilePath, outputFilePath);
Save audio track from video:
FFMpeg.ExtractAudio(inputVideoFilePath, outputAudioFilePath);
Add or replace audio track on video:
FFMpeg.ReplaceAudio(inputVideoFilePath, inputAudioFilePath, outputVideoFilePath);
Add poster image to audio file (good for youtube videos):
FFMpeg.PosterWithAudio(inputImageFilePath, inputAudioFilePath, outputVideoFilePath);
// or
var image = Image.FromFile(inputImageFile);
image.AddAudio(inputAudioFilePath, outputVideoFilePath);
Other available arguments could be found in FFMpegCore.Arguments
namespace.
Input piping
With input piping it is possible to write video frames directly from program memory without saving them to jpeg or png and then passing path to input of ffmpeg. This feature also allows us to convert video on-the-fly while frames are being generated or received.
The IPipeSource
interface is used as the source of data. It could be represented as encoded video stream or raw frames stream. Currently, the IPipeSource
interface has single implementation, RawVideoPipeSource
that is used for raw stream encoding.
For example:
Method that is generating bitmap frames:
IEnumerable<IVideoFrame> CreateFrames(int count)
{
for(int i = 0; i < count; i++)
{
yield return GetNextFrame(); //method of generating new frames
}
}
Then create ArgumentsContainer
with InputPipeArgument
var videoFramesSource = new RawVideoPipeSource(CreateFrames(64)) //pass IEnumerable<IVideoFrame> or IEnumerator<IVideoFrame> to constructor of RawVideoPipeSource
{
FrameRate = 30 //set source frame rate
};
FFMpegArguments
.FromPipeInput(videoFramesSource, <input_stream_options>)
.OutputToFile("temporary.mp4", false, <output_options>)
.ProcessSynchronously();
if you want to use System.Drawing.Bitmap
as IVideoFrame
, there is a BitmapVideoFrameWrapper
wrapper class.
Binaries
If you prefer to manually download them, visit ffbinaries or zeranoe Windows builds.
Windows
command: choco install ffmpeg -Y
location: C:\ProgramData\chocolatey\lib\ffmpeg\tools\ffmpeg\bin
Mac OSX
command: brew install ffmpeg mono-libgdiplus
location: /usr/local/bin
Ubuntu
command: sudo apt-get install -y ffmpeg libgdiplus
location: /usr/bin
Path Configuration
Behavior
If you wish to support multiple client processor architectures, you can do so by creating a folder x64
and x86
in the root
directory.
Both folders should contain the binaries (ffmpeg.exe
and ffprobe.exe
) for build for the respective architectures.
By doing so, the library will attempt to use either /root/{ARCH}/(ffmpeg|ffprobe).exe
.
If these folders are not defined, it will try to find the binaries in /root/(ffmpeg|ffprobe.exe)
Option 1
The default value (\\FFMPEG\\bin
) can be overwritten via the FFMpegOptions
class:
public Startup()
{
GlobalFFOptions.Configure(new FFOptions { BinaryFolder = "./bin", TemporaryFilesFolder = "/tmp" });
}
Option 2
The root and temp directory for the ffmpeg binaries can be configured via the ffmpeg.config.json
file.
{
"BinaryFolder": "./bin",
"TemporaryFilesFolder": "/tmp"
}
Compatibility
Some versions of FFMPEG might not have the same argument schema. The lib has been tested with version 3.3
to 4.2
Contributors
License
Copyright © 2020
Released under MIT license