📄 faq.texi.svn-base
字号:
ffmpeg -i input -acodec libfaac -ab 128kb -vcodec h264 -b 1200kb -ar 48000 -mbd 2 -coder 1 -cmp 2 -subcmp 2 -s 368x192 -r 30000/1001 -title X -f psp -flags loop -trellis 2 -partitions parti4x4+parti8x8+partp4x4+partp8x8+partb8x8 output.mp4@end table@section Which are good parameters for encoding high quality MPEG-4?'-mbd rd -flags +4mv+trell+aic -cmp 2 -subcmp 2 -g 300 -pass 1/2',things to try: '-bf 2', '-flags qprd', '-flags mv0', '-flags skiprd'.@section Which are good parameters for encoding high quality MPEG-1/MPEG-2?'-mbd rd -flags +trell -cmp 2 -subcmp 2 -g 100 -pass 1/2'but beware the '-g 100' might cause problems with some decoders.Things to try: '-bf 2', '-flags qprd', '-flags mv0', '-flags skiprd.@section Interlaced video looks very bad when encoded with ffmpeg, whats wrong?You should use '-flags +ilme+ildct' and maybe '-flags +alt' for interlacedmaterial, and try '-top 0/1' if the result looks really messed-up.@section How can I read DirectShow files?If you have built FFmpeg with @code{./configure --enable-avisynth}(only possible on MinGW/Cygwin platforms),then you may use any file that DirectShow can read as input.(Be aware that this feature has been recently added,so you will need to help yourself in case of problems.)Just create an "input.avs" text file with this single line ...@example DirectShowSource("C:\path to your file\yourfile.asf")@end example... and then feed that text file to FFmpeg:@example ffmpeg -i input.avs@end exampleFor ANY other help on Avisynth, please visit @url{http://www.avisynth.org/}.@section How can I join video files?A few multimedia containers (MPEG-1, MPEG-2 PS, DV) allow to join video files bymerely concatenating them.Hence you may concatenate your multimedia files by first transcoding them tothese privileged formats, then using the humble @code{cat} command (or theequally humble @code{copy} under Windows), and finally transcoding back to yourformat of choice.@exampleffmpeg -i input1.avi -sameq intermediate1.mpgffmpeg -i input2.avi -sameq intermediate2.mpgcat intermediate1.mpg intermediate2.mpg > intermediate_all.mpgffmpeg -i intermediate_all.mpg -sameq output.avi@end exampleNotice that you should either use @code{-sameq} or set a reasonably highbitrate for your intermediate and output files, if you want to preservevideo quality.Also notice that you may avoid the huge intermediate files by taking advantageof named pipes, should your platform support it:@examplemkfifo intermediate1.mpgmkfifo intermediate2.mpgffmpeg -i input1.avi -sameq -y intermediate1.mpg < /dev/null &ffmpeg -i input2.avi -sameq -y intermediate2.mpg < /dev/null &cat intermediate1.mpg intermediate2.mpg |\ffmpeg -f mpeg -i - -sameq -vcodec mpeg4 -acodec libmp3lame output.avi@end exampleSimilarly, the yuv4mpegpipe format, and the raw video, raw audio codecs alsoallow concatenation, and the transcoding step is almost lossless.For example, let's say we want to join two FLV files into an output.flv file:@examplemkfifo temp1.amkfifo temp1.vmkfifo temp2.amkfifo temp2.vmkfifo all.amkfifo all.vffmpeg -i input1.flv -vn -f u16le -acodec pcm_s16le -ac 2 -ar 44100 - > temp1.a < /dev/null &ffmpeg -i input2.flv -vn -f u16le -acodec pcm_s16le -ac 2 -ar 44100 - > temp2.a < /dev/null &ffmpeg -i input1.flv -an -f yuv4mpegpipe - > temp1.v < /dev/null &ffmpeg -i input2.flv -an -f yuv4mpegpipe - > temp2.v < /dev/null &cat temp1.a temp2.a > all.a &cat temp1.v temp2.v > all.v &ffmpeg -f u16le -acodec pcm_s16le -ac 2 -ar 44100 -i all.a \ -f yuv4mpegpipe -i all.v \ -sameq -y output.flvrm temp[12].[av] all.[av]@end example@section FFmpeg does not adhere to the -maxrate setting, some frames are bigger than maxrate/fps.Read the MPEG spec about video buffer verifier.@section I want CBR, but no matter what I do frame sizes differ.You do not understand what CBR is, please read the MPEG spec.Read about video buffer verifier and constant bitrate.The one sentence summary is that there is a buffer and the input rate isconstant, the output can vary as needed.@section How do I check if a stream is CBR?To quote the MPEG-2 spec:"There is no way to tell that a bitstream is constant bitrate withoutexamining all of the vbv_delay values and making complicated computations."@chapter Development@section Are there examples illustrating how to use the FFmpeg libraries, particularly libavcodec and libavformat?Yes. Read the Developers Guide of the FFmpeg documentation. Alternatively,examine the source code for one of the many open source projects thatalready incorporate ffmpeg at (@url{projects.html}).@section Can you support my C compiler XXX?It depends. If your compiler is C99-compliant, then patches to supportit are likely to be welcome if they do not pollute the source codewith @code{#ifdef}s related to the compiler.@section Is Microsoft Visual C++ supported?No. Microsoft Visual C++ is not compliant to the C99 standard and doesnot - among other things - support the inline assembly used in FFmpeg.If you wish to use MSVC++ for yourproject then you can link the MSVC++ code with libav* as long asyou compile the latter with a working C compiler. For more information, seethe @emph{Microsoft Visual C++ compatibility} section in the FFmpegdocumentation.There have been efforts to make FFmpeg compatible with MSVC++ in thepast. However, they have all been rejected as too intrusive, especiallysince MinGW does the job adequately. None of the core developerswork with MSVC++ and thus this item is low priority. Should you findthe silver bullet that solves this problem, feel free to shoot it at us.We strongly recommend you to move over from MSVC++ to MinGW tools.@section Can I use FFmpeg or libavcodec under Windows?Yes, but the Cygwin or MinGW tools @emph{must} be used to compile FFmpeg.Read the @emph{Windows} section in the FFmpeg documentation to find moreinformation.To get help and instructions for building FFmpeg under Windows, check outthe FFmpeg Windows Help Forum at@url{http://arrozcru.no-ip.org/ffmpeg/}.@section Can you add automake, libtool or autoconf support?No. These tools are too bloated and they complicate the build.@section Why not rewrite ffmpeg in object-oriented C++?ffmpeg is already organized in a highly modular manner and does not need tobe rewritten in a formal object language. Further, many of the developersfavor straight C; it works for them. For more arguments on this matter,read "Programming Religion" at (@url{http://www.tux.org/lkml/#s15}).@section Why are the ffmpeg programs devoid of debugging symbols?The build process creates ffmpeg_g, ffplay_g, etc. which contain full debuginformation. Those binaries are strip'd to create ffmpeg, ffplay, etc. Ifyou need the debug information, used the *_g versions.@section I do not like the LGPL, can I contribute code under the GPL instead?Yes, as long as the code is optional and can easily and cleanly be placedunder #ifdef CONFIG_GPL without breaking anything. So for example a new codecor filter would be OK under GPL while a bugfix to LGPL code would not.@section I want to compile xyz.c alone but my compiler produced many errors.Common code is in its own files in libav* and is used by the individualcodecs. They will not work without the common parts, you have to compilethe whole libav*. If you wish, disable some parts with configure switches.You can also try to hack it and remove more, but if you had problems fixingthe compilation failure then you are probably not qualified for this.@section I'm using libavcodec from within my C++ application but the linker complains about missing symbols which seem to be available.FFmpeg is a pure C project, so to use the libraries within your C++ applicationyou need to explicitly state that you are using a C library. You can do this byencompassing your FFmpeg includes using @code{extern "C"}.See @url{http://www.parashift.com/c++-faq-lite/mixing-c-and-cpp.html#faq-32.3}@section I have a file in memory / a API different from *open/*read/ libc how do i use it with libavformat?You have to implement a URLProtocol, see libavformat/file.c in FFmpegand libmpdemux/demux_lavf.c in MPlayer sources.@section I get "No compatible shell script interpreter found." in MSys.The standard MSys bash (2.04) is broken. You need to install 2.05 or later.@section I get "./configure: line <xxx>: pr: command not found" in MSys.The standard MSys install doesn't come with pr. You need to get it from the coreutils package.@section I tried to pass RTP packets into a decoder, but it doesn't work.RTP is a container format like any other, you must first depacketize thecodec frames/samples stored in RTP and then feed to the decoder.@section Where can I find libav* headers for Pascal/Delphi?see @url{http://www.iversenit.dk/dev/ffmpeg-headers/}@section Where is the documentation about ffv1, msmpeg4, asv1, 4xm?see @url{http://svn.mplayerhq.hu/michael/trunk/docs/}@section How do I feed H.263-RTP (and other codecs in RTP) to libavcodec?Even if peculiar since it is network oriented, RTP is a container like anyother. You have to @emph{demux} RTP before feeding the payload to libavcodec.In this specific case please look at RFC 4629 to see how it should be done.@section AVStream.r_frame_rate is wrong, it is much larger than the framerate.r_frame_rate is NOT the average framerate, it is the smallest frameratethat can accurately represent all timestamps. So no, it is notwrong if it is larger than the average!For example, if you have mixed 25 and 30 fps content, then r_frame_ratewill be 150.@bye
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -