⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 internals

📁 完整的RTP RTSP代码库
💻
字号:
MP4Player internalsDecember, 2002Bill MayCisco Systems.This will attempt to describe the internal workings of mp4player forthose that are interested. mp4player was intended as a mpeg4 streaming only player, but sincewe were going to handle multiple audio codecs, I made an early decisionto support both local and streaming playback, as well as multiple audio and video codecs.mp4player (and it's derivitive applications gmp4player and wmp4player)is broken up into 2 part - libmp4player, which contains everythingfor playback, and playback control code (for example, in gmp4player, thiswould be the GTK gui code).There are a few major concepts that need to be understood.  These arethe concepts of a session, a media, a bytestream, and a decode plugin.Session-------A session is something that the user wants to play.  This could beaudio only, video only, both, multiple videos with audio, whatever.A session is represented by the CPlayerSession class, which is themajor structure/API of libmp4player.  The CPlayerSession class is also responsible for the syncronization of the audio and video classes.Media-----Each media stream is represented by a CPlayerMedia class.  This classis responsible for decoding data from the bytestream and passing it toa sync class for buffering and rendering.  The media class really doesn'tcare too much whether it is audio or video - the majority of the codeworks for both.Bytestream----------A bytestream is the mechanism that gives the Media a frame (or a numberof bytes) to decode, as well as a timestamp for that frame.  For example, an mp4 file bytestream would read each audio or video frame from an mp4 container file.mp4player supports bytestreams for avi files, mp4 files, mpeg files, .mov files, some raw files, and RTP.Some bytestreams need to be media aware (meaning they have to know somethingabout the structure of the media inside of them).  The mpeg file and someof the RTP bytestreams are media aware; the mp4 container file is not.Each media will have its own bytestream.  The bytestream base class residesin our_bytestream.h.Decode Plugin-------------Each media must have a way to translate from the encoded data to datathat can be rendered (in the case of video, it is YUV data - for audio, it will be PCM - if you need to know what YUV or PCM is, do a web search).With this in mind, we have the concept of a decode plugin that the mediacan use to take data from the bytestream and pass it to the sync routines.At startup, decode plugins are detected (rather than hard linked at compile time), and the proper decode plugin is detected as the media arecreated.The decode plugin takes the encoded frame from the bytestream, decodeit and pass the raw data to the rendering buffer.Media Data Flow---------------The media data flow is fairly simple:bytestream -> decoder -> render buffer -> renderingThreading---------libmp4player uses multiple threads.  Whatever control is used will haveits own thread, or multiple threads.Each media has a thread for decoding, and if it uses RTP as a bytestream, has a thread for RTP packet reception (except for RTP over RTSP, which will have 1 thread for all media).SDL will start a thread for audio rendering.Finally, there is a thread for syncronization of audio and video.Rendering and Syncronization----------------------------We use the open source package SDL for rendering - it is a multiple platformrendering engine for games.  We have modified it slightly to get the audio latency for better syncronization.  The decode plugins will have an interface to a sync class - there are both audio and video sync classes, with different interfaces.  audio.cppand video.cpp contain the base classes for rendering - video_sdl.cpp andaudio_sdl.cpp contain the SDL rendering functions.  These classes providebuffering for the audio and video.  If someone wanted, they could replacethese functions with other rendering engines.Rendering is done through the CPlayerSession sync thread.  This is afairly complex state machine when both audio and video is being displayed.It works by starting the audio rendering, then using the audio latency to feedback the display time to the sync task for video rendering.  The SDL audio driver uses a callback when it requires more data.  We syncronize the "display" times passed by the bytestream with the time of day from the gettimeofday function.Finally-------To see the call flow to start and stop a session, you should lookat main.cpp, function start_session().

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -