📄 rfc2326.txt
字号:
Schulzrinne, et. al. Standards Track [Page 11]RFC 2326 Real Time Streaming Protocol April 1998 HTTP or other means such as email and may not necessarily be stored on the media server. For the purposes of this specification, a presentation description is assumed to describe one or more presentations, each of which maintains a common time axis. For simplicity of exposition and without loss of generality, it is assumed that the presentation description contains exactly one such presentation. A presentation may contain several media streams. The presentation description file contains a description of the media streams making up the presentation, including their encodings, language, and other parameters that enable the client to choose the most appropriate combination of media. In this presentation description, each media stream that is individually controllable by RTSP is identified by an RTSP URL, which points to the media server handling that particular media stream and names the stream stored on that server. Several media streams can be located on different servers; for example, audio and video streams can be split across servers for load sharing. The description also enumerates which transport methods the server is capable of. Besides the media parameters, the network destination address and port need to be determined. Several modes of operation can be distinguished: Unicast: The media is transmitted to the source of the RTSP request, with the port number chosen by the client. Alternatively, the media is transmitted on the same reliable stream as RTSP. Multicast, server chooses address: The media server picks the multicast address and port. This is the typical case for a live or near-media-on-demand transmission. Multicast, client chooses address: If the server is to participate in an existing multicast conference, the multicast address, port and encryption key are given by the conference description, established by means outside the scope of this specification.1.7 RTSP States RTSP controls a stream which may be sent via a separate protocol, independent of the control channel. For example, RTSP control may occur on a TCP connection while the data flows via UDP. Thus, data delivery continues even if no RTSP requests are received by the mediaSchulzrinne, et. al. Standards Track [Page 12]RFC 2326 Real Time Streaming Protocol April 1998 server. Also, during its lifetime, a single media stream may be controlled by RTSP requests issued sequentially on different TCP connections. Therefore, the server needs to maintain "session state" to be able to correlate RTSP requests with a stream. The state transitions are described in Section A. Many methods in RTSP do not contribute to state. However, the following play a central role in defining the allocation and usage of stream resources on the server: SETUP, PLAY, RECORD, PAUSE, and TEARDOWN. SETUP: Causes the server to allocate resources for a stream and start an RTSP session. PLAY and RECORD: Starts data transmission on a stream allocated via SETUP. PAUSE: Temporarily halts a stream without freeing server resources. TEARDOWN: Frees resources associated with the stream. The RTSP session ceases to exist on the server. RTSP methods that contribute to state use the Session header field (Section 12.37) to identify the RTSP session whose state is being manipulated. The server generates session identifiers in response to SETUP requests (Section 10.4).1.8 Relationship with Other Protocols RTSP has some overlap in functionality with HTTP. It also may interact with HTTP in that the initial contact with streaming content is often to be made through a web page. The current protocol specification aims to allow different hand-off points between a web server and the media server implementing RTSP. For example, the presentation description can be retrieved using HTTP or RTSP, which reduces roundtrips in web-browser-based scenarios, yet also allows for standalone RTSP servers and clients which do not rely on HTTP at all. However, RTSP differs fundamentally from HTTP in that data delivery takes place out-of-band in a different protocol. HTTP is an asymmetric protocol where the client issues requests and the server responds. In RTSP, both the media client and media server can issue requests. RTSP requests are also not stateless; they may set parameters and continue to control a media stream long after theSchulzrinne, et. al. Standards Track [Page 13]RFC 2326 Real Time Streaming Protocol April 1998 request has been acknowledged. Re-using HTTP functionality has advantages in at least two areas, namely security and proxies. The requirements are very similar, so having the ability to adopt HTTP work on caches, proxies and authentication is valuable. While most real-time media will use RTP as a transport protocol, RTSP is not tied to RTP. RTSP assumes the existence of a presentation description format that can express both static and temporal properties of a presentation containing several media streams.2 Notational Conventions Since many of the definitions and syntax are identical to HTTP/1.1, this specification only points to the section where they are defined rather than copying it. For brevity, [HX.Y] is to be taken to refer to Section X.Y of the current HTTP/1.1 specification (RFC 2068 [2]). All the mechanisms specified in this document are described in both prose and an augmented Backus-Naur form (BNF) similar to that used in [H2.1]. It is described in detail in RFC 2234 [17], with the difference that this RTSP specification maintains the "1#" notation for comma-separated lists. In this memo, we use indented and smaller-type paragraphs to provide background and motivation. This is intended to give readers who were not involved with the formulation of the specification an understanding of why things are the way that they are in RTSP.3 Protocol Parameters3.1 RTSP Version [H3.1] applies, with HTTP replaced by RTSP.3.2 RTSP URL The "rtsp" and "rtspu" schemes are used to refer to network resources via the RTSP protocol. This section defines the scheme-specific syntax and semantics for RTSP URLs. rtsp_URL = ( "rtsp:" | "rtspu:" ) "//" host [ ":" port ] [ abs_path ] host = <A legal Internet host domain name of IP address (in dotted decimal form), as defined by Section 2.1Schulzrinne, et. al. Standards Track [Page 14]RFC 2326 Real Time Streaming Protocol April 1998 of RFC 1123 \cite{rfc1123}> port = *DIGIT abs_path is defined in [H3.2.1]. Note that fragment and query identifiers do not have a well-defined meaning at this time, with the interpretation left to the RTSP server. The scheme rtsp requires that commands are issued via a reliable protocol (within the Internet, TCP), while the scheme rtspu identifies an unreliable protocol (within the Internet, UDP). If the port is empty or not given, port 554 is assumed. The semantics are that the identified resource can be controlled by RTSP at the server listening for TCP (scheme "rtsp") connections or UDP (scheme "rtspu") packets on that port of host, and the Request-URI for the resource is rtsp_URL. The use of IP addresses in URLs SHOULD be avoided whenever possible (see RFC 1924 [19]). A presentation or a stream is identified by a textual media identifier, using the character set and escape conventions [H3.2] of URLs (RFC 1738 [20]). URLs may refer to a stream or an aggregate of streams, i.e., a presentation. Accordingly, requests described in Section 10 can apply to either the whole presentation or an individual stream within the presentation. Note that some request methods can only be applied to streams, not presentations and vice versa. For example, the RTSP URL: rtsp://media.example.com:554/twister/audiotrack identifies the audio stream within the presentation "twister", which can be controlled via RTSP requests issued over a TCP connection to port 554 of host media.example.com. Also, the RTSP URL: rtsp://media.example.com:554/twister identifies the presentation "twister", which may be composed of audio and video streams. This does not imply a standard way to reference streams in URLs. The presentation description defines the hierarchical relationships in the presentation and the URLs for the individual streams. A presentation description may name a stream "a.mov" and the whole presentation "b.mov".Schulzrinne, et. al. Standards Track [Page 15]RFC 2326 Real Time Streaming Protocol April 1998 The path components of the RTSP URL are opaque to the client and do not imply any particular file system structure for the server. This decoupling also allows presentation descriptions to be used with non-RTSP media control protocols simply by replacing the scheme in the URL.3.3 Conference Identifiers Conference identifiers are opaque to RTSP and are encoded using standard URI encoding methods (i.e., LWS is escaped with %). They can contain any octet value. The conference identifier MUST be globally unique. For H.323, the conferenceID value is to be used. conference-id = 1*xchar Conference identifiers are used to allow RTSP sessions to obtain parameters from multimedia conferences the media server is participating in. These conferences are created by protocols outside the scope of this specification, e.g., H.323 [13] or SIP [12]. Instead of the RTSP client explicitly providing transport information, for example, it asks the media server to use the values in the conference description instead.3.4 Session Identifiers Session identifiers are opaque strings of arbitrary length. Linear white space must be URL-escaped. A session identifier MUST be chosen randomly and MUST be at least eight octets long to make guessing it more difficult. (See Section 16.) session-id = 1*( ALPHA | DIGIT | safe )3.5 SMPTE Relative Timestamps A SMPTE relative timestamp expresses time relative to the start of the clip. Relative timestamps are expressed as SMPTE time codes for frame-level access accuracy. The time code has the format hours:minutes:seconds:frames.subframes, with the origin at the start of the clip. The default smpte format is "SMPTE 30 drop" format, with frame rate is 29.97 frames per second. Other SMPTE codes MAY be supported (such as "SMPTE 25") through the use of alternative use of "smpte time". For the "frames" field in the time value can assume the values 0 through 29. The difference between 30 and 29.97 frames per second is handled by dropping the first two frame indices (values 00 and 01) of every minute, except every tenth minute. If the frame value is zero, it may be omitted. Subframes are measured in one-hundredth of a frame.Schulzrinne, et. al. Standards Track [Page 16]RFC 2326 Real Time Streaming Protocol April 1998 smpte-range = smpte-type "=" smpte-time "-" [ smpte-time ] smpte-type = "smpte" | "smpte-30-drop" | "smpte-25" ; other timecodes may be added smpte-time = 1*2DIGIT ":" 1*2DIGIT ":" 1*2DIGIT [ ":" 1*2DIGIT ] [ "." 1*2DIGIT ] Examples: smpte=10:12:33:20- smpte=10:07:33- smpte=10:07:00-10:07:33:05.01 smpte-25=10:07:00-10:07:33:05.013.6 Normal Play Time Normal play time (NPT) indicates the stream absolute position relative to the beginning of the presentation. The timestamp consists of a decimal fraction. The part left of the decimal may be expressed in either seconds or hours, minutes, and seconds. The part right of the decimal point measures fractions of a second. The beginning of a presentation corresponds to 0.0 seconds. Negative values are not defined. The special constant now is defined as the
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -