📄 rfc2327.txt
字号:
k=base64:<encoded encryption key> The encryption key (as described in [3] for RTP media streams under the AV profile) is included in this key field but has been base64 encoded because it includes characters that are prohibited in SDP. k=uri:<URI to obtain key> A Universal Resource Identifier as used by WWW clients is included in this key field. The URI refers to the data containing the key, and may require additional authenticationHandley & Jacobson Standards Track [Page 17]RFC 2327 SDP April 1998 before the key can be returned. When a request is made to the given URI, the MIME content-type of the reply specifies the encoding for the key in the reply. The key should not be obtained until the user wishes to join the session to reduce synchronisation of requests to the WWW server(s). k=prompt No key is included in this SDP description, but the session or media stream referred to by this key field is encrypted. The user should be prompted for the key when attempting to join the session, and this user-supplied key should then be used to decrypt the media streams. Attributes a=<attribute> a=<attribute>:<value> Attributes are the primary means for extending SDP. Attributes may be defined to be used as "session-level" attributes, "media-level" attributes, or both. A media description may have any number of attributes ("a=" fields) which are media specific. These are referred to as "media-level" attributes and add information about the media stream. Attribute fields can also be added before the first media field; these "session-level" attributes convey additional information that applies to the conference as a whole rather than to individual media; an example might be the conference's floor control policy. Attribute fields may be of two forms: o property attributes. A property attribute is simply of the form "a=<flag>". These are binary attributes, and the presence of the attribute conveys that the attribute is a property of the session. An example might be "a=recvonly". o value attributes. A value attribute is of the form "a=<attribute>:<value>". An example might be that a whiteboard could have the value attribute "a=orient:landscape" Attribute interpretation depends on the media tool being invoked. Thus receivers of session descriptions should be configurable in their interpretation of announcements in general and of attributes in particular. Attribute names must be in the US-ASCII subset of ISO-10646/UTF-8.Handley & Jacobson Standards Track [Page 18]RFC 2327 SDP April 1998 Attribute values are byte strings, and MAY use any byte value except 0x00 (Nul), 0x0A (LF), and 0x0D (CR). By default, attribute values are to be interpreted as in ISO-10646 character set with UTF-8 encoding. Unlike other text fields, attribute values are NOT normally affected by the `charset' attribute as this would make comparisons against known values problematic. However, when an attribute is defined, it can be defined to be charset-dependent, in which case it's value should be interpreted in the session charset rather than in ISO-10646. Attributes that will be commonly used can be registered with IANA (see Appendix B). Unregistered attributes should begin with "X-" to prevent inadvertent collision with registered attributes. In either case, if an attribute is received that is not understood, it should simply be ignored by the receiver. Media Announcements m=<media> <port> <transport> <fmt list> A session description may contain a number of media descriptions. Each media description starts with an "m=" field, and is terminated by either the next "m=" field or by the end of the session description. A media field also has several sub-fields: o The first sub-field is the media type. Currently defined media are "audio", "video", "application", "data" and "control", though this list may be extended as new communication modalities emerge (e.g., telepresense). The difference between "application" and "data" is that the former is a media flow such as whiteboard information, and the latter is bulk-data transfer such as multicasting of program executables which will not typically be displayed to the user. "control" is used to specify an additional conference control channel for the session. o The second sub-field is the transport port to which the media stream will be sent. The meaning of the transport port depends on the network being used as specified in the relevant "c" field and on the transport protocol defined in the third sub-field. Other ports used by the media application (such as the RTCP port, see [2]) should be derived algorithmically from the base media port. Note: For transports based on UDP, the value should be in the range 1024 to 65535 inclusive. For RTP compliance it should be an even number.Handley & Jacobson Standards Track [Page 19]RFC 2327 SDP April 1998 For applications where hierarchically encoded streams are being sent to a unicast address, it may be necessary to specify multiple transport ports. This is done using a similar notation to that used for IP multicast addresses in the "c=" field: m=<media> <port>/<number of ports> <transport> <fmt list> In such a case, the ports used depend on the transport protocol. For RTP, only the even ports are used for data and the corresponding one-higher odd port is used for RTCP. For example: m=video 49170/2 RTP/AVP 31 would specify that ports 49170 and 49171 form one RTP/RTCP pair and 49172 and 49173 form the second RTP/RTCP pair. RTP/AVP is the transport protocol and 31 is the format (see below). It is illegal for both multiple addresses to be specified in the "c=" field and for multiple ports to be specified in the "m=" field in the same session description. o The third sub-field is the transport protocol. The transport protocol values are dependent on the address-type field in the "c=" fields. Thus a "c=" field of IP4 defines that the transport protocol runs over IP4. For IP4, it is normally expected that most media traffic will be carried as RTP over UDP. The following transport protocols are preliminarily defined, but may be extended through registration of new protocols with IANA: - RTP/AVP - the IETF's Realtime Transport Protocol using the Audio/Video profile carried over UDP. - udp - User Datagram Protocol If an application uses a single combined proprietary media format and transport protocol over UDP, then simply specifying the transport protocol as udp and using the format field to distinguish the combined protocol is recommended. If a transport protocol is used over UDP to carry several distinct media types that need to be distinguished by a session directory, then specifying the transport protocol and media format separately is necessary. RTP is an example of a transport-protocol that carries multiple payload formats that must be distinguished by the session directory for it to know how to start appropriate tools, relays, mixers or recorders.Handley & Jacobson Standards Track [Page 20]RFC 2327 SDP April 1998 The main reason to specify the transport-protocol in addition to the media format is that the same standard media formats may be carried over different transport protocols even when the network protocol is the same - a historical example is vat PCM audio and RTP PCM audio. In addition, relays and monitoring tools that are transport-protocol-specific but format-independent are possible. For RTP media streams operating under the RTP Audio/Video Profile [3], the protocol field is "RTP/AVP". Should other RTP profiles be defined in the future, their profiles will be specified in the same way. For example, the protocol field "RTP/XYZ" would specify RTP operating under a profile whose short name is "XYZ". o The fourth and subsequent sub-fields are media formats. For audio and video, these will normally be a media payload type as defined in the RTP Audio/Video Profile. When a list of payload formats is given, this implies that all of these formats may be used in the session, but the first of these formats is the default format for the session. For media whose transport protocol is not RTP or UDP the format field is protocol specific. Such formats should be defined in an additional specification document. For media whose transport protocol is RTP, SDP can be used to provide a dynamic binding of media encoding to RTP payload type. The encoding names in the RTP AV Profile do not specify unique audio encodings (in terms of clock rate and number of audio channels), and so they are not used directly in SDP format fields. Instead, the payload type number should be used to specify the format for static payload types and the payload type number along with additional encoding information should be used for dynamically allocated payload types. An example of a static payload type is u-law PCM coded single channel audio sampled at 8KHz. This is completely defined in the RTP Audio/Video profile as payload type 0, so the media field for such a stream sent to UDP port 49232 is: m=video 49232 RTP/AVP 0 An example of a dynamic payload type is 16 bit linear encoded stereo audio sampled at 16KHz. If we wish to use dynamic RTP/AVP payload type 98 for such a stream, additional information is required to decode it: m=video 49232 RTP/AVP 98Handley & Jacobson Standards Track [Page 21]RFC 2327 SDP April 1998 a=rtpmap:98 L16/16000/2 The general form of an rtpmap attribute is: a=rtpmap:<payload type> <encoding name>/<clock rate>[/<encoding parameters>] For audio streams, <encoding parameters> may specify the number of audio channels. This parameter may be omitted if the number of channels is one provided no additional parameters are needed. For video streams, no encoding parameters are currently specified. Additional parameters may be defined in the future, but codecspecific parameters should not be added. Parameters added to an rtpmap attribute should only be those required for a session directory to make the choice of appropriate media too to participate in a session. Codec-specific parameters should be added in other attributes. Up to one rtpmap attribute can be defined for each media format specified. Thus we might have: m=audio 49230 RTP/AVP 96 97 98 a=rtpmap:96 L8/8000 a=rtpmap:97 L16/8000 a=rtpmap:98 L16/11025/2 RTP profiles that specify the use of dynamic payload types must define the set of valid encoding names and/or a means to register encoding names if that profile is to be used with SDP. Experimental encoding formats can also be specified using rtpmap. RTP formats that are not registered as standard format names must be preceded by "X-". Thus a new experimental redundant audio stream called GSMLPC using dynamic payload type 99 could be specified as: m=video 49232 RTP/AVP 99 a=rtpmap:99 X-GSMLPC/8000 Such an experimental encoding requires that any site wishing to receive the media stream has relevant configured state in its session directory to know which tools are appropriate. Note that RTP audio formats typically do not include information about the number of samples per packet. If a non-default (as defined in the RTP Audio/Video Profile) packetisation is required, the "ptime" attribute is used as given below.Handley & Jacobson Standards Track [Page 22]RFC 2327 SDP April 1998 For more details on RTP audio and video formats, see [3]. o Formats for non-RTP media should be registered as MIME content types as described in Appendix B. For example, the LBL whiteboard application might be registered as MIME content-type application/wb with encoding considerations specifying that it operates over UDP, with no appropriate file format. In SDP this would then be expressed using a combination of the "media" field and the "fmt" field, as follows: m=application 32416 udp wb Suggested Attributes
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -