📄 rfc.xxxx
字号:
/*** Distributed with 'doc' Version 2.0 from University of Southern** California Information Sciences Institute (USC-ISI). 8/22/90*/Network Working Group Steve HotzRequest for Comments: XXXX Paul Mockapetris XX/YY/ZZ PRELIMINARY DRAFT & NOTES: NOT YET RELEASED/REVIEWED Automated Domain Testing1. Status:This RFC suggests ideas for the automated testing of DNS domains forcorrect configuration. The purpose of this RFC is to elicit furtherdiscussion regarding system requirements and issues of design andimplementation.Distribution of this memo is unlimited.2. Introduction:The Domain Name System, currently used by a majority of the Internet'scomponent networks, is a large and complex, autonomously administrated,distributed database which provides a network wide name service. It'ssize, and the autonomously administered nature of the system, make itan ideal breeding ground for problems caused by misconfigured domains.An automated tool for checking that domains are consistent with theprocedures as specified in the official RFCs would prove useful ina couple ways.It would be most useful as a means for domain administrators to verify(to a certain degree) that their domains are configured correctly.This would allow new domains to become operational fairly quickly,while ensuring that the domain is not causing far-reaching problemsduring the period when the administrator is still gaining hands-onexperience.The development of such a tool would also codify the interpretationof what is considered correct or incorrect, and serve as a yardstickagainst which strangely behaving domains may be measured.Additionally, by examining misconfiguration trends of the Internetas a whole, one might identify areas in which the DNS, or variousimplementations, may need to evolve in future versions (i.e. perhapsadministrators often configure their domains in a certain "incorrect"manner to gain some particular behavior or characteristic notachievable with standard "correct" configuration).3.1. On what scope should a question be asked?There are a few variants on the question that one would like such atool to answer. The simplest query would be whether or not a specificdomain is correctly configured. In light of the primary intended usethis tool, this question is probably sufficient.However, one might like to ask about the configuration of the entiredomain tree. A generalization of this would be to ask about aspecified subtree (which could be a leaf domain). However, pursuingthis question requires walking the domain tree by searching throughall zone information for zone delegations. This would cause aconsiderable amount of network traffic, and you may be thwarted byservers unwilling to give up entire zone information.Although the latter may more convenient when checking a group ofdomains, and possibly more efficient with respect to network andnameserver utilization (by not repeating queries to common parentdomain servers), we will assume the former model throughout this RFC.3.2. What should be tested? General Discussion.The most interesting task in designing such a tool is determining whatconstitutes a misconfigured domain, which problems we should look for,and finally, which misconfigurations are possible/reasonable to detectwith an automated procedure.Because the point at which a zone delegation is made is prone toerrors, and the "delegation" and "acceptance" of authority spans twodistinct domains, a test of a single domain should probably look atinformation from multiple domains.One could, in addition to looking at the particular domain inquestion, attempt to look at children (delegated) domains. However,this again raises the issue of determining which domains (if any) arebelow a given domain. On the other hand, one can test the parent(delegating) domain to verify that it's configuration is consistentwith respect to the domain under scrutiny.We will assume the latter approach for now. Again, whether oneapproach or the other is more useful, is a topic for discussion.I believe that a tool based on either model could be used to discoverthe same set of problems in any arbitrary subtree, by choosing theappropriate set of domains to examine.3.4. What claims should be made?Once we have decided on a set of conditions we wish to test, we needto specify what one should say about anomalies that are detected.Should there be an attempt to classify them in some manner? If so, aclassification might be similar to the following:ERRORS: Something is definitely wrong, and should be fixed. Example: Authoritative nameservers for a domain disagree on information about the domain.WARNINGS: Used for less serious problems or strong indicators of problems not decidably testable. Example: Parent domain servers have differing SOA records. A sign of a change to domain information which hasn't propagated yet. May explain some of the other errors.NOTE: One may be able to determine some potential problems by looking at information returned as a matter of course in investigating some separate problem. If such potential problems don't occur as often as the indicator, you may choose to note it rather than incur additional overhead investigating a likely dead-end. Example: Glue records were not returned with NS query to non-authoritative server. May not be an error since returning them in "ADDITIONAL" section is optional.3.5. Procedure VariantsWhen subsequent queries are based on the results of a query, thereare often several possible ways to proceed.Example: Assume that we are querying the parent domain servers todetermine a set of authoritative servers for the domain. The responseto these queries will be used to form a list of the appropriateservers to which another query will be posed. In some instances,among the set of servers for the parent domain, some may claim to alsobe authoritative for the domain being tested. Should you treat thesereplies differently than replies from parent servers that are notauthoritative for the domain? A few alternatives might be: (1) Ignorethe non-authoritative replies, only if authoritative responses werereceived. (2) Classify nameservers for the domain based on whetherparent nameserver holding an NS record for them were authoritative ornot, and further, look at results from nameservers of differentclasses differently. (3) Most simply, you might consider anynameserver for the domain specified in an NS record from any parentdomain server to be equivalent.Similar issues will exist for other parts of the procedure.3.6. What should be tested? Specific examples.The following is a partial set of anomalies to consider for automateddomain testing.3.6.1. Queries to parent domain servers about domain.a. Server claims to be non-authoritative for parent domain. An initial query will probably be made to discover the nameservers for the parent domain. If one of these nameservers is not authoritative, it probably should not have been listed among the NS referrals.b. Server has no SOA record for parent domain. As above (a), this server should not have been listed as a nameserver.c. Server returns multiple SOAs for parent domain.d. SOA records from different servers have different serial numbers. All servers should, except during short transition interval between updates, have consistent data.e. SOA records from different servers are different (same serial no). Changes to important information is being changed, but version is not being updated; possibly preventing distribution of changes.3.6.2. Queries to parent domain servers about domain.f. NS records from single server have different TTLs. Server has information, possibly inconsistent, from incorrect sources or out-of-date versions.g. Glue records present (or not present). Parent server should have addresses of authoritative servers; particularly important if non-authoritative.h. List of NS records for domain differ among servers. Parent servers should have consistent delegation information for the domain.i. Server claims to also be authoritative for domain. This is certainly not infrequent, however we might want to test whether other authoritative domain servers concur. Also, often the NS records from domain servers and from parent domain servers disagree, even if both authoritative. May want to note whether inconsistent NS lists are of this special case.3.6.3. Queries to domain servers about domain.j. Server has no SOA record for domain. Parent servers have incorrect data and are referring queries to about the domain to an unsuspecting server.k. Server claims to be non-authoritative. As above (j), parent server believes server to be authoritative and is referring others to this server.l. Server returns multiple SOAs for domain. See above (c).m. SOA records from different servers have different serial numbers. See above (d).n. SOA records from different servers are different (but same serial). See above (e).o. NS records from single server have different TTLs. See above (f).p. List of NS records for domain differ among servers. All authoritative servers for a domain should agree on information about domain, particularly important delegation records.q. NS list from parent servers does not match list from authoritative servers. Correctly operating domains will have identical NS records at all domains servers and parents domain servers.r. Server claims to be authoritative, however no NS record from authoritative servers. Strangely enough, this happens often when a server claims to be authoritative, answers questions about the domain, but does not have an NS record for itself.s. Reverse mapping (in-addr.arpa.) for server address not found. A frequent error in newly established domains.4. Implementation:An initial implementation attempt is available via anonymousftp from venera.isi.edu, file: pub/doc.tar.ZDoc requires the latest release of 'dig' (version 2.0) to performnameserver queries; dig is also available: pub/dig.2.0.tar.Z.This software is intended to run on Berkeley UNIX (and variants)
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -