⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 rfc1750.txt

📁 bind 9.3结合mysql数据库
💻 TXT
📖 第 1 页 / 共 5 页
字号:
   measures including statistical and spectral.  These tests check   things like autocorrelation between different parts of a "random"   sequence or distribution of its values.  They could be met by a   constant stored random sequence, such as the "random" sequence   printed in the CRC Standard Mathematical Tables [CRC].   A typical pseudo-random number generation technique, known as a   linear congruence pseudo-random number generator, is modular   arithmetic where the N+1th value is calculated from the Nth value by        V    = ( V  * a + b )(Mod c)         N+1      N   The above technique has a strong relationship to linear shift   register pseudo-random number generators, which are well understood   cryptographically [SHIFT*].  In such generators bits are introduced   at one end of a shift register as the Exclusive Or (binary sum   without carry) of bits from selected fixed taps into the register.   For example:      +----+     +----+     +----+                      +----+      | B  | <-- | B  | <-- | B  | <--  . . . . . . <-- | B  | <-+      |  0 |     |  1 |     |  2 |                      |  n |   |      +----+     +----+     +----+                      +----+   |        |                     |            |                     |        |                     |            V                  +-----+        |                     V            +----------------> |     |        V                     +-----------------------------> | XOR |        +---------------------------------------------------> |     |                                                              +-----+       V    = ( ( V  * 2 ) + B .xor. B ... )(Mod 2^n)        N+1         N         0       2   The goodness of traditional pseudo-random number generator algorithms   is measured by statistical tests on such sequences.  Carefully chosen   values of the initial V and a, b, and c or the placement of shift   register tap in the above simple processes can produce excellent   statistics.Eastlake, Crocker & Schiller                                    [Page 6]RFC 1750        Randomness Recommendations for Security    December 1994   These sequences may be adequate in simulations (Monte Carlo   experiments) as long as the sequence is orthogonal to the structure   of the space being explored.  Even there, subtle patterns may cause   problems.  However, such sequences are clearly bad for use in   security applications.  They are fully predictable if the initial   state is known.  Depending on the form of the pseudo-random number   generator, the sequence may be determinable from observation of a   short portion of the sequence [CRYPTO*, STERN].  For example, with   the generators above, one can determine V(n+1) given knowledge of   V(n).  In fact, it has been shown that with these techniques, even if   only one bit of the pseudo-random values is released, the seed can be   determined from short sequences.   Not only have linear congruent generators been broken, but techniques   are now known for breaking all polynomial congruent generators   [KRAWCZYK].4. Unpredictability   Randomness in the traditional sense described in section 3 is NOT the   same as the unpredictability required for security use.   For example, use of a widely available constant sequence, such as   that from the CRC tables, is very weak against an adversary. Once   they learn of or guess it, they can easily break all security, future   and past, based on the sequence [CRC].  Yet the statistical   properties of these tables are good.   The following sections describe the limitations of some randomness   generation techniques and sources.4.1 Problems with Clocks and Serial Numbers   Computer clocks, or similar operating system or hardware values,   provide significantly fewer real bits of unpredictability than might   appear from their specifications.   Tests have been done on clocks on numerous systems and it was found   that their behavior can vary widely and in unexpected ways.  One   version of an operating system running on one set of hardware may   actually provide, say, microsecond resolution in a clock while a   different configuration of the "same" system may always provide the   same lower bits and only count in the upper bits at much lower   resolution.  This means that successive reads on the clock may   produce identical values even if enough time has passed that the   value "should" change based on the nominal clock resolution. There   are also cases where frequently reading a clock can produce   artificial sequential values because of extra code that checks forEastlake, Crocker & Schiller                                    [Page 7]RFC 1750        Randomness Recommendations for Security    December 1994   the clock being unchanged between two reads and increases it by one!   Designing portable application code to generate unpredictable numbers   based on such system clocks is particularly challenging because the   system designer does not always know the properties of the system   clocks that the code will execute on.   Use of a hardware serial number such as an Ethernet address may also   provide fewer bits of uniqueness than one would guess.  Such   quantities are usually heavily structured and subfields may have only   a limited range of possible values or values easily guessable based   on approximate date of manufacture or other data.  For example, it is   likely that most of the Ethernet cards installed on Digital Equipment   Corporation (DEC) hardware within DEC were manufactured by DEC   itself, which significantly limits the range of built in addresses.   Problems such as those described above related to clocks and serial   numbers make code to produce unpredictable quantities difficult if   the code is to be ported across a variety of computer platforms and   systems.4.2 Timing and Content of External Events   It is possible to measure the timing and content of mouse movement,   key strokes, and similar user events.  This is a reasonable source of   unguessable data with some qualifications.  On some machines, inputs   such as key strokes are buffered.  Even though the user's inter-   keystroke timing may have sufficient variation and unpredictability,   there might not be an easy way to access that variation.  Another   problem is that no standard method exists to sample timing details.   This makes it hard to build standard software intended for   distribution to a large range of machines based on this technique.   The amount of mouse movement or the keys actually hit are usually   easier to access than timings but may yield less unpredictability as   the user may provide highly repetitive input.   Other external events, such as network packet arrival times, can also   be used with care.  In particular, the possibility of manipulation of   such times by an adversary must be considered.4.3 The Fallacy of Complex Manipulation   One strategy which may give a misleading appearance of   unpredictability is to take a very complex algorithm (or an excellent   traditional pseudo-random number generator with good statistical   properties) and calculate a cryptographic key by starting with the   current value of a computer system clock as the seed.  An adversary   who knew roughly when the generator was started would have aEastlake, Crocker & Schiller                                    [Page 8]RFC 1750        Randomness Recommendations for Security    December 1994   relatively small number of seed values to test as they would know   likely values of the system clock.  Large numbers of pseudo-random   bits could be generated but the search space an adversary would need   to check could be quite small.   Thus very strong and/or complex manipulation of data will not help if   the adversary can learn what the manipulation is and there is not   enough unpredictability in the starting seed value.  Even if they can   not learn what the manipulation is, they may be able to use the   limited number of results stemming from a limited number of seed   values to defeat security.   Another serious strategy error is to assume that a very complex   pseudo-random number generation algorithm will produce strong random   numbers when there has been no theory behind or analysis of the   algorithm.  There is a excellent example of this fallacy right near   the beginning of chapter 3 in [KNUTH] where the author describes a   complex algorithm.  It was intended that the machine language program   corresponding to the algorithm would be so complicated that a person   trying to read the code without comments wouldn't know what the   program was doing.  Unfortunately, actual use of this algorithm   showed that it almost immediately converged to a single repeated   value in one case and a small cycle of values in another case.   Not only does complex manipulation not help you if you have a limited   range of seeds but blindly chosen complex manipulation can destroy   the randomness in a good seed!4.4 The Fallacy of Selection from a Large Database   Another strategy that can give a misleading appearance of   unpredictability is selection of a quantity randomly from a database   and assume that its strength is related to the total number of bits   in the database.  For example, typical USENET servers as of this date   process over 35 megabytes of information per day.  Assume a random   quantity was selected by fetching 32 bytes of data from a random   starting point in this data.  This does not yield 32*8 = 256 bits   worth of unguessability.  Even after allowing that much of the data   is human language and probably has more like 2 or 3 bits of   information per byte, it doesn't yield 32*2.5 = 80 bits of   unguessability.  For an adversary with access to the same 35   megabytes the unguessability rests only on the starting point of the   selection.  That is, at best, about 25 bits of unguessability in this   case.   The same argument applies to selecting sequences from the data on a   CD ROM or Audio CD recording or any other large public database.  If   the adversary has access to the same database, this "selection from aEastlake, Crocker & Schiller                                    [Page 9]RFC 1750        Randomness Recommendations for Security    December 1994   large volume of data" step buys very little.  However, if a selection   can be made from data to which the adversary has no access, such as   system buffers on an active multi-user system, it may be of some   help.5. Hardware for Randomness   Is there any hope for strong portable randomness in the future?   There might be.  All that's needed is a physical source of   unpredictable numbers.   A thermal noise or radioactive decay source and a fast, free-running   oscillator would do the trick directly [GIFFORD].  This is a trivial   amount of hardware, and could easily be included as a standard part   of a computer system's architecture.  Furthermore, any system with a   spinning disk or the like has an adequate source of randomness   [DAVIS].  All that's needed is the common perception among computer   vendors that this small additional hardware and the software to   access it is necessary and useful.5.1 Volume Required   How much unpredictability is needed?  Is it possible to quantify the   requirement in, say, number of random bits per second?   The answer is not very much is needed.  For DES, the key is 56 bits   and, as we show in an example in Section 8, even the highest security   system is unlikely to require a keying material of over 200 bits.  If   a series of keys are needed, it can be generated from a strong random   seed using a cryptographically strong sequence as explained in   Section 6.3.  A few hundred random bits generated once a day would be   enough using such techniques.  Even if the random bits are generated   as slowly as one per second and it is not possible to overlap the   generation process, it should be tolerable in high security   applications to wait 200 seconds occasionally.   These numbers are trivial to achieve.  It could be done by a person   repeatedly tossing a coin.  Almost any hardware process is likely to   be much faster.5.2 Sensitivity to Skew   Is there any specific requirement on the shape of the distribution of   the random numbers?  The good news is the distribution need not be   uniform.  All that is needed is a conservative estimate of how non-   uniform it is to bound performance.  Two simple techniques to de-skew   the bit stream are given below and stronger techniques are mentioned   in Section 6.1.2 below.Eastlake, Crocker & Schiller                                   [Page 10]RFC 1750        Randomness Recommendations for Security    December 19945.2.1 Using Stream Parity to De-Skew   Consider taking a sufficiently long string of bits and map the string   to "zero" or "one".  The mapping will not yield a perfectly uniform   distribution, but it can be as close as desired.  One mapping that   serves the purpose is to take the parity of the string.  This has the   advantages that it is robust across all degrees of skew up to the   estimated maximum skew and is absolutely trivial to implement in   hardware.   The following analysis gives the number of bits that must be sampled:   Suppose the ratio of ones to zeros is 0.5 + e : 0.5 - e, where e is   between 0 and 0.5 and is a measure of the "eccentricity" of the

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -