⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 rfc1607.txt

📁 中、英文RFC文档大全打包下载完全版 .
💻 TXT
📖 第 1 页 / 共 2 页
字号:
RFC 1607              A View from the 21st Century          1 April 1994   "Collaboratories" which was intended to convey the idea   that people and computers could carry out various kinds of   collaborative work if they had the right kinds of networks   to link their computer systems and the right kinds of   applications to deal with distributed applications. Of   course, we take that sort of thing for granted now, but it   was new and often complicated 30 years ago.   I am going to try to find out how they dealt with the   problem of explosive growth.   Louis and I will be leaving shortly for a three-day   excursion to the new vari-grav habitat but I will let you   know what I find out about the 1990s period in Internet   history when we get back.   Therese   -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-   To: "Troisema" <rm1023@geosync.hyatt.com>   CC: "David Kenter" <dkenter@xob.isea.mr>   CC: "Therese Troisema" <ttroisema@inria.fr>   From: "Jonathan Bradel" <jbradel@astro.luna.edu>   Date: September 13, 2023 10:34:05 LT   Subject: Re: Internet History   Therese,   I sent a few Knowbot programs out looking for Internet   background and found an interesting archive at the Postel   Historical Institute in Pacific Palisades, California.   These folks have an incredible collection of old documents,   some of them actually still on paper, dating as far back as   1962! This stuff gets addicting after a while.   Postel apparently edited a series of reports called   "Request for Comments" or "RFC" for short. These seem to be   one of the principal means by which the technology of the   Internet has been documented, and also, as nearly as I can   tell, a lot of its culture. The Institute also has a   phenomenal archive of electronic mail going back to about   1970 (do you believe it? Email from over 50 years ago!). I   don't have time to set up a really good automatic analysis   of the contents, but I did leave a couple of Knowbots   running to find things related to growth, scaling, andCerf                                                            [Page 8]RFC 1607              A View from the 21st Century          1 April 1994   increased capacity of the Internet.   It turns out that the technical committee called the   Internet Engineering Task Force was very pre-occupied in   the 1991-1994 period with the whole problem of   accommodating exponential growth in the size of the   Internet. They had a bunch of different options for re-   placing the then-existing IP layer with something that   could support a larger address space. There were a lot of   arguments about how soon they would run out of addresses   and a lot of uncertainty about how much functionality to   add on while solving the primary growth problem. Some folks   thought the scaling problem was so critical that it should   take priority while others thought there was still some   time and that new functionality would help motivate the   massive effort needed to replace the then-current version 4   IP.   As it happens, they were able to achieve multiple   objectives, as we now know. They found a way to increase   the space for identifying logical end-points in the system   as well increasing the address space needed to identify   physical end-points. That gave them a hook on which to base   the mobile, dynamic addressing capability that we now rely   on so heavily in the Internet. According to the notes I   have seen, they were also experimenting with new kinds of   applications that required different kinds of service than   the usual "best efforts" they were able to obtain from the   conventional router systems.   I found an absolutely hilarious "packet video clip" in one   of the archives. It's a black-and-white, 6 frame per second   shot of some guy taking off his coat, shirt and tie at one   of the engineering committee meetings. His T-shirt says "IP   on everything" which must have been some kind of slogan for   Internet expansion back then. Right at the end, some big   bearded guy comes up and stuffs some paper money in the   other guy's waistband. Apparently, there are quite a few   other archives of the early packet video squirreled away at   the PHI. I can't believe how primitive all this stuff   looks. I have attached a sample for you to enjoy. They   didn't have TDV back then, so you can't move the point of   view around the room or anything. You just have to watch   the figures move jerkily across the screen.   You can dig into this stuff if you send a Knowbot program   to concierge@phi.pacpal.ca.us. This Postel character must   have never thrown anything away!!Cerf                                                            [Page 9]RFC 1607              A View from the 21st Century          1 April 1994   Jon   -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-   To: "Jonathan Bradel" <jbradel@astro.luna.edu>   CC: "David Kenter" <dkenter@xob.isea.mr>   CC: "Troisema" <rm1023@geosync.hyatt.com>   From: "Therese Troisema" <ttroisema@inria.fr>   Date: September 15, 2023 07:55:45 UT   Subject: Re: Internet History   Jon,   thanks for the pointer. I pulled up a lot of very useful   material from PHI. You're right, they did manage to solve a   lot of problems at once with the new IP. Once they got the   bugs out of the prototype implementations, it spread very   quickly from the transit service companies outward towards   all the host computers in the system. I also discovered   that they were doing research on primitive gigabit-per-   second networks at that same general time. They had been   relying on unbelievably slow transmission systems around   100 megabits-per-second and below. Can you imagine how long   it would take to send a typical 3DV image at those glacial   speeds?   According to the notes I found, a lot of the wide-area   system was moved over to operate on top of something they   called Asynchronous Transfer Mode Cell Switching or ATM for   short. Towards the end of the decade, they managed to get   end to end transfer rates on the order of a gigabyte per   second which was fairly respectable, given the technology   they had at the time. Of course, the telecommunications   business had been turned totally upside down in the process   of getting to that point.   It used to be the case that broadcast and cable television,   telephone and publishing were different businesses. In some   countries, television and telephone were monopolies   operated by the government or operated in the private   sector with government regulation. That started changing   drastically as the 1990s unfolded, especially in the United   States where telephone companies bought cable companies,   publishers owned various communication companies and it got   to be very hard to figure out just what kind of company itCerf                                                           [Page 10]RFC 1607              A View from the 21st Century          1 April 1994   was that should or could be regulated. There grew up an   amazing number of competing ways to deliver information in   digital form. The same company might offer a variety of   information and communication services.   With regard to the Internet, it was possible to reach it   through mobile digital radio, satellite, conventional wire   line access (quaintly called "dial-up") using Integrated   Services Digital Networking, specially-designed modems,   special data services on television cable, and new fiber-   based services that eventually made it even into   residential settings. All the bulletin board systems got   connected to the Internet and surprised everyone, including   themselves, when the linkage created a new kind of   publishing environment in which authors took direct re-   sponsibility for making their work accessible.   Interestingly, this didn't do away either with the need for   traditional publishers, who filter and evaluate material   prior to publication, nor for a continuing interest in   paper and CD-ROM. As display technology got better and more   portable, though, paper became much more of a specialty   item. Most documents were published on-line or on high-   density digital storage media.  The basic publishing   process retained a heavy emphasis on editorial selection,   but the mechanics shifted largely in the direction of the   author - with help from experts in layout and   accessibility. Of course, it helped to have a universal   reference numbering plan which allowed authors to register   documents in permanent archives. References could be made   to these from any other on-line context and the documents   retrieved readily, possiblyat some cost for copying rights.   By the end of the decade, "multimedia" was no longer a   buzz-word but a normal way of preparing and presenting   information. One unexpected angle: multimedia had been   thought to be confined to presentation in visual and   audible forms for human consumption, but it turned out that   including computers as senders and recipients of these   messages allowed them to use the digital email medium as an   enabling technology for deferred, inter-computer   interaction.   Just based on what I have been reading, one of the toughest   technical problems was finding good standards to represent   all these different modalities. Copyright questions, which   had been thought to be what they called "show-stoppers,"   turned out to be susceptible to largely-established caseCerf                                                           [Page 11]RFC 1607              A View from the 21st Century          1 April 1994   law. Abusing access to digital information was impeded in   large degree by wrapping publications in software shields,   but in the end, abuses were still possible and abusers were   prosecuted.   On the policy side, there was a strong need to apply   cryptography for authentication and for privacy. This was a   big struggle for many governments, including ours here in   France,  where there are very strong views and laws on this   subject, but ultimately, the need for commonality on a   global basis outweighed many of the considerations that   inhibited the use of this valuable technology.   Well, that takes us up to about 20 years ago, which still   seems a far cry from our current state of technology. With   over a billion computers in the system and most of the   populations of information-intensive countries fully   linked, some of the more technically-astute back at the   turn of the millennium may have had some inkling of what   was in store for the next two decades.   Therese   -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-   To: "Therese Troisema" <ttroisema@inria.fr>   CC: "Jonathan Bradel" <jbradel@astro.luna.edu>   From: "David Kenter" <dkenter@xob.isea.mr>   Date: September 17, 2023 06:43:13 MT   Subject: Re: Internet History   Therese and Jon,   This is really fascinating! I found some more material,   thanks to the Internet Society, which summarizes the   technical developments over the last 20 years. Apparently   one of the key events was the development of all-optical   transmission, switching and computing in a cost-effective   way.  For a long time, this technology involved rather   bulky equipment - some of the early 3DV clips from 2000-   2005 showed rooms full of gear required to steer beams   around. A very interesting combination of fiber optics and   three-dimensional electro-optical integrated circuits   collapsed a lot of this to sizes more like what we are   accustomed to today. Using pico- and femto- molecular   fabrication methods, it has been possible to build very   compact, extremely high speed computing and communicationCerf                                                           [Page 12]RFC 1607              A View from the 21st Century          1 April 1994   devices.   I guess those guys at Xerox PARC who imagined that there   might be hundreds of millions of computers in the world,   hundreds or even thousands of them for each person, would   be pleased to see how clear their vision was. The only   really bad thing, as I see it, is that those guys who were   trying to figure out how to deal with Internet expansion   really blew it when they picked a measly 64 bit address   space. I hear we are running really tight again. I wonder   why they didn't have enough sense just to allocate at least   1024 bits to make sure we'd have enough room for the   obvious applications we can see we want, now?   David   -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-Final Comments   The letters end here, so we are left to speculate about many of the   loose ends not tied up in this informal exchange. Obviously, our   current struggles ultimately will be resolved and a very different,   information-intensive world will evolve from the present. There are a   great many policy, technical and economic questions that remain to be   answered to guide our progress towards the environment described in   part in these messages. It will be an interesting two or three   decades ahead!Cerf                                                           [Page 13]RFC 1607              A View from the 21st Century          1 April 1994Security Considerations   Security issues are not discussed in this memo.Author's Address   Vinton Cerf   President, Internet Society   12020 Sunrise Valley Drive, Suite 270   Reston, VA 22091   EMail: +1 703 648 9888   Fax: +1 703 648 9887   EMail: vcerf@isoc.org   or   Vinton Cerf   Sr. VP Data Architecture   MCI Data Services Division   2100 Reston Parkway, Room 6001   Reston, VA 22091   Phone: +1 703 715 7432   Fax: +1 703 715 7436   EMail: vinton_cerf@mcimail.comCerf                                                           [Page 14]

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -