Readit News logoReadit News
mrighele · a year ago
I don't think that the Internet was created to survive a nuclear strike, but I think we can say that it was _designed_ to survive a nuclear strike, that was one of the reason that packet switching was invented (compared to the traditional, at the time, circuit switching).

The idea of packet switching as a way make a communication network more robust came to Paul Baran a few years earlier, and while it was not the basis of Arpanet, it probably influenced it. Wikipedia [1] is not the best of the sources, but it sums it up nicely:

"After joining the RAND Corporation in 1959, Baran took on the task of designing a "survivable" communications system that could maintain communication between end points in the face of damage from nuclear weapons during the Cold War. Then, most American military communications used high-frequency connections, which could be put out of action for many hours by a nuclear attack.

[...]

After proving survivability, Baran and his team needed to show proof of concept for that design so that it could be built. [...] The result was one of the first store-and-forward data layer switching protocols, a link-state/distance vector routing protocol, and an unproved connection-oriented transport protocol. Explicit detail of the designs can be found in the complete series of reports On Distributed Communications, published by RAND in 1964.

[...]

The Distributed Network that Baran introduced was intended to route around damage.

[...]

In 1969, when the US Advanced Research Projects Agency (ARPA) started developing the idea of an internetworked set of terminals to share computing resources, the reference materials that they considered included Baran and the RAND Corporation's "On Distributed Communications" volumes. The resiliency of a packet-switched network that uses link-state routing protocols, which are used on the Internet, stems in some part from the research to develop a network that could survive a nuclear attack."

[1] https://en.wikipedia.org/wiki/Paul_Baran

dboreham · a year ago
I've noticed that the closer I am to historical events, the more wrong reporting of those events tends to be. I have a neighbor who worked at RAND in the 60s and will ask him about this next time we meet.

Worth noting perhaps that many (all?) technical innovations are the result of some underlying technology maturing to the point that it can be applied to a problem. In this case, I bet that nobody liked the fragility and brittleness of circuit switched networking, but in order to make a packet switched network you need small fast computers that are cheap enough to deploy as network nodes. These appeared : minicomputers. The first ARPANet nodes were minicomputers running routing software. In fact the Internet used regular computers as routers into near modern history (IBM RISC machines iirc were deployed at the DS3 upgrade). So PSN is the result of a) people sitting around wishing they could have a PSN, and b) the technology to actually realize that becoming practical. There's no eureka moment.

Retric · a year ago
It’s impossible to know everything that’s happening at the same time. So while ARPANET was the 2nd packet switched network behind NPL a journalist wouldn’t have a clue.

The original conception of Message Blocks was for routing around a damaged network, but the term for Packet Switching was actually a means for multiple users to share a single connection. ARPANET included many ideas from NPL but was also its own thing.

So who invented Packet Switching depends on what parts of it you consider critical.

RcouF1uZ4gsC · a year ago
> I've noticed that the closer I am to historical events, the more wrong reporting of those events tends to be.

That is not just for historical events, but for anything you are familiar with.

Michael Crichton coined the term Gell-Mann Amnesia Effect

https://theportal.wiki/wiki/The_Gell-Mann_Amnesia_Effect

marcus0x62 · a year ago
> IBM RISC machines iirc were deployed at the DS3 upgrade

You recall correctly. Info/pictures here[0].

0 - https://www.rcsri.org/collection/nsfnet-t3/

dredmorbius · a year ago
For those who are interested in that sort of thing, Baran's full 11 publications outlining the design goals and principles of packet-switched networks in January of 1964 are available from RAND as downloadable PDFs:

<https://www.rand.org/about/history/baran.html>

The first document introduces the basic problem:

Let us consider the synthesis of a communication network which will allow several hundred major communications stations to talk with one another after an enemy attack. As a criterion of survivability we elect to use the percentage of stations both surviving the physical attack and remaining in electrical connection with the largest single group of surviving stations. This criterion is chosen as a conservative measure of the ability of the surviving stations to operate together as a coherent entity after the attack. This means that small groups of stations isolated from the single largest group are considered to be ineffective.

<https://www.rand.org/pubs/research_memoranda/RM3420.html>

"Attack" isn't defined, but it's clear that resilience against broad assaults was a key consideration from the very beginning.

throw0101a · a year ago
> "Attack" isn't defined, but it's clear that resilience against broad assaults was a key consideration from the very beginning.

It was a consideration for Baran's work, but not for ARPAnet:

From chapter two of Wizards:

> Roberts also learned from Scantlebury, for the first time, of the work that had been done by Paul Baran at RAND a few years earlier. When Roberts returned to Washington, he found the RAND reports, which had actually been collecting dust in the Information Processing Techniques Office for months, and studied them. Roberts was designing this experimental network not with survivable communications as his main—or even secondary—concern. Nuclear war scenarios, and command and control issues, weren’t high on Roberts’s agenda. But Baran’s insights into data communications intrigued him nonetheless, and in early 1968 he met with Baran. After that, Baran became something of an informal consultant to the group Roberts assembled to design the network. […]

* https://www.goodreads.com/book/show/281818.Where_Wizards_Sta...

* https://en.wikipedia.org/wiki/Larry_Roberts_(computer_scient...

Deleted Comment

KineticLensman · a year ago
> but I think we can say that it was _designed_ to survive a nuclear strike

There is a lot more to the design of 'the internet' than the selection of a packet switched protocol. The early boxes were not hardened in any way and were intended to support computer timesharing amongst academic researchers. The DoD's C2 providers themselves rejected the concept of a decentralised packet-switched network because 'it would never work'. Which is why the relevant theoretical papers were sitting on the shelf and available when ARPANET was designed

cfmcdonald · a year ago
> but I think we can say that it was _designed_ to survive a nuclear strike

On what basis? What is the distinction between being "created" to survive a nuclear strike, and being "designed" to do so?

> that was one of the reason that packet switching was invented (compared to the traditional, at the time, circuit switching).

Yes, but I don't think it's a relevant one. Baran's papers kinda-sorta-maybe had some influence on ARPANET, but ARPANET mostly got packet-switching (and certainly the term "packet") from Donald Davies. If you look at the actual layout of ARPANET it wasn't very survivable (not much redundancy in the links) [0], compared to Baran's proposal [1]. Internetworking and "the Internet" as we know it came much later and was way beyond the point where Baran had any influence.

[0]: https://commons.wikimedia.org/wiki/Category:ARPANET_maps [1]: https://personalpages.manchester.ac.uk/staff/m.dodge/cyberge...

throw0101a · a year ago
> I don't think that the Internet was created to survive a nuclear strike, but I think we can say that it was _designed_ to survive a nuclear strike, that was one of the reason that packet switching was invented (compared to the traditional, at the time, circuit switching).

That cannot be said unless you can cite sources saying so. It's been a little while since I read Wizards, but I don't remember any mention of nuclear war until Baran's work at RAND is mentioned, which is a couple of chapters in (IIRC).

For Licklider et al it was all about research, collaboration, and resource sharing.

> The idea of packet switching as a way make a communication network more robust came to Paul Baran a few years earlier, and while it was not the basis of Arpanet, it probably influenced it.

Baran's work was used in things like queuing theory, but work was already underway on ARPAnet before Baran (and Davis in the UK) was roped in. In fact it was Davis that pointed out Baran's work to the ARPAnet folks:

* https://en.wikipedia.org/wiki/Donald_Davies

This is all covered in Where Wizards Say Up Late.

AlbertCory · a year ago
1) Robustness in the face of nuclear disaster was one of the drivers of packet switching.

2) Not the only one, though.

3) It was such a good idea that it grew a life of its own, and no one talked about nuclear war anymore.

walrus01 · a year ago
I think it's also worth noting that the physical infrastructure of certain portions of the circuit-switched AT&T Long Lines network (which also carried the AUTOVON network for the DoD) were designed, or at least attempted to be designed, to survive a nuclear strike. Theoretically traffic would have been manually re-routed around destroyed areas in a post-strike scenario.

There's certain underground bunker sites on the historical L4 transcontinental coaxial cable routes which were designed with equipment mounted on springs, massively thick concrete walls, decontamination showers, and so on. Other mountain top and flat land area microwave point to point relay sites were designed to take a certain amount of overpressure and possibly would have survived nearby smaller nuclear weapon strikes (though obviously not a direct hit).

Much of this pre-dates 1969 and the concept of the ARPANET. If you dig into things like what telecom services/links supported the NIKE missile batteries built around major cities and so forth. And what telecom services/medium to long distance links supported the data feeds from regional radar sites into the SAGE direction centers.

Here's a fairly typical example of the more costly site of site, which you can see the photos of under construction and how much extremely reinforced underground concrete is involved.

https://coldwar-ct.com/Home_Page_S1DO.html

pjmorris · a year ago
Perhaps corroborating your point, excerpted from Waldrop's 'The Dream Machine':

"Why did ARPA build the network?” Lukasik asks. “There were actually two reasons. One was that the network would be good for computer science.""

...

"But there was also another side to the story, which was that ARPA was a Defense Department agency. And after Eb [Rechtin] came in, defense relevance became the dominant notion. Everybody was writing relevance statements. "

...

"So in that environment, I would have been hard pressed to plow a lot of money into the network just to improve the productivity of the researchers. The rationale just wouldn’t have been strong enough. What was strong enough was this idea that packet switching would be more survivable, more robust under damage to the network. If a plane got shot down, or an artillery barrage hit the command center, the messages would still get through. And in a strategic situation—meaning a nuclear attack—the president could still communicate to the missile fields. So I can assure you, to the extent that I was signing the checks, which I was from nineteen sixty-seven on, I was signing them because that was the need I was convinced of.”

Waldrop, M. Mitchell. The Dream Machine (p. 273). Stripe Press. Kindle Edition.

TheOtherHobbes · a year ago
Yes, quite. This is a very silly article which ignores a lot of the real history in favour of a cutesy top up of early 1990s nostalgia - which was a good 20-30 years after the events that really matter.

Saying "The Internet isn't ARPANET" is ridiculous. Of course it isn't. ARPANET was an academic research project with a mix of defence and open R&D requirements. The Internet is a collection of extra layers of commercial development on top of some of that R&D.

Academic research projects are rarely hardened because the point of the project is to investigate possibilities, not to spend hundreds of billions building a physically bomb-proof network that's useless because the core tech doesn't work.

When the Berlin Wall came down the goals changed, but the core concept of distributed scalable robustness is still very much there today. Of course now we have too many choke points, so it's not as robust as it could be. But if someone cuts a cable packets will still find a longer, slower way around as long as the bandwidth is there.

phaedrus · a year ago
In my opinion packet switching is an idea that would have been inevitable once the technology to support it is in place. The real important idea of the Internet is the insight that the envelope should be independent of the data it contains.

I say this because I work with specialized equipment that uses RS-232 serial protocols for communication, and despite many decades of examples available of "the right" way to do it (e.g. OSI model), engineers continue(d) to not understand this lesson and design ad hoc protocols that don't respect this division of concerns and which suffer for it. Even in IP protocols designed to modernize this to wrap the RS-232 serial packets in internet packets, they repeat the same mistake(s). That is, in the midst of having to deal with a more complicated problem for serial protocol to IP protocol conversion because the original format didn't clearly distinguish envelope from data, they compound the problem by mixing levels in the new protocol.

For example, writing IPv4 into the standard for a proprietary an Application or Session level protocol even at the same time as our network is banning IPv4 and requiring IPv6, but the protocol is not even defined for IPv6 addresses. When it should be agnostic to IP level concerns entirely. (The engineers said, while designing the system, "well .1 is going to be this piece of equipment, .2 is going to be the other side, if it's blah blah blah it's .3, etc.")

treyd · a year ago
People keep repeating this mistake at all layers of the stack because they fail to see different layers as managing different sets of concerns and want to treat it as a monolith.

A very recent and high level example of this is how the Matrix protocol was architected. By defining all the message formats in terms of JSON-over-HTTP it makes it difficult to only use part of the protocol and not all of it, and makes it difficult to use it over alternative transports since the assumptions of HTTP idioms are baked all the way down.

m463 · a year ago
> The idea of packet switching as a way make a communication network more robust came to Paul Baran a few years earlier,

I remember back in the days when the internet was first starting to happen, seeing comments about packet switching. The gist of what I recall was that there was lots of dark fiber, and folks like AT&T were trying to prevent it from being used and force slow/expensive packet switching on us, billing us per-packet or something.

it is hard to dig up any article about this however.

Rattled · a year ago
When was this? I would have thought the internet was well established before fiber optic cabling was widely used.
euroderf · a year ago
AUTODIN preceded it, but ARPANET won out over AUTODIN II.
esafak · a year ago

Dead Comment

teleforce · a year ago
For all intents and purposes the Internet was created by and for US military not for public utility, and one of the main reasons for its creation, if not the sole reason, is to be able to withstand large scale nuclear attack. It's not created for remote feeding the cats with the IoT at home while you are away. It's easily the top ten invention of 20th century.

It's not only Internet, FFT also was created to detect illegal nuclear testing and it's included the top ten algorithms of 20th century [1],[2]. There's no shame of admitting the facts.

If not for military purpose Internet will probably never ever see the light of day. The Internet packet switching end to end network precursor and predecessor in France proposed and implemented by Louis Pouzin, famously called the 4th man of the Internet, who coined the word "datagram" the very fundamental concept in packet switching, lost his proposal to other communication technology for France nation wide networks implementation [3].

The original inventors of the Internet proposed the idea to AT&T, the US communication behemoth and monopoly at the time laughed at the packet switching idea saying that it will not work and does not make any economic sense. The only reason Internet created and survived is because of the military applications and as any military application the main objective is to maintain and sustain command and control where Internet is more than capable of. I've once read someone simulated 99% infrastructure demolition (as in nuclear attack) and Internet communication managed to survive intact. Cannot recalled the article now, perhaps this is where ChatGPT becomes handy. The fact that the US military maintained a separate military network (similar to Internet) from the public Internet after the Internet gone public and its popularity increased, provided the hints why it's so important for the US military.

[1] Great algorithms are the poetry of computation:

https://www.andrew.cmu.edu/course/15-355/misc/Top%20Ten%20Al...

[2] The Algorithm That Almost Stopped The Development of Nuclear Weapons:

https://www.iflscience.com/the-algorithm-that-almost-stopped...

[3]Louis Pouzin:

https://en.wikipedia.org/wiki/Louis_Pouzin

throw0101a · a year ago
> For all intents and purposes the Internet was created by and for US military not for public utility, and one of the main reasons for its creation, if not the sole reason, is to be able to withstand large scale nuclear attack.

And yet Charles M. Herzfeld (who was ARPA director at the time) and Robert Taylor (who was in charge of getting ARPAnet going) say otherwise: it was to link computing centres across the country so that resources could be shared and collaboration would be easier.

If Herzfeld and Taylor didn't know why ARPA created ARPAnet then who else would?

The entire linked to article is about tracing the source of the "able to withstand large scale nuclear attack" original story myth:

> However, the documentation is voluminous and the people who were in the room have all given a consistent story about how it was to build a network for time-sharing of expensive computers and better collaboration.

user3939382 · a year ago
Packet switching was invented by an American and Brit independently IIRC.
mrighele · a year ago
Yes, the linked article talks also about Donald Davies, which influenced more directly Arpanet.

But the point about Paul Baran is that for him packed switching was a way to make communications more resilient in face of nuclear bombing (and other things) and he had at least some influence on the birth of the Internet, so if you ask me "what is the relation between the Internet and a nuclear strike" my first answer is "Paul Baran"

dboreham · a year ago
For a couple of decades there was a totally independent network stack in the UK (JANET) with its own equivalent of RFCs and so on. History gets written by the victors.
soheil · a year ago
a nuclear attack could be natural selection for the internets evolution honestly.
ot1138 · a year ago
Interesting story but I have a bit of anecdotal evidence to share. Back when I was a Freshman at UIUC in 1989, I was given a campus tour and told that one of the buildings there was designed to collapse outwardly in order to protect the equipment in the basement. That equipment was a national computer network (not yet called the internet!)

So at the very least, the origin of this story predates 1991 by at least two years.

I don't recall the name of the building but here it is on Google maps.

https://www.google.com/maps/@40.106201,-88.2268272,3a,75y,91...

Edit: It's not clear from my original comment but the reason for collapse would presumably be a nuclear strike. I remember this because this was a time when we grew up with a constant fear of a Russian nuclear strike and I couldn't help but wonder why anyone on earth would want to nuke Champaign.

Edit: Ah, here we go! It is the Foreign Languages Building (FLB), later renamed. I remember having to trudge here at 7ams on snowy winter days to listen to Japanese language cassettes.

https://uihistories.library.illinois.edu/virtualtour/maincam...

Edit: And here's a contemporary article about the FLB, which also cited some of the crazy rumors about this building.

https://imgur.com/HXenjnt.png

jcrash · a year ago
I was going to share this story but you beat me to it. They're still claiming this in tours ~2017.

The building was called the Foreign Languages Building until very recently and is now called the Literatures, Cultures & Linguistics Building.

Relevant info from the UIHistory site:

"Located on the site of the former Old Entomology Building, ground was broken on the Foreign Language Building (FLB) on December 18, 1968.

A popular myth is that the building's distinctive architecture was a result of its being designed to house a supercomputer on campus called Plato. The building was supposedly designed so that if it was bombed, the building's shell would fall outwards, protecting the supercomputer on the inside. It is also rumored that the building's interior layout was a result of trying to confuse Soviet spies and prevent them from stealing secrets from the supercomputer.

In reality, the building's architecture is not actually all that unique and was a popular style of the day. In fact, just a few blocks to the west, one may find the Speech and Hearing Sciences Building, which a 2-story clone of the building. Plato itself was real, but refered not to a secret government program, but rather to the first "modern" electronic learning system, the forbearer of course software like WebCT and Mallard. The mainframe computer that ran the Plato system was located in north campus, in a building which used to reside on the west side of the Bardeen Quad." [0]

[0] https://uihistories.library.illinois.edu/virtualtour/maincam...

Hilarious that the myth extends to the interior design - the basement really is a maze the first few times you visit.

ot1138 · a year ago
Plato was in fact real... I used it many times! Looking back, it was pretty impressive technology for its day but was quickly becoming obsolete. I hated having to walk all the way to campus to get some physics units in that I missed.

I vaguely seemed to recall that sometime around the Gulf war, I was able to modem in and connect remotely. Shortly after, I stopped getting Plato assignments!

marshray · a year ago
My parents worked at UIUC in the early 1970's.

Plato was an early interactive learning system, the supercomputer was called the Illiac-IV.

The building was called the "Center for Advanced Computation". I don't know if the computer was in that building, but I don't think they were exactly hiding it from the Soviets.

conductr · a year ago
I wonder how long that equipment would survive being exposed to the elements after the collapse
ot1138 · a year ago
My guess is that something that important was protected by reinforced ceilings/floors.
liotier · a year ago
The Internet ? No. The Arpanet ? Also no. But SAGE was - its prototype (the Cape Cod air defense project) was a packet-switching network and SAGE itself was as well... And staying up on doomsday for a bit of nuclear combat was the essence of SAGE's functional specification. SAGE pioneered most of the concepts and technologies of the Arpanet, whose purpose was absolutely not combat, so it is easy to imagine how the nuclear strike resilient Internet urban legend evolved.
palisade · a year ago
Found it!!! The proof that ARPAnet was based on the idea of command and control! Command & Control means a network capable of surviving a nuclear attack.

Licklider who established the Intergalactic Computer Network memo that started it all. And, who was heavily involved in ARPAnet and bringing in all the people involved including Baran and Davies. Specifically mentions ARPAnet Command and Control in his paper!

"It is necessary to bring this opus to a close because I have to go catch an airplane. I had intended to review ARPA’s Command-and-Control interests in improved mancomputer interaction, in time-sharing and in computer networks. I think, however, that you all understnad [sic.] the reasons for ARPA’s basic interest in these matters, and I can, if need be, review them briefly at the meeting. The fact is, as I see it, that the military greatly needs solutions to many or most of the problems that will arise if we tried to make good use of the facilities that are coming into existence."

https://worrydream.com/refs/Licklider_1963_-_Members_and_Aff...

throw0101a · a year ago
Instead of quoting from the end of that document, perhaps quote from the beginning:

> In the first place, it is evident that we have among us a collection of individual (personal and/or organizational) aspirations, efforts, activities, and projects. These have in common, I think, the characteristics that they are in some way connected with advancement of the art or technology of information processing, the advancement of intellectual capability (man, man-machine, or machine), and the approach to a theory of science.

The word "military" only exists in the last few paragraphs of the document. Most of it is about workflows and resource sharing:

> When the computer operated the programs for me, I suppose that the activity took place in the computer at SDC, which is where we have been assuming I was. However, I would just as soon leave that on the level of inference. With a sophisticated network-control system, I would not decide whether to send the data and have them worked on by programs somewhere else, or bring in programs and have them work on my data. I have no great objection to making that decision, for a while at any rate, but, in principle, it seems better for the computer, or the network, somehow, to do that. At the end of my work, I filed some things away, and tried to do it in such a way that they would be useful to others. That called into play, presumably, some kind of a convention-monitoring system that, in its early stages, must almost surely involve a human criterion as well asmaching [sic.] processing.

palisade · a year ago
Listen, mr throwaway account, if we're starting at the top of the document:

"The ARPA Command & Control Research office has just been assigned a new task that must be activated immediately, and I must devote the whole of the coming week to it."

You know, the line you had to skip a few paragraphs ahead of to find your quotes.

ThinkBeat · a year ago
Norway was the first nation outside of the US to get connected to the Arpanet¹³³.

This was done to have for the US military to have "real-time" access to seismic data from sensors in Norway, that would detect nuclear activity of various forms. Primarily those in the Soviet Union.

I can at least say ARPANET was expanded to Norway for military purposes.

A family member of mine was the first foreigner to "chat" with folks in the US via ARPANET.

And later given access to compute resources in the US, for work that was a bit hush hush.

¹ https://www.nb.no/en/story/da-norge-fikk-internett/#:~:text=....

² https://www.norsar.no/about-us/history/arpanet ³ https://www.ffi.no/en/news/from-the-us-to-ffi-kjeller-was-fi...

nostrademons · a year ago
There's a bunch of good stuff on YouTube from the folks who actually developed the Internet. Here's a couple from Leonard Kleinrock, whose MIT thesis laid out much of the math behind packet switching and whose lab sent the first Internet message:

https://www.youtube.com/watch?v=vuiBTJZfeo8

https://www.youtube.com/watch?v=rHHpwcZiEW4

And from Bob Kahn, who designed the router of that early Internet (interviewed by Vint Cerf, who invented TCP/IP):

https://www.youtube.com/watch?v=EKxNMTVnBzM

https://www.youtube.com/watch?v=hKZ6tJcQpcI

A key innovation of the Internet was packet-switching. Previous networks like AT&T's telephone system were circuit-switched: the configuration of the network, and route between source and destination, is an inherent property of the network, and once a connection is established it can't be easily reconfigured. Packet-switching makes the source and destination a property of the message, and then the network is responsible for figuring out a route from source to destination. Notably, because all information needed to specify the destination is included in the message, it can be retried or take a totally different route.

Most things have multiple causes, and the Internet is definitely one of them. The scalability and distribution properties were certainly one of them: a centralized system like the telephone network cannot scale to new uses and many new endpoints the way a distributed system like the Internet can. According to Kleinrock, the need for management to keep an eye on all the research they were funding was apparently another one of them. But given that it was funded by DARPA, the resilience of a packet-switched network to scenarios where individual circuits might go down was probably a major reason for the interest in this technology. It doesn't necessarily have to be a nuclear strike, but there were a number of scenarios of interest to RAND and DARPA that could involve a portion of the nation's communication network being disabled and still needing to get messages through.

This is also a good lesson to designers of future networks and computing systems. The end-to-end principle remains as valuable to system designers today as it did in the 1960s.

jcrawfordor · a year ago
I'm an enthusiast and writer on the history of military communications technology, particularly during the Cold War. The internet is very much part of this story, and I am asked about this controversy from time to time. The problem is, I find that people who argue for both positions are becoming much too fixated on the idea that there was some single set of influences on a complex project. There isn't really any answer to "was the internet designed to withstand nuclear war" for the same reason that there isn't really any single answer to any question about the historical motivations of complex undertakings. That's just not how history works.

There are some facts which we know to be true:

1) Various components of the defense complex were actively researching survivable C2 communications, particularly beginning in the 1950s although there were earlier precedents. Many of these efforts involved ideas that were similar to those used in modern computer networking, and they sometimes culminated in built systems with meaningful similarities to the internet, like AUTODIN.

2) A diverse cast of academics, contractors, and government entities were involved in these projects. Sometimes the same people worked on multiple projects. Even when they didn't, there were often communications between these entities, but sometimes, due to security concerns, there wasn't. Much of this communication was informal, so in retrospect it is hard to tell who knew what. There can be surprises both directions.

3) Communications technologies often emerge naturally from innovations in other fields, technical advances, etc., so while many similar communications technologies have a shared intellectual heritage, it is also not that unusual for totally independent efforts to arrive at roughly the same point. Radio is a classic example.

I think that, in consideration of these facts, we can reach two conclusions:

A) It seems likely that some of the people involved in ARPANET were familiar with survivable C2 research and applied those ideas to their efforts. After all, lots of people and lots of organizations worked on these programs, and some of the research was widely distributed within the defense-industrial-academic complex during the '50s and '60s.

B) It also seems likely that ARPANET independently arrived at similar endpoints. After all, it had some similar constraints and objectives, and its creators were working with mostly the same underlying technology.

These two do not contradict each other. In fact, I think it is by far most likely that both are true in the cases of different individual people and different individual aspects of the design. That's just how these things are.

Before considering The Internet specifically, let's consider a couple of similar situations in the development of technology:

1) People sometimes do a great deal of hand-wringing over the assignment of labels like "the first programmable computer." I have always been very wary of giving these sorts of titles without a fair number of weasel words. Consider, for example, the ENIAC, usually called the "first programmable computer." And yet, there is a compelling argument that a number of the substantial design elements of ENIAC, including its programmability, are derived from work done for an earlier codebreaking machine called Colossus. This connection remained unknown for many years because of the secrecy surrounding Colossus... a level of secrecy that means that, while a number of people who worked on Colossus and later worked on ENIAC almost certainly carried over ideas, they wouldn't have admitted to having done so as Colossus was officially unknown to the ENIAC project. The particular climate of wartime and military technological development means that ideas often move around in subtle ways, and knowledge of where an idea came from is intentionally obscured. The history of military technology can be a very difficult field for this reason.

2) Almost at the opposite ends of the spectrum, information often flows very freely in academic and industrial laboratory environments, and so ideas spread without clear documentation. I am reminded of a piece I wrote years ago, on the fact that several early internet protocols use a similar set of three-digit status codes in similar ways (HTTP, for example). Oddly enough, these pseudo-standard status codes appear almost simultaneously in RJE and FTP, but neither mentions the other. Over time I have been lucky enough to get in touch with several of the authors of both RFCs, and while none of them can recall the origin of the codes, they agree with my general theory: the two separate groups, both at MIT, had just shared notes during the development of the protocols and one of them informally adopted the status code scheme from the other. People talk to each other, and ideas often move between projects without formal documentation.

So, with those two examples of subtle cross-project influence in mind, can we say anything suggestive about ARPANET? Well, there are certain suggestive details. For example, by the time ARPANET's first IMPs were built, at least one researcher (Howard Frank) was engaged in ARPANET research who had previously consulted on survivable C2 networks. But the ARPANET project had already set certain design details like packet-switching by that point... which raises the question of if packet-switching is even the important part. Howard Frank wasn't working on packet switching itself, he was working on performance and reliability modeling of topologies for packet switching, an area where military C2 research was probably generally ahead of ARPANET research at that time. So, if we take the face-value assumption that aspects of ARPANET topology research were probably based on survivable C2 research, does that mean that ARPANET was "created to survive a nuclear strike?" Or did it merely end up that way? It ends up coming down to splitting hairs about what "created" means, an exercise that sort of ignores the fact that technological developments always combine established ideas and new ones.

ARPANET was not built for military C2. It was used for military C2 later on, but during the early days of ARPANET the military had more wide-area networking initiatives than you could shake a stick at and ARPANET was not one of the ones contracted for C2 purposes.

Was ARPANET designed for nuclear survivability? The most obvious answer is "no," because the early topology of ARPANET lacked the level of redundancy in its topology that actual survivable networks of the era had. But, this seems to have been more a consequence of funding and resource availability than intentions, because ARPANET researchers had done plenty of work on performance and reliability, using basically the same methods as used for survivable networks.

So maybe, at the end of the day, the "best" answer to this is sort of a boring one: meh. Nuclear survivability was obviously not a goal of ARPANET because ARPANET did not build out a survivable network. That said, ARPANET incorporated most of the technical ideas from survivable networks of the era. It is a virtual certainty that ARPANET got some of those ideas from earlier and simultaneous research into survivable networks, but it is also a virtual certainty that ARPANET arrived at some of them independently. If you consider "packet switching" to be the main technical advancement of ARPANET, it's probably not an idea that ARPANET got from survivable C2 research, because the historical record looks pretty confident that multiple people independently arrived at packet switching. That ought not to be surprising to anyone, because packet switching is a fairly direct evolution of practices established in radio and telegraph networks almost fifty years before. But, I also think it's an unnecessarily restrictive view of ARPANET's technical contributions, and other aspects of ARPANET like routing policy were definitely influenced by survivable communications research and, to some extent, directly based on survivability work.

What all this means about why ARPANET was "created" or what it was "designed" for is strictly a matter of how you interpret those words. Yes, articles and books and etc. should not repeat the claim that "the Internet was created to survive a nuclear strike," because the truth and falsity of that statement requires a lengthy and nuanced explanation. When we express history as simple facts we should try to stick to the ones that are, like, 90% true, instead of like 50% true. But "facts" about history are rarely 100% true, we're just not that lucky. It all happened a long time ago, there were a lot of people involved, different people were doing different things, it's a tangled mess of motivations and influences. That's why we study it.

Thanks for coming to my TED talk.

Postscript: Also, packet-switching is not at all intrinsic to survivable networks, although certainly survivability lead to a lot of advancements in packet-switching. But there were also a lot of circuit-switched survivable networks, and for a good span of the Cold War, I would say that circuit-switching outnumbered packet-switching for hardened C2. You'll notice that AT&T, the military's #1 choice for hardened communications, was firmly not on the packet switching side of things. But the military also contracted C2 projects to Western Union, who were basically arriving at modern packet switching by their own route (automated telegraph routing). This schism, between packet-switching and circuit-switching, remains a critical part of the data communications story today.

walrus01 · a year ago
This is a good amount of info for people who are not familiar with the subject. To illustrate just one part of the pre-ARPANET hardened communications networks, I'll focus on a few typical usage scenarios.

The DOD for the most part did not build their own domestic communications networks spanning multiple states and across the country. A lot of it was carried by the AT&T Long Lines network, which had a mix of barely-hardened, not hardened, and some extremely hardened sites (L4 coaxial, project offices, etc).

If one finds, for instance, the location of all of the SAGE direction centers and all of the telecommunications links that fed the remote radar sites' data to the SAGE direction centers, you'll also find a ton of AT&T Long Lines mountain top, hilltop sites, mixed in with the ordinary central offices located in mid sized to large sized cities.

In other places in the world the US DOD built its own bespoke communications networks for very specific command and control purposes. At one point in time there was a massive troposcatter network spanning nearly the entire Mediterranean sea, with important radar sites in Italy, Cyprus, linked to air bases and the DOD's backbone communications links from europe to the USA.

Similar massive troposcatter data links were built through the Aleutian islands.

Most of these were replaced by C-band geostationary satellite by the mid 1970s as that became a higher performance and more viable technology than the extremely massive, power hungry and expensive to maintain troposcatter stuff.

justsomehnguy · a year ago
TL; DR: osmosis

Thanks, JB!

hypeatei · a year ago
Probably not. Most government agencies (e.g. FEMA) aren't prepared to handle the aftermath of one let alone backbone infrastructure.

In the US, only high ranking government officials and STRATCOM are operating during a nuclear attack. Everyone else is on their own.