Do you have time to be more curious about the physical structure of our digital text information universe?

        Go to the "Worldometers" site to watch the "Exaflood" in the Society & Media section. 
         Do you want to know where to find the actual, physical "archeology" of the Intenet Click here for a brief summary of a Northern Virginia community's surrender to the Cloud's requirements by selling their houses and farms so that massive server farm can be built
   
Rich Miller's now-ancient Data Center Frontier survey of 2015's top ten cloud server farms has the advantage of providing statistics for their then-current and projected size, and more importantly, their power consumption and cooling requirements.  Photographs of the interior of some data centers show a clear distinction between the blue "cooled" aisles where the memory and computing resides, and the red "hot" aisles which draw off the otherwise fatal heat generated by their electricity demands which can reach 1500 watts per square foot in spaces that extend for acres.  For example, the NSA's Bluffdale, Utah computing center, which was the seventh largest in 2015, was said to have computing and storage capacity to draw up to 65 megawatts (ca. 26,000-56,000 average homes per year), requiring at peak load up to 1.7 million gallons of water per day to keep it from malfunctioning and harming operators.  By 2021, it was reported to be capable of yottabyte information storage and processing (Gillis).  Old server farms were designed to be air-cooled, but increasing their size and squeezing together more microprocessors and memory units per square foot required more efficient water chillers to cool the air.  By 2022, Data Center Frontier was reporting that fully immersive cooling, direct submersion of the computing elements in non-conducting coolants, was a major new technological change.  Heat from the coolant can be harvested and used to create power, reducing the unit's carbon footprint by some amount.  (In a 1/20/22 letter to the Washington Post, an inhabitant of the Prince William neighborhood pointed out that they are required to use wells for their drinking water, so the projected server farms will be competing with their wells for its cooling water, though that probably won't last long.)

         By August, 2020, TeleGeography reported that Asia was home to the largets number of the Cloud's server farms.  To see a map of the world's physical Cloud locations, go to: https://www.cloudinfrastructuremap.com/ 

        Information also is transported by submarine cables, which are not as susceptible to damage or disruption as satellites.  To see TeleGeography's Submarine Cable Map of the big physical connections which connect servers together and to you, go to: https://www.submarinecablemap.com/ 

         James Yoder (an incoming freshman [!] at UT Austin), has created an interactive satellite map called "Things in Space" on Github: https://ajmas.github.io/ThingsInSpace/ Of course, not all of these satellites transmit digital text, but the entire system is crucial to our current Internet structure, down to your access to ebooks in the Goucher Library's collection or your social media accounts.  Roll over "Help" and "About" in the upper right corner for some very brief clarification of what you  are seeing.  Obviously most of the "stuff in space" is not related to the Internet and Cloud, but just running  over an individual dot will show its orbit and identify it as an actual satellite, spent rocket component, or other type of object.  (Red dots are actual satellites, blue dots labeled "R/B" are spent rocket bodies, and grey dots labeled "DEB" are debris.  Satellites labeled "COSMOS" are Soviet or Russian in origin and mostly military.  Starlink comprises an ever increasing fraction of all orbiting material.)  Scrolling in or out with the + and - buttons on lower right gives you a view of the various layers of our orbital "family" and you can click-drag up or down to see Earth from the North or South pole.  From a polar view, the large ring you can detect in an upper orbit is composed of geostationary satellites, a highly competitive orbital slot. 

        To track individual orbital objects, including the International Space Station (for which I can conceive of no application for a 341 paper), see: https://www.n2yo.com/satellite/?s=40899


Additional articles if you are really into this topic!:
     Alexander Gillis, "Yottabyte (YB)," TechTarget, updated 9/20/21.  Note this article is now years old and its information, other than the basic facts about prefixes and their numerical meaning, are suspect.  Especially consider the assertion, in September 2021, that "Currently, there is nothing that can be measured on a yottabyte scale."  In that same year (see above) Gillis was reporting that the NSA's Bluffdale, Utah server farm was estimated to have yottabyte capacity.
    Julie Kendrick, "The 10 Largest Server Farms Ever Built," History-Computer.com, 26 July 2023: This article measures server farm size by the millions of square feet of horizontal space they enclose.  Other relevant measures might be computing power, electrical consumption in megawatts, and cooling water requirements.  Rich Miller's 2015 article (below) was more helpful in providing some of those comparative data.  Note especially, though, that Kendrick's #1 biggest server farm in 2023, "The Citadel" in Tahoe Reno, Nevada, is said to be powered entirely by renewable energy sources, for both computing electricity and cooling capacity.  Also note that Miller's top 10 were predominantly in Northern Virginia, especially Loudon County, whereas Kendrick's include installations in Europe and China.  What does that tell us about how the material infrastructure reflects the "nationality" of the Internet?
     Rich Miller, "The Top 10 Cloud Campuses," Data Center Frontier, 23 November 2015. https://datacenterfrontier.com/top-10-cloud-campuses/  Miller's list inludes many photographs of the interiors of cloud server units, and some of their cooling devices, a useful addition to Kendrick's article (above).  Note that Miller's #1 largest cener, Switch SUPERNAP in Las Vegas, took up 1.4 million square feet, whereas Kendrick's, eight years later, is "The Citadel" in Nevada at 7.2 million square feet.  Try finding comparative land forms you are familiar with to get some idea of how big these installations are, and are becoming.
     Yevgenie Sverdlik, "TeleGeography Maps the World's Cloud Data Centers: The firm behind Submarine Cable Map outlines geography of the Cloud," Data Center Frontier, 4 August 2020, https://www.datacenterknowledge.com/cloud/telegeography-maps-world-s-cloud-data-centers