- Dropping Giant Rolls of Film From Satellites to Spy from Space
By Allison Kubo More cold war science in case you enjoyed our last cold war science article: the atomic-powered-nuclear-weapon-silo-ice-sculpture. In 1958, the Central Intelligence Agency started project Corona, a top-secret mission to perform photographic surveillance of the Soviet Union. Of course, this is before digital cameras. Current digital cameras use charge-coupled devices (CCD) which an array of capacitors transfer the photons that hit them into electrical signals. Although the development of the CCD began only a year after Project Corona, it wasn’t until the 1970s that it was employed by the military for imaging. However, before digital cameras film, photographic emulsions were used. These work in a similar way, except the light, hits a crystal in the film and changes its orientation. After the film is exposed to light, it must be developed in various chemical baths to “fix” the film so that it can be examined. The Corona Project consisted of a series of eight satellites launched from 1959 to 1972 which carried up to 3 miles worth of 75 mm film. After flying over the sites of interest the film canisters were dropped from 60,000 ft (18km). The film buckets had to have heat shielding to survive the stress of reentry. The early satellites returned 16 pounds of film and proved to be more safe and effective than U2 flights over the Soviet Union. The early missions were cloaked with secrecy and misdirection using “science” as a motivator. The film returned were “biological specimens”. And to increase deception and go along with public enthusiasm for space the Corona satellites were built to convey a monkey passenger. Many of the intrepid test subjects were killed in the experiment. After many apes were sacrificed and finally some film successfully retrieved, the project was deemed TOP SECRET by President Kennedy in 1962. After an unfortunate incident involving an accidental landing on a Venezuelan farm, engineers on the project designed a mid-air retrieval. They caught the film mid-air using a claw on the underside of an airplane. In addition, they employed a salt plug which if not retrieved in time would dissolve and cause the capsule to sink assuming it landed in the ocean. In 1985, more than 800,000 images were declassified by President Bill Clinton. These photos have provided an important record of climate change and social change since their capture in the 1960s. Many areas, such as the Middle East, have since undergone significant urban development and industrialization and the CORONA images are now freely available for study. We see several patterns emerging when we examine these missions: the shrouding of military operations in the name of “science” and said study actually benefiting from these operations.
- Are Large-Scale Data Breaches the New Normal?
By: Hannah Pell Image credit: Wikimedia Commons.In early May 2021, a ransomware attack on the Colonial Pipeline caused massive disruption to the East Coast’s fuel supply. Pictures of cars lined up at gas stations and warnings not to “panic buy” gasoline evoked memories of the 1973 oil crisis. Colonial Pipeline Co. paid a $4.4 million ransom demanded by the hackers — which the Federal Bureau of Investigations has since recovered — and chose to shut down the pipeline for the first time in its 57-year history, avoiding the possibility of the hackers gaining direct control over infrastructure transporting 2.5 million barrels of gasoline, diesel, heating oil, and jet fuel per day. “We were in a harrowing situation and had to make difficult choices that no company ever wants to face, but I am proud of the fact that our people reacted quickly to get the pipeline back up and running safely,” Colonial Pipeline Co. CEO Joseph Blount said in his testimony to the Senate Committee on Homeland Security and Governmental Affairs. Over the course of the COVID-19 pandemic, we’ve seen one large-scale data breach after the next; in fact, cybercrime has increased 600% since the pandemic started. Unfortunately, the ransomware attack on the Colonial Pipeline is not the first directed at energy systems, and likely will not be the last. Healthcare systems, too, are constantly at risk. Even McDonalds has been targeted. With our increasing reliance on digital technology, cybersecurity is of critical importance, even prompting a recent executive order from the Biden administration. I can’t help but wonder: are such large-scale data breaches the new normal? Are we reacting when necessary, rather than taking proactive measures to ensure adequate cyber protections? If so, how are we modernizing our infrastructure accordingly? Cryptography is the science of code-breaking and traces back to antiquity with evidence of ciphers and non-standard hieroglyphics. Codes were widely used during World War II to protect military intelligence, and cryptographers — such as British mathematician and “father of modern computing” Alan Turing — were recruited to decrypt enemy ciphers using Enigma machines. The World Wide Web was launched in 1990 (invented by CERN computer scientist Tim Berners-Lee), accelerating the growth of tech companies such as Google and Facebook. (Google engineers discovered a significant software leakage in 2018, and over five million users’ data was compromised. That same year, Facebook selling user data without their consent to Cambridge Analytica unfolded as a major scandal.) Cyberattacks can vary in approach. Ransomware, which is predicted to remain the number one cybersecurity threat, is a type of malware allowing hackers to block access to files or personal data until a ransom is paid. In the Colonial Pipeline case, the DarkSide-affiliated hackers gained access through the company’s virtual private network (VPN) with one single compromised password. Other cyberattack techniques include phishing, Trojan horse viruses, and spam. Fortunately, there are a number of countermeasures, including “security by design,” automated theorem proving, audit trails, code reviews, and “defense in depth.” Many have stood up to meet the challenge of improving and strengthening cybersecurity. The Cybersecurity and Infrastructure Security Agency Act of 2018 established a new federal agency of the same name to serve as our “Nation’s risk advisor.” Private sector ransomware negotiations have opened an entirely new line of work. Additionally, scientific progress in quantum key distribution enhances data encryption, although the technology could also be utilized for more sophisticated cyberattacks if in the wrong hands. Setting up multi-factor authentication is an effective strategy for securing sensitive data on our personal devices. According to the 2020 Internet Crime Report published by the Federal Bureau of Investigation’s Internet Crime Complaint Center (I3), losses incurred by victims of cybercrime amounted to $4.2 billion in 2020 alone. I3 received an average of 2,000 complaints per day and 2,211,296 total over the past five years. “In 2020, while the American public was focused on protecting our families from a global pandemic and helping others in need, cybercriminals took advantage of an opportunity to profit from our dependence on technology to go on an Internet crime spree,” FBI Deputy Director Paul Abbate wrote. At a recent press conference on the Colonial Pipeline ransomware attack, Abbate emphasized the following: “With continued cooperation and support from victims, private industry, and our U.S. and international partners, we will bring to bear the full weight and strength of our combined efforts and resources against those actors who think nothing of threatening public safety and our national security for profit.” It’s clear that cooperation and coordination between public and private sectors, as well as increased transparency and openness about the extent of such large-scale cyberattacks, will be necessary to effectively tackle this issue.
- How the Film Tenet Explores Entropy, Information, and Maxwell's Demon
By: Hannah Pell “If we conceive a being whose faculties are so sharpened that he can follow every molecule in its course, such a being, whose attributes are still as essentially finite as our own, would be able to do what is impossible to us,” wrote James Clerk Maxwell in his Theory of Heat (1871). With this sentence, Maxwell cast considerable doubt on the second law of thermodynamics, which states that the entropy of an isolated system left to spontaneous evolution cannot decrease. (Think of how air always flows from hot to cold, eventually reaching thermal equilibrium). Such a “being” was later characterized as a “demon” by William Thomson in an 1874 article published in the journal Nature, because of its “far-reaching subversive effects on the natural order of things.” Thus, Maxwell’s demon was born, outlining a paradox that remained unresolved for 115 years. Maxwell envisioned the thought experiment as follows: Picture two adjacent rooms — A and B — containing gases at equal temperature and pressures separated by a microscopic hole in the wall. The demon can control whether the hole is opened or closed and only allows fast-moving (hotter) particles to pass through from room A to B, not vice versa. By contrast, slow-moving molecules pass from room B to A, thereby cooling room A. In principle, this arrangement would violate the second law, as the work done by the demon would be negligible (although physicist Leo Szilard later pointed out that the demon’s acquisition of particle speed would require energy). The demon is effectively instigating a decrease in entropy, a physical implausibility.Image Credit: Plenio & Vitelli, “The physics of forgetting: Landauer's erasure principle and information theory,” (2010). Further developments linked thermodynamic principles to information. In 1948, mathematician Claude Shannon interpreted the entropy of a random variable as the average level of information or uncertainty inherent in its potential values. Later, in 1961, physicist Rolf Landauer showed that erasing information corresponds to an increase in entropy, which physicists now interpret as a measure of hidden information about the system in question. This idea is referred to as Landauer’s principle. Such physical concepts are woven into the universe of the science fiction movie Tenet. Before its release in August 2020, director Christopher Nolan told EW that the film is not about time travel but rather deals with different ways that time can function. “Not to get into a physics lesson, but inversion is this idea of the material that has had its entropy inverted, so it’s running backward through time, relative to us.” Nolan hired physics Nobel laureate Kip Thorne to consult on the script, but caveats that the producers were “not going to make any case for this being scientifically accurate.” So is the main character — referred to simply as “Protagonist” — a modern Maxwell’s demon? Let’s watch to find out. (Spoilers ahead!) We learn early on that a strange class of materials is being amassed in what appears to be a top-secret storage facility. Described as “inverted,” they’ve supposedly been manufactured in the future and behave as if they move backward in time from the characters’ present perspective; the Protagonist catches a bullet, and “Entropy runs backward,” the scientist explains. “Don’t try to understand it.”(If you look closely, Maxwell’s demon appears on the whiteboard in the background). Later, chaos unfolds at the Freeport. “What happened here?” Neil asks. “It hasn’t happened yet,” the Protagonist replies. And we watch the scene unfold as if on rewind. Dropped weapons — the “inverted materials” — fly off the ground. Even the “antagonists” are inverted, fighting in reverse. Cars drive backward on the highway, and Estonian is heard backward on the radio. A negotiation with Andrei Sator, the main antagonist, becomes an exercise in discerning what information has been shared in some version of the past. Along the way we hear about a “positron moving backward in time,” the Grandfather paradox, plutonium, and theories of parallel worlds. We are finally told of the “temporal pincer,” a time-bending mission technique in which half the team of good guys travels forward in time and the other half travels backward (ten minutes exactly, asymmetry reflected in “tenet”). The time-reversal is governed by an algorithm, which Sator aims to control in order to rewrite history. “Does being here now mean it never happened?” Protagonist asks. Such nonlinear temporality is reflected in how the film unfolds. As the audience, we’re given very few clues as to the dynamics of the conflict, creating a perception of heightened disorder, yet characters make decisions as if they already know what will happen. It turns out, however, that Protagonist did know the whole time or at least a version of him did. The “tenet” was devised by a Protagonist of the future; we eventually see him literally fighting himself. The physics throughout Tenet indeed constructs a “twilight world.” Although the film is not particularly scientific, who knows? Maybe the future looks different — only time will tell. “What’s happened has happened, which is an expression of faith in the mechanics of the world,” Neil proclaims.
- Physicists’ Early Dreams of Nuclear Powered Spaceflight
By: Hannah Pell Considering how much space junk is in orbit, the need to maintain and monitor cislunar space (the region between Earth and the Moon) is becoming an increasingly important issue. To do so effectively may require spacecraft that can propel for longer durations than currently available, and nuclear reactors may offer a solution. Recent news of progress utilizing nuclear technology to power extended spaceflight — from the Demonstration Rocket for Agile Cislunar Operations (DRACO) program, SpaceNukes, among others — is an opportunity to reexamine the history of this technology and pinpoint the origins of nuclear propulsion: Project Orion. The Beginnings of Nuclear Propulsion At the end of World War II, after witnessing the catastrophic destruction possible from nuclear weapons, physicists actively sought peaceful applications of such nuclear capabilities. Nuclear power, once regarded as “too cheap to meter,” is a well-known example of these efforts, but some saw another opportunity: space travel. Polish mathematician Stanislaw Ulam, who worked on the Manhattan Project, undertook preliminary calculations as early as 1946. More than a decade of work at Los Alamos National Laboratory resulted in a co-authored 1955 report (and several reports thereafter) titled “On A Method of Propulsion of Projectiles By Means of External Nuclear Explosions: Part I.” Ulam’s idea for a spacecraft propelled by thousands of nuclear bombs was taking shape. Soon after, Ted Taylor, America’s leading atomic bomb designer at the time (though staunchly against nuclear weapons), sold the idea of a nuclear propelled spacecraft to General Atomic (also jokingly referred to as “Generous Atomics” by some physicists due to their ample financial resources), and Project Orion began. Taylor knew Ulam through collaborative work at Los Alamos and described some of their conversations on fissile explosives in a 1995 oral history interview. Taylor recruited theoretical physicist and mathematician Freeman Dyson, who effectively joined to bolster the credibility of Orion. “If you just talked about the project, said you were going to propel a ship with nuclear bombs, the immediate reaction was that this was crazy. … They needed people with a solid reputation in order to have a chance to get the thing approved,” Dyson explained in the BBC documentary To Mars by A Bomb. Dyson, who dreamed of interstellar travel, handled rigorous calculations, from a proof-of-concept published in Physics Today showing nuclear propulsion was indeed a viable option to deriving the potential levels of radiation exposure per launch. (His son, historian of science George Dyson, authored a detailed account of Project Orion). What kind of science fiction is this? (In fact, Stanley Kubrick considered using nuclear propulsion technology in the making of 2001: A Space Odyssey). Let’s see if we can convince ourselves otherwise. Or, for fun, just go give it a try on Kerbal Space Program. I’ll wait. Project Orion Physics 101 Project Orion engineers envisioned its design in a fundamentally different way than other approaches at the time; rather than focusing on meeting the minimum of what was physically permissible, why not go bigger? “Midrange” Orion would be several thousand tons, approximately the size of an ocean liner, and would hold a crew of 50 people. (The mass of “Super” Orion was estimated at 8 million tons, the size of a city!) Orion would be designed for round-trip missions to Mars, and even one-way trips all the way to Saturn. Propelling an Orion vehicle demanded the systematic, controlled release of successive nuclear explosions. You could imagine the “nuclear pulse units” ejected one by one as if on an assembly line; in fact, Project Orion scientists consulted with the Coca-Cola company, thinking that the soft drink corporation's machinery could be easily scaled up to handle the unit, which resembled a soda can (pictured below).Diagram of an Orion nuclear pulse unit. Image Credit: NASA. Diagram of an Orion nuclear pulse unit. Image Credit: NASA. You might (quite reasonably) be wondering: if explosions occur so close to the ship, wouldn’t they cause damage? The Orion design incorporated a 1000-ton steel pusher plate mounted on shock absorbers smoothing the acceleration to levels that humans could withstand, between 2 to 4 g. However, there were two critical problems with the pusher plate: calculations predicted that the plate would ablate (erode) if unprotected from the repeated nuclear exposure and shockwaves from the blasts could cause spalling, or shards of metal breaking off. Declassified footage from Project Orion testing. Video credit: U.S. National Archives. Secrecy and Militarization The successful launch of Sputnik 1 in 1957 further amplified competition between the United States and Soviet Union amidst the ongoing space race. This urgency meant that the federal government was eager to find a fast and effective means of space travel. The newly formed National Aeronautics and Space Administration did not support Project Orion due to secrecy concerns due to its structure as a civilian space program. The Air Force, however, agreed to contribute funding, but with another price. “Officially it had to be justified to the budgeteers as a military program, so they had to invent fake military requirements for it,” Dyson explained. The Air Force’s involvement with Project Orion, initially “a translation of a sword into a plowshare” and inspired by hopes of disentangling nuclear technology from its reliance on militarization, may have marked the beginning of its downfall. “Military influences were inevitably at work upon it.” Eventually a car-sized model of the Orion spacecraft was constructed, and then-president John F. Kennedy visited the California site to see it in person. Managers had hoped that the presidential visit would help secure additional funding and political support, but Kennedy felt that the last thing the world needed was a nuclear weapons race in space, especially after the Cuban Missile Crisis. In August 1963, the international Limited Nuclear Test Ban Treaty was signed, effectively ending the Orion program. “Death of A Project” In 1965, Dyson published an essay in Science titled “Death of a Project,” attributing the demise of Project Orion to the Defense Department, leaders of NASA, advocates of the Nuclear Test Ban Treaty, and “the scientific community as a whole.” “The story of Orion is significant because this is the first time in modern history that a major expansion of human technology has been suppressed for political reasons,” he wrote. Despite the soundness of the science, Project Orion was morally difficult for many to get behind. “The idea isn’t crazy; the idea that we might do it is crazy,” physicist and author Arthur C. Clarke said of Project Orion. It’s a confluence of tight-lipped secrecy in a context of increasing anti-nuclear sentiment that didn’t exactly rally widespread support. Johndale Solem, the former Los Alamos theoretical physicist, offered a succinct summarization: “Generally, people recoil from the notion of using nuclear explosives. I do; I recoil from that notion. Because I know we don’t have that kind of world. And I know that having nuclear weapons in space is inviting someone to misuse them.” Indeed, Project Orion is an important reminder that scientific justifications may not inherently prove sufficient plausibility; what can be done might not be done. Nevertheless, physicists dreamed of expanding humanity’s reach for the cosmos and sought worlds beyond our own, limited and seemingly clouded in destruction. In some ways, this sentiment still rings true today. Artist’s conception of a Project Orion spaceship. Image credit: NASA
- What its Like to be Eaten by a Baby TRex
Allison Kubo Hutchison We’ve already covered some important questions like do trilobites bites (spoiler: they don’t) but recent research has given insight into another important question: what is it like to be eaten by a baby T-Rex?The answer is it is between being eaten by a hyena and a crocodile. To get this result, first paleontologists uncovered a fossil with bite marks that are thought to be from a young T. rex specimen. The spacing and dimensions of punctures on the fossilized vertebrae of an edmontosaurus, a type of duck-billed dinosaur, were compared to various T. rex fossils of different ages and found to match those between 11-12 years old. After identifying that it was from the T. rex, scientists attempted to duplicate the depth and shape of the wounds today. Researchers mounted a tooth made of dental-grade cobalt-chromium alloy on an “electromechanical testing system”, a biting machine, then “bit” a cow bone. After examining the wounds on the cow bone for similarity to the edmontosaurus they found that the young T. rex must have had significant bite force up to 5,600 N compared to the measly bite force of 300 N in humans. Adult or fully grown the T. rex had bite forces of up to 35,000 N, enough to pulverize bones as seen in the coprolites of T. rex. Potentially enough to crush a car. This amazing bite force, a 6 times increase of the juveniles occurred because of an important “puberty” if you will that occurs in specimens of ~14 years of age. The growth rate increases sharply and then tapers off at 16-18 years. In just a few years between 12 and 18, the T. rex can grow five times in size, growing thousands of kilograms a year depending on food availability. This amazing growth rate has been questioned by some scientists suggesting that instead, the juvenile specimens represented a different species entirely: the nanotyrannus. The nanotyrannus or lack thereof has caused significant controversy for decades.Research on the juveniles' impressive (though less astronomical than the adult’s) bite force reinforces that they were indeed juvenile specimens rather than a different species. It also may be important to understand how feeding trends change throughout their life. Instances of “experimental paleontology” not to be confused with Jurassic Park-like studies can provide important information on species dead for 65 million years. Image Attribution: KoprX, Tyrannosaurus specimens, Added "adult" and "juvenile" labels, CC BY-SA 4.0