Latest headlines: China's Nuclear Entry Into Space; Mangano-Sherman-Busby Study...Raises Questions; Fukushima research is back
You are reading from a free online e-book titled 'Deception, Cover-up and Murder in the Nuclear Age.' Visit the Table of Contents, or an introduction to the chapters in the book. Start at the book's beginning.
|2 of 5||
|Chapter 3 - Global fallout from nuclear testing|
In the book A Thousand Days: John F. Kennedy in the White House, author Arthur M. Schlesinger, Jr. tells of a fascinating episode in JFK's presidency. On page 455 of Schlesinger's voluminous book, he writes:
"Jerome Wiesner, [JFK's] Science Adviser, reminded him one drizzling day how rain washed radioactive debris from the clouds and brought it down to the earth. Kennedy, looking out the window, said, "You mean that stuff is in the rain out there?" Wiesner said, "Yes." The President continued gazing out the window, deep sadness on his face, and did not say a word for several minutes.'
Whenever it rained in the entire decade of the 1960s, there was strontium-90 in it. The entire globe was the repeated scene of what happened in two small outskirts of Hiroshima and Nagasaki in 1945.
But this fact wasn't on the minds of most people. The Limited Test Ban Treaty was signed, no more radioactivity was being injected into the atmosphere, and the levels of radiation in Wheaties and milk were dropping, or so people thought. Scientists and laypersons made a lethal error in their acceptance of assertions by the U.S. Atomic Energy Commission (AEC) that the fallout from 1961-1963 testing would dissipate largely in the stratosphere. The stratosphere is a layer of our atmosphere ranging in elevation from 6,000 to 55,000 feet and varies depending on season and latitude. (The stratosphere is basically defined as the region of the atmosphere where temperatures defy commonsense - if you were in a rocketship soaring from Earth into outerspace, you'll know you've hit the lower part of the stratosphere when the outside air temperature begins increasing, and you've left it when the temperature starts decreasing.)
The big hydrogen bomb tests of 1961 and 1962 created mushroom clouds that pierced the stratosphere and injected about 99% of their radioactive load there. Not much was known about the stratosphere in the 1950s and 1960s, but it was understood that the stratosphere was a place unto its own - what goes there, pretty much stays there; the AEC claimed that the fallout in the stratosphere would fall back down to Earth slowly - about half of the radioactive load every 15 years. But scientists soon found the AEC was lying and underestimated the rate of fallout by a factor of about 10 to 15. In fact, strong weather systems would bring much of that radioactivity down immediately in the springs of 1963 and 1964 and the worst hit-area was the 'breadbasket' of North America where the strongest weather phenomena occur. In order for the radioactive air of the stratosphere to 'mix' with the lower part of the atmosphere, a storm needs to grow in size to thousands of feet high.
We all have seen pictures of thunderstorm clouds in the Plains States with the classic 'anvil' cloud shape. The anvil is actually the clouds - water vapor - bumping up against the 'stratosphere,' which is deplete of moisture. The anvil is created in no different way than when you drop applesauce on a countertop. Just as the applesauce cannot penetrate the countertop, tall clouds cannot penetrate the stratosphere. These storms are capable of pulling air from the stratosphere downwards to Earth, but only in certain circumstances. In 1965, University of Michigan Professor A. Nielson Dingle proposed in a journal article3 that rare, highly-organized storms topped with miles-high and multiple-mile-diameter cloud-free vortexes - mimicking a hurricane structure- are capable of drawing stratospheric air down into the troposphere - whatever dust or particles are pulled from the stratosphere becomes mixed with precipitation to create concentrated areas of radioactivity. These powerful storms are called 'convection storms' because of their ability to circulate air across large parts of Earth's atmosphere - sort of like a large-scale convection oven.
As the storm's rain-filled clouds empty their contents, the air - and its contents - from the stratosphere is brought down to the surface of the Earth. This is what happened in the months and years after the hydrogen bomb tests of 1961 and 1962 - the radioactivity circulated down to Earth during the most extreme weather systems. However, one only needs to go to the 'breadbasket' of the U.S. and Canada to find the strongest of these systems and also find the greatest radioactive fallout. Dingle asserted that 'direct tapping of the lower stratosphere by highly organized and intense convective storms' have created radioactive 'hot spots' across parts of the globe and in particular the Plains States, which he said 'should be uniquely subject to the radioactive "hot spots."' Dingle asserted in the 1960s that the tornado-spawning hurricane-like storm type in America's mid-section 'constitutes the greatest health hazard, and is peculiar to a limited but agriculturally important part of the United States.' Although strong storms - and tornadoes - occur across Earth's mid-latitudes, the greatest tornado-impacted areas are in Oklahoma, Arkansas, Kansas, Nebraska and Iowa, which also might be home to the greatest 'hot spots.'
In the weather map above, from May 1963, two symbols illustrate a common springtime occurrence in the Plains States. The big filled dots (black dots) designate rain and the funny, squashed- looking 'eight' symbol below some dots (look below Bismarck, Fargo and Swift Current) is the weather symbol for cumulonimbus clouds, or the anvil-shaped clouds synonymous with stratosphere-touching clouds.
Radioactive rain means radioactive....?
All of the radioactive debris in the Plains States in early 1963 landed on fields and rivers, but mostly saturated the ground. Of course, most of the acreage in the Plains States is used for farming. In North Dakota, the total surface acreage is about 45 million acres and farming acreage comprises 39 million acres.
As scientists have known since the 1940s, radioactivity in the soil - and deposited on leaves - leaches into plant matter, including crops and grasses. As you probably are thinking, all vegetation in the Plains States and 'grain belts' in Canada in 1963 probably was extremely radioactive.
Since before World War II, North Dakota and its immediate neighbors - including provinces in Canada - have produced wheat, milk products, soybeans, cattle, corn, sugar beets, sunflowers, barley, various beans, canola, flax, oats, peas, honey, lentils, potatoes, hay, and more.
One of the main crops of the Plains States is wheat. In 1964, Montana, Minnesota, North and South Dakota produced about 87% of national production of spring wheat, which is used in whole wheat bread, artisan bread products and whole grain cereals. From 1958 to the late 1960s, whole wheat ('spring wheat') was one of the most radioactive foods in the food supply, year-after-year surpassing permissible levels set by international radiation standards.4
Milk was also impacted. The town of Mandan, North Dakota, long mystified scientists ever since they found milk produced in that town that had the highest ever recorded level of strontium-90 in milk in the early 1960s. Whereas most milk in the U.S. had 10 to 40 'units' of strontium-90 in milk, in May 1963, a peak of 105 was detected in Mandan-made milk. One article in the publication 'Nuclear Information' in September 1965 claimed that 'some farms [were] producing milk with....as much as 199 S.U. [Strontium Units]'
The blame, however, doesn't fall exclusively on Mandan. Mandan was simply a dairy-products production town, which received milk shipments from dairy production farms as far away as "east of Jamestown to as far west as Billings, Mont., and from the northern parts of South Dakota to the northern border of North Dakota." The high strontium-90 measurements in milk in Mandan, which greatly alarmed scientists, could have been attributed to Minnesota, South Dakota or Montana milk. All of that milk was made in Mandan into powdered milk, butter and cream products. So, across Montana, North and South Dakota and Minnesota, the milk was becoming highly radioactive as cows ate grass that was tainted with 'stratospheric fallout' - regional milk supplies and nationally-distributed butter, cream and other dairy products reached millions of consumers.
We know from data and maps produced by the Public Health Service that milk contamination was a real problem in these areas. From early 1963 to early 1965, the 'strontium units' in milk from these north-central states reached beyond suggested maximum permissible levels - into the 50s and 60s 'stronium units' and higher - whereas most of the continental U.S. had milk with 'strontium units' in the 10s or 20s or 30s. Montana, North and South Dakota and Minnesota were usually uniformly hit with fallout from strong 'convection storms' but the hardest hit areas were North Dakota and an eastern part of Montana and/or western part of Minnesota.
Much of the milk produced in the north-central states was consumed regionally, but that didn't let the rest of America off the hook. Nearly all of the dairy cattle areas of the U.S. were hit - at one time or another in the early 1960s - by convection storms, which caused strontium-90 levels to rise in local milk to 10, 20 or 30 'strontium units' and sometimes above 60 strontium units outside of the Plains States (see chart at bottom).
Strontium-90 behaves just like calcium and where calcium is channeled by the body to help grow 'healthy bones and teeth,' so goes the strontium-90. It is truly astonishing how much calcium is absorbed by a newborn baby and a young child. As a fetus develops, he or she seeks 30 grams of calcium for her skeleton from the mother. By the first birthday, the infant body seeks another 75 grams of calcium. By the second year - another 50 grams of calcium. If you take a spice container from your kitchen, you'll find that the content-weight is usually between 50 and 75 grams. Imagine that in the first two years of life an infant absorbs the weight equivalent in calcium of 2 to 3 spice containers! By their 8th or 9th birthday, children will have gained nearly 50 percent of the bone mass of their future adult bodies, or about 500 grams. Depending on the dietary levels of strontium-90, a fraction of this calcium intake is substituted by strontium-90. The body can't tell the difference between the two. Because of the high concentrations of strontium-90 in the environment in the 1960s, children growing up at that time absorbed strontium-90 into their bodies at concentrations - relative to their skeleton's calcium load - greater than any other time in history. To make matters worse, although milk tends to contain lower concentrations of strontium-90 than wheat following 'nuclear fallout' events, calcium (and also strontium-90) in milk is more efficiently absorbed into our bodies than calcium (and strontium-90) found in wheat. Also, milk is generally consumed in larger quantities daily by children than wheat products in the U.S. (or in Russia). Biochemistry, culture and fallout deposition patterns combined in a perfect storm to target the age group most vulnerable to radiation injury: children. The health of children across the globe in the 1960s, and the survival of the human race, was truly in peril, and had it not been for the 1958 moratorium and 1963 test ban treaty humans would have become mostly infertile and diseased by the late 20th century - a worst-case health catastrophe.
That strontium-90 was lurking in childrens' foods fortified with calcium. More than 50% of a child's source of calcium is milk, and about 20% is wheat.
Did We Avert a Health Catastrophe? - Infants and Strontium-90
The body places no restriction or upper limit on strontium-90 concentrations in bone, so while a fatal dose - like all other radiochemicals - exists for strontium-90, catastrophic health results can also occur from lower doses due to chronic internal exposure within the bone. The more we ingest, the more it gets built into bones and teeth, and consequently the higher the internal dose. The impacts are magnified in the young. The rate of absorption of calcium or strontium into bone is greatest for those whose bones are growing fast, which is the young. So, when in the mid-1960s, strontium intake via diet in both the young and mature populations reached an all-time-peak in U.S. and world history, the strontium-90 concentrations in infant teeth rose the quickest and the exponential rise alarmed scientists (see graph titled 'Strontium concentrations in infant incisors'). By 1963 or 1964, young children who put their baby teeth under the pillow for the 'Tooth Fairy' were parting with a fraction of 0.01 milligrams of strontium-90 in their skeleton (bones and teeth). A fraction of a milligram of strontium-90 might seem inconsequential, but if mixed in one liter of water, it would be 210 times the maximum content level (MCL) allowed nowadays by the EPA in a liter of drinking water. It would have to be diluted 210 times in order for it to be safe to drink. Some say that even the EPA's standards are too relaxed, and the 'MCL' should be lower.
What does strontium-90 do in the body?
A growing body of evidence and 'research discovery' suggests that strontium-90 in the bone may do more than just cause bone cancer and leukemia. Strontium-90 emits a type of ionizing radiation called 'beta radiation' that causes changes in the electron structures of atoms in nearby cells. When you change the nature of atoms in cells, you change the nature of the cells. The result, according to the authors of a 2011 study, is that strontium-90...
"has multiple pathogenic effects in the body. Along with other radioactive strontium isotopes, it is a genotoxic carcinogen. Experiments on animals exposed to radioactive strontium (57, 58) have identified elevated levels of a variety of cancers, including sarcomas, soft tissue cancers, respiratory cancers, leukemia, oral/nasal/periodontal cancers, lymphoma, basal/squamous cell cancers, pituitary adenoma, and tubular adenoma of the ovary. When consumed orally or inhaled, the most serious immediate consequences of Sr-90 exposure are hematological: reductions in white blood cell count that adversely affect the ability to resist infectious disease. Along with cancer and immune suppression, radioactive strontium has been shown to adversely affect the musculoskeletal, respiratory, cardiovascular, gastrointestinal, ocular, and neurological systems, and to cause chromosomal defects and other reproductive disorders (59)."5
When in the bone, strontium-90 attacks the bone marrow, a blood-forming organ with rapidly reproducing cells. That is where red blood and white blood cells are formed. Under the influence of these beta rays, critical groups of white blood cells can become mutated. T-lymphocytes' (or T-cells) are our body's soldiers on the 'front lines' boosting the body's cellular immune response. Stem cells also originate in the bone marrow (at about 12 weeks in prenatal growth) and give rise to 'B-lymphocytes' that give rise to 'humeral' antibodies. Abberations of these cell bodies cause them to act differently - different from their perfectly designed role of fighting off cancer and infectious disease. 6
In the catastrophe that was the profusion of radioactive chemicals injected into the environment in the 1960s, one would expect to notice die-offs, of birds, fish and other species, including humans. But where was this happening in the human populations of Earth? Where would one look to as the proverbial 'canary in the coal mine?' Which population would show health injuries the earliest? Obviously, the answer is children. We would expect a higher incidence of disease and death of young children. As it turns out, such data was collected. It was part of a body snatching project carried out by the U.S. Atomic Energy Commission. The American government, in concert with scientists in other countries, clandestinely collected bones of deceased children (and adults) from mortuaries, hospitals and cemeteries in the 1960s. Their goal was to study the cause of death of adults and children and compare this to strontium unit levels.
However, as Paul Langley notes in his book "Medicine and the Bomb: Deceptions from Trinity to Maralinga," when the bone data from the snatching project was later declassified several crucial pieces of data were missing. Langley notes that although the strontium-90 load (or total strontium-units) was reported, the 'cause of stillbirths or other infant death' was withheld. This failure to record in the dataset's sanitized public version the cause of death makes it impossible for scientists to establish further links between strontium-90 body burdens and stillbirth rates, childhood leukemia, or other radiation-linked health disorders. This missing piece of data is crucial to making sense of the study for the benefit of public health.
The only completed study that could help shed light on the global public health problems of the 1960s and beyond is classified and buried in a building of the U.S. federal government.
The closest this author found in terms of scientific evidence that depicts the horrors of 1960s fallout-related health effects was a 1992 paper in the British Medical Journal by Dr. R.K. Whyte, a Canadian.7 The paper was summarized by statistican and anti-nuclear advocate Jay Gould in his 1996 book 'The Enemy Within':
'In an effort to reexamine the hypothesis that the increase in early infant mortality observed in the 1950s and 1960s was attributable to the early practice of restricting oxygen for sick newborns, Whyte analyzed annual neonatal, first-day infant mortality, and stillbirths in the United States, England, and Wales since 1935...Whyte discerned an obviously anomalous upsurge in neonatal and first-day neonatal mortality rates in the peak years of superpower bomb tests... He found that falling rates of neonatal mortality in both countries were interrupted in the early 1950's, reaching a maximum upward deviation by the mid-1960s.'
Whyte commented in his 1992 paper: 'After 1951 in England and Wales mortality increased, reaching a maximum deviation from the previous exponential fall in 1967...The same general pattern occurred in the United States...[which] reached a maximum deviation from the previous exponential fall in 1966...' Was oxygen deprivation the cause? Gould writes that: 'By 1980 rates had returned to the level that would have been expected from the rate of improvement observed in the earlier 1935-1950 period, which was consistent with the end of atmospheric tests in 1980 and continuing advances in neonatal care. Because a similar pattern was discerned for stillborn rates, [Whyte] concluded that oxygen restriction for the newborn could not explain the phenomenon, nor could any apparent change in the nutritional management of pregnant women.' (p.41)
Whyte concluded his short paper with a review of the
possible culprits: 'These observations indicate a common maternal-fetal cause such as an economic or environmental factor...[yet] there are
no clear economic correlates...[and] no universal change in nutritional management of pregnant women is apparent...Among environmental factors is the
rise in exposure to strontium-90 resulting from atmospheric weapons
testing, which has been closely correlated, both geographically and temporarily, with excess fetal and infant deaths from 1950 to 1964....This and other hypotheses
should be developed and tested to determine the cause of this important loss of life.'
During the 20th century, Americans were exposed to a range of disease-causing environmental factors. The ozone was depleting, letting in harmful UV rays. Cavities were filled with toxic mercury 'amalgam' fillings, which leak toxic vapor that is breathed into the lungs and incorporated into human tissue. DDT and pesticides were sprayed on vegetation and 'drifted' onto homes and stuck onto foods. Hospital and dentist offices and poorly calibrated T.V. sets overdosed Americans with X-rays. Dioxins were sprayed on roads in several states for 'dust mitigation.' Radioactive phosphogypsum - a byproduct of the phosphate industry - was built into wallboard, building materials and two-lane blacktop in several states. Satellites carrying nuclear reactors and plutonium-batteries burnt up in the atmosphere and spread radioactivity far and wide.
The amounts and types of environmental poisons released from industry and nuclear activities in the 20th century and incorporated into bio-systems makes the Old Europeans who drank out of lead goblets look like environmentalists. Worse, trying to parse out which disease is linked to which toxin is a nightmarish exercise. Also, consider that radioactive isotopes decay often into other radioactive isotopes, which cause entirely different injuries than their 'parents.' Strontium-90's 'daughter,' yttrium-90, can cause malfunctions in lipid and hormone secretions that can adversely impact fetal and infant development.8
Now, billions of human beings are using cell phones to talk and surf the internet, however those phones emit electromagnetic radiation at a frequency range that is categorically 'microwave' radiation. Yes, microwave. Cell phones are cooking the brains of the young, the leaders of tomorrow. How does a slightly cooked cerebral cortex function any differently than a 'rare' cerebral cortex would be an experiment that only Nazis would have performed. But that experiment is happening on a massive scale on billions of young and not-so-young techno-guinea-pigs.
Advances in medicine will never be advanced enough to, firstly, link diseases to environmental 'insults' as long as those insults keep diversifying in type and quantity, and, secondly, as long as these environmental 'insults' keep mixing with each other in new combinations.
What happens to a child who uses a cell phone and has high strontium-90 in his bones? Or who doesn't use a cell phone but has mercury fillings? Or who doesn't have strontium-90, but plutonium in his bones? Or who also lives next to an army installation where depleted uranium shells are tested?