Professional Documents
Culture Documents
Propulsion Disadvantages GDI 2011
Propulsion Disadvantages GDI 2011
Gemini
1 Propulsion Disads
Index
Index ..................................................................... 1
Ozone Link Solid-Chemical Propellant ............ 48 Ozone Link Liquid-Chemical Propellant .......... 49 Ozone Link/! ........................................................ 50 Ozone ! Econ/Disease/Species.......................... 51 Ozone ! Cancer ................................................. 52 Ozone ! Climate Change ................................... 53 Ozone ! Disease/Food/Environment ................. 54 Ozone ! Warming/Cancer ................................. 55 **AFF .................................................................. 56 Aff- Environment Friendly Propulsion Coming .. 57 Aff A2 Ozone ! .............................................. 58 Aff A2 UV Radiation ! .................................. 59 Aff CFCs Good ................................................. 60 Aff A2 Ozone Climate Change ................. 61 Aff Alt Cause CFCs .......................................... 62 Aff SPS Link Turn ............................................ 63 Aff Link Turn ................................................... 64
**NUCLEAR**........................................93
Mars- Nuclear Link ............................................. 94 Mars/Asteroids- Nuclear Link ............................. 95
**Weaponization......................................96
Weaponization Link ............................................ 97 Weaponization Link ............................................ 98 Weaponization Link ............................................ 99
2 Propulsion Disads
***MISCELLANEOUS*** ..................183
Xenon Propulsion Bad ....................................... 184 Ionic Propulsion Bad ......................................... 185 Water Coach Bad ............................................... 186 Water Coach Bad ............................................... 187 Water Coach Chemical Launch Link ................. 188 Antimatter Bad- Cost ......................................... 189 Antimatter Bad- Weaponization ........................ 190 Antimatter Bad- Solvency ................................. 191 Space Elevator Bad- Radiation .......................... 192 Space Elevator Bad- Radiation .......................... 193 Space Elevator Bad- Solvency ........................... 194 Nuclear Pulse/Project Orion- EMP .................... 195 Nuclear Pulse/Project Orion EMP Impact extensions ........................................................................... 196 Nuclear Pulse/Project Orion Bad- EMP extensions197 Nuclear Pulse/Project Orion Bad - Satellites ..... 198 Bifrost Bridge Bad ............................................. 199
3 Propulsion Disads
***CHEMICAL***
4 Propulsion Disads
**Perchlorate D/A
5 Propulsion Disads
Perchlorate 1NC
Rocket fuel emits perchlorate pollutes water, and puts babies at risk of developmental issues Madsen and Jahagirdar 6 (Travis Madsen Policy Analyst at Frontier Group and Sujatha
Jahagirdar Political Director at Student PIRGs, spring 2006, The Politics of Rocket Fuel Pollution, http://www.environmentcalifornia.org/reports/clean-water/clean-water-program-reports/the-politics-ofrocket-fuel-pollution) JPG The main ingredient in solid rocket fuel perchloratepollutes drinking water sources in more than 20 states. Tests also reveal perchlorate in grocery store food supplies and in breast milk from women across the country. A 2005 study by researchers at Texas Tech University suggests that breastfed babies ingest levels of perchlorate that exceed the safe dose recently established by the National Academy of Scienceputting children at risk for development damage . California state agencies have discovered perchlorate in more than 400 water sources since 1997, including the Colorado River and hundreds of
municipal wells.
Perchlorate causes massive water shortages contamination Waldman 2 (Peter, reporter @ wsj, 12/27/2,
http://www.kuratrading.com/HTMLArticles/perchlorate.htm) JPG Several of the nation's fastest-growing areas -- including Las Vegas, Texas and Southern California -- could face debilitating water shortages because of groundwater contamination by perchlorate , the main ingredient of solid rocket fuel. The chemical, dumped widely during the Cold War at military bases and defenseindustry sites, has seeped into water supplies in 22 states. The U.S. Environmental Protection Agency and the
Department of Defense are embroiled in a bitter dispute over perchlorate's health effects, with the EPA recommending a strict drinking-water limit that the Pentagon opposes as too costly. Yet even without a national standard, state regulators and water purveyors are taking no chances: Dozens of perchlorate-tainted wells have been shuttered nationwide, casting a pall on growth plans in several parched areas. Perchlorate is what scientists call an endocrine disrupter, a chemical that can alter hormonal balances -- thyroid hormones, in this case -- and thus impede metabolism and brain development, particularly among newborns. The chemical isn't believed to enter the body through the skin, so bathing in contaminated water isn't considered dangerous. The real debate is over how much ingested perchlorate causes harm. The outcome of that argument will ultimately determine how much the Pentagon and its defense contractors will have to spend to cleanse the chemical from the nation's drinking supplies. The EPA has urged the Pentagon to undertake widespread testing for perchlorate in groundwater, but the Defense Department has resisted. Its official policy, issued last month, allows testing only where a "reasonable basis" exists to suspect perchlorate contamination is both present and "could threaten public health." One major problem is that perchlorate is turning up in many unexpected places, including at military training and test ranges where rockets and missiles -- with their large quantities of solid propellants -- aren't believed to have been used. Some scientists believe other types of munitions that used tiny amounts of perchlorate may be the culprits. Many of the ordinary military ranges with perchlorate pollution lie on the outskirts of growing cities, in places that were once distant from civilian neighborhoods but now serve as watersheds and open space for sprawling suburban communities.
6 Propulsion Disads
Perchlorate ! General
Perchlorate contaminates food and water causes health risks for children and pregnant women ENS 8 (Environmental News service, 10/6/8, http://www.ens-newswire.com/ens/oct2008/2008-10-0601.html) JPG Ammonium perchlorate is widely used throughout the aerospace, munitions, and pyrotechnics industries as a primary ingredient in solid rocket and missile propellants, fireworks, and explosive charges. It is a component of more than 350 types of munitions, according to the Department of Defense. The chemical has been found not only in drinking water but also in lettuce and milk. In 2004, the Food and Drug Administration reported finding perchlorate in 217 of 232 samples of milk and lettuce in 15 states. Perchlorate affects the ability of the thyroid gland to take up iodine, which is needed to make thyroid hormones that regulate many body functions. Children and pregnant women are especially susceptible. \
Perchlorate causes cancer, birth defects and other problems Anderson 3 (Adrienne, professor of Environmental & Ethnic Studies @ Boulder U, December 2003,
http://www.zcommunications.org/planetary-casualties-an-interview-with-adrienne-anderson-by-davidbarsamian) JPG A horrific wave of infant defects, cancers, and other problems followed . There are significant similar cases in California with Aerojet and other military contractors. Suburban communities are being built in these subdivisions outside of urban populations. Their water supplies are being contaminated by rocket fuel cancer-causing propellants that contaminate water supply after water supply, forcing
shutdowns of well water all throughout Southern California and the Sacramento area. Lockheed Martin contaminated Burbanks water supply. You have to wonder with all the people coming down with Parkinsons disease and all sorts of neurological problems, whats the association with that? Are studies being done? No, theyre not. In California, Lockheed Martin was actually paying people to eat their pollution, giving them $1,000 if they would eat perchlorates, a solid rocket fuel contaminant that was contaminating public water supplies throughout California.
7 Propulsion Disads
Perchlorate causes health issues in mothers and newborns it multiplies the effects of other harmful chemicals Szabo 11 (Liz, writer @ USA TODAY, 1/17/11, http://www.usatoday.com/yourlife/parentingfamily/pregnancy/2011-01-14-chemicals14_st_N.htm) JPG In spite of these efforts, a new study shows the typical pregnant woman has dozens of potentially toxic or even cancer-causing chemicals in her body including ingredients found in flame retardants and rocket fuel. Almost all 268 women studied had detectable levels of eight types of chemicals in their
blood or urine, finds the study, published in today's Environmental Health Perspectives. It analyzed data from the Centers for Disease Control and Prevention (CDC). These chemicals include certain pesticides, flame retardants, PFCs used in non-stick cookware, phthalates (in many fragrances and plastics), pollution from car exhaust, perchlorate (in rocket fuel) and PCBs, toxic industrial chemicals banned in 1979 that persist in the environment. Many of these chemicals pass through the placenta and can concentrate in the fetus, says lead author Tracey Woodruff, director of the University of California-San Francisco Program on Reproductive Health and Environment. Other researchers have discovered some of these chemicals in babies' umbilical cords, Woodruff says. Some of the chemicals detected in the study have been linked to health problems in other studies. For example, the Food and Drug Administration has expressed "some concern" that BPA an estrogen-like ingredient in plastic found in 96% of pregnant women affects the development of the brain, prostate and behavior in children exposed both before and after birth. Lead and mercury are known to cause brain damage. The study tested for 163 chemicals. So, as disturbing as the findings are, the study may actually underestimate the number of chemicals circulating through women's bodies, says Sarah Janssen, a senior scientist with the Natural Resources Defense Council, an advocacy group. She's concerned that some of these chemicals may act together to cause more damage than they would alone.
8 Propulsion Disads
Perchlorate ! Water/Cancer
Perchlorate contaminates water supplies causes cancer Madsen and Jahagirdar 6 (Travis Madsen Policy Analyst at Frontier Group and Sujatha
Jahagirdar Political Director at Student PIRGs, spring 2006, The Politics of Rocket Fuel Pollution, http://www.environmentcalifornia.org/reports/clean-water/clean-water-program-reports/the-politics-ofrocket-fuel-pollution) JPG Lockheed Martin, the worlds largest defense contractor, polluted water supplies in the Redlands area of San Bernardino County, California, near where it made missiles from 1961 to 1974.42 Officials discovered trichloroethylene, an industrial solvent, in water wells near the former Lockheed site in 1980. The chemical was gradually polluting Redlands water supply. The pollution plumeone of the largest in Californiaspanned more than 14 miles.43 In 1997, officials discovered widespread perchlorate contamination in the same area.44 Contamination from the Lockheed facility created a perchlorate plume measuring approximately seven square miles. Forty-seven drinking water wells have been affected to date, and concentrations as
high as 70 ppb have led to the shutdown of five wells.45 Analysis showed that 63 percent of water delivered to residents in Loma Linda and 18 percent of the water supply in Redlands came from perchlorate-tainted wells.46 A group of nearly 800 people filed lawsuits against Lockheed Martin, seeking damages for health problems that could have been caused by exposure to pollution from the site, including cancer.47 Since 1998, Lockheed has spent $80 million cleaning and replacing contaminated municipal water systems around Redlands and Riverside, California. The company expects to pay $180 million more over the next 20 years cleaning up perchlorate and other chemicals that seeped into underground water supplies near this facility.48 Regulatory standards for cleanup of perchlorate contamination will affect the amount of liability Lockheed will face for cleanup and affect any pending legal cases. Regarding these standards, Lockheed spokeswoman Gail Rymer said, Those levels determine how much treatment is necessary. Its a cost issue.49
9 Propulsion Disads
10 Propulsion Disads
Perchlorate ! Food
Perchlorate kills food supplies, unproportionally affects the Latino population Newman 7 (Penny, Exec director @ Center for Community Action and Environmental Justice, 4/10/7,
naturalresources.house.gov/uploadedfiles/newmantestimony04.10.07.pdf) JPG The impact of perchlorate is not limited to drinking water. Perchlorate also concentrates in leafy vegetables like
lettuce, which creates a concern for consumers of Imperial Valley crops irrigated with Colorado River water. Tests by scientists and advocacy organizations like the Environmental Working Group have confirmed that plants, especially broadleaf varieties, concentrate perchlorate from the environment. Scientists have found perchlorate in plant tissues at levels up to 100 times higher than in nearby water sources. In 2004, The Food and Drug Administration released a study finding perchlorate in 90 percent of 128 lettuce samples and in all but three of the 104 milk samples, with average levels ranging from six parts per billion in milk to 12 parts per billion in Romaine lettuce. These results raise the possibility that perchlorate contamination is much more widespread than regulators currently know, and that exposure is wide spread across the country. Perchlorate is highly mobile in water and can persist for decades under typical ground and surface water conditions. Research has also shown that perchlorate can concentrate in crops such as wheat, lettuce, alfalfa, and cucumbers thereby resulting in much greater exposures than might be predicted by water or fertilizer concentrations. Newer data have shown perchlorate contamination to be widespread in store-bought fruit, vegetables, cows milk, beer and wine. Perchlorate has been found in human breast milk at levels up to 92 ppb, and was found in every one of 2820 urine samples the Centers for Disease Control recently tested for perchlorate. Nopales, a staple in the Latino communities, has similar characteristics of those vegetables found to uptake perchlorate easily such as lettuce. A concern for low income Latino communities that rely on the tasty succulent as a major food source is that
11 Propulsion Disads
Community makes up 65% of the citys population and African Americans contribute 17% to Ria1tos population. (US Census 2000) President Bush didn't tolerate the presence of Perchlorate from the McGregor Naval Weapons Station south of Waco in the water supply at the Presidential Ranch at Crawford. Congress appropriated money so that Bushs water for his animals would be safe to drink, so what about the rest of us? The California Water Code provides the State Water Resources Control Board
(Board) with the authority to require the cleanup and abatement of Perchlorate contamination throughout the state. In order to fully exercise their authority and restore aquifers throughout the state to health. I urge the -State Water Resources hoard to adopt cleanup and abatement orders for Perchlorate cleanup that require the following: 1) Cleanup of Perchlorate pollution to the xliest extent that is technically feasible and uses best available technology; 2) Provision Of safe, alternative water supplies until full cleanup is complete; 3} Full reimbursement by responsible dischargcrs to community members and public utilities that have paid for stopgap cleanup measures; 4) Implementation of strict enforcement measures in the event of a failure to meet cleanup requirernents and timelines. Perchlorate does not belong in Califomias drinking water supplies. By including the measures outlined above in cleanup and abatement orders, the State Water Resources Control Board will take the 516115 necessary to restore vital groundwater resources across the state to health.
Rocket fuel disproportionately affects minority populations specifically women EWG 9 (Environmental Working Group, EWG research team: Senior researchers Anila Jacob, MD,
MPH; Sonya Lunder, MPH, May 2009, http://www.ewg.org/report/Pollution-in-5-Extraordinary-Women) JPG An unprecedented two-year study commissioned by the Environmental Working Group and conducted by four independent research laboratories in the United States, Canada and the Netherlands has documented up to 481 toxic chemicals in the blood of five minority women leaders in the environmental justice movement. The
women leaders, from New Orleans, Green Bay, Corpus Christi and Oakland, have spent years deeply engaged in battles to rid their communities of air and water pollution from local manufacturing plants, hazardous waste dumps, oil refineries and conventional agriculture. 75 chemicals tested The study, sponsored by EWG in conjunction with Rachel's Network, a nationwide organization of women environmental leaders, tested the five women last year for 75 chemical contaminants. Testing was targeted toward compounds that are heavily used in everyday consumer products but that have escaped effective regulation under the antiquated Toxic Substances Control Act (TSCA). The results underscore the widespread and systemic failure of current law to protect the public from chemicals, many of which persist in the environment for decades or far longer, that are associated in animal studies with cancer, reproductive problems and behavioral effects. All of the women were contaminated with flame retardants, Teflon chemicals, synthetic fragrances, the plastics ingredient bisphenol A and the rocket fuel component perchlorate. Conclusion Though they live thousands of miles apart, come from distinctive cultural traditions and confront different environmental hazards outside their homes, the women's differences are only skin deep. Their body burdens of environmental pollutants, a mix of industrial chemicals, synthetic cosmetics ingredients and chemicals used to treat consumer products, are strikingly similar - and roughly equivalent to the body burdens of other Americans surveyed by governmental and independent research organizations. Every woman: Tested positive for 35 to 60 percent of the 75 chemicals on the search list. Had a high body burden of at least one controversial chemical whose lack of regulation and widespread presence in American life is fueling debate over reform of the nation's toxic chemical policies. The laboratory analyses, which offer a snapshot of the toxic body burdens of women on the front lines of the environmental health and environmental justice movements, set the stage for larger, population-scale research projects that could determine how exposure to chemicals in water, food and consumer products may vary across minority populations; what other industrial compounds may also be present in Americans' bodies; and any health risks those pollutants may pose, alone or in combination.
12 Propulsion Disads
ecological landscape of this new nation. Environmental racism buttressed the exploitation of land, people, and the natural environment. It operates as an intra-nation power arrangement--especially where
ethnic or racial groups form a political and or numerical minority. For example, blacks in the U.S. form both a political and numerical racial minority. On the other hand, blacks in South Africa, under apartheid, constituted a political minority and numerical majority. American and South African apartheid had devastating environmental impacts on blacks.
Racism must be rejected Barndt, Pastor and Co-director of Crossroads 91 Ministry working to dismantle racism (Joseph,
Dismantling Racism: The Continuing Challenge to White America 155-6,)
To study racism is to study walls. We have looked at barriers and fences, restraints and limitations, ghettos and prisons. The prison of racism confines us all, people of color and white people alike. It shackles the victimizer as well as the victim. The walls forcibly keep people of color and white people separate from each other; in our separate prisons we are all prevented from achieving the human potential that God intends for us. The limitations imposed on people of color by poverty, subservience, and powerlessness are cruel, inhuman, and unjust; the effects of uncontrolled power, privilege,
and greed, which are the marks of our white prison, will inevitably destroy us as well. But we have also seen that the walls of racism can be dismantled. We are not condemned to an inexorable fate, but are offered the vision and the possibility of freedom . Brick by brick, stone by stone, the prison of individual, institutional, and cultural racism can be destroyed. You and I are urgently called to join the efforts of those who know it is time to tear down, once and for all, the walls of racism. The danger point of self-destruction seems to be drawing ever more near . The results of centuries of national and worldwide conquest and colonialism, of military buildups and violent aggression, of overconsumption and environmental destruction may be reaching a point of no return . A small and predominantly white minority of the global population derives its power and privilege from the sufferings of the vast majority of peoples of color. For the sake of the world and ourselves, we dare not allow it to continue.
13 Propulsion Disads
14 Propulsion Disads
15 Propulsion Disads
16 Propulsion Disads
favorable to industry.
17 Propulsion Disads
18 Propulsion Disads
19 Propulsion Disads
innovators have an opportunity to create cutting edge solutions that will strengthen health protections and spark economic growth," Jackson said in a statement.
20 Propulsion Disads
21 Propulsion Disads
Aff A2 Health !
Their impacts are exaggerated Perchlorate has minimal effects on health Holt 11 (Jim, senior writer @ The Signal, 4/5/11, http://www.the-signal.com/section/36/article/42903/)
JPG
There were fundamental flaws we found in the scientific conclusions that were reached , said Bill Romanelli, spokesman for the corporate interests. The group calling itself the Perchlorate Study Group and representing a handful of aerospace and chemical corporations that make or use perchlorate want the state study withdrawn and re-evaluated to focus on what it considers reliable and widely accepted scientific information. The statement came in reaction to a study released in January by the Office of Environmental Health Hazard Assessment. The study recommends that California re-set the limit for perchlorate in drinking water to 1 part per billion. The current health safety limit called a public health goal calls for no more than 6 parts per billion of perchlorate in drinking water, which is six micrograms in a liter. The state study is separate from yet another study, released by the Boston University School of Medicine in February, that found extremely low levels of perchlorate wont adversely effect pregnant women or their unborn children.
22 Propulsion Disads
Aff A2 Health !
Perchlorate doesnt affect health overwhelming scientific consensus Weaver 4/11 (Lindsay, writer @ Morgan Hill Times, 2011,
http://www.morganhilltimes.com/printer/article.asp?c=274593) JPG The reliable and widely accepted science already exists, the Perchlorate Information Bureau insists, and it's past due that California recognizes at low levels the natural toxin isn't hazardous to your health. The residents affected by
San Martin's distressful rocket fuel groundwater contamination of 2003 are still seeing the remnants of the spill by Olin Corp.'s shuttered road flare factory on Tennant Avenue - fewer than 20 wells are still closely monitored and millions of dollars have been spent by local government agencies to clean and monitor South County's drinking water. A few residents are even still dependent on bottled water. But, according to a new study by Boston University School of Medicine's scientist Dr. Elizabeth Pearce, perchlorate isn't harmful to thyroid function even in pregnant women in their first trimester. The findings depict what the National Academy of Sciences has said for 50 years, the Perchlorate Information Bureau's spokesman Bill Romanelli said; "It doesn't come as a surprise. It's the same as the bulk of overwhelmingly scientific conclusions that say it simply doesn't impact human health," he said Monday. The Perchlorate Information Bureau represents the interests of the aerospace industry that uses and produces rocket fuel, but Romanelli said the science speaks for itself.
23 Propulsion Disads
24 Propulsion Disads
residue from ground testing of solid rockets and waste burning of their fuel reacts with water to form acid rain and acid fog. In the dry foothills above San Jose, California, the Chemical Systems Division of United Technologies Corporation (UTQ) burns toxic solid rocket fuel in open pits, while
citizens living nearby arc not allowed to bum household garbage or even yard cuttings. The Bay Area Air Quality Board, as well as agencies permitting solid-rocker-fuel burning in other parts of California, Colorado, Utah, and Mississippi, allows the practice because there is no proven, safe alternative. Rocket-fuel pollution is footloose: when activists and officials stopped some UTC fuel burning in San Jose, the company started burning it openly at the Sierra Army Depot, in Herlong, California, north of Lake Tahoe. Even if production of new rockets and fuel stops, the cold-war missile buildup may come back to haunt, and poison. The INF treaty with the Soviet Union actually specified that the United States destroy Pershing II solid rocket motors through open burning or explosive demobrion, which was carried out in the Pueblo Army Depot in Colorado and the Longhorn Army Ammunition Plant in Texas. The army's Redstone Arsenal, in Huntsville, Alabama, has developed an experimental method to recycle rather than burn off toxic elements of rocket fuel, but to make it practical would require a good deal more support from army and government hinders. Meanwhile, along the wetlands of the Mississippi Gulf Coast, a group called Citizens for a Healthy Environment is distributing skull-and-crossbones bumper stickers as part of its campaign to block NASA's Advanced Solid Rocket Motor program. NASA plans lo test the new rockets at the Stennis Space Center, near Bay St. Louis. Stennis has conducted tests of liquid-fueled rockets since the mid-1960s, but the higher levels
of pollution from solid rockets have people worried about health risks as well as damage to the wetlands. NASA initially plans four 2.25-minute tests of the new motors each year, which are expected to release about 1.7 million pounds of aluminum oxide, 123,000 pounds of chlorine, and more than a million pounds of hydrogen chloride at the site annually. When the hydrogen chloride mixes with water, it will
form up to 3.2 million pounds of hydrochloric acid each year. NASA proposes to protect the environment by deflecting the exhaust plume upward to dilute its impact, and by testing only under optimum weather conditions.
25 Propulsion Disads
26 Propulsion Disads
tanks.
27 Propulsion Disads
NOx ! Warming
Nutrous oxide causes warming creates a feedback loop Upham 10 (Ben, writer @ New York Times, 4/8/10, http://www.enn.com/sci-tech/article/41195) JPG
Nitrous oxide, also known as "laughing gas," is ranked third behind carbon dioxide and methane in contributing to global warming, and is regulated under the Kyoto Protocols. According to the EPA, the gas is 310 times more effective in trapping heat than carbon dioxide. Sixty percent of the nitrous in the atmosphere is produced naturally. Global warming "wild card" Twenty-five percent of the land surface in the Northern Hemisphere is underlain by permafrost, and as it thaws it could create a feedback loop that accelerates global warming, because it releases greenhouse gases, like methane and carbon dioxide, which in turn increase warming, spurring more thawing. Scientists had thought only a little nitrous oxide was released during this
process, but the journal study suggests otherwise.
28 Propulsion Disads
ecosystems are degraded. Biodiversity lost and once vibrant communities disappear as fish die and ecosystems degrade. However, attention has moved away form that ongoing environmental disaster and many
seem to believe the problem has gone away because the media rarely seems to report on it anymore.
Impact is extinction Santo 99 (Miguel Santo, professor of Ecology and Environmental Science at Baruch College, 1999,
Environmental Crisis, p. 35-36)
In addition, natural forests provide recreation and unique scientific beauty while at the same time serving as the basis for natural communities that provide life support to organisms (including people). As mentioned, one vital by-product of plant photosynthetic activity is oxygen, which is essential to human existence. In addition, forests remove pollutants and odors from the atmosphere. The wilderness is highly effective in metabolizing many toxic substances. The atmospheric concentration of pollutants over the forest, such as particulates and sulfur dioxide, are measurably below that of adjacent areas (see Figure 2.3). In view of their ecological role in ecosystems, the impact of species extinction may be devastating. The rich diversity of species and the ecosystems that support them are intimately connected to the long-term survival of humankind. As the historic conservationist Aldo Leopold stated in 1949, The outstanding scientific discovery of the twentieth century is not television or radio, but the complexity of the land organisms. . . To keep every cog and wheel is the first precaution of indifferent tinkering. An endangered species may have a significant role in its community. Such an organism may control the structure and functioning of the community through its activities. The sea otter, for example, in relation to its size, is perhaps the most voracious of all marine mammals. The otter feeds on sea mollusks, sea urchins, crabs, and fish. It needs to eat more than 20 percent of its weight every day to provide the necessary energy to maintain its body temperature in a cold marine habitat. The extinction of such keystone or controller species from the
ecosystem would cause great damage. Its extinction could have cascading effects on many species even causing secondary extinction. Traditionally, species have always evolved along with their changing
environment. As disease organisms evolve, other organisms may evolve chemical defense mechanisms that confer disease resistance. As the weather becomes drier, for example, plants may develop smaller, thicker leaves, which lose water slowly. The environment, however, is now developing and changing rapidly but evolution is slow , requiring hundreds of thousands of years. If species are allowed to become extinct the total biological diversity
on Earth will be greatly reduced: therefore, the potential for natural adaptation and change also will be re4yce&shus endangering the diversity of future human life-support systems.
29 Propulsion Disads
30 Propulsion Disads
Loss of ocean ecosystems collapses the economy and causes extinction Heinberg 8 (Richard, Senior Fellow-in-Residence @ Post Carbon Institute, 11/25/8,
http://www.postcarbon.org/top_food_chain) JPG Today comes the startling news of a British government report showing a drop in oceanic zooplankton of 73 percent since 1960. For many people, this may seem relatively inconsequential as compared to daily cataclysmic
revelations about the state of the national and global economy. This reaction is understandable: we care first and foremost about our own immediate survival prospects, and a new and greater Depression will mean millions losing their homes, millions more their jobs. It's nothing to look forward to. It takes some scientific literacy to appreciate the implications of the catastrophic loss of microscopic sea animals. We need to understand that these are food for crustaceans
and fish, which are food for sea birds and mammals. We need to appreciate the importance of the oceanic food web in the planetary biosphere. At the top of the global food chain sits a species that we really do care aboutHomo sapiens. The ongoing disappearance of zooplankton, amphibians, butterflies, and bees is tied directly or indirectly to the continuing growth of our own speciesboth in population (there are nearly seven billion of us large-bodied omnivores, more
than any other mammal) and in consumptive voracity (water, food, minerals, energy, forestsyou name it). It's at this point in the discussion that some of us start feeling guilty for being human, and others of us tune the conversation out because there's apparently not much we can do to fundamentally change the demographic and economic growth trends our species has been pursuing for hundreds, if not thousands of years. But the current economic Armageddon (that we care about) is related to human-induced biodiversity loss (that many of us don't notice) in systemic ways. Both
result from pyramid schemes: borrowing and leveraging money on one hand; on the other, using temporary fossil energy to capture ever more biosphere services so as to grow human population and consumption to unsustainable levels. Our economic pyramid is built out of great hewn blocks of renewable and non-renewable resources that are being made unavailable to other organisms as well as to future generations of humans. The financial meltdown tells us these trends can't go on forever. How the mighty have fallen!Masters of the Universe reduced to begging for billion-dollar handouts in front of a television audience. Next will come a human demographic collapse (resulting from the economic crisis, with poor folks unable to afford food or shelter), as mortality begins to exceed fertility. In all of this it's important to remember that the species on the lower levels of the biodiversity pyramid have been paying the price for our exuberance all along. The pyramid appears to collapse from the top, while in fact its base has been crumbling for some time .
31 Propulsion Disads
safeguard these marine creatures and the livelihoods that depend on them.
32 Propulsion Disads
33 Propulsion Disads
in acid rain may have benefits, limiting global warming by counteracting the natural production of methane gases by microbes in wetland areas. Methane is thought to account for 22 percent of the human-enhanced greenhouse effect. And microbes in wetland areas are its biggest producers. They feed off substrates such as hydrogen and acetate in peat and emit methane into the atmosphere. Global warming itself will only fuel the production of methane as heating up the microbes causes them to produce even more methane. But the new model suggests that sulphur pollution from industry mitigates this. This is because sulphur-eating bacteria also found in wetland regions outcompete the methaneemitting microbes for substrates. Experiments have shown that sulphur deposits can reduce methane production in
small regions by up to 30 per cent by activating the sulphur-eating bacteria.
Warming creates a feedback loop with methane microbes causes the impacts faster New Scientist 4 (Proceedings of the National Academy of Sciences, 8/3/4,
http://www.newscientist.com/article/dn6231-acid-rain-limits-global-warming.html) JPG Furthermore, the model suggests that sulphur pollution will continue to suppress methane emissions despite the feedback effect that global warming has on the process. While sulphur emissions reduce methane emissions by about eight per cent currently, the figure should rise to 15 per cent by about 2030, predicts the model. "All our projections show that, if you don't include acid rain, methane pollution is going to increase," Gauci adds. Sulphur pollution is already estimated to have cut methane emissions from wetlands from about
175 to 160 million tonnes per year in 2004. By 2030, this is predicted to fall to 155 million tonnes per year with the help of sulphur-eating bacteria.
34 Propulsion Disads
more carefully.
35 Propulsion Disads
36 Propulsion Disads
More ev alternatives provide logistical problems Coffey 10 (Jerry, writer @ universetoday.com, 9/29/10, http://www.universetoday.com/74655/capecanaveral/) JPG Cape Canaveral was chosen for rocket launches because the linear velocity of the Earths surface is greatest towards the equator. The location of the cape allows rockets to take advantage of this by being launched eastward to match the Earths rotation. Also, since it is best to launch a rocket downrange towards an unpopulated area, the ocean is a great area to launch towards. Although the United States has sites closer to the equator with expanses of ocean to the east of them. There are better sites in the United Sates(Hawaii for instance), but they present significant logistical obstacles.
37 Propulsion Disads
Florida scrub jays are a keystone species protecting their habitat is key to preserve the Florida ecosystem every animal counts for biodiversity SUSF 2009 (State Univeristy System of Florida, Publication of Archival Library & Museum Materials,
http://palmm.fcla.edu/lfnh/currmat/Biodiversityinfo.shtml) JPG
Most efforts to restore endangered species populations are targeted at this level. The Florida panther, manatee, red-cockaded woodpecker, and Florida scrub jay are all species that represent important biodiversity in Florida . Often, by protecting their habitats, we also protect an ecosystem. Ecosystem level: This level is the variety of different kinds of ecosystems within a region that enable the regions to cope with changes or disturbances. For example, migrating birds need two different kinds of ecosystems in two different parts of the world as well as in healthy ecosystem rest stops along their route. As we lose many different types of ecosystems to development and consumption of natural resources, this level of global diversity decreases. For example, filling in mangrove swamps to build high-rise hotels on the coast or cutting down rain forests for grazing land and the sale of prized natural resources such as mahogany and rubber is dramatically decreasing the resiliency of ecosystem diversity. Ecosystem, Species, and Genetic Resiliency All three levels of diversity are essential to maintain life on earth as we know it today. Each level must be protected because they all depend on one another and must be resilient in order to survive. It is the variety of genes, species, or ecosystems that makes all three levels resilient. The Importance of Biodiversity Here is an analogy to help you understand what biodiversity does: Let's pretend I'm giving you a free ticket for a flight to Hawaii. You'll take it, right? You might ask, "What's the catch?" Well, the catch is that the plane...is losing rivets...not too many....just a few every hour. You might want to know a few things about rivets and airplanes; like...how many rivets does a plane have? How many does a plane need to fly? Are some more critical than others? Can rivets function alone - or do they only work in sets? That's a lot like our ecosystems and species diversity. We know plants and animals help our ecosystems provide ecological services - like the photosynthetic plants that give you food energy, and the decomposers that enrich your soil so trees are able to grow in order to provide shade you. Suppose we asked these questions: "How many species do we have on this eco-ship?" "How many do we need?" "Are some more important than others?" "What is the minimum number we need to function?" We don't know the answers. Now think about it, wouldn't you like to keep as many of these rivets as possible? How Biodiversity Benefits Humans So, when we lose biodiversity, we lose access to many different plants and animals that we might need. Here are some specific ways that biodiversity helps us: Scientists use the genetic diversity in our food crops so we can continue to grow plants that are resistant to pests and disease. Many of our prescription drugs were first made from plants and animals. If we continue to lose species, especially those not well researched such as those in tropical forests, will lose out on potentially valuable medicinal cures. Biodiversity stabilizes the ecosystem. It keeps our options open for the future. There may be resources out there that we don't yet understand their potentials, and we don't want to destroy them before we even know about them. Biodiversity also increases the beauty of the planet. What would Florida be without herons, eagles, or zebra long-wing butterflies?
38 Propulsion Disads
Loss of biodiversity leads to extinction Diner, Major, 94 (Major David N.; Instructor, Administrative and Civil Law Division, The Judge
Advocate General's School, United States Army) "The Army and the Endangered Species Act: Who's Endangering Whom?" 143 Mil. L. Rev. 161l/n WBW Biologically diverse ecosystems are characterized by a large number of specialist species, filling narrow ecological niches. These ecosystems inherently are more stable than less diverse systems. "The more complex the ecosystem, the more successfully it can resist a stress. . . . [l]ike a net, in which each knot is
connected to others by several strands, such a fabric can resist collapse better than a simple, unbranched circle of threads -- which if cut anywhere breaks down as a whole." 79 By causing widespread extinctions, humans have artificially
simplified many ecosystems. As biologic simplicity increases, so does the risk of ecosystem failure. The spreading Sahara Desert in Africa, and the dustbowl conditions of the 1930s in the United States are relatively mild examples of what might be expected if this trend continues. Theoretically, each new animal or plant extinction, with all its dimly perceived and intertwined affects, could cause total ecosystem collapse and human extinction. Each new extinction increases the risk of disaster. Like a mechanic removing, one by one, the rivets from an aircraft's wings, 80 mankind may be edging closer to the abyss.
39 Propulsion Disads
Alternate causes and preservation solves the impact Baker 10 (Richard, president of Pelican Island Autobon Society, 12/18/10,
http://www.pelicanislandaudubon.org/hootnovember06.htm) JPG Florida scrub-jays, the only bird species unique to Florida and keystone species of fire-dependent xeric oak scrub, have been in a steady decline; 90% of the original populations are gone due to the loss of habitat for agriculture and urban development and also due to degrading of habitat from the suppression of natural fires.
Like our bald eagle, scrub-jays are listed as a threatened species by both federal and state agencies. To rectify this decline, some scrub habitats have been preserved and managed (although not enough) to protect this species and the other species found only in scrub. Large areas are needed as each scrub-jay family group, two breeders and up to six helpers, defends approximately 13-25 acres of land.
40 Propulsion Disads
**Ozone D/A
41 Propulsion Disads
Ozone 1NC
Current launch rates wont affect the ozone the plan pushes emissions over the brink and kills the ozone Ross et. al. 9 (Martin environmental Systems Directorate @ Aerospace, Ph.D. @ UCLA in Earth and
planetary sciences, Darin Toohey, Manfred Peinemann and Patrick Ross, Volume 7, Issue 1 January 2009 , pages 50 82, informaworld) JPG Combustion emissions from rocket launches change the composition of the atmosphere. The changes can be divided into transient changes near the launch site that affect air quality in the lowermost troposphere and long-term global changes in the composition of the stratosphere. In this paper, we are concerned with the long-term impact of rocket
emissions on the global ozone layer. Ozone depletion has been a critical concern of nations across the globe for many decades, and large-scale industrial processes that alter stratospheric composition are assessed with respect to the amount of ozone depletion they would cause. When an assessment suggests unacceptably large ozone loss for a particular process, regulatory actions to limit or modify that process might be enacted to protect the ozone layer.1 In this paper, we consider rocket combustion emissions in the context of ozone layer protection over the next several decades. Our calculations are not a formal assessment, but are a preliminary evaluation to identify the main areas of concern for the space industry. These concerns include risks associated with overly conservative regulation and a suggestion for new research in order to reduce the likelihood of such regulation. Cicerone and Stedman2 first considered rocket emissions as a source of ozone depletion. Subsequent
at current launch rates, ozone depletion from rocket exhaust is insignificant compared to other sources of ozone loss.3 If launch rates and ozone depletion from other sources remain at current levels, this assessment will not change. The potential exists that the demand for launch services could increase significantly in the future.4 Large (factors of ten or more) increases in launch demand could come about for a variety of reasons, including national decisions regarding security, enhanced space exploration, market forces associated with significant reductions in launch costs, or the emergence of new markets such as space tourism, manufacturing, or solar power. Analysts generally assume that if the cost
studies have shown consistently that of access to orbit is reduced sufficiently, then large, new markets will emerge for space industry and the launch market. This development would be considered revolutionary, and it is not clear when or if, this might occur. Nevertheless, if space transport follows the normal development path of transportation technology enters a period of continual expansion, it would be necessary to reconsider the environmental consequences of large rockets, launched often. In this paper, we consider the implication of such significant increase in demand for orbital launches on the global ozone layer. We do not consider greenhouse gas emissions from rockets. Climate change is to some extent a separable problem from ozone depletion. While rocket engines emit gases identified as contributing to climate change, the amount emitted globally is trivial compared to other sources and is likely to remain so. Annual CO 2 emissions from rockets, for example, are about several kilotons (kt) compared to emissions of several hundred kt from aircraft which, in turn, is only a few percent from all CO 2 sources.5 Space launch emissions, even for the large growth scenarios discussed here, will not likely be significant in future greenhouse gas regulatory schemes. As a cautionary tale, we point out that even though aircraft are responsible for a few percent of all CO 2 emissions, the airline industry must contend with considerable attention and likely regulation or carbon taxation. 6 The message to the space industry should be clear: policy and media attention on high visibility propulsion emissions are often framed in ways that overemphasize the relative contribution. 7 If rockets are a minuscule contributor to the problem of climate change, they do have
a significant potential to become a significant contributor to the problem of stratospheric ozone depletion. This follows from three unique characteristics of rocket emissions: Rocket combustion products are the only human-produced source of ozone-destroying compounds injected directly into the middle and upper stratosphere. The stratosphere is relatively isolated from the troposphere so that emissions from individual launches accumulate in the stratosphere.8 Ozone loss caused by rockets should be considered as the cumulative effect of several years of all launches, from all
space organizations across the planet. Stratospheric ozone levels are controlled by catalytic chemical reactions driven by only trace amounts of reactive gases and particles.9 Stratospheric concentrations of these reactive compounds are typically about one-thousandth that of ozone. Deposition of relatively small absolute amounts of these reactive compounds can significantly modify ozone levels. Rocket engines are known to emit many of the reactive gases and particles that drive ozone destroying catalytic reactions.10 This is true for all propellant types. Even water vapor emissions, widely considered inert, contribute to ozone depletion. Rocket engines cause more or less ozone loss according to propellant type, but every type of rocket engine causes some loss; no rocket engine is perfectly green in this sense.
Ozone depletion means extinction Festive Earth Society, 8 (February 26, The Ozone Layer, http://festiveearth.com/content/view/96/54/index.html, JM)
The ozone layer is essential for human life. It is able to absorb much harmful ultraviolet radiation, preventing
penetration to the earths surface. Ultraviolet radiation (UV) is defined as radiation with wavelengths between 290-320 nanometers, which are harmful to life because this radiation can enter cells and destroy the deoxyribonucleic acid (DNA) of many life forms on planet earth. In a sense, the ozone layer can be thought of as a UV filter or our planets built in sunscreen (Geocities.com, 1998). Without the ozone layer, UV radiation would not be filtered as it reached the surface of the earth. If this happened, cancer would break out and all of the living civilizations, and all species on earth would be in jeopardy (Geocities.com, 1998). Thus, the ozone layer essentially allows life, as we know it, to exist.
42 Propulsion Disads
Ozone Uq I/L
Increased rocket launches crush the ozone the status quo is sustainable Ross et. al. 9 (Martin environmental Systems Directorate @ Aerospace, Ph.D. @ UCLA in Earth and
planetary sciences, Darin Toohey, Manfred Peinemann and Patrick Ross, January, http://www.makealeap.org/node/41204) JPG The global market for rocket launches may require more stringent regulation in order to prevent significant damage to Earths stratospheric ozone layer in the decades to come, according to a new study by researchers in California and Colorado. Future ozone losses from unregulated rocket launches will eventually exceed ozone losses due to chlorofluorocarbons, or CFCs, which stimulated the 1987 Montreal Protocol banning ozone-depleting chemicals, said Martin Ross, chief study author from The Aerospace Corporation in Los Angeles. The study, which includes the University of Colorado at Boulder and Embry-Riddle Aeronautical University, provides a market analysis for estimating future ozone layer depletion based on the expected growth of the space industry and known impacts of rocket launches. The paper by Ross, Manfred Peinemann of The Aerospace
Corporation, CU-Boulders Darin Toohey and Embry-Riddle Aeronautical Universitys Patrick Ross appeared online in March in the journal Astropolitics. If left unregulated, rocket launches by the year 2050 could result in more ozone destruction than was ever realized by CFCs.Darin Toohey Solid rocket motors (SRMs) and liquid rocket engines (LREs) deplete the global ozone layer in various capacities. Highly reactive trace-gas molecules known as radicals dominate stratospheric ozone destruction, and a single radical in the stratosphere can destroy up to 10,000 ozone molecules before being deactivated and removed from the stratosphere. Microscopic particles, including soot and aluminum oxide particles emitted by rocket engines, provide chemically active surface areas that increase the rate such radicals leak from their reservoirs and contribute to ozone destruction. In addition, every type of rocket engine causes some ozone loss, and rocket
combustion products are the only human sources of ozone-destroying compounds injected directly into the middle and upper stratosphere where the ozone layer resides. The authors estimated global ozone
depletion from rockets as a function of payload launch rate and relative mix of SRM and LRE rocket emissions. Currently, global rocket launches deplete the ozone layer ~0.03%, an insignificant fraction of the depletion caused by other ozone depletion substances (ODSs). However, they note, as the space industry grows and ODSs fade from the stratosphere,
43 Propulsion Disads
greater than the concentrations found in the undisturbed stratosphere, and the ozone loss is dramatic. Long-term effects occur as gas and particulate emissions from individual launches become dispersed throughout the global stratosphere and accumulate over time . The
concentrations of emitted compounds reach an approximate global steady state as exhaust from recent launches replaces exhaust removed from the stratosphere by natural atmospheric circulation.
44 Propulsion Disads
depletion based on the expected growth of the space industry and known impacts of rocket launches. Future ozone losses from unregulated rocket launches will eventually exceed ozone losses due to
chlorofluorocarbons (CFCs), organic compounds that contain carbon, chlorine, and fluorine atoms. , or CFCs, which stimulated the 1987 Montreal Protocol banning ozone-depleting chemicals, according to Martin Ross, chief study author from The Aerospace Corporation in Los Angeles. "As the rocket launch market grows, so will ozonedestroying rocket emissions," said Professor Darin Toohey of CU-Boulder's atmospheric and oceanic sciences department. "If left unregulated, rocket launches by the year 2050 could result in more ozone destruction than was ever realized by CFCs," he added. Since some proposed space efforts would require frequent launches of large rockets over extended periods, the new study was designed to bring attention to the issue in hopes of sparking additional research, explained Ross. "In the policy world, uncertainty often leads to unnecessary regulation," he said. "We are suggesting this could be avoided with a more robust understanding of how rockets affect the ozone layer," he added. According to Toohey, current global rocket launches deplete (deplete v. 1. To use up something, such as a nutrient. 2. To empty something out, as the body of electrolytes.) the ozone layer by no more than a few hundredths of 1 percent annually. But, as the space industry grows and other ozone-depleting
chemicals decline in the Earth's stratosphere, the issue of ozone depletion from rocket launches is expected to move to the forefront. Highly reactive trace-gas molecules known as radicals
dominate stratospheric ozone destruction, and a single radical in the stratosphere can destroy up to 10,000 ozone molecules before being deactivated and removed from the stratosphere. "Microscopic particles, including soot and aluminum oxide aluminum oxide: see alumina. particles emitted by rocket engines, provide chemically active surface areas that increase the rate such radicals "leak" from their reservoirs and contribute to ozone destruction," said Toohey. "Today, just a handful of NASA (NASA: see National Aeronautics and Space Administration. NASA in full National Aeronautics and Space Administration) Independent U.S. space shuttle launches release more ozone-depleting substances in the stratosphere than the entire annual use of CFC-based medical inhalers used to treat asthma and other diseases in the United States and which are now banned," said Toohey. "The Montreal Protocol has left out the space industry, which could have been included," he added. (ANI)
45 Propulsion Disads
Plumes from rocket launches could be the world's next worrisome emissions, according to a new study that says solid-fuel rockets damage the ozone layer, allowing more harmful solar rays to reach
Earth. Thanks to international laws, ozone-depleting chemicals such as chlorofluorocarbons (CFCs) and methyl bromide have been slowly fading from the atmosphere. Increased international space launches and the potential
commercial space travel boom could mean that rockets will soon emerge as the worst offenders in terms of ozone depletion, according to the study, published in the March issue of the journal Astropolitics. If the space tourism industry alone follows market projections, rocket launches are "going to run up
against Montreal Protocol," said study co-author Darin Toohey of the University of Colorado at Boulder. The Montreal Protocol on Substances that Deplete the Ozone Layer, an international treaty, prescribes measures intended to hasten the recovery of Earth's depleted ozone layer. "This isn't urgent," Toohey said. "But if we wait 30 years, it
will be."
46 Propulsion Disads
this may appear to be a relatively inconsequential amount, the amount of chlorine injected into the stratosphere by each Shuttle launch is more damaging to the ozone supply than the aggregate annual chlorofluorocarbon emissions of most of the world's factories. This occurs, in part, because the chlorine (Cl) is injected directly into the stratosphere and immediately begins participating in ozone destruction.62 Rocket propellant effluents also have a uniquely dangerous depleting effect on the ozone layer in that the depletion is highly concentrated .
Measurements of ozone loss in the launch trail of a Titan II booster rocket thirteen minutes after launch, at an altitude of eighteen kilometers, have shown that ozone is reduced by more than forty percent within the trail 63 Other studies have shown that within one kilometer of the exhaust trail of the Space Shuttle and Energia vehicles ozone may be reduced up to eighty percent between one and three hours after launch.64
47 Propulsion Disads
interaction between combustion gas and stratospheric ozone is the urgent problem of the practical astronautics. The analytical estimates for atmospheric ozone destroyed in a rocket plume are made more
then 20 years. The results very differ even for the same rocket types and prognoses vary from extremely pessimistic to restrained optimistic ones. Such divergence is a result, first of all, high chemical kinetics calculations sensitivity to the rate constants values varying more then several times for the numerous reactions taking into account, initial data for a rocket plume, initial data for the atmosphere performance etc. The wide known comparisons the calculated results with the real TCO change above the space-vehicle launching sites are absent till now despite the regular TCO space monitoring is conducted since 1978 year. In the article the analyses of the spline-interpolation total ozone mapping spectrometer (TOMS) measurements [1,2] is presented. We have examined 773 launches space rockets ARIAN, CZ, DELTA, PROTON, SHUTTLE, TITAN, ZENIT families was made for period since 1978 until 2001 year. For every launch the ozone
level maps for regions corresponding to 10o latitude on 20o longitude during 7 days elapsed time have been built. For ~30% launches we have exposed the areas with TCO decreased on 1520 Dobson units. The areas have shape either "spots" with 200-300km diameter or "stripes" 200-300km width parallel to plume. Such local ozone "holes" appear in 1-2 days after launch and disappear in 5-7 days as rule. For comparison, we also made the ozone level maps for regions similar to launching sites latitude. So far as the probability (frequency) of appearance of natural ozone "holes" with same dimension and shape above the launching sites is less then exposed ones, with good reason it may be assumed that in some cases the "holes" are the result of stratospheric ozone depletion by propulsion gas components. The space rockets families we compared from point of frequency appearance the ozone "holes". The worse result gave us the SHUTTLE family (~50%). With aid of the results it is offered to make the prognosis for every launch of "dirty" rockets and choose the most convenient launching time for stratospheric ozone depletion minimization.
48 Propulsion Disads
present an acute environmental danger to the ozone since their effluents are disseminated below fifty kilometers, directly into the area of highest ozone concentration.45 Solid propellants are also very dangerous, as compared to liquid propellants, since HC1 is a by-product of the combustion and the chlorine atom is known to deplete the ozone .46
49 Propulsion Disads
depleting exhaust products are emitted that will eventually find their way into the stratosphere.55
50 Propulsion Disads
Ozone Link/!
Expansion of chemical propellant use causes extinction ozone. Johnson 9 (John, Staff, Los Angeles Times April 4, http://www.greenchange.org/article.php?id=4228, JM)
rocket launches may ultimately have to be restricted in number to avoid serious damage to the Earth's protective ozone layer . Future ozone losses from the increasing number of rocket launches could eventually exceed the damage caused by chlorofluorocarbons, or CFCs, the chemical compounds banned from use in aerosols, freezers and air conditioners, they conclude in a new study. "As the rocket launch market grows, so will ozone-destroying rocket emissions," said Darin Toohey, a professor in the atmospheric and oceanic sciences department at the
Some atmospheric researchers are suggesting that University of Colorado at Boulder. "If left unregulated, rocket launches by the year 2050 could result in more ozone destruction than was ever realized by CFCs." Toohey's research, based on measurements of pollutants emitted by current rocket launches and projections of future launches, in conjunction with authors from the Aerospace Corp. and Embry-Riddle Aeronautical University, appeared online in March in the journal Astropolitics. Without Earth's ozone layer, exposure from the sun's harmful radiation would make life on the planet's surface impossible . Several decades ago, scientists began to notice the ozone layer was being eaten away, most famously over Antarctica, due to chemical reactions eventually traced to chlorofluorocarbons. In 1987, CFCs were banned from industrial uses, leading to predictions that the ozone layer would recover by 2040. Global rocket launches, currently at more than 100
per year, deplete the ozone layer by less than 1% annually, Toohey said. But as the number of launches increases with plans by some nations, including the U.S., to colonize the moon and venture to Mars, the problem could become serious, he said. Rockets use a variety of propellants -- solids, liquids and
hybrids. Little is known about how each affects the ozone layer. "I am optimistic that we are going to solve this problem, but we are not going to solve it by doing nothing," Toohey said.
51 Propulsion Disads
Ozone ! Econ/Disease/Species
Ozone loss hurts the economy, causes disease, and disrupts wildlife. Shapiro 95 (Lynn Anne, Contributor, Southern California Interdisciplinary Law Journal 1994-1995, p. 744, JM)
Some of the predicted effects of a thinning ozone layer include: (1) for every 1% drop in the amount of ozone (a) a 2% to 5% increase in squamous cell skin cancer, (b) a 1% to 3% increase in basal cell skin cancer, (c) a
1% to 2% increase in the incidence of and a 0.8% to 1.5% increase in mortality from melanoma skin cancer, and (d) a 0.3% to 0.6% increase in cataracts; (2) suppression of the human immune system resulting in an increase in the numbers and severity of various diseases; (3) changes in the delicate competitive balance among plant species, with a resulting reduction in crop yields, often on a one-to-one ratio to the percentage drop in ozone levels; (4) changes in marine ecosystems with resulting, potentially devastating, effects on aquatic food chains; (5) degradation of certain polymer compounds used by industry with costly economic consequences from necessary countermeasures; (6) potential for increased acid rain; and (7) contributions to the so called "greenhouse effect" and global warming, due to an increase in atmospheric carbon dioxide levels resulting from the depletion of the ozone layer.33
52 Propulsion Disads
Ozone ! Cancer
Ozone depletion causes disproportionate increases in cancer rates Martens 98 (WJM, Mathematics Department @ Maastricht University, Environmental Health Perspectives 106, 1, February,
JM) Analysis of what happens with the tumor incidences in the course of time after the ozone layer changes is more complex than the previous example, due in particular to the relatively long incubation time between initial UV exposure and the first appearance of cancer. Although there is a large body of data, both experimental and epidemiologic, that confirms a causal relationship between accumulated UV dose and squamous cell carcinoma (33,34), the UV dose dependencies of basal cell carcinoma and melanoma skin cancer (MSC) (except for lentigo maligna melanoma) are less certain. Earlier skin cancer assessments were based on comparison of two stationary situations (35) and did not include the delay between exposure and tumor development (36). The assessment model used here integrates
dynamic aspects of the full source-risk chain: from production and emission of ozonedepleting substances, global stratospheric chlorine concentrations, local depletions of stratospheric ozone, resulting increases in UV-B levels, and finally, the effects on skin cancer rates (37-39). Figure 5 clearly shows the delay mechanisms in the effect of ozone depletion on skin cancer rates. Full
compliance with the Copenhagen Amendments to the Montreal Protocol would lead to a peak in the atmospheric chlorine concentration around 1995, a peak in stratospheric chlorine concentration and ozone depletion around 2000, and a peak in skin cancer by about 2050 (50 years after the peak in ozone depletion). The latter delay is mainly due to the fact that skin cancer incidences depend on cumulative UV-B exposure. An important aspect in this modeling experiment is that skin cancer rates are very sensitive with respect to lifestyle (i.e., sun exposure habits ). Changing lifestyles such as the trend toward sun worshipping during the last half century contribute greatly to the increases in the incidence of skin cancer. This has been identified as a serious public health problem in several western countries, and campaigns have been launched to curb excessive exposure to the sun. An increase in UV exposure of 50% would increase the excess number of skin cancer cases to 135% (Figure 6). Another factor contributing to a steady increase in the number of skin cancers is the aging of the population. Because older people build up a high cumulative UV dose during their lives, skin cancer occurs more frequently among the elderly. Figure 6 also shows that if a
population is aging, the same level of UV exposure would lead to higher incidences of skin cancer than in a younger population, perhaps a 50 to 60% increase in the overall incidence . So it appears that in view of the several delay mechanisms involved in cancer onset and, additionally, the aging of the population, increases in incidences of skin cancer are likely to occur.
53 Propulsion Disads
by
influence tropical circulation and increase rainfall at low latitudes in the entire Southern Hemisphere. This is the first time that ozone depletion an upper atmospheric phenomenon confined to the polar regions has been linked to climate change from the Pole to the equator. "The ozone hole is not
even mentioned in the summary for policymakers issued with the last IPCC report," says Lorenzo Polvani, co-author of the paper. "We show in this study that it has large and far-reaching impacts. The ozone hole is a big player in the climate system." Lead author Sarah Kang says: "It's really amazing that the ozone hole, located
so high up in the atmosphere over Antarctica, can have an impact all the way to the tropics and affect rainfall there it's just like a domino effect." The ozone hole is now widely believed to have
been the dominant agent of atmospheric circulation changes in the Southern Hemisphere in the last half century. This means, according to Polvani and Kang, that international agreements about mitigating climate change cannot be confined to dealing with carbon alone ozone needs to be considered, too. "This could be a real game-changer," says Polvani. Over the past decade ozone depletion has largely halted. Scientists now expect it to fully reverse, with the ozone hole closing by mid-century. "While the ozone hole has been considered as a solved
problem, we're now finding it has caused a great deal of the climate change that's been observed," says Polvani. Together with colleagues at the Canadian Centre for Climate Modelling and Analysis, Kang and
Polvani used two different state-of-the-art climate models to show the ozone hole effect. They first calculated the atmospheric changes in the models produced by creating an ozone hole. They then compared these changes with the ones that have been observed in the last few decades: the close agreement between the models and the observations shows that ozone is likely to have been responsible for the observed changes in the Southern Hemisphere . Kang and Polvani plan next to study extreme precipitation events, which are associated with major floods, mudslides, etc. "We really want to know," says Kang, "if and how the closing of the ozone hole will affect these."
54 Propulsion Disads
Ozone ! Disease/Food/Environment
Ozone depletion causes disease, crop failure, and ecosystem disruption Mishra 10 (M. P., Chief Editor of ECOSOC, international environmental newsletter, Ozone Layer: Its depletion,
consequences, and protection, September 12, http://www.ecosensorium.org/2010/09/ozone-layer-its-depletion-consequences.html, JM)
Ozone absorbs ultraviolet radiations so that much of it is never allowed to reach to the earth surface. The protective umbrella of ozone layer in the stratosphere protects the earth from harmful ultraviolet radiations.
Ozone plays an important role in the biology and climatology on the earths environment. It filters out all the radiations that remain below 3000. Radiations below this wavelength are biologically harmful. Hence any depletion of ozone layer is sure to exert catastrophic impacts on life in the biosphere. The Ultraviolet radiation is one of the most harmful radiations contained in the sunlight. Ozone layer in the stratosphere absorbs these radiations and does not allow it to reach to the earth. The depletion of Ozone layer may lead to UV exposures that may cause a
number of biological consequences like Skin Cancer, damages to vegetation, and even the reduction of the population of planktons (in the oceanic Photic zone). Some of the remarkable effects of the UV radiations or the effects of depletion of the Ozone Layer are mentioned below. (1) UV radiation causes suneye- diseases (cataract), skin diseases, skin cancer and damage to immune systems in our body . (2) It damages plants and causes reduction in crop productivity. (3) It damages embryos of fish, shrimps, crabs and amphibians. The population of salamanders is reducing due to UVradiations reaching to the earth. (4) UV- radiations damage fabrics, pipes, paints, and other non-living materials on this earth. (5) It contributes in the Global Warming. If the ozone depletion continues, the temperature around the world may rise even up to 5.5 Celsius degrees . II.Specific Effects The specific effects of depletion of Ozone Layer have been observed on Human Society, Agriculture, Plants
and Animals etc. These effects have been summarized as below- A. Effects of Ozone Depletion on Human Society (i).The flux of ultra violet radiation in the biosphere is increased due to ozone depletion. It has seriously harmful effects on human societies like formation of patches on skin and weakening of the human immune system. (ii). It may cause three types of skin cancer like basal cell carcinoma, squamous cell carcinoma and melanoma. A 10 per cent decrease in
stratospheric ozone has been reported to cause 20 to 30 per cent increase in cancer in human society. Each year, about 7000 people die of such diseases each year in USA. About 10 percent increase in
skin cancer has been reported in Australia and New Zealand. (iii).Exposure to UV radiations damages skin of the sunbathing people by damaging melanocyte-cells or by causing sun-burns due to faster flow of blood in the capillaries of exposed areas. (iv).Exposure to UV radiations due to ozone depletion may cause leukemia and breast cancer. (iv).Exposure of UV radiation to human eye damages cornea and lens leading to Photo keratitis, cataract and even blindness. (v).The Ambient Ozone Exposure may cause Emphysema, bronchitis, asthma and even obstruction of lungs in human beings. (vi).Exposure to radiations due to ozone depletion has been reported to cause
DNA breakage, inhibition and alteration of DNA replication and premature ageing in human beings. B. Effect of Ozone Depletion on Agriculture (i). Radiations reaching to the earth due to ozone depletion cause severe damage to plants including crops. As per reports, ultra violet radiations reaching to
the earth cause losses up to 50 per cent in European countries. (ii).The radiation reaching to the earth due to the depletion of the ozone layer cause visible damages in plants. They adversely affect the rate of photosynthesis that finally results into decrease in the agricultural production. (iv).The UV radiation enhances the rate of evaporation through stomata and decreases the moisture content of the soil. This condition adversely affects the growth and development of crop plants and reduces the crop yield. (v). The ozone reduction adversely affects the
weather pattern which in turn affects the crop production by encouraging plant injuries and disease development. (vi). The UV radiation reaching to the earth surface alters the global balance between radiation
and energy. This condition of imbalance causes seasonal variations that further reduce the crop production. (vii). A number of economically important plant species such as rice, depend on cyanobacteria residing in their roots for the retention of nitrogen. These bacteria are sensitive to UV light and they are hence, are killed instantly. C. Effects of Ozone Depletion on other Plants and Animals (i).The ozone layer depletion causes climatic alterations that cause physiological changes in plants and animals. The change in the energy balance and radiation may affect the survival and stability of living organisms. (ii).The depletion of ozone layer may cause changes in thermal conditions of the biosphere. It may affect type, density and stability of vegetation which in turn may affect different bio-geo-chemical cycles operating in nature. Interruption in these cycles damages important process of ecosystem leading to dangerous conditions for plants and animals. (iii).The depletion of ozone layer causes death of plankton- populations in fresh as well as marine waters .This condition seriously affects the transfer of materials in ecosystems. The recent researches gave analyzed a widespread extinction of planktons 2 million years ago that coincided with the nearby supernova. Planktons are
particularly susceptible to effects of UV light and are vitally important to the marine food webs.
55 Propulsion Disads
Ozone ! Warming/Cancer
Ozone depletion causes warming and skin cancer, and disrupts ecosystems EPA 1-13 (Health and Environmental Effects of Ozone Layer Depletion,
http://www.epa.gov/ozone/science/effects/index.html, JM)
Reductions in stratospheric ozone levels will lead to higher levels of UVB reaching the Earth's surface. The sun's output of UVB does not change; rather, less ozone means less protection, and hence more
UVB reaches the Earth. Studies have shown that in the Antarctic, the amount of UVB measured at the surface can double during the annual ozone hole. Another study confirmed the relationship between reduced ozone and increased UVB levels in Canada during the past several years. Effects on Human Health Laboratory and epidemiological studies demonstrate that
UVB causes nonmelanoma skin cancer and plays a major role in malignant melanoma development. In addition, UVB has been linked to cataracts -- a clouding of the eyes lens. All
sunlight contains some UVB, even with normal stratospheric ozone levels. It is always important to protect your skin and eyes from the sun. Ozone layer depletion increases the amount of UVB and the risk of health effects. EPA uses the Atmospheric and Health Effects Framework (AHEF) model, developed in the mid 1980s, to estimate the health benefits of stronger ozone layer protection policies under the Montreal Protocol. EPA estimates avoided skin cancer cases, skin cancer deaths, and cataract cases in the United States. Effects on Plants Physiological and developmental processes of plants are affected by UVB radiation, even by the amount of UVB in present-day sunlight. Despite mechanisms to reduce or
repair these effects and a limited ability to adapt to increased levels of UVB, plant growth can be directly affected by UVB radiation. Indirect changes caused by UVB (such as changes in plant form, how
nutrients are distributed within the plant, timing of developmental phases and secondary metabolism) may be equally, or sometimes more, important than damaging effects of UVB. These changes can have important implications for plant competitive balance, herbivory, plant diseases, and biogeochemical cycles . Effects on Marine Ecosystems Phytoplankton form the foundation of aquatic food webs. Phytoplankton productivity is limited to the euphotic zone, the upper layer of the water column in which there is sufficient sunlight to support net productivity. The position of the organisms in the euphotic zone is influenced by the action of wind and waves. In addition, many phytoplankton are capable of active movements that enhance their productivity and, therefore, their survival. Exposure to
solar UVB radiation has been shown to affect both orientation mechanisms and motility in phytoplankton, resulting in reduced survival rates for these organisms. Scientists have demonstrated a direct reduction in phytoplankton production due to ozone depletion -related increases in UVB. One study has indicated a 6-12% reduction in the marginal ice zone. Solar UVB radiation has been found to cause damage to early developmental stages of fish, shrimp, crab, amphibians and other animals. The most severe effects are decreased reproductive capacity and impaired larval development. Even at current levels, solar UVB radiation is a limiting factor, and small increases in UVB exposure could result in significant reduction in the size of the population of animals that eat these smaller creatures. Effects on Biogeochemical Cycles Increases in solar UV radiation could affect terrestrial and aquatic biogeochemical cycles, thus altering both sources and sinks of greenhouse and chemically-important trace gases e.g., carbon dioxide (CO2), carbon monoxide (CO), carbonyl sulfide (COS) and possibly other gases, including ozone. These potential changes would contribute to biosphere-atmosphere feedbacks that attenuate or reinforce the atmospheric buildup of these gases. Effects on Materials Synthetic polymers, naturally occurring biopolymers, as well as some other materials of commercial interest are adversely affected by solar UV radiation. Today's materials are somewhat protected from UVB by special additives. Therefore, any increase in solar UVB levels will therefore accelerate their breakdown, limiting the length of time for which they are
useful outdoors.
56 Propulsion Disads
**AFF
57 Propulsion Disads
under consideration are environmentally friendly, have a higher density, and have better thermal characteristics than hydrazine. The near-term goal is to improve mission performance and greatly reduce ground operations costs. For the far-term, a very high performance (high specific impulse) system is being sought. The key to this goal is the development of a high-temperature catalyst; research in this area is underway. (Schneider, 1997) For small spacecraft, several chemical propulsion technologies are being explored. Examples include: 1. A warm gas propulsion system that uses a mixture of hydrogen, oxygen, and an inert gas (nitrogen or helium) and that offers a high specific impulse alternative to cold gas systems with a minimal increase in complexity 2. Exothermic decomposing solid and hybrid systems, which offer the high density and simplicity of solid propellants for low-thrust, quick-response applications 3. A water electrolysis concept that can provide dual use as a combined propulsion/power system 4. A "microturbomachinery"-based bipropellant system for very high-performance applications which uses microelectronic mechanical system (MEMS) fabrication technology to
provide propulsion systems "on-a-chip" similar to computer chips. (Schneider, 1997)
58 Propulsion Disads
Aff A2 Ozone !
Ozone depletion isnt caused by humans Maduro 2 (Rogielo, Co-author, The Holes in the Ozone Scare, January, http://www.mitosyfraudes.org/Ingles/Crista.html, JM)
They discovered that changes
in the ozone layer were directly caused by the horizontal and vertical movement of air masses (that is, wind dynamics). A close analysis of the data also demonstrated that chemistry played no role in the thickness of the ozone layer over these stations . The authors discuss the implications of their work in detail: Intensive investigations on irregular variations of the total ozone during the last years point out many phenomena as possible sources. Influences related to homogeneous and heterogeneous chemistry, volcanic activity, solar proton events, and other forms of solar activity are documented ... The main cause, however, may be influences from meteorological conditions, and these relations have got much less attention . The role of horizontal
advection and vertical motion as a significant source for ozone column variations has been studied more than 40 years ...
Recently Rabbe and Larsen (3) have indicated dynamic processes in the atmosphere as a main reason of ozone variations and ozone "miniholes." They show that ascending motion of the air is accompanied by dilution of the ozone layer, and vice versa, descending motion of the air causes enhanced density of the ozone layer. The causes of ascending and descending motions are often winds
blowing across mountain ranges. Such vertical air movements will cause adiabatic expansion and compression with cooling and warming in time scales down to a few hours. Chemical processes can also contribute to ozone variations, but here the time scales are days. On the other hand, ozone variations with periods in the order of 10 days, and seasonal variations as well can also be explained by dynamic meteorological reasoning. After a detailed analysis of the Russian data, Henriksen and Roldugin conclude with a sharp reminder to the promoters of the ozone depletion fraud that they cannot arbitrarily exclude factors other than chemistry from their models : The question of so-called
"ozone depletion" has to be investigated from the point of view of long-term variation of general circulation in the atmosphere. Models of "the depletion," as summarized in [the World Meteorological Organization's] WMO Report, must realize that the meteorological conditions have significant effects on the ozone layer, being the main cause of seasonal as well as most of the shorter and apparently arbitrary density and thermal variations.
Chemical rockets pose a minimal threat to ozone Ross and Zittel 2k (Martin N. and Paul F., PhD in Earth & Planetary Sciences, UCLA, PhD in Physical Chemistry,
UCBerkeley, Aerospace 1, 2, Summer, JM) Space transportation, once dominated by government, has become an important part of our commercial economy, and the business of launching payloads into orbit is expected to nearly double in the next decade. Each time a rocket is
launched, combustion products are emitted into the stratosphere. CFCs and other chemicals banned by international agreement are thought to have reduced the total amount of stratospheric ozone by about 4 percent. In comparison, recent predictions about the effect on the ozone layer of solid rocket motor (SRM) emissions suggest that they reduce the total amount of stratospheric ozone by only about 0.04 percent. Even though emissions from liquid-fueled rocket engines were not included in these predictions, it is likely that rockets do not constitute a serious threat to global stratospheric ozone at the present time. Even so, further research and testing needs to be done on emissions from
rockets of all sizes and fuel system combinations to more completely understand how space transportation activities are affecting the ozone layer today and to predict how they will affect it in the future.
59 Propulsion Disads
Aff A2 UV Radiation !
UV radiation is harmless, unrelated to ozone, and not increasing no impact. Maduro 2 (Rogielo, Co-author, The Holes in the Ozone Scare, January, http://www.mitosyfraudes.org/Ingles/Crista.html, JM)
How has such a technical matter as stratospheric chemistry come to dominate headlines around the world and mobilize politicians to impose a ban that will cost their nations over $5 trillion over the next few years? The answer is fear of increased numbers of deaths from skin cancer as more ultravioiet radiation hits the Earth, supposedly the result of ozone depletion. If
it were not for the mass hysteria that has been created over the alleged dangers of an increase in skin cancer rates, there would be no ban on CFCs today , and newspapers would not even bother to cover the issue. For example, during the same four- to six-week period that the so-called ozone hole appears over Antarctica, a nitrogen oxide (NOx) hole also develops over the same area. Both the socalled ozone hole and nitrogen oxide hole are created in Antarctica by the same natural phenomena, but mentioning this and other unusual phenomena over Antarctica would raise too many questions in people's minds about the extraordinary chemistry that takes place at the end of the polar winter in Antarctica, and would lead people to question the ozone scare. So, the NOx hole is never mentioned. Let's look at the UV/cancer theory. First, the scare stories about UV and ozone depletion are based on increases in UV that are minuscule, compared with the natural variations in UV-B that are determined by one's altitude and distance from the Equator. Second, there is no evidence that levels of UV-B have increased at the surface of the Earth, despite the claims of worldwide ozone depletion. And third, biological research now indicates that it is not UV-B that causes the malignant types of skin cancer, but UV-A, which is not screened out by the ozone layer. The ozone depletion theory predicts that there will be a 10 to 20 percent increase in the level of UV-B radiation at the surface as a result of ozone depletion. This might seem like a large increase, unless one knows something about the geometry of the Sun and the Earth. UV-B varies by 5,000 percent from the Equator to the Poles . It also varies with altitude. This is the result of simple geometry: There is more sunlight exposure at the Equator and the atmosphere is thinner in the mountains, so more UV-B gets through. In midlatitudes such as that of the United States, a 1 percent increase in UV-B is the equivalent of moving 6 miles south (closer to the Equator). Thus, the alleged increase in UV radiation, according to the theory,
would be the equivalent of what a person would receive if he were to move 60 to 120 miles south the equivalent of moving from New York City to Philadelphia. Actual instrumental measurements of ultraviolet radiation at the
surface show that there has been no increase in UV levels, despite widespread claims of ozone depletion in northern latitudes. Just as with the ozone layer, the levels of UV radiation go through tremendous seasonal fluctuations. The amount of incoming UV radiation is modulated by several factors, including the angle of the Sun at that particular time of the year (lowest in winter), incoming solar radiation, sun spots, thickness of the ozone layer, meteorological conditions (cloud cover, and so on) and pollution. Accurately determining the amount of UV radiation requires long-term readings over an extensive network. Curiously enough, while tens of billions of dollars have been spent on "ozone research" almost no money has been spent on UV readings at the surface. The most extensive study to date of UV-B radiation at the surface is that conducted by Joseph Scotto and his collaborators at the National Cancer Institute. The study, published in the Feb. 12, 1988, issue of Science,(6) presented evidence that the amount of UV-B reaching ground level stations across the United States had not increased, but in fact, had decreased between 1974 and 1985. Instead of rejoicing at the results, the promoters of the ozone depletion scare saw to it that the network of observing
stations was shut down, by cutting its funding (less than $500,000 out of more than $1.75 billion in research funds to study "climate change"). One of the recent attempts to contradict the Scotto study was an article by J.B. Kerr and C.T. McElroy, published in Science magazine in 1993, claiming an upward trend in UV radiation over Toronto. (7) The results were frontpage news internationally, but when it was soon demonstrated by other scientists that the so-called trend was based on faulty statistical manipulation (8) this reverse got little publicity. The entire "rise" in UV was based on
readings taken during the last 3 days of five years of measurements! A correct statistical analysis showed that the trend in UV was zero (that is to say, the amount of UV had neither increased nor
decreased over the five-year period). Interestingly enough, the Canadian study had been rejected for publication by Nature. At the time the Canadian paper was submitted to Science, F. Sherwood Rowland was the president of the American Association for the Advancement of Science, publisher of Science. According to knowledgeable sources, Rowland rammed through the publication of the paper despite its obvious errors.
60 Propulsion Disads
countries sounded the death knell for an important part of the international chemical industry, with implications for billions of dollars of investments and hundreds of jobs in related sectors. The protocol did not simply prescribe limits on these chemicals based on "best available technology,"
which had been a traditional way of reconciling environmental goals with economic interests. Rather, the negotiators established target dates for replacing products that had become synonymous with modern standards of living, even though the requisite technologies did not yet exist. At the time of the negotiations and signing, no measurable evidence of damage existed. Thus, unlike environmental agreements of the past, the treaty was not a response to harmful developments or events, but rather a preventive action on a global scale". What Benedick knew, but did not say, is that the ban on CFCs would directly and indirectly cause millions of deaths per year, and that he supports this mass murder. Seven years after the Montreal Protocol banning CFCs, the "evidenc of damage" still does not exist, and the Montreal Protocol has served as the shining example for new international environmental treaties. The Climate Treaty, the Biodiversity Treaty, and others, have been signed despite the lack of scientific evidence, the argument being that the delegates are just following the example of the Montreal Protocol. The ban on the production of CFCs took effect on Jan. 1, 1996, in the United States. This event, which most people may not even notice until their car air conditioners break down, is earthshaking. The production of one of the most useful chemicals invented by man literally, the life-
blood of the world.'s food refrigeration system is ending. By preserving the food supply and keeping it wholesome, refrigeration is one ot the major factors in the dramatic increase in human life expectancy in the past half-century. By removing these inexpensive, benign, and efficient coolants, the Montreal Protocol measures put at risk the poorest populations in the world, those for whom the more expensive refrigerant replacements will make the cost of refrigeration prohibitive. The entire worldwide food chain depends on CFCs. CFCs are used in refrigeration systems at the time crops are harvested
and during transportation, storage, and distribution. This refrigeration "cold chain" depends on a steady supply of CFCs and HCFCs. (13) There are no drop-in substitutes for CFCs and HCFCs for most refrigerators, freezers, and refrigerated transports, which means that as supplies disappear, existing equipment shuts down or is scrapped. Most nations of the world cannot afford to replace this equipment. As a result of the ban on CFCs, the cold chain is already collapsing in the poorer areas of the world, particularly Africa and Eastern Europe. Public health also suffers
from this cold chain collapse, because most vaccines and many medicines need to be refrigerated. In addition, a ban on the agricultural pesticide and fumigant methyl bromide, for which there is no available
chemical substitute, means that many countries will lose the ability to export their crops, and that dangerous pests will spread to other areas of the world to destroy crops and attack people. Methyl bromide is crucial to preserve food in storage, particularly grains. More than one third of the world's grain supply will be lost if methyl bromide is banned. In 1992,
international refrigeration experts estimated that the ban on CFCs was going to kill between 20 to 40 million people every year by the end of the decade, through hunger, starvation, and foodborne diseases. This is now an underestimate, given the addition of methyl bromide to the list of chemicals to be banned, and given the emergence of new and old diseases.
61 Propulsion Disads
Stratospheric and tropospheric ozone both absorb infrared radiation emitted by Earths surface, trapping heat in the atmosphere. Stratospheric ozone also significantly absorbs solar radiation. As a result, increases or decreases in stratospheric or tropospheric ozone induce a climate forcing and, therefore, represent direct links between ozone and climate. In recent decades, global stratospheric
ozone has decreased due to rising reactive chlorine and bromine amounts in the atmosphere, while global tropospheric ozone in the Industrial Era has increased due to pollution from human activities (see Q3). Stratospheric ozone depletion has caused a small negative radiative forcing since preindustrial times, while increases in tropospheric ozone have caused a positive radiative forcing (see Figure Q18- 1). Summing the positive forcing due to tropospheric ozone increases with the smaller negative forcing from stratospheric ozone depletion yields a net positive radiative forcing. The large uncertainty in tropospheric ozone forcing reflects the difficulty in quantifying tropospheric ozone trends and in modeling the complex production and loss processes that control its abundance. The negative radiative forcing from stratospheric ozone depletion will diminish in the coming decades as ODSs are gradually removed from the atmosphere.
Stratospheric ozone depletion cannot be a principal cause of present-day global climate change for two reasons: first, the climate forcing from ozone depletion is negative, which leads to surface cooling. Second, the total forcing from other longlived and short-lived gases in Figure Q18-1 is positive and far larger. The total forcing from these other gases is the principal cause of observed and projected climate change.
62 Propulsion Disads
not fully repair itself until at least the middle of the 21st century.
Alt causes outweigh the internal link ONeill 9 (Ian, writer @ AstroEngine, 1/13/9, http://www.astroengine.com/2009/01/oh-no-rocketlaunches-are-bad-for-the-environment-wed-better-stay-at-home-then/) JPG Of course, there are other space agencies, and now we have a growing number of private rocket companies, but compared with the daily carbon emissions we individuals and industry are responsible for, rocket launches arent exactly the Spawn of Satin.
63 Propulsion Disads
64 Propulsion Disads
65 Propulsion Disads
66 Propulsion Disads
Soyuz Bad
Soyuz fails its unsafe and explodes upon reentry Messier 8 (Doug, editor @ Parabolic Arc, co-owner of spacejobs.com, Intl space U, 9/21/8,
http://www.parabolicarc.com/2008/09/21/is-soyuz-unsafe/) JPG
The Rocketsandsuch blog has an interesting post about what might be causing re-entry problems with Soyuz spacecraft returning from the International Space Station. The last two missions to return from orbit experienced rough, ballistic re-entries because the pyrotechnic charges designed to separate the crew return module from the rest of the ship failed to fire properly. The space station has grown in size considerably since those first early long duration flights that the Soyuz so flawlessly serviced. It is a bit larger now with all the new modules the Emperor has sent aloft for our friends. As such it makes quite a target for training gangly military officers on ground-based radars around the world. It has also become quite a source of electromagnetic energy itself, with all the radios and such from all the international partners blasting their messages back to the homelands, the blogger writes. Did you hear the recent news about cell phones in your pocket causing your little reproductive agents to slow down or become ineffective? The same thing
may be at work when the cacophony of EMI on the space station envelops the Soyuz separation pyros and causes them to become inert. If this report is true, then the space station program is in serious trouble. The current crew could be at risk if their Soyuz is similarly affected; the last crew to return were lucky to escape with their lives, according to some reports. Their Soyuz vehicle began to re-enter the atmosphere backwards until it broke away from the orbital module and righted itself. This problem also raises questions
about NASAs plan to rely on the Soyuz as the primary transportation vehicle after the agency retires the space shuttle in 2010. NASAs successor vehicle, Orion, might not be ready to fly crews to the ISS until 2015. Soyuz is unsafe and we are subjecting our astronauts to an unnecessary risk by putting them in vehicles that have been on orbit for more than a couple of weeks, Rocketsandsuch concludes.
67 Propulsion Disads
68 Propulsion Disads
Soyuz Bad $
Soyuz costs 63 million dollars a seat Dillow 3/15 (Clay, writer @ popular science, 3/15/11, http://www.popsci.com/technology/article/201103/hitchhiking-iss-soyuz-gets-more-expensive-nasa-signs-new-deal-russia) JPG The Russians are teaching the Americans an important lesson in capitalism: where theres high demand for a scarce commodity, costs will rise. NASA and its Russian counterpart inked a new $753 million modification to its current International Space Station transportation deal Monday, securing seats on the Russian Soyuz spacecraft from 2014 to 2016 at a price of almost $63 million per seat. The old contract, which runs until
2014, reserves seats on the Soyuz for just $56 million. The new deal is a bridge between the end of the old contract in 2014 and the expected emergence of a homegrown commercial manned space transportation system sometime in the middle of the decade. It secures a place for six crew members for launch in 2014 and six more the following year along with the return of both crews, with the second crew returning in 2016 after a six-month stint on the ISS. With NASAs retiring of the space shuttle fleet later this year, the Russian Soyuz has the market cornered as far as manned transportation between the Earth and the ISS is concerned. Whether or not that has anything to do with the uptick in per-seat price is pure speculation, but NASA chief Charles Bolden took the opportunity Monday to remind Americans and American companies of the importance of developing a space transit option thats made in the U.S.A.
69 Propulsion Disads
Soyuz Good
Soyuz is safe Halvorson and Karash 8 (Todd Halvorson and Yuri Karash, writers @ Florida today, 6/30/8,
http://www.space.com/5574-rides-soyuz-spacecraft-rocky-risky.html) JPG
The crew of the International Space Station will get a go-ahead next week to perform spacewalking inspections as part of a probe into back-to-back ballistic re-entries by Russian Soyuz spacecraft. Two veteran cosmonauts, meanwhile, say the type of steep trajectories flown by consecutive Soyuz crews are safe-but-rocky rides back to Earth. "Imagine you drive a luxury car with fine shock absorbers, not feeling the road at all," said Pavel Vinogradov, who served on Russia's Mir space station and commanded an expedition to the new outpost. "And then suddenly, one of the shock absorbers breaks and you start feeling all the dents and unevenness of the road," he said. "It
doesn't mean that your life is in danger. You can still safely drive the car."
70 Propulsion Disads
71 Propulsion Disads
The engine causes launch failures Spencer 8 (Henry, computer programmer & spacecraft engineer, 8/7/8,
http://www.newscientist.com/blog/space/2008/08/spacex-rocket-failure-due-to-new-engine.html) JPG SpaceX has now announced what caused the failure of its Falcon 1 rocket last weekend: a new engine on its first stage. As I wrote in my previous post, the new engine's walls were cooled by the incoming fuel rather than just having a thick layer of expendable insulation. That left more fuel inside the engine at cut-off time, so its thrust died out slowly as the extra fuel escaped out the nozzle. This
"residual" thrust pushed the first stage forward gently, enough that it caught up with the second stage before the second stage had moved far enough away. That caused the first stage to collide with the second just after the two separated at an altitude of 217 kilometres.
It was three for three on launch failures Musil 8 (Steven, editor at CNET, 8/3/8, http://news.cnet.com/8301-11386_3-10005481-76.html) JPG
A privately funded rocket suffered a launch failure Saturday night, the third launch failure in as many attempts for an Internet entrepreneur who is hoping to develop private space delivery and transportation. The failure occurred about two minutes after the launch of the two-stage Falcon 1 rocket, which was manufactured by Space
Exploration Technologies, also known as SpaceX. A failure prevented the two stages from separating after the launch from a central Pacific atoll, SpaceX CEO Elon Musk said in a company blog. The rocket was carrying three satellites for NASA and the Department of Defense. Musk said an investigation into the cause of the failure is under way, but he called the launch itself "picture perfect."
The engine catches on fire causes mission failure Bergin 6 (Chris, writer @ nasaspaceflight.com, 3/26/6, http://www.nasaspaceflight.com/2006/03/spacexcome-hell-or-high-water/) JPG SpaceXs initial analysis indicates that there was a fuel leak just above the main engine, which caused a highly visible fire. The fire cut into the first stage helium pneumatic system, causing a decrease of pneumatic pressure at T+25s. Once the pneumatic pressure decreased below a critical value, the spring return safety function of the pre-valves forced them to close, shutting down the the main engine at T+29s, Elon Musk stated in a statement about the incident.
72 Propulsion Disads
73 Propulsion Disads
74 Propulsion Disads
75 Propulsion Disads
76 Propulsion Disads
77 Propulsion Disads
industry has a long history of over-estimating demand, under-estimating technical challenges, and then experiencing cost increases and schedule delays leading to recriminations. The same pattern prevails
in the weapons industry, which probably means that companies selling to the government operate within a structure of incentives that rewards such behavior.
78 Propulsion Disads
79 Propulsion Disads
80 Propulsion Disads
81 Propulsion Disads
SpaceX Good
No risk of disaster with current technology SpaceX 11 (Developers, Falcon 9, Falcon 9, no specific date, http://www.spacex.com/falcon9.php#engine_reliability, JM)
Falcon 9 has nine Merlin engines clustered together. This vehicle will be capable of sustaining an engine failure at any point in flight and still successfully completing its mission. This actually results in an even higher level of reliability than a single engine stage . The SpaceX nine engine architecture is an
improved version of the architecture employed by the Saturn V and Saturn I rockets of the Apollo Program, which had flawless flight records despite losing engines on a number of missions. Another notable point is the SpaceX hold-before-release system a capability required by commercial airplanes, but not implemented on many launch vehicles. After first stage engine start, the Falcon is held down and not released for flight until
all propulsion and vehicle systems are confirmed to be operating normally. An automatic safe shut-down and unloading of propellant occurs if any off nominal conditions are detected.
SpaceX is avoiding chemical propulsion drawbacks Moskowitz 11 (Clara, Senior Writer, space.com, April 6, http://www.space.com/11311-spacex-huge-private-rocket-moonmars.html, JM) A massive new private rocket envisioned by the commercial spaceflight company SpaceX could do more than just ferry big satellites and spacecraft into orbit. It could even help return astronauts to the moon, the rocket's builder says. SpaceX announced plans to build the huge rocket, called the Falcon Heavy, yesterday (April 5). To make
the new booster, SpaceX will upgrade its Falcon 9 rockets with twin strap-on boosters and other systems to make them capable of launching larger payloads into space than any other rocket operating today. But the Falcon Heavy's increased power could also be put toward traveling beyond low-Earth orbit and out into the solar system, said SpaceX's founder and CEO Elon Musk during a Tuesday press conference. [Video: How SpaceX's Falcon Heavy Rocket Flies] " It certainly opens up a wide range of possibilities, such as returning to the moon and conceivably going to Mars," Musk said. Traveling that far requires more lift than most rockets flying today, including NASA's space shuttle. But the Falcon Heavy, which
is designed to generate 3.8 million pounds (1,700 metric tons) of thrust, would be able to do the job, Musk said. The Falcon Heavy booster is designed to have more lifting capability than any other rocket in service today, and about half the capability of the most powerful rocket ever built, NASA's towering Saturn 5 booster, which sent the Apollo astronauts to the moon in the late 1960s and early 1970s. The Falcon Heavy may not be able to carry everything needed for a mission to the moon in a single go, but it could potentially launch various components separately. For example, the astronauts and moon lander could be launched in one trip, with another liftoff following to deliver the vehicle that
SpaceX's Falcon 9 rocket has so far made two successful test launches, one of which carried SpaceX's Dragon capsule to orbit for the first time. Both rockets
would ferry the crew back home, Musk said.
will initially fly unmanned, but have been created with flying people in mind. [World's Tallest Rockets] "As far as human standards are concerned, they are designed to meet all of the published human standards ," Musk said.
SpaceX's commercial plans The Hawthorne, Calif.-based company hopes the Falcon rockets will be used to ferry astronauts to the International Space Station, and possibly beyond, after the space agency's space shuttles retire this year. SpaceX already has a $1.6 million contract to haul cargo to the space station aboard the Falcon 9. In addition to NASA missions, the Falcon Heavy could prove useful for other commercial space ventures. For example, the Las Vegas-based Bigelow Aerospace is designing a commercial space station, and eyeing establishing a private moon base. Such a destination would require a vehicle to help build it, as well as a rocket to ferry space tourists and other clients to and from the base. Even farther destinations like Mars are not out of the question with the Falcon Heavy, Musk said, though such a trip would probably require multiple launches. He brought up the possibility of a mission to collect samples of Martian dirt and return
"The Falcon Heavy has so much more capability than any other vehicle, I think we can start to realistically contemplate missions like a Mars sample return," Musk said. And the company isn't content to stop at the Falcon Heavy. SpaceX is also considering building an even more powerful rocket called a "super heavy-lift" vehicle that would have about three times the capability of a Falcon Heavy , or about 50 percent more
them to Earth for studying an endeavor that has so far proven prohibitively complicated . power than the Saturn 5. Such a vehicle would likely have no trouble reaching the moon, Mars or beyond. Musk said SpaceX has a small contract with NASA right now to explore the possibility of building the super heavy-lift rocket.
82 Propulsion Disads
SpaceX Good
SpaceX tech solves cost and safety issues DefenseNews 10 (Reliability, Cost Put Falcon 9 Rocket on Top, August 2,
http://www.defensenews.com/story.php?i=4731243&c=FEA&s=SPE, JM)
SpaceX has contracts for 30 flights of its Falcon 9 rocket over the next seven years and is angling to add to that docket by capturing launch deals for the Pentagon's Evolved Expendable Launch Vehicle (EELV) program, company officials say. "SpaceX is working toward being onramped for the U.S. national security space missions as an EELV provider," SpaceX officials said in a July 27 response to questions. "This means we'll be eligible to compete for EELV missions as part of the approved EELV acquisition process." The Falcon 9 is a "two-stage, liquid-oxygen and rocket-grade-kerosene-powered launch vehicle," according to a SpaceX fact sheet. It stands 180 feet tall and 12 feet wide. At
SpaceX touts reliability as the rocket's top selling point and operational characteristic . So what makes the Falcon 9 so much more reliable than other launch vehicles? "Propulsion and separation events are the primary causes of failures in launch vehicles. SpaceX designed Falcon 9 with boost stage propulsion redundancy," company officials said. "SpaceX also minimized the number of stages [two] to minimize separation events. "In addition, as a part of SpaceX's launch operations, the first stage is held down after ignition to watch engine trends. This capability is required for commercial airplanes, but not
takeoff, its thrust is 1.1 million pounds of force. The "heavy" version can carry 32 tons to low-Earth orbit.
implemented on many launch vehicles," the officials said. "If an off-nominal condition exists, then an autonomous abort is conducted. This helps prevent an engine performance issue from causing a failure in flight ." The company began developing the rocket in 2002 with an idea in mind: "Reliability and low-cost can go hand-inhand," SpaceX officials said. It was the launch vehicle's low cost that led judges for the inaugural Defense News Technology and Innovation contest to pick the Falcon 9 as a winner in the new platform category. The company said the outlook for more U.S. government and commercial sales "is increasingly brigh t." The Falcon 9's buyer list include, according to the company: "NASA, Iridium, Bigelow Aerospace, Space Systems Loral, MDA Corp. (Canada), Astrium (Europe), CONAE (Argentina) and Spacecom (Israel), to name a few." Company officials said a recent $492 million deal with Iridium is "believed to be the single largest commercial deal ever." The Falcon 9 will lift Iridium's NEXT satellite into orbit. Over the long term, the Falcon 9 will be the "workhorse vehicle for SpaceX and its customers," the company officials said.
83 Propulsion Disads
goals of greater efficiency and performance are magnified in the realm of rocket propulsion, as the cost of placing a spacecraft in orbit is quite expensive. Any increase in the efficiency or performance of a propulsion
system should allow the payload or mission mass to increase as well. The goal of the Integrated High Payoff Rocket Propulsion Technology Program (IHPRPT), which began its execution phase in 1996, has been to improve U.S. rocket technology, doubling its performance by 2010. [1] The goals of the IHPRPT Program include booster and orbit transfer applications as well as spacecraft propulsion applications. Booster applications are exclusively in the realm
of chemical propulsion and while alternative propulsion technologies are being evaluated for orbit transfer applications, they are still primarily affected by chemical rocket technologies. In-space propulsion, however, allows us to venture beyond the realm of the chemical rocket.
84 Propulsion Disads
Satellite fleet operator SES of Luxembourg, whose culture of risk aversion is widely known in the industry, has given a ringing endorsement of Space Exploration Technologies (SpaceX), saying the startup launch service providers twice-flown Falcon 9 rocket is above any of the other launch vehicles. The positive review of a supplier it has never used is all the more striking considering that it came not
from the SES marketing department the company has purchased a Falcon 9 launch but from SES Chief Technology Officer Martin Halliwell. In a May 24 presentation to SES investors, Halliwell said SESs decision to launch its
SES-8 satellite aboard an upgraded version of the current Falcon 9 rocket in March 2013 is a major step forward, not only for us, but for the industry in general. SES is the first major operator of
geostationary orbiting telecommunications satellites to order a Falcon 9. Hawthorne, Calif.-based SpaceX is obliged to demonstrate the flight worthiness of an upgraded main-stage engine, a larger propellant tank and a wider payload fairing before proceeding with the SES-8 launch. But it is not required to demonstrate a flight to geostationary transfer orbit, where most telecommunications satellites are dropped off in orbit. The satellites then use their own power to climb to final geostationary position 36,000 kilometers over the equator. In return for giving SpaceX a blue-chip name to
add to its manifest, the SES contract was concluded for a price that industry officials said is unbeatable well under $60 million for a satellite weighing a bit more than 3,000 kilograms. Halliwell said only that SpaceXs current Falcon 9 pricing is less than 60 percent of the price of other operators. In his presentation to investors, Halliwell stressed SESs policy of seeking a broader range of
rockets to choose from to maintain and expand its fleet of 44 satellites. The company has signed multilaunch agreements with Arianespace of Evry, France, for Europes Ariane 5 rocket, and with International Launch Services (ILS) of Reston, Va., which markets Russias Proton heavy-lift rocket. Halliwell said if SpaceX falls behind schedule, SES will transfer the SES-8 launch contract to ILS or Arianespace. SES also is willing to launch its satellites with Sea Launch Co. of Long Beach, Calif., which is returning to flight this year following Chapter 11 bankruptcy reorganization. SESs evaluation of launch
vehicles is important because the company is one of the few commercial satellite operators that have the resources to conduct in-depth technical reviews of its satellite and rocket suppliers. Halliwell said the Falcon 9 rocket is human-rated which puts it above any of the other launch vehicles.
Most of SESs satellites are too big to be launched by the Falcon 9 version to be launched in 2013. But SES is working with Princeton University in the United States on a new-generation electric propulsion system that could permit heavy satellites to become much lighter in the future. SES is funding the Princeton work as part of a two-year project to bring electric propulsion firmly into the commercial market. Several satellite operators have used one or another
version of electric propulsion for years to reduce their satellites weight. But these are usually satellites that would have trouble finding a launch among todays main commercial-launch vehicles, and they also carry chemical propulsion. Halliwell said SESs work with Princeton on electric
propulsion is designed to permit a large satellite to reduce its weight by up to 50 percent. That weight savings could be used to add more payload or to move from a heavy-lift to a less-expensive medium-lift rocket such as Falcon 9. In addition to what some operators view as its still-untested nature despite being used on Russian telecommunications satellites for more than two decades the technology requires more time for a satellite to reach its final destination, typically a month or two instead of just a few days. Halliwell said the Princeton technology should be ready for a flight demonstration within three or four years.
85 Propulsion Disads
same. Because there is a limited energy release in chemical reactions and because a thermodynamic nozzle is being used to accelerate the combustion gases that do not have the minimum possible molecular weight, there is a limit on the exhaust velocity that can be achieved. The maximum Isp that can be achieved with chemical engines is in the range of 400 to 500 s. So, for example, if
we have an Isp of 450 s, and a mission delta-V of 10 km/s (typical for launching into low earth orbit (LEO)), then the mass ratio will be 9.63. The problem here is that most of the vehicle mass is propellant, and due to
limitations of the strength of materials, it may be impossible to build such a vehicle to just ascend into orbit. Early rocket scientists got around this problem by building a rocket in stages, throwing away the
structural mass of the lower stages once the propellant was consumed. This effectively allowed higher mass ratios to be achieved, and hence a space mission could be achieved with low-Isp engines. This is what all rockets do today, even the Space Shuttle. In spite of the relatively low Isp, chemical engines do have a relatively high thrust-to-weight ratio (T/W)2. A high T/W (50-75) is necessary for a rocket vehicle to overcome the force of gravity on Earth and accelerate into space. The thrust of the rocket engines must compensate for the weight of the rocket engines, the propellant, the structural mass, and the payload. Although it is not always necessary, a high T/W engine will allow orbital and interplanetary space vehicles to accelerate quickly and reach there destinations in shorter time periods. Nuclear propulsion systems have the
ability to overcome the Isp limitations of chemical rockets because the source of energy and the propellant are independent of each other. The energy comes from a critical nuclear reactor in which
neutrons split fissile isotopes, such as 92-U-235 (Uranium) or 94-Pu-239 (Plutonium), and release energetic fission products, gamma rays, and enough extra neutrons to keep the reactor operating. The energy density of nuclear fuel is enormous. For example, 1 gram of fissile uranium has enough energy to provide approximately one megawatt (MW) of thermal power for a day.3
86 Propulsion Disads
increasing. Many of these missions demand mass efficient propulsion systems powerful enough to both maintain orbits and propel interplanetary satellites. Additionally, with the increasing 2 amount of Earth satellites on orbit, the need for more precise station keeping is becoming dangerously apparent. This is particularly true in the case of geosynchronous satellites, which are not
perfectly stable. Geosynchronous satellites are being packed in closer to each other. The small amount of drift inherent in nearly all geosynchronous orbits must be precisely countered to prevent these satellites from colliding. To do so, requires constant updates to orbital velocity. Also, these satellites are built to endure longer than a typical low Earth orbiting satellite. This is because there is no notable air drag at this altitude, and the satellites are enormously more expensive to put on orbit. Therefore, the orbital maneuvers needed to correct position and velocity are not only frequent
but are required over a long period of time. Such updates add up to a substantial load on the propulsion system. Electric propulsion systems answer this demand with high performance and high efficiency. Chemical propulsion, even at its theoretical maximum, is inadequate for the future of space propulsion. Humble et al. comment Of the various methods for generating high speed reactionmass, electromagnetic techniques offer the only way that, in principle, is not limited by the bond strengths of matter [1]. Electric propulsion provides more reasonable solutions to the space propulsion
missions. Humble et al. go on to describe electric propulsion theory: In electric propulsion systems, electromagnetic forces directly accelerate the reaction-mass, so we are theoretically limited only by our ability to apply these forces at the desired total power levels [1] . Mass efficiency is highly increased in electric propulsion systems due to this method of acceleration, the degree of which is determined by the type of electric system being used.
87 Propulsion Disads
low specific impulse but enabling engines with very large thrusts, falls short for deep-space and interstellar missions. Although the near interstellar space can be reached using chemical propulsion, aided by gravitational assist, no mission in interstellar space can be performed in a reasonable time without improvements in propulsion.
88 Propulsion Disads
are liquid at normal temperatures. Some of the bipropellants are hypergolic, meaning that they ignite spontaneously when mixed, requiring no source of ignition. Cryogenic propellants are gases that are liquids only when super-cooled. Cryogenic propellants are, for instance, liquid oxygen and liquid hydrogen. Both oxygen and hydrogen are liquids only at very low temperatures (Oxygen: -360 F; Hydrogen: -422 F). Since these propellants are liquids only at such low temperatures, they are difficult, costly, and dangerous to use. There are other liquid propellants available that are sometimes used as rocket propellants, for instance, Red Fuming
Nitric Acid (oxygen source) and kerosene, or RP-1 (fuel a hydrocarbon like gasoline) and H2O2 (oxidizer hydrogen peroxide). But these propellants do not produce enough thrust to be used in large rockets . Other "liquid" propellants are, for example, Dinitrogen Tetroxide (oxygen source) and Hydrazine (fuel like liquid ammonia). This liquid propellant blend also lacks the power of a cryogenic propellant, and both these liquids must be maintained under high pressure. Hydrazine is a highly toxic carcinogen that is dangerous to handle. Nearly all large liquid fueled rockets that are currently used for commercial space operations use cryogenic liquid hydrogen and liquid oxygen propellants due to the power these liquid propellants produce.
Liquid propellants fail dangerous, technical barriers, and they interfere with space missions. SPS 5 (Space Propulsion Systems, The MFC Propulsion Program, November 23,
http://www.sps.aero/Propulsion_Program/MFC_Intro.htm, JM) The only new liquid propellant engine to be developed since the early eighties (the Space Shuttle Main engines) is the Aerospike engine, currently under development for use in the Lockheed-Martin Venturestar X-33 Spaceplane. In most cases, the main engines of the new generation Spaceplanes will use cryogenic liquid hydrogen and liquid oxygen as propellant. Liquid propellant motors, particularly those using these cryogenic propellants, will principally be used since they produce a great deal of power and can be throttled, meaning the power produced by the rocket motor can be increased, decreased, or the motor shut off and restarted a number of times. This ability to throttle is critically important for space operations, both during launch and for maneuvering in space, and is the primary reason for using this type of engine. However, all liquid
propellant motors suffer from a number of problems that limit their usefulness for commercial space applications: Liquid, non-cryogenic, bipropellants do not produce the power needed to launch large payloads into space, unless at least one of the two components is a cryogenic propellant such as liquid oxygen Cryogenic propellants are difficult and dangerous to both handle, as in spaceport based refueling of spacecraft, and in storage, since they must be supercooled and maintained at extremely low temperatures, and stored under high pressure Many noncryogenic liquid bipropellants must also be stored and used under high pressures, and some of the best, such as hydrazine, are extremely deadly carcinogenic toxins Both liquid hydrogen and liquid oxygen are extremely combustible gases, posing severe fire/explosion safety hazards Liquid propellants require fuel tanks two or more times larger than those used for solid propellants . Due to limitations in spacecraft size/cost, this reduces the amount of cargo and personnel carrying capacity of the launch vehicle. Liquid propellant rocket motors, particularly those using cryogenic propellants, are extremely complex. Liquid propellant feed systems consist of pumps, valves, and both liquid feed and recycle piping. All these components must be lightweight and designed into compact packages . The high degree of complexity increases maintenance requirements and costs, and poses a significant threat to launch safety. With the planned routine, almost daily, operations of space launch vehicles, the likelihood of catastrophic failure of these motors increases. Space-based manufacture of liquid propellants and in-space re-fueling of spacecraft with liquid propellants will both be extremely difficult, dangerous, and costly. Although it is the goal of the commercial space industry to reduce launch costs from the current average cost of $10,000 per pound to low earth orbit to $1,000 per pound within the next 10 years, if commercial launch vehicles were to rely on liquid propellant propulsion systems for either expendable launch vehicles or manned spaceplanes, the likelihood of their achieving this launch cost goal within the projected time frame is small.
89 Propulsion Disads
propellants are exposed to each other. This exposure prevents manufacturing of solid propellants with the best fuels and oxidizers, since the fuel and oxidizer would spontaneously ignite on contact, or produce a highly unstable and dangerous propellant ; Many desirable oxidizers are sensitive to the presence of moisture (high humidity). If propellants are made using moisture sensitive materials, they can become unstable, leading to misfires, erratic performance of the motor, or even spontaneous combustion of the propellant within the motor during storage or handling; The manufacture of solid propellants is very dangerous, and must be done at a remote location. The solid rocket motors must be made in pieces (booster segments) an d transported over long distances to the launch site for final assembly into a launch vehicle. This process is both extremely costly and dangerous. NO CURRENT SOLID ROCKET MOTOR CAN BE STARTED, STOPPED, AND THROTTLED USING TODAY'S SOLID ROCKET PROPELLANT TECHNOLOGY. NO SOLID ROCKET MOTOR CAN BE DESIGNED TO PERFORM ANY OF THESE FUNCTIONS USING CURRENT TECHNOLOGY. Once ignited, they must burn to completion. This is a very serious limitation for their widespread use as main propulsion systems in expendable launch vehicles and spaceplanes.
90 Propulsion Disads
agency rooted out their causes and dealt out new safety plans before again launching astronauts into space. It took more than two years following both the Challenger and Columbia accident before NASA launched another shuttle - most recently with last year's STS-114 flight aboard Discovery on a test flight which proved that still more work was needed to prevent fuel tank debris at liftoff. STS-121 Commander Looks Toward Launch "The anniversaries remind
us that we can never be complacent about anything," astronaut Steven Lindsey, commander of NASA's next shuttle flight STS-121, told SPACE.com. "[They] help us remind each other, each year, to refocus...because the
next several years, that's all we're going to think about, but what about 10 years from now? If we've been successful for 10 years and haven't had an accident, that's what you worry about.
"We've got to pay attention to the past so that we don't repeat it," Lindsey said. Lindsey's STS-121 mission, currently set to launch no early than May 3, will mark NASA's second shuttle flight since the Columbia disaster and complete a series tests designed to increase shuttle safety.
Until failsafe launch mechanisms are developed, spaceflight will inevitably lead to malfunctions. Malik 6 (Tariq, Staff Writer, space.com, January 27, http://www.space.com/1990-remembering-challenger-shuttle-disasterrefocus-nasa.html, JM) The very public loss of Challenger and Columbia were vivid reminders of the risks inherent to human spaceflight, astronauts said. "There's been a perception for as long as I've been in the program until this recent accident that spaceflight's routine, that's the public perception," said Lindsey, who joined NASA's astronaut corps in 1995. " It wasn't until I came here and started getting involved that I realized how close to the edge we always are when we fly this, and recognize the inherent danger in what we do. It's not routine ." But the results, including scientific research, unexpected spin-offs and pushing the boundaries of human exploration are worth the risk, the astronaut added. "I think that you could wake up in the morning, and until you go to bed at night, and even while you sleep, wherever you are, you could look at multiple things that came out of the space program," Lindsey said. "It impacts everything that we do." Some space experts believe that, statistically, another spaceflight accident will occur
in the future, forcing NASA or other space agency to once again take a close look at the processes and the risks involved in human spaceflight. NASA's chief also said that the progress of human
spaceflight will likely suffer painful setbacks, much like the early air industry, adding that the lessons learned from each experience will lead to safer craft. "I know that in the course of this, there will be other opportunities to learn, and they will be sober opportunities surrounded with black crepe," Griffin said. "But we will learn in the same way that the nation and the world learned how to do air transport, and it will be difficult." Risk will always go hand-in-hand with human spaceflight, Lindsey added. "If we want a completely safe program, then we shouldn't fly at all," the shuttle commander said. "Because there's no such thing."
91 Propulsion Disads
major malfunction or mishap could affect support for President Barack Obama's plan to shift responsibility for ferrying astronauts to the International Space Station to the commercial space sector. "If they blow up on the pad, Obama's lost it," space policy expert Roger Handberg, a political scientist at the University of Central Florida, said of the administration's chances of getting the proposal through Congress.
92 Propulsion Disads
93 Propulsion Disads
**NUCLEAR**
94 Propulsion Disads
propelled rockets would shorten voyages in space. "Project Prometheus will develop the means to efficiently increase power for spacecraft, thereby fundamentally increasing our capability for solar system exploration," says NASA.
Unique link- only a commitment to go back to Mars will cause nuclear rocket use. Madrigal 9 (Alexis, Staff @ Wired Science, 11/3, http://www.wired.com/wiredscience/2009/11/nuclear-propulsion-inspace/)
There were several attempts to resurrect nuclear propulsion of various types, most recently the mothballed Project Prometheus. None, though, have garnered much support . One major reason is that NASA
picks its propulsion systems based on its targets and true exploration of the solar system and beyond hasnt really been a serious goal, the Constellation plans for a return to the moon aside. The destinations dictate the power system, said Rao Surampudi, a Jet Propulsion Laboratory engineer who works on the development of power systems. By and large, its
cheaper and easier to go with solar power or very low-power radioisotope generators like the one that powers the Cassini mission. McDaniel agreed that the targets drive things, citing the general decline of pure technology development research at NASA. Until we commit to going back to Mars, were not going to have a nuclear rocket, McDaniel said.
95 Propulsion Disads
96 Propulsion Disads
**Weaponization
97 Propulsion Disads
Weaponization Link
The pentagon will use nuclear rockets for space weaponization GNAWNPS 5 (Global Network Against Weapons & Nuclear Power in Space, 5/31/5,
http://www.thepowerhour.com/news/space_statement.htm) JPG The Pentagon has long maintained they need nuclear reactors in order to provide the enormous power required for weapons in space. In a Congressional study entitled Military Space Forces: The Next 50 Years it was reported that "Nuclear reactors thus remain the only know long-lived, compact source able to supply military forces with electric power...Larger versions could meet multimegawatt needs of space-based lasers....Nuclear reactors must support major bases on the moon..." In an article printed in the Idaho Statesman on April 20, 1992 military officials stated "The Air Force is not developing [the nuclear rocket] for space exploration. They're looking at it to deliver payloads to space." Considering that NASA says all of their space missions will now be "dual use," meaning every mission will be both military and civilian at the same time , it is important to
ask what the military application of the Project Prometheus will be.
98 Propulsion Disads
Weaponization Link
NASA nuclear space exploration is a Trojan horse for space militarization Gagnon 3 (Bruce, Coordinator of the Global Network Against Weapons & Nuclear Power in Space
group, 1/27/3, http://www.spacedaily.com/news/nuclearspace-03b.html) JPG
Critics of NASA have long stated that in addition to potential health concerns from radiation exposure, the NASA space nukes initiative represents the Bush administration's covert move to develop power systems for space-based weapons such as lasers on satellites. The military has often stated that their planned lasers in space will require enormous power projection capability and that nuclear reactors in orbit are the only practical way of providing such power. The Global Network Against Weapons & Nuclear Power in Space maintains that just like missile defense is a Trojan horse for the Pentagon's real agenda for control and domination of space, NASA's nuclear rocket is a Trojan horse for the militarization of space. NASA's new chief, former Navy Secretary Sean O'Keefe said soon after Bush appointed him to head the space agency that, "I don't think we have a choice, I think it's imperative that we have a more direct association between the Defense Department and NASA. Technology has taken us to a point where you really can't differentiate between that which is purely military in application and those capabilities which are civil and commercial in nature."
99 Propulsion Disads
Weaponization Link
Nuclear propulsion guarantees space weaponization civilian sector will be coopted by the military Grossman 4 (Karl, Journalism prof @ the State U of NY and author of "Cover Up: What You Are Not
Supposed To Know About Nuclear Power, Earth First Journal, March-April 2004, http://westgatehouse.com/art154.html) JPG Space nuclear power also has boosters among the military, which has been considering spacebased weapons--devices that need substantial amounts of power. Additionally, the military has been interested in nuclear-powered rockets. In the late 1980s, an earlier series of nuclear rocket projects was first revived with Project Timberwind, a program to build atomic rockets to loft heavy Star Wars equipment and also for trips to Mars. This kind of "dual use" now runs through all NASA operations, says Bruce Gagnon,
coordinator of the Global Network Against Weapons and Nuclear Power in Space. "Right after Bush swore the new NASA chief into office, O'Keefe told the nation that from now on every mission would be dual use. By that he meant that every mission would carry military and civilian payloads at the same time . This is further evidence that the space program has been taken over by the Pentagon." "Space is viewed today," says Gagnon, "as open territory to be seized for eventual corporate profit" and for US military control. Gagnon speaks of proposals to "mine the sky"--to extract minerals from celestial bodies, with the moon considered a prime source for rare Helium-3. This elemental substance would be brought back to Earth to fuel supposedly cleaner fusion-power reactors. Gagnon says that the US military wants to establish bases in space, including on the moon, to protect these operations and to control the "shipping lanes of the future." "The Bush space plan will be
enormously expensive, dangerous and will create unnecessary conflict as it expands nuclear power and weapons into space," notes Gagnon, "all disguised as the noble effort to hunt for the 'origins of life'."
Challenger exploded.
NASA and the DoD are intertwined Broad 91 (William, writer @ NYT, 4/3/91, http://www.nytimes.com/1991/04/03/us/secret-nuclearpowered-rocket-being-developed-for-star-wars.html) JPG While currently run by the Defense Department, the effort is being quietly evaluated by the National Aeronautics and Space Administration, which is considering nuclear reactors to power a manned mission to Mars.
**Accidents
Accidents
There is a ten percent chance of nuclear rocket accidents the plan only increases that number Gagnon 3 (Bruce, Coordinator of the Global Network Against Weapons & Nuclear Power in Space
group, 1/27/3, http://www.spacedaily.com/news/nuclearspace-03b.html) JPG Included in NASA plans are the nuclear rocket to Mars; a new generation of Radioisotope Thermoelectric
Generators (RTGs) for interplanetary missions; nuclear-powered robotic Mars rovers to be launched in 2003 and 2009; and the nuclear powered mission called Pluto-Kuiper Belt scheduled for January, 2006. Ultimately NASA envisions mining colonies on the Moon, Mars, and asteroids that would be powered by nuclear reactors. All of the above missions would be launched from the Kennedy Space Center in Florida on rockets with a historic 10% failure rate. By dramatically increasing the numbers of nuclear launches NASA also dramatically increases the chances of accident. During the 1950s and 1960s NASA spent over $10 billion to build the nuclear rocket program which was cancelled in the end because of the fear that a launch accident would contaminate major portions of Florida and beyond.
Accidents
Nuclear propulsion inevitably causes accidents empirically proven Grossman 97 (Karl, Journalism prof @ the State U of NY and author of "Cover Up: What You Are Not
Supposed To Know About Nuclear Power, 2/3/97, http://www.flybynews.com/archives/karl/kg9105we.htm) JPG .The record of nuclear power in space is poor. The United States has launched 24 nuclearfueled space devices, including a navigational satellite with plutonium aboard that disintegrated in the atmosphere as it plunged to Earth in 1964. The U.S. failure rate for nuclear-powered space devices has been about 15 percent. The Soviet Union has the same failure rate. The Soviets have sent up more than 30 nuclear-fueled devices, including the Kosmos 954, which littered a broad swath of Canada with radioactive debris when it crashed in 1978. The United States spent some $2 billion of taxpayer money on developing nuclear-powered rockets from 1955 to 1973, but none ever got off the ground. That effort was finally canceled because of the concern that a rocket might crash to Earth. Now we're turning to nuclear power in space -with its inevitable mishaps -- again. Last year the United States launched the Ulysses plutonium-fueled probe to survey the sun. A December Associated Press dispatch noted, "The Ulysses spacecraft is wobbling like an offbalance washing machine, threatening to cripple the $760-million mission." Fortunately, the probe is not coming
back for an Earth flyby.
Accidents
Other types of propulsion are comparatively better captures solvency with no risk of accidents Grossman 3 (Karl, Journalism prof @ the State U of NY and author of "Cover Up: What You Are Not
Supposed To Know About Nuclear Power, February 2003, http://www.envirovideo.com/nuclearspacestory.html) JPG "NASA hasnt learned its lesson from its history involving space nuclear power," says Kaku, "and a
hallmark of science is that you learn from previous mistakes. NASA doggedly pursues its fantasy of nuclear power in space. We have to save NASA from itself." He cites "alternatives" space nuclear power. "Some of these alternatives may delay the space program a bit. But the planets are not going to go away. Whats the rush? Id rather explore the universe slower than not at all if there is a nuclear disaster." Dr. Ross McCluney, a former NASA scientist now principal research scientist at the Florida Solar Energy Center, says NASAs push for the use of nuclear power in space is "an example of tunnel vision, focusing too narrowly on what appears to be a good engineering solution but not on the longer-term human and environmental risks and the law of unintended consequences. You think youre in control of everything and then things happen beyond your control. If your project is inherently benign, an unexpected error can be tolerated. But when you have at your projects core something inherently dangerous, then the consequences of unexpected failures can be great."
Accidents ! Helper
The plan creates a Chernobyl in the sky spreads radiation across the globe Grossman 4 (Karl, Journalism prof @ the State U of NY and author of "Cover Up: What You Are Not
Supposed To Know About Nuclear Power, Earth First Journal, March-April 2004, http://westgatehouse.com/art154.html) JPG Opponents of using nuclear power in space warn of serious accidents from Project Prometheus. And it's not a matter of the sky falling--accidents have already happened in the use of nuclear power in space. In 1964, there was an accident in which a SNAP-9A, plutonium-powered US satellite fell back to Earth, disintegrating and spreading plutonium over every continent at every latitude. Dr. John Gofman, professor emeritus of medical physics at the University of California-Berkeley, has long linked the SNAP-9A accident to an increased level of lung cancer. Warning of a "Chernobyl in the sky," Dr. Michio Kaku, professor of nuclear physics at the City University of New York, points to alternatives to atomic power in space--among them solar power and long-lived fuel cells. "Some of these alternatives may delay the
space program a bit. But the planets are not going to go away." Indeed, as a result of the SNAP-9A accident, NASA intensified its work on solar energy systems, and its satellites are now powered by solar energy, as is the International Space Station. NASA has a division working on the additional uses of space solar power.
More ev Grossman 3 (Karl, Journalism prof @ the State U of NY and author of "Cover Up: What You Are Not
Supposed To Know About Nuclear Power, February 2003, http://www.envirovideo.com/nuclearspacestory.html) JPG
The Transit 4As plutonium system was manufactured by General Electric. Then, in 1964, there was a serious accident involving a plutonium-energized satellite. On April 24, 1964, the GE-built Transit 5BN with a the SNAP-9A (SNAP for Systems Nuclear Auxiliary Power) board failed to achieve orbit and fell from the sky and disintegrating as it burned in the atmosphere. The 2.1 pounds of Plutonium-238 (an isotope of plutonium 280 times "hotter" with radioactivity than the Plutonium-239 which is used in atomic and hydrogen bombs) in the SNAP-9A dispersed widely over the Earth. A study titled Emergency Preparedness for Nuclear-Powered Satellites done by a grouping of European health and radiation protection agencies later reported that " a worldwide soil sampling program carried out in 1970 showed SNAP-9A debris present at all continents and at all latitudes." Long connecting the SNAP-9A accident and an increase of lung cancer on Earth has been Dr. John Gofman, professor emeritus of medical physics at the University of California at Berkeley, an M.D. and Ph.D. who was involved in isolating plutonium for the Manhattan Project and co-discovered several radioisotopes.
Accidents Link/Impact
The plan causes nuclear accidents to be inevitable isnt necessary and kills thousands Grossman 97 (Karl, Journalism prof @ the State U of NY and author of "Cover Up: What You Are Not
Supposed To Know About Nuclear Power, 2/3/97, http://www.flybynews.com/archives/karl/kg9105we.htm) JPG
While getting into position to make a low-level (186-mile high), high-speed (33,000 miles an hour) "flyby" of the Earth, the Galileo plutonium-fueled space probe has gone out of whack. The probe, which is supposed to send us information about Jupiter and its moons, unexpectedly shut down all but in essential functions in late March. It took the National Aeronautics and Space Administration 13 days to fix that. Then NASA ordered the probe to unfurl its main communications antenna. But the antenna wouldn't unfurl. Next, on May 2, all but essential functions were lost again. NASA blames the March and May malfunctions on a "stray electronic signal." It still can't figure out why the antenna isn't working. Galileo, with its 50 pounds of plutonium aboard - theoretically enough to give a lethal dose of lung cancer to everyone on Earth - will be buzzing our planet in December, 1992. This "slingshot maneuver" is designed to use the Earth's gravity to give Galileo the velocity to get to Jupiter. It is to be hoped that there will be no foul-ups in Galileo'a functioning then, causing it to make what is called an "Earth-impacting trajectory." With the probe just above the Earth's atmosphere on the flyby, it would take
only a small malfunction to cause it to drop and disintegrate, showering plutonium down on Earth. The United States is proceeding rapidly with the nuclearization of space, and the threat we face from Galileo is the kind of danger we will be undergoing constantly if we allow the government to continue to send nuclear hardware into space. If we tolerate Chernobyls in the sky, deadly accidents will be inevitable. Yet this risk is unnecessary. The potential catastrophes are avoidable. After Galileo was launched in 1989, I received, under the Freedom of Information Act, NASA-funded studies declaring that nuclear power was not necessary to generate electricity on the Galileo mission; solar energy would do. The
plutonium on board Galileo is being used not for propulsion but as fuel in generators providing a mere 560 watts of electricity for the probe's instruments - electricity that could be produced instead by solar energy. A decade ago NASA's Jet Propulsion Laboratory concluded: "A Galileo Jupiter-orbiting mission could be performed with a concentrated photovoltaic solar array [panels converting sunlight to electricity] power source without changing the mission sequence or impacting science objectives." Five years ago, another JPL study said that it would take only two to three years to build the alternative solar-power source. Still another JPL report stressed that using the sun for power would cost less than using plutonium. It is humanity's destiny to explore the heavens, but what a folly it will be if in doing this, we needlessly
cause the deaths of tens of thousands of people and contaminate the Earth with deadly plutonium.
renewed emphasis on nuclear power in space is not only dangerous but politically unwise, says Dr. Michio Kaku, professor of theoretical physics at the City University of New York. The only thing that can kill the U.S. space program is a nuclear disaster. The American people will not tolerate a Chernobyl in the sky. That would doom the space program. NASA hasnt learned its
lesson from its history involving space nuclear power, says Kaku, and a hallmark of science is that you learn from previous mistakes. NASA doggedly pursues its fantasy of nuclear power in space. We have to save NASA from itself. He cites alternatives space nuclear power. Some of these alternatives may delay the space program a bit. But the planets are not going to go away. Whats the rush? Id rather explore the universe slower than not at all if there is a nuclear disaster.
**Production D/As
NASA is running out of nuclear fuel needed for its deep space exploration . The end of the Cold War's nuclear weapons buildup means that the U.S. space agency does not have enough plutonium for future faraway space probes except for a few missions already scheduled according to a
new study released Thursday by the National Academy of Sciences. Deep space probes beyond Jupiter can't use solar power because they're too far from the sun. So they rely on a certain type of plutonium, plutonium-238. It powers these spacecraft with the heat of its natural decay. But plutonium-238 isn't found in nature; it's a byproduct of nuclear
weaponry. The United States stopped making it about 20 years ago and NASA has been relying on the Russians. But now the Russian supply is running dry because they stopped making it, too.
U.S. and Russia plutonium 238 supplies low now Berger 8 (Brian, Staff Space News, 3/6, http://www.space.com/5054-plutonium-shortage-thwart-future-nasa-missions-outerplanets.html, accessed 6/23, AL)
In the future, in some future year not too far from now, we will have used the last U.S. kilogram of plutonium-238, Griffin said. And if we want more plutonium-238 we will have to buy it from Russia. Griffin, who has said many times that he finds it unseemly that the United States may have to depend
entirely on Russia to access the space station between the space shuttles retirement in 2010 and the introduction several years later of the Orion Crew Exploration Vehicle or a commercial alternative, made clear he was no more pleased with the prospect of relying entirely on Russia for flying space missions requiring nuclear power sources. I think its appalling, he said. But even the Russian supply might not last for much longer. When the hearing resumed March 6, Griffin told lawmakers Russia has advised the United States that they are down to their last 10 kilograms of plutonium. We are now foreseeing the end of that Russian line, he said. Griffin also clarified that NASA has
been assured of enough plutonium-238 to do the MSL, a 2013 or 2014 Discovery-class mission and an outer-planets flagship mission targeted for 2016 or 2017. When those missions are allocated, we have no more, he said. Griffin said absent a national decision to restart production, NASAs planetary
science program would be severely hampered. John Logsdon, executive director of the Space Policy Institute at George Washington University here, said not restarting plutonium-238 production puts the U.S. space program in an undesirable position of vulnerability. The major risk is political, Logsdon said. It begs the question whether Russia is a reliable enough source, under plausible future political scenarios, that we can count on it. Logsdon said the United States also could find itself paying dearly for Russias remaining supply. Any monopoly supplier can name their own price, he said.
Contamination D/A
Plutonium production process necessary for fuel contaminates workers Gagnon 3 (Bruce, Coordinator of the Global Network Against Weapons & Nuclear Power in Space
group, 1/27/3, http://www.spacedaily.com/news/nuclearspace-03b.html) JPG Beyond accidents impacting the planet, the space nuclear production process at the DoE labs will lead to significant numbers of workers and communities being contaminated. Historically DoE has a bad track record when it comes to protecting workers and local water systems from radioactive contaminants. During the Cassini RTG fabrication process at Los Alamos 244 cases of worker contamination were reported to the DoE. Serious questions need to be asked: How will workers be protected?
Where will they test the nuclear rocket? How much will it cost? What would be the impacts of a launch accidents?
Arsenic and Alpha Emitters are cancer causing to humans. Arsenic and Alpha Emitters are pulled out of the ground during the mining process, entering the groundwater, people drink the groundwater and
become contaminated. There can be a 5, 10, or 20-year latency period of exposure to Arsenic and Alpha Emitters before cancer develops. CBR proposes 20 more years of Uranium Mining near Crawford, Nebraska. The Cameco, Inc. website states they have a proven reserve of 60 million pounds of Uranium to extract. How much water is that at 9,000 gallons per minute? 24 hours per day, 365 days per year for 20 more years What will the number of gallons increase to once the two new Uranium Mines are developed and running? There are about 321 people diagnosed with Diabetes each year on Pine Ridge. Currently, of our 25,000 residents, 10% of our Tribal Members have Diabetes. What will that number be after 20 more years of mining which has the potential of contamination of our groundwater? Our people who are Diabetic patients seem to move to the Dialysis stage of the disease quickly, can this be a result of kidney damage sustained over many, many years of contamination of ingesting even low doses of Arsenic and Alpha Emitters? The homes across the Pine Ridge whose test results revealed an illegal MCL of Arsenic now have filters provided by the Indian Health Service to filter Arsenic out of the water as it comes out of our kitchen faucet to purify the water we drink and cook with, but the water we bath our children in, wash our clothes with, water our lawns with, and shower with is not filtered. The Arsenic is still pouring into our homes. According to the Indian Health Service official at the Aug 15, 2007 Environmental Health Tech Team meeting, this shouldnt be a concern because you have to drink it to be effected by it. I wonder what scientists from other parts of the world say about that? Western Science is not the only science who studies such matters, a German scientist states he has proof that a low dose over time can have a more dramatic result than previously understood. With the Crow Butte Resources existing mine and two new proposed mines 38 miles to the southeast of Pine Ridge, and the proposed Powertech Uranium Mine 60 miles to the Northwest of Pine Ridge, In Situ Leach Mining
for Uranium has the potential to contaminate all of the groundwater our people depend on for drinking water.
Testing = Radiation
Nuclear rocket testing causes radiation cancer Rutschman 6 (Avi, writer @ The Acorn, 10/16/6, http://www.calisafe.org/_disc1/40000016.htm) JPG
The Santa Susana Field Laboratory Panel, an independent team of researchers and health experts, released a report last week concluding that toxins and radiation released from the Rocketdyne research facility near Simi Valley could be responsible for hundreds of cancers in the surrounding areas. The Santa Susana Field Laboratory was built in 1948 by North American Aviation and consists of 2,850 acres in eastern Ventura County. Over the years, it has been used as a test site for experiments involving nuclear reactors, high-powered lasers and rockets. The report was completed by experts in the fields of reactor accident analysis, atmospheric transport of contaminants, hydrology and geology. The study took five years to complete and was funded by the California Environmental Protection Agency. "We want to thank the many legislatures that have attended meetings, provided funds and pressured public agencies into action," said Marie Mason, a community activist and longtime resident of the Santa Susana Knolls area in Simi Valley, who helped to form the advisory panel. The panel originally formed 15 years ago after a 1959 nuclear meltdown that occurred at the Santa Susana Field Laboratory was made public. Concerned about the possibility of facing adverse health affects due to the meltdown, area residents pressured legislators into funding a panel to study the impact of the incident. "We were fearful of what our families and communities may have been exposed to," said Holly Huff, another community member who pushed for the formation of the panel. The first study conducted by the panel was performed by UCLA researchers and focused on the adverse health effects the meltdown had on Rocketdyne employees. Completed in 1997, that report indicated workers did indeed suffer a higher rate of lymph system and lung cancers. Boeing, the current owner of the Santa Susana Field Laboratory, has challenged the validity of the studies, calling into question the scientific methods used by researchers. "We received a summary of the report Thursday, and we were not given an advance copy to look through and prepare with," said Blythe Jameson, a Boeing spokesperson. "Based on our preliminary assessment," Jameson said, "we found that the report has significant flaws and that the claims are baseless without scientific merit and a grave disservice to our employees and the community." After the UCLA study concluding that laboratory workers had faced adverse health effects because of the meltdown, the panel was given federal and state funds to conduct another study of potential impacts on neighboring communities and their residents. According to the panel, Boeing was unwilling to disclose a large amount of data concerning the accident and certain operations. This forced the researchers to base some of their studies on models of similar accidents. "One simply does not know with confidence what accidents and releases have not been disclosed, nor what information about the ones we do know of also has not been revealed," the panel stated in its report. After five years of research, the panel concluded that between 260 and 1,800 cancer cases were caused by the field laboratory's contamination of surrounding communities. The incident released levels of cesium-137 and iodine-131, radio nucleotides that act as carcinogens that surpass the amount of contaminants released during the Three Mile Island incident. The report also stated that other contaminants have escaped, and still could, from the Boeing-owned laboratory through groundwater and surface runoff.
Politics Links
Bipartisan consensus against nuclear power Japan meltdowns Broder 11 (John, writer @ NYT, 3/13/11,
http://www.nytimes.com/2011/03/14/science/earth/14politics.html) JPG The fragile bipartisan consensus that nuclear power offers a big piece of the answer to Americas energy and global warming challenges may have evaporated as quickly as confidence in Japans crippled nuclear reactors. President Obama is seeking tens of billions of dollars in government insurance for new nuclear reactor construction. Senator Joseph I. Lieberman wants to put the brakes on nuclear construction for now while studying what happened in Japan. Until this weekend, President Obama, mainstream environmental groups and large numbers of Republicans and Democrats in Congress agreed that nuclear power offered a steady energy source and part of the solution to climate change, even as they disagreed on virtually every
other aspect of energy policy. Mr. Obama is seeking tens of billions of dollars in government insurance for new nuclear construction, and the nuclear industry in the United States, all but paralyzed for decades after the Three Mile Island accident in 1979, was poised for a comeback. Now, that is all in question as the world watches the unfolding crisis in Japans nuclear reactors and the widespread terror it has spawned.
Politics Links
Nuclear power in space is politically publicly and unpopular Gagnon 3 (Bruce, Coordinator of the Global Network Against Weapons & Nuclear Power in Space
group, 1/27/3, http://www.spacedaily.com/news/nuclearspace-03b.html) JPG NASA's expanded focus on nuclear power in space "is not only dangerous but politically unwise," says Dr. Michio Kaku, professor of nuclear physics at the City University of New York. "The only thing that can kill the U.S. space program is a nuclear disaster. The American people will not tolerate a Chernobyl in the sky." "NASA hasn't learned its lesson from its history involving space nuclear power," says Kaku, "and a hallmark of science is that you learn from previous mistakes. NASA doggedly pursues its fantasy of nuclear power in space." Since the 1960s there have been eight space nuclear power accidents by the U.S. and the former Soviet Union, several of which released deadly plutonium into the Earth's
atmosphere. In April, 1964 a U.S. military satellite with 2.1 pounds of plutonium-238 on-board fell back to Earth and burned up as it hit the atmosphere spreading the toxic plutonium globally as dust to be ingested by the people of the planet. In 1997 NASA launched the Cassini space probe carrying 72 pounds of plutonium that fortunately did not experience failure. If it had, hundreds of thousands of people around the world could have been contaminated.
Nuclear propulsion unpopular with public- Cassini proves Lemos 7 (Robert, Staff @ Wired Science, 9/20,http://www.wired.com/science/space/news/2007/09/space_nukes
Yet, concerns
that an accident at launch would expose people to radioactivity have caused some citizens to staunchly oppose the technology. In 1997, public outcry over the use of 73 pounds of plutonium almost scrapped the Cassini mission, a probe which is now delivering stunning vistas and scientific data from Saturn. In 2006, NASA launched the New Horizons mission to Pluto and the outer solar system, but the radioactive material required to power the probe resulted in a lot of political hand-wringing, said Todd May, deputy associate administrator for NASA's Science Mission Directorate, who worked
on the New Horizons mission. "The stack of documents that it took to launch that small amount of plutonium on the New Horizons mission was enormous," May said.
Appropriations committee specifically doesnt want more production Smith 9 (Marcia, writer @ Space Policy Online, 8/8/9,
http://www.spacepolicyonline.com/pages/index.php?option=com_content&view=article&id=305:houseand-senate-cut-plutonium-production-funding-imperling-nasa-space-science-missionplans&catid=67:news&Itemid=27) JPG The Senate Appropriations Committee report (S. Rept. 111-45) expressed similar reservations. "The Committee recommends no funding for this program at this time . The Committee understands the importance of this mission and the capability provided to other Federal agencies. However, the Department's proposed plutonium reprocessing program is poorly defined and lacks an overall mission justification as well as a credible project cost
estimate. Sustaining the plutonium mission is a costly but an important responsibility. The Committee expects the Department to work with other Federal agency customers to develop an equitable and appropriate cost sharing strategy to sustain this mission into the future."
Appropriations committee controls congress specifically discretionary spending like the plan Alarkon 10 (Walter, writer @ The Hill, 5/14/10, http://thehill.com/homenews/house/97995-cbc-couldsee-its-spending-clout-increase-next-congress) JPG By having clout on the Appropriations Committee, the CBC would have a greater voice to be
able to push their priorities, said CBC Chairwoman Barbara Lee (D-Calif.). "It's about equity in our federal resources," she told The Hill. Seniority on the Appropriations Committee is a sought-after commodity because of the power the panel wields over the federal budget. Discretionary spending measure s -- including those funding wars and each government agency -- are typically considered by the House and Senate Appropriations Committees before they come up for full votes on either chamber. Each federal agency's budget request is first considered by a subcommitte e, making the subcommittee chairmen -known on Capitol Hill as "cardinals" -- far more powerful than junior appropriators. Federal discretionary spending for 2010, excluding the $33 billion in Iraq and Afghanistan war funding expected to pass this month, is expected to be $1.4 trillion. The influence of appropriations can be seen by looking at the list of congressional leaders; Speaker Nancy Pelosi (D-Calif.), Senate Majority Leader Harry Reid (D-Nev.), Senate Majority Whip Dick Durbin (Dlll.) and Senate Minority Leader Mitch McConnell (R-Ky.) have all been appropriators.
Producing more is politically unpalatable and the public hates it ONeill 9 (Ian, writer @ Universe Today, 5/8/9, http://www.universetoday.com/30610/nasa-is-runningout-of-plutonium/) JPG So the options are stark: Either manufacture more plutonium or find a whole new way of powering our spacecraft without radioisotope thermal generators (RTGs). The first option is bound to cause some serious political fallout (after all, when there are long-standing policies in place to restrict the production of plutonium, NASA may not get a fair hearing for its more peaceful applications) and the second option doesnt exist yet. Although plutonium-238 cannot be used for nuclear weapons, launching missions with any kind of radioactive material on board always causes a public outcry (despite the most stringent safeguards against
contamination should the mission fail on launch), and hopelessly flawed conspiracy theories are inevitable. RTGs are not nuclear reactors, they simply contain a number of tiny plutonium-238 pellets that slowly decay, emitting -particles and generating heat. The heat is harnessed by thermocouples and converted into electricity for on board systems and robotic experiments.
**Nuclear Bad**
A nuclear thermal propulsion system can only operate for minutes Iannotta 2 (Ben, Aerospace America, http://www.aiaa.org/aerospace/Article.cfm?issuetocid=244&ArchiveIssueID=29,
accessed 6-27, JG)
A nuclear-thermal propulsion system would be more powerful, but its specific impulse would be 600-700 sec. Although the thermal propulsion system would provide a sudden burst of acceleration, it would operate for minutes compared to years for the nuclear-electric system. Think of the nuclear thermal system as a gas-guzzling V8 engine and the nuclear-electric system as a V4 economy car
that could run for 10 years on a tank of gas, Taylor says.
Too many barriers to solving Sharma 7 (Rahul, Lethbridge Journalism, http://www.lurj.org/article.php/vol1n2/chariots.xml, accessed 6-27, JG)
From the point of view of mass and flight time, Nuclear Thermal Propulsion (NTP) may well represent the best technology for human exploration beyond the Earth-Moon system. However, although it is well understood in concept, there is no program currently developing NTP flight systems (in contrast to chemical, SEP, and NEP). Thus NTP is a technology for which the entire burden of investment and advocacy would need to be borne by the human exploration program. In addition, there are serious environmental issues and infrastructure
investments that would need to be addressed to enable development and testing of NTP technology. Ground tests of NTP rockets would produce effluent gases for which new handling and cleaning facilities would be required. These investments and political concerns are a significant hurdle, and so we assert that the preferred solution is to establish workable first-generation human
exploration architecture without relying on NTP.
are simple but the execution can be complicated. NTP works on the same concept as a hydrogen rocket . The material that makes thrust is heated by a heat source. In this case it is a nuclear reactor. The sheer energy this system can produce when properly managed can exceed that of normal rocket systems. Unfortunately this type of propulsion is highly inefficient as the temperatures needed to make it truly effective would actually melt any known material now used to make rockets . To prevent this, the engine would have to lose 40% of its efficiency.
Lack of testing ensures mission failure and tanks solvency Powell et. al 4 (James, Plus Ultra Technologies, January,
http://www.aiaa.org/aerospace/Article.cfm?issuetocid=445&ArchiveIssueID=46, accessed 6-27, JG) Technical risk is another factor to consider in assessing nuclear propulsion systems . Unlike sensors and electronics, nuclear propulsion does not allow use of a redundant or back-up nuclear propulsion system. There is only one reactor, and it, together with the associated hardware, must function reliably during the entire mission. Going ahead with missions without having fully demonstrated propulsion system reliability, and without extensive long-term testing, risks mission failure.
Howe & OBrien 10 (Steven and Robert, U.S. Department of Energy, September,
http://www.inl.gov/technicalpublications/Documents/4731768.pdf, accessed 6-27, JG) The second issue has been studied by several review groups and by NASA for the past two decades. In
2008, a NASA supported team of government and industry participants spent several months designing a Fission Surface Power (FSP) system for the moon and estimating the cost of development [21]. The study estimated that the FSP would cost under $2B. This estimate encompassed three main categories: 1) reactor system development, 2) qualification of the system for space, and 3) alteration of facilities and security at the Kennedy Space Center to handle the system. The FSP estimate did not include any costs for ground based testing of a full power system nor any fuel, development costs. The most recent estimate by NASA is that development of a NTR would cost around $3-3.5 B. This is consistent with the FSP estimate in that fuel development and ground testing of the NTR will increase the costs . In all, the costs are modest compared to the savings in launch costs, the improvement in mission performance, and the reduction in mission risk.
determining the price of nuclear testing and development, especially NEP systems, is whether or not reactors and thrusters have to be tested for the full operational lifetime. If so, NEP testing will require enormous facilities capable of monitoring a reactor for a decade or more of operation , and ensure safety in case of mishap. Funding must be approved by Congress, and because of the long time scale of the project, it is repeatedly subject to being cut as new politicians are elected and administrations change.
Testing failure tanks solvency Powell et. al 4 (James, Plus Ultra Technologies, January,
http://www.aiaa.org/aerospace/Article.cfm?issuetocid=445&ArchiveIssueID=46, accessed 6-27, JG) Technical risk is another factor to consider in assessing nuclear propulsion systems. Unlike sensors and electronics, nuclear propulsion does not allow use of a redundant or back-up nuclear propulsion system. There is only one reactor, and it, together with the associated hardware, must function reliably during the entire mission. Going ahead with missions without having fully demonstrated propulsion system reliability, and without extensive long-term testing, risks mission failure.
Heres more testing evidence long-term problems without testing Powell et. al 4 (James, Plus Ultra Technologies, January,
http://www.aiaa.org/aerospace/Article.cfm?issuetocid=445&ArchiveIssueID=46, accessed 6-27, JG) Development time for NEP is likely to be considerably longer than for NTP, for two reasons. First, to ensure reliability, systems must be tested for periods comparable to their anticipated operating time. Testing an integrated NEP system for two or three years will not prove that the system can operate reliably in space for 12 years. Questions about long-term behavior of highburnup nuclear fuel at elevated temperatures, corrosion effects, fatigue and mechanical failure, and coolant leaks from piping and radiators, for example, cannot be resolved without long-term testing.
Electric Propulsion. This works on the concept of using electrical power to heat the rocket propellant. The main design concept now in use for this type of propulsion is the Radioisotope Thermoelectric Generator. The generator is powered by the decay of radioactive isotopes. The heat generated by the isotopes is captured by thermocouples which convert this
heat to the electricity need to heat rocket propellants.
Nuclear space exploration leads to an international space race Smith 3 (Wayne Smith, CIP Senior Fellow, 1-28, http://www.spacedaily.com/news/nuclearspace-03d.html, accessed 6-27, JG)
Little response was generated overseas as nuclear power in the form of RTG's (Radioisotope Thermionic Generators) for space probes and satellites is nothing new. However, the latest announcement places
nuclear power at the forefront of future space development. Spacefaring nations such as the European Union and Russia cannot ignore this challenge. In particular the newest emerging superpower, China, will closely watch how events unfurl. In just over three years, China has gone from Satellite
launches to planning a human spaceflight in October of this year. This remarkably rapid advancement was spurred by the realization of the strategic importance of space. Space will be central to tomorrow's world order and national security dictates that a space presence is a sign of strength. Huang Chunping, commander-in-chief of the chinese Shenxhou space launch program has said, "Just imagine, there are outer space facilities of another country at the place very, very high above your head, and so others clearly see what you are doing, and what you are feeling. That's why we also need to develop space technology." Clearly the Chinese have more on their minds than national prestige in attempting to become the third nation to ever have launched a man into space. Manned aerospace is the epitome of space technology. National prestige is clearly an important consideration, and one which westerners can easily relate to as they fondly reminisce about the moon landings. However, the military implications are just as important, if not greater, a consideration. China has already invested too much money into developing a space launch capability to consider pulling back now . In past interviews, they have announced the intention to build space stations, reach the moon and build bases there, and even boasted they will beat the United States with a manned mission to Mars.
Space racing leads to global perpetual wars Krepon and Clary 3 (Michael , CEO of the Henry L. Stimson Center, Christopher, Research Assistant for the
Weaponization of Space Project @ Stimson Center, http://www.stimson.org/wos/pdf/space3.pdf, 5-22, accessed 6-28, JG)
U.S. initiatives to seize the high ground of space are likely to be countered by asymmetric and unconventional warfare strategies carried out by far weaker statesin space and to a greater extent on Earth. In addition, U.S. initiatives associated with space dominance would likely alienate longstanding allies, as well as China and Russia, whose assistance is required to effectively counter terrorism and proliferation, the two most pressing national security concerns of this decade. No U.S. ally has expressed support for space warfare initiatives. To the contrary, U.S. initiatives to weaponize space would likely corrode
bilateral relations and coalition-building efforts. Instead, the initiation of preemptive or preventive warfare in space by the United States based on assertions of an imminent threator a threat that cannot be ameliorated in other waysis likely to be met with deep and widespread skepticism abroad. The international community has long been aware of latent threats to satellites residing in military capabilities designed for other purposes. Common knowledge of such military capabilities designed for other means has not generated additional instability in crisis or escalation in wartime. The flight-testing and deployment of dedicated space weaponry would add new instability in crisis and new impulses toward escalation. It would be folly to invite these consequences unless it is absolutely necessary to do so. Space warfare, far more than terrestrial combat, does not lend itself to the formation of coalitions of the willing. U.S. initiatives to weaponize space could therefore result in a lonely journey that leads to war without end and to war without friends. The burdens and risks placed upon the shoulders of U.S. expeditionary forces would be exceedingly great. In addition, the quest for space dominance would undoubtedly accentuate domestic political
Nuclear-electric propulsion costs billions and takes 20 years Moomaw 3 (Bruce, writer @ SpaceDaily, 1/21/3, http://www.spacedaily.com/news/rocketscience03a1.html) JPG Nuclear-electric exploration of the Solar System has tremendous scientific potential in the middle-range future -and such reactors would use uranium-235, which is far more expensive than plutonium but also
thousands of times less radioactive when a reactor is shut down, thus being virtually totally safe to launch into orbit. But developing such miniature spacegoing reactors, as mentioned, will still be a difficult task, costing one or two billion dollars -- and there is simply no unmanned Solar System scientific mission planned for flight within the next 15 to 20 years that needs such a powerful propulsion system badly enough to be worth that expense in such a short time.
Small outages have a cascading effect throughout the grid Glauthier 3 (T. J., President & CEO of the Electricity Innovation Institute, 9/21/3, "LIGHTING UP THE
BLACKOUT: TECHNOLOGY UPGRADES TO AMERICA'S ELECTRICAL SYSTEM" lexis/nexis) JPG
I sincerely appreciate the opportunity to address this distinguished Committee on a subject about which we are all concerned.
The electric power system represents the fundamental national infrastructure, upon which all other infrastructures depend for their daily operations. As we learned from the recent Northeast blackout, without
electricity, municipal water pumps don't work, vehicular traffic grinds to a halt at intersections, subway trains stop between stations, and elevators stop between floors. The August 14th blackout also illustrated how vulnerable a regional power network can be to cascading outages caused by initially small--and still not fully understood--local problems. In response to the Committee's request, my testimony today provides some of EPRI's and E2I's views on technology issues that require further attention to improve the effectiveness and reliability of the nation's interconnected power systems. This testimony will be supplemented with a matrix table as requested by the Committee. Context for power reliability Power system reliability is the product of many activities--planning, maintenance, operations, regulatory and reliability standards--all of which must be considered as the nation makes the transition over the longer term to a more efficient and effective power delivery system. While there are specific technologies that can be more widely applied to improve reliability both in the near- and intermediate-term, the inescapable reality is that there must be more
than simply sufficient capacity in both generation and transmission in order for the system to operate reliably. The emergence of a competitive market in wholesale power transactions over the past decade has
consumed much of the operating margin in transmission capacity that traditionally existed and helped to avert outages. Moreover, a lack of incentives for continuing investment in both new generating capacity and power
delivery infrastructure has left the overall system much more vulnerable to the weakening effects of what would normally be low-level, isolated events and disturbances.
Blackouts cost the economy 30 Billion Dollars PER DAY. Just a few days of outage brings economic growth down to ZERO Bryan 3 (Jay, writer @ The Gazette, Power grids vital in information age: "Just a few days could
theoretically take economic growth ... right down to zero", lexis/nexis) This worsened the already-anemic state of a U.S. economy that had been hammered by a massive stockmarket meltdown and a series of confidence- sapping corporate scandals. It hurt Canada, too, weakening our biggest market. So now, just when there are signs of healthy growth in both countries, is the last time you'd want to see a large part of the continent's electric-power network collapse. We can be grateful that the immediate impacts look modest. David Rosenberg, chief North American economist with Merrill Lynch, estimates that the U.S. impact could amount to as much as $30 billion for each day of interrupted activity . That's roughly one percentage point of quarterly economic growth, which means that just a few days could theoretically take economic growth in the third quarter right down to zero. But this is just the first step in his analysis. In reality, most activity was returning to something close to normal by yesterday. More important, Rosenberg says, any losses in August are likely to be recouped in September, much as economic activity rebounds to wipe out most losses after a severe winter storm. But even if we do look back on the great blackout of '03 as a mere hiccup for the economy, there will be little reason for complacency. As Royal Bank economist John Anania notes, the reliability of the power grid is
Economic collapse causes nuclear war- extinction Broward 9 ((Member of Triond) http://newsflavor.com/opinions/will-an-economic-collapse-kill-you/
AD: 7-7-09 )ET Now its time to look at the consequences of a failing world economy. With five offical nations having nuclear weapons, and four more likely to have them there could be major consequences of another world war. The first thing that will happen after an economic collapse will be war over resources. The United States currency will become useless and will have no way of securing reserves. The United States has little to no capacity to produce oil, it is totatlly dependent on foreign oil. If the United States stopped getting foreign oil, the government would go to no ends to secure more, if there were a war with any other major power over oil, like Russia or China, these wars would most likely involve nuclear weapons. Once one nation launches a nuclear weapon, there would of course be retaliation, and with five or more countries with nuclear weapons there would most likely be a world nuclear war. The risk is so high that acting to save the economy is the most important issue facing us in the 21st century.
Uq No Plutonium
NASA is out of plutonium its out of alternatives ONeill 9 (Ian, writer @ Universe Today, 5/8/9, http://www.universetoday.com/30610/nasa-is-runningout-of-plutonium/) JPG
Decommissioning nuclear weapons is a good thing. But when our boldest space missions depend on surplus nuclear isotopes derived from weapons built at the height of the Cold War, there is an obvious problem. If were not manufacturing any more nuclear bombs, and we are slowly decommissioning the ones we do have, where will NASAs supply of plutonium-238 come from? Unfortunately, the answer isnt easy to arrive at; to start producing this isotope, we need to restart plutonium production. And buying plutonium-238 from Russia isnt an option, NASA has already been doing that and theyre running out too This situation has the potential of being a serious limiting factor for the future of spaceflight beyond the orbit of Mars. Exploration of the inner-Solar System should be OK, as the strength of sunlight is substantial, easily powering our robotic orbiters, probes and rovers. However, missions further afield will be struggling to collect the meagre sunlight with their solar arrays. Historic missions such as Pioneer, Voyager, Galileo, Cassini and New Horizons would not be possible without the plutonium-238 pellets. So the options are stark: Either manufacture more plutonium or find a whole new way of powering our spacecraft without radioisotope thermal generators (RTGs). The first option is bound to cause some serious political fallout (after all, when there are long-standing policies in place to restrict the production of plutonium, NASA may not get a fair hearing for its more peaceful applications) and the second option doesnt exist yet. Although plutonium-238 cannot be used for nuclear weapons, launching missions with any kind of radioactive material on board always causes a public outcry (despite the most stringent safeguards against contamination should the mission fail on launch), and hopelessly flawed conspiracy theories are inevitable. RTGs are not nuclear reactors, they simply contain a number of tiny plutonium-238 pellets that slowly decay, emitting -particles and generating heat. The heat is harnessed by thermocouples and converted into electricity for on board systems and robotic experiments. RTGs also have astonishingly long lifespans. The Voyager probes for example were launched in 1977 and their fuel is predicted to keep them powered-up until 2020 at least. Next, the over-budget and delayed Mars Science Laboratory will be powered by plutonium-238, as will the future Europa orbiter mission. But that is about as far as NASAs supply will stretch. After Europa, there will be no fuel left.
More ev Dillow 9 (Clay, writer @ Popular Science, 9/29/9, http://www.popsci.com/military-aviation-ampspace/article/2009-09/nasas-plutonium-shortage-threatens-deep-space-exploration) JPG Imagine youre driving across the Mojave Desert, and somewhere in the middle of absolutely nowhere you realize that the next gas station is further away than your car can travel on its current supply of gasoline. What next? Thats the problem NASA mission planners are facing as the agency's supply of plutonium-238, the fuel used to power deep space probes like Cassini and surface scouts like the upcoming Mars Science Laboratory, are dwindling. Unfortunately, that leaves NASA in a pretty tight spot: weve depleted our reserves of plutonium-238, and there isnt anywhere to refuel ahead on the horizon either. Plutonium-238
powers spacecraft via heat given off by its radioactive decay. A small pelletsmaller than ones fistglows red from its own heat and can power equipment in extremely hostile environments like the vacuum of space, where temperatures vary greatly. For missions to the outer planets or the Kuiper belt, where sunlight is a thousand times lower and the temperature near absolute zero, plutonium-238 is the only option, as solar power is too weak to provide an effective charge. But this special brand of plutonium was a byproduct of Cold War activities and hasnt been produced by the U.S. since the 80s (plutonium-239 goes in nuclear warheads, so naturally we keep plenty of that laying around). NASA has launched nearly two dozen missions over the past four decades that were powered by plutonium-238, including the Voyager probes, the Galileo probe that studied Jupiter and its moons, and the Cassini that is currently doing laps around Saturn. Those missions ran on either U.S. reserves of plutonium-238 or excess stock purchased from Russia. But now neither nation is producing the stuff, and even if we started again today, it would take eight years to build up production to the volumes necessary for annual deep space missions.
Timeframe 10 years
Development of the necessary tech takes at least 10 years Moomaw 3 (Bruce, writer @ SpaceDaily, 1/21/3, http://www.spacedaily.com/news/rocketscience03a1.html) JPG
However, this -- to put it mildly -- is not the same thing as saying that NASA plans to try to develop a very large nuclear rocket engine capable of launching a manned ship to Mars within a decade. Pae quotes O'Keefe as saying: "We're talking about doing something on a very aggressive schedule to not only develop the capabilities for nuclear propulsion and power generation but to have a mission using the new technology within a decade." But O'Keefe has spent the past year talking constantly about his hopes for a deep space mission using nuclear-powered propulsion within a decade or so -- while making it clear that he is talking about an unmanned, relatively small probe. NASA's Nuclear Electric Propulsion program -- for which it included $46.5 million in its FY 2003 budget request -- would have been just such a system.
$ Link
A manned nuclear rocket costs tens of billions and takes decades Moomaw 3 (Bruce, writer @ SpaceDaily, 1/21/3, http://www.spacedaily.com/news/rocketscience03a1.html) JPG Such a huge nuclear-powered manned ship would certainly take tens of billions of dollars to develop, and it is utterly ridiculous to say that there is any chance that NASA could develop a manned Mars ship (nuclear-powered or not) quickly enough to launch it within a decade.
**Fusion**
reaction used by fusion research today. Unfortunately, if this method was used to fuel a starship -- such as the Icarus interstellar vehicle -- the deuterium-tritium (D-T) reaction produces high-energy neutrons that transfer heat from the reaction directly to the engine's structure . About 80 percent of the fusion energy released is in the form of those neutrons, so the reaction isn't very healthy (or useful) for a starship. Pure deuterium reactions also produce neutrons, though only about 1/3 of the fusion energy is released as such. That's better than the D-T reaction, but when we're talking about engine powers in the hundreds of gigawatts to terawatts, then such percentages mean gigawatts of heat that must be gotten rid of, adding to the mass of the engines and degrading the overall performance.
One of the first issues posed by the D-T fusion reaction was how to supply sufficient tritium. Tritium is radioactive, with a relatively short half-life of 12.4 years, and therefore it exists only in minute quantities in nature. Luckily, the neutron emitted in D-T fusion can react with an isotope of lithium to
produce tritium and even release additional energy in the process. Though nothing compares with the vast store of deuterium in seawater; the worlds lithium resources are enough for several thousand years of energy production. The lithiumneutron reaction resolves the tritium-supply problem. However, it introduces additional engineering difficulties. The severity of the technical problems associated with the D-T reaction was not fully understood in the early years of the fusion program. But
these difficulties have gradually been revealed by the extraordinarily detailed series of conceptual reactor designs produced under Department of Energy (DOE) funding over the last decade. The object of these studies is to describe a plausible fusion reactor based on the underlying physics and reasonable extrapolations of the technology. Of course, no one can be certain exactly what a D-T fusion reactor will look like. Nevertheless, several difficult questions that might seem to depend on this knowledge can already be answered. In particular: will a fusion reactor be simpler or more complex, cheaper or more expensive, safer or more dangerous, than a
The main fusion reaction will take place in a gaslike plasma in which deuterium and tritium atoms are so energetic so hot that the nuclei have lost their electrons. The temperature of this gas will probably exceed 150,000,000 C. This plasma cannot be contained
fission reactor? The answers depend only on the broad outlines of future reactors.
by physical walls, not only because no material could withstand the heat, but also because walls would contaminate the plasma. Instead, the plasma will be bottled within a vacuum by magnetic forces,
Four-fifths of the energy from the D-T reaction is released in the form of fast-moving neutrons. These neutrons are 15 to 30 times more energetic than those released in fission reactions. The first wall surrounding the plasma and
vacuum region will take the brunt of both the neutron bombardment and the electromagnetic radiation from the hot plasma. This first wall is expected to be made of stainless steel or, better, one of the refractory metals such as molybdenum or vanadium that retain their strength at very high temperatures. In colliding with this wall, the neutrons will give up some of their energy as heat. This heat must be removed by rapidly circulating coolant to prevent the wall from melting. After being piped out of the reactor, the
heated coolant is used to produce steam and generate electricity. The fusion of deuterium (D) with tritium (T) is 100 to 1,000 times more reactive than the fusion of combinations involving helium 3 (He3), protons (p), or boron 11 (B11). In other words, a DT based power plant would yield 100 to 1,000 times more energy than an identical plant using the other fuels. That is why almost all research has focused on D-T fusion. However, the energetic neutrons it releases would damage and induce radioactivity in the reactor structure. Many of the collisions between neutrons and
atoms in the first wall actually knock the atoms forming the metal out of their original positions. Each atom in the first wall will, on average, be dislodged from its lattice position about 30 times per year. Obviously, this causes the structure of the metal to deteriorate. A few of the neutrons colliding
with atoms in the first wall will have the beneficial effect of dislodging some neutrons from the atomic nuclei. These dislodged neutrons, plus the original ones generated by the fusion, pass through the wall and into the so called blanket, which contains lithium in some form. Here, the bulk of their energy is used to produce heat, which also is used to create steam for generating electricity, and eventually the neutrons are absorbed by the lithium to breed tritium. Lithium itself poses
it burns violently when it comes in contact with either air or water and even capable of undergoing combustion with the water contained in concrete. The
serious engineering problems. It is an extremely reactive chemical:
lithium may be either in liquid form or in a solid compound. Liquid lithium blankets produce substantially more tritium and allow it to be more easily removed. However, the need to handle large amounts of this metal in liquid form leads to technical complexity and poses safety hazards . The tritium-breeding region has other engineering requirements. It must be designed in such a way that the structural materials, as contrasted with the actual lithium, capture a minimum of neutrons. Also, the operating temperature must be high enough so that the coolant, when piped outside the reactor, can generate steam efficiently . Outside the blanket, powerful magnets must provide the magnetic fields to contain the plasma. These fields will exert enormous forces on the magnets themselves, equivalent to pressures of hundreds of atmospheres. If made from copper wire, these magnets
would consume more power than produced by the reactor, so they will have to be superconducting. Superconducting magnets, cooled by liquid helium to within a few degrees of absolute zero, will be extremely sensitive to heat and radiation damage . Thus, they must be effectively shielded from the heat and
radiation of the plasma and blanket. Temperatures within the fusion reactor will range from the highest produced on earth (within the plasma) to practically the lowest
The entire structure will be bombarded with neutrons that induce radiation and cause serious damage to materials. Problems associated with the inflammable lithium must be managed. Advanced materials will
possible (within the magnets). have to endure tremendous stress from temperature extremes and damaging neutrons. The magnetic fields will exert forces equivalent to those seen only in very high
A working fusion reactor would also have to be very large. This conclusion is based on fundamental principles of plasma physics and fusion technology. To begin with,
pressure chemical reactors and specialized laboratory equipment. All in all, the engineering will be extremely complex. because of the properties of magnetic fields, a fusion reactor must be tubular. There is still dispute as to whether this tube should be bent into a toroidal (doughnut) shape, as in the device known as the tokamak, or kept as a long, straight tube with end plugs, as in the device known as the tandem mirror. However, the main conclusions as to the size and complexity of a D-T reactor are independent of this choice. The first wall of the reactor encloses the plasma. The best theories available
Even if a breakthrough in physics were to allow a smaller plasma, separate engineering requirements would prevent the radius of the first wall from being appreciably less than three meters . These requirements arise from
the need to avoid excessive differences in power density.
surrounding the plasma, and the relatively small surface area of this wall cannot be increased without further increasing the size of the reactor. In fact, bigger reactors need larger heattransfer rates. Thus, the actual heat-transfer rate per square inch must be extremely large and cannot simply be reduced by a design change.
Fusion is unstable, expensive, and nearly impossible to improve Lidsky 83 (Lawrence M., MIT Technology Review, October, p. 7-8,
http://www.askmar.com/Robert%20Bussard/The%20Trouble%20With%20Fusion.pdf, JM) On these counts, a comparison between current LWR fission reactors and the somewhat optimistic fusion designs produced by the DOE studies yields a devastating critique of fusion. For equal heat-transfer rates, the critical inner wall of the fusion reactor is subject to ten times greater neutron flux than the fuel in a fission reactor. Worse, the neutrons striking the first wall of the fusion reactor are far more energetic and thus more damaging than those encountered by components of fission reactors. Even in fission reactors, the lifetimes MIT Technology Review, October 1983 7 of both the replaceable fuel rods and the reactor structure itself are limited because of neutron damage. And the fuel rods in a fission reactor are far easier to replace than the first wall of the fusion reactor, a major structural component. The drawbacks of the existing
fusion program will weaken the prospects for other fusion programs, no matter how wisely redirected. But even though radiation damage rates and heat transfer requirements are much more severe in a fusion reactor, the power density is only one-tenth as large. This is a strong indication that fusion would be substantially more expensive than fission because, to put it simply, greater effort would be required to produce less power .
A leading cause for several of the known failure modes is erosion and eventual removal of the dischargecathode-keeper plate. Keeper erosion is a well-documented wear process for the NSTAR thruster and was tracked photographically during the course of the ELT. Ion-bombardment-sputter-erosion of the keeper plate by the discharge plasma led to the complete removal of the plate after 30,000 hrs of operation. The primary function of the keeper plate is to protect the cathode from discharge-plasma ion-bombardment. As the keeper erodes, it exposes the cathode-orifice plate, heater, and radiation shield to discharge-plasma ion-bombardment. Excessive erosion of the heater may lead to breach of the heater sheath, and therefore heater failure. Heater failure causes cathode failure because without a functioning heater, the cathode cannot be ignited. Keeper erosion can result in orifice-plate removal. During the ELT, following the removal of the keeper, the cathode-orifice-plate-to-tube weld was eroded by discharge-plasma ion-bombardment. If the orifice plate had fallen off, the cathode operation would have ceased. Cathode inability to start due to a cathode-common-keeper short is also a result of wear of the
cathode assembly. The source of the shorting material is likely erosion of the discharge-keeper plate 4,s. Although not a failure for the cathode itself, erosion of the radiation shield due to discharge-plasma ion-sputtering led to the formation of tantalum (Ta) flakes large enough that they could lodge themselves between the grids or defocus individual beamlets, causing rogue hole formation. The other primary cathode wear mechanism, not related to keeper erosion is excessive, is performance degradation or cathode failure-to-start due to insert depletion. Removal of impregnate material is a temperature- and runtime-dependent process. When impregnate material is removed or is not readily available for diffusion to the surface, electron emission cannot occur and the cathode cannot operate.
are several practical problems which need to be overcome before these devices can become viable. Although these issues appear present technical obstacles to a working
system, none of them seems to be insoluble in the light of current knowledge. This section outlines two particular areas which need further work in order to make IEC systems an engineering reality. The first of these is the achievement of high density in the active area and the second is good energy reclamation. These two areas are now considered in turn. The importance of a high particle density has already been described in some detail in the preceding sections. There are also some related issues which need some further consideration. One is the isolation of high and low densities in the machine. For the ion
source and acceleration system to work effectively they need to operate in a good vacuum. Stray particles cause unwanted collisions, scattering the beam and resulting in an increase of waste heat. This means that good containment of the target particles is important, and this is the principle reason why ions and plasmas are the main focus of research - both can be effectively contained. As well as being difficult to physically separate from the beam, neutral atoms also contain bound electrons and some of the incident beam energy is used up ionising these. A plasma can be contained, but although the electrons are now separate from their parent nuclei, they are still present and are scattering sites for the beam; they can spill easily into the main vacuum with the deleterious effects already described, and can also carry away heat energy (although some of the reclaim systems mentioned in the section above may ameliorate the problem by recovering
these). Ionic systems therefore have several advantages over neutral ones. The target particle cloud needs to be dense and well contained for the reasons already mentioned. It also needs to be shaped appropriately. If the target is too large
or the wrong shape, then scattered and fused products will undergo further secondary scattering in the cloud with several undesirable consequences. These include the transfer of heat to the cloud,
raising its entropy and removing recoverable energy from the system and the spillage of scattered particles out of the trap and into the main chamber with the results already discussed. This is why the long, thin, sausage-shaped topology described in the sections above is useful. Although a system based on ionic entrapment is in some respects ideal, there are two problems associated with it. The first of these is overcoming the natural coulombic repulsion of the ions in order to gain a dense enough target . This issue has already been discussed. The second is the form of the ion trap necessary to contain a high enough density . Paul and Penning traps tend to enclose the ions in metal structures and this stops the scattered and fusion products escaping freely as required by the energy capture systems. However, as previously discussed, novel trap topologies are available and there is still research to be done, ideally to produce a trap with the field topology shown in Fig. 20. Such a structure should ideally have no physical protrusions into its active region. Although this might not be possible as a static system, dynamic approaches such as standing waves and collapsing field profiles have still to be explored. Other nonlinear field phenomena for example field arrangements similar to those which cause charge bunching in Gunn diodes might also be explored. The use of neutral beams was discussed in a previous section as a possible solution to the density issue. However, as already noted, neutral particles are difficult to contain and also use energy in ionisation (although this is small, only 13 eV for a Deuterium atom). Any neutral beam system would probably therefore be pulsed, the individual pulses being sent to reach the reaction area at exactly the same time as the accelerated beam. Scattered and fused components would then be expelled from the centre quickly, due to their inherent velocity, and captured by the retrieval system. The remaining neutral particles would need to be evacuated before a new pulse was initiated. In such a system, timing would be critical. Consider now the practical problems associated with the accelerated ion beam. The technology of ion acceleration is fairly simple; however, if the device is to operate in a pulsed mode there are some added complications. These mostly involve ensuring that the ion pulses arrive at the target with optimal timing - this is critical for cavity efficiency. Ideally the density profile of the beam should be a sinusoidal variation. However, in practice this may be difficult to achieve due to different initial ion velocities. Reducing the variability of ion velocity is a significant way of improving the beam profile, and this can be achieved by sorting the ions before acceleration into a narrow velocity band. This is often done in Ion Scattering Spectroscopy [50] for the same reason. The ions are injected into a curved duct or tube under the influence of a constant magnetic field, only those with exactly the energy required to emerge from the other end without hitting the tube walls are accelerated. The idea is shown in Fig. 21. This is one of several useful techniques which can be adapted from this field. Another practical issue concerns power-management in the system. The energy inputs and outputs can be divided up into two classes - internal and external. This classification differentiates the power produced and consumed within the device from the power delivered-to or drawn-from external sources. In the steady-state the
system is a net source of power; however, in the start-up phase, external power is probably required. Figure 22 illustrates the broad input and output groupings and details some of the internal sources and loads. The
key to running the system efficiently will be the intelligent handling of these by the management system.
is that the scientific feasibility of such weapons could be established using the same devices that are being promoted as essential for the ratification of the Comprehensive Test Ban Treaty
(the treaty has been signed by about 150 countries, with the notable exceptions of India and Pakistan, since September 1996). The nuclear weapons powers, notably the United States and France, have programs for the "stewardship" of their existing stockpiles of nuclear weapons. As part of their stewardship programs they are building or operating facilities that will be used to maintain the skills of nuclear weapons designers, and which could be sued to develop a qualitatively different class of nuclear weapons. ICF facilities and research are an important part of these programs. Since its May 11 nuclear tests, India has also announced its own stockpile stewardship program. The stated goals of the US stockpile stewardship program are to maintain the safety and reliability of existing weapons. We have shown in a previous report that most of the US program of SBSS is marginal or irrelevant to nuclear safety .1We have also argued that fusion facilities such as NIF and the proposed X-1, are not relevant to maintaining the reliability of current nuclear weapons, particularly if the United States were to adopt a nuclear policy based upon deterrence rather than first-strike. The evidence for this conclusion is summarized below. Pursuit of programs with explicit potential for designing new nuclear weapons is counter to Article VI of the NPT and to the CTBT. This applies whether the new weapons follow on current generation fission-triggered weapons or are part of an entirely new class of weapons, such as pure fusion weapons. In this context, it is worthwhile to recall that Article VI of the NPT relates, among other things, to the "cessation of the nuclear arms race at an early date." ICF researchers claim that their research could also lead to
commercial power production from fuels that are widely available and plentiful. However, the energy applications of any explosive fusion research should be justified on their own merits and in comparison to other energy projects. Many environmentally sound energy technologies are much further ahead than ECF and yet receive far fewer resources. Further, ECF approaches will take decades to develop into economical energy sources, if they prove feasible at all . The fact that large resources have been spent over decades on fusion power research without even establishing scientific feasibility needs to be more carefully considered, given the urgency of reducing greenhouse gas emissions. Military rationalizations and the relatively great pull of nuclear bureaucracies on governmental energy programs seem to be the forces driving ECF programs rather than serious evaluations of the world's energy and environment needs.
an avenue for limiting proliferation. But tritium can be produced in commercial reactors
(through the use of lithium target rods in light water reactors or by the extraction of the tritium produced in heavy water reactors, like CANDUS, due to the conversion of deuterium to tritium).15Separation facilities are also needed to extract the tritium for the target rods, but these are less complex than those for extracting plutonium from irradiated reactor fuel and could be more readily developed and operated. Tritium is hard to detect if it is properly shielded and put into appropriate containers, making development of effective radioactive and monitoring systems very difficult, although not impossible. Further, tritium is currently not under international safeguards and there are no official plans for such safeguards. In fact, the US is in the process of greatly loosening restraints. It has initiated a program to produce test quantities of tritium for its weapons program in commercial nuclear reactors and may initiate a large-scale program for military tritium production in commercial reactors owned by the Tennessee Valley Authority. Even more troubling, however, is the possible future use of lithium and deuterium in either fusion research or in potential fusion weapons programs. While this is speculative at present so far as pure fusion weapons are concerned, it is important to note here that the thermonuclear component of fission-triggered nuclear weapons consists of a combination of these two elements in the form of lithium-deuteride. Both lithium and deuterium are
non-radioactive and are readily available. There will be essentially no way to control their production or to keep track of it. In the short term it is necessary to bring tritium stockpiles under international safeguards. This would provide a small but not sufficient measure of restraint .
Perhaps more importantly, tritium production for weapons should be halted as it is inconsistent with nonproliferation and disarmament goals. (Commercial requirements are far smaller than weapons and can be met from current stockpiles and byproducts from Canadian heavy water reactors).16Certainly, the program in the United States to develop a new tritium production source should be halted since it is unnecessary . Current tritium
supplies are more than adequate to meet US stockpile needs if further efforts towards reducing the number of nuclear weapons are made.17
scale funding of such activities more likely. Even without the construction of actual weapons, these activities could put the CTBT in serious jeopardy from forces both internal and external to the United States. Internally, those same pressures, which could lead to the resumption of testing of current generation weapons, could also lead to the testing of new weapons (to replace older, less safe or less reliable weapons). Externally, the knowledge that the United States or other weapons states were engaging in new fusion weapons design activities could lead other states to view this as a reversal of their treaty commitments. Comparable pressures to develop pure fusion weapons would be likely to mount in several countries. This would have severe negative repercussions for both non-proliferation and complete nuclear disarmament. The time to stop this dangerous
thermonuclear quest for explosive ignition is now, before its scientific feasibility is established.
CTBT is key to prevent proliferation and nuclear war Lalanne 2 (Dominique, Speaker @ NPT Review Conference Preparatory Committee, April,
http://www.reachingcriticalwill.org/legal/npt/NGOpres02/5.pdf, JM) The Comprehensive Nuclear-Test-Ban Treaty (CTBT) is an integral part of our global efforts to reduce the dangers of weapons of mass destruction. All states should recognise that action on the CTBT is all the more important in light of heightened awareness regarding the dangers of terrorism generally and nuclear terrorism in particular. The states presently resisting the CTBT are undermining their own security as well as the security of the entire world. The CTBT was brought about through the hard work and determination of NGOs and millions of ordinary people around the world. In all these years, the NGO community has not faltered in its advocacy for a test ban treaty. People throughout the world understood that ending nuclear
testing was essential for two powerful reasons: to halt the spiraling arms race once and forever; and to prevent further devastation of human health and the global environment, already contaminated from decades of atmospheric and underground explosions . We are
profoundly disappointed with the countries that failed to attend the Second Conference on Facilitating the Entry into Force of the Comprehensive Nuclear-Test-Ban Treaty, in November 2001, especially those states whose signature or ratification is essential for entry into force. We are pleased, however, at the support for the CTBT demonstrated by three nuclear-weapon states (France, Russia, and the UK) and we call on them to maintain and strengthen their support . Entry into force of the CTBT is crucial to the stability and future of the non-proliferation regime, as all NPT states parties confirmed at the 2000 Review Conference. Among the 13 practical steps for systematic and progressive nuclear disarmament identified in the final document of that conference, two are devoted to the CTBT and nuclear testing. The first of these steps stressed the importance and urgency of signatures and ratifications, without delay and without conditions and in accordance with constitutional processes, to achieve the early entry into force of the Comprehensive Nuclear-Test-Ban Treaty. The second step called for a moratorium on nuclear-weapon-test explosions or any other nuclear explosions pending entry into force of that Treaty. Both of these goals are in serious danger today. This conference should take stock on progress towards these goals and make practical recommendations on how to achieve early entry into force of the test ban treaty. A ban on testing is an essential step towards nuclear disarmament because it helps block dangerous nuclear competition and new nuclear threats from emerging . However, technological advances in nuclear weapons research and development mean that a ban on nuclear test explosions by itself cannot prevent some qualitative improvements of nuclear arsenals.
Ramjets Bad
Ramjet propulsion wont work Froning 81 (H.D., Senior Staff Engineer, AIAA, Investigation of a quantum ramjet,
http://www.greenstone.org/greenstone3/nzdl?a=d&d=HASH8a0afea69cce9a9819cf36.4&c=envl&sib=1&dt=&ec=&et=&p.a=b&p.s= ClassifierBrowse&p.sa=, JM) This investigation indicates that so-called "quantum interstellar ramjets" would have to accomplish
propulsive interactions with the invisible, illusive and ever-changing quantum fluctuation energies of the vacuum over scales of time and distance that are many orders of magnitude less than those required for accomplishing any known ramjet combustion processes. As such, this investigation indicates that ramjet-like quantum propulsion systems are far beyond even the boldest and most optimistic extrapolations of our ramjet art . But this investigation has also revealed that a quantum
propulsion system need only extract an infinitesimal fraction of the enormous vacuum fluctuation energies which may exist over submicroscopic scales of distance along a starship's entire interstellar route. Therefore, perhaps some hope remains that some type of quantum propulsion system might eventually exploit the stupendous fluctuation energies of cosmic space for propulsive purposes. And although such a system would surely be bizzare and beyond our current scientific understanding, perhaps it would be but one exciting example of what may unfold during the next century of flight as mankind delves deeper into the mystery of matter, time and space.
Ramjets Bad
Ramjet intake is limited cant reach high speeds Martin 73 (A.R., Freelance physicist, Astronautica Acta 18, 1, abst.,
February, JM) Description of a physical model of a magnetic intake for an interstellar ramjet. The particle collection properties of the intake are formulated by applying two criteria. First, all particles with more than a certain amount of their initial momentum in the transverse component are reflected by the magnetic field peak near the vehicle. Second, all particles with an initial gyration radius larger than a certain value are not injected into the vehicle power plant, even if the particles are not reflected. A brief discussion of proton reactors allows a numerical value of the necessary particle collection rate to be derived, and this value is used in conjunction with the intake fractions to determine the minimum possible intake dimensions. It is found that, while intakes of radius about 1 km are possible at large fractions of the speed of light, the intake radii at low velocities during the initial part of the journey are of the order of 10,000,000 km. This places a severe restriction upon the feasibility of the magnetic ramjet concept.
Ramjets lack propellant and fusion technology Jones 6 (Antonia J., Department of Computing
@ Imperial College, London, July 23, http://users.cs.cf.ac.uk/O.F.Rana/Antonia.J.Jones/Lectures/Specials/SelfReplicatingAutomata.pdf, JM) Consideration of mass ratios for various propulsion systems show that it would be highly desirable to acquire propulsion mass from interstellar space. The only available propellant is the interstellar matter. This is extremely rarefied with typically 1 - 2 atoms/cm3, or even less. Since these atoms are primarily hydrogen this translates to a density of about 2 X 1024 g/cm3. In 1960 [Bussard 1960] published his classic paper on the interstellar ramjet. He proposed collecting the matter by ionising it and using an electromagnetic field to concentrate the matter in order to initiate fusion. Because concentrations of interstellar hydrogen are so dilute
Bussard found that the scoop intake would have a diameter of around 100 kms. Controlled thermonuclear fusion was achieved for several seconds on the 9 November 1991 at the Joint European Torus near Oxford, although the temperature reached was insufficient to make the reaction self-sustaining. It will probably take 20-30 years to get
the first fusion based electricity generating stations on line but the important point is that it has been proved to be
feasible. This, combined with recent advances in high temperature superconducting technology, makes the design and construction of a Bussard Ramjet at least a possibility in principle. Fishback [Fishback 1969] examined in more detail the collection by a magnetic field and developed equations limiting the speed of the ramjet relative to the plasma, in [Martin 1971] some corrections were made to the numerical results. The limitation is not severe in regions with densities of 1-2 protons/cm3 but at ten times this density the speed for an aluminium
the scoop can be considerably smaller in high density regions, but we shall not be able to take advantage from this. Moreover, such regions are not
structure is limited to 0.94c and at 100 times to only 0.075c. Obviously
uncommon in the Galaxy [Martin 1972]. Other limitations of the ramjet are studied in detail in [Martin 1973], [Matloff 1977]. In addition the proton-proton reaction is difficult to sustain because of the small cross section [Whitmire 1975]. It may be possible to add a nuclear catalyst, such as carbon-12, which could be recovered, to speed up the reaction [Whitmire 1975] by a factor of 1018 which would make the ramjet project more feasible. The success at JET suggest that practical fusion is a possibility but it seems likely that the demonstration of such catalytic
reactions will not be accomplished in the near future. A lot of invention and research is needed, however, before the Bussard ramjet becomes a reality. The fusion reactor must be light-weight and long-lived, it must be able to fuse protons, not the easier to ignite mixture of deuterium and tritium. The reactor must be able to fuse incoming protons without slowing them down , or the frictional loss of bringing the fuel to a halt, fusing it, and reaccelerating the reaction products will put an undesirable upper limit on the maximum velocity obtainable.
***SOLAR***
solar sail-powered spacecraft is most likely to have resides in its controls. The probe will not be able to steer its sails like a boat does in the ocean, and will have to be set on the correct trajectory from the start. The smallest variations could lead , in a 2,500 AU-long journey, to errors of up to a million miles, which are naturally to be avoided. Experts believe that, with it being launched so close to the
Sun, the spacecraft's orbital planners will have to keep in mind the theory of general relativity, as well as the precession of the perihelion of orbiting objects.
Hard to design for favorable space travel Alhorn 11 (Alhorn, NASA Engineer, 02-03, http://www.pbs.org/wgbh/nova/space/alhorn-solarsails-au.html,
accessed 6-24-11, JG)
It's not something thatIt's basically on the cusp of being accepted in general. You
have to design a system, which is very hard to test here on the ground, because these structures are very, very lightweight. There's a term called "gossamer"they're very flimsy, they're very lightweight, and they're very hard to test. That's one of the main challenges. Another challenge is to design it such that it will go off and do what you want it to do, because these materials are sort of thin and flimsy that it takes special mechanisms to make them behave the way you want them to . So it's a challenging mechanical design.
Solar sails one of the hardest things to build without failure ITOD 6 (ITOD News, 6-19, http://itotd.com/articles/573/solar-sails/, accessed 6-24-11)
Even so, if
a solar sail is going to push a spacecraft of any significant mass, itll have to be enormous. And therein lies a problem: with greater size comes greater massnot so much from the sail itself but from the support structure thats needed to keep it rigid and connect it to crafts payload. The greater the mass to be pushed, the greater the size of the sail thats needed , and so on. Thus, in solar sail design, thinner and lighter materials are almost always better. Sail thickness is measured in micrometres (m)
millionths of a meterwith some being as thin as 2 m. (By comparison, the average human hair is about 80 m thick.)
This brings up a second problem: fragility. Youve got to fold or roll up a huge sheet of material thats a zillionth of an inch thick, get it into space, and then unfurl it perfectly without ripping or mutilating it, and without creating a support structure so massive that itll cancel out the sails
low mass.
Vulpetti 8 (Giovanni, Physicist, Solar sails: a novel approach to interplanetary travel, pg. 103, accessed 6-24-11,
are not yet up to the support of human exploration of the solar system. These sails are too small to carry the tens of thousands of kilograms necessary to support humans between the planets and exploration gear .
Also, sail-implemented missions to Mars (for example) using today's sail technology would be of longer duration than rocketpropelled interplanetary ventures.
Cant test on Earth ensures problems in space Orzulik 6 (Ryan, Science and Engineering @ York, 10-25,
http://www.students.yorku.ca/~orszulik/Specification%20and%20Milestones%20Report.pdf, accessed 624-11, JG) The sail will also include a steering and control algorithm based on control vanes at the tips of the sail axes. The largest obstacle that must be overcome in this project is finding a way to actually test the solar sail. The first main requirement, is that a light bright enough to generate a flux large enough to move the solar sail be used. The other main problems with testing is that the solar sail is designed to be used in the microgravity of space, but it will be tested in the gravity environment of the Earth. Thus a test setup that
supports the weight of the solar sail must be constructed. However, the test setup must be mounted on rollers with very small static and kinetic friction coefficients so that the small acceleration exerted on the sail will allow it to move.
Wont work too many engineering challenges Leipold 99 (M., Institute of Space Sensor Technology and Planetary Exploration,
http://www.esa.int/esapub/bulletin/bullet98/LEIPOLD.pdf, accessed 6-25-11, JG) Technological challenges of solar-sail deployment Although the basic idea behind solar sailing appears simple, challenging engineering problems have to be solved to exploit photonic propulsion for orbit
transfer. Since the spiral orbit-raising efciency depends basically on the overall spacecraft mass to solar sail area ratio, lightweight technological solutions for large insolar sails for space exploration Figure 7. The DLR solar-sail mock-up orbit deployed sail surfaces are required. The technical challenges are: to fabricate the sails using ultra-
thin lms and lightweight deployable booms able to carry the in-orbit loads to package the sails and booms into a small volume to deploy these lightweight structures successfully in space, and to control the large but low-mass structure
Even at Earth's comfortable distance from the Sun, flares can affect weather and disrupt communications. Close up, they would likely be fatal to a sundiving sail.
Current tech doesnt solve tears in the solar sail Vulpetti 8 (Giovanni, Physicist, Solar sails: a novel approach to interplanetary travel, pg. 103, accessed 6-24-11,
JG)
A solar sail must be lightweight enough to move itself and a payload (in space) when sunlight reflects from it. To meet the design requirements for many of the missions discussed in this book (see Chapters 9 and 17), even the first solar sails must be gossamer-like; hence they will be very fragile. Unfortunately, they must also be large. The sail must be large to reflect enough light to produce thrust and propel itself and its payload to a destination elsewhere in the solar system. First-generation solar sails will have areal densities of
10 g/m2 or less and be tens of meters in diameter. (This is the loading of the bare sail, not that of the whole sailcraft we denoted by a in Chapter 16.) At first glance, these sails will resemble common aluminum foil found in many kitchens. Who hasn't attempted to pull aluminum foil off a roll, only to have it hopelessly torn to shreds, forcing you to start over with another piece? However, appearances are misleading. Aluminum foil used in the kitchen is typically 0.013 mm thick, about 10 times thicker than the first-generation solar sails. Now imagine fabricating a sail 100-m by 100-m square out of something ten times thinner than
aluminum foil. Not only must the sail be this large, but it has to be strong enough to sustain its own weight under gravity during testing. Even our best materials are too fragile (by themselves) under these
conditions and require bracing with cords embedded in them to provide additional strength and to reduce the effects of the inevitable tears. This cord serves the same ripstop function as those found in parachutes. If a tear starts, it will spread until it encounters the cord, where it will be stopped. The edges of the sails are reinforced and securely fastened to the booms during operation.
Solar heat leaks from storage tanks solvency Battat 10 (John, Engineering,
http://ardent.mit.edu/real_options/Real_opts_portfolio%20applications/2010%20APs/Battat_report.pdf, accessed 6-26-11, JG) No reason to use solar propulsion no benefits from other types of propulsion Solar Thermal Propulsion (STP) offers no unique mission capabilities not available through alternate propulsion technologies. State of the art chemical propulsion can perform all the missions for which STP is a candidate, albeit at a performance disadvantage in many cases. STP could provide better payload mass performance than alternate propulsion technologies in many cases, but as noted next, STPs with this performance dont fit in the fairings. Solar propulsion leaks solar energy increases mission times If the STP operates with intermittent burns near periapse, gravity losses are minimized and the STP can approach the delta V of a high-thrust system. The price for this is increased trip time. The question, clearly, is how much of a trip time increase must be incurred. This, in fact, was the motivation for the energy storage STP concept: one could collect solar energy all around the orbit and deliver it quickly near periapse. Also, if solar energy collection is discontinued during thrusting, simultaneous pointing to the Sun and of the thrust vector is not required, and the STP overall configuration is simplified. However, the very poor demonstrated efficiency of the storage concept (due to heat leak out of the storage system) in early tests led us to doubt its viability.
Solar energy production uses toxic cadmium Flux Energy 11 (Flux Energy, Solar Industry, 06-08, http://sundial365.com/pdfs/EnvironmentalMyths_final.pdf, accessed 626-11, JG) Operators of solar installations are currently under fire to find ways to reverse the negative environmental impact their systems deliver. One issue of great concern is the production of PV panels utilizing the newer thin-film
technology. Thinfilm technology reduces the amount of material required in creating a solar cell. Thus, it is quickly becoming a preferred manufacturing process due to cost, flexibility, lighter weight and ease of integration compared to wafer silicon cells. The thin-layer production of panels, however, involves the mining of rare earth minerals such as cadmium and selenium. These minerals are so rare that the yield per truckload of ore is very small, implying that many truckloads are required to feed the global need for these elements. As more and more solar installation operators elect to center their production on thin-layer PV elements, the industry will respond. As with many rare elements, when demand goes up, price goes up. These minerals also possess a level of toxicity that can be dangerous to the environment as well as to humans. They are considered hazardous materials. Assuming a 30- to 40-year life for most PV panels, there
are grave concerns over the proper disposal of thin-film panels to keep these minerals from leaking into waste and water streams. Additionally, the mining processes for these elements are very invasive and pollutive. China is the primary global producer due to the lower standards for invasive mining. The mining of cadmium and other toxic elements is allowed in the U.S. as a by-product of other mining efforts such as the extraction of zinc. However, following in the standards set by the European Union to ban the use of some of these elements from all products, regulations and cleanup mandates continue to limit the production of cadmium and other minerals in the U.S. The manufacturers of solar panels and other energy industry lobbyists continue to push for more relaxed regulations. While the production and disposal of thin-film PV panels is certainly one issue attracting a lot of environmentalist opposition in the industry, there are many others.
Cadmium has a direct link to global warming Pinkham 93 (Sandra, Doctor @ Columbus,
http://www.bodycenteredtherapies.com/pinkhammedical/documents/Scan_2a.pdf, accessed 6-26, JG)
According to ''the precautionary principle,'' it is better to accept as true what cannot be perfectly proved, even though it might be wrong, if doing so can lead to actions which will protect our ecosystem. This paper uses this guideline to assess the effects of cadmium exposure and its toxicity. This highly toxic metal is apparently used by the cell in the stress response to get rid of damaged, virus-infected, and cancerous cells. Indiscriminant exposure to
global cadmium air pollution alters the cellular content of free cadmium ions and the minerals that antagonize its effects, affecting the response of cells, organs, and individuals to all other stimuli. Cadmium's effects at low dose are thus influenced by many factors, not just dose. These factors include age,
gender, species, genetic factors, prior nutritional history and exposure to cadmium and other stressors, and current nutritional history and exposure to other stressors. Other toxic metals, organic compounds, biological pathogens and emotional stresses interact with cadmium to produce effects. Stress effects at a cellular level
appear linked with current global problems affecting the environment, such as global warming, and human health effects, like the increase in disabling fatigue and infectious disease.
Warming causes extinction Henderson 6 (Bill, Environmental Scientist, 8-19-06, http://www.countercurrents.org/cchenderson190806.htm, accessed 6-25-11, JG)
The scientific debate about human induced global warming is over but policy makers - let alone the happily shopping general public - still seem to not understand the scope of the impending tragedy. Global warming isn't just warmer temperatures, heat waves, melting ice and threatened polar bears. Scientific understanding increasingly points to runaway global warming leading to human extinction. If impossibly Draconian security measures are not immediately put in place to keep further emissions of greenhouse gases out of the atmosphere we are looking at the death of billions, the end of civilization as we know it and in all probability the end of man's several million year old existence , along with the extinction of most flora and fauna beloved to man in the world we share.
Deep cuts happened already more to come Schow 11 (Ashe, Heritage Action, 6-15, http://heritageaction.com/2011/06/2012-energy-and-water-appropriations-billapproved/, accessed 6-26, JG) Earlier today, the House Appropriations Committee approved the 2012 Energy and Water appropriations bill, which will be voted on in the full House of Representatives after the July 4th recess. This is the fifth of 12 spending bills approved by the committee. The bill, which totals $30.6 billion, actually cuts $1 billion compared to current spending levels and is $5.9 billion below what President Obamas budget proposed. It contains a 42% cut to President Obamas clean energy priorities like fuel-efficient vehicles, research and solar energy. Committee Chairman Hal Rogers (R-KY) boasted that bill shows that conservatives in Congress are committed to: Restoring restraint and responsibility to the appropriations process in a time when we cannot spend as we used to.
GOP cutting millions in solar energy Nouvea 11 (Trent, Staff Writer, 6-3, http://www.tgdaily.com/business-and-law-features/56382-gop-slashes-clean-energyfunding, accessed 6-26, JG) To be sure, the above-mentioned legislation - which the Appropriations Energy and Water panel moved to full committee on Thursday - cuts $97 million in solar energy funding, fuel-efficient vehicle technologies by $46 million and vehicle technology deployment by $200 million. Although Obama's Office of Management and Budget said it does not have a position on the controversial bill at this stage, Democrats harshly criticized the cuts. "Renewable energy programs in this bill are drastically reduced. We can debate whether renewable energy is an environmental program, and whether it is a market problem," stated Rep. Ed Pastor (D-Ariz). "In either case, it is a national security problem." But Energy and Water subcommittee Chairman Rodney Frelinghuysen (R-N.J.) defended the reduction in funds. "The highest
priorities are protected by supporting the Department of Energy's national defense programs, and by preserving activities that directly support American competitiveness , such as water
infrastructure and basic science research.
Chinese solar company cuts now revenues lacking CSIS 8 (Chinese Stock Information, 10-31, http://www.chinesestock.org/show.aspx?id=25386&cid=11, accessed 6-26, JG)
China's solar power firms are facing cuts of around 25-26 pct in average selling prices next year as a result of contract renegotiations and the depreciation of the euro against the dollar, Credit Suisse said. "While solar firms have not yet revised down 2009 output guidance, the pace of new contract additions has slowed and we believe contract renegotiations are likely across the whole industry chain," it said. The bulk of China's solar power companies are listed in the US but most of their sales in Europe, making them vulnerable to currency revaluations. New York-listed Suntech, China's biggest photovoltaic
panel producer, has more direct exposure to the European market, insulating it from the fall in the euro.
Link Demand
Solar industries will respond to demand for energy Flux Energy 11 (Flux Energy, Solar Industry, 06-08, http://sundial365.com/pdfs/EnvironmentalMyths_final.pdf, accessed 626-11, JG) These minerals are so rare that the yield per truckload of ore is very small, implying that many truckloads are required to feed the global need for these elements. As more and more solar installation operators elect to center their
production on thin-layer PV elements, the industry will respond. As with many rare elements, when demand goes up, price goes up.
Impact Disease
Cadmium can spread its toxins globally UN 8 (United Nations Environment Program,
http://www.chem.unep.ch/pb_and_cd/SR/Files/2008/UNEP_Cadmium_review_Interim-APPENDIXMAR2008.pdf, accessed 6-26, JG) These measurements indicate the origin of dust particles transported by air masses, and provide evidence that aerosols are transported intercontinentally, as well as from industrialized regions to remote regions with few local emission sources such as the Arctic. As cadmium is transported in the atmosphere adhered to aerosol particles, these studies indicate that cadmium has a potential to be transported intercontinentally.
Cadmium is empirically linked to diseases it makes chances of infection onehundred percent likely if spread globally Pinkham 93 (Sandra, Doctor @ Columbus,
http://www.bodycenteredtherapies.com/pinkhammedical/documents/Scan_2a.pdf, accessed 6-26, JG) With the HIV epidemic arriving in the time period of falling lead pollution and rising cadmium pollution, it would be most helpful to know whether cadmium played a role in the progression of HIV to AIDS. There is a body of circumstantial evidence that suggests this to be true, in that many of the
substances that block cadmium toxic effects, or enhance its excretion, also block the replication of HIV (Stewart- Pin kha m, ] 991 b). Although no studies have been conducted to test this hypothesis directly in a laboratory setting, studies have been done on Herpes simplex virus, a chronic virus that is activated in a variety of stressful circumstances. Cadmium is the only metal that activates Herpes simplex from a latent state (Pawl 1993). Continued administration of
cadmium increases the yield of infectious virus by 10 to 100 fold, an effect unmatched by any other activator studied. It also prolongs the recovery of infectious virus from 6 to II days. Zinc, nickel
and manganese, on the other hand, block the cadmium - induced infectious virus. Likewise, lithium blocks Herpes activation.
Uncontrolled disease causes extinction Steinbrunner 97 (John, Senior Fellow at Brookings, Biological Weapons: A Plague Upon all Houses, JSTOR,
accessed 6-22-11, JG) The use of a pathogen, by contrast, is an extended process whose scope and timing cannot be precisely controlled. For most potential biological agents, the predominant drawback is that they would not act swiftly or decisively enough to be an effective weapon. But for a few pathogens - ones most likely to have a decisive effect and therefore the ones most likely to be contemplated for deliberately hostile use - the risk runs in the other direction. A lethal pathogen
that could efficiently spread from one victim to another would be capable of initiating an intensifying cascade of disease that might ultimately threaten the entire world population . The
1918 influenza epidemic demonstrated the potential for a global contagion of this sort but not necessarily its outer limit.
Cadmium Bad
Majority of solar cells are made of toxic cadmium GGR 10 (Go Green Resources, 2-8, http://www.gogreenresources.org/tag/solar-cells, accessed 6-26, JG)
Even though solar power systems boast of a prolonged life, the issue of getting rid of the parts implemented to acquire and store the power has not already been addressed. The majority of solar cells are partially composed of cadmium, a hugely poisonous substance . The later removal of this toxic material may result in a severe environmental hazard if dealt with ineptly . Sufficient access to
sunlight for a decent portion of a typical day is another factor to weigh.
Cadmium is toxic airborne and water pollution Hope 4 (L. California Energy Commission, http://www.energy.ca.gov/reports/500-04-053.PDF, accessed 6-26, JG)
Cadmium is potentially of concern with the thin-film technologies. Cadmium compounds are used in
CdTe, CIS, and CIGS (copper indium gallium selenide) cells, although in very small quantities in the latter two types of cells. (Cadmium compounds are not used by amorphous silicon and crystalline silicon cells or GaAs cells.) CIS and CIGS cells can be made with or without a top CdS layer. Use of cadmium can generate cadmium-containing wastewater, and possibly cadmium fumes and dusts. Tests using standard leaching protocols show that cadmium could be leached out of crushed CdTe modules, although these tests overestimate leaching from intact cells. Recent tests on CdTe and CIS modules show that Cd concentrations were below the TCLP limit.
Aside from the budgetary concerns, last week also saw developments aimed at improving the financing conditions for renewable energy projects. First, the US Department of Energy issued more than $US2.2 billion loan guarantees to solar plants, clearly signalling that the Obama administration remains keen on building up the country's renewable power capacity. NextEra Energy Resources and the solar energy unit of Abengoa received conditional loan guarantee offers to develop solar-thermal projects of 250MW each, in Southern California, while Sempra Energy, California's third-largest utility, received a $US359.1 million guarantee to build a 150MW PV plant in Arizona.
Democrats will balance the solar cuts debate suppresses GOP cuts York 11 (Anthony, LA Times, 6-18, http://www.latimes.com/news/local/la-me-state-budget-20110617,0,1532584.story,
accessed 6-26, JG) Reporting from Blythe, Calif. -- Gov. Jerry Brown on Friday warned Republican lawmakers that if they failed to negotiate a budget compromise with Democrats, he would seek to go around them. That could include signing a budget that has only Democratic support , and having initiatives put before voters on the tax questions that have brought bipartisan talks to a standstill. He has been frustrated by the inability to win four Republican votes needed for the Legislature to put the tax issue on the ballot. " I may be in initiative circulation in the next few months," he said, after attending a groundbreaking ceremony for what is scheduled to be the largest solar-energy project in the world. "I'm going to solve the problem. I'd like to solve it in a week or two, but if I can't I can take actions of many kinds, including going to the people themselves through the direct initiative process." Under that scenario, Brown said, lawmakers would have to make deeper cuts to schools and other state programs until voters have a chance to vote on higher taxes. "It's more time consuming, more devastating to our schools and more expensive, but I am going to stop at nothing to get this budget done in a sustainable, balanced way," he said. Brown also implied that he could work for an even larger Democratic legislative majority in the 2012 elections that could relegate GOP lawmakers to virtual obscurity. He accused Republicans of "undermining the state and thumbing their nose at the people and their democratic rights.
Even if there are cuts there is bipartisan support for a new solar panel bill
has launched an aggressive program to change all that, with a slew of ambitious plans to convert the oil-hungry U.S. military to alternative-energy sources--and, at the same time, spur creation of a commercial industry capable of producing enough renewable energy at affordable prices for civilians. The hope is that demand from a massive consumer like the armed forces could affect supply--scaling up energy production, driving down cost, and leading to technological breakthroughs for biofuels, solar panels, hybrid vehicles,
and similar products. That would reduce the need for oil throughout the U.S. economy and spare the armed forces from future missions in war-torn, oil-exporting states. It's not the military's job to fight climate change, but many senior Defense officials contend that there is a clear national-security reason to do so, because government studies show that the fossil-fuel emissions behind global warming will induce food shortages, drought, and rising sea levels--inviting a world of political volatility.
Obama just increased solar incentives CalFinder 11 (Solar Power Contractors, http://solar.calfinder.com/blog/news/obama-cuts-solar-costs-newfunding/, accessed 6-26-11, JG)
President Obama
is on a mission to make solar cost-competitive with coal, and today the Department of Energy Secretary Steven Chu announced another milestone in that objective: $27 million in new funding for the solar SunShot Initiative. The program is designed to cut the fees you pay upfront to go solar, which account for almost half the costs of most residential installations. Essentially, Obama aims to streamline the expensive and cumbersome hurdles in your way, including permitting processes, zoning laws and regulations, interconnection, net metering standards, and access to financing.
about the same as our electricity fees right now. That will be an era whereby solar energy is used on a large scale. "
Japans nuclear accident refueled the Chinese solar industry Bayani 11 (Oliver, Staff Writer, 5-31, http://www.ecoseed.org/business/renewable-energy/article/95renewable-energy/10026-china-most-attractive-for-renewable-energy-projects-besting-u-s-once-again%E2%80%93-ernst-young, accessed 6-26, JG) With regard to solar, the United States was on top of the rankings while China trailed behind India at third place. Its market grew 67 percent from $3.6 billion in 2009 to $6 billion in 2010. The growth was
primarily driven by loan guarantees provided by Department of Energy, supporting solar panel makers with $1.13 billion and solar generation projects with $6.95 billion worth of loans. The second largest of these loan guarantees was $2.1 billion awarded to Solar Trust of America in March for a 484-megawatt solar thermal plant in Blythe, California. Of the seven solar generation loan guarantees the department has given, 4 of them were solar thermal projects.
Meanwhile, the March 11 tsunami that triggered a nuclear disaster in Japan has drove renewable energy interest in China, particularly solar power in the past quarter , according to the report. The National Development and Reform Commission, the countrys main development agency, called for an increase in Chinas solar capacity target from 20 GW to 50 GW by 2020 this month. There is pressure on China to develop its own solar market and reduce reliance on the export of components, amid
concerns that cuts to European feed in tariff schemes and a growing supply chain in the United States could lead to an oversupply of panels, the report noted. China seems to be focusing more in concentrating solar thermal in a bid to diversify its energy mix.
African buyers ensure no cuts AP News 11 (Associated Press, 3-4, http://m.news24.com/fin24/Economy/Africa/China-solar-producers-woo-African-business20110303, accessed 6-26, JG) Johannesburg - In a show
of commercial muscle that highlights China's growing investment in Africa, Chinese solar power producers dominated exhibits on Thursday at an energy conference on a continent where nearly two-thirds of the population lives off the electric grid. "Wow! It's like an
invasion!" exclaimed a South African exhibitor at the African Energy Indaba, where 60 of 80 stands were Chinese vendors, according to event organisers. Chinese producers are working hard to maximise their impact among African clients. Those could include governments that want to power health centres and schools in remote areas, rural farmers who want electricity for water pumps and cellphones. They also could include villagers who walk long distances to find wood for cooking, and middle-class families fed up with soaring power prices and urban power cuts. Only Chinese producers offered solar powered technology at the conference ending on Thursday in Johannesburg.
Link will be inevitably triggered solar energy will eventually be in high demand Lior 1 (Noam, Energy @ Philadelphia Uni.,
http://www.seas.upenn.edu/~lior/lior%20papers/Power%20from%20Space.pdf, accessed 6-26. JG) Power can be produced in space for terrestrial use by a using a number of energy sources, including solar, nuclear, and chemical. On the one hand, in view of the rising demand for energy, the diminishing fuel and available terrestrial area for power plant siting, and the alarmingly increasing environmental effects of power generation. The use of space for power generation seems to be inevitable: (1) it allows highest energy conversion efficiency, provides the best heat sink, allows maximal source use if solar energy is the source, and relieves the Earth from the penalties of power generation, and (2) it is technologically feasible, and both the costs of launching payloads into space and those of energy transmission are declining because of other uses for space transportation, dominantly communications.
assume our decision makers kill incentives at the federal level as part of their budget cutting efforts. They also cut other programs that support budgets in each of the fifty states. Since the states, who are today in a financial crisis, cannot afford more federal spending cuts, local solar energy incentives may need to be cut to avoid severe cuts in essential services. Now lets assume the solar industry eventually withers and dies (like renewables did after its false start in the 1970s) and Middle Eastern oil stops flowing. Assuming domestic oil production wont have time to offset our demand for foreign oil, we must then either tap our strategic oil reserves, risking our ability to defend ourselves or ration energy nationwide . I truly believe and again this in my
personal opinion that we shouldnt be choosing between oil production and renewables. What we should be focusing on is energy, regardless of the source. Put another way, our energy independence will only come from aggressive support for all types of domestic energy, so rather than cutting incentives to any sector of the energy industry, we should be spending more.
Tapping strategic oil reserves causes worldwide panic buoys prices to all time high Levi 11 (Michael, Council on Foreign Relations, 3-4, http://blogs.cfr.org/levi/2011/03/04/its-too-early-touse-the-strategic-petroleum-reserve/, accessed 6-26, JG) Indeed policymakers should be concerned that it would do precisely the opposite. Tapping the reserves right now could validate fears in the market after all, it would signal that the United States government was worried. That could simply induce more precautionary buying, thus buoying prices, rather than depressing them. Such an outcome would be doubly dangerous, since, since it would
undermine the psychological value of the reserves.
Worldwide panic and buoyed prices ignite resource wars Gleason Report 10 (Investing Company, April,
http://www.gleasonreport.com/documents/The%20Real%20Gold%20Standard.pdf, accessed 6-26, JG) That price level implies gold will be $1500 (10x) to $2400 (16x) and possibly higher by 2015. The market is not psychologically ready for a higher multiple but it could happen during war or political upheaval. High oil prices will incite resource wars. The Iraq/Afghan wars are about energy. War with Iran is likely. Iran has 9% of the worlds remaining oil reserves and borders the coveted Caspian Sea reserves. The West wants Irans oil on the market and Europe wants pipeline alternatives to those owned by Russia. This is an economic survival issue for the western economies. The American economy and dollar dominance depend on affordable oil. The political stakes are high and the forward risks are ominous .
Resource Wars cause extinction Wooldridge 9 (Frosty Wooldridge, Free lance writer @ Cornell University, 2009,
http://www.australia.to/index.php?option=com_content&view=article&id=10042:humanity-galloping-toward-its-greatest-crisis-inthe-21st-century&catid=125:frosty-wooldridge&Itemid=244, accessed 6-26-11, JG)
Without transitioning away from use of fossil fuels, humanity will move further into an era of resource wars (remember, Africom has been added to the Pentagons structure -- and China has noticed), clearly with intent to protect US interests in petroleum reserves. The consequences of more resource wars, many likely triggered over water supplies stressed by climate disruption, are likely to include increased unrest in poor nations, a proliferation of weapons of mass destruction, widening inequity within and between nations, and in the worst (and not unlikely) case, a nuclear war ending civilization.
Impact Warming
Solar energy does not lead to global warming it stops it Chow 11 (Darren, Environmental Activist, 6-20, http://www.contour2002.org/article/help-stop-globalwarming-with-a-solar-power-system, accessed 6-25, JG)
But if you are seeking to be more active in this endeavor you can take a step farther and install a solar power system in your home. It runs the electricity in a more efficient and eco-friendly way. Sounds impossible? Even if you think that this is not doable, I know that it has caught your attention. Read the rest of the article and become familiar with this technology. The thing is that the environment-friendly feature is not the only striking benefit that solar panels can promise you and your family. It is also developed to help people like you save a lot of money. A simple solar panel has the ability to harness the energy of the sun so that it can be used to power certain machines. In this sense, the sun's energy can create electricity when you have developed a consolidated solar power system. Science recognizes this technology, but you will not find a lot of solar panels out in the market. While some people are doubtful against its effectiveness, companies do produce them because of reduced returns. A technology as good as this must not kept from the public, especially when it can help alleviate the global warming phenomenon. With this, I advice you to consult a useful manual that can help you create a solar power system in your home.
Warming causes extinction Henderson 6 (Bill, Environmental Scientist, 8-19-06, http://www.countercurrents.org/cchenderson190806.htm, accessed 6-25-11, JG)
The scientific debate about human induced global warming is over but policy makers - let alone the happily shopping general public - still seem to not understand the scope of the impending tragedy. Global warming isn't just warmer temperatures, heat waves, melting ice and threatened polar bears. Scientific understanding increasingly points to runaway global warming leading to human extinction. If impossibly Draconian security measures are not immediately put in place to keep further emissions of greenhouse gases out of the atmosphere we are looking at the death of billions, the end of civilization as we know it and in all probability the end of man's several million year old existence , along with the extinction of most flora and fauna beloved to man in the world we share.
AT Cadmium
Cadmium isnt key to solar energy rarely used at all Solar Power 10 (Solar Industry Website, 12-29, http://www.solarpower.org/News/800316207-chinasexport-cuts-of-rare-earths-should-have-minimal-short-term-effects-on-solar.aspx, accessed 6-26, JG) Recently, China's commerce ministry announced that it would cut its exports of rare earth metals in the early months of next year as the country moves to increase its domestic stockpiles of the metals,used in the production of many consumer products, including automobiles and electronics. However, while the export cut rattled the nerves of executives in many industries that rely on rare earths, the solar power industry is not as concerned. While some reports claim that rare earths are vital to the production of solar panels, they are actually not , according to the Solar Home & Business Journal. In fact, solar panels are mostly constructed using crystalline silicon, one of the most widely available substances on the planet. Currently, the U.S. produces myriad amounts of crystalline silicon, and many new factories are being built to process it. Moreover, thin-film solar panels are not made with rare earths, but with
tellurium, indium and gallium - substances the U.S. does not rely on China to produce. Even so, the U.S. is increasing its production capacity of certain elements as it moves to wean itself from its dependence on Chinese exports. Some mines in the U.S. - including one in Colorado - are expanding manufacturing capacity to meet the demands of solar panel producers. For now, however, "the basic availability" of the metals "appears more than adequate ," the U.S. Department of Energy asserts.
Cadmium doesnt spread globally their author concedes UN 8 (United Nations Environment Program,
http://www.chem.unep.ch/pb_and_cd/SR/Files/2008/UNEP_Cadmium_review_Interim-APPENDIXMAR2008.pdf, accessed 6-26, JG) Specific evidence of cadmium intercontinental transport is very scarce. Due to the relatively short residence time of cadmium in the atmosphere (days or weeks), the airborne dispersion of cadmium has a pronounced local or regional character. However, data from ice core measurements in Greenland
indicate that cadmium can be transported over distances of up to thousands of kilometres. Analysis of cadmium in aerosols in a few regions also illustrates long-range transport. Some small portion of anthropogenic cadmium from North America has been noted in the Russian Arctic. Further, aerosol measurements in Taiwan show that a portion of airborne cadmium can be transported over a thousand kilometres from developing areas of China
The cost of solar cells has roughly been dropping by a factor of 2 every 5 years. This has occurred in the environment where support for photovoltaic research has not been very aggressive and demand has been limited. Federal funding of photovoltaic research has been on the order of 60 million dollars a year. This signicantly less than what has been spent on to a more exotic forms of power is such as fusion. The
cost of photovoltaic power has been above $3 per peak watt so demand for solar cells has been limited to remote applications and other exotic uses. Figure 8 below is a schedule that gives the potential demand for photovoltaic electricity. It is not a demand curve in the traditional sense, but rather is a schedule of the demand for electricity in various applications and various prices based on Table 4 in Ogden and Williams. A pictorial representation of this table is very illustrative At very high prices the market is for exotic uses such as space satellites, buoys, corrosion protection.
Huge demand for solar cell production inflates prices of rare toxic materials makes them unavailable Flux Energy 11 (Flux Energy, Solar Industry, 06-08, http://sundial365.com/pdfs/EnvironmentalMyths_final.pdf, accessed 625-11, JG) Operators of solar installations are currently under fire to find ways to reverse the negative environmental impact their systems deliver. One issue of great concern is the production of PV panels utilizing the newer thin-film
technology. Thinfilm technology reduces the amount of material required in creating a solar cell. Thus, it is quickly becoming a preferred manufacturing process due to cost, flexibility, lighter weight and ease of integration compared to wafer silicon cells. The thin-layer production of panels, however, involves the mining of rare earth minerals such as cadmium and selenium. These minerals are so rare that the yield per truckload of ore is very small, implying that many truckloads are required to feed the global need for these elements. As more and more solar installation operators elect to center their production on thinlayer PV elements, the industry will respond. As with many rare elements, when demand goes up, price goes up. These minerals also possess a level of toxicity that can be dangerous to the environment as well as to humans. They are considered hazardous materials. Assuming a 30- to 40-year life for
most PV panels, there are grave concerns over the proper disposal of thin-film panels to keep these minerals from leaking into waste and water streams. Additionally, the mining processes for these elements are very invasive and pollutive. China is the primary global producer due to the lower standards for invasive mining. The mining of cadmium and other toxic elements is allowed in the U.S. as a by-product of other mining efforts such as the extraction of zinc. However, following in the standards set by the European Union to ban the use of some of these elements from all products, regulations and cleanup mandates continue to limit the production of cadmium and other minerals in the U.S. The manufacturers of solar panels and other energy industry lobbyists continue to push for more relaxed regulations. While the production and disposal of thin-film PV panels is certainly one issue attracting a lot of environmentalist opposition in the industry, there are many others.
Solar panels key to stop global warming Chow 11 (Darren, Environmental Activist, 6-20, http://www.contour2002.org/article/help-stop-globalwarming-with-a-solar-power-system, accessed 6-25, JG)
But if you are seeking to be more active in this endeavor you can take a step farther and install a solar power system in your home. It runs the electricity in a more efficient and eco-friendly way. Sounds impossible? Even if you think that this is not doable, I know that it has caught your attention. Read the rest of the article and become familiar with this technology. The thing is that the environment-friendly feature is not the only striking benefit that solar panels can promise you and your family. It is also developed to help people like you save a lot of money. A simple solar panel has the ability to harness the energy of the sun so that it can be used to power certain machines. In this sense, the sun's energy can create electricity when you have developed a consolidated solar power system. Science recognizes this technology, but you will not find a lot of solar panels out in the market. While some people are doubtful against its effectiveness, companies do produce them because of reduced returns. A technology as good as this must not kept from the public, especially when it can help alleviate the global warming phenomenon. With this, I advice you to consult a useful manual that can help you create a solar power system in your home.
in the coming years photovoltaics will become economically attractive in more and more regions of the world without support programmes, particularly in those locations where high levels of solar radiation and high energy costs come together. Therefore, in the mid-term an increased demand for photovoltaics is to be expected, especially when prices decrease significantly.
Current solar cell production sustainable Iles 94 (P.A., Applied Solar Energy Corporation, Technology Challenges For Space Solar Cells, pg.
1961, accessed 6-25-11, JG) Current space cell production can accommodate most of the missions. The major exceptions are the high
radiation orbits and these may be addressed by extensions of the III-V Cell technology for InP-based cells. Thin film cells must be adapted for space use, to realize their advantages of lighter weight, easier stowability and deployment, and lower costs. A business problem is maintaining a balanced Production base, to supply required quantities of Si cells, GaAs/Ge cells and cascade cells. Despite the advantages of higher efficiency cells, there will be continued demand for Si cells, with possible use of advanced Si cells in specific missions. The GaAs/Ge and cascade cells will coexist, and will be used in larger quantities.
The hardware of a Solar Sail is faced with tremendous challenges . High residual velocities can only be reached with this propulsion concept if the specific ratio of spacecraft weight and sail size (sail area density) is very small. In most proposed Solar Sail concepts [4], this results in the need for very large sail surfaces, which can easily reach the size of 100,000 m 2 , see Figure 1. The result is the requirement that the sail must be automatically deployable in orbit. A look at the required sail area density of 1 10 g/m 2 shows that this challenge is hard to meet with the sail material and stiffening structures that are commercially available today. Figure 2 shows the boom area density that can be attained by means of a
diagonally-stiffened DLR / ESA Solar Sail concept. In an initial deployment experiment with a boom weight of approx. 14 g/m 2, specific densities of approx. 2.5 g/m 2 can be reached depending on the sail size and new boom concept.
assume our decision makers kill incentives at the federal level as part of their budget cutting efforts. They also cut other programs that support budgets in each of the fifty states. Since the states, who are today in a financial crisis, cannot afford more federal spending cuts, local solar energy incentives may need to be cut to avoid severe cuts in essential services. Now lets assume the solar industry eventually withers and dies (like renewables did after its false start in the 1970s) and Middle Eastern oil stops flowing. Assuming domestic oil production wont have time to offset our demand for foreign oil, we must then either tap our strategic oil reserves, risking our ability to defend ourselves or ration energy nationwide . I truly believe and again this in my
personal opinion that we shouldnt be choosing between oil production and renewables. What we should be focusing on is energy, regardless of the source. Put another way, our energy independence will only come from aggressive support for all types of domestic energy, so rather than cutting incentives to any sector of the energy industry, we should be spending more.
Tapping strategic oil reserves causes worldwide panic buoys prices to all time high Levi 11 (Michael, Council on Foreign Relations, 3-4, http://blogs.cfr.org/levi/2011/03/04/its-too-early-touse-the-strategic-petroleum-reserve/, accessed 6-26, JG) Indeed policymakers should be concerned that it would do precisely the opposite. Tapping the reserves right now could validate fears in the market after all, it would signal that the United States government was worried. That could simply induce more precautionary buying, thus buoying prices, rather than depressing them. Such an outcome would be doubly dangerous, since, since it would
undermine the psychological value of the reserves.
Worldwide panic and buoyed prices ignite resource wars Gleason Report 10 (Investing Company, April,
http://www.gleasonreport.com/documents/The%20Real%20Gold%20Standard.pdf, accessed 6-26, JG) That price level implies gold will be $1500 (10x) to $2400 (16x) and possibly higher by 2015. The market is not psychologically ready for a higher multiple but it could happen during war or political upheaval. High oil prices will incite resource wars. The Iraq/Afghan wars are about energy. War with Iran is likely. Iran has 9% of the worlds remaining oil reserves and borders the coveted Caspian Sea reserves. The West wants Irans oil on the market and Europe wants pipeline alternatives to those owned by Russia. This is an economic survival issue for the western economies. The American economy and dollar dominance depend on affordable oil. The political stakes are high and the forward risks are ominous .
Resource Wars cause extinction Wooldridge 9 (Frosty Wooldridge, Free lance writer @ Cornell University, 2009,
http://www.australia.to/index.php?option=com_content&view=article&id=10042:humanity-galloping-toward-its-greatest-crisis-inthe-21st-century&catid=125:frosty-wooldridge&Itemid=244, accessed 6-26-11, JG)
Without transitioning away from use of fossil fuels, humanity will move further into an era of resource wars (remember, Africom has been added to the Pentagons structure -- and China has noticed), clearly with intent to protect US interests in petroleum reserves. The consequences of more resource wars, many likely triggered over water supplies stressed by climate disruption, are likely to include increased unrest in poor nations, a proliferation of weapons of mass destruction, widening inequity within and between nations, and in the worst (and not unlikely) case, a nuclear war ending civilization.
Production costs high now CalFinder 11 (Solar Power Contractors, http://solar.calfinder.com/ask/why-are-solar-2, accessed 6-26-11, JG)
Yet solar cells are still expensive, and that has much to do with the production costs . Silicon is expensive to process from raw material to solar cell semiconductor. For that reason, streamlining the production
process has become a top goal for solar industrialists, and while advancements have certainly been made, production costs remain high.
Link will be inevitably triggered solar energy will eventually be in high demand Lior 1 (Noam, Energy @ Philadelphia Uni.,
http://www.seas.upenn.edu/~lior/lior%20papers/Power%20from%20Space.pdf, accessed 6-26. JG) Power can be produced in space for terrestrial use by a using a number of energy sources, including solar, nuclear, and chemical. On the one hand, in view of the rising demand for energy, the diminishing fuel and available terrestrial area for power plant siting, and the alarmingly increasing environmental effects of power generation. The use of space for power generation seems to be inevitable: (1) it allows highest energy conversion efficiency, provides the best heat sink, allows maximal source use if solar energy is the source, and relieves the Earth from the penalties of power generation, and (2) it is technologically feasible, and both the costs of launching payloads into space and those of energy transmission are declining because of other uses for space transportation, dominantly communications.
***MISCELLANEOUS***
high molecular weight in order to increase both thrust and electrical efficiency. This leaves a short list of propellant choices, usually resulting in one of the heavier noble gases like xenon. However, non-gaseous elements will typically ionize more readily than the noble gases . The
ionization potential is a measure of how easily a species ionizes. Not limiting choices to the stereotypical noble gases results in a myriad of interesting propellant choices, many of which can outperform xenon in nearly every category. Iodine is a particularly interesting choice as a propellant since it is almost as heavy as xenon, the heaviest of the stable noble gases, and it is easier to ionize than xenon. Other alternatives to the noble gases are metals. When these metals are ionized
and ejected from a Hall thruster, they sometimes return to the spacecraft. This can result in plating of important hardware. Iodine, a non-metal, does not introduce the plating problems present in a metal propellant fueled Hall thruster. However, iodine introduces an oxidation issue. Iodine, being more fairly electronegative, may oxidize certain materials. This can be more or less of a problem than plating, depending
on the hardware of the spacecraft. Iodine can be vaporized using less power than most other propellants. 5 Several prospective propellants have significantly high melting points, which would need to be overcome before it can be flowed through a feed system and converted to plasma. The melting point of iodine is one of the lowest of the alternate propellant options. Finally, cost is another important factor. Xenon is a very expensive substance and is not
produced in high quantity. As of 2001, roughly ten tons per year of xenon was produced [3]. This could not sustain terrestrial industry and future space systems. The future of space depends on a more readily available propellant and one that is not going to dominate costs. Iodine is much
cheaper than its noble gas counterpart xenon. This is mainly because iodine is 25,000 times more abundant in the Earths crust [4]. Iodine is a good alternative, but using it has its own technical barriers. One major barrier is its state of matter at room temperature. Iodine, being a solid at room temperature, cannot be simply
injected into the combustion chamber. The complexity in designing thrusters to use Iodine as a fuel acts as a barrier to this technology moving forward. The technology has been developed to adequately vaporize and pump solid propellants. However, integration is not fully tested to the point of confidently operating iodine propellant Hall thrusters while maintaining the proper storage temperature. This is an engineering issue for operating these thrusters. Although iodine may produce favorable theoretical results, the amount of power needed to sustain the gaseous iodine must be considered . Including
this in the power to thrust calculations more accurately relates current systems with possible iodine replacements [4]. An additional technical barrier is that iodine is not ideal in terms of ionization. Other species have better ionization characteristics. Metals typically will be influenced 6 by a colliding electron to release an electron of their own and become positively charged ions. Because iodine is more commonly a negative ion , at least in the monatomic state, it
is possible that an interaction with an electron could influence the diatomic iodine to disassociate and create a negative ion. This would hurt the performance and efficiency of the thruster as the electric field is meant to accelerate positive charges to produce thrust [4]. Since iodine injected thrusters are more difficult to build, operate, and maintain, the profiles of these thrusters is not yet well
understood. The exhaust profile needs to be well characterized before an iodine fueled system can be flown in actual missions. More data must be collected to prove iodine as a viable option as a fuel. Unfortunately, there are
complications to testing the iodine propellant. The iodine plasma is more difficult to measure as it is more likely to corrode intrusive measuring devices. Many intrusive probes would be preferable to
get the best experimental data possible, but their degradation can skew results and ruin equipment. Other types of instrumentation must be used in order to adequately characterize the entire exhaust profile [4].
a tungsten-based impregnated emitter which is highly susceptible to poisoning. Poisoning from water vapor or oxygen can seriously degrade the performance of these types of emitters . Another limiting factor for impregnated emitters is their lifetime, governed by the rate at which the impregnate evaporates. In a
high-current application, the lifetime of an emitter is considerably shorter than in low-current applications. Both the SPT-100 and BHT-200 mentioned above utilize an external cathode for electron production. Since both of these thrusters are relatively compact, this generally poses no major integration issues. As Hall thrusters increase in size, however, the
overall form factor of the thruster becomes increasingly important. Any real estate on a satellite bus utilized by a Hall thruster system cannot be used for other necessary subsystems such as thermal, power, or communications. The Hall thrusters plume divergence is another 19 issue that must be carefully evaluated when a Hall thruster system is being integrated into a satellite. It is highly undesirable to have high energy particles from the thrusters exhaust plume striking solar arrays, communications antennas, or the satellite bus itself. Thruster plume symmetry and divergence need to be evaluated as the scale of the thruster increases.
continues to work. Otherwise, the arc restarting fails, and the discharge outside the thruster occurs. It can be solved through increasing the threshold to 150 V by adjusting the design point of power. However, it is difficult for water arcjet to remove the oscillation of plume and arc voltage . There are two possible reasons for the unstable plume and voltage. One is inappropriate working parameters. In general,
arcjet has a stable range for working parameters. When argon is used as a propellant at the same specific power as water in the identical arcjet, a stable plume appears at the outlet [13]. Thus, the design of thruster seems reasonable. Another is the oscillation of liquid propellant flow rate. There is no low-frequency fluctuation for gas propellant in the arcjet. However, in instability research for a liquid rocket, it is found that the low-frequency vibration results from the oscillation of propellant supply system [14]. In the previous design of water arcjet thruster, the vaporization is realized by the heat absorbed from the radiation of anode in the steel pipe, which twists outside the hightemperature region of anode. It is shown in Fig. 3 that the twisted steel pipe forms a coil. There is a small gap between the anode and the water tube coil. The phase change is unstable because of the uneven heating, which
results in the variation of water vapors flow rate at the exit of steel pipe.
The use of water as a propellant for the engineering model resistojet poses problems not encountered with the other propellants tested. Water is most conveniently stored as a liquid and requires significant input of energy to vaporize. This raises questions about the best manner in which to vaporize the liquid so that it may be
superheated and expelled through the nozzle. One could envision two systems: the first of these employs a separate water vaporizer upstream of the thruster with the thruster serving only to superheat the steam (ref. 13); the second would combine the vaporizing and superheating functions into a single unit (ref. 16). Since the engineering model thruster
was originally envisioned to operate on steam in combination with a separate steam generator, it was never optimized to perform as a water vaporizer . However, it was decided during the performance testing
program to make minor modifications to the engineering model to facilitate operation as a liquid water-fed thruster. The decoupled system (i.e., the system employing the separate boiler) appeared to operate in a manner very similar to the seven gaseous propellant systems discussed earlier, since the fluid entering the heat exchanger was already a vapor and required only superheating. Since the range of operating capabilities of the boiler was limited, only one propellant inlet pressure setting was tested (0.21 MPa), although four total power levels ranging from 780 to 1160 W were examined. The decoupled system demonstrated a maximum specific impulse of 184/sec at a thrust level of 230 mN while consuming 466 W in the water vaporizer and 692 W 1n the thruster. The heater temperature near the nozzle under these conditions was measured to be about 1140 C. The coupled system required the thruster to act as a boiler and superheater. Therefore the
thruster operated at high temperatures to perform the superheating, causing a large temperature difference between the incoming liquid and the heat exchanger walls. This was an undesirable condition according to traditional boiler design practice, which calls for a thin layer of liquid in contact with the heat exchanger wall. Such a condition would require a liquid-towall temperature difference on the order of 50 C. However, the room-temperature liquid fed directly into the thruster encountered wall temperatures as high as 700 C, which would cause the Incoming liquid
stream to flash to a mixture of superheated vapor and liquid droplets. The range of stable operation was narrower for the coupled system than for the decoupled system, so the power levels and thrust levels tested were highly interdependent. Four power levels ranging from 200 to 500 W, each at a unique thrust level, were tested. Figure 6 shows the relation between specific Impulse and the ratio of input electric power to propellant mass flow rate for all of the operating conditions tested using water propellant. The coupled system demonstrated a maximum specific Impulse of 159/sec at a thrust level of 84 mN while consuming 289 W. The heater temperature near the nozzle under these conditions was approximately 600 C. The large variations in the data obtained from the coupled system as compared to the
the relatively low flow rates experienced with the coupled system. These were typically only one-third the flow rates of the decoupled system, so the resulting uncertainty 1n mass flow rate was much larger for the coupled system. Figure 6 shows that no significant performance advantages exist for either water-feeding scheme over its alternative.
data from the decoupled system are due to
Water coaches wont be launched from earth, cant land, and dont travel fast McConnell and Tolley 10 (Brian S. and Alex, Freelance Scientists, published in Journal of the British Interplanetary
Society, http://spacecoach.org/for-ship-designers/, JM)
Spacecoaches never enter a planetary atmosphere. Any craft that do, such as landers, are separate vehicles that will be designed independently. Electrothermal propulsion systems produce small amounts of thrust for a long period of time . The craft is not subjected to extreme forces or vibration as an Earth launched craft would be. Solar power, provided by large,
lightweight solar arrays is the primary power source for the ship and its engines. Nuclear powerplants, such as those developed in the 1960s NERVA program, are not necessary, except perhaps for future missions beyond Jupiter. Water is reused extensively. Water may be initially frozen into pykrete shells to shield part of the ship, then melted off to irrigate crops and recycled many times before being sent to the engines for use as propellant later in the flight. Any ship fueled for a long duration trip (e.g. Martian moons) will be fueled with many tons of water. The ship can be viewed as an organic structure, almost like a cell, where the design goal is to minimize the amount of non-aqueous or non-organic mass. The
ship itself is mostly an empty membrane or shell that protects the plants, animals and people within, and may be home to extensive plant growth (for life support, food production, and to create a comfortable, natural
environment for its inhabitants). Large, complex structures can be built simply by interconnecting smaller units, with no upward limit on their size or form.
be used to supply high thrust steam and/or hydrogen peroxide thrusters that will be useful in ascent and landing maneuvers to higher gravity sites, such as the Martian moon Phobos, where electric propulsion systems will not be able to produce enough thrust. Steam rockets are the simplest type of rocket motor available. A steam rocket works by
superheating water above its normal boiling point in a pressure vessel, and then venting the exhaust out through a standard rocket nozzle. While steam rockets are not very efficient, with a specific impulse of only 45 seconds , they can be used to land and ascent from low gravity sites such as the Martian moons, Ceres and small moons of Jupiter and Saturn, as well as large asteroids and comets. Hydrogen peroxide is also an interesting propellant which can be generated from water and oxygen using an electrolytic process that has already been developed for wastewater treatment. Hydrogen peroxide rockets work by pumping concentrated hydrogen peroxide through a catalyst that decomposes it into superheated steam and oxygen. When used as a monopropellant, hydrogen peroxide motors run at a specific impulse of 160 seconds, and when the exhaust is mixed with kerosene or alcohol, specific impulse can be boosted to 265 seconds. In either case, this is sufficient to land and ascend from larger moons, and even
would cost $640 billion, much too high for practical considerations. In addition, the extremely low production rate would require an unreasonably long fill time on the order of 100's of years. The situation looks discouraging until we account for the anticipated improvements to the current production
capacity. In this case the costs would go down by at least 2 orders of magnitude to $64 million per/.tg or $6.4 billion for a 100 pg mission. This is too expensive to support even occasional missions, and is certainly prohibitive for anything above the 10 pg level. However, this cost certainly" permits ground-based testing and demonstration of antimatter-assisted fusion/fission propulsion technology, which would require quantities of only 1 pg or less.
present the point of view that the only realistic applications for annihilation energy were in the military domain [13]. To everyone's surprise, the Americans didn't come. Ten days before the conference, they announced their withdrawal without giving any convincing explanations. The participants quickly realized that the American Authorities had undoubtly reevaluated the military importance of antimatter, and had probably prevented the Los Alamos Scientists from coming to Madrid [14]. Thus exposing that scientists working at CERN, and coming from a non-European weapons laboratory, had other than fundamental research interests, that were obviously militarily sensitive.
Antimatter weapon development guarantees H-bomb proliferation Gsponer and Hurni 87 (Andre and Jean-Pierrre, Independent Sci. Res. Inst., Oxford & Author, The Physical Principles
of Thermonuclear Explosives, Bulletin of Peace Proposals 19, pp.444-450, JM) Whether antimatter triggered thermonuclear weapons are realizable or not, or whether other weapons using annihilation energy are feasible or not, the fact that a relatively small quantity of antimatter can set off a very
powerful thermonuclear explosion creates serious problems for the future of the strategic balance. In fact, the arms control treaties presently in force deal only with fission related devices
and materials [16]: atomic bombs, nuclear reactors and fissile materials. By removing the fission fuse from thermonuclear weapons, antimatter triggered H-bombs and neutron bombs could be constructed freely by any country possessing the capacity, and be placed anywhere, including outer-space. Then again, even if technical obstacles prevented, for example, the actual construction of battle-field antimatter weapons, antimatter triggered microexplosions would still allow small and middle sized thermonuclear explosions to be
made in the laboratory. This possibility would considerably reduce the need for underground nuclear explosions, thus rendering ineffective any attempt to slow the arms race by an eventual
comprehensive nuclear test-ban treaty [16]. A nuclear test laboratory of this type could be based around a large heavy-ion accelerator [16], which would provide a means of massive antimatter production, as well as a driver to study the compression and explosion of thermonuclear fuel pellets.
100 ug of antiprotons were to suddenly annihilate with its confinement structure there would be a nearly instantaneous flux of 4.2 x 1020 gamma rays. 57% of those gamma rays would have an energy of
approximately 200 MeV, while the others would be 0.511 MeV gammas. The gamma rays account for only 43% of the energy released from the annihilation. The rest is released in the form of neutrinos. Neutrinos interact very little with matter, therefore the gamma flux accounts for the vast majority of the radiation risk. This potential radiation hazard adds
much unwanted mass to the system design and may offset the lightweight advantages originally assumed. Another risk involves the regulation and supervision of antimatter fuel for fear that one would use
antiprotons as a weapon of mass destruction. However, the fact that all end products of proton-antiproton annihilation are all neutral forms of radiation make it a poor choice of weaponry.
Space elevators are touted as a novel and cheap way to get cargo, and possibly people, into space one day. So far, they have barely left the drawing board, but ultimately robots could climb a cable stretching 100,000 kilometres from Earth's surface into space. But there is a hitch: humans might not survive thanks to the whopping dose of ionising radiation they would receive travelling through the core of the Van Allen radiation belts around Earth. These are two concentric rings of charged particles trapped by Earth's magnetic fields. "They would die on the way through the radiation belts if they were unshielded," says Anders Jorgensen, author of a new study on the subject and a technical staff member at Los Alamos National Laboratory, New
Mexico, US. Space elevators had been planned to be anchored on an ocean platform near the equator, with the other end tied to a counterweight in space. At the equator, the most dangerous part of the radiation belts extends from about 1000 to 20,000 kilometres in altitude . The region did not hurt the Apollo astronauts in the 1960s and 1970s because their rockets delivered them swiftly through it. For a space elevator travelling at the
current proposed speed of 200 kilometres per hour, however, passengers might spend half a week in the belts. That would hit them with 200 times the radiation experienced by the Apollo astronauts.
Space elevator transportation is lethal and harmful to equipment Hoffman 6 (Michael, Contributor, DailyTech, November 15,
http://www.dailytech.com/Radiation+May+Kill+Travelers+on+Space+Elevator/article4953.htm, JM)
Although technology designated for the creation and implementation of space elevators has been increasing in popularity (even at NASA), a recent article in New Scientist claims that space elevators may inadvertently kill travelers due to a high levels of ionising radiation. The Van Allen radiation belts, two rings of charged particles trapped in the Earth's magnetic field, would ultimately kill any humans on the space elevator. Astronauts who traveled through the belt in a spacecraft went relatively unharmed because they do so at a fast pace -- people on the elevator, however, may spend a half week in the belts. Even at short intervals, the
Van Allen radiation belts have been responsible for damaging shuttle and satellite integrated circuits and sensors. Researchers are looking into several different ways they would be able to protect space travelers
from the high level of radiation. The first way is to move the elevator away from the equator so that the more intense parts of the belts can be avoided. Some scientists have been quick to point out that even a relocation to the north or south might not be enough to reduce the amount of radiation exposure . Another idea being discussed is to create some sort of radiation shield to help block radiation when the travelers enter the Van Allen belts. But the shield would ultimately weigh down the elevator line enough to disrupt the motion of the cable and/or add unwanted stress on the line. The idea of creating a 62,000-mile elevator to carry supplies and humans into space has been met with a bit of interest and optimism from researchers. Even though there was no winner in the recent Space Elevator Games competition held in a New Mexico desert, contest organizers believe someone has the ability to win next year. The University of Saskatchewan Space Design Team, however, was close but ended up being disqualified after going over the time limit by two seconds. Needless to say, humans might still be using the good old rocket ship for some time to come even when, or if, a space elevator is built.
sickness, and perhaps death, for humans spending more than a brief period of time in the belts without shielding. The exact dose and the level of the related hazard depends on the type or radiation, the intensity of the radiation, the length of exposure, and on any shielding introduced. For the space elevator the radiation concern is particularly critical since it passes through the most intense regions of the radiation belts. The only humans who have ever traveled through the radiation belts have
been the Apollo astronauts. They received radiation doses up to approximately 1 rem over a time interval less than an hour. A vehicle climbing the space elevator travels approximately 200 times slower than the moon rockets did, which would result in an extremely high dose up to approximately 200 rem under similar conditions, in a timespan of a few days. Technological systems on the space elevator, which spend prolonged periods of time in the radiation belts, may also be affected by the high radiation levels . In this paper we will give an overview of the radiation belts in terms relevant to space elevator studies. We will then compute the expected radiation doses, and evaluate the required level of shielding. We concentrate on passive shielding using aluminum, but also look briefly at active shielding using magnetic fields. We also look at the effect of moving the space elevator anchor point and increasing the speed of the climber. Each of these mitigation mechanisms will result in a performance decrease, cost increase,
Efforts to reduce radiation from the space elevator moot its solvency Young 6 (Kelly, Staff, New Scientist, Space elevators:First floor, deadly radiation! November 13,
http://www.newscientist.com/article/dn10520, JM)
There are several possibilities for dealing with the radiation - all of which come with drawbacks. One option would be to move the elevator off the equator. By shifting the elevator north or south, the most intense part of the radiation belts could be avoided. "Basically what we found was that by moving off the equator by the largest amount you can, you reduce the radiation by a small factor - but probably not enough," says study co-author Blaise Gassend of MIT in the US. In addition, if the elevator was located at a latitude of 45 north, roughly the same latitude as MIT, the cable would veer south, pulled towards the equator by centrifugal forces. So it would run nearly horizontally through Earth's atmosphere for thousands of kilometres, putting weather-related stresses on the cable that could weaken it. Another option would be to have some sort of radiation shield stationed along the cable so the elevator could pick it up when it is about to reach the belts. But such a shield would weigh down the whole apparatus, disrupting the natural motion of the cable.
has been stuck on the ground floor for decades, not least because constructing a tether strong enough for the job is beyond current technology. Nanotubes might be up to the task, but they would have to be made longer and with fewer defects than any that can be fabricated today. A
new study makes the prospects appear even gloomier. Even if a space elevator could be built, it will need thrusters attached to it to prevent potentially dangerous amounts of wobbling, says Lubos Perek of the Czech Academy of Sciences' Astronomical Institute in Prague. The addition would increase the difficulty and cost of building and maintaining the elevator. Previous studies have noted that gravitational tugs from the Moon and Sun, as well as
pressure from gusts of solar wind, would shake the tether. That could potentially make it veer into space traffic, including satellites and bits of space debris. A collision could cut the tether and wreck the space elevator.
400 kilometers above Kansas would destroy most of the electronics that were not protected in the entire Continental United States. That is a large area. However, Electromagnetic Pulse remains almost
untested for small nuclear bombs. Electromagnetic Pulse was a theory of nuclear weapons that was not tested until the early 1960s. This was the same time period that Orion was under development; so little research was done on the Electromagnetic Pulse effects of Orion. In fact, George Dyson informed me that he read 6000 pages worth of information on Project Orion to write his book, largely in the form of declassified reports, and that Electromagnetic Pulse was not mentioned on a single page of these papers. I asked about this question in many message boards on the Internet. However, the best response I could get was EMP is not significant for less than 1 megaton bombs from the Yahoo Project Orion club. I believed it was significant, but I was largely been unable to find out a way to test my hypothesis. However, I did get some information, mostly on using a very commonly used and applied physics formula known as the Inverse Squared law, which states that the power of a field or charge or many other things varies inversely squared with its distance.
EMP crushes US leadership, infrastructure, and readiness Foster et al 4 (John S., Special Commission to Assess the Threat to the US as a Result of EMP,
http://www.globalsecurity.org/wmd/library/congress/2004_r/04-07-22emp.pdf, JM) Several potential adversaries have or can acquire the capability to attack the United States with a high-altitude nuclear weapon-generated electromagnetic pulse (EMP). A determined adversary can achieve an EMP attack capability without having a high level of sophistication. EMP is one of a small number of threats that can hold our society at risk of catastrophic consequences. EMP will cover the wide geographic region within line of sight to the nuclear weapon. It has the capability to produce significant damage to critical infrastructures and thus to the very fabric of US society, as well as to the ability of the United States and Western nations to project influence and military power. The common element that can produce such an impact from EMP is primarily electronics, so pervasive in all aspects of our
society and military, coupled through critical infrastructures. Our vulnerability is increasing daily as our use of and dependence on electronics continues to grow. The impact of EMP is asymmetric
in relation to potential protagonists who are not as dependent on modern electronics. The current vulnerability of our critical infrastructures can both invite and reward attack if not corrected. Correction is feasible and well within the Nation's means and resources to accomplish.
biological and chemical warfare agents, and cyber attacks that might cause damage that could reach large-scale, long-term levels. The first order of business is to prevent any of these attacks from occurring. The US must establish a global environment that will profoundly discourage such attacks. We
must persuade nations to forgo obtaining nuclear weapons or to provide acceptable assurance that these weapons will neither threaten the vital interests of the United States nor fall into threatening hands.
EMP destroys the economy Foster et al 4 (John S., Special Commission to Assess the Threat to the US as a Result of EMP,
http://www.globalsecurity.org/wmd/library/congress/2004_r/04-07-22emp.pdf, JM) The financial services industry comprises a network of organizations and attendant systems that process instruments of monetary value in the form of deposits, loans, funds transfers, savings, and other financial transactions. It includes banks and other depository institutions, including the Federal Reserve System; investment-related companies such as underwriters, brokerages, and mutual funds; industry utilities such as the New York Stock Exchange, the Automated Clearing House, and the Society for Worldwide Interbank Financial Telecommunications; and third party processors that provide
electronic processing services to financial institutions, including data and network management and check processing. Virtually all American economic activity depends upon the functioning of the financial services industry . Today, most financial transactions that express National wealth are performed and recorded electronically . Virtually all transactions involving banks and other financial institutions happen electronically. Essentially all record-keeping of financial transactions involves information stored electronically. The financial services industry has evolved to the point that it would be impossible to operate without the efficiencies, speeds, and processing and storage capabilities of electronic information technology. The terrorist attacks of September 11, 2001, demonstrated the vulnerabilities arising from the significant interdependencies of the Nations critical infrastructures. The attacks disrupted all critical infrastructures in New York City , including power, transportation, and telecommunications. Consequently, operations in key financial markets were interrupted, increasing liquidity risks for the United States financial system. 11
EMP disrupts the food supply system Foster et al 4 (John S., Special Commission to Assess the Threat to the US as a Result of EMP,
http://www.globalsecurity.org/wmd/library/congress/2004_r/04-07-22emp.pdf, JM)
EMP can damage or disrupt the infrastructure that supplies food to the population of the United States. Recent federal efforts to better protect the food infrastructure from terrorist attack tend to focus on
preventing small-scale disruption of the food infrastructure, such as would result from terrorists poisoning some food. Yet an EMP attack could potentially disrupt the food infrastructure over a large region encompassing many cities for a protracted period of weeks to months. Technology has made possible a dramatic revolution in US agricultural productivity. The transformation of the United States from a nation of farmers to a nation where less than 2 percent of the population is able to feed the other 98 percent and supply export markets is made possible only by technological advancements that, since 1900, have increased the productivity of the modern farmer by more than 50-fold. Technology, in the form of knowledge, machines, modern fertilizers and pesticides, high-yield crops and feeds, is the key to this revolution in food production. Much of the technology for food production directly or indirectly depends upon electricity, transportation, and other infrastructures. The distribution system is a chokepoint in the US
food infrastructure. Supermarkets typically carry only enough food to provision the local population for 1 to 3 days. Supermarkets replenish their stocks on virtually a daily basis from regional warehouses that usually carry enough food to supply a multi-county area for about one month. The large quantities of food kept in regional warehouses will do little to alleviate a crisis if it cannot be distributed to the population in a timely manner . Distribution depends largely on a functioning
transportation system.
services for the Federal government, including communications, remote sensing, weather forecasting, and imaging. The national security and homeland security communities use commercial satellites for critical activities, including direct and backup communications, emergency response services, and continuity of operations during emergencies . Satellite services are
important for national security and emergency preparedness telecommunications because of their ubiquity and separation from other communications infrastructures. The Commission to Assess United States National Security Space Management and Organization conducted an assessment of space activities that support US national security interests, and concluded that
space systems are vulnerable to a range of attacks due to their political, economic, and military value.19 Satellites in low Earth orbit generally are at very considerable risk of severe lifetime degradation or outright failure from collateral radiation effects arising from an EMP attack on ground targets.
important if not critical in low Earth orbit, not the least of which is the International Space Station. There are several thousands of satellites that would be affected by this devastating effect of nuclear weapons. To do some kind of testing on this, I tested for bombs blowing up in the Van Allen belts.
To my surprise, none did. So while space Electromagnetic Pulse may damage something, it would not be a critical blow to the space industry.
this sort of scale cannot become reality with the existing economics of spaceflight in which a space shuttle launch costs $450 million.4 Space activity needs economy of scale: It can be a lot cheaper if a lot
more of it is done. This is necessary if humanity is to expand into the solar system in any significant way.
Mass driver operations like bifrost bridge dont avoid massive costs Combs 10 (Mike, Freelance Writer, The Space Settlement FAQ, January, http://space.mike-combs.com/spacsetl.htm, JM)
Zubrin says even
assuming a lunar mass-driver could deliver ores to GEO at 1/10,000th of current launch prices, launching the raw material for building an O'Neill habitat would cost $4 trillion. He then considers it a "reasonable guess" that factoring in the costs of refining, processing, manufacturing, and construction would justify multiplying this price tag 10 times over . But this
latter calculation may gain unfair leverage from the current high costs of rocket launch into orbit, when the issues are the costs of refining ores in a region where solar energy is continuously available, and of construction in a region with access to zero gravity. Launch costs 1/10,000th current prices certainly sounds like a generous assumption. But is there in fact any basis for comparison between rocketry and launch via electromagnetic forces? A M.I.T. study concluded that a lunar massdriver could launch ore into space for a cost of around 10 cents/kilogram. For Zubrin to successfully dispute this, he must identify the calculation errors in these previous studies, and not merely throw out a number of his own, no matter how generous-sounding. Zubrin says that, "...the size and complexity of the O'Neill operation...boggles the mind". Certainly all can agree on this point. But it seems inescapable that building self-sufficient settlements on the surface of Mars of comparable capacity would require at least an equivalent amount of infrastructure not only be launched into Earth orbit, but propelled the additional distance to Mars, and then soft-landed on the surface. Zubrin is well known as an advocate of the position that this is within our capabilities.