Aug 14, 2008

World - Experts ponder hazards of using Technology to save the world

Last year, a private company proposed "fertilizing" parts of the ocean with iron, in hopes of encouraging carbon-absorbing blooms of plankton. Meanwhile, researchers elsewhere are talking about injecting chemicals into the atmosphere, launching sun-reflecting mirrors into stationary orbit above the earth or taking other steps to reset the thermostat of a warming planet.
This technology might be useful, even life-saving. But it would inevitably produce environmental effects impossible to predict and impossible to undo. So a growing number of experts say it is time for broad discussion of how and by whom it should be used, or if it should be tried at all.
Similar questions are being raised about nanotechnology, robotics and other powerful emerging technologies. There are even those who suggest humanity should collectively decide to turn away from some new technologies as inherently dangerous.
"The complexity of newly engineered systems coupled with their potential impact on lives, the environment, etc., raise a set of ethical issues that engineers had not been thinking about," said William Wulf, a computer scientist who until last year headed the National Academy of Engineering. As one of his official last acts, he established the Center for Engineering, Ethics and Society there.
Rachelle Hollander, a philosopher who directs the center, said the new technologies were so powerful that "our saving grace, our inability to affect things at a planetary level, is being lost to us," as human-induced climate change is demonstrating.
Engineers, scientists, philosophers, ethicists and lawyers are taking up the issue in scholarly journals, online discussions and conferences around the world.
"It's a hot topic," said Ronald Arkin, a computer scientist at Georgia Tech who advises the U.S. Army on robot weapons. "We need at least to think about what we are doing while we are doing it, to be aware of the consequences of our research."
So far, though, most scholarly conversation about these issues has been "piecemeal," said Andrew Maynard, chief science adviser for the Project on Emerging Nanotechnologies at the Woodrow Wilson Center in Washington. "It leaves the door open for people to do something that is going to cause long-term problems."
That's what some environmentalists said they feared when Planktos, a California company, announced it would embark on a private effort to fertilize part of the South Atlantic with iron in hopes of producing carbon-absorbing plankton blooms that the company could market as carbon offsets. Countries bound by the London Convention, an international treaty governing dumping at sea, issued a "statement of concern" about the work, and a UN group called for a moratorium, but it is not clear what would have happened had Planktos not abandoned the effort for lack of money.
"There is no one to say 'thou shalt not,"' said Jane Lubchenco, an environmental scientist at Oregon State University and a former president of the American Association for the Advancement of Science.
When scientists and engineers discuss geoengineering, it is obvious they are talking about technologies with the potential to change the planet. But the issue of engineering ethics applies as well to technologies whose planet-altering potential may not emerge until it is too late.
Arkin said robotics researchers should consider not just how to make robots more capable, but also who must bear responsibility for their actions and how much human operators should remain "in the loop," particularly with machines to aid soldiers on the battlefield or the disabled in their homes.
But he added that progress in robotics was so "insidious" that people might not realize they had ventured into ethically challenging territory until too late.
Ethical and philosophical issues have long occupied biotechnology, where institutional review boards commonly rule on proposed experiments and advisory committees must approve the use of gene-splicing and related techniques. When the U.S. government initiated its effort to decipher the human genome, a percentage of the budget went to consideration of ethics issues like genetic discrimination.
But such questions are relatively new for scientists and engineers in other fields. Some are calling for the same kind of discussion that microbiologists organized in 1975 when the immense power of their emerging knowledge of gene-splicing or recombinant DNA began to dawn on them. The meeting, at the Asilomar conference center in California, gave rise to an ethical framework that still prevails in biotechnology.
"Something like Asilomar might be very important," said Andrew Light, director of the Center for Global Ethics at George Mason University, one of the organizers of a conference in Charlotte, North Carolina, in April on the ethics of emerging technologies. "The question now is how best to begin that discussion among the scientists, to encourage them to do something like this, then figure out what would be the right mechanism, who would fund it, what form would recommendations take, all those details."
But an engineering Asilomar might be hard to bring off.
"So many people have their nose to the bench," Arkin said, "historically a pitfall of many scientists."
Paul Thompson, a philosopher at Michigan State and former secretary of the International Society for Environmental Ethics, said many scientists were trained to limit themselves to questions answerable in the real world, in the belief that "scientists and engineers should not be involved in these kinds of ethical questions."
Researchers working in geoengineering say they worry that if people realize there are possible technical fixes for global warming, they will feel less urgency about reducing greenhouse gas emissions.
"Even beginning the discussion, putting geoengineering on the table and beginning the scientific work, could in itself make us less concerned about all the things that we need to start doing now," Light said.
On the other hand, some climate scientists argue that if people realized such drastic measures were on the horizon, they would be frightened enough to reduce their collective carbon footprint. Still others say that, given the threat global warming poses to the planet, it would be unethical not to embark on the work needed to engineer possible remedies - and to let policy makers know of its potential.
But when to begin this kind of discussion? "It's a really hard question," Thompson said. "I don't think anyone has an answer to it."
Many scientists do not like talking about their research before it has taken shape, for fear of losing control over it, according to David Goldston, former chief of staff at the House Science Committee and a columnist for the journal Nature. This mind-set is "generally healthy," he wrote in a recent column, but it is "maladapted for situations that call for focused research to resolve societal issues that need to be faced with some urgency."
And then there is the longstanding fear held by scientists that if they engage with the public for any reason, their work will be misunderstood or portrayed in inaccurate or sensationalized terms.
Francis Collins, who is stepping down as head of the government human genome project, said he had often heard researchers say "it's better if people don't know about it." But he said he was proud that the National Human Genome Research Institute had from the beginning devoted substantial financing to research on privacy, discrimination and other ethical issues raised by progress in genetics. If scientific research has serious potential implications in the real world, "the sooner there is an opportunity for public discussion the better," he said in a recent interview.
In part, that is because some emerging technologies will require political adjustments. For example, if the planet came to depend on chemicals in space or orbiting mirrors or regular oceanic infusions of iron, system failure could mean catastrophic - and immediate - climate change. But maintaining the systems requires a political establishment with guaranteed indefinite stability.
As Collins put it, the political process these days is "not well designed to handle issues that are not already in a crisis."
Or as Goldston put it, "with no grand debate over first principles and no accusations of acting in bad faith, nanotechnology has received only fitful attention."
Meanwhile, there is growing recognition that climate engineering, nanotechnology and other emerging technologies are full of "unknown unknowns," factors that will not become obvious until they are put into widespread use at a scale impossible to turn back, as happened, in a sense, with the atomic bomb. Before its first test, some of its developers worried that the blast might set the atmosphere on fire. They did not anticipate that the bombs would generate electromagnetic pulses intense enough to paralyze electrical systems across a continent

No comments: