Thursday, March 24, 2011

At the Edge of Invasion, Possible New Rules for Evolution



Just as Galapagos finches are icons of evolution by natural selection, Australia’s cane toads may someday be icons of “spatial sorting” — a dynamic that seems to exist at the edges of invasion, altering the standard rules of evolution.

Cane toads have evolved in odd ways Down Under. Adaptations that drove their dramatic spread made individual toads less reproductively fit. Evolution through natural selection of hereditary mutations still exists, but no longer appears driven by reproductive imperatives alone. It’s also shaped by speed.

“The possibility that some traits have evolved by ‘mating betwixt the quickest’ rather than ’survival of the fittest’ warrants further attention,” wrote biologists led by the University of Sydney’s Richard Shine in the March 21 Proceedings of the National Academy of Sciences.

Introduced to northeast Australia 75 years ago in an ill-advised attempt at beetle control, cane toads spread like fire, their range expanding at rates that grew daily. As they first arrived at his study area, Shine noticed something strange: As expected, the toads displayed myriad adaptations — longer legs, greater endurance, a tendency to move faster and farther and straighter — that affected their ability to disperse, but dispersal’s benefits were unclear.



It seemed hard to reconcile with the idea of natural selection enhancing individual fitness. We started thinking about what could have caused them to become such driven little robotic dispersal machines.

The fastest-spreading cane toads also had the highest mortality rates. Longer, stronger toad legs led to spinal injuries. “Most obviously, why did the toads just sprint through our magnificent, food-rich flood plain in a frenetic rush to keep on going?” said Shine. After all, if the toads’ evolution is driven solely by a drive to reproduce, they would have stopped to enjoy the spoils of invasion.

“Much of what they did seemed hard to reconcile with the idea of natural selection enhancing individual fitness,” said Shine. “We started thinking about what other kinds of processes could have caused them to become such driven little robotic dispersal machines.”

In the new study, Shine describes those processes, which fall under the rubric of “spatial sorting” and are most easily understood by analogy: Imagine a race between rowboats crewed by randomly distributed oarsmen. If the race is stopped intermittently, and oarsmen randomly redistributed between boats nearest each other, boats in the lead will accumulate ever-higher proportions of skilled rowers.

Those are the dynamics of spatial sorting. Boats are organisms, rowers are genes and the crew swap is reproduction. Each newly-crewed boat is offspring. Generation by generation, organisms in the lead get faster and faster. Classical natural selection still operates — if a mutation causes an organism’s offspring to go sterile, the lineage soon ends — but it’s no longer the only driver of evolution.

Now space matters, too. Physical proximity produced by dispersal continue to shape that dispersal. Whatever drives creatures to spread farther and faster clusters at the front. If an adaptation improves dispersal but hurts survival, it matters less than usual, because the pool of potential mates is determined by their ability to cover ground.

A key challenge in studying spatial sorting is disentangling the effects of natural selection and spatial sorting. In many cases, better dispersal is a good, old-fashioned adaptation: It might help organisms find new sources of food, or relieve overcrowding.

Such disentanglement is presently hard to do, wrote Shine. Cane toads are the best-studied candidate for spatial sorting, though gaps in data still exist.

But spatial sorting might help explain instances of a phenomenon called preadaptation, in which complex traits emerge through the combination of many smaller adaptations, each of which provides no survival advantages. It would seem unlikely for them to persist long enough to collect in one place — unless, that is, survival advantages are no longer so important. And in a world full of biological invasions, anything that helps explain their dynamics deserves further study.

“Spatial sorting may prove to be classical natural selection’s shy younger sibling, not as important as Darwinian processes but nonetheless capable of shaping biological diversity by a process so-far largely neglected,” wrote Shine’s team.

Image: Cane toad (Sam Fraser-Smith/Flickr)

See Also:


Citation: “An evolutionary process that assembles phenotypes through space rather than through time.” By Richard Shine, Gregory P. Brown, and Benjamin L. Phillips. Proceedings of the National Academy of Sciences, March 21, 2011.
"

Tuesday, March 22, 2011

Post-Japan, Is a New Type of Nuclear Reactor in the Future?

TWR-reactor



Post-Japan, Is a New Type of Nuclear Reactor in the Future?: "
As the ongoing nuclear saga in Japan plays out, a spotlight is being thrown on the reactor technology at the heart of it. With an alternative, innovative reactor design, much of the disaster could've been averted.
The Fukushima Daiichi nuclear plant at the heart of Japan's radiation woes at the moment is a 40-year-old General Electric design of boiling water reactor (BWR). The particular sequence of natural disasters that befell Northern Japan, combined with the particular design flaws of the BWRs has resulted in all that fear and headline-grabbing worry. But could an alternative reactor design have withstood the assault of one of the largest recorded earthquakes and a horrific tsunami? Turns out the answer is yes.
The alternative, being pointed out by Freakonomics and Nathan Myhrvold's blog, is a kind of nuclear plant called a traveling wave reactor (TWR) that's been designed by TerraPower (which has Bill Gates as a key investor) and it's unique.
The Japanese BWR system relies on the hot reactor core made of enriched uranium super-heating low-pressure water into steam, which then pushes turbines and generators to produce electricity. Due to the quirks of the GE engineering, the reactor-'killing' control rods are thrust into the system from underneath, while spent fuel rods are hoisted out from the top of the reactor and are stored in a large tank (whose eerie blue glow you may have seen many times in older sci-fi movies) of water suspended many stories up in the building. The various subsystems in the design meant that in Japan the reactor wasn't cooled quickly enough due to failings in the power as the earthquake, then tsunami, removed the engineers' ability to intervene in the process. The default way the reactor works is to heat up, unless it's actively cooled, and its the over-heating, then the hydrogen explosions, which ruptured the reactor vessel and the cooling tank, exposing fuel to air and letting it heat up still more, which then made the situation much worse.
But in the traveling wave reactor proposed by TerraPower, everything is different. The company touts it as a 'proliferation-resistant energy that produces significantly smaller amounts of nuclear waste than conventional nuclear reactors,' implying that in normal operation it's better anyway. All it takes is an 'initial start-up with a small amount of low-enriched material' and then the reactor 'can run for decades on depleted uranium--currently a waste byproduct of the enrichment process' that's needed to make fuel for conventional reactors. According to TerraPower 'an established fleet of TWRs could operate without enrichment for millennia.'
The reactor works like this: A pile of depleted uranium is given a kick-start by a small chunk of enriched uranium--a highly volatile radioactive material that spits out neutrons. These neutrons travel into the depleted uranium, converting it into active heat-generating fuel (which heats water into steam, conventionally), and then this region activates the neighboring depleted uranium--the traveling wave of energy generation and new fuel-creation that gives the process its name, although the 'burning region' actually stays still, and the fuel is slowly slid through the active zone.
Everything happens in an enclosed reactor that needn't be opened for many decades, and there's no need for a spent fuel pool as the spent fuel (now plutonium-239) is merely left in the reactor core behind the front of the traveling wave, where it cools by itself. There's no need for active cooling, and if there was an interruption in the plant's operation, then the reactor would quickly cool by itself. If a disastrous earthquake and tsunami hit a TWR in action, it's unlikely a super-hot reactor emergency and potential meltdown would occur.
To read more news like this, follow Fast Company on Twitter: Click here.

"

The triumph of coal marketing

by 
Do you have an opinion about nuclear power? About the relative safety of one form of power over another? How did you come to this opinion?

Here are the stats, and here's the image. A non-exaggerated but simple version of his data:

Deathratewatts


For every person killed by nuclear power generation, 4,000 die due to coal, adjusted for the same amount of power produced... You might very well have excellent reasons to argue for one form over another. Not the point of this post. The question is: did you know about this chart? How does it resonate with you?

Vivid is not the same as true. It's far easier to amplify sudden and horrible outcomes than it is to talk about the slow, grinding reality of day to day strife. That's just human nature. Not included in this chart are deaths due to global political instability involving oil fields, deaths from coastal flooding and deaths due to environmental impacts yet unmeasured, all of which skew it even more if you think about it.

This chart unsettles a lot of people, because there must be something wrong with it. Further proof of how easy it is to fear the unknown and accept what we've got.

I think that any time reality doesn't match your expectations, it means that marketing was involved. Perhaps it was advertising, or perhaps deliberate story telling by an industry. Or perhaps it was just the stories we tell one another in our daily lives. It's sort of amazing, even to me, how much marketing colors the way we see the world--our reaction (either way) to this chart is proof of it.

Saturday, March 19, 2011

Books Sculpted To Look Like Their Authors


For the Dutch book week, several books were hacked into and carved to look like their author’s faces. There’s Anne Frank and Kader Abdolah up above, looking all wooden-headed. Which author would you choose to carve? [Behance via Selectism

Friday, March 18, 2011

Understanding Japan’s Nuclear Crisis


By John Timmer, Ars Technica

Following the events at the Fukushima Daiichi nuclear reactors in Japan has been challenging. At best, even those present at the site have a limited view of what’s going on inside the reactors themselves, and the situation has changed rapidly over the last several days. Meanwhile, the terminology involved is somewhat confusing—some fuel rods have almost certainly melted, but we have not seen a meltdown; radioactive material has been released from the reactors, but the radioactive fuel currently remains contained.

Over time, the situation has become a bit less confused, as cooler heads have explained more about the reactor and the events that have occurred within it. What we’ll attempt to do here is aggregate the most reliable information we can find, using material provided by multiple credible sources. We’ve attempted to confirm some of this information with groups like the Nuclear Regulatory Commission and the Department of Energy but, so far, these organizations are not making their staff available to talk to the press.

Inside a Nuclear Reactor


Nuclear reactors are powered by the fission of a radioactive element, typically uranium. There are a number of products of this reaction, but the one that produces the power is heat, which the fission process gives off in abundance. There are different ways to extract electricity from that heat, but the most common way of doing so shares some features with the first steam engines: use it to boil water, and use the resulting pressure to drive a generator.

Radioactivity makes things both simpler and more complex. On the simpler side, fission will readily occur underwater, so it’s easy to transfer the heat to water simply by dunking the nuclear fuel directly into it.



In the reactor design used in Japan, the fuel is immersed in water, which boils off to generate power, is cooled, and then returns to the reactor. The pressure vessel and primary containment keep radioactivity inside. (Ars Technica)

Unfortunately, the radioactivity complicates things. Even though the fuel is sealed into rods, it’s inevitable that this water will pick up some radioactive isotopes. As a result, you can’t just do whatever you’d like with the liquid that’s been exposed to the fuel rods. Instead, the rods and water remain sealed in a high-pressure container and linked pipes, with the hot water or steam circulated out to drive machinery, but then reinjected back into the core after it has cooled, keeping a closed cycle.

The water recirculation doesn’t just let us get power out of the reactor; it’s essential to keeping the reactor core cool. Unless the heat of decay is carried away from the core, its temperature will rise rapidly, and the fuel and its structural support will melt.

The Fission Reaction



Completely inserting control rods to limit uranium’s fission, however, doesn’t affect what’s happened to the products of previous reactions. Many of the elements that are produced following uranium’s split are themselves radioactive, and will decay without needing any encouragement from a neutron. Some of the neutrons from the reactor will also be absorbed by atoms in the equipment or cooling water, converting those to radioactive isotopes. Most of this additional radioactive material decays within the span of a few days, so it’s not a long-term issue. But it ensures that, even after a reactor is shut down by control rods, there’s enough radioactive decay around to keep things hot for a while.

All of which makes the continued operation of the plant’s cooling system essential. Unfortunately, cooling system failures have struck several of the reactors at Fukushima Daiichi.

Surviving the Quake, But Not the Tsunami


Because cooling is so essential to a plant’s operation, there are a few layers of backups to keep the pumps running. For starters, even if the reactors themselves are taken offline, the coolant pumps can receive power from offsite; this option was eliminated by the earthquake itself, which apparently cut off the external power to Fukushima. The earthquake also triggered a shutdown of the reactors, removing the obvious local source of power to the pumps. At this point, the first backup system kicked in: a set of on-site generators that burn fossil fuels to keep the equipment running.

Those generators lasted only a short while before the tsunami arrived and swamped them, flooding parts of the plant’s electrical system in the process. Batteries are in place to allow a short-term backup for these generators; it’s not clear whether these failed due to the problems with the electrical system, or were simply drained. In any case, additional generators were slow to arrive due to the widespread destruction, and didn’t manage to get the pumps running again when they did.

As a result, the plants have been operating without a cooling system since shortly after the earthquake. Even though the primary uranium reaction was shut down promptly, the reactor cores have continued to heat up due to secondary decay products.

Ugly Possibilities


Without cooling, there are a number of distinctly ugly possibilities. As water continues to be heated, more steam will be generated within the reactor vessel, increasing the pressure there, possibly to the point where the vessel would fail. The reactor vessel would burst into a primary containment vessel, which would limit the immediate spread of radioactive materials. However, the rupture of the reactor vessel would completely eliminate any possibility of restoring the coolant system, and might ultimately leave the reactor core exposed to the air.

And that would be a problem, since air doesn’t carry heat away nearly as efficiently as water, making it more likely that the temperatures would rise sufficiently to start melting the fuel rods. The other problem with exposing the fuel rods to air is that the primary covering of the rods, zirconium, can react with steam, reducing the integrity of the rods and generating hydrogen.

To respond to this threat, the plant’s operators took two actions, done on different days with the different reactors. To begin with, they attempted to pump cold sea water directly into the reactors to replace the boiled-off coolant water. This was not a decision made lightly; sea water is very corrosive and will undoubtedly damage the metal parts of the reactor, and its complex mixture of contents will also complicate the cleanup. This action committed the plant operators to never running it again without a complete replacement of its hardware. As an added precaution, the seawater was spiked with a boron compound in order increase the absorption of neutrons within the reactor.

The second action involved the bleeding off of some pressure from the reactor vessel in order to lower the risk of a catastrophic failure. This was also an unappealing option, given that the steam would necessarily contain some radioactivity. Still, it was considered a better option than allowing the container to burst.

This decision to bleed off pressure ultimately led to the first indications of radioactivity having escaped the reactor core and its containment structure. Unfortunately, it also blew the roof off the reactor building.

Hard Choices to Bad Results


As seen in some rather dramatic video footage, shortly after the pressure was released, the buildings housing the reactors began to explode. The culprit: hydrogen, created by the reaction of the fuel casing with steam. The initial explosions occurred without damaging the reactor containment vessel, meaning that more significantly radioactive materials, like the fuel, remained in place. Larger increases in radioactivity, however, followed one of the explosions, indicating possible damage to the containment vessel, although levels have since fluctuated.

However, the mere presence of so much hydrogen indicated a potentially serious issue: it should only form if the fuel rods have been exposed to the air, which indicates that coolant levels within the reactor have dropped significantly. This also means that the structural integrity of the fuel rods is very questionable; they’ve probably partially melted.

Part of the confusion in the coverage of these events has been generated by the use of the term “meltdown.” In a worst-case scenario, the entire fuel rod melts, allowing it to collect on the reactor floor, away from the moderating affect of any control rods. Its temperature would soar, raising the prospect that the material will become so hot that it will melt through the reactor floor, or reach a source of water and produce an explosive release of steam laced with radioactive fuel. There is no indication that any of this is happening in Japan at the moment.

Still, the partial melting of some fuel does increase the chances that some highly radioactive material will be released. We’re nowhere near the worst case, but we’re not anywhere good, either.

An additional threat has recently become apparent, as one of the inactive reactors at the site suffered from an explosion and fire in the area where its fuel is being stored. There is almost no information available about how the tsunami affected the stored fuel. Hydrogen is again suspected to be the source of the explosion, which again suggests that some of the fuel rods have been exposed to the air and could be melting. It’s possible that problems with the stored fuel contributed to the recent radiation releases, since there isn’t nearly as much containment hardware between the storage area and the environment.

Again, plans have been made to add sea water to the storage area, both by helicopter drops attempted earlier today, and through standard firefighting equipment.

Where We Stand


So far, the most long-lived radioactive materials at the site appear to remain contained within the reactor buildings. Radioisotopes have and continue to escape containment, but there’s no indication yet that these are anything beyond secondary decay products with short half-lives.

Although radiation above background levels has been detected far from the reactor site, most of this has been low-level and produced by short-lived isotopes. Prevailing winds have also sent a lot of the radioactive material out over the Pacific. As a result, most of the problems with radioactive exposure have been in the immediate vicinity of the Fukushima Daiichi reactors themselves, where radiation has sometimes reached threatening levels; it’s been possible to hit a yearly safe exposure limit within a matter of hours at times. Areas around the reactors have been evacuated or subject to restrictions, but it’s not clear how far out the areas of significant exposure extend, and they may change rapidly.

All of this is severely complicating efforts to get the temperatures under control. Personnel simply can’t spend much time at the reactor site without getting exposed to dangerous levels of radioactivity. As a result, all of the efforts to get fresh coolant into place have been limited and subject to interruption whenever radiation levels spike. The technicians who continue to work at the site are putting their future health at risk.

There is some good news here, as each day without a critical failure allows more of the secondary radioactive materials to decay, lowering the overall risk of a catastrophic event. In the meantime, however, there’s little we can do to influence the probability of a major release of radioactive material. Getting seawater into the reactors has proven to be hit-or-miss, and we don’t have a strong sense of the structural integrity of a lot of the containment buildings at this point; what’s happening in the fuel storage areas is even less certain. In short, our only real option is to try to get more water in and hope for the best.

Future of Nuclear Energy


Nuclear power plays a big role in most plans to limit the use of fossil fuels, and the Department of Energy has been working to encourage the building of the first plants in decades within the US. The protracted events in Japan will undoubtedly play a prominent role in the public debate; in fact, they may single-handedly ignite discussion on a topic that the public was largely ignoring. The take-home message, however, is a bit tough to discern at this point.

In some ways, the Japanese plants, even though they are an old design, performed admirably. They withstood the fifth-largest earthquake ever recorded, and the safety systems, including the automatic shutdown and backup power supplies, went into action without a problem. The containment systems have largely survived several hydrogen explosions and, so far, the only radioactive materials that have been released are short-lived isotopes that are concentrated in the plant’s vicinity. If things end where they are now, the plants themselves will have done very well under the circumstances.

But, as mentioned above, ending where we are now is completely beyond our control, and that highlights some reasons why this can’t be considered a triumph. Some of the issues are in the design. Although the plant was ready for an extreme event, it clearly wasn’t designed with a tsunami in mind—it is simply impossible to plan for every eventuality. However, this seems to be a major omission given the plant’s location. It also appears that the fuel storage areas weren’t nearly as robustly designed as the reactors.

Once the cooling crisis started, a set of predictable issues cropped up. We can never send humans inside many of the reactor areas, leaving us dependent upon monitoring equipment that may not be working or reliable during a crisis. And, once radiation starts to leak, we can’t send people to many areas that were once safe, meaning we’ve got even less of an idea of what’s going on inside, and fewer points to intervene at. Hardware that wasn’t designed for some purposes, like pumping sea water into the reactor vessel, hasn’t worked especially well for the emergency measures.

On balance, the safety systems of this reactor performed reasonably well, but were pushed up against a mixture of unexpected events and design limits. And, once anything starts to go wrong with a nuclear reactor, it places the entire infrastructure under stress, and intervening becomes a very, very difficult thing to do.

This latter set of issues mean that the surest way to build a safe nuclear plant is to ensure that nothing goes wrong in the first place. There are ways to reduce the risk by adding more safety and monitoring features while tailoring the design to some of the most extreme local events. But these will add to the cost of a nuclear plant, and won’t ever be able to ensure that nothing goes wrong. So, deciding on if and how to pursue expanded nuclear power will require a careful risk analysis, something the public is generally ill-equipped for.

Top image: Ars Technica.

Source: Ars Technica.

See Also:

"

Thursday, March 17, 2011

Lets stop arguing about Market Pricing of Carbon Credits

Professor Ross Garnaut. File photo: MELISSA ADAMSYesterday I heard Professor Ross Garnaut address the National Press Club in Canberra on Climate Change. The comments below represent my 'rant' that generally supports Garaunt's proposals. There is little doubt that a tax on carbon will make a significant contribution to government tax revenue. It is 'a great big tax on everything'... but the aggregate amount of tax collected by the Government need not necessarily increase. We can reduce the tax take on personal incomes and business incomes while balancing our budget through the new revenue from a tax on carbon. The Henry Tax Revue provides just one possibility to re-set the tax mix in a way that makes collection more efficient, provides better equity of treatment between tax payers and presents a tax regime with less traction in our efforts to improve productivity.

I hope the Coalition continues to believe in the importance of achieving a 5% reductions in greenhouse emissions by 2020 (as supported in Parliament).

To meet our international commitments on carbon emission reductions, we need to make carbon emissions credits a scarce resource. Arguably, the current Coalition policies rely on central management of carbon emissions to achieve target reductions, rather than placing reliance on market pricing to allocate a scarce resource. While central management of carbon emissions cloaks the taxing of carbon, it will eventually be seen as a tax (in the same way that Australians eventually saw tariffs as a form of taxation). Surely the Coalition should prefer to focus their political argument on the areas where they have historically had an advantage... economic management! Why would the Coalition choose to push policies that are deficient in economic logic. Market pricing is almost universally accepted as the most effective way of distributing a scare resource. Even the Chinese Government has acknowledged the superiority of distribution through market pricing. 

Where should the Coalition focus their policy initiatives? If the major political parties took a bi-partisan position on the mechanisms of allocating scarce carbon credits and the mechanisms for collecting the carbon tax, they could then focus on the more important areas of:
  • the rate of carbon tax to be applied
  • the change in the aggregate tax to be collected 
  • the tax reductions applied to other tax sources to improve efficiency, equity and productivity
These are the areas where the Coalition has a 'home ground' advantage. 


Exposure to Carcinogens from Coal-Fired Power Plants

The nuclear debate is too complicated for me. The crisis in Japan seems to have stopped in its tracks the renaissance of the nuclear industry. Much of the crisis in Japan arises from storing spent nuclear rods on-site... not very clever if the power plant is in an earthquake-prone region. Another issue seems to be the age of the plants and the brittle nature of old steel and old concrete casings of nuclear material. There appears to be valuable learnings to take from the extreme conditions applied to the Japanese plants. 


How do you measure the risks of extreme conditions applied to both nuclear and coal-fired power plants? Below are comments from a contributor named 'gunslinger' to an article published in Scientific America entitled 'Should Japan's Reactor Crisis Kill the Nuclear Renaissance?'
I cannot vouch for the accuracy of his/her assertions... but some interesting issues are raised.


I honestly question which would be an immediately worse disaster, a nuclear meltdown or a terrorist attach on coal site. Nuclear meltdown has the potential to expose plant personal to unhealthy levels of radiation, possibly leading to cancer. It also has the potential to send a radioactive cloud a few 10s of miles away from the plant (remember, exposer time in the cloud will be short since the cloud travels with wind), the danger is really when the cloud settles on houses cars and people clothing. Realistically, under a worst case senario, people within a 30 mile radious could potentially be exposed to unhealthy levevls. Conversely, at a coal site, if a terrorist were to explode the ammonia tanks, a cloud of ammonia could form and travel roughly the same distance i suspect. The difference is that when the people breath in ammonia gas, it pretty much melts your lungs.

Carcinogens are what we are afraid of, rightfully. radiation and pollution are both carcinogens. Extremely high doses of radiation are more dangerious than pollution, but to see immediately acute effects from radiation, you need to be handling weapons grade plutonium, not the stuff in reactors. When you really sit down and determine which is worse, coal is worse. Just imagin the mass of coal used by these facilities.

Lets just do the math real quick and dirty. One site I was at using 3 TONS of coal per SECOND! .02% of the wieght of this coal is uranium, plutonium and other radio activit materials. So thats, .06 tons of radio active material spewed directly into the air every second (not to mention the carcinogenic combustion byproducts), though that sounds a little, it isnt that far off.

A typical nuclear site uses about 1500 tons of uranium in the reactor core. Doing the math using the values from the previous paragraph, every 8 hours, this coal site puts as much radio active material directly into the air we breath as a nuclear plant uses and stores in 2 YEARS... every 8 hours!! Granted, the coal uranium isn't as radio active, but remember, what important is amplitude, type and TIME OF EXPOSER. We are in contact with coal byproducts 24 hours a day. I do question which is worse.