Saturday, January 29, 2011

9 Steps To Perfect Health – #1: Don’t Eat Toxins

doughnut  


Imagine a world where:
  • diabetes, heart diseases, autoimmunity and other modern diseases are rare or don’t exist at all
  • we are naturally lean and fit
  • we are fertile throughout our childbearing years
  • we sleep peacefully and deeply
  • we age gracefully without degenerative diseases like Alzheimer’s and osteoporosis

While this might sound like pure fantasy today, anthropological evidence suggests that this is exactly how human beings lived for the vast majority of our evolutionary history.

Today, most people accept diseases like obesity, diabetes, infertility and Alzheimer’s as “normal”. But while these diseases may now be common, they’re anything but normal. Humans evolved roughly 2.5 million years ago, and for roughly 84,000 generations we were naturally free of the modern diseases which kill millions of people each year and make countless others miserable. In fact, the world I asked you to imagine above – which may seem preposterous and unattainable today – was the natural human state for our entire history on this planet up until a couple hundred years ago.

What was responsible for the change? What transformed us from naturally healthy and vital people free of degenerative disease into a world of sick, fat, infertile and unhappy people?

In a word? The modern lifestyle. And though there are several aspects of our current lifestyle that contribute to disease, the widespread consumption of food toxins is by far the greatest offender. Specifically, the following four dietary toxins are to blame:

  • Cereal grains (especially refined flour)
  • Omega-6 industrial seed oils (corn, cottonseed, safflower, soybean, etc.)
  • Sugar (especially high-fructose corn syrup)
  • Processed soy (soy milk, soy protein, soy flour, etc.)

What is a toxin?


At the simplest level, a toxin is something capable of causing disease or damaging tissue when it enters the body. When most people hear the word “toxin”, they think of chemicals like pesticides, heavy metals or other industrial pollutants. But even beneficial nutrients like water, which are necessary to sustain life, are toxic at high doses.

In their book The Perfect Health Diet, Paul & Shou-Ching Jaminet apply the economic principle of declining marginal benefits to toxins:

It implies that the first bit eaten of any toxin has low toxicity. Each additional bit is slightly more toxic than the bit before. At higher doses, the toxicity of each bit continues to increase, so that the toxin is increasingly poisonous.

This is important to understand as we discuss the role of dietary toxins in contributing to modern disease. Most of us won’t get sick from eating a small amount of sugar, cereal grain, soy and industrial seed oil. But if we eat those nutrients (or rather anti-nutrients) in excessive quantities, our risk of developing modern diseases rises significantly.

That’s exactly what’s happening today. These four food toxins – refined cereal grains, industrial seed oils, sugar and processed soy – comprise the bulk of the modern diet. Bread, pastries, muffins, crackers, cookies, soda, fruit juice, fast food and other convenience foods are all loaded with these toxins. And when the majority of what most people eat on a daily basis is toxic, it’s not hard to understand why our health is failing.

Let’s look at each of these food toxins in more detail.

Cereal grains: the unhealthiest “health food” on the planet?


The major cereal grains – wheat, corn, rice, barley, sorghum, oats, rye and millet – have become the staple crops of the modern human diet. They’ve also become the “poster children” of the low-fat, high-carbohydrate diet promoted by organizations like the American Heart Association (AHA) and American Diabetes Association (ADA). If you say the phrase “whole grains” to most people, the first word that probably comes to their mind is “healthy”.

But the fact is that most animals, including our closest relative (the chimpanzee) aren’t adapted to eating cereal grains and don’t eat them in large quantities. And humans have only been eating them for the past 10,000 years (a tiny blip of time on the scale of evolution). Why?

Because plants like cereal grains are always competing against predators (like us) for survival. Unlike animals, plants can’t run away from us when we decide to eat them. They had to evolve other mechanisms for protecting themselves. These include:

  • producing toxins that damage the lining of the gut;
  • producing toxins that bind essential minerals, making them unavailable to the body; and,
  • producing toxins that inhibit digestion and absorption of other essential nutrients, including protein.

One of these toxic compounds is the protein gluten, which is present in wheat and many of the other most commonly eaten cereal grains. In short, gluten damages the intestine and makes it leaky. And researchers now believe that a leaky gut is one of the major predisposing factors for conditions like obesity, diabetes and autoimmune disease.

Celiac disease (CD) – a condition of severe gluten intolerance – has been well known for decades. Celiacs have a dramatic and, in some cases, potentially fatal immune response to even the smallest amounts of gluten.

But celiac disease is just the tip of the iceberg when it comes to intolerance to wheat and other gluten containing grains. Celiac disease is characterized by antibodies to two components of the gluten compound: alpha-gliadin, and transglutaminase. But we now know that people can and do react to several other components of wheat and gluten. The diagram below shows how wheat and gluten are broken down in the body:

diagram of components of wheat

Current laboratory testing for gluten intolerance only tests for alpha-gliadin and transglutaminase, the two components of gluten implicated in celiac disease (highlighted in red in the diagram). But as you can see, wheat contains several other components including lectins like wheat germ agglutinin (WGA), other epitopes of the gliadin protein like beta-gliadin, gamma-gliadin and omega-gliadin, another protein called glutenin, an opioid peptide called gluteomorphin, and a compound called daminated gliadin produced by the industrial processing or digestion of gluten.

So here’s the thing. Studies now clearly show that people can react negatively to all of these components of wheat – not just the alpha-gliadin and transglutaminase that celiacs react to. And the worst part of this is that up until about 2 weeks ago, no commercial labs were testing for sensitivity to these other subfractions of wheat.

This means, of course, that it’s extremely likely that far more people are intolerant to wheat and gluten than conventional wisdom would tell us. In fact, that’s exactly what the latest research shows. Dr. Kenneth Fine, a pioneer in gluten intolerance research, has demonstrated that 1 in 3 Americans are gluten intolerant, and that 8 in 10 have the genes that predispose them to developing gluten intolerance.

This is nothing short of a public health catastrophe in a nation where the #1 source of calories is refined flour. But while most are at least aware of the dangers of sugar, trans-fat and other unhealthy foods, fewer than 1 in 8 people with celiac disease are aware of their condition. A 1999 paper in the British Medical Journal illustrated this well:

Graphic depicting incidence of undiagnosed celiac disease

Patients with clinically obvious celiac disease (observable inflammation and destruction of the gut tissue) comprise only 12.5% of the total population of people with CD. 87.5% of those with celiac have no obvious gut symptoms. For every symptomatic patient with CD, there are 8 patients with CD and no gastrointestinal symptoms.

But does that mean patients with CD without gut symptoms are healthy? Not at all. It was long believed that the pathological manifestations of CD were limited to the gastrointestinal tract. But research over the past few decades has revealed that gluten intolerance can affect almost every other tissue and system in the body, including:

  • brain;
  • endocrine system;
  • stomach and liver;
  • nucleus of cells;
  • blood vessels; and,
  • smooth muscle,

just to name a few!

This explains why CD and gluten intolerance are associated with several different diseases, including type 1 diabetes, thyroid disorders, osteoporosis, neurodegenerative conditions like Alzheimer’s, Parkinson’s and dementia, psychiatric illness, ADHD, rheumatoid arthritis, migraine, obesity and more. The table below from the same 1999 BMJ paper depicts the increased incidence of other diseases in patients with CD:

table showing associations of other diseases with celiac disease

As you can see, up to 17% of people with CD have an “undefined neurological disorder”. But even that alarmingly high statistic only accounts for people with diagnosed CD. We know that only 1 in 8 people with CD are diagnosed. We also know that those with CD represent only a small fraction of the population of people with gluten intolerance. With this in mind, it’s not hard to imagine that the number of people with gluten intolerance that have “undefined neurological disorders” (and other associated conditions on the list above) could be significantly higher than current research suggests.

Finally, we also now know that when you are gluten intolerant – which 33% (if not more) of you are – you will also “cross-react” with other foods that have a similar “molecular signature” to gluten and its components. Unfortunately, the list of these foods (shown below) contains all grains, which is why some medical practitioners (myself included) recommend not just a gluten-free diet, but an entirely grain-free diet. As you can see, it also contains other foods like dairy (alpha & beta casein, casomorphin, milk butyrophilin) and coffee (which is a very common cross-reactant).

  • alpha-caesin
  • beta-caesin
  • casomorphin
  • milk butyrophilin
  • cow’s milk
  • american cheese
  • chocolate
  • coffee
  • all cereal grains
  • quinoa
  • amaranth
  • buckwheat
  • tapioca
  • rice
  • potato
  • corn
  • sesame

Industrial seed oils: unnatural and unfit for human consumption


Industrial seed oils (corn, cottonseed, soybean, safflower, sunflower, etc.) have not been a part of the human diet up until relatively recently, when misguided groups like the AHA and the ADA started promoting them as “heart-healthy” alternatives to saturated fat.

The graph below shows how dramatically seed oil consumption has risen over the past several decades:

pufaconsumption

Throughout 4-5 million years of hominid evolution, diets were abundant in seafood and other sources of omega-3 long chain fatty acids (EPA & DHA), but relatively low in omega-6 seed oils.

Anthropological research suggests that our hunter-gatherer ancestors consumed omega-6 and omega-3 fats in a ratio of roughly 1:1. It also indicates that both ancient and modern hunter-gatherers were free of the modern inflammatory diseases, like heart disease, cancer, and diabetes, that are the primary causes of death and morbidity today.

At the onset of the industrial revolution (about 140 years ago), there was a marked shift in the ratio of n-6 to n-3 fatty acids in the diet. Consumption of n-6 fats increased at the expense of n-3 fats. This change was due to both the advent of the modern vegetable oil industry and the increased use of cereal grains as feed for domestic livestock (which in turn altered the fatty acid profile of meat that humans consumed).

The following chart lists the omega-6 and omega-3 content of various vegetable oils and foods:

efa content of oils

Vegetable oil consumption rose dramatically between the beginning and end of the 20th century, and this had an entirely predictable effect on the ratio of omega-6 to omega-3 fats in the American diet. Between 1935 and 1939, the ratio of n-6 to n-3 fatty acids was reported to be 8.4:1. From 1935 to 1985, this ratio increased to 10.3:1 (a 23% increase). Other calculations put the ratio as high as 12.4:1 in 1985. Today, estimates of the ratio range from an average of 10:1 to 20:1, with a ratio as high as 25:1 in some individuals.

In fact, Americans now get almost 20% of their calories from a single food source – soybean oil – with almost 9% of all calories from the omega-6 fat linoleic acid (LA) alone! (PDF)

This reveals that our average intake of n-6 fatty acids is between 10 and 25 times higher than evolutionary norms. The consequences of this dramatic shift cannot be underestimated.

So what are the consequences to human health of an n-6:n-3 ratio that is up to 25 times higher than it should be?

The short answer is that elevated n-6 intakes are associated with an increase in all inflammatory diseases – which is to say virtually all diseases. The list includes (but isn’t limited to):

  • cardiovascular disease
  • type 2 diabetes
  • obesity
  • metabolic syndrome
  • irritable bowel syndrome & inflammatory bowel disease
  • macular degeneration
  • rheumatoid arthritis
  • asthma
  • cancer
  • psychiatric disorders
  • autoimmune diseases

The relationship between intake n-6 fats and cardiovascular mortality is particularly striking. The following chart, from an article entitled Eicosanoids and Ischemic Heart Disease by Stephan Guyenet, clearly illustrates the correlation between a rising intake of n-6 and increased mortality from heart disease:

landis graph of hufa and mortality

As you can see, the USA is right up there at the top with the highest intake of n-6 fat and the greatest risk of death from heart disease.

On the other hand, several clinical studies have shown that decreasing the n-6:n-3 ratio protects against chronic, degenerative diseases. One study showed that replacing corn oil with olive oil and canola oil to reach an n-6:n-3 ratio of 4:1 led to a 70% decrease in total mortality. That is no small difference.

Joseph Hibbeln, a researcher at the National Institute of Health (NIH) who has published several papers on n-3 and n-6 intakes, didn’t mince words when he commented on the rising intake of n-6 in a recent paper:

The increases in world LA consumption over the past century may be considered a very large uncontrolled experiment that may have contributed to increased societal burdens of aggression, depression and cardiovascular mortality.

And those are just the conditions we have the strongest evidence for. It’s likely that the increase in n-6 consumption has played an equally significant role in the rise of nearly every inflammatory disease. Since it is now known that inflammation is involved in nearly all diseases, including obesity and metabolic syndrome, it’s hard to overstate the negative effects of too much omega-6 fat.

Sugar: the sweetest way to wreck your health


About 20 years ago, Nancy Appleton, PhD, began researching all of the ways in which sugar destroys our health. Over the years the list has continuously expanded, and now includes 141 points. Here’s just a small sampling (the entire list can be found on her blog).

  • Sugar feeds cancer cells and has been connected with the development of cancer of the breast, ovaries, prostate, rectum, pancreas, lung, gallbladder and stomach.
  • Sugar can increase fasting levels of glucose and can cause reactive hypoglycemia.
  • Sugar can cause many problems with the gastrointestinal tract, including an acidic digestive tract, indigestion, malabsorption in patients with functional bowel disease, increased risk of Crohn’s disease and ulcerative colitis.
  • Sugar can interfere with your absorption of protein.
  • Sugar can cause food allergies.
  • Sugar contributes to obesity.

But not all sugar is created alike. White table sugar (sucrose) is composed of two sugars: glucose and fructose. Glucose is an important nutrient in our bodies and is healthy, as long as it’s consumed in moderation. Fructose is a different story.

Fructose is found primarily in fruits and vegetables, and sweeteners like sugar and high-fructose corn syrup (HFCS). A recent USDA report found that the average American eats 152 pounds of sugar each year, including almost 64 pounds of HFCS.

Unlike glucose, which is rapidly absorbed into the bloodstream and taken up by the cells, fructose is shunted directly to the liver where it is converted to fat. Excess fructose consumption causes a condition called non-alcoholic fatty liver disease (NAFLD), which is directly linked to both diabetes and obesity.

A 2009 study showed that shifting 25% of dietary calories from glucose to fructose caused a 4-fold increase in abdominal fat. Abdominal fat is an independent predictor of insulin sensitivity, impaired glucose tolerance, high blood pressure, high cholesterol, high triglycerides and several other metabolic diseases.

In a widely popular talk on YouTube, Dr. Robert H. Lustig explains that fructose has all of the qualities of a poison. It causes damage, provides no benefit and is sent directly to the liver to be detoxified so that it doesn’t harm the body.

For more on the toxic effects of fructose, see The Perfect Health Diet and Robert Lustig’s YouTube talk: Sugar, The Bitter Truth.

Soy: another toxin promoted as a health food


Like cereal grains, soy is another toxin often promoted as a health food. It’s now ubiquitous in the modern diet, present in just about every packaged and processed food in the form of soy protein isolate, soy flour, soy lecithin and soybean oil.

For this reason, most people are unaware of how much soy they consume. You don’t have to be a tofu-loving hippie to eat a lot of soy. In fact, the average American – who is most definitely not a tofu-loving hippie – gets up to 9% of total calories from soybean oil alone.

Whenever I mention the dangers of soy in my public talks, someone always protests that soy can’t be unhealthy because it’s been consumed safely in Asia for thousands of years. There are several reasons why this isn’t a valid argument.

First, the soy products consumed traditionally in Asia were typically fermented and unprocessed – including tempeh, miso, natto and tamari. This is important because the fermentation process partially neutralizes the toxins in soybeans.

Second, Asians consumed soy foods as a condiment, not as a replacement for animal foods. The average consumption of soy foods in China is 10 grams (about 2 teaspoons) per day and is 30 to 60 grams in Japan. These are not large amounts of soy.

Contrast this with the U.S. and other western countries, where almost all of the soy consumed is highly processed and unfermented, and eaten in much larger amounts than in Asia.

How does soy impact our health? The following is just a partial list:

  • Soy contains trypsin inhibitors that inhibit protein digestion and affect pancreatic function;
  • Soy contains phytic acid, which reduces absorption of minerals like calcium, magnesium, copper, iron and zinc;
  • Soy increases our requirement for vitamin D, which 50% of American are already deficient in;
  • Soy phytoestrogens disrupt endocrine function and have the potential to cause infertility and to promote breast cancer in adult women.
  • Vitamin B12 analogs in soy are not absorbed and actually increase the body’s requirement for B12;
  • Processing of soy protein results in the formation of toxic lysinoalanine and highly carcinogenic nitrosamines;
  • Free glutamic acid or MSG, a potent neurotoxin, is formed during soy food processing and additional amounts are added to many soy foods to mask soy’s unpleasant taste; and,
  • Soy can stimulate the growth of estrogen-dependent tumors and cause thyroid problems, especially in women.

Perhaps most alarmingly, a study at the Harvard Public School of Health in 2008 found that men who consumed the equivalent of one cup of soy milk per day had a 50% lower sperm count than men who didn’t eat soy.

In 1992, the Swiss Health Service estimated that women consuming the equivalent of two cups of soy milk per day provides the estrogenic equivalent of one birth control pill. That means women eating cereal with soy milk and drinking a soy latte each day are effectively getting the same estrogen effect as if they were taking a birth control pill.

This effect is even more dramatic in infants fed soy formula. Babies fed soy-based formula have 13,000 to 22,000 times more estrogen compounds in their blood than babies fed milk-based formula. Infants exclusively fed soy formula receive the estrogenic equivalent (based on body weight) of at least five birth control pills per day.

Click here for a complete list of studies demonstrating the harmful effects of soy products.



Related posts:
  1. 9 Steps to Perfect Health: Introduction
  2. How to Save Your Family’s Life: 30 Ways to Prevent Modern Disease
  3. Ten steps to preventing heart disease naturally
"

Thursday, January 27, 2011

Broccoli Fights Cancer By Clearing Bad Tumor Suppressors


Cruciferous vegetables contain compounds that preferentially destroy ineffective mutant p53 tumor suppressor proteins, but leave the good ones alone. Steve Mirsky reports.

Listen to this Podcast

Generations of American children have been told, “Eat your broccoli!” And for decades, researchers have known that broccoli and related vegetables like cauliflower and watercress appeared to lower the risk of some cancers. And that compounds in the vegetables could kill cancer cells. But how the cruciferous veggies worked their medical magic was a mystery. Until now. Because researchers have figured out just what broccoli does that helps keep cancer in check. The work appears in the Journal of Medicinal Chemistry. [Xiantao Wang et al, Selective Depletion of Mutant p53 by Cancer Chemoprevention Isothiocyanates and Their Structure-Activity Relationships]
Proteins coded by the gene p53 help keep cancer from starting to grow. But when the p53 gene is mutated, the protection is gone. Mutated p53 is implicated in about half of all human cancers.
Broccoli and its relatives are rich in compounds called isothiocyanates, or ITCs. And these ITCs apparently destroy the products of the mutant p53 gene, but leave the healthy p53 proteins alone and free to suppress tumor development.
The researchers write that “depletion of mutant p53 may reduce drug resistance and lead to new strategies for treating cancer in the clinic.” In the meantime, eat your broccoli!
—Steve Mirsky

100% Renewable Energy in 40 Years Not Limited to Our Wildest Dreams: Study

windmill

100% Renewable Energy in 40 Years Not Limited to Our Wildest Dreams: Study: "
New research suggests the whole world could switch to renewable energy sources using current tech in just 20 to 40 years. It would cost no more than current energy, and would have big economic and eco payoffs. The only barriers are down to social, business, and political inertia.
We all know about renewable energy--it's been around for years, and is key to solving the global warming (and end-of-oil) crisis. Nowadays it's good to be green, and research into the millions of different aspects of the tech is skyrocketing. But a Stanford research team has just compiled an innovative, lateral-thinking study that says even using current available technology the entire world could switch 100% of its energy needs to renewable sources in just a handful of decades. How is this possible?
Current tech is good enough
The research from Mark Z. Jacobson and team involves making all new energy production plants use renewable energy by 2030, and then converting older existing plants by 2050. In the new world order, almost everything would run off electricity. Ninety percent of the production would come from windmills and solar energy plants (already very well established technologies) and the remaining 10% would come from hydroelectric power, geothermal, and wave/tidal power. Mobile things--cars, trains, ships and such--would run on hydrogen-powered fuel cells, and aircraft would burn hydrogen fuel. The hydrogen itself would come from green-electric generation processes.
All of this plan requires no more than a dedicated push to exploit existing technology and to network it all together in an intelligent way--because demand varies from place to place, throughout the day, and as seasons change, and the sun, wind, and waves don't necessarily give power all the time, everywhere. 'If you combine them as one commodity and use hydroelectric to fill in gaps [as it's a reliable battery-like resource], it is a lot easier to match demand,' Jacobson notes. A supergrid, with long-distance links, international cooperation and really smart energy management is needed. (Good job we're already building one).
Will it cost more?
Nope. Making the changes will take time, effort and money--because you have to build a lot of new equipment, and link up power grids across the world. Spinning up green-power industries to build devices at a global scale will also cost money, as will winding down and deconstructing the infrastructure in place to support coal, oil, gas and even nuclear electricity generation.
But 'when you actually account for all the costs to society--including medical costs--of the current fuel structure, the costs of our plan are relatively similar to what we have today,' according to Jacobson. That medical reference is to the health benefits of reducing pollution on a global scale, as well as side-effects like deaths from warming-induced natural disasters.
Will it cost more in the long run?
Nope, it may cost less. Due to the incontrovertible laws of thermodynamics and other bits of physics, 'heat engines' like the non-renewable power stations and car engines we run today are way less energy efficient than an all-electric process. The Stanford plan suggests global energy needs would drop by 30% due to this efficiency boost, meaning we'd actually need less power--and if the business models evolve to support this norm, individuals may pay less for their energy.
Won't we pepper the Earth with windmills and solar farms and hydroelectric dams?
Nope, Stanford's plan would require 0.4% of the world's land (mainly for solar power) and the spacing between windmills accounts for another 0.6%--although you can use this area for farming and catering for other needs. One percent of the windmills are already in place, and Jacobson notes 'the actual footprint required by wind turbines to power half the world's energy is less than the area of Manhattan.'
Considering how much space is taken up by power stations and coal mines--facilities that would be closed in the plan--this isn't too much of a sacrifice. And a significant share of the wind farms could be offshore, to satisfy NIMBYism.
Why don't we do it then?
Inertia. We're all used to the current way of things, and rethinking everything from how your car works to looking at a landscape where power windmills go from rare to the norm involves a big effort--a 'large scale transformation' on a global scale. Governments are notoriously slow-footed when it comes to this sort of change, and the Stanford plan involves so many innovations and international cooperation that the complexity is almost beyond imagination.
Existing businesses who rely on coal, oil, gas (and their byproducts, like the airline industry's need for aviation fuel) will be reluctant too.
But we've done similar things, as Jaconson notes--it's an effort 'comparable to the Apollo moon project or constructing the interstate highway system,' just compressed into a short timescale and requiring action from a majority of nations.
To read more news on this, and similar stuff, keep up with my updates by following me, Kit Eaton, on Twitter.
[Image: Shooter's Bottom wind turbine at dusk (Sharon Loxton) / CC BY-SA 2.0]


"

Genetically Modified Mosquitoes Hit the Forests of Malaysia

close-up of mosquito

Modern technology is dengue fever's new enemy.

Malaysia released 6,000 genetically modified mosquitoes into a local forest this month in an effort to curb dengue fever--dengue fever is transmitted by the female Aedes aegypti mosquito, is often found in tropical and subtropical environments, and causes headaches, muscular pain, and nausea, and sometimes death. There is no cure or vaccine and it's not uncommon for travelers to pick up the disease, especially in the popular tourist countries of Southeast Asia.


The goal of the experiment, a first in all of Asia, is to reduce the number of offspring and lifetime expectancy of the mosquitoes, so that the mosquito population is ultimately reduced. The Aedes aegypti male mosquitoes were the ones released--genetically engineered in such a way to mate with females that don't produce offspring and those that don't live as long as others.


The experiment is far from accepted in local circles--several environmental groups protested the act and ultimately appealed to the government to not allow it to happen. But the government went ahead anyway and just announced this week that the experiment is now complete.


'I am surprised that they did this without prior announcement given the high level of concerns raised not just from the NGOs but also scientists and the local residents,' said Third World Network researcher, Lim Li Ching. 'We don't agree with this trial that has been conducted in such an untransparent way. There are many questions and not enough research has been done on the full consequences of this experiment.'


Concerns have been raised that the experiment could lead to an uncontrollable mosquito population but in fact they are designed to only live for a few days.


Follow me, Jenara Nerenberg, on Twitter.

"

Wednesday, January 26, 2011

Obama's State of the Union: The facts about clean energy and broadband access

Caltech, solar, Obama, ETHIt's debatable that the U.S. is feeling the same sense of unity and resolve toward technology that it did more than 50 years ago when theSoviet Union launched its Sputnik satellite and won the race to space. Regardless, as President Obama pointed out during last night's State of the Union Address, a Sputnik-like response is in order if the U.S. is to develop the technology needed to address a number of significant challenges the nation faces in the coming years—in particular clean energy and ubiquitous broadband communications.

Understandably, given the breadth of topics he needed to cover, the President mentioned but did not provide much detail about several key technology initiatives underway. Scientific American fills in some of the blanks related to key statements Obama made last night.

"At the California Institute of Technology, they're developing a way to turn sunlight and water into fuel for our cars."
The U.S. Department of Energy (DOE) in July announced an award of up to $122 million over five years to establish an Energy Innovation Hub directed by Caltech chemistry professor Nathan Lewis. The organization will include a multidisciplinary team of scientists aimed at developing new methods to generate fuels directly from sunlight. Caltech is leading the Joint Center for Artificial Photosynthesis (JCAP) in partnership with the DOE's Lawrence Berkeley National Laboratory to develop anintegrated solar energy-to-chemical fuel conversion system and move this system from the bench-top discovery phase to a scale where it can be commercialized.Caltech, ETH, solar, Obama

Also at Caltech, researchers are developing a new reactor to capture solar energy and use it as a catalyst to convert carbon dioxide and water into fuel. Led by Sossina Haile, a professor of materials science and chemical engineering, Caltech scientists have built a 61-centimeter tall prototype reactor with a quartz window that acts as a magnifying glass to focus sunlight coming into the reactor, whose inner cavity is lined with ceria, a metal oxide commonly found in self-cleaning ovens. When the cavity absorbs the concentrated sunlight and is heated the ceria acts as a catalyst, releasing oxygen from its crystalline framework. When the cavity is cooled a chemical reaction produces carbon monoxide and/or hydrogen gas. The hydrogen gas can be used to fuel hydrogen fuel cells, whereas the carbon monoxide, combined with the hydrogen gas, can be used to create synthetic gas.

"At Oak Ridge National Laboratory, they're using supercomputers to get a lot more power out of our nuclear facilities."
Oak Ridge researchers are using the DOE's largest supercomputer—the XT5 Jaguar—to build a 3-D virtual reactor that they can use to figure out how to generate energy more efficiently and with less waste.

"Just recently, China became the home to the world's largest private solar research facility, and the world's fastest computer."
In March, Santa Clara, Calif.–based Applied Materials opened its Solar Technology Center in Xi'an, China. At 400,000 square feet, this facility is indeed the world's largest non-governmental solar energy research facility, with laboratory and office buildings for research and development, engineering, product demonstration, testing and training for crystalline silicon and thin-film solar module manufacturing equipment and processes.

China's Tianhe-1A supercomputer at the National Supercomputer Center in Tianjin has achieved a performance level of 2.57 petaflops per second (a petaflop is one quadrillion calculations per second). This ranks the Tianhe-1A ahead of the former number one system—Oak Ridge's Jaguar, which has achieved a tope performance level of 1.75 petaflops per second.

"Our infrastructure used to be the best, but our lead has slipped. South Korean homes now have greater Internet access than we do."
It's true that two studies last year—the first by the U.S. Government Accountability Office (GAO) and the second by the University of Oxford's Saïd Business School and Cisco Systems—ranked the U.S. 15th among developed nations in terms of universal broadband access. However, the U.S.'s performance is the result of a number of factors, not the least of which is the country's physical size. The U.S. has more broadband subscriber lines than any other country, and it also has a lot more territory to cover than say, Japan, which is number two in terms of broadband subscriber lines, according to the GAO report. Japan is about the size of California. Likewise, top-ranked South Korea's infrastructure needs to cover a landmass only slightly larger than Indiana.

"Within the next five years, we'll make it possible for businesses to deploy the next generation of high-speed wireless coverage to 98 percent of all Americans."
About 75 percent of households have a broadband connection today, and those connections have average download speeds of about 9.6 megabits per second and upload speeds of about two megabits per second, according to the Saïd–Cisco study. The GAO study estimated that more than 90 percent of U.S. households have broadband access.

Images courtesy of Switzerland's ETH Zurich science and technology university, which has collaborated with Caltech on the solar reactor

Sunday, January 23, 2011

Australia’s Top 10 Inventions: Refrigeration



To celebrate Australia Day this week, we’re looking at some of the best inventions to ever come out of our sunburnt country. Today, we pay homage to James Harrison, whose technological advances in mechanical refrigeration meant our ancestors could get cold beer.
While people had been using iceboxes to keep stuff cold for thousands of years before James Harrison was even conceived, the Scottish-born Australian was the first to invent and patent a mechanical system to create ice for refrigeration. In 1854, Harrison created a commercial ice-making machine in Geelong, which he then expanded to create a vapour-compression refrigeration system, which he was awarded a patent for in 1855.
What made this refrigeration system unique was the use of a compressor to force vapourised ether into a condenser for cooling, where it turned back into liquid. This liquid then made its way through the the refrigeration coils and turned back into gas, which cooled down the insides of the system. According to Wikipedia, the machine used a 5 metre flywheel and could produce 3000kg of ice a day.
Harrison continued his innovations in refrigeration, jumping between it and his career as a journalist and editor at The Age. He was one of the pioneers allowing for meat to be shipped between Britain and Australia, although is first experiment was an unmitigated failure thanks to a lack of ice to keep the meat cold enough.
Harrison’s method of refrigeration is still used by fridges today, although the process has been refined significantly and ether is no longer the gas of choice. But what makes his invention especially brilliant and signifies Aussie ingenuity is that the first company to use his system was a Bendigo-based brewery, Glasgow & Co. That’s right – an Aussie invented the fridge and it’s first real use was making beer. You have to love this country…

Saturday, January 22, 2011

Quantum Entanglement Could Stretch Across Time



In the weird world of quantum physics, two linked particles can share a single fate, even when they’re miles apart.

Now, two physicists have mathematically described how this spooky effect, called entanglement, could also bind particles across time.

If their proposal can be tested, it could help process information in quantum computers and test physicists’ basic understanding of the universe.

“You can send your quantum state into the future without traversing the middle time,” said quantum physicist S. Jay Olson of Australia’s University of Queensland, lead author of the new study.

In ordinary entanglement, two particles (usually electrons or photons) are so intimately bound that they share one quantum state — spin, momentum and a host of other variables — between them. One particle always “knows” what the other is doing. Make a measurement on one member of an entangled pair, and the other changes immediately.

Physicists have figured out how to use entanglement to encrypt messages in uncrackable codes and build ultrafast computers. Entanglement can also help transmit encyclopedias’ worth of information from one place to another using only a few atoms, a protocol called quantum teleportation.

In a new paper posted on the physics preprint website arXiv.org, Olson and Queensland colleague Timothy Ralph perform the math to show how these same tricks can send quantum messages not only from place to place, but from the past to the future.



The equations involved defy simple mathematical explanation, but are intuitive: If it’s impossible to describe one particle without including the other, this logically extends to time as well as space.

“If you use our timelike entanglement, you find that [a quantum message] moves in time, while skipping over the intermediate points,” Olson said. “There really is no difference mathematically. Whatever you can do with ordinary entanglement, you should be able to do with timelike entanglement.”

Olson explained them with a Star Trek analogy. In one episode, “beam me up” teleportation expert Scotty is stranded on a distant planet with limited air supply. To survive, Scotty freezes himself in the transporter, awaiting rescue. When the Enterprise arrives decades later, Scotty steps out of the machine without having aged a day.

“It’s not time travel as you would ordinarily think of it, where it’s like, poof! You’re in the future,” Olson said. “But you get to skip the intervening time.”

According to quantum physicist Ivette Fuentes of the University of Nottingham, who saw Olson and Ralph present the work at a conference, it’s “one of the most interesting results” published in the last year.

“It stimulated our imaginations,” said Fuentes. “We know entanglement is a resource and we can do very interesting things with it, like quantum teleportation and quantum cryptography. We might be able to exploit this new entanglement to do interesting things.”

One such interesting thing could involve storing information in black holes, said physicist Jorma Louko, also of the University of Nottingham.

“They show that you can use the vacuum, that no-particle state, to store a lot of information in just a couple of atoms, and recover that info from other atoms later on,” Louko said. “The details of that have not been worked out, but I can foresee that the ideas that these authors use could be adapted to the black hole context.”

Entanglement in time could also be used to investigate as-yet-untested fundamentals of particle physics. In the 1970s, physicist Bill Unruh predicted that, if a spaceship accelerates through the empty space of a vacuum, particles should appear to pop out of the void. Particles carry energy, so they would be, in effect, a warm bath. Wave a thermometer outside, and it would record a positive temperature.

Called the Unruh effect, this is a solid prediction of quantum field theory. It’s never been observed, however, as a spaceship would have to accelerate at as-yet-unrealistic speeds to generate an effect large enough to be testable. But because timelike entanglement also involves particles emerging from vacuums, it could be used to conduct more convenient searches, relying on time rather than space.

Finding the Unruh effect would provide support for quantum field theory. But it might be even more exciting not to see the effect, Olson said.

“It would be more of a shocking result,” Olson said. “If you didn’t see it, something would be very wrong with our understanding.”

Image: flickr/Darren Tunnicliff

See Also:

"