Why Can't We Fall Asleep?

...But going to sleep isn’t always a simple process, and it seems to have grown more problematic in recent years, as I learned through a series of conversations this May, when some of the world’s leading sleep experts met with me to share their ongoing research into the nature of sleeping. (The meetings were facilitated by a Harvard Medical School Media Fellowship.) According to Charles Czeisler, the chief of the Division of Sleep and Circadian Disorders at Brigham and Women’s Hospital, over the past five decades our average sleep duration on work nights has decreased by an hour and a half, down from eight and a half to just under seven. Thirty-one per cent of us sleep fewer than six hours a night, and sixty-nine per cent report insufficient sleep. When Lisa Matricciani, a sleep researcher at the University of South Australia, looked at available sleep data for children from 1905 to 2008, she found that they’d lost nearly a minute of sleep a year. It’s not just a trend for the adult world. We are, as a population, sleeping less now than we ever have.
The problem, on the whole, isn’t that we’re waking up earlier. Much of the change has to do with when we choose to go to bed—and with how we decide to do so...

The Myth of Big, Bad Gluten

AS many as one in three Americans tries to avoid gluten, a protein found in wheat, barley and rye. Gluten-free menus, gluten-free labels and gluten-free guests at summer dinners have proliferated.
Some of the anti-glutenists argue that we haven’t eaten wheat for long enough to adapt to it as a species. Agriculture began just 12,000 years ago, not enough time for our bodies, which evolved over millions of years, primarily in Africa, to adjust. According to this theory, we’re intrinsically hunter-gatherers, not bread-eaters. If exposed to gluten, some of us will develop celiac disease or gluten intolerance, or we’ll simply feel lousy.
Most of these assertions, however, are contradicted by significant evidence, and distract us from our actual problem: an immune system that has become overly sensitive.
...
Here’s the lesson: Adaptation to a new food stuff can occur quickly — in a few millenniums in this case. So if it happened with milk, why not with wheat?
“If eating wheat was so bad for us, it’s hard to imagine that populations that ate it would have tolerated it for 10,000 years,” Sarah A. Tishkoff, a geneticist at the University of Pennsylvania who studies lactase persistence, told me.
For Dr. Bana Jabri, director of research at the University of Chicago Celiac Disease Center, it’s the genetics of celiac disease that contradict the argument that wheat is intrinsically toxic.
...
So the real mystery of celiac disease is what breaks that tolerance, and whatever that agent is, why has it become more common in recent decades?
An important clue comes from the fact that other disorders of immune dysfunction have also increased. We’re more sensitive to pollens (hay fever), our own microbes (inflammatory bowel disease) and our own tissues (multiple sclerosis).
Perhaps the sugary, greasy Western diet — increasingly recognized as pro-inflammatory — is partly responsible. Maybe shifts in our intestinal microbial communities, driven by antibiotics and hygiene, have contributed. Whatever the eventual answer, just-so stories about what we evolved eating, and what that means, blind us to this bigger, and really much more worrisome, problem: The modern immune system appears to have gone on the fritz.
Maybe we should stop asking what’s wrong with wheat, and begin asking what’s wrong with us.

Why do people believe myths about the Confederacy? Because our textbooks and monuments are wrong.

Super fascinating look at the way history was rewritten over a few generations, turning reality on its head.

History is the polemics of the victor, William F. Buckley once said. Not so in the United States, at least not regarding the Civil War. As soon as the Confederates laid down their arms, some picked up their pens and began to distort what they had done and why. The resulting mythology took hold of the nation a generation later and persists — which is why a presidential candidate can suggest, as Michele Bachmann did in 2011, that slavery was somehow pro-family and why the public, per the Pew Research Center, believes that the war was fought mainly over states’ rights.
The Confederates won with the pen (and the noose) what they could not win on the battlefield: the cause of white supremacy and the dominant understanding of what the war was all about. We are still digging ourselves out from under the misinformation they spread, which has manifested in our public monuments and our history books.
Take Kentucky, where the legislature voted not to secede. Early in the war, Confederate Gen. Albert Sidney Johnston ventured through the western part of the state and found “no enthusiasm, as we imagined and hoped, but hostility.” Eventually, 90,000 Kentuckians would fight for the United States, while 35,000 fought for the Confederate States. Nevertheless, according to historian Thomas Clark, the state now has 72 Confederate monuments and only two Union ones.

How to Disrupt the Military-Industrial-Congressional Complex

President Obama is said to be considering an executive orderrequiring federal contractors to disclose their political spending. He should sign it immediately. 
But he should go further and ban all political spending by federal contractors that receive more than half their revenues from government.
Ever since the Supreme Court’s shameful Citizens United decision, big corporations have been funneling large amounts of cash into American politics, often secretly. 
Bad enough. But when big government contractors do the funneling, American taxpayers foot the bill twice over: We pay their lobbying and campaign expenses. And when those efforts nab another contract, we pay for stuff we often don’t need.
This is especially true for defense contractors – the biggest federal contractors of all. 
A study by St. Louis University political scientist Christopher Witko reveals a direct relationship between what a corporation spends on campaign contributions and the amount it receives back in government contracts. 

The Iran I Saw

This is a tale of two Irans. This is, specifically, the tale of the other Iran.
The tale we hear most often focuses on natural resources like oil as their greatest asset or nuclear power as their greatest threat—a narrative frozen in time, stretching back decades with remembered pain on both sides. For many Americans, the reference point for Iran is still centered on the hostage crisis at the U.S. embassy in Tehran over 35 years ago; for others, it has focused on Iranian support for destabilizing regional actors against our interests and costing lives.
At the same time, of course, Iranians have their own version of this tale: Many remember well U.S. support for a coup of their elected leadership, our support for a dictatorial regime and later encouragement of a war in Iraq that cost nearly a half-million Iranian lives.
Politics, power, mistrust: This is one version of how the media frames discussion of Iran. It’s very real, and it has much caution and evidence to support it.
But there’s another tale, one I saw repeatedly in my trip there last month. It was my second visit within the year, traveling with a group of senior global business executives to explore this remarkable and controversial nation.
This tale focuses on Iran’s next generation, an entirely new generation that came of age well after the Islamic Revolution, and on human capital, the greatest asset a country can have.  It’s about technology as the driver for breaking down barriers even despite internal controls and external sanctions. People under age 35 represent nearly two-thirds of Iran’s population at this point: Many were engaged in the Green Movement protests against the Iranian presidential election in 2009. Most are utterly wired and see the world outside of Iran every day—often in the form of global news, TV shows, movies, music, blogs, and startups—on their mobile phones.
This is a tale we rarely hear about.

A Scientific Ethical Divide Between China and West

 China is spending hundreds of billions of dollars annually in an effort to become a leader in biomedical research, building scores of laboratories and training thousands of scientists.
But the rush to the front ranks of science may come at a price: Some experts worry that medical researchers in China are stepping over ethical boundaries long accepted in the West.
Scientists around the world were shocked in April when a team led by Huang Junjiu, 34, at Sun Yat-sen University in Guangzhou, published the results of an experiment in editing the genes of human embryos.
The technology, called Crispr-Cas9, may one day be used to eradicate inheritable illnesses. But in theory, it also could be used to change such traits as eye color or intelligence, and to ensure that the changes are passed on to future generations.

Letter to My Son

Very long, but incredibly moving piece on what it's like, for many, to be black in America.

“Here is what I would like for you to know: In America, it is traditional to destroy the black body—it is heritage.”

Colorado’s Effort Against Teenage Pregnancies Is a Startling Success

Over the past six years, Colorado has conducted one of the largest experiments with long-acting birth control. If teenagers and poor women were offered free intrauterine devices and implants that prevent pregnancy for years, state officials asked, would those women choose them?
They did in a big way, and the results were startling. The birthrate among teenagers across the state plunged by 40 percent from 2009 to 2013, while their rate of abortions fell by 42 percent, according to the Colorado Department of Public Health and Environment. There was a similar decline in births for another group particularly vulnerable to unplanned pregnancies: unmarried women under 25 who have not finished high school.