Each day, an average of nine people are killed in the United States and more than 1,000 injured by drivers doing something other than driving. The totals—3,300 U.S. deaths and 387,000 injuries in 2011—show that laws in many states banning texting and hand-held cellphone use while driving aren’t getting the job done. Jay Winsten , Frank Stanton Director of the School of Public Health’s Center for Health Communication and associate dean for health communication, thinks it’s time to turn to a higher power: social norms. Winsten and the center hope to reduce distracted driving by enlisting the Hollywood creative community in a campaign against it. Texting and cellphone use will be the main targets, but the campaign also will seek to raise awareness of how anything that takes a driver’s attention from the road — whether programming a GPS device or re-setting a child’s entertainment center — is potentially hazardous. One recent study monitored truckers for over 3 million miles using sensors and cameras to see when they took their eyes off the road. Reading or writing a text message distracted the driver for an average of 4.6 seconds, increasing the odds of a crash 23-fold. “That’s the equivalent of driving blindfolded the length of a football field at 55 miles an hour,” Winsten said. “Imagine a texting driver heading toward you in the opposing lane of traffic. The threat comes from a combination of visual and cognitive distraction.” Dialing a handheld cellphone took an average of 3.8 seconds, and increased the crash risk 3.8 fold. A 2012 survey by the AAA Foundation found that “despite the near-universal disapproval of texting and e-mailing behind the wheel, more than one in four licensed drivers, 26.6 percent, reported typing or sending a text or e-mail at least once in the past 30 days, and more than one in three, 34.6 percent, said they read a text or e-mail while driving during this time.” With legislation outlawing distracted driving tough to enforce and awareness campaigns showing limited results, Winsten and colleagues are planning an initiative to change social norms by enlisting Hollywood as a partner. Such a strategy has worked before. In the late 1980s and early 1990s, Winsten spearheaded a campaign that made “designated driver” a household term, helping to shape a new social norm that the driver doesn’t drink. The campaign, which is credited with contributing to a 25 percent drop in drunk driving deaths and creating enduring social change, was accomplished with the help of many partners, including law enforcement, Madison Avenue advertising firms, advocacy groups such as Mothers Against Drunk Driving and Students Against Drunk Driving, and a big push from Hollywood and the television networks. Winsten won the support of television producers and writers to incorporate a designated driver message into 160 episodes of some of the most popular television shows of the day, including “Cheers,” “L.A. Law,” and “The Cosby Show.” Winsten also recruited President George H.W. Bush and later President Bill Clinton as the campaign’s lead spokespersons. Surveys found that a majority of the American public supported the practice of choosing a designated driver. Winsten hopes to use a similar strategy against distracted driving. There is already a lot being done to fight distracted driving. Many states have passed laws regulating texting and cellphone use while driving and, in the business world, companies such as AT&T have been very active with advertising efforts, Winsten said. In an announcement last week, Verizon, Sprint, and T-Mobile agreed to become partners in AT&T’s “It Can Wait” anti-texting campaign. And the Department of Transportation, under outgoing Secretary Ray LaHood, has made reducing distracted driving deaths a top priority. “We’ve gone through the initial phase of response to the problem with greatly increased awareness and it’s a good time to review gains and consolidate knowledge,” Winsten said. To do so, Winsten’s team is planning a conference this fall that will bring interested groups to HSPH for working sessions to examine existing efforts on distracted driving, extract lessons from past social change campaigns, and plan strategy. “It’s all about digging deeper to understand cultural change,” Winsten said. “What are the stages in the evolution of particular social norms and what are the levers for influencing change?” While borrowing lessons from past campaigns is important, the new effort, operating in a dramatically different media environment, will have to forge a new path. In the late 1980s and early 1990s, one could be assured of reaching a significant portion of the public with messages placed in a handful of key shows. Today, cable and satellite channels give television viewers hundreds of options and television itself competes for viewers’ screen time with video gaming, social media, and news and entertainment websites. “For sure the mission is harder now,” Winsten said. “There wasn’t even a fourth network then. Fox was only on two nights a week. [But] it’s not only the fragmented media marketplace, it’s the public’s very short attention span.” But Hollywood will still provide part of the answer, he said. “People connect to fictional characters, and become engaged in the story lines,” Winsten said. “A substantial body of research on social learning has demonstrated that the modeling of behavior through entertainment programming can strongly influence social norms and behavior.” The traditional top-down model of a public health campaign is outdated, he said, because today knowledge is disseminated laterally among the public as much as vertically from traditional knowledge sources. “Today, a large part of the conversation — for example, on the safety of vaccines —is going on without us,” Winsten said. “The challenge for public health is to re-insert ourselves into the conversation.”
They didn’t always speak the same language, but climate scientists and disaster relief workers wrapped up a meeting Tuesday in agreement about the importance of leveraging climate insights into improved disaster preparedness as the planet warms. The two-day conference was a rare convergence of two communities that, if the direst predictions come true, may get to know each other much better in the coming decades. “We’re going to rely on you to deal with the mess that’s coming,” Daniel Schrag , director of the Harvard University Center for the Environment (HUCE), Sturgis Hooper Professor of Geology, and professor of environmental science and engineering, told the humanitarian relief workers at the event. “You’re going to be critical and you’re going to have your hands full.” During the panel, Daniel Schrag told humanitarian relief workers, “We’re going to rely on you to deal with the mess that’s coming.” Topics at the event, “2013 Humanitarian Action Summit: Climate and Crisis,” included an overview of climate change as well as talks on climate change and food security, conflict and migration, humanitarian aid, climate predictions, and related initiatives in humanitarian organizations. It was co-sponsored by the Harvard Humanitarian Initiative (HHI) and the Harvard University Center for the Environment, and held at the American Academy of Arts and Sciences in Cambridge. Attendees included representatives from a variety of academic and nonprofit organizations, including Oxford University, the University of Liberal Arts in Bangladesh, Oxfam, the World Food Program, AmeriCares, MIT, Stanford, the Brookings Institution, the World Bank, and the Conrad Hilton Foundation. A major hurdle remains the translation of long-term climate trends into predictions about regional weather events. Although there seemed to be little doubt of the growing relationship between human-induced climate change and extreme weather, pinpointing trends precisely enough to be useful to relief organizations will be difficult, speakers said. If long-term trends are clear, the natural variability in the weather on any given day makes it difficult to predict very far into the future. That’s not to say predictions can’t help. Speakers pointed out that weather forecasting has improved significantly in recent decades, so that a 10-day forecast — nonexistent not so many years ago — is reasonably accurate today. Accuracy degrades rapidly beyond that, however, and predictability from a few months up to a decade away is very poor, after which longer-term trends can be discerned. Improved computing power should continue to improve forecasts, but there will remain a certain amount of unpredictability, said Mark Cane, a professor of Earth and climate sciences at Columbia University. Climate scientists do understand some specific drivers of regional weather, and there are useful steps that can be taken even based on long-term outlooks. For example, the relationship between weather patterns and the El Nino and La Nina events in the equatorial Pacific are better understood. Though predicting the magnitude of an El Nino/La Nina remains difficult, scientists know that an El Nino is linked to heavy rains in western South America, drought in southern Africa, cool and wet weather in the southern U.S., and dry weather in Australia, which might shed light on the likelihood of crop failures, water shortages, and flooding. Also, knowing that record-setting heat waves will become more frequent will allow officials to take preparedness steps in cities with vulnerable elderly populations even if a heat wave cannot specifically be predicted far in advance. El Nino and La Nina are such powerful drivers of local conditions in the regions they affect that a study of 93 tropical nations found that the annual risk of conflict decreases significantly from El Nino years to La Nina years, comparable to increasing annual per capita income in a nation from $1,000 to $10,000, Cane said. The U.S. Agency for International Development has already begun to use data in its Famine Early Warning Systems Network, which uses weather and other information to give advance warning to places at risk of food shortages and famine. “Any increase in predictability will be useful,” said Michael Delaney of Oxfam. “I don’t think we have to wait until it’s perfect to use it.” HHI co-founders Jennifer Leaning , Bagnoud Professor of the Practice of Health and Human Rights, and Michael VanRooyen , HHI director, professor of medicine, and professor of global health and population, said that follow-up between climate scientists and disaster relief specialists will be key. VanRooyen said it’s likely a small working group will be organized to continue the conversation and develop real-world applications. Also on the “2013 Humanitarian Action Summit: Climate and Crisis” panel was Noah Diffenbaugh, assistant professor in the school of earth sciences at Stanford University.
Researchers from Harvard-affiliated Brigham and Women’s Hospital (BWH) are the first to report that synthetic silicate nanoplatelets (also known as layered clay) can induce stem cells to become bone cells without the need of additional bone-inducing factors. Synthetic silicates are made up of simple or complex salts of silicic acids, and have been used extensively for various commercial and industrial applications, such as food additives, glass and ceramic fillers, and anti-caking agents. The research was published online Monday in Advanced Materials . “With an aging population in the U.S., injuries and degenerative conditions are subsequently on the rise,” said Harvard Medical School Associate Professor of Medicine Ali Khademhosseini of the BWH Division of Biomedical Engineering , the senior author of the study. “As a result, there is an increased demand for therapies that can repair damaged tissues. In particular, there is a great need for new materials that can direct stem cell differentiation and facilitate functional tissue formation. Silicate nanoplatelets have the potential to address this need in medicine and biotechnology.” “Based on the strong preliminary studies, we believe that these highly bioactive nanoplatelets may be utilized to develop devices such as injectable tissue repair matrixes, bioactive fillers, or therapeutic agents for stimulating specific cellular responses in bone-related tissue engineering,” said Akhilesh Gaharwar of the BWH Division of Biomedical Engineering , the study’s first author and a visiting fellow at the Wyss Institute for Biologically Inspired Engineering at Harvard University . “Future mechanistic studies will be performed to better understand underlying pathways that govern favorable responses, leading to a better understanding of how materials strategies can be leveraged to further improve construct performance and ultimately shorten patient recovery time.” This research was supported by the National Institutes of Health, U.S. Army Engineer Research and Development Center, Institute for Soldier Nanotechnologies, and the National Science Foundation. Click here to view the full manuscript.
The breakthrough technique that allowed scientists to obtain one-of-a-kind, colorful images of the myriad connections in the brain and nervous system is about to get a significant upgrade. A group of Harvard researchers, led by Joshua Sanes , the Jeff C. Tarr Professor of Molecular and Cellular Biology and Paul J. Finnegan Family Director, Center for Brain Science, and Jeff Lichtman , the Jeremy R. Knowles Professor of Molecular and Cellular Biology and Santíago Ramón y Cajal Professor of Arts and Sciences, has made a host of technical improvements in the “ Brainbow ” imaging technique. Their work is described in a May 5 paper in Nature Methods . First described in 2007, the system combines three fluorescent proteins — one red, one blue, and one green — to label different cells with as many as 90 colors. By studying the resulting images, researchers were able to begin to understand how the millions of neurons in the brain are connected. “‘Brainbow’ generated beautiful images of a kind we had never been able to obtain before, but it was difficult in some ways,” said Sanes, who also serves as director of the Center for Brain Science. “These modifications aim to overcome some of the more problematic features of the original genetic constructs,” Lichtman said. “Lead author Dawen Cai, a research associate in our labs, worked hard and creatively to find ways to make the ‘Brainbow’ colors brighter, more variable, and useable in situations where the original gene constructs were hard to implement. Our first look at these animals suggests that these improvements are fantastic.” The cells in the cerebellum of a mouse show up in an array of colors, including red, pink, yellow, green, cyan, blue, and brown. Among the challenges faced by researchers using the original method, Sanes said, was the chance that certain colored proteins would bleach out faster than others. “If one color bleaches faster than the others, you start with a ‘Brainbow,’ but by the time you’re done imaging, you might just have a ‘blue-bow,’ because the red and yellow bleach too fast,” he said. Sanes said that some colors also were too dim, causing problems in the imaging process, while in other cases the protein didn’t fill the whole neuron evenly enough, or there was an overabundance of a certain color in an image. “What we decided to do was to make the next generation of ‘Brainbow,’” Sanes said. “We systematically set out to look at these problems. We looked at a whole range of fluorescent proteins to find the ones that were brightest and wouldn’t bleach as much, and we developed new transgenic methods to avoid the predominance of a particular color.” The researchers also explored new ways to create “Brainbow” images, including using viruses to introduce fluorescent proteins into cells. The advantage of the new technique, Sanes said, is it offers researchers the chance to target certain parts of the brain and better understand how neurons radiate out to connect with other brain regions. Ultimately, he said, he hopes that other researchers are able to apply the techniques outlined in the paper in the same way that they expanded on the first “Brainbow” method. “People adapted the method to study a number of interesting questions in other tissues to examine cellular relationships and cell lineages in kidney and skin cells,” he said. “It was also used to examine the nervous system in animals like zebrafish and C. elegans. With these new tools, I think we’ve taken the next step.”
Detecting alien worlds presents a significant challenge, as they are small, faint, and close to their stars. The two most prolific techniques for finding exoplanets are radial velocity (looking for wobbling stars) and transits (looking for dimming stars). A team at Tel Aviv University in Israel and the Harvard-Smithsonian Center for Astrophysics (CfA) has just discovered an exoplanet using a new method that relies on Einstein’s special theory of relativity. “We are looking for very subtle effects. We needed high-quality measurements of stellar brightnesses, accurate to a few parts per million,” said team member David Latham of the CfA. “This was only possible because of the exquisite data NASA is collecting with the Kepler spacecraft,” added Simchon Faigler of Tel Aviv University, lead author of the paper announcing the discovery. Although Kepler was designed to find transiting planets, this planet was not identified using the transit method. Instead, it was discovered in 2003 using a technique first proposed by Avi Loeb of the CfA and his colleague Scott Gaudi , now at Ohio State University. (Coincidentally, Loeb and Gaudi developed their theory while visiting the Institute for Advanced Study at Princeton, where Einstein once worked.) The new method looks for three small effects that occur simultaneously as a planet orbits a star. Einstein’s “beaming” effect causes the star to brighten as it moves toward us, tugged by the planet, and dim as it moves away. The brightening results from photons “piling up” in energy, as well as light getting focused in the direction of the star’s motion due to relativistic effects. Einstein’s “beaming” effect causes the star to brighten as it moves toward us, tugged by the planet, and dim as it moves away. The brightening results from photons “piling up” in energy, as well as light getting focused in the direction of the star’s motion due to relativistic effects. Illustration by Dood Evan “This is the first time that this aspect of Einstein’s theory of relativity has been used to discover a planet,” said co-author Tsevi Mazeh of Tel Aviv University. The team also looked for signs that the star was stretched into a football shape by gravitational tides from the orbiting planet. The star would appear brighter when the “football” was observed from the side, due to more visible surface area, and fainter when viewed end-on. The third small effect was due to starlight reflected by the planet itself. Once the new planet was identified, it was confirmed by Latham using radial velocity observations gathered by the Tillinghast Reflector Echelle Spectrograph (TRES) at Whipple Observatory in Arizona, and by Lev Tal-Or