Sunday 29 September 2019

Machu Picchu: Ancient Incan sanctuary intentionally built on faults

Machu Picchu, Peru
The ancient Incan sanctuary of Machu Picchu is considered one of humanity's greatest architectural achievements. Built in a remote Andean setting atop a narrow ridge high above a precipitous river canyon, the site is renowned for its perfect integration with the spectacular landscape. But the sanctuary's location has long puzzled scientists: Why did the Incas build their masterpiece in such an inaccessible place? Research suggests the answer may be related to the geological faults that lie beneath the site.
On Monday, 23 Sept. 2019, at the GSA Annual meeting in Phoenix, Rualdo Menegat, a geologist at Brazil's Federal University of Rio Grande do Sul, will present the results of a detailed geoarchaeological analysis that suggests the Incas intentionally built Machu Picchu -- as well as some of their cities -- in locations where tectonic faults meet. "Machu Pichu's location is not a coincidence," says Menegat. "It would be impossible to build such a site in the high mountains if the substrate was not fractured."
Using a combination of satellite imagery and field measurements, Menegat mapped a dense web of intersecting fractures and faults beneath the UNESCO World Heritage Site. His analysis indicates these features vary widely in scale, from tiny fractures visible in individual stones to major, 175-kilometer-long lineaments that control the orientation of some of the region's river valleys.
Menegat found that these faults and fractures occur in several sets, some of which correspond to the major fault zones responsible for uplifting the Central Andes Mountains during the past eight million years. Because some of these faults are oriented northeast-southwest and others trend northwest-southeast, they collectively create an "X" shape where they intersect beneath Machu Picchu.
Menegat's mapping suggests that the sanctuary's urban sectors and the surrounding agricultural fields, as well as individual buildings and stairs, are all oriented along the trends of these major faults. "The layout clearly reflects the fracture matrix underlying the site," says Menegat. Other ancient Incan cities, including Ollantaytambo, Pisac, and Cusco, are also located at the intersection of faults, says Menegat. "Each is precisely the expression of the main directions of the site's geological faults."
Menegat's results indicate the underlying fault-and-fracture network is as integral to Machu Picchu's construction as its legendary stonework. This mortar-free masonry features stones so perfectly fitted together that it's impossible to slide a credit card between them. As master stoneworkers, the Incas took advantage of the abundant building materials in the fault zone, says Menegat. "The intense fracturing there predisposed the rocks to breaking along these same planes of weakness, which greatly reduced the energy needed to carve them."
In addition to helping shape individual stones, the fault network at Machu Picchu likely offered the Incas other advantages, according to Menegat. Chief among these was a ready source of water. "The area's tectonic faults channeled meltwater and rainwater straight to the site," he says. Construction of the sanctuary in such a high perch also had the benefit of isolating the site from avalanches and landslides, all-too-common hazards in this alpine environment, Menegat explains.
The faults and fractures underlying Machu Picchu also helped drain the site during the intense rainstorms prevalent in the region. "About two-thirds of the effort to build the sanctuary involved constructing subsurface drainages," says Menegat. "The preexisting fractures aided this process and help account for its remarkable preservation," he says. "Machu Picchu clearly shows us that the Incan civilization was an empire of fractured rocks."

How the eyes might be windows to the risk of Alzheimer's disease

Closeup of senior woman's eye
Alzheimer's disease (AD) begins to alter and damage the brain years -- even decades -- before symptoms appear, making early identification of AD risk paramount to slowing its progression.
In a new study published online in the September 9, 2019 issue of the Neurobiology of Aging, scientists at University of California San Diego School of Medicine say that, with further developments, measuring how quickly a person's pupil dilates while they are taking cognitive tests may be a low-cost, low-invasive method to aid in screening individuals at increased genetic risk for AD before cognitive decline begins.
In recent years, researchers investigating the pathology of AD have primarily directed their attention at two causative or contributory factors: the accumulation of protein plaques in the brain called amyloid-beta and tangles of a protein called tau. Both have been linked to damaging and killing neurons, resulting in progressive cognitive dysfunction.
The new study focuses on pupillary responses which are driven by the locus coeruleus (LC), a cluster of neurons in the brainstem involved in regulating arousal and also modulating cognitive function. Tau is the earliest occurring known biomarker for AD; it first appears in the LC; and it is more strongly associated with cognition than amyloid-beta. The study was led by first author William S. Kremen, PhD, and senior author Carol E. Franz, PhD, both professors of psychiatry and co-directors of the Center for Behavior Genetics of Aging at UC San Diego School of Medicine.
The LC drives pupillary response -- the changing diameter of the eyes' pupils -- during cognitive tasks. (Pupils get bigger the more difficult the brain task.) In previously published work, the researchers had reported that adults with mild cognitive impairment, often a precursor to AD, displayed greater pupil dilation and cognitive effort than cognitively normal individuals, even if both groups produced equivalent results. Critically, in the latest paper, the scientists link pupillary dilation responses with identified AD risk genes.
"Given the evidence linking pupillary responses, LC and tau and the association between pupillary response and AD polygenic risk scores (an aggregate accounting of factors to determine an individual's inherited AD risk), these results are proof-of-concept that measuring pupillary response during cognitive tasks could be another screening tool to detect Alzheimer's before symptom appear," said Kremen.

Scientists detect the ringing of a newborn black hole for the first time

If Albert Einstein's theory of general relativity holds true, then a black hole, born from the cosmically quaking collisions of two massive black holes, should itself "ring" in the aftermath, producing gravitational waves much like a struck bell reverbates sound waves. Einstein predicted that the particular pitch and decay of these gravitational waves should be a direct signature of the newly formed black hole's mass and spin.
Now, physicists from MIT and elsewhere have "heard" the ringing of an infant black hole for the first time, and found that the pattern of this ringing does, in fact, predict the black hole's mass and spin -- more evidence that Einstein was right all along.
The findings, published today in Physical Review Letters, also favor the idea that black holes lack any sort of "hair" -- a metaphor referring to the idea that black holes, according to Einstein's theory, should exhibit just three observable properties: mass, spin, and electric charge. All other characteristics, which the physicist John Wheeler termed "hair," should be swallowed up by the black hole itself, and would therefore be unobservable.
The team's findings today support the idea that black holes are, in fact, hairless. The researchers were able to identify the pattern of a black hole's ringing, and, using Einstein's equations, calculated the mass and spin that the black hole should have, given its ringing pattern. These calculations matched measurements of the black hole's mass and spin made previously by others.
If the team's calculations deviated significantly from the measurements, it would have suggested that the black hole's ringing encodes properties other than mass, spin, and electric charge -- tantalizing evidence of physics beyond what Einstein's theory can explain. But as it turns out, the black hole's ringing pattern is a direct signature of its mass and spin, giving support to the notion that black holes are bald-faced giants, lacking any extraneous, hair-like properties.
"We all expect general relativity to be correct, but this is the first time we have confirmed it in this way," says the study's lead author, Maximiliano Isi, a NASA Einstein Fellow in MIT's Kavli Institute for Astrophysics and Space Research. "This is the first experimental measurement that succeeds in directly testing the no-hair theorem. It doesn't mean black holes couldn't have hair. It means the picture of black holes with no hair lives for one more day."
A chirp, decoded
On Sept. 9, 2015, scientists made the first-ever detection of gravitational waves -- infinitesimal ripples in space-time, emanating from distant, violent cosmic phenomena. The detection, named GW150914, was made by LIGO, the Laser Interferometer Gravitational-wave Observatory. Once scientists cleared away the noise and zoomed in on the signal, they observed a waveform that quickly crescendoed before fading away. When they translated the signal into sound, they heard something resembling a "chirp."
Scientists determined that the gravitational waves were set off by the rapid inspiraling of two massive black holes. The peak of the signal -- the loudest part of the chirp -- linked to the very moment when the black holes collided, merging into a single, new black hole. While this infant black hole likely gave off gravitational waves of its own, its signature ringing, physicists assumed, would be too faint to decipher amid the clamor of the initial collision.
Isi and his colleagues, however, found a way to extract the black hole's reverberation from the moments immediately after the signal's peak. In previous work led by Isi's co-author, Matthew Giesler, the team showed through simulations that such a signal, and particularly the portion right after the peak, contains "overtones" -- a family of loud, short-lived tones. When they reanalyzed the signal, taking overtones into account, the researchers discovered that they could successfully isolate a ringing pattern that was specific to a newly formed black hole.
In the team's new paper, the researchers applied this technique to actual data from the GW150914 detection, concentrating on the last few milliseconds of the signal, immediately following the chirp's peak. Taking into account the signal's overtones, they were able to discern a ringing coming from the new, infant black hole. Specifically, they identified two distinct tones, each with a pitch and decay rate that they were able to measure.
"We detect an overall gravitational wave signal that's made up of multiple frequencies, which fade away at different rates, like the different pitches that make up a sound," Isi says. "Each frequency or tone corresponds to a vibrational frequency of the new black hole."
Listening beyond Einstein
Einstein's theory of general relativity predicts that the pitch and decay of a black hole's gravitational waves should be a direct product of its mass and spin. That is, a black hole of a given mass and spin can only produce tones of a certain pitch and decay. As a test of Einstein's theory, the team used the equations of general relativity to calculate the newly formed black hole's mass and spin, given the pitch and decay of the two tones they detected.
They found their calculations matched with measurements of the black hole's mass and spin previously made by others. Isi says the results demonstrate that researchers can, in fact, use the very loudest, most detectable parts of a gravitational wave signal to discern a new black hole's ringing, where before, scientists assumed that this ringing could only be detected within the much fainter end of the gravitational wave signal, and only with much more sensitive instruments than what currently exist.
"This is exciting for the community because it shows these kinds of studies are possible now, not in 20 years," Isi says.
As LIGO improves its resolution, and more sensitive instruments come online in the future, researchers will be able to use the group's methods to "hear" the ringing of other newly born black holes. And if they happen to pick up tones that don't quite match up with Einstein's predictions, that could be an even more exciting prospect.
"In the future, we'll have better detectors on Earth and in space, and will be able to see not just two, but tens of modes, and pin down their properties precisely," Isi says. "If these are not black holes as Einstein predicts, if they are more exotic objects like wormholes or boson stars, they may not ring in the same way, and we'll have a chance of seeing them."
This research was supported, in part, by NASA, the Sherman Fairchild Foundation, the Simons Foundation, and the National Science Foundation.

Engineers develop 'blackest black' material to date

With apologies to "Spinal Tap," it appears that black can, indeed, get more black.
MIT engineers report today that they have cooked up a material that is 10 times blacker than anything that has previously been reported. The material is made from vertically aligned carbon nanotubes, or CNTs -- microscopic filaments of carbon, like a fuzzy forest of tiny trees, that the team grew on a surface of chlorine-etched aluminum foil. The foil captures more than 99.96 percent of any incoming light, making it the blackest material on record.
The researchers have published their findings today in the journal ACS-Applied Materials and Interfaces. They are also showcasing the cloak-like material as part of a new exhibit today at the New York Stock Exchange, titled "The Redemption of Vanity."
The artwork, a collaboration between Brian Wardle, professor of aeronautics and astronautics at MIT, and his group, and MIT artist-in-residence Diemut Strebe, features a 16.78-carat natural yellow diamond, estimated to be worth $2 million, which the team coated with the new, ultrablack CNT material. The effect is arresting: The gem, normally brilliantly faceted, appears as a flat, black void.
Wardle says the CNT material, aside from making an artistic statement, may also be of practical use, for instance in optical blinders that reduce unwanted glare, to help space telescopes spot orbiting exoplanets.
"There are optical and space science applications for very black materials, and of course, artists have been interested in black, going back well before the Renaissance," Wardle says. "Our material is 10 times blacker than anything that's ever been reported, but I think the blackest black is a constantly moving target. Someone will find a blacker material, and eventually we'll understand all the underlying mechanisms, and will be able to properly engineer the ultimate black."
Wardle's co-author on the paper is former MIT postdoc Kehang Cui, now a professor at Shanghai Jiao Tong University.
Into the void
Wardle and Cui didn't intend to engineer an ultrablack material. Instead, they were experimenting with ways to grow carbon nanotubes on electrically conducting materials such as aluminum, to boost their electrical and thermal properties.
But in attempting to grow CNTs on aluminum, Cui ran up against a barrier, literally: an ever-present layer of oxide that coats aluminum when it is exposed to air. This oxide layer acts as an insulator, blocking rather than conducting electricity and heat. As he cast about for ways to remove aluminum's oxide layer, Cui found a solution in salt, or sodium chloride.
At the time, Wardle's group was using salt and other pantry products, such as baking soda and detergent, to grow carbon nanotubes. In their tests with salt, Cui noticed that chloride ions were eating away at aluminum's surface and dissolving its oxide layer.
"This etching process is common for many metals," Cui says. "For instance, ships suffer from corrosion of chlorine-based ocean water. Now we're using this process to our advantage."
Cui found that if he soaked aluminum foil in saltwater, he could remove the oxide layer. He then transferred the foil to an oxygen-free environment to prevent reoxidation, and finally, placed the etched aluminum in an oven, where the group carried out techniques to grow carbon nanotubes via a process called chemical vapor deposition.
By removing the oxide layer, the researchers were able to grow carbon nanotubes on aluminum, at much lower temperatures than they otherwise would, by about 100 degrees Celsius. They also saw that the combination of CNTs on aluminum significantly enhanced the material's thermal and electrical properties -- a finding that they expected.
What surprised them was the material's color.
"I remember noticing how black it was before growing carbon nanotubes on it, and then after growth, it looked even darker," Cui recalls. "So I thought I should measure the optical reflectance of the sample.
"Our group does not usually focus on optical properties of materials, but this work was going on at the same time as our art-science collaborations with Diemut, so art influenced science in this case," says Wardle.
Wardle and Cui, who have applied for a patent on the technology, are making the new CNT process freely available to any artist to use for a noncommercial art project.
"Built to take abuse"
Cui measured the amount of light reflected by the material, not just from directly overhead, but also from every other possible angle. The results showed that the material absorbed greater than 99.995 percent of incoming light, from every angle. In essence, if the material contained bumps or ridges, or features of any kind, no matter what angle it was viewed from, these features would be invisible, obscured in a void of black.
The researchers aren't entirely sure of the mechanism contributing to the material's opacity, but they suspect that it may have something to do with the combination of etched aluminum, which is somewhat blackened, with the carbon nanotubes. Scientists believe that forests of carbon nanotubes can trap and convert most incoming light to heat, reflecting very little of it back out as light, thereby giving CNTs a particularly black shade.

Are humans preventing flies from eavesdropping?

Today's world is filled with background noise, whether it be from a roaring river or a well-trafficked highway. Elevated noise levels from both human-made and natural sources may interfere with animals' listening ability and alter how they interact with other animals. A group of researchers at California Polytechnic State University investigated how background sounds affect a parasitoid fly's eavesdropping capabilities.
Ormia flies listen for cricket calls to find their hosts for their young. When found, the flies deposit their eggs on or near the cricket. Larvae hatch and burrow inside of the cricket, eventually bursting through and killing the host. Therefore, the researchers hypothesized that noise could interfere with eavesdropping among the flies such that they have a difficult time finding their hosts.
The research was published in Royal Society Open Science, and used sticky fly traps near speakers broadcasting cricket calls across a gradient of noise. The results show that fewer parasitoid flies were caught near speakers in noisier locations. Because parasitoids end up killing their hosts, the results suggest that crickets may benefit from calling in noisy areas.
The study also found that both traffic noise and natural ocean noise inhibit fly orientation to sound, suggesting crickets could use sound as a parasite shield across different soundscapes. These results suggest that soundscapes may influence the evolution of tightly co-evolved host-parasitoid relationships.
Some questions still remain for the authors -- if the parasitoid flies are less abundant in noise, might female flies have the same problem localizing male crickets calls? Author Jennifer N. Phillips stated, "Future work could investigate whether female crickets have trouble hearing and moving toward calling males in noise, which would be a fitness cost to males trying to find a mate. This could balance out the reduced risk from fly parasitism. However, if crickets are able to adjust their call or ability to find each other and the parasitoid fly does not, there may still be some benefit of noise to crickets."

Atlantic Ocean may get a jump-start from the other side of the world

Ocean waves crashing on beach
A key question for climate scientists in recent years has been whether the Atlantic Ocean's main circulation system is slowing down, a development that could have dramatic consequences for Europe and other parts of the Atlantic rim. But a new study suggests help may be on the way from an unexpected source -- the Indian Ocean.
Think of it as ocean-to-ocean altruism in the age of climate change.
The new study, from Shineng Hu of the Scripps Institution of Oceanography at the University of California-San Diego and Alexey Fedorov of Yale University, appears Sept. 16 in the journal Nature Climate Change. It is the latest in a growing body of research that explores how global warming may alter global climate components such as the Atlantic meridional overturning circulation (AMOC).
AMOC is one of the planet's largest water circulation systems. It operates like a liquid escalator, delivering warm water to the North Atlantic via an upper limb and sending colder water south via a deeper limb.
Although AMOC has been stable for thousands of years, data from the past 15 years, as well as computer model projections, have given some scientists cause for concern. AMOC has showed signs of slowing during that period, but whether it is a result of global warming or only a short-term anomaly related to natural ocean variability is not known.
"There is no consensus yet," Fedorov said, "but I think the issue of AMOC stability should not be ignored. The mere possibility that the AMOC could collapse should be a strong reason for concern in an era when human activity is forcing significant changes to the Earth's systems.
"We know that the last time AMOC weakened substantially was 15,000 to 17,000 years ago, and it had global impacts," Fedorov added. "We would be talking about harsh winters in Europe, with more storms or a drier Sahel in Africa due to the downward shift of the tropical rain belt, for example."
Much of Fedorov and Hu's work focuses on specific climate mechanisms and features that may be shifting due to global warming. Using a combination of observational data and sophisticated computer modeling, they plot out what effects such shifts might have over time. For example, Fedorov has looked previously at the role melting Arctic sea ice might have on AMOC.
For the new study, they looked at warming in the Indian Ocean.
"The Indian Ocean is one of the fingerprints of global warming," said Hu, who is first author of the new work. "Warming of the Indian Ocean is considered one of the most robust aspects of global warming."
The researchers said their modeling indicates a series of cascading effects that stretch from the Indian Ocean all way over to the Atlantic: As the Indian Ocean warms faster and faster, it generates additional precipitation. This, in turn, draws more air from other parts of the world, including the Atlantic, to the Indian Ocean.
With so much precipitation in the Indian Ocean, there will be less precipitation in the Atlantic Ocean, the researchers said. Less precipitation will lead to higher salinity in the waters of the tropical portion of the Atlantic -- because there won't be as much rainwater to dilute it. This saltier water in the Atlantic, as it comes north via AMOC, will get cold much quicker than usual and sink faster.
"This would act as a jump-start for AMOC, intensifying the circulation," Fedorov said. "On the other hand, we don't know how long this enhanced Indian Ocean warming will continue. If other tropical oceans' warming, especially the Pacific, catches up with the Indian Ocean, the advantage for AMOC will stop."
The researchers said this latest finding illustrates the intricate, interconnected nature of global climate. As scientists try to understand the unfolding effects of climate change, they must attempt to identify all of the climate variables and mechanisms that are likely to play a role, they added.
"There are undoubtedly many other connections that we don't know about yet," Fedorov said. "Which mechanisms are most dominant? We're interested in that interplay."

Carp aquaculture in Neolithic China dating back 8,000 years

Carp
In a recent study, an international team of researchers analyzed fish bones excavated from the Early Neolithic Jiahu site in Henan Province, China. By comparing the body-length distributions and species-composition ratios of the bones with findings from East Asian sites with present aquaculture, the researchers provide evidence of managed carp aquaculture at Jiahu dating back to 6200-5700 BC.
Despite the growing importance of farmed fish for economies and diets around the world, the origins of aquaculture remain unknown. The Shijing, the oldest surviving collection of ancient Chinese poetry, mentions carp being reared in a pond circa 1140 BC, and historical records describe carp being raised in artificial ponds and paddy fields in East Asia by the first millennium BC. But considering rice paddy fields in China date all the way back to the fifth millennium BC, researchers from Lake Biwa Museum in Kusatu, Japan, the Max Planck Institute for the Science of Human History in Jena, Germany, the Sainsbury Institute for the Study of Japanese Arts and Cultures in Norwich, U.K., and an international team of colleagues set out to discover whether carp aquaculture in China was practiced earlier than previously thought.
Carp farming goes way back in Early Neolithic Jiahu
Jiahu, located in Henan, China, is known for the early domestication of rice and pigs, as well the early development of fermented beverages, bone flutes, and possibly writing. This history of early development, combined with archaeological findings suggesting the presence of large expanses of water, made Jiahu an ideal location for the present study.
Researchers measured 588 pharyngeal carp teeth extracted from fish remains in Jiahu corresponding with three separate Neolithic periods, and compared the body-length distributions with findings from other sites and a modern sample of carp raised in Matsukawa Village, Japan. While the remains from the first two periods revealed unimodal patterns of body-length distribution peaking at or near carp maturity, the remains of Period III (6200-5700 BC) displayed bimodal distribution, with one peak at 350-400 mm corresponding with sexual maturity, and another at 150-200 mm.
This bimodal distribution identified by researchers was similar to that documented at the Iron Age Asahi site in Japan (circa 400 BC -- AD 100), and is indicative of a managed system of carp aquaculture that until now was unidentified in Neolithic China. "In such fisheries," the study notes, "a large number of cyprinids were caught during the spawning season and processed as preserved food. At the same time, some carp were kept alive and released into confined, human regulated waters where they spawned naturally and their offspring grew by feeding on available resources. In autumn, water was drained from the ponds and the fish harvested, with body-length distributions showing two peaks due to the presence of both immature and mature individuals."
Species-composition ratios support findings, indicate cultural preferences
The size of the fish wasn't the only piece of evidence researchers found supporting carp management at Jiahu. In East Asian lakes and rivers, crucian carp are typically more abundant than common carp, but common carp comprised roughly 75% of cyprinid remains found at Jiahu. This high proportion of less-prevalent fish indicates a cultural preference for common carp and the presence of aquaculture sophisticated enough to provide it.
Based on the analysis of carp remains from Jiahu and data from previous studies, researchers hypothesize three stages of aquaculture development in prehistoric East Asia. In Stage 1, humans fished the marshy areas where carp gather during spawning season. In Stage 2, these marshy ecotones were managed by digging channels and controlling water levels and circulation so the carp could spawn and the juveniles later harvested. Stage 3 involved constant human management, including using spawning beds to control reproduction and fish ponds or paddy fields to manage adolescents.
Although rice paddy fields have not yet been identified at Jiahu, the evolution of carp aquaculture with wet rice agriculture seems to be connected, and the coevolution of the two is an important topic for future research.

Why is Earth so biologically diverse? Mountains hold the answer

Mount Chimborazo volcano, Ecuador
What determines global patterns of biodiversity has been a puzzle for scientists since the days of von Humboldt, Darwin, and Wallace. Yet, despite two centuries of research, this question remains unanswered. The global pattern of mountain biodiversity, and the extraordinarily high richness in tropical mountains in particular, is documented in two companion Science review papers this week. The papers focus on the fact that the high level of biodiversity found on mountains is far beyond what would be expected from prevailing hypotheses.
"The challenge is that, although it is evident that much of the global variation in biodiversity is so clearly driven by the extraordinary richness of tropical mountain regions, it is this very richness that current biodiversity models, based on contemporary climate, cannot explain: mountains are simply too rich in species, and we are falling short of explaining global hotspots of biodiversity," says Professor Carsten Rahbek, lead author of both review papers published in Science.
To confront the question of why mountains are so biologically diverse, scientists at the Center for Macroecology, Evolution and Climate (CMEC) at the GLOBE Institute of the University of Copenhagen work to synthesize understanding and data from the disparate fields of macroecology, evolutionary biology, earth sciences, and geology. The CMEC scientists are joined by individual collaborators from Oxford University, Kew Gardens, and University of Connecticut.
Part of the answer, these studies find, lies in understanding that the climate of rugged tropical mountain regions is fundamentally different in complexity and diversity compared to adjacent lowland regions. Uniquely heterogeneous mountain climates likely play a key role in generating and maintaining high diversity.
"People often think of mountain climates as bleak and harsh," says study co-leader Michael K. Borregaard. "But the most species-rich mountain region in the world, the Northern Andes, captures, for example, roughly half of the world's climate types in a relatively small region -- much more than is captured in nearby Amazon, a region that is more than 12 times larger."
Stressing another unique feature of mountain climate, Michael explains, "Tropical mountains, based in fertile and wet equatorial lowlands and extending into climatic conditions superficially similar to those found in the Arctic, span a gradient of annual mean temperatures over just a few km as large as that found over 10,000 km from the tropical lowlands at Equator to the arctic regions at the poles. It's pretty amazing if you think about it."
Another part of the explanation of the high biodiversity of certain mountains is linked to the geological dynamics of mountain building. These geological processes, interacting with complex climate changes through time, provide ample opportunities for evolutionary processes to act.
"The global pattern of biodiversity shows that mountain biodiversity exhibits a visible signature of past evolutionary processes. Mountains, with their uniquely complex environments and geology, have allowed the continued persistence of ancient species deeply rooted in the tree of life, as well as being cradles where new species have arisen at a much higher rate than in lowland areas, even in areas as amazingly biodiverse as the Amazonian rainforest," says Professor Carsten Rahbek.
From ocean crust, volcanism and bedrock to mountain biodiversity
Another explanation of mountain richness, says the study, may lie in the interaction between geology and biology. The scientists report a novel and surprising finding: the high diversity is in most tropical mountains tightly linked to bedrock geology -- especially mountain regions with obducted, ancient oceanic crust. To explain this relationship between geology and biodiversity, the scientists propose, as a working hypothesis, that mountains in the tropics with soil originating from oceanic bedrock provide exceptional environmental conditions that drive localized adaptive change in plants. Special adaptations that allow plants to tolerate these unusual soils, in turn, may drive speciation cascades (the speciation of one group leading to speciation in other groups), all the way to animals, and ultimately contribute to the shape of global patterns of biodiversity.
The legacy of von Humboldt -- his 250th anniversary
The two papers are part of Science's celebration of Alexander von Humboldt's 250th birth anniversary. In 1799, Alexander von Humboldt set sail on a 5-year, 8000-km voyage of scientific discovery through Latin America. His journey through the Andes Mountains, captured by his famous vegetation zonation figure featuring Mount Chimborazo, canonized the place of mountains in understanding Earth's biodiversity.
Acknowledging von Humboldt's contribution to our understanding of the living world, Professor Carsten Rahbek, one of the founding scientists of the newly established interdisciplinary GLOBE Institute at the University of Copenhagen says:
"Our papers in Science are a testimony to the work of von Humboldt, which truly revolutionized our thinking about the processes that determine the distribution of life. Our work today stands on the shoulders of his work, done centuries ago, and follows his approach of integrating data and knowledge of different scientific disciplines into a more holistic understanding of the natural world. It is our small contribution of respect to the legacy of von Humboldt."

Natural selection alters genes that control roundworms' sense of smell

Charles Darwin was right.
In his 1859 book, "On the Origin of Species," the famed scientist hypothesized that artificial selection (or domestication) and natural selection work in the same ways.
Now an international team, led by Northwestern University, has produced some of the first evidence that Darwin's speculation was correct.
This time, the study's subjects are not exotic birds in the Galapagos, but instead a roundworm, which relies on its sense of smell to assess the availability of food and nearby competition. In the Northwestern-led work, researchers found that natural selection acts on the same genes that control wild roundworms' sense of smell as were previously found in domesticated worms in the lab.
"The evolution of traits if rarely connected to exact genes and processes," said Northwestern's Erik Andersen, who led the study. "We offer a clear example of how evolution works."
The scientists used a combination of laboratory experiments, computational genomic analysis and field work. Their research also shows that natural selection acts on signal-sensing receptors rather than the downstream parts of the genetic process.
The study published this week (Sept. 23) in the journal Nature Ecology & Evolution. Andersen is an associate professor of molecular biosciences in Northwestern's Weinberg College of Arts and Sciences.
A keystone model organism, C. elegans is a one-millimeter-long roundworm that lives in decaying organic matter -- particularly rotten fruits -- and feeds on bacteria. These roundworms are typically found in gardens and compost piles.
For C. elegans, having a keen sense of smell can be the difference between life or death. If they smell enough food in their environment, then they will stay, grow and reproduce. If they sense a shortage of food and/or too much competition from other worms, then they will undertake a long and potentially fatal journey in search of a more favorable environment. This process, called "dauer," delays growth and reproduction.
In other words, dauer decreases reproductive success in the short term in order to ensure survival in the long run.
"At some point in their lives, these worms must make a gamble," Andersen said. "In the time it takes for a worm to come out of dauer and start growing again, the worm that stayed behind has already been multiplying. If the food runs out, then the dauer worm made the right decision and wins. If the food doesn't run out, then the dauer worm loses."
Andersen and his collaborators found that evolution plays a significant role in a worm's decision to stay or enter dauer. Some roundworms have one genetic receptor to process scents; other roundworms have two. The roundworms with two receptors have a heightened sense of smell, which allows them to better assess the availability of resources in their environment and make a better gamble.
"If worms can smell large numbers of worms around them, that gives them an advantage," Andersen said. "This was discovered in a previous study of artificial selection in worms. Now we also found that result in natural populations. We can see specific evidence in these two genes that artificial and natural selection act similarly."

Friday 27 September 2019

A little kindness goes a long way for worker performance and health

Small gestures of kindness by employers can have big impacts on employees' health and work performance, according to an international team of researchers. The team specifically examined the effects of employers enhancing the lunches of bus drivers in China with fresh fruit and found that it reduced depression among the drivers and increased their confidence in their own work performance.
"An ultimate solution to improve worker performance and health could be big pay raises or reduced workloads, but when those solutions aren't feasible, we found that even small offerings can make a big difference," said Bu Zhong, associate professor of journalism at Penn State.
According to Zhong, bus drivers are vulnerable to specific health problems due in large part to their stressful working environment, which often includes irregular shift schedules, unpredictable traffic conditions and random meal times. In addition, the sedentary nature of driving and continuous whole-body vibration contributes to fatigue, musculoskeletal problems such as lower-back pain, cardiovascular diseases and gastrointestinal issues.
Zhong and his colleagues conducted an experiment with 86 Shenzen bus drivers. During the experiment, on-duty bus drivers were given, in addition to their typical box lunch which includes no fruit, a serving of fresh fruit -- either an apple or a banana -- for three weeks. The cost of the fruit was 73 cents per meal.
The team distributed surveys to the bus drivers at three time intervals -- one week before the experiment began, once in the middle of the three-week-long experiment and one week following the end of the experiment. The findings appear today in the International Journal of Occupational Safety and Ergonomics.
The researchers assessed depression with a personal health questionnaire that is recommended by the U.S. Centers for Disease Control and Prevention. The scale consisted of eight items, asking the participants to rate, for example, how often during the past two weeks they felt down, depressed or hopeless, and had trouble falling or staying asleep.
"Bus drivers reported significantly decreased depression levels one week after the experiments ended compared to one week before it began," said Zhong.
The team measured self-efficacy -- perceived confidence and ability to implement the necessary actions and tasks so as to achieve specific goals -- using the 10-item General Self-Efficacy Scale. Items on this scale included, "I can always manage to solve difficult problems if I try hard enough" and "I can usually handle whatever comes my way."
"We found that self-efficacy was significantly higher in the middle of the experiment week than in the week after the experiment ended," said Zhong.
Zhong concluded that while eating an extra apple at lunchtime may seem trivial, its impact can be large.
"This research suggests that employees can be sensitive to any improvement at the workplace," he said. "Before an ultimate solution is possible, some small steps can make a difference -- one apple at a time."

Virtual reality training could improve employee safety

A new study suggests employee safety could be improved through use of Virtual Reality (VR) in Health and Safety training, such as fire evacuation drills.
The Human Factors Research Group at the University of Nottingham, developed an immersive VR system to stimulate participants' perception of temperature, and senses of smell, sight and hearing to explore how they behaved during two health and safety training scenarios: an emergency evacuation in the event of a fire and a fuel leak.
In one scenario, participants had to evacuate from a virtual fire in an office, seeing and hearing using a VR headset but could also feel heat from three 2kW heaters, and could smell smoke from a scent diffuser, creating a multisensory virtual environment. This group was compared against another group who were observed in this scenario using only audio-visual elements of VR.
Observing real life behaviours
Previous research on human behaviour during real-world fire incidents has shown that a lack of understanding of the spread and movement of fire often means that occupants are unprepared and misjudge appropriate actions. Immersive health and safety training enables employers to train people about hazards and hazardous environments without putting anyone at risk.
The Nottingham research, funded by the Institution of Occupational Safety and Health (IOSH), found contrasts between the groups in the way participants reacted to the scenario. Those in the multi-sensory group had a greater sense of urgency, reflecting a real-life scenario, and were more likely to avoid the virtual fires. Evidence from the audio-visual participants suggested that they were treating the experience more like a game and behaviours were less consistent with those expected in a real world situation.
Dr Glyn Lawson, Associate Professor in the Faculty of Engineering, University of Nottingham, said: "Health and safety training can fail to motivate and engage employees and can lack relevance to real-life contexts. Our research, which has been funded by the Institution of Occupational Safety and Health, suggests that virtual environments can help address these issues, by increasing trainees' engagement and willingness to participate in further training. There are also business benefits associated with the use of virtual environment training, such as the ability to deliver training at or near the workplace and at a time that is convenient to the employee."
Virtual Reality vs. PowerPoint
A further test was done, as part of the study, to measure the effectiveness of VR training versus traditional PowerPoint training. Participants took questionnaires, testing their knowledge on either fire safety or safe vehicle disassembly procedure, before and after training as well as one week later.
While those trained via PowerPoint appeared to have gained more knowledge when tested directly after training, there was a significantly larger decrease in knowledge scores when participants were retested one week later. In comparison, the VR group's long term retention was better and reported higher levels of engagement; attitude to occupational safety and health; and willingness to undertake training in the future.
The research suggests that the increased cognitive engagement of learning in the virtual environment creates more established and comprehensive mental models which can improve recall, and implies that testing an employee's knowledge immediately following health and safety training may not be an effective means of gaging long-term knowledge of health and safety.
Applications to the work place
Mary Ogungbeje, Research Manager at IOSH, said: "The wheels are turning so that virtual and smart learning is increasingly engrained in the workplace and everyday life.
"Technology is continuously advancing and in many cases becoming more affordable, so this study gives us a taste of what's to come. By improving training strategies with the use of technology and stimulated sensory experiences, we are heading in a direction where the workforce will not just enjoy a more immersive and interesting training course but participate in an effective learning experience, so they are better prepared and equipped to stay safe, healthy and well at work."
The researchers conducted meetings, discussions, and visits with partners including Rolls-Royce, for expert advice around fire safety and safe handling of hazardous chemicals. The University of Nottingham's Health and Safety advisors also contributed to help the researchers better understand how the training may be implemented in industry.
The study aims to produce evidence-based guidance for the development and use of virtual environments in engaging and effective training using cost-effective and accessible solutions. The full study features in a report, titled 'Immersive virtual worlds: Multisensory virtual environments for health and safety training', to be released at the IOSH's annual conference on Tuesday 17 September.

The future of 'extremely' energy-efficient circuits

Data centers are processing data and dispensing the results at astonishing rates and such robust systems require a significant amount of energy -- so much energy, in fact, that information communication technology is projected to account for 20% of total energy consumption in the United States by 2020.
To answer this demand, a team of researchers from Japan and the United States have developed a framework to reduce energy consumption while improving efficiency.
They published their results on July 19 in Scientific Reports, a Nature journal.
"The significant amount of energy consumption has become a critical problem in modern society," said Olivia Chen, corresponding author of the paper and assistant professor in the Institute of Advanced Sciences at Yokohama National University. "There is an urgent requirement for extremely energy-efficient computing technologies."
The research team used a digital logic process called Adiabatic Quantum-Flux-Parametron (AQFP). The idea behind the logic is that direct current should be replaced with alternating current. The alternating current acts as both the clock signal and the power supply -- as the current switches directions, it signals the next time phase for computing.
The logic, according to Chen, could improve conventional communication technologies with currently available fabrication processes.
"However, there lacks a systematic, automatic synthesis framework to translate from high-level logic description to Adiabatic Quantum-Flux-Parametron circuit netlist structures," Chen said, referring to the individual processors within the circuit. "In this paper, we mitigate that gap by presenting an automatic flow. We also demonstrate that AQFP can achieve a reduction in energy use by several orders of magnitude compared to traditional technologies."
The researchers proposed a top-down framework for computing decisions that can also analyze its own performance. To do this, they used logic synthesis, a process by which they direct the passage of information through logic gates within the processing unit. Logic gates can take in a little bit of information and output a yes or no answer. The answer can trigger other gates to respond and move the process forward, or stop it completely.
With this basis, the researchers developed a computation logic that takes the high-level understanding of processing and how much energy a system uses and dissipates and describes it as an optimized map for each gate within the circuit model. From this, Chen and the research team can balance the estimation of power needed to process through the system and the energy that the system dissipates.
According to Chen, this approach also compensates for the cooling energy needed for superconducting technologies and reduces the energy dissipation by two orders of magnitude.
"These results demonstrate the potential of AQFP technology and applications for large-scale, high-performance and energy-efficient computations," Chen said.
Ultimately, the researchers plan to develop a fully automated framework to generate the most efficient AQFP circuit layout.
"The synthesis results of AQFP circuits are highly promising in terms of energy-efficient and high-performance computing," Chen said. "With the future advancing and maturity of AQFP fabrication technology, we anticipate broader applications ranging from space applications and large-scale computing facilities such as data centers."

Investments to address climate change are good business

An internationally respected group of scientists have urgently called on world leaders to accelerate efforts to tackle climate change. Almost every aspect of the planet's environment and ecology is undergoing changes in response to climate change, some of which will be profound if not catastrophic in the future.
According to their study published in Science today, reducing the magnitude of climate change is also a good investment. Over the next few decades, acting to reduce climate change is expected to cost much less than the damage otherwise inflicted by climate change on people, infrastructure and ecosystems.
"Acting on climate change" said lead author, Prof Ove Hoegh-Guldberg from the ARC Centre for Excellence in Coral Reef Studies at the University of Queensland in Australia "has a good return on investment when one considers the damages avoided by acting."
The investment is even more compelling given the wealth of evidence that the impacts of climate change are happening faster and more extensively than projected, even just a few years ago. This makes the case for rapidly reducing greenhouse gas emissions even more compelling and urgent.
Prof Hoegh-Guldberg explained the mismatch. "First, we have underestimated the sensitivity of natural and human systems to climate change, and the speed at which these changes are happening. Second, we have underappreciated the synergistic nature of climate threats -- with the outcomes tending to be worse than the sum of the parts. This is resulting is rapid and comprehensive climate impacts, with growing damage to people, ecosystems, and livelihoods."
For example, sea-level rise can lead to higher water levels during storm events. This can create more damage. For deprived areas, this may exacerbate poverty creating further disadvantage. Each risk may be small on its own, but a small change in a number of risks can lead to large impacts.
Prof Daniela Jacob, co-author and Director of Climate Services Centre (GERICS) in Germany is concerned about these rapid changes -- especially about unprecedented weather extremes.
"We are already in new territory" said Prof Jacob, "The 'novelty' of the weather is making our ability to forecast and respond to weather-related phenomena very difficult."
These changes are having major consequences. The paper updates a database of climate-related changes and finds that there are significant benefits from avoiding 2oC and aiming to restrict the increase to 1.5oC above pre-industrial global temperatures.
Prof Rachel Warren from the Tyndall Centre at the University of East Anglia in the UK assessed projections of risk for forests, biodiversity, food, crops and other critical systems, and found very significant benefits for limiting global warming to 1.5oC rather than 2oC.
"The scientific community has quantified these risks in order to inform policy makers about the benefits of avoiding them," Prof Warren stated.
Since the Paris Agreement came into force, there has been a race to quantify the benefits of limiting warming to 1.5oC so that policy makers have the best possible information for developing the policy required for doing it.
Prof Warren continued. "If such policy is not implemented, we will continue on the current upward trajectory of burning fossil fuels and continuing deforestation, which will expand the already large-scale degradation of ecosystems. To be honest, the overall picture is very grim unless we act."
A recent report from the United Nations projected that as many as a million species may be at risk of extinction over the coming decades and centuries. Climate change is not the only factor but is one of the most important ones.
The urgency of responding to climate change is at front of mind for Prof Michael Taylor, co-author and Dean of Science at the University of the West Indies. "This is not an academic issue, it is a matter of life and death for people everywhere. That said, people from small island States and low-lying countries are in the immediate cross-hairs of climate change."
"I am very concerned about the future for these people," said Professor Taylor.
This urgency to act is further emphasized by the vulnerability of developing countries to climate change impacts as pointed out by Francois Engelbrecht, co-author and Professor of Climatology at the Global Change Institute of the University of the Witwatersrand in South Africa.
"The developing African countries are amongst those to be affected most in terms of impacts on economic growth in the absence of strong climate change mitigation," Prof Engelbrecht explains.
Prof Hoegh-Guldberg reiterated the importance of the coming year (2020) in terms of climate action and the opportunity to strengthen emission reduction pledges in line with the Paris Agreement of 2015.
"Current emission reduction commitments are inadequate and risk throwing many nations into chaos and harm, with a particular vulnerability of poor peoples. To avoid this, we must accelerate action and tighten emission reduction targets so that they fall in line with the Paris Agreement. As we show, this is much less costly than suffering the impacts of 2oC or more of climate change."
"Tackling climate change is a tall order. However, there is no alternative from the perspective of human well-being -- and too much at stake not to act urgently on this issue."

Nonverbal signals can create bias against larger groups

If children are exposed to bias against one person, will they develop a bias against that person's entire group? The answer is yes, according to new research from University of Georgia social psychologist Allison Skinner. The study's results are the first to demonstrate that nonverbal signals can produce new biases that generalize to entire groups and classes of people.
"Our findings indicate that the process of acquiring bias based on nonverbal signals -- and extending that bias to a larger group -- is already in operation in early childhood, prior to the start of first grade," said Skinner, first author and assistant professor of psychology in the Franklin College of Arts and Sciences. "Exposure to biased nonverbal signals may be an important process through which group biases are rapidly and unintentionally transmitted within our culture."
Her study, published in the Journal of Personality and Social Psychology, explores bias generalization in preschoolers aged 4 and 5.
With co-authors Kristina R. Olson and Andrew N. Meltzoff (both University of Washington), Skinner tested whether preschool children seeing one individual receive more positive nonverbal signals than another would lead them to develop bias in favor of that individual's group -- and whether such biases would be generalized to large classes of people, for example, those of the same nationality.
In the experiments, children watched video in which an adult actor displayed positive nonverbal signals -- appearing warm and friendlier -- toward an unknown adult from one fictitious place and negative nonverbal signals toward an unknown adult from another place. The preschoolers were then asked questions to assess their biases toward the adults in the videos and toward other people of their "nationality."
"Children's biases went beyond simply preferring people from one place relative to another," Skinner said. "They were more likely to imitate the words and actions demonstrated by the target of positive nonverbal signals, and they preferred to interact with members of that individual's group."
This study follows on the heels of her previously published work on the role of nonverbal signals in spreading attitudes and biases among adults. In a study published in Personality and Social Psychology Bulletin, Skinner found that adults formed conscious attitudes toward an individual based on witnessing positive or negative nonverbal signals displayed toward that person. They also formed unconscious attitudes, but they were likely to misattribute the cause, according to Skinner.
"People were more likely to attribute their attitude to how the individual behaved, rather than how the individual was treated by others," she said. "It didn't matter if the individual responded neutrally. If people treated him as if he was behaving like a jerk, then that was their inference."

Impostor syndrome is more common than you think; Study finds best way to cope with it

The impostor syndrome, a phenomenon that manifests when people feel like frauds even if they are actually capable and well-qualified, affects people both in the workplace and in the classroom. A new study reveals that perceptions of impostorism are quite common and uncovers one of the best -- and worst -- ways to cope with such feelings.
Findings of the study, co-authored by Brigham Young University professors Jeff Bednar, Bryan Stewart, and James Oldroyd, revealed that 20 percent of the college students in their sample suffered from very strong feelings of impostorism. The researchers conducted interviews with students in an elite academic program to understand the various coping mechanisms students used to escape these feelings, but one particular method stood out above the rest: seeking social support from those outside their academic program.
The findings of their interview study suggest that if students "reached in" to other students within their major, they felt worse more often than they felt better. However, if the student "reached out" to family, friends outside their major, or even professors, perceptions of impostorism were reduced.
"Those outside the social group seem to be able to help students see the big picture and recalibrate their reference groups," said Bednar, a BYU management professor and co-author on the study. "After reaching outside their social group for support, students are able to understand themselves more holistically rather than being so focused on what they felt they lacked in just one area."
Along with seeking social support, the study also uncovered negative ways students coped with impostorism. Some students tried to get their mind off schoolwork through escapes such as video games but ended up spending more time gaming than studying. Other students tried to hide how they really felt around their classmates, pretending they were confident and excited about their performance when deep down they questioned if they actually belonged.
In a second study, the researchers surveyed 213 students to confirm what was revealed in their interview study about seeking social support: reaching out to individuals outside the major proved to be more effective than reaching in to individuals within the major.
Surprisingly, the study also reveals that perceptions of impostorism lack a significant relationship with performance. This means that individuals who suffer with the impostor syndrome are still capable of doing their jobs well, they just don't believe in themselves. Researchers also explain that social-related factors impact impostorism more than an individual's actual ability or competence.
"The root of impostorism is thinking that people don't see you as you really are," said Stewart, an accounting professor at BYU and co-author on the study. "We think people like us for something that isn't real and that they won't like us if they find out who we really are."
Outside the classroom, researchers believe that implications from this study can and should be applied in the workplace as well. "It's important to create cultures where people talk about failure and mistakes," Bednar said. "When we create those cultures, someone who is feeling strong feelings of impostorism will be more likely to get the help they need within the organization."

Brain anatomy changes with maturation to adolescence

In a first-of-its-kind study, Children's Hospital Los Angeles researchers piece together a road map of typical brain development in children during a critical window of maturation. The study shows how a "wave of brain maturation" directly underlies important social and behavioral changes children develop during the transition from childhood to adolescence.
As children mature, many aspects of their lives shift in preparation for adulthood. Academic and social environments intensify during this time, requiring increasing mastery of thoughts, emotions, and behavioral control. Very little is known about what is going on neurologically during this important transition. A group of researchers at CHLA examined anatomical and behavioral changes during the finite window of neurological development in a group of 9-12 year old children. A more detailed understanding of typical brain development could give scientists and clinicians a better framework to help care for children who may be developing atypically or facing developmental challenges.
"We know that children are growing substantially in their ability to self-regulate during this time," says Mary Baron Nelson, PhD, RN, the first author on this publication. "Among many other changes, their attention spans are expanding and they are learning social norms such as gauging appropriate responses or behaviors." Because these are cognitive processes, the research group hypothesized that measurable changes could be occurring in brain structure and function. This is precisely what they found.
The team of scientists, led by Bradley Peterson, MD, of the Institute for the Developing Mind at CHLA, examined anatomical, chemical, and neuropsychological measures to determine what changes could be occurring in a group of 234 healthy children, aged 9-12 years. "We used brain imaging, measured multiple chemicals and metabolites, and took cognitive and neuropsychological scores," says Dr. Baron Nelson.
Using imaging and measuring brain metabolites, the group observed what Dr. Baron Nelson refers to as a "wave of maturation" sweeping through the brain. White matter tracts -- the pathways in the brain that transmit information -- showed increasing maturation with age from the back to the front of the brain. This is expected, as the frontal lobes are not fully formed until an individual is in his or her late twenties. The frontal lobes mediate executive function -- major planning of complex decisions and actions. But perhaps less expected is that so many of these changes begin to occur so early on. The findings in the study show that this maturation is largely beginning during years 9-12. This brain maturity correlates with a critical and formative period of time: children are undergoing rapid, neurological maturity at the same time that they are facing difficult social and academic decisions.
As a child grows, he or she becomes more able to control impulses and process complex concepts. In support of this observation the group discovered increasing scores on tasks that measured these skills. But how were these children able to have more impulse control and make more complex decisions? The group analyzed the data and were able to determine that anatomical and metabolic changes occurring during this window of development are responsible for this increase in abilities.
"We've learned that this is not a wait-and-see period of time," says Dr. Baron Nelson. "Dynamic changes are happening here and this gives us a real opportunity for intervention. We can help shape these kids as they grow."

Better way to teach physics to university students

Courses in introductory physics are required for nearly all university STEM degree programs not only because physics is considered foundational to those disciplines, but also because it provides students practical experience in applied mathematics. The latter is especially true for calculus-based physics courses, which typically provide students their first exposure to using calculus outside of their math classes.
Now, a team of physicists and educators at the University of Kansas has developed a curriculum for college-level students that shows promise in helping students in introductory physics classes further practice and develop their calculus skills, especially those who test lower in core math abilities. They term the approach "energy-first."
Their findings appear in the peer-reviewed journal Physical Review Physics Education Research.
"It's almost always the case that in introductory physics courses students are first taught mechanics in the context of forces. Later in the course, they are shown that they can also apply the concept of energy to solve most of the problems they already learned to solve with forces," said co-author Christopher Fischer, engineering physics director and associate chair of physics & astronomy at KU. "We decided instead we want to teach energy first -- because, number one, we think it's a more generally applicable way of thinking about physics. Number two, it also allows us to achieve our secondary goal of providing the students with more opportunities to use and practice their calculus skills."
From fall 2015 to spring of this year, Fischer and his KU colleagues monitored students and performed testing in two introductory physics classes at KU taken mostly by students pursuing degrees in the physical sciences and engineering. For one, they devised an "energy-first" curriculum. For the other, they kept to a more traditional approach that taught students about forces before teaching them about energy.
The presence of two different physics courses using different curricula naturally provided an opportunity for the researchers to compare the outcomes of students in the two courses.
"We sought to compare, as best we could, apples to apples," Fischer said. "In other words, we compared students who had the same ACT math scores but who took different physics courses to determine what effect our new physics curriculum had on student outcomes."
The researchers worked with the KU Center for Teaching Excellence and the KU Office of Institutional Research & Planning to obtain the students' ACT math scores.
Fischer and his colleagues found engineering students taking the new "energy-first" physics curriculum tended to earn higher grades in subsequent engineering classes (for instance, in a mechanical engineering class for which either of the two introductory physics classes was a prerequisite).
"What we saw was the engineering students who were taking our new physics curriculum did better in their engineering classes," he said.
Furthermore, the biggest benefits to student performance in downstream engineering classes were seen in students who had lower math ACT scores but took the "energy-first" physics class.
"The benefits were even larger the lower your initial math abilities were," Fischer said. "So, engineering students who had lower ACT math scores had larger benefits from taking this new curriculum, which got us thinking maybe tasking students with solving more problems using calculus in this physics class is helping them with their applied math skills in general, as well as their physics skills."
Fischer's KU colleagues on the project from the KU physics & astronomy department were lead author Sarah LeGresley, assistant teaching professor of physics & astronomy; Jennifer Delgado, associate teaching professor; Christopher Bruner, a doctoral student; and Michael Murray, professor of physics & astronomy.
The KU researchers examined how well students had picked up on the physics content by performing more assessments, again finding those in the "energy-first" cohort had the edge over those in the old-style introductory physics class.
"Separately, we did a side-by-side comparison of student performance on a standardized physics conceptual test that many different universities use," Fischer said. "And we saw that all the students in the new physics curriculum are doing better than the students from the traditional physics curriculum."
While these results certainly argue for the adoption of an "energy-first" approach, Fischer stressed because of the small sample size and limited demographics of students at only a single, large Midwestern university, the "energy-first" curriculum would need to be tried out on a broader level before concluding it was a superior method for teaching introductory physics to college-age students.
"We didn't have tens of thousands of students in our study," Fischer said. "We looked at only a few thousand. Thus, it's important that other universities try this new curriculum to see if our results can be replicated. Indeed, we would happily welcome other institutions to collaborate with us to test if our results are robust -- that's essential."
Additionally, the KU researchers hope to develop and implement an assessment to use in physics classes to understand math transference better.
"Is this new way of teaching physics helping students improve their applied calculus skills?" Fischer said. "To our knowledge, no one has built an assessment targeting that specific question. So, we're trying to figure out how to design such an instrument."
Finally, Fischer said the team would like to build off the lessons learned from the implementation of the "energy-first" physics approach to modify the curriculum of other classes in the department.
"Is there a way we could design something similar, or at least take advantage of this sort of design methodology for our department's algebra-based physics classes?" he said. "This naturally also motivates us to reach out to high schools to find collaborators to help us develop new and improved ways of teaching physics in a way that would be more useful for high school students."

Novel C. diff structures are required for infection, offer new therapeutic targets

  Iron storage "spheres" inside the bacterium C. diff -- the leading cause of hospital-acquired infections -- could offer new targ...