Sunday, 8 December 2019

High amounts of screen time begin as early as infancy

Child using tablet device
Children's average daily time spent watching television or using a computer or mobile device increased from 53 minutes at age 12 months to more than 150 minutes at 3 years, according to an analysis by researchers at the National Institutes of Health, the University at Albany and the New York University Langone Medical Center. By age 8, children were more likely to log the highest amount of screen time if they had been in home-based childcare or were born to first-time mothers. The study appears in JAMA Pediatrics.
"Our results indicate that screen habits begin early," said Edwina Yeung, Ph.D., the study's senior author and an investigator in the Epidemiology Branch of NIH's Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD). "This finding suggests that interventions to reduce screen time could have a better chance of success if introduced early."
NICHD researchers and their colleagues analyzed data from the Upstate KIDS Study, originally undertaken to follow the development of children conceived after infertility treatments and born in New York State from 2008 to 2010. Mothers of nearly 4,000 children who took part in the study responded to questions on their kids' media habits when they were 12, 18, 24, 30, and 36 months of age. They also responded to similar questions when the children were 7 and 8 years old. The study compiled additional demographic information on the mothers and children from birth records and other surveys.
The American Academy of Pediatrics recommends avoiding digital media exposure for children under 18 months of age, introducing children 18 to 24 months of age to screen media slowly, and limiting screen time to an hour a day for children from 2 to 5 years of age. In the current study, researchers found that 87% of the children had screen time exceeding these recommendations. However, while screen time increased throughout toddlerhood, by age 7 and 8, screen time fell to under 1.5 hours per day. The researchers believe this decrease relates to time consumed by school-related activities.
The study authors classified the children into two groups based on how much their average daily screen time increased from age 1 to age 3. The first group, 73% of the total, had the lowest increase, from an average of nearly 51 minutes a day to nearly an hour and 47 minutes a day. The second group, 27% of the total, had the highest increase, from nearly 37 minutes of screen time a day to about 4 hours a day. Higher levels of parental education were associated with lower odds of inclusion in the second group. In addition, girls were slightly less likely to be in the second group, compared to boys, while children of first-time mothers were more likely to be in the high-increase group.
The researchers also classified the children into percentiles based on their total daily screen time. Children were more likely to be in the 10th, or highest, percentile if their parents had only a high school diploma or equivalent (more than twice as likely) or were children of first-time mothers (almost twice as likely). Similarly, compared to single-born children, twins were more likely to belong to the highest screen time group. Compared to children in center-based care, children in home-based care, whether provided by a parent, babysitter or relative, were more than twice as likely to have high screen time.

What keeps cells in shape? New research points to two types of motion

Illustration of human cell 
The health of cells is maintained, in part, by two types of movement of their nucleoli, a team of scientists has found. This dual motion within surrounding fluid, it reports, adds to our understanding of what contributes to healthy cellular function and points to how its disruption could affect human health.
"Nucleolar malfunction can lead to disease, including cancer," explains Alexandra Zidovska, an assistant professor in New York University's Department of Physics and the senior author of the study, which appears in the journal eLife. "Thus, understanding the processes responsible for the maintenance of nucleolar shape and motion might help in the creation of new diagnostics and therapies for certain human afflictions."
Recent discoveries have shown that some cellular compartments don't have membranes, which were previously seen as necessary to hold a cell together. Researchers have since sought to understand the forces that maintain the integrity of these building blocks of life absent these membranes.
What has been observed is the nature of this behavior. Specifically, these compartments act as liquid droplets made of a material that does not mix with the fluid around them -- similar to oil and water. This process, known as liquid-liquid phase separation, has now been established as one of the key cellular organizing principles.
In their study, the researchers focused on the best known example of such cellular liquid droplet: the nucleolus, which resides inside the cell nucleus and is vital to cell's protein synthesis.
"While the liquid-like nature of the nucleolus has been studied before, its relationship with the surrounding liquid is not known," explains Zidovska, who co-authored the study with Christina Caragine, an NYU doctoral student, and Shannon Haley, an undergraduate in NYU's College of Arts and Science at the time of the work and now a doctoral student at the University of California at Berkeley. "This relationship is particularly intriguing considering the surrounding liquid -- the nucleoplasm -- contains the entire human genome."
Yet, unclear is how the two fluids interact with each other.
To better understand this dynamic, the scientists examined the motion and fusion of human nucleoli in live human cells, while monitoring their shape, size, and smoothness of their surface. The method for studying the fusion of the nucleolar droplets was created by the team in 2018 and reported in the journal Physical Review Letters.
Their latest study showed two types of nucleolar pair movements or "dances": an unexpected correlated motion prior to their fusion and separate independent motion. Moreover, they found that the smoothness of the nucleolar interface is susceptible to both changes in gene expression and the packing state of the genome, which surrounds the nucleoli.
"Nucleolus, the biggest droplet found inside the cell nucleus, serves a very important role in human aging, stress response, and general protein synthesis while existing in this special state," observes Zidovska. "Because nucleoli are surrounded by fluid that contains our genome, their movement stirs genes around them. Consequently, because the genome in the surrounding fluid and nucleoli exist in a sensitive balance, a change in one can influence the other. Disrupting this state can potentially lead to disease."

How does language emerge?

The word 'welcome' in different languages 
How did the almost 6000 languages of the world come into being? Researchers from the Leipzig Research Centre for Early Childhood Development at Leipzig University and the Max Planck Institute for Evolutionary Anthropology have tried to simulate the process of developing a new communication system in an experiment -- with surprising results: even preschool children can spontaneously develop communication systems that exhibit core properties of natural language.
How the languages of the world emerged is largely a mystery. Considering that it might have taken millennia, it is intriguing to see how deaf people can create novel sign languages spontaneously. Observations have shown that when deaf strangers are brought together in a community, they come up with their own sign language in a considerably short amount of time. The most famous example of this is Nicaraguan Sign Language, which emerged in the 1980s. Interestingly, children played an important role in the development of these novel languages. However, how exactly this happened has not been documented, as Manuel Bohn describes: "We know relatively little about how social interaction becomes language. This is where our new study comes in."
In a series of studies, researchers at the Leipzig Research Centre for Early Childhood Development and the Max Planck Institute for Evolutionary Anthropology attempted to recreate exactly this process. The idea had been around for quite some time, says Gregor Kachel. But there was a problem: how to make children communicate with each other without them reverting to talking to each other? The solution came up in Skype conversations between the two researchers from Germany and their colleague Michael Tomasello in the US. In the study, children were invited to stay in two different rooms and a Skype connection was established between them. After a brief familiarization with the set-up, the researchers sneakily turned off the sound and watched as the children found new ways of communicating that go beyond spoken language.
The children's task was to describe an image with different motifs in a coordination game. With concrete things -- like a hammer or a fork -- children quickly found a solution by imitating the corresponding action (e.g. eating) in a gesture. But the researchers repeatedly challenged the children with new, more abstract pictures. For example, they introduced a white sheet of paper as a picture. The depicted "nothing" is difficult to imitate. Kachel describes how two children nevertheless mastered this task: "The sender first tried all sorts of different gestures, but her partner let her know that she did not know what was meant. Suddenly our sender pulled her T-shirt to the side and pointed to a white dot on her coloured T-shirt. The two had a real breakthrough: of course! White! Like the white paper! Later, when the roles were switched, the recipient didn't have a white spot on her T-shirt, but she nevertheless took the same approach: she pulled her T-shirt to the side and pointed to it. Immediately her partner knew what to do." Within a very short time, the two had established a sign for an abstract concept. In the course of the study, the images to be depicted became more and more complex, which was also reflected in the gestures that the children produced. In order to communicate, for example, an interaction between two animals, children invented separate gestures for actors and actions and began to combine them -- thus creating a kind of small local grammar.
How does a language emerge? Based on the present study, the following steps appear plausible: first, people create reference to actions and objects via signs that resemble things. The prerequisite for this is a common ground of experience between interaction partners. Partners also coordinate by imitating each other such that they use the same signs for the same things. The signs thus gain interpersonal and eventually conventional meaning. Over time, the relationships between the signs and things become more abstract and the meaning of the individual signs more specific. Grammatical structures are gradually introduced when there is a need to communicate more complex facts. However, the most remarkable aspect of the current studies is that these processes can be observed under controlled circumstances and within 30 minutes.
The studies demonstrate that communication cannot be reduced to words alone. When there is no way to use conventional spoken language, people find other ways to get their message across. This phenomenon forms the basis for the development of new languages. The study by Manuel Bohn, Gregor Kachel and Michael Tomasello shows what the first steps in the development of a new language could look like. According to Bohn, however, numerous new questions arise at this point: "It would be very interesting to see how the newly invented communication systems change over time, for example when they are passed on to new 'generations' of users. There is evidence that language becomes more systematic when passed on."

Sunday, 29 September 2019

Machu Picchu: Ancient Incan sanctuary intentionally built on faults

Machu Picchu, Peru
The ancient Incan sanctuary of Machu Picchu is considered one of humanity's greatest architectural achievements. Built in a remote Andean setting atop a narrow ridge high above a precipitous river canyon, the site is renowned for its perfect integration with the spectacular landscape. But the sanctuary's location has long puzzled scientists: Why did the Incas build their masterpiece in such an inaccessible place? Research suggests the answer may be related to the geological faults that lie beneath the site.
On Monday, 23 Sept. 2019, at the GSA Annual meeting in Phoenix, Rualdo Menegat, a geologist at Brazil's Federal University of Rio Grande do Sul, will present the results of a detailed geoarchaeological analysis that suggests the Incas intentionally built Machu Picchu -- as well as some of their cities -- in locations where tectonic faults meet. "Machu Pichu's location is not a coincidence," says Menegat. "It would be impossible to build such a site in the high mountains if the substrate was not fractured."
Using a combination of satellite imagery and field measurements, Menegat mapped a dense web of intersecting fractures and faults beneath the UNESCO World Heritage Site. His analysis indicates these features vary widely in scale, from tiny fractures visible in individual stones to major, 175-kilometer-long lineaments that control the orientation of some of the region's river valleys.
Menegat found that these faults and fractures occur in several sets, some of which correspond to the major fault zones responsible for uplifting the Central Andes Mountains during the past eight million years. Because some of these faults are oriented northeast-southwest and others trend northwest-southeast, they collectively create an "X" shape where they intersect beneath Machu Picchu.
Menegat's mapping suggests that the sanctuary's urban sectors and the surrounding agricultural fields, as well as individual buildings and stairs, are all oriented along the trends of these major faults. "The layout clearly reflects the fracture matrix underlying the site," says Menegat. Other ancient Incan cities, including Ollantaytambo, Pisac, and Cusco, are also located at the intersection of faults, says Menegat. "Each is precisely the expression of the main directions of the site's geological faults."
Menegat's results indicate the underlying fault-and-fracture network is as integral to Machu Picchu's construction as its legendary stonework. This mortar-free masonry features stones so perfectly fitted together that it's impossible to slide a credit card between them. As master stoneworkers, the Incas took advantage of the abundant building materials in the fault zone, says Menegat. "The intense fracturing there predisposed the rocks to breaking along these same planes of weakness, which greatly reduced the energy needed to carve them."
In addition to helping shape individual stones, the fault network at Machu Picchu likely offered the Incas other advantages, according to Menegat. Chief among these was a ready source of water. "The area's tectonic faults channeled meltwater and rainwater straight to the site," he says. Construction of the sanctuary in such a high perch also had the benefit of isolating the site from avalanches and landslides, all-too-common hazards in this alpine environment, Menegat explains.
The faults and fractures underlying Machu Picchu also helped drain the site during the intense rainstorms prevalent in the region. "About two-thirds of the effort to build the sanctuary involved constructing subsurface drainages," says Menegat. "The preexisting fractures aided this process and help account for its remarkable preservation," he says. "Machu Picchu clearly shows us that the Incan civilization was an empire of fractured rocks."

How the eyes might be windows to the risk of Alzheimer's disease

Closeup of senior woman's eye
Alzheimer's disease (AD) begins to alter and damage the brain years -- even decades -- before symptoms appear, making early identification of AD risk paramount to slowing its progression.
In a new study published online in the September 9, 2019 issue of the Neurobiology of Aging, scientists at University of California San Diego School of Medicine say that, with further developments, measuring how quickly a person's pupil dilates while they are taking cognitive tests may be a low-cost, low-invasive method to aid in screening individuals at increased genetic risk for AD before cognitive decline begins.
In recent years, researchers investigating the pathology of AD have primarily directed their attention at two causative or contributory factors: the accumulation of protein plaques in the brain called amyloid-beta and tangles of a protein called tau. Both have been linked to damaging and killing neurons, resulting in progressive cognitive dysfunction.
The new study focuses on pupillary responses which are driven by the locus coeruleus (LC), a cluster of neurons in the brainstem involved in regulating arousal and also modulating cognitive function. Tau is the earliest occurring known biomarker for AD; it first appears in the LC; and it is more strongly associated with cognition than amyloid-beta. The study was led by first author William S. Kremen, PhD, and senior author Carol E. Franz, PhD, both professors of psychiatry and co-directors of the Center for Behavior Genetics of Aging at UC San Diego School of Medicine.
The LC drives pupillary response -- the changing diameter of the eyes' pupils -- during cognitive tasks. (Pupils get bigger the more difficult the brain task.) In previously published work, the researchers had reported that adults with mild cognitive impairment, often a precursor to AD, displayed greater pupil dilation and cognitive effort than cognitively normal individuals, even if both groups produced equivalent results. Critically, in the latest paper, the scientists link pupillary dilation responses with identified AD risk genes.
"Given the evidence linking pupillary responses, LC and tau and the association between pupillary response and AD polygenic risk scores (an aggregate accounting of factors to determine an individual's inherited AD risk), these results are proof-of-concept that measuring pupillary response during cognitive tasks could be another screening tool to detect Alzheimer's before symptom appear," said Kremen.

Scientists detect the ringing of a newborn black hole for the first time

If Albert Einstein's theory of general relativity holds true, then a black hole, born from the cosmically quaking collisions of two massive black holes, should itself "ring" in the aftermath, producing gravitational waves much like a struck bell reverbates sound waves. Einstein predicted that the particular pitch and decay of these gravitational waves should be a direct signature of the newly formed black hole's mass and spin.
Now, physicists from MIT and elsewhere have "heard" the ringing of an infant black hole for the first time, and found that the pattern of this ringing does, in fact, predict the black hole's mass and spin -- more evidence that Einstein was right all along.
The findings, published today in Physical Review Letters, also favor the idea that black holes lack any sort of "hair" -- a metaphor referring to the idea that black holes, according to Einstein's theory, should exhibit just three observable properties: mass, spin, and electric charge. All other characteristics, which the physicist John Wheeler termed "hair," should be swallowed up by the black hole itself, and would therefore be unobservable.
The team's findings today support the idea that black holes are, in fact, hairless. The researchers were able to identify the pattern of a black hole's ringing, and, using Einstein's equations, calculated the mass and spin that the black hole should have, given its ringing pattern. These calculations matched measurements of the black hole's mass and spin made previously by others.
If the team's calculations deviated significantly from the measurements, it would have suggested that the black hole's ringing encodes properties other than mass, spin, and electric charge -- tantalizing evidence of physics beyond what Einstein's theory can explain. But as it turns out, the black hole's ringing pattern is a direct signature of its mass and spin, giving support to the notion that black holes are bald-faced giants, lacking any extraneous, hair-like properties.
"We all expect general relativity to be correct, but this is the first time we have confirmed it in this way," says the study's lead author, Maximiliano Isi, a NASA Einstein Fellow in MIT's Kavli Institute for Astrophysics and Space Research. "This is the first experimental measurement that succeeds in directly testing the no-hair theorem. It doesn't mean black holes couldn't have hair. It means the picture of black holes with no hair lives for one more day."
A chirp, decoded
On Sept. 9, 2015, scientists made the first-ever detection of gravitational waves -- infinitesimal ripples in space-time, emanating from distant, violent cosmic phenomena. The detection, named GW150914, was made by LIGO, the Laser Interferometer Gravitational-wave Observatory. Once scientists cleared away the noise and zoomed in on the signal, they observed a waveform that quickly crescendoed before fading away. When they translated the signal into sound, they heard something resembling a "chirp."
Scientists determined that the gravitational waves were set off by the rapid inspiraling of two massive black holes. The peak of the signal -- the loudest part of the chirp -- linked to the very moment when the black holes collided, merging into a single, new black hole. While this infant black hole likely gave off gravitational waves of its own, its signature ringing, physicists assumed, would be too faint to decipher amid the clamor of the initial collision.
Isi and his colleagues, however, found a way to extract the black hole's reverberation from the moments immediately after the signal's peak. In previous work led by Isi's co-author, Matthew Giesler, the team showed through simulations that such a signal, and particularly the portion right after the peak, contains "overtones" -- a family of loud, short-lived tones. When they reanalyzed the signal, taking overtones into account, the researchers discovered that they could successfully isolate a ringing pattern that was specific to a newly formed black hole.
In the team's new paper, the researchers applied this technique to actual data from the GW150914 detection, concentrating on the last few milliseconds of the signal, immediately following the chirp's peak. Taking into account the signal's overtones, they were able to discern a ringing coming from the new, infant black hole. Specifically, they identified two distinct tones, each with a pitch and decay rate that they were able to measure.
"We detect an overall gravitational wave signal that's made up of multiple frequencies, which fade away at different rates, like the different pitches that make up a sound," Isi says. "Each frequency or tone corresponds to a vibrational frequency of the new black hole."
Listening beyond Einstein
Einstein's theory of general relativity predicts that the pitch and decay of a black hole's gravitational waves should be a direct product of its mass and spin. That is, a black hole of a given mass and spin can only produce tones of a certain pitch and decay. As a test of Einstein's theory, the team used the equations of general relativity to calculate the newly formed black hole's mass and spin, given the pitch and decay of the two tones they detected.
They found their calculations matched with measurements of the black hole's mass and spin previously made by others. Isi says the results demonstrate that researchers can, in fact, use the very loudest, most detectable parts of a gravitational wave signal to discern a new black hole's ringing, where before, scientists assumed that this ringing could only be detected within the much fainter end of the gravitational wave signal, and only with much more sensitive instruments than what currently exist.
"This is exciting for the community because it shows these kinds of studies are possible now, not in 20 years," Isi says.
As LIGO improves its resolution, and more sensitive instruments come online in the future, researchers will be able to use the group's methods to "hear" the ringing of other newly born black holes. And if they happen to pick up tones that don't quite match up with Einstein's predictions, that could be an even more exciting prospect.
"In the future, we'll have better detectors on Earth and in space, and will be able to see not just two, but tens of modes, and pin down their properties precisely," Isi says. "If these are not black holes as Einstein predicts, if they are more exotic objects like wormholes or boson stars, they may not ring in the same way, and we'll have a chance of seeing them."
This research was supported, in part, by NASA, the Sherman Fairchild Foundation, the Simons Foundation, and the National Science Foundation.

Engineers develop 'blackest black' material to date

With apologies to "Spinal Tap," it appears that black can, indeed, get more black.
MIT engineers report today that they have cooked up a material that is 10 times blacker than anything that has previously been reported. The material is made from vertically aligned carbon nanotubes, or CNTs -- microscopic filaments of carbon, like a fuzzy forest of tiny trees, that the team grew on a surface of chlorine-etched aluminum foil. The foil captures more than 99.96 percent of any incoming light, making it the blackest material on record.
The researchers have published their findings today in the journal ACS-Applied Materials and Interfaces. They are also showcasing the cloak-like material as part of a new exhibit today at the New York Stock Exchange, titled "The Redemption of Vanity."
The artwork, a collaboration between Brian Wardle, professor of aeronautics and astronautics at MIT, and his group, and MIT artist-in-residence Diemut Strebe, features a 16.78-carat natural yellow diamond, estimated to be worth $2 million, which the team coated with the new, ultrablack CNT material. The effect is arresting: The gem, normally brilliantly faceted, appears as a flat, black void.
Wardle says the CNT material, aside from making an artistic statement, may also be of practical use, for instance in optical blinders that reduce unwanted glare, to help space telescopes spot orbiting exoplanets.
"There are optical and space science applications for very black materials, and of course, artists have been interested in black, going back well before the Renaissance," Wardle says. "Our material is 10 times blacker than anything that's ever been reported, but I think the blackest black is a constantly moving target. Someone will find a blacker material, and eventually we'll understand all the underlying mechanisms, and will be able to properly engineer the ultimate black."
Wardle's co-author on the paper is former MIT postdoc Kehang Cui, now a professor at Shanghai Jiao Tong University.
Into the void
Wardle and Cui didn't intend to engineer an ultrablack material. Instead, they were experimenting with ways to grow carbon nanotubes on electrically conducting materials such as aluminum, to boost their electrical and thermal properties.
But in attempting to grow CNTs on aluminum, Cui ran up against a barrier, literally: an ever-present layer of oxide that coats aluminum when it is exposed to air. This oxide layer acts as an insulator, blocking rather than conducting electricity and heat. As he cast about for ways to remove aluminum's oxide layer, Cui found a solution in salt, or sodium chloride.
At the time, Wardle's group was using salt and other pantry products, such as baking soda and detergent, to grow carbon nanotubes. In their tests with salt, Cui noticed that chloride ions were eating away at aluminum's surface and dissolving its oxide layer.
"This etching process is common for many metals," Cui says. "For instance, ships suffer from corrosion of chlorine-based ocean water. Now we're using this process to our advantage."
Cui found that if he soaked aluminum foil in saltwater, he could remove the oxide layer. He then transferred the foil to an oxygen-free environment to prevent reoxidation, and finally, placed the etched aluminum in an oven, where the group carried out techniques to grow carbon nanotubes via a process called chemical vapor deposition.
By removing the oxide layer, the researchers were able to grow carbon nanotubes on aluminum, at much lower temperatures than they otherwise would, by about 100 degrees Celsius. They also saw that the combination of CNTs on aluminum significantly enhanced the material's thermal and electrical properties -- a finding that they expected.
What surprised them was the material's color.
"I remember noticing how black it was before growing carbon nanotubes on it, and then after growth, it looked even darker," Cui recalls. "So I thought I should measure the optical reflectance of the sample.
"Our group does not usually focus on optical properties of materials, but this work was going on at the same time as our art-science collaborations with Diemut, so art influenced science in this case," says Wardle.
Wardle and Cui, who have applied for a patent on the technology, are making the new CNT process freely available to any artist to use for a noncommercial art project.
"Built to take abuse"
Cui measured the amount of light reflected by the material, not just from directly overhead, but also from every other possible angle. The results showed that the material absorbed greater than 99.995 percent of incoming light, from every angle. In essence, if the material contained bumps or ridges, or features of any kind, no matter what angle it was viewed from, these features would be invisible, obscured in a void of black.
The researchers aren't entirely sure of the mechanism contributing to the material's opacity, but they suspect that it may have something to do with the combination of etched aluminum, which is somewhat blackened, with the carbon nanotubes. Scientists believe that forests of carbon nanotubes can trap and convert most incoming light to heat, reflecting very little of it back out as light, thereby giving CNTs a particularly black shade.

Are humans preventing flies from eavesdropping?

Today's world is filled with background noise, whether it be from a roaring river or a well-trafficked highway. Elevated noise levels from both human-made and natural sources may interfere with animals' listening ability and alter how they interact with other animals. A group of researchers at California Polytechnic State University investigated how background sounds affect a parasitoid fly's eavesdropping capabilities.
Ormia flies listen for cricket calls to find their hosts for their young. When found, the flies deposit their eggs on or near the cricket. Larvae hatch and burrow inside of the cricket, eventually bursting through and killing the host. Therefore, the researchers hypothesized that noise could interfere with eavesdropping among the flies such that they have a difficult time finding their hosts.
The research was published in Royal Society Open Science, and used sticky fly traps near speakers broadcasting cricket calls across a gradient of noise. The results show that fewer parasitoid flies were caught near speakers in noisier locations. Because parasitoids end up killing their hosts, the results suggest that crickets may benefit from calling in noisy areas.
The study also found that both traffic noise and natural ocean noise inhibit fly orientation to sound, suggesting crickets could use sound as a parasite shield across different soundscapes. These results suggest that soundscapes may influence the evolution of tightly co-evolved host-parasitoid relationships.
Some questions still remain for the authors -- if the parasitoid flies are less abundant in noise, might female flies have the same problem localizing male crickets calls? Author Jennifer N. Phillips stated, "Future work could investigate whether female crickets have trouble hearing and moving toward calling males in noise, which would be a fitness cost to males trying to find a mate. This could balance out the reduced risk from fly parasitism. However, if crickets are able to adjust their call or ability to find each other and the parasitoid fly does not, there may still be some benefit of noise to crickets."

Atlantic Ocean may get a jump-start from the other side of the world

Ocean waves crashing on beach
A key question for climate scientists in recent years has been whether the Atlantic Ocean's main circulation system is slowing down, a development that could have dramatic consequences for Europe and other parts of the Atlantic rim. But a new study suggests help may be on the way from an unexpected source -- the Indian Ocean.
Think of it as ocean-to-ocean altruism in the age of climate change.
The new study, from Shineng Hu of the Scripps Institution of Oceanography at the University of California-San Diego and Alexey Fedorov of Yale University, appears Sept. 16 in the journal Nature Climate Change. It is the latest in a growing body of research that explores how global warming may alter global climate components such as the Atlantic meridional overturning circulation (AMOC).
AMOC is one of the planet's largest water circulation systems. It operates like a liquid escalator, delivering warm water to the North Atlantic via an upper limb and sending colder water south via a deeper limb.
Although AMOC has been stable for thousands of years, data from the past 15 years, as well as computer model projections, have given some scientists cause for concern. AMOC has showed signs of slowing during that period, but whether it is a result of global warming or only a short-term anomaly related to natural ocean variability is not known.
"There is no consensus yet," Fedorov said, "but I think the issue of AMOC stability should not be ignored. The mere possibility that the AMOC could collapse should be a strong reason for concern in an era when human activity is forcing significant changes to the Earth's systems.
"We know that the last time AMOC weakened substantially was 15,000 to 17,000 years ago, and it had global impacts," Fedorov added. "We would be talking about harsh winters in Europe, with more storms or a drier Sahel in Africa due to the downward shift of the tropical rain belt, for example."
Much of Fedorov and Hu's work focuses on specific climate mechanisms and features that may be shifting due to global warming. Using a combination of observational data and sophisticated computer modeling, they plot out what effects such shifts might have over time. For example, Fedorov has looked previously at the role melting Arctic sea ice might have on AMOC.
For the new study, they looked at warming in the Indian Ocean.
"The Indian Ocean is one of the fingerprints of global warming," said Hu, who is first author of the new work. "Warming of the Indian Ocean is considered one of the most robust aspects of global warming."
The researchers said their modeling indicates a series of cascading effects that stretch from the Indian Ocean all way over to the Atlantic: As the Indian Ocean warms faster and faster, it generates additional precipitation. This, in turn, draws more air from other parts of the world, including the Atlantic, to the Indian Ocean.
With so much precipitation in the Indian Ocean, there will be less precipitation in the Atlantic Ocean, the researchers said. Less precipitation will lead to higher salinity in the waters of the tropical portion of the Atlantic -- because there won't be as much rainwater to dilute it. This saltier water in the Atlantic, as it comes north via AMOC, will get cold much quicker than usual and sink faster.
"This would act as a jump-start for AMOC, intensifying the circulation," Fedorov said. "On the other hand, we don't know how long this enhanced Indian Ocean warming will continue. If other tropical oceans' warming, especially the Pacific, catches up with the Indian Ocean, the advantage for AMOC will stop."
The researchers said this latest finding illustrates the intricate, interconnected nature of global climate. As scientists try to understand the unfolding effects of climate change, they must attempt to identify all of the climate variables and mechanisms that are likely to play a role, they added.
"There are undoubtedly many other connections that we don't know about yet," Fedorov said. "Which mechanisms are most dominant? We're interested in that interplay."

Carp aquaculture in Neolithic China dating back 8,000 years

Carp
In a recent study, an international team of researchers analyzed fish bones excavated from the Early Neolithic Jiahu site in Henan Province, China. By comparing the body-length distributions and species-composition ratios of the bones with findings from East Asian sites with present aquaculture, the researchers provide evidence of managed carp aquaculture at Jiahu dating back to 6200-5700 BC.
Despite the growing importance of farmed fish for economies and diets around the world, the origins of aquaculture remain unknown. The Shijing, the oldest surviving collection of ancient Chinese poetry, mentions carp being reared in a pond circa 1140 BC, and historical records describe carp being raised in artificial ponds and paddy fields in East Asia by the first millennium BC. But considering rice paddy fields in China date all the way back to the fifth millennium BC, researchers from Lake Biwa Museum in Kusatu, Japan, the Max Planck Institute for the Science of Human History in Jena, Germany, the Sainsbury Institute for the Study of Japanese Arts and Cultures in Norwich, U.K., and an international team of colleagues set out to discover whether carp aquaculture in China was practiced earlier than previously thought.
Carp farming goes way back in Early Neolithic Jiahu
Jiahu, located in Henan, China, is known for the early domestication of rice and pigs, as well the early development of fermented beverages, bone flutes, and possibly writing. This history of early development, combined with archaeological findings suggesting the presence of large expanses of water, made Jiahu an ideal location for the present study.
Researchers measured 588 pharyngeal carp teeth extracted from fish remains in Jiahu corresponding with three separate Neolithic periods, and compared the body-length distributions with findings from other sites and a modern sample of carp raised in Matsukawa Village, Japan. While the remains from the first two periods revealed unimodal patterns of body-length distribution peaking at or near carp maturity, the remains of Period III (6200-5700 BC) displayed bimodal distribution, with one peak at 350-400 mm corresponding with sexual maturity, and another at 150-200 mm.
This bimodal distribution identified by researchers was similar to that documented at the Iron Age Asahi site in Japan (circa 400 BC -- AD 100), and is indicative of a managed system of carp aquaculture that until now was unidentified in Neolithic China. "In such fisheries," the study notes, "a large number of cyprinids were caught during the spawning season and processed as preserved food. At the same time, some carp were kept alive and released into confined, human regulated waters where they spawned naturally and their offspring grew by feeding on available resources. In autumn, water was drained from the ponds and the fish harvested, with body-length distributions showing two peaks due to the presence of both immature and mature individuals."
Species-composition ratios support findings, indicate cultural preferences
The size of the fish wasn't the only piece of evidence researchers found supporting carp management at Jiahu. In East Asian lakes and rivers, crucian carp are typically more abundant than common carp, but common carp comprised roughly 75% of cyprinid remains found at Jiahu. This high proportion of less-prevalent fish indicates a cultural preference for common carp and the presence of aquaculture sophisticated enough to provide it.
Based on the analysis of carp remains from Jiahu and data from previous studies, researchers hypothesize three stages of aquaculture development in prehistoric East Asia. In Stage 1, humans fished the marshy areas where carp gather during spawning season. In Stage 2, these marshy ecotones were managed by digging channels and controlling water levels and circulation so the carp could spawn and the juveniles later harvested. Stage 3 involved constant human management, including using spawning beds to control reproduction and fish ponds or paddy fields to manage adolescents.
Although rice paddy fields have not yet been identified at Jiahu, the evolution of carp aquaculture with wet rice agriculture seems to be connected, and the coevolution of the two is an important topic for future research.

Why is Earth so biologically diverse? Mountains hold the answer

Mount Chimborazo volcano, Ecuador
What determines global patterns of biodiversity has been a puzzle for scientists since the days of von Humboldt, Darwin, and Wallace. Yet, despite two centuries of research, this question remains unanswered. The global pattern of mountain biodiversity, and the extraordinarily high richness in tropical mountains in particular, is documented in two companion Science review papers this week. The papers focus on the fact that the high level of biodiversity found on mountains is far beyond what would be expected from prevailing hypotheses.
"The challenge is that, although it is evident that much of the global variation in biodiversity is so clearly driven by the extraordinary richness of tropical mountain regions, it is this very richness that current biodiversity models, based on contemporary climate, cannot explain: mountains are simply too rich in species, and we are falling short of explaining global hotspots of biodiversity," says Professor Carsten Rahbek, lead author of both review papers published in Science.
To confront the question of why mountains are so biologically diverse, scientists at the Center for Macroecology, Evolution and Climate (CMEC) at the GLOBE Institute of the University of Copenhagen work to synthesize understanding and data from the disparate fields of macroecology, evolutionary biology, earth sciences, and geology. The CMEC scientists are joined by individual collaborators from Oxford University, Kew Gardens, and University of Connecticut.
Part of the answer, these studies find, lies in understanding that the climate of rugged tropical mountain regions is fundamentally different in complexity and diversity compared to adjacent lowland regions. Uniquely heterogeneous mountain climates likely play a key role in generating and maintaining high diversity.
"People often think of mountain climates as bleak and harsh," says study co-leader Michael K. Borregaard. "But the most species-rich mountain region in the world, the Northern Andes, captures, for example, roughly half of the world's climate types in a relatively small region -- much more than is captured in nearby Amazon, a region that is more than 12 times larger."
Stressing another unique feature of mountain climate, Michael explains, "Tropical mountains, based in fertile and wet equatorial lowlands and extending into climatic conditions superficially similar to those found in the Arctic, span a gradient of annual mean temperatures over just a few km as large as that found over 10,000 km from the tropical lowlands at Equator to the arctic regions at the poles. It's pretty amazing if you think about it."
Another part of the explanation of the high biodiversity of certain mountains is linked to the geological dynamics of mountain building. These geological processes, interacting with complex climate changes through time, provide ample opportunities for evolutionary processes to act.
"The global pattern of biodiversity shows that mountain biodiversity exhibits a visible signature of past evolutionary processes. Mountains, with their uniquely complex environments and geology, have allowed the continued persistence of ancient species deeply rooted in the tree of life, as well as being cradles where new species have arisen at a much higher rate than in lowland areas, even in areas as amazingly biodiverse as the Amazonian rainforest," says Professor Carsten Rahbek.
From ocean crust, volcanism and bedrock to mountain biodiversity
Another explanation of mountain richness, says the study, may lie in the interaction between geology and biology. The scientists report a novel and surprising finding: the high diversity is in most tropical mountains tightly linked to bedrock geology -- especially mountain regions with obducted, ancient oceanic crust. To explain this relationship between geology and biodiversity, the scientists propose, as a working hypothesis, that mountains in the tropics with soil originating from oceanic bedrock provide exceptional environmental conditions that drive localized adaptive change in plants. Special adaptations that allow plants to tolerate these unusual soils, in turn, may drive speciation cascades (the speciation of one group leading to speciation in other groups), all the way to animals, and ultimately contribute to the shape of global patterns of biodiversity.
The legacy of von Humboldt -- his 250th anniversary
The two papers are part of Science's celebration of Alexander von Humboldt's 250th birth anniversary. In 1799, Alexander von Humboldt set sail on a 5-year, 8000-km voyage of scientific discovery through Latin America. His journey through the Andes Mountains, captured by his famous vegetation zonation figure featuring Mount Chimborazo, canonized the place of mountains in understanding Earth's biodiversity.
Acknowledging von Humboldt's contribution to our understanding of the living world, Professor Carsten Rahbek, one of the founding scientists of the newly established interdisciplinary GLOBE Institute at the University of Copenhagen says:
"Our papers in Science are a testimony to the work of von Humboldt, which truly revolutionized our thinking about the processes that determine the distribution of life. Our work today stands on the shoulders of his work, done centuries ago, and follows his approach of integrating data and knowledge of different scientific disciplines into a more holistic understanding of the natural world. It is our small contribution of respect to the legacy of von Humboldt."

Natural selection alters genes that control roundworms' sense of smell

Charles Darwin was right.
In his 1859 book, "On the Origin of Species," the famed scientist hypothesized that artificial selection (or domestication) and natural selection work in the same ways.
Now an international team, led by Northwestern University, has produced some of the first evidence that Darwin's speculation was correct.
This time, the study's subjects are not exotic birds in the Galapagos, but instead a roundworm, which relies on its sense of smell to assess the availability of food and nearby competition. In the Northwestern-led work, researchers found that natural selection acts on the same genes that control wild roundworms' sense of smell as were previously found in domesticated worms in the lab.
"The evolution of traits if rarely connected to exact genes and processes," said Northwestern's Erik Andersen, who led the study. "We offer a clear example of how evolution works."
The scientists used a combination of laboratory experiments, computational genomic analysis and field work. Their research also shows that natural selection acts on signal-sensing receptors rather than the downstream parts of the genetic process.
The study published this week (Sept. 23) in the journal Nature Ecology & Evolution. Andersen is an associate professor of molecular biosciences in Northwestern's Weinberg College of Arts and Sciences.
A keystone model organism, C. elegans is a one-millimeter-long roundworm that lives in decaying organic matter -- particularly rotten fruits -- and feeds on bacteria. These roundworms are typically found in gardens and compost piles.
For C. elegans, having a keen sense of smell can be the difference between life or death. If they smell enough food in their environment, then they will stay, grow and reproduce. If they sense a shortage of food and/or too much competition from other worms, then they will undertake a long and potentially fatal journey in search of a more favorable environment. This process, called "dauer," delays growth and reproduction.
In other words, dauer decreases reproductive success in the short term in order to ensure survival in the long run.
"At some point in their lives, these worms must make a gamble," Andersen said. "In the time it takes for a worm to come out of dauer and start growing again, the worm that stayed behind has already been multiplying. If the food runs out, then the dauer worm made the right decision and wins. If the food doesn't run out, then the dauer worm loses."
Andersen and his collaborators found that evolution plays a significant role in a worm's decision to stay or enter dauer. Some roundworms have one genetic receptor to process scents; other roundworms have two. The roundworms with two receptors have a heightened sense of smell, which allows them to better assess the availability of resources in their environment and make a better gamble.
"If worms can smell large numbers of worms around them, that gives them an advantage," Andersen said. "This was discovered in a previous study of artificial selection in worms. Now we also found that result in natural populations. We can see specific evidence in these two genes that artificial and natural selection act similarly."

Friday, 27 September 2019

A little kindness goes a long way for worker performance and health

Small gestures of kindness by employers can have big impacts on employees' health and work performance, according to an international team of researchers. The team specifically examined the effects of employers enhancing the lunches of bus drivers in China with fresh fruit and found that it reduced depression among the drivers and increased their confidence in their own work performance.
"An ultimate solution to improve worker performance and health could be big pay raises or reduced workloads, but when those solutions aren't feasible, we found that even small offerings can make a big difference," said Bu Zhong, associate professor of journalism at Penn State.
According to Zhong, bus drivers are vulnerable to specific health problems due in large part to their stressful working environment, which often includes irregular shift schedules, unpredictable traffic conditions and random meal times. In addition, the sedentary nature of driving and continuous whole-body vibration contributes to fatigue, musculoskeletal problems such as lower-back pain, cardiovascular diseases and gastrointestinal issues.
Zhong and his colleagues conducted an experiment with 86 Shenzen bus drivers. During the experiment, on-duty bus drivers were given, in addition to their typical box lunch which includes no fruit, a serving of fresh fruit -- either an apple or a banana -- for three weeks. The cost of the fruit was 73 cents per meal.
The team distributed surveys to the bus drivers at three time intervals -- one week before the experiment began, once in the middle of the three-week-long experiment and one week following the end of the experiment. The findings appear today in the International Journal of Occupational Safety and Ergonomics.
The researchers assessed depression with a personal health questionnaire that is recommended by the U.S. Centers for Disease Control and Prevention. The scale consisted of eight items, asking the participants to rate, for example, how often during the past two weeks they felt down, depressed or hopeless, and had trouble falling or staying asleep.
"Bus drivers reported significantly decreased depression levels one week after the experiments ended compared to one week before it began," said Zhong.
The team measured self-efficacy -- perceived confidence and ability to implement the necessary actions and tasks so as to achieve specific goals -- using the 10-item General Self-Efficacy Scale. Items on this scale included, "I can always manage to solve difficult problems if I try hard enough" and "I can usually handle whatever comes my way."
"We found that self-efficacy was significantly higher in the middle of the experiment week than in the week after the experiment ended," said Zhong.
Zhong concluded that while eating an extra apple at lunchtime may seem trivial, its impact can be large.
"This research suggests that employees can be sensitive to any improvement at the workplace," he said. "Before an ultimate solution is possible, some small steps can make a difference -- one apple at a time."

Virtual reality training could improve employee safety

A new study suggests employee safety could be improved through use of Virtual Reality (VR) in Health and Safety training, such as fire evacuation drills.
The Human Factors Research Group at the University of Nottingham, developed an immersive VR system to stimulate participants' perception of temperature, and senses of smell, sight and hearing to explore how they behaved during two health and safety training scenarios: an emergency evacuation in the event of a fire and a fuel leak.
In one scenario, participants had to evacuate from a virtual fire in an office, seeing and hearing using a VR headset but could also feel heat from three 2kW heaters, and could smell smoke from a scent diffuser, creating a multisensory virtual environment. This group was compared against another group who were observed in this scenario using only audio-visual elements of VR.
Observing real life behaviours
Previous research on human behaviour during real-world fire incidents has shown that a lack of understanding of the spread and movement of fire often means that occupants are unprepared and misjudge appropriate actions. Immersive health and safety training enables employers to train people about hazards and hazardous environments without putting anyone at risk.
The Nottingham research, funded by the Institution of Occupational Safety and Health (IOSH), found contrasts between the groups in the way participants reacted to the scenario. Those in the multi-sensory group had a greater sense of urgency, reflecting a real-life scenario, and were more likely to avoid the virtual fires. Evidence from the audio-visual participants suggested that they were treating the experience more like a game and behaviours were less consistent with those expected in a real world situation.
Dr Glyn Lawson, Associate Professor in the Faculty of Engineering, University of Nottingham, said: "Health and safety training can fail to motivate and engage employees and can lack relevance to real-life contexts. Our research, which has been funded by the Institution of Occupational Safety and Health, suggests that virtual environments can help address these issues, by increasing trainees' engagement and willingness to participate in further training. There are also business benefits associated with the use of virtual environment training, such as the ability to deliver training at or near the workplace and at a time that is convenient to the employee."
Virtual Reality vs. PowerPoint
A further test was done, as part of the study, to measure the effectiveness of VR training versus traditional PowerPoint training. Participants took questionnaires, testing their knowledge on either fire safety or safe vehicle disassembly procedure, before and after training as well as one week later.
While those trained via PowerPoint appeared to have gained more knowledge when tested directly after training, there was a significantly larger decrease in knowledge scores when participants were retested one week later. In comparison, the VR group's long term retention was better and reported higher levels of engagement; attitude to occupational safety and health; and willingness to undertake training in the future.
The research suggests that the increased cognitive engagement of learning in the virtual environment creates more established and comprehensive mental models which can improve recall, and implies that testing an employee's knowledge immediately following health and safety training may not be an effective means of gaging long-term knowledge of health and safety.
Applications to the work place
Mary Ogungbeje, Research Manager at IOSH, said: "The wheels are turning so that virtual and smart learning is increasingly engrained in the workplace and everyday life.
"Technology is continuously advancing and in many cases becoming more affordable, so this study gives us a taste of what's to come. By improving training strategies with the use of technology and stimulated sensory experiences, we are heading in a direction where the workforce will not just enjoy a more immersive and interesting training course but participate in an effective learning experience, so they are better prepared and equipped to stay safe, healthy and well at work."
The researchers conducted meetings, discussions, and visits with partners including Rolls-Royce, for expert advice around fire safety and safe handling of hazardous chemicals. The University of Nottingham's Health and Safety advisors also contributed to help the researchers better understand how the training may be implemented in industry.
The study aims to produce evidence-based guidance for the development and use of virtual environments in engaging and effective training using cost-effective and accessible solutions. The full study features in a report, titled 'Immersive virtual worlds: Multisensory virtual environments for health and safety training', to be released at the IOSH's annual conference on Tuesday 17 September.

The future of 'extremely' energy-efficient circuits

Data centers are processing data and dispensing the results at astonishing rates and such robust systems require a significant amount of energy -- so much energy, in fact, that information communication technology is projected to account for 20% of total energy consumption in the United States by 2020.
To answer this demand, a team of researchers from Japan and the United States have developed a framework to reduce energy consumption while improving efficiency.
They published their results on July 19 in Scientific Reports, a Nature journal.
"The significant amount of energy consumption has become a critical problem in modern society," said Olivia Chen, corresponding author of the paper and assistant professor in the Institute of Advanced Sciences at Yokohama National University. "There is an urgent requirement for extremely energy-efficient computing technologies."
The research team used a digital logic process called Adiabatic Quantum-Flux-Parametron (AQFP). The idea behind the logic is that direct current should be replaced with alternating current. The alternating current acts as both the clock signal and the power supply -- as the current switches directions, it signals the next time phase for computing.
The logic, according to Chen, could improve conventional communication technologies with currently available fabrication processes.
"However, there lacks a systematic, automatic synthesis framework to translate from high-level logic description to Adiabatic Quantum-Flux-Parametron circuit netlist structures," Chen said, referring to the individual processors within the circuit. "In this paper, we mitigate that gap by presenting an automatic flow. We also demonstrate that AQFP can achieve a reduction in energy use by several orders of magnitude compared to traditional technologies."
The researchers proposed a top-down framework for computing decisions that can also analyze its own performance. To do this, they used logic synthesis, a process by which they direct the passage of information through logic gates within the processing unit. Logic gates can take in a little bit of information and output a yes or no answer. The answer can trigger other gates to respond and move the process forward, or stop it completely.
With this basis, the researchers developed a computation logic that takes the high-level understanding of processing and how much energy a system uses and dissipates and describes it as an optimized map for each gate within the circuit model. From this, Chen and the research team can balance the estimation of power needed to process through the system and the energy that the system dissipates.
According to Chen, this approach also compensates for the cooling energy needed for superconducting technologies and reduces the energy dissipation by two orders of magnitude.
"These results demonstrate the potential of AQFP technology and applications for large-scale, high-performance and energy-efficient computations," Chen said.
Ultimately, the researchers plan to develop a fully automated framework to generate the most efficient AQFP circuit layout.
"The synthesis results of AQFP circuits are highly promising in terms of energy-efficient and high-performance computing," Chen said. "With the future advancing and maturity of AQFP fabrication technology, we anticipate broader applications ranging from space applications and large-scale computing facilities such as data centers."

Investments to address climate change are good business

An internationally respected group of scientists have urgently called on world leaders to accelerate efforts to tackle climate change. Almost every aspect of the planet's environment and ecology is undergoing changes in response to climate change, some of which will be profound if not catastrophic in the future.
According to their study published in Science today, reducing the magnitude of climate change is also a good investment. Over the next few decades, acting to reduce climate change is expected to cost much less than the damage otherwise inflicted by climate change on people, infrastructure and ecosystems.
"Acting on climate change" said lead author, Prof Ove Hoegh-Guldberg from the ARC Centre for Excellence in Coral Reef Studies at the University of Queensland in Australia "has a good return on investment when one considers the damages avoided by acting."
The investment is even more compelling given the wealth of evidence that the impacts of climate change are happening faster and more extensively than projected, even just a few years ago. This makes the case for rapidly reducing greenhouse gas emissions even more compelling and urgent.
Prof Hoegh-Guldberg explained the mismatch. "First, we have underestimated the sensitivity of natural and human systems to climate change, and the speed at which these changes are happening. Second, we have underappreciated the synergistic nature of climate threats -- with the outcomes tending to be worse than the sum of the parts. This is resulting is rapid and comprehensive climate impacts, with growing damage to people, ecosystems, and livelihoods."
For example, sea-level rise can lead to higher water levels during storm events. This can create more damage. For deprived areas, this may exacerbate poverty creating further disadvantage. Each risk may be small on its own, but a small change in a number of risks can lead to large impacts.
Prof Daniela Jacob, co-author and Director of Climate Services Centre (GERICS) in Germany is concerned about these rapid changes -- especially about unprecedented weather extremes.
"We are already in new territory" said Prof Jacob, "The 'novelty' of the weather is making our ability to forecast and respond to weather-related phenomena very difficult."
These changes are having major consequences. The paper updates a database of climate-related changes and finds that there are significant benefits from avoiding 2oC and aiming to restrict the increase to 1.5oC above pre-industrial global temperatures.
Prof Rachel Warren from the Tyndall Centre at the University of East Anglia in the UK assessed projections of risk for forests, biodiversity, food, crops and other critical systems, and found very significant benefits for limiting global warming to 1.5oC rather than 2oC.
"The scientific community has quantified these risks in order to inform policy makers about the benefits of avoiding them," Prof Warren stated.
Since the Paris Agreement came into force, there has been a race to quantify the benefits of limiting warming to 1.5oC so that policy makers have the best possible information for developing the policy required for doing it.
Prof Warren continued. "If such policy is not implemented, we will continue on the current upward trajectory of burning fossil fuels and continuing deforestation, which will expand the already large-scale degradation of ecosystems. To be honest, the overall picture is very grim unless we act."
A recent report from the United Nations projected that as many as a million species may be at risk of extinction over the coming decades and centuries. Climate change is not the only factor but is one of the most important ones.
The urgency of responding to climate change is at front of mind for Prof Michael Taylor, co-author and Dean of Science at the University of the West Indies. "This is not an academic issue, it is a matter of life and death for people everywhere. That said, people from small island States and low-lying countries are in the immediate cross-hairs of climate change."
"I am very concerned about the future for these people," said Professor Taylor.
This urgency to act is further emphasized by the vulnerability of developing countries to climate change impacts as pointed out by Francois Engelbrecht, co-author and Professor of Climatology at the Global Change Institute of the University of the Witwatersrand in South Africa.
"The developing African countries are amongst those to be affected most in terms of impacts on economic growth in the absence of strong climate change mitigation," Prof Engelbrecht explains.
Prof Hoegh-Guldberg reiterated the importance of the coming year (2020) in terms of climate action and the opportunity to strengthen emission reduction pledges in line with the Paris Agreement of 2015.
"Current emission reduction commitments are inadequate and risk throwing many nations into chaos and harm, with a particular vulnerability of poor peoples. To avoid this, we must accelerate action and tighten emission reduction targets so that they fall in line with the Paris Agreement. As we show, this is much less costly than suffering the impacts of 2oC or more of climate change."
"Tackling climate change is a tall order. However, there is no alternative from the perspective of human well-being -- and too much at stake not to act urgently on this issue."

Novel C. diff structures are required for infection, offer new therapeutic targets

  Iron storage "spheres" inside the bacterium C. diff -- the leading cause of hospital-acquired infections -- could offer new targ...