Saturday 30 September 2023

Upgrading iron and steel plants could save equivalent of two years of global carbon emissions

 Upgrading, or retrofitting, the world's iron and steel processing plants early could reduce carbon emissions by up to 70 gigatonnes by 2050, roughly equivalent to two years' worth of net global carbon emissions, according to a new study led by UCL researchers.

Published in the journal Nature, the researchers found that by upgrading the world's iron and steel production facilities, carbon emissions can be reduced by 58.7 gigatonnes between 2020 and 2050, roughly equivalent to two years' worth of net global carbon emissions. In addition, they found that by bumping forward emissions reduction retrofits five years ahead of when they would be typically scheduled, it would reduce emissions by 69.6 gigatonnes over that time frame. Iron and steel production contributes about 7% to total global carbon emissions.

To develop this schedule, the team created a comprehensive database of 19,678 individual processing units located in 4,883 individual iron and steel plants around the world, inventoried by their technical characteristics, including their locations, processing technologies, operating details, status and age.

Iron and steel production is a carbon emissions heavy process. The researchers found that as of 2019, the last year that data is available, 74.5% of the world's steel was produced in coal powered plants that release considerable carbon emissions. Technologies exist to reduce these admissions, but upgrades are expensive and time consuming and so are usually only undertaken at the end of a processing unit's operational lifetime.

Refining is also hard on the equipment, and the individual processing units within each plant need to be retrofitted periodically to prolong their operational lifetimes. Overall, 43.2% of global iron and steel plants have been retrofitted with new technologies or have otherwise enhanced their processes to extend their operating lifetime. The frequency of their retrofits depends on the technique they employ and how old they are, but typically they occur after 15 to 27 years of use.

The researchers found that if all currently operating processing units were upgraded to incorporate low-emissions technology at their predicted time of their refit, total emissions from the iron and steel sector could be reduced by 58.7 gigatonnes between 2020 and 2050, but if all the refits and upgrades were bumped forward and completed five years early, the total carbon savings would be 16% greater at 69.6 gigatonnes.

But the team also emphasises that mitigation efforts will have to take place at the individual facility level, and that the decarbonisation of the entire iron and steel industry depends on the efforts undertaken by every single plant. Because of the complexity and variety of methods involved in steel production around the world, there's no one-size-fits-all decarbonisation technology or solution for the entire sector, and each processing unit should be upgraded individually according to its technical specification.

Senior author Professor Dabo Guan (UCL Bartlett School of Sustainable Construction) said: "Our results lend vivid background to the possibility of achieving net-zero carbon emissions in iron and steel production in the future. By retrofitting existing plants with low-carbon technologies, and improving scrap collecting and recycling, the iron and steel sector can dramatically reduce its carbon emissions. This study sheds light on the specific emissions reductions that are possible within the iron and steel industry."

About 63% of the world's steel production is from some type of blast oxygen furnace, while most of the remaining capacity is produced by electric arc furnaces. Upgrading the global inventory of blast oxygen furnaces will yield the greatest net carbon savings, about 74% of the total projected carbon savings. Upgrades to electric arc furnaces would account for the second highest net carbon savings, at about 16% of the projected whole, though this may be limited by the total amount of stock scrap available worldwide as the technique is dependent on recycling existing metals.

The researchers hope that this data can be used to identify improved ways to update ageing steel plants with emission reduction technologies in order to reach net-zero carbon emissions more quickly. Compiling this publicly available global database of iron and steel plants and tracking all their ages and technologies has significantly improved the detail of data around the carbon emission of global iron and steel production.

The researchers emphasise that because of the wide range of production methods and plant designs, the particulars of individual upgrades and mitigation effort of each processing unit will have to be done on an individual basis. Their research will help policymakers create a roadmap of when and how to upgrade iron and steel plants to meet emissions reduction targets.

The first author PhD student Tianyang Lei of Tsinghua University said: "Our study presents various CO2 emissions mitigation pathways at the plant level, optimizing when and how to retrofit each plant based on processing routes, latest retrofitting year, and operating lifetime, stressing the importance of early retrofitting with deep decarbonisation technologies for achieving net-zero carbon emissions by 2050."

The database reveals other insights into the iron and steel industry. Geographically different regions tend to use different technologies and techniques based on the available technologies and raw materials in the region. Some of the most carbon-intensive, coal-based production plants are concentrated in China, Japan, and India, while plants in the Middle East and North America which have greater access to natural gas resources use techniques that emit relatively less carbon dioxide.

The top five carbon emitting iron and steel plants contribute 7% of the total CO2 emissions from the global iron and steel industry but only make up 0.1% of the total 4,883 plants. They are: Anshan Iron & Steel (China), Posco -- Pohang Iron & Steel (South Korea), Shanghai Baosteel (China), Jiangsu Shagang (China), Maanshan Iron & Steel Group (China). The researchers say that retrofitting these plants to lower their carbon emissions would demonstrate the feasibility for other, similar plants.

Pollen analysis suggests peopling of Siberia and Europe by modern humans occurred during a major Pleistocene warming spell

 It's an Ice Age mystery that's been debated for decades among anthropologists: Exactly when and how did the flow of Homo sapiens in Eurasia happen? Did a cold snap or a warming spell drive early human movement from Africa into Europe and Asia?

A new study appearing in Science Advances compares Pleistocene vegetation communities around Lake Baikal in Siberia, Russia, to the oldest archeological traces of Homo sapiens in the region. The researchers use the "remarkable evidence" to tell a compelling story from 45,000-50,000 years ago with new detail: how the first humans migrated across Europe and Asia.

The new pollen data suggest warming temperatures supported forests that expanded into Siberia and facilitated early human migration there, at roughly the same time as more and western areas of Eurasia.

"This research addresses long-standing debates regarding the environmental conditions that early Homo sapiens faced during their migration into Europe and Asia around 40,000 to 50,000 years ago," said co-author Ted Goebel, professor of anthropology at the University of Kansas. "It provides critical insights into environmental conditions at Lake Baikal, using pollen records to reveal surprising warmth during this period."

Indeed, the pollen data suggest that the dispersal of people occurred during some of the highest temperatures in the late Pleistocene, which also would have featured higher humidity. The ancient pollen record shows coniferous forests and grasslands characterized the region, able to support foraging and hunting by humans. Goebel said the environmental data, combined with archeological evidence, tell a new story.

"This contradicts some recent archaeological perspectives in Europe," said the KU researcher. "The key factor here is accurate dating, not just of human fossils and animal bones associated with the archaeology of these people, but also of environmental records, including from pollen. What we have presented is a robust chronology of environmental changes in Lake Baikal during this time period, complemented by a well-dated archaeological record of Homo sapiens' presence in the region."

Goebel's collaborators were lead author Koji Shichi of the Forestry and Forest Products Research Institute in Kochi, Japan; Masami Izuho of Tokyo Metropolitan University, Hachioji, Japan; and Kenji Kashiwaya of Kanazawa University, Kanazawa, Japan.

While the pollen analysis was carried out in Japan, Goebel and Izuho tied the pollen data to important evidence in the archeological record of early human migration. Goebel said the emergence of full-fledged Homo sapiens in the archaeological record corresponds to changes in culture and behavior. Early modern humans of this period were making stone tools on long, slender blades, working bone, antler and ivory to craft tools -- including some of the first bone needles with carved eyelets for sewing and early bone and antler spear points.

"Some of us argue that as the anatomical changes were occurring, as evidenced by the fossil record, there was a simultaneous shift in behavior and cognition," Goebel said. "These early humans were becoming more creative, innovative and adaptable. This is when we start to observe significant changes in the archaeological record, such as cave paintings. We also find mobile art, like the early carvings known as Venus figurines. In Central Europe, there's even an ivory sculpture dating back to this early period, depicting a lion-headed man. It's not just replicating nature; it's about creative expression, inventing new things, exploring new places."

At least one human bone has been found in the region that dates to the era, according to the KU researcher.

"There is one human fossil from Siberia, although not from Lake Baikal but farther west, at a place called Ust'-Ishim," Goebel said. "Morphologically, it is human, but more importantly, it's exceptionally well-preserved. It has been directly radiocarbon-dated and has yielded ancient DNA, confirming it as a representative of modern Homo sapiens, distinct from Neanderthals or Denisovans, or other pre-modern archaic humans."

Goebel said the earliest human inhabitants of the area likely would have lived in extended nuclear families or small bands, as they seem to have done in other areas of Eurasia. But because so much archeological evidence is degraded, it's difficult to know with certainty.

"At Ust'-Ishim in Siberia, we have evidence of a fully modern human co-existing with the sites we've been discussing," he said. "However, Ust'-Ishim was an isolated discovery, found by geologists eroding from a riverbank. We lack information about its archaeological context, whether it was part of a settlement or simply a solitary bone washed downstream. Consequently, linking that single individual to the archaeological sites in the Baikal region is tenuous -- do they represent the same population? We think so, but definitely need more evidence."

New study definitively confirms Gulf Stream weakening, understanding the changes could help predict future trends in extreme events

 The Gulf Stream transport of water through the Florida Straits has slowed by 4% over the past four decades, with 99% certainty that this weakening is more than expected from random chance, according to a new study.

The Gulf Stream -- which is a major ocean current off the U.S. East Coast and a part of the North Atlantic Ocean circulation -- plays an important role in weather and climate, and a weakening could have significant implications.

"We conclude with a high degree of confidence that Gulf Stream transport has indeed slowed by about 4% in the past 40 years, the first conclusive, unambiguous observational evidence that this ocean current has undergone significant change in the recent past," states the journal article, "Robust weakening of the Gulf Stream during the past four decades observed in the Florida Straits," published in Geophysical Research Letters. The Florida Straits, located between the Florida Keys, Cuba, and The Bahamas, has been the site of many ocean observation campaigns dating to the 1980s and earlier. "This significant trend has emerged from the dataset only over the past ten years, the first unequivocal evidence for a recent multidecadal decline in this climate-relevant component of ocean circulation."

The Gulf Stream affects regional weather, climate, and coastal conditions, including European surface air temperature and precipitation, coastal sea level along the Southeastern U.S., and North Atlantic hurricane activity. "Understanding past Gulf Stream changes is important for interpreting observed changes and predicting future trends in extreme events including droughts, floods, heatwaves, and storms," according to the article. "Determining trends in Gulf Stream transport is also relevant for clarifying whether elements of the large-scale North Atlantic circulation have changed, and determining how the ocean is feeding back on climate."

"This is the strongest, most definitive evidence we have of the weakening of this climatically-relevant ocean current," said Chris Piecuch, a physical oceanographer with the Woods Hole Oceanographic Institution, who is lead author of this study.

The paper does not conclude whether the Gulf Stream weakening is due to climate change or to natural factors, stating that future studies should try to identify the cause of the weakening.

"While we can definitively say this weakening is happening, we are unable to say to what extent it is related to climate change or whether it is a natural variation," Piecuch said. "We can see similar weakening indicated in climate models, but for this paper we were not able to put together the observational evidence that would really allow us to pinpoint the cause of the observed decline."

For the study, researchers applied what's known as Bayesian modeling to combine thousands of data points from three independent data sets -- from undersea cables, satellite altimetry, and in situ observations -- to determine the transport of water through the Florida Straits since 1982. Bayesian modeling uses probability to represent uncertainty within a model. The Bayesian model results provided clear evidence of significant long-term change. In addition, researchers found that omitting any one of the data sets from the analysis also indicated weakening. The paper states this demonstrates that transport weakening is a common signal that is not dependent on any one data set

Piecuch explained the importance of the methodology by using a courtroom analogy. "When you are making your case, you need more than one witness, and you ideally want a collection of independent witnesses whose statements -- when taken together -- paint a consistent and coherent story," he said. "We brought all the witnesses to the stand that we could technically involve to bring all these data sets together. Once we synthesized the testimony from all the different witnesses, they painted a very clear picture that, indeed, over the past 40 years the Gulf Stream has weakened by about 4%, which is significant. It's more than you would expect if the current was stable; so, it's an important change."

"I have been studying western boundary currents -- primarily the Agulhas Current off South Africa -- for 30 years and it is only now that we are able to observe a robust trend in one of these extraordinarily dynamic systems. I believe this is a profound benchmark," said article co-author Lisa Beal, professor of Ocean Sciences at the Rosenstiel School of Marine, Atmospheric, and Earth Science at the University of Miami, Florida. "Central to this paper is the Bayesian model developed by Chris, which provides a framework to carefully assimilate ocean observations of disparate quality and resolution. I think there is potential for this technique to extract other climate change signals from among the scattered observations we have in the ocean."

The study builds on many earlier studies to quantify long-term change in Gulf Stream transport. While the weakening found in the current study is consistent with hypotheses from many previous studies, Piecuch noted that the current study is "water-tight" and is "the first unequivocal evidence of a decline."

Piecuch said that the finding of definitive evidence of the weakening of the Gulf Stream transport of water "is a testament to long-term ocean observing and the importance of sustaining long ocean records."

The current study, which is part of a bigger six-year project funded by the National Science Foundation to make new measurements of the Gulf Stream at the Florida Straits, emphasizes the importance of having long-term observations, he said. "The more subtle that the change is that you are looking at, the longer is the observational record that you need to be able to tease that subtle change out of an observational time series."

"This paper explicitly demonstrates the value of these long observing systems to tease out very subtle signals. In this case, we showed that we needed more than 30 years of data," Piecuch said. "Hopefully, this paper and our wider project will drive home the importance of funding and sustaining long-term ocean observing."

Beal added, "The Gulf Stream is a vital artery of the ocean's circulation, and so the ramifications of its weakening are global. I used to think of the ocean as our last remaining frontier, wild, pristine, and indomitable. It saddens me to acknowledge, from our study and so many others, and from recent record-breaking headlines, that even the remotest parts of the ocean are now in the grip of our addiction to fossil fuels."

Decreasing biodiversity may promote spread of viruses

 How are environmental changes, loss of biodiversity, and the spread of pathogens connected? The answer is a puzzle. Researchers from Charité -- Universitätsmedizin Berlin have now described one piece of that puzzle in the journal eLife, showing that the destruction of tropical rainforests harms the diversity of mosquito species. At the same time, more resilient species of mosquitoes become more prevalent -- which also means the viruses they carry are more abundant. If there are many individuals of a given species, those viruses can spread quickly.

For their study, researchers from Charitéteamed up with the Leibniz Institute for Zoo and Wildlife Research (IZW) to investigate how clearing rainforests to make way for coffee or cacao plantations or human settlements affects the prevalence and biodiversity of mosquitoes and the viruses they carry. The study, which brings together the fields of virology and biodiversity research, was led by Prof. Sandra Junglen, head of the Ecology and Evolution of Arboviruses research group at the Institute of Virology at Charité.

For their research work, the team first caught mosquitoes around Taï National Park in the West African country of Côte d'Ivoire. There is a broad range of land uses there, from pristine rainforest to secondary forest, cacao and coffee plantations, and villages. "We identified the species of mosquitoes we had caught and tested them for viral infections," explains Kyra Hermanns of the Institute of Virology at Charité, the first author of the study. "Then we looked at how the composition of mosquito species differs across the different land use types, where certain viruses are present, and how prevalent they are."

Resilient mosquito species prevail over others

There are many different viruses in a healthy ecosystem such as a pristine rainforest. The main reason is that there is a broad range of animal species living there that can carry the virus, acting as hosts. This is because viruses are always tied to their hosts.

If there is a change in the ecosystem, it affects the viruses as well, Junglen explains: "We discovered 49 virus species, with the greatest diversity of hosts and viruses observed in untouched or minimally disturbed habitats." Most of the 49 different virus species were relatively rare in the areas studied. However, nine of them were commonly found in multiple habitats, with the prevalence of five virus species increasing in habitats that had been disturbed and reaching the highest figures in human settlements.

"This means that the clearing of tropical rainforests causes a decrease in biodiversity across mosquito species, which changes the composition of host types. Some resilient mosquito species have multiplied very successfully in the cleared areas, bringing their viruses with them," Junglen explains. The composition of a given community of species thus has a direct effect on the prevalence of viruses: "If one host species is very abundant, it is easier for viruses to spread," the virologist notes. "All of the viruses we found to be more common were demonstrated to be present in a certain mosquito species. The viruses belong to different families and have different properties. That means we were able to show for the first time that the spread of the viruses is attributable not to a close genetic relationship, but to the characteristics of their hosts -- especially those mosquito species that adapt well to changing environmental conditions in habitats that have been disturbed."

New insight into the dynamics of infectious disease

The viruses the researchers found only infect mosquitoes and, as things currently stand, cannot be transmitted to humans. Still, they are a valuable model for understanding how changes in the diversity of a community of species affect the presence and prevalence of viruses. "Our study makes clear just how important biodiversity is, and that decreasing biodiversity makes it easier for certain viruses to thrive because it causes their hosts to become more abundant," Junglen notes.

"Previously, these kinds of processes were studied almost exclusively using individual pathogens and individual hosts. Now we have a more complete picture that we can use for further research," she explains. As their next step, the researchers plan to study additional habitats in other countries, with one goal being to pinpoint the exact factors that affect the diversity of mosquito species under land-use change, and the characteristics that viruses need to have in order to spread with their hosts.

Thursday 28 September 2023

New insights into the atmosphere and star of an exoplanet

 Astronomers led by a team at Université de Montréal has made important progress in understanding the intriguing TRAPPIST-1 exoplanetary system, which was first discovered in 2016 amid speculation it could someday provide a place for humans to live.

Not only does the new research shed light on the nature of TRAPPIST-1 b, the exoplanet orbiting closest to the system's star, it has also shown the importance of parent stars when studying exoplanets.

Published in Astrophysical Journal Letters, the findings by astronomers at UdeM's Trottier Institute for Research on Exoplanets (iREx) and colleagues in Canada, the U.K. and U.S. shed light on the complex interplay between stellar activity and exoplanet characteristics.

Captured the attention

TRAPPIST-1, a star much smaller and cooler than our sun located approximately 40 light-years away from Earth, has captured the attention of scientists and space enthusiasts alike since the discovery of its seven Earth-sized exoplanets seven years ago. These worlds, tightly packed around their star with three of them within its habitable zone, have fueled hopes of finding potentially habitable environments beyond our solar system.

She and her colleagues used the technique of transmission spectroscopy to peer deeper into the distant world. By analysing the central star's light after it has passed through the exoplanet's atmosphere during a transit, astronomers can see the unique fingerprint left behind by the molecules and atoms found within that atmosphere.

'Just a small subset'

"This is just a small subset of many more observations of this unique planetary system yet to come and to be analysed," adds René Doyon, Principal Investigator of the NIRISS instrument and co-author on the study. "These first observations highlight the power of NIRISS and the JWST in general to probe the thin aroun

The astronomers' key finding was just how significant stellar activity and contamination are when trying to determine the nature of an exoplanet. Stellar contamination refers to the influence of the star's own features, such as dark spots and bright faculae, on the measurements of the exoplanet's atmosphere.

The team found compelling evidence that stellar contamination plays a crucial role in shaping the transmission spectra of TRAPPIST-1 b and, likely, the other planets in the system. The central star's activity can create "ghost signals" that may fool the observer into thinking they have detected a particular molecule in the exoplanet's atmosphere.

This result underscores the importance of considering stellar contamination when planning future observations of all exoplanetary systems, the sceintists say. This is especially true for systems like TRAPPIST-1, since the system is centred around a red dwarf star which can be particularly active with starspots and frequent flare events.

"In addition to the contamination from stellar spots and faculae, we saw a stellar flare, an unpredictable event during which the star looks brighter for several minutes or hours," said Lim. "This flare affected our measurement of the amount of light blocked by the planet. Such signatures of stellar activity are difficult to model but we need to account for them to ensure that we interpret the data correctly."

A range of models explored

Based on their collected JWST observations, Lim and her team explored a range of atmospheric models for TRAPPIST-1 b, examining various possible compositions and scenarios.

They found they could confidently rule out the existence of cloud-free, hydrogen-rich atmospheres -- in other words, there appears to be no clear, extended atmosphere around TRAPPIST-1 b. However, the data could not confidently exclude thinner atmospheres, such as those composed of pure water, carbon dioxide, or methane, nor an atmosphere similar to that of Titan, a moon of Saturn and the only moon in the Solar System with its own atmosphere.

These results are generally consistent with previous (photometric, and not spectroscopic) JWST observations of TRAPPIST-1 b with the MIRI instrument. The new study also proves that Canada's NIRISS instrument is a highly performing, sensitive tool able to probe for atmospheres on Earth-sized exoplanets at impressive levels.

Hidden supermassive black holes reveal their secrets through radio signals

 Astronomers have found a striking link between the amount of dust surrounding a supermassive black hole and the strength of the radio emission produced in extremely bright galaxies. The findings are published in the Monthly Notices of the Royal Astronomical Society.

The team of international astronomers, led by Newcastle University and Durham University, UK, used new data from the Dark Energy Spectroscopic Instrument (DESI), which is conducting a five year survey of large scale structure in the universe that will include optical spectra for ~3 million quasars; extremely bright galaxies powered by supermassive black holes. They found that quasars that contained more dust, and therefore appeared redder, were more likely to have stronger radio emission compared to the quasars that had very little-to-no dust, appearing very blue.

Almost every known galaxy contains a supermassive black hole, which are black holes with a mass millions to billions that of our Sun, at its centre, including our own Milky Way. In some galaxies there is lots of material in the centre, feeding and growing this supermassive black hole, making it very energetic and "active." The most powerful type of these active galaxies are called "quasars," which are some of the brightest objects in the Universe. Most quasars appear very blue, due to the bright disc of matter that orbits and feeds the central supermassive black hole which is very bright in optical and ultraviolet wavelengths. However, astronomers have found that a significant fraction of these quasars appear very red, although the nature of these objects is still not well understood.

In order to understand the physics of these red quasars, "spectroscopic" measurements are required, which can be used to analyse the quasar light at different wavelengths. The "shape" of the quasar's spectrum can indicate the amount of dust present surrounding the central region. Observing the radio emission from quasars can also tell you about the energetics of the central supermassive black hole; whether it is launching powerful "winds" or "jets" that might shape the surrounding galaxy.

This new study, led by Dr Victoria Fawcett of Newcastle University, and previously Durham University, uses spectroscopic observations from DESI to measure the amount of dust (reddening) in a sample of ~35,000 quasars and link this to the observed radio emission. They find that DESI is capable of observing much more extreme red (dusty) quasars compared to similar/previous spectroscopic surveys, such as the Sloan Digital Sky Survey (SDSS). They also find that redder quasars are much more likely to have strong radio emission compared to typical blue quasars.

Dr Fawcett says: "It was really exciting to see the amazing quality of the DESI data and to discover thousands of these, previously rare, red quasars. I feel like this study puts lots of the puzzle pieces together in our understanding of red quasars and definitively links the dust in a quasar to its radio emission. I think this is the strongest evidence so far that red quasars are a key element in how galaxies evolve."

This reddening-radio connection is likely due to powerful outflows of gas driven away from the supermassive black hole, which slam into the surrounding dust, causing shocks and radio emission. These outflows will eventually blow away all the dust and gas in the central region of the galaxy, revealing a blue quasar and resulting in weaker radio emission. This is consistent with the emerging picture that red quasars are a younger, "blow-out" phase in the evolution of galaxies. Red quasars may therefore be extremely important for understanding how galaxies evolve over time.

Dr Fawcett continues "There are still many unanswered questions surrounding red quasars, such as whether black hole winds or radio jets are ultimately responsible for this enhanced radio emission. However, with the sample of DESI red quasars continuing to grow over the next few years of the survey, I am confident that we are on the brink of fully understanding the nature of these red quasars."

Scientists regenerate neurons that restore walking in mice after paralysis from spinal cord injury

 In a new study in mice, a team of researchers from UCLA, the Swiss Federal Institute of Technology, and Harvard University have uncovered a crucial component for restoring functional activity after spinal cord injury. The neuroscientists have shown that re-growing specific neurons back to their natural target regions led to recovery, while random regrowth was not effective.

In a 2018 study published in Nature, the team identified a treatment approach that triggers axons -- the tiny fibers that link nerve cells and enable them to communicate -- to regrow after spinal cord injury in rodents. But even as that approach successfully led to the regeneration of axons across severe spinal cord lesions, achieving functional recovery remained a significant challenge.

For the new study, published this week in Science, the team aimed to determine whether directing the regeneration of axons from specific neuronal subpopulations to their natural target regions could lead to meaningful functional restoration after spinal cord injury in mice. They first used advanced genetic analysis to identify nerve cell groups that enable walking improvement after a partial spinal cord injury.

The researchers then found that merely regenerating axons from these nerve cells across the spinal cord lesion without specific guidance had no impact on functional recovery. However, when the strategy was refined to include using chemical signals to attract and guide the regeneration of these axons to their natural target region in the lumbar spinal cord, significant improvements in walking ability were observed in a mouse model of complete spinal cord injury.

"Our study provides crucial insights into the intricacies of axon regeneration and requirements for functional recovery after spinal cord injuries," said Michael Sofroniew, MD, PhD, professor of neurobiology at the David Geffen School of Medicine at UCLA and a senior author of the new study. "It highlights the necessity of not only regenerating axons across lesions but also of actively guiding them to reach their natural target regions to achieve meaningful neurological restoration."

The authors say understanding that re-establishing the projections of specific neuronal subpopulations to their natural target regions holds significant promise for the development of therapies aimed at restoring neurological functions in larger animals and humans. However, the researchers also acknowledge the complexity of promoting regeneration over longer distances in non-rodents, necessitating strategies with intricate spatial and temporal features. Still, they conclude that applying the principles laid out in their work "will unlock the framework to achieve meaningful repair of the injured spinal cord and may expedite repair after other forms of central nervous system injury and disease."

The research team included scientists from NeuroX Institute, School of Life Sciences, Swiss Federal Institute of Technology (EPFL); the Department of Neurosurgery, Lausanne University Hospital (CHUV) and University of Lausanne (UNIL), Center for Interventional Neurotherapies (NeuroRestore); Wyss Center for Bio and Neuroengineering; Department of Clinical Neuroscience, Lausanne University Hospital (CHUV) and University of Lausanne; Departments of Bioengineering, Chemistry and Biochemistry, University of California, Los Angeles; Bertarelli Platform for Gene Therapy, Swiss Federal Institute of Technology; Brain Mind Institute, School of Life Sciences, Swiss Federal Institute of Technology; M. Kirby Neurobiology Center, Department of Neurology, Boston Children's Hospital, Harvard Medical School, Boston; Department of Neurobiology, David Geffen School of Medicine, University of California, Los Angeles.

This work was supported by the Defitech Foundation, Wings for Life, Riders4Riders, Wyss Center for Bio and Neuroengineering, Swiss National Science Foundation (PZ00P3_185728 to M.A.A. and PZ00P3_208988 to J.W.S.); the Morton Cure Paralysis Foundation (to M.A.A); the ALARME Foundation (to M.A.A. and G.C); the Dr. Miriam and Sheldon G. Adelson Medical Foundation (to M.V.S., Z.H., and T.J.D.); Wings for Life (M.A.A., M.V.S., M.A.S., and M.M); Holcim-Stiftung Foundation (to J.W.S.); and the Canadian Institutes for Health Research (to J.W.S.). We are grateful to J. Ravier and M. Burri for the illustrations and to L. Batti and I. Gantar from the Advanced Lightsheet Imaging Center (ALICe) at the Wyss Center for Bio and Neuroengineering, Geneva. Funding: This work was supported in part using the resources and services of the Gene Expression Core Facility, and the Bertarelli Platform for Gene Therapy at the School of Life Sciences of EPFL.

Why are you better at recognizing upright faces? Clues from a person who sees the world differently

When you see a familiar face upright, you'll recognize it right away. But if you saw that same face upside down, it's much harder to place. Now researchers who've studied Claudio, a 42-year-old man whose head is rotated back almost 180 degrees such that it sits between his shoulder blades, suggest that the reason people are so good at processing upright faces has arisen through a combination of evolution and experience. The findings appear September 22 in the journal iScience.

"Nearly everyone has far more experience with upright faces and ancestors whose reproduction was influenced by their ability to process upright faces, so it's not easy to pull apart the influence of experience and evolved mechanisms tailored for upright faces in typical participants," says first author Brad Duchaine, a psychologist at Dartmouth College. "However, because Claudio's head orientation is reversed to most faces that he has looked at, he provides an opportunity to examine what happens when the faces viewed most often have a different orientation than the viewer's face."

Researchers have long known from earlier studies that our ability to process faces drops or even plummets when a face is rotated 180 degrees. But it had been hard to determine if the reason for that comes from evolutionary mechanisms that shaped our brains' facial processing abilities gradually over time or simply because most of us primarily interact with people and see them with their face in an upright position.

The question was: How does Claudio's altered viewpoint relative to the faces of others change how he is able to detect and match them up? Or does it? The answer, they realized, would offer clues about the nature of face perception in people more generally.

To find out, the researchers tested Claudio's face-detection and identity-matching ability in 2015 and 2019. They also tested his recognition of Thatcherized faces, in which some of the features, such as the eyes and mouth, had been altered. Across all three types of tests, people with typical face perception are much better at these judgements when faces are upright than when they are inverted.

Their studies found that Claudio was more accurate than controls with inverted detection and Thatcher judgments but scored similarly to controls with face identity matching. The researchers say the findings suggest that our ability with upright faces arises from a combination of evolutionary mechanisms and experience.

"Because Claudio appears to have had more experience with upright faces than upside-down faces and he has viewed faces from an upside-down vantage point, it is revealing that he does not do better with upright faces than inverted faces for face detection and face identity matching," Duchaine said. "The absence of an advantage for the face orientation that he's had more experience with suggests that our great sensitivity to upright faces results from both our greater experience with them and an evolved component that makes our visual system better tuned to upright faces than inverted faces."

Duchaine was surprised to get a different result when Claudio saw Thatcherized faces. In that case, Claudio performed better when those manipulated faces appeared upright. While the researchers say they don't why this happened, they suspect that the Thatcher effect arises from different visual mechanisms than facial detection and identity matching -- and that those different mechanisms must have different developmental trajectories.

In future studies, the researchers want to learn more about this difference as well as other kinds of judgments people make when they see faces, including facial expressions, age, sex, attractiveness, eye gaze direction, trustworthiness, and more. Using measures of what's happening in Claudio's brain when he sees faces, they also want to "see whether his current face processing depends on typical mechanisms." 

An implantable device could enable injection-free control of diabetes

 One promising approach to treating Type 1 diabetes is implanting pancreatic islet cells that can produce insulin when needed, which can free patients from giving themselves frequent insulin injections. However, one major obstacle to this approach is that once the cells are implanted, they eventually run out of oxygen and stop producing insulin.

To overcome that hurdle, MIT engineers have designed a new implantable device that not only carries hundreds of thousands of insulin-producing islet cells, but also has its own on-board oxygen factory, which generates oxygen by splitting water vapor found in the body.

The researchers showed that when implanted into diabetic mice, this device could keep the mice's blood glucose levels stable for at least a month. The researchers now hope to create a larger version of the device, about the size of a stick of chewing gum, that could eventually be tested in people with Type 1 diabetes.

"You can think of this as a living medical device that is made from human cells that secrete insulin, along with an electronic life support-system. We're excited by the progress so far, and we really are optimistic that this technology could end up helping patients," says Daniel Anderson, a professor in MIT's Department of Chemical Engineering, a member of MIT's Koch Institute for Integrative Cancer Research and Institute for Medical Engineering and Science (IMES), and the senior author of the study.

While the researchers' main focus is on diabetes treatment, they say that this kind of device could also be adapted to treat other diseases that require repeated delivery of therapeutic proteins.

MIT Research Scientist Siddharth Krishnan is the lead author of the paper, which appears today in the Proceedings of the National Academy of Sciences. The research team also includes several other researchers from MIT, including Robert Langer, the David H. Koch Institute Professor at MIT and a member of the Koch Institute, as well as researchers from Boston Children's Hospital.

Replacing injections

Most patients with Type 1 diabetes have to monitor their blood glucose levels carefully and inject themselves with insulin at least once a day. However, this process doesn't replicate the body's natural ability to control blood glucose levels.

"The vast majority of diabetics that are insulin-dependent are injecting themselves with insulin, and doing their very best, but they do not have healthy blood sugar levels," Anderson says. "If you look at their blood sugar levels, even for people that are very dedicated to being careful, they just can't match what a living pancreas can do."

A better alternative would be to transplant cells that produce insulin whenever they detect surges in the patient's blood glucose levels. Some diabetes patients have received transplanted islet cells from human cadavers, which can achieve long-term control of diabetes; however, these patients have to take immunosuppressive drugs to prevent their body from rejecting the implanted cells.

More recently, researchers have shown similar success with islet cells derived from stem cells, but patients who receive those cells also need to take immunosuppressive drugs.

Another possibility, which could prevent the need for immunosuppressive drugs, is to encapsulate the transplanted cells within a flexible device that protects the cells from the immune system. However, finding a reliable oxygen supply for these encapsulated cells has proven challenging.

Some experimental devices, including one that has been tested in clinical trials, feature an oxygen chamber that can supply the cells, but this chamber needs to be reloaded periodically. Other researchers have developed implants that include chemical reagents that can generate oxygen, but these also run out eventually.

The MIT team took a different approach that could potentially generate oxygen indefinitely, by splitting water. This is done using a proton-exchange membrane -- a technology originally deployed to generate hydrogen in fuel cells -- located within the device. This membrane can split water vapor (found abundantly in the body) into hydrogen, which diffuses harmlessly away, and oxygen, which goes into a storage chamber that feeds the islet cells through a thin, oxygen-permeable membrane.

A significant advantage of this approach is that it does not require any wires or batteries. Splitting this water vapor requires a small voltage (about 2 volts), which is generated using a phenomenon known as resonant inductive coupling. A tuned magnetic coil located outside the body transmits power to a small, flexible antenna within the device, allowing for wireless power transfer. It does require an external coil, which the researchers anticipate could be worn as a patch on the patient's skin.

Drugs on demand

After building their device, which is about the size of a U.S. quarter, the researchers tested it in diabetic mice. One group of mice received the device with the oxygen-generating, water-splitting membrane, while the other received a device that contained islet cells without any supplemental oxygen. The devices were implanted just under the skin, in mice with fully functional immune systems.

The researchers found that mice implanted with the oxygen-generating device were able to maintain normal blood glucose levels, comparable to healthy animals. However, mice that received the nonoxygenated device became hyperglycemic (with elevated blood sugar) within about two weeks.

Typically when any kind of medical device is implanted in the body, attack by the immune system leads to a buildup of scar tissue called fibrosis, which can reduce the devices' effectiveness. This kind of scar tissue did form around the implants used in this study, but the device's success in controlling blood glucose levels suggests that insulin was still able to diffuse out of the device, and glucose into it.

This approach could also be used to deliver cells that produce other types of therapeutic proteins that need to be given over long periods of time. In this study, the researchers showed that the device could also keep alive cells that produce erythropoietin, a protein that stimulates red blood cell production.

"We're optimistic that it will be possible to make living medical devices that can reside in the body and produce drugs as needed," Anderson says. "There are a variety of diseases where patients need to take proteins exogenously, sometimes very frequently. If we can replace the need for infusions every other week with a single implant that can act for a long time, I think that could really help a lot of patients."

The researchers now plan to adapt the device for testing in larger animals and eventually humans. For human use, they hope to develop an implant that would be about the size of a stick of chewing gum. They also plan to test whether the device can remain in the body for longer periods of time.

"The materials we've used are inherently stable and long-lived, so I think that kind of long-term operation is within the realm of possibility, and that's what we're working on," Krishnan says.

"We are very excited about these findings, which we believe could provide a whole new way of someday treating diabetes and possibly other diseases," Langer adds.

Wednesday 27 September 2023

Caribbean parrots thought to be endemic are actually relicts of millennial-scale extinction

 In a new study published in PNAS, researchers have extracted the first ancient DNA from Caribbean parrots, which they compared with genetic sequences from modern birds. Working with fossils and archaeological specimens, they showed that two species thought to be endemic to particular islands were once more widespread and diverse. The results help explain how parrots rapidly became the world's most endangered group of birds, with 28% of all species considered to be threatened. This is especially true for parrots that inhabit islands.

On his first voyage to the Caribbean in 1492, Christopher Columbus noted that flocks of parrots were so abundant they "obscured the sun." Today, more than half of parrot species in the Caribbean have gone extinct, from large particolored macaws to a parrotlet the size of a sparrow.

Biologists attempting to conserve the remaining parrot species are stymied by how little is known of their former distributions. This is due, primarily, to their complicated history with humans.

"People have always been obsessed with parrots," said lead author Jessica Oswald, a senior biologist with the U.S. Fish and Wildlife Service Forensics Lab. "Indigenous peoples have moved parrots across continents and between islands for thousands of years. Later, European colonists continued that practice, and we're still moving them around today."

Centuries of exchange and trade have made it difficult to know how parrots wound up where they are now. Half of the 24 parrot species that currently live in the Caribbean were introduced from other areas, and it's unclear whether native parrots evolved on the islands they inhabit or were similarly transported there.

Fortunately, their popularity with humans means parrots are occasionally found in archaeological sites as well. Their bones have been recovered from refuse piles -- called middens -- alongside shells, fish bones and other scraps from previous meals.

"There are records of parrots being kept in homes, where they were valued for their feathers and, in some cases, potentially as a source of food," said senior author Michelle LeFebvre, curator of South Florida archaeology and ethnography at the Florida Museum of Natural History.

Parrots also have an uncharacteristically good fossil record in the Caribbean, compared with other tropical regions. However, specimens are rarely found intact. More often, their bones are broken or isolated, and it's not always possible to determine which species they belonged to.

DNA can provide unequivocal answers where physical comparisons fall short, and co-author David Steadman was eager to see if they could extract any residual genetic material preserved in bone tissue. Oswald -- who worked as a graduate student and postdoctoral associate at the Florida Museum -- had recently completed a proof of concept, in which she successfully sequenced the first DNA from an extinct Caribbean bird that had been preserved in a blue hole for 2,500 years. Using the same methods, she later discovered that an extinct flightless bird from the Caribbean was most closely related to similarly bygone, ground-dwelling birds from Africa and New Zealand.

"For me, the single most satisfying thing about this project is we can use fossils in ways that weren't even imaginable when they came out of the ground," said Steadman, a retired curator of ornithology at the Florida Museum.

The authors pieced together the long history of parrots in the genus Amazona, focusing on two species -- the Cuban (A. leucocephala) and Hispaniolan (A. ventralis) parrots -- for which they could obtain ancient DNA samples.

Of the two, Cuban parrots are currently the most widespread, with isolated populations in Cuba and on a few islands in the Bahamas and Turks and Caicos. They're one of the only native parrots in the region not in imminent danger of extinction.

The Hispaniolan parrot has had a harder time adapting to human-wrought changes. It's listed as vulnerable to extinction on the International Union for Conservation of Nature's Red List and is entirely endemic to its eponymous island.

Most of the fragmentary fossils collected outside of Hispaniola and Puerto Rico were consequently identified as belonging to the more common Cuban parrots. But when the DNA results came back, they told a different story. The fossils from the Bahamian paleontological sites were actually from Hispaniolan parrots, indicating that this species formerly had a range that extended up through the Bahamas before human arrival to the islands.

Similarly, the results indicate that Cuban parrots once inhabited the largest island in the Turks and Caicos, from which they are now absent.

"One of the striking things about this study is the discovery of what could be considered dark extinctions," LeFebvre said. "We're learning about diversity we didn't even know existed until we took a closer look at museum specimens."

Bones from archaeological sites in the Turks and Caicos and from Montserrat -- an island far to the south in the Lesser Antilles -- were also determined to be from Hispaniolan parrots. These had likely been transported there by humans, and the species is no longer present on the islands.

According to Oswald, knowing where species once thrived -- both naturally by their own devices and artificially with the aid of humans -- is the first step to conserving what's left of their diversity.

"We have to think about what we consider to be natural," she said. "People have been altering the natural world for thousands of years, and species that we think are endemic to certain areas might be the product of recent range loss due to humans. It takes paleontologists, archaeologists, evolutionary biologists and museum scientists all working together to really understand the long-term role of humans on diversity change."

The authors published their study in the journal PNAS. Brian Smith of the American Museum of Natural History, Julie Allen of Virginia Tech and Robert Guralnick of the Florida Museum of Natural History are also co-authors on the study.

Theories about the natural world may need to change to reflect human impact

 New research, reported in Nature Ecology & Evolution, (25 September 2023) has for the first time validated at scale, one of the theories that has underpinned ecology for over half a century. In doing so, the findings raise further questions about whether models should be revised to capture human impacts on natural systems.

Scientists working in the 50's and 60's developed theories to predict the ecological distribution of species. These theories could be applied across a broad range of environments and variables such as food supply or temperature and when tested on a small scale they were found to be accurate. Amongst the earliest examples of these theories is the coral reef zonation theory which explains how different types of fish or corals for example are found on coral reefs at different depths.

Modern computing capabilities have now made it possible to test these theories at a larger scale, to see whether they 'hold water'.

To validate the depth zonation model on coral reefs, scientists at Bangor University and the US Government National Oceanic and Atmospheric Administration (NOAA), led by Dr Laura Richardson, of Bangor University, collected data from 5525 surveys at 35 Pacific Ocean islands. Their work has revealed that the model is correct and can predict the distribution of different fish species according to depth, but only on uninhabited islands where there is no and has never been any local human interference.

At islands and reefs with human habitation the pattern was not as marked or predictable.

The findings therefore suggest that our old 'models' of the natural world may no longer be valid in the face of mounting local human impacts.

As lead author, Dr Laura Richardson of Bangor University's School of Ocean Sciences, suggests, "Science is cumulative, building on past work. Now that we have greater computing capabilities, we should be testing these widely accepted but spatially under-validated theories at scale. Moreover the intervening years have seen human impacts on the environment increase to such an extent that these models may no longer predict the ecological distribution patterns we see today.

"This leads to more questions, both about the usefulness of models which represented a world less impacted by human activity, and about how to quantify or model our impact on the natural environment."

"The results show that now is the time to consider whether and how to include human impacts into our understanding of the natural world today."

Scientists reveal what fuels wildfires in Sierra Nevada Mountains

 Wildfires in California, exacerbated by human-driven climate change, are getting more severe. To better manage them, there's a growing need to know exactly what fuels the blazes after they ignite. In a study published in Environmental Research Letters, Earth system scientists at the University of California, Irvine report that one of the chief fuels of wildfires in California's Sierra Nevada mountains is the decades-old remains of large trees.

"Our findings support the idea that large-diameter fuel build-up is a strong contributor to fire severity," said Audrey Odwuor, a Ph.D. candidate in the UCI Department of Earth System Science and the lead author of the new study.

Researchers have known for decades that an increasing number of trees and an increasing abundance of dead plant matter on forest floors are the things making California wildfires more severe -- but until now it was unclear what kinds of plant debris contribute most to a fire.

To tackle the question, Odwuor and two of the study's co-authors -- James Randerson, professor of Earth system science at UCI, and Alondra Moreno from the California Air Resources Board -- drove a mobile lab owned and operated by the lab of study co-author and UCI alumna Francesca Hopkins at UC Riverside, to the southern Sierra Nevada mountains during 2021's KNP Complex Fire.

The KNP Complex Fire burned almost 90,000 acres in California's Sequoia and Kings Canyon National Parks. In the fire's smoke, the team took samples of particulate matter-laden air and analyzed the samples for their radiocarbon content at UCI's W.M. Keck Accelerator Mass Spectrometer facility with co-author and UCI Earth system science professor Claudia Czimczik.

Different fuel types, explained Czimczik, have different radiocarbon signatures, such that when they analyzed the smoke they discovered radiocarbon values associated with large fuel sources like fallen tree logs.

"What we did was pretty distinctive, as we were able to identify fuel sources by measuring the wildfire smoke," said Czimczik. "Our approach provides what we think of as an integrated picture of the fire because we're sampling smoke produced over the course of the fire that has been transported downwind."

The team also saw elevated levels of particulate matter that is 2.5 microns in diameter or less, which includes particles that, if inhaled, are small enough to absorb into the bloodstream.

The preponderance of large-diameter fuels is new in western forests. "We're really in a situation that's a consequence of both management strategies and climate warming since European-American settlement began in California," Odwuor said. "These fuels are building up on the forest floor over periods of decades, which is not typically how these forests were maintained."

It's information that, according to Odwuor, could help California better manage its wildfires.

"The knowledge that large-diameter fuels drive fires and fire emissions -- at least in the KNP Complex Fire -- can be useful for knowing which fuels to target with fuel treatments and what might end up in the smoke from both wildfires and prescribed fire," said Odwuor. "The idea is that because we can't control the climate, we can only do our best to manage the fuels, which will theoretically have an impact on fire severity and the composition of the smoke."

But the solution isn't as straightforward as removing trees from forest floors, because, among other things, they provide habitat for wildlife. That, and "once you get them out, where do you send them? There are only so many mills in California that can handle all the wood," Odwuor said.

Where the new knowledge could be helpful is with prescribed burns, wherein teams burn tracks of forest in a planned fashion with the aim of reducing the amount of fuel available for future wildfires.

"We're hoping to build some urgency for these management strategies," said Odwuor.

By air, rain and land: How microbes return after a wildfire

 The disruption brought by wildfires reaches everything that lives in or near a burning field or forest -- including microbes. A better understanding of how microbial communities change and grow after a fire could help researchers predict how bacteria and fungi will respond to major environmental changes.

A study published this week in mSystems suggests that dispersal -- through air or rain, for example -- plays a major role in microbial succession after a destructive fire. Researchers at the University of California, Irvine, spent a year tracking how bacterial and fungal communities returned to the leaf litter in a burned field. They found that the emerging microbial communities in the soil surface changed with the seasons and the reappearance of plants, and that the assembly of those communities was largely driven by dispersal.

The risk and extent of big, ecological disturbances like wildfires have been increasing in the last few decades.

"We know with climate change and human activity we're disturbing our ecosystems more and more," said Kristin Barbour, lead author on the new study and a Ph.D. student at the University of California, Irvine. "Microbes, especially those in the surface soil, perform a number of really key ecosystem processes, like carbon and nitrogen cycling." Bacteria and fungi, she said, break down the dead and decaying plant matter on the floor of a field or forest.

Barbour originally set out to study microbial dispersal in the context of droughts, but her plans changed after an unplanned wildfire burned a field site at Loma Ridge, near Irvine. What seemed like a setback became an opportunity. "We wanted to take advantage of this disturbance, especially since wildfire is becoming more frequent in many parts of the world," Barbour said.

The intense heat produced during a wildfire alters the chemical composition of the leaf litter, where microbes reside, and can shift the microbial communities in an ecosystem.

The researchers looked at 2 ecosystems that had been affected by the fire: a semi-arid grassland and a coastal sage scrub. To study the movement of microbes, they used 4 configurations of dispersal bags. For the first, they used burned leaf litter to fill small porous pouches that allowed microbes to pass in and out. For the second, a control group, they sealed leaf litter in bags that did not allow movement in or out. The third configuration was a porous bag filled with glass slides, to collect microbes as they moved through, and the fourth, another control group, included closed bags with glass slides.

At 5 times during the year after the fire, Barbour and her colleagues collected dispersal bags from both sites and identified bacteria and fungi on the leaf litter. They found that the effect of dispersal differed in the two environments, suggesting that microbial responses are dependent on their environment. "Which hurts our ability to make generalized statements," Barbour said.

They did see some recurring patterns. Overall, dispersal from the air contributed most significantly to the microbes entering the soil surface -- 34% of the bacteria and 42% of the fungi. They also found that in the first few months after the fire, before plants had re-emerged, bulk soil (the soil beneath the leaf litter) explained the largest share of immigrating bacteria.

The study of how microbes move through the environment is an emerging area of research, Barbour said, but one that's intimately connected to larger issues of how big disturbances change the environment.

"There's a lot of exciting work being done right now, looking at dispersal and at microbial communities out in the environment," she said.

Monday 25 September 2023

Effective visual communication of climate change

 The consequences of a warming climate frequently dominated the news this summer, from devastating wildfires and floods to deadly heat waves across the globe. Reducing harm from climate change is a challenging endeavor, and it requires comprehensive public education. Thus, the question arises: How can climate change science be made most accessible to the general population, as well as decision-makers and educators?

In a new paper published in the journal Geosphere, Steph Courtney and Karen McNeal explore the effects of improved data visualizations on user perception of climate change evidence.

With a geoscience background and a specialization in science education and communication, Courtney is passionate about improving the public's understanding of climate change. "We get excited and carried away as scientists but that's not going to work for a lot of audiences," says Courtney. "Your communication goal is more important than how cool you think your graph is."

In this study, the team redesigned three graphs from the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report. They assessed the impact of the redesign on graph usability (i.e. individuals' ability to accurately and quickly extract information from it) and user perception of scientist trustworthiness and climate change risk, which correlate to both knowledge and intention to take action. The process was iterative, including two rounds of user testing and successive redesign and re-testing. Methods included a combination of surveys, eye-tracking, ranking activities, and interviews.

While the overall usability of the redesigned charts was found to be equivalent to that of the originals, study participants perceived all of the redesigned figures to be more trustworthy. Participants also reported one of the redesigns made them more concerned about climate change than the original.

An interesting question explored in this study was whether more simplistic figures would look "less scientific" and thus less trustworthy. Surprisingly, this did not turn out to be the case. "Pretty is fine," explains Courtney. "If it looks nicer, it looks like you put time into it, so you care about it and you know what you are doing. Understandable and attractive graphs can be trustworthy. It is a win-win!"

The authors found that familiar figure formats were most useful; even minor changes to standard charts were confusing to the audience. Intentional use of color-coding was very important to participants, increasing both their understanding and perception of credibility. Likewise, minimal use of additional explanatory text on axes and fields greatly helped in those respects, although visual cluttering is a potential downside.

Graph redesign in this study was subtle, in order to be able to confidently parse out which change resulted in what improvement. Yet some of these changes were quite impactful. "Little changes that take just a bit of effort can make the science more accessible and meaningful to people. It is worth that little bit of effort. Communication is not the only barrier in addressing climate change, but it is an area we can stand to improve -- and it is worthwhile," emphasizes Courtney.

Courtney sees the future of this research field focusing on practical examples of climate scientists' top priorities for public understanding of and actions to address climate change, perhaps attempting more dramatic graph edits. She will share her results with the IPCC and is looking forward to seeing their next assessment report.

Plant and forest researchers do not 'anthropomorphize' plants

 Plants are often attributed with abilities similar to those known in the animal or human world. Trees are said to have feelings and can therefore care for their offspring, like mothers. In an article in the review journal Trends in Plant Science, 32 international plant and forest researchers followed up on such assertions. Led by Prof. David G. Robinson, professor emeritus for cell biology at the Centre for Organismal Studies (COS) of Heidelberg University, the researchers analysed the claims in two popular publications on forests and reached the conclusion that conjecture is equated with fact. They warn against "anthropomorphising" plants.

The article scrutinised the assertions in two widely received books about the hidden life of trees and the search for the so-called "mother tree." The researchers report that in those works, trees are attributed with human characteristics and behaviours, including the ability to feel pain and pleasure, to communicate with one another, and to act altruistically. Based on existing research literature, Prof. Robinson and his co-authors provide detailed evidence that the main assertions are scientifically untenable. The Heidelberg researcher points out that numerous research papers on the significance of intraspecies competition clearly contradict the contention that trees of a single species support one another and keep each other alive.

According to Prof. Robinson and his colleagues, newer studies also render the "mother tree concept" untenable. Many publications based on this concept that presume to substantiate a targeted transfer of carbon from older to younger trees via networked fungi -- the mycorrhizae -- are flawed due to a lack of control variants. "And where the data does actually suggest such a transfer, the quantity of carbon transferred is so small that it is physiologically completely irrelevant for the recipient tree," states Prof. Robinson. The researchers also criticise that both books cite evidentiary sources that were not peer-reviewed.

Finally, the authors point out the fatal consequences such claims could have for the adaptation of forests to climate change if political decisions are "based on pleasant-sounding but false messages" rather than scientific fact, adds Robinson. The article's authors included researchers from the University of Göttingen as well as from Austria, Canada, Chile, Great Britain, Ireland, Israel, Spain, Sweden, Switzerland, and the USA. They represent the fields of biology, forestry, and plant science.

Study finds two antibiotics for children with sinusitis equally effective, but one had fewer side effects

 Brigham researchers found that patients prescribed amoxicillin-clavulanate had higher rates of gastrointestinal symptoms and yeast infections than those prescribed amoxicillin

Acute sinusitis is one of the most common causes for children to be put on antibiotic medications, with patients in the United States filing nearly 5 million antibiotic prescriptions every year to treat the condition. The drugs amoxicillin and amoxicillin-clavulanate make up most of those prescriptions, but there is a lack of consensus on which should be first-line for children.

In a new study published today in JAMA and led by researchers at Brigham and Women's Hospital, a founding member of the Mass General Brigham healthcare system, scientists analyzed the treatment outcomes of over 300,000 children who were prescribed either of the two drugs. They found that there was no difference in the rates of treatment failure -- that is, having to go on a new course of antibiotics or seek additional treatment for sinusitis or complications -- between patients prescribed amoxicillin and amoxicillin-clavulanate. Treatment failure was so rare, in fact, that the study's authors say that physicians should be confident that either medication will clear a case of acute sinusitis that requires antibiotics. But the risk of adverse events, especially gastrointestinal symptoms and yeast infections, were higher among those prescribed amoxicillin-clavulanate.

"This study adds recent, actionable data and evidence to inform what antibiotic a clinician should choose to treat a child with acute bacterial sinusitis," said lead author Timothy Savage, MD, MPH, MSc, an associate epidemiologist in the Brigham's Division of Pharmacoepidemiology and Pharmacoeconomics. "As seen from this study, there's no difference in the treatment failure rate regardless of which of these two antibiotics you choose."

Amoxicillin-clavulanate is believed to treat a wider range of bacteria than amoxicillin, but it is also associated with more gastrointestinal side effects. Scientists also worry that in the long-term, overprescribing amoxicillin-clavulanate may accelerate the rate at which infectious bacteria develop antimicrobial resistance. Doctors have therefore wondered whether the benefits of prescribing amoxicillin-clavulanate to children with acute sinusitis outweigh the short- and long-term risks.

The researchers pulled data from 320,141 clinical cases of children diagnosed with acute sinusitis and compared whether children on amoxicillin or amoxicillin-clavulanate were more likely to undergo treatment failure. They discovered that there was no difference in the rates of treatment failures associated with either medication. Treatment failure in general was exceedingly rare; less than two percent of prescriptions failed, most of which were corrected by an outpatient medication change. Only 0.1% of children had failures so severe that they required a visit to the emergency room or hospitalization.

The clinical data showed that adverse events were somewhat rare but more frequent among patients treated with amoxicillin-clavulanate, occurring in 2.3% of patients treated with amoxicillin-clavulanate and 2% of patients treated with amoxicillin. Patients treated with amoxicillin-clavulanate had a 15% increased risk of gastrointestinal side effects and 33% higher risk of yeast infections compared to patients treated with amoxicillinThe study's authors conclude that the more narrow-spectrum amoxicillin may be the best first-line choice to combat acute sinusitis.

"Our study shows that there are more adverse events when amoxicillin-clavulanate is used," Savage said. "Based on these data, physicians should seriously consider prescribing amoxicillin as a first line of defense against acute sinusitis."

Not all acute sinusitis cases are caused by bacterial infections; a previous study found that viruses may be responsible for up to 32% of instances. Still, because the symptoms of bacterial and viral sinusitis can be nearly indistinguishable, many doctors opt to first treat a patient with antibiotics and monitor whether the infection clears. Around 85% of children that present with acute sinusitis receive an antibiotic, with amoxicillin and amoxicillin-clavulanate accounting for 65% of those prescriptions. The current study did not include microbiologic data and the authors could not discern whether acute sinusitis diagnoses were due to viral or bacterial infections. As this was not a randomized clinical trial, the study authors also acknowledge the possibility that residual bias could have impacted the results, although they re-analyzed the data several different ways to try to mitigate this, with no difference in the results.

Two previous studies that compared clinical outcomes of the two drugs were conducted more than 20 years ago. Those analyses showed that both medications alleviated symptoms at similar rates, but both studies were limited by a combined sample size of under 300 patients. Bacterial species have evolved significantly in the last twenty years, a fact that convinced Savage and his team to launch a new, larger study comparing treatment failure rates of both drugs.

"If a physician is trying to decide between these two drugs, they can look at these results and see that 98% of kids got better regardless of whether they were prescribed amoxicillin or amoxicillin-clavulanate," Savage said. "The chance that a child will end up in the hospital after using these drugs is less than one in a thousand. That should provide some reassurance that a child is going to do pretty well regardless of the antibiotic."

Disclosures: Savage reports an institutional contract from UCB outside the submitted work, and co-author Krista Huybrechts reports institutional contracts from UCB and Takeda outside the submitted work.

This study was supported by the Eunice Kennedy Shriver National Institute of Child Health & Human Development (T32HD040128 and K08HD110600). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. Dr. Savage had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.

Novel C. diff structures are required for infection, offer new therapeutic targets

  Iron storage "spheres" inside the bacterium C. diff -- the leading cause of hospital-acquired infections -- could offer new targ...