Sunday, 26 April 2020

Researchers achieve remote control of hormone release

Stress definition concept
Abnormal levels of stress hormones such as adrenaline and cortisol are linked to a variety of mental health disorders, including depression and posttraumatic stress disorder (PTSD). MIT researchers have now devised a way to remotely control the release of these hormones from the adrenal gland, using magnetic nanoparticles.
This approach could help scientists to learn more about how hormone release influences mental health, and could eventually offer a new way to treat hormone-linked disorders, the researchers say.
"We're looking how can we study and eventually treat stress disorders by modulating peripheral organ function, rather than doing something highly invasive in the central nervous system," says Polina Anikeeva, an MIT professor of materials science and engineering and of brain and cognitive sciences.
To achieve control over hormone release, Dekel Rosenfeld, an MIT-Technion postdoc in Anikeeva's group, has developed specialized magnetic nanoparticles that can be injected into the adrenal gland. When exposed to a weak magnetic field, the particles heat up slightly, activating heat-responsive channels that trigger hormone release. This technique can be used to stimulate an organ deep in the body with minimal invasiveness.
Anikeeva and Alik Widge, an assistant professor of psychiatry at the University of Minnesota and a former research fellow at MIT's Picower Institute for Learning and Memory, are the senior authors of the study. Rosenfeld is the lead author of the paper, which appears today in Science Advances.
Controlling hormones
Anikeeva's lab has previously devised several novel magnetic nanomaterials, including particles that can release drugs at precise times in specific locations in the body.
In the new study, the research team wanted to explore the idea of treating disorders of the brain by manipulating organs that are outside the central nervous system but influence it through hormone release. One well-known example is the hypothalamic-pituitary-adrenal (HPA) axis, which regulates stress response in mammals. Hormones secreted by the adrenal gland, including cortisol and adrenaline, play important roles in depression, stress, and anxiety.
"Some disorders that we consider neurological may be treatable from the periphery, if we can learn to modulate those local circuits rather than going back to the global circuits in the central nervous system," says Anikeeva, who is a member of MIT's Research Laboratory of Electronics and McGovern Institute for Brain Research.
As a target to stimulate hormone release, the researchers decided on ion channels that control the flow of calcium into adrenal cells. Those ion channels can be activated by a variety of stimuli, including heat. When calcium flows through the open channels into adrenal cells, the cells begin pumping out hormones. "If we want to modulate the release of those hormones, we need to be able to essentially modulate the influx of calcium into adrenal cells," Rosenfeld says.
Unlike previous research in Anikeeva's group, in this study magnetothermal stimulation was applied to modulate the function of cells without artificially introducing any genes.
To stimulate these heat-sensitive channels, which naturally occur in adrenal cells, the researchers designed nanoparticles made of magnetite, a type of iron oxide that forms tiny magnetic crystals about 1/5000 the thickness of a human hair. In rats, they found these particles could be injected directly into the adrenal glands and remain there for at least six months. When the rats were exposed to a weak magnetic field -- about 50 millitesla, 100 times weaker than the fields used for magnetic resonance imaging (MRI) -- the particles heated up by about 6 degrees Celsius, enough to trigger the calcium channels to open without damaging any surrounding tissue.
The heat-sensitive channel that they targeted, known as TRPV1, is found in many sensory neurons throughout the body, including pain receptors. TRPV1 channels can be activated by capsaicin, the organic compound that gives chili peppers their heat, as well as by temperature. They are found across mammalian species, and belong to a family of many other channels that are also sensitive to heat.
This stimulation triggered a hormone rush -- doubling cortisol production and boosting noradrenaline by about 25 percent. That led to a measurable increase in the animals' heart rates.
Treating stress and pain
The researchers now plan to use this approach to study how hormone release affects PTSD and other disorders, and they say that eventually it could be adapted for treating such disorders. This method would offer a much less invasive alternative to potential treatments that involve implanting a medical device to electrically stimulate hormone release, which is not feasible in organs such as the adrenal glands that are soft and highly vascularized, the researchers say.
Another area where this strategy could hold promise is in the treatment of pain, because heat-sensitive ion channels are often found in pain receptors.
"Being able to modulate pain receptors with this technique potentially will allow us to study pain, control pain, and have some clinical applications in the future, which hopefully may offer an alternative to medications or implants for chronic pain," Anikeeva says. With further investigation of the existence of TRPV1 in other organs, the technique can potentially be extended to other peripheral organs such as the digestive system and the pancreas.

Milky Way could be catapulting stars into its outer halo

View of Milky Way from Earth 
Though mighty, the Milky Way and galaxies of similar mass are not without scars chronicling turbulent histories. University of California, Irvine astronomers and others have shown that clusters of supernovas can cause the birth of scattered, eccentrically orbiting suns in outer stellar halos, upending commonly held notions of how star systems have formed and evolved over billions of years.
Hyper-realistic, cosmologically self-consistent computer simulations from the Feedback in Realistic Environments 2 project enabled the scientists to model the disruptions in otherwise orderly galactic rotations. The team's work is the subject of a study published today in the Monthly Notices of the Royal Astronomical Society.
"These highly accurate numerical simulations have shown us that it's likely the Milky Way has been launching stars in circumgalactic space in outflows triggered by supernova explosions," said senior author James Bullock, dean of UCI's School of Physical Sciences and a professor of physics & astronomy. "It's fascinating, because when multiple big stars die, the resulting energy can expel gas from the galaxy, which in turn cools, causing new stars to be born."
Bullock said the diffuse distribution of stars in the stellar halo that extends far outside the classical disk of a galaxy is where the "archeological record" of the system exists. Astronomers have long assumed that galaxies are assembled over lengthy periods of time as smaller star groupings come in and are dismembered by the larger body, a process that ejects some stars into distant orbits. But the UCI team is proposing "supernova feedback" as a different source for as many as 40 percent of these outer-halo stars.
Lead author Sijie Yu, a UCI Ph.D. candidate in physics, said the findings were made possible partly by the availability of a powerful new set of tools.
"The FIRE-2 simulations allow us to generate movies that make it seem as though you're observing a real galaxy," she noted. "They show us that as the galaxy center is rotating, a bubble driven by supernova feedback is developing with stars forming at its edge. It looks as though the stars are being kicked out from the center."
Bullock said he did not expect to see such an arrangement because stars are such tight, incredibly dense balls that are generally not subject to being moved relative to the background of space. "Instead, what we're witnessing is gas being pushed around," he said, "and that gas subsequently cools and makes stars on its way out."
The researchers said that while their conclusions have been drawn from simulations of galaxies forming, growing and evolving to the present day, there is actually a fair amount of observational evidence that stars are forming in outflows from galactic centers to their halos.
"In plots that compare data from the European Space Agency's Gaia mission -- which provides a 3D velocity chart of stars in the Milky Way -- with other maps that show stellar density and metallicity, we can see structures similar to those produced by outflow stars in our simulations," Yu said.
Bullock added that mature, heavier, metal-rich stars like our sun rotate around the center of the galaxy at a predictable speed and trajectory. But the low-metallicity stars, which have been subjected to fewer generations of fusion than our sun, can be seen rotating in the opposite direction.
He said that over the lifespan of a galaxy, the number of stars produced in supernova bubble outflows is small, around 2 percent. But during the parts of galaxies' histories when starburst events are booming, as many as 20 percent of stars are being formed this way.
"There are some current projects looking at galaxies that are considered to be very 'starbursting' right now," Yu said. "Some of the stars in these observations also look suspiciously like they're getting ejected from the center."

Faster-degrading plastic could promise cleaner seas

Discarded fishing net in ocean
To address plastic pollution plaguing the world's seas and waterways, Cornell University chemists have developed a new polymer that can degrade by ultraviolet radiation, according to research published in the Journal of the American Chemical Society.
"We have created a new plastic that has the mechanical properties required by commercial fishing gear. If it eventually gets lost in the aquatic environment, this material can degrade on a realistic time scale," said lead researcher Bryce Lipinski, a doctoral candidate in the laboratory of Geoff Coates, professor of chemistry and chemical biology at Cornell University. "This material could reduce persistent plastic accumulation in the environment."
Commercial fishing contributes to about half of all floating plastic waste that ends up in the oceans, Lipinski said. Fishing nets and ropes are primarily made from three kinds of polymers: isotactic polypropylene, high-density polyethylene, and nylon-6,6, none of which readily degrade.
"While research of degradable plastics has received much attention in recent years," he said, "obtaining a material with the mechanical strength comparable to commercial plastic remains a difficult challenge."
Coates and his research team have spent the past 15 years developing this plastic called isotactic polypropylene oxide, or iPPO. While its original discovery was in 1949, the mechanical strength and photodegradation of this material was unknown before this recent work. The high isotacticity (enchainment regularity) and polymer chain length of their material makes it distinct from its historic predecessor and provides its mechanical strength.
Lipinski noted that while iPPO is stable in ordinary use, it eventually breaks down when exposed to UV light. The change in the plastic's composition is evident in the laboratory, but "visually, it may not appear to have changed much during the process," he said.
The rate of degradation is light intensity-dependent, but under their laboratory conditions, he said, the polymer chain lengths degraded to a quarter of their original length after 30 days of exposure.
Ultimately, Lipinski and other scientists want to leave no trace of the polymer in the environment. He notes there is literature precedent for the biodegradation of small chains of iPPO which could effectively make it disappear, but ongoing efforts aim to prove this.

Exoplanet apparently disappears in latest Hubble observations

Hubble Space Telescope 
Now you see it, now you don't.


What astronomers thought was a planet beyond our solar system has now seemingly vanished from sight. Though this happens in science fiction, such as Superman's home planet Krypton exploding, astronomers are looking for a plausible explanation.
One interpretation is that, rather than being a full-sized planetary object, which was first photographed in 2004, it could instead be a vast, expanding cloud of dust produced in a collision between two large bodies orbiting the bright nearby star Fomalhaut. Potential follow-up observations might confirm this extraordinary conclusion.
"These collisions are exceedingly rare and so this is a big deal that we actually get to see one," said András Gáspár of the University of Arizona, Tucson. "We believe that we were at the right place at the right time to have witnessed such an unlikely event with NASA's Hubble Space Telescope."
"The Fomalhaut system is the ultimate test lab for all of our ideas about how exoplanets and star systems evolve," added George Rieke of the University of Arizona's Steward Observatory. "We do have evidence of such collisions in other systems, but none of this magnitude has been observed in our solar system. This is a blueprint of how planets destroy each other."
The object, called Fomalhaut b, was first announced in 2008, based on data taken in 2004 and 2006. It was clearly visible in several years of Hubble observations that revealed it was a moving dot. Until then, evidence for exoplanets had mostly been inferred through indirect detection methods, such as subtle back-and-forth stellar wobbles, and shadows from planets passing in front of their stars.
Unlike other directly imaged exoplanets, however, nagging puzzles arose with Fomalhaut b early on. The object was unusually bright in visible light, but did not have any detectable infrared heat signature. Astronomers conjectured that the added brightness came from a huge shell or ring of dust encircling the planet that may possibly have been collision-related. The orbit of Fomalhaut b also appeared unusual, possibly very eccentric.
"Our study, which analyzed all available archival Hubble data on Fomalhaut revealed several characteristics that together paint a picture that the planet-sized object may never have existed in the first place," said Gáspár.
The team emphasizes that the final nail in the coffin came when their data analysis of Hubble images taken in 2014 showed the object had vanished, to their disbelief. Adding to the mystery, earlier images showed the object to continuously fade over time, they say. "Clearly, Fomalhaut b was doing things a bona fide planet should not be doing," said Gáspár.
The interpretation is that Fomalhaut b is slowly expanding from the smashup that blasted a dissipating dust cloud into space. Taking into account all available data, Gáspár and Rieke think the collision occurred not too long prior to the first observations taken in 2004. By now the debris cloud, consisting of dust particles around 1 micron (1/50th the diameter of a human hair), is below Hubble's detection limit. The dust cloud is estimated to have expanded by now to a size larger than the orbit of Earth around our Sun.
Equally confounding is that the team reports that the object is more likely on an escape path, rather than on an elliptical orbit, as expected for planets. This is based on the researchers adding later observations to the trajectory plots from earlier data. "A recently created massive dust cloud, experiencing considerable radiative forces from the central star Fomalhaut, would be placed on such a trajectory," said Gáspár. "Our model is naturally able to explain all independent observable parameters of the system: its expansion rate, its fading, and its trajectory."
Because Fomalhaut b is presently inside a vast ring of icy debris encircling the star, colliding bodies would likely be a mixture of ice and dust, like the comets that exist in the Kuiper belt on the outer fringe of our solar system. Gáspár and Rieke estimate that each of these comet-like bodies measured about 125 miles (200 kilometers) across (roughly half the size of the asteroid Vesta).
According to the authors, their model explains all the observed characteristics of Fomalhaut b. Sophisticated dust dynamical modeling done on a cluster of computers at the University of Arizona shows that such a model is able to fit quantitatively all the observations. According to the author's calculations, the Fomalhaut system, located about 25 light-years from Earth, may experience one of these events only every 200,000 years.
Gáspár and Rieke -- along with other members of an extended team -- will also be observing the Fomalhaut system with NASA's upcoming James Webb Space Telescope in its first year of science operations. The team will be directly imaging the inner warm regions of the system, spatially resolving for the first time the elusive asteroid-belt component of an extrasolar planetary system. The team will also search for bona fide planets orbiting Fomalhaut that might be gravitationally sculpting the outer disk. They will also analyze the chemical composition of the disk.

Stabilizing brain-computer interfaces

Brain-computer interface abstract illustration 
Researchers from Carnegie Mellon University (CMU) and the University of Pittsburgh (Pitt) have published research in Nature Biomedical Engineering that will drastically improve brain-computer interfaces and their ability to remain stabilized during use, greatly reducing or potentially eliminating the need to recalibrate these devices during or between experiments.
Brain-computer interfaces (BCI) are devices that enable individuals with motor disabilities such as paralysis to control prosthetic limbs, computer cursors, and other interfaces using only their minds. One of the biggest problems facing BCI used in a clinical setting is instability in the neural recordings themselves. Over time, the signals picked up by BCI can vary, and a result of this variation is that an individual can lose the ability to control their BCI.
As a result of this loss of control, researchers ask the user to go through a recalibration session which requires them to stop what they're doing and reset the connection between their mental commands and the tasks being performed. Typically, another human technician is involved just to get the system to work.
"Imagine if every time we wanted to use our cell phone, to get it to work correctly, we had to somehow calibrate the screen so it knew what part of the screen we were pointing at," says William Bishop, who was previously a PhD student and postdoctoral fellow in the Department of Machine Learning at CMU and is now a fellow at Janelia Farm Research Campus. "The current state of the art in BCI technology is sort of like that. Just to get these BCI devices to work, users have to do this frequent recalibration. So that's extremely inconvenient for the users, as well as the technicians maintaining the devices."
The paper, "A stabilized brain-computer interface based on neural manifold alignment," presents a machine learning algorithm that accounts for these varying signals and allows the individual to continue controlling the BCI in the presence of these instabilities. By leveraging the finding that neural population activity resides in a low-dimensional "neural manifold," the researchers can stabilize neural activity to maintain good BCI performance in the presence of recording instabilities.
"When we say 'stabilization,' what we mean is that our neural signals are unstable, possibly because we're recording from different neurons across time," explains Alan Degenhart, a postdoctoral researcher in electrical and computer engineering at CMU. "We have figured out a way to take different populations of neurons across time and use their information to essentially reveal a common picture of the computation that's going on in the brain, thereby keeping the BCI calibrated despite neural instabilities."
The researchers aren't the first to propose a method for self-recalibration; the problem of unstable neural recordings has been up in the air for a long time. A few studies have proposed self-recalibration procedures, but have faced the issue of dealing with instabilities. The method presented in this paper is able to recover from catastrophic instabilities because it doesn't rely on the subject performing well during the recalibration.
"Let's say that the instability were so large such that the subject were no longer able to control the BCI," explains Byron Yu, a professor of electrical and computer engineering and biomedical engineering at CMU. "Existing self-recalibration procedures are likely to struggle in that scenario, whereas in our method, we've demonstrated it can in many cases recover from those catastrophic instabilities."
"Neural recording instabilities are not well characterized, but it's a very large problem," says Emily Oby, a postdoctoral researcher in neurobiology at Pitt. "There's not a lot of literature we can point to, but anecdotally, a lot of the labs that do clinical research with BCI have to deal with this issue quite frequently. This work has the potential to greatly improve the clinical viability of BCIs, and to help stabilize other neural interfaces."

Origins of human language pathway in the brain at least 25 million years old

The word 'hello' in different languages
Scientists have discovered an earlier origin to the human language pathway in the brain, pushing back its evolutionary origin by at least 20 million years.
Previously, a precursor of the language pathway was thought by many scientists to have emerged more recently, about 5 million years ago, with a common ancestor of both apes and humans.
For neuroscientists, this is comparable to finding a fossil that illuminates evolutionary history. However, unlike bones, brains did not fossilize. Instead neuroscientists need to infer what the brains of common ancestors may have been like by studying brain scans of living primates and comparing them to humans.
Professor Chris Petkov from the Faculty of Medical Sciences, Newcastle University, UK the study lead said: "It is like finding a new fossil of a long lost ancestor. It is also exciting that there may be an older origin yet to be discovered still."
The international teams of European and US scientists carried out the brain imaging study and analysis of auditory regions and brain pathways in humans, apes and monkeys which is published in Nature Neuroscience.
They discovered a segment of this language pathway in the human brain that interconnects the auditory cortex with frontal lobe regions, important for processing speech and language. Although speech and language are unique to humans, the link via the auditory pathway in other primates suggests an evolutionary basis in auditory cognition and vocal communication.
Professor Petkov added: "We predicted but could not know for sure whether the human language pathway may have had an evolutionary basis in the auditory system of nonhuman primates. I admit we were astounded to see a similar pathway hiding in plain sight within the auditory system of nonhuman primates."
Remarkable transformation
The study also illuminates the remarkable transformation of the human language pathway. A key human unique difference was found: the human left side of this brain pathway was stronger and the right side appears to have diverged from the auditory evolutionary prototype to involve non-auditory parts of the brain.
The study relied on brain scans from openly shared resources by the global scientific community. It also generated original new brain scans that are globally shared to inspire further discovery. Also since the authors predict that the auditory precursor to the human language pathway may be even older, the work inspires the neurobiological search for its earliest evolutionary origin -- the next brain 'fossil' -- to be found in animals more distantly related to humans.

Rising carbon dioxide causes more than a climate crisis -- it may directly harm our ability to think

Carbon dioxide in air concept illustration
As the 21st century progresses, rising atmospheric carbon dioxide (CO2) concentrations will cause urban and indoor levels of the gas to increase, and that may significantly reduce our basic decision-making ability and complex strategic thinking, according to a new CU Boulder-led study. By the end of the century, people could be exposed to indoor CO2 levels up to 1400 parts per million -- more than three times today's outdoor levels, and well beyond what humans have ever experienced.
"It's amazing how high CO2 levels get in enclosed spaces," said Kris Karnauskas, CIRES Fellow, associate professor at CU Boulder and lead author of the new study published today in the AGU journal GeoHealth. "It affects everybody -- from little kids packed into classrooms to scientists, business people and decision makers to regular folks in their houses and apartments."
Shelly Miller, professor in CU Boulder's school of engineering and coauthor adds that "building ventilation typically modulates CO2 levels in buildings, but there are situations when there are too many people and not enough fresh air to dilute the CO2." CO2 can also build up in poorly ventilated spaces over longer periods of time, such as overnight while sleeping in bedrooms, she said.
Put simply, when we breathe air with high CO2 levels, the CO2 levels in our blood rise, reducing the amount of oxygen that reaches our brains. Studies show that this can increase sleepiness and anxiety, and impair cognitive function.
We all know the feeling: Sit too long in a stuffy, crowded lecture hall or conference room and many of us begin to feel drowsy or dull. In general, CO2 concentrations are higher indoors than outdoors, the authors wrote. And outdoor CO2 in urban areas is higher than in pristine locations. The CO2 concentrations in buildings are a result of both the gas that is otherwise in equilibrium with the outdoors, but also the CO2 generated by building occupants as they exhale.
Atmospheric CO2 levels have been rising since the Industrial Revolution, reaching a 414 ppm peak at NOAA's Mauna Loa Observatory in Hawaii in 2019. In the ongoing scenario in which people on Earth do not reduce greenhouse gas emissions, the Intergovernmental Panel on Climate Change predicts outdoor CO2 levels could climb to 930 ppm by 2100. And urban areas typically have around 100 ppm CO2 higher than this background.
Karnauskas and his colleagues developed a comprehensive approach that considers predicted future outdoor CO2 concentrations and the impact of localized urban emissions, a model of the relationship between indoor and outdoor CO2 levels and the impact on human cognition. They found that if the outdoor CO2 concentrations do rise to 930 ppm, that would nudge the indoor concentrations to a harmful level of 1400 ppm.
"At this level, some studies have demonstrated compelling evidence for significant cognitive impairment," said Anna Schapiro, assistant professor of psychology at the University of Pennsylvania and a coauthor on the study. "Though the literature contains some conflicting findings and much more research is needed, it appears that high level cognitive domains like decision-making and planning are especially susceptible to increasing CO2 concentrations."
In fact, at 1400 ppm, CO2 concentrations may cut our basic decision-making ability by 25 percent, and complex strategic thinking by around 50 percent, the authors found.
The cognitive impacts of rising CO2 levels represent what scientists call a "direct" effect of the gas' concentration, much like ocean acidification. In both cases, elevated CO2 itself -- not the subsequent warming it also causes -- is what triggers harm.
The team says there may be ways to adapt to higher indoor CO2 levels, but the best way to prevent levels from reaching harmful levels is to reduce fossil fuel emissions. This would require globally adopted mitigation strategies such as those set forth by the Paris Agreement of the United Nations Framework Convention on Climate Change.

Key nose cells identified as likely COVID-19 virus entry points

SARS-CoV-2 illustration
Two specific cell types in the nose have been identified as likely initial infection points for COVID-19 coronavirus. Scientists discovered that goblet and ciliated cells in the nose have high levels of the entry proteins that the COVID-19 virus uses to get into our cells. The identification of these cells by researchers from the Wellcome Sanger Institute, University Medical Centre Groningen, University Cote d'Azur and CNRS, Nice and their collaborators, as part of the Human Cell Atlas Lung Biological Network, could help explain the high transmission rate of COVID-19.
Reported today (23rd April) in Nature Medicine, this first publication with the Lung Biological Network is part of an ongoing international effort to use Human Cell Atlas data to understand infection and disease. It further shows that cells in the eye and some other organs also contain the viral-entry proteins. The study also predicts how a key entry protein is regulated with other immune system genes and reveals potential targets for the development of treatments to reduce transmission.
Novel coronavirus disease -- COVID-19 -- affects the lungs and airways. Patient's symptoms can be flu-like, including fever, coughing and sore throat, while some people may not experience symptoms but still have transmissible virus. In the worst cases, the virus causes pneumonia that can ultimately lead to death. The virus is thought to be spread through respiratory droplets produced when an infected person coughs or sneezes, and appears to be easily transmitted within affected areas. So far the virus has spread to more than 184 countries and claimed more than 180,000 lives.
Scientists around the world are trying to understand exactly how the virus spreads, to help prevent transmission and develop a vaccine. While it is known that the virus that causes COVID-19 disease, known as SARS-CoV-2, uses a similar mechanism to infect our cells as a related coronavirus that caused the 2003 SARS epidemic, the exact cell types involved in the nose had not previously been pinpointed.
To discover which cells could be involved in COVID-19 transmission, researchers analysed multiple Human Cell Atlas (HCA) consortium datasets of single cell RNA sequencing, from more than 20 different tissues of non-infected people. These included cells from the lung, nasal cavity, eye, gut, heart, kidney and liver. The researchers looked for which individual cells expressed both of two key entry proteins that are used by the COVID-19 virus to infect our cells.
Dr Waradon Sungnak, the first author on the paper from Wellcome Sanger Institute, said: "We found that the receptor protein -- ACE2 -- and the TMPRSS2 protease that can activate SARS-CoV-2 entry are expressed in cells in different organs, including the cells on the inner lining of the nose. We then revealed that mucus-producing goblet cells and ciliated cells in the nose had the highest levels of both these COVID-19 virus proteins, of all cells in the airways. This makes these cells the most likely initial infection route for the virus."
Dr Martijn Nawijn, from the University Medical Center Groningen in the Netherlands, said, on behalf of the HCA Lung Biological Network: "This is the first time these particular cells in the nose have been associated with COVID-19. While there are many factors that contribute to virus transmissibility, our findings are consistent with the rapid infection rates of the virus seen so far. The location of these cells on the surface of the inside of the nose make them highly accessible to the virus, and also may assist with transmission to other people."
The two key entry proteins ACE2 and TMPRSS2 were also found in cells in the cornea of the eye and in the lining of the intestine. This suggests another possible route of infection via the eye and tear ducts, and also revealed a potential for fecal-oral transmission.
When cells are damaged or fighting an infection, various immune genes are activated. The study showed that ACE2 receptor production in the nose cells is probably switched on at the same time as these other immune genes.
The work was carried out as part of the global Human Cell Atlas consortium which aims to create reference maps of all human cells to understand health and disease. More than 1,600 people across 70 countries are involved in the HCA community, and the data is openly available to scientists worldwide.
Dr Sarah Teichmann, a senior author from the Wellcome Sanger Institute and co-chair of the HCA Organising Committee, said: "As we're building the Human Cell Atlas it is already being used to understand COVID-19 and identify which of our cells are critical for initial infection and transmission. This information can be used to better understand how coronavirus spreads. Knowing which exact cell types are important for virus transmission also provides a basis for developing potential treatments to reduce the spread of the virus."
The global HCA Lung Biological Network continues to analyse the data in order to provide further insights into the cells and targets likely to be involved in COVID-19, and to relate them to patient characteristics.

Saturday, 18 April 2020

Strongest evidence yet that neutrinos explain how the universe exists

Neutrino word cloud illustration
New data throws more support behind the theory that neutrinos are the reason the universe is dominated by matter.
The current laws of physics do not explain why matter persists over antimatter -- why the universe is made of 'stuff'. Scientists believe equal amounts of matter and antimatter were created at the beginning of the universe, but this would mean they should have wiped each other out, annihilating the universe as it began.
Instead, physicists suggest there must be differences in the way matter and antimatter behave that explain why matter persisted and now dominates the universe. Each particle of matter has an antimatter equivalent, and neutrinos are no different, with an antimatter equivalent called antineutrinos.
They should be exact opposites in their properties and behaviour, which is what makes them annihilate each other on contact.
Now, an international team of researchers that make up the T2K Collaboration, including Imperial College London scientists, have found the strongest evidence yet that neutrinos and antineutrinos behave differently, and therefore may not wipe each other out.
Dr Patrick Dunne, from the Department of Physics at Imperial, said: "This result brings us closer than ever before to answering the fundamental question of why the matter in our universe exists. If confirmed -- at the moment we're over 95 per cent sure -- it will have profound implications for physics and should point the way to a better understanding of how our universe evolved."
Previously, scientists have found some differences in behaviour between matter and antimatter versions of subatomic particles called quarks, but the differences observed so far do not seem to be large enough to account for the dominance of matter in the universe.
However, T2K's new result indicates that the differences in the behaviour of neutrinos and antineutrinos appear to be quite large. Neutrinos are fundamental particles but do not interact with normal matter very strongly, such that around 50 trillion neutrinos from the Sun pass through your body every second.
Neutrinos and antineutrinos can come in three 'flavours', known as muon, electron and tau. As they travel, they can 'oscillate' -- changing into a different flavour. The fact that muon neutrinos oscillate into electron neutrinos was first discovered by the T2K experiment in 2013.
To get the new result, the team fired beams of muon neutrinos and antineutrinos from the J-PARC facility at Tokai, Japan, and detected how many electron neutrinos and antineutrinos arrived at the Super-Kamiokande detector 295km away.
They looked for differences in how the neutrinos or antineutrinos changed flavour, finding neutrinos appear to be much more likely to change than antineutrinos.
The available data also strongly discount the possibility that neutrinos and antineutrinos are as just likely as each other to change flavour. Dr Dunne said: "What our result shows is that we're more than 95 per cent sure that matter neutrinos and antineutrinos behave differently. This is big news in itself; however we do already know of other particles that have matter-antimatter differences that are too small to explain our matter-dominated universe.
"Therefore, measuring the size of the difference is what matters for determining whether neutrinos can answer this fundamental question. Our result today finds that unlike for other particles, the result in neutrinos is compatible with many of the theories explaining the origin of the universe's matter dominance."
While the result is the strongest evidence yet that neutrinos and antineutrinos behave differently, the T2K Collaboration is working to reduce any uncertainties and gather more data by upgrading the detectors and beamlines, including the new Hyper-Kamiokande detector to replace the Super-Kamiokande. A new experiment, called DUNE, is also under construction in the US. Imperial is involved in both.
Imperial researchers have been involved in the T2K Collaboration since 2004, starting with conceptual designs on whiteboards and research and development on novel particle detector components that were key to building this experiment, which was finally completed and turned on in 2010.
For the latest result, the team contributed to the statistical analysis of the results and ensuring the signal they observe is real, as well as including the effects of how neutrinos interact with matter, which is one of the largest uncertainties that go into the analysis.

ESO telescope sees star dance around supermassive black hole, proves Einstein right

Black hole concept illustration
Observations made with ESO's Very Large Telescope (VLT) have revealed for the first time that a star orbiting the supermassive black hole at the centre of the Milky Way moves just as predicted by Einstein's general theory of relativity. Its orbit is shaped like a rosette and not like an ellipse as predicted by Newton's theory of gravity. This long-sought-after result was made possible by increasingly precise measurements over nearly 30 years, which have enabled scientists to unlock the mysteries of the behemoth lurking at the heart of our galaxy.
"Einstein's General Relativity predicts that bound orbits of one object around another are not closed, as in Newtonian Gravity, but precess forwards in the plane of motion. This famous effect -- first seen in the orbit of the planet Mercury around the Sun -- was the first evidence in favour of General Relativity. One hundred years later we have now detected the same effect in the motion of a star orbiting the compact radio source Sagittarius A* at the centre of the Milky Way. This observational breakthrough strengthens the evidence that Sagittarius A* must be a supermassive black hole of 4 million times the mass of the Sun," says Reinhard Genzel, Director at the Max Planck Institute for Extraterrestrial Physics (MPE) in Garching, Germany and the architect of the 30-year-long programme that led to this result.
Located 26,000 light-years from the Sun, Sagittarius A* and the dense cluster of stars around it provide a unique laboratory for testing physics in an otherwise unexplored and extreme regime of gravity. One of these stars, S2, sweeps in towards the supermassive black hole to a closest distance less than 20 billion kilometres (one hundred and twenty times the distance between the Sun and Earth), making it one of the closest stars ever found in orbit around the massive giant. At its closest approach to the black hole, S2 is hurtling through space at almost three percent of the speed of light, completing an orbit once every 16 years. "After following the star in its orbit for over two and a half decades, our exquisite measurements robustly detect S2's Schwarzschild precession in its path around Sagittarius A*," says Stefan Gillessen of the MPE, who led the analysis of the measurements published today in the journal Astronomy & Astrophysics.
Most stars and planets have a non-circular orbit and therefore move closer to and further away from the object they are rotating around. S2's orbit precesses, meaning that the location of its closest point to the supermassive black hole changes with each turn, such that the next orbit is rotated with regard to the previous one, creating a rosette shape. General Relativity provides a precise prediction of how much its orbit changes and the latest measurements from this research exactly match the theory. This effect, known as Schwarzschild precession, had never before been measured for a star around a supermassive black hole.
The study with ESO's VLT also helps scientists learn more about the vicinity of the supermassive black hole at the centre of our galaxy. "Because the S2 measurements follow General Relativity so well, we can set stringent limits on how much invisible material, such as distributed dark matter or possible smaller black holes, is present around Sagittarius A*. This is of great interest for understanding the formation and evolution of supermassive black holes," say Guy Perrin and Karine Perraut, the French lead scientists of the project.
This result is the culmination of 27 years of observations of the S2 star using, for the best part of this time, a fleet of instruments at ESO's VLT, located in the Atacama Desert in Chile. The number of data points marking the star's position and velocity attests to the thoroughness and accuracy of the new research: the team made over 330 measurements in total, using the GRAVITY, SINFONI and NACO instruments. Because S2 takes years to orbit the supermassive black hole, it was crucial to follow the star for close to three decades, to unravel the intricacies of its orbital movement.
The research was conducted by an international team led by Frank Eisenhauer of the MPE with collaborators from France, Portugal, Germany and ESO. The team make up the GRAVITY collaboration, named after the instrument they developed for the VLT Interferometer, which combines the light of all four 8-metre VLT telescopes into a super-telescope (with a resolution equivalent to that of a telescope 130 metres in diameter). The[ same team reported in 2018] -- another effect predicted by General Relativity: they saw the light received from S2 being stretched to longer wavelengths as the star passed close to Sagittarius A*. "Our previous result has shown that the light emitted from the star experiences General Relativity. Now we have shown that the star itself senses the effects of General Relativity," says Paulo Garcia, a researcher at Portugal's Centre for Astrophysics and Gravitation and one of the lead scientists of the GRAVITY project.
With ESO's upcoming Extremely Large Telescope, the team believes that they would be able to see much fainter stars orbiting even closer to the supermassive black hole. "If we are lucky, we might capture stars close enough that they actually feel the rotation, the spin, of the black hole," says Andreas Eckart from Cologne University, another of the lead scientists of the project. This would mean astronomers would be able to measure the two quantities, spin and mass, that characterise Sagittarius A* and define space and time around it. "That would be again a completely different level of testing relativity," says Eckart.

Large-scale analysis links glucose metabolism proteins to Alzheimer's disease biology

Brain and dementia concept

In the largest study to date of proteins related to Alzheimer's disease, a team of researchers has identified disease-specific proteins and biological processes that could be developed into both new treatment targets and fluid biomarkers. The findings suggest that sets of proteins that regulate glucose metabolism, together with proteins related to a protective role of astrocytes and microglia -- the brain's support cells -- are strongly associated with Alzheimer's pathology and cognitive impairment.


The study, part of the Accelerating Medicines Partnership for Alzheimer's Disease (AMP-AD), involved measuring the levels and analyzing the expression patterns of more than 3,000 proteins in a large number of brain and cerebrospinal fluid samples collected at multiple research centers across the United States. This research was funded by the National Institutes of Health's National Institute on Aging (NIA) and published April 13 in Nature Medicine.
"This is an example of how the collaborative, open science platform of AMP-AD is creating a pipeline of discovery for new approaches to diagnosis, treatment and prevention of Alzheimer's disease," said NIA Director Richard J. Hodes, M.D. "This study exemplifies how research can be accelerated when multiple research groups share their biological samples and data resources."
The research team, led by Erik C.B. Johnson, M.D., Ph.D, Nicholas T. Seyfried, Ph.D., and Allan Levey, M.D., Ph.D., all at the Emory School of Medicine, Atlanta, analyzed patterns of protein expression in more than 2,000 human brain and nearly 400 cerebrospinal fluid samples from both healthy people and those with Alzheimer's disease. The paper's authors, which included Madhav Thambisetty, M.D., Ph.D., investigator and chief of the Clinical and Translational Neuroscience Section in the NIA's Laboratory of Behavioral Neuroscience, identified groups (or modules) of proteins that reflect biological processes in the brain.
The researchers then analyzed how the protein modules relate to various pathologic and clinical features of Alzheimer's and other neurodegenerative disorders. They saw changes in proteins related to glucose metabolism and an anti-inflammatory response in glial cells in brain samples from both people with Alzheimer's as well as in samples from individuals with documented brain pathology who were cognitively normal. This suggests, the researchers noted, that the anti-inflammatory processes designed to protect nerve cells may have been activated in response to the disease.
The researchers also set out to reproduce the findings in cerebrospinal fluid. The team found that, just like with brain tissue, the proteins involved in the way cells extract energy from glucose are increased in the spinal fluid from people with Alzheimer's. Many of these proteins were also elevated in people with preclinical Alzheimer's, i.e., individuals with brain pathology but without symptoms of cognitive decline. Importantly, the glucose metabolism/glial protein module was populated with proteins known to be genetic risk factors for Alzheimer's, suggesting that the biological processes reflected by these protein families are involved in the actual disease process.
"We've been studying the possible links between abnormalities in the way the brain metabolizes glucose and Alzheimer's-related changes for a while now," Thambisetty said. "The latest analysis suggests that these proteins may also have potential as fluid biomarkers to detect the presence of early disease."
In a previous study, Thambisetty and colleagues, in collaboration with the Emory researchers, found a connection between abnormalities in how the brain breaks down glucose and the amount of the signature amyloid plaques and tangles in the brain, as well as the onset of symptoms such as problems with memory.
"This large, comparative proteomic study points to massive changes across many biological processes in Alzheimer's and offers new insights into the role of brain energy metabolism and neuroinflammation in the disease process," said Suzana Petanceska, Ph.D., program director at NIA overseeing the AMP-AD Target Discovery Program. "The data and analyses from this study has already been made available to the research community and can be used as a rich source of new targets for the treatment and prevention of Alzheimer's or serve as the foundation for developing fluid biomarkers."
Brain tissue samples came from autopsy of participants in Alzheimer's disease research centers and several epidemiologic studies across the country, including the Baltimore Longitudinal Study of Aging (BLSA), Religious Orders Study (ROS) and Memory and Aging Project (MAP), and Adult Changes in Thought (ACT) initiatives. The brain collections also contained samples from individuals with six other neurodegenerative disorders as well as samples representing normal aging, which enabled the discovery of molecular signatures specific for Alzheimer's. Cerebrospinal fluid samples were collected from study participants at the Emory Goizueta Alzheimer's Disease Research Center. These and other datasets are available to the research community through the AD Knowledge Portal, the data repository for the AMP-AD Target Discovery Program, and other NIA supported team-science projects operating under open science principles.

'Directing' evolution to identify potential drugs earlier in discovery

Illustration of molecular models of antibodies
Scientists have developed a technique that could significantly reduce the time of discovering potential new antibody-based drugs to treat disease.
Antibodies are produced by the body in response to the presence of a disease-causing agent. They can also be synthesised in the laboratory to mimic natural antibodies and are used to treat a number of diseases.
Antibody therapies can be highly effective, but challenges can arise when promising candidate antibodies are produced at a large scale. Stresses encountered during manufacturing can disrupt the structure of these fragile proteins leading to aggregation and loss of activity. This in turn prevents them from being made into a therapeutic.
New research from an eight-year collaboration between scientists at the University of Leeds and the biopharmaceutical company AstraZeneca has resulted in a technique that allows fragments of antibodies to be screened for susceptibility to aggregation caused by structure disruption much earlier in the drug discovery process.
The approach is described in the journal Nature Communications, published today.
Dr David Brockwell, Associate Professor in the Astbury Centre for Structural Molecular Biology at the University of Leeds, led the research. He said: "Antibody therapeutics have revolutionised medicine. They can be designed to bind to almost any target and are highly specific.
"But a significant problem has been the failure rate of candidates upon manufacturing at industrial scale. This often only emerges at a very late stage in the development process -- these drugs are failing at the last hurdle.
"But our research is turning that problem on its head."
When it comes to developing an antibody drug, scientists are not restricted to a single protein sequence. Fortunately, there is often an array of similar antibodies with the same ability of locking or binding tightly onto a disease-causing agent. That gives researchers a range of proteins to screen, to determine which are more likely to progress through the development process.
Professor Sheena Radford, FRS, Director of the Astbury Centre for Structural Molecular Biology, said: "The collaboration that has existed between the team of scientists within the University of Leeds and AstraZeneca demonstrates the power of industry and academia working together to tackle what has been one of the major roadblocks preventing the efficient and rapid development of these powerful therapeutic molecules."
How the target proteins are screened
The target proteins are cloned into the centre of an enzyme that breaks down antibiotics in the bacterium E.coli. This enables the scientists to directly link antibiotic resistance of the bacteria to how aggregation-prone the antibody fragment is. A simple readout -- bacterial growth on an agar plate containing antibiotic -- gives an indication of whether the target protein will survive the manufacturing process. If the antibody proteins are susceptible to stress, they will unfold or clump together, become inactive, and the antibiotic will kill the bacteria. But if the protein chain is more stable, the bacteria thrives and will display antimicrobial resistance and will grow in the presence of the antibiotic.
The scientists harvest the bacteria that have survived and identify the cloned protein sequence. That indicates which protein sequences to take forward in the development pipeline. The whole cycle takes about a month and increases the chances of success.
Directed evolution
But the process can go a step further, using the idea of directed evolution.
Scientists use the idea of natural selection where mutations or changes take place in the proteins, sometimes making them more stable. Directed evolution could generate new better performing sequences that, at the current time, cannot even be imagined, let alone designed and manufactured. How does this method work? Like Darwin's natural selection, evolutionary pressure in this case is applied by the antibiotic and selects for the survival of bacteria that produce the protein variants that do not aggregate.
The protein sequences hosted in the bacterial cells that have shown resistance are harvested and their genes sequenced and scored, to select the best performing sequences. After a quick check to ensure that the new antibody sequences still retain their excellent binding capability for the original disease-causing target, they can be taken forward for further development.
Professor Radford said: "There is tremendous excitement about this approach. We are letting evolutionary selection change the sequence of the proteins for us and that might make some of them more useful as drug therapies. Importantly for industry, nature does the hard-work -- obviating the need for so called rational engineering which is time- and resource-intensive.
"As we do this, we will be putting the sequence information we gather into a database. As the database gets bigger, it may well be possible with artificial intelligence and machine learning to be able to identify the patterns in protein sequences that tell us that a protein can be scaled up for pharmaceutical production without needing any experiments. That is our next challenge and one we are tackling right now."
Dr David Lowe, who led the work at AstraZeneca, said: "The screening system that we have developed here is a great example of industry and academia working together to solve important challenges in the development of potential new medicines.
"By combining AstraZeneca's antibody discovery and screening expertise, together with the Astbury Centre's world-leading knowledge of protein structure and aggregation, we have produced a high throughput method for rapidly evolving proteins with better biophysical properties that has the potential for wide scientific applicability."

Whole genome sequencing reveals genetic structural secrets of schizophrenia

Schizophrenia word concep

Most research about the genetics of schizophrenia has sought to understand the role that genes play in the development and heritability of schizophrenia. Many discoveries have been made, but there have been many missing pieces. Now, UNC School of Medicine scientists have conducted the largest-ever whole genome sequencing study of schizophrenia to provide a more complete picture of the role the human genome plays in this disease.
Published in Nature Communications, the study co-led by senior author Jin Szatkiewicz, PhD, associate professor in the UNC Department of Genetics, suggests that rare structural genetic variants could play a role in schizophrenia.
"Our results suggest that ultra-rare structural variants that affect the boundaries of a specific genome structure increase risk for schizophrenia," Szatkiewicz said. "Alterations in these boundaries may lead to dysregulation of gene expression, and we think future mechanistic studies could determine the precise functional effects these variants have on biology."
Previous studies on the genetics of schizophrenia have primarily involved using common genetic variations known as SNPs (alterations in common genetic sequences and each affecting a single nucleotide), rare variations in the part of DNA that provide instructions for making proteins, or very large structural variations (alterations affecting a few hundred thousands of nucleotides). These studies give snapshots of the genome, leaving a large portion of the genome a mystery, as it potentially relates to schizophrenia.
In the Nature Communications study, Szatkiewicz and colleagues examined the entire genome, using a method called whole genome sequencing (WGS). The primary reason WGS hasn't been more widely used is that it is very expensive. For this study, an international collaboration pooled funding from National Institute of Mental Health grants and matching funds from Sweden's SciLife Labs to conduct deep whole genome sequencing on 1,165 people with schizophrenia and 1,000 controls -- the largest known WGS study of schizophrenia ever.
As a result, new discoveries were made. Previously undetectable mutations in DNA were found that scientists had never seen before in schizophrenia.
In particular, this study highlighted the role that a three-dimensional genome structure known as topologically associated domains (TADs) could play in the development of schizophrenia. TADs are distinct regions of the genome with strict boundaries between them that keep the domains from interacting with genetic material in neighboring TADs. Shifting or breaking these boundaries allows interactions between genes and regulatory elements that normally would not interact.
When these interactions occur, gene expression may be changed in undesirable ways that could result in congenital defects, formation of cancers, and developmental disorders. This study found that extremely rare structural variants affecting TAD boundaries in the brain occur significantly more often in people with schizophrenia than in those without it. Structural variants are large mutations that may involve missing or duplicated genetic sequences, or sequences that are not in the typical genome. This finding suggests that misplaced or missing TAD boundaries may also contribute to the development of schizophrenia. This study was the first to discover the connection between anomalies in TADs and the development of schizophrenia.
This work has highlighted TADs-affecting structural variants as prime candidates for future mechanistic studies of the biology of schizophrenia.
"A possible future investigation would be to work with patient-derived cells with these TADs-affecting mutations and figure out what exactly happened at the molecular level," said Szatkiewicz, an adjunct assistant professor of psychiatry at UNC. "In the future, we could use this information about the TAD effects to help develop drugs or precision medicine treatments that could repair disrupted TADs or affected gene expressions which may improve patient outcomes."
This study will be combined with other WGS studies in order to increase the sample size to further confirm these results. This research will also help the scientific community build on the unfolding genetic mysteries of schizophrenia.

Novel C. diff structures are required for infection, offer new therapeutic targets

  Iron storage "spheres" inside the bacterium C. diff -- the leading cause of hospital-acquired infections -- could offer new targ...