Tuesday 19 February 2019

Light-based production of drug-discovery molecules

Photoelectrochemical (PEC) cells are widely studied for the conversion of solar energy into chemical fuels. They use photocathodes and photoanodes to "split" water into hydrogen and oxygen respectively. PEC cells can work under mild conditions with light, which makes them also suitable for other catalyzing reactions that turn organic molecules into high added-value chemicals, like those used to develop drugs.
However, PEC cells have rarely been used in organic synthesis so far, except in some recent conceptual attempts that have tested only a handful of simple substrates. Overall, PEC cells remain largely unexplored for broad-scope synthetic methodologies of functional organic molecules.
They could nevertheless prove most helpful in one of the most appealing synthetic methods for pharmaceuticals and agrochemicals, called "direct amination." It involves adding an amine group to an organic molecule without pre-activating the molecule by an additional processing step.
Direct amination normally requires high temperatures, and also needs what is known as a "directing group" -- a chemical unit that fixes the reaction site but has no other functions, and which often has to be removed before using the new compound in applications.
Now, the labs of Xile Hu and Michael Grätzel at EPFL's Institute of chemical sciences and engineering (ISIC) have developed a new method for aminating arenes -- hydrocarbons with a ring in their structure -- without the need for a directing group.
"Our method is operationally simple and can be used to synthesize a broad range of nitrogen-containing heterocycles relevant to drug discovery," writes Lei Zhang, the lead author of the study. Proving the point, the researchers used their method to make several pharmaceutical molecules, including derivatives of the muscle relaxant metaxalone and the antimicrobial benzethonium chloride.
Based on a PEC cell, the method catalyzes the reaction with light and the low-cost, Earth-abundant semiconductor hematite. "Pioneering studies in Michael Grätzel's lab have yielded robust hematite samples that are efficient for water splitting, but hematite has never been used to catalyze organic synthesis," says Hu.
In the current study, hematite was found to work well for direct amination under visible light, while its high stability promises a long lifetime as a working catalyst. And because it harvests light, the photoelectrocatalysis used here consumes less energy than direct electrocatalysis.
"This is an important demonstration of principle for using PEC cells for the production of high added-value chemicals and pharmaceuticals," says Hu. "The work merges two traditionally separated fields, namely photoelectrochemistry and organic synthesis. There are plenty of untapped opportunities for this approach, and we are excited to further explore these opportunities."

Researchers discover anti-laser masquerading as perfect absorber

The width, height and spacing of the cylinders depicted here dictates how the metamaterial described in the new paper absorbs electromagnetic energy.
Researchers at Duke University have discovered that a perfect absorber of electromagnetic waves they described in a 2017 paper can easily be tweaked into a sort of "time-reversed laser" known as a coherent perfect absorber (CPA).
The research appeared online on January 28 in the journal Advanced Optical Materials.
A laser is a device that transforms energy into coherent light, meaning the light waves are perfectly aligned with one another. Reversing the process, a CPA -- sometimes called a time-reversed laser -- is a device that absorbs all of the energy from two identical electromagnetic waves hitting it from either side in perfect synchrony. That is, the crests and troughs of their waves enter the material from either side at precisely the same time.
In 2017, Willie Padilla, professor of electrical and computer engineering at Duke, built the first material capable of absorbing nearly 100 percent of an electromagnetic wave's energy without containing even an atom of metal. The device was a metamaterial -- synthetic materials composed of many individual, engineered features that together produce properties not found in nature.
This particular metamaterial featured zirconia ceramic constructed into a surface dimpled with cylinders like the face of a Lego brick. After computationally modeling the device's properties by altering the cylinders' size and spacing, the researchers realized that they had actually created a more fundamental kind of CPA.
"We've studied this system before as a perfect absorber, but now we've figured out that this device can be configured to be a CPA as well," said Padilla. "This study has shown that these seemingly different fields are actually one and the same."
The CPAs currently described in the literature all have only one mode. They work when the incoming electromagnetic waves are either perfectly aligned or perfectly out of sync. Padilla and Kebin Fan, a research assistant professor in Padilla's laboratory, have discovered that their perfect absorber is actually a CPA with two overlapping modes: it can absorb both aligned and misaligned waves.
By changing the material's parameters so that the two modes no longer overlap, Padilla and Fan were able to show it could easily become just like the CPAs currently in the literature, but with much more versatility.
"Typical CPAs have only one variable, the material's thickness," said Fan. "We have three: the cylinders' radius, height and periodicity. This gives us a lot more room to tailor these modes and put them in the frequency spectrum where we want them, giving us a lot of flexibility for tailoring the CPAs."
In the paper, the researchers show that their device can switch between absorbing all phases of electromagnetic waves and only those in sync with one another merely by increasing the height of the cylinders from 1.1 millimeters to 1.4. With this ease of transition, they believe it should be possible to engineer a material that can dynamically switch between the two.
"We haven't done that yet," said Padilla. "It is challenging, but it's on our agenda."
While there aren't currently any devices that make use of the abilities of CPAs, Padilla and Fan have a few in mind. In principle, researchers could engineer a device that measures not just the intensity of incoming light like a normal camera, but also its phase.
"If you're trying to figure out the properties of a material, the more measurements you have, the more you can understand about the material," said Padilla. "And while coherent detectors do exist -- we have one in our own lab, actually -- they're extremely expensive to build through other technologies."

Solid-state catalysis: Fluctuations clear the way

The use of efficient catalytic agents is what makes many technical procedures feasible in the first place. Indeed, synthesis of more than 80% of the products generated in the chemical industry requires the input of specific catalysts. Most of these are solid-state catalysts, and the reactions they make possible take place between molecules that adsorb to their surfaces. The specific properties of the catalyst permit the starting molecules to interact and accelerate the reaction between them, without consuming or altering the catalyst itself. However, efficient catalysis also requires efficient mixing, so reactants must be able to diffuse laterally on the surface of the catalyst to maximize the chance of undergoing the desired reaction. Under the conditions employed in industrial processes, however, the surface of the catalyst is generally so densely packed with adsorbed particles that it has been unclear how molecules could effectively diffuse at all. Researchers led by Professor Joost Wintterlin at the Department of Chemistry at Ludwig-Maximilian-Universitaet (LMU) have now shown that, although reactants indeed spend time virtually trapped on the surface of the catalyst, local fluctuations in occupancy frequently provide opportunities to change positions. The new findings appear in the leading journal Science.
In order to gain insight into the molecular processes that take place on a solid-state catalyst, Wintterlin and colleagues used scanning tunneling microscopy (STM) to monitor the mobility of individual oxygen atoms on a ruthenium (Ru) catalyst that was densely packed with adsorbed carbon monoxide (CO) molecules. "We chose this system because the oxidation of CO to CO2 on metals belonging to the platinum group is a well-studied model for solid-state catalysis generally," Wintterlin explains. However, conventional scanning tunneling microscopy would have been unable to capture the surface dynamics of this reaction system. But the team succeeded in enhancing the rate of data acquisition, finally attaining rates of up to 50 images per second -- high enough to make videos of the dynamics of the particles on the catalyst.
The STM images revealed that the oxygen atoms are completely hemmed in by triangular cages formed by CO molecules adsorbed to the surface of the Ru catalyst. Analysis of the videos showed that single oxygen atoms can only hop between three positions formed by the interstices of the Ru atoms. "But, to our surprise, we also observed that an atom can escape from its cage, and suddenly begins to diffuse through the carbon monoxide matrix at a rate that is almost as high as if it were on a completely empty surface," says Ann-Kathrin Henß, first author of the research paper. In collaboration with Professor Axel Groß of the Institute of Theoretical Chemistry at Ulm University, the Munich researchers were able to link this phenomenon with fluctuations in the local density of the CO on the surface, which give rise to regions in which the molecules are more or less closely packed together. When such a fluctuation occurs in the vicinity of an oxygen atom, the latter can escape from its cage, and make its way to a new position. In fact, this 'door-opening mechanism' opens up diffusion pathways so rapidly that the movement of the oxygen atoms through the matrix is not significantly impeded. This explains why they can almost always find a new binding partner for the reaction facilitated by the catalyst.

Can we trust scientific discoveries made using machine learning?

Rice University statistician Genevera Allen says scientists must keep questioning the accuracy and reproducibility of scientific discoveries made by machine-learning techniques until researchers develop new computational systems that can critique themselves.
Allen, associate professor of statistics, computer science and electrical and computer engineering at Rice and of pediatrics-neurology at Baylor College of Medicine, will address the topic in both a press briefing and a general session today at the 2019 Annual Meeting of the American Association for the Advancement of Science (AAAS).
"The question is, 'Can we really trust the discoveries that are currently being made using machine-learning techniques applied to large data sets?'" Allen said. "The answer in many situations is probably, 'Not without checking,' but work is underway on next-generation machine-learning systems that will assess the uncertainty and reproducibility of their predictions."
Machine learning (ML) is a branch of statistics and computer science concerned with building computational systems that learn from data rather than following explicit instructions. Allen said much attention in the ML field has focused on developing predictive models that allow ML to make predictions about future data based on its understanding of data it has studied.
"A lot of these techniques are designed to always make a prediction," she said. "They never come back with 'I don't know,' or 'I didn't discover anything,' because they aren't made to."
She said uncorroborated data-driven discoveries from recently published ML studies of cancer data are a good example.
"In precision medicine, it's important to find groups of patients that have genomically similar profiles so you can develop drug therapies that are targeted to the specific genome for their disease," Allen said. "People have applied machine learning to genomic data from clinical cohorts to find groups, or clusters, of patients with similar genomic profiles.
"But there are cases where discoveries aren't reproducible; the clusters discovered in one study are completely different than the clusters found in another," she said. "Why? Because most machine-learning techniques today always say, 'I found a group.' Sometimes, it would be far more useful if they said, 'I think some of these are really grouped together, but I'm uncertain about these others.'"
Allen will discuss uncertainty and reproducibility of ML techniques for data-driven discoveries at a 10 a.m. press briefing today, and she will discuss case studies and research aimed at addressing uncertainty and reproducibility in the 3:30 p.m. general session, "Machine Learning and Statistics: Applications in Genomics and Computer Vision." Both sessions are at the Marriott Wardman Park Hotel.

Diet drinks may be associated with strokes among post-menopausal women

Among post-menopausal women, drinking multiple diet drinks daily was associated with an increase in the risk of having a stroke caused by a blocked artery, especially small arteries, according to research published in Stroke, a journal of the American Heart Association.
This is one of the first studies to look at the association between drinking artificially sweetened beverages and the risk of specific types of stroke in a large, racially diverse group of post-menopausal women. While this study identifies an association between diet drinks and stroke, it does not prove cause and effect because it was an observational study based on self-reported information about diet drink consumption.
Compared with women who consumed diet drinks less than once a week or not at all, women who consumed two or more artificially sweetened beverages per day were:
  • 23 percent more likely to have a stroke;
  • 31 percent more likely to have a clot-caused (ischemic) stroke;
  • 29 percent more likely to develop heart disease (fatal or non-fatal heart attack); and
  • 16 percent more likely to die from any cause.
Researchers found risks were higher for certain women. Heavy intake of diet drinks, defined as two or more times daily, more than doubled stroke risk in:
  • women without previous heart disease or diabetes, who were 2.44 times as likely to have a common type of stroke caused by blockage of one of the very small arteries within the brain;
  • obese women without previous heart disease or diabetes, who were 2.03 times as likely to have a clot-caused stroke; and
  • African-American women without previous heart disease or diabetes, who were 3.93 times as likely to have a clot-caused stroke.
"Many well-meaning people, especially those who are overweight or obese, drink low-calorie sweetened drinks to cut calories in their diet. Our research and other observational studies have shown that artificially sweetened beverages may not be harmless and high consumption is associated with a higher risk of stroke and heart disease," said Yasmin Mossavar-Rahmani, Ph.D., lead author of the study and associate professor of clinical epidemiology and population health at the Albert Einstein College of Medicine in the Bronx, New York.
Researchers analyzed data on 81,714 postmenopausal women (age 50-79 years at the start) participating in the Women's Health Initiative study that tracked health outcomes for an average of 11.9 years after they enrolled between 1993 and 1998. At their three-year evaluation, the women reported how often in the previous three months they had consumed diet drinks such as low calorie, artificially sweetened colas, sodas and fruit drinks. The data collected did not include information about the specific artificial sweetener the drinks contained.
The results were obtained after adjusting for various stroke risk factors such as age, high blood pressure, and smoking. These results in postmenopausal women may not be generalizable to men or younger women. The study is also limited by having only the women's self-report of diet drink intake.
"We don't know specifically what types of artificially sweetened beverages they were consuming, so we don't know which artificial sweeteners may be harmful and which may be harmless," Mossavar-Rahmani said.
The American Heart Association recently published a science advisory that found there was inadequate scientific research to conclude that low-calorie sweetened beverages do -- or do not -- alter risk factors for heart disease and stroke in young children, teens or adults. The Association recognizes diet drinks may help replace high calorie, sugary beverages, but recommends water (plain, carbonated and unsweetened flavored) as the best choice for a no calorie drink.
"Unfortunately, current research simply does not provide enough evidence to distinguish between the effects of different low-calorie sweeteners on heart and brain health. This study adds to the evidence that limiting use of diet beverages is the most prudent thing to do for your health," said Rachel K. Johnson, Ph.D., R.D., professor of nutrition emeritus, University of Vermont and the chair of the writing group for the American Heart Association's science advisory, Low-Calorie Sweetened Beverages and Cardiometabolic Health.
"The American Heart Association suggests water as the best choice for a no-calorie beverage. However, for some adults, diet drinks with low calorie sweeteners may be helpful as they transition to adopting water as their primary drink. Since long-term clinical trial data are not available on the effects of low-calorie sweetened drinks and cardiovascular health, given their lack of nutritional value, it may be prudent to limit their prolonged use" said Johnson.

Brain pathways of aversion identified

What happens in the brain when we feel discomfort? Researchers at Karolinska Institutet in Sweden are now one step closer to finding the answer. In a new study published in the journal Molecular Psychiatry they identify which pathways in the mouse brain control behaviour associated with aversion.
Scientists have long been interested in how the brain creates signals associated with negative emotions in order to better understand how imbalances in the same system can lead to affective disorders such as depression and anxiety.
The amygdala has long been the most commonly studied brain structure for understanding fear, whereas for rewards and positive signals the focus has been on the neurotransmitter dopamine. But when it comes to areas of the brain that control feelings of discomfort and aversion, much less is known.
In the past few years, research has indicated that a brain structure called the habenula controls positive and negative emotions in animal models. Moreover, clinical cases have been conducted with patients suffering from depression where deep brain stimulation of the habenula has been beneficial. The habenula controls both dopamine and the neurotransmitter serotonin, which is thought to play a significant part in the sense of wellbeing. However, it has not been known how the habenula is regulated.
Researchers at Karolinska Institutet have now mapped which networks in the mouse brain control the habenula, and what role they play in aversion.
"We've discovered a specific pathway that goes between the hypothalamus and the habenula, and that can be modulated using optogenetics to control the feeling of aversion," says study leader docent Dinos Meletis at the Department of Neuroscience. "Our hope is that this can lead to the development of new treatments that can rebalance the brain's networks in for example depression or anxiety disorders."
Using optogenetics and other advanced methods, the group were able to identify the identity of the nerve cells and map their interconnections. Optogenetics is a method that uses light to activate specific neurons in order to study how the activation of different networks affects behaviour.
"This methodological revolution in brain research has made it possible to functionally study how different types of nerve cell and pathways actually control different types of behaviour, something that was impossible to do only a decade ago," says Dr Meletis.

Live better with attainable goals

Those who set realistic goals can hope for a higher level of well-being. The key for later satisfaction is whether the life goals are seen as attainable and what they mean to the person, as psychologists from the University of Basel report in a study with over 970 participants.
Wealth, community, health, meaningful work: life goals express a person's character, as they determine behavior and the compass by which people are guided. It can therefore be assumed that goals can contribute substantially to how satisfied people are in life -- or how dissatisfied if important goals are blocked and cannot be achieved.
A team of psychologists from the University of Basel conducted a detailed examination on how life goals are embedded in people's lives across adult; the results are now published in the European Journal of Personality. The researchers used data from 973 people between 18 and 92 years old living in German-speaking parts of Switzerland; more than half of the participants were surveyed again after two and four years. The participants had to assess the importance and the perceived attainability of life goals in ten areas -- health, community, personal growth, social relationships, fame, image, wealth, family, responsibility/care for younger generations, and work -- using a four-point scale.
Life goals with predictive power
The findings of the study revealed that perceiving one's personal goals as attainable is an indicator for later cognitive and affective well-being. This implies that people are most satisfied if they have a feeling of control and attainability. Interestingly, the importance of the goal was less relevant for later well-being than expected.
Life goals also hold predictive power for specific domains: Participants who set social-relation goals or health goals were more satisfied with their social relationships or their own health. The link between life goals and subsequent well-being appeared to be relatively independent of the age of the participants.
Younger people want status, older people want social engagement
What are the goals that people value the most in a respective age period? The goals that people value in a particular life stage depend on the development tasks that are present at this stage: the younger the participants were, the more they rated personal-growth, status, work and social-relation goals as important. The older the participants were, the more they rated social engagement and health as important.
"Many of our results confirmed theoretical assumptions from developmental psychology," says lead author and PhD student Janina Bühler from the University of Basel's Faculty of Psychology. Life goals were strongly determined by age: "If we examine, however, whether these goals contribute to well-being, age appears less relevant." Hence, adults, whether old or young, are able to balance the importance and attainability of their goals.

Depression reversed in male mice by activating gene that helps excite neurons

Directly activating a gene important to exciting our excitatory neurons and associated with major depression may help turn around classic symptoms like social isolation and loss of interest, at least for males, scientists report.
They looked in the prefrontal cortex, a brain area involved in complex behaviors like planning, personality and social behavior and known to have an important role in the pathogenesis of major depression, and found that making the SIRT1 gene inactive in excitatory neurons there created symptoms of depression in male mice, they report in the journal Molecular Psychiatry.
When, like in real life, stress not direct gene manipulation caused depression, a drug that activated SIRT1, reversed the symptoms in the males, says molecular behavioral neuroscientist Dr. Xin-Yun Lu.
"It has an antidepressant-like effect," says Lu, the study's corresponding author, a professor in the Department of Neuroscience and Regenerative Medicine at the Medical College of Georgia at Augusta University and Georgia Research Alliance Eminent Scholar in Translational Neuroscience.
That means drugs that activate SIRT1 and enable the usual high level of activity of these excitatory neurons might one day be an effective therapy for some with major depression, says Lu. Major depression is one of the most common mental disorders in the United States, affecting nearly 7 percent of adults, according to the National Institute of Mental Health.
The firing of excitatory neurons is definitely decreased in depression, and neurons are not communicating as they should. "It's like they are disconnected," says Lu. Problems like manic behavior and seizures, on the other hand, indicate excessive firing.
It's hard to get excited without energy, and another of SIRT1's known roles in brain cells is regulating cell powerhouses, called mitochondria. The scientists found that at least part of the way knocking out SIRT1 in males impacted the excitability of these normally excited neurons was by reducing the number of cell powerhouses and the expression of genes involved in powerhouse production.
The depressed behaviors they saw as a result are another indicator of SIRT1's importance in that region to mood regulation and how without it, there is insufficient excitation of neurons. So was the resolution of stress-induced depression in male mice when they activated SIRT1 that had been deactivated by stress, the scientists say.
They note surprise at the lack of impact in female mice since the SIRT1 variant was first identified in a large gene study of depressed women. They suspect physical differences in this front region of the brain, like differences in the numbers of neurons and synapses between males and females, could help explain the sex differences they found. Lu is already looking to see if she finds similar sex disparities in the hippocampus, another brain region important in depression as well as other conditions like Alzheimer's.
Still, depressed mice and humans act similarly, Lu says, which includes an impaired ability to feel pleasure called anhedonia. So they used the mice's usual high preference for a sweet, sucrose solution, as one way to gauge their depression.
"You give them a choice and they will drink that," she says. "But if you stress them, they won't lose their preference necessarily but it will reduce their interest." Males also forego their normal social nature and instead become loners. They even lose their interest in sex and in sniffing the females' pheromones.
Lu, who is also a pharmacologist, plans to look at existing drugs, including some older drugs never used for depression, to see if any have an impact on SIRT1 like the research drug they used.
Depression is generally considered caused by a combination of genetic and environmental factors. Lu says some individuals likely are born with the SIRT1 variant identified in genome-wide association studies, which predisposes them to depression, although environmental factors must also come into play for depression to happen. She notes that the SIRT1 variant is likely rare and only associated with depression rather than considered causative.
The prefrontal cortex is known to have a role in emotional responses and involved in controlling neurotransmitters, like serotonin, which are key to mood regulation. The severity of depression correlates with the degree of inactivity of that brain region, Lu and her colleagues write.
A 2015 study in the journal Nature reported genome-wide studies of 5,303- Chinese women with major depressive disorder and 5,337 controls identified a variant of the SIRT1 gene as one of two variants associated with the disorder. Those scientists later replicated the finding in males.
Lu notes the magic of finding it in the females in the large human study was likely because of the consistent severity of the disease and the fact that all the women were from a similar region.

People who cunningly use cooperation and egoism are 'unbeatable'

Cooperating with other people makes many things easier. However, competition is also a characteristic aspect of our society. In their struggle for contracts and positions, people have to be more successful than their competitors and colleagues. When will cooperation lead to success, and when is egoism more effective? Scientists from the Max Planck Institute for Evolutionary Biology in Ploen have developed an experiment that enables them to examine the success rate of cooperative and egoistic behaviour strategies. A strategy referred to as "extortion" is particularly successful, according to the researchers. This strategy that alternates between cooperation and egoism is difficult for the co-player to resist. The extortion strategy is especially effective when there is strong competitive pressure -- that is if there can be only one winner.
"Extortioners often come across as friendly colleagues. They reciprocate friendliness with friendliness, making their competitors feel as though it must be a misunderstanding, if they are taken advantage of again and again. They are forced to play along to avoid losing even more. This seemingly friendly yet extremely tough exploitation strategy is rewarded with additional gain," explains Manfred Milinski from the Max Planck Institute for Evolutionary Biology in Ploen. Together with Lutz Becks, who has meanwhile taken up research work at the University of Konstanz, he examined the willingness of human beings to cooperate and exploit under varying conditions.
Calculations drawn up by scientists show that mutual support can easily turn into extortion. Theorists use the so called prisoners' dilemma to explore this issue of social interaction among human beings. In this game, two participants will benefit more if they cooperate, than they would if both of them behaved egoistically. However, if one player is egoistic while the other one cooperates, the egoistic player will receive the largest prize, while the cooperating player goes away empty-handed.
This means that cooperating is only worthwhile, if you keep encountering the same player, and are thus able to "punish" previous egoism and reward cooperative behaviour. For a long time, scientists have considered this type of "tit for tat" strategy to be the most effective behavioural strategy and a recipe for mutual cooperation.
Extortion is unbeatable
In reality, however, many people tend to cooperate less frequently, than is predicted theoretically for the prisoners' dilemma. This discrepancy can be explained by the "extortion" strategy that has been referred to as unbeatable, and was first described by two US researchers in 2012. The extortioner takes advantage of the other player systematically, by forcing them to constantly cooperate. In 60 percent of cases, an "extortioner" will react to their counterpart's cooperation by cooperating themselves. In 40 percent of cases they will behave egoistically and collect the maximum prize.
The co-player has to comply with the extortioner, because it is the only behaviour that will pay off for them. They are only able to increase their small gain, by cooperating more and more frequently, in order to benefit in most cases from the extortioner's 60 percent of cooperation. Their gain will increase steadily as a result, but they will cause the extortioner to obtain a much greater prize.
Experiments conducted by Manfred Milinski and his team in Ploen showed that human beings actually tend to be encouraged to cooperate and to accede to extortion, when they are playing against a computer that employs the extortion strategy. However, a computer is unimpressed, if its human co-players become increasingly unwilling and refuse to cooperate over the second half of the experiment. The experiments could therefore not demonstrate, whether or not a human extortioner would eventually yield to their competitors' attempts to discipline them, and would return to more cooperative behaviour.
A bonus as incentive for extortion
Manfred Milinski and Lutz Becks examined in over 100 students if, and under which conditions, extortioners can be disciplined. In 49 consecutive rounds of the prisoners' dilemma, two students each played for real sums of money.
The scientists introduced a bonus to increase competitive pressure among the players. In the first experiment, one player was drawn from each pair, who would receive an additional bonus of ten euros in the end, if that player managed to earn at least ten percent more than their co-player. In the second experiment, the bonus was granted to the player who earned ten percent more than their competitor. No bonus was available in a control experiment.
Where there was no prospect of receiving a bonus, the players would quickly cooperate and usually obtain a high profit. They used a cooperative strategy ("generous") that was described only recently. The extortion strategy did not occur in this scenario.
However, if one of the players was additionally enticed with a bonus, this player would turn into an extortioner in many cases. Despite the fact that the other player would keep trying to discipline them by refusing cooperation, the extortioner would resist and cooperate even less rather than more frequently over the course of the experiment. Extortioners were also shown to be most successful in the long-term, even in the experiment in which the potential bonus player was not predetermined.
The bonus enabled extortioners to earn even more than cooperative players using a "generous" strategy, who had no prospect of receiving a bonus. "Willingness to cooperate is not a recipe for success, if competitive pressure is strong. Our results show why human beings frequently prove to be less cooperative in real life, than has been predicted in the past," Becks explains.

Hormone therapy may increase cardiovascular risk during gender transition

Patients receiving hormone therapy as part of their gender-transition treatment had an elevated risk for cardiovascular events, including strokes, heart attacks and blood clots, according to a study published in the American Heart Association's journal Circulation.
The results are based on analysis of medical records of 3,875 Dutch individuals who received hormone treatment between 1972and 2015 as part of their gender transition.
"In light of our results, we urge both physicians and transgender individuals to be aware of this increased cardiovascular risk," said study author Nienke Nota, M.D., a researcher in the department of endocrinology at the Amsterdam University Medical Center. "It may be helpful to reduce risk factors by stopping smoking, exercising, eating a healthy diet and losing weight, if needed before starting therapy, and clinicians should continue to evaluate patients on an ongoing basis thereafter."
Past research has shown that hormone therapy increases cardiovascular risk among people receiving it to alleviate symptoms of menopause, yet research evidence remains scarce on the effects of hormone treatment in people undergoing gender transition. Even though such individuals tend to be younger than menopausal patients receiving hormone-replacement therapy, transgender people may have more psychosocial stressors and other factors that increase cardiovascular risk, the researchers said.
The analysis involved 2,517 transgender women, median age 30, who received estrogen, with or without androgen-suppressors, and 1,358 transgender men, median age 23, who received testosterone as part of their transition.
To gauge risk, the researchers determined the incidence of acute cardiovascular events -- heart attacks, strokes and deep vein thromboses (blood clots). They compared the incidence of such cases in the transgender population to that reported in the general population. Transwomen were followed for an average of 9 years since start of hormone therapy, while transmen were followed for an average of 8 years after starting with hormones.
The analysis showed that transwomen -- individuals assigned male sex at birth but with female gender identity who were receiving hormones as part of their transition -- had more than twice as many strokes as women (29 versus 12) and nearly twice as many strokes as men (29 versus 16). There were five times as many deep-vein clots among transwomen (73) than women (13) and 4.5 times more than men (73 versus 16). Heart attacks occurred at more than twice the rate among transwomen (30) than women (13). Transmen -- those assigned female sex at birth but who had male gender identity and received hormones -- had a more than three-fold elevation in heart-attack risk compared with women (11 versus 3).
The study was not designed to tease out the mechanism behind the increased risk. The researchers caution that their study was based solely on a review of medical records and could not account for risk factors such as smoking, psychosocial stressors, dietary and exercise habits. While those risk factors probably contribute to the increased cardiovascular risk, the researchers suggest that the hormone therapy may contribute to increased risk as well.
In previous studies it has been shown that triglyceride and insulin levels, for example, have both increased as a result of estrogen therapy and both are known to promote clogging and inflammation of the blood vessels. Additionally, estrogen therapy can render the blood more prone to clotting, which may explain the higher rate of strokes and blood clots observed in transwomen, the authors said. The rise in heart attack risk observed in transmen receiving testosterone may be explained partly by the hormone's tendency to make the blood stickier by increasing the concentration of red blood cells along with lowering the level of good cholesterol and raising the level of bad cholesterol, the research team said.

Parents: Keep medical marijuana dispensaries away from kids

With medical marijuana now legal in about two-thirds of U.S. states, there's growing concern about how dispensaries may impact surrounding neighborhoods and communities.
And parents in a new national poll overwhelmingly agree on one place dispensaries should not be allowed: anywhere near children.
Seven in 10 parents think they should have a say in whether dispensaries are located near their child's school or daycare and most say they should be banned within a certain distance of those facilities, according to the C.S. Mott Children's Hospital National Poll on Children's Health at the University of Michigan.
Highest on the list of concerns was the risk impaired drivers may pose to children -- with nearly half of parents saying this was a significant worry. A recent study found that more than half of people taking cannabis for chronic pain report driving while high.
"Medical marijuana has become legal in the majority of states but there is wide variation in state and local policies that regulate the location and operation of dispensaries," says poll co-director Sarah Clark, M.P.H.
"The majority of parents feel strongly that they should give local input on decisions regarding where dispensaries may open and also support limitations on how close dispensaries could be to children's areas."
Aside from the top concern involving drivers under the influence, some parents also worried about the possibility of a child finding and ingesting edible marijuana inadvertently left behind by a dispensary customer (48 percent), and teens having easier opportunities accessing marijuana (49 percent.) Other dispensary concerns included setting a bad example for kids (45 percent) and bringing violent crime to the area (35 percent).
Three quarters of parents indicated general support for legal medical marijuana, including one third of parents who support the option for children. Just 26 percent of parents opposed medical marijuana.
At the same time, most parents agreed that dispensaries should be banned within a certain distance of elementary schools, middle and high schools, and daycare centers. Forty-four percent of parents also believed dispensaries should not be close to places of worship. Support for such bans was equally strong among both mothers and fathers, younger and older parents, and parents of higher and lower income.
"Most parents seem to understand that marijuana can have legitimate medical benefits, but parents also have major concerns about the risks that medical marijuana dispensaries might pose to children," Clark says. "When it comes to where dispensaries are located, many parents feel that any area near children is too close for comfort."
Most parents (77 percent) agreed that medical marijuana dispensaries should have the same regulations as liquor stores for where they can be located. Meanwhile, 52 percent of parents said dispensaries should have the same rights as other businesses. Nearly all parents (90 percent) felt dispensaries should undergo inspections to ensure they are following all regulations.
Nearly half of parents (45 percent) said that medical marijuana is legal in their state, and 24 percent knew there was at least one medical marijuana dispensary in their community. Only 20 percent reported that their state or community has regulations about where dispensaries can be located, while 59 percent did not know if such regulations exist.
While most parents wanted to be consulted about locating a dispensary near their child's school or daycare, this may prove difficult, Clark says. There is no consistent state or local framework to regulate the location and operations of dispensaries. Some states may have added legal complexities differentiating the sale of medical versus recreational marijuana.
It may also be confusing about whether parents need to contact elected officials or commissions, and if they should focus on the state or local level when an application is filed for a new dispensary. Decisions about the location of new dispensaries could be made through a state law, a local zoning regulation, or other action.
"Parents who want to share their views about dispensaries before any open in their school's neighborhood may have limited opportunities to do so. They may not even be aware that a specific dispensary location is under consideration until the decision has already been made," Clark says.
"The lack of established standards may lead officials to enact policies that may not address parents' concerns," Clark adds. "Parents who want to provide input about local dispensaries may need to take the initiative to learn about the rules for opening a dispensary in their community and what steps they should follow to be involved in these decisions."

Blood clot discovery could pave way for treatment of blood diseases

Scientists have discovered new ways in which the body regulates blood clots, in a discovery which could one day lead to the development of better treatments that could help prevent and treat conditions including heart diseases, stroke and vascular dementia.
Led by the University of Exeter and funded by the British Heart Foundation, the team has developed a new technique that allows them to simultaneously measure blood clotting and the formation of free radicals.
Free radicals are unstable molecules containing unpaired single electrons seeking to pair up. This makes these molecules highly reactive and able to modify protein, lipids and DNA. Amongst other unwanted effects, free radicals play a role in the build-up of blood clots, which in turn are considered a key driver in the a development of a range of conditions, including heart disease, stroke, dementia, and inflammation-related conditions such as arthritis.
The new technique is outlined in research published in Haematologica. The technique combines electron paramagnetic resonance, a cutting-edge method for detecting free radicals, with blood cell aggregometry, an established technique for measuring blood clotting. The team has successfully used the technique in mice and in human cells. They aim to better understand how blood cells function, which will help to develop new drugs against blood clotting diseases or to test the risk of clotting diseases in patients.
Dr Giordano Pula, of the University of Exeter Medical School, led the study. He said: "We're really excited to discover this new technique and its potential to understand how blood vessel diseases develop. For the first time, we can now simultaneously measure blood clotting and the formation of free radicals. We know they play a key role in blood vessel damage caused by ageing, diabetes, obesity and chronic inflammation. We're currently using this technique in our efforts to develop a new treatment to protect the blood vessels in diseases such as heart diseases, stroke, obesity, and vascular dementia."
The researcher team, which includes collaborators in the laboratory of Professor Patrick Pagano at the University of Pittsburgh (US), discovered that the enzymes NADPH oxidases are critically important for the generation of free radicals, the stimulation of blood clotting and the promotion of blood vessel damage in patients.
Professor Jeremy Pearson, Associate Medical Director at the British Heart Foundation, said: "With BHF funding, Dr Pula has developed an improved method to investigate part of the blood clotting process which focuses on the ways in which platelets from blood samples clump together.
"This method may be useful for future studies looking into new anti-platelet treatments for diseases, such as diabetes, where clotting is disturbed and increases the risk of heart attack or stroke."

Human cells can change job to fight diabetes

Traditional cell biology textbooks say that most cells can only differentiate to the same cell type, with the same function.
It seems that some of these textbooks need to be rewritten, thanks to the new results by researchers at the University of Bergen and their international partners at Université de Genève (UNIGE), Harvard Medical School, Universiteit Leiden and the Oregon Stem Cell Center (OHSU).
The team's latest study shows that the cells in the human body are much more able to differentiate into different cell types than earlier assumed. They are the first researchers ever to have managed to influence the signals in human cells, so that these cells can change their original function.
"By influencing the glucagon-producing cells in the pancreas, we made them be able to produce insulin instead. This may lead to new treatments for diabetes," says Professor Helge Ræder, leader of the Diabetes Stem Cell Group, Department of Clinical Science, UiB.
The researchers witnessed that mice recovered from diabetes after they had human manipulated cells transplanted into their pancreas, and became sick again as soon as these cells were removed.
Resistant cells
In addition to having the glucagon-producing cells produce insulin, the study also showed that these cells were also more resistant against the immune system, which usually attacks insulin-producing cells in diabetes patients.
"This means that we probably can use the patient's own cells in this diabetes treatment, without being afraid that the manipulated cells will eventually be destroyed by the immune system," Ræder explains.
"Today, it is possible to transplant insulin producing cells from dead donors to diabetes patients. The big challenge is that we are only able to treat a very small fraction of the patients with this method."
A step toward new gene therapy
Ræder believes that the new method is not limited to only changing the function of the cells in the pancreas. He thinks that this cell flexibility will be found in many other types of cells in the human body, and may contribute to new treatments for many different diseases.
"The ability of cells to change their function may be important in the treatment of other diseases caused by cell death, including neurological diseases, heart attacks and cancer," says Helge Ræder.
Facts: Diabetes and pancreas
There are three different types of cells in the pancreas: alpha-cells, beta-cells and delta-cells. The cells make cluster and produce different kinds of hormones for blood sugar regulation.
Alpha-cells produce glucagon, which increases the blood sugar levels. Beta-cells produce insulin, which decreases glucagon levels. Delta-cells produce somatostatin, which controls the regulation of the Alpha and Beta Cells.
Persons with type 1 diabetes have a damaged beta-cell function, and therefore have constant high blood sugar levels. They need to get insulin by injections.

Helping patients breathe during dangerous procedure prevents complications

Thousands of Americans die each year during a dangerous two-minute procedure to insert a breathing tube.
Now a Vanderbilt University Medical Center (VUMC) study in the New England Journal of Medicine (NEJM) is showing that using bag-mask ventilation, squeezing air from a bag into the mouth for 60 seconds to help patients' breathing, improves outcomes and could potentially save lives.
"When you place a breathing tube, you have to give patients medications to make them relaxed and sleepy. And those medications take about a minute to kick in," said first author Jonathan D. Casey, MD, a Pulmonary and Critical Care Fellow at VUMC.
"After you give those medications, there is a big divide among doctors about whether to just wait and watch while their breathing slows and stops, or to provide ventilation (breath for the patient) with a bag-mask device. We found that providing ventilation with a bag-mask device is safe and very effective. Most importantly, it cut the rate of severely low oxygen levels in half."
Tracheal intubation, the process of placing a breathing tube, may be required to perform surgery or to support breathing during a serious illness. During tracheal intubation for illness, about 40 percent of people suffer low oxygen levels, which may cause damage to the brain and heart, and 2 percent of people suffer cardiac arrest, a sudden failure of heart function that is frequently fatal.
The PreVent trial (Preventing Hypoxemia with Manual Ventilation during Endotracheal Intubation) is a multicenter trial of bag-mask ventilation during tracheal intubation; the results, released today in the NEJM, have the potential to change practice across the nation, as more than 1.5 million patients undergo tracheal intubation each year in the U.S.
"Doctors have been performing this procedure for 50 years and there has always been controversy about the safest way to do it," Casey said. "Some doctors believe that when you squeeze the bag and force air into the lungs that will also put air into the stomach and put the patient at risk for vomiting of stomach contents into the lungs.
"That is not what we found. Our study found that bag-mask ventilation didn't cause the vomiting that people were worried about, and it was very effective at preventing low oxygen levels."
The multicenter trial was conducted in seven ICUs across the U.S., with adult patients undergoing the procedure receiving either ventilation with a bag-mask device or no ventilation between induction and laryngoscopy.
Among the 401 patients enrolled, the lowest median oxygen saturation was 96 percent in the bag-mask ventilation group as compared to 93 percent in the no-ventilation group.
A total of 21 patients in the bag-mask ventilation group had severely low oxygen levels, as compared with 45 patients in the no-ventilation group.
Vomiting of stomach contents into the lungs occurred during 2.5 percent of intubations in the bag-mask ventilation group and during 4 percent of the group without bag-mask ventilation.
The Medical ICU at Vanderbilt now routinely uses bag-mask ventilation during placement of a breathing tube.
"It is important to act on what we learn. Not only did we immediately apply these important results to our practice, but we have started follow-up trials of other ways to improve the safety of tracheal intubation -- and those new trials require that bag-mask ventilation be provided for every patient receiving a breathing tube," said Matthew W. Semler, MD MSc, Associate Director of the Medical ICU at Vanderbilt and senior author on the study.
"The best thing about this intervention is that it is free," concluded David R. Janz, MD, MSc, Assistant Professor of Medicine at Louisiana State University and a co-author on the trial. "This is a device that is already always available when you are placing a breathing tube. In the past, we only used the bag-mask device to assist patients' breathing if we had difficulty placing a breathing tube. Now we know that it should be used in every procedure even before we make our first attempt to place a breathing tube."
The study was funded by the Vanderbilt Institute for Clinical and Translational Research; prevent ClinicalTrials.gov number NCT03026322.

What happens to magnetic nanoparticles once in cells?

Although magnetic nanoparticles are being used more and more in cell imaging and tissue bioengineering, what happens to them within stem cells in the long term remained undocumented. Researchers from CNRS, the Sorbonne Université, and universities Paris Diderot and Paris 13, have shown substantial degradation of these nanoparticles, followed in certain cases by the cells "re-magnetizing." This phenomenon is the sign of biosynthesis of new magnetic nanoparticles from iron released in the intracellular medium by the degradation of the first nanoparticles. Published in PNAS on February 11, 2019, this work may explain the presence of "natural" magnetism in human cells, and help us to envisage new tools for nanomedicine, thanks to this magnetism produced by the cells themselves.
Magnetic nanoparticles are at the core of today's nanomedicine: they serve as imaging diagnosis agents, thermal anti-cancer agents, drug targeting agents, and tissue engineering agents. The question of their fate in cells, after they have accomplished their therapeutic mission, was not well understood.
To follow the journey of these nanoparticles in cells, researchers at the Laboratoire Matière et Systèmes Complexes (CNRS/Université Paris Diderot) and the Laboratoire de Recherche Vasculaire Translationnelle (INSERM/Université Paris Diderot/Université Paris 13), in collaboration with scientists from Sorbonne Université[1] have developed an original approach to nanomagnetism in living systems: first they incorporated magnetic nanoparticles in vitro in human stem cells. They then left them to differentiate and develop for one month, to observe them long term in the intracellular environment and to monitor their transformations.
By following the "magnetic fingerprint" of these nanoparticles in the cells, the researchers have shown that they were first being destroyed (cell magnetization falls) and releasing iron into the intracellular environment. Next, this "free" iron was stored in non-magnetic form in ferritin, the protein responsible for storing iron, or served as a base for the biosynthesis of new magnetic nanoparticles within the cell.
This phenomenon is known to occur in some bacteria, but a biosynthesis like this had never been shown in mammalian cells. This could explain the presence of magnetic crystals in humans, observed in the cells of diverse organs, particularly the brain. What is more, this iron storage in magnetic form could also be a way for the cell to "detoxify" over the long term to counter excess iron. From the point of view of nanomedicine, this biosynthesis open up a new path to the possibility of purely biological magnetic marking in cells.

Penis development needs more than just testes and testosterone

An immunofluorescence image of a human fetal adrenal gland showing the steroidogenic cells.
Proper development of the fetal penis requires not just testosterone from the testes, but a second hormone produced by other tissues, including the placenta, according to a new study publishing February 14 in the open-access journal PLOS Biology from Paul Fowler of the University of Aberdeen, Michelle Bellingham of the University of Glasgow, and colleagues in the UK, France and Sweden. The results reveal a previously unknown pathway of masculinization of the external genitals, and may explain why placental dysfunction is associated with disorders of male genital development.
During development of the male fetus, the testes release testosterone, a steroid hormone which is converted to 5?-dihydrotestosterone (DHT) by the genital tubercle, helping to ensure that this primordial structure develops into a penis, rather than into the female clitoris. Recently, penis development was shown to also depend on a second process, called the alternative or "backdoor" pathway, which also ends in the production of DHT but doesn't depend on the production of testosterone by the testes. However, the details of this backdoor pathway, including the source of the DHT precursor, have been unclear.
To learn more about this pathway, the authors used mass-spectrometry to measure levels of different steroids in fetal plasma and tissue during the second trimester, when the most critical steps in penis development occur. They also analyzed gene expression levels in various tissues of key enzymes known to be involved in hormone synthesis.
They found that androsterone, a steroid from the backdoor pathway, which can be converted to DHT, was the principal androgen in the male fetal circulation, and that levels of both androsterone and testosterone were lower in the female fetal circulation. They also found that enzymes needed for the backdoor pathway were present primarily in non-gonadal tissue, including the liver and the placenta.
Since androsterone can be made from progesterone, the authors suggest that placental progesterone or related compounds are the likely source of androsterone in the backdoor pathway. While it remains unclear why a sex difference in fetal androsterone levels exists, the authors found high expression in the male genital tubercle of enzymes required to convert androsterone to DHT. The male genital tubercle appears, therefore to be able to convert both testosterone and androsterone into DHT.
"Our results demonstrate that masculinization of the male fetus depends not only on the testes, but also on other tissues, especially the placenta" Fowler and Bellingham said. "They also suggest an explanation for why disorders of placental insufficiency can lead to hypospadias and other abnormalities of growth of the male external genitalia."

Climate change makes summer weather stormier yet more stagnant

Summer thunderstorm in the city.
Climate change is shifting the energy in the atmosphere that fuels summertime weather, which may lead to stronger thunderstorms and more stagnant conditions for midlatitude regions of the Northern Hemisphere, including North America, Europe, and Asia, a new MIT study finds.
Scientists report that rising global temperatures, particularly in the Arctic, are redistributing the energy in the atmosphere: More energy is available to fuel thunderstorms and other local, convective processes, while less energy is going toward summertime extratropical cyclones -- larger, milder weather systems that circulate across thousands of kilometers. These systems are normally associated with winds and fronts that generate rain.
"Extratropical cyclones ventilate air and air pollution, so with weaker extratropical cyclones in the summer, you're looking at the potential for more poor air-quality days in urban areas," says study author Charles Gertler, a graduate student in MIT's Department of Earth, Atmospheric and Planetary Sciences (EAPS). "Moving beyond air quality in cities, you have the potential for more destructive thunderstorms and more stagnant days with perhaps longer-lasting heat waves."
Gertler and his co-author, Associate Professor Paul O'Gorman of EAPS, are publishing their results in the Proceedings of the National Academy of Sciences.
A shrinking gradient
In contrast to more violent tropical cyclones such as hurricanes, extratropical cyclones are large weather systems that occur poleward of the Earth's tropical zone. These storm systems generate rapid changes in temperature and humidity along fronts that sweep across large swaths of the United States. In the winter, extratropical cyclones can whip up into Nor'easters; in the summer, they can bring everything from general cloudiness and light showers to heavy gusts and thunderstorms.
Extratropical cyclones feed off the atmosphere's horizontal temperature gradient -- the difference in average temperatures between northern and southern latitudes. This temperature gradient and the moisture in the atmosphere produces a certain amount of energy in the atmosphere that can fuel weather events. The greater the gradient between, say, the Arctic and the equator, the stronger an extratropical cyclone is likely to be.
In recent decades, the Arctic has warmed faster than the rest of the Earth, in effect shrinking the atmosphere's horizontal temperature gradient. Gertler and O'Gorman wondered whether and how this warming trend has affected the energy available in the atmosphere for extratropical cyclones and other summertime weather phenomena.
They began by looking at a global reanalysis of recorded climate observations, known as the ERA-Interim Reanalysis, a project that has been collecting available satellite and weather balloon measurements of temperature and humidity around the world since the 1970s. From these measurements, the project produces a fine-grained global grid of estimated temperature and humidity, at various altitudes in the atmosphere.
From this grid of estimates, the team focused on the Northern Hemisphere, and regions between 20 and 80 degrees latitude. They took the average summertime temperature and humidity in these regions, between June, July, and August for each year from 1979 to 2017. They then fed each yearly summertime average of temperature and humidity into an algorithm, developed at MIT, that estimates the amount of energy that would be available in the atmosphere, given the corresponding temperature and humidity conditions.
"We can see how this energy goes up and down over the years, and we can also separate how much energy is available for convection, which would manifest itself as thunderstorms for example, versus larger-scale circulations like extratropical cyclones," O'Gorman says.
Seeing changes now
Since 1979, they found the energy available for large-scale extratropical cyclones has decreased by 6 percent, whereas the energy that could fuel smaller, more local thunderstorms has gone up by 13 percent.
Their results mirror some recent evidence in the Northern Hemisphere, suggesting that summer winds associated with extratropical cyclones have decreased with global warming. Observations from Europe and Asia have also shown a strengthening of convective rainfall, such as from thunderstorms.
"Researchers are finding these trends in winds and rainfall that are probably related to climate change," Gertler says. "But this is the first time anyone has robustly connected the average change in the atmosphere, to these subdaily timescale events. So we're presenting a unified framework that connects climate change to this changing weather that we're seeing."
The researchers' results estimate the average impact of global warming on summertime energy of the atmosphere over the Northern Hemisphere. Going forward, they hope to be able to resolve this further, to see how climate change may affect weather in more specific regions of the world.
"We'd like to work out what's happening to the available energy in the atmosphere, and put the trends on a map to see if it's, say, going up in North America, versus Asia and oceanic regions," O'Gorman says. "That's something that needs to be studied more."

Ultra-lightweight ceramic material withstands extreme temperatures

The new ceramic aerogel is so lightweight that it can rest on a flower without damaging it.
UCLA researchers and collaborators at eight other research institutions have created an extremely light, very durable ceramic aerogel. The material could be used for applications like insulating spacecraft because it can withstand the intense heat and severe temperature changes that space missions endure.
Ceramic aerogels have been used to insulate industrial equipment since the 1990s, and they have been used to insulate scientific equipment on NASA's Mars rover missions. But the new version is much more durable after exposure to extreme heat and repeated temperature spikes, and much lighter. Its unique atomic composition and microscopic structure also make it unusually elastic.
When it's heated, the material contracts rather than expanding like other ceramics do. It also contracts perpendicularly to the direction that it's compressed -- imagine pressing a tennis ball on a table and having the center of the ball move inward rather than expanding out -- the opposite of how most materials react when compressed. As a result, the material is far more flexible and less brittle than current state-of-the-art ceramic aerogels: It can be compressed to 5 percent of its original volume and fully recover, while other existing aerogels can be compressed to only about 20 percent and then fully recover.
The research, which was published today in Science, was led by Xiangfeng Duan, a UCLA professor of chemistry and biochemistry; Yu Huang, a UCLA professor of materials science and engineering; and Hui Li of Harbin Institute of Technology, China. The study's first authors are Xiang Xu, a visiting postdoctoral fellow in chemistry at UCLA from Harbin Institute of Technology; Qiangqiang Zhang of Lanzhou University; and Menglong Hao of UC Berkeley and Southeast University.
Other members of the research team were from UC Berkeley; Purdue University; Lawrence Berkeley National Laboratory; Hunan University, China; Lanzhou University, China; and King Saud University, Saudi Arabia.
Despite the fact that more than 99 percent of their volume is air, aerogels are solid and structurally very strong for their weight. They can be made from many types of materials, including ceramics, carbon or metal oxides. Compared with other insulators, ceramic-based aerogels are superior in blocking extreme temperatures, and they have ultralow density and are highly resistant to fire and corrosion -- all qualities that lend themselves well to reusable spacecraft.
But current ceramic aerogels are highly brittle and tend to fracture after repeated exposure to extreme heat and dramatic temperature swings, both of which are common in space travel.
The new material is made of thin layers of boron nitride, a ceramic, with atoms that are connected in hexagon patterns, like chicken wire.
In the UCLA-led research, it withstood conditions that would typically fracture other aerogels. It stood up to hundreds of exposures to sudden and extreme temperature spikes when the engineers raised and lowered the temperature in a testing container between minus 198 degrees Celsius and 900 degrees above zero over just a few seconds. In another test, it lost less than 1 percent of its mechanical strength after being stored for one week at 1,400 degrees Celsius.
"The key to the durability of our new ceramic aerogel is its unique architecture," Duan said. "Its innate flexibility helps it take the pounding from extreme heat and temperature shocks that would cause other ceramic aerogels to fail."
Ordinary ceramic materials usually expand when heated and contract when they are cooled. Over time, those repeated temperature changes can lead those materials to fracture and ultimately fail. The new aerogel was designed to be more durable by doing just the opposite -- it contracts rather than expanding when heated.
In addition, the aerogel's ability to contract perpendicularly to the direction that it's being compressed -- like the tennis ball example -- help it survive repeated and rapid temperature changes. (That property is known as a negative Poisson's ratio.) It also has interior "walls" that are reinforced with a double-pane structure, which cuts down the material's weight while increasing its insulating abilities.
Duan said the process researchers developed to make the new aerogel also could be adapted to make other ultra-lightweight materials.
"Those materials could be useful for thermal insulation in spacecraft, automobiles or other specialized equipment," he said. "They could also be useful for thermal energy storage, catalysis or filtration."
The research was partly supported by grants from the National Science Foundation.

Novel C. diff structures are required for infection, offer new therapeutic targets

  Iron storage "spheres" inside the bacterium C. diff -- the leading cause of hospital-acquired infections -- could offer new targ...