Thursday, 15 June 2023

Shining potential of missing atoms

 Single photons have applications in quantum computation, information networks, and sensors, and these can be emitted by defects in the atomically thin insulator hexagonal boron nitride (hBN). Missing nitrogen atoms have been suggested to be the atomic structure responsible for this activity, but it is difficult to controllably remove them. A team at the Faculty of Physics of the University of Vienna has now shown that single atoms can be kicked out using a scanning transmission electron microscope under ultra-high vacuum. The results are published in the journal Small.

Transmission electron microscopy allows us to see the atomic structure of materials, and it is particularly well suited to directly reveal any defects in the lattice of the specimen, which may be detrimental or useful depending on the application. However, the energetic electron beam may also damage the structure, either due to elastic collisions or electronic excitations, or a combination of both. Further, any gases left in the vacuum of the instrument can contribute to damage, whereby dissociated gas molecules can etch away atoms of the lattice. Until now, transmission electron microscopy measurements of hBN have been conducted at relatively poor vacuum conditions, leading to rapid damage. Due to this limitation, it has not been clear whether vacancies -- single missing atoms -- can be controllably created.

At the University of Vienna, the creation of single atomic vacancies has now been achieved using aberration-corrected scanning transmission electron microscopy in near ultra-high vacuum. The material was irradiated at a range of electron-beam energies, which influences the measured damage rate. At low energies, damage is dramatically slower than previously measured under poorer residual vacuum conditions. Single boron and nitrogen vacancies can be created at intermediate electron energies, and boron is twice as likely to be ejected due to its lower mass. Although atomically precise measurements are not feasible at the higher energies previously used to make hBN emit single photons, the results predict that nitrogen in turn becomes easier to eject -- allowing these shining vacancies to be preferentially created.

Robust statistics collected by painstaking experimental work combined with new theoretical models were vital for reaching these conclusions. Lead author Thuy An Bui has worked on the project since her Master's thesis: "At each electron energy, I needed to spend many days at the microscope carefully collecting one series of data after another," she says. "Once the data was collected, we used machine learning to help analyse it accurately, though even this took a great deal of work." Senior author Toma Susi adds: "To understand the damage mechanism, we created an approximate model that combines ionization with knock-on damage. This allowed us to extrapolate to higher energies and shed new light on defect creation."

Saturday, 10 June 2023

Diet tracking: How much is enough to lose weight?

 Keeping track of everything you eat and drink in a day is a tedious task that is tough to keep up with over time. Unfortunately, dutiful tracking is a vital component for successful weight loss, however, a new study in Obesity finds that perfect tracking is not needed to achieve significant weight loss.

Researchers from UConn, the University of Florida, and the University of Pennsylvania tracked 153 weight loss program participants for six months where users self-reported their food intake using a commercial digital weight loss program. The researchers wanted to see what the optimal thresholds were for diet tracking to predict 3%, 5%, and 10% weight loss after six months.

"We partnered with WeightWatchers, who was planning on releasing a new Personal Points program, and they wanted to get empirical data via our clinical trial," says co-author and Department of Allied Health Sciences Professor Sherry Pagoto.

Pagoto explains that the new program takes a personalized approach to assigning points including a list of zero-point foods to eliminate the need for calculating calories for everything,

"Dietary tracking is a cornerstone of all weight loss interventions, and it tends to be the biggest predictor of outcomes. This program lowers the burden of that task by allowing zero-point foods, which do not need to be tracked."

Researchers and developers are seeking ways to make the tracking process less burdensome, because as Pagoto says, for a lot of programs, users may feel like they need to count calories for the rest of their lives: "That's just not sustainable. Do users need to track everything every single day or not necessarily?"

With six months of data, Assistant Professor in the Department of Allied Health Sciences Ran Xu was interested to see if there was a way to predict outcomes based on how much diet tracking participants did. Ran Xu and Allied Health Sciences Ph.D. student Richard Bannor analyzed the data to see if there were patterns associated with weight loss success from a data science perspective. Using a method called receiver operating characteristics (ROC) curve analysis they found how many days people need to track their food to reach clinically significant weight loss.

"It turns out, you don't need to track 100% each day to be successful," says Xu. "Specifically in this trial, we find that people only need to track around 30% of the days to lose more than 3% weight and 40% of the days to lose more than 5% weight, or almost 70% of days to lose more than 10% weight. The key point here is that you don't need to track every day to lose a clinically significant amount of weight."

This is promising since Pagoto points out that the goal for a six-month weight loss program is typically 5% to 10%, a range where health benefits have been seen in clinical trials.

"A lot of times people feel like they need to lose 50 pounds to get healthier, but actually we start to see changes in things like blood pressure, lipids, cardiovascular disease risk, and diabetes risk when people lose about 5-to-10% of their weight," says Pagoto. "That can be accomplished if participants lose about one to two pounds a week, which is considered a healthy pace of weight loss."

Xu then looked at trajectories of diet tracking over the six months of the program.

The researchers found three distinct trajectories. One they call high trackers, or super users, who tracked food on most days of the week throughout six months, and on average lost around 10% of their weight.

However, many participants belonged to a second group that started tracking regularly, before their tracking gradually declined over time to, by the four-month mark, only about one day per week. They still lost about 5% of their weight.

A third group, called the low trackers, started tracking only three days a week, and dropped to zero by three months, where they stayed for the rest of the intervention. On average this group lost only 2% of their weight.

"One thing that is interesting about this data is, oftentimes in the literature, researchers just look at whether there is a correlation between tracking and overall weight loss outcomes. Ran took a data science approach to the data and found there is more to the story," Pagoto says. "Now we're seeing different patterns of tracking. This will help us identify when to provide extra assistance and who will need it the most."

The patterns could help inform future programs which could be tailored to help improve user tracking based on which group they fall into. Future studies will dig deeper into these patterns to understand why they arise and hopefully develop interventions to improve outcomes.

"For me, what's exciting about these digital programs is that we have a digital footprint of participant behavior," says Xu. "We can drill down to the nitty-gritty of what people do during these programs. The data can inform precision medicine approaches, where we can take this data science perspective, identify patterns of behavior, and design a targeted approach."

Digitally delivered health programs give researchers multitudes of data they never had before which can yield new insights, but this science requires a multidisciplinary approach.

"Before, it felt like we were flying in the dark or just going by anecdotes or self-reported measures, but it's different now that we have so much user data. We need data science to make sense of all these data. This is where team science is so important because clinical and data scientists think about the problem from very different perspectives, but together, we can produce insights that neither of us could do on our own. This must be the future of this work," says Pagoto.

Xu agrees: "From a data science perspective, machine learning is exciting but if we just have machine learning, we only know what people do, but we don't know why or what to do with this information. That's where we need clinical scientists like Sherry to make sense of these results. That's why team science is so important."

No longer flying in the dark, these multi-disciplinary teams of researchers now have the tools needed to start tailoring programs even further to help people achieve their desired outcomes. For now, users of these apps can be assured that they can still get significant results, even if they miss some entries.

Taurine may be a key to longer and healthier life

 A deficiency of taurine -- a nutrient produced in the body and found in many foods -- is a driver of aging in animals, according to a new study led by Columbia researchers and involving dozens of aging researchers around the world.

The same study also found that taurine supplements can slow down the aging process in worms, mice, and monkeys and can even extend the healthy lifespans of middle-aged mice by up to 12%.

The study was published June 8 in Science.

"For the last 25 years, scientists have been trying to find factors that not only let us live longer, but also increase healthspan, the time we remain healthy in our old age," says the study's leader, Vijay Yadav, PhD, assistant professor of genetics & development at Columbia University Vagelos College of Physicians and Surgeons.

"This study suggests that taurine could be an elixir of life within us that helps us live longer and healthier lives."

Anti-aging molecules within us

Over the past two decades, efforts to identify interventions that improve health in old age have intensified as people are living longer and scientists have learned that the aging process can be manipulated.

Many studies have found that various molecules carried through the bloodstream are associated with aging. Less certain is whether these molecules actively direct the aging process or are just passengers going along for the ride. If a molecule is a driver of aging, then restoring its youthful levels would delay aging and increase healthspan, the years we spend in good health.

Taurine first came into Yadav's view during his previous research into osteoporosis that uncovered taurine's role in building bone. Around the same time, other researchers were finding that taurine levels correlated with immune function, obesity, and nervous system functions.

"We realized that if taurine is regulating all these processes that decline with age, maybe taurine levels in the bloodstream affect overall health and lifespan," Yadav says.

Taurine declines with age, supplementation increases lifespan in mice

First, Yadav's team looked at levels of taurine in the bloodstream of mice, monkeys, and people and found that the taurine abundance decreases substantially with age. In people, taurine levels in 60-year-old individuals were only about one-third of those found in 5-year-olds.

"That's when we started to ask if taurine deficiency is a driver of the aging process, and we set up a large experiment with mice," Yadav says.

The researchers started with close to 250 14-month-old female and male mice (about 45 years old in people terms). Every day, the researcher fed half of them a bolus of taurine or a control solution. At the end of the experiment, Yadav and his team found that taurine increased average lifespan by 12% in female mice and 10% in males. For the mice, that meant three to four extra months, equivalent to about seven or eight human years.

Taurine supplements in middle age improves health in old age

To learn how taurine impacted health, Yadav brought in other aging researchers who investigated the effect of taurine supplementation on the health and lifespan in several species.

These experts measured various health parameters in mice and found that at age 2 (60 in human years), animals supplemented with taurine for one year were healthier in almost every way than their untreated counterparts.

The researchers found that taurine suppressed age-associated weight gain in female mice (even in "menopausal" mice), increased energy expenditure, increased bone mass, improved muscle endurance and strength, reduced depression-like and anxious behaviors, reduced insulin resistance, and promoted a younger-looking immune system, among other benefits.

"Not only did we find that the animals lived longer, we also found that they're living healthier lives," Yadav says.

At a cellular level, taurine improved many functions that usually decline with age: The supplement decreased the number of "zombie cells" (old cells that should die but instead linger and release harmful substances), increased survival after telomerase deficiency, increased the number of stem cells present in some tissues (which can help tissues heal after injury), improved the performance of mitochondria, reduced DNA damage, and improved the cells' ability to sense nutrients.

Similar health effects of taurine supplements were seen in middle-aged rhesus monkeys, which were given daily taurine supplements for six months. Taurine prevented weight gain, reduced fasting blood glucose and markers of liver damage, increased bone density in the spine and legs, and improved the health of their immune systems.

Randomized clinical trial needed

The researchers do not know yet if taurine supplements will improve health or increase longevity in humans, but two experiments they conducted suggest taurine has potential.

In the first, Yadav and his team looked at the relationship between taurine levels and approximately 50 health parameters in 12,000 European adults aged 60 and over. Overall, people with higher taurine levels were healthier, with fewer cases of type 2 diabetes, lower obesity levels, reduced hypertension, and lower levels of inflammation. "These are associations, which do not establish causation," Yadav says, "but the results are consistent with the possibility that taurine deficiency contributes to human aging."

The second study tested if taurine levels would respond to an intervention known to improve health: exercise. The researchers measured taurine levels before and after a variety of male athletes and sedentary individuals finished a strenuous cycling workout and found a significant increase in taurine among all groups of athletes (sprinters, endurance runners, and natural bodybuilders) and sedentary individuals.

"No matter the individual, all had increased taurine levels after exercise, which suggests that some of the health benefits of exercise may come from an increase in taurine," Yadav says.

Only a randomized clinical trial in people will determine if taurine truly has health benefits, Yadav adds. Taurine trials are currently underway for obesity, but none are designed to measure a wide range of health parameters.

Other potential anti-aging drugs -- including metformin, rapamycin, and NAD analogs -- are being considered for testing in clinical trials.

"I think taurine should also be considered," Yadav says. "And it has some advantages: Taurine is naturally produced in our bodies, it can be obtained naturally in the diet, it has no known toxic effects (although it's rarely used in concentrations used ), and it can be boosted by exercise.

"Taurine abundance goes down with age, so restoring taurine to a youthful level in old age may be a promising anti-aging strategy."

Lingering effects of Neanderthal DNA found in modern humans

 Recent scientific discoveries have shown that Neanderthal genes comprise some 1 to 4% of the genome of present-day humans whose ancestors migrated out of Africa, but the question remained open on how much those genes are still actively influencing human traits -- until now.

A multi-institution research team including Cornell University has developed a new suite of computational genetic tools to address the genetic effects of interbreeding between humans of non-African ancestry and Neanderthals that took place some 50,000 years ago. (The study applies only to descendants of those who migrated from Africa before Neanderthals died out, and in particular, those of European ancestry.)

In a study published in eLife, the researchers reported that some Neanderthal genes are responsible for certain traits in modern humans, including several with a significant influence on the immune system. Overall, however, the study shows that modern human genes are winning out over successive generations.

"Interestingly, we found that several of the identified genes involved in modern human immune, metabolic and developmental systems might have influenced human evolution after the ancestors' migration out of Africa," said study co-lead author April (Xinzhu) Wei, an assistant professor of computational biology in the College of Arts and Sciences. "We have made our custom software available for free download and use by anyone interested in further research."

Using a vast dataset from the UK Biobank consisting of genetic and trait information of nearly 300,000 Brits of non-African ancestry, the researchers analyzed more than 235,000 genetic variants likely to have originated from Neanderthals. They found that 4,303 of those differences in DNA are playing a substantial role in modern humans and influencing 47 distinct genetic traits, such as how fast someone can burn calories or a person's natural immune resistance to certain diseases.

Unlike previous studies that could not fully exclude genes from modern human variants, the new study leveraged more precise statistical methods to focus on the variants attributable to Neanderthal genes.

While the study used a dataset of almost exclusively white individuals living in the United Kingdom, the new computational methods developed by the team could offer a path forward in gleaning evolutionary insights from other large databases to delve deeper into archaic humans' genetic influences on modern humans.

"For scientists studying human evolution interested in understanding how interbreeding with archaic humans tens of thousands of years ago still shapes the biology of many present-day humans, this study can fill in some of those blanks," said senior investigator Sriram Sankararaman, an associate professor at the University of California, Los Angeles. "More broadly, our findings can also provide new insights for evolutionary biologists looking at how the echoes of these types of events may have both beneficial and detrimental consequences."

How chronic stress drives the brain to crave comfort food

 When you're stressed, a high-calorie snack may seem like a comforting go-to. But this combination has an unhealthy downside. According to Sydney scientists, stress combined with calorie-dense 'comfort' food creates changes in the brain that drive more eating, boost cravings for sweet, highly palatable food and lead to excess weight gain.

A team from the Garvan Institute of Medical Research found that stress overrode the brain's natural response to satiety, leading to non-stop reward signals that promote eating more highly palatable food. This occurred in a part of the brain called the lateral habenula, which when activated usually dampens these reward signals.

"Our findings reveal stress can override a natural brain response that diminishes the pleasure gained from eating -- meaning the brain is continuously rewarded to eat," says Professor Herzog, senior author of the study and Visiting Scientist at the Garvan Institute.

"We showed that chronic stress, combined with a high-calorie diet, can drive more and more food intake as well as a preference for sweet, highly palatable food, thereby promoting weight gain and obesity. This research highlights how crucial a healthy diet is during times of stress."

The research was published in the journal Neuron.

From stressed brain to weight gain

While some people eat less during times of stress, most will eat more than usual and choose calorie-rich options high in sugar and fat.

To understand what drives these eating habits, the team investigated in mouse models how different areas in the brain responded to chronic stress under various diets.

"We discovered that an area known as the lateral habenula, which is normally involved in switching off the brain's reward response, was active in mice on a short-term, high-fat diet to protect the animal from overeating. However, when mice were chronically stressed, this part of the brain remained silent -- allowing the reward signals to stay active and encourage feeding for pleasure, no longer responding to satiety regulatory signals," explains first author Dr Kenny Chi Kin Ip from the Garvan Institute.

"We found that stressed mice on a high-fat diet gained twice as much weight as mice on the same diet that were not stressed."

The researchers discovered that at the centre of the weight gain was the molecule NPY, which the brain produces naturally in response to stress. When the researchers blocked NPY from activating brain cells in the lateral habenula in stressed mice on a high-fat diet, the mice consumed less comfort food, resulting in less weight gain.

Driving comfort eating

The researchers next performed a 'sucralose preference test' -- allowing mice to choose to drink either water or water that had been artificially sweetened.

"Stressed mice on a high-fat diet consumed three times more sucralose than mice that were on a high-fat diet alone, suggesting that stress not only activates more reward when eating but specifically drives a craving for sweet, palatable food," says Professor Herzog.

"Crucially, we did not see this preference for sweetened water in stressed mice that were on a regular diet."

Stress overrides healthy energy balance

Novel C. diff structures are required for infection, offer new therapeutic targets

  Iron storage "spheres" inside the bacterium C. diff -- the leading cause of hospital-acquired infections -- could offer new targ...