Showing posts with label HEALTH & MEDICINE. Show all posts
Showing posts with label HEALTH & MEDICINE. Show all posts

Friday, 3 November 2023

Gulf War Illness significantly reduces white blood cells' ability to make energy

A new Duke University-led study finds that Gulf War Illness (GWI), which affects approximately 250,000 U.S. veterans, significantly reduces their white blood cells' ability to make energy and creates a measurable biochemical difference in veterans who have the disease.

"Historically, GWI has been diagnosed based on a veteran's self-reported symptoms, such as exercise-induced fatigue, indigestion, dizziness, insomnia, or memory problems. There's been no objective biochemical or molecular measurements doctors could use to diagnose it," said Joel Meyer, professor of environmental genomics at Duke's Nicholas School of the Environment, who led the new study.

The new study provides measurements accessible in blood samples, which, though not sufficient to serve as a stand-alone diagnostic test, could be useful to help improve treatment for veterans suffering from Gulf War Illness by giving doctors a new way to assess whether a prescribed treatment is helping, Meyer said.

"Knowing this is an energetic deficiency can help us zero in on more effective ways to relieve the symptoms," Meyer said. "Blood tests, repeated over the course of the treatment, would show if a veteran's white blood cells are responding to a treatment and producing more energy."

He and his coauthors from Duke, the U.S. Department of Veteran Affairs' War-Related Illness and Injury Study Center, and the New Jersey Medical School published the new peer-reviewed paper Nov. 1 in the open-access journal PLOS ONE.

Their research reveals that Gulf War Illness inhibits white blood cells' energy production by impairing the workings of the cells' mitochondria, structures within the cell which extract energy from food and convert it into the chemical power needed to fuel growth, movement and other bodily processes and functions. Mitochondria are often referred to as the 'power plants' of the cell.

"The idea to investigate the role mitochondria might be playing in GWI came from Mike Falvo, one of my coauthors from Veteran Affairs and the New Jersey Medical School, who had noticed that a lot of GWI symptoms were similar to those associated with mitochondrial diseases," said Meyer. "So, we analyzed mitochondrial respiration and extracellular acidification, which are proxies for energy generation, in the white blood cells of 114 Gulf War veterans, 80 of whom had been diagnosed with GWI. We also looked for evidence of mitochondrial DNA damage and nuclear DNA damage."

The analyses revealed no evidence of DNA damage, but they did show significantly lower levels of extracellular acidification and oxygen consumption in the white blood cells from veterans with GWI -- signs that their mitochondria were generating less energy.

Follow-up blood tests on about a third of the veterans showed that some of these levels could vary over time, but the general pattern remained: the cells of veterans with GWI produced less energy.

The cause of Gulf War Illness is still unknown. To determine if environmental factors might play a role, Meyer and his colleagues turned to the veterans' surveys of self-reported symptoms and their written recollections of their deployments.

"We found veterans who recalled being exposed to pesticides and pyridostigmine bromide, a drug used during the Gulf War as a pretreatment to protect troops from the harmful effects of nerve agents, were more likely to get GWI after deployment," Meyer said. "An interesting question is how these effects have persisted so long after the exposures."

Coauthors on the new paper were William Pan and Ian Ryde of Duke; Thomas Alexander, Jacquelyn Klein-Adams and Duncan Ndirangu of the U.S. Department of Veteran Affairs' War-Related Illness and Injury Study Center (WRIISC); and Michael Falvo of WRIISC and the New Jersey Medical School at Rutgers Biomedical and Health Science. 

Strawberry consumption may reduce dementia risk for middle-aged individuals

 New research from the University of Cincinnati found that daily strawberry consumption could help reduce the risk of dementia for certain middle-aged populations.

Research background

In 2022, UC's Robert Krikorian, PhD, and his team published research that found adding blueberries to the daily diets of certain middle-aged populations may lower the chances of developing late-life dementia. He said the current research into strawberries is an extension to the blueberry research.

"Both strawberries and blueberries contain antioxidants called anthocyanins, which have been implicated in a variety of berry health benefits such as metabolic and cognitive enhancements," said Krikorian, professor emeritus in the UC College of Medicine's Department of Psychiatry and Behavioral Neuroscience. "There is epidemiological data suggesting that people who consume strawberries or blueberries regularly have a slower rate of cognitive decline with aging."

In addition to containing anthocyanins, Krikorian said strawberries contain additional micronutrients called ellagitannins and ellagic acid that have been associated with health benefits.

About 50% of individuals in the U.S. develop insulin resistance, commonly referred to as prediabetes, around middle age, which has been shown to be a factor in chronic diseases. Krikorian said the metabolic and cardiovascular benefits of strawberry consumption have been studied previously, but there were relatively few studies on its cognitive effects.

"This study assessed whether strawberry consumption might improve cognitive performance and metabolic health in this population and, if so, whether there might be an association between cognitive enhancement and reduced metabolic disturbance," he said.

Research methodology

A total of 30 overweight patients between 50-65 years old with complaints of mild cognitive decline were enrolled and completed the study. Krikorian said this population has an increased risk for late-life dementia and other common conditions.

Over a period of 12 weeks, the participants were asked to abstain from berry fruit consumption of any kind except for a daily packet of supplement powder to be mixed with water and consumed with breakfast. Half of the participants received powders that contained the equivalent of one cup of whole strawberries (the standard serving size), while the other half received a placebo.

The participants were given tests that measured certain cognitive abilities like long-term memory. The researchers also tracked their mood, intensity of depressive symptoms and metabolic data over the course of the study.

Those in the strawberry powder group had diminished memory interference, which is consistent with an overall improvement in executive ability.

"Reduced memory interference refers to less confusion of semantically related terms on a word-list learning test," Krikorian said. "This phenomenon generally is thought to reflect better executive control in terms of resisting intrusion of non-target words during the memory testing."

The strawberry-treated participants also had a significant reduction of depressive symptoms, which Krikorian said can be understood as a result from "enhanced executive ability that would provide better emotional control and coping and perhaps better problem-solving."

Other strawberry studies have found improvement in metabolic measures including lower insulin, but there was no effect found on the patients' metabolic health in this study.

"Those studies generally used higher dosages of strawberry powder than in our research, and this could have been a factor," Krikorian said.

Next steps

While more research is needed, Krikorian said the strawberry treatment may have improved cognitive function by reducing inflammation in the brain.

"Executive abilities begin to decline in midlife and excess abdominal fat, as in insulin resistance and obesity, will tend to increase inflammation, including in the brain," he said. "So, one might consider that our middle-aged, overweight, prediabetic sample had higher levels of inflammation that contributed to at least mild impairment of executive abilities. Accordingly, the beneficial effects we observed might be related to moderation of inflammation in the strawberry group."

Moving forward, Krikorian said future research trials should include larger samples of participants and differing dosages of strawberry supplementation.

Practicing mindfulness can help people make heart-healthy eating choices

 Practicing mindfulness focused on healthy eating can be good for the heart, a new study shows, because it improves self-awareness and helps people stick to a heart-healthy diet.

When people who had elevated blood pressure participated in an eight-week mindfulness-based blood pressure reduction program for the study, they significantly improved their scores on measures of self-awareness and adherence to a heart-healthy diet compared to a control group. The results were published in JAMA Network Open.

"Participants in the program showed significant improvement in adherence to a heart-healthy diet, which is one of the biggest drivers of blood pressure, as well as significant improvements in self-awareness, which appears to influence healthy eating habits," said lead study author Eric B. Loucks, an associate professor of epidemiology, behavioral and social sciences, and director of the Mindfulness Center at Brown University.

Loucks said the study helps explain the mechanism by which a customized mindfulness training program adapted toward improving diet can affect blood pressure.

"Improvements in our self-awareness, of how different foods make us feel, of how our body feels in general, as well as our thoughts, emotions and physical sensations around eating healthy as well as unhealthy food, can influence people's dietary choices," he said.

High blood pressure, a major cause of cardiovascular disease, is the single most important risk factor for early death worldwide, according to a recent report by the World Health Organization, leading to an estimated 10.8 million avoidable deaths every year. The important thing to note about those avoidable deaths, Loucks said, is that there is ample research supporting effective strategies to control and prevent hypertension.

"Almost everyone has the power to control blood pressure through changes in diet and physical activity, adherence to antihypertensive medications, minimizing alcohol intake and monitoring stress reactivity," he said.

A heart-focused mindfulness program

The mindfulness-based blood pressure reduction program used in the study, which Loucks developed in 2014, trains participants in skills such as meditation, yoga, self-awareness, attention control and emotion regulation. What makes the program unique, he said, is that participants learn how to direct those skills toward behaviors known to lower blood pressure.

The MB-BP plan consisted of a group orientation session, eight 2.5-hour weekly groupsessions and one day-long retreat, as well as recommended home practice for 45 minutes, six days a week. The program was led by trained instructors with expertise in cardiovascular disease etiology, treatment and prevention. Classes were held in Providence, R.I., at Brown University and at a health center in a lower-income, urban neighborhood.

The study compared two groups, totaling 201 participants. The 101 people in the test group were a part of the 8-week MB-BP program, which included personalized feedback and education about hypertension risk factors;mindfulness training of participants in relationship to hypertension risk factors (including mindful eating); and behaviorchange supportThe "usual care" control group received educational brochures on controlling high blood pressure. Both groups received a home blood-pressure monitoring device with usage training, and options for referral to primary care physicians.

The researchers focused on participant adherence to the DASH (Dietary Approaches to Stop Hypertension) program, a balanced eating plan rich in fruits, vegetables, whole grains and low-fat dairy, intended to create a heart-healthy eating style for life. Despite its effectiveness, adherence to the DASH diet is typically low.

After six months, the mindfulness group showed a 0.34-point improvement in the DASH diet score. Loucks explained that this effect can be interpreted as equivalent for a participant shifting from a vegetable intake approaching recommended levels (2-3 servings) to recommended levels (at least 4 servings), or making similar shifts across another component of the DASH score. The control group showed a -0.04-point change in DASH diet score.

The mindfulness group also showed a 0.71-point improvement in the average interoceptive awareness (which is the process of sensing and interpreting signals from one's own body) score compared to six months prior, which outperformed the control group by a significant 0.54 points.

The authors said the trial results offer evidence that an adapted mindfulness training program for participants with high blood pressure that targets diet and self-awareness significantly improves both.

"The program gives participants the tools to make heart-healthy diet changes that can lower their blood pressure and decrease their risk of cardiovascular disease," Loucks said.

The researchers are studying different "doses" of the program (for example, shorter program lengths, fewer sessions), as well as factors influencing the implementation of the MB-BP plan in a real-world setting -- including eligibility for health insurance coverage, accessibility for different patient groups and flexibility for physicians.

Additional contributors from Brown University included Frances Saadeh, Matthew Scarpaci, Jeffrey Proulx, Roee Gutman and Willoughby Britton. The study was supported by the National Institutes of Health Science of Behavior Change Common Fund Program through an award administered by the National Center for Complementary and Integrative Health (UH2AT009145, UH3AT009145).

One sleepless night can rapidly reverse depression for several days

Most people who have pulled an all-nighter are all too familiar with that "tired and wired" feeling. Although the body is physically exhausted, the brain feels slap-happy, loopy and almost giddy.

Now, Northwestern University neurobiologists are the first to uncover what produces this punch-drunk effect. In a new study, researchers induced mild, acute sleep deprivation in mice and then examined their behaviors and brain activity. Not only did dopamine release increase during the acute sleep loss period, synaptic plasticity also was enhanced -- literally rewiring the brain to maintain the bubbly mood for the next few days.

These new findings could help researchers better understand how mood states transition naturally. It also could lead to a more complete understanding of how fast-acting antidepressants (like ketamine) work and help researchers identify previously unknown targets for new antidepressant medications.

The research will be published online on Thursday (Nov. 2) in the journal Neuron. Northwestern postdoctoral fellow Mingzheng Wu is the paper's first author, and Professor Yevgenia Kozorovitskiy is the corresponding author.

"Chronic sleep loss is well studied, and it's uniformly detrimental effects are widely documented," Kozorovitskiy said. "But brief sleep loss -- like the equivalent of a student pulling an all-nighter before an exam -- is less understood. We found that sleep loss induces a potent antidepressant effect and rewires the brain. This is an important reminder of how our casual activities, such as a sleepless night, can fundamentally alter the brain in as little as a few hours."

An expert in neuroplasticity, Kozorovitskiy is an associate professor of neurobiology and the Irving M. Klotz Professor at Northwestern's Weinberg College of Arts and Sciences.

Signs of sleep loss

Scientists long have known that acute perturbations in sleep are associated with altered mental states and behaviors. Alterations of sleep and circadian rhythms in patients, for example, can trigger mania or occasionally reverse depressive episodes.

"Interestingly, changes in mood state after acute sleep loss feel so real, even in healthy subjects, as experienced by myself and many others," Wu said. "But the exact mechanisms in the brain that lead to these effects have remained poorly understood."

To explore these mechanisms, Kozorovitskiy and her team developed a new experiment to induce acute sleep loss in mice that did not have genetic predispositions related to human mood disorders. The experimental setup needed to be gentle enough to avoid causing substantial stress for the animals but just uncomfortable enough to prevent the animals from falling asleep. After a sleepless night, the animals' behavior shifted to become more aggressive, hyperactive and hypersexual, compared to controls that experienced a typical night's sleep.

Using optical and genetically encoded tools, the researchers measured the activity of dopamine neurons, which are responsible for the brain's reward response. And they found activity was higher in animals during the brief sleep loss period.

"We were curious which specific regions of the brain were responsible for the behavioral changes," Kozorovitskiy said. "We wanted to know if it was a large, broadcast signal that affected the entire brain or if it was something more specialized."

Specialized signal

Kozorovitskiy and her team examined four regions of the brain responsible for dopamine release: the prefrontal cortex, nucleus accumbens, hypothalamus and dorsal striatum. After monitoring these areas for dopamine release following acute sleep loss, the researchers discovered that three of the four areas (the prefrontal cortex, nucleus accumbens and hypothalamus) were involved.

But the team wanted to narrow down the results even further, so they systematically silenced the dopamine reactions. The antidepressant effect disappeared only when researchers silenced the dopamine response in the medial prefrontal cortex. By contrast, the nucleus accumbens and hypothalamus appeared to be most involved in the hyperactivity behaviors but were less connected to the antidepressant effect.

"The antidepressant effect persisted except when we silenced dopamine inputs in the prefrontal cortex," Kozorovitskiy said. "That means the prefrontal cortex is a clinically relevant area when searching for therapeutic targets. But it also reinforces the idea that has been building in the field recently: Dopamine neurons play very important but very different roles in the brain. They are not just this monolithic population that simply predicts rewards."

Heightened neuroplasticity

While most of the behaviors (such as hyperactivity and increased sexuality) disappeared within a few hours following acute sleep loss, the antidepressant effect lingered for a few days. This suggested that synaptic plasticity in the prefrontal cortex might be enhanced.

When Kozorovitskiy and her team examined individual neurons, they discovered just that. The neurons in the prefrontal cortex formed tiny protrusions called dendritic spines, highly plastic features that change in response to brain activity. When the researchers used a genetically encoded tool to disassemble the synapses, it reversed the antidepressant effect.

Evolving to avoid predators?

While researchers do not fully understand why sleep loss causes this effect in the brain, Kozorovitskiy suspects evolution is at play.

"It's clear that acute sleep deprivation is somehow activating to an organism," Kozorovitskiy said. "You can imagine certain situations where there is a predator or some sort of danger where you need a combination of relatively high function with an ability to delay sleep. I think this could be something that we're seeing here. If you are losing sleep routinely, then different chronic effects set in that will be uniformly detrimental. But in a transient way, you can imagine situations where it's beneficial to be intensely alert for a period of time."

Kozorovitskiy also cautions people not to start pulling all-nighters in order to brighten a blue mood.

"The antidepressant effect is transient, and we know the importance of a good night's sleep," she said. "I would say you are better off hitting the gym or going for a nice walk. This new knowledge is more important when it comes to matching a person with the right antidepressant." 

Wednesday, 25 October 2023

Smartphone attachment could increase racial fairness in neurological screening

 Engineers at the University of California San Diego have developed a smartphone attachment that could enable people to screen for a variety of neurological conditions, such as Alzheimer's disease and traumatic brain injury, at low cost -- and do so accurately regardless of their skin tone.

The technology, published in Scientific Reports, has the potential to improve the equity and accessibility of neurological screening procedures while making them widely available on all smartphone models.

The attachment fits over a smartphone's camera and improves its ability to capture clear video recordings and measurements of the pupil, which is the dark center of the eye. Recent research has shown that tracking pupil size changes during certain tasks can provide valuable insight into an individual's neurological functions. For example, the pupil tends to dilate during complex cognitive tasks or in response to unexpected stimuli.

However, tracking pupil size can be difficult in individuals with dark eye colors, such as those with darker skin tones, because conventional color cameras struggle to distinguish the pupil from the surrounding dark iris.

To enhance the visibility of the pupil, UC San Diego engineers equipped their smartphone attachment with a specialized filter that selectively permits a certain range of light into the camera. That range is called far-red light -- the extreme red end of the visible spectrum located just before infrared light. Melanin, the dark pigment in the iris, absorbs most visible wavelengths of light but reflects longer wavelengths, including far-red light. By imaging the eye with far-red light while blocking out other wavelengths, the iris appears significantly lighter, making it easier to see the pupil with a regular camera.

"There has been a large issue with medical device design that depends on optical measurements ultimately working only for those with light skin and eye colors, while failing to perform well for those with dark skin and eyes," said study senior author Edward Wang, an electrical and computer engineering professor in The Design Lab at UC San Diego, where he is the director of the Digital Health Technologies Lab. "By focusing on how we can make this work for all people while keeping the solution simple and low cost, we aim to pave the way to a future of fair access to remote, affordable healthcare."

Another feature of this technology that makes it more accessible is that it is designed to work on all smartphones. Traditionally, pupil measurements have been performed using infrared cameras, which are only available in high-end smartphone models. Since regular cameras cannot detect infrared light, this traditional approach limits accessibility to those who can afford more expensive smartphones. By using far-red light, which is still part of the visible spectrum and can be captured by regular smartphone cameras, this technology levels the playing field.

"The issue with relying on specialized sensors like an infrared camera is that not all phones have it," said study first author Colin Barry, an electrical and computer engineering Ph.D. student in Wang's lab. "We created an inexpensive and fair solution to provide these kinds of emerging neurological screenings regardless of the smartphone price, make or model."

To use the attachment, a person clips it over a smartphone's camera and places it over their eye. Then, the smartphone administers a pupil response test by providing a flash of bright light and recording video of the eye during the test. A machine learning model uses the recorded video of the eye to track pupil size.

The researchers tested their smartphone attachment on a diverse group of 12 volunteers with a wide range of eye colors, from light blue to dark brown. The smartphone measurements were validated against a pupillometer, the gold standard device used in the clinic for measuring pupil size.

The next phase of this project involves taking steps towards deploying the technology for large-scale neurological screenings in at-home environments. To reach that stage, the researchers are working on optimizing the design for mass manufacturing. They are also making the technology more user friendly, especially for older adults given their elevated risk for developing neurological conditions.

Wang and Barry have co-founded a company, Billion Labs Inc., to refine and commercialize the technology.

Paper: "Racially fair pupillometry measurements for RGB smartphone cameras using the far red spectrum."

This work is supported by the National Institute of Aging.

Disclosures: Edward Wang and Colin Barry are co-founders of and have a financial interest in Billion Labs Inc. Wang is also the CEO of Billion Labs. The terms of this arrangement have been reviewed and approved by the University of California San Diego in accordance with its conflict-of-interest policies.

Certain per- and polyfluoroalkyl 'forever chemicals' identified as potential risk factor for thyroid cancer

 Mount Sinai researchers have discovered a link between certain per- and polyfluoroalkyl substances (PFAS) and an increased risk for thyroid cancer, according to a study published in eBioMedicine today.

PFAS, also known as "forever chemicals," are a large, complex group of synthetic chemicals that can migrate into the soil, water, and air. Due to their strong carbon-fluorine bond, these chemicals do not degrade easily in the environment. Forever chemicals been used in consumer products around the world since the 1940s, including nonstick cookware, water-repellent clothing, stain-resistant fabrics, and other products that resist grease, water, and oil.

Multiple national and international institutions, including the European Parliament and the U.S. Environmental Protection Agency (EPA), have declared PFAS exposure a health crisis. This study supports the actions needed to regulate and remove these chemicals from potential exposure routes. Although PFAS exposure has been identified as a potential contributor to recent increases in thyroid cancer, limited studies have investigated the association between PFAS exposure and thyroid cancer in human populations.

"With the substantial increase of thyroid cancer worldwide over recent decades, we wanted to dive into the potential environmental factors that could be the cause for this rise. This led us to the finding that PFAS, 'forever chemicals,' may at least partially explain the rise of thyroid cancer and are an area we should continue to study further," said co-corresponding author Maaike van Gerwen, MD, PhD, Assistant Professor and Director of Research for the Department of Otolaryngology -- Head and Neck Surgery, Icahn School of Medicine at Mount Sinai. "Thyroid cancer risk from PFAS exposure is a global concern given the prevalence of PFAS exposure in our world. This study provides critical evidence to support large-scale studies further exploring the effect of PFAS exposure on the thyroid gland."

The researchers investigated associations between plasma PFAS levels and thyroid cancer diagnosis using BioMe, a medical record-linked biobank at Icahn Mount Sinai. They studied 88 thyroid cancer patients with plasma samples collected either at or before cancer diagnosis and 88 non-cancer controls -- people who did not develop any form of cancer -- who matched on sex, race/ethnicity, age (within five years), body mass index, smoking status, and the year of sample collection. The researchers measured levels of eight PFAS in blood samples from the BioMe participants using untargeted metabolomics. The levels of individual PFAS were compared between the group of participants who developed thyroid cancer and the group of healthy participants, using different statistical models to estimate accuracy.

The results showed that exposure to perfluorooctanesulfonic acid (n-PFOS, a group of chemicals under the PFAS umbrella) led to a 56 percent increased risk of thyroid cancer diagnosis. Additionally, the researchers conducted the analysis again in a subgroup of 31 patients who had at least a year between their enrollment in BioMe and their diagnosis of thyroid cancer, to take into consideration the time lag between exposure to PFAS chemicals and developing a disease. From this second analysis, there was also a positive association between the exposure of n-PFOS and the risk of thyroid cancer, as well as a positive association with a few additional PFAS chemicals, including branched perfluorooctanesulfonic acid, perfluorononanoic acid, perfluorooctylphosphonic acid, and linear perfluorohexanesulfonic acid.

"The results of this study provide further confirmation for the PFAS health crisis and underline the need to reduce, and hopefully one day eliminate, PFAS exposure," said co-corresponding author Lauren Petrick, PhD, Associate Professor of Environmental Medicine and Public Health, Icahn Mount Sinai. "Today, it's nearly impossible to avoid PFAS in our daily activities. We hope these findings bring awareness of the severity of these forever chemicals. Everyone should discuss their PFAS exposure with their treating physician to determine their risk and get screened if appropriate. In addition, we need continued industry changes to eliminate PFAS altogether."

This study was funded with pilot funding through the Department of Environmental Medicine and Public Health and the Institute for Exposomic Research's National Institute of Environmental Health Sciences-funded Center on Health and Environment Across the LifeSpan (HEALS), which supports research on environmental exposures, and their effects on health across the life course.

Monday, 23 October 2023

Dietary supplement modifies gut microbiome -- potential implications for bone marrow transplant patients

 Researchers at Baylor College of Medicine and the University of Michigan conducted a phase I pilot study to assess the feasibility of using potato starch as a dietary intervention to modify the gut microbiome in bone marrow transplant patients. The study, which appears in the journal Nature Medicine, is the first part of a two-phase ongoing clinical trial evaluating the effect of modifying the microbiome on the incidence of graft-versus-host disease (GVHD), a major complication that develops in up to half the patients who receive a bone marrow transplant and can lead to injury and death.

"The gut microbiome is a community of microbes we all carry inside the body, and its composition and products affect our health," said senior and co-corresponding author Dr. Pavan Reddy, currently professor and director of the Dan L Duncan Comprehensive Cancer Center at Baylor and previously at the University of Michigan. "Early research from our lab and others showed that a normal gut microbiome and its products change after a bone marrow transplant and that this change contributes to GVHD aggravation. Can we alter the progression of GVHD by modifying the microbiome?"

Previous pre-clinical data from the Reddy lab demonstrated that butyrate, a compound produced by healthy intestinal bacteria when they digest resistant potato starch, a form of starch that people cannot digest, was significantly decreased in the gut of mice experiencing GVHD. Restoring butyrate levels by increasing intestinal butyrate-producing bacteria reduced experimental acute GVHD severity and mortality.

"Having more butyrate in the gut is helpful for healing the intestine, and therefore beneficial for alleviating GVHD," Reddy said. "Previous studies have also shown that in healthy people, potato starch also promotes an increase in butyrate-producing bacteria and intestinal levels of butyrate."

These findings led the researchers to investigate whether increasing intestinal butyrate-producing bacteria and intestinal butyrate levels in bone marrow transplant patients would reduce or prevent the progression of GVHD.

"We began by assessing in the current study whether it was safe and practical for 10 patients undergoing a bone marrow transplant at the University of Michigan to take a food supplement made from resistant potato starch for more than 100 days and whether this would change the products of the gut bacteria that live in the stomach and intestine in a way that could possibly prevent GVHD after transplant," said first and co-corresponding author Dr. Mary Riwes, clinical assistant professor in medical oncology, internal medicine and hematology at the University of Michigan's Rogel Cancer Center.

The goal was that 60% or more of the patients would take 70% or more of the potato starch doses. "Our findings surpassed our expectations," Reddy said. "We found that more than 80% of the patients took 84% of the doses with no negative side effects, suggesting that it is safe and feasible for most patients to regularly take the food supplement. We also found that butyrate levels in the stools were significantly higher in the participants that took potato starch than in those that did not, as we had seen both in mice and healthy adults."

"In the next part of the study, currently open at the University of Michigan and soon to open at a second site at Baylor College of Medicine, we will determine whether taking potato starch will indeed result in less GVHD after transplant," Riwes said.

"It's exciting that this is the first demonstration that a simple, safe dietary intervention has an effect on the gut microbiome and metabolites in these patients," Reddy said. "Of the 10 patients in this study, only one developed GVHD, while typically about half of bone marrow transplant patients develop the condition. We have enrolled more patients in our ongoing phase 2 clinical trial to evaluate the value of potato starch in reducing the incidence of GVHD in transplant patients."

"We could potentially be able to use a food substance in patients undergoing a bone marrow transplant, as a simple-to-administer, low-cost and relatively safe approach to prevent GVHD, which is a major limitation to the life-saving capability of a bone-marrow transplant," Riwes said.

Jonathan L. Golob, John Magenau, Mengrou Shan, Gregory Dick, Thomas Braun, Thomas M. Schmidt, Attaphol Pawarode, Sarah Anand, Monalisa Ghosh, John Maciejewski, Darren King, Sung Choi, Gregory Yanik, Marcus Geer, Ethan Hillman, Costas A. Lyssiotis and Muneesh Tewari also contributed to this work.

Potential for injectable 'chemical vaccine' for malaria using atovaquone

 Johns Hopkins researchers looking to develop a long-acting, injectable malaria preventive using atovaquone have shown in a new study that resistance may not be the challenge scientists thought it was, particularly when using atovaquone as a malaria preventive. Malaria parasites in infected patients being treated with atovaquone tend to develop a resistance to the drug. Because of this, atovaquone by itself is not used as a malaria treatment nor has not been seen as a strong candidate for use as a preventive.

The study, led by a team of researchers at the Johns Hopkins Malaria Research Institute and the Johns Hopkins University School of Medicine, in conjunction with colleagues at the University of Liverpool, was published online October 12 in Nature Communications. The Malaria Research Institute is based at the Johns Hopkins Bloomberg School of Public Health.

In their study, the researchers found that the same genetic mutation that renders malaria parasites resistant to atovaquone in patients also destroys the parasite's ability to live within mosquito hosts -- meaning atovaquone-resistant malaria parasites would not be transmissible. The researchers concluded that atovaquone, despite concerns over resistance, holds promise as a long-acting, injectable "chemical vaccine" that could prevent infection in malaria-endemic areas.

"These findings should reduce concerns about the transmission of atovaquone resistance with atovaquone therapy, particularly when it is used as a chemical vaccine," says study senior author Theresa Shapiro, MD, PhD, professor of Clinical Pharmacology in the Johns Hopkins University School of Medicine and professor in the W. Harry Feinstone Department of Molecular Microbiology and Immunology at the Bloomberg School.

Malaria continues to be a major global health burden. According to the World Health Organization, the mosquito-borne parasitic disease afflicted nearly a quarter of a billion people in 2021, killing more than 600,000. Researchers generally agree that, despite the impact of insecticides and other malaria control measures, and the recent development of a malaria vaccine, new approaches against this deadly parasitic pathogen are needed.

One new approach, described by Shapiro and colleagues at the University of Liverpool in a 2018 preclinical study, would use an injectable, slow-release formulation of atovaquone to provide vaccine-like protection for weeks at a time. Atovaquone is generally considered safe for long-term use even at higher doses, and has the further advantage that it interrupts the malaria life-cycle in human hosts even at the pre-symptomatic stage, when the parasite is developing in liver cells.

However, when atovaquone is used not as a preventive but as a treatment for symptomatic malaria infection, it often fails due to the emergence of genetically acquired resistance. Shapiro notes that by the time an infection is symptomatic, it involves billions of individual malaria organisms, and in this vast population it is likely that a resistance mutation will appear, if only by random genetic variation. Under atovaquone treatment, parasites with this mutation will come to dominate the infection. Because of the resistance problem, atovaquone is used to treat malaria only in combination with another antimalarial called proguanil.

Resistance should be much less likely when using atovaquone as a preventive in people who are malaria-free, Shapiro says. The drug in such cases would be acting against a far smaller number of individual parasites that are only in the early, liver-infection stage.

"In fact, there are no reported cases of atovaquone resistance when the drug has been given prophylactically," she says.

Nevertheless, fear of resistance has left a cloud over the drug's use even as a preventive. Indeed, there have been concerns that the mutation, once it emerged -- for example, in a large population treated prophylactically with atovaquone -- could spread via human-to-mosquito-to-human transmission.

In the study, Shapiro's team examined the resistance problem, focusing on a key resistance mutation, cytochrome-b Y268S, that has been found in clinical investigations involving the major malaria parasite of concern, Plasmodium falciparum. The researchers confirmed that P. falciparum parasites carrying this mutation are thousands of times less susceptible to atovaquone, compared to unmutated parasites.

However, the scientists also found that the Y268S mutation, while it enables P. falciparum to survive in human hosts being treated with atovaquone, essentially destroys its ability to live within its Anophelesmosquito hosts. This means that atovaquone-resistant mutant parasites cannot spread via transmission from humans to mosquitoes and back again -- as the researchers demonstrated using mosquitoes and a P. falciparum-infectable mouse model. For the study, the mice were engrafted with human liver cells and human red blood cells.

"Testing the mutant parasites for their ability to infect humanized mice is the best in vivo assay we have short of using humans, and strongly supports the inability of drug-resistant parasites to be transmitted by mosquitoes," says Photini Sinnis, MD, deputy director at the Johns Hopkins Malaria Research Institute and one of the paper's senior authors.

The findings suggest that a "chemical vaccine" strategy for protecting people from malaria with atovaquone remains viable and should continue to be investigated. Shapiro and colleagues are collaborating with Andrew Owen, PhD, a professor at the University of Liverpool, and his team to complete preclinical studies and launch a Phase I trial. Owen is principal investigator for LONGEVITY, an international project funded by Unitaid that aims to translate long-acting medicines for malaria and other diseases that disproportionately affect people in low- and middle-income countries.

"Many advances in malaria medicines that have started at small scale for the protection of travelers, later see wider use in endemic areas where they are most needed -- and this may be the path atovaquone takes as a chemical vaccine," Shapiro says.

The study's first author was Victoria Balta, PhD, a graduate student working with coauthor David Sullivan, MD, a professor in the Bloomberg School's Department of Molecular Microbiology and Immunology.

Pupil response may shed light on who responds best to transcranial magnetic stimulation for depression

 New findings from researchers at UCLA Health suggest that measuring changes in how pupils react to light could help predict recovery from depression and personalize transcranial magnetic stimulation (TMS) treatment of major depressive disorder.

TMS is a safe, non-invasive therapy that uses magnetic fields to stimulate parts of the brain involved in mood regulation. While TMS is proven effective, not all patients respond equally well to the therapy. The ability to predict who will benefit most could allow doctors to better customize and target treatments.

In two recent studies, UCLA scientists found that the pupil's response to light before treatment correlated with improvements in depression symptoms over the course of therapy. Pupil size reflects activation of the autonomic nervous system, which controls involuntary functions and is negatively impacted in people with depression.

The first study, appearing in the Journal of Affective Disorders, reports on outcomes for 51 patients who underwent daily TMS sessions. Before receiving treatment, researchers measured the patients' baseline pupillary constriction amplitude, or CA: how much the pupil shrinks when exposed to light. The pupil's constriction is an indicator of parasympathetic nervous system function. The researchers found a significant association between baseline pupil constriction amplitude and symptom improvement, indicating that a greater constriction amplitude at baseline was associated with a better outcome. In other words, those with larger pupil constriction in response to light at baseline showed greater symptom improvement over their full treatment.

The second study, appearing in Brain Stimulation, went further and compared patients who were treated for depression with one of two common TMS protocols: 10 Hz stimulation and intermittent theta burst stimulation (iTBS). In 10 Hz stimulation, magnetic pulses are delivered at a fixed rate of 10 pulses per second, or 10 Hz, a continuous and relatively high-frequency stimulation. iTBS is a faster form of stimulation with bursts of three pulses at 50 Hz, repeated with short breaks between bursts. This pattern is thought to mimic the natural rhythm of certain brain activities.

The researchers found that people with slower pupillary constriction had significantly greater improvement in depression after 10 sessions if they received iTBS rather than 10 Hz treatment.

"These results suggest we may be able to use a simple test of the pupil to identify who is most likely to respond to electromagnetic stimulation of the brain to treat their depression," said researcher Cole Citrenbaum, lead author of both studies and a researcher with the TMS Clinical and Research Program at the Semel Institute for Neuroscience and Human Behavior at UCLA.

The researchers propose that measuring pupillary reactivity before starting TMS could eventually help guide treatment selection on an individual basis. "Additionally, we may be able to tailor the frequency of stimulation to the individual patient to maximize their benefit from treatment," Citrenbaum said. This personalized approach could lead to better outcomes for patients.

"At the present time, about 65% of patients treated with TMS have a substantial improvement in their depression," said Dr. Andrew F. Leuchter, senior author of both studies and Distinguished Professor of Psychiatry at the Jane and Jerry Semel Institute for Neuroscience and Human Behavior at UCLA. "Our goal is to have more than 85% of patients fully recover from depression. As we better understand the complex brain activity underlying depression, we move closer to matching patients with the treatments that ensure their full recovery. Pupil testing may be one useful tool in reaching this goal."

The studies add to growing evidence on the benefits of biologically-based personalization in treating major depression. UCLA researchers plan further trials to confirm the value of pupillometry in optimizing transcranial magnetic stimulation.

Sunday, 22 October 2023

NUS scientists develop innovative magnetic gel that heals diabetic wounds three times faster

 Diabetic patients, whose natural wound-healing capabilities are compromised, often develop chronic wounds that are slow to heal. Such non-healing wounds could cause serious infections resulting in painful outcomes such as limb amputation. To address this global healthcare challenge, a team of researchers from the National University of Singapore (NUS) engineered an innovative magnetic wound-healing gel that promises to accelerate the healing of diabetic wounds, reduce the rates of recurrence, and in turn, lower the incidents of limb amputations.

Each treatment involves the application of a bandage pre-loaded with a hydrogel containing skin cells for healing and magnetic particles. To maximise therapeutic results, a wireless external magnetic device is used to activate skin cells and accelerate the wound healing process. The ideal duration of magnetic stimulation is about one to two hours.

Lab tests showed the treatment coupled with magnetic stimulation healed diabetic wounds about three times faster than current conventional approaches. Furthermore, while the research has focussed on healing diabetic foot ulcers, the technology has potential for treating a wide range of complex wounds such as burns.

"Conventional dressings do not play an active role in healing wounds," said Assistant Professor Andy Tay, who leads the team comprising researchers from the Department of Biomedical Engineering at NUS College of Design and Engineering as well as the NUS Institute for Health Innovation & Technology. "They merely prevent the wound from worsening and patients need to be scheduled for dressing change every two or three days. It is a huge cost to our healthcare system and an inconvenience to patients."

In contrast, the unique NUS invention takes a comprehensive 'all-in-one' approach to wound healing, accelerating the process on several fronts.

"Our technology addresses multiple critical factors associated with diabetic wounds, simultaneously managing elevated glucose levels in the wound area, activating dormant skin cells near the wound, restoring damaged blood vessels, and repairing the disrupted vascular network within the wound," explained Asst Prof Tay.

The NUS team described their innovation in a paper published in the scientific journal, Advanced Materials, on 8 September 2023. The research was conducted in collaboration with scientists from the Agency for Science, Technology and Research, Nanyang Technological University, Sun Yat-sen University and Wuhan University of Technology.

Physical theory improves protein folding prediction

 Proteins are important molecules that perform a variety of functions essential to life. To function properly, many proteins must fold into specific structures. However, the way proteins fold into specific structures is still largely unknown. Researchers from the University of Tokyo developed a novel physical theory that can accurately predict how proteins fold. Their model can predict things previous models cannot. Improved knowledge of protein folding could offer huge benefits to medical research, as well as to various industrial processes.

You are literally made of proteins. These chainlike molecules, made from tens to thousands of smaller molecules called amino acids, form things like hair, bones, muscles, enzymes for digestion, antibodies to fight diseases, and more. Proteins make these things by folding into various structures that in turn build up these larger tissues and biological components. And by knowing more about this folding process, researchers can better understand more about the processes that constitute life itself. Such knowledge is also essential to medicine, not only for the development of new treatments and industrial processes to produce medicines, but also for knowledge of how certain diseases work, as some are examples of protein folding gone wrong. So, to say proteins are important is putting it mildly. Proteins are the stuff of life.

Encouraged by the importance of protein folding, Project Assistant Professor Koji Ooka from the College of Arts and Sciences and Professor Munehito Arai from the Department of Life Sciences and Department of Physics embarked on the hard task of improving upon the prediction methods of protein folding. This task is formidable for many reasons. In particular, the computational requirements to simulate the dynamics of molecules necessitate a powerful supercomputer. Recently, the artificial intelligence-based program AlphaFold 2 accurately predicts structures resulting from a given amino acid sequence; but it cannot give details of the way proteins fold, making it a black box. This is problematic, as the forms and behaviors of proteins vary such that two similar ones may fold in radically different ways. So, instead of AI, the duo needed a different approach: statistical mechanics, a branch of physical theory.

"For over 20 years, a theory called the Wako-Saitô-Muñoz-Eaton (WSME) model has successfully predicted the folding processes for proteins comprising around 100 amino acids or fewer, based on the native protein structures," said Arai. "WSME can only evaluate small sections of proteins at a time, missing potential connections between sections farther apart. To overcome this issue, we produced a new model, WSME-L, where the L stands for 'linker.' Our linkers correspond to these nonlocal interactions and allow WSME-L to elucidate the folding process without the limitations of protein size and shape, which AlphaFold 2 cannot."

But it doesn't end there. There are other limitations of existing protein folding models that Ooka and Arai set their sights on. Proteins can exist inside or outside of living cells; those within are in some ways protected by the cell, but those outside cells, such as antibodies, require additional bonds during folding, called disulfide bonds, which help to stabilize them. Conventional models cannot factor in these bonds, but an extension to WSME-L called WSME-L(SS), where each S stands for sulfide, can. To further complicate things, some proteins have disulfide bonds before folding starts, so the researchers made a further enhancement called WSME-L(SSintact), which factors in that situation at the expense of extra computation time.

"Our theory allows us to draw a kind of map of protein folding pathways in a relatively short time; mere seconds on a desktop computer for short proteins, and about an hour on a supercomputer for large proteins, assuming the native protein structure is available by experiments or AlphaFold 2 prediction," said Arai. "The resulting landscape allows a comprehensive understanding of multiple potential folding pathways a long protein might take. And crucially, we can scrutinize structures of transient states. This might be helpful for those researching diseases like Alzheimer's and Parkinson's -- both are caused by proteins which fail to fold correctly. Also, our method may be useful for designing novel proteins and enzymes which can efficiently fold into stable functional structures, for medical and industrial use."

While the models produced here accurately reflect experimental observations, Ooka and Arai hope they can be used to elucidate the folding processes of many proteins that have not yet been studied experimentally. Humans have about 20,000 different proteins, but only around 100 have had their folding processes thoroughly studied.

Keeping a human in the loop: Managing the ethics of AI in medicine

 Artificial intelligence (AI) -- of ChatGPT fame -- is increasingly used in medicine to improve diagnosis and treatment of diseases, and to avoid unnecessary screening for patients. But AI medical devices could also harm patients and worsen health inequities if they are not designed, tested, and used with care, according to an international task force that included a University of Rochester Medical Center bioethicist.

Jonathan Herington, PhD, was a member of the AI Task Force of the Society for Nuclear Medicine and Medical Imaging, which laid out recommendations on how to ethically develop and use AI medical devices in two papers published in the Journal of Nuclear Medicine. In short, the task force called for increased transparency about the accuracy and limits of AI and outlined ways to ensure all people have access to AI medical devices that work for them -- regardless of their race, ethnicity, gender, or wealth.

While the burden of proper design and testing falls to AI developers, health care providers are ultimately responsible for properly using AI and shouldn't rely too heavily on AI predictions when making patient care decisions.

"There should always be a human in the loop," said Herington, who is assistant professor of Health Humanities and Bioethics at URMC and was one of three bioethicists added to the task force in 2021. "Clinicians should use AI as an input into their own decision making, rather than replacing their decision making."

This requires that doctors truly understand how a given AI medical device is intended to be used, how well it performs at that task, and any limitations -- and they must pass that knowledge on to their patients. Doctors must weigh the relative risks of false positives versus false negatives for a given situation, all while taking structural inequities into account.

When using an AI system to identify probable tumors in PET scans, for example, health care providers must know how well the system performs at identifying this specific type of tumor in patients of the same sex, race, ethnicity, etc., as the patient in question.

"What that means for the developers of these systems is that they need to be very transparent," said Herington.

According to the task force, it's up to the AI developers to make accurate information about their medical device's intended use, clinical performance, and limitations readily available to users. One way they recommend doing that is to build alerts right into the device or system that informs users about the degree of uncertainty of the AI's predictions. That might look like heat maps on cancer scans that show whether areas are more or less likely to be cancerous.

To minimize that uncertainty, developers must carefully define the data they use to train and test their AI models, and should use clinically relevant criteria to evaluate the model's performance. It's not enough to simply validate algorithms used by a device or system. AI medical devices should be tested in so-called "silent trials," meaning their performance would be evaluated by researchers on real patients in real time, but their predictions would not be available to the health care provider or applied to clinical decision making.

Developers should also design AI models to be useful and accurate in all contexts in which they will be deployed.

"A concern is that these high-tech, expensive systems would be deployed in really high-resource hospitals, and improve outcomes for relatively well-advantaged patients, while patients in under-resourced or rural hospitals wouldn't have access to them -- or would have access to systems that make their care worse because they weren't designed for them," said Herington.

Currently, AI medical devices are being trained on datasets in which Latino and Black patients are underrepresented, meaning the devices are less likely to make accurate predictions for patients from these groups. In order to avoid deepening health inequities, developers must ensure their AI models are calibrated for all racial and gender groups by training them with datasets that represent all of the populations the medical device or system will ultimately serve.

Though these recommendations were developed with a focus on nuclear medicine and medical imaging, Herington believes they can and should be applied to AI medical devices broadly.

"The systems are becoming ever more powerful all the time and the landscape is shifting really quickly," said Herington. "We have a rapidly closing window to solidify our ethical and regulatory framework around these things."

Tuesday, 17 October 2023

Link between seasons and eating habits

 You might imagine that you're healthier in the summer. The sun is shining, we get plenty of vitamin D, and the days are long.

However, recent research from the University of Copenhagen suggests that eating habits in winter may be better for our metabolic health than eating habits in summer, at least if you're a mouse. Researchers have examined the metabolism and weight of mice exposed to both 'winter light' and 'summer light'.

"We found that even in non-seasonal animals, differences in light hours between summer and winter do cause differences in energy metabolism. In this case, body weight, fat mass and liver fat content," says Lewin Small, who carried out the research while a postdoc at Novo Nordisk Foundation Center for Basic Metabolic Research at the University of Copenhagen. He adds:

"We found this mostly in mice exposed to winter light hours. These mice had less body weight gain and adiposity. They have more rhythmicity in the way they eat over a 24-hour period. And this then led to benefits in metabolic health."

The study is the first of its kind to examine light hour's influence on metabolism in mice, that are not considered seasonal animals as like humans they do not only breed in specific seasons. Animals breeding in specific seasons gain weight before the breeding season to save energy supplies.

Light hours affect the metabolism

The researcher's inspiration for initiating the study stemmed from the significant variation in daylight hours across various regions of the world.

advertisement

"We study the influence of the time-of-day on aspects of metabolism such as exercise, obesity and diabetes. However, most studies that investigate this link do so assuming an equal length of day and night all year round," says Lewin Small.

Therefore, they wanted to find out what the seasonal light differences meant for the metabolism. Most people in the world live with at least a two-hour difference in light between summer and winter.

"I come from Australia, and when I first moved to Denmark, I was not used to the huge difference in light between summer and winter and I was interested in how this might affect both circadian rhythms and metabolism," says Lewin Small and adds:

"Therefore, we exposed laboratory mice to different light hours representing different seasons and measured markers of metabolic health and the circadian rhythms of these animals."

Because the research was conducted using mice as the experimental subjects, it is not possible to assume that the same thing goes for humans.

"This is a proof of principle. Do differences in light hours affect energy metabolism? Yes, it does. Further studies in humans may find that altering our exposure to artificial light at night or natural light exposure over the year could be used to improve our metabolic health," says Juleen Zierath, Professor at the Novo Nordisk Center for Basic Metabolism Research (CBMR) and senior author of the study.

Lewin Small adds that the new knowledge is important to understand how eating patterns are affected by the light and seasons which might help us understand why some people gain more weight or if people gain more weight in a specific time of year.

"Differences in light between summer and winter could affect our hunger pathways and when we get hungry during the day," he says.

New 3D-printed tumor model enables faster, less expensive and less painful cancer treatment

 An international team of interdisciplinary researchers has successfully created a method for better 3D modelling of complex cancers.

The University of Waterloo-based team combined cutting-edge bioprinting techniques with synthetic structures or microfluidic chips. The method will help lab researchers more accurately understand heterogeneous tumours: tumours with more than one kind of cancer cell, often dispersed in unpredictable patterns.

Traditionally, medical practitioners would biopsy a patient's tumour, extract cells, and then grow them in flat petri dishes in a lab. "For fifty years, this was how biologists understood tumours," said Nafiseh Moghimi, an applied mathematics post-doctoral researcher and the lead author of the study. "But a decade ago, repeated treatment failures in human trials made scientists realize that a 2D model does not capture the real tumour structure inside the body."

The team's research addresses this problem by creating a 3D model that not only reflects the complexity of a tumour but also simulates its surrounding environment.

The research, which took place in the Mathematical Medicine Lab under the supervision of applied mathematics professor Mohammad Kohandel, united advancements from several disciplines. "We are creating something that is very, very new in Canada. Maybe just a couple of labs are doing something even close to this research," Moghimi said.

First, the team created polymer "microfluidic chips": tiny structures etched with channels that mimic blood flow and other fluids surrounding a patient's tumour.

Next, the team grew multiple types of cancer cells and suspended these cell cultures in their own customized bioink: a cocktail of gelatine, alginate, and other nutrients designed to keep the cells cultures alive.

Finally, they used an extrusion bioprinter -- a device that resembles a 3D printer but for organic material -- to layer the different types of cancer cells onto the prepared microfluidic chips.

The result is a living, three-dimensional model of complex cancers that scientists can then use to test different modes of treatment, such as various chemotherapy drugs.

Moghimi and her team are particularly interested in creating complex models of breast cancer. After skin cancer, breast cancer is the most common cancer diagnosed in women.

Breast cancer is especially challenging to treat because it appears as complex tumours containing multiple types of cells when it metastasizes. Relying on the cells from one or two biopsies to accurately represent an entire tumour can lead to ineffective treatment plans and poor outcomes.

The 3D-printed tumour models exemplify how new technology enables faster, less expensive and less painful treatments for serious conditions like late-stage breast cancer.

Fungal infection in the brain produces changes like those seen in Alzheimer's disease

 Previous research has implicated fungi in chronic neurodegenerative conditions such as Alzheimer's disease, but there is limited understanding of how these common microbes could be involved in the development of these conditions.

Working with animal models, researchers at Baylor College of Medicine and collaborating institutions discovered how the fungus Candida albicans enters the brain, activates two separate mechanisms in brain cells that promote its clearance, and, important for the understanding of Alzheimer's disease development, generates amyloid beta (Ab)-like peptides, toxic protein fragments from the amyloid precursor protein that are considered to be at the center of the development of Alzheimer's disease. The study appears in the journal Cell Reports.

"Our lab has years of experience studying fungi, so we embarked on the study of the connection between C. albicans and Alzheimer's disease in animal models," said corresponding author Dr. David Corry, Fulbright Endowed Chair in Pathology and professor of pathology and immunology and medicine at Baylor. He also is a member of Baylor's Dan L Duncan Comprehensive Cancer Center. "In 2019, we reported that C. albicans does get into the brain where it produces changes that are very similar to what is seen in Alzheimer's disease. The current study extends that work to understand the molecular mechanisms."

"Our first question was, how does C. albicans enter the brain? We found that C. albicans produces enzymes called secreted aspartic proteases (Saps) that breakdown the blood-brain barrier, giving the fungus access to the brain where it causes damage," said first author Dr. Yifan Wu, postdoctoral scientist in pediatrics working in the Corry lab.

Next, the researchers asked, how is the fungus effectively cleared from the brain? Corry and his colleagues had previously shown that a C. albicans brain infection is fully resolved in otherwise healthy mice after 10 days. In this study, they reported that this occurred thanks to two mechanisms triggered by the fungus in brain cells called microglia.

"The same Saps that the fungus uses to break the blood-brain barrier also break down the amyloid precursor protein into AB-like peptides," Wu said. "These peptides activate microglial brain cells via a cell surface receptor called Toll-like receptor 4, which keeps the fungi load low in the brain, but does not clear the infection."

C. albicans also produces a protein called candidalysin that also binds to microglia via a different receptor, CD11b. "Candidalysin-mediated activation of microglia is essential for clearance of Candida in the brain," Wu said. "If we take away this pathway, fungi are no longer effectively cleared in the brain."

"This work potentially contributes an important new piece of the puzzle regarding the development of Alzheimer's disease," Corry said. "The current explanation for this condition is that it is mostly the result of the accumulation of toxic Ab-like peptides in the brain that leads to neurodegeneration. The dominant thinking is that these peptides are produced endogenously, our own brain proteases break down the amyloid precursor proteins generating the toxic Ab peptides."

Here, the researchers show that the Ab-like peptides also can be generated from a different source -- C. albicans. This common fungus, which has been detected in the brains of people with Alzheimer's disease and other chronic neurodegenerative disorders, has its own set of proteases that can generate the same Ab-like peptides the brain can generate endogenously.

"We propose that the brain Ab-peptide aggregates that characterize multiple Candida-associated neurodegenerative conditions including Alzheimer's disease, Parkinson's disease and others, may be generated both intrinsically by the brain and by C. albicans," Corry said. "These findings in animal models support conducting further studies to evaluate the role of C. albicans in the development of Alzheimer's disease in people, which can potentially lead to innovative therapeutic strategies."

Peering inside cells to see how they respond to stress

 Imagine the life of a yeast cell, floating around the kitchen in a spore that eventually lands on a bowl of grapes. Life is good: food for days, at least until someone notices the rotting fruit and throws them out. But then the sun shines through a window, the section of the counter where the bowl is sitting heats up, and suddenly life gets uncomfortable for the humble yeast. When temperatures get too high, the cells shut down their normal processes to ride out the stressful conditions and live to feast on grapes on another, cooler day.

This "heat shock response" of cells is a classic model of biological adaptation, part of the fundamental processes of life -- conserved in creatures from single-celled yeast to humans -- that allow our cells to adjust to changing conditions in their environment. For years, scientists have focused on how different genes respond to heat stress to understand this survival technique. Now, thanks to the innovative use of advanced imaging techniques, researchers at the University of Chicago are getting an unprecedented look at the inner machinery of cells to see how they respond to heat stress.

"Adaptation is a hidden superpower of the cells," said Asif Ali, PhD, a postdoctoral researcher at UChicago who specializes in capturing images of cellular processes. "They don't have to use this superpower all the time, but once they're stuck in a harsh condition, suddenly, there's no way out. So, they employ this as a survival strategy."

Ali works in the lab of David Pincus, PhD, Assistant Professor of Molecular Genetics and Cell Biology at UChicago, where their team studies study how cells adapt to stressful and complex environments, including the heat shock response. In the new study, published October 16, 2023, in Nature Cell Biology, they combined several new imaging techniques to show that in response to heat shock, cells employ a protective mechanism for their orphan ribosomal proteins -- critical proteins for growth that are highly vulnerable to aggregation when normal cell processing shuts down -- by preserving them within liquid-like condensates.

Once the heat shock subsides, these condensates get dispersed with the help of molecular chaperone proteins, facilitating integration of the orphaned proteins into functional mature ribosomes that can start churning out proteins again. This rapid restart of ribosome production allows the cell to pick back up where it left off without wasting energy. The study also shows that cells unable to maintain the liquid state of these condensates don't recover as quickly, falling behind by ten generations while they try to reproduce the lost proteins.

"Asif developed an entirely new cell biological technique that lets us visualize orphaned ribosomal proteins in cells in real time, for the first time," Pincus said. "Like many innovations, it took a technological breakthrough to enable us to see a whole new biology that was invisible to us before but has always been going on in cells that we've been studying for years."

Loosely affiliated biomolecular goo

Ribosomes are crucial machines inside the cytoplasm of all cells that read the genetic instructions on messenger RNA and build chains of amino acids that fold into proteins. Producing ribosomes to perform this process is energy intensive, so under conditions of stress like heat shock, it's one of the first things a cell shuts down to conserve energy. At any given time though, 50% of newly synthesized proteins inside a cell are ribosomal proteins that haven't been completely translated yet. Up to a million ribosomal proteins are produced per minute in a cell, so if ribosome production shuts down, these millions of proteins could be left floating around unattended, prone to clumping together or folding improperly, which can cause problems down the line.

Instead of focusing on how genes behave during heat shock, Ali and Pincus wanted to look inside the machinery of cells to see what happens to these "orphaned" ribosomal proteins. For this, Ali turned to a new microscopy tool called lattice light sheet 4D imaging that uses multiple sheets of laser light to create fully dimensional images of components inside living cells.

Since he wanted to focus on what was happening to just the orphaned proteins during heat shock, Ali also used a classic technique called "pulse labeling" with a modern twist: a special dye called a "HaloTag" to flag the newly synthesized orphan proteins. Often when scientists want to track the activity of a protein inside a cell, they use a green fluorescent protein (GFP) tag that glows bright green under a microscope. But since there are so many mature ribosomal proteins in a cell, using GFPs would just light up the whole cell. Instead, the pulse labelling with HaloTag dye allows researchers to light up just the newly created ribosomes and leave the mature ones dark.

Using these combined imaging tools, the researchers saw that the orphaned proteins were collected into liquid-like droplets of material near the nucleolus (Pincus used the scientific term "loosely affiliated biomolecular goo"). These blobs were accompanied by molecular chaperones, proteins that usually assist the ribosomal production process by helping fold new proteins. In this case, the chaperones seemed to be "stirring" the collected proteins, keeping them in a liquid state and preventing them from clumping together.

This finding is intriguing, Pincus said, because many human diseases like cancer and neurodegenerative disorders are linked to misfolded or aggregated clumps of proteins. Once proteins get tangled together, they stay that way too, so this "stirring" mechanism seems to be another adaptation.

"I think a very plausible general definition for cellular health and disease is if things are liquid and moving around, you are in a healthy state, once things start to clog up and form these aggregates, that's pathology," Pincus said. "We really think we're uncovering the fundamental mechanisms that might be clinically relevant, or at least, at the mechanistic heart of so many human diseases."

Finding structure at an atomic scale

In the future, Ali hopes to employ another imaging technique called cryo-electron tomography, an application using an electron microscope while cell samples are frozen to capture images of their interior components at an atomic level of resolution. Another advantage of this technique is that it allows researchers to capture 3D images inside the cell itself, as opposed to separating and preparing proteins for imaging.

Novel C. diff structures are required for infection, offer new therapeutic targets

  Iron storage "spheres" inside the bacterium C. diff -- the leading cause of hospital-acquired infections -- could offer new targ...