Friday 10 December 2021

Securing data transfers with relativity

Data encryption concept photo illustration

The volume of data transferred is constantly increasing, but the absolute security of these exchanges cannot be guaranteed, as shown by cases of hacking frequently reported in the news. To counter hacking, a team from the University of Geneva (UNIGE), Switzerland, has developed a new system based on the concept of "zero-knowledge proofs," the security of which is based on the physical principle of relativity: information cannot travel faster than the speed of light. Thus, one of the fundamental principles of modern physics allows for secure data transfer. This system allows users to identify themselves in complete confidentiality without disclosing any personal information, promising applications in the field of cryptocurrencies and blockchain. These results can be read in the journal Nature.

When a person -- the so called 'prover' -- wants to confirm their identity, for example when they want to withdraw money from an ATM, they must provide their personal data to the verifier, in our example the bank, which processes this information (e.g. the identification number and the pin code). As long as only the prover and the verifier know this data, confidentiality is guaranteed. If others get hold of this information, for example by hacking into the bank's server, security is compromised.

Zero-knowledge proof as a solution

To counter this problem, the prover should ideally be able to confirm their identity, without revealing any information at all about their personal data. But is this even possible? Surprisingly the answer is yes, via the concept of a zero-knowledge proof. "Imagine I want to prove a mathematical theorem to a colleague. If I show them the steps of the proof, they will be convinced, but then have access to all the information and could easily reproduce the proof," explains Nicolas Brunner, a professor in the Department of Applied Physics at the UNIGE Faculty of Science. "On the contrary, with a zero-knowledge proof, I will be able to convince them that I know the proof, without giving away any information about it, thus preventing any possible data recovery."

The principle of zero-knowledge proof, invented in the mid-1980s, has been put into practice in recent years, notably for cryptocurrencies. However, these implementations suffer from a weakness, as they are based on a mathematical assumption (that a specific encoding function is difficult to decode). If this assumption is disproved -- which cannot be ruled out today -- security is compromised because the data would become accessible. Today, the Geneva team is demonstrating a radically different system in practice: a relativistic zero-knowledge proof. Security is based here on a physics concept, the principle of relativity, rather than on a mathematical hypothesis. The principle of relativity -- that information does not travel faster than light -- is a pillar of modern physics, unlikely to be ever challenged. The Geneva researchers' protocol therefore offers perfect security and is guaranteed over the long term.

Dual verification based on a three-colorability problem

Implementing a relativistic zero-knowledge proof involves two distant verifier/prover pairs and a challenging mathematical problem. "We use a three-colorability problem. This type of problem consists of a graph made up of a set of nodes connected or not by links," explains Hugo Zbinden, professor in the Department of Applied Physics at the UNIGE. Each node is given one out of three possible colours -- green, blue or red -- and two nodes that are linked together must be of different colours. These three-colouring problems, here featuring 5,000 nodes and 10,000 links, are in practice impossible to solve, as all possibilities must be tried. So why do we need two pairs of checker/prover?

"To confirm their identity, the provers will no longer have to provide a code, but demonstrate to the verifier that they know a way to three-colour a certain graph," continues Nicolas Brunner. To be sure, the verifiers will randomly choose a large number of pairs of nodes on the graph connected by a link, then ask their respective prover what colour the node is. Since this verification is done almost simultaneously, the provers cannot communicate with each other during the test, and therefore cannot cheat. Thus, if the two colours announced are always different, the verifiers are convinced of the identity of the provers, because they actually know a three-colouring of this graph. "It's like when the police interrogates two criminals at the same time in separate offices: it's a matter of checking that their answers match, without allowing them to communicate with each other," says Hugo Zbinden. In this case, the questions are almost simultaneous, so the provers cannot communicate with each other, as this information would have to travel faster than light, which is of course impossible. Finally, to prevent the verifiers from reproducing the graph, the two provers constantly change the colour code in a correlated manner: what was green becomes blue, blue becomes red, etc. "In this way, the proof is made and verified, without revealing any information about it," says the Geneva-based physicist.

 

Keep Walking

The calf muscle in your legs is your second heart.

 



Everyone knows that the heart pumps blood, right? But did you know that your body has a second blood pump? It’s your calf muscles! That’s right, the calf muscles in your legs are your second heart! 


The human body is engineered such that when you walk, the calf muscles pump venous blood back toward your heart.


The veins in your calf act like a reservoir for blood your body does not need in circulation at any given time. These reservoir veins are called muscle venous sinuses. When the calf muscle contracts, blood is squeezed out of the veins and pushed up along the venous system. These veins have one-way valves which keep the blood flowing in the correct direction toward the heart, and also prevent gravity from pulling blood back down your legs.


Walking or running enables your foot to  play a major role in the pumping mechanism of the calves. The foot itself also has its own (smaller) venous reservoir. During the first motion of taking a step, as you put weight on your foot, the foot venous reservoir blood is squeezed out and ‘primes’ the calf reservoir. Then, in the later stages of a step, the calf muscle contracts and pumps the blood up the leg, against gravity. The valves keep the blood flowing in the right direction and prevents gravity from pulling the blood right back down.


Thus, when you are immobile for long periods of time (sitting in an airplane, car seat, or chair for hours) your calf muscle is not contracting much and the blood stagnates.


That’s why walking or running is so good for overall blood circulation. It prevents blood pooling and helps prevent potentially dangerous blood clots called deep vein thrombosis(DVT).


Another condition called venous insufficiency, or venous reflux can cause blood to pool in your legs due to the failure of the valves to work properly. In this condition, the valves fail to prevent the backflow of blood down your legs. Symptoms of venous insufficiency can include heavy, tired, throbbing, painful legs, ankle swelling, bulging varicose veins, cramps, itching, restless leg, skin discolouration and even skin ulceration. Venous insufficiency is a very common disorder, affecting over 40 million people in the U.S.


In cases when a person is even more immobile, such as laying in a hospital bed, the pooled blood can become stagnant and develop into a blood clot. This is called a deep vein thrombosis (DVT). DVT can cause leg pain and swelling and is dangerous because a blood clot can break off and travel in your blood stream and get lodged in your lungs.


- By Louis Prevosti, MD

Wednesday 8 December 2021

Researchers boost human mental function with brain stimulation

Brain illustration


 In a pilot human study, researchers from the University of Minnesota Medical School and Massachusetts General Hospital show it is possible to improve specific human brain functions related to self-control and mental flexibility by merging artificial intelligence with targeted electrical brain stimulation.

Alik Widge, MD, PhD, an assistant professor of psychiatry and member of the Medical Discovery Team on Addiction at the U of M Medical School, is the senior author of the research published in Nature Biomedical Engineering. The findings come from a human study conducted at Massachusetts General Hospital in Boston among 12 patients undergoing brain surgery for epilepsy -- a procedure that places hundreds of tiny electrodes throughout the brain to record its activity and identify where seizures originate.

In this study, Widge collaborated with Massachusetts General Hospital's Sydney Cash, MD, PhD, an expert in epilepsy research; and Darin Dougherty, MD, an expert in clinical brain stimulation. Together, they identified a brain region -- the internal capsule -- that improved patients' mental function when stimulated with small amounts of electrical energy. That part of the brain is responsible for cognitive control -- the process of shifting from one thought pattern or behavior to another, which is impaired in most mental illnesses.

"An example might include a person with depression who just can't get out of a 'stuck' negative thought. Because it is so central to mental illness, finding a way to improve it could be a powerful new way to treat those illnesses," Widge said.

The team developed algorithms, so that after stimulation, they could track patients' cognitive control abilities, both from their actions and directly from their brain activity. The controller method provided boosts of stimulation whenever the patients were doing worse on a laboratory test of cognitive control.

"This system can read brain activity, 'decode' from that when a patient is having difficulty, and apply a small burst of electrical stimulation to the brain to boost them past that difficulty," Widge said. "The analogy I often use is an electric bike. When someone's pedaling but having difficulty, the bike senses it and augments it. We've made the equivalent of that for human mental function."

The study is the first to show that:

  • A specific human mental function linked to mental illness can be reliably enhanced using precisely targeted electrical stimulation;
  • There are specific sub-parts of the internal capsule brain structure that are particularly effective for cognitive enhancement; and
  • A closed-loop algorithm used as a controller was twice as effective than stimulating at random times.

Some of the patients had significant anxiety in addition to their epilepsy. When given the cognitive-enhancing stimulation, they reported that their anxiety got better, because they were more able to shift their thoughts away from their distress and focus on what they wanted. Widge says that this suggests this method could be used to treat patients with severe and medication-resistant anxiety, depression or other disorders.

"This could be a totally new approach in treating mental illness. Instead of trying to suppress symptoms, we could give patients a tool that lets them take control of their own minds," Widge said. "We could put them back in the driver's seat and let them feel a new sense of agency."

The research team is now preparing for clinical trials. Because the target for improving cognitive control is already approved by the Food and Drug Administration for deep brain stimulation, Widge says this research can be done with existing tools and devices -- once a trial is formally approved -- and the translation of this care to current medical practice could be rapid.

‘Dancing molecules’ successfully repair severe spinal cord injuries

Spine illustration in human body


 Northwestern University researchers have developed a new injectable therapy that harnesses "dancing molecules" to reverse paralysis and repair tissue after severe spinal cord injuries.

In a new study, researchers administered a single injection to tissues surrounding the spinal cords of paralyzed mice. Just four weeks later, the animals regained the ability to walk.

The research will be published in the Nov. 12 issue of the journal Science.

By sending bioactive signals to trigger cells to repair and regenerate, the breakthrough therapy dramatically improved severely injured spinal cords in five key ways: (1) The severed extensions of neurons, called axons, regenerated; (2) scar tissue, which can create a physical barrier to regeneration and repair, significantly diminished; (3) myelin, the insulating layer of axons that is important in transmitting electrical signals efficiently, reformed around cells; (4) functional blood vessels formed to deliver nutrients to cells at the injury site; and (5) more motor neurons survived.

After the therapy performs its function, the materials biodegrade into nutrients for the cells within 12 weeks and then completely disappear from the body without noticeable side effects. This is the first study in which researchers controlled the collective motion of molecules through changes in chemical structure to increase a therapeutic's efficacy.

"Our research aims to find a therapy that can prevent individuals from becoming paralyzed after major trauma or disease," said Northwestern's Samuel I. Stupp, who led the study. "For decades, this has remained a major challenge for scientists because our body's central nervous system, which includes the brain and spinal cord, does not have any significant capacity to repair itself after injury or after the onset of a degenerative disease. We are going straight to the FDA to start the process of getting this new therapy approved for use in human patients, who currently have very few treatment options."

Stupp is Board of Trustees Professor of Materials Science and Engineering, Chemistry, Medicine and Biomedical Engineering at Northwestern, where he is founding director of the Simpson Querrey Institute for BioNanotechnology (SQI) and its affiliated research center, the Center for Regenerative Nanomedicine. He has appointments in the McCormick School of Engineering, Weinberg College of Arts and Sciences and Feinberg School of Medicine.

Life expectancy has not improved since the 1980s

According to the National Spinal Cord Injury Statistical Center, nearly 300,000 people are currently living with a spinal cord injury in the United States. Life for these patients can be extraordinarily difficult. Less than 3% of people with complete injury ever recover basic physical functions. And approximately 30% are re-hospitalized at least once during any given year after the initial injury, costing millions of dollars in average lifetime health care costs per patient. Life expectancy for people with spinal cord injuries is significantly lower than people without spinal cord injuries and has not improved since the 1980s.

"Currently, there are no therapeutics that trigger spinal cord regeneration," said Stupp, an expert in regenerative medicine. "I wanted to make a difference on the outcomes of spinal cord injury and to tackle this problem, given the tremendous impact it could have on the lives of patients. Also, new science to address spinal cord injury could have impact on strategies for neurodegenerative diseases and stroke."

'Dancing molecules' hit moving targets

The secret behind Stupp's new breakthrough therapeutic is tuning the motion of molecules, so they can find and properly engage constantly moving cellular receptors. Injected as a liquid, the therapy immediately gels into a complex network of nanofibers that mimic the extracellular matrix of the spinal cord. By matching the matrix's structure, mimicking the motion of biological molecules and incorporating signals for receptors, the synthetic materials are able to communicate with cells.

"Receptors in neurons and other cells constantly move around," Stupp said. "The key innovation in our research, which has never been done before, is to control the collective motion of more than 100,000 molecules within our nanofibers. By making the molecules move, 'dance' or even leap temporarily out of these structures, known as supramolecular polymers, they are able to connect more effectively with receptors."

Stupp and his team found that fine-tuning the molecules' motion within the nanofiber network to make them more agile resulted in greater therapeutic efficacy in paralyzed mice. They also confirmed that formulations of their therapy with enhanced molecular motion performed better during in vitro tests with human cells, indicating increased bioactivity and cellular signaling.

"Given that cells themselves and their receptors are in constant motion, you can imagine that molecules moving more rapidly would encounter these receptors more often," Stupp said. "If the molecules are sluggish and not as 'social,' they may never come into contact with the cells."

One injection, two signals

Once connected to the receptors, the moving molecules trigger two cascading signals, both of which are critical to spinal cord repair. One signal prompts the long tails of neurons in the spinal cord, called axons, to regenerate. Similar to electrical cables, axons send signals between the brain and the rest of the body. Severing or damaging axons can result in the loss of feeling in the body or even paralysis. Repairing axons, on the other hand, increases communication between the body and brain.

The second signal helps neurons survive after injury because it causes other cell types to proliferate, promoting the regrowth of lost blood vessels that feed neurons and critical cells for tissue repair. The therapy also induces myelin to rebuild around axons and reduces glial scarring, which acts as a physical barrier that prevents the spinal cord from healing.

"The signals used in the study mimic the natural proteins that are needed to induce the desired biological responses. However, proteins have extremely short half-lives and are expensive to produce," said Zaida Álvarez, the study's first author and former research assistant professor in Stupp's laboratory. "Our synthetic signals are short, modified peptides that -- when bonded together by the thousands -- will survive for weeks to deliver bioactivity. The end result is a therapy that is less expensive to produce and lasts much longer."

Universal application

While the new therapy could be used to prevent paralysis after major trauma (automobile accidents, falls, sports accidents and gunshot wounds) as well as from diseases, Stupp believes the underlying discovery -- that "supramolecular motion" is a key factor in bioactivity -- can be applied to other therapies and targets.

"The central nervous system tissues we have successfully regenerated in the injured spinal cord are similar to those in the brain affected by stroke and neurodegenerative diseases, such as ALS, Parkinson's disease and Alzheimer's disease," Stupp said. "Beyond that, our fundamental discovery about controlling the motion of molecular assemblies to enhance cell signaling could be applied universally across biomedical targets."

Other Northwestern study authors include Evangelos Kiskinis, assistant professor of neurology and neuroscience in Feinberg; research technician Feng Chen; postdoctoral researchers Ivan Sasselli, Alberto Ortega and Zois Syrgiannis; and graduate students Alexandra Kolberg-Edelbrock, Ruomeng Qiu and Stacey Chin. Peter Mirau of the Air Force Research Laboratories and Steven Weigand of Argonne National Laboratory also are co-authors.

‘Dancing molecules’ successfully repair severe spinal cord injuries

Saturday 18 September 2021

The first cells might have used temperature to divide

Cell division illustration

 A simple mechanism could underlie the growth and self-replication of protocells -- putative ancestors of modern living cells -- suggests a study publishing September 3 in Biophysical Journal. Protocells are vesicles bounded by a membrane bilayer and are potentially similar to the first unicellular common ancestor (FUCA). On the basis of relatively simple mathematical principles, the proposed model suggests that the main force driving protocell growth and reproduction is the temperature difference that occurs between the inside and outside of the cylindrical protocell as a result of inner chemical activity.

"The initial motivation of our study was to identify the main forces driving cell division," says the study author Romain Attal of Universcience. "This is important because cancer is characterized by uncontrolled cell division. This is also important to understand the origin of life."

The splitting of a cell to form two daughter cells requires the synchronization of numerous biochemical and mechanical processes involving cytoskeletal structures inside the cell. But in the history of life, such complex structures are a high-tech luxury and must have appeared much later than the ability to split. Protocells must have used a simple splitting mechanism to ensure their reproduction, before the appearance of genes, RNA, enzymes, and all the complex organelles present today, even in the most rudimentary forms of autonomous life.

In the new study, Attal proposed a model based on the idea that the early forms of life were simple vesicles containing a particular network of chemical reactions -- a precursor of modern cellular metabolism. The main hypothesis is that molecules composing the membrane bilayer are synthesized inside the protocell through globally exothermic, or energy-releasing, chemical reactions.

The slow increase of the inner temperature forces the hottest molecules to move from the inner leaflet to the outer leaflet of the bilayer. This asymmetric movement makes the outer leaflet grow faster than the inner leaflet. This differential growth increases the mean curvature and amplifies any local shrinking of the protocell until it splits in two. The cut occurs near the hottest zone, around the middle.

"The scenario described can be viewed as the ancestor of mitosis," Attal says. "Having no biological archives as old as 4 billion years, we don't know exactly what FUCA contained, but it was probably a vesicle bounded by a lipid bilayer encapsulating some exothermic chemical reactions."

Although purely theoretical, the model could be tested experimentally. For example, one could use fluorescent molecules to measure temperature variations inside eukaryotic cells, in which mitochondria are the main source of heat. These fluctuations could be correlated with the onset of mitosis and with the shape of the mitochondrial network.

If borne out by future investigations, the model would have several important implications, Attal says. "An important message is that the forces driving the development of life are fundamentally simple," he explains. "A second lesson is that temperature gradients matter in biochemical processes and cells can function like thermal machines."

Friday 17 September 2021

Cold planets exist throughout our galaxy, even in the galactic bulge, research suggests

Spiral galaxy

 Although thousands of planets have been discovered in the Milky Way, most reside less than a few thousand light years from Earth. Yet our Galaxy is more than 100,000 light years across, making it difficult to investigate the Galactic distribution of planets. But now, a research team has found a way to overcome this hurdle.

In a study published in The Astrophysical Journal Letters, researchers led by Osaka University and NASA have used a combination of observations and modeling to determine how the planet-hosting probability varies with the distance from the Galactic center.

The observations were based on a phenomenon called gravitational microlensing, whereby objects such as planets act as lenses, bending and magnifying the light from distant stars. This effect can be used to detect cold planets similar to Jupiter and Neptune throughout the Milky Way, from the Galactic disk to the Galactic bulge -- the central region of our Galaxy.

"Gravitational microlensing currently provides the only way to investigate the distribution of planets in the Milky Way," says Daisuke Suzuki, co-author of the study. "But until now, little is known mainly because of the difficulty in measuring the distance to planets that are more than 10,000 light years from the Sun."

To solve this problem, the researchers instead considered the distribution of a quantity that describes the relative motion of the lens and distant light source in planetary microlensing. By comparing the distribution observed in microlensing events with that predicted by a Galactic model, the research team could infer the Galactic distribution of planets.

The results show that the planetary distribution is not strongly dependent on the distance from the Galactic center. Instead, cold planets orbiting far from their stars seem to exist universally in the Milky Way. This includes the Galactic bulge, which has a very different environment to the solar neighborhood, and where the presence of planets has long been uncertain.

"Stars in the bulge region are older and are located much closer to each other than stars in the solar neighborhood," explains lead author of the study Naoki Koshimoto. "Our finding that planets reside in both these stellar environments could lead to an improved understanding of how planets form and the history of planet formation in the Milky Way."

According to the researchers, the next step should be to combine these results with measurements of microlens parallax or lens brightness -- two other important quantities associated with planetary microlensing.

Will it be safe for humans to fly to Mars?

 

Spaceship on Mars illustration

Sending human travelers to Mars would require scientists and engineers to overcome a range of technological and safety obstacles. One of them is the grave risk posed by particle radiation from the sun, distant stars and galaxies.

Answering two key questions would go a long way toward overcoming that hurdle: Would particle radiation pose too grave a threat to human life throughout a round trip to the red planet? And, could the very timing of a mission to Mars help shield astronauts and the spacecraft from the radiation?

In a new article published in the peer-reviewed journal Space Weather, an international team of space scientists, including researchers from UCLA, answers those two questions with a "no" and a "yes."

That is, humans should be able to safely travel to and from Mars, provided that the spacecraft has sufficient shielding and the round trip is shorter than approximately four years. And the timing of a human mission to Mars would indeed make a difference: The scientists determined that the best time for a flight to leave Earth would be when solar activity is at its peak, known as the solar maximum.

The scientists' calculations demonstrate that it would be possible to shield a Mars-bound spacecraft from energetic particles from the sun because, during solar maximum, the most dangerous and energetic particles from distant galaxies are deflected by the enhanced solar activity.

A trip of that length would be conceivable. The average flight to Mars takes about nine months, so depending on the timing of launch and available fuel, it is plausible that a human mission could reach the planet and return to Earth in less than two years, according to Yuri Shprits, a UCLA research geophysicist and co-author of the paper.

"This study shows that while space radiation imposes strict limitations on how heavy the spacecraft can be and the time of launch, and it presents technological difficulties for human missions to Mars, such a mission is viable," said Shprits, who also is head of space physics and space weather at GFZ Research Centre for Geosciences in Potsdam, Germany.

The researchers recommend a mission not longer than four years because a longer journey would expose astronauts to a dangerously high amount of radiation during the round trip -- even assuming they went when it was relatively safer than at other times. They also report that the main danger to such a flight would be particles from outside of our solar system.

Shprits and colleagues from UCLA, MIT, Moscow's Skolkovo Institute of Science and Technology and GFZ Potsdam combined geophysical models of particle radiation for a solar cycle with models for how radiation would affect both human passengers -- including its varying effects on different bodily organs -- and a spacecraft. The modeling determined that having a spacecraft's shell built out of a relatively thick material could help protect astronauts from radiation, but that if the shielding is too thick, it could actually increase the amount of secondary radiation to which they are exposed.

The two main types of hazardous radiation in space are solar energetic particles and galactic cosmic rays; the intensity of each depends on solar activity. Galactic cosmic ray activity is lowest within the six to 12 months after the peak of solar activity, while solar energetic particles' intensity is greatest during solar maximum, Shprits said.


Thursday 16 September 2021

Brain refreshing: Why the dreaming phase matters

Dreaming concep

 Scientists have long wondered why almost all animals sleep, despite the disadvantages to survival of being unconscious. Now, researchers led by a team from the University of Tsukuba have found new evidence of brain refreshing that takes place during a specific phase of sleep: rapid eye movement (REM) sleep, which is when you tend to dream a lot.

Previous studies have measured differences in blood flow in the brain between REM sleep, non-REM sleep, and wakefulness using various methods, with conflicting results. In their latest work, the Tsukuba-led team used a technique to directly visualize the movement of red blood cells in the brain capillaries (where nutrients and waste products are exchanged between brain cells and blood) of mice during awake and asleep states.

"We used a dye to make the brain blood vessels visible under fluorescent light, using a technique known as two-photon microscopy," says senior author of the study Professor Yu Hayashi. "In this way, we could directly observe the red blood cells in capillaries of the neocortex in non-anesthetized mice."

The researchers also measured electrical activity in the brain to identify REM sleep, non-REM sleep, and wakefulness, and looked for differences in blood flow between these phases.

"We were surprised by the results," explains Professor Hayashi. "There was a massive flow of red blood cells through the brain capillaries during REM sleep, but no difference between non-REM sleep and the awake state, showing that REM sleep is a unique state"

The research team then disrupted the mice's sleep, resulting in "rebound" REM sleep -- a stronger form of REM sleep to compensate for the earlier disruption. Blood flow in the brain was further increased during rebound REM sleep, suggesting an association between blood flow and REM sleep strength. However, when the researchers repeated the same experiments in mice without adenosine A2a receptors (the receptors whose blockade makes you feel more awake after drinking coffee), there was less of an increase in blood flow during REM sleep, even during rebound REM sleep.

"These results suggest that adenosine A2a receptors may be responsible for at least some of the changes in blood flow in the brain during REM sleep," says Professor Hayashi.

Given that reduced blood flow in the brain and decreased REM sleep are correlated with the development of Alzheimer's disease, which involves the buildup of waste products in the brain, it may be interesting to address whether increased blood flow in the brain capillaries during REM sleep is important for waste removal from the brain. This study lays preliminary groundwork for future investigations into the role of adenosine A2a receptors in this process, which could ultimately lead to the development of new treatments for conditions such as Alzheimer's disease.

Reducing sugar in packaged foods can prevent disease in millions

Grocery shopping

Cutting 20% of sugar from packaged foods and 40% from beverages could prevent 2.48 million cardiovascular disease events (such as strokes, heart attacks, cardiac arrests), 490,000 cardiovascular deaths, and 750,000 diabetes cases in the U.S. over the lifetime of the adult population, according to micro-simulation study published in Circulation.

A team of researchers from Massachusetts General Hospital (MGH), the Friedman School of Nutrition Science & Policy at Tufts University, Harvard T.H. Chan School of Public Health and New York City Department of Health and Mental Hygiene (NYC DOH) created a model to simulate and quantify the health, economic, and equity impacts of a pragmatic sugar-reduction policy proposed by the U.S. National Salt and Sugar Reduction Initiative (NSSRI). A partnership of more than 100 local, state and national health organizations convened by the NYC DOH, the NSSRI released draft sugar-reduction targets for packaged foods and beverages in 15 categories in 2018. This February, NSSRI finalized the policy with the goal of industry voluntarily committing to gradually reformulate their sugary products.

Implementing a national policy, however, will require government support to monitor companies as they work toward the targets and to publicly report on their progress. The researchers hope their model will build consensus on the need for a national-sugar reformulation policy in the US. "We hope that this study will help push the reformulation initiative forward in the next few years," says Siyi Shangguan, MD, MPH, lead author and attending physician at MGH. "Reducing the sugar content of commercially prepared foods and beverages will have a larger impact on the health of Americans than other initiatives to cut sugar, such as imposing a sugar tax, labeling added sugar content, or banning sugary drinks in schools."

Ten years after the NSSRI policy goes into effect, the U.S. could expect to save $4.28 billion in total net healthcare costs, and $118.04 billion over the lifetime of the current adult population (ages 35 to 79), according to the model. Adding the societal costs of lost productivity of Americans developing diseases from excessive sugar consumption, the total cost savings of the NSSRI policy rises to $160.88 billion over the adult population's lifetime. These benefits are likely to be an underestimation since the calculations were conservative. The study also demonstrated that even partial industry compliance with the policy could generate significant health and economic gains.

The researchers found that the NSSRI policy became cost-effective at six years and cost-saving at nine years. The policy could also reduce disparities, with the greatest estimated health gains among Black and Hispanic adults, and Americans with lower income and less education -- populations that consume the most sugar as a historical consequence of inequitable systems.

Product reformulation efforts have been shown to be successful in reducing other harmful nutrients, such as trans fats and sodium. The U.S., however, lags other countries in implementing strong sugar-reduction policies, with countries such as the UK, Norway, and Singapore taking the lead on sugar-reformulation efforts. The US may yet become a leader in protecting its people from the dangers of excessive sugar consumption if the NSSRI's proposed sugar-reduction targets are achieved. "The NSSRI policy is by far the most carefully designed and comprehensive, yet achievable, sugar-reformulation initiative in the world," says Shangguan.

Consuming sugary foods and beverages is strongly linked to obesity and diseases such as type 2 diabetes and cardiovascular disease, the leading cause of mortality in the U.S. More than two in five American adults are obese, one in two have diabetes or prediabetes, and nearly one in two have cardiovascular disease, with those from lower-income groups being disproportionately burdened.

"Sugar is one of the most obvious additives in the food supply to reduce to reasonable amounts," says Dariush Mozaffarian, MD, DrPH, co-senior author and dean of the Friedman School of Nutrition Science and Policy at Tufts University. "Our findings suggest it's time to implement a national program with voluntary sugar reduction targets, which can generate major improvements in health, health disparities, and healthcare spending in less than a decade."

Major funding for this study provided by the National Institutes of Health.

Shangguan is an attending at MGH and an instructor of Medicine at Harvard Medical School. Mozaffarian is dean of the Friedman School of Nutrition Science and Policy at Tufts University. Thomas Gaziano, MD, MSc, is associate professor at Brigham and Women's Hospital and assistant professor of Medicine at HMS. Renata Micha, PhD, is research associate professor at the Friedman School of Nutrition Science and Policy at Tufts University and associate professor at the University of Thessaly in Greece.


 

Tuesday 17 August 2021

Eating more plant foods may lower heart disease risk in young adults, older women

Woman holding crate of fresh vegetable

Eating more nutritious, plant-based foods is heart-healthy at any age, according to two research studies published today in the Journal of the American Heart Association, an open access journal of the American Heart Association.

In two separate studies analyzing different measures of healthy plant food consumption, researchers found that both young adults and postmenopausal women had fewer heart attacks and were less likely to develop cardiovascular disease when they ate more healthy plant foods.

The American Heart Association Diet and Lifestyle Recommendations suggest an overall healthy dietary pattern that emphasizes a variety of fruits and vegetables, whole grains, low-fat dairy products, skinless poultry and fish, nuts and legumes and non-tropical vegetable oils. It also advises limited consumption of saturated fat, trans fat, sodium, red meat, sweets and sugary drinks.

One study, titled "A Plant-Centered Diet and Risk of Incident Cardiovascular Disease during Young to Middle Adulthood," evaluated whether long-term consumption of a plant-centered diet and a shift toward a plant-centered diet starting in young adulthood are associated with a lower risk of cardiovascular disease in midlife.

"Earlier research was focused on single nutrients or single foods, yet there is little data about a plant-centered diet and the long-term risk of cardiovascular disease," said Yuni Choi, Ph.D., lead author of the young adult study and a postdoctoral researcher in the division of epidemiology and community health at the University of Minnesota School of Public Health in Minneapolis.

Choi and colleagues examined diet and the occurrence of heart disease in 4,946 adults enrolled in the Coronary Artery Risk Development in Young Adults (CARDIA) study. Participants were 18- to 30-years-old at the time of enrollment (1985-1986) in this study and were free of cardiovascular disease at that time. Participants included 2,509 Black adults and 2,437 white adults (54.9% women overall) who were also analyzed by education level (equivalent to more than high school vs. high school or less). Participants had eight follow-up exams from 1987-88 to 2015-16 that included lab tests, physical measurements, medical histories and assessment of lifestyle factors. Unlike randomized controlled trials, participants were not instructed to eat certain things and were not told their scores on the diet measures, so the researchers could collect unbiased, long-term habitual diet data.

After detailed diet history interviews, the quality of the participants diets was scored based on the A Priori Diet Quality Score (APDQS) composed of 46 food groups at years 0, 7 and 20 of the study. The food groups were classified into beneficial foods (such as fruits, vegetables, beans, nuts and whole grains); adverse foods (such as fried potatoes, high-fat red meat, salty snacks, pastries and soft drinks); and neutral foods (such as potatoes, refined grains, lean meats and shellfish) based on their known association with cardiovascular disease.

Participants who received higher scores ate a variety of beneficial foods, while people who had lower scores ate more adverse foods. Overall, higher values correspond to a nutritionally rich, plant-centered diet.

"As opposed to existing diet quality scores that are usually based on small numbers of food groups, APDQS is explicit in capturing the overall quality of diet using 46 individual food groups, describing the whole diet that the general population commonly consumes. Our scoring is very comprehensive, and it has many similarities with diets like the Dietary Guidelines for Americans Healthy Eating Index (from the U.S. Department of Agriculture's Food and Nutrition Service), the DASH (Dietary Approaches to Stop Hypertension) diet and the Mediterranean diet," said David E. Jacobs Jr., Ph.D., senior author of the study and Mayo Professor of Public Health in the division of epidemiology and community health at the University of Minnesota School of Public Health in Minneapolis.

Researchers found:

  • During 32 years of follow-up, 289 of the participants developed cardiovascular disease (including heart attack, stroke, heart failure, heart-related chest pain or clogged arteries anywhere in the body).
  • People who scored in the top 20% on the long-term diet quality score (meaning they ate the most nutritionally rich plant foods and fewer adversely rated animal products) were 52% less likely to develop cardiovascular disease, after considering several factors (including age, sex, race, average caloric consumption, education, parental history of heart disease, smoking and average physical activity).
  • In addition, between year 7 and 20 of the study when participants ages ranged from 25 to 50, those who improved their diet quality the most (eating more beneficial plant foods and fewer adversely rated animal products) were 61% less likely to develop subsequent cardiovascular disease, in comparison to the participants whose diet quality declined the most during that time.
  • There were few vegetarians among the participants, so the study was not able to assess the possible benefits of a strict vegetarian diet, which excludes all animal products, including meat, dairy and eggs.

"A nutritionally rich, plant-centered diet is beneficial for cardiovascular health. A plant-centered diet is not necessarily vegetarian," Choi said. "People can choose among plant foods that are as close to natural as possible, not highly processed. We think that individuals can include animal products in moderation from time to time, such as non-fried poultry, non-fried fish, eggs and low-fat dairy."

Because this study is observational, it cannot prove a cause-and-effect relationship between diet and heart disease.


 

Saturday 15 May 2021

Lightning and subvisible discharges produce molecules that clean the atmosphere

Lightning strike

 Lightning bolts break apart nitrogen and oxygen molecules in the atmosphere and create reactive chemicals that affect greenhouse gases. Now, a team of atmospheric chemists and lightning scientists have found that lightning bolts and, surprisingly, subvisible discharges that cannot be seen by cameras or the naked eye produce extreme amounts of the hydroxyl radical -- OH -- and hydroperoxyl radical -- HO2.

The hydroxyl radical is important in the atmosphere because it initiates chemical reactions and breaks down molecules like the greenhouse gas methane. OH is the main driver of many compositional changes in the atmosphere.

"Initially, we looked at these huge OH and HO2 signals found in the clouds and asked, what is wrong with our instrument?" said William H. Brune, distinguished professor of meteorology at Penn State. "We assumed there was noise in the instrument, so we removed the huge signals from the dataset and shelved them for later study."

The data was from an instrument on a plane flown above Colorado and Oklahoma in 2012 looking at the chemical changes that thunderstorms and lightning make to the atmosphere.

But a few years ago, Brune took the data off the shelf, saw that the signals were really hydroxyl and hydroperoxyl, and then worked with a graduate student and research associate to see if these signals could be produced by sparks and subvisible discharges in the laboratory. Then they did a reanalysis of the thunderstrom and lightning dataset.

"With the help of a great undergraduate intern," said Brune, "we were able to link the huge signals seen by our instrument flying through the thunderstorm clouds to the lightning measurements made from the ground."

The researchers report their results online today (April 29) in Science First Release and the Journal of Geophysical Research -- Atmospheres.

Brune notes that airplanes avoid flying through the rapidly rising cores of thunderstorms because it is dangerous, but can sample the anvil, the top portion of the cloud that spreads outward in the direction of the wind. Visible lightning happens in the part of the anvil near the thunderstorm core.

"Through history, people were only interested in lightning bolts because of what they could do on the ground," said Brune. "Now there is increasing interest in the weaker electrical discharges in thunderstorms that lead to lightning bolts."

Most lightning never strikes the ground, and the lightning that stays in the clouds is particularly important for affecting ozone, and important greenhouse gas, in the upper atmosphere. It was known that lightning can split water to form hydroxyl and hydroperoxyl, but this process had never been observed before in thunderstorms.

What confused Brune's team initially was that their instrument recorded high levels of hydroxyl and hydroperoxyl in areas of the cloud where there was no lightning visible from the aircraft or the ground. Experiments in the lab showed that weak electrical current, much less energetic than that of visible lightning, could produce these same components.

While the researchers found hydroxyl and hydroperoxyl in areas with subvisible lightning, they found little evidence of ozone and no evidence of nitric oxide, which requires visible lightning to form. If subvisible lightning occurs routinely, then the hydroxyl and hydroperoxyl these electrical events create need to be included in atmospheric models. Currently, they are not.

According to the researchers, "Lightning-generated OH (hydroxyl) in all storms happening globally can be responsible for a highly uncertain but substantial 2% to 16% of global atmospheric OH oxidation."

"These results are highly uncertain, partly because we do not know how these measurements apply to the rest of the globe," said Brune. "We only flew over Colorado and Oklahoma. Most thunderstorms are in the tropics. The whole structure of high plains storms is different than those in the tropics. Clearly we need more aircraft measurements to reduce this uncertainty."

Other researchers at Penn State include Patrick J. McFarland, undergraduate; David O. Miller, doctoral recipient; and Jena M. Jenkins, doctoral candidate, all in meteorology and atmospheric science.

Saturday 8 May 2021

Mars has right ingredients for present-day microbial life beneath its surface, study finds

Mars illustration

 As NASA's Perseverance rover begins its search for ancient life on the surface of Mars, a new study suggests that the Martian subsurface might be a good place to look for possible present-day life on the Red Planet.

The study, published in the journal Astrobiology, looked at the chemical composition of Martian meteorites -- rocks blasted off of the surface of Mars that eventually landed on Earth. The analysis determined that those rocks, if in consistent contact with water, would produce the chemical energy needed to support microbial communities similar to those that survive in the unlit depths of the Earth. Because these meteorites may be representative of vast swaths of the Martian crust, the findings suggest that much of the Mars subsurface could be habitable.

"The big implication here for subsurface exploration science is that wherever you have groundwater on Mars, there's a good chance that you have enough chemical energy to support subsurface microbial life," said Jesse Tarnas, a postdoctoral researcher at NASA's Jet Propulsion Laboratory who led the study while completing his Ph.D. at Brown University. "We don't know whether life ever got started beneath the surface of Mars, but if it did, we think there would be ample energy there to sustain it right up to today."

In recent decades, scientists have discovered that Earth's depths are home to a vast biome that exists largely separated from the world above. Lacking sunlight, these creatures survive using the byproducts of chemical reactions produced when rocks come into contact with water.

One of those reactions is radiolysis, which occurs when radioactive elements within rocks react with water trapped in pore and fracture space. The reaction breaks water molecules into their constituent elements, hydrogen and oxygen. The liberated hydrogen is dissolved in the remaining groundwater, while minerals like pyrite (fool's gold) soak up free oxygen to form sulfate minerals. Microbes can ingest the dissolved hydrogen as fuel and use the oxygen preserved in the sulfates to "burn" that fuel.

In places like Canada's Kidd Creek Mine, these "sulfate-reducing" microbes have been found living more than a mile underground, in water that hasn't seen the light of day in more than a billion years. Tarnas has been working with a team co-led by Brown University professor Jack Mustard and Professor Barbara Sherwood Lollar of the University of Toronto to better understand these underground systems, with an eye toward looking for similar habitats on Mars and elsewhere in the solar system. The project, called Earth 4-D: Subsurface Science and Exploration, is supported by the Canadian Institute for Advances Research.

For this new study, the researchers wanted to see if the ingredients for radiolysis-driven habitats could exist on Mars. They drew on data from NASA's Curiosity rover and other orbiting spacecraft, as well as compositional data from a suite of Martian meteorites, which are representative of different parts of the planet's crust.

The researchers were looking for the ingredients for radiolysis: radioactive elements like thorium, uranium and potassium; sulfide minerals that could be converted to sulfate; and rock units with adequate pore space to trap water. The study found that in several different types of Martian meteorites, all the ingredients are present in adequate abundances to support Earth-like habitats. This was particularly true for regolith breccias -- meteorites sourced from crustal rocks more than 3.6 billion years old -- which were found to have the highest potential for life support. Unlike Earth, Mars lacks a plate tectonics system that constantly recycle crustal rocks. So these ancient terrains remain largely undisturbed.

The researchers say the findings help make the case for an exploration program that looks for signs of present-day life in the Martian subsurface. Prior research has found evidence of an active groundwater system on Mars in the past, the researchers say, and there's reason to believe that groundwater exists today. One recent study, for example, raised the possibility of an underground lake lurking under the planet's southern ice cap. This new research suggests that wherever there's groundwater, there's energy for life.

The cerebellum may have played an important role in the evolution of the human brain

Illustration of human brain, cerebellum highlighted

 The cerebellum -- a part of the brain once recognized mainly for its role in coordinating movement -- underwent evolutionary changes that may have contributed to human culture, language and tool use. This new finding appears in a study by Elaine Guevara of Duke University and colleagues, published May 6th in the journal PLOS Genetics.

Scientists studying how humans evolved their remarkable capacity to think and learn have frequently focused on the prefrontal cortex, a part of the brain vital for executive functions, like moral reasoning and decision making. But recently, the cerebellum has begun receiving more attention for its role in human cognition. Guevara and her team investigated the evolution of the cerebellum and the prefrontal cortex by looking for molecular differences between humans, chimpanzees, and rhesus macaque monkeys. Specifically, they examined genomes from the two types of brain tissue in the three species to find epigenetic differences. These are modifications that do not change the DNA sequence but can affect which genes are turned on and off and can be inherited by future generations.

Compared to chimpanzees and rhesus macaques, humans showed greater epigenetic differences in the cerebellum than the prefrontal cortex, highlighting the importance of the cerebellum in human brain evolution. The epigenetic differences were especially apparent on genes involved in brain development, brain inflammation, fat metabolism and synaptic plasticity -- the strengthening or weakening of connections between neurons depending on how often they are used.

The epigenetic differences identified in the new study are relevant for understanding how the human brain functions and its ability to adapt and make new connections. These epigenetic differences may also be involved in aging and disease. Previous studies have shown that epigenetic differences between humans and chimpanzees in the prefrontal cortex are associated with genes involved in psychiatric conditions and neurodegeneration. Overall, the new study affirms the importance of including the cerebellum when studying how the human brain evolved.

Tuesday 4 May 2021

Among COVID-19 survivors, an increased risk of death, serious illness

COVID-19 test concept

As the COVID-19 pandemic has progressed, it has become clear that many survivors -- even those who had mild cases -- continue to manage a variety of health problems long after the initial infection should have resolved. In what is believed to be the largest comprehensive study of long COVID-19 to date, researchers at Washington University School of Medicine in St. Louis showed that COVID-19 survivors -- including those not sick enough to be hospitalized -- have an increased risk of death in the six months following diagnosis with the virus.

The researchers also have catalogued the numerous diseases associated with COVID-19, providing a big-picture overview of the long-term complications of COVID-19 and revealing the massive burden this disease is likely to place on the world's population in the coming years.

The study, involving more than 87,000 COVID-19 patients and nearly 5 million control patients in a federal database, appears online April 22 in the journal Nature.

"Our study demonstrates that up to six months after diagnosis, the risk of death following even a mild case of COVID-19 is not trivial and increases with disease severity," said senior author Ziyad Al-Aly, MD, an assistant professor of medicine. "It is not an exaggeration to say that long COVID-19 -- the long-term health consequences of COVID-19 -- is America's next big health crisis. Given that more than 30 million Americans have been infected with this virus, and given that the burden of long COVID-19 is substantial, the lingering effects of this disease will reverberate for many years and even decades. Physicians must be vigilant in evaluating people who have had COVID-19. These patients will need integrated, multidisciplinary care."

In the new study, the researchers were able to calculate the potential scale of the problems first glimpsed from anecdotal accounts and smaller studies that hinted at the wide-ranging side effects of surviving COVID-19, from breathing problems and irregular heart rhythms to mental health issues and hair loss.

"This study differs from others that have looked at long COVID-19 because, rather than focusing on just the neurologic or cardiovascular complications, for example, we took a broad view and used the vast databases of the Veterans Health Administration (VHA) to comprehensively catalog all diseases that may be attributable to COVID-19," said Al-Aly, also director of the Clinical Epidemiology Center and chief of the Research and Education Service at the Veterans Affairs St. Louis Health Care System.

The investigators showed that, after surviving the initial infection (beyond the first 30 days of illness), COVID-19 survivors had an almost 60% increased risk of death over the following six months compared with the general population. At the six-month mark, excess deaths among all COVID-19 survivors were estimated at eight people per 1,000 patients. Among patients who were ill enough to be hospitalized with COVID-19 and who survived beyond the first 30 days of illness, there were 29 excess deaths per 1,000 patients over the following six months.

"These later deaths due to long-term complications of the infection are not necessarily recorded as deaths due to COVID-19," Al-Aly said. "As far as total pandemic death toll, these numbers suggest that the deaths we're counting due to the immediate viral infection are only the tip of the iceberg."

The researchers analyzed data from the national health-care databases of the U.S. Department of Veterans Affairs. The dataset included 73,435 VHA patients with confirmed COVID-19 but who were not hospitalized and, for comparison, almost 5 million VHA patients who did not have a COVID-19 diagnosis and were not hospitalized during this time frame. The veterans in the study were primarily men (almost 88%), but the large sample size meant that the study still included 8,880 women with confirmed cases.

To help understand the long-term effects of more severe COVID-19, the researchers harnessed VHA data to conduct a separate analysis of 13,654 patients hospitalized with COVID-19 compared with 13,997 patients hospitalized with seasonal flu. All patients survived at least 30 days after hospital admission, and the analysis included six months of follow-up data.

The researchers confirmed that, despite being initially a respiratory virus, long COVID-19 can affect nearly every organ system in the body. Evaluating 379 diagnoses of diseases possibly related to COVID-19, 380 classes of medications prescribed and 62 laboratory tests administered, the researchers identified newly diagnosed major health issues that persisted in COVID-19 patients over at least six months and that affected nearly every organ and regulatory system in the body, including:

  • Respiratory system: persistent cough, shortness of breath and low oxygen levels in the blood.
  • Nervous system: stroke, headaches, memory problems and problems with senses of taste and smell.
  • Mental health: anxiety, depression, sleep problems and substance abuse.
  • Metabolism: new onset of diabetes, obesity and high cholesterol.
  • Cardiovascular system: acute coronary disease, heart failure, heart palpitations and irregular heart rhythms.
  • Gastrointestinal system: constipation, diarrhea and acid reflux.
  • Kidney: acute kidney injury and chronic kidney disease that can, in severe cases, require dialysis.
  • Coagulation regulation: blood clots in the legs and lungs.
  • Skin: rash and hair loss.
  • Musculoskeletal system: joint pain and muscle weakness.
  • General health: malaise, fatigue and anemia.

While no survivor suffered from all of these problems, many developed a cluster of several issues that have a significant impact on health and quality of life.

Among hospitalized patients, those who had COVID-19 fared considerably worse than those who had influenza, according to the analysis. COVID-19 survivors had a 50% increased risk of death compared with flu survivors, with about 29 excess deaths per 1,000 patients at six months. Survivors of COVID-19 also had a substantially higher risk of long-term medical problems.

"Compared with flu, COVID-19 showed remarkably higher burden of disease, both in the magnitude of risk and the breadth of organ system involvement," Al-Aly said. "Long COVID-19 is more than a typical postviral syndrome. The size of the risk of disease and death and the extent of organ system involvement is far higher than what we see with other respiratory viruses, such as influenza."

In addition, the researchers found that the health risks from surviving COVID-19 increased with the severity of disease, with hospitalized patients who required intensive care being at highest risk of long COVID-19 complications and death.

"Some of these problems may improve with time -- for example, shortness of breath and cough may get better -- and some problems may get worse," Al-Aly added. "We will continue following these patients to help us understand the ongoing impacts of the virus beyond the first six months after infection. We're only a little over a year into this pandemic, so there may be consequences of long COVID-19 that are not yet visible."


 

Genetic effects of Chernobyl radiation

View of Chernobyl nuclear power plant 

In two landmark studies, researchers have used cutting-edge genomic tools to investigate the potential health effects of exposure to ionizing radiation, a known carcinogen, from the 1986 accident at the Chernobyl nuclear power plant in northern Ukraine. One study found no evidence that radiation exposure to parents resulted in new genetic changes being passed from parent to child. The second study documented the genetic changes in the tumors of people who developed thyroid cancer after being exposed as children or fetuses to the radiation released by the accident.

The findings, published around the 35th anniversary of the disaster, are from international teams of investigators led by researchers at the National Cancer Institute (NCI), part of the National Institutes of Health. The studies were published online in Science on April 22.

"Scientific questions about the effects of radiation on human health have been investigated since the atomic bombings of Hiroshima and Nagasaki and have been raised again by Chernobyl and by the nuclear accident that followed the tsunami in Fukushima, Japan," said Stephen J. Chanock, M.D., director of NCI's Division of Cancer Epidemiology and Genetics (DCEG). "In recent years, advances in DNA sequencing technology have enabled us to begin to address some of the important questions, in part through comprehensive genomic analyses carried out in well-designed epidemiological studies."

The Chernobyl accident exposed millions of people in the surrounding region to radioactive contaminants. Studies have provided much of today's knowledge about cancers caused by radiation exposures from nuclear power plant accidents. The new research builds on this foundation using next-generation DNA sequencing and other genomic characterization tools to analyze biospecimens from people in Ukraine who were affected by the disaster.

The first study investigated the long-standing question of whether radiation exposure results in genetic changes that can be passed from parent to offspring, as has been suggested by some studies in animals. To answer this question, Dr. Chanock and his colleagues analyzed the complete genomes of 130 people born between 1987 and 2002 and their 105 mother-father pairs.

One or both of the parents had been workers who helped clean up from the accident or had been evacuated because they lived in close proximity to the accident site. Each parent was evaluated for protracted exposure to ionizing radiation, which may have occurred through the consumption of contaminated milk (that is, milk from cows that grazed on pastures that had been contaminated by radioactive fallout). The mothers and fathers experienced a range of radiation doses.

The researchers analyzed the genomes of adult children for an increase in a particular type of inherited genetic change known as de novo mutations. De novo mutations are genetic changes that arise randomly in a person's gametes (sperm and eggs) and can be transmitted to their offspring but are not observed in the parents.

For the range of radiation exposures experienced by the parents in the study, there was no evidence from the whole-genome sequencing data of an increase in the number or types of de novo mutations in their children born between 46 weeks and 15 years after the accident. The number of de novo mutations observed in these children were highly similar to those of the general population with comparable characteristics. As a result, the findings suggest that the ionizing radiation exposure from the accident had a minimal, if any, impact on the health of the subsequent generation.

"We view these results as very reassuring for people who were living in Fukushima at the time of the accident in 2011," said Dr. Chanock. "The radiation doses in Japan are known to have been lower than those recorded at Chernobyl."

In the second study, researchers used next-generation sequencing to profile the genetic changes in thyroid cancers that developed in 359 people exposed as children or in utero to ionizing radiation from radioactive iodine (I-131) released by the Chernobyl nuclear accident and in 81 unexposed individuals born more than nine months after the accident. Increased risk of thyroid cancer has been one of the most important adverse health effects observed after the accident.

The energy from ionizing radiation breaks the chemical bonds in DNA, resulting in a number of different types of damage. The new study highlights the importance of a particular kind of DNA damage that involves breaks in both DNA strands in the thyroid tumors. The association between DNA double-strand breaks and radiation exposure was stronger for children exposed at younger ages.

Next, the researchers identified the candidate "drivers" of the cancer in each tumor -- the key genes in which alterations enabled the cancers to grow and survive. They identified the drivers in more than 95% of the tumors. Nearly all the alterations involved genes in the same signaling pathway, called the mitogen-activated protein kinase (MAPK) pathway, including the genes BRAF, RAS, and RET.

The set of affected genes is similar to what has been reported in previous studies of thyroid cancer. However, the researchers observed a shift in the distribution of the types of mutations in the genes. Specifically, in the Chernobyl study, thyroid cancers that occurred in people exposed to higher radiation doses as children were more likely to result from gene fusions (when both strands of DNA are broken and then the wrong pieces are joined back together), whereas those in unexposed people or those exposed to low levels of radiation were more likely to result from point mutations (single base-pair changes in a key part of a gene).

The results suggest that DNA double-strand breaks may be an early genetic change following exposure to radiation in the environment that subsequently enables the growth of thyroid cancers. Their findings provide a foundation for further studies of radiation-induced cancers, particularly those that involve differences in risk as a function of both dose and age, the researchers added.

"An exciting aspect of this research was the opportunity to link the genomic characteristics of the tumor with information about the radiation dose -- the risk factor that potentially caused the cancer," said Lindsay M. Morton, Ph.D., deputy chief of the Radiation Epidemiology Branch in DCEG, who led the study.

"The Cancer Genome Atlas set the standard for how to comprehensively profile tumor characteristics," Dr. Morton continued. "We extended that approach to complete the first large genomic landscape study in which the potential carcinogenic exposure was well-characterized, enabling us to investigate the relationship between specific tumor characteristics and radiation dose."

She noted that the study was made possible by the creation of the Chernobyl Tissue Bank about two decades ago -- long before the technology had been developed to conduct the kind of genomic and molecular studies that are common today.

Novel C. diff structures are required for infection, offer new therapeutic targets

  Iron storage "spheres" inside the bacterium C. diff -- the leading cause of hospital-acquired infections -- could offer new targ...