Saturday 26 November 2022

Astronomers discover closest black hole to Earth

Black hole illustration

Astronomers using the International Gemini Observatory, operated by NSF's NOIRLab, have discovered the closest-known black hole to Earth. This is the first unambiguous detection of a dormant stellar-mass black hole in the Milky Way. Its close proximity to Earth, a mere 1600 light-years away, offers an intriguing target of study to advance our understanding of the evolution of binary systems.

Black holes are the most extreme objects in the Universe. Supermassive versions of these unimaginably dense objects likely reside at the centers of all large galaxies. Stellar-mass black holes -- which weigh approximately five to 100 times the mass of the Sun -- are much more common, with an estimated 100 million in the Milky Way alone. Only a handful have been confirmed to date, however, and nearly all of these are 'active' -- meaning they shine brightly in X-rays as they consume material from a nearby stellar companion, unlike dormant black holes which do not.

Astronomers using the Gemini North telescope on Hawai'i, one of the twin telescopes of the InternationalGemini Observatory, operated by NSF's NOIRLab, have discovered the closest black hole to Earth, which the researchers have dubbed Gaia BH1. This dormant black hole is about 10 times more massive than the Sun and is located about 1600 light-years away in the constellation Ophiuchus, making it three times closer to Earth than the previous record holder, an X-ray binary in the constellation of Monoceros. The new discovery was made possible by making exquisite observations of the motion of the black hole's companion, a Sun-like star that orbits the black hole at about the same distance as the Earth orbits the Sun.

"Take the Solar System, put a black hole where the Sun is, and the Sun where the Earth is, and you get this system," explained Kareem El-Badry, an astrophysicist at the Center for Astrophysics | Harvard & Smithsonianand the Max Planck Institute for Astronomy, and the lead author of the paper describing this discovery. "While there have been many claimed detections of systems like this, almost all these discoveries have subsequently been refuted. This is the first unambiguous detection of a Sun-like star in a wide orbit around a stellar-mass black hole in our Galaxy."

Though there are likely millions of stellar-mass black holes roaming the Milky Way Galaxy, those few that have been detected were uncovered by their energetic interactions with a companion star. As material from a nearby star spirals in toward the black hole, it becomes superheated and generates powerful X-rays and jets of material. If a black hole is not actively feeding (i.e., it is dormant) it simply blends in with its surroundings.

"I've been searching for dormant black holes for the last four years using a wide range of datasets and methods," said El-Badry. "My previous attempts -- as well as those of others -- turned up a menagerie of binary systems that masquerade as black holes, but this is the first time the search has borne fruit."

The team originally identified the system as potentially hosting a black hole by analyzing data from the European Space Agency's Gaia spacecraft. Gaia captured the minute irregularities in the star's motion caused by the gravity of an unseen massive object. To explore the system in more detail, El-Badry and his team turned to the Gemini Multi-Object Spectrograph instrument on Gemini North, which measured the velocity of the companion star as it orbited the black hole and provided precise measurement of its orbital period. The Gemini follow-up observations were crucial to constraining the orbital motion and hence masses of the two components in the binary system, allowing the team to identify the central body as a black hole roughly 10 times as massive as our Sun.

"Our Gemini follow-up observations confirmed beyond reasonable doubt that the binary contains a normal star and at least one dormant black hole," elaborated El-Badry. "We could find no plausible astrophysical scenario that can explain the observed orbit of the system that doesn't involve at least one black hole."

The team relied not only on Gemini North's superb observational capabilities but also on Gemini's ability to provide data on a tight deadline, as the team had only a short window in which to perform their follow-up observations.

"When we had the first indications that the system contained a black hole, we only had one week before the two objects were at the closest separation in their orbits. Measurements at this point are essential to make accurate mass estimates in a binary system," said El-Badry. "Gemini's ability to provide observations on a short timescale was critical to the project's success. If we'd missed that narrow window, we would have had to wait another year."

Astronomers' current models of the evolution of binary systems are hard-pressed to explain how the peculiar configuration of Gaia BH1 system could have arisen. Specifically, the progenitor star that later turned into the newly detected black hole would have been at least 20 times as massive as our Sun. This means it would have lived only a few million years. If both stars formed at the same time, this massive star would have quickly turned into a supergiant, puffing up and engulfing the other star before it had time to become a proper, hydrogen-burning, main-sequence star like our Sun.

It is not at all clear how the solar-mass star could have survived that episode, ending up as an apparently normal star, as the observations of the black hole binary indicate. Theoretical models that do allow for survival all predict that the solar-mass star should have ended up on a much tighter orbit than what is actually observed.

This could indicate that there are important gaps in our understanding of how black holes form and evolve in binary systems, and also suggests the existence of an as-yet-unexplored population of dormant black holes in binaries.

"It is interesting that this system is not easily accommodated by standard binary evolution models," concluded El-Badry. "It poses many questions about how this binary system was formed, as well as how many of these dormant black holes there are out there."

"As part of a network of space- and ground-based observatories, Gemini North has not only provided strong evidence for the nearest black hole to date but also the first pristine black hole system, uncluttered by the usual hot gas interacting with the black hole," said NSF Gemini Program Officer Martin Still. "While this potentially augurs future discoveries of the predicted dormant black hole population in our Galaxy, the observations also leave a mystery to be solved -- despite a shared history with its exotic neighbor, why is the companion star in this binary system so normal?"

 

Differences between brains of primates are small but significant, study shows

Model of brain 

 While the physical differences between humans and non-human primates are quite distinct, a new study reveals their brains may be remarkably similar. And yet, the smallest changes may make big differences in developmental and psychiatric disorders.

Understanding the molecular differences that make the human brain distinct can help researchers study disruptions in its development. A new study, published recently in the journal Science by a team including University of Wisconsin-Madison neuroscience professor Andre Sousa, investigates the differences and similarities of cells in the prefrontal cortex -- the frontmost region of the brain, an area that plays a central role in higher cognitive functions -- between humans and non-human primates such as chimpanzees, Rhesus macaques and marmosets.

The cellular differences between these species may illuminate steps in their evolution and how those differences can be implicated in disorders, such as autism and intellectual disabilities, seen in humans. Sousa, who studies the developmental biology of the brain at UW-Madison's Waisman Center, decided to start by studying and categorizing the cells in the prefrontal cortex in partnership with the Yale University lab where he worked as a postdoctoral researcher.

"We are profiling the dorsolateral prefrontal cortex because it is particularly interesting. This cortical area only exists in primates. It doesn't exist in other species," Sousa says. "It has been associated with several relevant functions in terms of high cognition, like working memory. It has also been implicated in several neuropsychiatric disorders. So, we decided to do this study to understand what is unique about humans in this brain region."

Sousa and his lab collected genetic information from more than 600,000 prefrontal cortex cells from tissue samples from humans, chimpanzees, macaques and marmosets. They analyzed that data to categorize the cells into types and determine the differences in similar cells across species. Unsurprisingly, the vast majority of the cells were fairly comparable.

"Most of the cells are actually very similar because these species are relatively close evolutionarily," Sousa says.

Sousa and his collaborators found five cell types in the prefrontal cortex that were not present in all four of the species. They also found differences in the abundancies of certain cell types as well as diversity among similar cell populations across species. When comparing a chimpanzee to a human the differences seem huge -- from their physical appearances down to the capabilities of their brains. But at the cellular and genetic level, at least in the prefrontal cortex, the similarities are many and the dissimilarities sparing.

"Our lab really wants to know what is unique about the human brain. Obviously from this study and our previous work, most of it is actually the same, at least among primates," Sousa says.

The slight differences the researchers found may be the beginning of determining some of those unique factors, and that information could lead to revelations about development and developmental disorders at a molecular level.

"We want to know what happened after the evolutionary split between humans and other primates," Sousa says. "The idea is you have a mutation in a gene or in several genes and those genes now have slightly different functions. But if these genes are relevant for brain development, for example, how many of a certain cell is produced, or how cells are connecting to other cells, how is it affecting the neuronal circuitry and their physiological properties? We want to understand how these differences lead to differences in the brain and then lead to differences we can observe in adults."

The study's observations were made in the brains of adults, after much of the development is complete. This means that the differences may be occurring during the brain's development. So, the researchers' next step is to study samples from developing brains and extend their area of investigation past the prefrontal cortex to potentially find where and when these differences originate. The hope is that this information will lead to a more robust foundation to lay developmental disorder research on top of.

"We are able to do extraordinary things, right? We are studying life itself, the universe, and so much more. And this is really unique when you look around," says Sousa, whose team included graduate students Ryan Risgaards and Zachary Gomez-Sanchez, research intern Danielle Schmidt, and undergraduate students Ashwin Debnath and Cade Hottman. "If we have these unique abilities, it has to be something in the brain, right? There is something in the brain that allows us to do all of that and we are really interested in knowing what it is."


Honey bee life spans are 50 percent shorter today than they were 50 years ago

Honey bees

 A new study by University of Maryland entomologists shows that the lifespan for individual honey bees kept in a controlled, laboratory environment is 50% shorter than it was in the 1970s. When scientists modeled the effect of today's shorter lifespans, the results corresponded with the increased colony loss and reduced honey production trends seen by U.S. beekeepers in recent decades.

Colony turnover is an accepted factor in the beekeeping business, as bee colonies naturally age and die off. But over the past decade, U.S. beekeepers have reported high loss rates, which has meant having to replace more colonies to keep operations viable. In an effort to understand why, researchers have focused on environmental stressors, diseases, parasites, pesticide exposure and nutrition.

This is the first study to show an overall decline in honey bee lifespan potentially independent of environmental stressors, hinting that genetics may be influencing the broader trends seen in the beekeeping industry. The study was published November 14, 2022, in the journal Scientific Reports.

"We're isolating bees from the colony life just before they emerge as adults, so whatever is reducing their lifespan is happening before that point," said Anthony Nearman, a Ph.D. student in the Department of Entomology and lead author of the study. "This introduces the idea of a genetic component. If this hypothesis is right, it also points to a possible solution. If we can isolate some genetic factors, then maybe we can breed for longer-lived honey bees."

Nearman first noticed the decline in lifespan while conducting a study with entomology associate professor Dennis van Engelsdorp on standardized protocols for rearing adult bees in the laboratory. Replicating earlier studies, the researchers collected bee pupae from honey bee hives when the pupae were within 24 hours of emerging from the wax cells they are reared in. The collected bees finished growing in an incubator and were then kept as adults in special cages.

Nearman was evaluating the effect of supplementing the caged bees' sugar water diet with plain water to better mimic natural conditions when he noticed that, regardless of diet, the median lifespan of his caged bees was half that of caged bees in similar experiments in the 1970s. (17.7 days today versus 34.3 days in the 1970s.) This prompted a deeper review of published laboratory studies over the past 50 years.

"When I plotted the lifespans over time, I realized, wow, there's actually this huge time effect going on," Nearman said. "Standardized protocols for rearing honey bees in the lab weren't really formalized until the 2000s, so you would think that lifespans would be longer or unchanged, because we're getting better at this, right? Instead, we saw a doubling of mortality rate."

Although a laboratory environment is very different from a colony, historical records of lab-kept bees suggest a similar lifespan to colony bees, and scientists generally assume that isolated factors that reduce lifespan in one environment will also reduce it in another. Previous studies had also shown that in the real world, shorter honey bee lifespans corresponded to less foraging time and lower honey production. This is the first study to connect those factors to colony turnover rates.

When the team modeled the effect of a 50% reduction in lifespan on a beekeeping operation, where lost colonies are replaced annually, the resulting loss rates were around 33%. This is very similar to the average overwinter and annual loss rates of 30% and 40% reported by beekeepers over the past 14 years.

Nearman and vanEngelsdorp noted that their lab-kept bees could be experiencing some sort of low-level viral contamination or pesticide exposure during their larval stage, when they're brooding in the hive and worker bees are feeding them. But the bees have not shown overt symptoms of those exposures and a genetic component to longevity has been shown in other insects such as fruit flies.

The next steps for the researchers will be to compare trends in honey bee lifespans across the U.S. and in other countries. If they find differences in longevity, they can isolate and compare potential contributing factors such as genetics, pesticide use and presence of viruses in the local bee stocks.


Oldest evidence of the controlled use of fire to cook food, researchers report

Wood fire burning

A remarkable scientific discovery has been made by researchers from the Hebrew University of Jerusalem (HU), Tel Aviv University (TAU), and Bar-Ilan University (BIU), in collaboration with the Steinhardt Museum of Natural History, Oranim Academic College, the Israel Oceanographic and Limnological Research (IOLR) institution, the Natural History Museum in London, and the Johannes Gutenberg University in Mainz. A close analysis of the remains of a carp-like fish found at the Gesher Benot Ya'aqov (GBY) archaeological site in Israel shows that the fish were cooked roughly 780,000 years ago. Cooking is defined as the ability to process food by controlling the temperature at which it is heated and includes a wide range of methods. Until now, the earliest evidence of cooking dates to approximately 170,000 years ago. The question of when early man began using fire to cook food has been the subject of much scientific discussion for over a century. These findings shed new light on the matter and was published in Nature Ecology and Evolution.

The study was led by a team of researchers: Dr. Irit Zohar, a researcher at TAU's Steinhardt Museum of Natural History and curator of the Beit Margolin Biological Collections at Oranim Academic College, and HU Professor Naama Goren-Inbar, director of the excavation site. The research team also included Dr. Marion Prevost at HU's Institute of Archaeology; Prof. Nira Alperson-Afil at BIU's Department for Israel Studies and Archaeology; Dr. Jens Najorka of the Natural History Museum in London; Dr. Guy Sisma-Ventura of the Israel Oceanographic and Limnological Research Institute; Prof. Thomas Tütken of the Johannes Gutenberg University in Mainz and Prof. Israel Hershkovitz at TAU's Faculty of Medicine.

Dr. Zohar and Dr. Prevost: "This study demonstrates the huge importance of fish in the life of prehistoric humans, for their diet and economic stability. Further, by studying the fish remains found at Gesher Benot Ya'aqob we were able to reconstruct, for the first time, the fish population of the ancient Hula Lake and to show that the lake held fish species that became extinct over time. These species included giant barbs (carp like fish) that reached up to 2 meters in length. The large quantity of fish remains found at the site proves their frequent consumption by early humans, who developed special cooking techniques. These new findings demonstrate not only the importance of freshwater habitats and the fish they contained for the sustenance of prehistoric man, but also illustrate prehistoric humans' ability to control fire in order to cook food, and their understanding the benefits of cooking fish before eating it."

In the study, the researchers focused on pharyngeal teeth (used to grind up hard food such as shells) belonging to fish from the carp family. These teeth were found in large quantities at different archaeological strata at the site. By studying the structure of the crystals that form the teeth enamel (whose size increases through exposure to heat), the researchers were able to prove that the fish caught at the ancient Hula Lake, adjacent to the site, were exposed to temperatures suitable for cooking, and were not simply burned by a spontaneous fire.

Until now, evidence of the use of fire for cooking had been limited to sites that came into use much later than the GBY site -- by some 600,000 years, and ones most are associated with the emergence of our own species, homo sapiens.

Prof. Goren-Inbar added: "The fact that the cooking of fish is evident over such a long and unbroken period of settlement at the site indicates a continuous tradition of cooking food. This is another in a series of discoveries relating to the high cognitive capabilities of the Acheulian hunter-gatherers who were active in the ancient Hula Valley region. These groups were deeply familiar with their environment and the various resources it offered them. Further, it shows they had extensive knowledge of the life cycles of different plant and animal species. Gaining the skill required to cook food marks a significant evolutionary advance, as it provided an additional means for making optimal use of available food resources. It is even possible that cooking was not limited to fish, but also included various types of animals and plants."

Prof. Hershkovitz and Dr. Zohar note that the transition from eating raw food to eating cooked food had dramatic implications for human development and behavior. Eating cooked food reduces the bodily energy required to break down and digest food, allowing other physical systems to develop. It also leads to changes in the structure of the human jaw and skull. This change freed humans from the daily, intensive work of searching for and digesting raw food, providing them free time in which to develop new social and behavioral systems. Some scientists view eating fish as a milestone in the quantum leap in human cognitive evolution, providing a central catalyst for the development of the human brain. They claim that eating fish is what made us human. Even today, it is widely known that the contents of fish flesh, such as omega-3 fatty acids, zinc, iodine and more, contribute greatly to brain development.

The research team believe that the location of freshwater areas, some of them in areas that have long since dried up and become arid deserts, determined the route of the migration of early man from Africa to the Levant and beyond. Not only did these habitats provide drinking water and attracted animals to the area but catching fish in shallow water is a relatively simple and safe task with a very high nutritional reward.

The team posits that exploiting fish in freshwater habitats was the first step on prehistoric humans' route out of Africa. Early man began to eat fish around 2 million years ago but cooking fish -- as found in this study -- represented a real revolution in the Acheulian diet and is an important foundation for understanding the relationship between man, the environment, climate, and migration when attempting to reconstruct the history of early humans.

It should be noted that evidence of the use of fire at the site -- the oldest such evidence in Eurasia -- was identified first by BIU's Prof. Nira Alperson-Afil. "The use of fire is a behavior that characterizes the entire continuum of settlement at the site," she explained. "This affected the spatial organization of the site and the activity conducted there, which revolved around fireplaces." Alperson-Afil's research of fire at the site was revolutionary for its time and showed that the use of fire began hundreds of thousands of years before previously thought.

HU's Goren-Inbar added that the archaeological site of GBY documents a continuum of repeated settlement by groups of hunter-gatherers on the shores of the ancient Hula Lake which lasting tens of thousands of years. "These groups made use of the rich array of resources provided by the ancient Hula Valley and left behind a long settlement continuum with over 20 settlement strata," Goren-Inbar explained. The excavations at the site have uncovered the material culture of these ancient hominins, including flint, basalt, and limestone tools, as well as their food sources, which were characterized by a rich diversity of plant species from the lake and its shores (including fruit, nuts, and seeds) and by many species of land mammals, both medium-sized and large.

Dr. Jens Najorka of the Natural History Museum in London explained: "In this study, we used geochemical methods to identify changes in the size of the tooth enamel crystals, as a result of exposure to different cooking temperatures. When they are burnt by fire, it is easy to identify the dramatic change in the size of the enamel crystals, but it is more difficult to identify the changes caused by cooking at temperatures between 200 and 500 degrees Celsius. The experiments I conducted with Dr. Zohar allowed us to identify the changes caused by cooking at low temperatures. We do not know exactly how the fish were cooked but given the lack of evidence of exposure to high temperatures, it is clear that they were not cooked directly in fire, and were not thrown into a fire as waste or as material for burning."


 

Wednesday 16 November 2022

New study reveals that exposure to outdoor artificial light at night is associated with an increased risk of diabetes

 A new study published in Diabetologia (the journal of the European Association for the Study of Diabetes [EASD]) finds that outdoor artificial light at night (LAN) is associated with impaired blood glucose control and an increased risk of diabetes, with more than 9 million cases of the disease in Chinese adults being attributed to LAN exposure. The study is by Dr Yu Xu and colleagues at the Shanghai Institute of Endocrine and Metabolic Diseases, Ruijin Hospital, Shanghai Jiaotong University School of Medicine, Shanghai, China.

Exposure to artificial LAN at night is a ubiquitous environmental risk factor in modern societies. The intensity of urban light pollution has increased to the point that it not only affects residents of big cities, but also those in distant areas such as suburbs and forest parks that may be hundreds of kilometres from the light source. The authors note: "Despite over 80% of the world's population being exposed to light pollution at night, this problem has gained limited attention from scientists until recent years."

Earth's 24-hour day-night cycle has resulted in most organisms, including mammals, having an inbuilt circadian (roughly 24-hour) timing system which is adapted to the natural sequence of light and dark periods. Light pollution has been found to alter the circadian rhythm of insects, birds and other animals, resulting in premature death and loss of biodiversity.

Artificial LAN has also been implicated as a potential cause of metabolic dysregulation through altering the timing of food intake. Rats exposed to artificial LAN developed glucose intolerance, exhibiting elevated blood sugar and insulin. Another study found that mice exposed to nocturnal dim white light of minimal brightness for 4 weeks had increased body mass and reduced glucose tolerance compared to animals whose environment was completely dark at night, despite having roughly equivalent energy consumption and expenditure.

Associations have also been found between artificial LAN and health problems in humans. A study of night-shift workers found that those exposed to brighter LAN were more likely to have disrupted circadian rhythms, as well as a greater risk of coronary heart disease. Other research found that higher LAN exposure was associated with a 13% and 22% increase in the likelihood of being overweight and obese, respectively, while exposure to LAN in the bedroom was reported to be positively associated with the development of diabetes in elderly people.

The potential impact of outdoor artificial LAN was revealed by a study in South India which used satellite images to map light pollution and compared this with data on general health markers among adults across the region. With increasing LAN intensity, there were corresponding rises in average body mass index (BMI), systolic blood pressure and 'bad' (LDL) cholesterol levels in the exposed population.

Diabetes is a critical public health problem in China, and the onset and progression of the disease is largely governed by behavioural and environmental risk factors. The nation's rapid urbanisation and economic growth has resulted in a dramatic increase in urban lighting, and the number of people exposed to it. Those living in cities are prone to being shifted away from a natural 24-hour day-night cycle, to one of round-the-clock working and leisure time, often staying out late and being exposed to artificial LAN.

The study used data from the China Noncommunicable Disease Surveillance Study; a representative sample of the general population in China taken in 2010 across 162 sites across the country. A total of 98,658 adults participated, undergoing interviews to collect demographic, medical, household income, lifestyle, education and family history information. The mean age of participants was 42.7 years and around approximately half were women.

Body weight and height of participants were measured to calculate BMI, and blood samples were taken to obtain levels of both fasting and postprandial (after meal) serum glucose, as well as glycated haemoglobin (HbA1c). This is a form of glucose bound to haemoglobin in red blood cells which acts as a moving average of blood sugar over the previous 8 to 12 weeks.

Participants at each study site were assigned an average artificial outdoor LAN exposure level for that location using night-time low-light image data of the Earth's surface from the US Defense Meteorological Satellite Program (DMSP). Exposure levels were ordered from lowest to highest and grouped into five quintiles (groups of 20% from highest to lowest), with the median light intensity in the highest quintile being 69 times greater than in the lowest.

The intensity of outdoor LAN varied substantially across China, with most areas being exposed to low intensity light, while higher intensities converged on the Eastern coastal cities. Participants living in areas in the higher quintiles of outdoor LAN were more likely to be older, have a higher BMI and household income, and live in an urban area. In contrast, those in the lower quintile areas reported higher levels of physical activity but fewer years of education.

The study found that the highest quintile of LAN exposure was associated with a relative increase of 28% in the prevalence of diabetes than in the lowest quintile areas. Chronic exposure to residential outdoor LAN was positively associated with blood glucose levels, insulin resistance and diabetes prevalence, and inversely associated with beta cell function, even after adjusting for many important diabetes risk factors. On average, for every 42 people living in regions in the highest quintile of LAN exposure, there is one more case of diabetes that would not have occurred if those individuals had been living in areas in the lowest quintile. While the association between LAN exposure and diabetes might not be as strong as with better known risk factors, the ubiquity of outdoor artificial light means that the scale of population exposure is vast.

The researchers estimated that more than 9 million cases of diabetes in Chinese adults aged ?18 years could be attributed to outdoor LAN exposure; a figure expected to increase with accelerating urbanisation and the growing number of people migrating from China's countryside to its cities. The global nature and scale of this problem is illustrated by the fact that an estimated 83% of the world's population and more than 99% of those in the US and Europe live under light-polluted skies.

These findings contribute to a growing body of evidence suggesting that LAN is detrimental to health and demonstrate that it may be a potential novel risk factor for diabetes. The authors conclude that "further studies involving the direct measurement of individual exposure to LAN are needed to confirm whether its relationship with diabetes is a causal one."

Ancient disease has potential to regenerate livers

 Leprosy is one of the world's oldest and most persistent diseases but the bacteria that cause it may also have the surprising ability to grow and regenerate a vital organ.

Scientists have discovered that parasites associated with leprosy can reprogramme cells to increase the size of a liver in adult animals without causing damage, scarring or tumors.

The findings suggest the possibility of adapting this natural process to renew ageing livers and increase healthspan -- the length of time living disease-free -- in humans.

Experts say it could also help regrow damaged livers, thereby reducing the need for transplantation, which is currently the only curative option for people with end-stage scarred livers.

Previous studies promoted the regrowth of mouse livers by generating stem cells and progenitor cells -- the step after a stem cell that can become any type of cell for a specific organ -- via an invasive technique that often resulted in scarring and tumour growth.

To overcome these harmful side-effects, Edinburgh researchers built on their previous discovery of the partial cellular reprogramming ability of the leprosy-causing bacteria, Mycobacterium leprae.

Working with the US Department of Health and Human Services in Baton Rouge, Louisiana, the team infected 57 armadillos -- a natural host of leprosy bacteria -- with the parasite and compared their livers with those of uninfected armadillos and those that were found to be resistant to infection.

They found that the infected animals developed enlarged -- yet healthy and unharmed -- livers with the same vital components, such as blood vessels, bile ducts and functional units known as lobules, as the uninfected and resistant armadillos.

The team believe the bacteria 'hijacked' the inherent regenerative ability of the liver to increase the organ's size and, therefore, to provide it with more cells within which to increase.

They also discovered several indicators that the main kinds of liver cells -- known as hepatocytes -- had reached a "rejuvenated" state in the infected armadilllos.

Livers of the infected armadillos also contained gene expression patterns -- the blueprint for building a cell -- similar to those in younger animals and human fetal livers.

Genes related to metabolism, growth and cell proliferation were activated and those linked with aging were downregulated, or suppressed.

Scientists think this is because the bacteria reprogramed the liver cells, returning them to the earlier stage of progenitor cells, which in turn became new hepatocytes and grow new liver tissues.

The team are hopeful that the discovery has the potential to help develop interventions for aging and damaged livers in humans. Liver diseases currently result in two million deaths a year worldwide.

The findings have been published in the journal Cell Reports Medicine. This work has been funded by the UK's Medical Research Council and the US National Institutes of Health and National Institute of Allergy and Infectious Diseases.

Sunday 13 November 2022

Clonazepam uses

Watch the video 



It's used to control seizures or fits due to epilepsy, involuntary muscle spasms, panic disorder and sometimes restless legs syndrome.

Clonazepam is available on prescription only. It comes as tablets and as a liquid that you swallow.

Clonazepam works by increasing levels of a calming chemical in your brain. This can relieve anxiety, stop seizures and fits or relax tense muscles.
The most common side effect is feeling sleepy (drowsy) during the daytime.
Clonazepam is not likely to be addictive if you take it for a short time (2 to 4 weeks).
If you take clonazepam for more than 2 to 4 weeks, your dose will need to be reduced gradually before you stop taking it.
Do not drink alcohol while taking clonazepam. There's a risk you can sleep very deeply and you may have trouble waking up.
Who can and cannot take clonazepam
Clonazepam tablets and liquid can be taken by adults aged 18 years and over.

It can also be taken by children from 1 month old for epilepsy.

It's not suitable for everyone.

To make sure it's safe for you, tell your doctor before starting clonazepam if you:

have had an allergic reaction to clonazepam or any other medicine in the past
have myasthenia gravis, a condition that causes muscle weakness
have sleep apnoea, a condition that causes breathing problems when you're asleep
have lung, liver or kidney problems
have spinal or cerebellar ataxia (where you may become shaky and unsteady and have slurred speech)
have (or have had) problems with alcohol or drugs
have recently had a loss or bereavement, depression or thoughts of harming yourself or suicide
have been diagnosed with a personality disorder
are trying to get pregnant, are already pregnant or breastfeeding
are going to have a general anaesthetic for an operation or dental treatment
How and when to take it
It's important to take clonazepam exactly as your doctor tells you to.

You'll usually start on a low dose and gradually increase it over 2 to 4 weeks until your doctor thinks the dose is the right dose.

Your doctor will tell you if you need to take clonazepam in 1 dose or split your dose so you take it up to 3 times each day. Ask a doctor or pharmacist if you're not sure how to take it.

The usual dose for:

epilepsy in adults – the starting dose is 1mg taken at night (increasing to 4mg to 8mg over 2 to 4 weeks)
epilepsy in children – the dose varies depending on their age. It will be increased gradually over 2 to 4 weeks
involuntary muscle spasms (adults) – the starting dose is 1mg taken at night (increasing to 4mg to 8mg over 2 to 4 weeks)
panic disorder – 1mg to 2mg each day
restless legs syndrome – 500 micrograms to 2mg each day
If you're older than 65 or have kidney, liver or severe breathing problems, your doctor may recommend a lower dose.

Take clonazepam tablets with a drink of water. You can take the tablets or liquid with or without food.

What if I forget to take it?
If you forget to take your clonazepam, take it as soon as you remember, unless it's nearly time for your next dose.

In this case, just leave out the missed dose and take your next dose as usual.

Never take 2 doses at the same time. Never take an extra dose to make up for a forgotten one.

If you forget doses often, it may help to set an alarm to remind you.

You could also ask a pharmacist for advice on other ways to remember your medicines.

What if I take too much?
The amount of clonazepam that can lead to an overdose varies from person to person.

If you take too much clonazepam, you may get symptoms including:

poor coordination or trouble speaking
feeling sleepy
a slow or irregular heartbeat
uncontrolled eye movements
muscle weakness
feeling overexcited

Watch

https://youtu.be/cclJ8fLxgGw

Earth-sun distance dramatically alters seasons in the equatorial Pacific in a 22,000-year cycle

Weather and climate modelers understand pretty well how seasonal winds and ocean currents affect El Niño patterns in the eastern equatorial Pacific Ocean, impacting weather across the United States and sometimes worldwide.

But new computer simulations show that one driver of annual weather cycles in that region -- in particular, a cold tongue of surface waters stretching westward along the equator from the coast of South America -- has gone unrecognized: the changing distance between Earth and the sun.

The cold tongue, in turn, influences the El Niño-Southern Oscillation (ENSO), which impacts weather in California, much of North America, and often globally.

The Earth-sun distance slowly varies over the course of the year because Earth's orbit is slightly elliptical. Currently, at its closest approach -- perihelion -- Earth is about 3 million miles closer to the sun than at its farthest point, or aphelion. As a result, sunlight is about 7% more intense at perihelion than at aphelion.

Research led by the University of California, Berkeley, demonstrates that the slight yearly change in our distance from the sun can have a large effect on the annual cycle of the cold tongue. This is distinct from the effect of Earth's axial tilt on the seasons, which is currently understood to cause the annual cycle of the cold tongue.

Because the period of the annual cycle arising from the tilt and distance effects are slightly different, their combined effects vary over time, said lead researcher John Chiang, UC Berkeley professor of geography.

"The curious thing is that the annual cycle from the distance effect is slightly longer than that for tilt -- around 25 minutes, currently -- so over a span of about 11,000 years, the two annual cycles go from being in phase to out of phase, and the net seasonality undergoes a remarkable change, as a result," Chiang said.

Chiang noted that the distance effect is already incorporated into climate models -- though its effect on the equatorial Pacific was not recognized until now -- and his findings will not alter weather predictions or climate projections. But the 22,000-year phase cycle may have had long-term, historical effects. Earth's orbital precession is known to have affected the timing of the ice ages, for example.

The distance effect -- and its 22,000-year variation -- also may affect other weather systems on Earth. The ENSO, which also originates in the equatorial Pacific, is likely affected because its workings are closely tied to the seasonal cycle of the cold tongue.

"Theory tells us that the seasonal cycle of the cold tongue plays a key role in the development and termination of ENSO events," said Alyssa Atwood, a former UC Berkeley postdoctoral fellow who is now an assistant professor at Florida State University in Tallahassee. "Because of this, many of ENSO's key characteristics are synced to the seasonal cycle."

For example, ENSO events tend to peak during Northern Hemisphere winters, she said, and they don't typically persist beyond northern or boreal spring months, which scientists refer to as the "spring predictability barrier." Because of these linkages, it is reasonable to expect that the distance effect could also have a major impact on ENSO -- something that should be examined in future studies.

"Very little attention has been paid to the cold tongue seasonal cycle because most people think it's solved. There's nothing interesting there," Chiang said. "What this research shows is that it's not solved. There's still a mystery there. Our result also begs the question whether other regions on Earth may also have a significant distance effect contribution to their seasonal cycle."

"We learn in science classes as early as grade school that the seasons are caused by the tilt of Earth's axis," added co-author Anthony Broccoli of Rutgers University. "This is certainly true and has been well understood for centuries. Although the effect of the Earth-sun distance has also been recognized, our study indicates that this 'distance effect' may be a more important effect on climate than had been recognized previously."

Chiang, Atwood and Broccoli and their colleagues reported their findings today in the journal Nature.

Two distinct yearly cycles affect Pacific cold tongue

The main driver of global weather changes is seasonal change. Earth's equator is tilted relative to its orbit around the sun, so the Northern and Southern hemispheres are illuminated differently. When the sun shines directly overhead in the north, it's warmer in the north and colder in the south, and vice versa.

These yearly changes have major effects on the Pacific equatorial trade winds, which blow from southeast to northwest across the south and equatorial Pacific and push surface waters westward, causing upwelling of cold water along the equator that creates a tongue of cold surface water that stretches from Ecuador across the Pacific -- almost one-quarter the circumference of the planet.

The yearly hemispheric changes in seasonal temperature alters the strength of the trades, and thus cause a yearly cycle in the temperature of the cold tongue. This, in turn, has a major influence on ENSO, which typically peaks during Northern Hemisphere winter.

The occurrence of El Niño -- or its opposite, La Niña -- helps determines whether California and the West Coast will have a wet or dry winter, but also whether the Midwest and parts of Asia will have rain or drought.

"In studying past climates, much effort has been dedicated to trying to understand if variability in the tropical Pacific Ocean -- that is, the El Niño/La Niña cycle -- has changed in the past," Broccoli said. "We chose to focus instead on the yearly cycle of ocean temperatures in the eastern Pacific cold tongue. Our study found that the timing of perihelion -- that is, the point at which the earth is closest to the sun -- has an important influence on climate in the tropical Pacific."

In 2015, Broccoli, co-director of the Rutgers Climate Institute, along with his then-graduate student Michael Erb, employed a computer climate model to show that the distance changes caused by Earth's elliptical orbit dramatically altered the cold tongue yearly cycle. But climate modelers mostly ignored the result, Chiang said.

"Our field is focused on El Niño, and we thought that the seasonal cycle was solved. But then we realized that the result by Erb and Broccoli challenged this assumption," he said.

Chiang and his colleagues, including Broccoli and Atwood, examined similar simulations using four different climate models and confirmed the result. But the team went further to show how the distance effect works.

Earth's 'marine' and 'continental' hemispheres

The key distinction is that changes in the sun's distance from Earth don't affect the Northern and Southern hemispheres differently, which is what gives rise to the seasonal effect due to Earth's axial tilt. Instead, they warm the eastern "continental hemisphere" dominated by the North and South American and African and Eurasian landmasses, more than it warms the Western Hemisphere -- what he calls the marine hemisphere, because it is dominated by the Pacific Ocean.

"The traditional way of thinking about monsoons is that the Northern Hemisphere warms up relative to the Southern Hemisphere, generating winds onto land that bring monsoon rains," Chiang said. "But here, we're actually talking about east-west, not north-south, temperature differences that cause the winds. The distance effect is operating through the same mechanism as the seasonal monsoon rains, but the wind changes are coming from this east-west monsoon."

The winds generated by this differential heating of the marine and continental hemispheres alter the yearly variation of the easterly trades in the western equatorial Pacific, and thereby the cold tongue.

"When Earth is closest to the sun, these winds are strong. In the offseason, when the sun is at its furthest, these winds become weak," Chiang said. "Those wind changes are then propagated to the Eastern Pacific through the thermocline, and basically it drives an annual cycle of the cold tongue, as a result."

Today, Chiang said, the distance effect on the cold tongue is about one-third the strength of the tilt effect, and they enhance one another, leading to a strong annual cycle of the cold tongue. About 6,000 years ago, they canceled one another, yielding a muted annual cycle of the cold tongue. In the past, when Earth's orbit was more elliptical, the distance effect on the cold tongue would have been larger and could have led to a more complete cancellation when out of phase.

Though Chiang and his colleagues did not examine the effect of such a cancellation, this would potentially have had a worldwide effect on weather patterns.

Chiang emphasized that the distance effect on climate, while clear in climate model simulations, would not be evident from observations because it cannot be readily distinguished from the tilt effect.

"This study is purely model based. So, it is a prediction," he said. "But this behavior is reproduced by a number of different models, at least four. And what we did in this study is to explain why this happens. And in the process, we've discovered another annual cycle of the cold tongue that's driven by Earth's eccentricity."

Atwood noted that, unlike the robust changes to the cold tongue seasonal cycle, changes to ENSO tend to be model-dependent.

"While ENSO remains a challenge for climate models, we can look beyond climate model simulations to the paleoclimate record to investigate the connection between changes in the annual cycle of the cold tongue and ENSO in the past," she said. "To date, paleoclimate records from the tropical Pacific have largely been interpreted in terms of past changes in ENSO, but our study underscores the need to separate changes in the cold tongue annual cycle from changes in ENSO."

No sign of decrease in global CO2 emissions

 Global carbon emissions in 2022 remain at record levels -- with no sign of the decrease that is urgently needed to limit warming to 1.5°C, according to the Global Carbon Project science team.

If current emissions levels persist, there is now a 50% chance that global warming of 1.5°C will be exceeded in nine years.

The new report projects total global CO2 emissions of 40.6 billion tonnes (GtCO2) in 2022. This is fuelled by fossil CO2 emissions which are projected to rise 1.0% compared to 2021, reaching 36.6 GtCO2 -- slightly above the 2019 pre-COVID-19 levels[1]. Emissions from land-use change (such as deforestation) are projected to be 3.9 GtCO2 in 2022.

Projected emissions from coal and oil are above their 2021 levels, with oil being the largest contributor to total emissions growth. The growth in oil emissions can be largely explained by the delayed rebound of international aviation following COVID-19 pandemic restrictions.

The 2022 picture among major emitters is mixed: emissions are projected to fall in China (0.9%) and the EU (0.8%), and increase in the USA (1.5%) and India (6%), with a 1.7% rise in the rest of the world combined.

The remaining carbon budget for a 50% likelihood to limit global warming to 1.5°C has reduced to 380 GtCO2 (exceeded after nine years if emissions remain at 2022 levels) and 1230 GtCO2 to limit to 2°C (30 years at 2022 emissions levels).

To reach zero CO2 emissions by 2050 would now require a decrease of about 1.4 GtCO2 each year, comparable to the observed fall in 2020 emissions resulting from COVID-19 lockdowns, highlighting the scale of the action required.

Land and ocean, which absorb and store carbon, continue to take up around half of the CO2 emissions. The ocean and land CO2 sinks are still increasing in response to the atmospheric CO2 increase, although climate change reduced this growth by an estimated 4% (ocean sink) and 17% (land sink) over the 2012-2021 decade.

This year's carbon budget shows that the long-term rate of increasing fossil emissions has slowed. The average rise peaked at +3% per year during the 2000s, while growth in the last decade has been about +0.5% per year.

The research team -- including the University of Exeter, the University of East Anglia (UEA), CICERO and Ludwig-Maximilian-University Munich -- welcomed this slow-down, but said it was "far from the emissions decrease we need."

The findings come as world leaders meet at COP27 in Egypt to discuss the climate crisis.

"This year we see yet another rise in global fossil CO2 emissions, when we need a rapid decline," said Professor Pierre Friedlingstein, of Exeter's Global Systems Institute, who led the study.

"There are some positive signs, but leaders meeting at COP27 will have to take meaningful action if we are to have any chance of limiting global warming close to 1.5°C. The Global Carbon Budget numbers monitor the progress on climate action and right now we are not seeing the action required."

Professor Corinne Le Quéré, Royal Society Research Professor at UEA's School of Environmental Sciences, said: "Our findings reveal turbulence in emissions patterns this year resulting from the pandemic and global energy crises.

"If governments respond by turbo charging clean energy investments and planting, not cutting, trees, global emissions could rapidly start to fall.

"We are at a turning point and must not allow world events to distract us from the urgent and sustained need to cut our emissions to stabilise the global climate and reduce cascading risks."

Land-use changes, especially deforestation, are a significant source of CO2 emissions (about a tenth of the amount from fossil emissions). Indonesia, Brazil and the Democratic Republic of the Congo contribute 58% of global land-use change emissions.

Carbon removal via reforestation or new forests counterbalances half of the deforestation emissions, and the researchers say that stopping deforestation and increasing efforts to restore and expand forests constitutes a large opportunity to reduce emissions and increase removals in forests

Saturday 12 November 2022

Previously unknown monumental temple discovered near the Tempio Grande in Vulci

 An interdisciplinary team headed by archeologists Dr. Mariachiara Franceschini of the University of Freiburg and Paul P. Pasieka of the University of Mainz has discovered a previously unknown Etruscan temple in the ancient city of Vulci, which lies in the Italian region of Latium. The building, which is 45 meters by 35 meters, is situated west of the Tempio Grande, a sacred building which was excavated back in the 1950s. Initial examination of the strata of the foundation of the northeast corner of the temple and the objects they found there, led the researchers to date the construction of the temple towards the end of the sixth or beginning of the fifth century BCE.

"The new temple is roughly the same size and on a similar alignment as the neighboring Tempio Grande, and was built at roughly the same Archaic time," explains Franceschini. "This duplication of monumental buildings in an Etruscan city is rare, and indicates an exceptional finding," adds Pasieka. The team discovered the temple when working on the Vulci Cityscape project, which was launched in 2020 and aimed to research the settlement strategies and urbanistic structures of the city of Vulci. Vulci was one of the twelve cities of the Etruscan federation and in pre-Roman times was one of the most important urban centers in what is now Italy.

New discoveries about city design and development

"We studied the entire northern area of Vulci, that's 22.5 hectares, using geophysical prospecting and Ground Penetrating Radar," explains Pasieka. "We discovered remains from the city's origins that had previously been overlooked in Vulci and are now better able to understand the dynamics of settlement and the road system, besides identifying different functional areas in the city." The researchers were able in 2021 to uncover the first sections of wall, made of solid tuff. "Our knowledge about the appearance and organization of Etruscan cities has been limited until now," says Franceschini. "The intact strata of the temple are offering us insights into more than a thousand years of development of one of the most important Etruscan cities."

Over the coming years the scientists want to study the different phases of use and the precise architectural appearance of the temple in more depth, in order to learn more about the religion of the Etruscans, the social structures in Vulci and what the lives of the city's inhabitants were really like.

Fritz Thyssen Foundation and Gerda Henkel Foundation funding the excavation

Rats bop to the beat

 Accurately moving to a musical beat was thought to be a skill innately unique to humans. However, new research now shows that rats also have this ability. The optimal tempo for nodding along was found to depend on the time constant in the brain (the speed at which our brains can respond to something), which is similar across all species. This means that the ability of our auditory and motor systems to interact and move to music may be more widespread among species than previously thought. This new discovery offers not only further insight into the animal mind, but also into the origins of our own music and dance.

Can you move to the beat, or do you have two left feet? Apparently, how well we can time our movement to music depends somewhat on our innate genetic ability, and this skill was previously thought to be a uniquely human trait. While animals also react to hearing noise, or might make rhythmic sounds, or be trained to respond to music, this isn't the same as the complex neural and motor processes that work together to enable us to naturally recognize the beat in a song, respond to it or even predict it. This is referred to as beat synchronicity.

Only relatively recently, research studies (and home videos) have shown that some animals seem to share our urge to move to the groove. A new paper by a team at the University of Tokyo provides evidence that rats are one of them. "Rats displayed innate -- that is, without any training or prior exposure to music -- beat synchronization most distinctly within 120-140 bpm (beats per minute), to which humans also exhibit the clearest beat synchronization," explained Associate Professor Hirokazu Takahashi from the Graduate School of Information Science and Technology. "The auditory cortex, the region of our brain that processes sound, was also tuned to 120-140 bpm, which we were able to explain using our mathematical model of brain adaptation."

But why play music to rats in the first place? "Music exerts a strong appeal to the brain and has profound effects on emotion and cognition. To utilize music effectively, we need to reveal the neural mechanism underlying this empirical fact," said Takahashi. "I am also a specialist of electrophysiology, which is concerned with electrical activity in the brain, and have been studying the auditory cortex of rats for many years."

The team had two alternate hypotheses: The first was that the optimal music tempo for beat synchronicity would be determined by the time constant of the body. This is different between species and much faster for small animals compared to humans (think of how quickly a rat can scuttle). The second was that the optimal tempo would instead be determined by the time constant of the brain, which is surprisingly similar across species. "After conducting our research with 20 human participants and 10 rats, our results suggest that the optimal tempo for beat synchronization depends on the time constant in the brain," said Takahashi. "This demonstrates that the animal brain can be useful in elucidating the perceptual mechanisms of music."

The rats were fitted with wireless, miniature accelerometers, which could measure the slightest head movements. Human participants also wore accelerometers on headphones. They were then played one-minute excerpts from Mozart's Sonata for Two Pianos in D Major, K. 448, at four different tempos: Seventy-five percent, 100%, 200% and 400% of the original speed. The original tempo is 132 bpm and results showed that the rats' beat synchronicity was clearest within the 120-140 bpm range. The team also found that both rats and humans jerked their heads to the beat in a similar rhythm, and that the level of head jerking decreased the more that the music was sped up.

"To the best of our knowledge, this is the first report on innate beat synchronization in animals that was not achieved through training or musical exposure," said Takahashi. "We also hypothesized that short-term adaptation in the brain was involved in beat tuning in the auditory cortex. We were able to explain this by fitting our neural activity data to a mathematical model of the adaptation. Furthermore, our adaptation model showed that in response to random click sequences, the highest beat prediction performance occurred when the mean interstimulus interval (the time between the end of one stimulus and the start of another) was around 200 milliseconds (one-thousandth of a second). This matched the statistics of internote intervals in classical music, suggesting that the adaptation property in the brain underlies the perception and creation of music."

As well as being a fascinating insight into the animal mind and the development of our own beat synchronicity, the researchers also see it as an insight into the creation of music itself. "Next, I would like to reveal how other musical properties such as melody and harmony relate to the dynamics of the brain. I am also interested in how, why and what mechanisms of the brain create human cultural fields such as fine art, music, science, technology and religion," said Takahashi. "I believe that this question is the key to understand how the brain works and develop the next-generation AI (artificial intelligence). Also, as an engineer, I am interested in the use of music for a happy life."


Red-supergiant supernova: Secrets of an earlier Universe

An international research team led by the University of Minnesota Twin Cities has measured the size of a star dating back 2 billion years after the Big Bang, or more than 11 billion years ago. Detailed images show the exploding star cooling and could help scientists learn more about the stars and galaxies present in the early Universe.

The paper is published in Nature, the world's leading peer-reviewed, multidisciplinary science journal.

"This is the first detailed look at a supernova at a much earlier epoch of the Universe's evolution," said Patrick Kelly, a lead author of the paper and an associate professor in the University of Minnesota School of Physics and Astronomy. "It's very exciting because we can learn in detail about an individual star when the Universe was less than a fifth of its current age, and begin to understand if the stars that existed many billions of years ago are different from the ones nearby."

The red supergiant in question was about 500 times larger than the sun, and it's located at redshift three, which is about 60 times farther away than any other supernova observed in this detail.

Using data from the Hubble Space Telescope with follow-up spectroscopy using the University of Minnesota's access to the Large Binocular Telescope, the researchers were able to identify multiple detailed images of the red supergiant because of a phenomenon called gravitational lensing, where mass, such as that in a galaxy, bends light. This magnifies the light emitted from the star.

"The gravitational lens acts as a natural magnifying glass and multiplies Hubble's power by a factor of eight," Kelly said. "Here, we see three images. Even though they can be seen at the same time, they show the supernova as it was at different ages separated by several days. We see the supernova rapidly cooling, which allows us to basically reconstruct what happened and study how the supernova cooled in its first few days with just one set of images. It enables us to see a rerun of a supernova."

The researchers combined this discovery with another one of Kelly's supernova discoveries from 2014 to estimate how many stars were exploding when the Universe was a small fraction of its current age. They found that there were likely many more supernovae than previously thought.

"Core-collapse supernovae mark the deaths of massive, short-lived stars. The number of core-collapse supernovae we detect can be used to understand how many massive stars were formed in galaxies when the Universe was much younger," said Wenlei Chen, first author of the paper and a postdoctoral researcher in the University of Minnesota School of Physics and Astronomy.

En route to human-environment interaction technology with soft microfingers

 Humans have always been fascinated by scales different than theirs, from giant objects such as stars, planets and galaxies, to the world of the tiny: insects, bacteria, viruses and other microscopic objects. While the microscope allows us to view and observe the microscopic world, it is still difficult to interact with it directly.

However, human-robot interaction technology might change all that. Microrobots, for instance, can interact with the environment at much smaller scales than us. Microsensors have been used for measuring forces exerted by insects during activities such as flight or walking. However, most studies so far have only focused on measuring insect behavior rather than a direct insect-microsensor interaction.

Against this backdrop, researchers from Ritsumeikan University in Japan have now developed a soft micro-robotic finger that can enable a more direct interaction with the microworld. The study, led by Professor Satoshi Konishi, was published in Scientific Reports on 10 October 2022 "A tactile microfinger is achieved by using a liquid metal flexible strain sensor. A soft pneumatic balloon actuator acts as an artificial muscle, allowing control and finger-like movement of the sensor. With a robotic glove, a human user can directly control the microfingers. This kind of system allows for a safe interaction with insects and other microscopic objects," explains Prof. Konishi.

Using their newly developed microrobot setup, the research team investigated the reaction force of a pill bug as a representative sample of an insect. The pill bug was fixed in place using a suction tool and the microfinger was used to apply a force and measure the reaction force of the bug's legs.

The reaction force measured from the legs of the pill bug was approximately 10 mN (millinewtons), which agreed with previously estimated values. While a representative study and a proof-of-concept, this result shows great promise towards realizing direct human interactions with the microworld. Moreover, it can have applications even in augmented reality (AR) technology. Using robotized gloves and micro-sensing tools such as the microfinger, many AR technologies concerning human-environment interactions on the microscale can be realized.

"With our strain-sensing microfinger, we were able to directly measure the pushing motion and force of the legs and torso of a pill bug -- something that has been impossible to achieve previously! We anticipate that our results will lead to further technological development for microfinger-insect interactions, leading to human-environment interactions at much smaller scales," remarks Prof. Konishi.

Death of a star reveals midsize black hole lurking in a dwarf galaxy

 An intermediate-mass black hole lurking undetected in a dwarf galaxy revealed itself to astronomers when it gobbled up an unlucky star that strayed too close. The shredding of the star, known as a "tidal disruption event" or TDE, produced a flare of radiation that briefly outshone the combined stellar light of the host dwarf galaxy and could help scientists better understand the relationships between black holes and galaxies.

The flare was captured by astronomers with the Young Supernova Experiment (YSE), a survey designed to detect cosmic explosions and transient astrophysical events. An international team led by scientists at UC Santa Cruz, the Niels Bohr Institute at the University of Copenhagen, and Washington State University reported the discovery in a paper published November 10 in Nature Astronomy.

"This discovery has created widespread excitement because we can use tidal disruption events not only to find more intermediate-mass black holes in quiet dwarf galaxies, but also to measure their masses," said coauthor Ryan Foley, an assistant professor of astronomy and astrophysics at UC Santa Cruz who helped plan the YSE survey.

First author Charlotte Angus at the Niels Bohr Institute said the team's findings provide a baseline for future studies of midsize black holes.

"The fact that we were able to capture this midsize black hole whilst it devoured a star offered us a remarkable opportunity to detect what otherwise would have been hidden from us," Angus said. "What is more, we can use the properties of the flare itself to better understand this elusive group of middle-weight black holes, which could account for the majority of black holes in the centers of galaxies."

Supermassive black holes are found at the centers of all massive galaxies, including our own Milky Way. Astronomers conjecture that these massive beasts, with millions or billions of times the mass of the sun, could have grown from smaller "intermediate-mass" black holes with thousands to hundreds of thousands of solar masses.

One theory for how such massive black holes were assembled is that the early universe was rampant with small dwarf galaxies with intermediate-mass black holes. Over time, these dwarf galaxies would have merged or been gobbled up by more massive galaxies, their cores combining each time to build up the mass in the center of the growing galaxy. This merger process would eventually create the supermassive black holes seen today.

"If we can understand the population of intermediate-mass black holes out there -- how many there are and where they are located -- we can help determine if our theories of supermassive black hole formation are correct," said coauthor Enrico Ramirez-Ruiz, professor of astronomy and astrophysics at UCSC and Niels Bohr Professor at the University of Copenhagen.

But do all dwarf galaxies have midsize black holes?

"That's difficult to assert, because detecting intermediate-mass black holes is extremely challenging," Ramirez-Ruiz said.

Classic black hole hunting techniques, which look for actively feeding black holes, are often not sensitive enough to uncover black holes in the centers of dwarf galaxies. As a result, only a minuscule fraction of dwarf galaxies is known to host intermediate-mass black holes. Finding more midsize black holes with tidal disruption events could help to settle the debate about how supermassive black holes form.

"One of the biggest open questions in astronomy is currently how supermassive black holes form," said coauthor Vivienne Baldassare, professor of physics and astronomy at Washington State University.

Data from the Young Supernova Experiment enabled the team to detect the first signs of light as the black hole began to eat the star. Capturing this initial moment was pivotal to unlocking how big the black hole was, because the duration of these events can be used to measure the mass of the central black hole. This method, which until now had only been shown to work well for supermassive black holes, was first proposed by Ramirez-Ruiz and coauthor Brenna Mockler at UC Santa Cruz.

"This flare was incredibly fast, but because our YSE data gave us so much early information about the event, we were really able to pin down the mass of the black hole using it," Angus said.


Novel C. diff structures are required for infection, offer new therapeutic targets

  Iron storage "spheres" inside the bacterium C. diff -- the leading cause of hospital-acquired infections -- could offer new targ...