Saturday, 31 August 2019

Lab-on-a-chip may help identify new treatments for liver disease

Non-alcoholic fatty liver disease (NAFLD) -- the accumulation of liver fat in people who drink little or no alcohol -- is increasingly common around the world, and in the United States, it affects between 30 and 40 percent of adults. Currently, there are no approved drugs for the treatment of NAFLD, which is predicted to soon become the main cause of chronic liver problems and the need for liver transplantation.
Now a team led by investigators at Massachusetts General Hospital (MGH) has developed a "lab on a chip" technology that can simulate different levels of NAFLD progression in cells across a single continuous tissue.
For the research, which is described in an article published in the journal Lab-on-a-Chip, the scientists used their new platform to evaluate the effects of different drivers of NAFLD -- such as fat and oxygen concentrations -- on liver cells. In this way, the platform can allow for detailed studies of NAFLD progression. Other influences such as inflammatory cues can also be superimposed onto the platform to examine their impacts.
In addition, the lab on a chip platform can be used to assess investigational drugs' effects on NAFLD progression, therefore revealing their potential for further testing in clinical trials.
"This platform is unique in that in one continuous liver tissue on a single chip, we are able to look at different severities of the disease and to study how liver tissue might respond to both triggers of NAFLD as well as different therapeutic approaches," said senior author O. Berk Usta, PhD, an investigator in the Center for Engineering in Medicine at MGH and assistant professor of Surgery at Harvard Medical School. "While further studies into more complex pathologies of NAFLD and its progressive forms are needed to establish a more complete recapitulation, the current platform establishes a basis for lab-based drug efficacy screening for NAFLD," noted Beyza Bulutoglu PhD, the lead author of the manuscript.
Usta suggested that such a strategy may help accelerate the search for effective drugs for NAFLD conditions that range from benign fat accumulation to more serious complications including fibrosis, cirrhosis, and liver cancer.
Source :- Science Daily

Entanglement sent over 50 km of optical fiber

For the first time, a team has sent a light particle entangled with matter over 50 km of optical fiber. This paves the way for the practical use of quantum networks and sets a milestone for a future quantum internet.
The quantum internet promises absolutely tap-proof communication and powerful distributed sensor networks for new science and technology. However, because quantum information cannot be copied, it is not possible to send this information over a classical network. Quantum information must be transmitted by quantum particles, and special interfaces are required for this. The Innsbruck-based experimental physicist Ben Lanyon, who was awarded the Austrian START Prize in 2015 for his research, is researching these important intersections of a future quantum Internet. Now his team at the Department of Experimental Physics at the University of Innsbruck and at the Institute of Quantum Optics and Quantum Information of the Austrian Academy of Sciences has achieved a record for the transfer of quantum entanglement between matter and light. For the first time, a distance of 50 kilometers was covered using fiber optic cables. "This is two orders of magnitude further than was previously possible and is a practical distance to start building inter-city quantum networks," says Ben Lanyon.
Converted photon for transmission
Lanyon's team started the experiment with a calcium atom trapped in an ion trap. Using laser beams, the researchers write a quantum state onto the ion and simultaneously excite it to emit a photon in which quantum information is stored. As a result, the quantum states of the atom and the light particle are entangled. But the challenge is to transmit the photon over fiber optic cables. "The photon emitted by the calcium ion has a wavelength of 854 nanometers and is quickly absorbed by the optical fiber," says Ben Lanyon. His team therefore initially sends the light particle through a nonlinear crystal illuminated by a strong laser. Thereby the photon wavelength is converted to the optimal value for long-distance travel: the current telecommunications standard wavelength of 1550 nanometers. The researchers from Innsbruck then send this photon through a 50-kilometer-long optical fiber line. Their measurements show that atom and light particle are still entangled even after the wavelength conversion and this long journey.
Even greater distances in sight
As a next step, Lanyon and his team show that their methods would enable entanglement to be generated between ions 100 kilometers apart and more. Two nodes send each an entangled photon over a distance of 50 kilometers to an intersection where the light particles are measured in such a way that they lose their entanglement with the ions, which in turn would entangle them. With 100-kilometer node spacing now a possibility, one could therefore envisage building the world's first intercity light-matter quantum network in the coming years: only a handful of trapped ion-systems would be required on the way to establish a quantum internet between Innsbruck and Vienna, for example.
Source :- Science Daily

Psychosensory electronic skin technology for future AI and humanoid development

DGIST has announced that Professor Jae Eun Jang's team in the Department of Information and Communication Engineering developed electronic skin technology that can detect "prick" and "hot" pain sensations like humans. This research result is expected to be applied on the development of humanoid robots and patients wearing prosthetic hands in the future.
The attempt to mimic human's five senses led to the development of innovative electronic devices such as camera and TV, which are inventions that dramatically changed human life. As a result, many scientists are continuously performing research to imitate tactile, olfactory, and palate senses and tactile sensing is expected to be the next mimetic technology for various reasons. Currently, most tactile sensor researches are focusing on physical mimetic technologies that measure the pressure used for a robot to grab an object, but psychosensory tactile research on how to mimic human tactile feeling such like soft, smooth or rough has a long way to go.
As a result, Professor Jae Eun Jang's team developed a tactile sensor that can feel pain and temperature like human through a joint research with Professor Cheil Moon's team in the Department of Brain and Cognitive Science, Professor Ji-woong Choi's team in the Department of Information and Communication Engineering, and Professor Hongsoo Choi's team in the Department of Robotics Engineering. Its key strengths are that it has simplified the sensor structure and can measure pressure and temperature at the same time and can be applied on various tactile systems regardless of the measurement principle of the sensor.
For this, the research team focused on zinc oxide nano-wire (ZnO Nano-wire) technology, which was applied as a self-power tactile sensor that does not need a battery thanks to its piezoelectric effect, which generates electrical signals by detecting pressure. Also, a temperature sensor using Seebeck effect1) was applied at the same time for one sensor to do two jobs. The research team arranged electrodes on polyimide flexible substrate, grew the ZnO nano-wire, and could measure the piezoelectric effect by pressure and the Seebeck effect by temperature change at the same time. The research team also succeeded in developing a signal processing technique that judges the generation of pain signals considering the pressure level, stimulated area and temperature.
Professor Jang in the Department of Information and Communication Engineering said "We have developed a core base technology that can effectively detect pain, which is necessary for developing future-type tactile sensor. As an achievement of convergence research by experts in nano engineering, electronic engineering, robotics engineering, and brain sciences, it will be widely applied on electronic skin that feels various senses as well as new human-machine interactions. If robots can also feel pain, our research will expand further into technology to control robots' aggressive tendency, which is one of the risk factors of AI development."
1 Seebeck effect: Forms electric circuit by connecting different metals and generates thermoelectromotive forces on the circuit if there is a temperature difference on both access points.
Source :- Science Daily

Researchers develop low-power, low-cost network for 5G connectivity

Researchers at the University of Waterloo have developed a cheaper and more efficient method for Internet-of-Things devices to receive high-speed wireless connectivity.
With 75 billion Internet of Things (IoT) devices expected to be in place by 2025, a growing strain will be placed on requirements of wireless networks. Contemporary WiFi and cellular networks won't be enough to support the influx of IoT devices, the researchers highlighted in their new study.
Millimeter wave (mmWave), a network that offers multi-gigahertz of unlicensed bandwidth -- more than 200 times that allocated to today's WiFi and cellular networks, can be used to address the looming issue. In fact, 5G networks are going to be powered by mmWave technology. However, the hardware required to use mmWave is expensive and power-hungry, which are significant deterrents to it being deployed in many IoT applications.
"To address the existing challenges in exploiting mmWave for IoT applications we created a novel mmWave network called mmX," said Omid Abari, an assistant professor in Waterloo's David R. Cheriton School of Computer Science. "mmX significantly reduces cost and power consumption of a mmWave network enabling its use in all IoT applications."
In comparison to WiFi and Bluetooth, which are slow for many IoT applications, mmX provides much higher bitrate.
"mmX will not only improve our WiFi and wireless experience, as we will receive much faster internet connectivity for all IoT devices, but it can also be used in applications, such as, virtual reality, autonomous cars, data centers and wireless cellular networks," said Ali Abedi, a postdoctoral fellow at the Cheriton School of Computer Science. "Any sensor you have in your home, which traditionally used WiFi and lower frequency can now communicate using high-speed millimeter wave networks.
"Autonomous cars are also going to use a huge number of sensors in them which will be connected through wire; now you can make all of them wireless and more reliable."
The study, A Millimeter Wave Network for Billions of Things, authored by Waterloo's Faculty of Mathematics researchers Abari, Abedi, and research assistants Mohammed Mazaheri and Soroush Ameli, was recently presented at the ACM SIGCOMM 2019 conference.
Source :- Science Daily

Newly discovered giant planet slingshots around its star

Astronomers have discovered a planet three times the mass of Jupiter that travels on a long, egg-shaped path around its star. If this planet were somehow placed into our own solar system, it would swing from within our asteroid belt to out beyond Neptune. Other giant planets with highly elliptical orbits have been found around other stars, but none of those worlds were located at the very outer reaches of their star systems like this one.
"This planet is unlike the planets in our solar system, but more than that, it is unlike any other exoplanets we have discovered so far," says Sarah Blunt, a Caltech graduate student and first author on the new study publishing in The Astronomical Journal. "Other planets detected far away from their stars tend to have very low eccentricities, meaning that their orbits are more circular. The fact that this planet has such a high eccentricity speaks to some difference in the way that it either formed or evolved relative to the other planets."
The planet was discovered using the radial velocity method, a workhorse of exoplanet discovery that detects new worlds by tracking how their parent stars "wobble" in response to gravitational tugs from those planets.
However, analyses of these data usually require observations taken over a planet's entire orbital period. For planets orbiting far from their stars, this can be difficult: a full orbit can take tens or even hundreds of years.
The California Planet Search, led by Caltech Professor of Astronomy Andrew W. Howard, is one of the few groups that watches stars over the decades-long timescales necessary to detect long-period exoplanets using radial velocity.
The data needed to make the discovery of the new planet were first provided by W. M. Keck Observatory in Hawaii. In 1997, the team began using the High-Resolution Echelle Spectrometer (HIRES) on the Keck I telescope to take measurements of the planet's star, called HR 5183.
"The key was persistence," said Howard. "Our team followed this star with Keck Observatory for more than two decades and only saw evidence for the planet in the past couple years! Without that long-term effort, we never would have found this planet."
In addition to Keck Observatory, the California Planet Search also used the Lick Observatory in Northern California and the McDonald Observatory in Texas.
The astronomers have been watching HR 5183 since the 1990s, but do not have data corresponding to one full orbit of the planet, called HR 5183 b, because it circles its star roughly every 45 to 100 years. The team instead found the planet because of its strange orbit.
"This planet spends most of its time loitering in the outer part of its star's planetary system in this highly eccentric orbit, then it starts to accelerate in and does a slingshot around its star," explains Howard. "We detected this slingshot motion. We saw the planet come in and now it's on its way out. That creates such a distinctive signature that we can be sure that this is a real planet, even though we haven't seen a complete orbit."
The new findings show that it is possible to use the radial velocity method to make detections of other far-flung planets without waiting decades. And, the researchers suggest, looking for more planets like this one could illuminate the role of giant planets in shaping their solar systems.
Planets take shape out of disks of material left over after stars form. That means that planets should start off in flat, circular orbits. For the newly detected planet to be on such an eccentric orbit, it must have gotten a gravitational kick from some other object.
The most plausible scenario, the researchers propose, is that the planet once had a neighbor of similar size. When the two planets got close enough to each other, one pushed the other out of the solar system, forcing HR 5183 b into a highly eccentric orbit.
"This newfound planet basically would have come in like a wrecking ball," says Howard, "knocking anything in its way out of the system."
This discovery demonstrates that our understanding of planets beyond our solar system is still evolving. Researchers continue to find worlds that are unlike anything in our solar system or in solar systems we have already discovered.
"Copernicus taught us that Earth is not the center of the solar system, and as we expanded into discovering other solar systems of exoplanets, we expected them to be carbon copies of our own solar system," Howard explains, "But it's just been one surprise after another in this field. This newfound planet is another example of a system that is not the image of our solar system but has remarkable features that make our universe incredibly rich in its diversity."
Source :- Science Daily

Earth's fingerprint hints at finding habitable planets beyond the solar system

Two McGill University astronomers have assembled a "fingerprint" for Earth, which could be used to identify a planet beyond our Solar System capable of supporting life.
McGill Physics student Evelyn Macdonald and her supervisor Prof. Nicolas Cowan used over a decade of observations of Earth's atmosphere taken by the SCISAT satellite to construct a transit spectrum of Earth, a sort of fingerprint for Earth's atmosphere in infrared light, which shows the presence of key molecules in the search for habitable worlds. This includes the simultaneous presence of ozone and methane, which scientists expect to see only when there is an organic source of these compounds on the planet. Such a detection is called a "biosignature."
"A handful of researchers have tried to simulate Earth's transit spectrum, but this is the first empirical infrared transit spectrum of Earth," says Prof. Cowan. "This is what alien astronomers would see if they observed a transit of Earth."
The findings, published Aug. 28 in the journal Monthly Notices of the Royal Astronomical Society, could help scientists determine what kind of signal to look for in their quest to find Earth-like exoplanets (planets orbiting a star other than our Sun). Developed by the Canadian Space Agency, SCISAT was created to help scientists understand the depletion of Earth's ozone layer by studying particles in the atmosphere as sunlight passes through it. In general, astronomers can tell what molecules are found in a planet's atmosphere by looking at how starlight changes as it shines through the atmosphere. Instruments must wait for a planet to pass -- or transit -- over the star to make this observation. With sensitive enough telescopes, astronomers could potentially identify molecules such as carbon dioxide, oxygen or water vapour that might indicate if a planet is habitable or even inhabited.
Cowan was explaining transit spectroscopy of exoplanets at a group lunch meeting at the McGill Space Institute (MSI) when Prof. Yi Huang, an atmospheric scientist and fellow member of the MSI, noted that the technique was similar to solar occultation studies of Earth's atmosphere, as done by SCISAT.
Since the first discovery of an exoplanet in the 1990s, astronomers have confirmed the existence of 4,000 exoplanets. The holy grail in this relatively new field of astronomy is to find planets that could potentially host life -- an Earth 2.0.
A very promising system that might hold such planets, called TRAPPIST-1, will be a target for the upcoming James Webb Space Telescope, set to launch in 2021. Macdonald and Cowan built a simulated signal of what an Earth-like planet's atmosphere would look like through the eyes of this future telescope which is a collaboration between NASA, the Canadian Space Agency and the European Space Agency.
The TRAPPIST-1 system located 40 light years away contains seven planets, three or four of which are in the so-called "habitable zone" where liquid water could exist. The McGill astronomers say this system might be a promising place to search for a signal similar to their Earth fingerprint since the planets are orbiting an M-dwarf star, a type of star which is smaller and colder than our Sun.
"TRAPPIST-1 is a nearby red dwarf star, which makes its planets excellent targets for transit spectroscopy. This is because the star is much smaller than the Sun, so its planets are relatively easy to observe," explains Macdonald. "Also, these planets orbit close to the star, so they transit every few days. Of course, even if one of the planets harbours life, we don't expect its atmosphere to be identical to Earth's since the star is so different from the Sun."
According to their analysis, Macdonald and Cowan affirm that the Webb Telescope will be sensitive enough to detect carbon dioxide and water vapour using its instruments. It may even be able to detect the biosignature of methane and ozone if enough time is spent observing the target planet.
Prof. Cowan and his colleagues at the Montreal-based Institute for Research on Exoplanets are hoping to be some of the first to detect signs of life beyond our home planet. The fingerprint of Earth assembled by Macdonald for her senior undergraduate thesis could tell other astronomers what to look for in this search. She will be starting her Ph.D. in the field of exoplanets at the University of Toronto in the Fall.

Busy older stars outpace stellar youngstersspace

The oldest stars in our Galaxy are also the busiest, moving more rapidly than their younger counterparts in and out of the disk of the Milky Way, according to new analysis carried out at the University of Birmingham.
The findings provide fresh insights into the history of our Galaxy and increase our understanding of how stars form and evolve.
Researchers calculate that the old stars are moving more quickly in and out of the disc -- the pancake-shaped mass at the heart of the Galaxy where most stars are located.
A number of theories could explain this movement -- it all depends where the star is in the disc. Stars towards the outskirts could be knocked by gravitational interactions with smaller galaxies passing by. Towards the inner parts of the disc, the stars could be disturbed by massive gas clouds which move along with the stars inside the disc. They could also be thrown out of the disc by the movement of its spiral structure.
Dr Ted Mackereth, a galactic archaeologist at the University of Birmingham, is lead author on the paper. He explains: "The specific way that the stars move tells us which of these processes has been dominant in forming the disc we see today. We think older stars are move active because they have been around the longest, and because they were formed during a period when the Galaxy was a bit more violent, with lots of star formation happening and lots of disturbance from gasses and smaller satellite galaxies. There are lots of different processes at work, and untangling all these helps us to build up a picture of the history of our Galaxy."
The study uses data from the Gaia satellite, currently working to chart the movements of around 1 billion stars in the Milky Way. It also takes information from APOGEE, an experiment run by the Sloan Digital Sky Survey that uses spectroscopy to measure the distribution of elements in stars, as well as images from the recently-retired Kepler space telescope.
Measurements provided by Kepler show how the brightness of stars varies over time, which gives insights into how they vibrate. In turn, that yields information about their interior structure, which enables scientists to calculate their age.
The Birmingham team, working with colleagues at the University of Toronto and teams involved with the Sloan Digital Sky Survey, were able to take these different data strands and calculate the differences in velocity between different sets of stars grouped by age.
They found that the older stars were moving in many different directions with some moving very quickly out from the galactic disk. Younger stars move closely together at much slower speeds out from the disc, although they are faster than the older stars as they rotate around the Galaxy within the disc.
The eventual goal of the research is to link what is known about the Milky Way with information about how other galaxies in the universe formed, ultimately being able to place our Galaxy within the very earliest signatures of the universe.
The research is published in the Monthly Notices of the Royal Astronomical Society and funded by the Science and Technology Facilities Council, the Royal Astronomical Society and the European Research Council.
Source :- Science Daily

Hints of a volcanically active exo-moon

Jupiter's moon Io is the most volcanically active body in our solar system. Today, there are indications that an active moon outside our solar system, an exo-Io, could be hidden at the exoplanet system WASP-49b. "It would be a dangerous volcanic world with a molten surface of lava, a lunar version of close-in Super Earths like 55 Cancri-e" says Apurva Oza, postdoctoral fellow at the Physics Insitute of the University of Bern and associate of the NCCR PlanetS, "a place where Jedis go to die, perilously familiar to Anakin Skywalker." But the object that Oza and his colleagues describe in their work seems to be even more exotic than Star Wars science fiction: the possible exomoon would orbit a hot giant planet, which in turn would race once around its host star in less than three days -- a scenario 550 light years away in the inconspicuous constellation of Lepus, underneath the bright Orion constellation.
Sodium gas as circumstantial evidence
Astronomers have not yet discovered a rocky moon beyond our solar system and it's on the basis of circumstantial evidence that the researchers in Bern conclude that the exo-Io exists: Sodium gas was detected at the WASP 49-b at an anomalously high-altitude. "The neutral sodium gas is so far away from the planet that it is unlikely to be emitted solely by a planetary wind," says Oza. Observations of Jupiter and Io in our solar system, by the international team, along with mass loss calculations show that an exo-Io could be a very plausible source of sodium at WASP 49-b. "The sodium is right where it should be" says the astrophysicist.
Tides keep the system stable
Already in 2006, Bob Johnson of the University of Virginia and the late Patrick Huggins at New York University, USA had shown that large amounts of sodium at an exoplanet could point to a hidden moon or ring of material, and ten years ago, researchers at Virginia calculated that such a compact system of three bodies: star, close-in giant planet and moon, can be stable over billions of years. Apurva Oza was then a student at Virginia, and after his PhD on moons atmospheres in Paris, decided to pick up the theoretical calculations of these researchers. He now publishes the results of his work together with Johnson and colleagues in the Astrophysical Journal.
"The enormous tidal forces in such a system are the key to everything," explains the astrophysicist. The energy released by the tides to the planet and its moon keeps the moon's orbit stable, simultaneously heating it up and making it volcanically active. In their work, the researchers were able to show that a small rocky moon can eject more sodium and potassium into space through this extreme volcanism than a large gas planet, especially at high altitudes. "Sodium and potassium lines are quantum treasures to us astronomers because they are extremely bright," says Oza, "the vintage street lamps that light up our streets with yellow haze, is akin to the gas we are now detecting in the spectra of a dozen exoplanets."
"We need to find more clues"
The researchers compared their calculations with these observations and found five candidate systems where a hidden exomoon can survive against destructive thermal evaporation. For WASP 49-b the observed data can be best explained by the existence of an exo-Io. However, there are other options. For example, the exoplanet could be surrounded by a ring of ionized gas, or non-thermal processes. "We need to find more clues," Oza admits. The researchers are therefore relying on further observations with ground-based and space-based instruments.
"While the current wave of research is going towards habitability and biosignatures, our signature is a signature of destruction," says the astrophysicist. A few of these worlds could be destroyed in a few billion years due to the extreme mass loss. "The exciting part is that we can monitor these destructive processes in real time, like fireworks," says Oza.
Source :- Science Daily

Ultra-white beetle scales hold secret to creating sustainable paint from recycled plastic

The structure of ultra-white beetle scales could hold the key to making bright-white sustainable paint using recycled plastic waste, scientists at the University of Sheffield have discovered.
Cyphochilus beetle scales are one of the brightest whites in nature and their ultra-white appearance is created by the nanostructure in their tiny scales, as opposed to the use of pigment or dyes.
Experts have now been able to recreate and improve on this structure in the lab using low cost materials -- via a technique which could be used as a sustainable alternative to titanium dioxide in white paint.
Dr Andrew Parnell, from the University of Sheffield's Department of Physics and Astronomy, who led the research, said: "In the natural world, whiteness is usually created by a foamy, Swiss cheese-like structure made of a solid interconnected network and air. Until now, how these structures form and develop and how they have evolved light-scattering properties has remained a mystery.
"Having understood these structures we were able to take plastic and structure it in the same way. Ideally, we could recycle plastic waste that would normally be burnt or sent to landfill, structure it just like the beetle scale and then use it to make super white paint. This would make paint with a much lower carbon footprint and help tackle the challenge of recycling single-use plastics."
The findings show that the foamy structure of the beetles' scales had the right proportion of empty spaces, which optimise the scattering of light -- creating the ultra-white colouring.
Conventional white paint contains nanoparticles of titanium dioxide, which scatter light very strongly. However, the use of titanium dioxide is harmful to the environment as it contributes to nearly 75 per cent of the carbon footprint of each tin of paint that is produced.
To measure the tiny individual beetle scales, researchers used a technique called X-ray tomography, which is similar to a CT scan but on a miniscule scale. The scientists used the X-ray imaging facilities at the instrument ID16B at the European Synchrotron Research Facility (ESRF) in Grenoble, France.
The intense X-ray source at the ESRF meant whole intact scales could be measured, which was pivotal to understanding them and modelling how they scatter light. To follow how the synthetic material formed, they again used the ESRF to confirm the formation mechanism as the layer dried and became structured.
Dr Stephanie Burg, a PhD researcher at the University of Sheffield said: "This research answers long-standing questions about how the structure inside these scales actually form and we hope these lessons from nature will help inform the future of sustainable manufacturing for paint."
The team also used the instrument Larmor at the ISIS Spallation Neutron Source, which measured the nanostructure of the synthetic whites they made. This was at the Rutherford Appleton Laboratory in Oxfordshire -- part of the Science and Technologies Facilities Council.

Source :- Science Daily

Storage and release of mechanical waves without energy loss

Light and sound waves are at the basis of energy and signal transport and fundamental to some of our most basic technologies -- from cell phones to engines. Scientists, however, have yet to devise a method that allows them to store a wave intact for an indefinite period of time and then direct it toward a desired location on demand. Such a development would greatly facilitate the ability to manipulate waves for a variety of desired uses, including energy harvesting, quantum computing, structural-integrity monitoring, information storage, and more.
In a newly published paper in Science Advances, a group of researchers led by Andrea Alù, founding director of the Photonics Initiative at the Advanced Science Research Center (ASRC) at The Graduate Center, CUNY, and by Massimo Ruzzene, professor of Aeronautics Engineering at Georgia Tech, have experimentally shown that it is possible to efficiently capture and store a wave intact then guide it towards a specific location.
"Our experiment proves that unconventional forms of excitation open new opportunities to gain control over wave propagation and scattering," said Alù. "By carefully tailoring the time dependence of the excitation, it is possible to trick the wave to be efficiently stored in a cavity, and then release it on demand towards the desired direction."
Methodology
To achieve their goal, the scientists had to devise a way for changing the basic interaction between waves and materials. When a light or sound wave hits an obstacle, it is either partially absorbed or reflected and scattered. The absorption process entails immediately converting of the wave into heat or other forms of energy. Materials that can't absorb waves only reflect and scatter them. The researchers' goal was to find a way to mimic the absorbtion process without converting the wave into other forms of energy and instead storing it in the material. This concept, introduced theoretically two years ago by the ASRC group, is known as coherent virtual absorption.
To prove their theory, the researchers reasoned that they needed to tailor the waves' time evolution so that when they came in contact with non-abosorbing materials, they wouldn't be reflected, scattered, or transmitted. This would prevent the wave impinging on the structure from escaping, and it would be efficiently trapped inside as if it were being absorbed. The stored wave could then be released on demand.
During their experiment, researchers propagated two mechanical waves traveling in opposite directions along a carbon steel waveguide bar that contained a cavity. The time variations of each wave were carefully controlled to ensure that the cavity would retain all of the impinging energy. Then, by stopping the excitation or detuning one of the waves, they were able to control the release of the stored energy and send it towards a desired direction on demand.
"While we ran our proof-of-concept experiment using elastic waves traveling in a solid material, our findings are also applicable to radiowaves and light, offering exciting prospects for efficient energy harvesting, wireless power transfer, low-energy photonics, and generally enhanced control over wave propagation," said Ruzzene.
Source :- Science Daily

Defrosting surfaces in seconds

In the future, a delayed flight due to ice will be no cause for a meltdown.
A group of researchers at the University of Illinois at Urbana-Champaign and Kyushu University has developed a way to remove ice and frost from surfaces extremely efficiently, using less than 1% of the energy and less than 0.01% of the time needed for traditional defrosting methods.
The group describes the method in Applied Physics Letters, from AIP Publishing. Instead of conventional defrosting, which melts all the ice or frost from the top layer down, the researchers established a technique that melts the ice where the surface and the ice meet, so the ice can simply slide off.
"The work was motivated by the large energy efficiency losses of building energy systems and refrigeration systems due to the need to do intermittent defrosting. The systems must be shut down, the working fluid is heated up, then it needs to be cooled down again," said author Nenad Miljkovic, at UIUC. "This eats up a lot of energy when you think of the yearly operational costs of running intermittent defrosting cycles."
According to the authors, the biggest source of inefficiency in conventional systems is that much of the energy used for de-icing goes into heating other components of the system rather than directly heating the frost or ice. This increases energy consumption and system downtime.
Instead, the researchers proposed delivering a pulse of very high current where the ice and the surface meet to create a layer of water. To ensure the pulse reaches the intended space rather than melting the exposed ice, the researchers apply a thin coating of indium tin oxide (ITO) -- a conductive film often used for defrosting -- to the surface of the material. Then, they leave the rest to gravity.
To test this, the scientists defrosted a small glass surface cooled to minus 15.1 degrees Celsius -- about as cold as the warmest parts of Antarctica -- and to minus 71 degrees Celsius -- colder than the coldest parts of Antarctica. These temperatures were chosen to model heating, ventilation, air conditioning and refrigeration applications and aerospace applications, respectively. In all tests, the ice was removed with a pulse lasting less than one second.
In a real, 3D system, gravity would be assisted by air flow. "At scale, it all depends on the geometry," Miljkovic said. "However, the efficiency of this approach should definitely still be much better than conventional approaches."
The group hasn't studied more complicated surfaces like airplanes yet, but they think it's an obvious future step.
"They are a natural extension as they travel fast, so the shear forces on the ice are large, meaning only a very thin layer at the interface needs to be melted in order to remove the ice," Miljkovic said. "Work would be needed to figure out how we can coat curved components conformally with the ITO and to figure out how much energy we would need."
The researchers hope to work with external companies on scaling up their approach for commercialization.
Source :- Science Daily

Biochar: A better start to rain forest restoration

An indigenous farming technique that's been around for thousands of years provides the basis for restoring rain forests stripped clear of trees by gold mining and other threats.
A carbon-based soil amendment called biochar is a cheap and effective way to support tree seedling survival during reforestation efforts in the Amazon rain forest, according to new research from Wake Forest University's Center for Amazonian Scientific Innovation (CINCIA).
Restoring and recovering rain forests has become increasingly important for combatting climate change, since these wide swaths of trees can absorb billions of tons of carbon dioxide each year. The problem is particularly acute in areas mined for alluvial gold deposits, which devastate not only rain forest trees but also soils. High costs can be a huge barrier to replanting, fertilizing and nurturing trees to replace those lost in the rain forest.
The scientists found that using biochar combined with fertilizer significantly improved height and diameter growth of tree seedlings while also increasing the number of leaves the seedlings developed. The experiment, based in a Peruvian Amazon region called Madre de Dios, the heart of illegal gold mining trade in that country, used two tropical tree species: the fast-growing Guazuma crinita and Terminalia amazonia, a late successional tree often used as timber.
"The most difficult period in a tree seedling's life is the first few months after transplanting," said Miles Silman, CINCIA associate director of science and Wake Forest's Andrew Sabin Presidential Chair of Conservation Biology. "But just a little bit of biochar does wonderful things to the soil, and it really shines when you add organic fertilizer."
The CINCIA scientists make biochar out of Brazil nut husks discarded by large-scale processors in Peru. They burn the husks slowly in 55-gallon barrels, a low-tech, inexpensive and easily scalable method for producing biochar.
The study, "Biochar effects on two native tropical tree species and its potential as a tool for reforestation," appears online this month in the peer-reviewed journal Forests. Until this study, little was known about whether biochar could benefit tree growth in tropical tree seedlings.
"We show that while both biochar and fertilizer can improve tree seedling growth, combining them makes seedlings thrive beyond either amendment alone," said Silman.
The native peoples of the Amazon created "dark earths" using biochar thousands of years ago, and those soils are still productive today.
Biochar's benefits are many:
  • It improves the soil's ability to hold water and makes it less acidic.
  • It provides a welcoming habitat for microbes, which support plant growth.
  • It holds onto fertilizer and releases it over time, decreasing the need for repeat applications of fertilizer, which cuts labor and supply costs.
The scientists used soils recovered from the San Jacinto native community, where gold mining has ravaged the land. Silman explained that the dirt that comes from the mining sluice is devoid of the organic matter and microbes that supports plant life.
"These are the kinds of landscapes we have to recover, and we are still trying to determine how to grow plants in them," he said. "This soil is extremely limiting for natural regrowth, but treating them with biochar turns it into something that plants can grow in. That's good for biodiversity and good for the people that have to make a living from the land."

Tuesday, 27 August 2019

Big brains or big guts: Choose one

Ptarmigan
Big brains can help an animal mount quick, flexible behavioral responses to frequent or unexpected environmental changes. But some birds just don't need 'em.
A global study comparing 2,062 birds finds that, in highly variable environments, birds tend to have either larger or smaller brains relative to their body size. Birds with smaller brains tend to use ecological strategies that are not available to big-brained counterparts. Instead of relying on grey matter to survive, these birds tend to have large bodies, eat readily available food and make lots of babies.
The new research from biologists at Washington University in St. Louis appears Aug. 23 in the journal Nature Communications.
"The fact is that there are a great many species that do quite well with small brains," said Trevor Fristoe, formerly a postdoctoral researcher at Washington University, now at the University of Konstanz in Germany.
"What's really interesting is that we don't see any middle ground here," Fristoe said. "The resident species with intermediate brain size are almost completely absent from high latitude (colder and more climatically variable) environments. The species that don't go all in on either of the extreme strategies are forced to migrate to more benign climates during the winter."
"Having a large brain is typically associated with strong energetic demands and a slower life-history," said Carlos Botero, assistant professor of biology in Arts & Sciences and co-author of the paper. "Free from these constraints, species with small brains can exhibit traits and lifestyles that are never seen in larger-brained ones.
"What we found is that alternative ecological strategies that either increase or decrease investments in brain tissue are equally capable of coping with the challenges of living in high-latitude environments," he said.
Because the brain is such a costly organ to develop and maintain, biologists have long been interested in understanding how large brain size -- in all species -- could have evolved.
One hypothesis is based around the idea that one of the main advantages of possessing a big brain is that it allows for a high degree of behavioral flexibility. With flexibility comes the ability to respond to different conditions -- such as wide swings in temperature, or changes in food availability.
The so-called cognitive buffer hypothesis is not the only possible explanation for the evolution of brain size -- but it is an important and influential one.
Relative brain size is a measure of the size of the brain as compared to the body -- think: an ostrich's brain might be much bigger than a chickadee's brain, but so is the ostrich's body. Predictably, the global distribution of relative brain size of birds follows a bell curve, with most species landing squarely in the middle, and only a handful of outliers with relatively large or relatively small brains.
Previous studies had found general trends towards larger relative brain sizes in higher latitudes, where conditions are more variable -- consistent with the cognitive buffer hypothesis. Fristoe and Botero's new study is different because it looks at the full distribution of brain sizes across environments, allowing them to test whether different sizes are over- or under-represented.
Excluding contributions from migrants -- the birds that live in polar or temperate environments only during more favorable times of the year -- the researchers found that at high latitudes, bird brain size appears to be bimodal. This morphological pattern means that bird brains are significantly more likely to be relatively large, or relatively small, compared to body size.
What was going on here? Fristoe, born in Alaska, had a few ideas.
In fact, Fristoe suggests that the Alaska state bird, the ptarmigan, might be a good poster child for the small-brained species. Endearing though she is -- with her plushy bosom, feathered feet and unusual chuckling call -- she's not exactly known for her smarts. The ptarmigan can, however, chow down on twigs and willow leaves with the best of them.
"In our paper, we find that small-brained species in these environments employ strategies that are unachievable with a large brain," Fristoe said. "First, these species are able to persist by foraging on readily available but difficult to digest resources such as dormant plant buds, the needles of conifers, or even twigs.
"These foods can be found even during harsh winter conditions, but they are fibrous and require a large gut to digest," he said. "Gut tissue, like brain tissue, is energetically demanding, and limited budgets mean that it is challenging to maintain a lot of both.
"We also found that these species have high reproductive rates, producing many offspring every year," Fristoe said. "This would allow their populations to recover from high mortality during particularly challenging conditions. Because big-brained species tend to invest more time in raising fewer offspring, this is a strategy that is not available to them."
In other words, maybe big brains are not all that.
"Brains are not evolving in isolation -- they are part of a broader suite of adaptations that help organisms be successful in their lives," Botero said. "Because of trade-offs between different aspects of that total phenotype, we find that two different lineages may respond to selection from environmental oscillations in completely different ways.
"Given that our own species uses its brain to cope with these changes, it is not really surprising that biologists, ourselves included, have historically exhibited a bias toward thinking about environmental variability as a force that drives the expansion of brain size," Botero said. "But the interesting thing that we find here is that when we take a broader view, we realize that other strategies also work -- and remarkably, the alternative here involves making a brain actually smaller!"
Source :- Science Daily

Storms on Jupiter are disturbing the planet's colorful belts

Jupiter
Storm clouds rooted deep in Jupiter's atmosphere are affecting the planet's white zones and colorful belts, creating disturbances in their flow and even changing their color.
Thanks to coordinated observations of the planet in January 2017 by six ground-based optical and radio telescopes and NASA's Hubble Space Telescope, a University of California, Berkeley, astronomer and her colleagues have been able to track the effects of these storms -- visible as bright plumes above the planet's ammonia ice clouds -- on the belts in which they appear.
The observations will ultimately help planetary scientists understand the complex atmospheric dynamics on Jupiter, which, with its Great Red Spot and colorful, layer cake-like bands, make it one of the most beautiful and changeable of the giant gas planets in the solar system.
One such plume was noticed by amateur astronomer Phil Miles in Australia a few days before the first observations by the Atacama Large Millimeter/Submillimeter Array (ALMA) in Chile, and photos captured a week later by Hubble showed that the plume had spawned a second plume and left a downstream disturbance in the band of clouds, the South Equatorial Belt. The rising plumes then interacted with Jupiter's powerful winds, which stretched the clouds east and west from their point of origin.
Three months earlier, four bright spots were seen slightly north of the North Equatorial Belt. Though those plumes had disappeared by 2017, the belt had since widened northward, and its northern edge had changed color from white to orangish brown.
"If these plumes are vigorous and continue to have convective events, they may disturb one of these entire bands over time, though it may take a few months," said study leader Imke de Pater, a UC Berkeley professor emerita of astronomy. "With these observations, we see one plume in progress and the aftereffects of the others."
The analysis of the plumes supports the theory that they originate about 80 kilometers below the cloud tops at a place dominated by clouds of liquid water. A paper describing the results has been accepted for publication in the Astronomical Journal and is now online.
Into the stratosphere
Jupiter's atmosphere is mostly hydrogen and helium, with trace amounts of methane, ammonia, hydrogen sulfide and water. The top-most cloud layer is made up of ammonia ice and comprises the brown belts and white zones we see with the naked eye. Below this outer cloud layer sits a layer of solid ammonium hydrosulfide particles. Deeper still, at around 80 kilometers below the upper cloud deck, is a layer of liquid water droplets.
The storm clouds de Pater and her team studied appear in the belts and zones as bright plumes and behave much like the cumulonimbus clouds that precede thunderstorms on Earth. Jupiter's storm clouds, like those on Earth, are often accompanied by lightning.
Optical observations cannot see below the ammonia clouds, however, so de Pater and her team have been probing deeper with radio telescopes, including ALMA and also the Very Large Array (VLA) in New Mexico, which is operated by the National Science Foundation-funded National Radio Astronomy Observatory.
ALMA array's first observations of Jupiter were between Jan. 3 and 5 of 2017, a few days after one of these bright plumes was seen by amateur astronomers in the planet's South Equatorial Belt. A week later, Hubble, the VLA, the Gemini, Keck and Subaru observatories in Hawaii and the Very Large Telescope (VLT) in Chile captured images in the visible, radio and mid-infrared ranges.
De Pater combined the ALMA radio observations with the other data, focused specifically on the newly brewed storm as it punched through the upper deck clouds of ammonia ice.
The data showed that these storm clouds reached as high as the tropopause -- the coldest part of the atmosphere -- where they spread out much like the anvil-shaped cumulonimbus clouds that generate lightning and thunder on Earth.
"Our ALMA observations are the first to show that high concentrations of ammonia gas are brought up during an energetic eruption," de Pater said.
The observations are consistent with one theory, called moist convection, about how these plumes form. According to this theory, convection brings a mix of ammonia and water vapor high enough -- about 80 kilometers below the cloud tops -- for the water to condense into liquid droplets. The condensing water releases heat that expands the cloud and buoys it quickly upward through other cloud layers, ultimately breaking through the ammonia ice clouds at the top of the atmosphere.
The plume's momentum carries the supercooled ammonia cloud above the existing ammonia-ice clouds until the ammonia freezes, creating a bright, white plume that stands out against the colorful bands encircling Jupiter.
"We were really lucky with these data, because they were taken just a few days after amateur astronomers found a bright plume in the South Equatorial Belt," said de Pater. "With ALMA, we observed the whole planet and saw that plume, and since ALMA probes below the cloud layers, we could actually see what was going on below the ammonia clouds."
Hubble took images a week after ALMA and captured two separate bright spots, which suggests that the plumes originate from the same source and are carried eastward by the high altitude jet stream, leading to the large disturbances seen in the belt.
Coincidentally, three months before, bright plumes had been observed north of the Northern Equatorial Belt. The January 2017 observations showed that that belt had expanded in width, and the band where the plumes had first been seen turned from white to orange. De Pater suspects that the northward expansion of the North Equatorial Belt is a result of gas from the ammonia-depleted plumes falling back into the deeper atmosphere.
De Pater's colleague and co-author Robert Sault of the University of Melbourne in Australia used special computer software to analyze the ALMA data to obtain radio maps of the surface that are comparable to visible-light photos taken by Hubble.
"Jupiter's rotation once every 10 hours usually blurs radio maps, because these maps take many hours to observe," Sault said. "In addition, because of Jupiter's large size, we had to 'scan' the planet, so we could make a large mosaic in the end. We developed a technique to construct a full map of the planet."
Source :- Science Daily

Heat shield just 10 atoms thick to protect electronic devices

Overheated electronics concept illustration
Excess heat given off by smartphones, laptops and other electronic devices can be annoying, but beyond that it contributes to malfunctions and, in extreme cases, can even cause lithium batteries to explode.
To guard against such ills, engineers often insert glass, plastic or even layers of air as insulation to prevent heat-generating components like microprocessors from causing damage or discomforting users.
Now, Stanford researchers have shown that a few layers of atomically thin materials, stacked like sheets of paper atop hot spots, can provide the same insulation as a sheet of glass 100 times thicker. In the near term, thinner heat shields will enable engineers to make electronic devices even more compact than those we have today, said Eric Pop, professor of electrical engineering and senior author of a paper published Aug. 16 in Science Advances.
"We're looking at the heat in electronic devices in an entirely new way," Pop said.
Detecting sound as heat
The heat we feel from smartphones or laptops is actually an inaudible form of high-frequency sound. If that seems crazy, consider the underlying physics. Electricity flows through wires as a stream of electrons. As these electrons move, they collide with the atoms of the materials through which they pass. With each such collision an electron causes an atom to vibrate, and the more current flows, the more collisions occur, until electrons are beating on atoms like so many hammers on so many bells -- except that this cacophony of vibrations moves through the solid material at frequencies far above the threshold of hearing, generating energy that we feel as heat.
Thinking about heat as a form of sound inspired the Stanford researchers to borrow some principles from the physical world. From his days as a radio DJ at Stanford's KZSU 90.1 FM, Pop knew that music recording studios are quiet thanks to thick glass windows that block the exterior sound. A similar principle applies to the heat shields in today's electronics. If better insulation were their only concern, the researchers could simply borrow the music studio principle and thicken their heat barriers. But that would frustrate efforts to make electronics thinner. Their solution was to borrow a trick from homeowners, who install multi-paned windows -- usually, layers of air between sheets of glass with varying thickness -- to make interiors warmer and quieter.
"We adapted that idea by creating an insulator that used several layers of atomically thin materials instead of a thick mass of glass," said postdoctoral scholar Sam Vaziri, the lead author on the paper.
Atomically thin materials are a relatively recent discovery. It was only 15 years ago that scientists were able to isolate some materials into such thin layers. The first example discovered was graphene, which is a single layer of carbon atoms and, ever since it was found, scientists have been looking for, and experimenting with, other sheet-like materials. The Stanford team used a layer of graphene and three other sheet-like materials -- each three atoms thick -- to create a four-layered insulator just 10 atoms deep. Despite its thinness, the insulator is effective because the atomic heat vibrations are dampened and lose much of their energy as they pass through each layer.
To make nanoscale heat shields practical, the researchers will have to find some mass production technique to spray or otherwise deposit atom-thin layers of materials onto electronic components during manufacturing. But behind the immediate goal of developing thinner insulators looms a larger ambition: Scientists hope to one day control the vibrational energy inside materials the way they now control electricity and light. As they come to understand the heat in solid objects as a form of sound, a new field of phononics is emerging, a name taken from the Greek root word behind telephone, phonograph and phonetics.
"As engineers, we know quite a lot about how to control electricity, and we're getting better with light, but we're just starting to understand how to manipulate the high-frequency sound that manifests itself as heat at the atomic scale," Pop said.
Eric Pop is an affiliate of the Precourt Institute for Energy. Stanford authors include former postdoctoral scholars Eilam Yalon and Miguel Muñoz Rojo, and graduate students Connor McClellan, Connor Bailey, Kirby Smithe, Alexander Gabourie, Victoria Chen, Sanchit Deshmukh and Saurabh Suryavanshi. Other authors are from Theiss Research and the National Institute of Standards and Technology.
This research was supported by the Stanford Nanofabrication Facility, the Stanford Nano Shared Facilities, the National Science Foundation, the Semiconductor Research Corporation, the Defense Advanced Research Projects Agency, the Air Force Office of Scientific Research, the Stanford SystemX Alliance, the Knut and Alice Wallenberg Foundation, the Stanford Graduate Fellowship program and the National Institute of Standards and Technology.
Source :- Science Daily


Australian men's life expectancy tops other men's

Man with Australian flag
Australian men are now living longer than any other group of males in the world, according to new research from The Australian National University (ANU).
The study introduces a new way of measuring life expectancy, accounting for the historical mortality conditions that today's older generations lived through.
By this measure, Australian men, on average, live to 74.1.
The news is good for Australian women too; the study shows they're ranked second, behind their Swiss counterparts.
Dr Collin Payne co-led the study, which used data from 15 countries across Europe, North America and Asia with high life expectancies.
"Popular belief has it that Japan and the Nordic countries are doing really well in terms of health, wellbeing, and longevity. But Australia is right there," Dr Payne said.
"The results have a lot to do with long term stability and the fact Australia's had a high standard of living for a really, really long time. Simple things like having enough to eat, and not seeing a lot of major conflict play a part."
Dr Payne's study grouped people by year of birth, separating 'early' deaths from 'late' deaths, to come up with the age at which someone can be considered an 'above-average' survivor.
"Most measures of life expectancy are just based on mortality rates at a given time," Dr Payne said.
"It's basically saying if you took a hypothetical group of people and put them through the mortality rates that a country experienced in 2018, for example, they would live to an average age of 80.
"But that doesn't tell you anything about the life courses of people, as they've lived through to old age.
"Our measure takes the life course into account, including mortality rates from 50, 60, or 70 years ago.
"What matters is we're comparing a group of people who were born in the same year, and so have experienced similar conditions throughout their life."
Dr Payne says this method allows us to clearly see whether someone is reaching their cohort's life expectancy.
"For example, any Australian man who's above age 74 we know with 100 per cent certainty has outlived half of his cohort -- he's an above average survivor compared to his peers born in the same year," he said.
"And those figures are higher here than anywhere else that we've measured life expectancy.
"On the other hand, any man who's died before age 74 is not living up to their cohort's life expectancy."
Dr Payne says there are a number of factors which might've contributed to Australia jumping ahead in these new rankings.
"Mortality was really high in Japan in the 30s, 40s and 50s. In Australia, mortality was really low during that time," Dr Payne said.
"French males, for example, drop out because a lot of them died during WW2, some from direct conflict, others from childhood conditions."
Source :- Science Daily


Scorpion toxin that targets 'wasabi receptor' may help solve mystery of chronic pain

Black rock scorpion
Researchers at UC San Francisco and the University of Queensland have discovered a scorpion toxin that targets the "wasabi receptor," a chemical-sensing protein found in nerve cells that's responsible for the sinus-jolting sting of wasabi and the flood of tears associated with chopping onions. Because the toxin triggers a pain response through a previously unknown mechanism, scientists think it can be used as a tool for studying chronic pain and inflammation, and may eventually lead to the development of new kinds of non-opioid pain relievers.
The scientists isolated the toxin, a short protein (or peptide) that they dubbed the "wasabi receptor toxin" (WaTx), from the venom of the Australian Black Rock scorpion. The discovery came as the researchers were conducting a systematic search for compounds in animal venom that could activate, and therefore be used to probe and study, the wasabi receptor -- a sensory protein officially named TRPA1 (pronounced "trip A1") that's embedded in sensory nerve endings throughout the body. When activated, TRPA1 opens to reveal a channel that allows sodium and calcium ions to flow into the cell, which can induce pain and inflammation.
"Think of TRPA1 as the body's 'fire alarm' for chemical irritants in the environment," said John Lin King, a doctoral student in UCSF's Neuroscience Graduate Program and lead author of a study published August 22, 2019 in Cell, which describes the toxin and its surprising mode of action. "When this receptor encounters a potentially harmful compound -- specifically, a class of chemicals known as 'reactive electrophiles,' which can cause significant damage to cells -- it is activated to let you know you're being exposed to something dangerous that you need to remove yourself from."
Cigarette smoke and environmental pollutants, for example, are rich in reactive electrophiles which can trigger TRPA1 in the cells that line the surface of the body's airway, which can induce coughing fits and sustained airway inflammation. The receptor can also be activated by chemicals in pungent foods like wasabi, onions, mustard, ginger and garlic -- compounds that, according to Lin King, may have evolved to discourage animals from eating these plants. WaTx appears to have evolved for the same reason.
Though many animals use venom to paralyze or kill their prey, WaTx seems to serve a purely defensive purpose. Virtually all animals, from worms to humans, have some form of TRPA1. But the researchers found that WaTx can only activate the version found in mammals, which aren't on the menu for Black Rock scorpions, suggesting that the toxin is mainly used to ward off mammalian predators.
"Our results provide a beautiful and striking example of convergent evolution, whereby distantly related life forms -- plants and animals -- have developed defensive strategies that target the same mammalian receptor through completely distinct strategies," said David Julius, PhD, professor and chair of UCSF's Department of Physiology, and senior author of the new study.
But what the researchers found most interesting about WaTx was its mode of action. Though it triggers TRPA1, just as the compounds found in pungent plants do -- and even targets the very same site on that receptor -- the way it activates the receptor was novel and unexpected.
First, WaTx forces its way into the cell, circumventing the standard routes that place strict limits on what's allowed in and out. Most compounds, from tiny ions to large molecules, are either ingested by the cell through a complex process known as "endocytosis," or they gain entry by passing through one of the many protein channels that stud the cell's surface and act as gatekeepers.
But WaTx contains an unusual sequence of amino acids that allows it to simply penetrate the cell's membrane and pass right through to the cell's interior. Few other proteins are capable of the same feat. The most famous example is an HIV protein called Tat, but surprisingly, WaTx contains no sequences similar to those found in Tat or in any other protein that can pass through the cell's membrane.
"It was surprising to find a toxin that can pass directly through membranes. This is unusual for peptide toxins," Lin King said. "But it's also exciting because if you understand how these peptides get across the membrane, you might be able to use them to carry things -- drugs, for example -- into the cell that can't normally get across membranes."
Once inside the cell, WaTx attaches itself to a site on TRPA1 known as the "allosteric nexus," the very same site targeted by pungent plant compounds and environmental irritants like smoke. But that's where the similarities end.
Plant and environmental irritants alter the chemistry of the allosteric nexus, which causes the TRPA1 channel to rapidly flutter open and closed. This allows positively charged sodium and calcium ions to flow into the cell, triggering pain. Though both ions are able to enter when TRPA1 is activated by these irritants, the channel exhibits a strong preference for calcium and lets much more of it into the cell, which leads to inflammation. By contrast, WaTx wedges itself into the allosteric nexus and props the channel open. This abolishes its preference for calcium. As a result, overall ion levels are high enough to trigger a pain response, but calcium levels remain too low to initiate inflammation.
To demonstrate this, the researchers injected either mustard oil, a plant irritant known to activate the wasabi receptor, or WaTx into the paws of mice. With mustard oil, they observed acute pain, hypersensitivity to temperature and touch -- key hallmarks of chronic pain -- and inflammation, as evidenced by significant swelling. But with WaTx, they observed acute pain and pain hypersensitivities, but no swelling.
"When triggered by calcium, nerve cells can release pro-inflammatory signals that tell the immune system that something's wrong and needs to be repaired," Lin King said. "This 'neurogenic inflammation' is one of the key processes that becomes dysregulated in chronic pain. Our results suggest that you can decouple the protective acute pain response from the inflammation that establishes chronic pain. Achieving this goal, if only in principle, has been a longstanding aim in the field."
The researchers believe their findings will lead to a better understanding of acute pain, as well as the link between chronic pain and inflammation, which were previously thought to be experimentally indistinguishable. The findings may even lay the groundwork for the development of new pain drugs.
"The discovery of this toxin provides scientists with a new tool that can be used to probe the molecular mechanisms of pain, in particular, to selectively probe the processes that lead to pain hypersensitivity," Lin King said. "And for those interested in drug discovery, our findings underscore the promise of TRPA1 as a target for new classes of non-opioid analgesics to treat chronic pain."
Source :- Science Daily

Novel C. diff structures are required for infection, offer new therapeutic targets

  Iron storage "spheres" inside the bacterium C. diff -- the leading cause of hospital-acquired infections -- could offer new targ...