Friday, 28 June 2019

Cyanide compounds discovered in meteorites may hold clues to the origin of life

Cyanide and carbon monoxide are both deadly poisons to humans, but compounds containing iron, cyanide, and carbon monoxide discovered in carbon-rich meteorites by a team of scientists at Boise State University and NASA may have helped power life on early Earth. The extraterrestrial compounds found in meteorites resemble the active site of hydrogenases, which are enzymes that provide energy to bacteria and archaea by breaking down hydrogen gas (H2). Their results suggest that these compounds were also present on early Earth, before life began, during a period of time when Earth was constantly bombarded by meteorites and the atmosphere was likely more hydrogen-rich.
"When most people think of cyanide, they think of spy movies -- a guy swallowing a pill, foaming at the mouth and dying, but cyanide was probably an essential compound for building molecules necessary for life," explained Dr. Karen Smith, senior research scientist at Boise State University, Boise, Idaho. Cyanide, a carbon atom bound to a nitrogen atom, is thought to be crucial for the origin of life, as it is involved in the non-biological synthesis of organic compounds like amino acids and nucleobases, which are the building blocks of proteins and nucleic acids used by all known forms of life.
Smith is lead author of a paper on this research published June 25 in Nature Communications. Smith, along with Boise State assistant professor Mike Callahan, a co-author on the paper, developed new analytical methods to extract and measure ancient traces of cyanide in meteorites. They found that the meteorites containing cyanide belong to a group of carbon-rich meteorites called CM chondrites. Other types of meteorites tested, including a Martian meteorite, contained no cyanide.
"Data collected by NASA's OSIRIS-REx spacecraft of asteroid Bennu indicate that it is related to CM chondrites," said co-author Jason Dworkin of NASA's Goddard Space Flight Center in Greenbelt, Maryland. "OSIRIS-REx will deliver a sample from Bennu to study on Earth in 2023. We will search for these very compounds to try to connect Bennu to known meteorites and to understand the potential delivery of prebiotic compounds such as cyanide, which may have helped start life on the early Earth or other bodies in the solar system."
Cyanide has been found in meteorites before. However, in the new work, Smith and Callahan were surprised to discover that cyanide, along with carbon monoxide (CO), were binding with iron to form stable compounds in the meteorites. They identified two different iron cyano-carbonyl complexes in the meteorites using high-resolution liquid chromatography-mass spectrometry. "One of the most interesting observations from our study is that these iron cyano-carbonyl complexes resemble portions of the active sites of hydrogenases, which have a very distinct structure," Callahan said.
Hydrogenases are present in almost all modern bacteria and archaea and are widely believed to be ancient in origin. Hydrogenases are large proteins, but the active site -- the region where chemical reactions take place -- happens to be a much smaller metal-organic compound contained within the protein, according to Callahan. It is this compound that resembles the cyanide-bearing compounds the team discovered in meteorites.
An enduring mystery regarding the origin of life is how biology could have arisen from non-biological chemical processes. The similarities between the active sites in hydrogenase enzymes and the cyanide compounds the team found in meteorites suggests that non-biological processes in the parent asteroids of meteorites and on ancient Earth could have made molecules useful to emerging life.
"Cyanide and carbon monoxide attached to a metal are unusual and rare in enzymes. Hydrogenases are the exception. When you compare the structure of these iron cyano-carbonyl complexes in meteorites to these active sites in hydrogenases, it makes you wonder if there was a link between the two," Smith added. "It's possible that iron cyano-carbonyl complexes may have been a precursor to these active sites and later incorporated into proteins billions of years ago. These complexes probably acted as sources of cyanide on early Earth as well."

Algorithm designed to map universe, solve mysteries

Cornell University researchers have developed an algorithm designed to visualize models of the universe in order to solve some of physics' greatest mysteries.
The algorithm was developed by applying scientific principles used to create models for understanding cell biology and physics to the challenges of cosmology and big data.
"Science works because things behave much more simply than they have any right to," said professor of physics James Sethna. "Very complicated things end up doing rather simple collective behavior."
Sethna is the senior author of "Visualizing Probabilistic Models With Intensive Principal Component Analysis," published in the Proceedings of the National Academy of Sciences.
The algorithm, designed by first author Katherine Quinn, allows researchers to image a large set of probabilities to look for patterns or other information that might be useful, and provides them with better intuition for understanding complex models and data.
"A person can't just sit down and do it," Quinn said. "We need better algorithms that can extract what we're interested in, without being told what to look for. We can't just say, 'Look for interesting universes.' This algorithm is a way of untangling information in a way that can reveal the interesting structure of the data."
Further complicating the researchers' task was the fact that the data consists of ranges of probabilities, rather than raw images or numbers.
Their solution takes advantage of different properties of probability distributions to visualize a collection of things that could happen. In addition to cosmology, their model has applications to machine learning and statistical physics, which also work in terms of predictions.
To test the algorithm, the researchers used data from the European Space Agency's Planck satellite, and studied it with co-author Michael Niemack, associate professor of physics. They applied the model data on the cosmic microwave background -- radiation left over from the universe's early days.
The model produced a map depicting possible characteristics of different universes, of which our own universe is one point.
This new method of visualizing the qualities of our universe highlights the hierarchical structure of the dark energy and dark matter dominated model that fits the cosmic microwave background data so well. These visualizations present a promising approach for optimizing cosmological measurements in the future, Niemack said.
Next, researchers will try to expand this approach to allow for more parameters for each data point. Mapping such data could reveal new information about our universe, other possible universes or dark energy -- which appears to be the dominant form of energy in our universe but about which physicists still know little.

The first AI universe sim is fast and accurate and its creators don't know how it works

For the first time, astrophysicists have used artificial intelligence techniques to generate complex 3D simulations of the universe. The results are so fast, accurate and robust that even the creators aren't sure how it all works.
"We can run these simulations in a few milliseconds, while other 'fast' simulations take a couple of minutes," says study co-author Shirley Ho, a group leader at the Flatiron Institute's Center for Computational Astrophysics in New York City and an adjunct professor at Carnegie Mellon University. "Not only that, but we're much more accurate."
The speed and accuracy of the project, called the Deep Density Displacement Model, or D3M for short, wasn't the biggest surprise to the researchers. The real shock was that D3M could accurately simulate how the universe would look if certain parameters were tweaked -- such as how much of the cosmos is dark matter -- even though the model had never received any training data where those parameters varied.
"It's like teaching image recognition software with lots of pictures of cats and dogs, but then it's able to recognize elephants," Ho explains. "Nobody knows how it does this, and it's a great mystery to be solved."
Ho and her colleagues present D3M June 24 in the Proceedings of the National Academy of Sciences. The study was led by Siyu He, a Flatiron Institute research analyst.
Ho and He worked in collaboration with Yin Li of the Berkeley Center for Cosmological Physics at the University of California, Berkeley, and the Kavli Institute for the Physics and Mathematics of the Universe near Tokyo; Yu Feng of the Berkeley Center for Cosmological Physics; Wei Chen of the Flatiron Institute; Siamak Ravanbakhsh of the University of British Columbia in Vancouver; and Barnabás Póczos of Carnegie Mellon University.
Computer simulations like those made by D3M have become essential to theoretical astrophysics. Scientists want to know how the cosmos might evolve under various scenarios, such as if the dark energy pulling the universe apart varied over time. Such studies require running thousands of simulations, making a lightning-fast and highly accurate computer model one of the major objectives of modern astrophysics.
D3M models how gravity shapes the universe. The researchers opted to focus on gravity alone because it is by far the most important force when it comes to the large-scale evolution of the cosmos.
The most accurate universe simulations calculate how gravity shifts each of billions of individual particles over the entire age of the universe. That level of accuracy takes time, requiring around 300 computation hours for one simulation. Faster methods can finish the same simulations in about two minutes, but the shortcuts required result in lower accuracy.
Ho, He and their colleagues honed the deep neural network that powers D3M by feeding it 8,000 different simulations from one of the highest-accuracy models available. Neural networks take training data and run calculations on the information; researchers then compare the resulting outcome with the expected outcome. With further training, neural networks adapt over time to yield faster and more accurate results.
After training D3M, the researchers ran simulations of a box-shaped universe 600 million light-years across and compared the results to those of the slow and fast models. Whereas the slow-but-accurate approach took hundreds of hours of computation time per simulation and the existing fast method took a couple of minutes, D3M could complete a simulation in just 30 milliseconds.
D3M also churned out accurate results. When compared with the high-accuracy model, D3M had a relative error of 2.8 percent. Using the same comparison, the existing fast model had a relative error of 9.3 percent.
D3M's remarkable ability to handle parameter variations not found in its training data makes it an especially useful and flexible tool, Ho says. In addition to modeling other forces, such as hydrodynamics, Ho's team hopes to learn more about how the model works under the hood. Doing so could yield benefits for the advancement of artificial intelligence and machine learning, Ho says.
"We can be an interesting playground for a machine learner to use to see why this model extrapolates so well, why it extrapolates to elephants instead of just recognizing cats and dogs," she says. "It's a two-way street between science and deep learning."

Cosmic cat and mouse: Astronomers capture and tag a fleeting radio burst

An Australian-led team of astronomers using the Gemini South telescope in Chile have successfully confirmed the distance to a galaxy hosting an intense radio burst that flashed only once and lasted but a thousandth of a second. The team made the initial discovery of the fast radio burst (FRB) using the Australian Square Kilometre Array Pathfinder (ASKAP) radio telescope.
The critical Gemini observations were key to verifying that the burst left its host galaxy some 4 billion years ago.
Since the first FRB discovery in 2007, these mysterious objects have played a game of cosmic cat-and-mouse with astronomers -- with astronomers as the sharp-eyed cats! Fleeting radio outbursts, lasting about a millisecond (one-thousandth of one second), are difficult to detect, and even more difficult to locate precisely. In this case, the FRB, known as FRB 180924, was a single burst, unlike others that can flash multiple times over an extended period.
"It is especially challenging to pinpoint FRBs that only flash once and are gone," said Keith Bannister of Australia's Commonwealth Science and Industrial Research Organisation (CSIRO), who led the Australian team in the search effort. However, Bannister and his team did just that, which is a first.
The result is published in the June 27th issue of the journal Science.
The momentary pulse was first spotted in September 2018 during a dedicated search for FRBs using ASKAP -- a 36-antenna array of radio telescopes working together as a single instrument in Western Australia -- which also pinpointed the signal's location in the sky.
The researchers used the miniscule differences in the amount of time it takes for the light to reach different antennas in the array to zoom in on the host galaxy's location. "From these tiny time differences -- just a fraction of a billionth of a second -- we identified the burst's home galaxy," said team member Adam Deller, of Swinburne University of Technology.
Once pinpointed, the team enlisted the Gemini South telescope, along with the W.M. Keck Observatory and European Southern Observatory's Very Large Telescope (VLT) to determine the FRB's distance and other characteristics by carefully observing the galaxy that hosted the outburst. "The Gemini South data absolutely confirmed that the light left the galaxy about 4 billion years ago," said Nicolas Tejos of Pontificia Universidad Católica de Valparaíso, who led the Gemini observations.
"ASKAP gave us the two-dimensional position in the sky, but the Gemini, Keck, and VLT observations locked down the distance, which completes the three-dimensional picture," said Tejos.
"When we managed to get a position for FRB 180924 that was good to 0.1 arcsecond, we knew that it would tell us not just which object was the host galaxy, but also where within the host galaxy it occurred," said Deller. "We found that the FRB was located away from the galaxy's core, out in the 'galactic suburbs.'"
"The Gemini telescopes were designed with observations like this in mind," said Ralph Gaume, Deputy Division Director of the US National Science Foundation (NSF) Division of Astronomical Sciences, which provides funding for the US portion of the Gemini Observatory international partnership. Knowing where an FRB occurs in a galaxy of this type is important because it enables astronomers to get some hint of what the FRB progenitor might have been. "And for that," Gaume continues, "we need images and spectroscopy with superior image quality and depth, which is why Gemini and the optical and infrared observatory observations in this study were so important."
Localizing FRBs is critical to understanding what causes the flashes, which is still uncertain: to explain the high energies and short timescales, most theories invoke the presence of a massive yet very compact object such as a black hole or a highly magnetic neutron star. Finding where the bursts occur would tell us whether it is the formation, evolution, or collision and destruction of these objects that is generating the radio bursts."
"Much like gamma-ray bursts two decades ago, or the more recent detection of gravitational wave events, we stand on the cusp of an exciting new era where we are about to learn where fast radio bursts take place," said team member Stuart Ryder of Macquarie University, Australia. Ryder also noted that by knowing where within a galaxy FRBs occur, astronomers hope to learn more about what causes them, or at least rule out some of the many models. "Ultimately though," Ryder continued, "our goal is to use FRBs as cosmological probes, in much the same way that we use gamma ray bursts, quasars, and supernovae." According to Ryder, such a map could pinpoint the location of the 'missing baryons,' (baryons are the subatomic building blocks of matter) which standard models predict must be out there, but which don't show up using other probes.
By pinpointing the bursts and how far their light has traveled, astronomers can also obtain "core samples" of the intervening material between us and the flashes. With a large sample of FRB host galaxies, astronomers could conduct "cosmic tomography,"' to build the first 3D map of where baryons are located between galaxies. On that note Tejos added, "once we have a large sample of FRBs with known distances, we will also have a revolutionary new method for measuring the amount of matter in the cosmic web!"
To date, only one other fast radio burst (FRB 121102) has been localized, and it had a repeating signal that flashed more than 150 times, While both single and multiple flash FRBs are relatively rare, single FRBs are more common than repeating ones. The discovery of FRB 180924, then, could lead the way for future methods of localization.
"Fast turnaround follow-up contributions from Gemini Observatory will be especially significant in the future of time-domain astronomy," Tejos said, "as it promises not only to help astronomers perfect the study of transient phenomena, but perhaps alter our perceptions of the Universe."

New AI tool captures top players' strategies in RNA video game

A new artificial-intelligence tool captures strategies used by top players of an internet-based videogame to design new RNA molecules. Rohan Koodli and colleagues at the Eterna massive open laboratory present the tool, called EternaBrain, in PLOS Computational Biology. Eterna is directed by the lab of Prof. Rhiju Das at the Stanford University School of Medicine in California.
Found naturally in all living cells, RNA molecules perform essential biological functions. Recent years have seen strong interest in designing new RNA structures for use in cancer treatment, CRISPR gene editing, and more. However, every RNA structure consists of a long sequence of four building blocks, and determining the precise sequence needed to build a given structure can be computationally difficult.
In the new study, Koodli, Das, and colleagues carried out research through the Eterna internet-based videogame, a citizen-science initiative to tackle the computational challenges of RNA design. Eterna presents each player with a target RNA structure, and the player attempts to discover an RNA sequence that allows the finished molecule to fold into the desired shape. Some players outperform the best computer-automated methods in solving these challenges.
Using a dataset of 1.8 million design choices made by Eterna players, the researchers discovered an artificial neural network that captures some of the predilections and strategies of these experts. Called EternaBrain, this approach can predict the choices of the best players with significantly better accuracy than achieved by random guessing. An extended EternaBrain algorithm performs similarly or better than previously developed algorithms in solving Eterna challenges.
"Our findings suggest that it should be possible to create automatic algorithms for computer RNA design that emulate or outperform human RNA designers," Das says. "But we're not there yet; we still have a lot to learn from both gamers and AI researchers."
Next, the researchers will see if they can outperform top players by integrating EternaBrain with other computational approaches to RNA design. "We also hope to apply EternaBrain to more complicated problems being tackled by Eterna players, including the design of RNA computers and 3D machines, and the learning of design rules from actual wet-lab data," Das says.

Experiment reverses the direction of heat flow

Heat flows from hot to cold objects. When a hot and a cold body are in thermal contact, they exchange heat energy until they reach thermal equilibrium, with the hot body cooling down and the cold body warming up. This is a natural phenomenon we experience all the time.
It is explained by the second law of thermodynamics, which states that the total entropy of an isolated system always tends to increase over time until it reaches a maximum. Entropy is a quantitative measure of the disorder in a system. Isolated systems evolve spontaneously toward increasingly disordered states and lack of differentiation.
An experiment conducted by researchers at the Brazilian Center for Research in Physics (CBPF) and the Federal University of the ABC (UFABC), as well as collaborators at other institutions in Brazil and elsewhere, has shown that quantum correlations affect the way entropy is distributed among parts in thermal contact, reversing the direction of the so-called "thermodynamic arrow of time."
In other words, heat can flow spontaneously from a cold object to a hot object without the need to invest energy in the process, as is required by a domestic fridge. An article describing the experiment with theoretical considerations has just been published in Nature Communications.
The first author of the article, Kaonan Micadei, completed his PhD under the supervision of Professor Roberto Serra and is now doing postdoctoral research in Germany. Serra, also one of the authors of the article, was supported by São Paulo Research Foundation -- FAPESP via Brazil's National Institute of Science and Technology in Quantum Information. FAPESP also awarded two research grants linked to the project to another coauthor, Gabriel Teixeira Landi, a professor at the University of São Paulo's Physics Institute (IF-USP).
"Correlations can be said to represent information shared among different systems. In the macroscopic world described by classical physics, the addition of energy from outside can reverse the flow of heat in a system so that it flows from cold to hot. This is what happens in an ordinary refrigerator, for example," Serra told.
"It's possible to say that in our nanoscopic experiment, the quantum correlations produced an analogous effect to that of added energy. The direction of flow was reversed without violating the second law of thermodynamics. On the contrary, if we take into account elements of information theory in describing the transfer of heat, we find a generalized form of the second law and demonstrate the role of quantum correlations in the process."
The experiment was performed with a sample of chloroform molecules (a hydrogen atom, a carbon atom and three chlorine atoms) marked with a carbon-13 isotope. The sample was diluted in solution and studied using a nuclear magnetic resonance spectrometer, similar to the MRI scanners used in hospitals but with a much stronger magnetic field.
"We investigated temperature changes in the spins of the nuclei of the hydrogen and carbon atoms. The chlorine atoms had no material role in the experiment. We used radio frequency pulses to place the spin of each nucleus at a different temperature, one cooler, another warmer. The temperature differences were small, on the order of tens of billionths of 1 Kelvin, but we now have techniques that enable us to manipulate and measure quantum systems with extreme precision. In this case, we measured the radio frequency fluctuations produced by the atomic nuclei," Serra said.
The researchers explored two situations: in one, the hydrogen and carbon nuclei began the process uncorrelated, and in the other, they were initially quantum-correlated.
"In the first case, with the nuclei uncorrelated, we observed heat flowing in the usual direction, from hot to cold, until both nuclei were at the same temperature. In the second, with the nuclei initially correlated, we observed heat flowing in the opposite direction, from cold to hot. The effect lasted a few thousandths of a second, until the initial correlation was consumed," Serra explained.
The most noteworthy aspect of this result is that it suggests a process of quantum refrigeration in which the addition of external energy (as is done in refrigerators and air conditioners to cool a specific environment) can be replaced by correlations, i.e., an exchange of information between objects.
Maxwell's demon
The idea that information can be used to reverse the direction of heat flow -- in other words, to bring about a local decrease in entropy -- arose in classical physics in the mid-nineteenth century, long before information theory was invented.
It was a thought experiment proposed in 1867 by James Clerk Maxwell (1831-1879), who, among other things, created the famous classical electromagnetism equations. In this thought experiment, which sparked a heated controversy at the time, the great Scottish physicist said that if there were a being capable of knowing the speed of each molecule of a gas and of manipulating all the molecules at the microscopic scale, this being could separate them into two recipients, placing faster-than-average molecules in one to create a hot compartment and slower-than-average molecules in the other to create a cold compartment. In this manner, a gas initially in thermal equilibrium owing to a mixture of faster and slower molecules would evolve to a differentiated state with less entropy.
Maxwell intended the thought experiment to prove that the second law of thermodynamics was merely statistical.
"The being he proposed, which was capable of intervening in the material world at the molecular or atomic scale, became known as 'Maxwell's demon'. It was a fiction invented by Maxwell to present his point of view. However, we're now actually able to operate at the atomic or even smaller scales, so that usual expectations are modified," Serra said.
The experiment conducted by Serra and collaborators and described in the article just published is a demonstration of this. It did not reproduce Maxwell's thought experiment, of course, but it produced an analogous result.
"When we talk about information, we're not referring to something intangible. Information requires a physical substrate, a memory. If you want to erase 1 bit of memory from a flash drive, you have to expend 10,000 times a minimum amount of energy consisting of the Boltzmann constant times the absolute temperature. This minimum of energy necessary to erase information is known as Landauer's principle. This explains why erasing information generates heat. Notebook batteries are consumed by heat more than anything else," Serra said.
What the researchers observed was that the information present in the quantum correlations can be used to perform work, in this case the transfer of heat from a colder to a hotter object, without consuming external energy.
"We can quantify the correlation of two systems by means of bits. Connections between quantum mechanics and information theory are creating what is known as quantum information science. From the practical standpoint, the effect we studied could one day be used to cool part of a quantum computer's processor," Serra said.

New technology gives insight into how nanomaterials form and grow

A new form of electron microscopy allows researchers to examine nanoscale tubular materials while they are "alive" and forming liquids -- a first in the field.
Developed by a multidisciplinary team at Northwestern University and the University of Tennessee, the new technique, called variable temperature liquid-phase transmission electron microscopy (VT-LPTEM), allows researchers to investigate these dynamic, sensitive materials with high resolution. With this information, researchers can better understand how nanomaterials grow, form and evolve.
"Until now, we could only look at 'dead,' static materials," said Northwestern's Nathan Gianneschi, who co-led the study. "This new technique allows us to examine dynamics directly -- something that could not be done before."
The paper was published online this week in the Journal of the American Chemical Society.
Gianneschi is the Jacob and Rosaline Cohn Professor of Chemistry in Northwestern's Weinberg College of Arts and Sciences, professor of materials science and engineering and biomedical engineering in the McCormick School of Engineering, and associate director of the International Institute for Nanotechnology. He co-led the study with David Jenkins, associate professor of chemistry at University of Tennessee, Knoxville.
After live-cell imaging became possible in the early 20th century, it revolutionized the field of biology. For the first time, scientists could watch living cells as they actively developed, migrated and performed vital functions. Before, researchers could only study dead, fixed cells. The technological leap provided critical insight into the nature and behavior of cells and tissues.
"We think LPTEM could do for nanoscience what live-cell light microscopy has done for biology," Gianneschi said.
LPTEM allows researchers to mix components and perform chemical reactions while watching them unfold beneath a transmission electron microscope.
In this work, Gianneschi, Jenkins and their teams studied metal-organic nanotubes (MONTs). A subclass of metal-organic frameworks, MONTs have high potential for use as nanowires in miniature electronic devices, nanoscale lasers, semiconductors and sensors for detecting cancer biomarkers and virus particles. MONTs, however, are little explored because the key to unlocking their potential lies in understanding how they are formed.
For the first time, the Northwestern and University of Tennessee team watched MONTs form with LPTEM and made the first measurements of finite bundles of MONTs on the nanometer scale.

pattern of carbon fibers

Additive manufacturing built an early following with 3D printers using polymers to create a solid object from a Computer-Aided Design model. The materials used were neat polymers -- perfect for a rapid prototype, but not commonly used as structural materials.
A new wave of additive manufacturing uses polymer composites that are extruded from a nozzle as an epoxy resin, but reinforced with short, chopped carbon fibers. The fibers make the material stronger, much like rebar in a cement sidewalk. The resulting object is much stiffer and stronger than a resin on its own.
The question a recent University of Illinois at Urbana-Champaign study set out to answer concerns which configuration or pattern of carbon fibers in the layers of extruded resin will result in the stiffest material.
John Lambros, Willett professor in the Department of Aerospace Engineering and director of the Advanced Materials Testing and Evaluation Laboratory at U of I was approached by an additive manufacturing research group at Lawrence Livermore National Laboratory to test composite parts that they had created using a direct ink writing technique.
"The carbon fibers are small, about seven microns in diameter and 500 microns in length," Lambros said. "It's easier with a microscope but you can certainly see a bundle with the naked eye. The fibers are mostly aligned in the extruded resin, which is like a glue that holds the fibers in place. The Lawrence Livermore group provided the parts, created with several different configurations and one made without any embedded fibers as a control. One of the parts had been theoretically optimized for maximum stiffness, but the group wanted definitive experimental corroboration of the optimization process."
Lambros said that while waiting for the actual additively manufactured composite samples, Lambros and his student made their own "dummy" samples out of Plexiglas, and that way could begin testing the dummies.
In this case, the shape being tested was a clevis joint -- a small, oval-shaped plate with two holes used to connect two other surfaces. For each different sample shape, Lambros' lab must create a unique loading fixture to test it.
"We create the stands, the grips, and everything -- how they'll be painted, how the cameras will record the tests, and so on," Lambros said. "When we got the real samples, they weren't exactly the same shape. The thickness was a bit different than our Plexiglas ones, so we made new spacers and worked it out in the end. From the mechanics side, we must be very cautious. It's necessary to use precision so as to be confident that any eventual certification of additively manufactured parts is done properly."
"We created an experimental framework to validate the optimal pattern of the short-fiber reinforced composite material," Lambros said. "As the loading machine strained the clevis joint plates, we used a digital image correlation technique to measure the displacement field across the surface of each sample by tracking the motion in the pixel intensity values of a series of digital images taken as the sample deforms. A random speckle pattern is applied to the sample surface and serves to identify subsets of the digital images in a unique fashion so they can be tracked during deformation."
They tested one control sample and four different configurations, including the one believed to be optimized for stiffness, which had a wavy fiber pattern rather than one oriented along horizontal or vertical lines.
"Each sample clevis joint plate had 12 layers in a stack. The optimized one had curved deposition lines and gaps between them," Lambros said. "According to the Livermore group's predictions, the gaps are there by design, because you don't need more material than this to provide the optimal stiffness. That's what we tested. We passed loading pins through the holes, then pulled each sample to the point of breaking, recording the amount of load and the displacement.
"The configuration that they predicted would be optimal, was indeed optimal. The least optimal was the control sample, which is just resin -- as you would expect because there are no fibers in it."
Lambros said that there is a premise in the analysis that this is a global optimum -- meaning that this is the absolutely best possible sample built for stiffness -- no other build pattern is better than this one.
"Although of course we only tested four configurations, it does look like the optimized configuration may be the absolute best in practice because the configurations that would most commonly be used in design, such as 0°-90° or ±45° alignments, were more compliant or less stiff than what this one was," Lambros said. "The interesting thing that we found is that the sample optimized to be the stiffest also turned out to be the strongest. So, if you look at where they break, this one is at the highest load. This was somewhat unexpected in the sense that they had not optimized for this feature. In fact, the optimized sample was also a bit lighter than the others, so if you look at specific load, the failure load per unit weight, it's a lot higher. It's quite a bit stronger than the other ones. And why that is the case is something that we're going to investigate next."
Lambros said there may be more testing done in the future, but for now, his team successfully demonstrated that they could provide a validation for the optimized additive composite build.

Wednesday, 26 June 2019

Why money cannot 'buy' housework

If a man is handy with the vacuum cleaner, isn't averse to rustling up a lush family meal most nights after he's put on the washing machine having popped into the supermarket on his way home then it's more than likely his partner will have her own bank account.
A new study by Lancaster University reveals the way in which couples manage their money tells 'a tale of two marriages' in the UK today.
The research shows the management of household finances and control of financial decisions are linked to the time spent by women and men on routine housework such as cooking, cleaning, laundry and grocery shopping.
'What about money? Earnings, household financial organization, and housework' conducted by Lancaster Lecturer in Sociology, Dr Yang Hu, is published today in the Journal of Marriage and Family.
The study analysed data from more than 6,000 heterosexual couples, aged 20 to 59, from the UK Household Longitudinal Survey (Understanding Society).
This is the first study to examine how the organisation of household finances intervenes between couples getting their pay packets and housework at home. It takes previous studies on housework a step further.
"Housework provides a window into the 'checks and balances' of power and gender in couple relationships," said Dr Hu.
To negotiate their housework participation, men either hand over their income to their partners, who manage the money, so they use money to 'exchange' their way out of housework, or they hold on to their income to 'bargain' their way out of housework.
"Men get away with not doing housework through both channels," explains Dr Hu. "It puts women in a very compromising position as they are left to do the lion's share of housework."
Given wage penalties and a glass ceiling in the labour market, women are unlikely to win the 'war' over housework by 'exchanging' and 'bargaining' with men.
They are seen to negotiate in a different way. Going 'solo', some working women are seen to opt out of housework by taking control of their own earnings and developing a sense of autonomy.
Indeed, women's income only reduces their housework time when they can access their own earnings and have a say in household financial decisions. But the study finds that in the UK, less than 12% of working-age women kept separate purses, another 23% managed household finances, and only around 15% controlled financial decisions.
"Our research provides further evidence to show that despite women's participation in education and the labour market, this still has not yet translated into gender equality in housework at home," said Dr Hu.
"If men still monopolise the management of household finances and financial decisions, then things are unlikely to change," said Dr Hu. "It's therefore important for everyone to be able to access their own earnings.
"Educating and employing more women and settling the gender pay gap with gender equality flowing neatly into place at home as a result is certainly not the story this analysis is revealing."

Performance-enhancing bacteria found in the microbiomes of elite athletes

New research has identified a type of bacteria found in the microbiomes of elite athletes that contributes to improved capacity for exercise. These bacteria, members of the genus Veillonella, are not found in the guts of sedentary people.
By taking a closer look at the bacteria, the researchers from Joslin Diabetes Center determined Veillonella metabolizes lactic acid produced by exercise and converts it into propionate, a short chain fatty acid. The human body then utilizes that propionate to improve exercise capacity. The results were reported today in Nature Medicine.
"Having increased exercise capacity is a strong predictor of overall health and protection against cardiovascular disease, diabetes, and overall longevity," says Aleksandar D. Kostic PhD, TITLE., a co-author on the paper. "What we envision is a probiotic supplement that people can take that will increase their ability to do meaningful exercise and therefore protect them against chronic diseases including diabetes."
The work began in 2015 with fecal samples from Boston Marathon runners. Jonathan Scheiman, PhD, then a researcher in the lab of George Church, PhD, at Harvard Medical School, collected samples during a time span of one week before the Marathon to one week after the Marathon. He also collected samples from sedentary individuals. Dr. Scheiman then brought the samples to Dr. Kostic, who analyzed them to determine the species of bacteria in both cohorts.
"One of the things that immediately caught our attention was this single organism, Veillonella, that was clearly enriched in abundance immediately after the marathon in the runners. Veillonella is also at higher abundance in the marathon runners [in general] than it is in sedentary individuals." says Dr. Kostic.
They confirmed the link to improved exercise capacity in mouse models, where they saw a marked increase in running ability after supplementation with Veillonella. Next, they wanted to figure out how it worked.
"As we dug into the details of Veillonella, what we found was that it is relatively unique in the human microbiome in that it uses lactate or lactic acid as its sole carbon source," he says.
Lactic acid is produced by the muscles during strenuous exercise. The Veillonella bacteria are able to use this exercise by-product as their main food source.
"Our immediate hypothesis was that it worked as a metabolic sink to remove lactate from the system, the idea being that lactate build-up in the muscles creates fatigue," he says. "But talking to people like Sarah Lessard, [a clinical researcher at Joslin] and other people in the exercise physiology field, apparently this idea that lactate build-up causes fatigue is not accepted to be true. So, it caused us to rethink the mechanism of how this is happening."
Dr. Kostic and his team returned to the lab to figure out what could be causing the increase in exercise capacity. They ran a metagenomic analysis, meaning they tracked the genetics of all the organisms in the microbiome community, to determine what events were triggered by Veillonella's metabolism of lactic acid. They noted that the enzymes associated with conversion of lactic acid into the short chain fatty acid propionate were at much higher abundance after exercise.
"Then the question was maybe it's not removal of lactic acid, but the generation of propionate," says Dr. Kostic. "We did some experiments to introduce propionate into mice [via enema] and test whether that was sufficient for this increased running ability phenotype. And it was."
Dr. Kostic and his team plan to investigate the mechanisms of how propionate affects exercise capacity in a collaboration with Dr. Lessard.
Colonies of bacteria residing in our guts have a powerful impact on our health. Exercise is an important component of a healthy lifestyle meant to ward off diseases such as type 2 diabetes. Many people with metabolic disorders are not able to exercise at the level needed to see such benefits. Supplementing their microbiome using a probiotic capsule containing Veillonella could give them the boost they need for effective exercise. (Direct dosing with propionate pill would not work, as the short chain fatty acid would be broken down by digestive juices before it could take effect.) Dr. Scheiman has since spun this idea off into a company targeted at athletes.
"The microbiome is such a powerful metabolic engine," says Dr. Kostic. This is one of the first studies to directly show a strong example of symbiosis between microbes and their human host.
"It's very clear. It creates this positive feedback loop. The host is producing something that this particular microbe favors. Then in return, the microbe is creating something that benefits the host," he says. "This is a really important example of how the microbiome has evolved ways to become this symbiotic presence in the human host."

Babies can learn link between language and ethnicity, study suggests

Eleven-month-old infants can learn to associate the language they hear with ethnicity, recent research from the University of British Columbia suggests.
The study, published April 22 by Developmental Psychobiology, found that 11-month-old infants looked more at the faces of people of Asian descent versus those of Caucasian descent when hearing Cantonese versus English -- but not when hearing Spanish.
"Our findings suggest that by 11 months, infants are making connections between languages and ethnicities based on the individuals they encounter in their environments. In learning about language, infants are doing more than picking up sounds and sentences -- they also learn about the speakers of language," said Lillian May, a psychology lecturer at UBC who was lead author of the study.
The research was done in Vancouver, where approximately nine per cent of the population can speak Cantonese.
The researchers played English-learning infants of Caucasian ancestry sentences in both English and Cantonese and showed them pictures of people of Caucasian descent, and of Asian descent. When the infants heard Cantonese, they looked more at the Asian faces than when they were hearing English. When they heard English, they looked equally to Asian and Caucasian faces.
"This indicates that they have already learned that in Vancouver, both Caucasians and Asians are likely to speak English, but only Asians are likely to speak Cantonese," noted UBC psychology professor Janet Werker, the study's senior author.
The researchers showed the same pictures to the infants while playing Spanish, to see whether they were inclined to associate any unfamiliar language with any unfamiliar ethnicity. However, in that test the infants looked equally to Asian and Caucasian faces. This suggests young infants pick up on specific language-ethnicity pairings based on the faces and languages they encounter.
"Babies are learning so much about language -- even about its social use -- long before they produce the first word," said Werker. "The link between speaker characteristics and language is something no one has to teach babies. They learn it all on their own."
The researchers are now probing how babies' ability to link language and ethnicity might help them with language acquisition.

Air pollution found to affect marker of female fertility in real-life study

Ovarian reserve, a term widely adopted to reflect the number of resting follicles in the ovary and thus a marker of potential female fertility, has been found in a large-scale study to be adversely affected by high levels of air pollution.
Results from the Ovarian Reserve and Exposure to Environmental Pollutants (ORExPo study), a 'real-world data' study using hormone measurements taken from more than 1300 Italian women, are presented today at the Annual Meeting of ESHRE by first investigator Professor Antonio La Marca from the University of Modena and Reggio Emilia, Italy.
Behind the study lay emerging evidence that many environmental chemicals, as well as natural and artificial components of everyday diet, have the potential to disturb the physiological role of hormones, interfering with their biosynthesis, signaling or metabolism. The hormone in this case, anti- Müllerian hormone or AMH, is secreted by cells in the ovary and is now widely recognised as a reliable circulating marker of ovarian reserve.(1)
'The influence of age and smoking on AMH serum levels is now largely accepted,' explains Professor La Marca, 'but a clear effect of environmental factors has not been demonstrated so far.'
The ORExPo study was in effect an analysis of all AMH measurements taken from women living in the Modena area between 2007 and 2017 and assembled in a large database. These measurements were extended to a computing data warehouse in which AMH levels were linked to patients' age and residential address. The analysis was completed with environmental data and a 'geo-localisation' estimate based on each patient's residence. The assessment of environmental exposure considered daily particulate matter (PM) and values of nitrogen dioxide (NO2), a polluting gas which gets into the air from burning fuel.
Results from the 1463 AMH measurements collected from 1318 women firstly showed -- as expected -- that serum AMH levels after the age of 25 were inversely and significantly related to the women's age. However, it was also found that AMH levels were inversely and significantly related to environmental pollutants defined as PM10, PM2.5 and NO2. This association was age-independent.
These results were determined by dividing the full dataset into quartiles reflecting PM10, PM2.5 and NO2 concentrations. The analysis found significantly lower levels of AMH in the fourth quartile than in the lowest quartiles, which, said Professor La Marca, 'again confirms that independently of age the higher the level of particulate matter and NO2, the lower the serum concentration of AMH'. The lowest concentration of AMH -- reflecting 'severe ovarian reserve reduction' -- was measured in subjects who were exposed to levels of PM10, PM2.5 and NO2above 29.5, 22 and 26 mcg/m3 respectively. Nevertheless, these were values well below the upper limits recommended by the EU and local authorities (ie, 40, 25 and 40 mcg /m3 respectively).
Severe ovarian reserve reduction, as reflected in a serum AMH concentration below 1 ng/ml, was significantly more frequent in the fourth quartile than in the first three quartiles for PM10 (62% vs 38%), for PM2.5, and for NO2. 'This means by our calculations,' said Professor La Marca, 'exposure to high levels of PM10, PM2.5 and NO2 increases the risk of having a severely reduced ovarian reserve by a factor between 2 and 3.'
While noting that this study again confirms that age is the most important determinant of AMH concentration in women, Professor La Marca emphasised that other factors such as smoking, body weight and long-term hormonal contraception are already recognised as having an impact on AMH. Similarly, he said, environmental pollutants may also have a significant effect in determining circulating levels of AMH. 'Living in an area associated with high levels of air pollutants in our study increased the risk of severely reduced ovarian reserve by a factor of 2 or 3,' he said.

Puppy love: Choosing the perfect pooch poses challenges similar to dating

Psychologists at Indiana University who study how people pick their spouses have turned their attention to another important relationship: choosing a canine companion.
Their work, published in the journal Behavior Research Methods, recently found that, when it comes to puppy love, the heart doesn't always know what it wants.
The results are based upon data from a working animal shelter and could help improve the pet adoption process.
"What we show in this study is that what people say they want in a dog isn't always in line with what they choose," said Samantha Cohen, who led the study as a Ph.D. student in the IU Bloomington College of Arts and Sciences' Department of Psychological and Brain Sciences. "By focusing on a subset of desired traits, rather than everything a visitor says, I believe we can make animal adoption more efficient and successful."
As a member of the lab of IU Provost Professor Peter Todd, Cohen conducted the study while also volunteering as an adoption counselor at an animal shelter. Todd is co-author on the study.
"It was my responsibility to match dogs to people based on their preferences, but I often noticed that visitors would ultimately adopt some other dog than my original suggestion," Cohen said. "This study provides a reason: Only some desired traits tend to be fulfilled above chance, which means they may have a larger impact on dog selection."
The researchers categorized dogs based upon 13 traits: age, sex, color, size, purebred status, previous training, nervousness, protectiveness, intelligence, excitability, energy level, playfulness and friendliness. They surveyed the preferences of 1,229 people who visited dogs at an animal shelter, including 145 who decided to make an adoption.
A similar disconnect has been found in research on speed dating led by Todd, who has shown that people's stated romantic preferences tend not to match the partners they choose.
Although most participants in the dog adoption study listed many traits they preferred -- with "friendliness" as the most popular -- they ultimately selected dogs most consistent with just a few preferences, like age and playfulness, suggesting that others, like color or purebred status, exerted less influence on decision-making.
There was also another parallel to the world of dating. In short: Looks matter.
"As multiple psychologists have shown in speed-dating experiments, physical attractiveness is very important," Cohen said. "Most people think they've got a handsome or good-looking dog."
In the article, Cohen outlines some challenges facing aspiring dog-owners:
  • Focusing on "the one": Although adopters often came to the shelter with a vision of the perfect pet, Cohen said many risked missing a good match due to overemphasis on specific physical and personality traits. For example, an adopter who wants an Irish wolfhound because they're large, loyal and light shedders might fail to consider a non-purebred with the same qualities.
  • Mismatched perceptions: Surprisingly, adopters and shelters often used different traits to describe the same dog. These included subjective traits, such as obedience and playfulness, as well as seemingly objective traits, such as color.
  • Missed signals: People who have never had a dog may not grasp the implications of certain behaviors. A dog seen as "playful" at the shelter may come across as "destructive" in a small home, for example.
  • Performance anxiety: Shelters are high-stress environments for dogs, whose personalities may shift when they're more relaxed at home. Picking a dog based upon personality at the shelter is akin to choosing a date based on how well they perform while public speaking, Cohen said.
To improve pet adoptions, Cohen said animal shelters need to know that people tend to rely on certain traits more strongly when choosing a dog, which might make it easier to match adopters to dogs. She also suggested shelters consider interventions, such as temporary placement in a calmer environment, to help stressed or under-socialized dogs put their best paw forward, showing their typical level of desirable traits, such as friendliness.
Finally, Cohen advises caution about online adoption, since adopters are dependent upon someone else's description of the dogs. She suggests users limit their search criteria to their most desired traits to avoid filtering out a good match based upon less important preferences.
The study was supported in part by an IU Graduate and Professional Student Government Research Award.

Physical evidence in the brain for types of schizophrenia

In a study using brain tissue from deceased human donors, Johns Hopkins Medicine researchers say they found new evidence that schizophrenia can be marked by the buildup of abnormal proteins similar to those found in the brains of people with such neurodegenerative disorders as Alzheimer's or Huntington's diseases.
Schizophrenia -- the specific cause of which remains generally unknown, but is believed to be a combination of genes and environment -- is a disabling mental disorder marked by jumbled thinking, feeling and behavior, as well as delusions or hallucinations. Striking an estimated 200,000 people in the United States each year, its symptoms may be eased with anti-psychotic medications, but the drugs don't work for everyone. Rather than rely on categorizing by symptoms, researchers have long sought to better classify types of schizophrenia -- such as those in which abnormal proteins appear to accumulate -- as a potential way to improve and tailor therapies as precision medicine. The researchers aren't sure how common this variation of the disorder is, although they did find it in about half of the brain samples analyzed.
The new findings were published online May 6 in the American Journal of Psychiatry.
"The brain only has so many ways to handle abnormal proteins," says Frederick Nucifora Jr., Ph.D., D.O., M.H.S., the leader of the study and an assistant professor of psychiatry and behavioral sciences at the Johns Hopkins University School of Medicine. "With schizophrenia, the end process is mental and behavioral, and doesn't cause the pronounced physical neural cell death we see with neurodegenerative diseases, but there are clearly some overall biological similarities."
Based on their experience with schizophrenia and neurodegenerative disorders, Nucifora and his team wanted to determine if the features of schizophrenia brains that are also seen in the brains of patients with Alzheimer's disease or other illnesses. In these neurodegenerative disorders, certain abnormal proteins are churned out but don't assemble into properly functioning molecules, instead ending up misfolded, clumping up and leading to disease.
Using brain tissue samples from the Harvard Brain Tissue Resource Center and brain banks at the University of Pittsburgh and the University of Texas Southwestern, the researchers studied 42 samples from brains of people with schizophrenia and a comparison set of samples from 41 brains from healthy controls. About three-quarters of the brains came from men, and 80% were from white people. The donor tissues were from people with an average age of about 49.
The team broke open the cells from the brain tissue samples and analyzed their contents by looking at how much of the cells contents could be dissolved in a specific detergent. The more dissolved contents, the more "normal" or healthy the cell's contents. Less dissolved cell contents indicate that the cell contains a high volume of abnormal, misfolded proteins, as found in other brain diseases. The researchers found that 20 of the brains from people with schizophrenia had a greater proportion of proteins that couldn't be dissolved in detergent, compared to the amount found in the healthy samples. These same 20 samples also showed elevated levels of a small protein ubiquitin that is a marker for protein aggregation in neurodegenerative disorders. Elevated levels of ubiquitin weren't seen in the healthy brain tissue samples.
The researchers wanted to show that the anti-psychotic medications the patients were taking before they died didn't cause the accumulation of abnormal proteins. To clarify whether the disease or the treatment caused the buildup, the team examined the proteins in the brains of rats treated with the antipsychotic drugs haloperidol or risperidone for 4.5 months compared to control rats treated with plain water. They found that treatment with the anti-psychotic medications didn't cause an accumulation of undissolvable proteins or extra ubiquitin tags, suggesting that the disease and not the medication caused abnormal proteins to build up in some of the brains with schizophrenia.
Next, the researchers used mass spectroscopy to determine the identity of these undissolvable proteins. They found that many of these abnormal proteins were involved in nervous system development, specifically in generating new neurons and the connections that neurons use to communicate with one another.
Nucifora says this main finding of the abnormal proteins involved in these processes is consistent with theories of schizophrenia that trace its origins to brain development and to problems with neural communication.
"Researchers have been so focused on the genetics of schizophrenia that they've not paid as much attention to what is going on at the protein level and especially the possibility of protein aggregation," says Nucifora. "This may be a whole new way to look at the disorder and develop more effective therapies."
Nucifora says Johns Hopkins researchers have pioneered a way to use samples of neurons taken from the nose in living patients as stand-ins for brain biopsies in their studies of schizophrenia and other brain disorders. They hope to now use this technique to study changes in these abnormal proteins over time in people with schizophrenia. They also want to see whether the substantial variety in the disorder's symptoms is linked to particular levels of excess abnormal proteins, and how this leads to the disease. The researchers are investigating if other psychiatric illnesses have similar irregularities too.
This research was supported by the Brain and Behavior Research Foundation.
Margolis receives support from Teva Pharmaceutical Industries and Tamminga received support from Sunovion and travel funds from Autiphony Therapeutics, and has served as a consultant for Astellas, Eli Lilly and Co., Kaye Scholer for Pfizer and Lundbeck.

People prefer to donate time -- even when charities lose out

Each year during the holiday season, soup kitchens and charities alike are flooded with offers to volunteer. But is a donation of your time most beneficial to the charity, or would a financial contribution provide more value?
Researchers from Portland State University and Texas A&M University wondered what drives volunteering -- especially when a monetary donation would have more impact. Their study, "Why Do People Volunteer? An Experimental Analysis of Preferences for Time Donations," was published in this spring in the journal Management Science.
PSU Assistant Professor of Economics J. Forrest Williams worked with Texas A&M economics professors Alexander Brown and Jonathan Meer to consider what drives volunteers. They found time and time again, people preferred to donate their time even when doing so is less efficient
Why do people make that choice? Williams said the "warm glow" effect appears to be responsible. Warm glow is described simply as the positive feeling associated with volunteering or donating to charity.
"If you have really high wages, why would you give up a few hundred dollars an hour, when you could just work an hour and donate that amount and then provide significantly more value than your time would be worth to the charity?" Williams said.
Past research assumes, he added, that when given the choice between donating $10 worth of someone's time and $10, the options are equivalent. Their research proves otherwise.
"The warm glow you get from volunteering appears to be significantly higher than that of giving money. And that went against what a lot of the field believed," Williams said.
Their findings should inform future economic studies and conversations considering gifts of time and money, he added.
"If we are only concerned about the impact of our donations, it makes no economic sense," Williams said of people's preferences to donate time. The warm glow effect is just that persuasive.
The study controlled for a variety of factors that may influence someone's choices: recognition, networking, reciprocity, happiness gained from working with others, etc.
Participants still chose to donate their time even when their choice essentially lost money for the charity and there was no personal benefit.
"If you make $100 an hour, then you go work in a soup kitchen and don't provide $100 worth of service in an hour, they would be better off if you just gave them $100," he said.
It's unclear how charities should respond to this finding. It's conceivable they could be better off insisting on donations and paying for labor to replace volunteers. But they may also lose a lot of potential donors who gain greater benefit from the volunteering experience. Those donors might end up volunteering at a competing charity. Further, volunteers might increase the visibility of the charity increasing future monetary donations, something this study could not address.
"Even if you are hyper-effective at what you do, it's just so unlikely that the value of your time there is greater than the value of the money you would give," Williams said.

Helping physics teachers who don't know physics

A shortage of high school physics teachers has led to teachers with little-to-no physics training taking over physics classrooms, causing additional stress and job dissatisfaction for those teachers -- and a difficult learning experience for their students.
But new research indicates that focused physics professional development for teachers -- even those who have no prior physics training -- can lead to better experiences for both students and teachers, and can improve students' understanding of physics concepts.
The study, published last month in the Journal of Science Teacher Education, followed two groups of advanced-placement science teachers as they went through three years of training. The program was designed to improve their understanding of physics concepts and to assist them in developing teaching strategies to help their students better retain what they learn about physics.
Justina Ogodo, the study's author and postdoctoral researcher at The Ohio State University's Department of Teaching and Learning, said that when she launched this project, she remembered being a physics student in high school, and being uninspired by the education she received.
"I truly hated physics, because my teacher would speak to the board -- he would teach to the board," she said. "I imagined students were having the same experience I had, because the teachers don't have the content knowledge or pedagogical skills to teach physics."
Ogodo wanted to understand how a teacher's subject-matter knowledge could affect a student's ability to learn and understand. She followed a group of advanced-placement physics teachers through intensive physics professional development funded by the National Science Foundation, then compared their teaching practices and student outcomes with AP teachers who did not attend the courses.
To evaluate the teachers, Ogodo used the Reformed Teaching and Observation Protocol (RTOP) instrument, which has been in use as a teacher-evaluation tool since 2000. Ogodo used the instrument to measure each teacher's effectiveness in five categories: lesson design and implementation, content, classroom culture, communicative interactions and student/teacher relationships. She found that teachers who completed the training earned scores about 40 percent higher than teachers who did not participate in the professional development.
Prior to the training, Ogodo found, most teachers used "traditional, teacher-centered methods" to teach. Those methods include lectures, note-taking and problem-solving activities -- methods designed to complete the AP curriculum and focused on the AP exam.
Ogodo observed that teachers who completed the course were more likely to use conceptual learning techniques and the Socratic method to teach their students -- a method driven by inquiry-based teaching and learning, along with hands-on labs to help students see the real-world applications of the theories they learned.
The teachers who did not complete the training, Ogodo found, continued to fall back on lectures and standardized labs.
The shortage of physics teachers is severe. Across the United States, just 47 percent of physics teachers have physics degrees or physics education, according to the National Science Foundation.
And in Alabama, where this study was conducted, the problem is worse: Just 9 percent of physics teachers there have physics degrees or certification in physics education.
"They are just thrown into the physics classrooms to teach," Ogodo said. "That means they are not equipped to teach physics, and that can be frustrating for both teachers and students."
The results can be harmful, Ogodo found. Some teachers in Ogodo's study reported feeling a lack of confidence in their abilities, especially when teaching physics concepts they did not understand, and suggested that these feelings could lead to teacher burn-out. Ogodo also found that teachers' lack of knowledge can diminish students' interest in physics.
But in classrooms led by teachers who participated in the intensive physics education training, teachers reported feeling greater satisfaction in teaching physics and greater trust in their abilities.
Previous studies about science and education have shown that students' ability to achieve in any subject is directly connected to the quality and effectiveness of their teachers.
Ogodo said this study shows that increasing training for teachers will likely lead to better outcomes for students and to greater numbers of students seeking futures in the sciences.
"One student told me she likes to write, and that she wanted to be a creative writer, but that after taking this physics class with her teacher who had learned these better techniques, she wants to be a physics teacher," Ogodo said. "That just made my day."
This work was supported by the Alliance for Physics Excellence project, which was funded by the National Science Foundation through the University of Alabama, Alabama A&M University and the University of Alabama in Huntsville.

Novel C. diff structures are required for infection, offer new therapeutic targets

  Iron storage "spheres" inside the bacterium C. diff -- the leading cause of hospital-acquired infections -- could offer new targ...