Showing posts with label BUSINESS & INDUSTRY. Show all posts
Showing posts with label BUSINESS & INDUSTRY. Show all posts

Wednesday, 4 October 2023

Shape-changing smart speaker lets users mute different areas of a room

 In virtual meetings, it's easy to keep people from talking over each other. Someone just hits mute. But for the most part, this ability doesn't translate easily to recording in-person gatherings. In a bustling cafe, there are no buttons to silence the table beside you.

The ability to locate and control sound -- isolating one person talking from a specific location in a crowded room, for instance -- has challenged researchers, especially without visual cues from cameras.

A team led by researchers at the University of Washington has developed a shape-changing smart speaker, which uses self-deploying microphones to divide rooms into speech zones and track the positions of individual speakers. With the help of the team's deep-learning algorithms, the system lets users mute certain areas or separate simultaneous conversations, even if two adjacent people have similar voices. Like a fleet of Roombas, each about an inch in diameter, the microphones automatically deploy from, and then return to, a charging station. This allows the system to be moved between environments and set up automatically. In a conference room meeting, for instance, such a system might be deployed instead of a central microphone, allowing better control of in-room audio.

The team published its findings Sept. 21 in Nature Communications.

"If I close my eyes and there are 10 people talking in a room, I have no idea who's saying what and where they are in the room exactly. That's extremely hard for the human brain to process. Until now, it's also been difficult for technology," said co-lead author Malek Itani, a UW doctoral student in the Paul G. Allen School of Computer Science & Engineering. "For the first time, using what we're calling a robotic 'acoustic swarm,' we're able to track the positions of multiple people talking in a room and separate their speech."

Previous research on robot swarms has required using overhead or on-device cameras, projectors or special surfaces. The UW team's system is the first to accurately distribute a robot swarm using only sound.

The team's prototype consists of seven small robots that spread themselves across tables of various sizes. As they move from their charger, each robot emits a high frequency sound, like a bat navigating, using this frequency and other sensors to avoid obstacles and move around without falling off the table. The automatic deployment allows the robots to place themselves for maximum accuracy, permitting greater sound control than if a person set them. The robots disperse as far from each other as possible since greater distances make differentiating and locating people speaking easier. Today's consumer smart speakers have multiple microphones, but clustered on the same device, they're too close to allow for this system's mute and active zones.

advertisement

"If I have one microphone a foot away from me, and another microphone two feet away, my voice will arrive at the microphone that's a foot away first. If someone else is closer to the microphone that's two feet away, their voice will arrive there first," said co-lead authorTuochao Chen, a UW doctoral student in the Allen School. "We developed neural networks that use these time-delayed signals to separate what each person is saying and track their positions in a space. So you can have four people having two conversations and isolate any of the four voices and locate each of the voices in a room."

The team tested the robots in offices, living rooms and kitchens with groups of three to five people speaking. Across all these environments, the system could discern different voices within 1.6 feet (50 centimeters) of each other 90% of the time, without prior information about the number of speakers. The system was able to process three seconds of audio in 1.82 seconds on average -- fast enough for live streaming, though a bit too long for real-time communications such as video calls.

As the technology progresses, researchers say, acoustic swarms might be deployed in smart homes to better differentiate people talking with smart speakers. That could potentially allow only people sitting on a couch, in an "active zone," to vocally control a TV, for example.

Researchers plan to eventually make microphone robots that can move around rooms, instead of being limited to tables. The team is also investigating whether the speakers can emit sounds that allow for real-world mute and active zones, so people in different parts of a room can hear different audio. The current study is another step toward science fiction technologies, such as the "cone of silence" in "Get Smart" and"Dune," the authors write.

Of course, any technology that evokes comparison to fictional spy tools will raise questions of privacy. Researchers acknowledge the potential for misuse, so they have included guards against this: The microphones navigate with sound, not an onboard camera like other similar systems. The robots are easily visible and their lights blink when they're active. Instead of processing the audio in the cloud, as most smart speakers do, the acoustic swarms process all the audio locally, as a privacy constraint. And even though some people's first thoughts may be about surveillance, the system can be used for the opposite, the team says.

"It has the potential to actually benefit privacy, beyond what current smart speakers allow," Itani said. "I can say, 'Don't record anything around my desk,' and our system will create a bubble 3 feet around me. Nothing in this bubble would be recorded. Or if two groups are speaking beside each other and one group is having a private conversation, while the other group is recording, one conversation can be in a mute zone, and it will remain private."

Remote work can slash your carbon footprint -- if done righ

 Remote workers can have a 54% lower carbon footprint compared with onsite workers, according to a new study by Cornell University and Microsoft, with lifestyle choices and work arrangements playing an essential role in determining the environmental benefits of remote and hybrid work.

The study, published in the Proceedings of the National Academy of Sciences, also finds that hybrid workers who work from home two to four days per week can reduce their carbon footprint by 11% to 29%, but working from home one day per week is more negligible, cutting carbon footprint by only 2%.

"Remote work is not zero carbon, and the benefits of hybrid work are not perfectly linear," said study senior author Fengqi You, professor in energy systems engineering at Cornell. "Everybody knows without commuting you save on transportation energy, but there's always lifestyle effects and many other factors."

The main contributors to carbon footprint for onsite and hybrid workers, according to the study, are travel and office energy use. That's no surprise to researchers quantifying the impact of remote work on the environment, but Cornell and Microsoft used survey data and modeling to incorporate factors sometimes overlooked when calculating carbon footprint, including residential energy use based on time-use allocation, non-commute distance and mode of transportation, communications device usage, number of household members and office configuration, such as seat sharing and building size.

Notable findings and observations include:

  • Non-commute travel, such as trips to social and recreational activities, becomes more significant as the number of remote workdays increases.
  • Seat sharing among hybrid workers under full-building attendance can reduce carbon footprint by 28%.
  • Hybrid workers tend to commute farther than onsite workers due to differences in housing choices.
  • The effects of remote and hybrid work on communications technologies such as computer, phone and internet usage have negligible impacts on overall carbon footprint.

"Remote and hybrid work shows great potential for reducing carbon footprint, but what behaviors should these companies and other policy makers be encouraging to maximize the benefits?" said Longqi Yang, principal applied research manager at Microsoft and corresponding author of the study. "The findings suggest organizations should prioritize lifestyle and workplace improvements."

You said the study finds that companies and policymakers should also focus on incentivizing public transportation over driving, eliminating office space for remote workers and improving energy efficiency for office buildings.

"Globally, every person, every country and every sector have these kinds of opportunities with remote work. How could the combined benefits change the whole world? That's something we really want to advance our understanding of," said Yanqiu Tao, a doctoral student and the study's first author.

The study was based upon work supported by the National Science Foundation, and leveraged survey data from Microsoft, the American Time Use Survey, the National Household Travel Survey and the Residential Energy Consumption Survey.

What the French Revolution can teach us about inflation

 More than 200 years later, historians are still gleaning some unexpected insights from the French Revolution -- not about tyranny or liberty -- but rather, inflation.

"Revolutionary France experienced the first modern hyperinflation," said Louis Rouanet, Ph.D., assistant professor at The University of Texas at El Paso. "Although it happened more than two centuries ago, it offers relevant lessons for today."

Rouanet is the lead author of the new study, "Assignats or death: The politics and dynamics of hyperinflation in revolutionary France," published recently in the European Economic Review. A faculty member of the UTEP Department of Economics and Finance, Rouanet is an expert in economic history, specializing in revolutionary France, and a Frenchman himself. The study advances a new framework for understanding the monetary phenomenon hyperinflation, a period of rapid and extreme price increases.

Rouanet's analysis found that political instability and shifting public expectations were key in explaining the scenario that unfolded between May 1794 and May 1796, when the French revolutionary governments' decision to issue a paper currency called the assignat led to extreme inflation. Price levels increased more than 50% per month, complicating an already volatile economic situation. The currency was primarily supported by a political group known as the Jacobins, a party whose power waned throughout the revolution.

The French Revolution began at the end of the 18th century when extreme popular discontent with feudal institutions erupted into revolution, Rouanet said. The conflict reshaped the French government and led to the end of the feudal system, a hierarchical system of government that placed the king at the top, nobility and clergy below him, and peasants below all.

During the revolution, the government was bankrupt and expropriated substantial amounts of land and assets held by the Catholic Church in order to sell them. However, they were unable to sell the land fast enough to pay back creditors. To stimulate purchases, the government began issuing a paper currency called assignat. In order to prevent inflation, revolutionary officials promised to retire the assignat from circulation and burn the notes once they were used to buy property, but this commitment was not always honored, prompting public mistrust.

At the same time, the strength of the Jacobin party was weakening. From failing insurrections in Paris and the establishment of a new regime known as the Directory, the key drivers of the assignat were on their way out.

The political instability, coupled with public mistrust, prompted a rush to spend the assignat, which led to hyper-inflation, according to Rouanet. The research concludes that changing expectations and the public's anticipation of inflation can work together to create actual inflation.

"My research points to the importance of sound fiscal housekeeping and a strong political commitment to stable prices," Rouanet said. "As it occurred during the French Revolution, politicizing the money supply increases the instability of the demand for money and prices, thus making economic life less predictable."

Rouanet co-authored the paper with Brian P. Custinger, Ph.D. of Angelo State University and Texas Tech University and Joshua S. Ingber, Ph.D., of Northern Michigan University. Rouanet is a member of the newly-established Center for Free Enterprise, which aims to provide economic education to the community through reading groups, speaker series and curricula developed for high schools.

"This excellent research gives important insight into how public opinion can shape monetary policy and outcomes," said John Hadjimarcou, Ph.D., professor of marketing and interim dean of the Woody L. Hunt College of Business. "We are delighted to welcome Louis Rouanet and several new faculty members to the Hunt College of Business this year. Each new faculty member brings a trove of interesting research and practical experience that will greatly enrich the learning experience for our students and contribute to our local community."

Job loss is linked to increased risk of miscarriage and stillbirth

 Researchers have found a link between a pregnant woman or her partner losing their job and an increased risk of miscarriage or stillbirth.

The study, which is published today (Thursday) in Human Reproductionone of the world's leading reproductive medicine journals, found a doubling in the chances of a pregnancy miscarrying or resulting in a stillbirth following a job loss.

The researchers, led by Dr Selin Köksal from the Institute for Social and Economic Research at the University of Essex, UK, emphasise that their findings highlight an association between job loss and an increased probability of miscarriage or stillbirth and that the study cannot show that losing a job causes the pregnancy loss.

"Further research would need to be carried out to understand if losing one's job actually causes the increased risk of pregnancy loss," she said. "I would like to analyse socioeconomic factors influencing pregnancy loss in contexts where data for the entire population are available through administrative records. These data can help clarify whether there are solid causal links between job loss and pregnancy loss, and whether there are certain socioeconomic groups in the population that are particularly at risk, such as economically precarious employees.

"Being able to examine the association between job loss and pregnancy loss among different socioeconomic groups could help us to understand how exactly a job loss is related to higher risk of a miscarriage or a stillbirth. Is it because of economic hardship, or an experience of an unexpected event or is it due to loss of social status? These are the questions that I am hoping to answer in the future."

The study is based on data from the "Understanding Society" survey of 40,000 households in the UK between 2009 and 2022. It includes 8142 pregnancies for which there was complete information on the date of conception and pregnancy outcome.

Out of these pregnancies, 11.6% miscarried (947), which may be an underestimate because many pregnancies do not survive beyond the first month and pregnancy loss can go undetected. There were 38 stillbirths, representing 0.5% of conceptions, which is in line with the UK's official statistics for stillbirths.

Out of 136 women who were affected by their own or their partner's job loss, 32 (23.5%) miscarried and one (0.7%) had a still birth. Among 8006 women who were not affected by their own or their partner's job loss, 915 (10.4%) miscarried and 37 (0.5%) had a stillbirth.

Co-author of the paper, Dr Alessandro Di Nallo, from the Dondena Centre for Research on Social Dynamics and Public Policy at Bocconi University, Milan, Italy, said: "The reasons for these associations may be related to stress, reduced access to prenatal care, or changes in lifestyle.

"My previous research indicates that job loss reduces the likelihood of having children. This might be because people postpone their plans to have children under conditions of economic uncertainty, but it could also be due to other reasons. Stress results in a physiological response, releasing hormones that are known to increase the risk of miscarriage or premature delivery. The reduction in income following a job loss could restrict access and compliance with prenatal care, so that at-risk pregnancies are discovered late or are undetected. In addition, the emotional discomfort of job loss could prompt unhealthy behaviours, such as alcohol consumption, smoking or unhealthy eating."

Dr Köksal said: "Our findings are important as we uncover a potential socioeconomic, hence preventable, factor behind pregnancy losses that can be addressed through effective policymaking.

"It is important to raise awareness of women's legal rights and protection in the workplace during pregnancy, so that women can feel safer and more empowered to communicate their pregnancy with their employer. Moreover, stress during pregnancy can have negative effects on both maternal and foetal health. So, provision of psychological support during pregnancy through the public health system is important regardless of women's and their partner's job status.

"In the UK, pregnancy is a period that is protected fairly well by labour market legislation. However, there is no job loss protection for the partners of pregnant women who are dismissed without notice.Policymakers, for instance, could consider extending job protection to workers whose partners are pregnant as our results shows that a partner's job stability is equally as important as the woman's job stability for the course of pregnancy. Additionally, it makes sense to increase economic support for individuals -- and their partners -- who lose their jobs because the lack of economic support is shown to be one of the main causes of stress and personal distress, which can eventually increase the risk of pregnancy loss."

Limitations of the study include the fact that pregnancy and job loss were self-reported and may be affected by recall and a bias towards what is socially desirable; other factors might also be correlated with both job loss and pregnancy loss; and finally the researchers do not know if the findings hold true for different socioeconomic groups.

Tuesday, 15 August 2023

Successful cooperation depends on good 'mindreading' abilities

 A person's 'mindreading ability' can predict how well they are able to cooperate, even with people they have never met before.

Researchers at the University of Birmingham found that people with strong mind reading abilities -- the ability to understand and take the perspective of another person's feelings and intentions- are more successful in cooperating to complete tasks than people with weaker mind reading abilities.

These qualities, also called 'theory of mind', are not necessarily related to intelligence and could be improved through training programmes to foster improved cooperation, for example in the workplace or in schools and colleges.

Lead researcher Roksana Markiewicz explained: "As a psychology researcher, I often get asked if I can read minds and while this is often said to me as a joke, humans do have mindreading abilities. Our study shows that these qualities are clearly important in activities that require cooperation."

In the study published in the Journal of Experimental Psychology: LMC, the team measured theory of mind in over 400 participants. Participants were then sorted into pairs and joined a researcher on a zoom call where they played a series of communication games. Each player had a set of visual clues on their screen, which could not be viewed by their partner. They had to communicate about the different sets of clues and use them together to solve a puzzle.

Players who had high theory of mind (ToM) abilities and who were matched with people who had similarly high ToM scores cooperated more effectively than players matched with low ToM abilities. The researchers suggest that this is because of a heightened ability to align in the same mental space and to recover rapidly when misalignment occurs.

Similarly, the researchers found that failures in cooperation were more common among participants with low ToM abilities. They suggest this is because these participants found it harder to find ways to align their thinking, leading to more frequent mistakes, and poorer recovery from mistakes.

"We show for the first time that cooperation is not all about you," says Roksana. "Even if you have excellent mindreading abilities yourself, it will still be advantageous to cooperate with someone with similar abilities, so choose your cooperation partner wisely!"

Thursday, 27 July 2023

AI tests into top 1% for original creative thinking

 New research from the University of Montana and its partners suggests artificial intelligence can match the top 1% of human thinkers on a standard test for creativity.

The study was directed by Dr. Erik Guzik, an assistant clinical professor in UM's College of Business. He and his partners used the Torrance Tests of Creative Thinking, a well-known tool used for decades to assess human creativity.

The researchers submitted eight responses generated by ChatGPT, the application powered by the GPT-4 artificial intelligence engine. They also submitted answers from a control group of 24 UM students taking Guzik's entrepreneurship and personal finance classes. These scores were compared with 2,700 college students nationally who took the TTCT in 2016. All submissions were scored by Scholastic Testing Service, which didn't know AI was involved.

The results placed ChatGPT in elite company for creativity. The AI application was in the top percentile for fluency -- the ability to generate a large volume of ideas -- and for originality -- the ability to come up with new ideas. The AI slipped a bit -- to the 97th percentile -- for flexibility, the ability to generate different types and categories of ideas.

"For ChatGPT and GPT-4, we showed for the first time that it performs in the top 1% for originality," Guzik said. "That was new."

He was gratified to note that some of his UM students also performed in the top 1%. However, ChatGTP outperformed the vast majority of college students nationally.

Guzik tested the AI and his students during spring semester. He was assisted in the work by Christian Gilde of UM Western and Christian Byrge of Vilnius University. The researchers presented their work in May at the Southern Oregon University Creativity Conference.

"We were very careful at the conference to not interpret the data very much," Guzik said. "We just presented the results. But we shared strong evidence that AI seems to be developing creative ability on par with or even exceeding human ability."

Guzik said he asked ChatGPT what it would indicate if it performed well on the TTCT. The AI gave a strong answer, which they shared at the conference:

"ChatGPT told us we may not fully understand human creativity, which I believe is correct," he said. "It also suggested we may need more sophisticated assessment tools that can differentiate between human and AI-generated ideas."

He said the TTCT is protected proprietary material, so ChatGPT couldn't "cheat" by accessing information about the test on the internet or in a public database.

Guzik has long been interested in creativity. As a seventh grader growing up in the small town of Palmer, Massachusetts, he was in a program for talented-and-gifted students. That experience introduced him to the Future Problem Solving process developed by Ellis Paul Torrance, the pioneering psychologist who also created the TTCT. Guzik said he fell in love with brainstorming at that time and how it taps into human imagination, and he remains active with the Future Problem Solving organization -- even meeting his wife at one of its conferences.

Guzik and his team decided to test the creativity of ChatGPT after playing around with it during the past year.

"We had all been exploring with ChatGPT, and we noticed it had been doing some interesting things that we didn't expect," he said. "Some of the responses were novel and surprising. That's when we decided to put it to the test to see how creative it really is."

Guzik said the TTCT test uses prompts that mimic real-life creative tasks. For instance, can you think of new uses for a product or improve this product?

"Let's say it's a basketball," he said. "Think of as many uses of a basketball as you can. You can shoot it in a hoop and use it in a display. If you force yourself to think of new uses, maybe you cut it up and use it as a planter. Or with a brick you can build things, or it can be used as a paperweight. But maybe you grind it up and reform it into something completely new."

Guzik had some expectation that ChatGPT would be good at creating a lot of ideas (fluency), because that's what generative AI does. And it excelled at responding to the prompt with many ideas that were relevant, useful and valuable in the eyes of the evaluators.

He was more surprised at how well it did generating original ideas, which is a hallmark of human imagination. The test evaluators are given lists of common responses for a prompt -- ones that are almost expected to be submitted. However, the AI landed in the top percentile for coming up with fresh responses.

"At the conference, we learned of previous research on GPT-3 that was done a year ago," Guzik said. "At that time, ChatGPT did not score as well as humans on tasks that involved original thinking. Now with the more advanced GPT-4, it's in the top 1% of all human responses."

With AI advances speeding up, he expects it to become a key tool for the world of business going forward and a significant new driver of regional and national innovation.

"For me, creativity is about doing things differently," Guzik said. "One of the definitions of entrepreneurship I love is that to be an entrepreneur is to think differently. So AI may help us apply the world of creative thinking to business and the process of innovation, and that's just fascinating to me."

He said the UM College of Business is open to teaching about AI and incorporating it into coursework.

"I think we know the future is going to include AI in some fashion," Guzik said. "We have to be careful about how it's used and consider needed rules and regulations. But businesses already are using it for many creative tasks. In terms of entrepreneurship and regional innovation, this is a game changer."

Researchers calculate economic value of temporary carbon reduction with 'Social Value of Offsets' formula

 A new study identifies how to calculate the economic value of temporarily reducing carbon emissions through carbon offsetting.

The Social Value of Offsets (SVO) is an economic framework that will help policymakers calculate how much carbon should be stored in temporary offsets to make it equivalent to a permanent CO2 emission.

Using the SVO metric the researchers estimate that an offset sequestering one ton of carbon for 50 years is equivalent to between 0.3 to 0.5 tons permanently locked away, taking into account a range of factors for different risks, permanence and climate scenarios.

Offsets are a key part of Paris-compliant net zero strategies, but many offsetting projects fail and there is never a guarantee on how long an offset will sequester carbon for -- making it difficult to measure the economic damage avoided.

The study, published in Nature, sets out the risks and uncertainties of offsetting, which occur due to the unregulated nature of the global offsets market.

Risk factors to projects in tropical forests, for example, can include the lack of strong institutions on the ground to monitor, enforce and account for emissions sequestered, as well as the possibility of fires and disease.

There are also risks in how emissions reductions are reported as well that of 'non-additionality' -- when emissions reductions would have happened irrespective of the offsetting.

Other frameworks count the physical units of carbon but SVO is unique in that it is an economic framework where the value of temporary emissions reductions is measured as the value of the damages avoided to the economy during the length of the offsetting project.

The researchers say this will potentially make it easier to compare offsetting schemes, allowing anyone offsetting their carbon emissions to be able to weigh up the risks involved and decide how much carbon they would need to offset in temporary schemes to make up for a permanent carbon emission.

Professor Ben Groom, Dragon Capital Chair in Environmental Economics at the University of Exeter Business School, said: "Our analysis shows that a carbon emission today which is offset by a temporary project can be thought of as a postponed emission with the same warming effect when the project ends, but with less warming during the project.

"The Social Value of Offsets (SVO) stems from the value of delaying emissions and damages, and this depends on how impermanent, risky or additional they are. Valuing offsets using the SVO then provides a means of comparing offsets with different qualities in terms of the economic damages avoided."

Professor Groom explains why delaying emissions is important, both in an economic and physical sense. "With a project that stores carbon and releases it 50 years later, the net carbon reduction is always going to be zero, so some may say it's as if it never happened."

"But what that ignores is the flow of damages that you've avoided in the meantime, which could be important, because certain responses to climate change, like the melting of the ice caps, are responsive, depending on how long temperatures have been at a particular level.

"Delaying emissions is also important because economic processes could be happening in the background that make carbon removal cheaper in the future so offsetting could act as a temporary solution allowing the action point to be delayed until a time when it is cheaper to act.

"The question we're answering with SVO is how valuable this temporary period in which you avoid damages is."

The IPCC has previously noted that meeting the objectives of the Paris Agreement will require some offsetting, though some organisations suggest that offsetting should be largely avoided due to the unregulated, impermanent and risky nature of the offset market.

However, this study illustrates that in principle delaying emissions, even when offsetting projects are temporary and risky, is valuable in economic terms.

The economists believe the SVO metric can play an important role in appraising net-zero climate policy and harmonising the offset market, and has policy applications beyond the valuation of offsets.

These include calculating the benefits-to-cost ratio of an offset or any temporary carbon storage solution allowing for comparison to alternative technologies for mitigating climate change.

The SVO formula can also be applied to Life-Cycle Analysis of biofuels as well as used to calculate the price of carbon debt, using the rule of thumb that a company that emits a ton of carbon today and commits to a permanent removal in 50 years' time will pay 33% of the carbon price today to cover the damages of temporary atmospheric storage.

The Social Value of Offsets, by Professor Ben Groom, Dragon Capital Chair in Environmental Economics at the University of Exeter Business School and Professor Frank Venmans from the Grantham Research Institute on Climate Change and the Environment at LSE, is published in Nature.

GPT detectors can be biased against non-native English writers

 In a peer-reviewed opinion paper publishing July 10 in the journal Patterns, researchers show that computer programs commonly used to determine if a text was written by artificial intelligence tend to falsely label articles written by non-native language speakers as AI-generated. The researchers caution against the use of such AI text detectors for their unreliability, which could have negative impacts on individuals including students and those applying for jobs.

"Our current recommendation is that we should be extremely careful about and maybe try to avoid using these detectors as much as possible," says senior author James Zou, of Stanford University. "It can have significant consequences if these detectors are used to review things like job applications, college entrance essays or high school assignments."

AI tools like OpenAI's ChatGPT chatbot can compose essays, solve science and math problems, and produce computer code. Educators across the U.S. are increasingly concerned about the use of AI in students' work and many of them have started using GPT detectors to screen students' assignments. These detectors are platforms that claim to be able to identify if the text is generated by AI, but their reliability and effectiveness remain untested.

Zou and his team put seven popular GPT detectors to the test. They ran 91 English essays written by non-native English speakers for a widely recognized English proficiency test, called Test of English as a Foreign Language, or TOEFL, through the detectors. These platforms incorrectly labeled more than half of the essays as AI-generated, with one detector flagging nearly 98% of these essays as written by AI. In comparison, the detectors were able to correctly classify more than 90% of essays written by eighth-grade students from the U.S. as human-generated.

Zou explains that the algorithms of these detectors work by evaluating text perplexity, which is how surprising the word choice is in an essay. "If you use common English words, the detectors will give a low perplexity score, meaning my essay is likely to be flagged as AI-generated. If you use complex and fancier words, then it's more likely to be classified as human written by the algorithms," he says. This is because large language models like ChatGPT are trained to generate text with low perplexity to better simulate how an average human talks, Zou adds.

As a result, simpler word choices adopted by non-native English writers would make them more vulnerable to being tagged as using AI.

The team then put the human-written TOEFL essays into ChatGPT and prompted it to edit the text using more sophisticated language, including substituting simple words with complex vocabulary. The GPT detectors tagged these AI-edited essays as human-written.

"We should be very cautious about using any of these detectors in classroom settings, because there's still a lot of biases, and they're easy to fool with just the minimum amount of prompt design," Zou says. Using GPT detectors could also have implications beyond the education sector. For example, search engines like Google devalue AI-generated content, which may inadvertently silence non-native English writers.

While AI tools can have positive impacts on student learning, GPT detectors should be further enhanced and evaluated before putting into use. Zou says that training these algorithms with more diverse types of writing could be one way to improve these detectors.


Efficient discovery of improved energy materials by a new AI-guided workflow

 Scientists of the NOMAD Laboratory at the Fritz Haber Institute of the Max Planck Society recently proposed a workflow that can dramatically accelerate the search for novel materials with improved properties. They demonstrated the power of the approach by identifying more than 50 strongly thermally insulating materials. These can help alleviate the ongoing energy crisis, by allowing for more efficient thermoelectric elements, i.e., devices able to convert otherwise wasted heat into useful electrical voltage.

Discovering new and reliable thermoelectric materials is paramount for making use of the more than 40% of energy given off as waste heat globally and help mitigate the growing challenges of climate change. One way to increase the thermoelectric efficiency of a material is to reduce its thermal conductivity, κ, and thereby maintaining the temperature gradient needed to generate electricity. However, the cost associated with studying these properties limited the computational and experimental investigations of κ to only a minute subset of all possible materials. A team of the NOMAD Laboratory recently made efforts to reduce these costs by creating an AI-guided workflow that hierarchically screens out materials to efficiently find new and better thermal insulators.

The work recently published in npj Computational Materials proposes a new way of using Artificial Intelligence (AI) to guide the high-throughput search for new materials. Instead of using physical/chemical intuition to screen out materials based on general, known or suspected trends, the new procedure learns the conditions that lead to the desired outcome with advanced AI methods. This work has the potential to quantify the search for new energy materials and increase the efficiency of these searches.

The first step in designing these workflows is to use advanced statistical and AI methods to approximate the target property of interest, κ in this case. To this end, the sure-independence screening and sparsifying operator (SISSO) approach is used. SISSO is a machine learning method that reveals the fundamental dependencies between different materials properties from a set of billions of possible expressions. Compared to other "black-box" AI models, this approach is similarly accurate, but additionally yields analytic relationships between different material properties. This allows us to apply modern feature importance metrics to shed light on which material properties are the most important. In the case of κ, these are the molar volume, Vm; the high-temperature limit Debye Temperature, θD,∞; and the anharmonicity metricfactor, σA.

Furthermore, the described statistical analysis allows to distill out rule-of-thumbs for the individual features that enable to a priori estimate the potential of material to be a thermal insulator. Working with the three most important primary features hence allowed to create AI-guided computational workflows for discovering new thermal insulators. These workflows use state-of-the-art electronic structure programs to calculate each of the selected features. During each step materials were screened out that are unlikely to be good insulators based on their values of VmθD,∞, and σA. With this, it is possible to reduce the number of calculations needed to find thermally insulating materials by over two orders of magnitude. In this work, this is demonstrated by identifying 96 thermal insulators (κ < 10 Wm-1K-1) in an initial set of 732 materials. The reliability of this approach was further verified by calculating κ for 4 of these predictions with highest possible accuracy.

Besides facilitating the active search for new thermoelectric materials, the formalisms proposed by the NOMAD team can be also applied to solve other urgent material science problems.Scientists of the NOMAD Laboratory at the Fritz Haber Institute of the Max Planck Society recently proposed a workflow that can dramatically accelerate the search for novel materials with improved properties. They demonstrated the power of the approach by identifying more than 50 strongly thermally insulating materials. These can help alleviate the ongoing energy crisis, by allowing for more efficient thermoelectric elements, i.e., devices able to convert otherwise wasted heat into useful electrical voltage.

Discovering new and reliable thermoelectric materials is paramount for making use of the more than 40% of energy given off as waste heat globally and help mitigate the growing challenges of climate change. One way to increase the thermoelectric efficiency of a material is to reduce its thermal conductivity, κ, and thereby maintaining the temperature gradient needed to generate electricity. However, the cost associated with studying these properties limited the computational and experimental investigations of κ to only a minute subset of all possible materials. A team of the NOMAD Laboratory recently made efforts to reduce these costs by creating an AI-guided workflow that hierarchically screens out materials to efficiently find new and better thermal insulators.

The work recently published in npj Computational Materials proposes a new way of using Artificial Intelligence (AI) to guide the high-throughput search for new materials. Instead of using physical/chemical intuition to screen out materials based on general, known or suspected trends, the new procedure learns the conditions that lead to the desired outcome with advanced AI methods. This work has the potential to quantify the search for new energy materials and increase the efficiency of these searches.

The first step in designing these workflows is to use advanced statistical and AI methods to approximate the target property of interest, κ in this case. To this end, the sure-independence screening and sparsifying operator (SISSO) approach is used. SISSO is a machine learning method that reveals the fundamental dependencies between different materials properties from a set of billions of possible expressions. Compared to other "black-box" AI models, this approach is similarly accurate, but additionally yields analytic relationships between different material properties. This allows us to apply modern feature importance metrics to shed light on which material properties are the most important. In the case of κ, these are the molar volume, Vm; the high-temperature limit Debye Temperature, θD,∞; and the anharmonicity metricfactor, σA.

Furthermore, the described statistical analysis allows to distill out rule-of-thumbs for the individual features that enable to a priori estimate the potential of material to be a thermal insulator. Working with the three most important primary features hence allowed to create AI-guided computational workflows for discovering new thermal insulators. These workflows use state-of-the-art electronic structure programs to calculate each of the selected features. During each step materials were screened out that are unlikely to be good insulators based on their values of VmθD,∞, and σA. With this, it is possible to reduce the number of calculations needed to find thermally insulating materials by over two orders of magnitude. In this work, this is demonstrated by identifying 96 thermal insulators (κ < 10 Wm-1K-1) in an initial set of 732 materials. The reliability of this approach was further verified by calculating κ for 4 of these predictions with highest possible accuracy.

Besides facilitating the active search for new thermoelectric materials, the formalisms proposed by the NOMAD team can be also applied to solve other urgent material science problems.

Sunday, 31 May 2020

Not all psychopaths are violent; a new study may explain why some are 'successful' instead

Psychopathy is widely recognized as a risk factor for violent behavior, but many psychopathic individuals refrain from antisocial or criminal acts. Understanding what leads these psychopaths to be "successful" has been a mystery.

A new study conducted by researchers at Virginia Commonwealth University sheds light on the mechanisms underlying the formation of this "successful" phenotype.

"Psychopathic individuals are very prone to engaging in antisocial behaviors but what our findings suggest is that some may actually be better able to inhibit these impulses than others," said lead author Emily Lasko, a doctoral candidate in the Department of Psychology in the College of Humanities and Sciences. "Although we don't know exactly what precipitates this increase in conscientious impulse control over time, we do know that this does occur for individuals high in certain psychopathy traits who have been relatively more 'successful' than their peers."

The study, "What Makes a 'Successful' Psychopath? Longitudinal Trajectories of Offenders' Antisocial Behavior and Impulse Control as a Function of Psychopathy," will be published in a forthcoming issue of the journal Personality Disorders: Theory, Research, and Treatment.

When describing certain psychopathic individuals as "successful" versus "unsuccessful," the researchers are referring to life trajectories or outcomes. A "successful" psychopath, for example, might be a CEO or lawyer high in psychopathic traits, whereas an "unsuccessful" psychopath might have those same traits but is incarcerated.

The study tests a compensatory model of "successful" psychopathy, which theorizes that relatively "successful" psychopathic individuals develop greater conscientious traits that serve to inhibit their heightened antisocial impulses.

"The compensatory model posits that people higher in certain psychopathic traits (such as grandiosity and manipulation) are able to compensate for and overcome, to some extent, their antisocial impulses via increases in trait conscientiousness, specifically impulse control," Lasko said.

To test this model, the researchers studied data collected about 1,354 serious juvenile offenders who were adjudicated in court systems in Arizona and Pennsylvania.

"Although these participants are not objectively 'successful,' this was an ideal sample to test our hypotheses for two main reasons," the researchers write. "First, adolescents are in a prime developmental phase for the improvement of impulse control. Allowing us the longitudinal variability we would need to test our compensatory model. Second, offenders are prone to antisocial acts, by definition, and their rates of recidivism provided a real-world index of 'successful' versus 'unsuccessful' psychopathy phenotypes."

The study found that higher initial psychopathy was associated with steeper increases in general inhibitory control and the inhibition of aggression over time. That effect was magnified among "successful" offenders, or those who reoffended less.

Its findings lend support to the compensatory model of "successful" psychopathy, Lasko said.

"Our findings support a novel model of psychopathy that we propose, which runs contradictory to the other existing models of psychopathy in that it focuses more on the strengths or 'surpluses' associated with psychopathy rather than just deficits," she said. "Psychopathy is not a personality trait simply composed of deficits -- there are many forms that it can take."

Lasko is a researcher in VCU's Social Psychology and Neuroscience Lab, which seeks to understand why people try to harm one another. David Chester, Ph.D., director of the lab and an assistant professor of psychology, is co-author of the study.

The study's findings could be useful in clinical and forensic settings, Lasko said, particularly for developing effective prevention and early intervention strategies in that it could help identify strengths that psychopathic individuals possess that could deter future antisocial behavior.

Friday, 27 September 2019

The future of 'extremely' energy-efficient circuits

Data centers are processing data and dispensing the results at astonishing rates and such robust systems require a significant amount of energy -- so much energy, in fact, that information communication technology is projected to account for 20% of total energy consumption in the United States by 2020.
To answer this demand, a team of researchers from Japan and the United States have developed a framework to reduce energy consumption while improving efficiency.
They published their results on July 19 in Scientific Reports, a Nature journal.
"The significant amount of energy consumption has become a critical problem in modern society," said Olivia Chen, corresponding author of the paper and assistant professor in the Institute of Advanced Sciences at Yokohama National University. "There is an urgent requirement for extremely energy-efficient computing technologies."
The research team used a digital logic process called Adiabatic Quantum-Flux-Parametron (AQFP). The idea behind the logic is that direct current should be replaced with alternating current. The alternating current acts as both the clock signal and the power supply -- as the current switches directions, it signals the next time phase for computing.
The logic, according to Chen, could improve conventional communication technologies with currently available fabrication processes.
"However, there lacks a systematic, automatic synthesis framework to translate from high-level logic description to Adiabatic Quantum-Flux-Parametron circuit netlist structures," Chen said, referring to the individual processors within the circuit. "In this paper, we mitigate that gap by presenting an automatic flow. We also demonstrate that AQFP can achieve a reduction in energy use by several orders of magnitude compared to traditional technologies."
The researchers proposed a top-down framework for computing decisions that can also analyze its own performance. To do this, they used logic synthesis, a process by which they direct the passage of information through logic gates within the processing unit. Logic gates can take in a little bit of information and output a yes or no answer. The answer can trigger other gates to respond and move the process forward, or stop it completely.
With this basis, the researchers developed a computation logic that takes the high-level understanding of processing and how much energy a system uses and dissipates and describes it as an optimized map for each gate within the circuit model. From this, Chen and the research team can balance the estimation of power needed to process through the system and the energy that the system dissipates.
According to Chen, this approach also compensates for the cooling energy needed for superconducting technologies and reduces the energy dissipation by two orders of magnitude.
"These results demonstrate the potential of AQFP technology and applications for large-scale, high-performance and energy-efficient computations," Chen said.
Ultimately, the researchers plan to develop a fully automated framework to generate the most efficient AQFP circuit layout.
"The synthesis results of AQFP circuits are highly promising in terms of energy-efficient and high-performance computing," Chen said. "With the future advancing and maturity of AQFP fabrication technology, we anticipate broader applications ranging from space applications and large-scale computing facilities such as data centers."

Investments to address climate change are good business

An internationally respected group of scientists have urgently called on world leaders to accelerate efforts to tackle climate change. Almost every aspect of the planet's environment and ecology is undergoing changes in response to climate change, some of which will be profound if not catastrophic in the future.
According to their study published in Science today, reducing the magnitude of climate change is also a good investment. Over the next few decades, acting to reduce climate change is expected to cost much less than the damage otherwise inflicted by climate change on people, infrastructure and ecosystems.
"Acting on climate change" said lead author, Prof Ove Hoegh-Guldberg from the ARC Centre for Excellence in Coral Reef Studies at the University of Queensland in Australia "has a good return on investment when one considers the damages avoided by acting."
The investment is even more compelling given the wealth of evidence that the impacts of climate change are happening faster and more extensively than projected, even just a few years ago. This makes the case for rapidly reducing greenhouse gas emissions even more compelling and urgent.
Prof Hoegh-Guldberg explained the mismatch. "First, we have underestimated the sensitivity of natural and human systems to climate change, and the speed at which these changes are happening. Second, we have underappreciated the synergistic nature of climate threats -- with the outcomes tending to be worse than the sum of the parts. This is resulting is rapid and comprehensive climate impacts, with growing damage to people, ecosystems, and livelihoods."
For example, sea-level rise can lead to higher water levels during storm events. This can create more damage. For deprived areas, this may exacerbate poverty creating further disadvantage. Each risk may be small on its own, but a small change in a number of risks can lead to large impacts.
Prof Daniela Jacob, co-author and Director of Climate Services Centre (GERICS) in Germany is concerned about these rapid changes -- especially about unprecedented weather extremes.
"We are already in new territory" said Prof Jacob, "The 'novelty' of the weather is making our ability to forecast and respond to weather-related phenomena very difficult."
These changes are having major consequences. The paper updates a database of climate-related changes and finds that there are significant benefits from avoiding 2oC and aiming to restrict the increase to 1.5oC above pre-industrial global temperatures.
Prof Rachel Warren from the Tyndall Centre at the University of East Anglia in the UK assessed projections of risk for forests, biodiversity, food, crops and other critical systems, and found very significant benefits for limiting global warming to 1.5oC rather than 2oC.
"The scientific community has quantified these risks in order to inform policy makers about the benefits of avoiding them," Prof Warren stated.
Since the Paris Agreement came into force, there has been a race to quantify the benefits of limiting warming to 1.5oC so that policy makers have the best possible information for developing the policy required for doing it.
Prof Warren continued. "If such policy is not implemented, we will continue on the current upward trajectory of burning fossil fuels and continuing deforestation, which will expand the already large-scale degradation of ecosystems. To be honest, the overall picture is very grim unless we act."
A recent report from the United Nations projected that as many as a million species may be at risk of extinction over the coming decades and centuries. Climate change is not the only factor but is one of the most important ones.
The urgency of responding to climate change is at front of mind for Prof Michael Taylor, co-author and Dean of Science at the University of the West Indies. "This is not an academic issue, it is a matter of life and death for people everywhere. That said, people from small island States and low-lying countries are in the immediate cross-hairs of climate change."
"I am very concerned about the future for these people," said Professor Taylor.
This urgency to act is further emphasized by the vulnerability of developing countries to climate change impacts as pointed out by Francois Engelbrecht, co-author and Professor of Climatology at the Global Change Institute of the University of the Witwatersrand in South Africa.
"The developing African countries are amongst those to be affected most in terms of impacts on economic growth in the absence of strong climate change mitigation," Prof Engelbrecht explains.
Prof Hoegh-Guldberg reiterated the importance of the coming year (2020) in terms of climate action and the opportunity to strengthen emission reduction pledges in line with the Paris Agreement of 2015.
"Current emission reduction commitments are inadequate and risk throwing many nations into chaos and harm, with a particular vulnerability of poor peoples. To avoid this, we must accelerate action and tighten emission reduction targets so that they fall in line with the Paris Agreement. As we show, this is much less costly than suffering the impacts of 2oC or more of climate change."
"Tackling climate change is a tall order. However, there is no alternative from the perspective of human well-being -- and too much at stake not to act urgently on this issue."

Wednesday, 10 July 2019

New study maps how ocean currents connect the world's fisheries

A new study published in the journal Science finds that the world's marine fisheries form a single network, with over $10 billion worth of fish each year being caught in a country other than the one in which it spawned.
While fisheries are traditionally managed at the national level, the study reveals the degree to which each country's fishing economy relies on the health of its neighbors' spawning grounds, highlighting the need for greater international cooperation.
Led by researchers at the University of California, Berkeley, the London School of Economics, and the University of Delaware, the study used a particle tracking computer simulation to map the flow of fish larvae across national boundaries. It is the first to estimate the extent of larval transport globally, putting fishery management in a new perspective by identifying hotspots of regional interdependence where cooperative management is needed most.
"Now we have a map of how the world's fisheries are interconnected, and where international cooperation is needed most urgently to conserve a natural resource that hundreds of millions of people rely on," said co-author Kimberly Oremus, assistant professor at the University of Delaware's School of Marine Science and Policy.
The vast majority of the world's wild-caught marine fish, an estimated 90%, are caught within 200 miles of shore, within national jurisdictions. Yet even these fish can be carried far from their spawning grounds by currents in their larval stage, before they're able to swim. This means that while countries have set national maritime boundaries, the ocean is made up of highly interconnected networks where most countries depend on their neighbors to properly manage their own fisheries. Understanding the nature of this network is an important step toward more effective fishery management, and is essential for countries whose economies and food security are reliant on fish born elsewhere.
The authors brought together their expertise in oceanography, fish biology, and economics to make progress on this complex problem.
"Data from a wide range of scientific fields needed to come together to make this study possible," said lead author Nandini Ramesh, a post-doctoral researcher in the Department of Earth and Planetary Science at the University of California, Berkeley. "We needed to look at patterns of fish spawning, the life cycles of different species, ocean currents, and how these vary with the seasons in order to begin to understand this system." The study combined data from satellites, ocean moorings, ecological field observations, and marine catch records, to build a computer model of how eggs and larvae of over 700 species of fish all over the world are transported by ocean currents.
The research shows that ocean regions are connected to each other in what's known as a "small world network," the same phenomenon that allows strangers to be linked by six degrees of separation. That adds a potential new risk: threats in one part of the world could result in a cascade of stresses, affecting one region after another.
"We are all dependent on the oceans," said co-author James Rising, assistant professorial research fellow at the Grantham Research Institute in the London School of Economics. "When fisheries are mismanaged or breeding grounds are not protected, it could affect food security half a world away."
A surprising finding of the study was how interconnected national fisheries are, across the globe. "This is something of a double-edged sword," explained lead author Ramesh, "On one hand, it implies that mismanagement of a fishery can have negative effects that easily propagate to other countries; on the other hand, it implies that multiple countries can benefit by targeting conservation and/or management efforts in just a few regions."
"By modeling dispersal by species, we could connect this ecosystem service to the value of catch, marine fishing jobs, food security and gross domestic product," Oremus added. "This allowed us to talk about how vulnerable a nation is to the management of fisheries in neighboring countries."
They found that the tropics are especially vulnerable to this larval movement -- particularly when it comes to food security and jobs.
"Our hope is that this study will be a stepping stone for policy makers to study their own regions more closely to determine their interdependencies," said Ramesh. "This is an important first step. This is not something people have examined before at this scale."

'Female leadership trust advantage' gives women edge in some crisis situations

Certain crises require certain female leaders. Researchers at Lehigh University and Queen's University Belfast have found that trust established by female leaders practicing strong interpersonal skills results in better crisis resolution in cases when outcomes are predictable.
They describe this "female leadership trust advantage" in a paper published in this month's print issue of Psychology of Women Quarterly. Their research is the first to examine why and when a female leadership trust advantage emerges for leaders during organizational crises.
"People trust female leaders more than male leaders in times of crisis, but only under specific conditions," said paper co-author Corinne Post, professor of management at Lehigh University. "We showed that when a crisis hits an organization, people trust leaders who behave in relational ways, and especially so when the leaders are women and when there is a predictable path out of the crisis."
Relational behaviors are shown by those who think of themselves in relation to others. Such skills help build and restore trust, and, on average, are adopted more by women than men. The researchers specifically looked at the relational behavior of interpersonal emotion management (IEM), which alleviates feelings of threat during a crisis by anticipating and managing the emotions of others. IEM behaviors include removing or altering a problem to reduce emotional impact; directing attention to something more pleasant; reappraising a situation as more positive; and modulating or suppressing one's emotional response. IEM is central to establishing or repairing trust, often eroded when a crisis occurs.
Researchers defined crisis as a common, though often unexpected, time-sensitive, high-impact event that may disrupt organizational functioning and pose relational threats. For a company, this could be a product safety concern, consumer data breach, oil spill, corruption allegation or widespread harassment.
"Crises are fraught with relational issues, which, unless handled properly, threaten not only organizational performance but also the allocation of organizational resources and even organizational survival," they said. "Organizational crises, therefore, require a great deal of relational and emotional work to build or restore trust among those affected and may influence such trusting behaviors as provision of resources to the organization," including economic resources and investment in the firm, as well as inspiring employee cooperation.
To examine differences in trust for men and women leaders during an organizational crisis, researchers created a set of crisis scenarios. In some scenarios, the CEO (at times a male and at other times a female) anticipated and managed the emotions of others as the crisis unfolded -- and in others the CEO did not attend to others' emotions at all. Scenarios were varied to depict crises with predictable or uncertain outcomes.
"We found that this female leadership trust advantage was not just attitudinal, but that -- when the consequences of the crisis were foreseeable -- people were actually ready to invest much more in the firms led by relational women," Post said. "Our finding also suggests that, in an organizational crisis, female (relative to male) leaders may generate more goodwill and resources for their organization by using relational behaviors when the crisis fallout is predictable, but may not benefit from the same advantage in crises with uncertain consequences."
Demonstrating superior relational skills may help female leaders gain a trust advantage in crises that focus primarily on relationship aspects in an organization, when there is certainty around the resolution and fallout from the crisis is more controllable, for example. But it may not be as valuable when crisis outcomes are uncertain or difficult to control, when both agentic leadership (making decisions and acting quickly) and relational leadership (such as maintaining high levels of communication) are required.
Authors on the paper, "A Female Leadership Trust Advantage in Times of Crisis: Under What Conditions?" include Post; Iona Latu, lecturer in experimental social psychology at Queen's University Belfast; and Liuba Belkin, associate professor of management at Lehigh University.
The study is unique in basing its scenarios on production and food safety crises, when most studies of female leaders and organizational crisis look at financial performance crises. It also is different from most other research that "simply assumes female leaders behave more relationally," Post said. "We were able to determine how leader gender and leader relational behaviors (interpersonal emotional management) influenced trust both independently from each other and in combination."
The findings have important implications for leadership and gender research, as well as business professionals.
"Identifying what crisis management behaviors enhance trust in female leaders, and under what conditions such trust is enhanced may, for example, help to mitigate the documented higher risk for women (compared to men) of being replaced during drawn-out crises," the researchers said.
The results also suggest that to realize their leadership advantage potential, women may need to embrace relational leadership behaviors, at least under some circumstances. "Female leaders may find it helpful to know that, when uncertainty around a crisis is low, using relational leadership behaviors may help them elicit more trust from others," they said.
The research also holds implications for human resource professionals and organizational leaders.
"Because our findings reveal the importance of relational skills in eliciting trust during a crisis, we would encourage firms to consider hiring for, training and rewarding relational skills in their leaders, especially in jobs with high potential for crises," Post said.

Social context influences decision-makers' willingness to take risks

Do differences in performance have an impact on the appetite for risk-taking in decision-makers? Economists at the University of Göttingen have addressed this question. The result of their study is that people's willingness to take risks increases as soon as they get a lower return than other people with whom they compare themselves. At the same time, decision-makers take lower risks if they get a higher return than their peers. The study was published in the journal Games and Economic Behavior.
Dr Stephan Müller and Professor Holger Rau from the Faculty of Business and Economics at the University of Göttingen investigated the risk preferences of 236 participants in computer laboratory experiments. Risk preferences play an important role in financial and product markets, as they determine the investment behaviour and the associated profits and losses of investors. These actions result in those concerned making different profits on their investments. For example, investors in financial markets achieve higher long-term returns if they take on higher investment risks. Therefore, pronounced risky investment behaviour usually results in an increase in the gap between the levels of investors' income.
"If, for example, a fund manager generates higher profits than his colleague, this can lead to a significant increase in his colleague's willingness to take risks in order to close the existing gap," says Rau. Conversely, decision-makers are less keen on taking risks if they find that their social peer has a lower income. "Interestingly, the results of this change in risk-taking behaviour depend on how strong the test subjects' aversion to inequality is," Müller adds. The results of this study provide valuable insights into the design of employment contracts in order to control the risk-taking behaviour of employees through organisational structures and information policies. Furthermore, the study provides insights into insurance strategies for social projects in order to minimise the existing financial risk for sponsors.

Storing data in music

Manuel Eichelberger and Simon Tanner, two ETH doctoral students, store data in music. This means, for example, that background music can contain the access data for the local Wi-Fi network, and a mobile phone's built-in microphone can receive this data. "That would be handy in a hotel room," Tanner says, "since guests would get access to the hotel Wi-Fi without having to enter a password on their device."
To store the data, the two doctoral students and their colleague, Master's student Gabriel Voirol, make minimal changes to the music. In contrast to other scientists' attempts in recent years, the researchers state that their new approach allows higher data transfer rates with no audible effect on the music. "Our goal was to ensure that there was no impact on listening pleasure," Eichelberger says.
Tests the researchers have conducted show that in ideal conditions, their technique can transfer up to 400 bits per second without the average listener noticing the difference between the source music and the modified version (see also the audio sample). Given that under realistic conditions a degree of redundancy is necessary to guarantee transmission quality, the transfer rate will more likely be some 200 bits -- or around 25 letters -- per second. "In theory, it would be possible to transmit data much faster. But the higher the transfer rate, the sooner the data becomes perceptible as interfering sound, or data quality suffers," Tanner adds.
Dominant notes hide information
The researchers from ETH Zurich's Computer Engineering and Networks Laboratory use the dominant notes in a piece of music, overlaying each of them with two marginally deeper and two marginally higher notes that are quieter than the dominant note. They also make use of the harmonics (one or more octaves higher) of the strongest note, inserting slightly deeper and higher notes here, too. It is all these additional notes that carry the data. While a smartphone can receive and analyse this data via its built-in microphone, the human ear doesn't perceive these additional notes.
"When we hear a loud note, we don't notice quieter notes with a slightly higher or lower frequency," Eichelberger says. "That means we can use the dominant, loud notes in a piece of music to hide the acoustic data transfer." It follows that the best music for this kind of data transfer has lots of dominant notes -- pop songs, for instance. Quiet music is less suitable.
To tell the decoder algorithm in the smartphone where it needs to look for data, the scientists use very high notes that the human ear can barely register: they replace the music in the frequency range 9.8-10 kHz with an acoustic data stream that carries the information on when and where across the rest of the music's frequency spectrum to find the data being transmitted.
From the loudspeaker to the mic
The transmission principle behind this technique is fundamentally different from the well-known RDS system as used in car radios to transmit the radio station's name and details of the music that is playing. "With RDS, the data is transmitted using FM radio waves. In other words, data is sent from the FM transmitter to the radio device," Tanner explains. "What we're doing is embedding the data in the music itself -- transmitting data from the loudspeaker to the mic."

Novel C. diff structures are required for infection, offer new therapeutic targets

  Iron storage "spheres" inside the bacterium C. diff -- the leading cause of hospital-acquired infections -- could offer new targ...