—Frances Houle, Kate Kirby, Laura Greene & Michael Marder
In April 2024, Nature released detailed information about investigations into claims made by Ranga Dias, a physicist at the University of Rochester, in two high-profile papers the journal had published about the discovery of room-temperature superconductivity. Those two papers, which showed evidence of fabricated data, were eventually retracted, along with other Dias papers.
This work made it into top journals because reviewers are used to being able to trust that data have not been so completely manipulated, and Dias’s experiments required very high pressures that other labs could not easily replicate. One natural reaction from the physics community would be “How could we ever have let this happen?” But another should be “Here we go again!”
Alas, a pattern of similar behavior has been known for at least two decades. But improved education alone is not enough to sustain a culture of ethics in physics. Here’s what we need to do as well.
OpenAI has released a new ChatGPT bot that you can talk to
The news: OpenAI is rolling out an advanced AI chatbot that you can talk to. It’s available now—at least for some. The new ChatGPT voice bot can tell what different tones of voice convey, respond to interruptions, and reply to queries in real time. It has also been trained to sound more natural and use voices to convey a wide range of different emotions.
Why it matters: The new chatbot represents OpenAI’s push into a new generation of AI-powered voice assistants in the vein of Siri and Alexa, but with far more capabilities to enable more natural, fluent conversations. It is a step in the march to more fully capable AI agents. Read the full story.