Interesting Engineering
INNOVATION
ADVERTISEMENT
INNOVATION
A computer cooling breakthrough uses a common material to boost power 740 percent
Holy cow. The engineers used copper instead of diamond.
By Loukia Papadopoulos
May 22, 2022 (Updated: May 24, 2022 11:29 EDT)
Fire in the microchip.Birdlkportfolio/iStock
We have all had the experience of one of our electronic devices overheating. Needless, to say that when that happens, it becomes dangerous both for the device and its surroundings. But considering the speed at which devices work, is overheating avoidable?
A 740 percent increase in power per unit
Researchers at the University of Illinois at Urbana-Champaign (UIUC) and the University of California, Berkeley (UC Berkeley) have recently devised an invention that could cool down electronics more efficiently than other alternative solutions and enable a 740 percent increase in power per unit, according to a press release by the institutions published Thursday.
Tarek Gebrael, the lead author of the new research and a UIUC Ph.D. student in mechanical engineering, explained that current cooling solutions have three specific problems. “First, they can be expensive and difficult to scale up,” he said.
Top ArticlesA cutting‑edge manufacturingtechnique creates robots less than a millimeter wide
He brought up the example of heat spreaders made of diamonds which are obviously very expensive. Second, he described how conventional heat spreading approaches generally place the heat spreader and a heat sin (a device for dissipating heat efficiently) on top of the electronic device. Unfortunately, “in many cases, most of the heat is generated underneath the electronic device,” meaning that the cooling mechanism isn’t where it is needed most.
Third, explained Gebrael, heat spreaders can’t be installed directly on the surface of the electronics. They require a layer of “thermal interface material” to be placed between them to ensure good contact. This material, however, has poor heat transfer characteristics resulting in a negative impact on thermal performance.
A solution to all conventional problems
Luckily, the researchers have come up with a new solution that addresses all three of those problems.
ADVERTISEMENT
They began by using copper as the main material, which is obviously inexpensive. Then they made the copper coating entirely “engulf” the device, said Gebrael—”covering the top, the bottom, and the sides… a conformal coating that covers all the exposed surfaces” ensuring that no heat-producing regions were left unprotected. Finally, the new solution removes the need for a thermal interface material and a heat sink. How innovative!
“In our study, we compared our coatings to standard heat sinking methods,” Gebrael said. “What we showed is that you can get very similar thermal performance, or even better performance, with the coatings compared to the heat sinks.”
The removal of the heat sink and thermal interface also guarantees that the device using the new solution is dramatically smaller than its conventional counterparts. “And this translates to much higher power per unit volume. We were able to demonstrate a 740 percent increase in the power per unit volume,” added Gebrael.
ADVERTISEMENT
Using copper instead of diamond
IE reached out to Gebrael to find out why he chose copper as a replacement material. The engineer explained that copper is much cheaper than diamond, has a relatively high thermal conductivity that the processes the team used to deposit the copper coating are well-known to the electronics industry (like electroless and electroplating of copper).
“We knew the copper would dissipate the heat effectively because it is already widely used in standard heat spreaders and heat sinks (due to its high thermal conductivity). The challenge was to electrically isolate it from the electronics to prevent short-circuits. We did that by depositing on the electronics a thin conformal polymer coating first and then adding the conformal copper coating on top of the copper,” concluded Gebrael.
The study is published in Nature Electronics.
ADVERTISEMENT
Abstract:
Electrification is critical to decarbonizing society, but managing increasing power densification in electrical systems will require the development of new thermal management technologies. One approach is to use monolithic-metal-based heat spreaders that reduce thermal resistance and temperature fluctuation in electronic devices. However, their electrical conductivity makes them challenging to implement. Here we report co-designed electronic systems that monolithically integrate copper directly on electronic devices for heat spreading and temperature stabilization. The approach first coats the devices with an electrical insulating layer of poly(2-chloro-p-xylylene) (parylene C) and then a conformal coating of copper. This allows the copper to be in close proximity to the heat-generating elements, eliminating the need for thermal interface materials and providing improved cooling performance compared with existing technologies. We test the approach with gallium nitride power transistors, and show that it can be used in systems operating at up to 600 V and provides a low junction-to-ambient specific thermal resistance of 2.3 cm2 K W–1 in quiescent air and 0.7 cm2 K W–1 in quiescent water.
ADVERTISEMENT
This story has been updated to include commentary from the researcher.
Follow Us on
GET YOUR DAILY NEWS DIRECTLY IN YOUR INBOXStay ahead with the latest science, technology and innovation news, for free:
JOIN FREE
By subscribing, you agree to our Terms of Use and Privacy Policy. You may unsubscribe at any time.
Sponsored Stories
info
Report video
Russian Hoverbike Comes Crashing Down From 100 ft in Dubai
“Nature’s Morphine”, Pain Relief (Now Legal in the US)Natural Life

Skip to main content
Visit Nature news for the latest coverage and read Springer Nature’s statement on the Ukraine conflict
SearchLogin
ContentAboutPublish
NEWS FEATURE25 May 2022How the revamped Large Hadron Collider will hunt for new physics
The particle-smashing machine has fired up again — sparking fresh hope it can find unusual results.
Elizabeth Gibney
Twitter Facebook Email
Detectors at the ALICE experiment were revamped during the Large Hadron Collider’s 2018–22 shutdown. Credit: Maximilien Brice, Julien Marius Ordan/CERN
Download PDF
The hunt for new physics is back on. The world’s most powerful machine for smashing high-energy particles together, the Large Hadron Collider (LHC), has fired up after a shutdown of more than three years. Beams of protons are once again whizzing around its 27-kilometre loop at CERN, Europe’s particle-physics laboratory near Geneva. By July, physicists will be able to switch on their experiments and watch bunches of particles collide.
In its first two stints, in 2009–13 and 2015–18, the LHC explored the known physics world. All of that work — including the triumphant 2012 discovery of the Higgs boson — reaffirmed physicists’ current best description of the particles and forces that make up the Universe: the standard model. But scientists sifting through the detritus of quadrillions of high-energy collisions have yet to find proof of any surprising new particles or anything else completely unknown.
This time could be different. The LHC has so far cost US$9.2 billion to build, including the latest upgrades: version three comes with more data, better detectors and innovative ways to search for new physics. What’s more, scientists start with a tantalizing shopping list of anomalous results — many more than at the start of the last run — that hint at where to look for particles outside the standard model.
“We’re really starting with adrenaline up,” says Isabel Pedraza, a particle physicist at the Meritorious Autonomous University of Puebla (BUAP) in Mexico. “I’m sure we will see something in run 3.”
Higher energy and more data
After renovations to its particle accelerators, the third version of the LHC will collide protons at 13.6 trillion electron volts (TeV) — slightly higher than in run 2, which reached 13 TeV. The more-energetic smashes should increase the chances that collisions will create particles in high-energy regions where some theories suggest new physics could lie, says Rende Steerenberg, who leads beam operations at CERN. The machine’s beams will also deliver more-compact bunches of particles, increasing the probability of collisions. This will allow the LHC to maintain its peak rate of collisions for longer, ultimately allowing experiments to record as many data as in the first two runs combined.
To deal with the flood, the machine’s detectors — layers of sensors that capture particles that spray from collisions and measure their energy, momentum and other properties — have been upgraded to make them more efficient and precise (see ‘Data boost’).
Nik Spencer/Nature; Source: CERN
A major challenge for LHC researchers has always been that so little of the collision data can be stored. The machine collides bunches 40 million times per second, and each proton–proton collision, or ‘event’, can spew out hundreds of particles. ‘Trigger’ systems must weed out the most interesting of these events and throw the bulk of the data away. For example, at CMS — one of the LHC’s four main experiments — a trigger built into the hardware makes a rough cut of around 100,000 events per second on the basis of assessments of properties such as the particles’ energies, before software picks out around 1,000 to reconstruct in full for analysis.
With more data, the trigger systems must triage even more events. One improvement comes from a trial of chips originally designed for video games, called GPUs (graphics processing units). These can reconstruct particle histories more quickly than conventional processors can, so the software will be able to scan faster and across more criteria each second. That will allow it to potentially spot strange collisions that might previously have been missed.
In particular, the LHCb experiment has revamped its detector electronics so that it will use only software to scan events for interesting physics. Improvements across the experiment mean that it should collect four times more data in run 3 than it did in run 2. It is “almost like a brand new detector”, says Yasmine Amhis, a physicist at the Laboratory of the Irène-Joliot Curie Physics of the Two Infinities Lab in Orsay, France, and member of the LHCb collaboration.
The LHCb’s ‘vertex locator’, placed close to the LHC’s beamline to see short-lived particles.Credit: Maximilien Brice, Julien Marius Ordan/CERN
Spotting anomalies
Run 3 will also give physicists more precision in their measurements of known particles, such as the Higgs boson, says Ludovico Pontecorvo, a physicist with the ATLAS experiment. This alone could produce results that conflict with known physics — for instance, when measuring it more precisely shrinks the error bars enough to put it outside the standard model’s predictions.
But physicists also want to know whether a host of odd recent results are genuine anomalies, which might help to fill some gaps in understanding about the Universe. The standard model is incomplete: it cannot account for phenomena such as dark matter, for instance. And findings that jar with the model — but are not firm enough to claim as a definite discrepancy — have popped up many times in the past two years (see ‘Hints of new physics?’).
Nik Spencer/Nature; Source: CERN
The most recent is from the Tevatron collider at the Fermi National Accelerator Laboratory (Fermilab) in Batavia, Illinois, which shut down in 2011. Researchers have spent the past decade poring through data from the Tevatron’s CDF experiment. In April, they reported1 that the mass of the W boson, a fundamental particle that carries the weak nuclear force involved in radioactive decay, is significantly higher than the standard model predicts.
That doesn’t chime with LHC data: measurements at ATLAS and LHCb disagree with the CDF data, although they are less precise. Physicists at CMS are now working on their own measurement, using data from the machine’s second run. Data from run 3 could provide a definitive answer, although not immediately, because the mass of the W boson is notoriously difficult to measure.
B-meson confusion
The LHC’s data have hinted at other anomalies. In particular, evidence has been building for almost a decade of odd behaviour in particles called B mesons. These transient particles, which quickly decay into others, are so named because they contain pairs of fundamental particles that include a ‘bottom’ or ‘beauty’ quark. LHCb analyses suggest that B-meson decays tend to produce electrons more often than they produce their heavier cousins, muons2. The standard model predicts that nature should not prefer one over the other, says Tara Shears, a particle physicist at the University of Liverpool, UK, and a member of the LHCb collaboration. “Muons are being produced about 15% less often than electrons, and it’s utterly bizarre,” she says.
The result differs from the predictions of the standard model with a significance of around 3 sigma, or 3 standard deviations from what’s expected — which translates to a 3 in 1,000 chance that random noise could have produced the apparent bias. Only more data can confirm whether the effect is real or a statistical fluke. Experimentalists might have misunderstood something in their data or machine, but now that many of the relevant LHCb detectors have been replaced, the next phase of data-gathering should provide a cross-check, Shears says. “We will be crushed if [the anomaly] goes away. But that’s life as a scientist, that can happen.”
The anomaly is backed up by similar subtle discrepancies that LHCb has seen in other decays involving bottom quarks; experiments at colliders in Japan and the United States have also seen hints of this odd result. This kind of work is LHCb’s métier: its detectors were designed to study in detail the decays of particles that contain heavy quarks, allowing the experiment to gather indirect hints of phenomena that might influence these particles’ behaviour. CMS and ATLAS are more general-purpose experiments, but experimenters there are now checking to see whether they can spot more of the events that are sensitive to the anomalies, says Florencia Canelli, an experimental particle physicist at the University of Zurich in Switzerland and member of the CMS collaboration.
Hunt for the leptoquark
CMS and ATLAS will also do what LHCb cannot: comb collision data to look directly for the exotic particles that theorists suggest could be causing the still-unconfirmed anomalies. One such hypothetical particle has been dubbed the leptoquark, because it would, at high energies, take on properties of two otherwise distinct families of particles — leptons, such as electrons and muons, and quarks (see ‘Decoding decays’). This hybrid particle comes from theories that seek to unite the electromagnetic, weak and strong fundamental forces as aspects of the same force, and could explain the LHCb results. The leptoquark — or a complex version of it — also fits with another tantalizing anomaly; a measurement last year3, from the Muon g − 2 experiment at Fermilab, that muons are more magnetic than expected.
Nik Spencer/Nature
At the Moriond particle-physics conference in La Thuile, Italy, in March, CMS researchers presented results of a search that found intriguing hints of a beyond-standard-model lepton. This particle would interact with leptoquarks and is predicted by some leptoquark theories. Physicists saw a slight excess of the particles that the proposed lepton could decay into, bottom quarks and taus (heavier cousins of the muon), but the finding’s significance is only 2.8 sigma. “Those are very exciting results, as LHCb is also seeing something similar,” says Pedraza. CMS physicists presented hints of other new phenomena at the conference: two possible particles that might decay into two taus, and a potential high-energy particle that, through a theorized but unproven decay route, would turn into distinctive particle cascades termed jets.
Another intriguing result comes from ATLAS, where Ismet Siral at the University of Oregon in Eugene and his colleagues looked for hypothetical heavy, long-lived charged particles. In trillions of collisions from 3 years of data they found 7 candidates at around 1.4 TeV, around 8 times the energy of the heaviest known particle4. Those results are 3.3 sigma, and the identity of the candidate particles remains a mystery. “We don’t know if this is real, we need more data. That’s where run 3 comes in,” says Siral.
CERN’s 86-metre long Linac4 accelerator, which produces proton beams for the Large Hadron Collider. Credit: Robert Hradil, Monika Majer/ProStudio22.ch/CERN
Another LHC experiment, ALICE, will explore its own surprising finding: that the extreme conditions created in collisions between lead ions (which the LHC smashes together when not working with protons) might crop up elsewhere. ALICE is designed to study quark–gluon plasma, a hot, dense soup of fundamental particles created in collisions of heavy ions that is thought to have existed just after the Big Bang. Analyses of the first two runs found that particles in proton–proton and proton–lead ion collisions show some traits of this state of matter, such as paths that are correlated rather than random. “It’s an extremely interesting, unexpected phenomenon,” says Barbara Erazmus, deputy spokesperson for ALICE at CERN.
Like LHCb, ALICE has had a major upgrade, including updated electronics to provide it with a faster software-only trigger system. The experiment, which will probe the temperature of the plasma as well as precisely measuring particles that contain charm and beauty quarks, will be able to collect 100 times more events this time than in its previous two runs, thanks to improvements across its detectors.
Machine learning aids the search
Run 3 will see also entirely new experiments. FASER, half a kilometre from ATLAS, will hunt for light and weakly interacting particles including neutrinos and new phenomena that could explain dark matter. (These particles can’t be spotted by ATLAS, because they would fly out of collisions on a trajectory that hugs close to the LHC’s beamline and evades the detectors). Meanwhile, the ATLAS and CMS experiments now have improved detectors but will not receive major hardware upgrades until the next long shutdown, in 2026. At this point, the LHC will be overhauled to create more focused ‘high-luminosity’ beams, which will start up in 2029 (see ‘LHC timeline’). This will allow scientists in the following runs to collect 10 times more collision data than in runs 1 to 3 combined. For now, CMS and ATLAS have got prototype technology to help them prepare.
Nik Spencer/Nature; Source: CERN
As well as collecting more events, physicists such as Siral are keen to change the way in which LHC experiments hunt for particles. So far, much of the LHC’s research has involved testing specific predictions (such as searching for the Higgs where physicists expected to see it) or hunting for particular hypotheses of new physics.
Scientists thought this would be a fruitful strategy, because they had a good steer on where to look. Many expected to find new heavy particles, such as those predicted by a group of theories known as supersymmetry, soon after the LHC started. That they have seen none rules out all but the most convoluted versions of supersymmetry. Today, few theoretical extensions of the standard model seem any more likely to be true than others.
Experimentalists are now shifting to search strategies that are less constrained by expectations. Both ATLAS and CMS are going to search for long-lived particles that could linger across two collisions, for instance. New search strategies often mean writing analysis software that rejects the usual assumptions, says Siral.
Machine learning is likely to help, too. Many LHC experiments already use this technique to distinguish particular sought-for collisions from the background noise. This is ‘supervised’ learning: the algorithm is given a pattern to hunt for. But researchers are increasingly using ‘unsupervised’ machine-learning algorithms that can scan widely for anomalies, without expectations. For example, a neural network can compare events against a learned simulation of the standard model. If the simulation can’t recreate the event, that’s an anomaly. Although this kind of approach is not yet used systematically, “I do think this is the direction people will go in,” says Sascha Caron of Radboud University Nijmegen in the Netherlands, who works on applying these techniques to ATLAS data.
In making searches less biased, the triggers that decide which events are interesting to look at are crucial, so it helps that the new GPUs will be able to scour candidate events with wider criteria. CMS will also use an approach called ‘scouting’: analysing rough reconstructions of all the 100,000 or so events initially selected but not saved in full detail. “It’s the equivalent of 10 years more of running your detector, but in one year,” says Andrea Massironi, a physicist with the CMS experiment.
The detector at the Large Hadron Collider’s CMS experiment, pictured during the machine’s shutdown. Credit: Samuel Joseph Hertzog, Julien Marius Ordan/CERN
The triggers themselves could also soon rely on machine learning to make their choices. Katya Govorkova, a particle physicist at CERN, and her colleagues have come up with a high-speed proof-of-principle algorithm that uses machine learning to select which of the collider’s 40 million events per second to save, according to their fit with the standard model5. In run 3, researchers plan to train and test the algorithm on CMS collisions, alongside the experiment’s conventional trigger. A challenge will be knowing how to analyse events that the algorithm labels as anomalous, because it cannot yet point to exactly why an event is anomalous, says Govorkova.
Physicists must keep an open mind about where they might find the thread that will lead them to a theory beyond the standard model, says Amhis. Although the current crop of anomalies is exciting, even previous oddities seen by multiple experiments turned out to be statistical flukes that faded away when more data were gathered. “It’s important that we continue to push all of the physics programme,” she says. “It’s a matter of not putting all your eggs in one basket.”
Nature 605, 604-607 (2022)
doi: https://doi.org/10.1038/d41586-022-01388-6
References
CDF Collaboration. Science 376, 170–176 (2022).
PubMed Article Google Scholar
LHCb collaboration. Nature Phys. 18, 277–282 (2022).
Article Google Scholar
Abi, B. et al. Phys. Rev. Lett. 126, 141801 (2021).
PubMed Article Google Scholar
ATLAS Collaboration. Preprint at arXiv https://doi.org/10.48550/arXiv.2205.06013 (2022).
Govorkova, E. et al. Nature Mach. Int. 4, 154–161 (2022).
Article Google Scholar
Download references
Related ArticlesParticle’s surprise mass threatens to upend the standard modelCERN makes bold push to build €21-billion supercolliderExotic four-quark particle spotted at Large Hadron ColliderWhat’s next for physics’ standard model? Muon results throw theories into confusionSubjectsPhysics Particle physics Machine learning
Latest on:
Physics
Particle physics
Machine learning
Breakthrough in teleportation furthers quantum network development
NEWS & VIEWS 25 MAY 22
Cilia metasurfaces for electronically programmable microfluidic manipulation
ARTICLE 25 MAY 22
Demonstration of fault-tolerant universal quantum gate operations
ARTICLE 25 MAY 22
Jobs
Postdoctoral Associate- Biology
Baylor College of Medicine (BCM)
Houston, TX, United States
Postdoctoral Associate- Chemistry
Baylor College of Medicine (BCM)
Houston, TX, United States
Research Fellow-Stem Cell Biology & Engineering
New York Blood Center (NYBC)
New York City, NY, United States
Postdoctoral Fellows in Cancer Biology and Cancer Immunology
Fox Chase Cancer Center (FCCC), Temple University
Philadelphia, PA, United States
Close
Get the most important science stories of the day, free in your inbox.Sign up for Nature Briefing
Nature (Nature) ISSN 1476-4687 (online) ISSN 0028-0836 (print)
nature.com sitemap
About usPress releasesPress officeContact us
Discover contentJournals A-ZArticles by subjectNanoProtocol ExchangeNature Index
Publishing policiesNature portfolio policiesOpen access
Author & Researcher servicesReprints & permissionsResearch dataLanguage editingScientific editingNature MasterclassesNature Research AcademiesResearch Solutions
Libraries & institutionsLibrarian service & toolsLibrarian portalOpen researchRecommend to library
Advertising & partnershipsAdvertisingPartnerships & ServicesMedia kitsBranded content
Career developmentNature CareersNature ConferencesNature events
Skip to main content
Visit Nature news for the latest coverage and read Springer Nature’s statement on the Ukraine conflict
SearchLogin
ContentAboutPublish
Download PDF
NEWS FEATURE25 May 2022How the revamped Large Hadron Collider will hunt for new physics
The particle-smashing machine has fired up again — sparking fresh hope it can find unusual results.
Elizabeth Gibney
Twitter Facebook Email
Detectors at the ALICE experiment were revamped during the Large Hadron Collider’s 2018–22 shutdown. Credit: Maximilien Brice, Julien Marius Ordan/CERN
Download PDF
The hunt for new physics is back on. The world’s most powerful machine for smashing high-energy particles together, the Large Hadron Collider (LHC), has fired up after a shutdown of more than three years. Beams of protons are once again whizzing around its 27-kilometre loop at CERN, Europe’s particle-physics laboratory near Geneva. By July, physicists will be able to switch on their experiments and watch bunches of particles collide.
In its first two stints, in 2009–13 and 2015–18, the LHC explored the known physics world. All of that work — including the triumphant 2012 discovery of the Higgs boson — reaffirmed physicists’ current best description of the particles and forces that make up the Universe: the standard model. But scientists sifting through the detritus of quadrillions of high-energy collisions have yet to find proof of any surprising new particles or anything else completely unknown.
This time could be different. The LHC has so far cost US$9.2 billion to build, including the latest upgrades: version three comes with more data, better detectors and innovative ways to search for new physics. What’s more, scientists start with a tantalizing shopping list of anomalous results — many more than at the start of the last run — that hint at where to look for particles outside the standard model.
“We’re really starting with adrenaline up,” says Isabel Pedraza, a particle physicist at the Meritorious Autonomous University of Puebla (BUAP) in Mexico. “I’m sure we will see something in run 3.”
Higher energy and more data
After renovations to its particle accelerators, the third version of the LHC will collide protons at 13.6 trillion electron volts (TeV) — slightly higher than in run 2, which reached 13 TeV. The more-energetic smashes should increase the chances that collisions will create particles in high-energy regions where some theories suggest new physics could lie, says Rende Steerenberg, who leads beam operations at CERN. The machine’s beams will also deliver more-compact bunches of particles, increasing the probability of collisions. This will allow the LHC to maintain its peak rate of collisions for longer, ultimately allowing experiments to record as many data as in the first two runs combined.
To deal with the flood, the machine’s detectors — layers of sensors that capture particles that spray from collisions and measure their energy, momentum and other properties — have been upgraded to make them more efficient and precise (see ‘Data boost’).
Nik Spencer/Nature; Source: CERN
A major challenge for LHC researchers has always been that so little of the collision data can be stored. The machine collides bunches 40 million times per second, and each proton–proton collision, or ‘event’, can spew out hundreds of particles. ‘Trigger’ systems must weed out the most interesting of these events and throw the bulk of the data away. For example, at CMS — one of the LHC’s four main experiments — a trigger built into the hardware makes a rough cut of around 100,000 events per second on the basis of assessments of properties such as the particles’ energies, before software picks out around 1,000 to reconstruct in full for analysis.
With more data, the trigger systems must triage even more events. One improvement comes from a trial of chips originally designed for video games, called GPUs (graphics processing units). These can reconstruct particle histories more quickly than conventional processors can, so the software will be able to scan faster and across more criteria each second. That will allow it to potentially spot strange collisions that might previously have been missed.
In particular, the LHCb experiment has revamped its detector electronics so that it will use only software to scan events for interesting physics. Improvements across the experiment mean that it should collect four times more data in run 3 than it did in run 2. It is “almost like a brand new detector”, says Yasmine Amhis, a physicist at the Laboratory of the Irène-Joliot Curie Physics of the Two Infinities Lab in Orsay, France, and member of the LHCb collaboration.
The LHCb’s ‘vertex locator’, placed close to the LHC’s beamline to see short-lived particles.Credit: Maximilien Brice, Julien Marius Ordan/CERN
Spotting anomalies
Run 3 will also give physicists more precision in their measurements of known particles, such as the Higgs boson, says Ludovico Pontecorvo, a physicist with the ATLAS experiment. This alone could produce results that conflict with known physics — for instance, when measuring it more precisely shrinks the error bars enough to put it outside the standard model’s predictions.
But physicists also want to know whether a host of odd recent results are genuine anomalies, which might help to fill some gaps in understanding about the Universe. The standard model is incomplete: it cannot account for phenomena such as dark matter, for instance. And findings that jar with the model — but are not firm enough to claim as a definite discrepancy — have popped up many times in the past two years (see ‘Hints of new physics?’).
Nik Spencer/Nature; Source: CERN
The most recent is from the Tevatron collider at the Fermi National Accelerator Laboratory (Fermilab) in Batavia, Illinois, which shut down in 2011. Researchers have spent the past decade poring through data from the Tevatron’s CDF experiment. In April, they reported1 that the mass of the W boson, a fundamental particle that carries the weak nuclear force involved in radioactive decay, is significantly higher than the standard model predicts.
That doesn’t chime with LHC data: measurements at ATLAS and LHCb disagree with the CDF data, although they are less precise. Physicists at CMS are now working on their own measurement, using data from the machine’s second run. Data from run 3 could provide a definitive answer, although not immediately, because the mass of the W boson is notoriously difficult to measure.
B-meson confusion
The LHC’s data have hinted at other anomalies. In particular, evidence has been building for almost a decade of odd behaviour in particles called B mesons. These transient particles, which quickly decay into others, are so named because they contain pairs of fundamental particles that include a ‘bottom’ or ‘beauty’ quark. LHCb analyses suggest that B-meson decays tend to produce electrons more often than they produce their heavier cousins, muons2. The standard model predicts that nature should not prefer one over the other, says Tara Shears, a particle physicist at the University of Liverpool, UK, and a member of the LHCb collaboration. “Muons are being produced about 15% less often than electrons, and it’s utterly bizarre,” she says.
The result differs from the predictions of the standard model with a significance of around 3 sigma, or 3 standard deviations from what’s expected — which translates to a 3 in 1,000 chance that random noise could have produced the apparent bias. Only more data can confirm whether the effect is real or a statistical fluke. Experimentalists might have misunderstood something in their data or machine, but now that many of the relevant LHCb detectors have been replaced, the next phase of data-gathering should provide a cross-check, Shears says. “We will be crushed if [the anomaly] goes away. But that’s life as a scientist, that can happen.”
The anomaly is backed up by similar subtle discrepancies that LHCb has seen in other decays involving bottom quarks; experiments at colliders in Japan and the United States have also seen hints of this odd result. This kind of work is LHCb’s métier: its detectors were designed to study in detail the decays of particles that contain heavy quarks, allowing the experiment to gather indirect hints of phenomena that might influence these particles’ behaviour. CMS and ATLAS are more general-purpose experiments, but experimenters there are now checking to see whether they can spot more of the events that are sensitive to the anomalies, says Florencia Canelli, an experimental particle physicist at the University of Zurich in Switzerland and member of the CMS collaboration.
Hunt for the leptoquark
CMS and ATLAS will also do what LHCb cannot: comb collision data to look directly for the exotic particles that theorists suggest could be causing the still-unconfirmed anomalies. One such hypothetical particle has been dubbed the leptoquark, because it would, at high energies, take on properties of two otherwise distinct families of particles — leptons, such as electrons and muons, and quarks (see ‘Decoding decays’). This hybrid particle comes from theories that seek to unite the electromagnetic, weak and strong fundamental forces as aspects of the same force, and could explain the LHCb results. The leptoquark — or a complex version of it — also fits with another tantalizing anomaly; a measurement last year3, from the Muon g − 2 experiment at Fermilab, that muons are more magnetic than expected.
Nik Spencer/Nature
At the Moriond particle-physics conference in La Thuile, Italy, in March, CMS researchers presented results of a search that found intriguing hints of a beyond-standard-model lepton. This particle would interact with leptoquarks and is predicted by some leptoquark theories. Physicists saw a slight excess of the particles that the proposed lepton could decay into, bottom quarks and taus (heavier cousins of the muon), but the finding’s significance is only 2.8 sigma. “Those are very exciting results, as LHCb is also seeing something similar,” says Pedraza. CMS physicists presented hints of other new phenomena at the conference: two possible particles that might decay into two taus, and a potential high-energy particle that, through a theorized but unproven decay route, would turn into distinctive particle cascades termed jets.
Another intriguing result comes from ATLAS, where Ismet Siral at the University of Oregon in Eugene and his colleagues looked for hypothetical heavy, long-lived charged particles. In trillions of collisions from 3 years of data they found 7 candidates at around 1.4 TeV, around 8 times the energy of the heaviest known particle4. Those results are 3.3 sigma, and the identity of the candidate particles remains a mystery. “We don’t know if this is real, we need more data. That’s where run 3 comes in,” says Siral.
CERN’s 86-metre long Linac4 accelerator, which produces proton beams for the Large Hadron Collider. Credit: Robert Hradil, Monika Majer/ProStudio22.ch/CERN
Another LHC experiment, ALICE, will explore its own surprising finding: that the extreme conditions created in collisions between lead ions (which the LHC smashes together when not working with protons) might crop up elsewhere. ALICE is designed to study quark–gluon plasma, a hot, dense soup of fundamental particles created in collisions of heavy ions that is thought to have existed just after the Big Bang. Analyses of the first two runs found that particles in proton–proton and proton–lead ion collisions show some traits of this state of matter, such as paths that are correlated rather than random. “It’s an extremely interesting, unexpected phenomenon,” says Barbara Erazmus, deputy spokesperson for ALICE at CERN.
Like LHCb, ALICE has had a major upgrade, including updated electronics to provide it with a faster software-only trigger system. The experiment, which will probe the temperature of the plasma as well as precisely measuring particles that contain charm and beauty quarks, will be able to collect 100 times more events this time than in its previous two runs, thanks to improvements across its detectors.
Machine learning aids the search
Run 3 will see also entirely new experiments. FASER, half a kilometre from ATLAS, will hunt for light and weakly interacting particles including neutrinos and new phenomena that could explain dark matter. (These particles can’t be spotted by ATLAS, because they would fly out of collisions on a trajectory that hugs close to the LHC’s beamline and evades the detectors). Meanwhile, the ATLAS and CMS experiments now have improved detectors but will not receive major hardware upgrades until the next long shutdown, in 2026. At this point, the LHC will be overhauled to create more focused ‘high-luminosity’ beams, which will start up in 2029 (see ‘LHC timeline’). This will allow scientists in the following runs to collect 10 times more collision data than in runs 1 to 3 combined. For now, CMS and ATLAS have got prototype technology to help them prepare.
Nik Spencer/Nature; Source: CERN
As well as collecting more events, physicists such as Siral are keen to change the way in which LHC experiments hunt for particles. So far, much of the LHC’s research has involved testing specific predictions (such as searching for the Higgs where physicists expected to see it) or hunting for particular hypotheses of new physics.
Scientists thought this would be a fruitful strategy, because they had a good steer on where to look. Many expected to find new heavy particles, such as those predicted by a group of theories known as supersymmetry, soon after the LHC started. That they have seen none rules out all but the most convoluted versions of supersymmetry. Today, few theoretical extensions of the standard model seem any more likely to be true than others.
Experimentalists are now shifting to search strategies that are less constrained by expectations. Both ATLAS and CMS are going to search for long-lived particles that could linger across two collisions, for instance. New search strategies often mean writing analysis software that rejects the usual assumptions, says Siral.
Machine learning is likely to help, too. Many LHC experiments already use this technique to distinguish particular sought-for collisions from the background noise. This is ‘supervised’ learning: the algorithm is given a pattern to hunt for. But researchers are increasingly using ‘unsupervised’ machine-learning algorithms that can scan widely for anomalies, without expectations. For example, a neural network can compare events against a learned simulation of the standard model. If the simulation can’t recreate the event, that’s an anomaly. Although this kind of approach is not yet used systematically, “I do think this is the direction people will go in,” says Sascha Caron of Radboud University Nijmegen in the Netherlands, who works on applying these techniques to ATLAS data.
In making searches less biased, the triggers that decide which events are interesting to look at are crucial, so it helps that the new GPUs will be able to scour candidate events with wider criteria. CMS will also use an approach called ‘scouting’: analysing rough reconstructions of all the 100,000 or so events initially selected but not saved in full detail. “It’s the equivalent of 10 years more of running your detector, but in one year,” says Andrea Massironi, a physicist with the CMS experiment.
The detector at the Large Hadron Collider’s CMS experiment, pictured during the machine’s shutdown. Credit: Samuel Joseph Hertzog, Julien Marius Ordan/CERN
The triggers themselves could also soon rely on machine learning to make their choices. Katya Govorkova, a particle physicist at CERN, and her colleagues have come up with a high-speed proof-of-principle algorithm that uses machine learning to select which of the collider’s 40 million events per second to save, according to their fit with the standard model5. In run 3, researchers plan to train and test the algorithm on CMS collisions, alongside the experiment’s conventional trigger. A challenge will be knowing how to analyse events that the algorithm labels as anomalous, because it cannot yet point to exactly why an event is anomalous, says Govorkova.
Physicists must keep an open mind about where they might find the thread that will lead them to a theory beyond the standard model, says Amhis. Although the current crop of anomalies is exciting, even previous oddities seen by multiple experiments turned out to be statistical flukes that faded away when more data were gathered. “It’s important that we continue to push all of the physics programme,” she says. “It’s a matter of not putting all your eggs in one basket.”
Nature 605, 604-607 (2022)
doi: https://doi.org/10.1038/d41586-022-01388-6
References
CDF Collaboration. Science 376, 170–176 (2022).
PubMed Article Google Scholar
LHCb collaboration. Nature Phys. 18, 277–282 (2022).
Article Google Scholar
Abi, B. et al. Phys. Rev. Lett. 126, 141801 (2021).
PubMed Article Google Scholar
ATLAS Collaboration. Preprint at arXiv https://doi.org/10.48550/arXiv.2205.06013 (2022).
Govorkova, E. et al. Nature Mach. Int. 4, 154–161 (2022).
Article Google Scholar
Download references
Related ArticlesParticle’s surprise mass threatens to upend the standard modelCERN makes bold push to build €21-billion supercolliderExotic four-quark particle spotted at Large Hadron ColliderWhat’s next for physics’ standard model? Muon results throw theories into confusionSubjectsPhysics Particle physics Machine learning
Latest on:
Physics
Particle physics
Machine learning
Breakthrough in teleportation furthers quantum network development
NEWS & VIEWS 25 MAY 22
Cilia metasurfaces for electronically programmable microfluidic manipulation
ARTICLE 25 MAY 22
Demonstration of fault-tolerant universal quantum gate operations
ARTICLE 25 MAY 22
Jobs
Postdoctoral Associate- Biology
Baylor College of Medicine (BCM)
Houston, TX, United States
Postdoctoral Associate- Chemistry
Baylor College of Medicine (BCM)
Houston, TX, United States
Research Fellow-Stem Cell Biology & Engineering
New York Blood Center (NYBC)
New York City, NY, United States
Postdoctoral Fellows in Cancer Biology and Cancer Immunology
Fox Chase Cancer Center (FCCC), Temple University
Philadelphia, PA, United States
Close
Get the most important science stories of the day, free in your inbox.Sign up for Nature Briefing
Nature (Nature) ISSN 1476-4687 (online) ISSN 0028-0836 (print)
nature.com sitemap
About usPress releasesPress officeContact us
Discover contentJournals A-ZArticles by subjectNanoProtocol ExchangeNature Index
Publishing policiesNature portfolio policiesOpen access
Author & Researcher servicesReprints & permissionsResearch dataLanguage editingScientific editingNature MasterclassesNature Research AcademiesResearch Solutions
Libraries & institutionsLibrarian service & toolsLibrarian portalOpen researchRecommend to library
Advertising & partnershipsAdvertisingPartnerships & ServicesMedia kitsBranded content
Career development<
Scientists Turn Nuclear Waste Into Diamond Batteries Lasting 1,000’s of Years
[May 24, 2022: Maia Mulko, University of Bristol]
A prototype of Arkenlight’s gammavoltaic battery that will convert gamma rays from nuclear waste repositories into electricity. (CREDIT: University of Bristol)
Nuclear power is considered a clean energy source because it has zero carbon dioxide emissions; yet, at the same time, it produces massive amounts of hazardous, radioactive waste that pile up as more and more reactors are built around the world.
Experts have proposed different solutions for this issue in order to take better care of the environment and people’s health. With insufficient safe storage space for nuclear waste disposal, the focal point of these ideas is the reutilization of the materials.
Radioactive diamond batteries were first developed in 2016 and were immediately acclaimed because they promised a new, cost-effective way of recycling nuclear waste. In this context, it’s unavoidable to deliberate whether they’re the ultimate solution to these toxic, lethal residues.
(AdPlugg=window.AdPlugg||[]).push({});” frameborder=”0″ style=”box-sizing: border-box; display: block; position: relative; overflow: hidden !important; height: 282px;”>
What Are Radioactive Diamond Batteries?
Radioactive diamond batteries were first developed by a team of physicists and chemists from the Cabot Institute for the Environment of the University of Bristol. The invention was presented as a betavoltaic device, which means that it’s powered by the beta decay of nuclear waste.
Beta decay is a type of radioactive decay that occurs when an atom’s nucleus has an excess of particles and releases some of them to obtain a more stable ratio of protons to neutrons. This produces a kind of ionizing radiation called beta radiation, which involves a lot of high-speed and high-energy electrons or positrons known as beta particles.
Related Stories
Micro supercapacitors could revolutionize the way we use batteries
A new way to store sustainable energy: ‘Information batteries’
New stretchable and printable free-form lithium-ion batteries
Beta particles contain nuclear energy that can be converted into electric energy through a semiconductor.
Beta decay is a type of radioactive decay that occurs when an atom’s nucleus has an excess of particles and releases some of them to obtain a more stable ratio of protons to neutrons. (CREDIT: MikeRun/WikimediaCommons)
A typical betavoltaic cell consists of thin layers of radioactive material placed between semiconductors. As the nuclear material decays, it emits beta particles that knock electrons loose in the semiconductor, creating an electric current.
(AdPlugg=window.AdPlugg||[]).push({});” frameborder=”0″ style=”box-sizing: border-box; display: block; position: relative; overflow: hidden !important; height: 282px;”>
However, the power density of the radioactive source is lower the further it is from the semiconductor. On top of this, because beta particles are randomly emitted in all directions, only a small number of them will hit the semiconductor, and only a small number of those will be converted into electricity. This means that nuclear batteries are much less efficient than other types of batteries. This is where the polycrystalline diamond (PCD) comes in.
The radioactive diamond batteries are made using a process called chemical vapor deposition, which is widely used for artificial diamond manufacture. It uses a mixture of hydrogen and methane plasma to grow diamond films at very high temperatures. Researchers have modified the CVD process to grow radioactive diamonds by using a radioactive methane containing the radioactive isotope Carbon-14, which is found on irradiated reactor graphite blocks.
Diamond is one of the hardest materials that humanity knows — it’s even harder than silicon carbide. And it can act as both a radioactive source and a semiconductor. Expose it to beta radiation and you’ll get a long-duration battery that doesn’t need to be recharged. The nuclear waste in its interior fuels it over and over again, allowing it to self-charge for ages.
However, the Bristol team warned that their radioactive diamond batteries wouldn’t be suitable for laptops or smartphones, because they contain only 1g of carbon-14, meaning that they provide very low power —only a few microwatts, which is less than a typical AA battery. Therefore, their application so far is limited to small devices that must stay unattended for a long time, such as sensors and pacemakers.
(AdPlugg=window.AdPlugg||[]).push({});” frameborder=”0″ style=”box-sizing: border-box; display: block; position: relative; overflow: hidden !important; height: 282px;”>
Nano Diamond Radioactive Batteries
The origins of nuclear batteries can be traced back to 1913, when English physicist Henry Moseley found out that particle radiation could generate an electric current. In the 1950s and 1960s, the aerospace industry was very interested in Moseley’s discovery, as it could potentially power spacecraft for long-duration missions. The RCA Corporation also researched an application for nuclear batteries in radio receivers and hearing aids.
Nano diamond crystals. (CREDIT: D. Mukherjee/Wikimedia Commons)
(AdPlugg=window.AdPlugg||[]).push({});” frameborder=”0″ style=”box-sizing: border-box; display: block; position: relative; overflow: hidden !important; height: 282px;”>
But other technologies were needed in order to develop and sustain the invention. In this regard, the usage of synthetic diamonds is seen as revolutionary, as it provides safety and conductivity to the radioactive battery. With the addition of nanotechnology, an American company built a high-power nano-diamond battery.
A prototype of Arkenlight’s carbon-14 diamond betavoltaic battery. (CREDIT: University of Bristol)
Based in San Francisco, California, NDB Inc. was founded in 2012 with the objective of creating a cleaner and greener alternative to conventional batteries. The startup introduced its version of diamond-based batteries in 2016 and announced two proof-of-concept tests in 2020. It’s one of the firms that is attempting to commercialize radioactive diamond batteries.
(AdPlugg=window.AdPlugg||[]).push({});” frameborder=”0″ style=”box-sizing: border-box; display: block; position: relative; overflow: hidden !important; height: 282px;”>
Nano-diamond batteries from NDB are described as alpha, beta, and neutron voltaic batteries and have several new features according to their website.
Durability. The firm calculates that the batteries could last up to 28,000 years, which means that they could reliably power space vehicles in long-duration missions, space stations, and satellites. Drones, electric cars, and aircraft on Earth would never need to make stops to be recharged.
Safety. Diamond is not only one of the hardest substances, but also one of the most thermally conductive materials in the world, which helps protect against the heat produced by the radioisotopes that the battery is built with, turning it into electric current very quickly.
Market-friendliness. Thin-film layers of PCD in these allow the battery to allow for different shapes and forms. This is why nano-diamond batteries can be multipurpose and enter different markets, from the aforementioned space applications to consumer electronics. The consumer version would not last more than a decade, though.
Nano-diamond batteries are scheduled to come onto the market in 2023.
Arkenlight, the English firm commercializing Bristol’s radioactive diamond battery, plans on releasing their first product, a microbattery, to the market in the latter part of 2023.
(AdPlugg=window.AdPlugg||[]).push({});” frameborder=”0″ style=”box-sizing: border-box; display: block; position: relative; overflow: hidden !important; height: 282px;”>
The Future of Radioactive Diamond-Based Batteries
The portability of modern electronic devices, the increasing popularity of electric vehicles, and the 21st Century race to take humanity on long space missions to Mars have triggered a growing interest in battery technology research in the last few years.
Some types of batteries are more appropriate for certain applications and not as useful for others. But we can say that the conventional lithium-ion batteries that we are familiar with won’t be replaced with radioactive diamond batteries any time soon.
Conventional batteries last a shorter time, but they are also much cheaper to manufacture. However, at the same time, the fact that they do not last that long (they have a lifespan of about five years) is problematic, because they also produce a great deal of electronic waste, which is not easy to recycle.
Radioactive diamond batteries are more convenient, because they have a much longer lifespan than conventional batteries. If they can be developed into a universal battery, like NDB Inc. proposes, we could end up with smartphone batteries that last much longer than the life of the smartphone, and we could simply change the battery from one phone to the next, much as we now transfer the SIM card.
(AdPlugg=window.AdPlugg||[]).push({});” frameborder=”0″ style=”box-sizing: border-box; display: block; position: relative; overflow: hidden !important; height: 282px;”>
However, the diamond betavoltaics developed by Arkenlight won’t go that far. The company is working on designs that stack up lots of their carbon-14 betabatteries into cells. To provide high power discharge, each cell could be accompanied by a small supercapacitor, which could offer an excellent quick-discharge capability.
However, this radioactive material also has a lifespan of more than 5000 years. If that radiation were to leak out of the device in gaseous form, it could be a problem. That’s where the diamonds come in. In the diamond formation, the C-14 is a solid, so it can’t be extracted and absorbed by a living being.
The United Kingdom Atomic Energy Authority (UKAEA) calculated that 100 pounds (approximately 45 kg) of carbon-14 could allow the fabrication of millions of long-duration diamond-based batteries. These batteries could also reduce the costs of nuclear waste storage.
University of Bristol researcher Professor Tom Scott told Nuclear Energy Insider that, “By removing the Carbon-14 from irradiated graphite directly from the reactor, this would make the remaining waste products less radioactive and therefore easier to manage and dispose of. Cost estimates for disposing of the graphite waste are 46,000 pounds ($60,000) per cubic meter for Intermediate Level Waste [ILW] and 3,000 pounds ($4,000) per cubic meter for Low-Level Waste [LLW].”
(AdPlugg=window.AdPlugg||[]).push({});” frameborder=”0″ style=”box-sizing: border-box; display: block; position: relative; overflow: hidden !important; height: 282px;”>
Don’t all these features make them one of the best options for the sustainable future that we need? We’ll have to wait and see if the manufacturers can find a way of dealing with production costs and low energy output, and get their diamond-based batteries onto the market cost-effectively and accessibly.
For more science news stories check out our New Innovations section at The Brighter Side of News.
Note: Materials provided above by University of Bristol. Content may be edited for style and length.
(AdPlugg=window.AdPlugg||[]).push({}); ” frameborder=”0″ style=”box-sizing: border-box; display: block; position: relative; overflow: hidden !important; height: 603px;”>
Like these kind of feel good stories? Get the Brighter Side of News’ newsletter.
Tags: #New_Innovations, #Green_Good_News, #Batteries, #Clean_Energy, #Technology, #Research, #Diamonds, #Nuclear, #Science, #The_Brighter_Side_of_News
Good NewsInnovationGlobal Good
