News

AI in Action: AI's early proving ground: the hunt for new particles

See allHide authors and affiliations

Science  07 Jul 2017:
Vol. 357, Issue 6346, pp. 20
DOI: 10.1126/science.357.6346.20

High-energy physicists use machine learning to sift through the debris of particle collisions

Particle physicists began fiddling with artificial intelligence (AI) in the late 1980s, just as the term “neural network” captured the public's imagination. Their field lends itself to AI and machine-learning algorithms because nearly every experiment centers on finding subtle spatial patterns in the countless, similar readouts of complex particle detectors—just the sort of thing at which AI excels. “It took us several years to convince people that this is not just some magic, hocus-pocus, black box stuff,” says Boaz Klima, of Fermi National Accelerator Laboratory (Fermilab) in Batavia, Illinois, one of the first physicists to embrace the techniques. Now, AI techniques number among physicists' standard tools.

Particle physicists strive to understand the inner workings of the universe by smashing subatomic particles together with enormous energies to blast out exotic new bits of matter. In 2012, for example, teams working with the world's largest proton collider, the Large Hadron Collider (LHC) in Switzerland, discovered the long-predicted Higgs boson, the fleeting particle that is the linchpin to physicists' explanation of how all other fundamental particles get their mass.

Such exotic particles don't come with labels, however. At the LHC, a Higgs boson emerges from roughly one out of every 1 billion proton collisions, and within a billionth of a picosecond it decays into other particles, such as a pair of photons or a quartet of particles called muons. To “reconstruct” the Higgs, physicists must spot all those more-common particles and see whether they fit together in a way that's consistent with them coming from the same parent—a job made far harder by the hordes of extraneous particles in a typical collision.

Algorithms such as neural networks excel in sifting signal from background, says Pushpalatha Bhat, a physicist at Fermilab. In a particle detector—usually a huge barrel-shaped assemblage of various sensors—a photon typically creates a spray of particles or “shower” in a subsystem called an electromagnetic calorimeter. So do electrons and particles called hadrons, but their showers differ subtly from those of photons. Machine-learning algorithms can tell the difference by sniffing out correlations among the multiple variables that describe the showers. Such algorithms can also, for example, help distinguish the pairs of photons that originate from a Higgs decay from random pairs. “This is the proverbial needle-in-the-haystack problem,” Bhat says. “That's why it's so important to extract the most information we can from the data.”

Neural networks search for fingerprints of new particles in the debris of collisions at the LHC.

PHOTO: © 2012 CERN, FOR THE BENEFIT OF THE ALICE COLLABORATION

Machine learning hasn't taken over the field. Physicists still rely mainly on their understanding of the underlying physics to figure out how to search data for signs of new particles and phenomena. But AI is likely to become more important, says Paolo Calafiura, a computer scientist at Lawrence Berkeley National Laboratory in Berkeley, California. In 2024, researchers plan to upgrade the LHC to increase its collision rate by a factor of 10. At that point, Calafiura says, machine learning will be vital for keeping up with the torrent of data.

Navigate This Article