CERN eggheads burn AI into silicon to stem data deluge
Briefly

CERN eggheads burn AI into silicon to stem data deluge
"CERN burns custom nanosecond-speed AI into the silicon itself just to eliminate excess data. Each year the LHC produces 40,000 EBs of unfiltered sensor data alone, or about a fourth of the size of the entire Internet."
"Algorithms processing this data must be extremely fast. So fast that decisions must be burned into the chip design itself. The LHC detector systems process data at speeds up to hundreds of terabytes per second."
"Contained in a 27-kilometer ring located a hundred meters underground, the LHC smashes subatomic particles together at near-light speeds. The resulting collisions are expected to produce new types of matter that fill out our understanding of the Standard Model of particle physics."
CERN employs advanced machine learning techniques to manage the vast amounts of data generated by the Large Hadron Collider, which produces 40,000 EBs of unfiltered sensor data annually. Due to storage limitations, real-time data reduction is essential. The algorithms must operate at extreme speeds, necessitating integration into the chip design. The LHC, located underground, collides particles at near-light speeds to explore new matter types and enhance the understanding of the Standard Model of particle physics.
Read at Theregister
Unable to calculate read time
[
|
]