Wired 12.04: The God Particle and the Grid is an article by Richard Martin on the Large Hadron Collider at CERN and how it will be feeding terabytes of data every run that will be processed by a global computation network. Isn’t this what the ARPANET was imagined for originally? How can we adapt this model to arts computing?
A quote:
Every eight-hour run of the LHC will produce around 10 terabytes. At full power, the LHC could produce 10 petabytes of useful data each year. That’s 1016 bytes – 2 million DVDs’ worth of binary numbers encoding energy levels, momentum, charge – all in search of the one in 10 trillion anomalies that could mark the passage of a Higgs.
Discovering the Higgs might seem an esoteric goal. But the search will have a powerful real-world spinoff: to process all that data, scientists are building a worldwide meta-network of PCs, organized into large clusters and linked by ultra high-speed connections into a global, virtual computing service. It’s called the LHC Computing Grid, and it could mark the evolution of the Internet from a pervasive communications network into a powerful, global computation network.
Canada is also developing a Grid strategy through the NRC, CANARIE and C3.ca (Canadian High Performance Computing Collaboratory). This came from Matt Patey.
Comments are closed.