mardi 23 août 2016

Opinion: The end of computing’s steam age












CERN - European Organization for Nuclear Research logo.

August 23, 2016

Steam once powered the world. If you wanted to build a factory, or a scientific laboratory, you needed a steam engine and a supply of coal. Today, for most of us, power comes out of the wall in the form of electricity.

The modern-day analogue is computing: if you want to run a large laboratory such as CERN, you need a dedicated computer centre. The time, however, is ripe for change.


Image above: CERN Data Centre (Image: Sophia Bennett/CERN).

For LHC physicists, this change has already happened. We call it the Worldwide LHC Computing Grid (WLCG), which is maintained by the global particle-physics community. As physicists move towards the High Luminosity LHC (HL-LHC), however, we need a new solution for our increasingly demanding computing and data-storage needs. That solution could look very much like the Cloud, which is the general term for distributed computing and data storage in broader society.

There are clear differences between the Cloud and the Grid. When developing the WLCG, CERN was able to factor in technology that was years in the future by banking on Moore’s law, which states that processing capacity doubles roughly every 18 months. After more than 50 years, however, Moore’s law is coming up against a hard technology limit. Cloud technology, by contrast, shows no sign of slowing down: more bandwidth simply means more fibre or colour-multiplexing on the same fibre.

Cloud computing is already at an advanced stage. While CERN was building the WLCG, the Googles and Amazons of the world were building huge data warehouses to host commercial Clouds. Although we could turn to them to satisfy our computing needs, it is doubtful that such firms could guarantee the preservation of our data for the decades that it would be needed. We therefore need a dedicated “Science Cloud” instead.


Image above: A server at the CERN Data Centre (Image: Sophia Bennett/CERN).

CERN has already started to think about the parameters for such a facility. Zenodo, for example, is a future-proof and non-proprietary data repository that has been adopted by other big-data communities. The virtual nature of the technology allows various scientific disciplines to coexist on a given infrastructure, making it very attractive to providers. The next step requires co-operation with governments to develop computing and data warehouses for a Science Cloud.

CERN and the broader particle-physics community have much to bring to this effort. Just as CERN played a pioneering role in developing Grid computing to meet the needs of the LHC, we can contribute to the development of the Science Cloud to meet the demands of the HL-LHC. Not only will this machine produce a luminosity five times greater than the LHC, but data are increasingly coming straight from the sensors in the LHC detectors to our computer centre with minimal processing and reduction along the way. Add to that CERN’s open-access ethos, which began in open-access publishing and is now moving towards “open data”, and you have a powerful combination of know-how relevant to designing future computing and data facilities. Particle physics can therefore help develop Cloud computing for the benefit of science as a whole.

In the future, scientific computing will be accessed much as electrical power is today: we will tap into resources simply by plugging in, without worrying about where our computing cycles and data storage are physically located. Rather than relying on our own large computer centre, there will be a Science Cloud composed of computing and data centres serving the scientific endeavour as a whole, guaranteeing data preservation for as long as it is needed. Its location should be determined primarily by its efficiency of operation.

CERN has been in the vanguard of scientific computing for decades, from the computerised control system of the Super Proton Synchrotron in the 1970s, to CERNET, TCP/IP, the World Wide Web and the WLCG. It is in that vanguard that we need to remain, to deliver the best science possible. Working with governments and other data-intensive fields of science, it’s time for particle physics to play its part in developing a world in which the computing socket sits right next to the power socket. It’s time to move beyond computing’s golden age of steam.

This article was originally published in the CERN courier: http://cerncourier.com/cws/article/cern/65819

Note:

CERN, the European Organization for Nuclear Research, is one of the world’s largest and most respected centres for scientific research. Its business is fundamental physics, finding out what the Universe is made of and how it works. At CERN, the world’s largest and most complex scientific instruments are used to study the basic constituents of matter — the fundamental particles. By studying what happens when these particles collide, physicists learn about the laws of Nature.

The instruments used at CERN are particle accelerators and detectors. Accelerators boost beams of particles to high energies before they are made to collide with each other or with stationary targets. Detectors observe and record the results of these collisions.

Founded in 1954, the CERN Laboratory sits astride the Franco–Swiss border near Geneva. It was one of Europe’s first joint ventures and now has 22 Member States.

Related link: 

Large Hadron Collider’s (LHC): http://home.cern/topics/large-hadron-collider


For more information about the European Organization for Nuclear Research (CERN), visit: http://home.web.cern.ch/

Images (mentioned), Text, Credits: CERN/Eckhard Elsen.

Greetings, Orbiter.ch