mardi 23 août 2016

Opinion: The end of computing’s steam age

CERN - European Organization for Nuclear Research logo.

August 23, 2016

Steam once powered the world. If you wanted to build a factory, or a scientific laboratory, you needed a steam engine and a supply of coal. Today, for most of us, power comes out of the wall in the form of electricity.

The modern-day analogue is computing: if you want to run a large laboratory such as CERN, you need a dedicated computer centre. The time, however, is ripe for change.

Image above: CERN Data Centre (Image: Sophia Bennett/CERN).

For LHC physicists, this change has already happened. We call it the Worldwide LHC Computing Grid (WLCG), which is maintained by the global particle-physics community. As physicists move towards the High Luminosity LHC (HL-LHC), however, we need a new solution for our increasingly demanding computing and data-storage needs. That solution could look very much like the Cloud, which is the general term for distributed computing and data storage in broader society.

There are clear differences between the Cloud and the Grid. When developing the WLCG, CERN was able to factor in technology that was years in the future by banking on Moore’s law, which states that processing capacity doubles roughly every 18 months. After more than 50 years, however, Moore’s law is coming up against a hard technology limit. Cloud technology, by contrast, shows no sign of slowing down: more bandwidth simply means more fibre or colour-multiplexing on the same fibre.

Cloud computing is already at an advanced stage. While CERN was building the WLCG, the Googles and Amazons of the world were building huge data warehouses to host commercial Clouds. Although we could turn to them to satisfy our computing needs, it is doubtful that such firms could guarantee the preservation of our data for the decades that it would be needed. We therefore need a dedicated “Science Cloud” instead.

Image above: A server at the CERN Data Centre (Image: Sophia Bennett/CERN).

CERN has already started to think about the parameters for such a facility. Zenodo, for example, is a future-proof and non-proprietary data repository that has been adopted by other big-data communities. The virtual nature of the technology allows various scientific disciplines to coexist on a given infrastructure, making it very attractive to providers. The next step requires co-operation with governments to develop computing and data warehouses for a Science Cloud.

CERN and the broader particle-physics community have much to bring to this effort. Just as CERN played a pioneering role in developing Grid computing to meet the needs of the LHC, we can contribute to the development of the Science Cloud to meet the demands of the HL-LHC. Not only will this machine produce a luminosity five times greater than the LHC, but data are increasingly coming straight from the sensors in the LHC detectors to our computer centre with minimal processing and reduction along the way. Add to that CERN’s open-access ethos, which began in open-access publishing and is now moving towards “open data”, and you have a powerful combination of know-how relevant to designing future computing and data facilities. Particle physics can therefore help develop Cloud computing for the benefit of science as a whole.

In the future, scientific computing will be accessed much as electrical power is today: we will tap into resources simply by plugging in, without worrying about where our computing cycles and data storage are physically located. Rather than relying on our own large computer centre, there will be a Science Cloud composed of computing and data centres serving the scientific endeavour as a whole, guaranteeing data preservation for as long as it is needed. Its location should be determined primarily by its efficiency of operation.

CERN has been in the vanguard of scientific computing for decades, from the computerised control system of the Super Proton Synchrotron in the 1970s, to CERNET, TCP/IP, the World Wide Web and the WLCG. It is in that vanguard that we need to remain, to deliver the best science possible. Working with governments and other data-intensive fields of science, it’s time for particle physics to play its part in developing a world in which the computing socket sits right next to the power socket. It’s time to move beyond computing’s golden age of steam.

This article was originally published in the CERN courier:


CERN, the European Organization for Nuclear Research, is one of the world’s largest and most respected centres for scientific research. Its business is fundamental physics, finding out what the Universe is made of and how it works. At CERN, the world’s largest and most complex scientific instruments are used to study the basic constituents of matter — the fundamental particles. By studying what happens when these particles collide, physicists learn about the laws of Nature.

The instruments used at CERN are particle accelerators and detectors. Accelerators boost beams of particles to high energies before they are made to collide with each other or with stationary targets. Detectors observe and record the results of these collisions.

Founded in 1954, the CERN Laboratory sits astride the Franco–Swiss border near Geneva. It was one of Europe’s first joint ventures and now has 22 Member States.

Related link: 

Large Hadron Collider’s (LHC):

For more information about the European Organization for Nuclear Research (CERN), visit:

Images (mentioned), Text, Credits: CERN/Eckhard Elsen.


lundi 22 août 2016

LHC pushes limits of performance

CERN - European Organization for Nuclear Research logo.

22 Aug 2016

The Large Hadron Collider’s (LHC) performance continued to surpass expectations, when this week it achieved 2220 proton bunches in each of its counter-rotating beams – the most it will achieve this year.

This is not the maximum the machine is capable of holding (at full intensity the beam will have nearly 2800 bunches) but it is currently limited by a technical issue in the Super Proton Synchrotron (SPS).

“Performance is excellent, given this limitation,” says Mike Lamont, head of the Operations team. “We’re 10% above design luminosity (which we surpassed in June), we have these really long fills (where the beam is circulating for up to 20 hours or so) and very good collision rates. 2220 bunches is just us squeezing as much in as we can, given the restrictions, to maximize delivery to the experiments.”

As an example of the machine’s brilliant performance, with almost two months left in this year’s run it has already reached an integrated luminosity of 22fb-1 – very close to the goal for 2016 of 25fb-1 (up from 4fb-1 last year).

Image above: The LHC tunnel. The machine is surpassing all performance expectations (Image: Jacques Fichet/ CERN).

Luminosity is an essential indicator of the performance of an accelerator, measuring the potential number of collisions that can occur in a given amount of time, and integrated luminosity (measured in inverse femtobarns, fb-1) is the accumulated number of potential collisions. At its peak, the LHC’s proton-proton collision rate reaches about 1 billion collisions per second giving a chance that even the rarest processes at the highest energy could occur.

The SPS is currently experiencing a small fault that could be exacerbated by high beam intensity – hence the number of proton bunches sent to the LHC per injection is limited to 96, compared to the normal 288.

“Once this issue is fixed in the coming year-end technical stop, we’ll be able to push up the number of bunches even further. Next year we should be able to go to new record levels,” says Lamont with a wry grin.


CERN, the European Organization for Nuclear Research, is one of the world’s largest and most respected centres for scientific research. Its business is fundamental physics, finding out what the Universe is made of and how it works. At CERN, the world’s largest and most complex scientific instruments are used to study the basic constituents of matter — the fundamental particles. By studying what happens when these particles collide, physicists learn about the laws of Nature.

The instruments used at CERN are particle accelerators and detectors. Accelerators boost beams of particles to high energies before they are made to collide with each other or with stationary targets. Detectors observe and record the results of these collisions.

Founded in 1954, the CERN Laboratory sits astride the Franco–Swiss border near Geneva. It was one of Europe’s first joint ventures and now has 22 Member States.

Related links:

Large Hadron Collider’s (LHC):

Super Proton Synchrotron (SPS):

For more information about the European Organization for Nuclear Research (CERN), visit:

Image (mentioned), Text, Credits: CERN/Harriet Jarlett.

Best regards,

Astronauts Relaxing Before Pair of Spaceships Leave

ISS - Expedition 48 Mission patch.

August 22, 2016

Three astronauts are relaxing today after a spacewalk on Friday and weekend cleanup work. Meanwhile, a pair of spacecraft will be departing the International Space Station over the next two weeks.

NASA astronauts Jeff Williams and Kate Rubins successfully installed a new international docking adapter Friday morning during a five hour and 58-minute spacewalk. Japanese astronaut Takuya Onishi assisted the duo from inside the station, while all three cleaned up the Quest airlock afterward where they stowed their spacesuits and tools.

Image above: An astronaut works to install an international docking adapter during a spacewalk on Friday. Image credit: NASA.

Williams is scheduled to return to Earth on Sept. 6 with cosmonauts Oleg Skripochka and Alexey Ovchinin ending Expedition 48. The two cosmonauts began their departure preparations today to get the Soyuz TMA-20M spacecraft ready for undocking and landing in Kazakhstan.

Before Expedition 48 returns home in two weeks the SpaceX Dragon spacecraft will leave the station this Friday at 6:10 a.m. EDT. The crew is loading the space freighter with gear and science for analysis by NASA engineers on the ground. Dragon will splashdown in the Pacific Ocean a few hours after its release Friday and be retrieved by SpaceX personnel.

Related article:

Spacewalk Concludes After Commercial Crew Port Installation

Keep up with the International Space Station, and its research and crews, at:

International Space Station (ISS):

Space Station Research and Technology:

Image (mentioned), Text, Credits: NASA/Mark Garcia.


A Moon's Contrasts

NASA - Cassini Mission to Saturn patch.

Aug. 22, 2016

Dione reveals its past via contrasts in this view from NASA's Cassini spacecraft. The features visible here are a mixture of tectonics -- the bright, linear features -- and impact cratering -- the round features, which are spread across the entire surface.

Tectonic features tell the story of how Dione (698 miles or 1,123 kilometers across) has been heated and cooled since its formation, and scientists use those clues to piece together the moon's past. Impact craters are evidence of external debris striking the surface, and thus they tell about the environment in which the moon has existed over its history.

This view looks toward the trailing hemisphere of Dione. North on Dione is up. The image was taken in visible light with the Cassini narrow-angle camera on April 11, 2015.

The view was obtained at a distance of approximately 68,000 miles (110,000 kilometers) from Dione and at a Sun-Dione-spacecraft, or phase, angle of 28 degrees. Image scale is 2,165 feet (660 meters) per pixel.

The Cassini mission is a cooperative project of NASA, ESA (the European Space Agency) and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the mission for NASA's Science Mission Directorate, Washington. The Cassini orbiter and its two onboard cameras were designed, developed and assembled at JPL. The imaging operations center is based at the Space Science Institute in Boulder, Colorado.

For more information about the Cassini-Huygens mission visit and The Cassini imaging team homepage is at and ESA's website:

Image, Text, Credits: NASA/JPL-Caltech/Space Science Institute/Tony Greicius.


NASA Establishes Contact With STEREO Mission

NASA - STEREO Mission logo.

Aug. 22, 2016

STEREO spacecraft. Image credit: NASA

On Aug. 21, 2016, contact was reestablished with one of NASA's Solar Terrestrial Relations Observatories, known as the STEREO-B spacecraft, after communications were lost on Oct. 1, 2014. Over 22 months, the STEREO team has worked to attempt contact with the spacecraft. Most recently, they have attempted a monthly recovery operation using NASA's Deep Space Network, or DSN, which tracks and communicates with missions throughout space.

The DSN established a lock on the STEREO-B downlink carrier at 6:27 p.m. EDT. The downlink signal was monitored by the Mission Operations team over several hours to characterize the attitude of the spacecraft and then transmitter high voltage was powered down to save battery power. The STEREO Missions Operations team plans further recovery processes to assess observatory health, re-establish attitude control, and evaluate all subsystems and instruments.

Image above: On Aug. 21, 2016, NASA reestablished contact with the sun-watching STEREO-B spacecraft, after communications were lost in October 2014. STEREO-B is one of two spacecraft of the Solar Terrestrial Relations Observatory mission, which over the course of their lifetime have viewed the sun from vantage points such as the ones shown here, on the other side of the sun from Earth. This graphic shows the positions of the two STEREO spacecraft and their orbits in relation to Earth, Venus, Mercury and the sun. Image Credit: NASA.

Communications with STEREO-B were lost during a test of the spacecraft’s command loss timer, a hard reset that is triggered after the spacecraft goes without communications from Earth for 72 hours. The STEREO team was testing this function in preparation for something known as solar conjunction, when STEREO-B’s line of sight to Earth – and therefore all communication – was blocked by the sun.

STEREO-A continues to work normally.

For more on STEREO:

Images (mentioned), Text, Credits: NASA's Goddard Space Flight Center, by Karen C. Fox/Rob Garner.


Proba-3: seeing through shadow to view Sun’s corona

ESA - European Space Agency patch.

22 August 2016

Every 18 months or so, scientists and sensation-seekers gather at set points on Earth’s surface, to await awe-inspiring solar eclipses. The Moon briefly blocks the Sun, revealing its mysterious outer atmosphere, the corona. Though what if researchers could induce such eclipses at will?

That’s the scientific vision behind ESA’s double-satellite Proba-3, the world’s first precision formation-flying mission, planned for launch in 2019.

 Proba-3 formation flying satellites

An ‘occulter’ satellite will fly 150 m ahead of a second ‘coronagraph’ satellite, casting a precise shadow to reveal the ghostly tendrils of the solar corona, down to 1.2 solar radii, for hours on end.

“We have two scientific instruments aboard,” explains Damien Galano, Proba-3 Payload Manager. “The primary payload is ASPIICS, a coronagraph to observe the corona in visible light while the DARA radiometer on the occulter measures the total solar irradiance coming from the Sun – a scientific parameter about which there is still some uncertainty.

Proba-3 revealing corona

“The corona is a million times fainter than the Sun itself, so the light from the solar disk needs to be blocked in order to see it. The coronagraph idea was conceived by astronomer Bernard Lyot in the 1930s – and since then has been developed and has been incorporated into both Earth-based and space telescopes.

“But because of the wave nature of light, even within the cone of shadow cast by the occulter, some light still spills around the occulter edges, a phenomenon called ‘diffraction’.

“To minimise this unwanted light, the coronagraph can be positioned closer to the occulter – and therefore deeper into the shadow cone. However the deeper it is, the more the solar corona will also be occulted by the occulter. 

Coronagraph on single satellite

“Hence the advantage of a larger occulter and the maximum possible distance between the occulter and the coronagraph. Obviously a 150-m-long satellite is not a practical proposition, but our formation flying approach should provide us with equivalent performance.

“Furthermore, the ASPIICS coronagraph itself contains a smaller, secondary occulter disk, to cut down on diffracted light still further.

Coronagraph across two satellites

“Precision is all – the aperture of the ASPIICS instrument measures 50 mm in diameter, and for corona observation performance it should remain as much as possible in the centre of the shadow, which is about 70 mm across at 150 m.

"So we’ll need to achieve millimetre-scale positioning control between the two spacecraft, effectively forming a single giant instrument across space.”

ASPIICS (Association of Spacecraft for Polarimetry and Imaging of the Corona of the Sun) is being developed for ESA by a consortium led by Centre Spatial de Liège in Belgium, made up of 15 companies and institutes from five ESA Member States.

Diffraction of light

“Many of these companies are new to ESA, and they’ve proved to be very motivated and eager to show their capabilities,” remarks Damien. “We’ve produced various prototypes of instrument elements, and our first complete ‘structural and thermal model’ should be complete in the autumn, ahead of our end-of-year Critical Design Review.

“We’re also looking into various optical aspects, such as the best occulter edge shape to minimise diffraction.”

Proba-3: Dancing with the stars

There’s a lot of broader interest in this external occulter approach – especially for the imaging of Earth-like exoplanets, which would require the blocking out of their parent stars.

“It’s a similar challenge, the main difference being that the star in question is a point source of light rather than the extended source that our Sun is.

“So it could be that formation-flown external occulters become versatile scientific tools, opening many new vistas in astronomy.”


About Proba-3:


Images, Video, Text, Credits: ESA/P. Carril/D. Galeno/Thomas Bauer at Wellesley.

Best regards,

samedi 20 août 2016

United Launch Alliance Successfully Launches AFSPC-6 Mission for the U.S. Air Force

ULA - Delta IV / AFSPC-6 Mission poster.

August 20, 2016

Twin GSSAP Satellites Enhance Space Based Situational Awareness

Image above: ULA's Delta IV rocket lifts off with the AFSPC-6 mission for the United States Air Force.

A United Launch Alliance (ULA) Delta IV rocket carrying the AFSPC-6 mission for the United States Air Force lifted off from Space Launch Complex-37 Aug. 19 at 12:52 a.m. EDT. This is ULA’s seventh launch in 2016 and the 110th successful launch since the company was formed in December 2006.

“Thank you to the ULA, Air Force and industry partners for the outstanding teamwork and flawless execution that made today’s mission a success,” said Laura Maginnis, ULA vice president of Custom Services. “This morning’s AFSPC-6 launch is a prime example of why our customers continue to place their trust us to launch our nation’s crucial space capabilities.”

Delta IV AFSPC-6 Launch Highlights

This mission was launched aboard a Delta IV Medium+ (4,2) configuration Evolved Expendable Launch Vehicle (EELV) powered by one common booster core. The common booster core was powered by an RS-68A liquid hydrogen/liquid oxygen engine producing 702,000 pounds of thrust. A single RL10B liquid hydrogen/liquid oxygen engine powered the second stage. The booster and upper stage engines are both built by Aerojet Rocketdyne. ULA constructed the Delta IV Medium+ (4,2) launch vehicle in Decatur, Alabama.

The AFSPC-6 mission consists of twin Geosynchronous Space Situational Awareness Program (GSSAP) spacecraft, built by Orbital ATK. The new satellites will join the first two GSSAP spacecraft  launched approximately two years ago aboard a Delta IV launch vehicle. GSSAP is a space-based capability that collects space situational awareness data, allowing for more accurate tracking and characterization of man-made orbiting objects. It has a clear, unobstructed, and distinct vantage point for viewing resident space objects orbiting earth in a near-geosynchronous orbit without the weather or atmosphere disruptions that limit ground-based observations. The data from GSSAP greatly improves our ability to rapidly detect, warn, characterize and attribute disturbances to space systems in the geosynchronous environment.

 GSSAP USAF 1 & 2 satellites

ULA's next launch is the Atlas V OSIRIS-REx spacecraft for NASA. The launch is scheduled for Sept. 8 from Space Launch Complex-41 at Cape Canaveral Air Force Station, Florida.

The EELV program was established by the U.S. Air Force to provide assured access to space for Department of Defense and other government payloads. The commercially developed EELV program supports the full range of government mission requirements, while delivering on schedule and providing significant cost savings over the heritage launch systems.

With more than a century of combined heritage, United Launch Alliance is the nation’s most experienced and reliable launch service provider. ULA has successfully delivered more than 100 satellites to orbit that provide critical capabilities for troops in the field, aid meteorologists in tracking severe weather, enable personal device-based GPS navigation and unlock the mysteries of our solar system.

For more information on ULA, visit the ULA website at Join the conversation at, and

Images, Video, Text, Credits: United Launch Alliance (ULA)/Günter Space Page.