The COSMOS facility, which is located in the Stephen Hawking Centre for Theoretical Cosmology (CTC) at the University, is dedicated to research in cosmology, astrophysics and particle physics. It was switched on in 2012.
To date, the facility has been used to simulate the dynamics of the early Universe and for pipelines analysing the statistics of Planck satellite maps of the cosmic microwave sky. The COSMOS supercomputer was the first very large (over 10 terabyte) single-image shared-memory system to incorporate Intel Xeon Phi coprocessors, which are behind the most power-efficient computers in the world.
Intel Parallel Computing Centres (IPCC) are universities, institutions, and labs that are leaders in their field. The centres are focusing on modernising applications to increase parallelism and scalability through optimisations that leverage cores, caches, threads, and vector capabilities of microprocessors and coprocessors.
As an IPCC, the COSMOS research facility will receive enhanced Intel support from its applications and engineering teams, as well as early access to future Intel Xeon Phi and other Intel products aimed at high-performance computing. IPCC status will allow COSMOS to better focus on delivering computing advances to the scientific community it serves and also highlight the efforts Intel has put into advancing high-performance computing.
When operating at peak performance, the COSMOS Supercomputer can perform 38.6 trillion calculations per second (TFLOPS), and is based on SGI UV2000 systems with 1856 cores of Intel Xeon processors E5-2600, 14.8 TB RAM and 31 Intel® Xeon PhiTM coprocessors.
The research centre has already developed Xeon Phi for use in Planck Satellite analysis of the cosmic microwave sky and for simulations of the very early Universe. These capabilities will become even more important in the near future pending the arrival of new generations of Intel Xeon Phi coprocessors and associated technologies.
“I am very pleased that the COSMOS supercomputer centre has been selected among the vanguard of Intel Parallel Computing Centres worldwide,” said Professor Stephen Hawking, founder of the COSMOS Consortium. “These are exciting times for cosmology as we use COSMOS to directly test our mathematical theories against the latest observational data. Intel’s new technology and this additional support will accelerate our scientific research.”
“Building on COSMOS success to date with Intel’s Many Integrated Core-based technology, our new IPCC status will ensure we remain at the forefront of those exploiting many-core architectures for cosmological research,” said COSMOS director, Professor Paul Shellard. “With the SGI UV2 built around Intel Xeon processors E5-2600 family and Intel Xeon Phi processors, we have a flexible HPC platform on which we can explore Xeon Phi acceleration using distributed, offload and shared-memory programming models. Intel support will ensure fast code development timescales using MICs, enhancing COSMOS competitiveness and discovery potential.”
“Intel Parallel Computing Centres are collaborations to modernise key applications to unlock performance gains that come through parallelism, enabling the way for the next leap in discovery.
We are delighted to be working with the COSMOS team in this endeavour as they strive to understand the origins of the universe,” said Stephan Gillich, Director Technical Computing, Intel EMEA.
COSMOS is part of the Distributed Research utilising Advanced Computing (DiRAC) facility, funded by the Science & Technology Facilities Council and the Department of Business Innovation and Skills.
Cambridge’s COSMOS supercomputer, the largest shared-memory computer in Europe, has been named by computer giant Intel as one of its Parallel Computing Centres, building on a long-standing collaboration between Intel and the University of Cambridge.computingsupercomputerSpotlight on innovationPaul ShellardStephen HawkingIntelCentre for Theoretical CosmologyDepartment of Applied Mathematics and Theoretical PhysicsSchool of the Physical SciencesThese are exciting times for cosmology as we use COSMOS to directly test our mathematical theories against the latest observational dataStephen HawkingUniversity of CambridgeCOSMOS
The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.News
Precision measurement of the Newtonian gravitational constant using cold atoms
Nature 510, 7506 (2014). doi:10.1038/nature13433
Authors: G. Rosi, F. Sorrentino, L. Cacciapuoti, M. Prevedelli & G. M. Tino
About 300 experiments have tried to determine the value of the Newtonian gravitational constant, G, so far, but large discrepancies in the results have made it impossible to know its value precisely. The weakness of the gravitational interaction and the impossibility of shielding the effects of gravity make it very difficult to measure G while keeping systematic effects under control. Most previous experiments performed were based on the torsion pendulum or torsion balance scheme as in the experiment by Cavendish in 1798, and in all cases macroscopic masses were used. Here we report the precise determination of G using laser-cooled atoms and quantum interferometry. We obtain the value G = 6.67191(99) × 10−11 m3 kg−1 s−2 with a relative uncertainty of 150 parts per million (the combined standard uncertainty is given in parentheses). Our value differs by 1.5 combined standard deviations from the current recommended value of the Committee on Data for Science and Technology. A conceptually different experiment such as ours helps to identify the systematic errors that have proved elusive in previous experiments, thus improving the confidence in the value of G. There is no definitive relationship between G and the other fundamental constants, and there is no theoretical prediction for its value, against which to test experimental results. Improving the precision with which we know G has not only a pure metrological interest, but is also important because of the key role that G has in theories of gravitation, cosmology, particle physics and astrophysics and in geophysical models.
Fundamental constants: A cool way to measure big G
Nature 510, 7506 (2014). doi:10.1038/nature13507
Authors: Stephan Schlamminger
Published results of the gravitational constant, a measure of the strength of gravity, have failed to converge. An approach that uses cold atoms provides a new data point in the quest to determine this fundamental constant. See Letterp.518
The Kuiper Belt is the final frontier of our solar system, and also the vastest. Stretching from 3 to 5 billion miles from the Sun, it contains myriad primate icy bodies left over from the birth of our solar system 4.6 billion years ago. After passing the dwarf planet Pluto in July 2015, NASA's New Horizons space probe will hurtle deep into the Kuiper Belt at nearly 35,000 miles per hour. The Hubble Space Telescope is being used to search for a suitable Kuiper Belt object that New Horizons could pay a visit to. It would be our first and perhaps last look at such a remote relic from the distant past. The search is very challenging even for Hubble's sharp vision. It has to find something the size of Manhattan Island, as black as charcoal, and embedded against a snowstorm of background stars.
Update from the Gaia Project Team
A series of exhaustive tests have been conducted over the past few months to characterise some anomalies that have been revealed during the commissioning of Gaia following its successful launch in December 2013, as have been discussed in previous blog posts.
Annotated diagram of the Gaia payload module. Click for more information.
Key among these are an increased background seen in Gaia’s focal plane assembly due to stray light entering the satellite and reduced transmission of the telescope optics. In an effort to understand both problems, much of the diagnostic work has been focussed on contamination due to small amounts of water trapped in the spacecraft before launch that has been “outgassing” now that Gaia is in a vacuum.
The water vapour freezes out as ice on cold surfaces and since Gaia’s payload sits at temperatures between –100 and –150°C in the dark behind the big sunshield, that is where it ends up, including on the telescope mirrors. The ice initially led to a significant decrease in the overall transmission of the optics, but this problem was successfully dealt with by using heaters on Gaia’s mirrors and focal plane to remove the ice, before letting them cool down to operational temperatures again.
Some ice on the mirrors was expected – that is why the mirrors are equipped with heaters – but the amount detected was higher than expected. As the spacecraft continues to outgas for a while, future ‘decontamination’ campaigns are foreseen to keep the transmission issue in check using a much lighter heating procedure to minimise any disturbing effect on the thermal stability of the spacecraft.
With regards to the stray light, our analysis of the test data indicates that it is a mixture of sunlight diffracting over the edge of the sunshield and brighter sources in the ‘night sky’ on the payload side, both being scattered into the focal plane. A model has been developed which goes some way to explaining the stray light seen in the focal plane, but not all aspects are yet understood.
One key working hypothesis was that ice deposits have built up on the ceiling of the thermal tent structure surrounding the payload, and that scattering off this ice might enhance the stray light. Although there is no way to directly confirm that this is indeed the situation, the Gaia project team nevertheless considered ways of removing any such ice.
Unlike the mirrors and focal plane, the thermal tent does not have any heaters, so alternative solutions had to be explored. One option analysed in detail would involve altering the attitude of the spacecraft to allow sunlight to directly enter the thermal tent in order to remove any ice that might be there. The risks associated with this concept were assessed, and software and procedures developed to carry it out safely, but there is currently no plan to do so.
The reason is that we have also been conducting tests in our laboratories at ESTEC to try to replicate and better understand the situation. We have added layers of ice of varying thickness to representative samples of the same black paint that covers the inside of the Gaia thermal tent, to assess the ways it might be affecting the stray light. There is no evidence to suggest that thin layers of ice would in fact enhance the stray light and thus no evidence that an attempt to remove the hypothesised ice contamination in the tent would yield any benefit, hence the decision not to carry out this procedure.
Under the assumption that the stray light cannot be completely eliminated, we are investigating a variety of modified observing strategies to help reduce its impact over the course of the mission, along with modified on-board and ground software to best optimise the data that we will collect. As stated in earlier posts, even if we do have to work with the stray light, we already know that it will only affect the quality of the data collected for the faintest of Gaia’s one billion stars.
Stray light increases the background detected by Gaia and thus the associated noise. The impact is largest for the faintest stars, where the noise associated with the stellar light itself is comparable to that from the background, but there is minimal impact on brighter ones, for which the background is an insignificant fraction of the total flux.
The stray light is variable across Gaia’s focal plane and variable with time, and has a different effect on each of Gaia’s science instruments and the corresponding science goals. Thus, it is not easy to characterise its impact in a simple way.
Broadly speaking, however, our current analysis is that if the stray light remains as it is today, its impact will be to degrade the astrometric accuracy of a solar-type star at magnitude 20, the faint limit of Gaia, by roughly 50%, from 290 microarcsec to 430 microarcsec by the end of the mission. Things improve as you move to progressively brighter stars, and by magnitude 15, the accuracy will remain unaltered at approximately 25 microarcsec.
Credit: ESA/ATG medialab; background: ESO/S. Brunier
It is important to realise that for many of Gaia’s science goals, it is these relatively brighter stars and their much higher accuracy positions that are critical, and so it is good to see that they are essentially unaffected. Also, the total number of stars detected and measured will remain unchanged.
For brightness and low-resolution spectroscopic measurements made by Gaia’s photometric instruments, current indications are that the faintest stars at magnitude 20 will have been measured to roughly the 6–8% level by the end of the mission, rather than a nominal 4%, while brighter stars will remain more accurate at about 0.4%.
The radial velocity spectrometer is most affected by the stray light and about 1.5 magnitudes of sensitivity could be lost, although the number of stars that that translates into will not be known until on-going data analysis is complete.
Finally, Gaia also contains a laser interferometer called the ‘basic angle monitor’, designed to measure the angle of separation between Gaia’s two telescopes to an accuracy of 5 microarcseconds every few minutes. This is necessary in order to correct for variations in the separation angle caused by ‘normal’ thermal changes in the payload as Gaia spins. The system is working as planned, but is seeing larger-than-expected variations in the basic angle. We are currently examining these data to discover if this issue will have any impact.
A comprehensive understanding of these issues will be given when a thorough analysis of all engineering tests is complete. Gaia has nearly completed its performance verification data taking, and is about to start a month-long dedicated science observation run. Once the data have been fully analysed, we will be able to provide a detailed quantitative assessment of the scientific performance of Gaia.
While there will likely be some loss relative to Gaia’s pre-launch performance predictions, we already know that the scientific return from the mission will still be immense, revolutionising our understanding of the formation and evolution of our Milky Way galaxy and much else.
Planetary science: Early planet helped make Moon
Nature 510, 7504 (2014). doi:10.1038/510190c
Small chemical differences between Earth and the Moon support the idea that the Moon formed from remnants of a large early planet, or protoplanet, that smashed into Earth 4.5 billion years ago.Previous studies did not find differences in isotopic chemistry between Earth and Moon
Two γ-ray bursts from dusty regions with little molecular gas
Nature 510, 7504 (2014). doi:10.1038/nature13325
Authors: B. Hatsukade, K. Ohta, A. Endo, K. Nakanishi, Y. Tamura, T. Hashimoto & K. Kohno
Long-duration γ-ray bursts are associated with the explosions of massive stars and are accordingly expected to reside in star-forming regions with molecular gas (the fuel for star formation). Previous searches for carbon monoxide (CO), a tracer of molecular gas, in burst host galaxies did not detect any emission. Molecules have been detected as absorption in the spectra of γ-ray burst afterglows, and the molecular gas is similar to the translucent or diffuse molecular clouds of the Milky Way. Absorption lines probe the interstellar medium only along the line of sight, so it is not clear whether the molecular gas represents the general properties of the regions where the bursts occur. Here we report spatially resolved observations of CO line emission and millimetre-wavelength continuum emission in two galaxies hosting γ-ray bursts. The bursts happened in regions rich in dust, but not particularly rich in molecular gas. The ratio of molecular gas to dust (<9–14) is significantly lower than in star-forming regions of the Milky Way and nearby star-forming galaxies, suggesting that much of the dense gas where stars form has been dissipated by other massive stars.