вторник, 19 февраля 2019 г.

Captioned Image Spotlight (19 Feb 2019): A Recent Impact Site in…


Captioned Image Spotlight (19 Feb 2019): A Recent Impact Site in Noachis Terra


This image shows a recent impact in Noachis Terra in the southern mid-latitudes of Mars. The impact occurred in dark-toned ejecta material from a degraded, 60-kilometer crater to the south.


Rather than a single impact crater, we see multiple impacts like a shotgun blast. This suggests that the impactor broke up in the atmosphere on entry. Although the atmosphere of Mars is thinner than Earth’s, it still has the capacity to break up small impactors, especially ones comprised of weaker materials, like a stony meteoroid versus a iron-nickel one.


Our image depicts 21 distinctive craters ranging in size from 1 to 7 meters in diameter. They are distributed over an area that spans about 305 meters. Most observed recent impacts expose darker-toned materials underlying bright dusty surfaces. However, this impact does the opposite, showing us lighter-toned materials that lie beneath a darker colored surface.


The impact was initially discovered in a 2016 Context Camera image, and was not seen in a 2009 picture. This implies that the impact may be only two years old, but certainly no more than nine years.


NASA/JPL/University of Arizona


Sluggish Movements With its large neurons and relatively…


Sluggish Movements


With its large neurons and relatively simple circuits, the sea slug Aplysia californica (pictured) is a valuable model system in neurobiology, famous for Nobel prize-winning work on learning and memory. Most recently, researchers used Aplysia neurons to investigate how cells control the movement of mitochondria. These vital organelles, producing energy in the form of ATP, are transported within cells to areas where this energy is most needed. When two neurons form a connection, or synapse, mitochondria fuel the signal transmission between them. Scientists found that, in Aplysia, synapse formation boosts mitochondrial movement, and triggers changes in the activity of around 4000 genes, causing a long-term shift in the pre-synaptic neuron’s makeup. These new insights could help find ways to address problems with mitochondrial transport, thought to be involved in neurodegenerative diseases like Alzheimer’s.


Written by Emmanuelle Briolat



You can also follow BPoD on Instagram, Twitter and Facebook


Archive link


2019 February 19 Comet Iwamoto Before Spiral Galaxy NGC 2903…


2019 February 19


Comet Iwamoto Before Spiral Galaxy NGC 2903
Video Credit & Copyright: Norbert Span


Explanation: It isn’t every night that a comet passes a galaxy. Last Thursday, though, binocular comet C/2018 Y1 (Iwamoto) moved nearly in front of a spiral galaxy of approximately the same brightness: NGC 2903. Comet Iwamoto was discovered late last year and orbits the Sun in a long ellipse. It last visited the inner Solar System during the Middle Ages, around the year 648. The comet reached its closest point to the Sun – between Earth and Mars – on February 6, and its closest point to Earth a few days ago, on February 13. The featured time-lapse video condenses almost three hours into about ten seconds, and was captured last week from Switzerland. At that time Comet Iwamoto, sporting a green coma, was about 10 light minutes distant, while spiral galaxy NGC 2903 remained about 30 million light years away. Two satellites zip diagonally through the field about a third of the way through the video. Typically, a few comets each year become as bright as Comet Iwamoto.


∞ Source: apod.nasa.gov/apod/ap190219.html


Dunino Den, a sacred Pictish and Early Christian grove and well, St. Andrews, Scotland,...

Dunino Den, a sacred Pictish and Early Christian grove and well, St. Andrews, Scotland, 19.2.19.












Source link


Azurite | #Geology #GeologyPage #Mineral Locality: Mas Dieu,…


Azurite | #Geology #GeologyPage #Mineral


Locality: Mas Dieu, Mercoirol, Gard, Languedoc-Roussillon, France


Size: 1 × 2 × 1.5 cm


Largest Crystal: 0.40cm


Photo Copyright © ROCKS-STORE /e-rocks. com


Geology Page

www.geologypage.com

https://www.instagram.com/p/BuD8TLXlZUi/?utm_source=ig_tumblr_share&igshid=1qsgpa3j73lyu


Debris Flow Dynamics…


Debris Flow Dynamics http://www.geologypage.com/2019/02/debris-flow-dynamics.html


Hundreds of Thousands of New Galaxies


Galaxy cluster Abell 1314 in the constellation „Ursa Major“ in a distance of approximately 460 million light years. The LOFAR observations reveal radio emission from high-speed cosmic electrons (marked in red) resulting from collisions with other galaxy clusters. The overlay onto an optical image also shows hot X-ray gas (marked in grey) from observations with the Chandra satellite. © Amanda Wilber/LOFAR Surveys Team


Astronomers publish new sky map detecting a vast number of previously unknown galaxies


An international team of more than 200 astronomers from 18 countries including scientists from the Max Planck Institute for Radio Astronomy in Bonn, Germany, has published the first phase of a major new radio sky survey at unprecedented sensitivity using the Low Frequency Array (LOFAR) telescope. The survey reveals hundreds of thousands of previously undetected galaxies, shedding new light on many research areas including the physics of black holes and how clusters of galaxies evolve.


A special issue of the scientific journal Astronomy & Astrophysics is dedicated to the first twenty-six research papers describing the survey and its first results.


Radio astronomy reveals processes in the Universe that we cannot see with optical instruments. In this first part of the sky survey, LOFAR observed a quarter of the northern hemisphere at low radio frequencies. At this point, approximately ten percent of that data is now made public. It maps three hundred thousand sources, almost all of which are galaxies in the distant Universe; their radio signals have travelled billions of light years before reaching Earth.

Black holes


Huub Röttgering, Leiden University (The Netherlands): “If we take a radio telescope and we look up at the sky, we see mainly emission from the immediate environment of massive black holes. With LOFAR we hope to answer the fascinating question: where do those black holes come from?” What we do know is that black holes are pretty messy eaters. When gas falls onto them they emit jets of material that can be seen at radio wavelengths.


Philip Best, University of Edinburgh (UK), adds: “LOFAR has a remarkable sensitivity and that allows us to see that these jets are present in all of the most massive galaxies, which means that their black holes never stop eating.”

Clusters of galaxies


Clusters of galaxies are ensembles of hundreds to thousands of galaxies and it has been known for decades that when two clusters of galaxies merge, they can produce radio emission spanning millions of light years. This emission is thought to come from particles that are accelerated during the merger process. Amanda Wilber, University of Hamburg (Germany), elaborates: “With radio observations we can detect radiation from the tenuous medium that exists between galaxies. This radiation is generated by energetic shocks and turbulence. LOFAR allows us to detect many more of these sources and understand what is powering them.”


Annalisa Bonafede, University of Bologna and INAF (Italy), adds: “What we are beginning to see with LOFAR is that, in some cases, clusters of galaxies that are not merging can also show this emission, albeit at a very low level that was previously undetectable. This discovery tells us that, besides merger events, there are other phenomena that can trigger particle acceleration over huge scales.”

Magnetic fields


The unprecedented accuracy of the LOFAR measurements allows to measure the effect of cosmic magnetic fields on radio waves. Researchers from Germany investigated magnetic fields in the halos of galaxies. They could show the existence of enormous magnetic structures also between galaxies. „The LOFAR data are providing hints that the space between galaxies could be completely magnetic“, says Rainer Beck from MPIfR Bonn, Germany.


High-quality images


Creating low-frequency radio sky maps takes both significant telescope and computational time and requires large teams to analyse the data. “LOFAR produces enormous amounts of data – we have to process the equivalent of ten million DVDs of data. The LOFAR surveys were recently made possible by a mathematical breakthrough in the way we understand interferometry”, says Cyril Tasse, Observatoire de Paris – Station de radioastronomie à Nançay (France).


“We have been working together with SURF in the Netherlands to efficiently transform the massive amounts of data into high-quality images. These images are now public and will allow astronomers to study the evolution of galaxies in unprecedented detail”, says Timothy Shimwell, Netherlands Institute for Radio Astronomy (ASTRON) and Leiden University.


SURF’s compute and data centre located at SURFsara in Amsterdam runs on 100 percent renewable energy and hosts over 20 petabytes of LOFAR data. “This is more than half of all data collected by the LOFAR telescope to date. It is the largest astronomical data collection in the world. Processing the enormous data sets is a huge challenge for scientists. What normally would have taken centuries on a regular computer was processed in less than one year using the high throughput compute cluster (Grid) and expertise”, says Raymond Oonk (SURFsara).

LOFAR


The LOFAR telescope, the Low Frequency Array, is unique in its capabilities to map the sky in fine detail at metre wavelengths. LOFAR is operated by ASTRON in The Netherlands and is considered to be the world’s leading telescope of its type. “This sky map will be a wonderful scientific legacy for the future. It is a testimony to the designers of LOFAR that this telescope performs so well”, says Carole Jackson, Director General of ASTRON.

The next step


The 26 research papers in the special issue of Astronomy & Astrophysics were done with only the first two percent of the sky survey. The team aims to make sensitive high-resolution images of the whole northern sky, which will reveal 15 million radio sources in total. “Just imagine some of the discoveries we may make along the way. I certainly look forward to it”, says Jackson. “And among these there will be the first massive black holes that formed when the Universe was only a ‘baby’, with an age a few percent of its present age”, adds Röttgering.



LOFAR station Effelsberg, shown from 50 m above ground. In front: LOFAR lowband antennas (LBA) for 10-80 MHz, in the back: LOFAR highband antennas (HBA) for 110-240 MHz. © W. Reich/MPIfR






Local Contact:

Dr. Rainer Beck
Phone:+49 228 525-323
Email: rbeck@mpifr-bonn.mpg.de
Max-Planck-Institut für Radioastronomie, Bonn

Prof. Dr. Michael Kramer
Director and Head of “Fundamental Physics in Radio Astronomy” Research Dept.
Phone:+49 228 525-278
Email: mkramer@mpifr-bonn.mpg.de
Max-Planck-Institut für Radioastronomie, Bonn

Dr. Norbert Junkes
Press and Public Outreach
Phone:+49 228 525-399
Email: njunkes@mpifr-bonn.mpg.de
Max-Planck-Institut für Radioastronomie, Bonn



Original Papers:


LOFAR Surveys 26 papers in special issue of „Astronomy and Astrophysics“ 2019.



Links:


Radioastro­nomische Fundamental­physik
Research Department “Fundamental Physics in Radio Astronomy” at MPIfR, Bonn, Germany


Images and Videos – Additional images and video clips


Image Gallery LOFAR Surveys – Images from LOFAR surveys


LOFAR – International LOFAR Telescope (ILT)

LOFAR MPIfR – LOFAR website at Max Planck Institute for Radio Astronomy (MPIfR)


GLOW – German Long Wavelength Consortium (GLOW)




LOFAR: The international LOFAR telescope (ILT) consists of a European network of radio antennas, connected by a high-speed fibre optic network spanning seven countries. LOFAR was designed, built and is now operated by ASTRON (Netherlands Institute for Radio Astronomy), with its core located in Exloo in the Netherlands. LOFAR works by combining the signals from more than 100,000 individual antenna dipoles, using powerful computers to process the radio signals as if it formed a ‘dish’ of 1900 kilometres diameter. LOFAR is unparalleled given its sensitivity and ability to image at high resolution (i.e. its ability to make highly detailed images), such that the LOFAR data archive is the largest astronomical data collection in the world and is hosted at SURFsara (The Netherlands), Forschungszentrum Juelich (Germany) and the Poznan Super Computing Center (Poland). LOFAR is a pathfinder of the Square Kilometre Array (SKA), which will be the largest and most sensitive radio telescope in the world.


Institutes publishing the results:

Australia: CSIRO

Canada: University of Montreal, University of Calgary, Queen’s University

Denmark: University of Copenhagen


France: Observatoire de Paris PSL, Station de radioastronomie de Nançay, Université Côte d’Azur, Université de Strasbourg


Germany: Hamburg University, Ruhr-University Bochum, Karl Schwarzschild Observatory Tautenburg, European Southern Observatory, University of Bonn, Max Planck Institut für Extraterrestrische Physik, Garching, Bielefeld University, Max Planck Institute for Radio Astronomy, Bonn


Iceland: University of Iceland

India: Savitribai Phule Pune University

Ireland: University College Dublin

Italy: National Institute for Astrophysics (INAF), University of Bologna

Mexico: Universidad de Guanajuato


The Netherlands: ASTRON, the NOVA (Netherlands Research School for Astronomy) institutes at Leiden University, Groningen University, University of Amsterdam and Radboud University Nijmegen, SURFsara, SRON, Ampyx Power B.V, JIVE


Poland: Jagiellonian University, Nicolaus Copernicus University Toruń

South Africa: University of Western Cape, Rhodes University, SKA South Africa

Spain: Universidad de La Laguna

Sweden: Chalmers University

Uganda: Mbarara University of Science & Technology


United Kingdom: University of Hertfordshire, University of Edinburgh, Open University, University of Oxford, Univerity of Southampton, University of Bristol, University of Manchester, The Rutherford Appleton Laboratory, University of Portsmouth, University of Nottingham


USA: Harvard University, Naval Research Laboratory, University of Massachusetts




Archive link


Penrhos Feilw Standing Stones, Anglesey, North Wales, 17.2.19.


Penrhos Feilw Standing Stones, Anglesey, North Wales, 17.2.19.









Source link


CMS gets first result using largest ever LHC data sample


CERN – European Organization for Nuclear Research logo.


18 February, 2019


The CMS collaboration at CERN has submitted its first paper based on the full LHC dataset collected in 2018 and data collected in 2016 and 2017



Image above: A proton–proton collision event recorded by CMS in 2018 (Image: CMS collaboration).


Just under three months after the final proton–proton collisions from the Large Hadron Collider (LHC)’s second run (Run 2), the CMS collaboration has submitted its first paper based on the full LHC dataset collected in 2018 – the largest sample ever collected at the LHC – and data collected in 2016 and 2017. The findings reflect an immense achievement, as a complex chain of data reconstruction and calibration was necessary to be able to use the data for analysis suitable for a scientific result.


“It is truly a sign of effective scientific collaboration and the high quality of the detector, software and the CMS collaboration as a whole. I am proud and extremely impressed that the understanding of the so recently collected data is sufficiently advanced to produce this very competitive and exciting result,” said CMS spokesperson Roberto Carlin.


Quantum chromodynamics (QCD) is one of the pillars of the Standard Model of elementary particles and describes how quarks and gluons are confined within composite particles called hadrons, of which protons and neutrons are examples. However, the QCD processes behind this confinement are not yet well understood, despite much progress in the last two decades. One way to understand these processes is to study the little known Bc particle family, which consists of hadrons composed of a beauty quark and a charm antiquark (or vice-versa).


The high collision energies and rates provided by the Large Hadron Collider opened the path for the exploration of the Bc family. The first studies were published in 2014 by the ATLAS collaboration, using data collected during LHC’s first run. At the time, ATLAS reported the observation of a Bc particle called Bc(2S). On the other hand, the LHCb collaboration reported in 2017 that their data showed no evidence of Bc(2S) at all. Analysing the large LHC Run 2 data sample, collected in 2016, 2017 and 2018, CMS has now observed Bc(2S) as well as another Bc particle known as Bc*(2S). The collaboration has also been able to measure the mass of Bc(2S) with a good precision. These measurements provide a rich source of information on the QCD processes that bind heavy quarks into hadrons. For more information about the results visit the CMS webpage.


The results were submitted to Physical Review Letters and presented at CERN this week.


Note:


CERN, the European Organization for Nuclear Research, is one of the world’s largest and most respected centres for scientific research. Its business is fundamental physics, finding out what the Universe is made of and how it works. At CERN, the world’s largest and most complex scientific instruments are used to study the basic constituents of matter — the fundamental particles. By studying what happens when these particles collide, physicists learn about the laws of Nature.


The instruments used at CERN are particle accelerators and detectors. Accelerators boost beams of particles to high energies before they are made to collide with each other or with stationary targets. Detectors observe and record the results of these collisions.


Related links:


CMS collaboration first paper: https://arxiv.org/abs/1902.00571


Standard Model: https://home.cern/science/physics/standard-model


Physical Review Letters: https://arxiv.org/abs/1902.00571


CMS webpage: https://cms.cern/news/first-measurement-lhc-run-2-pp-data-collected-2016-2017-and-2018


CMS webpage: https://cms.cern/


For more information about European Organization for Nuclear Research (CERN), Visit: https://home.cern/


Image (mentioned), Text, Credit: CERN.


Best regards, Orbiter.chArchive link


CERN – LS2 report: The Proton Synchrotron’s magnets prepare for higher energies


CERN – European Organization for Nuclear Research logo.


18 February, 2019


Following our article on the PS Booster, we take a look at the next link in CERN’s accelerator chain: the venerable Proton Synchrotron and its magnet system 



Image above: One of the magnets being driven on a locomotive to the workshop (right) after being extracted from the PS itself (left) (Image: Julien Marius Ordan/Maximilien Brice/CERN).


The Proton Synchrotron (PS), which was CERN’s first synchrotron and which turns 60 this year, once held the record for the particle accelerator with the highest energy. Today, it forms a key link in CERN’s accelerator complex, mainly accelerating protons to 26 GeV before sending them to the Super Proton Synchrotron (SPS), but also delivering particles to several experimental areas such as the Antiproton Decelerator (AD). Over the course of Long Shutdown 2 (LS2), the PS will undergo a major overhaul to prepare it for the higher injection and beam intensities of the LHC’s Run 3 as well as for the High-Luminosity LHC.


One major component of the PS that will be consolidated is the magnet system. The synchrotron has a total of 100 main magnets within it (plus one reference magnet unit outside the ring), which bend and focus the particle beams as they whizz around it gaining energy. “During the last long shutdown (LS1) and at the beginning of LS2, the TE-MSC team performed various tests to identify weak points in the magnets,” explains Fernando Pedrosa, who is coordinating the LS2 work on the PS. The team identified 50 magnets needing refurbishment, of which seven were repaired during LS1 itself. “The remaining 43 magnets that need attention will be refurbished this year.”


Specifically, one of the elements, known as the pole-face windings, which is located between the beam pipe and the magnet yoke, needs replacing. In order to reach into the magnet innards to replace these elements, the magnet units have to be transferred to a workshop in building 151. Once disconnected, each magnet is placed onto a small locomotive system that drives them to the workshops. The locomotives themselves are over 50 years old, and their movement must be delicately managed. It takes ten hours to extract one magnet. So far, six magnets have been taken to the workshop and this work will last until 18 October 2019.


The workshop where the magnets are being treated is divided into two sections. In the first room, the vacuum chamber of the magnets is cut so as to access the pole-face windings. The magnet units are then taken to the second room, where prefabricated replacements are installed.


As mentioned in the previous LS2 Report, the PS Booster will see an increase in the energy it imparts to accelerating protons, from 1.4 GeV to 2 GeV. A new set of quadrupole magnets will be installed along the Booster-to-PS injection line, to increase the focusing strength required for the higher-energy beams. Higher-energy beams require higher-energy injection elements; therefore some elements will be replaced in the PS injection region as part of the LHC Injectors Upgrade (LIU) project, namely septum 42, kicker 45 and five bumper magnets.


Other improvements as part of the LIU project include the new cooling systems being installed to increase the cooling capacity of the PS. A new cooling station is being built at building 355, while one cooling tower in building 255 is being upgraded. The TT2 line, which is involved in the transfer from the PS to the SPS, will have its cooling system decoupled from the Booster’s, to allow the PS to operate independent of the Booster schedule. “The internal dumps of the PS, which are used in case the beam needs to be stopped, are also being changed, as are some other intercepting devices,” explains Pedrosa.


The LS2 operations are on a tight schedule,” notes Pedrosa, pointing out that works being performed on several interconnected systems create constraints for what can be done concurrently. As LS2 proceeds, we will bring you more news about the PS, including the installation of new instrumentation in wire scanners that help with beam-size measurement, an upgraded transverse-feedback system to stabilise the beam and more.


More pictures of the PS magnets are available on CDS: https://cds.cern.ch/record/2657869

Note:


CERN, the European Organization for Nuclear Research, is one of the world’s largest and most respected centres for scientific research. Its business is fundamental physics, finding out what the Universe is made of and how it works. At CERN, the world’s largest and most complex scientific instruments are used to study the basic constituents of matter — the fundamental particles. By studying what happens when these particles collide, physicists learn about the laws of Nature.


The instruments used at CERN are particle accelerators and detectors. Accelerators boost beams of particles to high energies before they are made to collide with each other or with stationary targets. Detectors observe and record the results of these collisions.


Related links:


Proton Synchrotron (PS): https://home.cern/science/accelerators/proton-synchrotron


CERN’s accelerator complex: https://home.cern/science/accelerators/accelerator-complex


Super Proton Synchrotron (SPS): https://home.cern/science/accelerators/super-proton-synchrotron


Antiproton Decelerator (AD): https://home.cern/science/accelerators/antiproton-decelerator


High-Luminosity LHC: https://home.cern/science/accelerators/high-luminosity-lhc


Previous LS2 Report: https://home.cern/news/news/accelerators/ls2-report-metamorphosis-booster


PS Booster: https://home.cern/science/accelerators/proton-synchrotron-booster


For more information about European Organization for Nuclear Research (CERN), Visit: https://home.cern/


Image (mentioned), Text, Credits: CERN/Achintya Rao.


Best regards, Orbiter.chArchive link


What exactly is a black hole?

A black hole is conventionally thought of as an astronomical object that irrevocably consumes all matter and radiation which comes within its sphere of influence. Physically, a black hole is defined by the presence of a singularity, i.e., a region of space, bounded by an ‘event horizon’, within which the mass/energy density becomes infinite, and the normally well-behaved laws of physics no longer apply.











What exactly is a black hole?
Simulation of Material Orbiting close to a Black Hole [Credit: ESO/Gravity Consortium/L. Calçada]

However, as an article published in the journal Nature Astronomy demonstrates, a precise and agreed definition of this ‘singular’ state proves to be frustratingly elusive. Its author, Dr. Erik Curiel of the Munich Center for Mathematical Philosophy at Ludwig-Maximilians-Universitaet, summarizes the problem as follows: “The properties of black holes are the subject of investigations in a range of subdisciplines of physics – in optical physics, in quantum physics and of course in astrophysics. But each of these specialties approaches the problem with its own specific set of theoretical concepts.”


Erik Curiel studied Philosophy as well as Theoretical Physics at Harvard University and the University of Chicago, and the primary aim of his current DFG-funded research project is to develop a precise philosophical description of certain puzzling aspects of modern physics.


“Phenomena such as black holes belong to a realm that is inaccessible to observation and experiment. Work based on the assumption that black holes exist therefore involves a level of speculation that is unusual even for the field of theoretical physics.”


However, this difficulty is what makes the physical approach to the nature of black holes so interesting from the philosophical point of view. “The physical perspective on black holes is itself inextricably bound up with philosophical issues relating to ontological, metaphysical and methodological considerations,” says Curiel.


“Surprising” and “eye-opening” insights


During the preparation of his philosophical analysis of the concept of black holes for Nature Astronomy, the author spoke to physicists involved in a wide range of research fields. In the course of these conversations, he was given quite different definitions of a black hole. Importantly, however, each was used in a self-consistent way within the bounds of the specialist discipline concerned. Curiel himself describes these discussions as “surprising” and “eye-opening”.


For astrophysicist Avi Loeb, “a black hole is the ultimate prison: once you check in, you can never get out.” On the other hand, theoretical physicist Domenico Giulini regards it as “conceptually problematical to think of black holes as objects in space, things that can move and be pushed around.”


Curiel’s own take-home-message is that the very diversity of definitions of black holes is a positive sign, as it enables physicists to approach the phenomenon from a variety of physical perspectives. However, in order to make productive use of this diversity of viewpoints, it will be important to cultivate a greater awareness of the differences in emphasis between them.


Source: Ludwig-Maximilians-Universität München [February 14, 2019]



TANN



Archive


Gravitational waves will settle cosmic conundrum

Measurements of gravitational waves from ~50 binary neutron stars over the next decade will definitively resolve an intense debate over how fast our universe is expanding, find an international team including UCL and Flatiron Institute cosmologists.











Gravitational waves will settle cosmic conundrum
Simulation of merging neutron stars calculated with supercomputers. Different colours show the mass density and the
temperature some time after the merger has taken place and shortly before the object collapses to a black hole.
Quarks are expected to form where the temperature and density are higher [Credit: C. Breu, L. Rezzolla]

The cosmos has been expanding for 13.8 billion years and its present rate of expansion, known as the Hubble constant, gives the time elapsed since the Big Bang.


However, the two best methods used to measure the Hubble constant do not agree, suggesting our understanding of the structure and history of the universe – called the ‘standard cosmological model’ – may be wrong.


The study, published in Physical Review Letters, shows how new independent data from gravitational waves emitted by binary neutron stars called ‘standard sirens’ will break the deadlock between the measurements once and for all.


“The Hubble Constant is one of the most important numbers in cosmology because it is essential for estimating the curvature of space and the age of the universe, as well as exploring its fate,” said Professor Hiranya Peiris (UCL Physics & Astronomy).


“We can measure the Hubble Constant by using two methods – one observing Cepheid stars and supernovae in the local universe, and a second using measurements of cosmic background radiation from the early universe – but these methods don’t give the same values, which means our standard cosmological model might be flawed.”


The team developed a universally applicable technique which calculates how gravitational wave data will resolve the issue.


Gravitational waves are emitted when binary neutron stars spiral towards each other before colliding in a bright flash of light that can be detected by telescopes. Indeed, UCL researchers were involved in detecting the first light from a gravitational wave event in August 2017.


Binary neutron star events are rare but invaluable in providing another route to track how the universe is expanding.


This is because the gravitational waves they emit cause ripples in space-time which can be detected by the Laser Interferometer Gravitational-Wave Observatory (LIGO) and the Virgo experiments, giving a precise measurement of the system’s distance from Earth.


By additionally detecting the light from the accompanying explosion, astronomers can determine the system’s velocity, and hence calculate the Hubble constant using Hubble’s Law.


For this study, the researchers modelled how many such observations would be needed to resolve the issue in measuring the Hubble constant accurately.


“We’ve calculated that by observing 50 binary neutron stars over the next decade, we will have sufficient gravitational wave data to independently determine the best measurement of the Hubble constant. We should be able to detect enough mergers to answer this question within 5-10 years,” said lead author Dr Stephen Feeney of the Center for Computational Astrophysics at the Flatiron Institute in New York City.


“This in turn will lead to the most accurate picture of how the universe is expanding and help us improve the standard cosmological model,” concluded Professor Peiris.


Source: University Colledge London [February 14, 2019]



TANN



Archive


Earth First Origins Project seeks to replicate the cradle of life

The evolution of planet Earth and the emergence of life during its first half-billion years are inextricably linked, with a series of planetwide transformations – formation of the ocean, evolution of the atmosphere, and the growth of crust and continents – underpinning the environmental stepping stones to life. But how, and in what order, were the ingredients for life on Earth manufactured and assembled?











Earth First Origins Project seeks to replicate the cradle of life
NASA’s new Prebiotic Chemistry and Early Earth Environments (PCE3) Consortium will identify planetary
 conditions that might give rise to life’s chemistry [Credit: Rensselaer]

NASA’s Astrobiology Program has awarded a $9 million grant to tackle the question through the Earth First Origins project, led by Rensselaer Polytechnic Institute Assistant Professor Karyn Rogers. The five-year project seeks to uncover the conditions on early Earth that gave rise to life by identifying, replicating, and exploring how prebiotic molecules and chemical pathways could have formed under realistic early Earth conditions.


“Planet Earth and the chemistry of life share the same road,” said Rogers. “Because of that co-evolution, we can use our understanding of the fundamental planetary processes that set the Earth system in motion to sketch the physical, chemical, and environmental map to life.”


Earth First Origins serves as the catalyst for launching the Rensselaer Astrobiology Research and Education (RARE) Center. The newly established RARE Center builds on the expertise established through more than three decades of astrobiology research at Rensselaer, and supersedes its predecessor, the New York Center for Astrobiology. In addition to conducting fundamental research into life’s origins and the potential for life throughout the universe, the RARE Center will support a range of education and public engagement activities. These include a seminar series, a curricular minor in astrobiology, the upcoming Gateway to Early Earth Summer School, and a core undergraduate and graduate education program.


“Rensselaer has an extensive history of significant contributions to the field of astrobiology, and the Earth First Origins project and the Rensselaer Astrobiology Research and Education Center will be tremendous additions to our legacy of discovery,” said Rensselaer President Shirley Ann Jackson. “The interdisciplinary global collaboration involved in these initiatives epitomizes the visionary work we engage in as The New Polytechnic.”



Earth First Origins and the RARE Center unite a diverse team of experts in planetary evolution, early Earth geochemistry, prebiotic and experimental astrobiology, and analytical chemistry.  Complemented by a team of molecular biologists, geochemical modelers, and data and visualization experts, the research team brings a wealth of experience poised to launch a new research paradigm for studying life’s origins.


“Various types of environments existed on early Earth and many of them could have been the starting place of life, or life could have emerged via processes that connected several environmental niches,” said Rogers. “We want to establish the range of possible conditions in different early Earth environments, replicate them in the lab, and understand the particular factors that contribute to the sequence of chemical syntheses that lead to life.” 


The Earth First Origins project will establish the Gateway to Early Earth, which consists of both a physical lab space and a virtual environment, the early Earth Lab (eEL) and the Virtual early Earth Portal (VeEP), both housed at Rensselaer.  The Gateway will be a resource for the Earth First Origins team, as well as the larger origins of life community, to access realistic early Earth environments, both experimentally and through models, and explore their potential to give rise to life’s chemistry.


The early Earth Laboratory will house a suite of experimental equipment used to replicate early Earth environments. The eEL will not only target the temperature, pressure, and geochemical conditions of the early Earth, but will also employ novel experimental techniques to represent the dynamic connections between different systems.


“Early Earth hosted a wide range of distinct environments. By accurately representing water-rock-atmosphere interactions, or the flow and mixing of fluids along thermal and chemical gradients, the eEL will provide a much better way of exploring the chemical pathways that emerged during Earth’s earliest times.” said Bruce Watson, co-investigator and a geochemist and Institute Professor at Rensselaer.


The VeEP provides applications and tools for integrating geochemical and geophysical models, and applying data visualization techniques to explore the range of possibilities in various early Earth environments. Additionally, the VeEP will allow researchers to record data from experiments, models, and analyses in “virtual notebooks” that are ingested into a larger structured data warehouse and accessed through the portal.


Source: Rensselaer Polytechnic Institute [February 14, 2019]



TANN



Archive


Delays in banning wildlife trade put hundreds of species at risk

From parrots to lizards, hundreds of animal species could be at risk of extinction because of a policy process that responds slowly to scientific knowledge, according to a new study in Science.











Delays in banning wildlife trade put hundreds of species at risk
Some species have taken almost 20 years to be added to the protected list, while others
are still waiting 24 years after being first considered [Credit: Getty Images]

International wildlife authorities will gather in May to vote on wildlife trade restrictions at the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES) Conference of the Parties (CoP). The study suggests concrete steps policymakers can take to speed up a wildlife protection process that can take more than two decades.


“New trends in wildlife trade can develop quickly, with some species going from common to near extinction in just a few years,” said Eyal Frank, a co-author of the study and an assistant professor at the University of Chicago’s Harris School of Public Policy. “A policymaking process needs to respond quickly to new information in order to prevent extinction for hundreds of animals and plants. That’s why it’s absolutely critical that policymakers allow science to inform a speedy protection process.”


Frank and his co-author, David Wilcove from Princeton University, analyzed 958 species on the International Union for the Conservation of Nature’s (IUCN) Red List that are endangered by international trade. Of those, they discovered that 28 percent are not protected by CITES, the primary international framework for preventing species extinction due to international wildlife trade.


When studying how quickly species from the Red List became protected under CITES, they found that 62 percent needed to wait as long as 19 years for protection under CITES or are still waiting to be listed up to 24 years after being first considered. These patterns are the same for even the most threatened species.


At the same time, the study points out that 36 percent of species studied were protected by CITES before making it on the Red List. This could be because the CITES authorities had information not available to the IUCN, or it could be due to staffing and other resource constraints at the IUCN.


“CITES and the Red List are two of the most important tools we have to save wildlife threatened by international trade. It’s vital that these two institutions work together closely and quickly to stop the killing,” said Wilcove, who is based at the Princeton Environmental Institute and Princeton’s Woodrow Wilson School of Public and International Affairs.


Frank and Wilcove recommend that any nation that is part of CITES advocate that Red List species threatened by international trade be quickly protected under the treaty in order to clear the backlog, with the goal being that any threatened species on the Red List that is threatened by trade receive a prompt vote for immediate protection under CITES. Independently from CITES, all countries can use the Red List as a source of information and take measures to protect threatened species found within their borders.


Source: Princeton University [February 14, 2019]



TANN



Archive


The smallest skeletons in the marine world observed in 3D by synchrotron techniques

Coccolithophores are microscopic marine algae that use carbon dioxide to grow and release carbon dioxide when they create their miniature calcite shells. These tiny but very abundant planktonic microorganisms could therefore be seriously impacted by current increasing carbon dioxide emissions. Scientists from the CNRS, Le Mans Université, Sorbonne Université, Aix-Marseille Université and the ESRF, the European Synchrotron, have revealed the nano-level 3D structure of their calcite shells, providing new perspectives for assessment of the role of these tiny microorganisms in the global carbon cycle. A study, published in Nature Communications, shows new correlations between their mass and the size of the organic template around which the calcite nucleation and growth take place.











The smallest skeletons in the marine world observed in 3D by synchrotron techniques
From coccospheres (left) to coccoliths (right). Coccolithophores are microscopic marine algae that use carbon dioxide
 to grow but release carbon dioxide when they create their miniature calcite shells called ‘coccoliths’
[Credit: Alain Gibaud, IMMM, CNRS UMR 6283, Le Mans Université]

You have probably never heard of them, but you may have inadvertently noticed coccolithophores in satellite images of the sea when a magnificent milky-turquoise coloured patch shows up in surface waters indicating that trillions of these single-celled calcified phytoplankton are present.
About one-third of the carbon dioxide released into the atmosphere as a result of human activity is absorbed by the oceans, where it reacts chemically and makes the water more acid. This, in turn, makes it difficult for certain calcifying marine organisms, such as sea stars, sea urchins, corals, and coccolithophores to build their shells or skeletons.


When tiny organisms impact the global carbon cycle


Coccolithophores, single-celled organisms much smaller than the pixels on your computer screen, are active players in the carbon cycle. They live in surface layers of the sea, where they use light to photosynthesize, fixing CO2 into organic matter leading to a decrease in dissolved CO2 in the ocean. Unlike other photosynthetic phytoplankton, coccolithophores produce calcite (i.e. CaCO3) in the form of minute platelets called “coccoliths”.











The smallest skeletons in the marine world observed in 3D by synchrotron techniques
Structures of coccospheres by three-dimensional X-ray coherent diffraction imaging carried out at the ESRF,
the European Synchrotron.(A) SEM image of G. oceanicaRCC1314. (B) 3D-CXDI view of G. oceanicaRCC1314.
(C) 3D-CXDI views of six other coccospheres. Scale bar = 1μm [Credit: Thomas Beuvier, ESRF, IMMM,
CNRS UMR 6283, Le Mans Université]

Coccolithophore calcification uses bicarbonate (HCO3) from seawater and releases CO2. When coccolithophore cells die, coccoliths and associated organic matter slowly sink to the seabed, thus contributing to the storage of carbon in the deep ocean reservoir. Although they are tiny organisms, the coccolithophores play a key role in the global carbon cycle because of the fact that they are very abundant in the oceans.
Several recent laboratory and field studies indicate that ocean acidification is likely to hamper coccolithophore calcification. However, some studies have reported an increase of coccolithophore calcification in more acidic conditions.


Unveiling the mass of coccoliths


Understanding how environmental factors influence the degree of calcification of coccoliths is therefore of significant interest. The crucial issue is to be able to accurately estimate the mass of the calcite shell of these microorganisms. “We have developed a method to estimate the mass of individual coccoliths using automated optical microscopy”, says CNRS scientist Luc Beaufort. “Although this technique is very useful for measuring the mass of a large quantity of coccoliths in a short period of time, it was crucial to assess the accuracy of these measurements by comparing with another very precise method.”











The smallest skeletons in the marine world observed in 3D by synchrotron techniques
Bloom of coccolithophores visible from space [Credit: NASA]

Scientists Alain Gibaud and Thomas Beuvier, regular users of the ESRF, put Yuriy Chushkin and Federico Zontone, scientists at the ESRF, in touch with the palaeontologists Luc Beaufort and Baptiste Suchéras-Marx and the marine biologist Ian Probert. The coherent X-ray diffraction imaging technique on ESRF beamline ID10 was used to generate incredibly detailed information on the 3D structure (and therefore mass) of shells and individual coccoliths of several species of coccolithophore.
The team were able to calibrate the optical microscopy method and found that each coccolith in the shell has different characteristics, despite all being created in the same environmental conditions. To explain the variations in coccolith size and mass within single coccolithophores, they found that the mass of coccoliths is proportionate to the size of the organic scale around which calcite nucleation occurs every 110-120 nm.


“The experiment at the ESRF was challenging because the samples, at 5 to 7microns, were almost too big for us to study. With coherent diffraction imaging, we managed to get information in 3D and reconstruct the individual calcite crystals of the coccoliths”, says Yuriy Chushkin, scientist at the ESRF. “In fact, the largest samples scattered the beam so well that in one hour we had the full 3D data set that we needed”, he concludes.


The next step for the team is to use the 3D computed images of these coccoliths to get a deeper understanding of how calcification is controlled by these extraordinary phytoplanktons and of the mechanical properties of these tiny but very intricate calcite structures.


Source: European Synchrotron Radiation Facility [February 14, 2019]



TANN



Archive


Massive Bolivian earthquake reveals mountains 660 kilometers below our feet

Most schoolchildren learn that the Earth has three (or four) layers: a crust, mantle and core, which is sometimes subdivided into an inner and outer core. That’s not wrong, but it does leave out several other layers that scientists have identified within the Earth, including the transition zone within the mantle.











Massive Bolivian earthquake reveals mountains 660 kilometers below our feet
Princeton seismologist Jessica Irving worked with then-graduate student Wenbo Wu and another collaborator to determine
 the roughness at the top and bottom of the transition zone, a layer within the mantle, using scattered earthquake waves.
They found that the top of the transition zone, a layer located 410 kilometers down, is mostly smooth, but the base
of the transition zone, 660 km down, in some places is much rougher than the global surface average. “In other
words, stronger topography than the Rocky Mountains or the Appalachians is present at the 660-km boundary,”
said Wu. Note: This graphic is not to scale .[Credit: Kyle McKernan/Princeton University]

In a study published in Science, Princeton geophysicists Jessica Irving and Wenbo Wu, in collaboration with Sidao Ni from the Institute of Geodesy and Geophysics in China, used data from an enormous earthquake in Bolivia to find mountains and other topography on the base of the transition zone, a layer 660 kilometers (410 miles) straight down that separates the upper and lower mantle. (Lacking a formal name for this layer, the researchers simply call it “the 660-km boundary.”)


To peer deep into the Earth, scientists use the most powerful waves on the planet, which are generated by massive earthquakes. “You want a big, deep earthquake to get the whole planet to shake,” said Irving, an assistant professor of geosciences.


Big earthquakes are vastly more powerful than small ones — energy increases 30-fold with every step up the Richter scale — and deep earthquakes, “instead of frittering away their energy in the crust, can get the whole mantle going,” Irving said. She gets her best data from earthquakes that are magnitude 7.0 or higher, she said, as the shockwaves they send out in all directions can travel through the core to the other side of the planet — and back again. For this study, the key data came from waves picked up after a magnitude 8.2 earthquake — the second-largest deep earthquake ever recorded — that shook Bolivia in 1994.


“Earthquakes this big don’t come along very often,” she said. “We’re lucky now that we have so many more seismometers than we did even 20 years ago. Seismology is a different field than it was 20 years ago, between instruments and computational resources.”


Seismologists and data scientists use powerful computers, including Princeton’s Tiger supercomputer cluster, to simulate the complicated behavior of scattering waves in the deep Earth.


The technology depends on a fundamental property of waves: their ability to bend and bounce. Just as light waves can bounce (reflect) off a mirror or bend (refract) when passing through a prism, earthquake waves travel straight through homogenous rocks but reflect or refract when they encounter any boundary or roughness.


“We know that almost all objects have surface roughness and therefore scatter light,” said Wu, the lead author on the new paper, who just completed his geosciences Ph.D. and is now a postdoctoral researcher at the California Institute of Technology. “That’s why we can see these objects — the scattering waves carry the information about the surface’s roughness. In this study, we investigated scattered seismic waves traveling inside the Earth to constrain the roughness of the Earth’s 660-km boundary.”


The researchers were surprised by just how rough that boundary is — rougher than the surface layer that we all live on. “In other words, stronger topography than the Rocky Mountains or the Appalachians is present at the 660-km boundary,” said Wu. Their statistical model didn’t allow for precise height determinations, but there’s a chance that these mountains are bigger than anything on the surface of the Earth. The roughness wasn’t equally distributed, either; just as the crust’s surface has smooth ocean floors and massive mountains, the 660-km boundary has rough areas and smooth patches. The researchers also examined a layer 410 kilometers (255 miles) down, at the top of the mid-mantle “transition zone,” and they did not find similar roughness.


“They find that Earth’s deep layers are just as complicated as what we observe at the surface,” said seismologist Christine Houser, an assistant professor at the Tokyo Institute of Technology who was not involved in this research. “To find 2-mile (1-3 km) elevation changes on a boundary that is over 400 miles (660 km) deep using waves that travel through the entire Earth and back is an inspiring feat. … Their findings suggest that as earthquakes occur and seismic instruments become more sophisticated and expand into new areas, we will continue to detect new small-scale signals which reveal new properties of Earth’s layers.”


The presence of roughness on the 660-km boundary has significant implications for understanding how our planet formed and continues to function. That layer divides the mantle, which makes up about 84 percent of the Earth’s volume, into its upper and lower sections. For years, geoscientists have debated just how important that boundary is. In particular, they have investigated how heat travels through the mantle — whether hot rocks are carried smoothly from the core-mantle boundary (almost 2,000 miles down) all the way up to the top of the mantle, or whether that transfer is interrupted at this layer. Some geochemical and mineralogical evidence suggests that the upper and lower mantle are chemically different, which supports the idea that the two sections don’t mix thermally or physically. Other observations suggest no chemical difference between the upper and lower mantle, leading some to argue for what’s called a “well-mixed mantle,” with both the upper and lower mantle participating in the same heat-transfer cycle.


“Our findings provide insight into this question,” said Wu. Their data suggests that both groups might be partially right. The smoother areas of the 660-km boundary could result from more thorough vertical mixing, while the rougher, mountainous areas may have formed where the upper and lower mantle don’t mix as well.


In addition, the roughness the researchers found, which existed at large, moderate and small scales, could theoretically be caused by heat anomalies or chemical heterogeneities. But because of how heat in transported within the mantle, Wu explained, any small-scale thermal anomaly would be smoothed out within a few million years. That leaves only chemical differences to explain the small-scale roughness they found.


What could cause significant chemical differences? The introduction of rocks that used to belong to the crust, now resting quietly in the mantle. Scientists have long debated the fate of the slabs of sea floor that get pushed into the mantle at subduction zones, the collisions happening found all around the Pacific Ocean and elsewhere around the world. Wu and Irving suggest that remnants of these slabs may now be just above or just below the 660-km boundary.


“It’s easy to assume, given we can only detect seismic waves traveling through the Earth in its current state, that seismologists can’t help understand how Earth’s interior has changed over the past 4.5 billion years,” said Irving. “What’s exciting about these results is that they give us new information to understand the fate of ancient tectonic plates which have descended into the mantle, and where ancient mantle material might still reside.”


She added: “Seismology is most exciting when it lets us better understand our planet’s interior in both space and time.”


Source: Princeton University [February 14, 2019]



TANN



Archive


Rosetta’s comet sculpted by stress


ESA – Rosetta Mission patch.


18 February 2019


Feeling stressed? You’re not alone. ESA’s Rosetta mission has revealed that geological stress arising from the shape of Comet 67P/Churyumov–Gerasimenko has been a key process in sculpting the comet’s surface and interior following its formation.



Rosetta’s distinctive dual-lobe comet

Small, icy comets with two distinct lobes seem to be commonplace in the Solar System, with one possible mode of formation a slow collision of two primordial objects in the early stages of formation some 4.5 billion years ago. A new study using data collected by Rosetta during its two years at Comet 67P/C-G has illuminated the mechanisms that contributed to shaping the comet over the following billions of years.


The researchers used stress modelling and three-dimensional analyses of images taken by Rosetta’s high resolution OSIRIS camera to probe the comet’s surface and interior.



Stress-formed fractures and terraces on Rosetta’s comet

 “We found networks of faults and fractures penetrating 500 metres underground, and stretching out for hundreds of metres,” says lead author Christophe Matonti of Aix-Marseille University, France.


“These geological features were created by shear stress, a mechanical force often seen at play in earthquakes or glaciers on Earth and other terrestrial planets, when two bodies or blocks push and move along one another in different directions. This is hugely exciting: it reveals much about the comet’s shape, internal structure, and how it has changed and evolved over time.”


The model developed by the researchers found shear stress to peak at the centre of the comet’s ‘neck’, the thinnest part of the comet connecting the two lobes.


“It’s as if the material in each hemisphere is pulling and moving apart, contorting the middle part – the neck – and thinning it via the resulting mechanical erosion,” explains co-author Olivier Groussin, also of Aix-Marseille University, France.


“We think this effect originally came about because of the comet’s rotation combined with its initial asymmetric shape. A torque formed where the neck and ‘head’ meet as these protruding elements twist around the comet’s centre of gravity.”


The observations suggest that the shear stress acted globally over the comet and, crucially, around its neck. The fact that fractures could propagate so deeply into 67P/C-G also confirms that the material making up the interior of the comet is brittle, something that was previously unclear.


“None of our observations can be explained by thermal processes,” adds co-author Nick Attree of the University of Stirling, UK. “They only make sense when we consider a shear stress acting over the entire comet and especially around its neck, deforming and damaging and fracturing it over billions of years.”


Sublimation, the process of ices turning to vapour and resulting in comet dust being dragged out into space, is another well-known process that can influence a comet’s appearance over time. In particular, when a comet passes closer to the Sun, it warms up and loses its ices more rapidly – perhaps best visualised in some of the dramatic outbursts captured by Rosetta during its time at Comet 67P/C–G.


The new results shed light on how dual-lobe comets have evolved over time.



Evolution of Rosetta’s comet over 4.5 billion years

Comets are thought to have formed in the earliest days of the Solar System, and are stored in vast clouds at its outer edges before beginning their journey inwards. It would have been during this initial ‘building’ phase of the Solar System that 67P/C-G got its initial shape.


The new study indicates that, even at large distances from the Sun, shear stress would then act over a timescale of billions of years following formation, while sublimation erosion takes over on shorter million-year timescales to continue shaping the comet’s structure – especially in the neck region that was already weakened by shear stress.



Ultima Thule vs Comet 67P/C-G

Excitingly, NASA’s New Horizons probe recently returned images from its flyby of Ultima Thule, a trans-Neptunian object located in the Kuiper belt, a reservoir of comets and other minor bodies at the outskirts of the Solar System.


The data revealed that this object also has a dual-lobed shape, even though somewhat flattened with respect to Rosetta’s comet.


“The similarities in shape are promising, but the same stress structures don’t seem to be apparent in Ultima Thule,” comments Christophe.


As more detailed images are returned and analysed, time will tell if it has experienced a similar history to 67P/C-G or not.


“Comets are crucial tools for learning more about the formation and evolution of the Solar System,” says Matt Taylor, ESA’s Rosetta Project Scientist.


“We’ve only explored a handful of comets with spacecraft, and 67P is by far the one we’ve seen in most detail. Rosetta is revealing so much about these mysterious icy visitors and with the latest result we can study the outer edges and earliest days of the Solar System in a way we’ve never been able to do before.”


Notes for editors:


“Bilobate comet morphology and internal structure controlled by shear deformation” by C. Matonti et al. is published in Nature Geoscience: https://www.nature.com/articles/s41561-019-0307-9


Rosetta is an ESA mission. It launched in 2004 and rendezvoused with Comet 67P/Churyumov–Gerasimenko in 2014 to study it close up as it orbited around the Sun. It also deployed the lander Philae onto the comet’s surface. Rosetta completed its mission by descending to the comet on 30 September 2016.


Related links:


Rosetta: http://www.esa.int/Our_Activities/Space_Science/Rosetta


OSIRIS camera: http://sci.esa.int/rosetta/35061-instruments/?fbodylongid=1642


Images, Text, Credits: ESA/Rosetta/NavCam – CC BY-SA IGO 3.0/MPS for OSIRIS Team MPS/UPD/LAM/IAA/SSO/INTA/UPM/DASP/IDA; C. Matonti et al. (2019)/NASA/Johns Hopkins University Applied Physics Laboratory/Southwest Research Institute.


Greetings, Orbiter.chArchive link


Carbonaceous chondrites provide clues about the delivery of water to Earth

An international study led by researchers from the Institute of Space Sciences, from the Spanish National Research Council (CSIC) and the Institut d’Estudis Espacials de Catalunya has discovered that carbonaceous chondrites, a class of meteorites, incorporated hydrated minerals along with organic material from the protoplanetary disk before the formation of planets. Scientists from the study published in the journal Space Science Reviews note that these meteorites played “an important role in the primordial Earth’s water enrichment” because they facilitated the transportation of volatile elements that were accumulated on the external regions of the so-called protoplanetary disk from which planets were formed more than 4.500 years ago. Earth was formed in an environment close to the Sun, very much reduced due to the relative lack of oxygen.











Carbonaceous chondrites provide clues about the delivery of water to Earth
Sample collecting of meteorites in Antarctica [Credit: Katherine Joy/ANSMET]

Carbonaceous chondrites come from asteroidal bodies that due to their size, generally inferior to hundred kilometres, never melted, and neither suffered internal chemical differentiation as planets did. Thus, the study gives clues about the initial accretion phases of the first bodies that formed the planets. The meteorites analysed in this work belong to the NASA’s Antartic collection, whose CSIC’s Institute of Space Sciences is the only repository Spanish centre, and the meteorites that fell in Murchison (Australia) in 1969 and in Renazzo (Italy) in 1824. Representative samples of the two more-hydrated types of carbonaceous chondrites (CM and CR groups) have been studied.
“Chondrites constitute a fossil legacy of the creation of the planetesimals, which provide information about the accretion of the first building blocks of planets, and also about everything that happened inside them shortly after their formation. In this study, we want to go a step further to identify processes of water incorporation befallen in the same protoplanetary disk”, explains CSIC researcher Josep M. Trigo-Rodríguez, who works at the Institute of Space Sciences and has led the study.











Carbonaceous chondrites provide clues about the delivery of water to Earth
Sample collecting of meteorites in Antarctica [Credit: Katherine Joy/ANSMET]

CSIC researcher adds: “There is a great debate about the origin of water in Earth and our study proves that carbonaceous chondrites were able to transport water in a very efficient way in their matrices. That water seems to come from two types of objects formed at different distances from the Sun: hydrated asteroids and comets. Obviously, in order to know the origin of water in Earth, we must study not only the comets but also the carbonaceous chondrites that come from an asteroid population called transitional. These bodies were far more numerous 4.000 millions of years ago, but suffered a gravitational destabilisation during Jupiter and Saturn’s migration to its current location. Those that did not end being swallowed by Jupiter and Saturn were rejected towards the terrestrial planets and to other regions of the Solar System, transporting water and organic material inside them”, explains the CSIC researcher.
The study also points at the direct implications for the origin of water in Earth. “Our calculations indicate that, coinciding with the so-called ‘Heavy Bombardment’ produced by the gravitational destabilisation of the main asteroid belt, billions of tons of carbonaceous chondrites reached Earth about 3.800 years ago. And they did it transporting in their fine matrices water and other volatile elements in form of hydrated minerals”, says Trigo.


Aims for future missions


Currently, there are two ongoing missions for sample return from primitive asteroids: NASA’s OSIRIS-Rex and JAXA’s (Japan Aerospace Exploration Agency) Hayabusa 2. The results from the carbonaceous chondrites’ analysis at a micro- and nanoscale that are published in this new study reveal the importance of the sample-return missions, that can bring to Earth rocks less altered by collisions than the meteorites that land on the terrestrial surface.


Source: Spanish National Research Council (CSIC) [February 14, 2019]



TANN



Archive


Featured

UFO sighting in Odessa UA НЛО шар плазмы UFO sighting in Odessa UA, white orb An unusual-looking object appeared suddenly in the sky at...

Popular