Baylor University
Department of Physics
College of Arts and Sciences

Baylor > Physics > News
Physics News


News Categories
•  Baylor
•  Colloquium
•  Faculty Meetings
•  Graduate
•  Outreach
•  Research Seminars
•  Social Events
•  SPS


Top News
•  Scientists Create Rare Fifth Form of Matter in Space for the First Time Ever
•  Reimagining of Schrödinger’s cat breaks quantum mechanics — and stumps physicists
•  RESOLUTIONS PRESENTED TO THE XXXth GENERAL ASSEMBLY OF THE INTERNATIONAL ASTRONOMICAL UNION
•  Black Holes Reignite Dead Stars and Turn Them Into Zombies
•  First-Ever Evidence of Higgs Boson Decay Opens New Doors for Particle Physics
•  NASA Created a Rare, Exotic State of Matter in Space
•  The Milky Way Had a Big Sibling Long Ago — And Andromeda Ate It
•  String Theory May Create Far Fewer Universes Than Thought
•  Large Hadron Collider Just Spat Electron-ified Atoms to Almost the Speed of Light
•  SETI Researchers Want to End the Alien-Detection Hype
•  Life Needs Sunlight — and That Could Change Where We Look for Aliens
•  Stars (Including 1 Daredevil) Circle the Milky Way's Monster Black Hole (Time-Lapse)
•  A New NASA-Led Project Means the Search for Aliens Is Heating Up
•  Probing Exoplanet Obliquity
•  The 10 most educated countries in the world
•  Designer diamonds could one day help build a quantum internet
•  Can We Ever Trust Man-Made AI?
•  High-Energy 'Ghost Particle' Traced to Distant Galaxy in Astronomy Breakthrough
•  The Peculiar Math That Could Underlie the Laws of Nature
•  World's fastest man-made spinning object could help study quantum mechanics
•  A City-Sized 'Telescope' Could Watch Space-Time Ripple 1 Million Times a Year
•  Physicists Find a Way to See the ‘Grin’ of Quantum Gravity
•  How Einstein Lost His Bearings, and With Them, General Relativity
•  Quantum vacuum may allow stars to exist in unconventional configurations
•  Friends and colleagues from the University of Cambridge have paid tribute to Professor Stephen Hawking, who died on 14 March 2018 at the age of 76.
•  A possible experiment to prove that gravity and quantum mechanics can be reconciled
•  SpaceX Aims to Begin BFR Spaceship Flight Tests as Soon as Next Year
•  Galaxies Rotate in Sync, Raising Dark Matter Questions
•  Does general relativity violate determinism inside charged black holes?
•  Speed of universe’s expansion remains elusive
•  Top 7 Breakthroughs of 2017 That Prove We’re Living in the Future
•  PHYSICS BREAKTHROUGH: NEW FORM OF MATTER, EXCITONIUM, FINALLY PROVED TO EXIST AFTER 50-YEAR SEARCH
•  In Just 4 Hours, Google’s AI Mastered All The Chess Knowledge in History
•  Farthest monster black hole found
•  A Swarm Intelligence Correctly Predicted TIME’s Person of the Year
•  Mathematicians Awarded $3 Million for Cracking Century-Old Problem
•  Scientists Experimentally Demonstrate the “Reversal of the Arrow of Time”
•  Physicists Take Steps Towards Measuring Unmeasurable Berry Curvature
•  'Holy Grail' Hadron: Scientists Are Close to Detecting the Elusive Tetraquark Particle
•  New Map of Dark Matter Puts the Big Bang Theory on Trial (Kavli Roundtable)
•  There and Back Again: Scientists Beam Photons to Space to Test Quantum Theory
•  New definitions of scientific units are on the horizon
•  IBM Has Used Its Quantum Computer to Simulate a Molecule—Here’s Why That’s Big News
•  Gravity may be created by strange flashes in the quantum realm A model of how wave forms of quantum systems collapse reveals a way they could create gravitational fields, and perhaps even reconcile two pillars of physics
•  Scientists discover strange form of black hole at the heart of Milky Way
•  3,700-year-old Babylonian tablet rewrites the history of maths - and shows the Greeks did not develop trigonometry
•  Dark Energy Survey reveals most accurate measurement of universe's dark matter
•  World's Fastest-Swirling Vortex Simulates the Big Bang
•  UCI celestial census indicates that black holes pervade the universe
•  Cosmic map reveals a not-so-lumpy Universe
•  High-Precision Measurement of the Proton’s Atomic Mass
•  Strange Noise in Gravitational-Wave Data Sparks Debate
•  STARSHOT: INSIDE THE PLAN TO SEND A SPACECRAFT TO OUR NEIGHBOR STAR: Hundreds of engineers and scientists have come together to shoot for the stars, literally.
•  Two Students Just Broke a Quantum Computing World Record
•  An easy-to-build desktop muon detector
•  Groundbreaking discovery confirms existence of orbiting supermassive black holes
•  NASA's Kepler Space Telescope Finds Hundreds of New Exoplanets, Boosts Total to 4,034
•  China’s quantum satellite achieves ‘spooky action’ at record distance
•  Scientists make waves with black hole research
•  We Live in a Cosmic Void, Another Study Confirms
•  Scientists Finally Witnessed a Phenomenon That Einstein Thought “Impossible”
•  Charmed Existence: Mysterious Particles Could Reveal Mysteries of the Big Bang
•  A New State of Matter is Discovered – And It’s Strange
•  A Theory of Reality as More Than the Sum of Its Parts
•  Dark Energy May Lurk in the Nothingness of Space
•  What Happens When You Mix Thermodynamics and the Quantum World? A Revolution
•  Alien Civilizations May Number In The Trillions, New Study Says
•  New blackbody force depends on spacetime geometry and topology
•  Gravitational Waves Could Help Us Detect the Universe’s Hidden Dimensions
•  We could detect alien life by finding complex molecules
•  We May Have Uncovered the First Ever Evidence of the Multiverse
•  Scientists Just Discovered an Alien Planet That’s The Best Candidate for Life As We Know It
•  Physicists detect whiff of new particle at the Large Hadron Collider
•  Physicists Discover Hidden Aspects of Electrodynamics
•  A dark matter 'bridge' holding galaxies together has been captured for the first time
•  No, Dark Energy Isn't An Illusion
•  Satellite galaxies at edge of Milky Way coexist with dark matter
•  Magnetic hard drives go atomic
•  Could Mysterious Cosmic Light Flashes Be Powering Alien Spacecraft?
•  NASA is Going to Create The Coldest Spot in the Known Universe
•  Testing theories of modified gravity
•  First Solid Sign that Matter Doesn't Behave Like Antimatter
•  Physicists investigate erasing information at zero energy cost
•  NASA Just Found A Solar System With 7 Earth-Like Planets
•  Nearby Star Has 7 Earth-Sized Worlds - Most In Habitable Zone
•  Data About 2 Distant Asteroids: Clues to the Possible Planet Nine
•  Tune Your Radio: Galaxies Sing When Forming Stars
•  Coders Race to Save NASA's Climate Data
•  You Can Help Scientists Find the Next Earth-Like Planet
•  Scientists Discover Over 100 New Exoplanets
•  Why These Scientists Fear Contact With Space Aliens
•  Scientists May Have Solved the Biggest Mystery of the Big Bang
•  New Research Shows the Universe May Have Once Been a Hologram
•  Dark energy emerges when energy conservation is violated
•  Physicists measure the loss of dark matter since the birth of the universe
•  This star has a secret – even better than 'alien megastructures'
•  Testing theories of modified gravity
•  A simple explanation of mysterious space-stretching ‘dark energy?’
•  Physicists detect exotic looped trajectories of light in three-slit experiment
•  Actual footage shows what it was like to land on Saturn's moon Titan

Scientists Create Rare Fifth Form of Matter in Space for the First Time Ever
[11/5/2018]
For a few minutes on Jan. 23, 2017, the coldest spot in the known universe was a tiny microchip hovering 150 miles over Kiruna, Sweden.

The chip was small — about the size of a postage stamp — and loaded with thousands of tightly-packed rubidium-87 atoms. Scientists launched that chip into space aboard an unpiloted, 40-foot-long (12 meters) rocket, then bombarded it with lasers until the atoms inside it cooled to minus 459.67 degrees Fahrenheit (minus 273.15 degrees Celsius) — a fraction of a fraction of a degree above absolute zero, the coldest possible temperature in nature.

While the rocket bobbed in low gravity for the following 6 minutes, scientists were given a rare opportunity to study in-depth the weirdest, least-understood state of matter in the universe — the Bose-Einstein condensate. For the first time ever, scientists had created one in space.




Unlike the other four states of matter (solids, liquids, gases and plasmas), Bose-Einstein condensates can form only when clouds of gassy atoms cool to within a few billionths of a degree above absolute zero. When groups of atoms are cooled to such unfathomably low temperatures, they stop moving as individuals and meld into one big "super atom." Tens of thousands of atoms suddenly become indistinguishable from one another, slowly vibrating on a uniform wavelength that can, theoretically, pick up the tiniest gravitational disturbances around them.

That hyper-sensitivity makes Bose-Einstein condensates promising tools for detecting gravitational waves — disturbances in the curvature of space-time created by collisions between supermassive objects like black holes and neutron stars. The trouble is, when scientists create Bose-Einstein condensates in terrestrial labs, they have just a few seconds to study them before the blob of homogenous matter falls to the bottom of its container and breaks apart.

Researchers sometimes try to buy themselves a few extra seconds by dropping Bose-Einstein condensates from tall towers, but this method is not sustainable for long-term study. Studying Bose-Einstein condensates in low or no gravity would be much more effective. (NASA recently set up a Cold Atom Laboratory on the International Space Station for just this purpose.)

This tiny microchip became the coldest spot in the known universe for 6 minutes on Jan. 23, 2017, as it hovered over Kiruna, Sweden.
This tiny microchip became the coldest spot in the known universe for 6 minutes on Jan. 23, 2017, as it hovered over Kiruna, Sweden.
Credit: DLR Aerospace Center
That brings us back to our rocket, and our very cold chip. When the chip-full-of-atoms was launched into space last January as part of the Matter-Wave Interferometry in Microgravity (MAIUS 1) experiment, scientists on the ground knew they had a few precious minutes to study it once the atoms inside froze. Using a compact laboratory built into the rocket, the team ran 110 lickety-split experiments on the chip to better understand how gravity affects atom trapping and cooling, and how Bose-Einstein condensates behave in free fall.

Among their results published in the Oct. 17 edition of the journal Nature, the researchers found that slicing up and reassembling Bose-Einstein condensates could be a key tool in detecting elusive gravitational waves. In one experiment, the team sliced their condensate cloud in half with a laser, then watched the halves recombine. Because both halves of the cloud share the exact same quantum state and move as a continuous wave, any differences in the two halves after recombination could indicate that an external influence altered that state. According to the researchers, the presence of gravitational waves could be one such influence.

If all this talk of chips and groundbreaking science is making you hungry for more, the good news is there's a lot more Bose-Einstein condensate research to be done, on Earth and above it. For now, the researchers behind the MAIUS I mission currently have two sequels in the works. Stay tuned (and bundle up).

Originally published on Live Science.
(FULL STORY)

Reimagining of Schrödinger’s cat breaks quantum mechanics — and stumps physicists
[9/19/2018]
In a multi-‘cat’ experiment, the textbook interpretation of quantum theory seems to lead to contradictory pictures of reality, physicists claim.
(FULL STORY)

RESOLUTIONS PRESENTED TO THE XXXth GENERAL ASSEMBLY OF THE INTERNATIONAL ASTRONOMICAL UNION
[9/1/2018]
Transactions IAU, Volume XXXB
Proc. XXX IAU General Assembly, August 2018 ⃝c 2019 International Astronomical Union Teresa Lago, ed. DOI: 00.0000/X000000000000000X
THIRTHIETH GENERAL ASSEMBLY
RESOLUTIONS PRESENTED TO THE XXXth GENERAL ASSEMBLY
RESOLUTION B4
on a suggested renaming of the Hubble Law
Proposed by the IAU Executive Committee
The XXX General Assembly of the International Astronomical Union,
considering
1. that the discovery of the apparent recession of the galaxies, which is usually referred to as the “Hubble law”, is one of the major milestones in the development of the science of Astronomy during the last 100 years and can be considered one of the founding pillars of modern Cosmology;
2. that the Belgian astronomer Georges Lemaˆıtre, in 1927 published (in French) the paper entitled “Un Univers homog`ene de masse constante et de rayon croissant rendant compte de la vitesse radiale des n ́ebuleuses extra-galactiques” [1]. In this he first rediscov- ers Friedman’s dynamic solution to Einstein’s general relativity equations that describes an expanding universe. He also derives that the expansion of the universe implies the spectra of distant galaxies are redshifted by an amount proportional to their distance. Finally he uses published data on the velocities and photometric distances of galaxies to derive the rate of expansion of the universe (assuming the linear relation he had found on theoretical grounds);
3. that, at the time of publication, the limited popularity of the Journal in which Lemaˆıtre’s paper appeared and the language used made his remarkable discovery largely unperceived by the astronomical community;
4. that both Georges Lemaˆıtre and the American astronomer Edwin Hubble attended the 3rd IAU General Assembly in Leiden in July 1928 and exchanged views [2] about the relevance of the redshift vs distance observational data of the extragalactic nebulae to the emerging evolutionary model of the universe;
5. that Edwin Hubble, in 1929 published the paper entitled “A Relation between Distance and Radial Velocity among Extra-Galactic Nebulae” [3] in which he proposed and derived the linear distance-velocity relation for galaxies, ultimately including new velocity data
1
2 RESOLUTIONS
in his 1931 paper with Humason [4]. Soon after the publication of his papers, the cosmic expansion became universally known as the “Hubble law”;
6. that, in 1931, on invitation by the Journal Monthly Notices of the Royal Astronom- ical Society, G. Lemaˆıtre translated in English his original 1927 paper [5], deliberately omitting the section in which he derived the rate of expansion because he ”did not find advisable to reprint the [his] provisional discussion of radial velocities which is clearly of no actual interest, and also the geometrical note, which could be replaced by a small bibliography of ancient and new papers on the subject” [6];
desiring
7. to pay tribute to both Georges Lemaˆıtre and Edwin Hubble for their fundamental contributions to the development of modern cosmology;
8. to honour the intellectual integrity of Georges Lemaˆıtre that made him value more the progress of science rather than his own visibility;
9. to highlight the role of the IAU General Assemblies in fostering exchanges of views and international discussions;
10. to inform the future scientific discourses with historical facts;
resolves
11. to recommend that from now on the expansion of the universe be referred to as the “Hubble-Lemaˆıtre law”.
[1] Annales de la Soci ́et ́e Scientifique de Bruxelles, A47, p. 49-59 (1927)
[2] Humason (https://www.aip.org/history-programs/niels-bohr-library/oral-histories/4686), as reported by Sidney van den Bergh, 2011, JRASC, Vol. 105, p. 197
[3] Proceedings of the National Academy of Science, USA, 15, 168 (1929)
[4] ”The velocity-distance relation among extra-galactic nebulae”, Astrophysical Journal, Vol 74, pages 43-80 (1931)
[5] Monthly Notices of the Royal Astronomical Society, Vol. 91, pages 483-490 (1931)
[6] Letter by G. Lemaˆıtre to Dr. Smart quoted by Mario Livio, Nature, Vol 479, pages 171-173 (2011)
(FULL STORY)

Black Holes Reignite Dead Stars and Turn Them Into Zombies
[8/30/2018]
When a dead star meets a giant black hole, something weird can happen. The astronomical meeting can create a zombie.

Upcoming research in the Astrophysical Journal outlines what might happen if a white dwarf encounters an intermediate-mass black hole. Its conclusion: The violent pull of the black hole could, in theory, reignite fusion inside the dead star.

The intermediate-mass black hole in this study contains between 1,000 and 1 million times the mass of the Sun, but this type is little understood because until now they've eluded detection. But like supermassive black holes—the types of black holes at the centers of galaxies—they may be able to pull in vast amounts of material from an unlucky passing star. When this happens, the star gets ripped apart, in what's called a tidal disruption event.

In this case, College of Charleston professor Chris Fragile hypothesized what might happen if a type of dead star called a white dwarf were in the crosshairs. While super-large stars become dense objects like neutron stars or black holes when they die and collapse, ordinary Sun-like stars don't. Instead, they blow out their gas layers, exposing a dense, Earth-sized core made out of degenerate electron matter instead of helium and hydrogen. That leftover is called a white dwarf.

Under most circumstances, there's no way to make a white dwarf a "normal" helium-fusing star again. But if the dwarf encounters an intermediate-mass black hole, weird physics come into play. The white dwarf is stretched and compressed in such a way that nuclear fusion resumes again—before the stellar remnant is jostled back to the land of the dead seconds later.

There are some caveats here. The white dwarf needs to pass close, and an intermediate- mass black hole more massive than 10,000 suns would probably just devour the white dwarf. With a black hole of less mass than 1,000 suns, nothing so extreme will happen.

Such an event has never been witnessed, but future sky surveys could capture such an event in action, so the paper—which relied on computer simulations—serves as a sort of guideline to what to look for. The weird physics afoot show that some dead stars don't necessarily stay that way forever.
(FULL STORY)

First-Ever Evidence of Higgs Boson Decay Opens New Doors for Particle Physics
[8/29/2018]
f you’ve been a science fan for the last few years, you’re aware of the exciting results to emerge from the Large Hadron Collider (LHC), which in 2012 found the Higgs boson, the subatomic particle responsible for giving mass to fundamental subatomic particles.

Today, physicists have another exciting announcement to add to the Higgs saga: They have made the first unambiguous observation of Higgs bosons decaying into a matter-antimatter pair of bottom quarks. Surprisingly, the Higgs bosons decay most often in this way.

The new announcement shows a strong agreement between the theoretical predictions and the experimental data, which could in turn set strict constraints on ideas of more fundamental physics that strive to explain why the Higgs boson even exists.
Field of dreams

In the 1960s, researchers were investigating linkages between the force of electromagnetism and the weak nuclear force, which is responsible for some types of radioactive decays. Although the two forces seemed distinct, it turned out that they both arose from a common and more fundamental force, now called the electroweak force.

However, there was a problem. The simplest manifestation of the theory predicted that all particles had zero mass. Even in the 1960s, physicists knew that subatomic particles had mass, so that was potentially a fatal flaw.

Several groups of scientists proposed a solution to this problem: A field permeates the universe, and it's called the Higgs field. Fundamental subatomic particles interacted with this field, and this interaction gave them their mass. [6 Implications of Finding the Higgs Boson]

The existence of the field also implied the existence of a subatomic particle, called the Higgs boson, which was finally discovered in 2012 by researchers working at the European Organization for Nuclear Research(CERN) laboratory in Switzerland. (Disclosure: I am a collaborator on one of the research groups that made the initial discovery as well as today’s announcement.) For their predictions of the Higgs field, British physicist Peter Higgs and Belgian physicist François Englert shared the 2013 Nobel Prize in physics.

Finding the bottom quarks

Higgs bosons are made in high-energy collisions between pairs of particles that have been accelerated to nearly the speed of light. These bosons don’t live for very long — only about 10^minus 22 seconds. A particle with that lifetime, traveling at the speed of light, will decay long before it travels a distance the size of an atom. Thus, it is impossible to directly observe Higgs bosons. It is only possible to observe their decay products and use them to infer the properties of the parent boson.

Higgs bosons have a mass of 125 gigaelectron volts (GeV), or one that's about 133 times heavier than a proton. Calculations from well-established theory predicts thatHiggs bosonsdecay into pairs of the following particles in the following percentages: bottom quarks (58 percent), W bosons (21 percent), Z bosons (6 percent), tau leptons (2.6 percent) and photons (0.2 percent). More exotic configurations make up the remainder. One of the key results of today’s announcement was to verify that the prediction was correct for bottom quarks. [Strange Quarks and Muons, Oh My! Nature's Tiniest Particles Dissected]

When physicists announced the discovery of the Higgs boson in 2012, they relied on its decay into Z bosons, W bosons and photons, but not bottom quarks. The reason is actually extremely simple: Those particular decays are far easier to identify.

At the collision energies available at the LHC, Higgs bosons are made in only one collision in every 1 billion. The vast number of collisions at the LHC occur through the interaction of the strong nuclear force, which is (by far) the strongest of the subatomic forces and is responsible for holding the nucleus of atoms together.

The problem is that in interactions involving the strong force, production of a matter-antimatter pair of bottom quarks is really quite common. Thus, the production of bottom quarks by Higgs bosons decaying into bottom quarks is totally swamped by pairs of bottom quarks made by more ordinary processes. Accordingly, it is essentially impossible to identify those events in which bottom quarks are produced through the decay of Higgs bosons. It's like trying to find a single diamond in a 50-gallon drum full of cubic zirconia.

Because it is difficult or impossible to isolate collisions in which Higgs bosons decay into bottom quarks, scientists needed another approach. So, researchers looked for a different class of events — collisions in which a Higgs boson was produced at the same time as a W or Z boson. Researchers call this class of collisions "associated production."

W and Z bosons are responsible for causing the weak nuclear force and they can decay in distinct and easily identifiable ways. Associated production occurs less often than nonassociated Higgs production, but the presence of W or Z bosons greatly enhances the ability of researchers to identify events containing a Higgs boson. The technique of associated production of a Higgs boson was pioneered at the Fermi National Accelerator Laboratory, located just outside Chicago. Because of the facility's lower-energy particle accelerator, the laboratory was never able to claim that it had discovered the Higgs boson, but its researchers' knowledge played a significant role in today’s announcement.

The LHC accelerator hosts two large-particle physics detectors capable of observing Higgs bosons — the Compact Muon Solenoid (CMS) and A Toroidal LHC Apparatus (ATLAS). Today, both experimental collaborations announced the observation of the associated production of Higgs bosons, with the specific decay of Higgs bosons into a matter-antimatter pair of bottom quarks.

Theoretical Band-Aid

While the simple observation of this decay mode is a significant advance in scientific knowledge, it has a much more important result. It turns out that the Higgs field, proposed back in 1964, is not motivated by a more fundamental idea. It was simply added on to the Standard Model, which describes the behavior of subatomic particles, as something of a Band-Aid. (Before the Higgs field was proposed, the Standard Model predicted massless particles. After the Higgs field was included as an ad hoc addition to the Standard Model, particles now have mass.) Thus, it is very important to explore the predictions of decay probabilities to search for hints of a connection to an underlying theory. And there have been more recent and comprehensive theories developed since the 1960s, which predict that perhaps more than one type of Higgs boson exists.

Thus, it is crucial to understand the rate at which Higgs bosons decay into other particles and compare it with the predicted decay rates. The easiest way to illustrate agreement is to report the observed rate of decay, divided by the predicted rate. Better agreement between the two will yield a ratio close to 1. The CMS experiment finds excellent agreement in today’s announcement, with a ratio of predicted-to-observed rates of 1.04 plus or minus 0.20, and the ATLAS measurement is similar (1.01 plus or minus 0.20). This impressive agreement is a triumph of current theory, although it does not indicate a direction toward a more fundamental origin for the Higgs phenomena.

The LHC will continue to operate through early December. Then it will pause operations for two years for refurbishing and upgrades. In the Spring of 2021, it will resume operations with considerably enhanced capabilities. The accelerator and detectors are expected to continue to take data through the mid-2030s and to record over 30 times more data than what's been recorded so far. With that increase of data and improved capabilities, it is quite possible that the Higgs boson still has stories to tell.

Originally published on Live Science.

Don Lincoln contributed this article to Live Science's Expert Voices: Op-Ed & Insights.
(FULL STORY)

NASA Created a Rare, Exotic State of Matter in Space
[8/1/2018]
NASA has cooled a cloud of rubidium atoms to ten-millionth of a degree above absolute zero, producing the fifth, exotic state of matter in space. The experiment also now holds the record for the coldest object we know of in space, though it isn't yet the coldest thing humanity has ever created. (That record still belongs to a laboratory at MIT.)

The Cold Atom Lab (CAL) is a compact quantum physics machine, a device built to work in the confines of the International Space Station (ISS) that launched into space in May. Now, according to a statement from NASA, the device has produced its first Bose-Einstein condensates, the strange conglomerations of atoms that scientists use to see quantum effects play out at large scales.

"Typically, BEC experiments involve enough equipment to fill a room and require near-constant monitoring by scientists, whereas CAL is about the size of a small refrigerator and can be operated remotely from Earth," Robert Shotwell, who leads the experiment from the Jet Propulsion Laboratory, said in the statement.
Despite that difficulty, NASA said, the project was worth the effort. A Bose-Einstein condensate on Earth is already a fascinating object; at super-low temperatures, atoms' boundaries blend together, and usually-invisible quantum effects play out in ways scientists can directly observe. But cooling clouds of atoms to ultra-low temperatures requires suspending them using magnets or lasers. And once those magnets or lasers are shut off for observations, the condensates fall to the floor of the experiment and dissipate.

In the microgravity of the ISS, however, things work a bit differently. The CAL can form a Bose-Einstein condensate, set it free, then have a significantly longer time to observe it before it drifts off, NASA wrote — as long as 5 or 10 seconds. And that advantage, as Live Science previously reported, should eventually allow NASA to create condensates far colder than any on Earth. As the condensates expand outside their container, they cool further. And the longer they have to cool, the colder they get.

Originally published on Live Science.
(FULL STORY)

The Milky Way Had a Big Sibling Long Ago — And Andromeda Ate It
[7/23/2018]
Four billion years from now, the Milky Way galaxy as we know it will cease to exist.

Our Milky Way is bound for a head-on collision with the similar-sized Andromeda galaxy, researchers announced today (May 31). Over time, the huge galactic smashup will create an entirely new hybrid galaxy, one likely bearing an elliptical shape rather than the Milky Way's trademark spiral-armed disk.

"We do know of other galaxies in the local universe around us that are in the process of colliding and merging," Roeland van der Marel, of the Space Telescope Science Institute in Baltimore, told reporters today. "However, what makes the future merger of the Andromeda galaxy and the Milky Way so special is that it will happen to us."

Astronomers have long known that the Milky Way and Andromeda, which is also known as M31, are barrelling toward one another at a speed of about 250,000 mph (400,000 kph). They have also long suspected that the two galaxies may slam into each other billions of years down the road. [Milky Way Slams Into Andromeda (Artist Images)]

However, such discussions of the future galactic crash have always remained somewhat speculative, because no one had managed to measure Andromeda's sideways motion — a key component of that galaxy's path through space.

But that's no longer the case.

Van der Marel and his colleagues used NASA's Hubble space telescope to repeatedly observe select regions of Andromeda over a seven-year period. They were able to measure the galaxy's sideways (or tangential) motion, and they found that Andromeda and the Milky Way are indeed bound for a direct hit.

"The Andromeda galaxy is heading straight in our direction," van der Marel said. "The galaxies will collide, and they will merge together to form one new galaxy." He and his colleagues also created a video simulation of the Milky Way crash into Andromeda.

That merger, van der Marel added, begins in 4 billion years and will be complete by about 6 billion years from now.

A future cosmic crash

Such a dramatic event has never occurred in the long history of our Milky Way, which likely began taking shape about 13.5 billion years ago.

"The Milky Way has had, probably, quite a lot of small, minor mergers," said Rosemary Wyse of Johns Hopkins University in Baltimore, who was not affiliated with the new study. "But this major merger will be unprecedented."

The merger poses no real danger of destroying Earth or our solar system, researchers said. The stretches of empty space separating the stars in the two galaxies will remain vast, making any collisions or serious perturbations unlikely.

However, our solar system will likely get booted out to a different position in the new galaxy, which some astronomers have dubbed the "Milkomeda galaxy." Simulations show that we'll probably occupy a spot much farther from the galactic core than we do today, researchers said.

During the second close passage, the cores of the Milky Way and Andromeda appear as a pair of bright lobes. Star-forming nebulae are much less prominent because the interstellar gas and dust has been significantly decreased by previous bursts of star formation. Image released May 31, 2012.
During the second close passage, the cores of the Milky Way and Andromeda appear as a pair of bright lobes. Star-forming nebulae are much less prominent because the interstellar gas and dust has been significantly decreased by previous bursts of star formation. Image released May 31, 2012.
Credit: NASA, ESA, Z. Levay and R. van der Marel (STScI), T. Hallas, and A. Mellinger
A new night sky

And the collision will change our night sky dramatically. If any humans are still around 3.75 billion years from now, they'll see Andromeda fill their field of view as it sidles up next to our own Milky Way. For the next few billion years after that, stargazers will be spellbound by the merger, which will trigger intense bouts of star formation.

Finally, by about 7 billion years from now, the bright core of the elliptical Milkomeda galaxy will dominate the night sky, researchers said. (The odds of viewing this sight, at least from Earth, are pretty slim, since the sun is predicted to bloat into a huge red giant 5 or 6 billion years from now.)

In its 22-year history, Hubble has revolutionized the way humanity views the cosmos. The new finding is another step in that process, researchers said.

"What's really exciting about the current measurements is, it's not about historical astronomy; it's not about looking back in time, understanding the expansion of the universe," said John Grunsfeld, associate administrator for NASA's Science Mission Directorate and a former astronaut who flew on three space shuttle missions that repaired Hubble .

"It's looking forward in time, which is another very human story," Grunsfeld added. "We like to know about our past — where did we come from? We very much like to know where we're going."

You can follow SPACE.com senior writer Mike Wall on Twitter: @michaeldwall. Follow SPACE.com for the latest in space science and exploration news on Twitter @Spacedotcom and on Facebook.
(FULL STORY)

String Theory May Create Far Fewer Universes Than Thought
[7/30/2018]
The problem with string theory, according to some physicists, is that it makes too many universes. It predicts not one but some 10500versions of spacetime, each with their own laws of physics. But with so many universes on the table, how can the theory explain why ours has the features it does?

Now some theorists suggest most—if not all—of those universes are actually forbidden, at least if we want them to have stable dark energy, the supposed force accelerating the expansion of the cosmos. To some, eliminating so many possible universes is not a drawback but a major step forward for string theory, offering new hope of making testable predictions. But others say the multiverse is here to stay, and the proposed problem with all those universes is not a problem at all.

The debate was a hot topic at the end of June in Japan, where string theorists convened for the conference Strings 2018. "This is really something new and it's led to a controversy within the field," says Ulf Danielsson, a physicist at Uppsala University in Sweden. The conversation centers on a pair of papers posted on the preprint server arXiv last month taking aim at the so-called “landscape” of string theory—the incomprehensible number of potential universes that result from the many different solutions to string theory's equations that produce the ingredients of our own cosmos, including dark energy. But the vast majority of the solutions found so far are mathematically inconsistent, the papers contend, putting them not in the landscape but in the so-called "swampland" of universes that cannot actually exist. Scientists have known many solutions must fall in this swampland for years, but the idea that most, or maybe all, of the landscape solutions might live there would be a major change. In fact, it may be theoretically impossible to find a valid solution to string theory that includes stable dark energy, says Cumrun Vafa, a Harvard University physicist who led the work on the two papers.
Lost in the Multiverse

String theory is an attempt to describe the whole universe under a single "theory of everything" by adding extra dimensions of spacetime and thinking of particles as miniscule vibrating loops. Many string theorists contend it is still the most promising direction for pursuing Albert Einstein's dream of uniting his general theory of relativity with the conflicting microscopic world of quantum mechanics. Yet the notion of a string theory landscape that predicts not just one universe but many has put some physicists off. "If it's really the landscape, in my view it's death for the theory because it loses all predictive value," says Princeton University physicist Paul Steinhardt, who collaborated on one of the recent papers. "Literally anything is possible." To Steinhardt and others, the newfound problems with dark energy offer string theory a way out. "This picture with a big multiverse could be mathematically wrong," Danielsson says. "Paradoxically this makes things much more interesting because that means string theory is much more predictive than we thought it was."

Some string theorists such as Savdeep Sethi of the University of Chicago welcome the reevaluation that is happening now. "I think this is exciting," he says. "I've been a skeptic of the landscape for a long time. I'm really happy to see the paradigm shift away from this belief that we have this proven set of solutions." But not everyone buys the argument that the landscape actually belongs in the swampland—especially the research team that established one of the earliest versions of the landscape in the first place back in 2003, which goes by the acronym KKLT after the scientists' last names. "I think it's very healthy to make these conjectures and check what other things could be going on but I don't see either theoretical or experimental reasons to take such a conjecture very seriously," says KKLT member Shamit Kachru of Stanford University. And Eva Silverstein, a Stanford physicist who also helped build the early landscape models, likewise doubts Vafa and his colleagues' argument. "I think the ingredients KKLT use and the way they put them together is perfectly valid," she says. Juan Maldacena, a theorist at the Institute for Advanced Study, says he also still supports the idea of string theory universes with stable dark energy.

And many theorists are perfectly happy with the string theory multiverse. "It is true that if this landscape picture is correct, the bit of the universe we're in compared to the multiverse will be like our solar system within the universe," Kachru says. And that is a good thing, he adds. Johannes Kepler originally sought a fundamental reason for why Earth lies the distance it does from the sun. But now we know the sun is just one of billions of stars in the galaxy, each with its own planets, and the Earth–sun distance is simply a random number rather than a result of some deep mathematical principle. Likewise, if the universe is one of trillions within the multiverse, the particular parameters of our cosmos are similarly random. The fact these numbers seem perfectly fine-tuned to create a habitable universe is a selection effect—humans will of course find themselves in one of the rare corners of the multiverse where it is possible for them to have evolved.

The Accelerating Universe

If it is true string theory cannot accommodate stable dark energy, that may be a reason to doubt string theory. But to Vafa it is a reason to doubt dark energy—that is, dark energy in its most popular form, called a cosmological constant. The idea originated in 1917 with Einstein and was revived in 1998 when astronomers discovered that not only is spacetime expanding—the rate of that expansion is picking up. The cosmological constant would be a form of energy in the vacuum of space that never changes and counteracts the inward pull of gravity. But it is not the only possible explanation for the accelerating universe. An alternative is "quintessence," a field pervading spacetime that can evolve. "Regardless of whether one can realize a stable dark energy in string theory or not, it turns out that the idea of having dark energy changing over time is actually more natural in string theory," Vafa says. "If this is the case, then one can measure this sliding of dark energy by astrophysical observations currently taking place."

So far all astrophysical evidence supports the cosmological constant idea, but there is some wiggle room in the measurements. Upcoming experiments such as Europe's Euclid space telescope, NASA's Wide-Field Infrared Survey Telescope (WFIRST) and the Simons Observatory being built in Chile's desert will look for signs dark energy was stronger or weaker in the past than the present. "The interesting thing is that we're already at a sensitivity level to begin to put pressure on [the cosmological constant theory]." Steinhardt says. "We don't have to wait for new technology to be in the game. We're in the game now." And even skeptics of Vafa's proposal support the idea of considering alternatives to the cosmological constant. "I actually agree that [a changing dark energy field] is a simplifying method for constructing accelerated expansion," Silverstein says. "But I don't think there's any justification for making observational predictions about the dark energy at this point."

Quintessence is not the only other option. In the wake of Vafa's papers, Danielsson and colleagues proposed another way of fitting dark energy into string theory. In their vision our universe is the three-dimensional surface of a bubble expanding within a larger-dimensional space. "The physics within this surface can mimic the physics of a cosmological constant," Danielsson says. "This is a different way of realizing dark energy compared to what we've been thinking so far."

A Beautiful Theory

Ultimately the debate going on in string theory centers on a deep question: What is the point of physics? Should a good theory be able to explain the particular characteristics of the universe around us or is that asking too much? And when a theory conflicts with the way we think our universe works, do we abandon the theory or the things we think we know?

String theory is incredibly appealing to many scientists because it is "beautiful"—its equations are satisfying and its proposed explanations elegant. But so far it lacks any experimental evidence supporting it—and even worse, any reasonable prospects for gathering such evidence. Yet even the suggestion string theory may not be able to accommodate the kind of dark energy we see in the cosmos around us does not dissuade some. "String theory is so rich and beautiful and so correct in almost all the things that it's taught us that it's hard to believe that the mistake is in string theory and not in us," Sethi says. But perhaps chasing after beauty is not a good way to find the right theory of the universe. "Mathematics is full of amazing and beautiful things, and most of them do not describe the world," physicist Sabine Hossenfelder of the Frankfurt Institute for Advanced Studies wrote in her recent book, Lost in Math: How Beauty Leads Physics Astray (Basic Books, 2018).

Despite the divergence of opinions, physicists are a friendly bunch, and are united by their common goal of understanding the universe. Kachru, one of the founders of the landscape idea, worked with Vafa, the landscape's critic, as his undergraduate advisor—and the two are still friends. "He asked me once if I'd bet my life these [landscape solutions] exist," Kachru says. "My answer was, 'I wouldn't bet my life but I'd bet his!'"

Additional reporting by Lee Billings.

This story was provided by Astrobiology Magazine, a web-based publication sponsored by the NASA astrobiology program.
(FULL STORY)

Large Hadron Collider Just Spat Electron-ified Atoms to Almost the Speed of Light
[7/30/2018]
Scientists working on the Large Hadron Collider (LHC) achieved yet another first Wednesday (July 25), revving full-blown atoms (with electrons oribiting them) up to near the speed of light.

The question of whether these were truly the first "atoms" that humans have accelerated to these speeds is a bit semantic; The LHC accelerates atomic nuclei of one sort or another all the time. (That's why folks sometimes call the giant machine, run by the European Center for Nuclear Research, or CERN, an "atom smasher.") But this is the first time those nuclei have had electrons orbiting them. In this case, CERN explained in a press release, the researchers accelerated lead nuclei, each orbited by a single electron, in a relatively low-energy beam for "about an hour."

Then they "ramped the LHC up to its full power and maintained the beam for about two minutes before it was ejected." [Photos: The World's Largest Atom Smasher (LHC)]


What is a Nerve Agent?
Simply put, nerve agents stop the central nervous system from communicating with the muscles, organs and glands it needs to keep your body's internal machinery running smoothly.

n a follow-up test, they maintained the full-power beam for two hours with a smaller group of atoms.

Michaela Schaumann, an LHC physicist, said in a statement that accelerating atoms with electrons is challenging because, "it's really easy to accidentally strip off the electron. ... When that happens, the nucleus crashes into the wall of the beam pipe because its charge is no longer synchronised with the LHC's magnetic field."

The multi-billion-Euro experiment has safeguards to protect itself, she said, so if a beam becomes unstable it automatically gets dumped in order to protect the LHC.

However, CERN said, the complex atom beams turned out to be more stable than expected. That's good news, Schaumann said, because it opens the door to a host of new experiments. The most interesting? Using the complex atoms as gamma-ray sources. When the electrons move from high- to low-energy states, they emit photons (light particles). And at the LHC's speeds, those photons would have the wavelengths and energies of gamma rays, which can be difficult to produce in a lab.

Originally published on Live Science.
(FULL STORY)

SETI Researchers Want to End the Alien-Detection Hype
[8/3/2018]
Researchers looking for signals from technologically advanced aliens pick up countless strange pings — but so far, nothing has convinced them that a message really came from aliens.

But that doesn't mean there aren't plenty of overblown media headlines about potential alien detections. So a team of researchers pursuing the search for extraterrestrial intelligence, or SETI, has decided to revive a scale meant to ground these detections in reality. They shared their scale, called Rio 2.0, in a new paper that takes aim at SETI researchers and the media for irresponsible coverage of potential detections.

"It's absolutely crucial that when we talk about something so hugely significant as the discovery of intelligent life beyond the Earth, we do it clearly and carefully," lead author Duncan Forgan, a SETI scientist at the University of St Andrews in the U.K., said in a statement from the SETI Institute. "Having Rio 2.0 allows us to rank a signal quickly in a way that the general public can easily understand, and helps us keep their trust in a world filled with fake news." [13 Ways to Hunt Intelligent Aliens]

The new study builds on a similar effort, called the Rio Scale, which was developed in 2000 and was presented at the 51st International Astronautical Congress held in Rio de Janeiro the next year. But since the original scale was developed, SETI scientists have decided that the evaluation needed some updating, particularly given the breakneck pace of online media.

The new version of the scale creates the same output: a score ranging from 0 to 10 meant to convey the importance of a signal detection, with 0 representing a detection of no importance and 10 indicating one of extraordinary importance.

But the new research proposes tweaking the way that score is calculated to try to make it a better representation of the factors that determine a detection's true significance and to make the tool easier for scientists to use to evaluate their own signals and those of their colleagues. That initial score is then meant to be revised as additional data are gathered, the researchers explained.

The co-authors of the new research, who include some of the same people behind the 2001 scale, hope that reviving the scale will help the public and the media evaluate the importance of a signal — sort of like the alien equivalent of the Richter scale, co-author Jill Tarter, a SETI researcher based at the SETI Institute, said in the statement.

"The SETI community is attempting to create a scale that can accompany reports of any claims of the detection of extraterrestrial intelligence and be refined over time as more data become available," Tarter said. "This scale should convey both the significance and credibility of the claimed detection."

The research was described in a paper published July 24 in the International Journal of Astrobiology, and the researchers have set up an online calculator to produce the scores.

So far, the scientists wrote, the majority of the detections they've run through the calculator have come back with a score of 0, which means it's still much too early to get excited about alien communications.

Email Meghan Bartels at mbartels@space.com or follow her @meghanbartels. Follow us @Spacedotcom, Facebook and Google+. Original article on Space.com.
(FULL STORY)

Life Needs Sunlight — and That Could Change Where We Look for Aliens
[8/1/2018]
With every new exoplanet discovered, the same question arises: Could this world host life?

The default way scientists first approach that question is to check if the planet lies in the so-called habitable zone, the range of distances from a star in which a planet can hold liquid water on its surface. But water alone doesn't make life, so in a new paper, a team of scientists looked at another aspect of habitability: whether a planet receives enough ultraviolet radiation to create life's building blocks.

"The thing that you know best about any exoplanet system is the star," Paul Rimmer, lead author on the new study and an astrochemist at the University of Cambridge in the United Kingdom, told Space.com. "So, that seemed like a natural thing to start with." [9 Strange, Scientific Excuses for Why We Haven't Found Aliens Yet]

Building life in the lab

Most scientists think that life began with ribonucleic acid (RNA). Like DNA, this molecule can transmit information, but unlike DNA, it can also help other molecules react with each other, potentially allowing RNA to replicate itself. But getting that RNA in the first place is tricky. This feat is so tricky, in fact, that the problem of creating RNA has haunted chemists interested in the origins of life for almost half a century, Sukrit Ranjan, a planetary scientist at the Massachusetts Institute of Technology, told Space.com. Ranjan has collaborated with the researchers in the recent study but was not involved in the new work.

He said that scientists know how to create each of the three building blocks that make up a molecule of RNA. In previous work, chemists have also figured out how to piece those building blocks together into two of the four flavors of RNA by focusing on a specific tricky chemical bond first. "The thing that jumped out to planetary scientists was that this mechanism requires UV [ultraviolet] light to function," Ranjan said.

So, Rimmer asked questions like what type of lights the chemists were using in their experiments and how closely those setups mimic the light produced by stars. For the new paper, Rimmer and his colleagues watched that mechanism work on two different chemical mixes meant to imitate a sulfur-rich young world and under a range of ultraviolet conditions. Those experiments let them calculate a minimum amount of ultraviolet light required for RNA formation.

This was Rimmer's first time doing formal chemistry lab research, and he said he appreciated taking the new approach. "I really enjoyed that aspect, because I think that experimentation is really the way that you can ground yourself in reality," he said. "It's like observation. It's something that you can very much see."

Others may not be so convinced by the new experiments: Frances Westall, an astrobiologist at the National Center for Scientific Research in France who was not involved with the study, called the paper more of an "interesting thought experiment" in an email to Space.com. She said she's particularly concerned that one of the two initial sulfur mixes the team used didn't create RNA under Earth-like conditions — and, after all, we're positive life started here somehow.

"One of my problems with many prebiotic chemistry experiments run by chemists is that they do not consider what the early Earth really was like," she wrote, mentioning that the team used what she considers an outdated recipe of gases to represent our planet's early atmosphere. "[Chemists] use spurious concepts simply because they can get good results under certain physicochemical conditions," Westall wrote. [13 Ways to Hunt Intelligent Aliens]

Bringing it to the stars

Once Rimmer and his colleagues had that minimum requirement for ultraviolet light, they pored over exoplanets, selecting which worlds to include in their analysis. The researchers wanted planets that scientists are confident are rocky and so focused on planets less than 1.4 Earth radiuses in size. The scientists also wanted planets that previous studies had shown were the right distances from their sun to be able to hold liquid water on their surfaces.

Those criteria narrowed the study's focus down to a dozen exoplanets, a list that includes some of the most astrobiologically intriguing worlds we know of, like TRAPPIST-1e, f and g Kepler-452b; and LHS 1140b. (The team eliminated another popular contender, Proxima b, because astronomers don't have a firm enough measure of that world's size.)

Then, they turned to ultraviolet radiation, calculating how much light these planets receive from their stars today. That left just one firm contender, Kepler-452b, which was discovered in 2015 and which NASA billed at the time as "the first near-Earth-size planet [identified] in the 'habitable zone' around a sun-like star."

Similar calculations for yet-to-be discovered planets could help scientists prioritize where they look for life, Rimmer said. That could be particularly helpful given how expensive the necessary observations of these planets' atmospheres will be — once such work is even technologically possible. These measures will happen via instruments like the long-delayed James Webb Space Telescope. "You want to make sure you're looking at the places where you have the best chances," he said.

That said, the team's ultraviolet radiation calculations are not the last word on habitability. Their analyses leave out two key factors: the impact of solar flares, which can cause dramatic fluctuations in the ultraviolet radiation the star releases, and the changes a star undergoes as it ages, becoming calmer and less active.

The second factor could be particularly important, Rimmer said. That's because, right now, astronomers' best bet for finding exoplanets where they can identify life is to search around small, faint M dwarf stars — which currently produce much less light than they did when they were younger. That means that while these stars may not currently foster the conditions RNA needs to form, they may have done so long ago. And life that arose in the past could still be hiding out on the surface of planets surrounding these stars.

Rimmer already has plans to build on the new research: He said he wants to use xenon lamps, which more closely mimic the ultraviolet light coming from stars, providing better estimates of where RNA formation can occur.

The research is described in a paper published today (Aug. 1) in the journal Science Advances.

Email Meghan Bartels at mbartels@space.com or follow her @meghanbartels. Follow us @Spacedotcom, Facebook and Google+. Original article on Space.com.
(FULL STORY)

Stars (Including 1 Daredevil) Circle the Milky Way's Monster Black Hole (Time-Lapse)
[8/2/2018]
Spectacular time-lapse footage from the European Southern Observatory's Very Large Telescope (VLT) in Chile captures stars orbiting the black hole at the center of the Milky Way, including one daring star that circles incredibly close.
A monster black hole called Sagittarius A* lies at the heart of the Milky Way, about 26,000 light-years away from Earth. The footage, taken during a 26-year-long observation campaign, revealed that a small group of stars orbit this gravitational monster at high speed.

Using the VLTee, astronomers followed one of these stars, known as S2, as it passed close to the black hole during May 2018. Their observations showed that, at its closest approach, this star was less than 12 billion miles (20 billion kilometers) from the black hole and moving at a speed of 15.5 million mph (25 million km/h). [Images: Black Holes of the Universe]

Traveling at nearly 3 percent of the speed of light, S2 is an ideal test object for studying very strong gravitational fields and testing Einstein's theory of general relativity, ESO officials said in a statement.

"This is the second time that we have observed the close passage of S2 around the black hole in our galactic center. But this time, because of much improved instrumentation, we were able to observe the star with unprecedented resolution," Reinhard Genzel, lead researcher from the Max Planck Institute for Extraterrestrial Physics (MPE) in Garching, Germany, said in the statement. "We have been preparing intensely for this event over several years, as we wanted to make the most of this unique opportunity to observe general relativistic effects."

The last time S2 made its closest approach to the black hole was 16 years ago, and the resolution of the measurements taken using other instruments wasn't good enough to pick up the effects of relativity, according to the statement.

The new infrared observations were made using the GRAVITY, SINFONI and NACO instruments on ESO's VLT. Astronomers compiled the recent observations to make a time-lapse video of their findings, as well as an animation that simulates the orbits of the tight group of stars circling the Milky Way's black hole and S2's close approach.

"ESO has worked with Reinhard Genzel and his team and collaborators in the ESO Member States for over a quarter of a century," Xavier Barcons, ESO's director general, said in the statement. "It was a huge challenge to develop the uniquely powerful instruments needed to make these very delicate measurements and to deploy them at the VLT in Paranal."

Follow Samantha Mathewson @Sam_Ashley13. Follow us @Spacedotcom, Facebook and Google+. Original article on Space.com.
(FULL STORY)

A New NASA-Led Project Means the Search for Aliens Is Heating Up
[6/26/2018]
To date, scientists have catalogued more than 3,500 exoplanets, some of which may even be capable of fostering life. But we simply don’t know. The ability to detect life on distant worlds still eludes us, but a new project coordinated by NASA now takes us a significant step closer to achieving that goal.

Six new papers published in the science journal Astrobiology are providing a launching point for scientists on the hunt for signs of life on planets outside our Solar System. The new papers outline various ways in which extraterrestrial “biosignatures” could be detected using current and future technologies, and what scientists should be looking for in the data. Encouragingly, the scientists say it’s entirely possible that we’ll detect atmospheric biosignatures of potentially habitable planets by the year 2030, while cautioning that definitive proof of alien life will likely only come later, after more rigorous analysis and more powerful telescopic techniques.

This project, called Nexus for Exoplanet Systems Science, or NExSS, is the culmination of two years of work, and it began with online discussions and a workshop held in Seattle in July 2016. NExSS is coordinated by NASA, but it’s global in reach, involving scientists from the University of Washington, the University of California-Riverside, the Tokyo Institute of Technology, the University of Glasgow, and many other institutions. The project is rooted in astrobiology, but it involves an impressive cast of interdisciplinary experts, including planetary scientists, Earth scientists, heliophysicists (who study of the effects of stars on their planetary systems), astrophysicists, chemists, and biologists. Indeed, if we’re ever going to find life on other planets, it’s going to take a village.


Artist’s conception of Kepler-62f, a small exoplanet located within a habitable zone.
Illustration: NASA Ames/JPL-Caltech/T. Pyle
The point of the project and the six new papers is to provide a comprehensive overview of what we know so far about life and how it gets started in the Universe, as well as how we might be able to detect biosignatures from Earth using current and future technologies. The researchers argue that an integrated, multidisciplinary approach is required, along with open-mindedness to a variety of ideas and perspectives. Because of the distances involved, however, and considering the limited state of our telescopic technologies, these biosignatures will have to be quite conspicuous.

“For life to be detectable on a distant world it needs to strongly modify its planet in a way that we can detect,” University of Washington astronomer Victoria Meadows, who co-authored two of the six new papers, said in a statement. “But for us to correctly recognize life’s impact, we also need to understand the planet and star—that environmental context is key.”

In the first paper, lead author Nancy Kiang from NASA’s Goddard Institute for Space Studies (GISS) and colleagues consider the types of biosignatures scientists should be looking for. A key conclusion of the paper is the recognition of two major signal types—atmospheric gases that are produced by life (e.g. oxygen that’s produced by plants or photosynthetic microbes), or light reflected by life (e.g. the color of leaves or pigments). The second paper, led by Meadows, is a discussion of possible false positives and false negatives, and the various ways in which astrobiologists might be fooled into thinking they’ve detected a planet with a discernible biosignature, or vice versa. Accordingly, the paper considers ways in which a planet could produce a biosignature like oxygen without the presence of life, and how planets with life might produce biosignatures far removed from what we witness here on Earth.

“There are lots of things in the universe that could potentially put two oxygen atoms together, not just photosynthesis—let’s try to figure out what they are,” said Meadows. “Under what conditions are they are more likely to happen, and how can we avoid getting fooled?”

The third paper, led by Edward W. Schwieterman from the Department of Earth Sciences at the University of California, Riverside, considers what we’ve learned about our own planet and how life emerged on Earth, and how these and related processes might exist elsewhere. In the fourth paper, University of Washington astronomer David C. Catling and colleagues provide a framework for evaluating exoplanetary biosignatures, including variables like chemicals in the planet’s atmosphere, the presence of oceans and continents, and a planet’s overall climate. The authors also provide a systematic way of assessing a potential biosignature, allowing researchers to assign confidence levels for a planet to host life, from “very likely” (90 to 100 percent) to “very unlikely (>10 percent). The fifth and sixth papers evaluate our current observational prospects, while proposing future directions, such as designing powerful space-based telescopes capable of detecting biosignatures of distant exoplanets.

Collectively, these papers, along with NExSS, show that the field of astrobiology is maturing; it’s becoming a rigorous scientific endeavor in its own right. Scientists are finally formalizing the search for extraterrestrial life, while providing entry points for scientists from different fields to come together. The strategies proposed in these papers require rigorous due process and sound science, but not at the expense of allowing scientists to think creatively about what other kinds of life might exist elsewhere.
(FULL STORY)

Probing Exoplanet Obliquity
[7/2/2018]
It’s always a shock for me when the soft air and fecund smells of spring slam into a parched and baked July, but seasonal change is inevitable. At least it is on Earth. We get such seasonal changes because of Earth’s obliquity, the angle of its spin axis relative to the plane of its orbit. For Earth, the angle has stayed pretty close to 23 degrees for a long time, although the tilt’s direction wobbles over cycles of thousands of years. And this very constancy of obliquity turns up in exoplanet discussions at times because it affects conditions on a planetary surface.

Some have argued that without the gravitational effects of the Moon, the tilt of the Earth would be changed by the gravitational pull of the Sun and planets, producing a potentially high degree of obliquity. Contrast our situation with that of Uranus, where we find a 90-degree tilt that leaves one pole in sunlight for half the Uranian year as the other remains in darkness. Without knowing how long the Moon has been able to stabilize Earth’s axial tilt, we can’t say how apparent equatorial ice sheets some 800 million years ago fit into this view of the Moon’s effect.

But obliquity as a factor in habitability continues to energize exoplanetary researchers. At Georgia Tech, a team led by Gongjie Li, working with graduate student Yutong Shan (Harvard-Smithsonian Center for Astrophysics) has developed computer simulations to analyze the spin axis dynamics of two exoplanets, Kepler 186f and Kepler 62f, two planets considered to be in or close to the habitable zone of their stars. The paper argues that without our Moon, Earth’s obliquity variation would range from 0 to 45 degrees over billion-year timescales.

Thus obliquity is an interesting data point. Bear in mind that so far, we have no reliable values for exoplanet obliquity, although ways to infer it from light curves and from high-contrast direct imaging have been proposed in the literature. The authors make the assumption that in both exoplanet systems studied, all planets have been identified. They then go on to study the evolution of the two five-planet systems. The ‘secular analytical framework’ they arrive at allows them to factor in planetary rotation rates, additional planets and satellites, and regions where resonant interactions within the system can produce large obliquity variations. For various realizations of planetary systems, the paper thus describes an ‘obliquity evolution.’

We know that Mars and Earth interact strongly with each other, as do Mercury and Venus; other than Earth, none of these worlds has a large moon. The authors point out that the orientation angle of a planet’s orbit around its host star can be made to oscillate through gravitational interactions. If the orbit oscillates at the same pace as the precession of the planet’s spin axis, large obliquity variations can be induced, the kind of thing our Moon dampens out.



Image: An artist’s depiction of Kepler-62f. Credit: NASA Ames/JPL-Caltech/T.Pyle.

For these two exoplanet systems, we get an interesting result, for even without a stabilizing moon (if none is present), these two planets could be experiencing relatively low changes in their axial tilt:

“It appears that both exoplanets are very different from Mars and the Earth because they have a weaker connection with their sibling planets,” said Li. “We don’t know whether they possess moons, but our calculations show that even without satellites, the spin axes of Kepler-186f and 62f would have remained constant over tens of millions of years. That’s not to say either exoplanet has water, let alone life. But both are relatively good candidates. Our study is among the first to investigate climate stability of exoplanets and adds to the growing understanding of these potentially habitable nearby worlds.”

As Li has just pointed out, we have no knowledge of surface conditions on either of these planets, making the lovely image above nothing more than a guess, and an optimistic one at that. The ‘super Earth’ Kepler 62f, about 40 percent larger and with a mass 2.8 times that of our planet, is in the constellation of Lyra, the outermost of the five planets orbiting a K2-class star some 1200 light years from Earth. Kepler-186f orbits a red dwarf about 550 light years out, part of a five-planet system in the constellation Cygnus. A stable axial tilt would make it likely that both worlds experience regular seasons and thus a stable climate.

But are large obliquity values necessarily inimical to life? Some recent work, considered by the authors, shows that variability in obliquity can keep a planet’s global temperature higher than it would otherwise have been, extending the outer edge of the habitable zone. But it does appear that obliquity variations can produce sharp transitions between climate states. From the paper:

Recently, Kilic et al. (2017) mapped out the various equilibrium climate states reached by an Earth-like planet as a function of stellar irradiance and obliquity. They find that, in this parameter space, the state boundaries (e.g. between cryo- and aqua-planets) are sharp and very sensitive to the climate history of the planet. This suggests that a variable obliquity can easily move the planet across state divisions, as well as alter the boundaries themselves, which would translate into a dramatic impact on instantaneous surface conditions and long-term climate evolution.

Planets with highly irregular seasons aren’t necessarily destined to be lifeless, but if we become capable of determining planetary obliquity, such a value could help us narrow the target list for future space telescopes. The authors also suggest that their framework can provide input parameters for existing global climate models as we analyze habitability in multi-planet systems.

The paper is Shan and Li, “Obliquity Variations of Habitable Zone Planets Kepler-62f and Kepler-186f,” Astronomical Journal Vol. 155, No. 6 (17 May 2018). Abstract / preprint.

tzf_img_post

{ 34 comments… add one }
Michael Lorrey July 2, 2018, 12:37
I would argue that without a large moon orbiting off the orbital plane of the planet to its primary, like ours does, we would not see much variation at all. We can look at all the other planets in our solar system other than Uranus as examples. Statistically, if planets were prone to wandering obliquity, there should be a much wider range of it among our planets in our solar system. Since there is not, there isn’t a benefit to having a large moon, and here is why: when a planet rotates, the planetary mass at the equator is drawn outward due to centrifugal forces. This creates an oblateness to the planetary sphere, such that the diameter pole to pole is significantly less than the diameter equator to equator. It is this difference, combined with the gyroscopic behavior of the rotating planet, and the overwhelming gravitational attraction of its star, that forces a spinning planet without moons to orient itself north/south along with its star. The greater mass at the equator of the oblate spheroidal planet due to planetary bulge imposes a torque that forces the planet to face its equator edge on to its star. It is only when there is a greater gravitational influence, like a large moon, or a nearby jovian, that the suns influence is overridden, and the planet trends toward obliquity.
This also explains why Uranus orients its axis toward the sun: the greater gravitational influences upon it are not the sun, its own moons, which inhabit an accretion disk that is likewise, consistent with the Uranian equator, face on, rather than edge on, to the Sun, and result from some cataclysmic impact early in Uranian planetary formation. The combined masses of Uranian moons rivals the total mass of our own moon, and while it masses 14x that of Earth, this is apparently sufficient for the moons of uranus to exert a sufficient gravitational influence upon the planet to maintain its obliquity.
(FULL STORY)

The 10 most educated countries in the world
[2/7/2018]
Every year, institutions in the United States dominate rankings of the best colleges in the world.

Of the top 10 best universities in the world, eight are located in the U.S. But despite having some of the best educational institutions on earth, the Organisation for Economic Co-operation and Development (OECD) ranks the U.S. sixth for adult education level.

The OECD defined a country's adult education level as the percentage of people between the ages of 25 and 64 who have completed some kind of tertiary education in the form of a two-year degree, four-year degree or vocational program.

Here are the 10 most educated countries:


10. Luxembourg

42.86 percent

9. Norway

43.02 percent

8. Finland

43.60 percent

7. Australia

43.74 percent

6. United States

45.67 percent

Oxford University
Joe Daniel Price | Getty Images
Oxford University
5. United Kingdom

45.96 percent

4. Korea

46.86 percent

3. Israel

49.90 percent

2. Japan

50.50 percent

1. Canada

56.27 percent

Here's where students have the hardest and easiest time paying off loans Here's where students have the hardest and easiest time paying off loans
Canada tops the list as the most educated country in the world. According to the OECD over 56 percent of adults in the Great White North have earned some kind of education after high school.

During the 2016 World Economic Forum in Davos, Canadian Prime Minister Justin Trudeau suggested that Canadians' education was the nation's greatest resource.

"We need education to enable people to learn, think, and adapt," he said. "Our natural resources are important, and they always will be. But Canadians know that what it takes to grow and prosper isn't just what's under our feet, it's what between our ears."

Canada is followed by highly-educated countries like Japan, Israel and Korea.

The United States ranked sixth on the OECD's list. According to the OECD, 45.7 percent of American adults between the ages of 25 and 64 have completed some kind of tertiary education in the form of a two-year degree, four-year degree or vocational program. The U.S. Census estimates that about 33 percent of American adults possess a bachelor's degree or more.
(FULL STORY)

Designer diamonds could one day help build a quantum internet
[7/6/2018]
A new kind of artificial diamond is a cut above the rest for quantum memory.

Unlike other synthetic diamonds, which could either store quantum information for a long time or transmit it clearly, the new diamond can do both. This designer crystal, described in the July 6 Science, could be a key building block in a quantum internet. Such a futuristic communications network would allow people to send supersecure messages and connect quantum computers around the world (SN: 10/15/16, p. 13).

Synthetic diamond can serve as quantum storage thanks to a type of flaw in its carbon lattice, where two neighboring carbon atoms are replaced with one noncarbon atom and an empty space (SN: 4/5/08, p. 216). This pairing exhibits a quantum property known as spin, which can be in an “up” state, a “down” state or both at once. Each of those states reflects a bit of quantum data, or qubit, that may be 1, 0 or both at once. A diamond transmits qubits by encoding them in light particles, or photons, that travel through fiber-optic cables.

Qubit-storing diamond defects are typically made with nitrogen atoms, which can store quantum data for milliseconds — a relatively long time in the quantum realm (SN: 4/23/11, p. 14). But nitrogen defects can’t communicate that data clearly. They emit light particles at a broad range of frequencies, which muddles the quantum information written into the photons.

Defects made with silicon atoms emit light more precisely, but until now haven’t been able to store qubits for longer than several nanoseconds due to their electrical interactions with nearby particles, explains Nathalie de Leon, an electrical engineer at Princeton University.

De Leon and colleagues got around this problem by forging silicon defects in a diamond infused with boron. This extra chemical ingredient shielded the delicate silicon defects from electrical interactions with nearby particles, extending the defects’ quantum memory. The boron-infused crystal nearly rivaled the long-term quantum memory of nitrogen defects, storing qubits for about a millisecond. And it gave a clean photon readout, emitting about 90 percent of its photons at the exact same frequency — compared to just 3 percent of photons spat out by nitrogen defects.

Tweaking the environment of the silicon defects was “an extremely creative way” to help keep a better grip on qubits, says Evelyn Hu, an applied physicist and electrical engineer at Harvard University not involved in the work.

This new artificial diamond could be used to construct devices called quantum repeaters for long-distance quantum communications, says David Awschalom, a physicist and quantum engineer at the University of Chicago who wasn’t involved in the work. Qubit-carrying photons can travel only up to about 100 kilometers through optical fiber before their signal gets scrambled (SN: 9/30/17, p. 8). Quantum repeaters that catch, store and re-emit photons could serve as stepping stones between fiber-optic cables to extend the reach of future networks.
(FULL STORY)

Can We Ever Trust Man-Made AI?
[5/21/2018]
Some of the most intelligent people at the most highly-funded companies in the world can’t seem to answer this simple question: what is the danger in creating something smarter than you? They’ve created AI so smart that the “deep learning” that it’s outsmarting the people that made it.

The reason is the “blackbox” style code that the AI is based off of—it’s built solely to become smarter, and we have no way to regulate that knowledge. That might not seem like a terrible thing if you want to build superintelligence. But we’ve all experienced something minor going wrong, or a bug, in our current electronics. Imagine that, but in a Robojudge that can sentence you to 10 years in prison without explanation other than “I’ve been fed data and this is what I compute”… or a bug in the AI of a busy airport. We need regulation now before we create something we can’t control. Max’s book Life 3.0: Being Human in the Age of Artificial Intelligence is being heralded as one of the best books on AI, period, and is a must-read if you’re interested in the subject.
(FULL STORY)

High-Energy 'Ghost Particle' Traced to Distant Galaxy in Astronomy Breakthrough
[7/12/2018]
Astronomers have traced a high-energy neutrino to its cosmic source for the first time ever, solving a century-old mystery in the process.

Neutrinos are nearly massless subatomic particles that have no electric charge and therefore interact rarely with their surroundings. Indeed, trillions of these "ghost particles" stream through your body unnoticed and unhindered every second.

Most of these neutrinos come from the sun. But a small percentage, which boast extremely high energies, have rocketed to our neck of the woods from very deep space. The inherent elusiveness of neutrinos has prevented astronomers from pinning down the origin of such cosmic wanderers — until now. [Tracing a Neutrino to Its Source: The Discovery in Pictures]

Observations by the IceCube Neutrino Observatory at the South Pole and a host of other instruments allowed researchers to track one cosmic neutrino to a distant blazar, a huge elliptical galaxy with a fast-spinning supermassive black hole at its heart.

And there's more. Cosmic neutrinos go hand in hand with cosmic rays, highly energetic charged particles that slam into our planet continuously. So, the new find pegs blazars as accelerators of at least some of the fastest-moving cosmic rays as well.

Astronomers have wondered about this since cosmic rays were first discovered, way back in 1912. But they've been thwarted by the particles' charged nature, which dictates that cosmic rays get tugged this way and that by various objects as they zoom through space. Success finally came from using the straight-line journey of a fellow-traveler ghost particle.

"We have been looking for the sources of cosmic rays for more than a century, and we finally found one," Francis Halzen, lead scientist at the IceCube Neutrino Observatory and a professor of physics at the University of Wisconsin-Madison, told Space.com. [Wacky Physics: The Coolest Little Particles in Nature]

In this artist's illustration, based on a real image of the IceCube lab at the South Pole, a distant source emits neutrinos that are detected below the ice by IceCube sensors.
In this artist's illustration, based on a real image of the IceCube lab at the South Pole, a distant source emits neutrinos that are detected below the ice by IceCube sensors.
Credit: IceCube/NSF
A team effort

IceCube, which is managed by the U.S. National Science Foundation (NSF), is a dedicated neutrino hunter. The facility consists of 86 cables, which nestle within boreholes that extend about 1.5 miles (2.5 kilometers) into the Antarctic ice. Every cable, in turn, holds 60 basketball-size "digital optical modules," which are outfitted with sensitive light detectors.

These detectors are designed to pick up the characteristic blue light emitted after a neutrino interacts with an atomic nucleus. (This light is thrown off by a secondary particle created by the interaction. And in case you were wondering: All that overlying ice prevents particles other than neutrinos from reaching the detectors and dirtying up the data.) These are rare events; IceCube spots just a couple of hundred neutrinos per year, Halzen said.

The facility has already made big contributions to astronomy. In 2013, for example, IceCube made the first-ever confirmed detection of neutrinos from beyond the Milky Way galaxy. Researchers weren't able to pin down the source of those high-energy ghost particles at the time.

On Sept. 22, 2017, however, IceCube picked up another cosmic neutrino. It was extremely energetic, packing about 300 teraelectron volts — nearly 50 times greater than the energy of the protons cycling through Earth's most powerful particle accelerator, the Large Hadron Collider.

Within 1 minute of the detection, the facility sent out an automatic notification, alerting other astronomers to the find and relaying coordinates to the patch of sky that seemed to house the particle's source.

The community responded: Nearly 20 telescopes on the ground and in space scoured that patch across the electromagnetic spectrum, from low-energy radio waves to high-energy gamma-rays. The combined observations traced the neutrino's origin to an already-known blazar called TXS 0506+056, which lies about 4 billion light-years from Earth.

For example, follow-up observations by several different instruments — including NASA's Earth-orbiting Fermi Gamma-ray Space Telescope and the Major Atmospheric Gamma Imaging Cherenkov Telescope (MAGIC) in the Canary Islands — revealed a powerful burst of gamma-ray light flaring from TXS 0506+056. [Gamma-Ray Universe: Photos by NASA's Fermi Space Telescope]

The IceCube team also went through its archival data and found more than a dozen other cosmic neutrinos that seemed to be coming from the same blazar. These additional particles were picked up by the detectors from late 2014 through early 2015.

"All the pieces fit together," Albrecht Karle, a senior IceCube scientist and UW-Madison physics professor, said in a statement. "The neutrino flare in our archival data became independent confirmation. Together with observations from the other observatories, it is compelling evidence for this blazar to be a source of extremely energetic neutrinos, and thus high-energy cosmic rays."

The findings are reported in two new studies published online today (July 12) in the journal Science. You can find them here and here.

Multimessenger astrophysics on the rise

Blazars are a special type of superluminous active galaxy that blast out twin jets of light and particles, one of which is aimed directly at Earth. (That's partly why blazars appear so bright to us — because we're in the line of jet fire.)

Astronomers have identified several thousand blazars throughout the universe, none of which have yet been found to be slinging neutrinos at us like TXS 0506+056 is.

"There's something special about this source, and we have to figure out what it is," Halzen told Space.com.

That's just one of many questions raised by the new results. For example, Halzen would also like to know the acceleration mechanism: How, exactly, do blazars get neutrinos and cosmic rays up to such tremendous speeds?

Halzen expressed optimism about answering such questions in the relatively near future, citing the power of "multimessenger astrophysics" — the use of at least two different types of signals to interrogate the cosmos — on display in the two new studies.

The neutrino discovery follows closely on the heels of another multimessenger landmark: In October 2017, researchers announced that they had analyzed a collision between two superdense neutron stars by observing both the electromagnetic radiation and gravitational waves emitted during the dramatic event.

"The era of multimessenger astrophysics is here," NSF Director France Cordova said in the same statement. "Each messenger — from electromagnetic radiation, gravitational waves and now neutrinos — gives us a more complete understanding of the universe and important new insights into the most powerful objects and events in the sky."
(FULL STORY)

The Peculiar Math That Could Underlie the Laws of Nature
[7/20/2018]
New findings are fueling an old suspicion that fundamental particles and forces spring from strange eight-part numbers called “octonions: Cohl Furey, a mathematical physicist at the University of Cambridge, is finding links between the Standard Model of particle physics and the octonions, numbers whose multiplication rules are encoded in a triangular diagram called the Fano plane.
(FULL STORY)

World's fastest man-made spinning object could help study quantum mechanics
[7/21/2018]
Researchers have created the fastest man-made rotor in the world, which they believe will help them study quantum mechanics. At more than 60 billion revolutions per minute, this machine is more than 100,000 times faster than a high-speed dental drill. "This study has many applications, including material science," said Tongcang Li, an assistant professor of physics and astronomy, and electrical and computer engineering, at Purdue University. "We can study the extreme conditions different materials can survive in."
Li's team synthesized a tiny dumbbell from silica and levitated it in high vacuum using a laser. The laser can work in a straight line or in a circle—when it's linear, the dumbbell vibrates, and when it's circular, the dumbbell spins.
A spinning dumbbell functions as a rotor, and a vibrating dumbbell functions like an instrument for measuring tiny forces and torques, known as a torsion balance. These devices were used to discover things like the gravitational constant and density of Earth, but Li hopes that as they become more advanced, they'll be able to study things like quantum mechanics and the properties of vacuum.
Researchers have created the fastest man-made rotor in the world by spinning a nanodumbbell with a circularly polarized laser.
"People say that there is nothing in vacuum, but in physics, we know it's not really empty," Li said. "There are a lot of virtual particles which may stay for a short time and then disappear. We want to figure out what's really going on there, and that's why we want to make the most sensitive torsion balance."
By observing this tiny dumbbell spin faster than anything before it, Li's team may also be able to learn things about vacuum friction and gravity. Understanding these mechanisms is an essential goal for the modern generation of physics, Li said.
A nanodumbbell levitated by an optical tweezer in vacuum can vibrate or spin, depending on the polarization of the incoming laser. Credit: Purdue University photo/Tongcang Li
Explore further: Levitating nanoparticle improves 'torque sensing,' might bring new research into fundamentals of quantum theory
More information: Jonghoon Ahn et al. Optically Levitated Nanodumbbell Torsion Balance and GHz Nanomechanical Rotor, Physical Review Letters (2018). DOI: 10.1103/PhysRevLett.121.033603 , https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.121.033603
Journal reference: Physical Review Letters
Provided by: Purdue University
632 shares


Read more at: https://phys.org/news/2018-07-world-fastest-man-made-quantum-mechanics.html#jCp
(FULL STORY)

A City-Sized 'Telescope' Could Watch Space-Time Ripple 1 Million Times a Year
[4/15/2018]

(FULL STORY)

Physicists Find a Way to See the ‘Grin’ of Quantum Gravity
[3/26/2018]
A recently proposed experiment would confirm that gravity is a quantum force.
(FULL STORY)

How Einstein Lost His Bearings, and With Them, General Relativity
[3/26/2018]
By 1913, Albert Einstein had nearly completed general relativity. But a simple mistake set him on a tortured, two-year reconsideration of his theory. Today, mathematicians still grapple with the issues he confronted.
(FULL STORY)

Quantum vacuum may allow stars to exist in unconventional configurations
[3/21/2018]
A new kind of star comes up from a study by SISSA's postdoctoral researcher Raul Carballo-Rubio. In a piece of research recently published in Physical Review Letters, Carballo-Rubio has developed a novel mathematical model that combines general relativity with the repulsive effect of quantum vacuum polarization. The inclusion of this repulsive force allows describing ultracompact configurations of stars, which were previously considered by scientists not to exist in equilibrium.

"As a consequence of the attractive and repulsive forces at play, a massive star can either become a neutron star, or turn into a black hole" says Carballo-Rubio.

In neutron stars, stellar equilibrium is the result of the "fight" between gravity, which is an attractive force, and a repulsive force called degeneracy pressure, of quantum mechanical origin. "But if the star's mass becomes higher than a certain threshold, about 3 times the solar mass, the equilibrium would be broken and the star collapses due to the overwhelming pull of the gravitational force".

In this study, the researcher has investigated the possibility that additional quantum mechanical forces that are largely expected to be present in nature, permit new equilibrium configurations for stars above this threshold. The additional force that has been taken into account is a manifestation of the effect known as "quantum vacuum polarization", which is a robust consequence of mixing gravity and quantum mechanics in a semiclassical framework.

"The novelty in this analysis is that, for the first time, all these ingredients have been assembled together in a fully consistent model. Moreover, it has been shown that there exist new stellar configurations, and that these can be described in a surprisingly simple manner".

There are still several important issues that remain to be studied, including the observational applications of these results. "It is not clear yet whether these configurations can be dynamically realized in astrophysical scenarios, or how long would they last if this is the case".

From an observational perspective, these "semiclassical relativistic stars" would be very similar to black holes. However, even minute differences would be perceptible in the next generation of gravitational wave observatories: "If there are very dense and ultracompact stars in the Universe, similar to black holes but with no horizons, it should be possible to detect them in the next decades".
(FULL STORY)

Friends and colleagues from the University of Cambridge have paid tribute to Professor Stephen Hawking, who died on 14 March 2018 at the age of 76.
[3/20/2018]
Widely regarded as one of the world’s most brilliant minds, he was known throughout the world for his contributions to science, his books, his television appearances, his lectures and through biographical films. He leaves three children and three grandchildren.

Professor Hawking broke new ground on the basic laws which govern the universe, including the revelation that black holes have a temperature and produce radiation, now known as Hawking radiation. At the same time, he also sought to explain many of these complex scientific ideas to a wider audience through popular books, most notably his bestseller A Brief History of Time.

He was awarded the CBE in 1982, was made a Companion of Honour in 1989, and was awarded the US Presidential Medal of Freedom in 2009. He was the recipient of numerous awards, medals and prizes, including the Copley Medal of the Royal Society, the Albert Einstein Award, the Gold Medal of the Royal Astronomical Society, the Fundamental Physics Prize, and the BBVA Foundation Frontiers of Knowledge Award for Basic Sciences. He was a Fellow of The Royal Society, a Member of the Pontifical Academy of Sciences, and a Member of the US National Academy of Sciences.

He achieved all this despite a decades-long battle motor neurone disease, with which he was diagnosed while a student, and eventually led to him being confined to a wheelchair and to communicating via his instantly recognisable computerised voice. His determination in battling with his condition made him a champion for those with a disability around the world.

Professor Hawking came to Cambridge in 1962 as a PhD student and rose to become the Lucasian Professor of Mathematics, a position once held by Isaac Newton, in 1979. In 2009, he retired from this position and was the Dennis Stanton Avery and Sally Tsui Wong-Avery Director of Research in the Department of Applied Mathematics and Theoretical Physics until his death - he was also a member of the University's Centre for Theoretical Cosmology, which he founded in 2007. He was active scientifically and in the media until the end of his life.

Professor Stephen Toope, Vice-Chancellor of the University of Cambridge, paid tribute, saying, “Professor Hawking was a unique individual who will be remembered with warmth and affection not only in Cambridge but all over the world. His exceptional contributions to scientific knowledge and the popularisation of science and mathematics have left an indelible legacy. His character was an inspiration to millions. He will be much missed.”

Stephen William Hawking was born on January 8, 1942 in Oxford although his family was living in north London at the time. In 1959, the family moved to St Albans where he attended St Albans School. Despite the fact that he was always ranked at the lower end of his class by teachers, his school friends nicknamed him ‘Einstein’ and seemed to have encouraged his interest in science. In his own words, “physics and astronomy offered the hope of understanding where we came from and why we are here. I wanted to fathom the depths of the Universe.”

His ambition brought him a scholarship to University College Oxford to read Natural Science.There he studied physics and graduated with a first class honours degree.

He then moved to Trinity Hall Cambridge and was supervised by Dennis Sciama at the Department of Applied Mathematics and Theoretical Physics for his PhD; his thesis was titled ‘Properties of Expanding Universes.’ In 2017, he made his PhD thesis freely available online via the University of Cambridge’s Open Access repository. There have been over a million attempts to download the thesis, demonstrating the enduring popularity of Professor Hawking and his academic legacy.

On completion of his PhD, he became a research fellow at Gonville and Caius College where he remained a fellow for the rest of his life. During his early years at Cambridge, he was influenced by Roger Penrose and developed the singularity theorems which show that the Universe began with the Big Bang.

An interest in singularities naturally led to an interest in black holes and his subsequent work in this area laid the foundations for the modern understanding of black holes. He proved that when black holes merge, the surface area of the final black hole must exceed the sum of the areas of the initial black holes, and he showed that this places limits on the amount of energy that can be carried away by gravitational waves in such a merger. He found that there were parallels to be drawn between the laws of thermodynamics and the behaviour of black holes. This eventually led, in 1974, to the revelation that black holes have a temperature and produce radiation, now known as Hawking radiation, a discovery which revolutionised theoretical physics.

He also realised that black holes must have an entropy – often described as a measure of how much disorder is present in a given system – equal to one quarter of the area of their event horizon: – the ‘point of no return’, where the gravitational pull of a black hole becomes so strong that escape is impossible. Some forty-odd years later, the precise nature of this entropy is still a puzzle. However, these discoveries led to Hawking formulating the ‘information paradox’ which illustrates a fundamental conflict between quantum mechanics and our understanding of gravitational physics. This is probably the greatest mystery facing theoretical physicists today.

To understand black holes and cosmology requires one to develop a theory of quantum gravity. Quantum gravity is an unfinished project which is attempting to unify general relativity, the theory of gravitation and of space and time with the ideas of quantum mechanics. Hawking’s work on black holes started a new chapter in this quest and most of his subsequent achievements centred on these ideas. Hawking recognised that quantum mechanical effects in the very early universe might provide the primordial gravitational seeds around which galaxies and other large-scale structures could later form. This theory of inflationary fluctuations, developed along with others in the early 1980’s, is now supported by strong experimental evidence from the COBE, WMAP and Planck satellite observations of the cosmic microwave sky. Another influential idea was Hawking’s ‘no boundary’ proposal which resulted from the application of quantum mechanics to the entire universe. This idea allows one to explain the creation of the universe in a way that is compatible with laws of physics as we currently understand them.

Professor Hawking’s influential books included The Large Scale Structure of Spacetime, with G F R Ellis; General Relativity: an Einstein centenary survey, with W Israel; Superspace and Supergravity, with M Rocek (1981); The Very Early Universe, with G Gibbons and S Siklos, and 300 Years of Gravitation, with W Israel.

However, it was his popular science books which took Professor Hawking beyond the academic world and made him a household name. The first of these, A Brief History of Time, was published in 1988 and became a surprise bestseller, remaining on the Sunday Times best-seller list for a record-breaking 237 weeks. Later popular books included Black Holes and Baby Universes, The Universe in a Nutshell, A Briefer History of Time, and My Brief History. He also collaborated with his daughter Lucy on a series of books for children about a character named George who has adventures in space.

In 2014, a film of his life, The Theory of Everything, was released. Based on the book by his first wife Jane, the film follows the story of their life together, from first meeting in Cambridge in 1964, with his subsequent academic successes and his increasing disability. The film was met with worldwide acclaim and Eddie Redmayne, who played Stephen Hawking, won the Academy Award for Best Actor at the 2015 ceremony.

Travel was one of Professor Hawking’s pastimes. One of his first adventures was to be caught up in the 7.1 magnitude Bou-in-Zahra earthquake in Iran in 1962. In 1997 he visited the Antarctic. He has plumbed the depths in a submarine and in 2007 he experienced weightlessness during a zero-gravity flight, routine training for astronauts. On his return, he quipped “Space, here I come.”

Writing years later on his website, Professor Hawking said: “I have had motor neurone disease for practically all my adult life. Yet it has not prevented me from having a very attractive family and being successful in my work. I have been lucky that my condition has progressed more slowly than is often the case. But it shows that one need not lose hope.”

At a conference In Cambridge held in celebration of his 75th birthday in 2017, Professor Hawking said “It has been a glorious time to be alive and doing research into theoretical physics. Our picture of the Universe has changed a great deal in the last 50 years, and I’m happy if I’ve made a small contribution.”

And he said he wanted others to feel the passion he has for understanding the universal laws that govern us all. “I want to share my excitement and enthusiasm about this quest. So remember to look up at the stars and not down at your feet. Try to make sense of what you see and wonder about what makes the universe exist. Be curious, and however difficult life may seem, there is always something you can do, and succeed at. It matters that you don’t just give up.”
(FULL STORY)

A possible experiment to prove that gravity and quantum mechanics can be reconciled
[3/12/2018]
Two teams of researchers working independently of one another have come up with an experiment designed to prove that gravity and quantum mechanics can be reconciled. The first team is a pairing of Chiara Marletto of the University of Oxford and Vlatko Vedral of National University of Singapore. The second is an international collaboration. In the papers, both published in Physical Review Letters, the teams describe their experiment and how it might be carried out.

Gravity is a tough nut to crack, there is just no doubt about it. In comparison, the strong, weak and electromagnetic forces are a walk in the park. Scientists still can't explain the nature of gravity, though how it works is rather well understood. The current best theory regarding gravity goes all the way back to Einstein's general theory of relativity, but there has been no way to reconcile it with quantum mechanics. Some physicists suggest it could be a particle called the graviton. But proving that such a particle exists has been frustrating, because it would be so weak that it would be very nearly impossible to measure its force. In this new effort, neither team is suggesting that their experiment could reconcile gravity and quantum mechanics. Instead, they are claiming that if such an experiment is successful, it would very nearly prove that it should be possible to do it.
The experiment essentially involves attempting to entangle two particles using their gravitational attraction as a means of confirming quantum gravity. In practice, it would consist of levitating two tiny diamonds a small distance from one another and putting each of them into a superposition of two spin directions. After that, a magnetic field would be applied to separate the spin components. At this point, a test would be made to see if each of the components is gravitationally attracted. If they are, the researchers contend, that will prove that gravity is quantum; if they are not, then it will not. The experiment would have to run many times to get an accurate assessment. And while a first look might suggest such an experiment could be conducted very soon, the opposite is actually true. The researchers suggest it will likely be a decade before such an experiment could be carried out due to the necessity of improving scale and the sensitivity involved in such an experiment.
Explore further: Physicists propose test of quantum gravity using current technology
More information: C. Marletto et al. Gravitationally Induced Entanglement between Two Massive Particles is Sufficient Evidence of Quantum Effects in Gravity, Physical Review Letters (2017). DOI: 10.1103/PhysRevLett.119.240402 , https://arxiv.org/abs/1707.06036
ABSTRACT
All existing quantum-gravity proposals are extremely hard to test in practice. Quantum effects in the gravitational field are exceptionally small, unlike those in the electromagnetic field. The fundamental reason is that the gravitational coupling constant is about 43 orders of magnitude smaller than the fine structure constant, which governs light-matter interactions. For example, detecting gravitons—the hypothetical quanta of the gravitational field predicted by certain quantum-gravity proposals—is deemed to be practically impossible. Here we adopt a radically different, quantum-information-theoretic approach to testing quantum gravity. We propose witnessing quantumlike features in the gravitational field, by probing it with two masses each in a superposition of two locations. First, we prove that any system (e.g., a field) mediating entanglement between two quantum systems must be quantum. This argument is general and does not rely on any specific dynamics. Then, we propose an experiment to detect the entanglement generated between two masses via gravitational interaction. By our argument, the degree of entanglement between the masses is a witness of the field quantization. This experiment does not require any quantum control over gravity. It is also closer to realization than detecting gravitons or detecting quantum gravitational vacuum fluctuations.
Sougato Bose et al. Spin Entanglement Witness for Quantum Gravity, Physical Review Letters (2017). DOI: 10.1103/PhysRevLett.119.240401 , https://arxiv.org/abs/1707.06050
ABSTRACT
Understanding gravity in the framework of quantum mechanics is one of the great challenges in modern physics. However, the lack of empirical evidence has lead to a debate on whether gravity is a quantum entity. Despite varied proposed probes for quantum gravity, it is fair to say that there are no feasible ideas yet to test its quantum coherent behavior directly in a laboratory experiment. Here, we introduce an idea for such a test based on the principle that two objects cannot be entangled without a quantum mediator. We show that despite the weakness of gravity, the phase evolution induced by the gravitational interaction of two micron size test masses in adjacent matter-wave interferometers can detectably entangle them even when they are placed far apart enough to keep Casimir-Polder forces at bay. We provide a prescription for witnessing this entanglement, which certifies gravity as a quantum coherent mediator, through simple spin correlation measurements.


Read more at: https://phys.org/news/2018-03-gravity-quantum-mechanics.html#jCp
(FULL STORY)

SpaceX Aims to Begin BFR Spaceship Flight Tests as Soon as Next Year
[2/7/2018]
Only a few hours after world beheld the launch of the Falcon Heavy, Elon Musk had already decided the monster rocket was too small.

“I finished looking at the side boosters, and they’re pretty big—you know, 16 stories tall, 60-foot leg span," Musk said at a press conference following the launch. "But really we need to be way bigger than that."

"NOW THAT WE'RE ALMOST DONE WITH FALCON... MOST OF OUR ENGINEERING RESOURCES WILL GO TOWARD BFR."
The Falcon 9 and Falcon Heavy are both closing in on their final designs. The so-called "Block 5" update will be the last major upgrade to the rockets, one that will increase the rockets' thrust and reusability and also allow SpaceX to certify the Falcon 9 to carry crews of astronauts on the Dragon 2 spacecraft. To go "way bigger than that" will require something new.

SpaceX is banking its future on the Big Falcon Rocket, or BFR—although the original name for the rocket is a bit more colorful. And this week's big success was a boon for Musk's ever bigger dreams. “It’s given me a lot of confidence that we can make the BFR design work,” Musk says about the Falcon Heavy launch and landing of the two side boosters. In fact, Musk says SpaceX could begin short flight tests of the second stage of BFR next year.

The Big Falcon Rocket will consist of a massive first-stage booster with 31 Raptor engines—a new rocket engine SpaceX began test-firing in September 2016. The second stage, also known as the Interplanetary Transport System, is a 48-meter long, 9-meter diameter spaceship that, on paper, could carry up to 100 people on flights to other planets. It's a rocket that could fulfill Musk's ultimate goal: colonizing Mars.


SPACEX
“I think we might be able to do short hopper flights with the spaceship part of BFR, maybe next year,” says Musk. "By hopper tests, I mean kind of like the beginning of the Grasshopper program for Falcon 9... it will go up several miles and come down."

The Grasshopper was a small experimental rocket SpaceX built to test vertical takeoffs and landings, a program that paved the way for Musk's company to land full-scale boosters like it did with two of the three from this week's Falcon Heavy test flight. The first flights of the BFR spaceship will be similar tests, and Musk said those flights probably would take place at a new commercial spaceport that SpaceX is building on the beaches of south Texas, just outside of Brownsville. However, Musk said it is possible the BFR hopper tests are conducted "ship to ship," potentially using two drone ships and flying the spaceship from one to the other.

Musk said the tests need to take place somewhere remote, "so if it blows up it's cool." He also said the spaceship should be capable of flying to Earth orbit itself, a requirement for the long-term plan of having the Interplanetary Transport System fly back to Earth from the moon or Mars.


Raptor engine test fire, April 12, 2017.
SPACEX
These initial tests need to make sure the spaceship can hold up to intense reentry heats. "The ship part is by far the hardest. That's gonna come in from super-orbital velocities," Musk says, meaning the spaceship will be returning at incredibly high speeds from beyond Earth's orbit. He explained that some of the heating elements will scale to the eighth power with velocity.

ADVERTISEMENT - CONTINUE READING BELOW


"[We] really want to test the heat shield material, so we're going to fly up, turn around, [and] accelerate back real hard."

In addition to the spaceport in Brownsville, SpaceX is considering expanding its facilities in San Pedro, California, to accelerate work on the interplanetary spaceship, as reported by Wired. Musk hopes the Big Falcon Rocket will ultimately be capable enough to replace Falcon 9 and Falcon Heavy.

"Now that we're almost done with Falcon 9 and Falcon Heavy... most of our engineering resources will be dedicated to BFR," he says. "I think it's conceivable that we do our first test flight in three or four years—full on orbital test flight of the new booster... going to the moon shortly thereafter."


SPACEX
That timeframe is ambitious to say the least. Falcon Heavy was announced in 2011 and took seven years to realize. The BFR is even bigger and more complex, designed to fly with four more engines than Falcon Heavy, and those engines are still in development. If SpaceX is going to fly the BFR booster and spaceship by the early 2020s, then getting started on flight tests for the Interplanetary Transport System is absolutely crucial.

"Testing the ship out is the whole tricky part," says Musk. "The booster I think—I don't want to get complacent—but I think we understand reusable boosters. Reusable spaceships that land propulsively, that's harder."


SPACEX
BFR is the most arduous spaceflight project SpaceX has ever attempted—an enormous booster with a second-stage spaceship that can fly a hundred people to the moon or Mars, launch commercial payloads, and even fly passengers from a city on Earth through space to land in another city as an alternative to airliners. SpaceX is going to learn a lot building BFR, probably producing some epic explosions along the way.

Whether BFR will fly in its complete configuration within three or four years is anyone's guess at this point. It's safe to say there will be unforeseen challenges and likely delays, but after witnessing Falcon Heavy, it seems that if anyone can pull off a large reusable interplanetary spaceship, its SpaceX.
(FULL STORY)

Galaxies Rotate in Sync, Raising Dark Matter Questions
[2/6/2018]
The universe is filled with galaxies, and often a large galaxy like our own will have several smaller ones orbiting it. Astronomers looked at one particular group of galaxies and noticed their circling was a bit too orderly for current models to explain.

Most scientists' understanding of our universe includes a substance called dark matter, which accounts for 80 percent of the matter in the universe. Dark matter was first hypothesized in order to account for the rotation of galaxies, which didn't seem to have enough conventional matter to keep them from flying apart like a smoothie in a lidless blender. Dark matter provides the extra stuff needed to keep galaxies together and was likely involved in galaxy formation. Dark matter appears to be cobwebbed across the universe. Scientist suspect that dwarf galaxies form along these dark matter threads and converge where they meet, merging into larger galaxies.
Under this framework, satellite galaxies should be distributed randomly around their host, following elongated orbits in arbitrary directions. This assumption was challenged when scientists found that satellites of our own Milky Way and the Andromeda galaxy, our closest major neighbor, don't follow this prediction. The small companion galaxies appear to rotate in sync with each other, following fairly circular paths in a disk-shaped plane around their host galaxy.

"These two distributions could not be more different," Stacy McGaugh, an expert in cosmologic modeling at Case Western Reserve University in Ohio, told Space.com.

Now scientists have found a third example of a highly ordered satellite system. In 2015, group of astronomers found that most of the dwarf galaxies orbiting the large galaxyCentaurus A did so in a plane perpendicular to the galaxy's disk. After hearing of this, a team of astronomers led by the University of Basel looked at the individual satellite galaxies circling Centaurus A, which is the richest assembly of galaxies within 30 million light-years of the Milky Way, according to the study. By tracking the positions and velocities of the satellites, the team discovered that 14 of the 16 companions orbit Centaurus A in the same direction. The latest findings were detailed Feb. 1 in the journal Science.

"A statistical outlier will happen once or twice," the University of Basel's Oliver Müller, lead author on the new work, told Space.com. "So we would expect that we find stuff by pure chance such as this. But if we find three of those systems close to each other, and every galaxy group has a .1 percent chance to exist [in such a well-ordered state] … then what is the probability of that?"

McGaugh, who was not involved with the study, agrees that the observation is a cause for concern for the current model. "It's not just a quibble," he said. "It's the third of three, [and] we haven't seen one that behaves right."

When it comes to satellite galaxies, "you can drop them in from afar or spin them out," McGaugh said. One way to produce these organized systems in the current model is to assume that the dwarf galaxies all formed elsewhere in space and fell into orbit around the host galaxy at the same time. This is, however, unlikely, McGaugh said.

Alternatively, they might have formed more recently, from interactions between nearby galaxies tugging on each other like the moon tugs on the Earth, raising the ocean to create tides. If this were the case, material might swirl off a galaxy, coalesce into a dwarf galaxy and begin to orbit its host. These tidal dwarf galaxies would naturally orbit in the plane of interaction between the two larger galaxies, and would likely circle in the same direction, Müller said.

Unfortunately, such a scenario is highly unlikely in the prevailing model, McGaugh said. It is likely under a competing model of the universe, but this rival has other drawbacks. "Sometimes, the best current answer is, 'We don't know,'" McGaugh said.

Müller hopes the observations from Centaurus A broaden the conversation about proposals like this one. "It would be really cool to see all the different explanations of such structures, how they can be formed," he said.

Follow Harrison @harrisontasoff. Follow us @Spacedotcom, Facebook and Google+. Original article on Space.com.
(FULL STORY)

Does general relativity violate determinism inside charged black holes?
[2/2/2018]
Under certain extreme conditions Einstein’s general theory of relativity seems to violate determinism, according to an international team of physicists. The group has shown that in a universe expanding under the influence of the cosmological constant, black holes generated by the collapse of highly charged stars should contain a region where physical conditions are not fixed by the stars’ initial state. At odds with a 40-year old idea known as cosmic censorship, the researchers say that signs of this indeterminism might show up in detections of gravitational waves.
Newton’s mechanics allow us in principle to calculate the exact state of a physical system at any point in the future, provided that we know its initial state perfectly. So too with general relativity: a precise knowledge of space’s geometry and its rate of change in the present enables us in theory to predict exactly how space-time will evolve. As such, Einstein’s theory is considered by most physicists to be entirely deterministic.
Charged black holes, however, challenge this deterministic picture. The “Reissner-Nordström” solution of general relativity describes a black hole created when a star that is electrically charged and spherical collapses in on itself under the force of gravity. Hidden from view inside such a black hole’s event horizon lies a second boundary known as the Cauchy horizon, beyond which space-time is smooth but indeterminate. In other words, the future can no longer be predicted.
Strong cosmic censorship
An idea put forward by British physicist Roger Penrose in the 1970s had appeared to forbid such non-deterministic behaviour. His “strong cosmic censorship” conjecture states that there is some mechanism within general relativity – a censor – that prohibits the appearance of Cauchy horizons. In the case of a charged black hole, he calculated that even the slightest perturbation in the initial conditions of the imploding star destroys the Cauchy horizon and yields a singularity in its place. At this point of infinite space-time curvature the relativistic field equations break down and determinism, or its absence, ceases to become an issue.
But in the new work, described in Physical Review Letters, Lisbon University’s Vitor Cardoso and colleagues find that there should be some circumstances when the singularity imposed by cosmic censorship does not form. They considered the net effect of two opposing influences on the Cauchy horizon – the amplification of any tiny perturbation by the immense gravity of a black hole on the one hand, and the damping effect of the black hole’s external environment on the other. Specifically, they worked out what would happen for a highly charged star collapsing in a universe whose expansion is being accelerated by a cosmological constant – as ours appears to be.
Quasinormal modes
To do so, Cardoso and team studied damped oscillations known as quasinormal modes. They have shown that when the collapsing star has enough charge, damping wins out over amplification and the oscillations die away quickly. As Cardoso explains, the charge and cosmological constant essentially provide repulsive forces that counteract the pull of gravity and so diminish its amplifying effects. The upshot, he and his colleagues say, is that the Cauchy horizon is damaged but not completely destroyed. As such, they conclude, there is indeed a region within the black hole where the relativistic field equations work but where determinism breaks down.
Group member João Costa says that this would be a more fundamental breakdown of determinism than that inherent to quantum phenomena. As he points out, although we can’t predict the outcome of any particular quantum measurement we can still work out the probability distribution for an ensemble of measurements. But, he says, beyond the Cauchy horizon such overall predictability would be impossible.
Falling Schrödinger’s cat
“Thinking about Schrödinger’s cat, we know we can assign probabilities to the cat being alive and dead,” says Cardoso. “But if the cat were to fall inside the Cauchy horizon we could not even compute these probabilities.” Although, he adds, “for the cat that is probably irrelevant, since it would be dead anyway.”
As Cardoso and colleagues point out in their paper, charged black holes are not expected to exist in nature. But they say that a close analogy between charge and angular momentum means they expect “very similar results” for neutral, rotating black holes. “Given the non-zero cosmological constant and the existence of rapidly rotating black holes in our universe,” they write, “these results cannot be taken lightly.”
Gary Horowitz of the University of California, Santa Barbara, who was not involved in the work, says the research provides “the best evidence I know for a violation of strong cosmic censorship in a theory of gravity and electromagnetism.” The next step, he adds, would be to fully model gravitational and charge effects – the present analysis relying on simplified massless scalar fields.
Unavoidable presence
In fact, in a paper recently posted to the arXiv server, Shahar Hod of the Ruppin Academic Center and the Hadassah Academic College in Israel claims to have done something similar. Hod has found that charged fields close to Reissner-Nordström black holes decay slowly enough to guarantee unstable Cauchy horizons. Given “the unavoidable presence” of these fields – since the collapsing stars themselves are charged – he concludes that such black holes must respect strong cosmic censorship.
Cardoso’s colleague Aron Jansen describes Hod’s counter-proposal as “an intriguing possible resolution” to the dispute but says more work needs to be done. One complicating factor, he says, is that actual charged matter would consist of fermions not the scalar particles investigated by Hod.
It is possible, adds Cardoso, that the issue could be settled by observing gravitational waves. He says that the idea of indeterminism would be bolstered by the existence of black holes that either have lots of charge or spin very quickly. Alternatively, he speculates, the fading of gravitational wave signals – due to the presence of a black hole’s event horizon – might be modulated by the presence of a Cauchy horizon. He points out, however, that such a signal might also mean dark energy cannot be explained in terms of the cosmological constant.
About the author
Edwin Cartlidge is a science writer based in Rome
(FULL STORY)

Speed of universe’s expansion remains elusive
[1/17/2018]
Discrepancy between measures of Hubble constant suggests influence of some astronomical unknown
(FULL STORY)

Top 7 Breakthroughs of 2017 That Prove We’re Living in the Future
[12/21/2017]
2017 has been a year of extraordinary breakthroughs in science and technology. Before it comes to an end, let's remind ourselves of the advances that will reshape humanity's future.
(FULL STORY)

PHYSICS BREAKTHROUGH: NEW FORM OF MATTER, EXCITONIUM, FINALLY PROVED TO EXIST AFTER 50-YEAR SEARCH
[12/9/2017]
After 50 years of theories and thwarted attempts, scientists have finally proved the existence of a new form of matter. The never-before-detected condensate is called excitonium, a name first coined in the 1960s by Harvard theoretical physicist Bert Halperin, who is now 76. Peter Abbamonte, the physicist responsible for the discovery, recently saw him at a party; Halperin was, apparently, excited.

“It’s as close to ‘proved’ as you’re ever going to get in science,” Abbamonte, a physics professor at the University of Illinois at Urbana-Champaign, told Newsweek. “You can never really ‘prove’ anything, but, well, people find it convincing.”

excitonium
Professor of physics Peter Abbamonte (center) works with graduate students Anshul Kogar (right) and Mindy Rak (left) in his laboratory at the Frederick Seitz Materials Research Laboratory.
L. BRIAN STAUFFER, UNIVERSITY OF ILLINOIS AT URBANA-CHAMPAIGN

Keep Up With This Story And More By Subscribing Now

Excitonium is a condensate, meaning what the researchers detected was a solid. Excitonium is made up of particles called excitons, in the same way that, say, solid aluminum is made up of aluminum particles. The exciton particles themselves, though, aren’t created through quite as intuitive a process.

Let’s start with something a little more conventional to compare to, like hydrogen. Hydrogen particles are made up of an electron and a proton. Exciton particles, then, are made up of an electron that’s escaped and the negative space it left behind when it did so. The hole actually acts like a particle, attracting the escaped electron and bonding with it; they orbit each other the same way an electron and a proton would.

Previous scientists strongly suspected that excitonium existed, but they never had a good enough way of proving it. What Abbamonte and his colleagues did was invent an electron-scattering technique to detect the exciton particles’ final result, excitonium. They started with a clean surface of the material in a vacuum—no air or anything else—and then scattered the electrons from its surface to make waves, like hitting the middle of a trampoline.

The particular way the waves spread allowed them to detect those escaped electrons in their final form, excitonium. It’s not unlike the way the fabled Higgs Boson was detected. They call the technique momentum-resolved electron energy-loss spectroscopy, or M-EELS. A paper explaining the discovery was published in the journal Science.

excitonium1
Artist's depiction of the collective excitons of an excitonic solid. These excitations can be thought of as propagating domain walls (yellow) in an otherwise ordered solid exciton background (blue).
PETER ABBAMONTE, U. OF I. DEPARTMENT OF PHYSICS AND FREDERICK SEITZ MATERIALS RESEARCH LABORATORY

Abbamonte and his colleagues started working on their scattering technique about seven years ago, but they weren’t designing it to detect excitonium. They initially wanted to study high-temperature superconductors. In early 2015, through “total serendipity” as Abbamonte put it, they realized their work had the potential to prove the existence of a whole new kind of matter.

Excitonium is such uncharted territory that scientists don’t yet know what its properties are. Some, said Abbamonte, think it will be an insulator, meaning it can’t carry any energy or momentum. Others think it will be a superfluid, meaning it can carry both energy and momentum with no dissipation—so, the exact opposite.

If it does turn out to be a superfluid, it could be used to conduct electricity and energy. The next step, according to Abbamonte, is to figure out what exactly excitonium does, but for now it’s too soon to speculate about its applications.

“The most important thing is that it exists,” Abbamonte said. “It’s one of those things that just ought to be there, you know? And it didn’t make sense that it wasn’t."
(FULL STORY)

In Just 4 Hours, Google’s AI Mastered All The Chess Knowledge in History
[12/7/2017]
Chess isn’t an easy game, by human standards. But for an artificial intelligence powered by a formidable, almost alien mindset, the trivial diversion can be mastered in a few spare hours.

In a new paper, Google researchers detail how their latest AI evolution, AlphaZero, developed “superhuman performance” in chess, taking just four hours to learn the rules before obliterating the world champion chess program, Stockfish.

In other words, all of humanity’s chess knowledge – and beyond – was absorbed and surpassed by an AI in about as long as it takes to drive from New York City to Washington, DC.

After being programmed with only the rules of chess (no strategies), in just four hours AlphaZero had mastered the game to the extent it was able to best the highest-rated chess-playing program Stockfish.

In a series of 100 games against Stockfish, AlphaZero won 25 games while playing as white (with first mover advantage), and picked up three games playing as black. The rest of the contests were draws, with Stockfish recording no wins and AlphaZero no losses.


“We now know who our new overlord is,” said chess researcher David Kramaley, the CEO of chess science website Chessable.

“It will no doubt revolutionise the game, but think about how this could be applied outside chess. This algorithm could run cities, continents, universes.”


Image source: Creative Commons/Pixabay
Developed by Google’s DeepMind AI lab, AlphaZero is a tweaked, more generic version of AlphaGo Zero, which specialises in playing the Chinese board game, Go.

DeepMind has been refining this AI for years, in the process besting a series of human champions who fell like dominoes before the indomitable, “Godlike” neural network.

That victory streak culminated in a startling success in October, in which a new fully autonomous version of the AI – which only learns by playing itself, never facing humans – bested all its former incarnations.

By contrast, AlphaGo Zero’s predecessors partly learned how to play the game by watching moves made by human players.

That effort was intended to assist the fledgling AI in learning strategy, but it seems it may have actually been a handicap, since AlphaGo Zero’s fully self-reliant learning proved devastatingly more effective in one-on-one competition.

“It’s like an alien civilisation inventing its own mathematics,” computer scientist Nick Hynes from MIT told Gizmodo in October.


“What we’re seeing here is a model free from human bias and presuppositions. It can learn whatever it determines is optimal, which may indeed be more nuanced that our own conceptions of the same.”

But things are moving so fast in this field that already the October accomplishment may have been outmoded.

In their new paper, the team outlines how the very latest AlphaZero AI takes the self-playing reliance – called reinforcement learning – and applies it with a much more generalised streak that gives it a broader focus to problem solving.

That broader focus means AlphaZero doesn’t just play chess. It also plays Shogi (aka Japanese chess) and Go too – and, perhaps unsurprisingly, it only took two and eight hours respectively to master those games as well.

For now, Google and DeepMind’s computer scientists aren’t commenting publicly on the new research, which hasn’t as yet been peer-reviewed.

But from what we can tell so far, this algorithm’s dizzying ascent to the pinnacle of artificial intelligence is far from over, and even chess grandmasters are bewildered by the spectacle before them.

“I always wondered how it would be if a superior species landed on Earth and showed us how they played chess,” grandmaster Peter Heine Nielsen told the BBC.

“Now I know.”

The findings are available at preprint website arXiv.
(FULL STORY)

Farthest monster black hole found
[12/6/2017]
Astronomers have discovered the most distant "supermassive" black hole known to science.
(FULL STORY)

A Swarm Intelligence Correctly Predicted TIME’s Person of the Year
[12/6/2017]
For the second year in a row, Unanimous AI's artificial swarm intelligence has predicted TIME's Person of the Year. With a 100 percent success rate, swarm AI proves that it's capable of taking in human insight in real-time for an optimized choice.
(FULL STORY)

Mathematicians Awarded $3 Million for Cracking Century-Old Problem
[12/5/2017]
Two mathematicians have each earned the (massive but countable) sum of $3 million for a proof that could one day help scientists understand extra dimensions.

Christopher Hacon, a mathematician at the University of Utah, and James McKernan, a physicist at the University of California at San Diego, won this year's Breakthrough Prize in Mathematics for proving a long-standing conjecture about how many types of solutions a polynomial equation can have. Polynomial equations are mainstays of high-school algebra — expressions like x^2+5X+6 = 1 — in which variables are raised to the whole number exponents and added, subtracted and multiplied. The mathematicians showed that even very complicated polynomials have just a finite number of solutions. [Images: The World's Most Beautiful Equations]

The Breakthrough Prize, which is the largest individual monetary prize given in the sciences, is sponsored by Sergey Brin, co-founder of Google; Facebook founder Mark Zuckerberg; Chan Zuckerberg Initiative co-founder Priscilla Chan; Anne Wojcicki, the founder of 23andme; and tech entrepreneurs Yuri and Julia Miler and Pony Ma. The awards go to researchers in the fields of life sciences, fundamental physics and mathematics. This year's winners received a total of $22 million in prize money.


Simple question, hard answer

Like many of the most important math conjectures, anyone who studied quadratic equations in 10th-grade algebra can understand the basic question that Hacon and McKernan cracked. But the solution, a devilishly technical math proof that spans hundreds of pages of computer-like text, is only comprehensible to a tiny circle of experts around the world, Hacon said.

The basic question is: Given a certain type of polynomial equation — for instance, x^2 + y^2 = r^2 (where x and y are the variables) — how many different shapes of solutions exist?

Polynomials of different types represent different shapes: for instance, the equation above defines a circle, whereas other well-known classes of polynomials define spheres, donuts or football shapes. The more variables, the more dimensions the polynomial describes, and the more possible shapes the solutions may take.

For decades, mathematicians have had an inkling that polynomials with many dimensions still had a finite number of solution shapes. But proving that idea, called the "minimal model program in all dimensions," had eluded the brightest minds in the field.

The new proof shows that this mathematical intuition is indeed correct, at least for a certain class of shapes (those, such as a donut, that have at least one hole).

To solve this proof, the researchers used a highly technical "lemma," or an argument based on a much less interesting problem. When they realized that this lemma could crack the longstanding minimal model problem wide open, their discovery came "surprisingly quick" — in just a few years, Hacon said. Interestingly, the new proof doesn't reveal how many types of solutions to a polynomial of given dimension exist or even what those solutions might look like; it only reveals that the number of possible shapes the solution takes isn't infinite.

Window into extra dimensions

Right now, Hacon andMcKernan's proof has absolutely no practical application. But ultimately, it could provide a theoretical window into extra dimensions, Hacon said.

"There's this string theory that suggests there should be an extra sixth dimension of the universe that we can't perceive," Hacon told Live Science. So one question researchers have asked is, "How may possible shapes can these extra six dimensions have and how do those shapes affect the universe we see?" (The newest proof only applies to shapes with holes, while popular string theories imagine rolled-up dimensions with no holes, but future work could wind up being more directly applicable, Hacon said.)

How exactly do you visualize a six-dimensional solution in a 3D world?

"You cheat," Hacon said. "You've seen abstract paintings, Picasso and whatnot. The drawing is nothing like a real person but nevertheless you can recognize the main features and it does convey something to you."

In the same way, a six-dimensional space can't be truly depicted on a 2D piece of paper, but its essence can be captured using mathematical tools, Hacon said.

Originally published on Live Science.
(FULL STORY)

Scientists Experimentally Demonstrate the “Reversal of the Arrow of Time”
[12/4/2017]
Through a remarkable experiment, an international team of scientists has found that it is possible to reverse the arrow of time without violating the second law of thermodynamics. Their research confirms that we still have much to learn about the world around us.
(FULL STORY)

Physicists Take Steps Towards Measuring Unmeasurable Berry Curvature
[12/4/2017]
Berry curvature, a property of quantum mechanics, has never been directly observed. Thanks to new experimentation, observation may one day be possible.
(FULL STORY)

'Holy Grail' Hadron: Scientists Are Close to Detecting the Elusive Tetraquark Particle
[11/9/2017]
Flit, zip, jitter, boom. Quarks, the tiny particles that make up everything tangible in the universe, remain deeply mysterious to physicists even 53 years after scientists first began to suspect these particles exist. They bop around at the edge of scientific instruments' sensitivities, are squirreled away inside larger particles, and decay from their higher forms into their simplest in half the time it takes a beam of light to cross a grain of salt. The little buggers don't give up their secrets easily.

That's why it took more than five decades for physicists to confirm the existence of an exotic particle they've been hunting since the beginning of quark science: the massive (at least in subatomic particle terms), elusive tetraquark.

Physicists Marek Karliner of Tel Aviv University and Jonathan Rosner of the University of Chicago have confirmed that the strange, massive tetraquark can exist in its purest, truest form: four particles, all interacting with one another inside a single, larger particle, with no barriers keeping them apart. It's stable, they found, and can likely be generated at the Large Hadron Collider, a particle smasher at the CERN particle physics laboratory in Switzerland, they report in a paper to be published in a forthcoming issue of the journal Physical Review Letters. [Beyond Higgs: 5 Elusive Particles That May Lurk in the Universe]


00:5501:10
Hold up — what the quark is a quark?

If you know a little about particle physics, you probably know that everything with mass is made up of atoms. Diving a little deeper into particle physics would reveal that those atoms are made up of subatomic particles — protons, neutrons and electrons. An even deeper look would reveal quarks.

Neutrons and protons are the most common examples of a class of particles known as hadrons. If you could peer into a hadron, you'd find it's made up of even more basic particles, clinging tightly together. Those are quarks.

A diagram shows how quarks usually fit into our understanding of tiny particles.
A diagram shows how quarks usually fit into our understanding of tiny particles.
Credit: udaix/Shutterstock
Like atoms, which adopt different properties depending on the combinations of protons and neutrons in their nuclei, hadrons derive their properties from combinations of their resident quarks. A proton? That's two "up" quarks and one "down" quark. Neutrons? Those are made up of two "down" quarks and one "up" quark. [Wacky Physics: The Coolest Little Particles in Nature]

(Electrons aren't made up of quarks because they aren't hadrons — they're leptons, part of a class of distant cousins of quarks.)

"Up" and "down" are the most common flavors of quark, but they're just two out of six. The other four — "charm," "top," "strange" and "bottom" quarks — existed in the moments after the Big Bang, and they appear in extreme situations, such as during high-velocity collisions in particle colliders. But they're much heavier than up and down quarks, and they tend to decay into their lighter siblings within moments of their creation.

But those heavier quarks can last long enough to bind together into strange hadrons with unusual properties that are stable for the very short lifetimes of the quarks zipping around inside them. Some good examples: the "doubly charmed baryon," or a hadron made up of two charm quarks and a lighter quark; and its cousin, formed when a hadron made up of two bulky bottom quarks and one lighter quark fuse together in a flash more powerful than the individual fusion reactions inside hydrogen bombs. (Of note, the bottom quark fusion is militarily useless thanks to heavy quarks' short lifetimes.)

Playing with colors

"The suspicion had been for many years that [the tetraquark] is impossible," Karliner told Live Science.

That's because physical laws suggested four quarks couldn't actually bind together into a stable hadron. Here's why: Just like in atoms, where the attraction between positively charged protons and negatively charged electrons is what holds them together, hadrons are held together by forces as well. In atoms, positive and negative particles constantly try to neutralize their charges to zero, so protons and electrons stick together, canceling each other out. [7 Strange Facts About Quarks]

Quarks have positive and negative electrodynamic charges, but they also interact with one another via the much more powerful "strong" force. And the strong force also has charges, called color charges: red, green and blue.

Any quark can have any color charge. And when they bind together to form hadrons, all those charges have to cancel out. So a red quark, for example, has to hook up with either a green quark and a blue quark, or its antimatter twin — an "antiquark" with a color charge of "antired." (This is your brain on quantum mechanics.) Any combination of a color and its anticolor, or all three colors, sticking together has a neutral color charge. Physicists call these particles "white."

The tetraquark: It's like a relationship (in that it doesn't always work)

So, Karliner said, it's not hard to imagine a four-quark hadron: Just stick two quarks to two matching antiquarks. But just because you stick four matching quarks together, he said, doesn't mean they'll be stable enough to form an actual hadron — they could fly apart.

"Just because you move two men and two women into an apartment," Karliner said, "doesn't mean they'll settle down and form a nuclear family."

Quarks have mass, which physicists measure in units of energy: megaelectron volts, or MeV. When they bind together, some of that mass converts into the binding energy holding them together, also measured in MeV. (Remember Einstein's E=mc^2? That's energy equals mass-times-the-speed-of-light-squared, the equation governing that conversion.)

If the mass is too high compared with the binding force, the energy of the quarks careening around inside the hadron will tear the particle apart. If it's low enough, the particle will live long enough for the quarks to settle down and develop group properties before they decay. A big, happy quark-foursome family needs to have a mass lower than two mesons (or quark-antiquark pairs) stuck together, according to Karliner.

Unfortunately, the mass of a quark family after some of its bulk is converted into binding force is incredibly difficult to calculate, which makes it hard to figure out whether a given theoretical particle is stable.

Scientists have known for about a decade that mesons can bind to other mesons to form ad-hoc tetraquarks, which is why you might have seen reports touting the existence of tetraquarks before. But in those tetraquarks, each quark interacts primarily with its pair. In a true tetraquark, all four would mix with one another equally.

"It's charming and interesting, but not the same," Karliner said. "It's very different to have two couples in different rooms sharing an apartment, and two men and two women all together with everyone … interacting with everyone else."

But those double-meson tetraquarks provide the mass threshold that true tetraquarks must cross to be stable, he said.

A needle in a haystack of haystacks

In theory, Karliner said, it would be possible to predict the existence of a stable tetraquark from pure calculation. But the quantum mechanics involved were just too difficult to make work with any reasonable degree of confidence.

Karliner and Rosner's key insight was that you could start to figure out the mass and binding energy of rare hadrons by analogy to more common hadrons that had already been measured.

Remember that doubly charmed baryon from earlier? And its explosive cousin with the two bottom quarks? In 2013, Karliner and Rosner began to suspect they could calculate its mass, after thinking carefully about the binding energy inside mesons made up of charm quarks and anticharm quarks.

Quantum mechanics suggests that two different-colored charm quarks — say, a red charm and a green charm — should bind together with exactly half the energy of a charm quark and its antimatter twin — say, a red charm quark and an antired charm antiquark. And scientists have already measured the energy of that bond, so the energy of acharm-charm bond should be half of that.

So Karliner and Rosner worked with those numbers, and they found that the doubly charmed baryon and double-bottom baryon should have a mass of 3627 MeV, plus or minus 12 MeV. They published their papers and pushed the experimentalists at CERN (European Organization for Nuclear Research) to start hunting, Karliner said.

The LHCb detector at CERN.
The LHCb detector at CERN.
Credit: CERN


But Karliner and Rosner offered CERN a road map, and eventually, the CERN scientists acceded. In July 2017, the first definite doubly charmed baryons turned up in the Large Hadron Collider (LHC). [Photos: The World's Largest Atom Smasher (LHC)]"The experimentalists were quite skeptical at first" that it would be possible to find the doubly charmed baryons in the real world, Karliner said. "It's like looking for a needle not in a haystack, but in a haystack of haystacks."

"We predicted in 2014 that the mass of this doubly charmed baryon was going to be 3,627 MeV, give or take 12 MeV," Karliner said. "The LHC measured 3,621 MeV, give or take 1 MeV."

In other words, they nailed it.

And because their calculation turned out to be correct, Karliner and Rosner had a road map to the true stable tetraquark.

One big, fat, happy family

In quantum mechanics, Karliner explained, there's a general rule that heavier quarks tend to bind much more tightly to each other than lighter quarks do. So if you're going to find a stable tetraquark, it's probably going to involve some quarks from the heavier end of the flavor spectrum.

Karliner and Rosner got to work as soon as the doubly charmed baryon measurement was announced. First, they calculated the mass of a tetraquark made up of two charm quarks and two lighter antiquarks; charm quarks, after all, are pretty chunky, at about 1.5 times the mass of a proton. The result? A doubly-charmed tetraquark turns out to be right on the edge of stable and unstable, with room for error on both sides — in other words, too uncertain to call a discovery.

But charm quarks aren't the heaviest quarks around. Enter the bottom quark, a true monster of an elementary particle at about 3.5 times the mass of its charmed sibling, with an accompanying leap in binding energy.

Fuse two of those together, Karliner and Rosner calculated, along with an up antiquark and a down antiquark, and you'll end up with a stable foursome — converting so much of their bulk into binding energy that they end up 215 MeV under the maximum mass threshold, with a margin of error of just 12 MeV.

"The upshot of all this is that we now have a robust prediction for the mass of this object which had been the holy grail of this branch of theoretical physics," Karliner said.

This kind of tetraquark won't live very long once it's created; it winks out after just one-tenth of a picosecond, or the length of time it takes a beam of light to cross a single microscopic skin cell. It then will decay into simpler combinations of up and down quarks. But that 0.1 picoseconds (one ten-trillionth of a second) is plenty long enough on the quantum mechanical scale to be considered a stable particle.

"It's like if you compared a human lifetime to [the movement of continents]," Karliner said. "If you have some creatures living on the scale of fractions of seconds, a human lifetime would seem almost infinite."

Onward to Switzerland

The next step, once a particle has been predicted by theorists, is for the experimentalists at CERN to try to create it in the miles-long tubes of their particle smasher, the LHC.

That can be a grueling process, especially because of the specific properties of bottom quarks.

The LHC works by slamming protons together at large fractions of the speed of light, releasing enough energy into the collider that some of it turns back into mass. And some tiny fraction of that mass will condense into rare forms of matter — like that doubly charmed baryon.

But the heavier a particle is, the lower the odds it will pop into being in the LHC. And bottom quarks are exceptionally unlikely creations.

In order to build a tetraquark, Karliner said, the LHC has to generate two bottom quarks in close enough proximity to each other that they bind, and then "decorate" them with two light antiquarks. And then it has to do it again, and again — until it's happened enough times that the researchers can be sure of their results.

But that's not as unlikely as it may sound.

"It turns out that, if you consider how you would make such things in a lab," Karliner said, "the probability of making them is only slightly less likely than finding that baryon with two bottom quarks and one light quark."

And that hunt is already underway.

Once the two-bottom-quark baryon is discovered, Karliner said — a result he expects within the next few years — "the clock starts ticking" on the appearance of the tetraquark.

Somewhere out there in the ether is a hadron that physicists have been hunting for 53 years. But now they've caught its scent.

Editor's Note: This article was updated to correct a the mass of the researcher's earlier doubly-charmed baryon prediction. It was 3,627 MeV, not 4,627 MeV.

Originally published on Live Science.

Get More from Our Newsletter

Editor's Recommendations

The 18 Biggest Unsolved Mysteries in Physics
Images: Inside the World's Top Physics Labs
Top 10 Greatest Explosions Ever
Advertisement

Ads by Revcontent
From The Web
How To Stop The Ringing In Your Ears?No Pill Can Cure Tinnitus, Try This Instead
Tinnitus Destroyer
Strange Link Between Eggs and Diabetes Will Bankrupt the Industry
Life Advice Daily
After Many Years Your Eyes Go Bad, but Try This Tonight (Watch)
USA Health Watch
Melania Trump's IQ Will Make You Laugh out Loud
Like It Viral
Hillary's Entire "Hit List" Just Went Public. You'll Never Guess Who's #1
Health Sciences Institute
Millions of People Are Canceling Their Cable Because of This New Site
PlatinumTVBox.com
Author Bio
Rafi Letzter
Rafi Letzter, Staff Writer


Rafi joined Live Science in 2017. He has a bachelor's degree in journalism from Northwestern University’s Medill School of journalism. You can find his past science reporting at Inverse, Business Insider and Popular Science, and his past photojournalism on the Flash90 wire service and in the pages of The Courier Post of southern New Jersey.
(FULL STORY)

New Map of Dark Matter Puts the Big Bang Theory on Trial (Kavli Roundtable)
[11/4/2017]
The prevailing view of the universe has just passed a rigorous new test, but the mysteries of dark matter and dark energy remain frustratingly unsolved.
(FULL STORY)

There and Back Again: Scientists Beam Photons to Space to Test Quantum Theory
[10/25/2017]
Researchers have taken a famous quantum-physics experiment to new heights by sending light, in the form of photons, to space and back, demonstrating the dual-particle-wave nature of light over much greater distances than scientists can achieve on Earth.

In the quantum theory of reality, particles like electrons and photons behave like waves as well, depending on how scientists measure them. Physicists call this phenomenon wave-particle duality, and it leads to many counterintuitive effects, like single particles traveling along two paths simultaneously.

In 1803, long before the conception of quantum theory, physicist Thomas Young conducted a famous experiment to demonstrate that light behaves like a wave. Young sent sunlight through two slits toward a blank paper card. When he observed the light on the card, it revealed a pattern of bright and dark bands that faded toward the edge. Rather than going through one slit or the other, the light had behaved like a wave, passing through both slits and interacting with itself to form a pattern, like ripples in a pond.

Advertisement

The Italian team used this instrument, called an interferometer, to split and recombine light. Here it's seen with an alignment laser beam.
The Italian team used this instrument, called an interferometer, to split and recombine light. Here it's seen with an alignment laser beam.
Credit: QuantumFuture Research Group/University of Padova - DEI
In the 20th century, scientists placed detectors on such slits to determine which path the light actually took. When they did this, they always detected the photon in one slit or the other. What's more, the film developed two bright bands opposite the gaps instead of the ripples — the photons were going through one slit or the other instead of interacting like a wave. It's almost as if the light knew how the scientists wanted it to behave.

Scientists were baffled as to how the light determined what to do and, more importantly, when it "decided" to behave as a particle or a wave. Does light commit to one behavior at the beginning of an experiment, when it's produced; at the end, when it's detected; or some time in between?

In the late 1970s and early 1980s, theoretical physicist John Wheeler proposed some tests to answer this question. Some of these involved changing the experimental setup after the light had already entered the apparatus. This would delay when the light is able to choose its behavior until near the end of the test. It was one of Wheeler's delayed-choice experiments that the team at the University of Padova, in Italy, conducted and detailed Oct. 25 in the journal Science Advances.

Wheeler's experiment had been done before, but not at this scale. Using a reflector on an orbiting satellite allowed the team to test the predictions of quantum theory over greater distances than ever before.

"The law of quantum mechanics … should be valid for any distance, right?" Giuseppe Vallone, a researcher at the University of Padova and co-author of the study, told Space.com. "But of course, if we don't test it, we cannot be sure."

Testing quantum physics in space

The experimental apparatus on Earth sent out one photon at a time. That light was then split into two waves by a device called a beam splitter. The team sent one beam on a slightly longer path, so it ended up slightly behind its counterpart, Vallone explained.

The key was that the scientists split the light in such a way that the earlier wave had horizontal polarization and the latter one had vertical polarization. In other words, the waves were oriented in two different directions.

Then, the light beams were prepped and were ready to be sent to space. Vallone's team directed the light at a satellite, where a reflector sent it back toward the apparatus in Italy. At that point, two light waves were headed back toward Earth, one slightly ahead of the other.

A beam of light (top left) is split in two and heads down separate paths. If the paths are recombined the two waves create an interference pattern. If not, a particle is detected along only one path. The actual experiment conducted by the Italian researchers started and ended on Earth, traveling to an orbiting satellite on the way.
A beam of light (top left) is split in two and heads down separate paths. If the paths are recombined the two waves create an interference pattern. If not, a particle is detected along only one path. The actual experiment conducted by the Italian researchers started and ended on Earth, traveling to an orbiting satellite on the way.
Credit: Vedovato et al., Sci. Adv. 2017;3: e1701180
This is when the "delayed choice" part of the experiment came in. After the light was reflected, a computer sent a random signal to a liquid crystal. Depending on the signal, the device either swapped both light beams' polarizations, or left them the same. At that point, the light passed through the beam splitter again. If the polarizations were left unchanged, the splitter simply recombined the light, making it act as a single wave. If the polarizations were swapped, it separated them even more, creating a distinct delay between the two pulses so the light would act as an individual particle.

The switch was decided only after the light was headed back to Earth, more than halfway through its 10-millisecond round-trip. This meant there was no way for the light to "know" what the scientists were expecting until the very end, when it hit the detector. If Vallone's group still saw the same behaviors — an interference pattern when the light was recombined, and single flashes when it wasn't — they would know that the light had been both a particle and a wave simultaneously, until their device made it choose one or the other at the very end.

And that's exactly what happened. The light split into two beams, like a wave and, at the same time, stayed together as a single photon, until the end, when the liquid crystal device forced it to behave as one or the other right before hitting the detector. The predictions of quantum theory were vindicated, Vallone said — and the surreal nature of quantum mechanics was reaffirmed.

Even though the Italian team's work focused on confirming previous experiments, the test was still worthwhile, according to Thomas Jennewein, a quantum physicist at the University of Waterloo, in Ontario, who is unaffiliated with the paper. The experiment Vallone's team conducted is closer to Wheeler's original proposal, Jennewein told Space.com, which relied on the distance the light traveled to keep it separated for a long time.

"It is out in space, and it is far away, and so we are getting closer to the original scheme," Jennewein said.

Wheeler's original thought experiment envisioned this test conducted on light from a distant galaxy, bent toward Earth along two possible paths by a massive object in between. In this situation, a single photon could have traveled along both paths simultaneously, only being forced to choose its behavior millions or billions of years after it began its journey. Vallone's group didn't replicate this aspect of the experiment, but they were able to keep the light in its bizarre double state, called a superposition, for 10 milliseconds — an impressively long amount of time compared to what was demonstrated in previous trials, according to Jennewein.


02:1204:40
So, what does it mean?

The results of Wheeler's experiment can be unsettling for those who like to believe in a definitive, physical reality. The new findings suggest that the behavior of objects in the universe is fundamentally undetermined until something forces them to behave a certain way. Particles propagate like waves, waves coalesce into particles and nothing can be predicted with certainty, only a probability.

Physicists often set these qualms aside to focus on their work. There's a saying, "Shut up and calculate," said Jennewein, who attributed it to Cornell University professor David Mermin. The idea is that scientists should work on figuring out the mathematics behind how quantum theory works rather than attempting to understand its implications.

Neither Jennewein nor Vallone completely adhere to this mantra. "People spend lifetimes, almost, trying to get their heads around these questions," Jennewein said.

"My personal belief is that we just cannot maintain our classical view when we look at quantum particles," he added. "It's kind of a new type of concept for us which has no representation in our everyday life." And our daily life is where we derive our intuitions from, he noted.

Vallone approaches the concept in a similar manner. "When we think of a photon as a particle, as a little ball, we are [making a] mistake. When we think of a photon like a water wave, we are [also making] a mistake," he said. "The photon, in some cases, seems to behave like a wave or seems to behave like a particle. But actually, it's neither."

Vallone's team used the Italian Space Agency's Matera Laser Ranging Observatory in Matera, Italy to send their light beam skyward.
Vallone's team used the Italian Space Agency's Matera Laser Ranging Observatory in Matera, Italy to send their light beam skyward.
Credit: QuantumFuture Research Group/University of Padova - DEI
More quantum physics in space

The experiment conducted by Vallone's team joins a new trend of space-based quantum research. In August 2016, China launched the first satellite designed specifically to test quantum theory and its applications in quantum computing. A team in Shanghai used the satellite to set a record for the farthest quantum teleportation, sending the state of one photon about 1,000 to 1,500 miles (1,600 to 2,400 kilometers) away.

These feats may find applications in computing, according to Vallone. Quantum objects can be in two states at once, like the light in Vallone's experiment, so quatum computers can encode more information than traditional electronics, he said. Also, because quantum states change when they are observed, they promise greater security than conventional communication because you can tell that someone has tried to eavesdrop.

Jennewein foresees more experiments like the one Vallone's group conducted. Space-based experiments enable researchers to explore the limits of quantum mechanics. "This experiment is a first step toward it," he said, and "I'm hoping to see more foundational quantum physics tests in space."

Email Harrison Tasoff at htasoff@space.com or follow him @harrisontasoff. Follow us @Spacedotcom, Facebook and Google+. Original article on Space.com.
(FULL STORY)

New definitions of scientific units are on the horizon
[10/18/2017]
Revamped definitions of scientific units are on their way. In the biggest overhaul of the international system of units (SI) since its inception in 1960, a committee is set to redefine four basic units — the ampere, the kilogram, the kelvin and the mole — using relationships to fundamental constants, rather than abstract or arbitrary definitions. The International Bureau of Weights and Measures is reviewing the plans at a meeting near Paris from 16 to 20 October. Its recommendations will then go before the General Conference on Weights and Measures, which oversees the SI system, in November 2018. The changes would take effect in May 2019.

Related stories
The new system of units
Kilogram conflict resolved at last
Hyper-precise atomic clocks face off to redefine time
More related stories
The kilogram is currently defined as the mass of a chunk of metal in a vault in Paris. And an imaginary experiment involving the force between two infinite wires defines the ampere, the unit of electrical current. The mole, meanwhile, is the amount of substance in a system with as many elementary entities as there are atoms in 0.012 kilograms of carbon-12, while the kelvin relates to the temperature and pressure at which water, ice and water vapour co-exist in equilibrium, known as the triple point of water. In the future, these units will be calculated in relation to constants — for example, the ampere will be based on the charge of an electron.

Redefinition might not affect everyday measurements, but it will enable scientists working at the highest level of precision to do so in multiple ways, at any place or time and on any scale, without losing accuracy.
(FULL STORY)

IBM Has Used Its Quantum Computer to Simulate a Molecule—Here’s Why That’s Big News
[9/13/2017]
We just got a little closer to building a computer that can disrupt a large chunk of the chemistry world, and many other fields besides. A team of researchers at IBM have successfully used their quantum computer, IBM Q, to precisely simulate the molecular structure of beryllium hydride (BeH2). It's the most complex molecule ever given the full quantum simulation treatment.
Molecular simulation is all about finding a compound's ground state—its most stable configuration. Sounds easy enough, especially for a little-old three-atom molecule like BeH2. But in order to really know a molecule's ground state, you have to simulate how each electron in each atom will interact with all of the other atoms' nuclei, including the strange quantum effects that occur on such small scales. This is a problem that becomes exponentially harder as the size of the molecule increases.
While today's supercomputers can simulate BeH2 and other simple molecules, they quickly become overwhelmed and chemical modellers—who attempt to come up with new compounds for things like better batteries and live-saving drugs—are forced to approximate how an unknown molecule might behave, then test it in the real world to see if it works as expected.
The promise of quantum computing is to vastly simplify that process by exactly predicting the structure of a new molecule, and how it will interact with other compounds. In work published today in Nature (paywall)—and also available on the Arxiv (PDF)—the IBM team have shown that they can use a new algorithm to calculate the ground state of BeH2 on their seven-qubit chip.
In some ways, it's a small advance. But it's an important step on the path of ever-greater complexity in molecular simulation using quantum computers that will ultimately lead to commercially important breakthroughs.
Even now, as the research team notes in their blog post on the work, IBM offers access to a 16-qubit quantum computer as a free cloud service. The more qubits a chip has—that is, quantum bits that can be used to encode data in multiple states at once—the greater the complexity of calculations it should be able to handle. At least in theory. As we pointed out when we made practical quantum computers one of our Breakthrough Technologies of 2017, one of the big challenges in designing quantum computers is making sure qubits remain in their delicate quantum state long enough to perform calculations. The more qubits a chip has, though, the harder that has been for researchers to do.
Still, the day when quantum computers surpass classical machines—an inflection point known as quantum supremacy—is rapidly approaching. Some observers think a chip with 50 qubits would be enough to get there. And while the chemistry world stands to benefit immensely from such advances, it isn't the only field. Quantum computers are expected to be superstars at any kind of optimization problem, which should help propel big advances in everything from artificial intelligence to how companies deliver packages to customers.
(FULL STORY)

Gravity may be created by strange flashes in the quantum realm A model of how wave forms of quantum systems collapse reveals a way they could create gravitational fields, and perhaps even reconcile two pillars of physics
[9/20/2017]
By Anil Ananthaswamy

HOW do you reconcile the two pillars of modern physics: quantum theory and gravity? One or both will have to give way. A new approach says gravity could emerge from random fluctuations at the quantum level, making quantum mechanics the more fundamental of the two theories.

Of our two main explanations of reality, quantum theory governs the interactions between the smallest bits of matter. And general relativity deals with gravity and the largest structures in the universe. Ever since Einstein, physicists have been trying to bridge the gap between the two, with little success.

Part of the problem is knowing which strands of each theory are fundamental to our understanding of reality.

One approach towards reconciling gravity with quantum mechanics has been to show that gravity at its most fundamental comes in indivisible parcels called quanta, much like the electromagnetic force comes in quanta called photons. But this road to a theory of quantum gravity has so far proved impassable.

Now Antoine Tilloy at the Max Planck Institute of Quantum Optics in Garching, Germany, has attempted to get at gravity by tweaking standard quantum mechanics.

In quantum theory, the state of a particle is described by its wave function. The wave function lets you calculate, for example, the probability of finding the particle in one place or another on measurement. Before the measurement, it is unclear whether the particle exists and if so, where. Reality, it seems, is created by the act of measurement, which “collapses” the wave function.

But quantum mechanics doesn’t really define what a measurement is. For instance, does it need a conscious human? The measurement problem leads to paradoxes like Schrödinger’s cat, in which a cat can be simultaneously dead and alive inside a box, until someone opens the box to look.

One solution to such paradoxes is a so-called GRW model that was developed in the late 1980s. It incorporates “flashes”, which are spontaneous random collapses of the wave function of quantum systems. The outcome is exactly as if there were measurements being made, but without explicit observers.

Tilloy has modified this model to show how it can lead to a theory of gravity. In his model, when a flash collapses a wave function and causes a particle to be in one place, it creates a gravitational field at that instant in space-time. A massive quantum system with a large number of particles is subject to numerous flashes, and the result is a fluctuating gravitational field.

“A spontaneous collapse in a quantum system creates a gravitational field at that instant in space-time”
It turns out that the average of these fluctuations is a gravitational field that one expects from Newton’s theory of gravity (arxiv.org/abs/1709.03809). This approach to unifying gravity with quantum mechanics is called semiclassical: gravity arises from quantum processes but remains a classical force. “There is no real reason to ignore this semiclassical approach, to having gravity being classical at the fundamental level,” says Tilloy.

“I like this idea in principle,” says Klaus Hornberger at the University of Duisburg-Essen in Germany. But he points out that other problems need to be tackled before this approach can be a serious contender for unifying all the fundamental forces underpinning the laws of physics on scales large and small. For example, Tilloy’s model can be used to get gravity as described by Newton’s theory, but the maths still has to be worked out to see if it is effective in describing gravity as governed by Einstein’s general relativity.

Tilloy agrees. “This is very hard to generalise to relativistic settings,” he says. He also cautions that no one knows which of the many tweaks to quantum mechanics is the correct one.

Nonetheless, his model makes predictions that can be tested. For example, it predicts that gravity will behave differently at the scale of atoms from how it does on larger scales. Should those tests find that Tilloy’s model reflects reality and gravity does indeed originate from collapsing quantum fluctuations, it would be a big clue that the path to a theory of everything would involve semiclassical gravity.

This article appeared in print under the headline “Quantum collapse spawns gravity?”
(FULL STORY)

Scientists discover strange form of black hole at the heart of Milky Way
[9/4/2017]
A strange form of black hole has been detected for the first time at the heart of the Milky Way.

It's a "mini-me" version of its neighbouring supermassive "cousin" - shedding light on how it formed.

Looming in the middle of every galaxy, supermassive black holes weigh as much as ten billion suns - fuelling the birth of stars and deforming the fabric of space-time itself.
But the mass of the newly identified black hole is only about 100,000 times that of our sun - placing it in the "intermediate sized" class.

These were believed to exist but none had ever actually been identified - until now.

Lying about 25,000 light years from Earth it could help answer one of the really big questions - how did the Milky Way evolve?
It was found hiding in a cloud of molecular gas by Japanese astronomers using the Alma (Atacama Large Millimeter/submillimeter Array) 16,400 feet above sea level in the Andes in northern Chile.

The radio telescope's high sensitivity and resolution enabled them to observe the cloud 195 light years from the Milky Way's centre spot.

It sheds fresh light on the most mysterious objects in the universe. Uncovering their secrets is the 'Holy Grail' of astronomy.
Recent research has shown supermassive black holes are essential to the creation of galaxies, stars - and even life itself.

Each one is about half a per cent of the host galaxy's size - which indicates they are the driving force behind their evolution.

The finding published in Nature Astronomy provides important insights into how supermassive black holes like the one at the very centre of our galaxy were created.

Although it is well established they reside in seemingly all galaxies we do not know how they get so enormous.
This is despite them appearing to have been in place when the universe was comparatively young - only a few hundred million years old.

Now the mystery could be solved by the identification of the intermediate-type black hole - something astrophysicists suspected were around but for which there have been only tentative candidates in the past.

It's believed they could be the seeds of their more massive counterparts - merging together to form a gigantic one. intermediate black holes might simply turn out to be their progenitors.

It's difficult to find black holes - because they are completely black. But in some cases they cause effects which can be seen.

A black hole is a region of space that has such an extremely powerful gravitational field that it absorbs all the light that passes near it and reflects none.

Professor Tomoharu Oka and colleagues used computer simulations to show the high velocity motion, or kinematics, of the gas could only be explained by an intermediate black ole conceal ed in its midst.

They also found the emission from this cloud closely resembles a scaled-down version of the Milky Way's quiescent supermassive black hole.

Astrophysicists have suspected an intermediate class of black hole might exist - with masses between a hundred and several hundred thousand times that of the Sun.

But such black holes had not previously been reliably detected and their existence has been fiercely debated among the astronomical community.

Prof Oka, of Keio University in Japan said it is widely accepted black holes with masses greater than a million solar masses lurk at the centres of massive galaxies, but their origins remain unknown.

He said: "One possible scenario is intermediate-mass black holes (IMBHs) - which are formed by the runaway coalescence of stars in young compact star clusters - merge at the centre of a galaxy to form a supermassive black hole.

"Although many candidates for IMBHs have been proposed none is accepted as definitive. Recently we discovered a peculiar molecular cloud near the centre of our Milky Way galaxy.

"Based on the careful analysis of gas kinematics we concluded a compact object with a mass of about 100,000 solar masses is lurking in this cloud."

Prof Oka said it suggests "this massive object is an inactive IMBH which is not currently accreting matter."

Theoretical studies have predicted 100 million to one billion lack holes should exist in the Milky Way - but only 60 or so have been identified through observations so far.

Despite their popularity both in real science and science fiction the concept of a black hole has only been around for a hundred years - as predicted by Albert Einstein. The term itself did not come into use until 1967, and it was just 46 years ago that the first one was identified.

Prof Oka said: "Further detection of such compact high-velocity features in various environments may increase the number of non-luminous black hole candid ate and thereby increase targets to search for evidential proof of general relativity.

"This would make a considerable contribution to the progress of modern physics."
(FULL STORY)

3,700-year-old Babylonian tablet rewrites the history of maths - and shows the Greeks did not develop trigonometry
[8/24/2017]
A 3,700-year-old clay tablet has proven that the Babylonians developed trigonometry 1,500 years before the Greeks and were using a sophisticated method of mathematics which could change how we calculate today.

The tablet, known as Plimpton 332, was discovered in the early 1900s in Southern Iraq by the American archaeologist and diplomat Edgar Banks, who was the inspiration for Indiana Jones.

The true meaning of the tablet has eluded experts until now but new research by the University of New South Wales, Australia, has shown it is the world’s oldest and most accurate trigonometric table, which was probably used by ancient architects to construct temples, palaces and canals.

However unlike today’s trigonometry, Babylonian mathematics used a base 60, or sexagesimal system, rather than the 10 which is used today. Because 60 is far easier to divide by three, experts studying the tablet, found that the calculations are far more accurate.

Dr Daniel Mansfield with the 3,700-year-old trigonometric table
Dr Daniel Mansfield with the 3,700-year-old trigonometric table CREDIT: UNSW
“Our research reveals that Plimpton 322 describes the shapes of right-angle triangles using a novel kind of trigonometry based on ratios, not angles and circles,” said Dr Daniel Mansfield of the School of Mathematics and Statistics in the UNSW Faculty of Science.

“It is a fascinating mathematical work that demonstrates undoubted genius. The tablet not only contains the world’s oldest trigonometric table; it is also the only completely accurate trigonometric table, because of the very different Babylonian approach to arithmetic and geometry.

“This means it has great relevance for our modern world. Babylonian mathematics may have been out of fashion for more than 3000 years, but it has possible practical applications in surveying, computer graphics and education.

“This is a rare example of the ancient world teaching us something new.”

The Greek astronomer Hipparchus, who lived around 120BC, has long been regarded as the father of trigonometry, with his ‘table of chords’ on a circle considered the oldest trigonometric table.

A trigonometric table allows a user to determine two unknown ratios of a right-angled triangle using just one known ratio. But the tablet is far older than Hipparchus, demonstrating that the Babylonians were already well advanced in complex mathematics far earlier.

Babylon, which was in modern day Iraq, was once one of the most advanced cultures in the world
Babylon, which was in modern day Iraq, was once one of the most advanced cultures in the world
The tablet, which is thought to have come from the ancient Sumerian city of Larsa, has been dated to between 1822 and 1762 BC. It is now in the Rare Book and Manuscript Library at Columbia University in New York.

“Plimpton 322 predates Hipparchus by more than 1000 years,” says Dr Wildberger.

“It opens up new possibilities not just for modern mathematics research, but also for mathematics education. With Plimpton 322 we see a simpler, more accurate trigonometry that has clear advantages over our own.

“A treasure-trove of Babylonian tablets exists, but only a fraction of them have been studied yet. The mathematical world is only waking up to the fact that this ancient but very sophisticated mathematical culture has much to teach us.”

VIDEO EMBED: Babylon culture minute
01:32
The 15 rows on the tablet describe a sequence of 15 right-angle triangles, which are steadily decreasing in inclination.

The left-hand edge of the tablet is broken but the researchers believe t there were originally six columns and that the tablet was meant to be completed with 38 rows.

“Plimpton 322 was a powerful tool that could have been used for surveying fields or making architectural calculations to build palaces, temples or step pyramids,” added Dr Mansfield.

The new study is published in Historia Mathematica, the official journal of the International Commission on the History of Mathematics.
(FULL STORY)

Dark Energy Survey reveals most accurate measurement of universe's dark matter
[8/4/2017]
Imagine planting a single seed and, with great precision, being able to predict the exact height of the tree that grows from it. Now imagine traveling to the future and snapping photographic proof that you were right.

If you think of the seed as the early universe, and the tree as the universe the way it looks now, you have an idea of what the Dark Energy Survey (DES) collaboration has just done. In a presentation at the American Physical Society Division of Particles and Fields meeting at the U.S. Department of Energy's (DOE) Fermi National Accelerator Laboratory, DES scientists will unveil the most accurate measurement ever made of the present large-scale structure of the universe.

These measurements of the amount and "clumpiness" (or distribution) of dark matter in the present-day cosmos were made with a precision that, for the first time, rivals that of inferences from the early universe by the European Space Agency's orbiting Planck observatory. The new DES result (the tree, in the above metaphor) is close to "forecasts" made from the Planck measurements of the distant past (the seed), allowing scientists to understand more about the ways the universe has evolved over 14 billion years.

"This result is beyond exciting," said Scott Dodelson of Fermilab, one of the lead scientists on this result. "For the first time, we're able to see the current structure of the universe with the same clarity that we can see its infancy, and we can follow the threads from one to the other, confirming many predictions along the way."

Most notably, this result supports the theory that 26 percent of the universe is in the form of mysterious dark matter and that space is filled with an also-unseen dark energy, which is causing the accelerating expansion of the universe and makes up 70 percent.

Paradoxically, it is easier to measure the large-scale clumpiness of the universe in the distant past than it is to measure it today. In the first 400,000 years following the Big Bang, the universe was filled with a glowing gas, the light from which survives to this day. Planck's map of this cosmic microwave background radiation gives us a snapshot of the universe at that very early time.

Since then, the gravity of dark matter has pulled mass together and made the universe clumpier over time. But dark energy has been fighting back, pushing matter apart. Using the Planck map as a start, cosmologists can calculate precisely how this battle plays out over 14 billion years.

"The DES measurements, when compared with the Planck map, support the simplest version of the dark matter/dark energy theory," said Joe Zuntz, of the University of Edinburgh, who worked on the analysis. "The moment we realized that our measurement matched the Planck result within 7 percent was thrilling for the entire collaboration."

The primary instrument for DES is the 570-megapixel Dark Energy Camera, one of the most powerful in existence, able to capture digital images of light from galaxies eight billion light-years from Earth.

The camera was built and tested at Fermilab, the lead laboratory on the Dark Energy Survey, and is mounted on the National Science Foundation's 4-meter Blanco telescope, part of the Cerro Tololo Inter-American Observatory in Chile, a division of the National Optical Astronomy Observatory. The DES data are processed at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign.

Scientists on DES are using the camera to map an eighth of the sky in unprecedented detail over five years. The fifth year of observation will begin in August. The new results released today draw from data collected only during the survey's first year, which covers 1/30th of the sky.

"It is amazing that the team has managed to achieve such precision from only the first year of their survey," said National Science Foundation Program Director Nigel Sharp. "Now that their analysis techniques are developed and tested, we look forward with eager anticipation to breakthrough results as the survey continues."

DES scientists used two methods to measure dark matter. First, they created maps of galaxy positions as tracers, and second, they precisely measured the shapes of 26 million galaxies to directly map the patterns of dark matter over billions of light-years, using a technique called gravitational lensing.

To make these ultraprecise measurements, the DES team developed new ways to detect the tiny lensing distortions of galaxy images, an effect not even visible to the eye, enabling revolutionary advances in understanding these cosmic signals. In the process, they created the largest guide to spotting dark matter in the cosmos ever drawn (see image). The new dark matter map is 10 times the size of the one DES released in 2015 and will eventually be three times larger than it is now.

"It's an enormous team effort and the culmination of years of focused work," said Erin Sheldon, a physicist at the DOE's Brookhaven National Laboratory, who co-developed the new method for detecting lensing distortions.

These results and others from the first year of the Dark Energy Survey will be released online and announced during a talk by Daniel Gruen, NASA Einstein fellow at the Kavli Institute for Particle Astrophysics and Cosmology at DOE's SLAC National Accelerator Laboratory, at 5 p.m. Central time. The talk is part of the APS Division of Particles and Fields meeting at Fermilab and will be streamed live.

The results will also be presented by Kavli fellow Elisabeth Krause of the Kavli Insitute for Particle Astrophysics and Cosmology at SLAC at the TeV Particle Astrophysics Conference in Columbus, Ohio, on Aug. 9; and by Michael Troxel, postdoctoral fellow at the Center for Cosmology and AstroParticle Physics at Ohio State University, at the International Symposium on Lepton Photon Interactions at High Energies in Guanzhou, China, on Aug. 10. All three of these speakers are coordinators of DES science working groups and made key contributions to the analysis.

"The Dark Energy Survey has already delivered some remarkable discoveries and measurements, and they have barely scratched the surface of their data," said Fermilab Director Nigel Lockyer. "Today's world-leading results point forward to the great strides DES will make toward understanding dark energy in the coming years."
(FULL STORY)

World's Fastest-Swirling Vortex Simulates the Big Bang
[8/8/2017]
Faster than a tornado, speedier than the giant storm swirling on Jupiter — it's the world's fastest-swirling vortex, which scientists have created in a primordial soup of gluey particles meant to re-create the Big Bang.

The swirling particle soup rotates at head-snapping speeds — many times faster than the closest contenders.

However, don't expect this fast-spinning fluid to turn heads anytime soon, as the vortices occur in a material called a quark-gluon plasma that is so small that the signature of this whirling can be detected only by the particles it produces.


World's Fastest-Swirling Vortex Simulates the Big Bang
An illustration of the quark-gluon plasma created in the Relativistic Heavy Ion Collider at Brookhaven National Laboratory
Credit: Brookhaven National Laboratory
Faster than a tornado, speedier than the giant storm swirling on Jupiter — it's the world's fastest-swirling vortex, which scientists have created in a primordial soup of gluey particles meant to re-create the Big Bang.

The swirling particle soup rotates at head-snapping speeds — many times faster than the closest contenders.

However, don't expect this fast-spinning fluid to turn heads anytime soon, as the vortices occur in a material called a quark-gluon plasma that is so small that the signature of this whirling can be detected only by the particles it produces.


05:31
"We can't look at the quark-gluon plasma; it's on the scale of an atomic nucleus," said Michael Lisa, a physicist at The Ohio State University who works on the Relativistic Heavy Ion Collider (RHIC) collaboration, which produced the new results. [The Big Bang to Civilization: 10 Amazing Origin Events]

Hot soup

Right after the Big Bang, a hot primordial stew of elementary particles called quarks and gluons permeated the baby universe. These elementary particles are the building blocks of better-known particles such as protons and neutrons. This quark-gluon plasma has several unique properties. First, at a blazing 7 trillion to 10 trillion degrees Fahrenheit (3.9 trillion to 5.6 trillion degrees Celsius), it's the hottest known fluid. It is also the densest fluid and "nearly perfect" in that it experiences almost no friction, meaning it flows very easily.

To understand exactly what happened in those moments after the Big Bang, scientists have re-created this primordial particle soup in an atom smasher at the RHIC, at Brookhaven National Laboratory in Upton, New York. The RHIC smashes the nuclei of gold atoms together at nearly the speed of light and then uses ultrasensitive detectors to measure the particles that fly off the collision.

Whirling fluid

In the new study, the team analyzed the quark-gluon plasma's vorticity — essentially a measure of its angular momentum or, in colloquial terms, how fast it spins.

Of course, they had a unique obstacle: The RHIC can produce just a teensy amount of the material, and it lives very fleetingly, or about 10 ^ minus 23 seconds. So there is no way to actually "observe" this fluid in the traditional sense.

Instead, scientists look for signatures of its whirling, based on the particles emitted from the soup, Lisa told Live Science. On average, particles inside a spinning fluid should have spins that roughly align with the angular momentum of the fluid. By measuring how much the particles coming off this whirling soup are deflected from their expected path, the team could calculate a rough estimate for the fluid's vorticity — which roughly measures the local spinning motion. In particular, particles known as lambda baryons tend to decay more slowly than other particles, such as protons and neutrons, meaning the RHIC detectors could more easily track their paths before they vanished.

It turns out, the vorticity in the quark-gluon plasma makes the whirling motion inside a tornado seem like a calm day in the park. The vorticity is the fastest ever recorded — much more rapid than that of Jupiter's Great Red Spot, a swirling storm of gas. It's also faster than the previous record holder, a supercooled type of helium nanodroplet, the researchers reported Aug. 2 in the journal Nature.

Understanding the structure of fluid flow in the plasma could reveal insight into the strong nuclear force, which binds atoms together, the researchers said. Several competing particle theories make predictions about vorticity that could eventually be compared against these experimental results. However, scientists still know too little about the plasma's swirling properties to make definitive conclusions.

"It's too early to say whether it teaches us something fundamental," Lisa said.

Originally published on Live Science.
(FULL STORY)

UCI celestial census indicates that black holes pervade the universe
[8/13/2017]
After conducting a cosmic inventory of sorts to calculate and categorize stellar-remnant black holes, astronomers from the University of California, Irvine have concluded that there are probably tens of millions of the enigmatic, dark objects in the Milky Way - far more than expected.

"We think we've shown that there are as many as 100 million black holes in our galaxy," said UCI chair and professor of physics and astronomy James Bullock, co-author of a research paper on the subject in the current issue of Monthly Notices of the Royal Astronomical Society.

UCI's celestial census began more than a year and a half ago, shortly after the news that the Laser Interferometer Gravitational-Wave Observatory, or LIGO, had detected ripples in the space-time continuum created by the distant collision of two black holes, each the size of 30 suns.

"Fundamentally, the detection of gravitational waves was a huge deal, as it was a confirmation of a key prediction of Einstein's general theory of relativity," Bullock said. "But then we looked closer at the astrophysics of the actual result, a merger of two 30-solar-mass black holes. That was simply astounding and had us asking, 'How common are black holes of this size, and how often do they merge?'"

He said that scientists assume most stellar-remnant black holes - which result from the collapse of massive stars at the end of their lives - will be about the same mass as our sun. To see evidence of two black holes of such epic proportions finally coming together in a cataclysmic collision had some astronomers scratching their heads.

UCI's work was a theoretical investigation into the "weirdness of the LIGO discovery," Bullock said. The research, led by doctoral candidate Oliver Elbert, was an attempt to interpret the gravitational wave detections through the lens of what is known about galaxy formation and to form a framework for understanding future occurrences.

"Based on what we know about star formation in galaxies of different types, we can infer when and how many black holes formed in each galaxy," Elbert said. "Big galaxies are home to older stars, and they host older black holes too."

According to co-author Manoj Kaplinghat, UCI professor of physics and astronomy, the number of black holes of a given mass per galaxy will depend on the size of the galaxy.

The reason is that larger galaxies have many metal-rich stars, and smaller dwarf galaxies are dominated by big stars of low metallicity. Stars that contain a lot of heavier elements, like our sun, shed a lot of that mass over their lives.

When it comes time for one to end it all in a supernova, there isn't as much matter left to collapse in on itself, resulting in a lower-mass black hole. Big stars with low metal content don't shed as much of their mass over time, so when one of them dies, almost all of its mass will wind up in the black hole.

"We have a pretty good understanding of the overall population of stars in the universe and their mass distribution as they're born, so we can tell how many black holes should have formed with 100 solar masses versus 10 solar masses," Bullock said. "We were able to work out how many big black holes should exist, and it ended up being in the millions - way more than I anticipated."

In addition, to shed light on subsequent phenomena, the UCI researchers sought to determine how often black holes occur in pairs, how often they merge, and how long it takes. They wondered whether the 30-solar-mass black holes detected by LIGO were born billions of years ago and took a long time to merge or came into being more recently (within the past 100 million years) and merged soon after.

"We show that only 0.1 to 1 percent of the black holes formed have to merge to explain what LIGO saw," Kaplinghat said. "Of course, the black holes have to get close enough to merge in a reasonable time, which is an open problem."

Elbert said he expects many more gravitation wave detections so that he and other astronomers can determine if black holes collide mostly in giant galaxies. That, he said, would tell them something important about the physics that drive them to coalesce.

According to Kaplinghat, they may not have to wait too long, relatively speaking. "If the current ideas about stellar evolution are right, then our calculations indicate that mergers of even 50-solar-mass black holes will be detected in a few years," he said.
(FULL STORY)

Cosmic map reveals a not-so-lumpy Universe
[8/3/2017]
Odd results could still be consistent with the 'standard model' of cosmology.
(FULL STORY)

High-Precision Measurement of the Proton’s Atomic Mass
[7/18/2017]
We report on the precise measurement of the atomic mass of a single proton with a purpose-built Penning-trap system. With a precision of 32 parts per trillion our result not only improves on the current CODATA literature value by a factor of 3, but also disagrees with it at a level of about 3 standard deviations.
(FULL STORY)

Strange Noise in Gravitational-Wave Data Sparks Debate
[6/30/2017]
The team that discovered gravitational waves put their data online. Now an independent group of researchers claims that they’ve found what might be a serious problem.
(FULL STORY)

STARSHOT: INSIDE THE PLAN TO SEND A SPACECRAFT TO OUR NEIGHBOR STAR: Hundreds of engineers and scientists have come together to shoot for the stars, literally.
[7/11/2017]
By Shannon Stirone
Jul 11, 2017
301
As a species, we have made magnificent strides in robotic space exploration in the past decade. From exploring Pluto close-up for the first time to discovering our solar system is rife with underground liquid oceans, we now understand our little neighborhood of planets and moons better than ever before. It's time to start talking about how we are going to explore the stars.

The Breakthrough Initiatives, created by Russian billionaire physicist Yuri Milner, is one of the most forward-thinking space exploration groups in the world. Among Breakthrough's many ambitious projects is Breakthrough Starshot. The goal is to send hundreds of gram-sized spacecraft to the nearest star—Proxima Centauri, some 4.2 light-years away—and have them arrive within our lifetimes. The craft would then attempt to communicate with Earth and transmit photos of Proxima Centauri and its orbiting planet, Proxima b, back to us.

Advertisement - Continue Reading Below

The Breakthrough Initiatives recently held an international conference called Breakthrough Discuss at Stanford University. Hundreds of researchers and engineers met to flesh out Breakthrough's many ambitious space exploration goals. Starshot attracted perhaps the most interest due to its thrilling prospects and many technical challenges to overcome.

The verdict? "It looks feasible," according to Harvard science professor Avi Loeb who chairs the advisory committee for Breakthrough Starshot.


This artist's impression shows a view of the surface of the planet Proxima b orbiting the red dwarf star Proxima Centauri, the closest star to the solar system.
ESO/M. Kornmesser
Even though the target star system is closer to us than any other, it's still mind-bogglingly far away: 25 trillion miles. Voyager 1, the spacecraft that has traveled farthest from Earth, has been flying at 38,000 mph for forty years, and it's only a tiny fraction closer to Proxima Centauri than it was when it launched. At Voyager's rate, it would take tens of thousands of years for the spacecraft to get anywhere close to Proxima Centauri, even if it were headed in the right direction.

Conventional rocket launches and gravity assist maneuvers just won't take us anywhere near the stars. We need a new plan.

Laser Beams and Light Sails


Concept image of a spherical light sail being accelerated by laser propulsion from Earth.
Michael Stillwell
Spaceflight generally evokes visions of giant rockets with fiery tails erupting off the pad at Cape Canaveral and flying out beyond the atmosphere. To maneuver to a destination after launch, spacecraft often use a liquid rocket fuel called hydrazine. This potent propellant, however, is much too heavy to launch in large quantities. It would be incredibly inefficient just to launch enough fuel to Mars for a return flight, let alone enough for an interstellar voyage. Fortunately, there's a much more efficient way to zip around the stars, and it uses nothing more than energy from beams of electromagnetic radiation.

Light sails are reflective surfaces resembling tin foil that use photons from a source of light, such as a laser beam or the sun, to propel a spacecraft. When the photons of light bounce off the reflective surface, the energy is transferred to a small push, and the craft accelerates in the near-vacuum of space.

Advertisement - Continue Reading Below

The technology isn't just theoretical. In 2010, the Japanese Aerospace Exploration Agency (JAXA) launched a craft called IKAROS—the first successful interplanetary probe to use light sailing as a means of propulsion. The Planetary Society also launched a light sail back in June 2015, and the institution is working on a new sail, the LightSail 2, slated for launch later this year.


Breakthrough Starshot wants to take light sail technology even farther out to space—all the way to Proxima Centauri. Last year, the organization announced a plan to use light sailing and laser propulsion to accelerate dozens or even hundreds of nano-spacecraft fast enough to reach Proxima Centauri in a matter of decades. We're talking about relativistic speeds, roughly 20 percent of the speed of light, or somewhere around 100 million mph. Only at such a ludicrous speed could a probe reach Proxima Centauri in a reasonable two or three decades. Then it will take another four years or so for the radio signals to get back to Earth, traveling at the full-bore speed of light.

"IN DOING THESE CALCULATIONS, NOTHING HAS COME UP YET THAT SEEMS LIKE IT'S NOT POSSIBLE."
The probes themselves would be little more than small computer chips with a smartphone-like camera, a radio transmitter, and a few other basic electronics. Cornell University is currently working on a project called KickSat to develop just this type of tiny spacecraft, which the KickSat team calls "chipsats." These chipsats are to be deployed from a CubeSat after launch, and in the future, little spacecraft with light sails could be released in orbit the same way. Yuri Milner has met with the KickSat team to discuss their chipsats, also known as Sprites, and the possibility of adapting them for a trip to Proxima Centauri.


KickSat
Advertisement - Continue Reading Below

To send light sail probes on a journey of this scale, the energy from the sun isn't going to cut it. The spacecraft's sails would need to be propelled by the light of a powerful, concentrated laser beam. This is one of the most challenging parts of sending nanoprobes on a journey to the stars: building enough laser infrastructure on Earth to propel the small craft.

Dozens of large lasers constructed on the globe would need to work together to form an array and coalesce into one powerful beam of light. According to Breakthrough, an enormous, global network of lasers would need to continuously hit the light sails for only about two minutes to get the little probes up to 20 percent the speed of light.


Convincing the space agencies of the world to contribute to a global laser system comes with its own set of logistical and engineering challenges, but it is certainly possible with largescale cooperation. The Starshot team's problems don't end there, though. There's also the small problem of making sure everything doesn't get shredded to bits as it flies through space at a million miles per hour.

A Spherical Sail

While the laser-propulsion plan has remained unchanged since the initial announcement, the Breakthrough Starshot team is just starting to dig deep into the engineering challenges. The Breakthrough Discuss conference at Stanford was the first large-scale meeting to develop a plan for Starshot, and those championing the mission have no short supply of problems to overcome.

Zach Manchester, an aerospace engineer and creator of the KickSat project, is working with the Starshot team to develop a concept for the solar sail, the main mechanism for getting to Proxima Centauri. Initially he thought a traditional, flat, kite-like sail—similar to the one used by IKAROS—would be the best way to go. But after a year of study, Manchester suggested the Starshot solar sail would probably need to be spherical instead of flat, making it look something like a small disco ball once deployed. Building a spherical sail also introduces the possibility of putting the probe itself inside the sail, rather than having it attached to the middle or towed along behind.


Concept image of a spherical light sail being accelerated with laser propulsion.
Michael Stillwell
Advertisement - Continue Reading Below

The thin sail would need to reflect about 99.999 percent of the powerful laser light or it would burn up almost instantly. The rapid acceleration to one-fifth the speed of light in about two minutes would enact around 60,000 g's on the sail. The material will not only need to be highly reflective, but also sturdy enough to stand up to the forces from 60,000 times the acceleration of Earth's gravity.

Developing the sail could prove harder than building enormous lasers all over the world. And then there's the issue of communicating with a spacecraft that's 25 trillion miles away.

How Will We Know If It Worked?

"We've identified 20 of the biggest challenges, and one of the biggest is the communication delay between the spacecraft and the star, which is 4 light-years away," says Avi Loeb, Professor of Science at Harvard University and chair of the advisory committee for Breakthrough Starshot. "We have to be able to send the photographic data that's being recorded, but you can't focus the beam of the laser at that distance. When we went to look over the numbers it looks feasible—it'll just be very challenging."

Currently, it takes about twenty minutes to receive 250 megabits of data from spacecraft orbiting Mars. Data from Voyager 1 takes more than a day and a half to phone home from 10 billion miles away. Even if the Starshot team gets a spacecraft to Proxima Centauri in a few decades, any photos of the enticing planet Proxima b will take over four years to reach Earth, and the more data we transmit, the longer it will take.


The Atacama Large Millimeter/submillimeter Array (ALMA), the largest radio telescope array in the world, located in the high desert of Chile. Radio telescope arrays such as this will be crucial to detecting a signal from a probe at Proxima Centauri.
ESO/C. Malin
As it stands today, Proxima b is the only planet we know of in the entire Alpha Centauri system, which includes the small red dwarf star Proxima Centauri and two larger stars, Alpha Centauri A and Alpha Centauri B. However, there is a good chance that other planets lurk in the system, and we simply have not spotted them because their orbits do not take them directly in front of their host stars from our perspective. A probe could potentially reveal undiscovered planets in the Alpha Centauri system.

To get all that photographic data back—data that could very well lead to the discovery of new worlds around out closest neighbor stars—we will need to improve our ground-based receivers and radio telescopes. It is possible that a global array of radio dishes could distinguish the signals of the probes. China's new Five-hundred-meter Aperture Spherical radio Telescope (FAST), the largest single-dish radio telescope in the world, is already being used by Milner's Breakthrough Listen mission to search for signals from intelligent life. The enormous dish could be crucial for helping us detect a signal from a nanoprobe at Proxima Centauri.

"We Haven't Found a Deal Breaker Yet"

The Starshot initiative is ambitious and daring to say the least, but it's not the first time humans have set out to test the limits of engineering. Fortunately, both Loeb and Manchester felt great after the two-day discussion. "I came out of it with a lot more hope and a mindset that everyone on board thinks this is doable. We haven't found a deal breaker yet, basically. In doing these calculations nothing has come up yet that seems like it's not possible," says Manchester.


The two bright stars are Alpha Centauri A (left) and Alpha Centauri B (right). The faint red star in the center of the red circle is Proxima Centauri.
Skatebiker
The success of the Starshot project has huge implications not just for interstellar travel, but for the ease of exploring and studying our own solar system. If we can develop a system to launch small probes at relativistic speeds, a spacecraft that would normally take two years to get to Mars could get there in only two hours. If Starshot technology is developed, and we wanted to photograph something in the outer solar system, we could simply launch a nanoprobe to arrive in days or weeks rather than years. The speed of planetary science studies would accelerate tremendously.

While Starshot is still a nascent project, the hundreds of scientists and engineers who attended the conference were in good spirits about the possibilities. They all trust that together they can work out the engineering kinks required to make something of this magnitude work. Surely if the team from Starshot succeeds, whether it's 30 years from now or 100, they will have single-handedly revolutionized the way we explore the cosmos.

We are on the verge of not just interplanetary exploration, but interplanetary infrastructure and industry as well. If Breakthrough can pull off its Starshot, we will be well on our way to a new era of interstellar exploration. It's time to start building some big ol' lasers.
(FULL STORY)

Two Students Just Broke a Quantum Computing World Record
[7/5/2017]
Researchers from Sweden have successfully simulated a 45-qubit quantum circuit, breaking the record for the greatest number of qubits to be simulated. This important milestone puts humanity one step closer to "quantum supremacy," the point at which quantum computers could outperform any traditional computer.
(FULL STORY)

An easy-to-build desktop muon detector
[6/14/2017]
On airplanes I am often asked about the blinking metallic device connected to my laptop’s USB port. To assuage any suspicions, I explain that I’m a third-year physics graduate student at MIT and that the little device is actually a cosmic-ray-muon detector.

Over the past few years that detector has evolved from an instrument for a multimillion-dollar experiment to a device that high school and college physics students can construct themselves. The goal of a new program called CosmicWatch is to encourage students to build the detectors, which weigh in at less than 100 g and cost less than $100, and explore the effects of the particles that are constantly raining down on Earth’s surface.

My foray into muon-detector construction began when my supervisor, Janet Conrad, and I were tasked with assisting in an upgrade of the IceCube Neutrino Observatory, a cubic-kilometer particle detector built deep in the Antarctic glacier near the South Pole. IceCube has the ability to detect the occasional astrophysical neutrino from phenomena such as gamma-ray bursts, supernovae, and black holes (see Physics Today, June 2014, page 30). On a far more regular basis, the observatory sees a drizzle of cosmic-ray muons. The charged particles are a decay product of the particles that form when high-energy cosmic rays collide with molecules in Earth’s atmosphere. Muons are extremely penetrating, which enables a small fraction of them to travel the more than 1.5 km through the Antarctic ice to the IceCube detector.

As part of IceCube’s low-energy upgrade, called PINGU, Conrad and I planned to build optically isolated scintillator targets and place them throughout the detector. If a charged particle passed through the plastic scintillator, it would emit light that we could collect using a silicon photomultiplier. Whenever the photomultiplier registered enough light at the same time as a triggered event in IceCube, we would know that the particle that triggered IceCube also passed through our target; we could use that information to help determine the particle’s location and trajectory. Conrad and I called the targets muon-tagging optical modules.

The first detector prototype was very simple. I filled a small PVC pipe with liquid scintillator and inserted some circuitry and a silicon photomultiplier. Two wires penetrated the PVC cap: one for biasing the photomultiplier and one for outputting data to an oscilloscope. It was not a great design. The scintillator leaked around the cap threads, and the device looked more like a homemade bomb from a cheap movie than a particle detector. But hey, it worked. We could immediately see the signals produced from cosmic-ray muons passing through the scintillator.

The next iteration of the detector did away with the liquid scintillator and PVC piping. We found some centimeter-thick plastic scintillator panels from an old cosmic-ray experiment and built a proper light-tight enclosure from some scrap aluminum found in the machine shop. I also came across an Arduino and high-speed operational amplifiers in the MIT electronics recycling pile. Those parts, along with some pulse-shaping circuitry, resulted in a simple data acquisition system. We were able to record data directly to a computer as well as on the oscilloscope. The cost of the whole device was less than $100, with the photomultiplier accounting for the bulk of the expense.

In a June 2016 paper, we described exactly how we built the detector and provided a website link that contained all the information about our circuit boards, computer-aided design drawings, and Arduino software. Within a few days after submission to the arXiv, emails began pouring in. I was stunned to see that many of them came not from particle astrophysicists but from high school students with their own ideas for measurements or improvements. An MIT student, Mgcini Keith Phuthi, read the paper and modified our design so that his detector would communicate with his laptop through Bluetooth.

Phuthi and several other undergraduate students joined our little group to set up a small production facility. Once we started working with the new students, it was obvious that building the detector touched on several important skills. The students learned about shop practices, working with printed circuit boards, and programming microcontrollers.

We set out to see if our device would be suitable for MIT’s Junior Lab course, a class on physics lab work for undergrads. In the process, we stumbled on another use for the detector. We approached a cabinet in the corner of one lab, and as soon as we were within a meter of it, the count rate exploded; there was obviously something radioactive in there. We had a pretty good idea that it must be coming from some active gamma-ray source. One by one we took each radioactive isotope out of the cabinet and brought it close to the detector. We each had our own guess (I was thinking it would be a new cobalt-60 source), but it turned out the culprit was a large jar partially filled with dark gray powder: uranium salts. Not something I thought you could store in an undergraduate lab.

We also found something interesting in Conrad’s office. On the wall, next to negatives from a bubble chamber and a lead-glass calorimeter, was a bright orange ceramic plate. It turns out that decades ago, Fiesta dinnerware was glazed with a depleted uranium–based coating. Uranium has a very long half-life, and many of the decay daughters emit radiation in the form of gamma rays. I was surprised to see so much radiation coming from dinnerware!

Over the next few weeks, we received many emails from students who wanted to build detectors for high-altitude balloon missions. The appeal of our detector stemmed from the fact that it was small and could be battery (or USB) powered, with data stored locally in a Raspberry Pi. To help with such projects, we decided to redesign the detector one more time to make it lighter and easier to build.

Our latest detector weighs 68 g (the model in our 2016 paper was about 10 times as heavy), draws less than a watt of power, and has an improved low-signal response. The design is so simple that it should take students just a few hours to build a full detector from scratch.

The detector is starting to gain international interest. Recently I started working with Katarzyna Frankiewicz, a PhD student from the National Center for Nuclear Research (NCBJ) in Poland. She and a colleague, Paweł Przewłocki, are working on improving the software side of the detector; they created a website for project information and data acquisition. And in collaboration with NCBJ’s education and training division, Frankiewicz and Przewłocki are about to start a new educational program for high school students using 20 detectors that NCBJ and MIT built together.

Now that we have a unique detector, an international group of enthusiastic scientists, and lots of experience helping students build desktop muon detectors, we are ready to launch the CosmicWatch program. This summer our goal is to produce the first set of 100 kits, which we will use to teach a class on particle detection and astrophysics for incoming students at the Wisconsin IceCube Particle Astrophysics Center and NCBJ. Some of those detectors will be sent to local high schools for teachers to use in demonstrations. Instructors could measure the angular dependence of the cosmic-ray-muon flux, demonstrate relativistic effects with a high-altitude measurement, and conduct muon tomography. Over the winter we will move to the next generation of detectors, which will have single-photon detection and hardware-coincidence capabilities, an SD card reader, and environmental sensors.

We are not alone in the community of cosmic-ray-muon programs. Upon developing the detector, we discovered that several other groups are working toward a similar goal. We are hoping to collaborate with them to expand on what we’ve designed. As the project grows, we hope to be able to use the detectors for useful physics measurements. One idea is to install the detectors on planes and ships to map out cosmic-ray fluxes throughout the world. Of course, that would require further R&D and therefore more funding.

The airplane conversations regarding my strange little USB device typically end here. But I’m able to capture my questioners’ attention at least one last time when I show them the measurement of the cosmic-ray-muon rate, shown in the graph below. The beauty of a good muon detector—even a small, cheap one—is that it transforms a fundamental but invisible aspect of nature into something we can see.


I used one of the detectors to measure the absolute rate of muons on a flight from Boston to Chicago. (The x-axis shows seconds.) As the airplane climbed to a cruising altitude of 9144 m, the drizzle of muons turned into a downpour.
Spencer N. Axani is a graduate student at MIT working with Janet Conrad. He earned an undergraduate degree in physics from the University of Alberta.
(FULL STORY)

Groundbreaking discovery confirms existence of orbiting supermassive black holes
[6/28/2017]
For the first time ever, astronomers at The University of New Mexico say they’ve been able to observe and measure the orbital motion between two supermassive black holes hundreds of millions of light years from Earth – a discovery more than a decade in the making.

UNM Department of Physics & Astronomy graduate student Karishma Bansal is the first-author on the paper, ‘Constraining the Orbit of the Supermassive Black Hole Binary 0402+379’, recently published in The Astrophysical Journal. She, along with UNM Professor Greg Taylor and colleagues at Stanford, the U.S. Naval Observatory and the Gemini Observatory, have been studying the interaction between these black holes for 12 years.

“For a long time, we’ve been looking into space to try and find a pair of these supermassive black holes orbiting as a result of two galaxies merging,” said Taylor. “Even though we’ve theorized that this should be happening, nobody had ever seen it until now.”

In early 2016, an international team of researchers, including a UNM alumnus, working on the LIGO project detected the existence of gravitational waves, confirming Albert Einstein’s 100-year-old prediction and astonishing the scientific community. These gravitational waves were the result two stellar mass black holes (~30 solar mass) colliding in space within the Hubble time. Now, thanks to this latest research, scientists will be able to start to understand what leads up to the merger of supermassive black holes that creates ripples in the fabric of space-time and begin to learn more about the evolution of galaxies and the role these black holes play in it.

“Even though we’ve theorized that this should be happening, nobody had ever seen it until now.” – Professor Greg Taylor, UNM Department of Physics & Astronomy
Using the Very Long Baseline Array (VLBA), a system made up of 10 radio telescopes across the U.S. and operated in Socorro, N.M., researchers have been able to observe several frequencies of radio signals emitted by these supermassive black holes (SMBH). Over time, astronomers have essentially been able to plot their trajectory and confirm them as a visual binary system. In other words, they’ve observed these black holes in orbit with one another.

“When Dr. Taylor gave me this data I was at the very beginning of learning how to image and understand it,” said Bansal. “And, as I learned there was data going back to 2003, we plotted it and determined they are orbiting one another. It’s very exciting.”

For Taylor, the discovery is the result of more than 20 years of work and an incredible feat given the precision required to pull off these measurements. At roughly 750 million light years from Earth, the galaxy named 0402+379 and the supermassive black holes within it, are incredibly far away; but are also at the perfect distance from Earth and each other to be observed.

Bansal says these supermassive black holes have a combined mass of 15 billion times that of our sun, or 15 billion solar masses. The unbelievable size of these black holes means their orbital period is around 24,000 years, so while the team has been observing them for over a decade, they’ve yet to see even the slightest curvature in their orbit.

“If you imagine a snail on the recently-discovered Earth-like planet orbiting Proxima Centauri – 4.243 light years away – moving at 1 cm a second, that's the angular motion we're resolving here,” said Roger W. Romani, professor of physics at Stanford University and member of the research team.

“What we’ve been able to do is a true technical achievement over this 12-year period using the VLBA to achieve sufficient resolution and precision in the astrometry to actually see the orbit happening,” said Taylor. “It’s a bit of triumph in technology to have been able to do this.”

While the technical accomplishment of this discovery is truly amazing, Bansal and Taylor say the research could also teach us a lot about the universe, where galaxies come from and where they’re going.

"The orbits of binary stars provided tremendous insights about stars,” said Bob Zavala, an astronomer with the U.S. Naval Observatory. “Now we'll be able to use similar techniques to understand super-massive black holes and the galaxies they reside within."

Continuing to observe the orbit and interaction of these two supermassive black holes could also help us gain a better understanding of what the future of our own galaxy might look like. Right now, the Andromeda galaxy, which also has a SMBH at its center, is on a path to collide with our Milky Way, meaning the event Bansal and Taylor are currently observing, might occur in our galaxy in a few billion years.

“Supermassive black holes have a lot of influence on the stars around them and the growth and evolution of the galaxy,” explained Taylor. “So, understanding more about them and what happens when they merge with one another could be important for our understanding for the universe.”

Bansal says the research team will take another observation of this system in three or four years to confirm the motion and obtain a precise orbit. In the meantime, the team hopes that this discovery will encourage related work from astronomers around the world.
(FULL STORY)

NASA's Kepler Space Telescope Finds Hundreds of New Exoplanets, Boosts Total to 4,034
[6/19/2017]
NASA has unveiled the complete set of data from the first four years of the agency's Kepler Space Telescope mission, which stared at a single patch of the sky in the search for alien planets. The result: Kepler has discovered 219 new candidates since NASA's last data unveiling, including 10 near-Earth-size planet candidates in the so-called habitable zone around their stars where the conditions are just right for liquid water to exist on a planet's surface — a key feature in the search for habitable worlds.

The new discoveries boost Kepler's total to 4,034 candidate planets during its mission, 2,335 of which were later confirmed by follow-up observations, NASA officials said in a statement. The 10 newfound potentially Earth-size worlds bring Kepler's total up to 50 of that type of exoplanet, with more than 30 of those being confirmed, NASA officials said during a briefing today (June 19).

The researchers also revealed a surprising divide between small, Earth-like planets and mini-Neptunes gleaned from the data. [From the Exoplanet Archive: How NASA Keeps Track of Alien Worlds]

The planets characterized by NASA's Kepler mission (yellow dots) and other surveys split into several different broad planet types. Future exoplanet surveys will reveal small planets orbiting further from their stars in the corner marked "frontier".
The planets characterized by NASA's Kepler mission (yellow dots) and other surveys split into several different broad planet types. Future exoplanet surveys will reveal small planets orbiting further from their stars in the corner marked "frontier".
Credit: NASA/Ames Research Center/Natalie Batalha/Wendy Stenzel




"With this catalog we're able to extend [our analysis of planets' demographics] out to the longest periods, those periods that are most similar to our Earth," said Susan Thompson, a Kepler research scientist for the SETI Institute in California and lead author on the new catalog study.

"As a result, this survey catalog will be the foundation for directly answering one of astronomy's most compelling questions: How many planets like our Earth are actually in the galaxy?"

According to the researchers, Kepler discovered more than 80 percent of all planet candidates and confirmed exoplanets ever found. This catalog is the final release of data from Kepler's four-year primary mission, which examined a narrow patch of sky in the Cygnus constellation. Kepler launched in 2009, and completed its primary mission in 2013. Now, it's in an extended mission known as K2.

To find planets, Kepler uses the transit method: The space telescope tracked stars over a long period of time so scientists could identify when the stars dimmed briefly, which could suggest a planet crossing between the star and Earth.

That process discovered potential planets like the newly found KOI 7711 (short for Kepler object of interest), an exoplanet that appears very much like Earth — just 1.3 times Earth's radius at an orbit that lets the planet feel about as much radiation as Earth gets from the sun. For KOI 7711 and the other planets, the percent the star dimmed let researchers determine its size, and the frequency of the dimming revealed the orbit.

To determine which dimmings of the 200,000 stars observed by Kepler were likely to be planets, the data went through an intensive vetting process. As Thompson described, about 34,000 signals were found — both transiting planets and noise that could have come from the camera or star itself. After vetting, the total came down to about 4,000 candidates, 50 of which were Earth-size and in the habitable zone.

The researchers then put simulated transits into the data and recorded how many were actually picked up by the software — determining how many transits the process might have missed. And they put noise through the process, too, checking how many were marked as transiting planets — so they knew how many planets were likely to be false alarms. [NASA's Planet-Hunting Kepler Explained (Infographic)]

The eighth Kepler planet catalog includes 10 new planet candidates that are less than twice the sized of Earth in their stars habitable zone. Here, 49 such planets from the full catalogue are graphed.
The eighth Kepler planet catalog includes 10 new planet candidates that are less than twice the sized of Earth in their stars habitable zone. Here, 49 such planets from the full catalogue are graphed.
Credit: NASA/Ames Research Center/Wendy Stenzel
During the briefing, researchers also discussed a surprising distinction they found between super-Earths, which are rocky planets with thin atmospheres, up to about 1.75 times Earth's size, and mini-Neptunes that form dense gas balls 2 to 3.5 times the size of Earth.

A research group used the Keck Observatory in Hawaii to gauge the size of 1,300 stars measured by Kepler, which allowed them to more precisely pinpoint the stars' sizes — and therefore the size of their potential planets. They found that while researchers had thought there was a smooth population containing the whole range of sizes between 1 and 4 times that of Earth, there was a much sharper divide.

"This is a major new division in the family tree of exoplanets, somewhat analogous to the discovery that mammals and lizards are separate branches on the tree of life," said Benjamin Fulton, a researcher at the University of Hawaii in Manoa and the California Institute of Technology and lead author on the Keck study.

Researchers combining data from the Keck telescope in Hawaii and the Kepler space telescope found that there's a sharp divide between super-Earths and mini-Neptunes.
Researchers combining data from the Keck telescope in Hawaii and the Kepler space telescope found that there's a sharp divide between super-Earths and mini-Neptunes.
Credit: NASA/Ames Research Center/JPL-Caltech/R. Hurt
That sharp divide likely comes from the planet formation process, Fulton said: Planets' rocky cores form from smaller pieces, and then the protoplanet's gravity attracts hydrogen and helium gas. A little bit of gas makes the planet much bigger, putting it on the mini-Neptune side of things. Planets in the middle, Fulton said, can suffer a setback that puts them back on the rocky super-Earth side of things: The newfound atmosphere can be baked away if the star is too close by or there's not enough to start with.

While the Kepler data set provides the best-ever glimpse of exoplanet demographics for one slice of the sky, future telescopes — like NASA's Transiting Exoplanet Survey Satellite set to launch in 2018 — will allow researchers to follow up on these Kepler finds to characterize the planets even more. They may someday even take direct images of exoplanets with tools like Hubble Space Telescope's successor, the James Webb Space Telescope (also set to launch in 2018). Plus, additional data from Kepler's current K2 mission will give researchers a glimpse into what things look like in other parts of the sky, revealing planets around star clusters of different ages, with different iron contents, and many more low-mass stars than Kepler saw the first time around, the researchers said.

"It feels a bit like the end of an era, but actually I see it as a new beginning," Thompson said. "It's amazing the things that Kepler has found. It has shown us these terrestrial worlds, and we still have all this work to do to really understand how common Earths are in the galaxy."

"I'm really excited to see what people are going to do with this catalog, because this is the first time we have a population that is really well-characterized and we can now do these statistical studies and really start to understand the Earth analogues out there," she added.

Editor's Note: This article was updated at 2:45 p.m. EDT to include more details and background from NASA's press conference. Video produced by Space.com's Steve Spaleta.

Email Sarah Lewin at slewin@space.com or follow her @SarahExplains. Follow us @Spacedotcom, Facebook and Google+. Original article on Space.com.

- See more at: https://www.space.com/37242-nasa-kepler-alien-planets-habitable-worlds-catalog.html#sthash.hQwV1yUi.dpuf
(FULL STORY)

China’s quantum satellite achieves ‘spooky action’ at record distance
[6/15/2017]
Quantum entanglement—physics at its strangest—has moved out of this world and into space. In a study that shows China's growing mastery of both the quantum world and space science, a team of physicists reports that it sent eerily intertwined quantum particles from a satellite to ground stations separated by 1200 kilometers, smashing the previous world record. The result is a stepping stone to ultrasecure communication networks and, eventually, a space-based quantum internet.

"It's a huge, major achievement," says Thomas Jennewein, a physicist at the University of Waterloo in Canada. "They started with this bold idea and managed to do it."

Entanglement involves putting objects in the peculiar limbo of quantum superposition, in which an object's quantum properties occupy multiple states at once: like Schrödinger's cat, dead and alive at the same time. Then those quantum states are shared among multiple objects. Physicists have entangled particles such as electrons and photons, as well as larger objects such as superconducting electric circuits.
SIGN UP FOR OUR DAILY NEWSLETTER

Get more great content like this delivered right to you!


Theoretically, even if entangled objects are separated, their precarious quantum states should remain linked until one of them is measured or disturbed. That measurement instantly determines the state of the other object, no matter how far away. The idea is so counterintuitive that Albert Einstein mocked it as "spooky action at a distance."

Starting in the 1970s, however, physicists began testing the effect over increasing distances. In 2015, the most sophisticated of these tests, which involved measuring entangled electrons 1.3 kilometers apart, showed once again that spooky action is real.

Beyond the fundamental result, such experiments also point to the possibility of hack-proof communications. Long strings of entangled photons, shared between distant locations, can be "quantum keys" that secure communications. Anyone trying to eavesdrop on a quantum-encrypted message would disrupt the shared key, alerting everyone to a compromised channel.

But entangled photons degrade rapidly as they pass through the air or optical fibers. So far, the farthest anyone has sent a quantum key is a few hundred kilometers. "Quantum repeaters" that rebroadcast quantum information could extend a network's reach, but they aren't yet mature. Many physicists have dreamed instead of using satellites to send quantum information through the near-vacuum of space. "Once you have satellites distributing your quantum signals throughout the globe, you've done it," says Verónica Fernández Mármol, a physicist at the Spanish National Research Council in Madrid. "You've leapfrogged all the problems you have with losses in fibers."


CREDITS: (GRAPHIC) C. BICKEL/SCIENCE; (DATA) JIAN-WEI PAN
Jian-Wei Pan, a physicist at the University of Science and Technology of China in Shanghai, got the chance to test the idea when the Micius satellite, named after an ancient Chinese philosopher, was launched in August 2016. The satellite is the foundation of the $100 million Quantum Experiments at Space Scale program, one of several missions that China hopes will make it a space science power on par with the United States and Europe.

In their first experiment, the team sent a laser beam into a light-altering crystal on the satellite. The crystal emitted pairs of photons entangled so that their polarization states would be opposite when one was measured. The pairs were split, with photons sent to separate receiving stations in Delingha and Lijiang, 1200 kilometers apart. Both stations are in the mountains of Tibet, reducing the amount of air the fragile photons had to traverse. This week in Science, the team reports simultaneously measuring more than 1000 photon pairs. They found the photons had opposite polarizations far more often than would be expected by chance, thus confirming spooky action over a record distance (though the 2015 test over a shorter distance was more stringent).

The team had to overcome many hurdles, including keeping the beams of photons focused on the ground stations as the satellite hurtled through space at nearly 8 kilometers per second. "Showing and demonstrating it is quite a challenging task," says Alexander Ling, a physicist at the National University of Singapore. "It's very encouraging." However, Ling notes that Pan's team recovered only about one photon out of every 6 million sent from the satellite—far better than ground-based experiments but still far too few for practical quantum communication.

Pan expects China's National Space Science Center to launch additional satellites with stronger and cleaner beams that could be detected even when the sun is shining. (Micius operates only at night.) "In the next 5 years we plan to launch some really practical quantum satellites," he says. In the meantime, he plans to use Micius to distribute quantum keys to Chinese ground stations, which will require longer strings of photons and additional steps. Then he wants to demonstrate intercontinental quantum key distribution between stations in China and Austria, which will require holding one half of an entangled photon pair on board until the Austrian ground station appears within view of the satellite. He also plans to teleport a quantum state—a technique for transferring quantum-encoded information without moving an actual object—from a third Tibetan observatory to the satellite.

Other countries are inching toward quantum space experiments of their own. Ling is teaming up with physicists in Australia to send quantum information between two satellites, and the Canadian Space Agency recently announced funding for a small quantum satellite. European and U.S. teams are also proposing putting quantum instruments on the International Space Station. One goal is to test whether entanglement is affected by a changing gravitational field, by comparing a photon that stays in the weaker gravitational environment of orbit with an entangled partner sent to Earth, says Anton Zeilinger, a physicist at the Austrian Academy of Sciences in Vienna. "There are not many experiments which test links between gravity and quantum physics."

The implications go beyond record-setting demonstrations: A network of satellites could someday connect the quantum computers being designed in labs worldwide. Pan's paper "shows that China is making the right decisions," says Zeilinger, who has pushed the European Space Agency to launch its own quantum satellite. "I'm personally convinced that the internet of the future will be based on these quantum principles."

Posted in: PhysicsSpace
DOI: 10.1126/science.aan6972
(FULL STORY)

Scientists make waves with black hole research
[6/14/2017]
Scientists at the University of Nottingham have made a significant leap forward in understanding the workings of one of the mysteries of the universe. They have successfully simulated the conditions around black holes using a specially designed water bath.

Their findings shed new light on the physics of black holes with the first laboratory evidence of the phenomenon known as the superradiance, achieved using water and a generator to create waves.
The research - Rotational superradiant scattering in a vortex flow - has been published in Nature Physics. It was undertaken by a team in the Quantum Gravity Laboratory in the School of Physics and Astronomy.
The work was led by Silke Weinfurtner from the School of Mathematical Sciences. In collaboration with an interdisciplinary team she designed and built the black hole 'bath' and measurement system to simulate black hole conditions.
Dr Weinfurtner said: "This research has been particularly exciting to work on as it has bought together the expertise of physicists, engineers and technicians to achieve our common aim of simulating the conditions of a black hole and proving that superadiance exists. We believe our results will motivate further research on the observation of superradiance in astrophysics."
What is superradiance?
The Nottingham experiment was based on the theory that an area immediately outside the event horizon of a rotating black hole - a black hole's gravitational point of no return - will be dragged round by the rotation and any wave that enters this region, but does not stray past the event horizon, should be deflected and come out with more energy than it carried on the way in - an effect known as superradiance.
Superadiance - the extraction of energy from a rotating black hole - is also known as the Penrose Mechanism and is a precursor of Hawking Radiation - a quantum version of black-hole superradiance.
What's in the Black Hole Lab?
Dr Weinfurtner said: "Some of the bizzare black hole phenomena are hard, if not, impossible to study directly. This means there are very limited experimental possibilities. So this research is quite an achievement."
The 'flume', is specially designed 3m long, 1.5m wide and 50cm deep bath with a hole in the centre. Water is pumped in a closed circuit to establish a rotating draining flow. Once at the desired depth waves were generated at varied frequenices until the supperadiant scattering effect is created and recorded using a specially designed 3D air fluid interface sensor.
Tiny dots of white paper punched out by a specially adapted sewing machine were used to measure the flow field - the speed of the fluid flow around the analogue black hole.
It all started from humble beginnings
This research has been many years in the making. The initial idea for creating a supperradiant effect with water started with a bucket and bidet. Dr Weinfurtner said: "This research has grown from humble beginnings. I had the initial idea for a water based experiment when I was at the International School for Advanced Studies (SISSA) in Italy and I set up an experiment with a bucket and a bidet. However, when it caused a flood I was quickly found a lab to work in!
After her postdoc, Dr Weinfurtner went on to work with Bill Unruh, the Canadian born physicist who also has a made seminal contributions to our understanding of gravity, black holes, cosmology, quantum fields in curved spaces, and the foundations of quantum mechanics, including the discovery of the Unruh effect.
Her move to the University of Nottingham accelerated her research as she was able to set up her own research group with support from the machine shop in the School of Physics and Astronomy.
Explore further: Water circling drain experiments offer insight into black holes
More information: Theo Torres et al, Rotational superradiant scattering in a vortex flow, Nature Physics (2017). DOI: 10.1038/nphys4151


Read more at: https://phys.org/news/2017-06-scientists-black-hole.html#jCp
(FULL STORY)

We Live in a Cosmic Void, Another Study Confirms
[6/14/2017]
Earth and its parent galaxy are living in a cosmic desert — a region of space largely devoid of other galaxies, stars and planets, according to a new study.

The findings confirm the results of a previous study based on observations taken in 2013. That previous study showed that Earth's galaxy, the Milky Way, is part of a so-called cosmic void. These voids are part of the large-scale structure of the universe, which looks sort of like a block of Swiss cheese, made up of dense filaments containing huge collections of galaxies surrounding relatively empty regions.

The KBC void

The cosmic void that contains the Milky Way's is dubbed the Keenan, Barger and Cowie (KBC) void, after the three astronomers who identified it in the 2013 study. It is the largest cosmic void ever observed — about seven times larger than the average void, with a radius of about 1 billion light-years, according to the study.

Advertisement



The KBC void is shaped like a sphere, and is surrounded by a shell of galaxies, stars and other matter. The new study shows this model of the KBC void is not ruled out based on additional observational data, Amy Barger, an observational cosmologist at the University of Wisconsin-Madison who was involved with both studies, said in a statement from the university.

Barger's undergraduate student who led the study, Benjamin Hoscheit, spoke about their work at the American Astronomical Society meeting in Austin, Texas, on June 6.

Hoscheit sought an efficient way to verify the results of the 2013 study, but in a shorter time span. That work was led by Ryan Keenan, Barger's doctoral student at the time at the University of Hawaii.

Whereas Keenan's work measured the density of different areas of the universe using galaxy catalogs, Hoscheit verified the work using a measurement called the kinematic Sunyaev-Zel'dovich (kSZ) effect, which measures the motions of galaxy clusters within the cosmic web.

The kSZ effect looks at photons coming from the cosmic microwave background (CMB), or light left over from an early stage in the universe's evolution. As the distant CMB photons pass through galaxy clusters, the photons shift in energy. This shift in energy shows how the galaxy clusters are moving, Hoscheit said.

Galaxy clusters that exist in a cosmic void should be attracted to regions with stronger gravity. That would be revealed in how fast these galaxy clusters move through space, Hoscheit said. But if the clusters were moving more slowly than expected, then perhaps the conclusions of the previous study would need to be rethought, he said. However, the kSZ effect on the clusters was consistent with that in the 2013 study, Hoscheit added.

Follow us @Spacedotcom, Facebook and Google+. Original article on Space.com.
(FULL STORY)

Scientists Finally Witnessed a Phenomenon That Einstein Thought “Impossible”
[6/11/2017]
Astronomers have observered a phenomenon known as gravitational microlensing in stars for the first time. Predicted by Einstein as part of his theory of general relativity, this could help measure the mass of distant stars using gravitational deflection.
(FULL STORY)

Charmed Existence: Mysterious Particles Could Reveal Mysteries of the Big Bang
[6/9/2017]
A mysterious particle created in a blazing fireball at an atom smasher is misbehaving, a new experiment shows.

The particle, called a charm quark, revealed surprising interactions with its neighboring subatomic particles, measurements show. That discovery could improve scientists' understanding of the conditions that existed soon after the Big Bang, when the universe was permeated by a primordial soup of elementary particles, and possibly show hints of physics beyond what scientists know today. [Wacky Physics: The Coolest Little Particles in Nature]

Back to the beginning

The surprising charm-quark behavior was first spotted at Brookhaven National Laboratory's Relativistic Heavy Ion Collider (RHIC) in Upton, New York, which aims to recreate conditions in the trillionths of a second after the Big Bang. The key to the new observation is the Heavy Flavor Tracker (HFT), a set of recently installed ultrasensitive photodetectors similar to those in digital cameras. Using the HFT, for the first time, researchers directly measured the behavior of charm quarks as they emerged from the trillion-degree fireball meant to recreate the universe's first moments.

Advertisement



To recreate these primeval conditions, the RHIC fires gold atoms at one another at nearly the speed of light. As they collide, the atoms break up into a soup of elementary, free-flowing particles known as a quark-gluon plasma. Quarks make up more familiar particles, like protons and neutrons, while gluons are the carriers of the strong nuclear force that holds the quarks together.

The measurements tell the physicists whether their models of fields that bind together quarks and gluons, based on a theory called quantum chromodynamics, are correct, according to a new study detailing the findings.

"You can study how nuclear medium behaves and functions at these high temperatures," Brookhaven National Laboratory physicist Flemming Videbaek, a coauthor of the study, told Live Science.

Heavy interactions

Quarks and their antimatter counterparts come in six varieties, known to physicists as "flavors": up, down, top, bottom, strange and charm. They have different masses; the up and down quarks that make up protons and neutrons are the lightest. Charm quarks are the second heaviest, behind top quarks. They never form in ordinary conditions on Earth; a particle accelerator is necessary to make them. [7 Strange Facts About Quarks]

Albert Einstein's famous E = mc2 equation says energy and mass are the same thing, and when the atomic nuclei collide in the RHIC, the energy is so great that it creates heavier, exotic particles, such as charm quarks.

One of the particles formed by this fiery collision is the D-zero, made up of a charm quark and an anti-up quark. The D-zeros travel for a fraction of a millimeter before they decay and become two other particles: kaons and pions. It's the kaons and pions that the experimenters actually "see" with the HFT.

What surprised the researchers was that the flow of quark-gluon plasma caught the heavy D-zero particles. The football-shaped fireball emitted more D-zeros from the wider part than from the ends, rather than in an evenly distributed way. Previous models predicted that the D-zero, which contains the heavy charm quark, was too massive to interact with the quarks and gluons in the plasma. According to those models, its mass would mean the D-zero barreled out too quickly, before the plasma's forces could act on it, and the plasma would not last long enough to produce much interaction.

Instead, the quark-gluon plasma has a low viscosity; if it were a fluid, it would flow freely, Videbaek said.

"The fact that it has a low viscosity means that it interacts [with the particles] quite a bit," Videbaek said. That means "some of the models were quite far off."

In addition to helping scientists refine their models, the charm quarks revealed more details about how the quark-gluon plasma behaves. Knowing more about what such plasmas actually do helps scientists understand what to look for if they seek out new physical laws, and helps them understand the implications of the ones they know already.

In future experiments, the team hopes to gain insight into the behavior of other heavy and rare particles made up of quarks, such as the B (or "beauty") meson, which is made of a bottom quark and one of its lighter cousins, Videbaek said.

The study was published May 26 in the journal Physical Review Letters.

Originally published on Live Science.
(FULL STORY)

A New State of Matter is Discovered – And It’s Strange
[6/8/2017]
A researcher has proven that a new state of matter, a phase transition, is possible in our 3D universe at low temperatures in "disordered" materials like glass. This discovery will shape future research on these materials.
(FULL STORY)

A Theory of Reality as More Than the Sum of Its Parts
[6/1/2017]
New math shows how, contrary to conventional scientific wisdom, conscious beings and other macroscopic entities might have greater influence over the future than does the sum of their microscopic compo
(FULL STORY)

Dark Energy May Lurk in the Nothingness of Space
[5/26/2017]
A new study may help reveal the nature of dark energy, the mysterious substance that is pushing the universe to expand outward. Dark energy may emerge from fluctuations in the nothingness of empty space, a new hypothesis suggests.

That idea, in turn, could also explain why the cosmological constant, a mathematical constant that Albert Einstein conjured up yet famously called "the biggest blunder of his life," takes the value it does. [8 Ways You Can See Einstein's Theory of Relativity in Real Life]

The new study proposed that the expansion is driven by fluctuations in the energy carried by the vacuum, or regions of space devoid of matter. The fluctuations create pressure that forces space itself to expand, making matter and energy less dense as the universe ages, said study co-author Qingdi Wang, a doctoral student at the University of British Columbia (UBC) in Canada.

Advertisement

Accelerating universe

Scientists call the force that pushes the universe to expand a cosmological constant (though it isn't a "force" in the strict sense). This constant is the energy density of space itself. If it is greater than zero, then Einstein's equations of relativity, which describe the structure of space-time, imply an expanding universe. In the late 1990s, measurements of distant supernovas showed that the universe was accelerating, not just expanding. Cosmologists call the energy that drives that acceleration dark energy. Whatever dark energy is, it dissipates more slowly than matter or dark matter, and doesn't clump together the way either of them do under the influence of gravity.

This acceleration has been a big quandary for physicists, because it contradicts the predictions of quantum field theories, the theoretical frameworks that describe the interactions of the tiniest subatomic particles. Quantum field theories predict vacuum energies that are so large that the universe shouldn't exist at all, said Lucas Lombriser, postdoctoral fellow at the Royal Observatory, Edinburgh, in Scotland, who was not involved in the new study. This discrepancy is called the "old" cosmological constant problem, and physicists generally thought that once new physics was discovered, the cosmological constant would disappear; expansion would be explained in some other way.

However, when scientists discovered the accelerated expansion, a new problem arose. According to theoretical calculations, the cosmological constant should be 50 to 120 orders of magnitude larger than it is, with a correspondingly large rate of expansion, Lombriser said.

Essentially, the energy density of the universe (how much energy there is per unit volume) should be gigantic, and it clearly isn't.

Fluctuations in empty space

The new work addresses not only what dark energy is but why the rate of universal expansion has the value it does.

"Everybody wants to know what dark energy is," Wang told Live Science. "I reconsidered this question more carefully," from the perspective of the universe's energy density.

Wang and his colleagues assumed that modern quantum field theory was correct about the energy density being very large, but that the vacuum fluctuations, or the movements of empty space, were very large on tiny scales, near what is called the Planck length, or 1.62 × 10 ^ minus 35 meters. That's so small that a proton is 100 million trillion times bigger.

"Every point in space is going through expansion and contraction," he said. "But it looks smooth just like a table looks smooth from far away."

The vacuum fluctuations, in Wang's formulation, are like children on a swing pumping their legs. Even though nobody is pushing them, they manage to impart extra energy on the swing, making the swing go up higher than it would otherwise. This phenomenon is called parametric resonance, which basically means that some piece of the system — the expansion and contraction, or the swinging of the child's legs — changes with time. In this case, the density of a very tiny portion of the universe is changing, Wang said.

Since the fluctuations are little bits of the universe expanding and contracting, this tiny resonance adds up on cosmological scales, he said. So the universe expands. (Expansion and contraction of space doesn't violate conservation laws, because space itself is doing the expanding).

As a result of Wang's approach, there's no need for any new fields, as in some dark energy models. Instead the expansion of the universe is roughly the same as that already predicted by quantum field theory.

Observations needed

While Wang's idea is a good one, that doesn't mean it's the end of the story, Lombriser said. The question is whether observations of the universe bear the theory out, he said.

"So far, they can argue that the vacuum contribution is in the right ballpark for what is being observed (which, if it holds up, is already a huge success)," Lombriser said in an email. "They have not yet made an accurate prediction for the exact observed value, but this is something they intend to further investigate in their future work."

Other physicists are more skeptical.

"On these high-energy scales, classical general relativity doesn't work any longer, but that's what they use. So, their approximation is interesting, but it's not well-justified, because in this limit, one should be using quantum gravity (a theory which we don't have)," Sabine Hossenfelder, a research fellow at the Frankfurt Institute for Advanced Studies in Germany, told Live Science via email.

"This paper is simply a first step in the process," said study co-author William Unruh, a physicist at UBC. "But I think the path is worth pursuing, as our results are suggestive."

The study is published in the May 15 issue of the journal Physical Review D.
(FULL STORY)

What Happens When You Mix Thermodynamics and the Quantum World? A Revolution
[5/6/2017]
IN HIS 1824 book, Reflections on the Motive Power of Fire, the 28-year-old French engineer Sadi Carnot worked out a formula for how efficiently steam engines can convert heat—now known to be a random, diffuse kind of energy—into work, an orderly kind of energy that might push a piston or turn a wheel. To Carnot’s surprise, he discovered that a perfect engine’s efficiency depends only on the difference in temperature between the engine’s heat source (typically a fire) and its heat sink (typically the outside air). Work is a byproduct, Carnot realized, of heat naturally passing to a colder body from a warmer one.


ABOUT
Original story reprinted with permission from Quanta Magazine, an editorially independent division of the Simons Foundation whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences

Carnot died of cholera eight years later, before he could see his efficiency formula develop over the 19th century into the theory of thermodynamics: a set of universal laws dictating the interplay among temperature, heat, work, energy and entropy—a measure of energy’s incessant spreading from more- to less-energetic bodies. The laws of thermodynamics apply not only to steam engines but also to everything else: the sun, black holes, living beings and the entire universe. The theory is so simple and general that Albert Einstein deemed it likely to “never be overthrown.”

Yet since the beginning, thermodynamics has held a singularly strange status among the theories of nature.

“If physical theories were people, thermodynamics would be the village witch,” the physicist Lídia del Rio and co-authors wrote last year in Journal of Physics A. “The other theories find her somewhat odd, somehow different in nature from the rest, yet everyone comes to her for advice, and no one dares to contradict her.”

Unlike, say, the Standard Model of particle physics, which tries to get at what exists, the laws of thermodynamics only say what can and can’t be done. But one of the strangest things about the theory is that these rules seem subjective. A gas made of particles that in aggregate all appear to be the same temperature—and therefore unable to do work—might, upon closer inspection, have microscopic temperature differences that could be exploited after all. As the 19th-century physicist James Clerk Maxwell put it, “The idea of dissipation of energy depends on the extent of our knowledge.”

In recent years, a revolutionary understanding of thermodynamics has emerged that explains this subjectivity using quantum information theory—“a toddler among physical theories,” as del Rio and co-authors put it, that describes the spread of information through quantum systems. Just as thermodynamics initially grew out of trying to improve steam engines, today’s thermodynamicists are mulling over the workings of quantum machines. Shrinking technology—a single-ion engine and three-atom fridge were both experimentally realized for the first time within the past year—is forcing them to extend thermodynamics to the quantum realm, where notions like temperature and work lose their usual meanings, and the classical laws don’t necessarily apply.

They’ve found new, quantum versions of the laws that scale up to the originals. Rewriting the theory from the bottom up has led experts to recast its basic concepts in terms of its subjective nature, and to unravel the deep and often surprising relationship between energy and information—the abstract 1s and 0s by which physical states are distinguished and knowledge is measured. “Quantum thermodynamics” is a field in the making, marked by a typical mix of exuberance and confusion.

“We are entering a brave new world of thermodynamics,” said Sandu Popescu, a physicist at the University of Bristol who is one of the leaders of the research effort. “Although it was very good as it started,” he said, referring to classical thermodynamics, “by now we are looking at it in a completely new way.”
Entropy as Uncertainty
In an 1867 letter to his fellow Scotsman Peter Tait, Maxwell described his now-famous paradox hinting at the connection between thermodynamics and information. The paradox concerned the second law of thermodynamics—the rule that entropy always increases— which Sir Arthur Eddington would later say “holds the supreme position among the laws of nature.” According to the second law, energy becomes ever more disordered and less useful as it spreads to colder bodies from hotter ones and differences in temperature diminish. (Recall Carnot’s discovery that you need a hot body and a cold body to do work.) Fires die out, cups of coffee cool and the universe rushes toward a state of uniform temperature known as “heat death,” after which no more work can be done.

The great Austrian physicist Ludwig Boltzmann showed that energy disperses, and entropy increases, as a simple matter of statistics: There are many more ways for energy to be spread among the particles in a system than concentrated in a few, so as particles move around and interact, they naturally tend toward states in which their energy is increasingly shared.

But Maxwell’s letter described a thought experiment in which an enlightened being—later called Maxwell’s demon—uses its knowledge to lower entropy and violate the second law. The demon knows the positions and velocities of every molecule in a container of gas. By partitioning the container and opening and closing a small door between the two chambers, the demon lets only fast-moving molecules enter one side, while allowing only slow molecules to go the other way. The demon’s actions divide the gas into hot and cold, concentrating its energy and lowering its overall entropy. The once useless gas can now be put to work.


Maxwell and others wondered how a law of nature could depend on one’s knowledge—or ignorance—of the positions and velocities of molecules. If the second law of thermodynamics depends subjectively on one’s information, in what sense is it true?

A century later, the American physicist Charles Bennett, building on work by Leo Szilard and Rolf Landauer, resolved the paradox by formally linking thermodynamics to the young science of information. Bennett argued that the demon’s knowledge is stored in its memory, and memory has to be cleaned, which takes work. (In 1961, Landauer calculated that at room temperature, it takes at least 2.9 zeptojoules of energy for a computer to erase one bit of stored information.) In other words, as the demon organizes the gas into hot and cold and lowers the gas’s entropy, its brain burns energy and generates more than enough entropy to compensate. The overall entropy of the gas-demon system increases, satisfying the second law of thermodynamics.

The findings revealed that, as Landauer put it, “Information is physical.” The more information you have, the more work you can extract. Maxwell’s demon can wring work out of a single-temperature gas because it has far more information than the average user.

But it took another half century and the rise of quantum information theory, a field born in pursuit of the quantum computer, for physicists to fully explore the startling implications.

Over the past decade, Popescu and his Bristol colleagues, along with other groups, have argued that energy spreads to cold objects from hot ones because of the way information spreads between particles. According to quantum theory, the physical properties of particles are probabilistic; instead of being representable as 1 or 0, they can have some probability of being 1 and some probability of being 0 at the same time. When particles interact, they can also become entangled, joining together the probability distributions that describe both of their states. A central pillar of quantum theory is that the information—the probabilistic 1s and 0s representing particles’ states—is never lost. (The present state of the universe preserves all information about the past.)

MORE QUANTA
How Life (and Death) Spring From Disorder
PHILIP BALL
How Life (and Death) Spring From Disorder
Quantum Gravity Research Could Unearth the True Nature of Time
NATALIE WOLCHOVER
Quantum Gravity Research Could Unearth the True Nature of Time
The Man Who's Trying to Kill Dark Matter
NATALIE WOLCHOVER
The Man Who’s Trying to Kill Dark Matter
Over time, however, as particles interact and become increasingly entangled, information about their individual states spreads and becomes shuffled and shared among more and more particles. Popescu and his colleagues believe that the arrow of increasing quantum entanglement underlies the expected rise in entropy—the thermodynamic arrow of time. A cup of coffee cools to room temperature, they explain, because as coffee molecules collide with air molecules, the information that encodes their energy leaks out and is shared by the surrounding air.

Understanding entropy as a subjective measure allows the universe as a whole to evolve without ever losing information. Even as parts of the universe, such as coffee, engines and people, experience rising entropy as their quantum information dilutes, the global entropy of the universe stays forever zero.

Renato Renner, a professor at ETH Zurich in Switzerland, described this as a radical shift in perspective. Fifteen years ago, “we thought of entropy as a property of a thermodynamic system,” he said. “Now in information theory, we wouldn’t say entropy is a property of a system, but a property of an observer who describes a system.”

Moreover, the idea that energy has two forms, useless heat and useful work, “made sense for steam engines,” Renner said. “In the new way, there is a whole spectrum in between—energy about which we have partial information.”

Entropy and thermodynamics are “much less of a mystery in this new view,” he said. “That’s why people like the new view better than the old one.”
Thermodynamics From Symmetry
The relationship among information, energy and other “conserved quantities,” which can change hands but never be destroyed, took a new turn in two papers published simultaneously last July in Nature Communications, one by the Bristol team and another by a team that included Jonathan Oppenheim at University College London. Both groups conceived of a hypothetical quantum system that uses information as a sort of currency for trading between the other, more material resources.

Imagine a vast container, or reservoir, of particles that possess both energy and angular momentum (they’re both moving around and spinning). This reservoir is connected to both a weight, which takes energy to lift, and a turning turntable, which takes angular momentum to speed up or slow down. Normally, a single reservoir can’t do any work—this goes back to Carnot’s discovery about the need for hot and cold reservoirs. But the researchers found that a reservoir containing multiple conserved quantities follows different rules. “If you have two different physical quantities that are conserved, like energy and angular momentum,” Popescu said, “as long as you have a bath that contains both of them, then you can trade one for another.”

In the hypothetical weight-reservoir-turntable system, the weight can be lifted as the turntable slows down, or, conversely, lowering the weight causes the turntable to spin faster. The researchers found that the quantum information describing the particles’ energy and spin states can act as a kind of currency that enables trading between the reservoir’s energy and angular momentum supplies. The notion that conserved quantities can be traded for one another in quantum systems is brand new. It may suggest the need for a more complete thermodynamic theory that would describe not only the flow of energy, but also the interplay between all the conserved quantities in the universe.


The fact that energy has dominated the thermodynamics story up to now might be circumstantial rather than profound, Oppenheim said. Carnot and his successors might have developed a thermodynamic theory governing the flow of, say, angular momentum to go with their engine theory, if only there had been a need. “We have energy sources all around us that we want to extract and use,” Oppenheim said. “It happens to be the case that we don’t have big angular momentum heat baths around us. We don’t come across huge gyroscopes.”

Popescu, who won a Dirac Medal last year for his insights in quantum information theory and quantum foundations, said he and his collaborators work by “pushing quantum mechanics into a corner,” gathering at a blackboard and reasoning their way to a new insight after which it’s easy to derive the associated equations. Some realizations are in the process of crystalizing. In one of several phone conversations in March, Popescu discussed a new thought experiment that illustrates a distinction between information and other conserved quantities—and indicates how symmetries in nature might set them apart.

“Suppose that you and I are living on different planets in remote galaxies,” he said, and suppose that he, Popescu, wants to communicate where you should look to find his planet. The only problem is, this is physically impossible: “I can send you the story of Hamlet. But I cannot indicate for you a direction.”

There’s no way to express in a string of pure, directionless 1s and 0s which way to look to find each other’s galaxies because “nature doesn’t provide us with [a reference frame] that is universal,” Popescu said. If it did—if, for instance, tiny arrows were sewn everywhere in the fabric of the universe, indicating its direction of motion—this would violate “rotational invariance,” a symmetry of the universe. Turntables would start turning faster when aligned with the universe’s motion, and angular momentum would not appear to be conserved. The early-20th-century mathematician Emmy Noether showed that every symmetry comes with a conservation law: The rotational symmetry of the universe reflects the preservation of a quantity we call angular momentum. Popescu’s thought experiment suggests that the impossibility of expressing spatial direction with information “may be related to the conservation law,” he said.

The seeming inability to express everything about the universe in terms of information could be relevant to the search for a more fundamental description of nature. In recent years, many theorists have come to believe that space-time, the bendy fabric of the universe, and the matter and energy within it might be a hologram that arises from a network of entangled quantum information. “One has to be careful,” Oppenheim said, “because information does behave differently than other physical properties, like space-time.”

Knowing the logical links between the concepts could also help physicists reason their way inside black holes, mysterious space-time swallowing objects that are known to have temperatures and entropies, and which somehow radiate information. “One of the most important aspects of the black hole is its thermodynamics,” Popescu said. “But the type of thermodynamics that they discuss in the black holes, because it’s such a complicated subject, is still more of a traditional type. We are developing a completely novel view on thermodynamics.” It’s “inevitable,” he said, “that these new tools that we are developing will then come back and be used in the black hole.”

Janet Anders (lower right) at a 160-person conference on quantum thermodynamics held at the University of Oxford in March.
Janet Anders (lower right) at a 160-person conference on quantum thermodynamics held at the University of Oxford in March.LUIS CORREA
What to Tell Technologists
Janet Anders, a quantum information scientist at the University of Exeter, takes a technology-driven approach to understanding quantum thermodynamics. “If we go further and further down [in scale], we’re going to hit a region that we don’t have a good theory for,” Anders said. “And the question is, what do we need to know about this region to tell technologists?”

In 2012, Anders conceived of and co-founded a European research network devoted to quantum thermodynamics that now has 300 members. With her colleagues in the network, she hopes to discover the rules governing the quantum transitions of quantum engines and fridges, which could someday drive or cool computers or be used in solar panels, bioengineering and other applications. Already, researchers are getting a better sense of what quantum engines might be capable of. In 2015, Raam Uzdin and colleagues at the Hebrew University of Jerusalem calculated that quantum engines can outpower classical engines. These probabilistic engines still follow Carnot’s efficiency formula in terms of how much work they can derive from energy passing between hot and cold bodies. But they’re sometimes able to extract the work much more quickly, giving them more power. An engine made of a single ion was experimentally demonstrated and reported in Science in April 2016, though it didn’t harness the power-enhancing quantum effect.

Popescu, Oppenheim, Renner and their cohorts are also pursuing more concrete discoveries. In March, Oppenheim and his former student, Lluis Masanes, published a paper deriving the third law of thermodynamics—a historically confusing statement about the impossibility of reaching absolute-zero temperature—using quantum information theory. They showed that the “cooling speed limit” preventing you from reaching absolute zero arises from the limit on how fast information can be pumped out of the particles in a finite-size object. The speed limit might be relevant to the cooling abilities of quantum fridges, like the one reported in a preprint in February. In 2015, Oppenheim and other collaborators showed that the second law of thermodynamics is replaced, on quantum scales, by a panoply of second “laws”—constraints on how the probability distributions defining the physical states of particles evolve, including in quantum engines.


As the field of quantum thermodynamics grows quickly, spawning a range of approaches and findings, some traditional thermodynamicists see a mess. Peter Hänggi, a vocal critic at the University of Augsburg in Germany, thinks the importance of information is being oversold by ex-practitioners of quantum computing, who he says mistake the universe for a giant quantum information processor instead of a physical thing. He accuses quantum information theorists of confusing different kinds of entropy—the thermodynamic and information-theoretic kinds—and using the latter in domains where it doesn’t apply. Maxwell’s demon “gets on my nerves,” Hänggi said. When asked about Oppenheim and company’s second “laws” of thermodynamics, he said, “You see why my blood pressure rises.”

While Hänggi is seen as too old-fashioned in his critique (quantum-information theorists do study the connections between thermodynamic and information-theoretic entropy), other thermodynamicists said he makes some valid points. For instance, when quantum information theorists conjure up abstract quantum machines and see if they can get work out of them, they sometimes sidestep the question of how, exactly, you extract work from a quantum system, given that measuring it destroys its simultaneous quantum probabilities. Anders and her collaborators have recently begun addressing this issue with new ideas about quantum work extraction and storage. But the theoretical literature is all over the place.

“Many exciting things have been thrown on the table, a bit in disorder; we need to put them in order,” said Valerio Scarani, a quantum information theorist and thermodynamicist at the National University of Singapore who was part of the team that reported the quantum fridge. “We need a bit of synthesis. We need to understand your idea fits there; mine fits here. We have eight definitions of work; maybe we should try to figure out which one is correct in which situation, not just come up with a ninth definition of work.”

Oppenheim and Popescu fully agree with Hänggi that there’s a risk of downplaying the universe’s physicality. “I’m wary of information theorists who believe everything is information,” Oppenheim said. “When the steam engine was being developed and thermodynamics was in full swing, there were people positing that the universe was just a big steam engine.” In reality, he said, “it’s much messier than that.” What he likes about quantum thermodynamics is that “you have these two fundamental quantities—energy and quantum information—and these two things meet together. That to me is what makes it such a beautiful theory.”

Original story reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.
(FULL STORY)

Alien Civilizations May Number In The Trillions, New Study Says
[5/19/2017]
The possibility that we earthlings are not truly alone in the universe has gained some added credibility, thanks to a new study that coincides with NASA’s recent planetary discoveries. The research, published in the journal Astrobiology last week, suggests that more planets in the Milky Way galaxy may harbor advanced civilizations than we previously imagined.

Study co-authors Adam Frank and Woodruff Sullivan looked at recent discoveries of potentially habitable exoplanets and considered the odds of whether sophisticated civilizations existed on them in the past or present.

“What we showed was the ‘floor’ on the probability for a civilization to form on any randomly chosen planet,” Frank, a University of Rochester physics and astronomy professor, told The Huffington Post in an email. “If we are the only civilization in cosmic history, then that what we calculated is the actual probability nature has set. But if the actual probability is higher than that floor, then civilizations have happened before.”

Frank says the potential number of planets orbiting their parent stars within a habitable distance is staggering.

“Even if you are pretty pessimistic and think that you’d have to search through 100 billion (habitable zone) planets before you found one where a civilization developed, then there have still been a trillion civilizations over cosmic history!” Frank wrote. “When I think about that, my mind reels — even if there is just a one in a 100 billion chance of evolution creating exo-civilizations, the universe still has made so many of them that we are swamped by histories other than our own.”


NASA/W. STENZEL
An artist’s depiction of planetary discoveries by NASA’s Kepler spacecraft, which searches for Earth-like planets. The Kepler telescope has discovered thousands of verified planets since it launched in 2009.
In 1961, astronomer Frank Drake — founder of the SETI Institute (SETI stands for “Search for Extraterrestrial Intelligence”) — devised what is now known as the “Drake equation” to estimate the number of planets that may be home to civilizations with the ability to communicate beyond their world.

Frank and Sullivan created a new equation, which appears at the bottom of the illustration below. While the Drake equation calculates the number of advanced alien civilizations that could exist in the Milky Way galaxy, Frank and Sullivan’s equation expands the question to calculate the number of advanced civilizations that have existed in our galaxy throughout the whole history of the universe.


UNIVERSITY OF ROCHESTER
Two equations consider the possibilities of technological alien civilizations in the Milky Way galaxy: At top, the 1961 Drake equation and, at bottom, a more recent equation by Adam Frank and Woodruff Sullivan.
The variable factors that Drake and others consider when attempting to come up with figures about ET-inhabited worlds include:

The rate of formation of stars with planets suitable for intelligent life.
The number of those stars that have planetary systems.
The number of those planets which may have life-sustaining environments.
The number of those planets where life develops.
How many of those planets produce intelligent life.
How many of those intelligent life forms could produce technology, such as radio signals.
In their Astrobiology paper, Frank and Sullivan write:

“Recent advances in exoplanet studies provide strong constraints on all astrophysical terms in the Drake equation. We set a firm lower bound on the probability that one or more technological species have evolved anywhere and at any time in the history of the observable universe.”

The two scientists address what they refer to as “the cosmic frequency of technological species.”

“The universe is more than 13 billion years old,” Sullivan, of the astronomy department and astrobiology program at the University of Washington, said in a statement. “That means that even if there have been 1,000 civilizations in our own galaxy, if they live only as long as we have been around — roughly 10,000 years — then all of them are likely already extinct. And others won’t evolve until we are long gone.

“For us to have much chance in finding another ‘contemporary’ active technological civilization, on average they must last much longer than our present lifetime,” Sullivan said.

The search for extraterrestrial signals has been ongoing for decades.

“With so many stars and planets filling the cosmos, it boggles the mind to think that we’re the only clever life to have made an appearance,” SETI Institute senior astronomer Seth Shostak told HuffPost in an email. “Frank and Sullivan use new research indicating that roughly one in five stars is orbited by a planet that could nurture biology. After that, it’s just a matter of counting up the tally of stars in the visible universe, and saying that — with all the suitable real estate that’s out there, if we’re the only place with intelligent life, then we’ve really won the mother of all lotteries.”

Shostak cautions against being overly optimistic or pessimistic about the SETI Institute’s searches for intelligent signals from possible outer space neighbors.

“The odds that no one is out there are very, very small. It’s a bit like an ant coming out of its hive, seeing the enormous amount of real estate stretching in all directions and deciding that, if its home is the only ant hill, then its existence is a near-miracle. Or, put another way, the calculation by Frank and Sullivan quantifies Jodie Foster’s statement in [the movie] ‘Contact’ that, if there’s nobody out there, it would be a ‘waste of space,’” said Shostak.

With all the suitable real estate that’s out there, if we’re the only place with intelligent life, then we’ve really won the mother of all lotteries.”
Seth Shostak, SETI Institute senior astronomer
Scientists searching for extraterrestrial beings — and, yes, to those beings, we would be aliens — are like archaeologists combing a vast space for treasures and information to learn more about the history of our species.

“I love the notion of a cosmic archaeological question. I think this puts an important new spin on the question about the rise of technological communicating intelligence,” Penelope J. Boston, incoming director of NASA’s Astrobiology Institute at Ames Research Center, told HuffPost.

“We have only been looking for other intelligences for a few decades in a galaxy of unfathomable proportions,” Boston said. “Of course we haven’t found anybody yet. I think it is childish to imagine that we should somehow have started looking, and bingo, there they are! I have trouble finding my dropped contact lens in the grass. Should I then disbelieve in the reality of my contact lens?”

While scientists had long wondered if there were other planets orbiting stars in the Milky Way galaxy and elsewhere, it wasn’t until the early 1990s that the first extrasolar world was confirmed.

“The existence of planets orbiting stars other than the sun is a 2,500-year-old question that has been entirely answered over the last 20 years,” said Frank. “We now know that every star in the night sky has at least one planet orbiting it, and many of those are in the right place for life to form.

“Ten thousand years from now, no one will remember anything about our era except it was when we discovered this single profound fact: We live in a cosmos of planets.”
(FULL STORY)

New blackbody force depends on spacetime geometry and topology
[5/23/2017]
In 2013, a group of physicists from Austria proposed the existence of a new and unusual force called the "blackbody force." Blackbodies—objects that absorb all incoming light and therefore appear black at room temperature—have long been known to emit blackbody radiation, which repels small nearby objects such as atoms and molecules. But the physicists showed that blackbodies theoretically also exert an attractive force on these objects. They called this force the "blackbody force," and showed that it can be stronger than blackbody radiation, and—for very small particles—even stronger than gravity.

Now in a new study published in EPL, a different team of physicists, C.R. Muniz et al., at Ceará State University and the Federal University of Ceará, Brazil, have theoretically demonstrated that the blackbody force depends not only on the geometry of the bodies themselves, but also on both the surrounding spacetime geometry and topology. In some cases, accounting for these latter factors significantly increases the strength of the blackbody force. The results have implications for a variety of astrophysics scenarios, such as planet and star formation, and possibly lab-based experiments.
"This work puts the blackbody force discovered in 2013 in a wider context, which involves strong gravitational sources and exotic objects like cosmic strings as well as the more prosaic ones found in condensed matter," Muniz told Phys.org.
As the scientists showed in 2013, the blackbody force arises when the heat absorbed by a blackbody causes the blackbody to emit electromagnetic waves that shift the atomic energy levels of nearby atoms and molecules. These shifts cause the atoms and molecules to be attracted to the blackbodies due to their high radiation intensity, pulling them together.
In the new study, the physicists investigated spherical blackbodies and cylindrical blackbodies, and showed how the topology and the local curvature of the spacetime influences their blackbody forces. They showed that ultradense spherical blackbodies like a neutron star (around which spacetime is highly curved) generate a stronger blackbody force due to the curvature compared to blackbodies in flat spacetime. They explain that this is because gravity modifies both the temperature of the blackbody and the solid angle at which the nearby atoms and molecules "see" the blackbody. On the other hand, a less dense blackbody such as our Sun (where spacetime is less curved) generates a blackbody force that is very similar to that of the flat case.
The researchers then considered the case of a global monopole, a spherical object that modifies the global properties of space, and found a different kind of influence. Whereas for other spherical blackbodies, the spacetime influence is gravitational and decreases with the distance to the blackbody, for the global monopole the influence is of a topological nature, decreasing with the distance but eventually reaching a constant value.
Finally, when investigating the blackbody force of cylindrical blackbodies around which spacetime is locally flat, the scientists found no gravitational correction to the temperature, but, surprisingly, an effect on the angles with nearby objects. And when a cylindrical blackbody becomes infinitely thin, turning into a hypothetical cosmic string, the blackbody force vanishes completely. Overall, the scientists expect that these newly discovered geometrical and topological influences on the blackbody force will help elucidate the role of this unusual force on objects throughout the universe.
"We think that the intensification of the blackbody force due to the ultradense sources can influence in a detectable way the phenomena associated with them, such as the emission of very energetic particles, and the formation of accretion discs around black holes," Muniz said. "That force can also help to detect the Hawking radiation emitted by these latter objects, since we know that such radiation obeys the blackbody spectrum. In the future, we would like to investigate the behavior of that force in other spacetimes, as well as the influence of extra dimensions on it."
Explore further: Blackbody radiation induces attractive force stronger than gravity
More information: C. R. Muniz et al. "Dependence of the black-body force on spacetime geometry and topology." EPL. DOI: 10.1209/0295-5075/117/60001


Read more at: https://phys.org/news/2017-05-blackbody-spacetime-geometry-topology.html#jCp
(FULL STORY)

Gravitational Waves Could Help Us Detect the Universe’s Hidden Dimensions
[5/5/2017]
Gravitational waves might be used to uncover hidden dimensions in the universe. By looking at these ripples in spacetime, researchers at the Max Planck Institute for Gravitational Physics in Germany say we could work out what impact hidden dimensions would have on them, and use this information to find these effects.

The discovery of gravitational waves was announced in February 2016. Scientists used the Laser Interferometer Gravitational-wave Observatory (LIGO) detectors to find fluctuations in spacetime created by a pair of colliding black holes. Scientists can now use this information to see the universe in a whole new way—potentially even one day tracing waves that came from the Big Bang.

At present, our models of the universe are incomplete. They cannot explain many of the things we observe in the universe, so many physicists believe we are missing something—and that something could be the presence of extra dimensions.

If scientists were to find evidence of extra dimensions, they could start answering some of the most fundamental unknowns of the universe, like what dark matter is and why the universe is expanding at an accelerating rate.

Gravitational waves are ripples in spacetime caused by extremely energetic events. These events, like merging black holes, would release so much energy they would disrupt the way spacetime moves, creating ‘waves’ that would propagate out from the source—similar to the way a pebble thrown into a pond creates ripples moving outwards.

Gravitational waves were first predicted by Albert Einstein over 100 years ago, but until now we have not been able to find them. By the time the ripples reach us, they are so tiny that detecting them requires hugely sensitive equipment. This is what LIGO was able to do.

In the latest study, which appears on the preprint server arxiv.org, David Andriot and Gustavo Lucena Gómez look at how gravitational waves move through the known dimensions—three representing space and another for time. They then investigate what effects extra dimensions might have on the four dimensional waves we see.

“If there are extra dimensions in the universe, then gravitational waves can walk along any dimension, even the extra dimensions,” Lucena Gómez told New Scientist.

They found extra dimensions could have two effects on gravitational waves—firstly, they would have what they call a “breathing mode.” This provides another way for the gravitational waves to move in space.

“The breathing mode deforms the space in a specific manner, giving a distinct signature,” they wrote. To observe this change, they would need three detectors like LIGO all working to observe the same thing at the same time—something that “should be available in a near future,” they wrote.

The second effect is a “massive tower” of extra gravitational waves. These waves could be detected at high frequencies, something our current technologies prevent. To detect changes at the frequencies they propose, LIGO would need to be thousands of times more sensitive.

The scientists are clear that such apparatus does not exist, but note: “If such a detector were available, however, one could hope for a very clean signal, since there is no known astrophysical process emitting gravitational waves with frequencies much greater than 103Hz. Such high frequencies may thus be clear symptoms of new physics.”

However, Bobby Acharya, professor of Theoretical Physics at King’s College London, U.K., who was not involved in the study, is not convinced by the findings. In an interview with Newsweek , he says that while he firmly believes in the existence of extra dimensions, models suggest these dimensions would be extremely small: “That means that in order to excite them and create waves in those extra dimensions you require a lot of energy,” he says.

“And if you did produce the gravitational wave that propagated in the extra dimensions, the fact that extra dimensions are so small it means the frequency of this gravitational wave will be very high—much higher than the LIGO gravitational wave detectors can detect.”

He said you would need a “very optimistic point of view” to try to detect gravitational waves propagating in extra dimensions: “[The extra dimensions] would have to be rather large and then it would be difficult to make the model consistent with other observations. I’m not so positive about the result.”
(FULL STORY)

We could detect alien life by finding complex molecules
[4/27/2017]
By Bob Holmes in Mesa, Arizona

How can we search for life on other planets when we don’t know what it might look like? One chemist thinks he has found an easy answer: just look for sophisticated molecular structures, no matter what they’re made of. The strategy could provide a simple way for upcoming space missions to broaden the hunt.

Until now, the search for traces of life, or biosignatures, on other planets has tended to focus mostly on molecules like those used by earthly life. Thus, Mars missions look for organic molecules, and future missions to Europa may look for amino acids, unequal proportions of mirror-image molecules, and unusual ratios of carbon isotopes, all of which are signatures of life here on Earth.

But if alien life is very different, it may not show any of these. “I think there’s a real possibility we could miss life if [resembling Earth life is] the only criterion,” says Mary Voytek, who heads NASA’s astrobiology programme.

Now Lee Cronin, a chemist at the University of Glasgow, UK, argues that complexity could be a biosignature that doesn’t depend on any assumptions about the life forms that produce it. “Biology has one signature: the ability to produce complex things that could not arise in the natural environment,” Cronin says.

Obviously, an aircraft or a mobile phone could not assemble spontaneously, so their existence points to a living – and even intelligent – being that built them. But simpler things like proteins, DNA molecules or steroid hormones are also highly unlikely to occur without being assembled by a living organism, Cronin says.

Step by step
Cronin has developed a way to measure the complexity of a molecule by counting the number of unique steps – adding chemical side groups or ring structures, for example – needed for its formation, without double-counting repeated steps. To draw an analogy, his metric would score the words “bana” and “banana” as equally complex, since once you can make one “na” it is trivial to add a second one.

Any structure requiring more than about 15 steps is so complex it must be biological in origin, he said this week at the Astrobiology Science Conference in Mesa, Arizona.

Cronin thinks he may be able to make that criterion simpler still, by specifying a maximum molecular weight for compounds that can assemble spontaneously.

Astrobiologists welcome Cronin’s suggestion. “I appreciate Lee for developing a biosignature that has minimal assumptions about the biology,” says Voytek.

In practice, though, Voytek notes that a detector compact enough to travel on an interplanetary mission would probably need to be designed to look for carbon-based life.

And even if Cronin’s method works, no scientist would risk claiming to have found extraterrestrial life on the basis of just one line of evidence, says Kevin Hand of NASA’s Jet Propulsion Laboratory and project scientist for the Europa Lander mission now being developed by NASA. That means that future missions will still need to look for multiple biosignatures.
(FULL STORY)

We May Have Uncovered the First Ever Evidence of the Multiverse
[4/27/2017]
TOO COLD

For years, scientists have been baffled by a weird anomaly far away in space: a mysterious “Cold Spot” about 1.8 billion light-years across. It is cooler than its surroundings by around 0.00015 degrees Celsius (0.00027 degrees Fahrenheit), a fact astronomers discovered by measuring background radiation throughout the universe.

Previously, astronomers believed that this space could be cooler simply because it had less matter in it than most sections of space. They dubbed it a massive supervoid and estimated that it had 10,000 galaxies fewer than other comparable sections of space.

But now, in a recently published survey of galaxies, astronomers from the Royal Astronomical Society (RAS) say they have discovered that this supervoide could not exist. They now believe that the galaxies in the cold spot are just clustered around smaller voids that populate the cold spot like bubbles. These small voids, however, cannot explain the temperature difference observed.

MULTIVERSE?

To link the temperature differences to the smaller voids, the researchers say a non-standard cosmological model would be required. “But our data place powerful constraints on any attempt to do that,” explained researcher Ruari Mackenzie in an RAS press release. While the study had a large margin of error, the simulations suggest there is only a two percent probability that the Cold Spot formed randomly.

“This means we can’t entirely rule out that the Spot is caused by an unlikely fluctuation explained by the standard model. But if that isn’t the answer, then there are more exotic explanations,” said researcher Tom Shanks in the press release. “Perhaps the most exciting of these is that the Cold Spot was caused by a collision between our universe and another bubble universe.”

If more detailed studies support the findings of this research, the Cold Spot might turn out to be the first evidence for the multiverse, though far more evidence would be needed to confirm our universe is indeed one of many.
(FULL STORY)

Scientists Just Discovered an Alien Planet That’s The Best Candidate for Life As We Know It
[4/18/2017]
SUPER-EARTH LHS 1140B

Only a few decades ago, the thought of any alien planets existing in the reaches of space were just hypothetical ideas. Now, we know of thousands of such planets – and today, scientists may have discovered the best candidate yet for alien life.

That candidate is an exoplanet orbiting a red dwarf star 40 light-years from Earth—what the international team of astronomers who discovered it have deemed a “super-Earth.” Using ESO’s HARPS instrument and a range of telescopes around the world, the astronomers located the exoplanet orbiting the dim star – LHS 1140 – within its habitable zone. This world passes in front of its parent stars as it orbits, has likely retained most of its atmosphere, and is a little larger and much more massive than the Earth. In short, super-Earth LHS 1140b is among the most exciting known subjects for atmospheric studies.

Other Earths: The Best Exoplanet Candidates for Life [INFOGRAPHIC]
Click to View Full Infographic
Although the faint red dwarf star LHS 1140b is ten times closer to its star than the Earth is to the Sun, because red dwarfs are much smaller and cooler than the Sun is, the super-Earth lies in the middle of the habitable zone and receives around half as much sunlight from its star as the Earth does.

“This is the most exciting exoplanet I’ve seen in the past decade,” lead author Jason Dittmann of the Harvard-Smithsonian Center for Astrophysics said in an ESO science release. “We could hardly hope for a better target to perform one of the biggest quests in science — searching for evidence of life beyond Earth.”

*5* Scientists Just Discovered an Alien Planet That’s The Best Candidate for Life As We Know It
Artist’s impression of the super-Earth exoplanet LHS 1140b. Credit: ESO
LIFE AS WE KNOW IT

To support life as we know it, a planet must retain an atmosphere and have liquid surface water. When red dwarf stars are young, they emit radiation that can damage the atmospheres of planets around them. This planet’s large size indicates that a magma ocean may have existed on its surface for eons, feeding steam into the atmosphere and replenishing the planet with water until well within the time the star had cooled to its current, steady glow. The astronomers estimate the planet is at least five billion years old, and deduce that it has a diameter of almost 18,000 kilometers (11,185 mi)— 1.4 times larger than that of the Earth. Its greater mass and density implies that it is probably made of rock with a dense iron core.

Two of the European members of the team, Xavier Delfosse and Xavier Bonfils, stated in the release: “The LHS 1140 system might prove to be an even more important target for the future characterization of planets in the habitable zone than Proxima b or TRAPPIST-1. This has been a remarkable year for exoplanet discoveries!”

Scientists expect observations with the Hubble Space Telescope will soon allow them to assess how much high-energy radiation the exoplanet receives, and further into the future — with the help of new telescopes like ESO’s Extremely Large Telescope and the James Webb Telescope — detailed observations of the atmospheres of exoplanets will be possible.
(FULL STORY)

Physicists detect whiff of new particle at the Large Hadron Collider
[4/18/2017]
For decades, particle physicists have yearned for physics beyond their tried-and-true standard model. Now, they are finding signs of something unexpected at the Large Hadron Collider (LHC), the world’s biggest atom smasher at CERN, the European particle physics laboratory near Geneva, Switzerland. The hints come not from the LHC’s two large detectors, which have yielded no new particles since they bagged the last missing piece of the standard model, the Higgs boson, in 2012, but from a smaller detector, called LHCb, that precisely measures the decays of familiar particles.

The latest signal involves deviations in the decays of particles called B mesons—weak evidence on its own. But together with other hints, it could point to new particles lying on the high-energy horizon. “This has never happened before, to observe a set of coherent deviations that could be explained in a very economical way with one single new physics contribution,” says Joaquim Matias, a theorist at the Autonomous University of Barcelona in Spain. Matias says the evidence is strong enough for a discovery claim, but others urge caution.

The LHC smashes protons together at unprecedented energy to try to blast into existence massive new particles, which its two big detectors, ATLAS and CMS, would spot. LHCb focuses on familiar particles, in particular B mesons, using an exquisitely sensitive tracking detector to sniff out the tiny explosive decays.
SIGN UP FOR OUR DAILY NEWSLETTER

Get more great content like this delivered right to you!


B mesons are made of fundamental particles called quarks. Familiar protons and neutrons are made of two flavors of quarks, up and down, bound in trios. Heavier quark flavors—charm, strange, top, and bottom—can be created, along with their antimatter counterparts, in high-energy particle collisions; they pair with antiquarks to form mesons.

Lasting only a thousandth of a nanosecond, B mesons potentially provide a window onto new physics. Thanks to quantum uncertainty, their interiors roil with particles that flit in and out of existence and can affect how they decay. Any new particles tickling the innards of B mesons—even ones too massive for the LHC to create—could cause the rates and details of those decays to deviate from predictions in the standard model. It’s an indirect method of hunting new particles with a proven track record. In the 1970s, when only the up, down, and strange quarks were known, physicists predicted the existence of the charm quark by discovering oddities in the decays of K mesons (a family of mesons all containing a strange quark bound to an antiquark).

In their latest result, reported today in a talk at CERN, LHCb physicists find that when one type of B meson decays into a K meson, its byproducts are skewed: The decay produces a muon (a cousin of the electron) and an antimuon less often than it makes an electron and a positron. In the standard model, those rates should be equal, says Guy Wilkinson, a physicist at the University of Oxford in the United Kingdom and spokesperson for the 770-member LHCb team. “This measurement is of particular interest because theoretically it’s very, very clean,” he says.

Strangely familiar
A new process appears to be modifying one of the standard ways a B meson decays to a K meson. It may involve a new force-carrying particle called a Z' that avoids creating a short-lived top quark.
Standard model decay
b
d

s
d

B meson
K meson
Muon, µ+
Antimuon, µ–
Possible new decay
µ
+
µ

B meson
K meson
b
d

s
d

t
Charged weak force boson, W–
Neutral weak force boson, Z
Possible new particle, Z'
Bottom quark
Strange quark
Top quark
Anti-down quark
V. ALTOUNIAN/SCIENCE
The result is just one of half a dozen faint clues LHCb physicists have found that all seem to jibe. For example, in 2013, they examined the angles at which particles emerge in such B meson decays and found that they didn’t quite agree with predictions.

What all those anomalies point to is less certain. Within the standard model, a B meson decays to a K meson only through a complicated “loop” process in which the bottom quark briefly turns into a top quark before becoming a strange quark. To do that, it has to emit and reabsorb a W boson, a “force particle” that conveys the weak force (see graphic, previous page).

The new data suggest the bottom quark might morph directly into a strange quark—a change the standard model forbids—by spitting out a new particle called a Z′ boson. That hypothetical cousin of the Z boson would be the first particle beyond the standard model and would add a new force to theory. The extra decay process would lower production of muons, explaining the anomaly. “It sort of an ad hoc construct, but it fits the data beautifully,” says Wolfgang Altmannshofer, a theorist at the University of Cincinnati in Ohio. Others have proposed that a quark–electron hybrid called a leptoquark might briefly materialize in the loop process and provide another way to explain the discrepancies.

Of course, the case for new physics could be a mirage of statistical fluctuations. Physicists with ATLAS and CMS 18 months ago reported hints of a hugely massive new particle only to see them fade away with more data. The current signs are about as strong as those were, Altmannshofer says.

The fact that physicists are using LHCb to search in the weeds for signs of something new underscores the fact that the LHC hasn’t yet lived up to its promise. “ATLAS and CMS were the detectors that were going to discover new things, and LHCb was going to be more complementary,” Matias says. “But things go as they go.”

If the Z′ or leptoquarks exist, then the LHC might have a chance to blast them into bona fide, albeit fleeting, existence, Matias says. The LHC is now revving up after its winter shutdown. Next month, the particle hunters will return to their quest.
(FULL STORY)

Physicists Discover Hidden Aspects of Electrodynamics
[4/11/2017]
BATON ROUGE – Radio waves, microwaves and even light itself are all made of electric and magnetic fields. The classical theory of electromagnetism was completed in the 1860s by James Clerk Maxwell. At the time, Maxwell’s theory was revolutionary, and provided a unified framework to understand electricity, magnetism and optics. Now, new research led by LSU Department of Physics & Astronomy Assistant Professor Ivan Agullo, with colleagues from the Universidad de Valencia, Spain, advances knowledge of this theory. Their recent discoveries have been published in Physical Review Letters.

Maxwell’s theory displays a remarkable feature: it remains unaltered under the interchange of the electric and magnetic fields, when charges and currents are not present. This symmetry is called the electric-magnetic duality.

However, while electric charges exist, magnetic charges have never been observed in nature. If magnetic charges do not exist, the symmetry also cannot exist. This mystery has motivated physicists to search for magnetic charges, or magnetic monopoles. However, no one has been successful. Agullo and his colleagues may have discovered why.

“Gravity spoils the symmetry regardless of whether magnetic monopoles exist or not. This is shocking. The bottom line is that the symmetry cannot exist in our universe at the fundamental level because gravity is everywhere,” Agullo said.

Gravity, together with quantum effects, disrupts the electric-magnetic duality or symmetry of the electromagnetic field.

Agullo and his colleagues discovered this by looking at previous theories that illustrate this phenomenon among other types of particles in the universe, called fermions, and applied it to photons in electromagnetic fields.

“We have been able to write the theory of the electromagnetic field in a way that very much resembles the theory of fermions, and prove this absence of symmetry by using powerful techniques that were developed for fermions,” he said.

This new discovery challenges assumptions that could impact other research including the study of the birth of the universe.



The Big Bang

Satellites collect data from the radiation emitted from the Big Bang, which is called the Cosmic Microwave Background, or CMB. This radiation contains valuable information about the history of the universe.

“By measuring the CMB, we get precise information on how the Big Bang happened,” Agullo said.

Scientists analyzing this data have assumed that the polarization of photons in the CMB is not affected by the gravitational field in the universe, which is true only if electromagnetic symmetry exists. However, since this new finding suggests that the symmetry does not exist at the fundamental level, the polarization of the CMB can change throughout cosmic evolution. Scientists may need to take this into consideration when analyzing the data. The focus of Agullo’s current research is on how much this new effect is.

This research is supported by the National Science Foundation grants PHY-1403943 and PHY-1552603.
(FULL STORY)

A dark matter 'bridge' holding galaxies together has been captured for the first time
[4/12/2017]
The first image of a dark matter "bridge", believed to form the links between galaxies, has been captured by astrophysicists in Canada.

Researchers at the University of Waterloo used a technique known as weak gravitational lensing to create a composite image of the bridge. Gravitational lensing is an effect that causes the images of distant galaxies to warp slightly under the influence of an unseen mass, such as a planet, a black hole, or in this case, dark matter.

Their composite image was made up of a combination of combined lensing images taken of more than 23,000 galaxy pairs, spotted 4.5 billion light-years away. This effect was measured from a multi-year sky survey at the Canada-France-Hawaii Telescope.

These results show that the dark matter filament bridge is strongest between systems less than 40 million light years apart, and confirms predictions that galaxies across the Universe are tied together through a cosmic web of the elusive substance.

Dark matter is a mysterious element said to make up around 84 per cent of the Universe. It's known as "dark" because it doesn't shine, absorb or reflect light, which has traditionally made it largely undetectable, except through gravity and gravitational lensing. Evidence for the existence of this form of matter comes, among other things, from the astrophysical observation of galaxies, which rotate far too rapidly to be held together only by the gravitational pull of the visible matter.

Astrophysics has long proposed the Universe's web of stars and galaxies is supported by a "cosmic scaffolding" made up of fine threads of this invisible dark matter. These threadlike strands formed just after the Big Bang when denser portions of the Universe drew in dark matter until it collapsed and formed flat disks, which featured fine filaments of dark matter at their joins. At the cross-section of these filaments, galaxies formed.

READ NEXT
Ligo's next trick? Finally hunting down dark matter

Ligo's next trick? Finally hunting down dark matter
By ABIGAIL BEALL

University of Waterloo
"For decades, researchers have been predicting the existence of dark matter filaments between galaxies that act like a web-like superstructure connecting galaxies together," said Mike Hudson, a professor of astronomy at the University of Waterloo in the journal Monthly Notices of the Royal Astronomical Society. "This image moves us beyond predictions to something we can see and measure."

"By using this technique, we're not only able to see that these dark matter filaments in the Universe exist, we're able to see the extent to which these filaments connect galaxies together," said co-author Seth Epps.

WHAT IS DARK MATTER?
Dark matter is an invisible form of matter which, until now, has only revealed itself through its gravitational effects.
Evidence for the existence of this form of matter comes, among other things, from the astrophysical observation of galaxies, which rotate far too rapidly to be held together only by the gravitational pull of the visible matter.
High-precision measurements using the European satellite Planck show that almost 85 percent of the entire mass of the universe consists of dark matter.
All the stars, planets, nebulae and other objects in space that are made of conventional matter account for no more than 15 percent of the mass of the universe.
The unknown form of matter can either consist of comparatively few, but very heavy particles, or of a large number of light ones.
One of the possible candidates for dark matter is a particle called the axion, first proposed in 1977. It appears in some extensions of the Standard Model of particle physics. Astronomers believe that if axions make up dark matter, they could be detected through gravitational waves. This is because axions accelerated by a black hole would give off gravitational waves, just as electrons give off electromagnetic waves.

As a result, instruments like Ligo – and the upcoming Advanced Ligo (Ligo) –
may be able to see gravitational waves (GWs) from thousands of black hole (BH) mergers which would mark the beginning of a new precision tool for physics.

Physicists have a general idea about what the dark matter particle looks like but are struggling to build a clear picture. They can track the distribution of dark matter throughout the galaxy by examining how galaxies move, but can't pinpoint its exact location or design.

Earlier this year, Priyamvada Natarajan, a professor of astrophysics at Yale University, and her team brought the search for dark matter a step forward by creating the most detailed map of dark matter ever created. The map looks like an alien landscape, with uneven peaks and troughs scattered throughout. There are gentle mounds, on top of which sharp peaks arise, like the inside of a cave covered in stalactites.
(FULL STORY)

No, Dark Energy Isn't An Illusion
[4/10/2017]
In 1998, two teams of scientists announced a shocking discovery: the expansion of the Universe was accelerating. Distant galaxies weren't just receding from us, but their recession speed was increasing over time. Over the next few years, precision measurements of three independent quantities -- distant galaxies containing type Ia supernovae, the fluctuation pattern in the cosmic microwave background, and large-scale correlations between galaxies at a variety of distances -- all supported and confirmed this picture. The leading explanation? That there's a new form of energy inherent to space itself: dark energy. The case is so strong that no one reasonably doubts the evidence, but many teams have made alternative cases for the explanation, claiming that dark energy itself could be an illusion.
To understand whether this could be the case, we need to walk through four straightforward steps:

What a Universe without dark energy would look like, What our Universe actually looks like, What alternative explanations have been offered up, And to evaluate whether any of them could legitimately work? In science, as in all things, it's pretty easy to offer a "what if..." alternative scenario to the leading idea. But can it stand up to scientific rigor? That's the crucial test.
Well before we conceived of dark energy, all the way back in the 1920s and 1930s, scientists derived how the entire Universe could have evolved within General Relativity. If you assumed that space, on the largest scales, was uniform -- with the same density and temperature everywhere -- there were only three viable scenarios to describe a Universe that was expanding today. If you fill a Universe with matter and radiation, like ours appears to be, gravity will fight the expansion, and the Universe can:

expand up to a point, reach a maximum size, and then begin contracting, eventually leading to a total recollapse.
expand and slow down somewhat, but gravitation is insufficient to ever stop or reverse it, and so it will eternally expand into the great cosmic abyss.
expand, with gravitation and the expansion balancing each other perfectly, so the expansion rate and the recession speed of everything asymptotes to zero, but never reverses. Those were the three classic fates of the Universe: big crunch, big freeze, or a critical Universe, which was right on the border between the two.
But then the crucial observations came in, and it turns out the Universe did none of those three things. For the first six billion years or so after the Big Bang, it appeared we lived in a critical Universe, with the initial expansion and the effects of gravitational attraction balancing one another almost perfectly. But when the density of the Universe dropped below a certain amount, a surprise emerged: distant galaxies began speeding up, away from us and one another. This cosmic acceleration was unexpected, but robust, and has continued at the same rate ever since, for the past 7.8 billion years.
Why was this happening? The current, known forms of energy in the Universe -- particles, radiation and fields -- can't account for it. So scientists hypothesized a new form of energy, dark energy, that could cause the Universe's expansion to accelerate. There could be a new field that permeates all of space causing it; it could be the zero-point energy of the quantum vacuum; it could be Einstein's cosmological constant from General Relativity. Current and planned observatories and experiments are looking for possible signatures that would distinguish or search for departures from any of these potential explanations, but so far all are consistent with being the true nature of dark energy.
But alternatives have been proposed as well. Adding a new type of energy to the Universe should be a last resort to explain a new observation, or even a new suite of observations. A lot of people were skeptical of its existence, so scientists began asking the question of what else could be occurring? What could mimic these effects? A number of possibilities immediately emerged:

Perhaps the distant supernovae weren't the same as nearby ones, and were inherently fainter?
Perhaps there was something about the environments in which the supernovae occurred that changed?
Perhaps the distant light, en-route, was undergoing an interaction that caused it to fail to reach our eyes?
Perhaps a new type of dust existed, making these distant objects appear systematically fainter?
Or could it be that the assumption on which these models are founded -- that the Universe is, on the largest scales, perfectly uniform -- is flawed enough that what appears to be dark energy is simply the "correct" prediction of Einstein's theory?
The light-blocking, light-losing, or systematic light-differences scenarios have all been ruled out by multiple approaches, as even if supernovae were removed from the equation entirely, the evidence for dark energy would still be overwhelming. With precision measurements of the cosmic microwave background, baryon acoustic oscillations, and the large-scale structures that form and fail-to-form in our Universe, the case that the Universe's expans
(FULL STORY)

Satellite galaxies at edge of Milky Way coexist with dark matter
[4/3/2017]
Research conducted by scientists at Rochester Institute of Technology rules out a challenge to the accepted standard model of the universe and theory of how galaxies form by shedding new light on a problematic structure.

The vast polar structure - a plane of satellite galaxies at the poles of the Milky Way - is at the center of a tug-of-war between scientists who disagree about the existence of mysterious dark matter, the invisible substance that, according to some scientists, comprises 85 percent of the mass of the universe.

A paper accepted for publication in the Monthly Notices for the Royal Astronomical Society bolsters the standard cosmological model, or the Cold Dark Matter paradigm, by showing that the vast polar structure formed well after the Milky Way and is an unstable structure.

The study, "Is the Vast Polar Structure of Dwarf Galaxies a Serious Problem for CDM?" - available online at https:/?/?arxiv.?org/?abs/?1612.?07325 - was co-authored by Andrew Lipnicky, a Ph.D. candidate in RIT's astrophysical sciences and technology program, and Sukanya Chakrabarti, assistant professor in RIT's School of Physics and Astronomy, whose grant from the National Science Foundation supported the research.

Lipnicky and Chakrabarti analyze the distribution of the classical Milky Way dwarf galaxies that form the vast polar structure and compares it to simulations of the "missing" or subhalo dwarf galaxies thought to be cloaked in dark matter.

Using motion measurements, the authors traced the orbits of the classical Milky Way satellites backward in time. Their simulations showed the vast polar structure breaking up and dispersing, indicating that the plane is not as old as originally thought and formed later in the evolution of the galaxy. This means that the vast polar structure of satellite galaxies may be a transient feature, Chakrabarti noted.

"If the planar structure lasted for a long time, it would be a different story," Chakrabarti said. "The fact that it disperses so quickly indicates that the structure is not dynamically stable. There is really no inconsistency between the planar structure of dwarf galaxies and the current cosmological paradigm."

The authors removed the classical Milky Way satellites Leo I and Leo II from the study when orbital analyses determined that the dwarf galaxies were not part of the original vast polar structure but later additions likely snatched from the Milky Way. A comparison excluding Leo I and II reveals a similar plane shared by classical galaxies and their cloaked counterparts.

"We tried many different combinations of the dwarf galaxies, including distributions of dwarfs that share similar orbits, but in the end found that the plane always dispersed very quickly," Lipnicky said.

Opposing scientific thought rejects the existence of dark matter. This camp calls into question the standard cosmological paradigm that accepts both a vast polar structure of satellite galaxies and a hidden plane of dark-matter cloaked galaxies. Lipnicky and Chakrabarti's study supports the co-existence of these structures and refutes the challenge to the accepted standard model of the universe.

Their research concurs with a 2016 study led by Nuwanthika Fernando, from the University of Sydney, which found that certain Milky Way planes are unstable in general. The paper published in the Monthly Notices for the Royal Astronomical Society.
(FULL STORY)

Magnetic hard drives go atomic
[3/11/2017]
Chop a magnet in two, and it becomes two smaller magnets. Slice again to make four. But the smaller magnets get, the more unstable they become; their magnetic fields tend to flip polarity from one moment to the next. Now, however, physicists have managed to create a stable magnet from a single atom.

The team, who published their work in Nature on 8 March1, used their single-atom magnets to make an atomic hard drive. The rewritable device, made from 2 such magnets, is able to store just 2 bits of data, but scaled-up systems could increase hard-drive storage density by 1,000 times, says Fabian Natterer, a physicist at the Swiss Federal Institute of Technology (EPFL) in Lausanne, and author of the paper.

“It’s a landmark achievement,” says Sander Otte, a physicist at Delft University of Technology in the Netherlands. “Finally, magnetic stability has been demonstrated undeniably in a single atom.”

Related stories
Nanoscience: Single-atom data storage
How DNA could store all the world’s data
Atom wranglers create rewritable memory
More related stories
Inside a regular hard drive is a disk split up into magnetized areas — each like a tiny bar magnet — the fields of which can point either up or down. Each direction represents a 1 or 0 — a unit of data known as a bit. The smaller the magnetized areas, the more densely data can be stored. But the magnetized regions must be stable, so that ‘1’s and ‘0’s inside the hard disk do not unintentionally switch

Current commercial bits comprise around 1 million atoms. But in experiments physicists have radically shrunk the number of atoms needed to store 1 bit — moving from 12 atoms in 20122 to now just one. Natterer and his team used atoms of holmium, a rare-earth metal, sitting on a sheet of magnesium oxide, at a temperature below 5 kelvin.

Holmium is particularly suitable for single-atom storage because it has many unpaired electrons that create a strong magnetic field, and they sit in an orbit close to the atom's centre where they are shielded from the environment. This gives holmium both a large and stable field, says Natterer. But the shielding has a drawback: it makes the holmium notoriously difficult to interact with. And until now, many physicists doubted whether it was possible to reliably determine the atom’s state.

Bits of data
To write the data onto a single holmium atom, the team used a pulse of electric current from the magnetized tip of scanning tunnelling microscope, which could flip the orientation of the atom's field between a 0 or 1. In tests the magnets proved stable, each retaining their data for several hours, with the team never seeing one flip unintentionally. They used the same microscope to read out the bit — with different flows of current revealing the atom’s magnetic state.

To further prove that the tip could reliably read the bit, the team — which included researchers from the technology company IBM — devised a second, indirect, read-out method. They used a neighbouring iron atom as a magnetic sensor, tuning it so that its electronic properties depended on the orientation of the two holmium atomic magnets in the 2-bit system. The method also allows the team to read out multiple bits at the same time, says Otte, making it more practical and less invasive than the microscope technique.

Using individual atoms as magnetic bits would radically increase the density of data storage, and Natterer says that his EPFL colleagues are working on ways to make large arrays of single-atom magnets. But the 2-bit system is still far from practical applications and well behind another kind of single-atom storage, which encodes data in atoms’ positions, rather than in their magnetization, and has already built a 1-kilobyte (8,192-bit) rewritable data storage device.

One advantage of the magnetic system, however, is that it could be compatible with spintronics, says Otte. This emerging technology uses magnetic states not just to store data, but to move information around a computer in place of electric current, and would make for much more energy-efficient systems.

In the near term, physicists are more excited about studying the single-atom magnets. Natterer, for example, plans to observe three mini-magnets that are oriented so their fields are in competition with each other — so they continually flip. “You can now play around with these single-atom magnets, using them like Legos, to build up magnetic structures from scratch,” he says.

Nature doi:10.1038/nature.2017.21599
Read the related News & Views article: 'Single-atom data storage'
(FULL STORY)

Could Mysterious Cosmic Light Flashes Be Powering Alien Spacecraft?
[3/10/2017]
Partner Series

Bizarre flashes of cosmic light may actually be generated by advanced alien civilizations, as a way to accelerate interstellar spacecraft to tremendous speeds, a new study suggests.

Astronomers have catalogued just 20 or so of these brief, superbright flashes, which are known as fast radio bursts (FRBs), since the first one was detected in 2007. FRBs seem to be coming from galaxies billions of light-years away, but what's causing them remains a mystery.

"Fast radio bursts are exceedingly bright given their short duration and origin at great distances, and we haven't identified a possible natural source with any confidence," study co-author Avi Loeb, a theorist at the Harvard-Smithsonian Center for Astrophysics, said in a statement Thursday (March 9). "An artificial origin is worth contemplating and checking." [5 Bold Claims of Alien Life]

Advertisement

One potential artificial origin, according to the new study, might be a gigantic radio transmitter built by intelligent aliens. So Loeb and lead author Manasvi Lingam, of Harvard University, investigated the feasibility of this possible explanation.

Artist's illustration of a light sail powered by a radio beam (red) generated on the surface of a planet. The leakage from such beams as they sweep across the sky would appear as superbright light flashes known as fast radio bursts, according to a new study.
Artist's illustration of a light sail powered by a radio beam (red) generated on the surface of a planet. The leakage from such beams as they sweep across the sky would appear as superbright light flashes known as fast radio bursts, according to a new study.
Credit: M. Weiss/CfA
The duo calculated that a solar-powered transmitter could indeed beam FRB-like signals across the cosmos — but it would require a sunlight-collecting area twice the size of Earth to generate the necessary power.

And the huge amounts of energy involved wouldn't necessarily melt the structure, as long as it was water-cooled. So, Lingam and Loeb determined, such a gigantic transmitter is technologically feasible (though beyond humanity's current capabilities).

Why would aliens build such a structure? The most plausible explanation, according to the study team, is to blast interstellar spacecraft to incredible speeds. These craft would be equipped with light sails, which harness the momentum imparted by photons, much as regular ships' sails harness the wind. (Humanity has demonstrated light sails in space, and the technology is the backbone of Breakthrough Starshot, a project that aims to send tiny robotic probes to nearby star systems.)

Indeed, a transmitter capable of generating FRB-like signals could drive an interstellar spacecraft weighing 1 million tons or so, Lingam and Loeb calculated.

"That's big enough to carry living passengers across interstellar or even intergalactic distances," Lingam said in the same statement.

Humanity would catch only fleeting glimpses of the "leakage" from these powerful beams (which would be trained on the spacecraft's sail at all times), because the light source would be moving constantly with respect to Earth, the researchers pointed out.



The duo took things a bit further. Assuming that ET is responsible for most FRBs, and taking into account the estimated number of potentially habitable planets in the Milky Way (about 10 billion), Lingam and Loeb calculated an upper limit for the number of advanced alien civilizations in a galaxy like our own: 10,000.

Lingam and Loeb acknowledge the speculative nature of the study. They aren't claiming that FRBs are indeed caused byaliens; rather, they're saying that this hypothesis is worthy of consideration.

"Science isn't a matter of belief; it's a matter of evidence," Loeb said. "Deciding what’s likely ahead of time limits the possibilities. It's worth putting ideas out there and letting the data be the judge."

The new study has been accepted for publication in The Astrophysical Journal Letters. You can read it for free on the online preprint site arXiv.org.
(FULL STORY)

NASA is Going to Create The Coldest Spot in the Known Universe
[3/8/2017]
Creating Cold Atom Lab

This summer, a box the size of an ice chest will journey to the International Space Station (ISS). Once there, it will become the coldest spot in the universe—more than 100 million times colder than deep space itself. The instruments inside the box — an electromagnetic “knife,” lasers, and a vacuum chamber — will slow down gas particles until they are almost motionless, bringing them just a billionth of a degree above absolute zero.

This box and its instruments are called the Cold Atom Laboratory (CAL). CAL was developed by the Jet Propulsion Laboratory (JPL), which is funded by NASA. Right now at JPL, CAL is in the final assembly stages, and getting ready for its trip to space which is set for August 2017. CAL will be hitching a ride on SpaceX CRS-12.

Once in space on the ISS, five scientific teams plan will use CAL to conduct experiments. Among them is the team headed by Eric Cornell, one of the scientists who won the Nobel Prize for creating Bose-Einstein condensates in a lab setting in 1995.

Seeing the Other 95%

Atoms that are cooled to extreme temperatures can form a unique state of matter: a Bose-Einstein condensate. This state is important scientifically because in it, the laws of quantum physics take over and we can observe matter behaving more like waves and less like particles. However, these rows of atoms, which move together like waves, can only be observed for fractions of a second on Earth because gravity causes atoms to move towards the ground. CAL achieves new low temperatures for longer observation of these mysterious waveforms.

dark-matter-nasa
CLICK HERE TO VIEW FULL INFOGRAPHIC
Although NASA has never observed or created Bose-Einstein condensates in space, ultra-cold atoms can hold their wave-like forms longer while in freefall on the International Space Station. JPL Project Scientist Robert Thompson believes CAL will render Bose-Einstein condensates observable for up to five to 10 seconds. He also believes that improvements to CAL’s technologies could allow for hundreds of seconds of observation time.

“Studying these hyper-cold atoms could reshape our understanding of matter and the fundamental nature of gravity,” said Thompson. “The experiments we’ll do with the Cold Atom Lab will give us insight into gravity and dark energy—some of the most pervasive forces in the universe.”

These experiments could potentially lead to improved technologies, including quantum computers, sensors, and atomic clocks for navigation on spacecraft. CAL deputy project manager Kamal Oudrhiri of JPL cites dark energy detection applications as “especially exciting.” Current physics models indicate that the universe is about 68 percent dark energy, 27 percent dark matter, and 5 percent ordinary matter.

“This means that even with all of our current technologies, we are still blind to 95 percent of the universe,” Oudrhiri said. “Like a new lens in Galileo’s first telescope, the ultra-sensitive cold atoms in the Cold Atom Lab have the potential to unlock many mysteries beyond the frontiers of known physics.”
(FULL STORY)

Testing theories of modified gravity
[3/1/2017]
Physics Today 70, 3, 21 (2017); doi: http://dx.doi.org/10.1063/PT.3.3485
The accelerated expansion of the universe is usually attributed to a mysterious dark energy, but there’s another conceivable explanation: modified gravity. Unmodified gravity—that is, Einstein’s general relativity— satisfactorily accounts for the dynamics of the solar system, where precision measurements can be made without the confounding influence of dark matter. Nor have any violations been detected in one of general relativity’s principal ingredients, the strong equivalence principle, which posits that inertial mass and gravitational mass are identical.
But those observational constraints are not ineluctable. In particular, a class of gravitational theories called Galileon models can also pass them. In 2012 Lam Hui and Alberto Nicolis of Columbia University devised a cosmic test that could refute or confirm the models. Their test hinges on the models’ central feature: an additional scalar field that couples to mass. The coupling can be characterized by a charge-like parameter, Q. For most cosmic objects, Q has the same value as the inertial mass. But for a black hole, whose mass arises entirely from its gravitational binding energy, Q is zero; the strong equivalence principle is violated.
Galaxies fall through space away from low concentrations of mass and toward high concentrations. The supermassive black holes at the centers of some galaxies are carried along with the flow. But if gravity has a Galileon component, the black hole feels less of a tug than do the galaxy’s stars, interstellar medium, and dark-matter particles. The upshot, Hui and Nicolis realized, is that the black hole will lag the rest of the galaxy and slip away from its center. The displacement is arrested when the black hole reaches the point where the lag is offset by the presence of more of the galaxy’s gravitational mass on one side of the black hole than on the other. Given the right circumstances, the displacement can be measured.
Hui and Nicolis’s proposal has now itself been put to the test. Asha Asvathaman and Jeremy Heyl of the University of British Columbia, together with Hui, have applied it to two galaxies: M32, which is being pulled toward its larger neighbor, the Andromeda galaxy, and M87 (shown here), which is being pulled through the Virgo cluster of galaxies. Both M32 and M87 are elliptical galaxies. Because of their simple shapes, their centroids can be determined from optical observations. The locations of their respective black holes can be determined from radio observations. Although the limit on Galileon gravity that Asvathaman, Heyl, and Hui derived was too loose to refute or confirm the theory, they nevertheless validated the test itself. More precise astrometric observations could make it decisive. (A. Asvathaman, J. S. Heyl, L. Hui, Mon. Not. R. Astron. Soc. 465, 3261, 2017, doi:10.1093/mnras/stw2905.)
(FULL STORY)

First Solid Sign that Matter Doesn't Behave Like Antimatter
[2/27/2017]
One of the biggest mysteries in physics is why there's matter in the universe at all. This week, a group of physicists at the world's largest atom smasher, the Large Hadron Collider, might be closer to an answer: They found that particles in the same family as the protons and neutrons that make up familiar objects behave in a slightly different way from their antimatter counterparts.

While matter and antimatter have all of the same properties, antimatter particles carry charges that are the opposite of those in matter. In a block of iron, for example, the protons are positively charged and the electrons are negatively charged. A block of antimatter iron would have negatively charged antiprotons and positively charged antielectrons (known as positrons). If matter and antimatter come in contact, they annihilate each other and turn into photons (or occasionally, a few lightweight particles such as neutrinos). Other than that, a piece of matter and antimatter should behave in the same way, and even look the same — a phenomenon called charge-parity (CP) symmetry. [The 18 Biggest Unsolved Mysteries in Physics]

Besides the identical behavior, CP symmetry also implies that the amount of matter and antimatter that was formed at the Big Bang, some 13.7 billion years ago, should have been equal. Clearly it was not, because if that were the case, then all the matter and antimatter in the universe would have been annihilated at the start, and even humans wouldn't be here.

But if there were a violation to this symmetry — meaning some bit of antimatter were to behave in a way that was different from its matter counterpart — perhaps that difference could explain why matter exists today.

To look for this violation, physicists at the Large Hadron Collider, a 17-mille-long (27 kilometers) ring beneath Switzerland and France, observed a particle called a lambda-b baryon. Baryons include the class of particles that make up ordinary matter; protons and neutrons are baryons. Baryons are made of quarks, and antimatter baryons are made of antiquarks. Both quarks and antiquarks come in six "flavors": up, down, top, bottom (or beauty), strange and charm, as scientists call the different varieties. A lambda-b is made of one up, one down and one bottom quark. (A proton is made of two up and one down, while a neutron consists of two down and one up quark.)

If the lambda and its antimatter sibling show CP symmetry, then they would be expected to decay in the same way. Instead, the team found that the lambda-b and antilambda-b particles decayed differently. Lambdas decay in two ways: into a proton and two charged particles called pi mesons (or pions), or into a proton and two K mesons (or kaons). When particles decay, they throw off their daughter particles at a certain set of angles. The matter and antimatter lambdas did that, but the angles were different. [7 Strange Facts About Quarks]

This is not the first time matter and antimatter have behaved differently. In the 1960s, scientists studied kaons themselves, which also decayed in a way that was different from their antimatter counterparts. B mesons — which consist of a bottom quark and an up, down, strange or charm quark — have also shown similar "violating" behavior.

Mesons, though, are not quite like baryons. Mesons are pairs of quarks and antiquarks. Baryons are made of ordinary quarks only, and antibaryons are made of antiquarks only. Discrepancies between baryon and antibaryon decays had never been observed before.

"Now we have something for baryons," Marcin Kucharczyk, an associate professor at the Institute of Nuclear Physics of the Polish Academy of Sciences, which collaborated on the LHC experiment, told Live Science. "When you'd observed mesons, it was not obvious that for baryons it was the same."

While tantalizing, the results were not quite solid enough to count as a discovery. For physicists, the measure of statistical significance, which is a way of checking whether one's data could happen by chance, is 5 sigma. Sigma refers to standard deviations, and a 5 means that there is only a 1 in 3.5 million chance that the results would occur by chance. This experiment got to 3.3 sigma — good, but not quite there yet. (That is, 3.3 sigma means that there's about a 1 in 4,200 chance that the observation would have occurred randomly, or about a 99-percent confidence level.)

The findings are not a complete answer to the mystery of why matter dominates the universe, Kucharczyk said.

"It cannot explain the asymmetry fully," he said. "In the future, we will have more statistics, and maybe for other baryons."

The findings are detailed in the Jan. 30 issue of the journal Nature Physics
(FULL STORY)

Physicists investigate erasing information at zero energy cost
[2/22/2017]
(Phys.org)—A few years ago, physicists showed that it's possible to erase information without using any energy, in contrast to the assumption at the time that erasing information must require energy. Instead, the scientists showed that the cost of erasure could be paid in terms of an arbitrary physical quantity such as spin angular momentum—suggesting that heat energy is not the only conserved quantity in thermodynamics.
Investigating this idea further, physicists Toshio Croucher, Salil Bedkihal, and Joan A. Vaccaro at the Centre for Quantum Dynamics, Griffith University, Brisbane, Queensland, Australia, have now discovered some interesting results about the tiny fluctuations in the spin cost of erasing information. The work could lead to the development of new types of heat engines and information processing devices.
As the scientists explain in a new paper published in Physical Review Letters, the possibility that information can be erased at zero energy cost is surprising at first due to the fact that energy and entropy are so closely related in thermodynamics. In the context of information, information erasure corresponds to entropy erasure (or a decrease in entropy) and therefore requires a minimum amount of energy, which is determined by Landauer's erasure principle.
Since Landauer's erasure principle is equivalent to the second law of thermodynamics, the zero-energy erasure scheme using arbitrary conserved quantities can be thought of as a generalized second law of thermodynamics. This idea dates back to at least 1957, when E. T. Jaynes proposed an alternative to the second law in which heat energy is thought of in a more general way than unusual, so that heat incorporates other kinds of conserved quantities.
Applying this framework to information erasure, in 2011 Vaccaro and Stephen Barnett showed that the energy cost of information erasure can be substituted with one or more different conserved quantities—specifically, spin angular momentum.
One important difference between heat energy and spin angular momentum is that, while heat may or may not be quantized, spin angular momentum is an intrinsically quantum mechanical property, and so it is always quantized. This has implications when it comes to accounting for tiny fluctuations in these quantities that become significant when designing systems at the nanoscale.

Scientists have only recently investigated these fluctuations in the context of the Landauer principle, where they found that these fluctuations are quickly suppressed by something called the Jarzynski equality. This means that heat energy fluctuations have only a very tiny probability of violating the Landauer principle.
In the new study, the scientists have for the first time investigated the corresponding discrete fluctuations that arise when erasing information using spin.
Among their results, the researchers found that the discrete fluctuations are suppressed even more quickly than predicted by the corresponding Jarzynski equality for "spinlabor"—a new term the scientists devised that means the spin equivalent of work. This is the first evidence of beating this bound in an information erasure context. The quick suppression means that the fluctuations have an extremely low probability of using less than the minimal cost required to erase information using spin, as given by the Vaccaro-Barnett bound, which is the spin equivalent of the Landauer principle.
"Our work generalizes fluctuation relations for erasure using arbitrary conserved quantities and exposes the role of discreteness in the context of erasure," Bedkihal told Phys.org. "We also obtained a probability of violation bound that is tighter than the corresponding Jarzynski bound. This is a statistically significant result."
The scientists also point out that this process of erasing information with spin has already been experimentally demonstrated, although it appears to have gone unnoticed. In spin-exchange optical pumping, light is used to excite electrons in an atom to a higher energy level. For the electrons to return to their lower energy level during the relaxation process, atoms and nuclei collide with each other and exchange spins. This entropy-decreasing process can be considered analogous to erasing information at a cost of spin exchange.
Overall, the new results reveal insight into the thermodynamics of spin and could also guide the development of future applications. These could include new kinds of heat engines and information processing devices based on erasure that use inexpensive, locally available resources such as spin angular momentum. The researchers plan to further pursue these possibilities in the future.
"The erasure mechanism can be used to design generalized heat engines operating under the reservoirs of multiple conserved quantities such as a thermal reservoir and a spin reservoir," Bedkihal said. "For example, one may design heat engines using semiconductor quantum dot systems where lattice vibrations constitute a thermal reservoir and nuclear spins constitute a polarized spin reservoir. Such heat engines go beyond the traditional Carnot heat engine that operates under two thermal reservoirs."
Explore further: Scientists show how to erase information without using energy
More information: Toshio Croucher, Salil Bedkihal, and Joan A. Vaccaro. "Discrete Fluctuations in Memory Erasure without Energy Cost." Physical Review Letters. DOI: 10.1103/PhysRevLett.118.060602, Also at arXiv:1604.05795 [quant-ph]
(FULL STORY)

NASA Just Found A Solar System With 7 Earth-Like Planets
[2/22/2017]
AN OCEAN OF WORLDS

Today, scientists working with telescopes at the European Southern Observatory and NASA announced a remarkable new discovery: An entire system of Earth-sized planets. If that’s not enough, the team asserts that the density measurements of the planets indicates that the six innermost are Earth-like rocky worlds.

CLICK IMAGE TO SEE FULL INFOGRAPHIC
CLICK IMAGE TO SEE FULL INFOGRAPHIC
And that’s just the beginning.

Three of the planets lie in the star’s habitable zone. If you aren’t familiar with the term, the habitable zone (also known as the “goldilocks zone”) is the region surrounding a star in which liquid water could theoretically exist. This means that all three of these alien worlds may have entire oceans of water, dramatically increasing the possibility of life. The other planets are less likely to host oceans of water, but the team states that liquid water is still a possibility on each of these worlds.

Summing the work, lead author Michaël Gillon notes that this solar system has the largest number of Earth-sized planets yet found and the largest number of worlds that could support liquid water: “This is an amazing planetary system — not only because we have found so many planets, but because they are all surprisingly similar in size to the Earth!”

Co-author Amaury Triaud notes that the star in this system is an “ultracool dwarf,” and he clarifies what this means in relation to the planets: “The energy output from dwarf stars like TRAPPIST-1 is much weaker than that of our Sun. Planets would need to be in far closer orbits than we see in the Solar System if there is to be surface water. Fortunately, it seems that this kind of compact configuration is just what we see around TRAPPIST-1.”

REACHING ANOTHER WORLD

The system is just 40 light-years away. On a cosmic scale, that’s right next door. Of course, practically speaking, it would still take us hundreds of millions of years to get there with today’s technology – but again, it is notable in that the find speaks volumes about the potential for life-as-we-know-it beyond Earth.

Moreover, the technology of tomorrow could get us to this system a lot faster.

These new discoveries ultimately mean that TRAPPIST-1 is of monumental importance for future study. The Hubble Space Telescope is already being used to search for atmospheres around the planets, and Emmanuël Jehin, a scientist who also worked on the research, asserts that future telescopes could allow us to truly see into the heart of this system: “With the upcoming generation of telescopes, such as ESO’s European Extremely Large Telescope and the NASA/ESA/CSA James Webb Space Telescope, we will soon be able to search for water and perhaps even evidence of life on these worlds.”
(FULL STORY)

Nearby Star Has 7 Earth-Sized Worlds - Most In Habitable Zone
[2/21/2017]
It will be announced tomorrow by NASA that Michael Gillon et al have confirmed 4 more Earth-sized planets circling TRAPPIST-1 in addition to 3 already discovered.

It is possible that most of the planets confirmed thus circling far TRAPPIST-1 could be in the star's habitable zone. The inner 6 planets are probably rocky in composition and may be just the right temperature for liquid water to exist (between 0 - 100 degrees C) - if they have any water, that is. The outermost 7th planet still needs some more observations to nail down its orbit and composition.
(FULL STORY)

Data About 2 Distant Asteroids: Clues to the Possible Planet Nine
[2/22/2017]
In the year 2000 the first of a new class of distant solar system objects was discovered, orbiting the Sun at a distance greater than that of Neptune: the "extreme trans Neptunian objects (ETNOs).

Their orbits are very far from the Sun compared with that of the Earth. We orbit the Sun at a mean distance of one astronomical unit (1 AU which is 150 million kilometres) but the ETNOs orbit at more than 150 AU. To give an idea of how far away they are, Pluto's orbit is at around 40 AU and its closest approach to the Sun (perihelion) is at 30 AU. This discovery marked a turning point in Solar System studies, and up to now, a total of 21 ETNOs have been identified.

Recently, a number of studies have suggested that the dynamical parameters of the ETNOs could be better explained if there were one or more planets with masses several times that of the Earth orbiting the Sun at distances of hundreds of AU. In particular, in 2016 the researchers Brown and Batygin used the orbits of seven ETNOs to predict the existence of a "superearth" orbiting the sun at some 700 AU. This range of masses is termed sub Neptunian. This idea is referred to as the Planet Nine Hypothesis and is one of the current subjects of interest in planetary science. However, because the objects are so far away the light we receive from them is very weak and until now the only one of the 21 trans Neptunian objects observed spectroscopically was Sedna.

Now, a team of researchers led by the Instituto de Astrofísica de Canarias (IAC) in collaboration with the Complutense University of Madrid has taken a step towards the physical characterization of these bodies, and to confirm or refute the hypothesis of Planet Nine by studying them. The scientists have made the first spectroscopic observations of 2004 VN112 and 2013 RF98, both of them particularly interesting dynamically because their orbits are almost identical and the poles of the orbits are separated by a very small angle. This suggest a common origin, and their present-day orbits could be the result of a past interaction with the hypothetical Planet Nine. This study, recently published in Monthly Notices of the Royal Astronomical Society, suggests that this pair of ETNOs was a binary asteroid which separated after an encounter with a planet beyond the orbit of Pluto.

To reach these conclusions, they made the first spectroscopic observations of 2004 VN112 and 2013 RF98 in the visible range. These were performed in collaboration with the support astronomers Gianluca Lombardi and Ricardo Scarpa, using the OSIRIS spectrograph on the Gran Telescopio CANARIAS (GTC), situated in the Roque de los Muchachos Observatory (Garafía, La Plama). It was hard work to identify these asteroids because their great distance means that their apparent movement on the sky is very slow. Then, they measured their apparent magnitudes (their brightness as seen from Earth) and also recalculated the orbit of 2013 RF98, which had been poorly determined. They found this object at a distance of more than an arcminute away from the position predicted from the ephemerides. These observations have helped to improve the computed orbit, and have been published by the Minor Planet Center (MPEC 2016-U18: 2013 RF98), the organism responsible for the identification of comets and minor planets (asteroids) as well as for measurements of their parameters and orbital positions.

The visible spectrum can give some information also about their composition. By measuring the slope of the spectrum, can be determined whether they have pure ices on their surfaces, as is the case for Pluto, as well as highly processed carbon compounds. The spectrum can also indicate the possible presence of amorphous silicates, as in the Trojan asteroids associated with Jupiter. The values obtained for 2004 VN112 and 2013 RF98 are almost identical and similar to those observed photometrically for two other ETNOs, 2000 CR105 and 2012 VP113. Sedna, however, the only one of these objects which had been previously observed spectroscopically, shows very different values from the others. These five objects are part of the group of seven used to test the hypothesis of Planet Nine, which suggests that all of them should have a common origin, except for Sedna, which is thought to have come from the inner part of the Oort cloud.

"The similar spectral gradients observed for the pair 2004 VN112 - 2013 RF98 suggests a common physical origin", explains Julia de León, the first author of the paper, an astrophysicist at the IAC. "We are proposing the possibility that they were previously a binary asteroid which became unbound during an encounter with a more massive object". To validate this hypothesis, the team performed thousands of numerical simulations to see how the poles of the orbits would separate as time went on. The results of these simulations suggest that a possible Planet Nine, with a mass of between 10 and 20 Earth masses orbiting the Sun at a distance between 300 and 600 AU could have deviated the pair 2004 VN112 - 2013 RF98 around 5 and 10 million years ago. This could explain, in principle, how these two asteroids, starting as a pair orbiting one another, became gradually separated in their orbits because they made an approach to a much more massive object at a particular moment in time.

Please follow SpaceRef on Twitter and Like us on Facebook.
(FULL STORY)

Tune Your Radio: Galaxies Sing When Forming Stars
[2/22/2017]
Almost all the light we see in the universe comes from stars which form inside dense clouds of gas in the interstellar medium.

The rate at which they form (referred to as the star formation rate, or SFR) depends on the reserves of gas in the galaxies and the physical conditions in the interstellar medium, which vary as the stars themselves evolve. Measuring the star formation rate is hence key to understand the formation and evolution of galaxies.

Until now, a variety of observations at different wavelengths have been performed to calculate the SFR, each with its advantages and disadvantages. As the most commonly used SFR tracers, the visible and the ultraviolet emission can be partly absorbed by interstellar dust. This has motivated the use of hybrid tracers, which combine two or more different emissions, including the infrared, which can help to correct this dust absorption. However, the use of these tracers is often uncertain because other sources or mechanisms which are not related to the formation of massive stars can intervene and lead to confusion.

Now, an international research team led by the IAC astrophysicist Fatemeh Tabatabaei has made a detailed analysis of the spectral energy distribution of a sample of galaxies, and has been able to measure, for the first time, the energy they emit within the frequency range of 1-10 Gigahertz which can be used to know their star formation rates. "We have used" explains this researcher "the radio emission because, in previous studies, a tight correlation was detected between the radio and the infrared emission, covering a range of more than four orders of magnitude". In order to explain this correlation, more detailed studies were needed to understand the energy sources and processes which produce the radio emission observed in the galaxies.

"We decided within the research group to make studies of galaxies from the KINGFISH sample (Key Insights on Nearby Galaxies: a Far-Infrared Survey with Herschel) at a series of radio frequencies", recalls Eva Schinnerer from the Max-Planck-Institut für Astronomie (MPIA) in Heidelberg, Germany. The final sample consists of 52 galaxies with very diverse properties. "As a single dish, the 100-m Effelsberg telescope with its high sensitivity is the ideal instrument to receive reliable radio fluxes of weak extended objects like galaxies", explains Marita Krause from the Max-Planck-Institut für Radioastronomie (MPIfR) in Bonn, Germany, who was in charge of the radio observations of those galaxies with the Effelsberg radio telescope. "We named it the KINGFISHER project, meaning KINGFISH galaxies Emitting in Radio."

The results of this project, published today in The Astrophysical Journal, show that the 1-10 Gigahertz radio emission used is an ideal star formation tracer for several reasons. Firstly, the interstellar dust does not attenuate or absorb radiation at these frequencies; secondly, it is emitted by massive stars during several phases of their formation, from young stellar objects to HII regions (zones of ionized gas) and supernova remnants, and finally, there is no need to combine it with any other tracer. For these reasons, measurements in the chosen range are a more rigorous way to estimate the formation rate of massive stars than the tracers traditionally used.

This study also clarifies the nature of the feedback processes occurring due to star formation activity, which are key in evolution of galaxies. "By differentiating the origins of the radio continuum, we could infer that the cosmic ray electrons (a component of the interstellar medium) are younger and more energetic in galaxies with higher star formation rates, which can cause powerful winds and outflows and have important consequences in regulation of star formation", explains Fatemeh Tabatabaei.

Article: "The radio spectral energy distribution and star formation rate calibration in galaxies", by F. Tabatabaei et al. The Astrophysical Journal. Volume 836, Number 2. (DOI: 10.3847/1538-4357/836/2/185)

http://iopscience.iop.org/article/10.3847/1538-4357/836/2/185
(FULL STORY)

Coders Race to Save NASA's Climate Data
[2/14/2017]
A group of coders is racing to save the government's climate science data.

On Saturday (Feb. 11), 200 programmers crammed themselves into the Doe Library at the University of California, Berkeley, furiously downloading NASA's Earth science data in a hackathon, Wired reported. The group's goal: rescue data that may be deleted or hidden under President Donald Trump's administration.

The process involves developing web-crawler scripts to trawl the internet, finding federal data and patching it together into coherent data sets. The hackers are also keeping track of data as it disappears; for instance, the Global Data Center's reports and one of NASA's atmospheric carbon dioxide (CO2) data sets has already been removed from the web.

By the end of Saturday, when the hackathon concluded, the coders had successfully downloaded thousands of pages — essentially all of NASA's climate data — onto the Internet Archive, a digital library.

But there is still more to be done. While the climate data may be safe for now, many other data sets out there could be lost, such as National Parks Service data on GPS boundaries and species tallies, Wired reported.

"Climate change data is just the tip of the iceberg," Eric Kansa, an anthropologist who manages archaeological data archiving for the nonprofit group Open Context, told Wired. "There are a huge number of other data sets being threatened [that are rich] with cultural, historical, sociological information."

Originally published on Live Science.

Editor's Recommendations

The Reality of Climate Change: 10 Myths Busted
NASA's Climate Change Data Key To Preparing Cities For Possible Catastrophes | Video
50 Interesting Facts About Earth
(FULL STORY)

You Can Help Scientists Find the Next Earth-Like Planet
[2/14/2017]
GRAVITATIONAL WOBBLES

NASA’s Kepler space telescope holds the record when it comes to candidate and confirmed exoplanets — to date, it has identified more than 5,000. To scan the universe for these alien planets, Kepler uses what’s called the “transit method.” Basically, Kepler watches out for the brightness dips that occur when a planet crosses the face of the star it orbits.

This isn’t the only method to catch exoplanets. The High Resolution Echelle Spectrometer (HIRES) instrument at the Keck Observatory in Hawaii detects radial velocity instead of brightness dips. This radial velocity method searches stars for signs of gravitational wobbles induced by orbiting planets. HIRES was part of a two-decade long radical velocity-planet hunting program and it has compiled almost 61,000 individual measurements made of more than 1,600 stars.

“HIRES was not specifically optimized to do this type of exoplanet detective work, but has turned out to be a workhorse instrument of the field,” said Steve Vogt, from the University of California Santa Cruz, who built the instrument. “I am very happy to contribute to science that is fundamentally changing how we view ourselves in the universe.”

From this huge amount of data, a team of researchers led by Paul Butler of the Carnegie Institution for Science in Washington, D.C., identified more than 100 possible exoplanets. Specifically, the researchers identified 60 candidate planets, plus 54 more that require further examination. They published their study in the The Astronomical Journal.

“We were very conservative in this paper about what counts as an exoplanet candidate and what does not,” researcher Mikko Tuomi explained, “and even with our stringent criteria, we found over 100 new likely planet candidates.” Among the candidate exoplanets, one could be orbiting the fourth-closest star (GJ 411) to our Sun just about 8.3 light years away. It’s not an Earth-twin however, as this potential planet has an orbital period that’s equivalent to just 10 days.

COLLABORATIVE EXPLORATION

There’s still a considerable amount of data to comb through. So, together with their findings, Butler’s team made the HIRES data set available to the public. “One of our key goals in this paper is to democratize the search for planets,” explained team member Greg Laughlin of Yale. “Anyone can download the velocities published on our website and use the open source Systemic software package and try fitting planets from the data.”

It’s certainly a noble idea and a timely one. “I think this paper sets a precedent for how the community can collaborate on exoplanet detection and follow-up”, said team-member Johanna Teske. “With NASA’s TESS mission on the horizon, which is expected to detect 1000+ planets orbiting bright, nearby stars, exoplanet scientists will soon have a whole new pool of planets to follow up.”

Other tools that can facilitate this search for exoplanets and potentially habitable ones include the recently completed James Webb Space Telescope (JWST). Its powerful array of lenses and mirrors will give our ability to scan the universe a much appreciated boost. Technological advances like the JWST, NASA’s TESS, and a couple of other interstellar eyes will allow us to see the universe like never before.
(FULL STORY)

Scientists Discover Over 100 New Exoplanets
[2/14/2017]
An international team of astronomers has announced the discovery of over a hundred new exoplanet candidates. These exoplanets were found using two decades' worth of data from the Keck Observatory in Hawaii. Their results were recently published in a paper in the Astronomical Journal, and among the discoveries is a planet orbiting the fourth-closest star to our own, only 8 light-years away.

Finding exoplanets isn't easy. Planets beyond our solar system are tiny and dark when compared to their host stars, so some advanced techniques have to be used to pinpoint them. The Kepler space telescope, for instance, finds exoplanets by looking for stars that regularly dim slightly. This dimming is caused by an exoplanet blocking some of the star's light when it passes in front, and the change in brightness can tell us a lot about the size of the planet and how fast it orbits.

However, there are additional ways to spot an exoplanet. The Keck Observatory uses a different method, called the radial velocity method, that looks at how the star moves. When a planet orbits a star, the planet's gravity causes the star to wobble a little bit. For instance, our own planet causes the sun to move a few inches per second, while Jupiter causes the sun to move about 40 feet per second. This wobbling is detectable by very sensitive telescopes, like the HIRES spectrometer at the Keck Observatory.

Before Kepler, the radial velocity method was the best way to find new exoplanets. Scientists using this method have found hundreds of worlds over the past 25 years, and we found the very first known exoplanet using this method. However, studying a star's radial velocity typically requires a lot of time for observation in order to separate the signal and any interferences.

The Keck data spans two decades, which is more than enough time to separate out the signal, and the data covers so many star systems that it could potentially contain evidence for thousands of new exoplanets. In fact, the dataset is so massive that one group of people could never get though all of it. To solve this problem, the team is releasing their data to the public, in the hopes that people will use that data to find even more exoplanets. If you're interested in discovering your very own alien planet, you can find the data and instructions on the team's website here.
(FULL STORY)

Why These Scientists Fear Contact With Space Aliens
[2/8/2017]
The more we learn about the cosmos, the more it seems possible that we are not alone. The entire galaxy is teeming with worlds, and we're getting better at listening — so the question, "Is there anybody out there?" is one we may be able to answer soon.

But do we really want to know? If aliens are indeed out there, would they be friendly explorers, or destroyers of worlds? This is a serious question no longer confined to science fiction, because a growing group of astronomers has taken it upon themselves to do more than just listen. Some are advocating for a beacon swept across the galaxy, letting E.T. know we're home, to see if anyone comes calling. Others argue we would be wise to keep Earth to ourselves.

"There's a possibility that if we actively message, with the intention of getting the attention of an intelligent civilization, that the civilization we contact would not necessarily have our best interests in mind," says Lucianne Walkowicz, an astrophysicist at the Adler Planetarium in Chicago. "On the other hand, there might be great benefits. It could be something that ends life on Earth, and it might be something that accelerates the ability to live quality lives on Earth. We have no way of knowing." Like many other astronomers, Walkowicz isn't convinced one way or the other — but she said the global scientific community needs to talk about it.

Internet investor and science philanthropist Yuri Milner shows the Starchip, a microelectronic component spacecraft. The $100 million project is aimed at establishing the feasibility of sending a swarm of tiny spacecraft, each weighing far less than an ounce, to the Alpha Centauri star system.

That conversation is likely to heat up soon thanks to the Breakthrough Initiatives, a philanthropic organization dedicated to interstellar outreach that's funded by billionaire Russian tech mogul Yuri Milner. Its Breakthrough Message program would solicit ideas from around the world to compose a message to aliens and figure out how to send it. Outreach for the program may launch as soon as next year, according to Pete Worden, the Breakthrough Initiatives' director.

"We're well aware of the argument, 'Do you send things or not?' There's pretty vigorous opinion on both sides of our advisory panel," Worden says. "But it's a very useful exercise to start thinking about what to respond. What's the context? What best represents the people on Earth? This is an exercise for humanity, not necessarily just about what we would send." Members of the advisory panel have argued that a picture (and the thousand words it may be worth) would be the best message.

Next comes "more of a technical expertise question," Wordon says. "Given that you have an image or images, how do you best encrypt it so it can be received?"

Breakthrough Message will work on those details, including how to transmit the pictures, whether through radio or laser transmitters; how to send it with high fidelity, so it's not rendered unreadable because of interference from the interstellar medium; which wavelengths of light to use, or whether to spread a message across a wide spectrum; how many times to send it, and how often; and myriad other technical concerns.

The scientific community continues to debate these questions. For instance, Philip Lubin of the University of California, Santa Barbara, has published research describing a laser array that could conceivably broadcast a signal through the observable universe.

Breakthrough is also working on where to send such a message, Worden adds. The $100 million Breakthrough Listen project is searching for any evidence of life in nearby star systems, which includes exoplanets out to a few hundred light years away.

"If six months from now, we start to see some interesting signals, we'll probably accelerate the Message program," he says.

The fact that there have been no signals yet does pose a conundrum. In a galaxy chock full of worlds, why isn't Earth crawling with alien visitors? The silence amid the presence of such plentiful planets is called the Fermi Paradox, named for the physicist Enrico Fermi, who first asked "Where is everybody?" in 1950.

In the decades since, astronomers have come up with possible explanations ranging from sociology to biological complexity. Aliens might be afraid of us, or consider us unworthy of attention, for instance. Or it may be that aliens communicate in ways that we can't comprehend, so we're just not hearing them. Or maybe aliens lack communication capability of any kind. Of course there's also the possibility that there are no aliens.

Image: Stephen Hawking
Stephen Hawking announces the "Breakthrough Starshot" initiative in New York in 2016. Dennis Van Tine / Star Max/IPx via AP
But those questions don't address the larger one: Whether it's a good idea to find out. Some scientists, most notably Stephen Hawking, are convinced the answer is a firm "No."

"We only have to look at ourselves to see how intelligent life might develop into something we wouldn't want to meet," Hawking said in 2010. He has compared meeting aliens to Christopher Columbus meeting Native Americans: "That didn't turn out so well," he said.

Others have warned of catastrophic consequences ripped from the pages of science fiction: Marauding aliens that could follow our message like a homing beacon, and come here to exploit Earth's resources, exploit humans, or even to destroy all life as we know it.

"Any civilization detecting our presence is likely to be technologically very advanced, and may not be disposed to treat us nicely. At the very least, the idea seems morally questionable," physicist Mark Buchanan argued in the journal Nature Physics last fall.

Other astronomers think it's worth the risk — and they add, somewhat darkly, that it's too late anyway. We are a loud species, and our messages have been making their way through the cosmos since the dawn of radio.

"If we are in danger of an alien invasion, it's too late," wrote Douglas Vakoch, the director of Messaging Extraterrestrial Intelligence (METI) International, in a rebuttal last fall in Nature Physics. Vakoch, the most prominent METI proponent, argues that if we don't tell anyone we're here, we could miss out on new technology that could help humanity, or even protect us from other, less friendly aliens.

“If we are in danger of an alien invasion, it’s too late.”
David Grinspoon, an author and astrobiologist at the Planetary Science Institute in Tucson, says he first thought, "'Oh, come on, you've got to be kidding me.' It seems kind of absurd aliens are going to come invade us, steal our precious bodily fluids, breed us like cattle, 'To Serve Man,' " a reference to a 1962 episode of "The Twilight Zone" in which aliens hatch a plan to use humans as a food source.

Originally, Grinspoon thought there would be no harm in setting up a cosmic lighthouse. "But I've listened to the other side, and I think they have a point," he adds. "If you live in a jungle that might be full of hungry lions, do you jump down from your tree and go, 'Yoo-hoo?'"

Many have already tried, albeit some more seriously than others.

In 2008, NASA broadcast the Beatles tune "Across the Universe" toward Polaris, the North Star, commemorating the space agency's 50th birthday, the 45th anniversary of the Deep Space Network, and the 40th anniversary of that song.

Later that year, a tech startup working with Ukraine's space agency beamed pictures and messages to the exoplanet Gliese 581 c. Other, sillier messages to the stars have included a Doritos commercial and a bunch of Craigslist ads.

Last October, the European Space Agency broadcast 3,775 text messages toward Polaris. It's not known to harbor any exoplanets, and even if it did, those messages would take some 425 years to arrive; yet the exercise, conceived by an artist, raised alarm among astronomers. Several prominent scientists, including Walkowicz, signed on to a statement guarding against any future METI efforts until some sort of international consortium could reach agreement.

Play Is an Alien Megastructure Causing this Distant Star's Strange Behavior? Facebook Twitter Google PlusEmbed
Is an Alien Megastructure Causing this Distant Star's Strange Behavior? 1:58
Even if we don't send a carefully crafted message, we're already reaching for the stars. The Voyager probe is beyond the solar system in interstellar space, speeding toward a star 17.6 light-years from Earth. Soon, if Milner has his way, we may be sending even more robotic emissaries.

Milner's $100 million Breakthrough Starshot aims to send a fleet of paper-thin space chips to the Alpha Centauri system within a generation's time. Just last fall, astronomers revealed that a potentially rocky, Earth-sized planet orbits Proxima Centauri, a small red dwarf star in that system and the nearest to our own, just four light years away. The chips would use a powerful laser to accelerate to near the speed of light, to cover the distance between the stars in just a few years. A team of scientists and engineers is working on how to build the chips and the laser, according to Worden.

"If we find something interesting, obviously we're going to get a lot more detail if we can visit, and fly by," he says. "Who knows what's possible in 50 years?"

But some time sooner than that, we will need to decide whether to say anything at all. Ultimately, those discussions are important for humanity, Worden, Walkowicz and Grinspoon all say.

"Maybe it's more important that we get our act together on Earth," Grinspoon says. "We are struggling to find a kind of global identity on this planet that will allow us to survive the problems we've created for ourselves. Why not treat this as something that allows us to practice that kind of thinking and action?"
(FULL STORY)

Scientists May Have Solved the Biggest Mystery of the Big Bang
[2/4/2017]
THE UNANSWERED QUESTION

The European Council for Nuclear Research (CERN) works to help us better understand what comprises the fabric of our universe. At this French association, engineers and physicists use particle accelerators and detectors to gain insight into the fundamental properties of matter and the laws of nature. Now, CERN scientists may have found an answer to one of the most pressing mysteries in the Standard Model of Physics, and their research can be found in Nature Physics.

According to the Big Bang Theory, the universe began with the production of equal amounts of matter and antimatter. Since matter and antimatter cancel each other out, releasing light as they destroy each other, only a minuscule number of particles (mostly just radiation) should exist in the universe. But, clearly, we have more than just a few particles in our universe. So, what is the missing piece? Why is the amount of matter and the amount of antimatter so unbalanced?


The Standard Model of particle physics does account for a small percentage of this asymmetry, but the majority of the matter produced during the Big Bang remains unexplained. Noticing this serious gap in information, scientists theorized that the laws of physics are not the same for matter and antimatter (or particles and antiparticles). But how do they differ? Where do these laws separate?

This separation, known as charge-parity (CP) violation, has been seen in hadronic subatomic particles (mesons), but the particles in question are baryons. Finding evidence of CP violation in these particles would allow scientists to calculate the amount of matter in the universe, and answer the question of why we have an asymmetric universe. After decades of effort, the scientists at CERN think they’ve done just that.

Using a Large Hadron Collider (LHC) detector, CERN scientists were able to witness CP violation in baryon particles. When smashed together, the matter (Λb0) and antimatter (Λb0-bar) versions of the particles decayed into different components with a significant difference in the quantities of the matter and antimatter baryons. According to the team’s report, “The LHCb data revealed a significant level of asymmetries in those CP-violation-sensitive quantities for the Λb0 and Λb0-bar baryon decays, with differences in some cases as large as 20 percent.”

WHAT DOES THIS MEAN?

This discovery isn’t yet statistically significant enough to claim that it is definitive proof of a CP variation, but most believe that it is only a matter of time. “Particle physics results are dragged, kicking and screaming, out of the noise via careful statistical analysis; no discovery is complete until the chance of it being a fluke is below one in a million. This result isn’t there yet (it’s at about the one-in-a-thousand level),” says scientist Chris Lee. “The asymmetry will either be quickly strengthened or it will disappear entirely. However, given that the result for mesons is well and truly confirmed, it would be really strange for this result to turn out to be wrong.”

This borderline discovery is one huge leap forward in fully understanding what happened before, during, and after the Big Bang. While developments in physics like this may seem, from the outside, to be technical achievements exciting only to scientists, this new information could be the key to unlocking one of the biggest mysteries in modern physics. If the scientists at CERN are able to prove that matter and antimatter do, in fact, obey separate laws of physics, science as we know it would change and we’ll need to reevaluate our understanding of our physical world.
References: ScienceAlert - Latest, Science Alert
(FULL STORY)

New Research Shows the Universe May Have Once Been a Hologram
[1/31/2017]
A TWO-DIMENSIONAL BOUNDARY

New research suggests that the universe may have been a hologram at one point in time, specifically a few hundred thousand years after the Big Bang. The study, published in the journal Physical Review Letters, is the latest research on the “holographic principle,” which suggests that the laws of physics can apply to the universe as a two-dimensional plane.

“We are proposing using this holographic universe, which is a very different model of the Big Bang than the popularly accepted one that relies on gravity and inflation,” said lead author Niayesh Afshordi, professor of physics and astronomy at the University of Waterloo and Perimeter Institute. “Each of these models makes distinct predictions that we can test as we refine our data and improve our theoretical understanding – all within the next five years.”

The theory suggests that the volume of space appears three-dimensional, but is actually encoded on a two-dimensional boundary or an observer-dependent horizon that requires one less dimension that it appears. In short, we see it as three-dimensional, but it is projected from a two-dimensional source, similar to how a hologram screen works.

“The idea is similar to that of ordinary holograms, where a three-dimensional image is encoded in a two-dimensional surface, such as in the hologram on a credit card,” explained researcher Kostas Skenderis from the University of Southampton. “However, this time, the entire universe is encoded.”

MAKING SENSE OF COSMIC INFLATION

The researchers arrived at this conclusion after observing irregularities in the cosmic microwave background — the Big Bang’s remnant. The team used a model with one time and two space dimensions. Actual data from the universe, including cosmic microwave background observations, were then plugged into the model. The researchers saw that the two fit perfectly, but only if the universe is no more than 10 degrees wide.

“I would say you don’t live in a hologram, but you could have come out of a hologram,” Afshordi told Gizmodo. “[In 2017], there are definitely three dimensions.”

While many accept the cosmic inflation that came after the Big Bang, our understanding of physics – including current general relativity and quantum mechanics theories – doesn’t work with what we observe. The fundamental laws of physics are incapable of explaining how the universe as we know it, with all its contents, could’ve fit in a small package that exponentially expanded.

This is where Afshordi’s research and the holographic model come in. These could lead to new theories about the Big Bang and a functioning theory of quantum gravity — a theory that meshes quantum mechanics with Einstein’s theory of gravity. “The key to understanding quantum gravity is understanding field theory in one lower dimension,” Afshordi says. “Holography is like a Rosetta Stone, translating between known theories of quantum fields without gravity and the uncharted territory of quantum gravity itself.”

The question remains, though: how did the universe transition from 2D to 3D? Further study is needed to explain this.
(FULL STORY)

Dark energy emerges when energy conservation is violated
[1/18/2017]
The conservation of energy is one of physicists' most cherished principles, but its violation could resolve a major scientific mystery: why is the expansion of the universe accelerating? That is the eye-catching claim of a group of theorists in France and Mexico, who have worked out that dark energy can take the form of Albert Einstein's cosmological constant by effectively sucking energy out of the cosmos as it expands.

The cosmological constant is a mathematical term describing an anti-gravitational force that Einstein had inserted into his equations of general relativity in order to counteract the mutual attraction of matter within a static universe. It was then described by Einstein as his "biggest blunder", after it was discovered that the universe is in fact expanding. But then the constant returned to favour in the late 1990s following the discovery that the universe's expansion is accelerating.

For many physicists, the cosmological constant is a natural candidate to explain dark energy. Since it is a property of space–time itself, the constant could represent the energy generated by the virtual particles that quantum mechanics dictates continually flit into and out of existence. Unfortunately the theoretical value of this "vacuum energy" is up to a staggering 120 orders of magnitude larger than observations of the universe's expansion imply.

The latest work, carried out by Alejandro Perez and Thibaut Josset of Aix Marseille University together with Daniel Sudarsky of the National Autonomous University of Mexico, proposes that the cosmological constant is instead the running total of all the non-conserved energy in the history of the universe. The "constant" in fact would vary – increasing when energy flows out of the universe and decreasing when it returns. However, the constant would appear unchanging in our current (low-density) epoch because its rate of change would be proportional to the universe's mass density. In this scheme, vacuum energy does not contribute to the cosmological constant.

The researchers had to look beyond general relativity because, like Newtonian mechanics, it requires energy to be conserved. Strictly speaking, relativity requires the conservation of a multi-component "energy-momentum tensor". That conservation is manifest in the fact that, on very small scales, space–time is flat, even though Einstein's theory tells us that mass distorts the geometry of space–time.


In contrast, most attempts to devise a theory of quantum gravity require space–time to come in discrete grains at the smallest (Planck-length) scales. That graininess opens the door to energy non-conservation. Unfortunately, no fully formed quantum-gravity theory exists yet, and so the trio instead turned to a variant of general relativity known as unimodular gravity, which allows some violation of energy conservation. They found that when they constrained the amount of energy that can be lost from (or gained by) the universe to be consistent with the cosmological principle – on very large scales the process must be both homogeneous and isotropic – the unimodular equations generated a cosmological-constant-like entity.

In the absence of a proper understanding of Planck-scale space–time graininess, the researchers were unable to calculate the exact size of the cosmological constant. Instead, they incorporated the unimodular equations into a couple of phenomenological models that exhibit energy non-conservation. One of these describes how matter might propagate in granular space–time, while the other modifies quantum mechanics to account for the disappearance of superposition states at macroscopic scales.

These models both contain two free parameters, which were adjusted to make the models consistent with null results from experiments that have looked for energy non-conservation in our local universe. Despite this severe constraint, the researchers found that the models generated a cosmological constant of the same order of magnitude as that observed. "We are saying that even though each individual violation of energy conservation is tiny, the accumulated effect of these violations over the very long history of the universe can lead to dark energy and accelerated expansion," Perez says.

In future, he says it might be possible to subject the new idea to more direct tests, such as observing supernovae very precisely to try to work out whether the universe's accelerating expansion is driven by a constant or varying force. The model could also be improved so that it captures dark-energy's evolution from just after the Big Bang – and then comparing the results with observations of the cosmic microwave background.

If the trio are ultimately proved right, it would not mean physicists having to throw their long-established conservation principles completely out of the window. A variation in the cosmological constant, Perez says, could point to a deeper, more abstract kind of conservation law. "Just as heat is energy stored in the chaotic motion of molecules, the cosmological constant would be 'energy' stored in the dynamics of atoms of space–time," he explains. "This energy would only appear to be lost if space–time is assumed to be smooth."

Other physicists are cautiously supportive of the new work. George Ellis of the University of Cape Town in South Africa describes the research as "no more fanciful than many other ideas being explored in theoretical physics at present". The fact that the models predict energy to be "effectively conserved on solar-system scales" – a crucial check, he says – makes the proposal "viable" in his view.

Lee Smolin of the Perimeter Institute for Theoretical Physics in Canada, meanwhile, praises the researchers for their "fresh new idea", which he describes as "speculative, but in the best way". He says that the proposal is "probably wrong", but that if it's right "it is revolutionary".
The research is described in Physical Review Letters.
(FULL STORY)

Physicists measure the loss of dark matter since the birth of the universe
[12/28/2016]
Russian scientists have discovered that the proportion of unstable particles in the composition of dark matter in the days immediately following the Big Bang was no more than 2 percent to 5 percent. Their study has been published in Physical Review D.

"The discrepancy between the cosmological parameters in the modern universe and the universe shortly after the Big Bang can be explained by the fact that the proportion of dark matter has decreased. We have now, for the first time, been able to calculate how much dark matter could have been lost, and what the corresponding size of the unstable component would be," says co-author Igor Tkachev of the Department of Experimental Physics at INR.
Astronomers first suspected that there was a large proportion of hidden mass in the universe back in the 1930s, when Fritz Zwicky discovered "peculiarities" in a cluster of galaxies in the constellation Coma Berenices—the galaxies moved as if they were under the effect of gravity from an unseen source. This hidden mass, which is only deduced from its gravitational effect, was given the name dark matter. According to data from the Planck space telescope, the proportion of dark matter in the universe is 26.8 percent; the rest is "ordinary" matter (4.9 percent) and dark energy (68.3 percent).
The nature of dark matter remains unknown. However, its properties could potentially help scientists to solve a problem that arose after studying observations from the Planck telescope. This device accurately measured the fluctuations in the temperature of the cosmic microwave background radiation—the "echo" of the Big Bang. By measuring these fluctuations, the researchers were able to calculate key cosmological parameters using observations of the universe in the recombination era—approximately 300,000 years after the Big Bang.
However, when researchers directly measured the speed of the expansion of galaxies in the modern universe, it turned out that some of these parameters varied significantly—namely the Hubble parameter, which describes the rate of expansion of the universe, and also the parameter associated with the number of galaxies in clusters. "This variance was significantly more than margins of error and systematic errors known to us. Therefore, we are either dealing with some kind of unknown error, or the composition of the ancient universe is considerably different to the modern universe," says Tkachev.
Russian physicists measure the loss of dark matter since the birth of the universe
The concentration of the unstable component of dark matter F against the speed of expansion of non-gravitationally bound objects (proportional to the age of the Universe) when examining various combinations of Planck data for several different cosmological phenomena.

The discrepancy can be explained by the decaying dark matter (DDM) hypothesis, which states that in the early universe, there was more dark matter, but then part of it decayed.


"Let us imagine that dark matter consists of several components, as in ordinary matter (protons, electrons, neutrons, neutrinos, photons). And one component consists of unstable particles with a rather long lifespan. In the era of the formation of hydrogen, hundreds of thousands of years after the Big Bang, they are still in the universe, but by now (billions of years later), they have disappeared, having decayed into neutrinos or hypothetical relativistic particles. In that case, the amount of dark matter in the era of hydrogen formation and today will be different," says lead author Dmitry Gorbunov, a professor at MIPT and staff member at INR.
The authors of the study analyzed Planck data and compared them with the DDM model and the standard ΛCDM (Lambda-cold dark matter) model with stable dark matter. The comparison showed that the DDM model is more consistent with the observational data. However, the researchers found that the effect of gravitational lensing (the distortion of cosmic microwave background radiation by a gravitational field) greatly limits the proportion of decaying dark matter in the DDM model.
Using data from observations of various cosmological effects, the researchers were able to give an estimate of the relative concentration of the decaying components of dark matter in the region of 2 percent to 5 percent.
"This means that in today's universe, there is 5 percent less dark matter than in the recombination era. We are not currently able to say how quickly this unstable part decayed; dark matter may still be disintegrating even now, although that would be a different and considerably more complex model," says Tkachev.

More information: A. Chudaykin et al, Dark matter component decaying after recombination: Lensing constraints with Planck data, Physical Review D (2016). DOI: 10.1103/PhysR
(FULL STORY)

This star has a secret – even better than 'alien megastructures'
[1/13/2017]
When Yale researcher Tabetha Boyajian first focused on the star KIC 8462852 via the Kepler Space Telescope in September 2015, she didn't know what to make of it.

The lighting of the star was mysterious – it was far too dim for a star of its age and type, intermittently dipping in brightness. Theories around Tabby’s star, as it was nicknamed, quickly piled up, with some scientists attributing the atypical lighting to surrounding cosmic dust or nearby comets. But more excitable space enthusiasts predicted alien activity, arguing that only orbiting alien structures could block a star’s light so effectively.

The so-called alien megastructure hypothesis persisted longer than most extra-terrestrial-based theories, simply because scientists had few alternative ideas to explain the star's peculiar blinking – until now. And the latest theory is almost as intriguing as the alien hypothesis.

Dr. Boyajian and her team weren't the first to spot the star: it was actually discovered in 1890. But their questions about the star's light pattern – and the subsequent alien-related theories – made the star, well, something of a star.

“We’d never seen anything like this star,” Boyajian told the Atlantic in October 2015. “It was really weird. We thought it might be bad data or movement on the spacecraft, but everything checked out.”


KIC 8462852's story became more intriguing in January 2016, New Scientist reports, when a comparison of the first image taken of Tabby's star, in 1890, with one taken in 1989 revealed that the star had dimmed 14 percent in the interim 100 years. And over one particularly confusing two-day period, the star dipped in brightness by 22 percent.

Tabby’s star kept scientists scratching their heads all last year. Volatility in light patterns are typical for young stars, but KIC 8462852 is mature.

“The steady brightness change in KIC 8462852 is pretty astounding,” Ben Montet, a scientist at the California Institute of Technology, said in an October statement. “It is unprecedented for this type of star to slowly fade for years, and we don’t see anything else like it in the Kepler data.”

Now, a team of scientists from Columbia University and the University of California, Berkeley, say they have found a reasonable explanation to KIC 8462852’s strange lighting.

“Following an initial suggestion by Wright & Sigurdsson, we propose that the secular dimming behavior is the result of the inspiral of a planetary body or bodies into KIC 8462852, which took place ~ 10-104 years ago (depending on the planet mass),” the three authors write in a study to be published Monday in the Monthly Notices of the Royal Astronomical Society.

“Gravitational energy released as the body inspirals into the outer layers of the star caused a temporary and unobserved brightening, from which the stellar flux is now returning to the quiescent state.”

In other words, KIC 8462852 ate a planet sometime in the past 10,000 years.

The theory goes like this:

If KIC 8462852 did eat a planet – which is extremely rare in the space world, unless a collision pushed the planet out of its orbit – the star’s brightness would increase for a short period of 200 to 10,000 years as it burned up the planet (short in star time, that is). But once the burning was complete, the star would go back to around its original level of brightness.

So we could be looking at KIC 8462852 during its post-planet digestion, as it dims back to normal, write the authors.

And KIC 8462852 could have been a messy eater, leaving crumbs – aka orbiting planet debris – that periodically block its light. “This paper puts a merger scenario on the table in a credible way,” Jason Wright, an astronomist at Penn State University, tells New Scientist. “I think this moves it into the top tier of explanations.”
(FULL STORY)

Testing theories of modified gravity
[1/12/2017]
The accelerated expansion of the universe is usually attributed to a mysterious dark energy, but there’s another conceivable explanation: modified gravity. Unmodified gravity—that is, Einstein’s general relativity—satisfactorily accounts for the dynamics of the solar system, where precision measurements can be made without the confounding influence of dark matter. Nor have any violations been detected in one of general relativity’s principal ingredients, the strong equivalence principle, which posits that inertial mass and gravitational mass are identical.

But those observational constraints are not ineluctable. In particular, a class of gravitational theories called Galileon models can also pass them. In 2012 Lam Hui and Alberto Nicolis of Columbia University devised a cosmic test that could refute or confirm the models. Their test hinges on the models’ central feature: an additional scalar field that couples to mass. The coupling can be characterized by a charge-like parameter, Q. For most cosmic objects, Q has the same value as the inertial mass. But for a black hole, whose mass arises entirely from its gravitational binding energy, Q is zero; the strong equivalence principle is violated.

Galaxies fall through space away from low concentrations of mass and toward high concentrations. The supermassive black holes at the centers of some galaxies are carried along with the flow. But if gravity has a Galileon component, the black hole feels less of a tug than do the galaxy’s stars, interstellar medium, and dark-matter particles. The upshot, Hui and Nicolis realized, is that the black hole will lag the rest of the galaxy and slip away from its center. The displacement is arrested when the black hole reaches the point where the lag is offset by the presence of more of the galaxy’s gravitational mass on one side of the black hole than on the other. Given the right circumstances, the displacement can be measured.

Hui and Nicolis’s proposal has now itself been put to the test. Asha Asvathaman and Jeremy Heyl of the University of British Columbia, together with Hui, have applied it to two galaxies: M32, which is being pulled toward its larger neighbor, the Andromeda galaxy, and M87 (shown here), which is being pulled through the Virgo cluster of galaxies. Both M32 and M87 are elliptical galaxies. Because of their simple shapes, their centroids can be determined from optical observations. The locations of their respective black holes can be determined from radio observations. Although the limit on Galileon gravity that Asvathaman, Heyl, and Hui derived was too loose to refute or confirm the theory, they nevertheless validated the test itself. More precise astrometric observations could make it decisive. (A. Asvathaman, J. S. Heyl, L. Hui, Mon. Not. R. Astron. Soc., in press.)
(FULL STORY)

A simple explanation of mysterious space-stretching ‘dark energy?’
[1/10/2017]
For nearly 2 decades, cosmologists have known that the expansion of the universe is accelerating, as if some mysterious "dark energy" is blowing it up like a balloon. Just what dark energy is remains one of the biggest mysteries in physics. Now, a trio of theorists argues that dark energy could spring from a surprising source. Weirdly, they say, dark energy could come about because—contrary to what you learned in your high school physics class—the total amount of energy in the universe isn't fixed, or "conserved," but may gradually disappear.

"It's a great direction to explore," says George Ellis, a theorist at the University of Cape Town in South Africa, who was not involved in the work. But Antonio Padilla, a theorist at the University of Nottingham in the United Kingdom, says, "I don't necessarily buy what they've done."

Dark energy could be a new field, a bit like an electric field, that fills space. Or it could be part of space itself—a pressure inherent in the vacuum—called a cosmological constant. The second scenario jibes well with Einstein's theory of general relativity, which posits that gravity arises when mass and energy warps space and time. In fact, Einstein invented the cosmological constant—literally by adding a constant to his famous differential equations—to explain how the universe resisted collapsing under its own gravity. But he gave up on the idea as unnecessary when in the 1920s astronomers discovered that the universe isn't static, but is expanding as if born in an explosion.

With the observation that the expansion of the universe is accelerating, the cosmological constant has made a comeback. Bring in quantum mechanics and the case for the cosmological constant gets tricky, however. Quantum mechanics suggests the vacuum itself should fluctuate imperceptibly. In general relativity, those tiny quantum fluctuations produce an energy that would serve as the cosmological constant. Yet, it should be 120 orders of magnitude too big—big enough to obliterate the universe. So explaining why there is a cosmological constant, but just a little bitty one, poses a major conceptual puzzle for physicists. (When there was no need for a cosmological constant theorists assumed that some as-yet-unknown effect simply nailed it to zero.)

Now, Thibault Josset and Alejandro Perez of Aix-Marseille University in France and Daniel Sudarsky of the National Autonomous University of Mexico in Mexico City say they have found a way to get a reasonable value for the cosmological constant. They begin with a variant of general relativity that Einstein himself invented called unimodular gravity. General relativity assumes a mathematical symmetry called general covariance, which says that no matter how you label or map spacetime coordinates—i.e. positions and times of events—the predictions of the theory must be the same. That symmetry immediately requires that energy and momentum are conserved. Unimodular gravity possesses a more limited version of that mathematical symmetry.

Unimodular gravity reproduces most of the predictions of general relativity. However, in it quantum fluctuations of the vacuum do not produce gravity or add to the cosmological constant, which is once again just a constant that can be set to the desired value. There's a cost, however. Unimodular gravity doesn't require energy to be conserved, so theorists have to impose that constraint arbitrarily.

Now, however, Josset, Perez, and Sudarsky show that in unimodular gravity, if they just go with it and allow the violation of the conservation of energy and momentum, it actually sets the value of the cosmological constant. The argument is mathematical, but essentially the tiny bit of energy that disappears in the universe leaves its trace by gradually changing the cosmological constant. "In the model, dark energy is something that keeps track of how much energy and momentum has been lost over the history of the universe," Perez says.

To show that the theory gives reasonable results, the theorists consider two scenarios of how the violation of energy conservation might come about in theories that address foundational issues in quantum mechanics. For example, a theory called continuous spontaneous localization (CSL) tries to explain why a subatomic particle like an electron can literally be in two places at once, but a big object like a car cannot. CSL assumes that such two-places-at-once states spontaneously collapse to one place or the other with a probability that increases with an object's size, making it impossible for a large object to stay in the two-place state. The knock against CSL is that it doesn't conserve energy. But the theorists show that the amount that energy conservation is violated would be roughly enough to give a cosmological constant of the right size.

The work's novelty lies in using the violation of conservation of energy to tie dark energy to possible extensions of quantum theory, says Lee Smolin, a theorist at the Perimeter Institute for Theoretical Physics in Waterloo, Canada. "It's in no way definitive," he says. "But it's an interesting hypothesis that unites these two things, which to my knowledge nobody has tried to connect before."

However, Padilla says the theorists are playing mathematical sleight-of-hand. They still have to assume that the cosmological constant starts with some small value that they don't explain, he says. But Ellis notes that physics abounds with unexplained constants such as the charge of the electron or the speed of light. "This just adds one more constant to the long list.”

Padilla also argues that the work runs contrary to the idea that phenomena on the biggest scales should not depend on those at the smallest scales. "You're trying to describe something on the scale of the universe," he says. "Do you really expect it to be sensitive to the details of quantum mechanics?" But Smolin argues that the cosmological constant problem already links the cosmic and quantum realms. So, he says, "It's a new idea that could possibly be right and thus is worth getting interested in."
(FULL STORY)

Physicists detect exotic looped trajectories of light in three-slit experiment
[1/6/2017]
Physicists have performed a variation of the famous 200-year-old double-slit experiment that, for the first time, involves "exotic looped trajectories" of photons. These photons travel forward through one slit, then loop around and travel back through another slit, and then sometimes loop around again and travel forward through a third slit.

Interestingly, the contribution of these looped trajectories to the overall interference pattern leads to an apparent deviation from the usual form of the superposition principle. This apparent deviation can be understood as an incorrect application of the superposition principle—once the additional interference between looped and straight trajectories is accounted for, the superposition can be correctly applied.

The team of physicists, led by Omar S. Magaña-Loaiza and Israel De Leon, has published a paper on the new experiment in a recent issue of Nature Communications.

Loops of light

"Our work is the first experimental observation of looped trajectories," De Leon told Phys.org. "Looped trajectories are extremely difficult to detect because of their low probability of occurrence. Previously, researchers had suggested that these exotic trajectories could exist but failed to observe them."

To increase the probability of the occurrence of looped trajectories, the researchers designed a three-slit structure that supports surface plasmons, which the scientists describe as "strongly confined electromagnetic fields that can exist at the surface of metals." The presence of these electromagnetic fields near the three slits increases the contribution of looped trajectories to the overall interference pattern by almost two orders of magnitude.

"We provided a physical explanation that links the probability of these exotic trajectories to the near fields around the slits," De Leon said. "As such, one can increase the strength of near fields around the slits to increase the probability of photons following looped trajectories."

Superposition principle accounting for looped trajectories

The new three-slit experiment with looped trajectories is just one of many variations of the original double-slit experiment, first performed by Thomas Young in 1801. Since then, researchers have been performing versions that use electrons, atoms, or molecules instead of photons.

One of the reasons why the double-slit experiment has attracted so much attention is that it represents a physical manifestation of the principle of quantum superposition. The observation that individual particles can create an interference pattern implies that the particles must travel through both slits at the same time. This ability to occupy two places, or states, at once, is the defining feature of quantum superposition.

Straight trajectories (green) and exotic looped trajectories (red, dashed, dotted) of light, where the red cloud near the surface depicts the near fields, which increase the probability of photons to follow looped trajectories. The graphs at left show simulations (top) and experimental results (bottom) of the large difference in interference patterns created by illuminating only one slit being treated independently (gray line) and the actual coupled system (blue line). The remarkable difference between the gray and blue lines is caused by the looped trajectories. Credit: Magaña-Loaiza et al. Nature Communications
So far, all previous versions of the experiment have produced results that appear to be accurately described by the principle of superposition. This is because looped trajectories are so rare under normal conditions that their contribution to the overall interference pattern is typically negligible, and so applying the superposition principle to those cases results in a very good approximation.

It is when the contribution of the looped trajectories becomes non-negligible that it becomes apparent that the total interference is not simply the superposition of individual wavefunctions of photons with straight trajectories, and so the interference pattern is not correctly described by the usual form of the superposition principle.

Magaña-Loaiza explained this apparent deviation in more detail:

"The superposition principle is always valid—what is not valid is the inaccurate application of the superposition principle to a system with two or three slits," he said.

"For the past two centuries, scientists have assumed that one cannot observe interference if only one slit is illuminated in a two- or three-slit interferometer, and this is because this scenario represents the usual or typical case.

"However, in our paper we demonstrate that this is true only if the probability of photons to follow looped trajectories is negligible. Surprisingly, interference fringes are formed when photons following looped trajectories interfere with photons following straight (direct) trajectories, even when only one of the three slits is illuminated.

"The superposition principle can be applied to this surprising scenario by using the sum or 'superposition' of two wavefunctions; one describing a straight trajectory and the other describing looped trajectories. Not taking into account looped trajectories would represent an incorrect application of the superposition principle.

"To some extent, this effect is strange because scientists know that Thomas Young observed interference when he illuminated both slits and not only one. This is true only if the probability of photons following looped trajectories is negligible."

In addition to impacting physicists' understanding of the superposition principle as it is applied to these experiments, the results also reveal new properties of light that could have applications for quantum simulators and other technologies that rely on interference effects.

"We believe that exotic looped paths can have important implications in the study of decoherence mechanisms in interferometry or to increase the complexity of certain protocols for quantum random walks, quantum simulators, and other algorithms used in quantum computation," De Leon said.
(FULL STORY)

Actual footage shows what it was like to land on Saturn's moon Titan
[1/12/2017]
In 2005, an alien probe flew through the hazy and cold atmosphere of Titan, the largest moon of Saturn, and landed on the world's surface.

That spacecraft — named the Huygens probe — was sent from Earth by the European Space Agency along with the Cassini spacecraft to help humanity learn more about Saturn and its 53 known moons.

SEE ALSO: These photos of a hexagon on Saturn are totally real

Thanks to a new video released by NASA, you can relive the Huygens' descent to Titan's surface 12 years after it actually landed.

The video shows actual footage from the spacecraft's point of view as it passed through the hazy layers of Titan's atmosphere, spotted "drainage canals" that suggest rivers of liquid methane run on the moon and gently set down on the surface, NASA said.
(FULL STORY)


News Archives