Baylor University
Department of Physics
College of Arts and Sciences

Baylor > Physics > News
Physics News

News Categories
•  Baylor
•  Colloquium
•  Faculty Meetings
•  Graduate
•  Outreach
•  Research Seminars
•  Social Events
•  SPS

Top News
•  Gravitational Waves Could Help Us Detect the Universe’s Hidden Dimensions
•  We could detect alien life by finding complex molecules
•  We May Have Uncovered the First Ever Evidence of the Multiverse
•  Scientists Just Discovered an Alien Planet That’s The Best Candidate for Life As We Know It
•  Physicists detect whiff of new particle at the Large Hadron Collider
•  Physicists Discover Hidden Aspects of Electrodynamics
•  A dark matter 'bridge' holding galaxies together has been captured for the first time
•  No, Dark Energy Isn't An Illusion
•  Satellite galaxies at edge of Milky Way coexist with dark matter
•  Magnetic hard drives go atomic
•  Could Mysterious Cosmic Light Flashes Be Powering Alien Spacecraft?
•  NASA is Going to Create The Coldest Spot in the Known Universe
•  Testing theories of modified gravity
•  First Solid Sign that Matter Doesn't Behave Like Antimatter
•  Physicists investigate erasing information at zero energy cost
•  NASA Just Found A Solar System With 7 Earth-Like Planets
•  Nearby Star Has 7 Earth-Sized Worlds - Most In Habitable Zone
•  Data About 2 Distant Asteroids: Clues to the Possible Planet Nine
•  Tune Your Radio: Galaxies Sing When Forming Stars
•  Coders Race to Save NASA's Climate Data
•  You Can Help Scientists Find the Next Earth-Like Planet
•  Scientists Discover Over 100 New Exoplanets
•  Why These Scientists Fear Contact With Space Aliens
•  Scientists May Have Solved the Biggest Mystery of the Big Bang
•  New Research Shows the Universe May Have Once Been a Hologram
•  Dark energy emerges when energy conservation is violated
•  Physicists measure the loss of dark matter since the birth of the universe
•  This star has a secret – even better than 'alien megastructures'
•  Testing theories of modified gravity
•  A simple explanation of mysterious space-stretching ‘dark energy?’
•  Physicists detect exotic looped trajectories of light in three-slit experiment
•  Actual footage shows what it was like to land on Saturn's moon Titan
•  Quaternions are introduced, October 16, 1843
•  The Sound Of Quantum Vacuum
•  Multiple copies of the Standard Model could solve the hierarchy problem
•  Universe May Have Lost 'Unstable' Dark Matter
•  Vera Rubin, Astronomer Who Did Pioneering Work on Dark Matter, Dies at 88
•  China's Hunt for Signals From the Dark Universe
•  Baylor Physics Ph.D. Graduate Quoted in "How Realistic Is the Interstellar Ship from 'Passengers'?"
•  Shutting a new door on locality
•  Unexpected interaction between dark matter and ordinary matter in mini-spiral galaxies
•  Thermodynamics constrains interpretations of quantum mechanics
•  Billions of Stars and Galaxies to Be Discovered in the Largest Cosmic Map Ever
•  Scientists Measure Antimatter for the First Time
•  Europe's Bold Plan for a Moon Base Is Coming Together
•  Einstein's Theory Just Put the Brakes on the Sun's Spin
•  Dying Star Offers Glimpse of Earth's Doomsday in 5B Years
•  Dark Matter Not So Clumpy After All
•  Scientists Catch "Virtual Particles" Hopping In and Out of Existence
•  New theory of gravity might explain dark matter
•  Supersolids produced in exotic state of quantum matter
•  You Can 3D Print Your Own Mini Universe
•  Creating Antimatter Via Lasers?
•  No, Astronomers Haven't Decided Dark Energy Is Nonexistent
•  Behind This Plant's Blue Leaves Lies a Weird Trick of Quantum Mechanics
•  Small entropy changes allow quantum measurements to be nearly reversed
•  Did the Mysterious 'Planet Nine' Tilt the Solar System?
•  Cosmological mystery solved by largest ever map of voids and superclusters
•  The Universe Has 10 Times More Galaxies Than Scientists Thought
•  Correlation between galaxy rotation and visible matter puzzles astronomers
•  The Spooky Secret Behind Artificial Intelligence's Incredible Power
•  Science of Disbelief: When Did Climate Change Become All About Politics?
•  Eyeballing Proxima b: Probably Not a Second Earth
•  Does the Drake Equation Confirm There Is Intelligent Alien Life in the Galaxy?
•  Scientists build world's smallest transistor
•  'Alien Megastructure' Star Keeps Getting Stranger
•  What's Out There? 'Star Men' Doc Tackles Life Questions Through Science
•  Evidence for new form of matter-antimatter asymmetry observed
•  Giant hidden Jupiters may explain lonely planet systems
•  Rarest nucleus reluctant to decay
•  Weird Science: 3 Win Nobel for Unusual States of Matter
•  Methane didn’t warm ancient Earth, new simulations suggest
•  New 'Artificial Synapses' Pave Way for Brain-Like Computers
•  Stephen Hawking Is Still Afraid of Aliens
•  The Ig Nobel Prize Winners of 2016
•  Teleported Laser Pulses? Quantum Teleportation Approaches Sci-Fi Level
•  China Claims It Developed "Quantum" Radar To See Stealth Planes
•  Earth Wobbles May Have Driven Ancient Humans Out of Africa
•  Alien Planet Has 2 Suns Instead of 1, Hubble Telescope Reveals
•  Glider Will Attempt Record-Breaking Flight to Edge of Space
•  Entangled Particles Reveal Even Spookier Action Than Thought
•  Dark Matter Just Got Murkier
•  New 'Gel' May Be Step Toward Clothing That Computes
•  3.7-Billion-Year-Old Rock May Hold Earth's Oldest Fossils
•  Planck: First Stars Formed Later Than We Thought
•  Galaxy Cluster 11.1 Billion Light-Years from Earth Is Most Distant Ever Seen
•  What Earth's Oldest Fossils Mean for Finding Life on Mars
•  Earth Just Narrowly Missed Getting Hit by an Asteroid
•  Astrobiology Primer v2.0 Released
•  A new class of galaxy has been discovered, one made almost entirely of dark matter
•  How We Could Visit the Possibly Earth-Like Planet Proxima b
•  'Virtual' Particles Are Just 'Wiggles' in the Electromagnetic Field
•  Are tiny BLACK HOLES hitting Earth once every 1,000 years? Experts claim primordial phenomenon could explain dark matter
•  ‘Largest structure in the universe’ undermines fundamental cosmic principles
•  "Kitchen Smoke" in nebula offer clues to the building blocks of life
•  Brian Krill: Evolution of the 21st-Century Scientist
•  Simulated black hole experiment backs Hawking prediction
•  Deuteron joins proton as smaller than expected
•  Scientists Identify 20 Alien Worlds Most Likely to Be Like Earth
•  ‘Alien Megastructure’ Star Mystery Deepens After Fresh Kepler Data Confirms Erratic Dimming

Gravitational Waves Could Help Us Detect the Universe’s Hidden Dimensions
Gravitational waves might be used to uncover hidden dimensions in the universe. By looking at these ripples in spacetime, researchers at the Max Planck Institute for Gravitational Physics in Germany say we could work out what impact hidden dimensions would have on them, and use this information to find these effects.

The discovery of gravitational waves was announced in February 2016. Scientists used the Laser Interferometer Gravitational-wave Observatory (LIGO) detectors to find fluctuations in spacetime created by a pair of colliding black holes. Scientists can now use this information to see the universe in a whole new way—potentially even one day tracing waves that came from the Big Bang.

At present, our models of the universe are incomplete. They cannot explain many of the things we observe in the universe, so many physicists believe we are missing something—and that something could be the presence of extra dimensions.

If scientists were to find evidence of extra dimensions, they could start answering some of the most fundamental unknowns of the universe, like what dark matter is and why the universe is expanding at an accelerating rate.

Gravitational waves are ripples in spacetime caused by extremely energetic events. These events, like merging black holes, would release so much energy they would disrupt the way spacetime moves, creating ‘waves’ that would propagate out from the source—similar to the way a pebble thrown into a pond creates ripples moving outwards.

Gravitational waves were first predicted by Albert Einstein over 100 years ago, but until now we have not been able to find them. By the time the ripples reach us, they are so tiny that detecting them requires hugely sensitive equipment. This is what LIGO was able to do.

In the latest study, which appears on the preprint server, David Andriot and Gustavo Lucena Gómez look at how gravitational waves move through the known dimensions—three representing space and another for time. They then investigate what effects extra dimensions might have on the four dimensional waves we see.

“If there are extra dimensions in the universe, then gravitational waves can walk along any dimension, even the extra dimensions,” Lucena Gómez told New Scientist.

They found extra dimensions could have two effects on gravitational waves—firstly, they would have what they call a “breathing mode.” This provides another way for the gravitational waves to move in space.

“The breathing mode deforms the space in a specific manner, giving a distinct signature,” they wrote. To observe this change, they would need three detectors like LIGO all working to observe the same thing at the same time—something that “should be available in a near future,” they wrote.

The second effect is a “massive tower” of extra gravitational waves. These waves could be detected at high frequencies, something our current technologies prevent. To detect changes at the frequencies they propose, LIGO would need to be thousands of times more sensitive.

The scientists are clear that such apparatus does not exist, but note: “If such a detector were available, however, one could hope for a very clean signal, since there is no known astrophysical process emitting gravitational waves with frequencies much greater than 103Hz. Such high frequencies may thus be clear symptoms of new physics.”

However, Bobby Acharya, professor of Theoretical Physics at King’s College London, U.K., who was not involved in the study, is not convinced by the findings. In an interview with Newsweek , he says that while he firmly believes in the existence of extra dimensions, models suggest these dimensions would be extremely small: “That means that in order to excite them and create waves in those extra dimensions you require a lot of energy,” he says.

“And if you did produce the gravitational wave that propagated in the extra dimensions, the fact that extra dimensions are so small it means the frequency of this gravitational wave will be very high—much higher than the LIGO gravitational wave detectors can detect.”

He said you would need a “very optimistic point of view” to try to detect gravitational waves propagating in extra dimensions: “[The extra dimensions] would have to be rather large and then it would be difficult to make the model consistent with other observations. I’m not so positive about the result.”

We could detect alien life by finding complex molecules
By Bob Holmes in Mesa, Arizona

How can we search for life on other planets when we don’t know what it might look like? One chemist thinks he has found an easy answer: just look for sophisticated molecular structures, no matter what they’re made of. The strategy could provide a simple way for upcoming space missions to broaden the hunt.

Until now, the search for traces of life, or biosignatures, on other planets has tended to focus mostly on molecules like those used by earthly life. Thus, Mars missions look for organic molecules, and future missions to Europa may look for amino acids, unequal proportions of mirror-image molecules, and unusual ratios of carbon isotopes, all of which are signatures of life here on Earth.

But if alien life is very different, it may not show any of these. “I think there’s a real possibility we could miss life if [resembling Earth life is] the only criterion,” says Mary Voytek, who heads NASA’s astrobiology programme.

Now Lee Cronin, a chemist at the University of Glasgow, UK, argues that complexity could be a biosignature that doesn’t depend on any assumptions about the life forms that produce it. “Biology has one signature: the ability to produce complex things that could not arise in the natural environment,” Cronin says.

Obviously, an aircraft or a mobile phone could not assemble spontaneously, so their existence points to a living – and even intelligent – being that built them. But simpler things like proteins, DNA molecules or steroid hormones are also highly unlikely to occur without being assembled by a living organism, Cronin says.

Step by step
Cronin has developed a way to measure the complexity of a molecule by counting the number of unique steps – adding chemical side groups or ring structures, for example – needed for its formation, without double-counting repeated steps. To draw an analogy, his metric would score the words “bana” and “banana” as equally complex, since once you can make one “na” it is trivial to add a second one.

Any structure requiring more than about 15 steps is so complex it must be biological in origin, he said this week at the Astrobiology Science Conference in Mesa, Arizona.

Cronin thinks he may be able to make that criterion simpler still, by specifying a maximum molecular weight for compounds that can assemble spontaneously.

Astrobiologists welcome Cronin’s suggestion. “I appreciate Lee for developing a biosignature that has minimal assumptions about the biology,” says Voytek.

In practice, though, Voytek notes that a detector compact enough to travel on an interplanetary mission would probably need to be designed to look for carbon-based life.

And even if Cronin’s method works, no scientist would risk claiming to have found extraterrestrial life on the basis of just one line of evidence, says Kevin Hand of NASA’s Jet Propulsion Laboratory and project scientist for the Europa Lander mission now being developed by NASA. That means that future missions will still need to look for multiple biosignatures.

We May Have Uncovered the First Ever Evidence of the Multiverse

For years, scientists have been baffled by a weird anomaly far away in space: a mysterious “Cold Spot” about 1.8 billion light-years across. It is cooler than its surroundings by around 0.00015 degrees Celsius (0.00027 degrees Fahrenheit), a fact astronomers discovered by measuring background radiation throughout the universe.

Previously, astronomers believed that this space could be cooler simply because it had less matter in it than most sections of space. They dubbed it a massive supervoid and estimated that it had 10,000 galaxies fewer than other comparable sections of space.

But now, in a recently published survey of galaxies, astronomers from the Royal Astronomical Society (RAS) say they have discovered that this supervoide could not exist. They now believe that the galaxies in the cold spot are just clustered around smaller voids that populate the cold spot like bubbles. These small voids, however, cannot explain the temperature difference observed.


To link the temperature differences to the smaller voids, the researchers say a non-standard cosmological model would be required. “But our data place powerful constraints on any attempt to do that,” explained researcher Ruari Mackenzie in an RAS press release. While the study had a large margin of error, the simulations suggest there is only a two percent probability that the Cold Spot formed randomly.

“This means we can’t entirely rule out that the Spot is caused by an unlikely fluctuation explained by the standard model. But if that isn’t the answer, then there are more exotic explanations,” said researcher Tom Shanks in the press release. “Perhaps the most exciting of these is that the Cold Spot was caused by a collision between our universe and another bubble universe.”

If more detailed studies support the findings of this research, the Cold Spot might turn out to be the first evidence for the multiverse, though far more evidence would be needed to confirm our universe is indeed one of many.

Scientists Just Discovered an Alien Planet That’s The Best Candidate for Life As We Know It

Only a few decades ago, the thought of any alien planets existing in the reaches of space were just hypothetical ideas. Now, we know of thousands of such planets – and today, scientists may have discovered the best candidate yet for alien life.

That candidate is an exoplanet orbiting a red dwarf star 40 light-years from Earth—what the international team of astronomers who discovered it have deemed a “super-Earth.” Using ESO’s HARPS instrument and a range of telescopes around the world, the astronomers located the exoplanet orbiting the dim star – LHS 1140 – within its habitable zone. This world passes in front of its parent stars as it orbits, has likely retained most of its atmosphere, and is a little larger and much more massive than the Earth. In short, super-Earth LHS 1140b is among the most exciting known subjects for atmospheric studies.

Other Earths: The Best Exoplanet Candidates for Life [INFOGRAPHIC]
Click to View Full Infographic
Although the faint red dwarf star LHS 1140b is ten times closer to its star than the Earth is to the Sun, because red dwarfs are much smaller and cooler than the Sun is, the super-Earth lies in the middle of the habitable zone and receives around half as much sunlight from its star as the Earth does.

“This is the most exciting exoplanet I’ve seen in the past decade,” lead author Jason Dittmann of the Harvard-Smithsonian Center for Astrophysics said in an ESO science release. “We could hardly hope for a better target to perform one of the biggest quests in science — searching for evidence of life beyond Earth.”

*5* Scientists Just Discovered an Alien Planet That’s The Best Candidate for Life As We Know It
Artist’s impression of the super-Earth exoplanet LHS 1140b. Credit: ESO

To support life as we know it, a planet must retain an atmosphere and have liquid surface water. When red dwarf stars are young, they emit radiation that can damage the atmospheres of planets around them. This planet’s large size indicates that a magma ocean may have existed on its surface for eons, feeding steam into the atmosphere and replenishing the planet with water until well within the time the star had cooled to its current, steady glow. The astronomers estimate the planet is at least five billion years old, and deduce that it has a diameter of almost 18,000 kilometers (11,185 mi)— 1.4 times larger than that of the Earth. Its greater mass and density implies that it is probably made of rock with a dense iron core.

Two of the European members of the team, Xavier Delfosse and Xavier Bonfils, stated in the release: “The LHS 1140 system might prove to be an even more important target for the future characterization of planets in the habitable zone than Proxima b or TRAPPIST-1. This has been a remarkable year for exoplanet discoveries!”

Scientists expect observations with the Hubble Space Telescope will soon allow them to assess how much high-energy radiation the exoplanet receives, and further into the future — with the help of new telescopes like ESO’s Extremely Large Telescope and the James Webb Telescope — detailed observations of the atmospheres of exoplanets will be possible.

Physicists detect whiff of new particle at the Large Hadron Collider
For decades, particle physicists have yearned for physics beyond their tried-and-true standard model. Now, they are finding signs of something unexpected at the Large Hadron Collider (LHC), the world’s biggest atom smasher at CERN, the European particle physics laboratory near Geneva, Switzerland. The hints come not from the LHC’s two large detectors, which have yielded no new particles since they bagged the last missing piece of the standard model, the Higgs boson, in 2012, but from a smaller detector, called LHCb, that precisely measures the decays of familiar particles.

The latest signal involves deviations in the decays of particles called B mesons—weak evidence on its own. But together with other hints, it could point to new particles lying on the high-energy horizon. “This has never happened before, to observe a set of coherent deviations that could be explained in a very economical way with one single new physics contribution,” says Joaquim Matias, a theorist at the Autonomous University of Barcelona in Spain. Matias says the evidence is strong enough for a discovery claim, but others urge caution.

The LHC smashes protons together at unprecedented energy to try to blast into existence massive new particles, which its two big detectors, ATLAS and CMS, would spot. LHCb focuses on familiar particles, in particular B mesons, using an exquisitely sensitive tracking detector to sniff out the tiny explosive decays.

Get more great content like this delivered right to you!

B mesons are made of fundamental particles called quarks. Familiar protons and neutrons are made of two flavors of quarks, up and down, bound in trios. Heavier quark flavors—charm, strange, top, and bottom—can be created, along with their antimatter counterparts, in high-energy particle collisions; they pair with antiquarks to form mesons.

Lasting only a thousandth of a nanosecond, B mesons potentially provide a window onto new physics. Thanks to quantum uncertainty, their interiors roil with particles that flit in and out of existence and can affect how they decay. Any new particles tickling the innards of B mesons—even ones too massive for the LHC to create—could cause the rates and details of those decays to deviate from predictions in the standard model. It’s an indirect method of hunting new particles with a proven track record. In the 1970s, when only the up, down, and strange quarks were known, physicists predicted the existence of the charm quark by discovering oddities in the decays of K mesons (a family of mesons all containing a strange quark bound to an antiquark).

In their latest result, reported today in a talk at CERN, LHCb physicists find that when one type of B meson decays into a K meson, its byproducts are skewed: The decay produces a muon (a cousin of the electron) and an antimuon less often than it makes an electron and a positron. In the standard model, those rates should be equal, says Guy Wilkinson, a physicist at the University of Oxford in the United Kingdom and spokesperson for the 770-member LHCb team. “This measurement is of particular interest because theoretically it’s very, very clean,” he says.

Strangely familiar
A new process appears to be modifying one of the standard ways a B meson decays to a K meson. It may involve a new force-carrying particle called a Z' that avoids creating a short-lived top quark.
Standard model decay


B meson
K meson
Muon, µ+
Antimuon, µ–
Possible new decay

B meson
K meson


Charged weak force boson, W–
Neutral weak force boson, Z
Possible new particle, Z'
Bottom quark
Strange quark
Top quark
Anti-down quark
The result is just one of half a dozen faint clues LHCb physicists have found that all seem to jibe. For example, in 2013, they examined the angles at which particles emerge in such B meson decays and found that they didn’t quite agree with predictions.

What all those anomalies point to is less certain. Within the standard model, a B meson decays to a K meson only through a complicated “loop” process in which the bottom quark briefly turns into a top quark before becoming a strange quark. To do that, it has to emit and reabsorb a W boson, a “force particle” that conveys the weak force (see graphic, previous page).

The new data suggest the bottom quark might morph directly into a strange quark—a change the standard model forbids—by spitting out a new particle called a Z′ boson. That hypothetical cousin of the Z boson would be the first particle beyond the standard model and would add a new force to theory. The extra decay process would lower production of muons, explaining the anomaly. “It sort of an ad hoc construct, but it fits the data beautifully,” says Wolfgang Altmannshofer, a theorist at the University of Cincinnati in Ohio. Others have proposed that a quark–electron hybrid called a leptoquark might briefly materialize in the loop process and provide another way to explain the discrepancies.

Of course, the case for new physics could be a mirage of statistical fluctuations. Physicists with ATLAS and CMS 18 months ago reported hints of a hugely massive new particle only to see them fade away with more data. The current signs are about as strong as those were, Altmannshofer says.

The fact that physicists are using LHCb to search in the weeds for signs of something new underscores the fact that the LHC hasn’t yet lived up to its promise. “ATLAS and CMS were the detectors that were going to discover new things, and LHCb was going to be more complementary,” Matias says. “But things go as they go.”

If the Z′ or leptoquarks exist, then the LHC might have a chance to blast them into bona fide, albeit fleeting, existence, Matias says. The LHC is now revving up after its winter shutdown. Next month, the particle hunters will return to their quest.

Physicists Discover Hidden Aspects of Electrodynamics
BATON ROUGE – Radio waves, microwaves and even light itself are all made of electric and magnetic fields. The classical theory of electromagnetism was completed in the 1860s by James Clerk Maxwell. At the time, Maxwell’s theory was revolutionary, and provided a unified framework to understand electricity, magnetism and optics. Now, new research led by LSU Department of Physics & Astronomy Assistant Professor Ivan Agullo, with colleagues from the Universidad de Valencia, Spain, advances knowledge of this theory. Their recent discoveries have been published in Physical Review Letters.

Maxwell’s theory displays a remarkable feature: it remains unaltered under the interchange of the electric and magnetic fields, when charges and currents are not present. This symmetry is called the electric-magnetic duality.

However, while electric charges exist, magnetic charges have never been observed in nature. If magnetic charges do not exist, the symmetry also cannot exist. This mystery has motivated physicists to search for magnetic charges, or magnetic monopoles. However, no one has been successful. Agullo and his colleagues may have discovered why.

“Gravity spoils the symmetry regardless of whether magnetic monopoles exist or not. This is shocking. The bottom line is that the symmetry cannot exist in our universe at the fundamental level because gravity is everywhere,” Agullo said.

Gravity, together with quantum effects, disrupts the electric-magnetic duality or symmetry of the electromagnetic field.

Agullo and his colleagues discovered this by looking at previous theories that illustrate this phenomenon among other types of particles in the universe, called fermions, and applied it to photons in electromagnetic fields.

“We have been able to write the theory of the electromagnetic field in a way that very much resembles the theory of fermions, and prove this absence of symmetry by using powerful techniques that were developed for fermions,” he said.

This new discovery challenges assumptions that could impact other research including the study of the birth of the universe.

The Big Bang

Satellites collect data from the radiation emitted from the Big Bang, which is called the Cosmic Microwave Background, or CMB. This radiation contains valuable information about the history of the universe.

“By measuring the CMB, we get precise information on how the Big Bang happened,” Agullo said.

Scientists analyzing this data have assumed that the polarization of photons in the CMB is not affected by the gravitational field in the universe, which is true only if electromagnetic symmetry exists. However, since this new finding suggests that the symmetry does not exist at the fundamental level, the polarization of the CMB can change throughout cosmic evolution. Scientists may need to take this into consideration when analyzing the data. The focus of Agullo’s current research is on how much this new effect is.

This research is supported by the National Science Foundation grants PHY-1403943 and PHY-1552603.

A dark matter 'bridge' holding galaxies together has been captured for the first time
The first image of a dark matter "bridge", believed to form the links between galaxies, has been captured by astrophysicists in Canada.

Researchers at the University of Waterloo used a technique known as weak gravitational lensing to create a composite image of the bridge. Gravitational lensing is an effect that causes the images of distant galaxies to warp slightly under the influence of an unseen mass, such as a planet, a black hole, or in this case, dark matter.

Their composite image was made up of a combination of combined lensing images taken of more than 23,000 galaxy pairs, spotted 4.5 billion light-years away. This effect was measured from a multi-year sky survey at the Canada-France-Hawaii Telescope.

These results show that the dark matter filament bridge is strongest between systems less than 40 million light years apart, and confirms predictions that galaxies across the Universe are tied together through a cosmic web of the elusive substance.

Dark matter is a mysterious element said to make up around 84 per cent of the Universe. It's known as "dark" because it doesn't shine, absorb or reflect light, which has traditionally made it largely undetectable, except through gravity and gravitational lensing. Evidence for the existence of this form of matter comes, among other things, from the astrophysical observation of galaxies, which rotate far too rapidly to be held together only by the gravitational pull of the visible matter.

Astrophysics has long proposed the Universe's web of stars and galaxies is supported by a "cosmic scaffolding" made up of fine threads of this invisible dark matter. These threadlike strands formed just after the Big Bang when denser portions of the Universe drew in dark matter until it collapsed and formed flat disks, which featured fine filaments of dark matter at their joins. At the cross-section of these filaments, galaxies formed.

Ligo's next trick? Finally hunting down dark matter

Ligo's next trick? Finally hunting down dark matter

University of Waterloo
"For decades, researchers have been predicting the existence of dark matter filaments between galaxies that act like a web-like superstructure connecting galaxies together," said Mike Hudson, a professor of astronomy at the University of Waterloo in the journal Monthly Notices of the Royal Astronomical Society. "This image moves us beyond predictions to something we can see and measure."

"By using this technique, we're not only able to see that these dark matter filaments in the Universe exist, we're able to see the extent to which these filaments connect galaxies together," said co-author Seth Epps.

Dark matter is an invisible form of matter which, until now, has only revealed itself through its gravitational effects.
Evidence for the existence of this form of matter comes, among other things, from the astrophysical observation of galaxies, which rotate far too rapidly to be held together only by the gravitational pull of the visible matter.
High-precision measurements using the European satellite Planck show that almost 85 percent of the entire mass of the universe consists of dark matter.
All the stars, planets, nebulae and other objects in space that are made of conventional matter account for no more than 15 percent of the mass of the universe.
The unknown form of matter can either consist of comparatively few, but very heavy particles, or of a large number of light ones.
One of the possible candidates for dark matter is a particle called the axion, first proposed in 1977. It appears in some extensions of the Standard Model of particle physics. Astronomers believe that if axions make up dark matter, they could be detected through gravitational waves. This is because axions accelerated by a black hole would give off gravitational waves, just as electrons give off electromagnetic waves.

As a result, instruments like Ligo – and the upcoming Advanced Ligo (Ligo) –
may be able to see gravitational waves (GWs) from thousands of black hole (BH) mergers which would mark the beginning of a new precision tool for physics.

Physicists have a general idea about what the dark matter particle looks like but are struggling to build a clear picture. They can track the distribution of dark matter throughout the galaxy by examining how galaxies move, but can't pinpoint its exact location or design.

Earlier this year, Priyamvada Natarajan, a professor of astrophysics at Yale University, and her team brought the search for dark matter a step forward by creating the most detailed map of dark matter ever created. The map looks like an alien landscape, with uneven peaks and troughs scattered throughout. There are gentle mounds, on top of which sharp peaks arise, like the inside of a cave covered in stalactites.

No, Dark Energy Isn't An Illusion
In 1998, two teams of scientists announced a shocking discovery: the expansion of the Universe was accelerating. Distant galaxies weren't just receding from us, but their recession speed was increasing over time. Over the next few years, precision measurements of three independent quantities -- distant galaxies containing type Ia supernovae, the fluctuation pattern in the cosmic microwave background, and large-scale correlations between galaxies at a variety of distances -- all supported and confirmed this picture. The leading explanation? That there's a new form of energy inherent to space itself: dark energy. The case is so strong that no one reasonably doubts the evidence, but many teams have made alternative cases for the explanation, claiming that dark energy itself could be an illusion.
To understand whether this could be the case, we need to walk through four straightforward steps:

What a Universe without dark energy would look like, What our Universe actually looks like, What alternative explanations have been offered up, And to evaluate whether any of them could legitimately work? In science, as in all things, it's pretty easy to offer a "what if..." alternative scenario to the leading idea. But can it stand up to scientific rigor? That's the crucial test.
Well before we conceived of dark energy, all the way back in the 1920s and 1930s, scientists derived how the entire Universe could have evolved within General Relativity. If you assumed that space, on the largest scales, was uniform -- with the same density and temperature everywhere -- there were only three viable scenarios to describe a Universe that was expanding today. If you fill a Universe with matter and radiation, like ours appears to be, gravity will fight the expansion, and the Universe can:

expand up to a point, reach a maximum size, and then begin contracting, eventually leading to a total recollapse.
expand and slow down somewhat, but gravitation is insufficient to ever stop or reverse it, and so it will eternally expand into the great cosmic abyss.
expand, with gravitation and the expansion balancing each other perfectly, so the expansion rate and the recession speed of everything asymptotes to zero, but never reverses. Those were the three classic fates of the Universe: big crunch, big freeze, or a critical Universe, which was right on the border between the two.
But then the crucial observations came in, and it turns out the Universe did none of those three things. For the first six billion years or so after the Big Bang, it appeared we lived in a critical Universe, with the initial expansion and the effects of gravitational attraction balancing one another almost perfectly. But when the density of the Universe dropped below a certain amount, a surprise emerged: distant galaxies began speeding up, away from us and one another. This cosmic acceleration was unexpected, but robust, and has continued at the same rate ever since, for the past 7.8 billion years.
Why was this happening? The current, known forms of energy in the Universe -- particles, radiation and fields -- can't account for it. So scientists hypothesized a new form of energy, dark energy, that could cause the Universe's expansion to accelerate. There could be a new field that permeates all of space causing it; it could be the zero-point energy of the quantum vacuum; it could be Einstein's cosmological constant from General Relativity. Current and planned observatories and experiments are looking for possible signatures that would distinguish or search for departures from any of these potential explanations, but so far all are consistent with being the true nature of dark energy.
But alternatives have been proposed as well. Adding a new type of energy to the Universe should be a last resort to explain a new observation, or even a new suite of observations. A lot of people were skeptical of its existence, so scientists began asking the question of what else could be occurring? What could mimic these effects? A number of possibilities immediately emerged:

Perhaps the distant supernovae weren't the same as nearby ones, and were inherently fainter?
Perhaps there was something about the environments in which the supernovae occurred that changed?
Perhaps the distant light, en-route, was undergoing an interaction that caused it to fail to reach our eyes?
Perhaps a new type of dust existed, making these distant objects appear systematically fainter?
Or could it be that the assumption on which these models are founded -- that the Universe is, on the largest scales, perfectly uniform -- is flawed enough that what appears to be dark energy is simply the "correct" prediction of Einstein's theory?
The light-blocking, light-losing, or systematic light-differences scenarios have all been ruled out by multiple approaches, as even if supernovae were removed from the equation entirely, the evidence for dark energy would still be overwhelming. With precision measurements of the cosmic microwave background, baryon acoustic oscillations, and the large-scale structures that form and fail-to-form in our Universe, the case that the Universe's expans

Satellite galaxies at edge of Milky Way coexist with dark matter
Research conducted by scientists at Rochester Institute of Technology rules out a challenge to the accepted standard model of the universe and theory of how galaxies form by shedding new light on a problematic structure.

The vast polar structure - a plane of satellite galaxies at the poles of the Milky Way - is at the center of a tug-of-war between scientists who disagree about the existence of mysterious dark matter, the invisible substance that, according to some scientists, comprises 85 percent of the mass of the universe.

A paper accepted for publication in the Monthly Notices for the Royal Astronomical Society bolsters the standard cosmological model, or the Cold Dark Matter paradigm, by showing that the vast polar structure formed well after the Milky Way and is an unstable structure.

The study, "Is the Vast Polar Structure of Dwarf Galaxies a Serious Problem for CDM?" - available online at https:/?/?arxiv.?org/?abs/?1612.?07325 - was co-authored by Andrew Lipnicky, a Ph.D. candidate in RIT's astrophysical sciences and technology program, and Sukanya Chakrabarti, assistant professor in RIT's School of Physics and Astronomy, whose grant from the National Science Foundation supported the research.

Lipnicky and Chakrabarti analyze the distribution of the classical Milky Way dwarf galaxies that form the vast polar structure and compares it to simulations of the "missing" or subhalo dwarf galaxies thought to be cloaked in dark matter.

Using motion measurements, the authors traced the orbits of the classical Milky Way satellites backward in time. Their simulations showed the vast polar structure breaking up and dispersing, indicating that the plane is not as old as originally thought and formed later in the evolution of the galaxy. This means that the vast polar structure of satellite galaxies may be a transient feature, Chakrabarti noted.

"If the planar structure lasted for a long time, it would be a different story," Chakrabarti said. "The fact that it disperses so quickly indicates that the structure is not dynamically stable. There is really no inconsistency between the planar structure of dwarf galaxies and the current cosmological paradigm."

The authors removed the classical Milky Way satellites Leo I and Leo II from the study when orbital analyses determined that the dwarf galaxies were not part of the original vast polar structure but later additions likely snatched from the Milky Way. A comparison excluding Leo I and II reveals a similar plane shared by classical galaxies and their cloaked counterparts.

"We tried many different combinations of the dwarf galaxies, including distributions of dwarfs that share similar orbits, but in the end found that the plane always dispersed very quickly," Lipnicky said.

Opposing scientific thought rejects the existence of dark matter. This camp calls into question the standard cosmological paradigm that accepts both a vast polar structure of satellite galaxies and a hidden plane of dark-matter cloaked galaxies. Lipnicky and Chakrabarti's study supports the co-existence of these structures and refutes the challenge to the accepted standard model of the universe.

Their research concurs with a 2016 study led by Nuwanthika Fernando, from the University of Sydney, which found that certain Milky Way planes are unstable in general. The paper published in the Monthly Notices for the Royal Astronomical Society.

Magnetic hard drives go atomic
Chop a magnet in two, and it becomes two smaller magnets. Slice again to make four. But the smaller magnets get, the more unstable they become; their magnetic fields tend to flip polarity from one moment to the next. Now, however, physicists have managed to create a stable magnet from a single atom.

The team, who published their work in Nature on 8 March1, used their single-atom magnets to make an atomic hard drive. The rewritable device, made from 2 such magnets, is able to store just 2 bits of data, but scaled-up systems could increase hard-drive storage density by 1,000 times, says Fabian Natterer, a physicist at the Swiss Federal Institute of Technology (EPFL) in Lausanne, and author of the paper.

“It’s a landmark achievement,” says Sander Otte, a physicist at Delft University of Technology in the Netherlands. “Finally, magnetic stability has been demonstrated undeniably in a single atom.”

Related stories
Nanoscience: Single-atom data storage
How DNA could store all the world’s data
Atom wranglers create rewritable memory
More related stories
Inside a regular hard drive is a disk split up into magnetized areas — each like a tiny bar magnet — the fields of which can point either up or down. Each direction represents a 1 or 0 — a unit of data known as a bit. The smaller the magnetized areas, the more densely data can be stored. But the magnetized regions must be stable, so that ‘1’s and ‘0’s inside the hard disk do not unintentionally switch

Current commercial bits comprise around 1 million atoms. But in experiments physicists have radically shrunk the number of atoms needed to store 1 bit — moving from 12 atoms in 20122 to now just one. Natterer and his team used atoms of holmium, a rare-earth metal, sitting on a sheet of magnesium oxide, at a temperature below 5 kelvin.

Holmium is particularly suitable for single-atom storage because it has many unpaired electrons that create a strong magnetic field, and they sit in an orbit close to the atom's centre where they are shielded from the environment. This gives holmium both a large and stable field, says Natterer. But the shielding has a drawback: it makes the holmium notoriously difficult to interact with. And until now, many physicists doubted whether it was possible to reliably determine the atom’s state.

Bits of data
To write the data onto a single holmium atom, the team used a pulse of electric current from the magnetized tip of scanning tunnelling microscope, which could flip the orientation of the atom's field between a 0 or 1. In tests the magnets proved stable, each retaining their data for several hours, with the team never seeing one flip unintentionally. They used the same microscope to read out the bit — with different flows of current revealing the atom’s magnetic state.

To further prove that the tip could reliably read the bit, the team — which included researchers from the technology company IBM — devised a second, indirect, read-out method. They used a neighbouring iron atom as a magnetic sensor, tuning it so that its electronic properties depended on the orientation of the two holmium atomic magnets in the 2-bit system. The method also allows the team to read out multiple bits at the same time, says Otte, making it more practical and less invasive than the microscope technique.

Using individual atoms as magnetic bits would radically increase the density of data storage, and Natterer says that his EPFL colleagues are working on ways to make large arrays of single-atom magnets. But the 2-bit system is still far from practical applications and well behind another kind of single-atom storage, which encodes data in atoms’ positions, rather than in their magnetization, and has already built a 1-kilobyte (8,192-bit) rewritable data storage device.

One advantage of the magnetic system, however, is that it could be compatible with spintronics, says Otte. This emerging technology uses magnetic states not just to store data, but to move information around a computer in place of electric current, and would make for much more energy-efficient systems.

In the near term, physicists are more excited about studying the single-atom magnets. Natterer, for example, plans to observe three mini-magnets that are oriented so their fields are in competition with each other — so they continually flip. “You can now play around with these single-atom magnets, using them like Legos, to build up magnetic structures from scratch,” he says.

Nature doi:10.1038/nature.2017.21599
Read the related News & Views article: 'Single-atom data storage'

Could Mysterious Cosmic Light Flashes Be Powering Alien Spacecraft?
Partner Series

Bizarre flashes of cosmic light may actually be generated by advanced alien civilizations, as a way to accelerate interstellar spacecraft to tremendous speeds, a new study suggests.

Astronomers have catalogued just 20 or so of these brief, superbright flashes, which are known as fast radio bursts (FRBs), since the first one was detected in 2007. FRBs seem to be coming from galaxies billions of light-years away, but what's causing them remains a mystery.

"Fast radio bursts are exceedingly bright given their short duration and origin at great distances, and we haven't identified a possible natural source with any confidence," study co-author Avi Loeb, a theorist at the Harvard-Smithsonian Center for Astrophysics, said in a statement Thursday (March 9). "An artificial origin is worth contemplating and checking." [5 Bold Claims of Alien Life]


One potential artificial origin, according to the new study, might be a gigantic radio transmitter built by intelligent aliens. So Loeb and lead author Manasvi Lingam, of Harvard University, investigated the feasibility of this possible explanation.

Artist's illustration of a light sail powered by a radio beam (red) generated on the surface of a planet. The leakage from such beams as they sweep across the sky would appear as superbright light flashes known as fast radio bursts, according to a new study.
Artist's illustration of a light sail powered by a radio beam (red) generated on the surface of a planet. The leakage from such beams as they sweep across the sky would appear as superbright light flashes known as fast radio bursts, according to a new study.
Credit: M. Weiss/CfA
The duo calculated that a solar-powered transmitter could indeed beam FRB-like signals across the cosmos — but it would require a sunlight-collecting area twice the size of Earth to generate the necessary power.

And the huge amounts of energy involved wouldn't necessarily melt the structure, as long as it was water-cooled. So, Lingam and Loeb determined, such a gigantic transmitter is technologically feasible (though beyond humanity's current capabilities).

Why would aliens build such a structure? The most plausible explanation, according to the study team, is to blast interstellar spacecraft to incredible speeds. These craft would be equipped with light sails, which harness the momentum imparted by photons, much as regular ships' sails harness the wind. (Humanity has demonstrated light sails in space, and the technology is the backbone of Breakthrough Starshot, a project that aims to send tiny robotic probes to nearby star systems.)

Indeed, a transmitter capable of generating FRB-like signals could drive an interstellar spacecraft weighing 1 million tons or so, Lingam and Loeb calculated.

"That's big enough to carry living passengers across interstellar or even intergalactic distances," Lingam said in the same statement.

Humanity would catch only fleeting glimpses of the "leakage" from these powerful beams (which would be trained on the spacecraft's sail at all times), because the light source would be moving constantly with respect to Earth, the researchers pointed out.

The duo took things a bit further. Assuming that ET is responsible for most FRBs, and taking into account the estimated number of potentially habitable planets in the Milky Way (about 10 billion), Lingam and Loeb calculated an upper limit for the number of advanced alien civilizations in a galaxy like our own: 10,000.

Lingam and Loeb acknowledge the speculative nature of the study. They aren't claiming that FRBs are indeed caused byaliens; rather, they're saying that this hypothesis is worthy of consideration.

"Science isn't a matter of belief; it's a matter of evidence," Loeb said. "Deciding what’s likely ahead of time limits the possibilities. It's worth putting ideas out there and letting the data be the judge."

The new study has been accepted for publication in The Astrophysical Journal Letters. You can read it for free on the online preprint site

NASA is Going to Create The Coldest Spot in the Known Universe
Creating Cold Atom Lab

This summer, a box the size of an ice chest will journey to the International Space Station (ISS). Once there, it will become the coldest spot in the universe—more than 100 million times colder than deep space itself. The instruments inside the box — an electromagnetic “knife,” lasers, and a vacuum chamber — will slow down gas particles until they are almost motionless, bringing them just a billionth of a degree above absolute zero.

This box and its instruments are called the Cold Atom Laboratory (CAL). CAL was developed by the Jet Propulsion Laboratory (JPL), which is funded by NASA. Right now at JPL, CAL is in the final assembly stages, and getting ready for its trip to space which is set for August 2017. CAL will be hitching a ride on SpaceX CRS-12.

Once in space on the ISS, five scientific teams plan will use CAL to conduct experiments. Among them is the team headed by Eric Cornell, one of the scientists who won the Nobel Prize for creating Bose-Einstein condensates in a lab setting in 1995.

Seeing the Other 95%

Atoms that are cooled to extreme temperatures can form a unique state of matter: a Bose-Einstein condensate. This state is important scientifically because in it, the laws of quantum physics take over and we can observe matter behaving more like waves and less like particles. However, these rows of atoms, which move together like waves, can only be observed for fractions of a second on Earth because gravity causes atoms to move towards the ground. CAL achieves new low temperatures for longer observation of these mysterious waveforms.

Although NASA has never observed or created Bose-Einstein condensates in space, ultra-cold atoms can hold their wave-like forms longer while in freefall on the International Space Station. JPL Project Scientist Robert Thompson believes CAL will render Bose-Einstein condensates observable for up to five to 10 seconds. He also believes that improvements to CAL’s technologies could allow for hundreds of seconds of observation time.

“Studying these hyper-cold atoms could reshape our understanding of matter and the fundamental nature of gravity,” said Thompson. “The experiments we’ll do with the Cold Atom Lab will give us insight into gravity and dark energy—some of the most pervasive forces in the universe.”

These experiments could potentially lead to improved technologies, including quantum computers, sensors, and atomic clocks for navigation on spacecraft. CAL deputy project manager Kamal Oudrhiri of JPL cites dark energy detection applications as “especially exciting.” Current physics models indicate that the universe is about 68 percent dark energy, 27 percent dark matter, and 5 percent ordinary matter.

“This means that even with all of our current technologies, we are still blind to 95 percent of the universe,” Oudrhiri said. “Like a new lens in Galileo’s first telescope, the ultra-sensitive cold atoms in the Cold Atom Lab have the potential to unlock many mysteries beyond the frontiers of known physics.”

Testing theories of modified gravity
Physics Today 70, 3, 21 (2017); doi:
The accelerated expansion of the universe is usually attributed to a mysterious dark energy, but there’s another conceivable explanation: modified gravity. Unmodified gravity—that is, Einstein’s general relativity— satisfactorily accounts for the dynamics of the solar system, where precision measurements can be made without the confounding influence of dark matter. Nor have any violations been detected in one of general relativity’s principal ingredients, the strong equivalence principle, which posits that inertial mass and gravitational mass are identical.
But those observational constraints are not ineluctable. In particular, a class of gravitational theories called Galileon models can also pass them. In 2012 Lam Hui and Alberto Nicolis of Columbia University devised a cosmic test that could refute or confirm the models. Their test hinges on the models’ central feature: an additional scalar field that couples to mass. The coupling can be characterized by a charge-like parameter, Q. For most cosmic objects, Q has the same value as the inertial mass. But for a black hole, whose mass arises entirely from its gravitational binding energy, Q is zero; the strong equivalence principle is violated.
Galaxies fall through space away from low concentrations of mass and toward high concentrations. The supermassive black holes at the centers of some galaxies are carried along with the flow. But if gravity has a Galileon component, the black hole feels less of a tug than do the galaxy’s stars, interstellar medium, and dark-matter particles. The upshot, Hui and Nicolis realized, is that the black hole will lag the rest of the galaxy and slip away from its center. The displacement is arrested when the black hole reaches the point where the lag is offset by the presence of more of the galaxy’s gravitational mass on one side of the black hole than on the other. Given the right circumstances, the displacement can be measured.
Hui and Nicolis’s proposal has now itself been put to the test. Asha Asvathaman and Jeremy Heyl of the University of British Columbia, together with Hui, have applied it to two galaxies: M32, which is being pulled toward its larger neighbor, the Andromeda galaxy, and M87 (shown here), which is being pulled through the Virgo cluster of galaxies. Both M32 and M87 are elliptical galaxies. Because of their simple shapes, their centroids can be determined from optical observations. The locations of their respective black holes can be determined from radio observations. Although the limit on Galileon gravity that Asvathaman, Heyl, and Hui derived was too loose to refute or confirm the theory, they nevertheless validated the test itself. More precise astrometric observations could make it decisive. (A. Asvathaman, J. S. Heyl, L. Hui, Mon. Not. R. Astron. Soc. 465, 3261, 2017, doi:10.1093/mnras/stw2905.)

First Solid Sign that Matter Doesn't Behave Like Antimatter
One of the biggest mysteries in physics is why there's matter in the universe at all. This week, a group of physicists at the world's largest atom smasher, the Large Hadron Collider, might be closer to an answer: They found that particles in the same family as the protons and neutrons that make up familiar objects behave in a slightly different way from their antimatter counterparts.

While matter and antimatter have all of the same properties, antimatter particles carry charges that are the opposite of those in matter. In a block of iron, for example, the protons are positively charged and the electrons are negatively charged. A block of antimatter iron would have negatively charged antiprotons and positively charged antielectrons (known as positrons). If matter and antimatter come in contact, they annihilate each other and turn into photons (or occasionally, a few lightweight particles such as neutrinos). Other than that, a piece of matter and antimatter should behave in the same way, and even look the same — a phenomenon called charge-parity (CP) symmetry. [The 18 Biggest Unsolved Mysteries in Physics]

Besides the identical behavior, CP symmetry also implies that the amount of matter and antimatter that was formed at the Big Bang, some 13.7 billion years ago, should have been equal. Clearly it was not, because if that were the case, then all the matter and antimatter in the universe would have been annihilated at the start, and even humans wouldn't be here.

But if there were a violation to this symmetry — meaning some bit of antimatter were to behave in a way that was different from its matter counterpart — perhaps that difference could explain why matter exists today.

To look for this violation, physicists at the Large Hadron Collider, a 17-mille-long (27 kilometers) ring beneath Switzerland and France, observed a particle called a lambda-b baryon. Baryons include the class of particles that make up ordinary matter; protons and neutrons are baryons. Baryons are made of quarks, and antimatter baryons are made of antiquarks. Both quarks and antiquarks come in six "flavors": up, down, top, bottom (or beauty), strange and charm, as scientists call the different varieties. A lambda-b is made of one up, one down and one bottom quark. (A proton is made of two up and one down, while a neutron consists of two down and one up quark.)

If the lambda and its antimatter sibling show CP symmetry, then they would be expected to decay in the same way. Instead, the team found that the lambda-b and antilambda-b particles decayed differently. Lambdas decay in two ways: into a proton and two charged particles called pi mesons (or pions), or into a proton and two K mesons (or kaons). When particles decay, they throw off their daughter particles at a certain set of angles. The matter and antimatter lambdas did that, but the angles were different. [7 Strange Facts About Quarks]

This is not the first time matter and antimatter have behaved differently. In the 1960s, scientists studied kaons themselves, which also decayed in a way that was different from their antimatter counterparts. B mesons — which consist of a bottom quark and an up, down, strange or charm quark — have also shown similar "violating" behavior.

Mesons, though, are not quite like baryons. Mesons are pairs of quarks and antiquarks. Baryons are made of ordinary quarks only, and antibaryons are made of antiquarks only. Discrepancies between baryon and antibaryon decays had never been observed before.

"Now we have something for baryons," Marcin Kucharczyk, an associate professor at the Institute of Nuclear Physics of the Polish Academy of Sciences, which collaborated on the LHC experiment, told Live Science. "When you'd observed mesons, it was not obvious that for baryons it was the same."

While tantalizing, the results were not quite solid enough to count as a discovery. For physicists, the measure of statistical significance, which is a way of checking whether one's data could happen by chance, is 5 sigma. Sigma refers to standard deviations, and a 5 means that there is only a 1 in 3.5 million chance that the results would occur by chance. This experiment got to 3.3 sigma — good, but not quite there yet. (That is, 3.3 sigma means that there's about a 1 in 4,200 chance that the observation would have occurred randomly, or about a 99-percent confidence level.)

The findings are not a complete answer to the mystery of why matter dominates the universe, Kucharczyk said.

"It cannot explain the asymmetry fully," he said. "In the future, we will have more statistics, and maybe for other baryons."

The findings are detailed in the Jan. 30 issue of the journal Nature Physics

Physicists investigate erasing information at zero energy cost
(—A few years ago, physicists showed that it's possible to erase information without using any energy, in contrast to the assumption at the time that erasing information must require energy. Instead, the scientists showed that the cost of erasure could be paid in terms of an arbitrary physical quantity such as spin angular momentum—suggesting that heat energy is not the only conserved quantity in thermodynamics.
Investigating this idea further, physicists Toshio Croucher, Salil Bedkihal, and Joan A. Vaccaro at the Centre for Quantum Dynamics, Griffith University, Brisbane, Queensland, Australia, have now discovered some interesting results about the tiny fluctuations in the spin cost of erasing information. The work could lead to the development of new types of heat engines and information processing devices.
As the scientists explain in a new paper published in Physical Review Letters, the possibility that information can be erased at zero energy cost is surprising at first due to the fact that energy and entropy are so closely related in thermodynamics. In the context of information, information erasure corresponds to entropy erasure (or a decrease in entropy) and therefore requires a minimum amount of energy, which is determined by Landauer's erasure principle.
Since Landauer's erasure principle is equivalent to the second law of thermodynamics, the zero-energy erasure scheme using arbitrary conserved quantities can be thought of as a generalized second law of thermodynamics. This idea dates back to at least 1957, when E. T. Jaynes proposed an alternative to the second law in which heat energy is thought of in a more general way than unusual, so that heat incorporates other kinds of conserved quantities.
Applying this framework to information erasure, in 2011 Vaccaro and Stephen Barnett showed that the energy cost of information erasure can be substituted with one or more different conserved quantities—specifically, spin angular momentum.
One important difference between heat energy and spin angular momentum is that, while heat may or may not be quantized, spin angular momentum is an intrinsically quantum mechanical property, and so it is always quantized. This has implications when it comes to accounting for tiny fluctuations in these quantities that become significant when designing systems at the nanoscale.

Scientists have only recently investigated these fluctuations in the context of the Landauer principle, where they found that these fluctuations are quickly suppressed by something called the Jarzynski equality. This means that heat energy fluctuations have only a very tiny probability of violating the Landauer principle.
In the new study, the scientists have for the first time investigated the corresponding discrete fluctuations that arise when erasing information using spin.
Among their results, the researchers found that the discrete fluctuations are suppressed even more quickly than predicted by the corresponding Jarzynski equality for "spinlabor"—a new term the scientists devised that means the spin equivalent of work. This is the first evidence of beating this bound in an information erasure context. The quick suppression means that the fluctuations have an extremely low probability of using less than the minimal cost required to erase information using spin, as given by the Vaccaro-Barnett bound, which is the spin equivalent of the Landauer principle.
"Our work generalizes fluctuation relations for erasure using arbitrary conserved quantities and exposes the role of discreteness in the context of erasure," Bedkihal told "We also obtained a probability of violation bound that is tighter than the corresponding Jarzynski bound. This is a statistically significant result."
The scientists also point out that this process of erasing information with spin has already been experimentally demonstrated, although it appears to have gone unnoticed. In spin-exchange optical pumping, light is used to excite electrons in an atom to a higher energy level. For the electrons to return to their lower energy level during the relaxation process, atoms and nuclei collide with each other and exchange spins. This entropy-decreasing process can be considered analogous to erasing information at a cost of spin exchange.
Overall, the new results reveal insight into the thermodynamics of spin and could also guide the development of future applications. These could include new kinds of heat engines and information processing devices based on erasure that use inexpensive, locally available resources such as spin angular momentum. The researchers plan to further pursue these possibilities in the future.
"The erasure mechanism can be used to design generalized heat engines operating under the reservoirs of multiple conserved quantities such as a thermal reservoir and a spin reservoir," Bedkihal said. "For example, one may design heat engines using semiconductor quantum dot systems where lattice vibrations constitute a thermal reservoir and nuclear spins constitute a polarized spin reservoir. Such heat engines go beyond the traditional Carnot heat engine that operates under two thermal reservoirs."
Explore further: Scientists show how to erase information without using energy
More information: Toshio Croucher, Salil Bedkihal, and Joan A. Vaccaro. "Discrete Fluctuations in Memory Erasure without Energy Cost." Physical Review Letters. DOI: 10.1103/PhysRevLett.118.060602, Also at arXiv:1604.05795 [quant-ph]

NASA Just Found A Solar System With 7 Earth-Like Planets

Today, scientists working with telescopes at the European Southern Observatory and NASA announced a remarkable new discovery: An entire system of Earth-sized planets. If that’s not enough, the team asserts that the density measurements of the planets indicates that the six innermost are Earth-like rocky worlds.

And that’s just the beginning.

Three of the planets lie in the star’s habitable zone. If you aren’t familiar with the term, the habitable zone (also known as the “goldilocks zone”) is the region surrounding a star in which liquid water could theoretically exist. This means that all three of these alien worlds may have entire oceans of water, dramatically increasing the possibility of life. The other planets are less likely to host oceans of water, but the team states that liquid water is still a possibility on each of these worlds.

Summing the work, lead author Michaël Gillon notes that this solar system has the largest number of Earth-sized planets yet found and the largest number of worlds that could support liquid water: “This is an amazing planetary system — not only because we have found so many planets, but because they are all surprisingly similar in size to the Earth!”

Co-author Amaury Triaud notes that the star in this system is an “ultracool dwarf,” and he clarifies what this means in relation to the planets: “The energy output from dwarf stars like TRAPPIST-1 is much weaker than that of our Sun. Planets would need to be in far closer orbits than we see in the Solar System if there is to be surface water. Fortunately, it seems that this kind of compact configuration is just what we see around TRAPPIST-1.”


The system is just 40 light-years away. On a cosmic scale, that’s right next door. Of course, practically speaking, it would still take us hundreds of millions of years to get there with today’s technology – but again, it is notable in that the find speaks volumes about the potential for life-as-we-know-it beyond Earth.

Moreover, the technology of tomorrow could get us to this system a lot faster.

These new discoveries ultimately mean that TRAPPIST-1 is of monumental importance for future study. The Hubble Space Telescope is already being used to search for atmospheres around the planets, and Emmanuël Jehin, a scientist who also worked on the research, asserts that future telescopes could allow us to truly see into the heart of this system: “With the upcoming generation of telescopes, such as ESO’s European Extremely Large Telescope and the NASA/ESA/CSA James Webb Space Telescope, we will soon be able to search for water and perhaps even evidence of life on these worlds.”

Nearby Star Has 7 Earth-Sized Worlds - Most In Habitable Zone
It will be announced tomorrow by NASA that Michael Gillon et al have confirmed 4 more Earth-sized planets circling TRAPPIST-1 in addition to 3 already discovered.

It is possible that most of the planets confirmed thus circling far TRAPPIST-1 could be in the star's habitable zone. The inner 6 planets are probably rocky in composition and may be just the right temperature for liquid water to exist (between 0 - 100 degrees C) - if they have any water, that is. The outermost 7th planet still needs some more observations to nail down its orbit and composition.

Data About 2 Distant Asteroids: Clues to the Possible Planet Nine
In the year 2000 the first of a new class of distant solar system objects was discovered, orbiting the Sun at a distance greater than that of Neptune: the "extreme trans Neptunian objects (ETNOs).

Their orbits are very far from the Sun compared with that of the Earth. We orbit the Sun at a mean distance of one astronomical unit (1 AU which is 150 million kilometres) but the ETNOs orbit at more than 150 AU. To give an idea of how far away they are, Pluto's orbit is at around 40 AU and its closest approach to the Sun (perihelion) is at 30 AU. This discovery marked a turning point in Solar System studies, and up to now, a total of 21 ETNOs have been identified.

Recently, a number of studies have suggested that the dynamical parameters of the ETNOs could be better explained if there were one or more planets with masses several times that of the Earth orbiting the Sun at distances of hundreds of AU. In particular, in 2016 the researchers Brown and Batygin used the orbits of seven ETNOs to predict the existence of a "superearth" orbiting the sun at some 700 AU. This range of masses is termed sub Neptunian. This idea is referred to as the Planet Nine Hypothesis and is one of the current subjects of interest in planetary science. However, because the objects are so far away the light we receive from them is very weak and until now the only one of the 21 trans Neptunian objects observed spectroscopically was Sedna.

Now, a team of researchers led by the Instituto de Astrofísica de Canarias (IAC) in collaboration with the Complutense University of Madrid has taken a step towards the physical characterization of these bodies, and to confirm or refute the hypothesis of Planet Nine by studying them. The scientists have made the first spectroscopic observations of 2004 VN112 and 2013 RF98, both of them particularly interesting dynamically because their orbits are almost identical and the poles of the orbits are separated by a very small angle. This suggest a common origin, and their present-day orbits could be the result of a past interaction with the hypothetical Planet Nine. This study, recently published in Monthly Notices of the Royal Astronomical Society, suggests that this pair of ETNOs was a binary asteroid which separated after an encounter with a planet beyond the orbit of Pluto.

To reach these conclusions, they made the first spectroscopic observations of 2004 VN112 and 2013 RF98 in the visible range. These were performed in collaboration with the support astronomers Gianluca Lombardi and Ricardo Scarpa, using the OSIRIS spectrograph on the Gran Telescopio CANARIAS (GTC), situated in the Roque de los Muchachos Observatory (Garafía, La Plama). It was hard work to identify these asteroids because their great distance means that their apparent movement on the sky is very slow. Then, they measured their apparent magnitudes (their brightness as seen from Earth) and also recalculated the orbit of 2013 RF98, which had been poorly determined. They found this object at a distance of more than an arcminute away from the position predicted from the ephemerides. These observations have helped to improve the computed orbit, and have been published by the Minor Planet Center (MPEC 2016-U18: 2013 RF98), the organism responsible for the identification of comets and minor planets (asteroids) as well as for measurements of their parameters and orbital positions.

The visible spectrum can give some information also about their composition. By measuring the slope of the spectrum, can be determined whether they have pure ices on their surfaces, as is the case for Pluto, as well as highly processed carbon compounds. The spectrum can also indicate the possible presence of amorphous silicates, as in the Trojan asteroids associated with Jupiter. The values obtained for 2004 VN112 and 2013 RF98 are almost identical and similar to those observed photometrically for two other ETNOs, 2000 CR105 and 2012 VP113. Sedna, however, the only one of these objects which had been previously observed spectroscopically, shows very different values from the others. These five objects are part of the group of seven used to test the hypothesis of Planet Nine, which suggests that all of them should have a common origin, except for Sedna, which is thought to have come from the inner part of the Oort cloud.

"The similar spectral gradients observed for the pair 2004 VN112 - 2013 RF98 suggests a common physical origin", explains Julia de León, the first author of the paper, an astrophysicist at the IAC. "We are proposing the possibility that they were previously a binary asteroid which became unbound during an encounter with a more massive object". To validate this hypothesis, the team performed thousands of numerical simulations to see how the poles of the orbits would separate as time went on. The results of these simulations suggest that a possible Planet Nine, with a mass of between 10 and 20 Earth masses orbiting the Sun at a distance between 300 and 600 AU could have deviated the pair 2004 VN112 - 2013 RF98 around 5 and 10 million years ago. This could explain, in principle, how these two asteroids, starting as a pair orbiting one another, became gradually separated in their orbits because they made an approach to a much more massive object at a particular moment in time.

Please follow SpaceRef on Twitter and Like us on Facebook.

Tune Your Radio: Galaxies Sing When Forming Stars
Almost all the light we see in the universe comes from stars which form inside dense clouds of gas in the interstellar medium.

The rate at which they form (referred to as the star formation rate, or SFR) depends on the reserves of gas in the galaxies and the physical conditions in the interstellar medium, which vary as the stars themselves evolve. Measuring the star formation rate is hence key to understand the formation and evolution of galaxies.

Until now, a variety of observations at different wavelengths have been performed to calculate the SFR, each with its advantages and disadvantages. As the most commonly used SFR tracers, the visible and the ultraviolet emission can be partly absorbed by interstellar dust. This has motivated the use of hybrid tracers, which combine two or more different emissions, including the infrared, which can help to correct this dust absorption. However, the use of these tracers is often uncertain because other sources or mechanisms which are not related to the formation of massive stars can intervene and lead to confusion.

Now, an international research team led by the IAC astrophysicist Fatemeh Tabatabaei has made a detailed analysis of the spectral energy distribution of a sample of galaxies, and has been able to measure, for the first time, the energy they emit within the frequency range of 1-10 Gigahertz which can be used to know their star formation rates. "We have used" explains this researcher "the radio emission because, in previous studies, a tight correlation was detected between the radio and the infrared emission, covering a range of more than four orders of magnitude". In order to explain this correlation, more detailed studies were needed to understand the energy sources and processes which produce the radio emission observed in the galaxies.

"We decided within the research group to make studies of galaxies from the KINGFISH sample (Key Insights on Nearby Galaxies: a Far-Infrared Survey with Herschel) at a series of radio frequencies", recalls Eva Schinnerer from the Max-Planck-Institut für Astronomie (MPIA) in Heidelberg, Germany. The final sample consists of 52 galaxies with very diverse properties. "As a single dish, the 100-m Effelsberg telescope with its high sensitivity is the ideal instrument to receive reliable radio fluxes of weak extended objects like galaxies", explains Marita Krause from the Max-Planck-Institut für Radioastronomie (MPIfR) in Bonn, Germany, who was in charge of the radio observations of those galaxies with the Effelsberg radio telescope. "We named it the KINGFISHER project, meaning KINGFISH galaxies Emitting in Radio."

The results of this project, published today in The Astrophysical Journal, show that the 1-10 Gigahertz radio emission used is an ideal star formation tracer for several reasons. Firstly, the interstellar dust does not attenuate or absorb radiation at these frequencies; secondly, it is emitted by massive stars during several phases of their formation, from young stellar objects to HII regions (zones of ionized gas) and supernova remnants, and finally, there is no need to combine it with any other tracer. For these reasons, measurements in the chosen range are a more rigorous way to estimate the formation rate of massive stars than the tracers traditionally used.

This study also clarifies the nature of the feedback processes occurring due to star formation activity, which are key in evolution of galaxies. "By differentiating the origins of the radio continuum, we could infer that the cosmic ray electrons (a component of the interstellar medium) are younger and more energetic in galaxies with higher star formation rates, which can cause powerful winds and outflows and have important consequences in regulation of star formation", explains Fatemeh Tabatabaei.

Article: "The radio spectral energy distribution and star formation rate calibration in galaxies", by F. Tabatabaei et al. The Astrophysical Journal. Volume 836, Number 2. (DOI: 10.3847/1538-4357/836/2/185)

Coders Race to Save NASA's Climate Data
A group of coders is racing to save the government's climate science data.

On Saturday (Feb. 11), 200 programmers crammed themselves into the Doe Library at the University of California, Berkeley, furiously downloading NASA's Earth science data in a hackathon, Wired reported. The group's goal: rescue data that may be deleted or hidden under President Donald Trump's administration.

The process involves developing web-crawler scripts to trawl the internet, finding federal data and patching it together into coherent data sets. The hackers are also keeping track of data as it disappears; for instance, the Global Data Center's reports and one of NASA's atmospheric carbon dioxide (CO2) data sets has already been removed from the web.

By the end of Saturday, when the hackathon concluded, the coders had successfully downloaded thousands of pages — essentially all of NASA's climate data — onto the Internet Archive, a digital library.

But there is still more to be done. While the climate data may be safe for now, many other data sets out there could be lost, such as National Parks Service data on GPS boundaries and species tallies, Wired reported.

"Climate change data is just the tip of the iceberg," Eric Kansa, an anthropologist who manages archaeological data archiving for the nonprofit group Open Context, told Wired. "There are a huge number of other data sets being threatened [that are rich] with cultural, historical, sociological information."

Originally published on Live Science.

Editor's Recommendations

The Reality of Climate Change: 10 Myths Busted
NASA's Climate Change Data Key To Preparing Cities For Possible Catastrophes | Video
50 Interesting Facts About Earth

You Can Help Scientists Find the Next Earth-Like Planet

NASA’s Kepler space telescope holds the record when it comes to candidate and confirmed exoplanets — to date, it has identified more than 5,000. To scan the universe for these alien planets, Kepler uses what’s called the “transit method.” Basically, Kepler watches out for the brightness dips that occur when a planet crosses the face of the star it orbits.

This isn’t the only method to catch exoplanets. The High Resolution Echelle Spectrometer (HIRES) instrument at the Keck Observatory in Hawaii detects radial velocity instead of brightness dips. This radial velocity method searches stars for signs of gravitational wobbles induced by orbiting planets. HIRES was part of a two-decade long radical velocity-planet hunting program and it has compiled almost 61,000 individual measurements made of more than 1,600 stars.

“HIRES was not specifically optimized to do this type of exoplanet detective work, but has turned out to be a workhorse instrument of the field,” said Steve Vogt, from the University of California Santa Cruz, who built the instrument. “I am very happy to contribute to science that is fundamentally changing how we view ourselves in the universe.”

From this huge amount of data, a team of researchers led by Paul Butler of the Carnegie Institution for Science in Washington, D.C., identified more than 100 possible exoplanets. Specifically, the researchers identified 60 candidate planets, plus 54 more that require further examination. They published their study in the The Astronomical Journal.

“We were very conservative in this paper about what counts as an exoplanet candidate and what does not,” researcher Mikko Tuomi explained, “and even with our stringent criteria, we found over 100 new likely planet candidates.” Among the candidate exoplanets, one could be orbiting the fourth-closest star (GJ 411) to our Sun just about 8.3 light years away. It’s not an Earth-twin however, as this potential planet has an orbital period that’s equivalent to just 10 days.


There’s still a considerable amount of data to comb through. So, together with their findings, Butler’s team made the HIRES data set available to the public. “One of our key goals in this paper is to democratize the search for planets,” explained team member Greg Laughlin of Yale. “Anyone can download the velocities published on our website and use the open source Systemic software package and try fitting planets from the data.”

It’s certainly a noble idea and a timely one. “I think this paper sets a precedent for how the community can collaborate on exoplanet detection and follow-up”, said team-member Johanna Teske. “With NASA’s TESS mission on the horizon, which is expected to detect 1000+ planets orbiting bright, nearby stars, exoplanet scientists will soon have a whole new pool of planets to follow up.”

Other tools that can facilitate this search for exoplanets and potentially habitable ones include the recently completed James Webb Space Telescope (JWST). Its powerful array of lenses and mirrors will give our ability to scan the universe a much appreciated boost. Technological advances like the JWST, NASA’s TESS, and a couple of other interstellar eyes will allow us to see the universe like never before.

Scientists Discover Over 100 New Exoplanets
An international team of astronomers has announced the discovery of over a hundred new exoplanet candidates. These exoplanets were found using two decades' worth of data from the Keck Observatory in Hawaii. Their results were recently published in a paper in the Astronomical Journal, and among the discoveries is a planet orbiting the fourth-closest star to our own, only 8 light-years away.

Finding exoplanets isn't easy. Planets beyond our solar system are tiny and dark when compared to their host stars, so some advanced techniques have to be used to pinpoint them. The Kepler space telescope, for instance, finds exoplanets by looking for stars that regularly dim slightly. This dimming is caused by an exoplanet blocking some of the star's light when it passes in front, and the change in brightness can tell us a lot about the size of the planet and how fast it orbits.

However, there are additional ways to spot an exoplanet. The Keck Observatory uses a different method, called the radial velocity method, that looks at how the star moves. When a planet orbits a star, the planet's gravity causes the star to wobble a little bit. For instance, our own planet causes the sun to move a few inches per second, while Jupiter causes the sun to move about 40 feet per second. This wobbling is detectable by very sensitive telescopes, like the HIRES spectrometer at the Keck Observatory.

Before Kepler, the radial velocity method was the best way to find new exoplanets. Scientists using this method have found hundreds of worlds over the past 25 years, and we found the very first known exoplanet using this method. However, studying a star's radial velocity typically requires a lot of time for observation in order to separate the signal and any interferences.

The Keck data spans two decades, which is more than enough time to separate out the signal, and the data covers so many star systems that it could potentially contain evidence for thousands of new exoplanets. In fact, the dataset is so massive that one group of people could never get though all of it. To solve this problem, the team is releasing their data to the public, in the hopes that people will use that data to find even more exoplanets. If you're interested in discovering your very own alien planet, you can find the data and instructions on the team's website here.

Why These Scientists Fear Contact With Space Aliens
The more we learn about the cosmos, the more it seems possible that we are not alone. The entire galaxy is teeming with worlds, and we're getting better at listening — so the question, "Is there anybody out there?" is one we may be able to answer soon.

But do we really want to know? If aliens are indeed out there, would they be friendly explorers, or destroyers of worlds? This is a serious question no longer confined to science fiction, because a growing group of astronomers has taken it upon themselves to do more than just listen. Some are advocating for a beacon swept across the galaxy, letting E.T. know we're home, to see if anyone comes calling. Others argue we would be wise to keep Earth to ourselves.

"There's a possibility that if we actively message, with the intention of getting the attention of an intelligent civilization, that the civilization we contact would not necessarily have our best interests in mind," says Lucianne Walkowicz, an astrophysicist at the Adler Planetarium in Chicago. "On the other hand, there might be great benefits. It could be something that ends life on Earth, and it might be something that accelerates the ability to live quality lives on Earth. We have no way of knowing." Like many other astronomers, Walkowicz isn't convinced one way or the other — but she said the global scientific community needs to talk about it.

Internet investor and science philanthropist Yuri Milner shows the Starchip, a microelectronic component spacecraft. The $100 million project is aimed at establishing the feasibility of sending a swarm of tiny spacecraft, each weighing far less than an ounce, to the Alpha Centauri star system.

That conversation is likely to heat up soon thanks to the Breakthrough Initiatives, a philanthropic organization dedicated to interstellar outreach that's funded by billionaire Russian tech mogul Yuri Milner. Its Breakthrough Message program would solicit ideas from around the world to compose a message to aliens and figure out how to send it. Outreach for the program may launch as soon as next year, according to Pete Worden, the Breakthrough Initiatives' director.

"We're well aware of the argument, 'Do you send things or not?' There's pretty vigorous opinion on both sides of our advisory panel," Worden says. "But it's a very useful exercise to start thinking about what to respond. What's the context? What best represents the people on Earth? This is an exercise for humanity, not necessarily just about what we would send." Members of the advisory panel have argued that a picture (and the thousand words it may be worth) would be the best message.

Next comes "more of a technical expertise question," Wordon says. "Given that you have an image or images, how do you best encrypt it so it can be received?"

Breakthrough Message will work on those details, including how to transmit the pictures, whether through radio or laser transmitters; how to send it with high fidelity, so it's not rendered unreadable because of interference from the interstellar medium; which wavelengths of light to use, or whether to spread a message across a wide spectrum; how many times to send it, and how often; and myriad other technical concerns.

The scientific community continues to debate these questions. For instance, Philip Lubin of the University of California, Santa Barbara, has published research describing a laser array that could conceivably broadcast a signal through the observable universe.

Breakthrough is also working on where to send such a message, Worden adds. The $100 million Breakthrough Listen project is searching for any evidence of life in nearby star systems, which includes exoplanets out to a few hundred light years away.

"If six months from now, we start to see some interesting signals, we'll probably accelerate the Message program," he says.

The fact that there have been no signals yet does pose a conundrum. In a galaxy chock full of worlds, why isn't Earth crawling with alien visitors? The silence amid the presence of such plentiful planets is called the Fermi Paradox, named for the physicist Enrico Fermi, who first asked "Where is everybody?" in 1950.

In the decades since, astronomers have come up with possible explanations ranging from sociology to biological complexity. Aliens might be afraid of us, or consider us unworthy of attention, for instance. Or it may be that aliens communicate in ways that we can't comprehend, so we're just not hearing them. Or maybe aliens lack communication capability of any kind. Of course there's also the possibility that there are no aliens.

Image: Stephen Hawking
Stephen Hawking announces the "Breakthrough Starshot" initiative in New York in 2016. Dennis Van Tine / Star Max/IPx via AP
But those questions don't address the larger one: Whether it's a good idea to find out. Some scientists, most notably Stephen Hawking, are convinced the answer is a firm "No."

"We only have to look at ourselves to see how intelligent life might develop into something we wouldn't want to meet," Hawking said in 2010. He has compared meeting aliens to Christopher Columbus meeting Native Americans: "That didn't turn out so well," he said.

Others have warned of catastrophic consequences ripped from the pages of science fiction: Marauding aliens that could follow our message like a homing beacon, and come here to exploit Earth's resources, exploit humans, or even to destroy all life as we know it.

"Any civilization detecting our presence is likely to be technologically very advanced, and may not be disposed to treat us nicely. At the very least, the idea seems morally questionable," physicist Mark Buchanan argued in the journal Nature Physics last fall.

Other astronomers think it's worth the risk — and they add, somewhat darkly, that it's too late anyway. We are a loud species, and our messages have been making their way through the cosmos since the dawn of radio.

"If we are in danger of an alien invasion, it's too late," wrote Douglas Vakoch, the director of Messaging Extraterrestrial Intelligence (METI) International, in a rebuttal last fall in Nature Physics. Vakoch, the most prominent METI proponent, argues that if we don't tell anyone we're here, we could miss out on new technology that could help humanity, or even protect us from other, less friendly aliens.

“If we are in danger of an alien invasion, it’s too late.”
David Grinspoon, an author and astrobiologist at the Planetary Science Institute in Tucson, says he first thought, "'Oh, come on, you've got to be kidding me.' It seems kind of absurd aliens are going to come invade us, steal our precious bodily fluids, breed us like cattle, 'To Serve Man,' " a reference to a 1962 episode of "The Twilight Zone" in which aliens hatch a plan to use humans as a food source.

Originally, Grinspoon thought there would be no harm in setting up a cosmic lighthouse. "But I've listened to the other side, and I think they have a point," he adds. "If you live in a jungle that might be full of hungry lions, do you jump down from your tree and go, 'Yoo-hoo?'"

Many have already tried, albeit some more seriously than others.

In 2008, NASA broadcast the Beatles tune "Across the Universe" toward Polaris, the North Star, commemorating the space agency's 50th birthday, the 45th anniversary of the Deep Space Network, and the 40th anniversary of that song.

Later that year, a tech startup working with Ukraine's space agency beamed pictures and messages to the exoplanet Gliese 581 c. Other, sillier messages to the stars have included a Doritos commercial and a bunch of Craigslist ads.

Last October, the European Space Agency broadcast 3,775 text messages toward Polaris. It's not known to harbor any exoplanets, and even if it did, those messages would take some 425 years to arrive; yet the exercise, conceived by an artist, raised alarm among astronomers. Several prominent scientists, including Walkowicz, signed on to a statement guarding against any future METI efforts until some sort of international consortium could reach agreement.

Play Is an Alien Megastructure Causing this Distant Star's Strange Behavior? Facebook Twitter Google PlusEmbed
Is an Alien Megastructure Causing this Distant Star's Strange Behavior? 1:58
Even if we don't send a carefully crafted message, we're already reaching for the stars. The Voyager probe is beyond the solar system in interstellar space, speeding toward a star 17.6 light-years from Earth. Soon, if Milner has his way, we may be sending even more robotic emissaries.

Milner's $100 million Breakthrough Starshot aims to send a fleet of paper-thin space chips to the Alpha Centauri system within a generation's time. Just last fall, astronomers revealed that a potentially rocky, Earth-sized planet orbits Proxima Centauri, a small red dwarf star in that system and the nearest to our own, just four light years away. The chips would use a powerful laser to accelerate to near the speed of light, to cover the distance between the stars in just a few years. A team of scientists and engineers is working on how to build the chips and the laser, according to Worden.

"If we find something interesting, obviously we're going to get a lot more detail if we can visit, and fly by," he says. "Who knows what's possible in 50 years?"

But some time sooner than that, we will need to decide whether to say anything at all. Ultimately, those discussions are important for humanity, Worden, Walkowicz and Grinspoon all say.

"Maybe it's more important that we get our act together on Earth," Grinspoon says. "We are struggling to find a kind of global identity on this planet that will allow us to survive the problems we've created for ourselves. Why not treat this as something that allows us to practice that kind of thinking and action?"

Scientists May Have Solved the Biggest Mystery of the Big Bang

The European Council for Nuclear Research (CERN) works to help us better understand what comprises the fabric of our universe. At this French association, engineers and physicists use particle accelerators and detectors to gain insight into the fundamental properties of matter and the laws of nature. Now, CERN scientists may have found an answer to one of the most pressing mysteries in the Standard Model of Physics, and their research can be found in Nature Physics.

According to the Big Bang Theory, the universe began with the production of equal amounts of matter and antimatter. Since matter and antimatter cancel each other out, releasing light as they destroy each other, only a minuscule number of particles (mostly just radiation) should exist in the universe. But, clearly, we have more than just a few particles in our universe. So, what is the missing piece? Why is the amount of matter and the amount of antimatter so unbalanced?

The Standard Model of particle physics does account for a small percentage of this asymmetry, but the majority of the matter produced during the Big Bang remains unexplained. Noticing this serious gap in information, scientists theorized that the laws of physics are not the same for matter and antimatter (or particles and antiparticles). But how do they differ? Where do these laws separate?

This separation, known as charge-parity (CP) violation, has been seen in hadronic subatomic particles (mesons), but the particles in question are baryons. Finding evidence of CP violation in these particles would allow scientists to calculate the amount of matter in the universe, and answer the question of why we have an asymmetric universe. After decades of effort, the scientists at CERN think they’ve done just that.

Using a Large Hadron Collider (LHC) detector, CERN scientists were able to witness CP violation in baryon particles. When smashed together, the matter (Λb0) and antimatter (Λb0-bar) versions of the particles decayed into different components with a significant difference in the quantities of the matter and antimatter baryons. According to the team’s report, “The LHCb data revealed a significant level of asymmetries in those CP-violation-sensitive quantities for the Λb0 and Λb0-bar baryon decays, with differences in some cases as large as 20 percent.”


This discovery isn’t yet statistically significant enough to claim that it is definitive proof of a CP variation, but most believe that it is only a matter of time. “Particle physics results are dragged, kicking and screaming, out of the noise via careful statistical analysis; no discovery is complete until the chance of it being a fluke is below one in a million. This result isn’t there yet (it’s at about the one-in-a-thousand level),” says scientist Chris Lee. “The asymmetry will either be quickly strengthened or it will disappear entirely. However, given that the result for mesons is well and truly confirmed, it would be really strange for this result to turn out to be wrong.”

This borderline discovery is one huge leap forward in fully understanding what happened before, during, and after the Big Bang. While developments in physics like this may seem, from the outside, to be technical achievements exciting only to scientists, this new information could be the key to unlocking one of the biggest mysteries in modern physics. If the scientists at CERN are able to prove that matter and antimatter do, in fact, obey separate laws of physics, science as we know it would change and we’ll need to reevaluate our understanding of our physical world.
References: ScienceAlert - Latest, Science Alert

New Research Shows the Universe May Have Once Been a Hologram

New research suggests that the universe may have been a hologram at one point in time, specifically a few hundred thousand years after the Big Bang. The study, published in the journal Physical Review Letters, is the latest research on the “holographic principle,” which suggests that the laws of physics can apply to the universe as a two-dimensional plane.

“We are proposing using this holographic universe, which is a very different model of the Big Bang than the popularly accepted one that relies on gravity and inflation,” said lead author Niayesh Afshordi, professor of physics and astronomy at the University of Waterloo and Perimeter Institute. “Each of these models makes distinct predictions that we can test as we refine our data and improve our theoretical understanding – all within the next five years.”

The theory suggests that the volume of space appears three-dimensional, but is actually encoded on a two-dimensional boundary or an observer-dependent horizon that requires one less dimension that it appears. In short, we see it as three-dimensional, but it is projected from a two-dimensional source, similar to how a hologram screen works.

“The idea is similar to that of ordinary holograms, where a three-dimensional image is encoded in a two-dimensional surface, such as in the hologram on a credit card,” explained researcher Kostas Skenderis from the University of Southampton. “However, this time, the entire universe is encoded.”


The researchers arrived at this conclusion after observing irregularities in the cosmic microwave background — the Big Bang’s remnant. The team used a model with one time and two space dimensions. Actual data from the universe, including cosmic microwave background observations, were then plugged into the model. The researchers saw that the two fit perfectly, but only if the universe is no more than 10 degrees wide.

“I would say you don’t live in a hologram, but you could have come out of a hologram,” Afshordi told Gizmodo. “[In 2017], there are definitely three dimensions.”

While many accept the cosmic inflation that came after the Big Bang, our understanding of physics – including current general relativity and quantum mechanics theories – doesn’t work with what we observe. The fundamental laws of physics are incapable of explaining how the universe as we know it, with all its contents, could’ve fit in a small package that exponentially expanded.

This is where Afshordi’s research and the holographic model come in. These could lead to new theories about the Big Bang and a functioning theory of quantum gravity — a theory that meshes quantum mechanics with Einstein’s theory of gravity. “The key to understanding quantum gravity is understanding field theory in one lower dimension,” Afshordi says. “Holography is like a Rosetta Stone, translating between known theories of quantum fields without gravity and the uncharted territory of quantum gravity itself.”

The question remains, though: how did the universe transition from 2D to 3D? Further study is needed to explain this.

Dark energy emerges when energy conservation is violated
The conservation of energy is one of physicists' most cherished principles, but its violation could resolve a major scientific mystery: why is the expansion of the universe accelerating? That is the eye-catching claim of a group of theorists in France and Mexico, who have worked out that dark energy can take the form of Albert Einstein's cosmological constant by effectively sucking energy out of the cosmos as it expands.

The cosmological constant is a mathematical term describing an anti-gravitational force that Einstein had inserted into his equations of general relativity in order to counteract the mutual attraction of matter within a static universe. It was then described by Einstein as his "biggest blunder", after it was discovered that the universe is in fact expanding. But then the constant returned to favour in the late 1990s following the discovery that the universe's expansion is accelerating.

For many physicists, the cosmological constant is a natural candidate to explain dark energy. Since it is a property of space–time itself, the constant could represent the energy generated by the virtual particles that quantum mechanics dictates continually flit into and out of existence. Unfortunately the theoretical value of this "vacuum energy" is up to a staggering 120 orders of magnitude larger than observations of the universe's expansion imply.

The latest work, carried out by Alejandro Perez and Thibaut Josset of Aix Marseille University together with Daniel Sudarsky of the National Autonomous University of Mexico, proposes that the cosmological constant is instead the running total of all the non-conserved energy in the history of the universe. The "constant" in fact would vary – increasing when energy flows out of the universe and decreasing when it returns. However, the constant would appear unchanging in our current (low-density) epoch because its rate of change would be proportional to the universe's mass density. In this scheme, vacuum energy does not contribute to the cosmological constant.

The researchers had to look beyond general relativity because, like Newtonian mechanics, it requires energy to be conserved. Strictly speaking, relativity requires the conservation of a multi-component "energy-momentum tensor". That conservation is manifest in the fact that, on very small scales, space–time is flat, even though Einstein's theory tells us that mass distorts the geometry of space–time.

In contrast, most attempts to devise a theory of quantum gravity require space–time to come in discrete grains at the smallest (Planck-length) scales. That graininess opens the door to energy non-conservation. Unfortunately, no fully formed quantum-gravity theory exists yet, and so the trio instead turned to a variant of general relativity known as unimodular gravity, which allows some violation of energy conservation. They found that when they constrained the amount of energy that can be lost from (or gained by) the universe to be consistent with the cosmological principle – on very large scales the process must be both homogeneous and isotropic – the unimodular equations generated a cosmological-constant-like entity.

In the absence of a proper understanding of Planck-scale space–time graininess, the researchers were unable to calculate the exact size of the cosmological constant. Instead, they incorporated the unimodular equations into a couple of phenomenological models that exhibit energy non-conservation. One of these describes how matter might propagate in granular space–time, while the other modifies quantum mechanics to account for the disappearance of superposition states at macroscopic scales.

These models both contain two free parameters, which were adjusted to make the models consistent with null results from experiments that have looked for energy non-conservation in our local universe. Despite this severe constraint, the researchers found that the models generated a cosmological constant of the same order of magnitude as that observed. "We are saying that even though each individual violation of energy conservation is tiny, the accumulated effect of these violations over the very long history of the universe can lead to dark energy and accelerated expansion," Perez says.

In future, he says it might be possible to subject the new idea to more direct tests, such as observing supernovae very precisely to try to work out whether the universe's accelerating expansion is driven by a constant or varying force. The model could also be improved so that it captures dark-energy's evolution from just after the Big Bang – and then comparing the results with observations of the cosmic microwave background.

If the trio are ultimately proved right, it would not mean physicists having to throw their long-established conservation principles completely out of the window. A variation in the cosmological constant, Perez says, could point to a deeper, more abstract kind of conservation law. "Just as heat is energy stored in the chaotic motion of molecules, the cosmological constant would be 'energy' stored in the dynamics of atoms of space–time," he explains. "This energy would only appear to be lost if space–time is assumed to be smooth."

Other physicists are cautiously supportive of the new work. George Ellis of the University of Cape Town in South Africa describes the research as "no more fanciful than many other ideas being explored in theoretical physics at present". The fact that the models predict energy to be "effectively conserved on solar-system scales" – a crucial check, he says – makes the proposal "viable" in his view.

Lee Smolin of the Perimeter Institute for Theoretical Physics in Canada, meanwhile, praises the researchers for their "fresh new idea", which he describes as "speculative, but in the best way". He says that the proposal is "probably wrong", but that if it's right "it is revolutionary".
The research is described in Physical Review Letters.

Physicists measure the loss of dark matter since the birth of the universe
Russian scientists have discovered that the proportion of unstable particles in the composition of dark matter in the days immediately following the Big Bang was no more than 2 percent to 5 percent. Their study has been published in Physical Review D.

"The discrepancy between the cosmological parameters in the modern universe and the universe shortly after the Big Bang can be explained by the fact that the proportion of dark matter has decreased. We have now, for the first time, been able to calculate how much dark matter could have been lost, and what the corresponding size of the unstable component would be," says co-author Igor Tkachev of the Department of Experimental Physics at INR.
Astronomers first suspected that there was a large proportion of hidden mass in the universe back in the 1930s, when Fritz Zwicky discovered "peculiarities" in a cluster of galaxies in the constellation Coma Berenices—the galaxies moved as if they were under the effect of gravity from an unseen source. This hidden mass, which is only deduced from its gravitational effect, was given the name dark matter. According to data from the Planck space telescope, the proportion of dark matter in the universe is 26.8 percent; the rest is "ordinary" matter (4.9 percent) and dark energy (68.3 percent).
The nature of dark matter remains unknown. However, its properties could potentially help scientists to solve a problem that arose after studying observations from the Planck telescope. This device accurately measured the fluctuations in the temperature of the cosmic microwave background radiation—the "echo" of the Big Bang. By measuring these fluctuations, the researchers were able to calculate key cosmological parameters using observations of the universe in the recombination era—approximately 300,000 years after the Big Bang.
However, when researchers directly measured the speed of the expansion of galaxies in the modern universe, it turned out that some of these parameters varied significantly—namely the Hubble parameter, which describes the rate of expansion of the universe, and also the parameter associated with the number of galaxies in clusters. "This variance was significantly more than margins of error and systematic errors known to us. Therefore, we are either dealing with some kind of unknown error, or the composition of the ancient universe is considerably different to the modern universe," says Tkachev.
Russian physicists measure the loss of dark matter since the birth of the universe
The concentration of the unstable component of dark matter F against the speed of expansion of non-gravitationally bound objects (proportional to the age of the Universe) when examining various combinations of Planck data for several different cosmological phenomena.

The discrepancy can be explained by the decaying dark matter (DDM) hypothesis, which states that in the early universe, there was more dark matter, but then part of it decayed.

"Let us imagine that dark matter consists of several components, as in ordinary matter (protons, electrons, neutrons, neutrinos, photons). And one component consists of unstable particles with a rather long lifespan. In the era of the formation of hydrogen, hundreds of thousands of years after the Big Bang, they are still in the universe, but by now (billions of years later), they have disappeared, having decayed into neutrinos or hypothetical relativistic particles. In that case, the amount of dark matter in the era of hydrogen formation and today will be different," says lead author Dmitry Gorbunov, a professor at MIPT and staff member at INR.
The authors of the study analyzed Planck data and compared them with the DDM model and the standard ΛCDM (Lambda-cold dark matter) model with stable dark matter. The comparison showed that the DDM model is more consistent with the observational data. However, the researchers found that the effect of gravitational lensing (the distortion of cosmic microwave background radiation by a gravitational field) greatly limits the proportion of decaying dark matter in the DDM model.
Using data from observations of various cosmological effects, the researchers were able to give an estimate of the relative concentration of the decaying components of dark matter in the region of 2 percent to 5 percent.
"This means that in today's universe, there is 5 percent less dark matter than in the recombination era. We are not currently able to say how quickly this unstable part decayed; dark matter may still be disintegrating even now, although that would be a different and considerably more complex model," says Tkachev.

More information: A. Chudaykin et al, Dark matter component decaying after recombination: Lensing constraints with Planck data, Physical Review D (2016). DOI: 10.1103/PhysR

This star has a secret – even better than 'alien megastructures'
When Yale researcher Tabetha Boyajian first focused on the star KIC 8462852 via the Kepler Space Telescope in September 2015, she didn't know what to make of it.

The lighting of the star was mysterious – it was far too dim for a star of its age and type, intermittently dipping in brightness. Theories around Tabby’s star, as it was nicknamed, quickly piled up, with some scientists attributing the atypical lighting to surrounding cosmic dust or nearby comets. But more excitable space enthusiasts predicted alien activity, arguing that only orbiting alien structures could block a star’s light so effectively.

The so-called alien megastructure hypothesis persisted longer than most extra-terrestrial-based theories, simply because scientists had few alternative ideas to explain the star's peculiar blinking – until now. And the latest theory is almost as intriguing as the alien hypothesis.

Dr. Boyajian and her team weren't the first to spot the star: it was actually discovered in 1890. But their questions about the star's light pattern – and the subsequent alien-related theories – made the star, well, something of a star.

“We’d never seen anything like this star,” Boyajian told the Atlantic in October 2015. “It was really weird. We thought it might be bad data or movement on the spacecraft, but everything checked out.”

KIC 8462852's story became more intriguing in January 2016, New Scientist reports, when a comparison of the first image taken of Tabby's star, in 1890, with one taken in 1989 revealed that the star had dimmed 14 percent in the interim 100 years. And over one particularly confusing two-day period, the star dipped in brightness by 22 percent.

Tabby’s star kept scientists scratching their heads all last year. Volatility in light patterns are typical for young stars, but KIC 8462852 is mature.

“The steady brightness change in KIC 8462852 is pretty astounding,” Ben Montet, a scientist at the California Institute of Technology, said in an October statement. “It is unprecedented for this type of star to slowly fade for years, and we don’t see anything else like it in the Kepler data.”

Now, a team of scientists from Columbia University and the University of California, Berkeley, say they have found a reasonable explanation to KIC 8462852’s strange lighting.

“Following an initial suggestion by Wright & Sigurdsson, we propose that the secular dimming behavior is the result of the inspiral of a planetary body or bodies into KIC 8462852, which took place ~ 10-104 years ago (depending on the planet mass),” the three authors write in a study to be published Monday in the Monthly Notices of the Royal Astronomical Society.

“Gravitational energy released as the body inspirals into the outer layers of the star caused a temporary and unobserved brightening, from which the stellar flux is now returning to the quiescent state.”

In other words, KIC 8462852 ate a planet sometime in the past 10,000 years.

The theory goes like this:

If KIC 8462852 did eat a planet – which is extremely rare in the space world, unless a collision pushed the planet out of its orbit – the star’s brightness would increase for a short period of 200 to 10,000 years as it burned up the planet (short in star time, that is). But once the burning was complete, the star would go back to around its original level of brightness.

So we could be looking at KIC 8462852 during its post-planet digestion, as it dims back to normal, write the authors.

And KIC 8462852 could have been a messy eater, leaving crumbs – aka orbiting planet debris – that periodically block its light. “This paper puts a merger scenario on the table in a credible way,” Jason Wright, an astronomist at Penn State University, tells New Scientist. “I think this moves it into the top tier of explanations.”

Testing theories of modified gravity
The accelerated expansion of the universe is usually attributed to a mysterious dark energy, but there’s another conceivable explanation: modified gravity. Unmodified gravity—that is, Einstein’s general relativity—satisfactorily accounts for the dynamics of the solar system, where precision measurements can be made without the confounding influence of dark matter. Nor have any violations been detected in one of general relativity’s principal ingredients, the strong equivalence principle, which posits that inertial mass and gravitational mass are identical.

But those observational constraints are not ineluctable. In particular, a class of gravitational theories called Galileon models can also pass them. In 2012 Lam Hui and Alberto Nicolis of Columbia University devised a cosmic test that could refute or confirm the models. Their test hinges on the models’ central feature: an additional scalar field that couples to mass. The coupling can be characterized by a charge-like parameter, Q. For most cosmic objects, Q has the same value as the inertial mass. But for a black hole, whose mass arises entirely from its gravitational binding energy, Q is zero; the strong equivalence principle is violated.

Galaxies fall through space away from low concentrations of mass and toward high concentrations. The supermassive black holes at the centers of some galaxies are carried along with the flow. But if gravity has a Galileon component, the black hole feels less of a tug than do the galaxy’s stars, interstellar medium, and dark-matter particles. The upshot, Hui and Nicolis realized, is that the black hole will lag the rest of the galaxy and slip away from its center. The displacement is arrested when the black hole reaches the point where the lag is offset by the presence of more of the galaxy’s gravitational mass on one side of the black hole than on the other. Given the right circumstances, the displacement can be measured.

Hui and Nicolis’s proposal has now itself been put to the test. Asha Asvathaman and Jeremy Heyl of the University of British Columbia, together with Hui, have applied it to two galaxies: M32, which is being pulled toward its larger neighbor, the Andromeda galaxy, and M87 (shown here), which is being pulled through the Virgo cluster of galaxies. Both M32 and M87 are elliptical galaxies. Because of their simple shapes, their centroids can be determined from optical observations. The locations of their respective black holes can be determined from radio observations. Although the limit on Galileon gravity that Asvathaman, Heyl, and Hui derived was too loose to refute or confirm the theory, they nevertheless validated the test itself. More precise astrometric observations could make it decisive. (A. Asvathaman, J. S. Heyl, L. Hui, Mon. Not. R. Astron. Soc., in press.)

A simple explanation of mysterious space-stretching ‘dark energy?’
For nearly 2 decades, cosmologists have known that the expansion of the universe is accelerating, as if some mysterious "dark energy" is blowing it up like a balloon. Just what dark energy is remains one of the biggest mysteries in physics. Now, a trio of theorists argues that dark energy could spring from a surprising source. Weirdly, they say, dark energy could come about because—contrary to what you learned in your high school physics class—the total amount of energy in the universe isn't fixed, or "conserved," but may gradually disappear.

"It's a great direction to explore," says George Ellis, a theorist at the University of Cape Town in South Africa, who was not involved in the work. But Antonio Padilla, a theorist at the University of Nottingham in the United Kingdom, says, "I don't necessarily buy what they've done."

Dark energy could be a new field, a bit like an electric field, that fills space. Or it could be part of space itself—a pressure inherent in the vacuum—called a cosmological constant. The second scenario jibes well with Einstein's theory of general relativity, which posits that gravity arises when mass and energy warps space and time. In fact, Einstein invented the cosmological constant—literally by adding a constant to his famous differential equations—to explain how the universe resisted collapsing under its own gravity. But he gave up on the idea as unnecessary when in the 1920s astronomers discovered that the universe isn't static, but is expanding as if born in an explosion.

With the observation that the expansion of the universe is accelerating, the cosmological constant has made a comeback. Bring in quantum mechanics and the case for the cosmological constant gets tricky, however. Quantum mechanics suggests the vacuum itself should fluctuate imperceptibly. In general relativity, those tiny quantum fluctuations produce an energy that would serve as the cosmological constant. Yet, it should be 120 orders of magnitude too big—big enough to obliterate the universe. So explaining why there is a cosmological constant, but just a little bitty one, poses a major conceptual puzzle for physicists. (When there was no need for a cosmological constant theorists assumed that some as-yet-unknown effect simply nailed it to zero.)

Now, Thibault Josset and Alejandro Perez of Aix-Marseille University in France and Daniel Sudarsky of the National Autonomous University of Mexico in Mexico City say they have found a way to get a reasonable value for the cosmological constant. They begin with a variant of general relativity that Einstein himself invented called unimodular gravity. General relativity assumes a mathematical symmetry called general covariance, which says that no matter how you label or map spacetime coordinates—i.e. positions and times of events—the predictions of the theory must be the same. That symmetry immediately requires that energy and momentum are conserved. Unimodular gravity possesses a more limited version of that mathematical symmetry.

Unimodular gravity reproduces most of the predictions of general relativity. However, in it quantum fluctuations of the vacuum do not produce gravity or add to the cosmological constant, which is once again just a constant that can be set to the desired value. There's a cost, however. Unimodular gravity doesn't require energy to be conserved, so theorists have to impose that constraint arbitrarily.

Now, however, Josset, Perez, and Sudarsky show that in unimodular gravity, if they just go with it and allow the violation of the conservation of energy and momentum, it actually sets the value of the cosmological constant. The argument is mathematical, but essentially the tiny bit of energy that disappears in the universe leaves its trace by gradually changing the cosmological constant. "In the model, dark energy is something that keeps track of how much energy and momentum has been lost over the history of the universe," Perez says.

To show that the theory gives reasonable results, the theorists consider two scenarios of how the violation of energy conservation might come about in theories that address foundational issues in quantum mechanics. For example, a theory called continuous spontaneous localization (CSL) tries to explain why a subatomic particle like an electron can literally be in two places at once, but a big object like a car cannot. CSL assumes that such two-places-at-once states spontaneously collapse to one place or the other with a probability that increases with an object's size, making it impossible for a large object to stay in the two-place state. The knock against CSL is that it doesn't conserve energy. But the theorists show that the amount that energy conservation is violated would be roughly enough to give a cosmological constant of the right size.

The work's novelty lies in using the violation of conservation of energy to tie dark energy to possible extensions of quantum theory, says Lee Smolin, a theorist at the Perimeter Institute for Theoretical Physics in Waterloo, Canada. "It's in no way definitive," he says. "But it's an interesting hypothesis that unites these two things, which to my knowledge nobody has tried to connect before."

However, Padilla says the theorists are playing mathematical sleight-of-hand. They still have to assume that the cosmological constant starts with some small value that they don't explain, he says. But Ellis notes that physics abounds with unexplained constants such as the charge of the electron or the speed of light. "This just adds one more constant to the long list.”

Padilla also argues that the work runs contrary to the idea that phenomena on the biggest scales should not depend on those at the smallest scales. "You're trying to describe something on the scale of the universe," he says. "Do you really expect it to be sensitive to the details of quantum mechanics?" But Smolin argues that the cosmological constant problem already links the cosmic and quantum realms. So, he says, "It's a new idea that could possibly be right and thus is worth getting interested in."

Physicists detect exotic looped trajectories of light in three-slit experiment
Physicists have performed a variation of the famous 200-year-old double-slit experiment that, for the first time, involves "exotic looped trajectories" of photons. These photons travel forward through one slit, then loop around and travel back through another slit, and then sometimes loop around again and travel forward through a third slit.

Interestingly, the contribution of these looped trajectories to the overall interference pattern leads to an apparent deviation from the usual form of the superposition principle. This apparent deviation can be understood as an incorrect application of the superposition principle—once the additional interference between looped and straight trajectories is accounted for, the superposition can be correctly applied.

The team of physicists, led by Omar S. Magaña-Loaiza and Israel De Leon, has published a paper on the new experiment in a recent issue of Nature Communications.

Loops of light

"Our work is the first experimental observation of looped trajectories," De Leon told "Looped trajectories are extremely difficult to detect because of their low probability of occurrence. Previously, researchers had suggested that these exotic trajectories could exist but failed to observe them."

To increase the probability of the occurrence of looped trajectories, the researchers designed a three-slit structure that supports surface plasmons, which the scientists describe as "strongly confined electromagnetic fields that can exist at the surface of metals." The presence of these electromagnetic fields near the three slits increases the contribution of looped trajectories to the overall interference pattern by almost two orders of magnitude.

"We provided a physical explanation that links the probability of these exotic trajectories to the near fields around the slits," De Leon said. "As such, one can increase the strength of near fields around the slits to increase the probability of photons following looped trajectories."

Superposition principle accounting for looped trajectories

The new three-slit experiment with looped trajectories is just one of many variations of the original double-slit experiment, first performed by Thomas Young in 1801. Since then, researchers have been performing versions that use electrons, atoms, or molecules instead of photons.

One of the reasons why the double-slit experiment has attracted so much attention is that it represents a physical manifestation of the principle of quantum superposition. The observation that individual particles can create an interference pattern implies that the particles must travel through both slits at the same time. This ability to occupy two places, or states, at once, is the defining feature of quantum superposition.

Straight trajectories (green) and exotic looped trajectories (red, dashed, dotted) of light, where the red cloud near the surface depicts the near fields, which increase the probability of photons to follow looped trajectories. The graphs at left show simulations (top) and experimental results (bottom) of the large difference in interference patterns created by illuminating only one slit being treated independently (gray line) and the actual coupled system (blue line). The remarkable difference between the gray and blue lines is caused by the looped trajectories. Credit: Magaña-Loaiza et al. Nature Communications
So far, all previous versions of the experiment have produced results that appear to be accurately described by the principle of superposition. This is because looped trajectories are so rare under normal conditions that their contribution to the overall interference pattern is typically negligible, and so applying the superposition principle to those cases results in a very good approximation.

It is when the contribution of the looped trajectories becomes non-negligible that it becomes apparent that the total interference is not simply the superposition of individual wavefunctions of photons with straight trajectories, and so the interference pattern is not correctly described by the usual form of the superposition principle.

Magaña-Loaiza explained this apparent deviation in more detail:

"The superposition principle is always valid—what is not valid is the inaccurate application of the superposition principle to a system with two or three slits," he said.

"For the past two centuries, scientists have assumed that one cannot observe interference if only one slit is illuminated in a two- or three-slit interferometer, and this is because this scenario represents the usual or typical case.

"However, in our paper we demonstrate that this is true only if the probability of photons to follow looped trajectories is negligible. Surprisingly, interference fringes are formed when photons following looped trajectories interfere with photons following straight (direct) trajectories, even when only one of the three slits is illuminated.

"The superposition principle can be applied to this surprising scenario by using the sum or 'superposition' of two wavefunctions; one describing a straight trajectory and the other describing looped trajectories. Not taking into account looped trajectories would represent an incorrect application of the superposition principle.

"To some extent, this effect is strange because scientists know that Thomas Young observed interference when he illuminated both slits and not only one. This is true only if the probability of photons following looped trajectories is negligible."

In addition to impacting physicists' understanding of the superposition principle as it is applied to these experiments, the results also reveal new properties of light that could have applications for quantum simulators and other technologies that rely on interference effects.

"We believe that exotic looped paths can have important implications in the study of decoherence mechanisms in interferometry or to increase the complexity of certain protocols for quantum random walks, quantum simulators, and other algorithms used in quantum computation," De Leon said.

Actual footage shows what it was like to land on Saturn's moon Titan
In 2005, an alien probe flew through the hazy and cold atmosphere of Titan, the largest moon of Saturn, and landed on the world's surface.

That spacecraft — named the Huygens probe — was sent from Earth by the European Space Agency along with the Cassini spacecraft to help humanity learn more about Saturn and its 53 known moons.

SEE ALSO: These photos of a hexagon on Saturn are totally real

Thanks to a new video released by NASA, you can relive the Huygens' descent to Titan's surface 12 years after it actually landed.

The video shows actual footage from the spacecraft's point of view as it passed through the hazy layers of Titan's atmosphere, spotted "drainage canals" that suggest rivers of liquid methane run on the moon and gently set down on the surface, NASA said.

Quaternions are introduced, October 16, 1843
Irish physicist, astronomer, and mathematician Sir William Rowan Hamilton introduced quaternions, a non-commutative extension of complex numbers, on October 16, 1843.

To be true, Benjamin Olinde Rodrigues had in 1840 already reached a result that amounted to the discovery of quaternions in all but name. His work on the subject was published years after Hamilton’s.

As the story goes, Hamilton knew that complex numbers could be interpreted as points in a plane, and he was looking for a way to do the same for points in three-dimensional space. It had been established that points in space can be represented by their coordinates, which are triples of numbers, and for many years Hamilton had known how to add and subtract triples of numbers. However, Hamilton had been stuck on the problem of multiplication and division for a long time. He could not figure out how to calculate the quotient of the coordinates of two points in space.

On Monday, October 16, 1843, Hamilton was walking with his wife to the Royal Irish Academy where he was going to preside at a council meeting. The concepts behind quaternions began forming in his mind. When the answer came to him, Hamilton carved the formula for the quaternions into the stone of Dublin’s Brougham Bridge (Broom Bridge).

A plaque on the bridge commemorates the event:

The next day, Hamilton wrote a letter to his friend and fellow mathematician John T Graves describing the train of thought that led to his discovery. The letter was published in the London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science in 1884, and states:

And here there dawned on me the notion that we must admit, in some sense, a fourth dimension of space for the purpose of calculating with triples. ... An electric circuit seemed to close, and a spark flashed forth.

Hamilton called a quadruple with these rules of multiplication a quaternion, and he devoted most of the remainder of his life to studying and teaching them.

Although controversy about the use of quaternions was notable in the late 19th century while vector algebra and vector calculus grew in popularity, they were made a mandatory examination topic in Dublin and for a while were the only advanced mathematics taught in some American universities.

Quaternions experienced a revival in the late 20th century, primarily due to their utility in describing spatial rotations.

Today, the quaternions are used in computer graphics, control theory, signal processing, and orbital mechanics, mainly for representing rotations/orientations. It is common for spacecraft attitude-control systems to be commanded in terms of quaternions, which are also used to telemeter their current attitude.

The Sound Of Quantum Vacuum
Quantum mechanics dictates sensitivity limits in the measurements of displacement, velocity and acceleration. A recent experiment at the Niels Bohr Institute probes these limits, analyzing how quantum fluctuations set a sensor membrane into motion in the process of a measurement. The membrane is an accurate model for future ultraprecise quantum sensors, whose complex nature may even hold the key to overcome fundamental quantum limits. The results are published in the prestigious scientific journal, Proceedings of the National Academy of Sciences of the USA.

Vibrating strings and membranes are at the heart of many musical instruments. Plucking a string excites it to vibrations, at a frequency determined by its length and tension. Apart from the fundamental frequency - corresponding to the musical note - the string also vibrates at higher frequencies. These overtones influence how we perceive the 'sound' of the instrument, and allow us to tell a guitar from a violin. Similarly, beating a drumhead excites vibrations at a number of frequencies simultaneously.

These matters are not different when scaling down, from the half-meter bass drum in a classic orchestra to the half-millimeter-sized membrane studied recently at the Niels Bohr Institute. And yet, some things are not the same at all: using sophisticated optical measurement techniques, a team lead by Professor Albert Schliesser could show that the membrane’s vibrations, including all its overtones, follow the strange laws of quantum mechanics. In their experiment, these quantum laws implied that the mere attempt to precisely measure the membrane vibrations sets it into motion. As if looking at a drum already made it hum!

A 'drum' with many tones
Although the membrane investigated by the Niels Bohr Institute team can be seen with bare eyes, the researchers used a laser to accurately track the membrane motion. And this indeed reveals a number of vibration resonances, all of which are simultaneously measured. Their frequencies are in the Megahertz range, about a thousand times higher than the sound waves we hear, essentially because the membrane is much smaller than a musical instrument. But the analogies carry on: just like a violin sounds different depending on where the string is struck (sul tasto vs sul ponticello), the researchers could tell from the spectrum of overtones at which location their membrane was excited by the laser beam.

Yet, observing the subtle quantum effects that the researchers were most interested in, required a few more tricks. Albert Schliesser explains: “For once, there is the problem of vibrational energy loss, leading to what we call quantum decoherence. Think of it this way: in a violin, you provide a resonance body, which picks up the string vibrations and transforms them to sound waves carried away by the air. That’s what you hear. We had to achieve exactly the opposite: confine the vibrations to the membrane only, so that we can follow its undisturbed quantum motion for as long as possible. For that we had to develop a special ‘body’ that cannot vibrate at the membrane's frequencies.”

This was achieved by a so-called phononic crystal, a regular pattern of holes that exhibits a phononic bandgap, that is, a band of frequencies at which the structure cannot vibrate. Yeghishe Tsaturyan, a PhD student on the team, realized a membrane with such a special body at the Danchip nanofabrication facilities in Lyngby.

A second challenge consists in making sufficiently precise measurements. Using techniques from the field of Optomechanics, which is Schliesser’s expertise, the team created a dedicated experiment at the Niels Bohr Institute, based on a laser custom-built to their needs, and a pair of highly reflecting mirrors between which the membrane is arranged. This allowed them to resolve vibrations with amplitudes much smaller than a proton’s radius (1 femtometer).

“Making measurements so sensitive is not easy, in particular since pumps and other lab equipment vibrates with much larger amplitudes. So we have to make sure this doesn't show in our measurement record,” adds PhD student William Nielsen.

Vacuum beats the 'drum'
Yet it is exactly the range of ultra-precision measurements where it gets interesting. Then, it starts to matter that, according to quantum mechanics, the process of measuring the motion also influences it. In the experiment, this 'quantum measurement backaction' is caused by the inevitable quantum fluctuations of the laser light. In the framework of quantum optics, these are caused by quantum fluctuations of the electromagnetic field in empty space (vacuum). Odd as it sounds, this effect left clear signatures in the Niels Bohr Institute experiment's data, namely strong correlations between the quantum fluctuations of the light, and the mechanical motion as measured by light.

“Observing and quantifying these quantum fluctuations is important to better understand how they can affect ultraprecision mechanical measurements - that is, measurements of displacement, velocity or acceleration. And here, the multi-mode nature of the membrane comes into play: not only is it a more accurate representation of real-world sensors. It may also contain the key to overcome some of the traditional quantum limits to measurement precision with more sophisticated schemes, exploiting quantum correlations”, Albert Schliesser says and adds, that in the long run, quantum experiments with ever more complex mechanical objects may also provide an answer to the question why we don't ever observe a bass drum in a quantum superposition (or will we?).

SOURCE: University of Copenhagen

Multiple copies of the Standard Model could solve the hierarchy problem
One of the unanswered questions in particle physics is the hierarchy problem, which has implications for understanding why some of the fundamental forces are so much stronger than others. The strengths of the forces are determined by the masses of their corresponding force-carrying particles (bosons), and these masses in turn are determined by the Higgs field, as measured by the Higgs vacuum expectation value.

So the hierarchy problem is often stated as a problem with the Higgs field: specifically, why is the Higgs vacuum expectation value so much smaller than the largest energy scales in the universe, in particular the scale at which gravity (by far the weakest of the forces) becomes strong? Reconciling this apparent discrepancy would impact physicists' understanding of particle physics at the most fundamental level.
"The hierarchy problem is one of the deepest questions in particle physics, and almost every one of its known solutions corresponds to a different vision of the universe," Raffaele Tito D'Agnolo, a physicist at Princeton, told "Identifying the correct answer will not just solve a conceptual puzzle, but will change the way we think about particle physics."
In a new paper published in Physical Review Letters, D'Agnolo and his coauthors have proposed a solution to the hierarchy problem that involves multiple (up to 1016) copies of the Standard Model, each with a different Higgs vacuum expectation value. In this model, the universe consists of many sectors, each of which is governed by its own version of the Standard Model with its own Higgs vacuum expectation value. Our sector is the one with the smallest nonzero value.
If, in the very early universe, all sectors had comparable temperatures and seemingly equal chances of dominating, why did our sector, with the smallest nonzero Higgs vacuum expectation value, come to dominate? The physicists introduce a new mechanism called a "reheaton field" that explains this by reheating the universe as it decays. The physicists show that there are several ways in which the reheaton field could have preferentially decayed into and deposited the majority of its energy into the sector with the smallest Higgs vacuum expectation value, causing this sector to eventually dominate and become our observable universe.
Compared to other proposed solutions to the hierarchy problem, such as supersymmetry and extra dimensions, the new proposal—which the physicists call "N-naturalness"—is different in that the solution does not rely solely on new particles. Although the new proposal shares some features with both supersymmetry and extra dimensions, one of its unique characteristics is that it is not only new particles, but more importantly cosmological dynamics, that is central to the solution.
"N-naturalness is qualitatively different from the solutions to the hierarchy problem proposed in the past, and it predicts signals in cosmic microwave background (CMB) experiments and large-scale structure surveys, two probes of nature that were thought to be unrelated to the problem," D'Agnolo said.
As the physicists explain, it should be possible to detect signatures of N-naturalness by searching for signs of the existence of other sectors. For instance, future CMB experiments might detect extra radiation and changes in neutrino cosmology, since neutrinos in nearby sectors are expected to be slightly heavier and less abundant than those in our sector. This approach is interesting for another reason: the neutrinos in the other sectors are also a viable dark matter candidate, which the researchers plan to study in more detail. Future experiments might also find signatures of N-naturalness in the form of a larger-than-expected mass of axion particles, as well as supersymmetric signatures due to possible connections to supersymmetry.
"If new relativistic species are not detected by the next generation of CMB experiments (Stage 4), then I will stop thinking of N-naturalness as a possible solution to the hierarchy problem," D'Agnolo said. "According to the current timeline, these experiments should start taking data around 2020 and reach their physics goals in approximately five years."

Universe May Have Lost 'Unstable' Dark Matter
The early universe may have contained more dark matter than there is today, new research suggests. The findings could help scientists better understand what the universe was like just after the Big Bang, researchers said.

Most of the matter in the universe seems to be invisible and largely intangible; it holds galaxies together and only interacts with the more familiar matter through its gravitational pull. Researchers call the strange stuff dark matter, and one of the biggest questions for astrophysicists is what it actually is and how it might evolve or decay. [Twisted Physics: 7 Mind-Blowing Findings]

New work by a team of Russian scientists may offer insight into that question. Dmitry Gorbunov, of the Moscow Institute of Physics and Technology; Igor Tkachev, head of the of the Department of Experimental Physics at the Institute for Nuclear Research in Russia; and Anton Chudaykin, of Novosibirsk State University in Russia considered whether some unstable dark matter might have decayed since the universe's early days, turning from whatever type of particle or particles make up dark matter — that's still unknown — into lighter particles.

"We have now, for the first time, been able to calculate how much dark matter could have been lost and what the corresponding size of the unstable component would be," Tkachev said in a statement.

Their new calculations suggest that no more than 5 percent of the current amount of dark matter in the universe, could have been lost since the Big Bang.

Besides suggesting new properties for the elusive dark matter, the work could be important in helping scientists understand how the universe has changed over time, the researchers said. For example, the findings may show how the universe's rate of expansion has varied and what happened in the universe's first few hundred thousand years, when matter as we know it started to form into atoms.

Mysterious matter

Dark matter is a kind of matter that has mass, so it exerts a gravitational pull. However, it doesn't interact through electromagnetism with ordinary matter, so it is invisible. That is, it doesn't reflect or absorb light. The lack of electrical charge also makes dark matter intangible. Physicists are still debating what kind of particles make up dark matter, but most researchers agree that the substance accounts for some four-fifths of the matter in the universe.

Researchers have said Planck telescope data shows only about 4.9 percent of the universe is ordinary matter, about 26.8 percent is dark matter, and the remaining 68.3 percent is dark energy, which accelerates universal expansion.

"We have now, for the first time, been able to calculate how much dark matter could have been lost and what the corresponding size of the unstable component would be," Tkachev said in a statement.

Their new calculations suggest that no more than 5 percent of the current amount of dark matter in the universe, could have been lost since the Big Bang.

Besides suggesting new properties for the elusive dark matter, the work could be important in helping scientists understand how the universe has changed over time, the researchers said. For example, the findings may show how the universe's rate of expansion has varied and what happened in the universe's first few hundred thousand years, when matter as we know it started to form into atoms.

Unstable universe

In its study, the team looked at data from the Planck space telescope, which studies the cosmic microwave background coming from a point located about 932,000 miles (1.5 million kilometers) from Earth. The cosmic microwave background is an "echo" of the Big Bang; it's the radiation from photons (light) that first started moving freely through the universe. By studying fluctuations in that radiation, it's possible to calculate the value of different parameters, such how fast the universe was expanding, at the time the radiation was emitted.

What they found was that the universe in its early days — about 300,000 years after it formed — behaved a bit differently than it does now. That conclusion comes from measuring the rate of expansion, as well as the number of galaxies in clusters, which are easier to explain if the amount of dark matter was anywhere from 2-5 percent greater than it is today.

To get that figure, the researchers compared the real universe with two models: one that assumed dark matter is stable and one that assumed the total amount of dark matter could change. The latter model did a better job of producing something like the universe seen today. So the early universe might have had two kinds of dark matter, the researchers said in a statement: one kind that decays into other particles and another that remains stable over billions of years.

We are not currently able to say how quickly this unstable part decayed; dark matter may still be disintegrating even now," Tkachev said in a statement.

In addition, by looking at gravitational lensing – the bending of light by massive objects -- of the background radiation, the researchers found an upper limit for how much of that dark matter had to decay, the scientists said. The study appears in the journal Physical Review D.

Vera Rubin, Astronomer Who Did Pioneering Work on Dark Matter, Dies at 88
Vera Rubin, a pioneering astronomer who helped find powerful evidence of dark matter, has died, her son said Monday.

She was 88.

Allan Rubin, a professor of geosciences at Princeton University, said his mother died Sunday night of natural causes. He said the Philadelphia native had been living in the Princeton area.

Vera Rubin found that galaxies don't quite rotate the way they were predicted, and that lent support to the theory that some other force was at work, namely dark matter.

Dark matter, which still hasn't been directly observed, makes up 27 percent of universe — as opposed to 5 percent of the universe being normal matter. Scientists better understand what dark matter isn't rather than what it is.

Rubin's scientific achievements earned her numerous awards and honors, including a National Medal of Science presented by President Bill Clinton in 1993 "for her pioneering research programs in observational cosmology." She also became the second female astronomer to be elected to the National Academy of Sciences.

"It goes without saying that, as a woman scientist, Vera Rubin had to overcome a number of barriers along the way," California Institute of Technology physicist Sean Carroll tweeted Monday.

Rubin's interest in astronomy began as a young girl and grew with the involvement of her father, Philip Cooper, an electrical engineer who helped her build a telescope and took her to meetings of amateur astronomers.

Although Rubin said her parents were extremely supportive of her career choice, she said in a 1995 interview with the American Institute of Physics that her father had suggested she become a mathematician, concerned that it would be difficult for her to make a living as an astronomer.

She was the only astronomy major to graduate from Vassar College in 1948. When she sought to enroll as a graduate student at Princeton, she learned women were not allowed in the university's graduate astronomy program, so she instead earned her master's degree from Cornell University.

Rubin earned her doctorate from Georgetown University, where she later worked as a faculty member for several years before working at the Carnegie Institution in Washington, a nonprofit scientific research center.

During her career, Rubin examined more than 200 galaxies.

"Vera Rubin was a national treasure as an accomplished astronomer and a wonderful role model for young scientists," said Matthew Scott, president of the Carnegie Institution. "We are very saddened by this loss."

China's Hunt for Signals From the Dark Universe
“So far we have collected about 1.8 billion cosmic rays, among them more than 1 million particles are high energy electrons,” Professor Fan Yizhong, a member of the mission team at the Purple Mountain Observatory in Nanjing, under the Chinese Academy of Sciences (CAS), told gbtimes. China’s dark matter-hunting satellite DAMPE celebrated its one year anniversary in space over the weekend, with the team now looking for unexpected results among collected data.

Launched on December 17, 2015, the 1,900kg DArk Matter Particle Explorer (DAMPE) has spent the year measuring the spectra of extremely energetic gamma-rays and cosmic ray with the aim of identifying possible Dark Matter signatures. DAMPE, which is also known as Wukong, after the monkey king in the Chinese fairytale Journey to the West, was carried on a Long March 2D booster, and placed in a 500km-altitude orbit.
Scientists reported on Monday, Dec. 21, 2015 that China’s ground stations received its first data DAMPE. The image above is an artistic rendering imagines the filaments of dark energy that make up parts of the cosmic web. Monstrous galaxies are thought to form at the nexuses of these filaments. (ALMA/ESO/NAOJ/NRAO) A Kashgar station situated in Xinjiang tracked and obtained data from “Wukong,”taking around seven minutes to receive and record the data. It was then transmitted to the National Space Science Center, reported the Chinese Academy of Sciences (CAS) in a statement.

DAMPE boasts of a massive surface area, not only capably observing high cosmic ray volumes but also surveying the sky at high energies. It uses four instruments for capturing the high-energy particles and tracing them back to their origin: a BGO calorimeter, a plastic scintillator detector, a neutron detector and a silicon-tungsten tracker. The particle sources are believed to be dark matter collisions, possibly giving scientists new insight into the dark matter that can potentially help scientists follow a wealth of scientific pursuits, including studying oceanic depths on icy moons and mapping out layers of celestial bodies.

“[It’s] an exciting mission,” said Princeton University’s David Spergel of the DAMPE mission. A recent study in the Astrophysical Journal proposed that the solar system might be growing dark matter “hairs,” speculated to exist and sprout from Earth.

"When gravity interacts with the cold dark matter gas during galaxy formation, all particles within a stream continue traveling at the same velocity," explained Gary Prézeau of NASA's Jet Propulsion Laboratory, Pasadena, California, who proposes the existence of long filaments of dark matter, or "hairs."

Based on many observations of its gravitational pull in action, scientists are certain that dark matter exists, and have measured how much of it there is in the universe to an accuracy of better than one percent. The leading theory is that dark matter is "cold," meaning it doesn't move around much, and it is "dark" insofar as it doesn't produce or interact with light.

Galaxies, which contain stars made of ordinary matter, form because of fluctuations in the density of dark matter. Gravity acts as the glue that holds both the ordinary and dark matter together in galaxies.

According to calculations done in the 1990s and simulations performed in the last decade, dark matter forms "fine-grained streams" of particles that move at the same velocity and orbit galaxies such as ours. A stream can be much larger than the solar system itself, and there are many different streams crisscrossing our galactic neighborhood," Prézeau said.

Prézeau likens the formation of fine-grained streams of dark matter to mixing chocolate and vanilla ice cream. Swirl a scoop of each together a few times and you get a mixed pattern, but you can still see the individual colors.

"When gravity interacts with the cold dark matter gas during galaxy formation, all particles within a stream continue traveling at the same velocity," Prézeau said.

But what happens when one of these streams approaches a planet such as Earth? Prézeau used computer simulations to find out. His analysis finds that when a dark matter stream goes through a planet, the stream particles focus into an ultra-dense filament, or "hair," of dark matter. In fact, there should be many such hairs sprouting from Earth.

A stream of ordinary matter would not go through Earth and out the other side. But from the point of view of dark matter, Earth is no obstacle. According to Prézeau's simulations, Earth's gravity would focus and bend the stream of dark matter particles into a narrow, dense hair.

Hairs emerging from planets have both "roots," the densest concentration of dark matter particles in the hair, and "tips," where the hair ends. When particles of a dark matter stream pass through Earth’s core, they focus at the "root" of a hair, where the density of the particles is about a billion times more than average. The root of such a hair should be around 600,000 miles (1 million kilometers) away from the surface, or twice as far as the moon. The stream particles that graze Earth's surface will form the tip of the hair, about twice as far from Earth as the hair’s root.

"If we could pinpoint the location of the root of these hairs, we could potentially send a probe there and get a bonanza of data about dark matter," Prézeau said.

A stream passing through Jupiter's core would produce even denser roots: almost 1 trillion times denser than the original stream, according to Prézeau's simulations.

"Dark matter has eluded all attempts at direct detection for over 30 years. The roots of dark matter hairs would be an attractive place to look, given how dense they are thought to be,” said Charles Lawrence, chief scientist for JPL’s astronomy, physics and technology directorate.

Another fascinating finding from these computer simulations is that the changes in density found inside our planet – from the inner core, to the outer core, to the mantle to the crust – would be reflected in the hairs. The hairs would have "kinks" in them that correspond to the transitions between the different layers of Earth.

Theoretically, if it were possible to obtain this information, scientists could use hairs of cold dark matter to map out the layers of any planetary body, and even infer the depths of oceans on icy moons.

DAMPE is testing the theory that dark matter particles may annihilate or decay and then produce high energy gamma-rays or cosmic rays - in particular electron/positron pairs – and DAMPE, with the widest observation spectrum and highest energy resolution of any dark matter probe in the world, will collect the evidence.

The data analysis of the DAMPE collaboration, which includes institutions from across China and international partners from Italy and Switzerland, has been concentrated on the high energy cosmic rays, in particular the electrons.

“We are looking forward to find something “unexpected” in the cosmic ray and gamma-ray spectra,” Fan says. The team is looking to publish their first results in early 2017.

DAMPE meanwhile will continue to scan in all directions for the second year of its three-year mission, before switching to focus on areas where dark matter may most likely to be observed in the third. The space craft carries four science payloads in total and has the potential to advance the understanding of the origin and propagation mechanism of high energy cosmic rays, as well as new discoveries in high energy gamma astronomy.

The Daily Galaxy via, and

Baylor Physics Ph.D. Graduate Quoted in "How Realistic Is the Interstellar Ship from 'Passengers'?"
The movie "Passengers," which opened yesterday (Dec. 21), explores the fascinations and perils of interstellar travel, but could the kind of starship portrayed in the movie ever exist in real life?

The film begins on board the Starship Avalon, which is carrying more than 5,000 passengers to a distant, habitable planet known as Homestead II.

Travelling at half the speed of light, the crew and passengers are expected to hibernate for 120 years before arriving. That is, until somebody accidentally wakes up 90 years early.

Is there anything remotely realistic about this spaceship? posed that question to several space travel experts, as well as Guy Hendrix Dyas, the film's production designer. Dyas looked at the history of movie spaceships (including the vehicles from the "Star Trek" and "Star Wars" universes) in his quest to come up with something unique for the new film.

The Avalon has three long, thin modules that wrap around a common center and spin (sort of like stripes on a barbershop pole). Dyas said he based that design on sycamore seeds. It appears that the spin also provides the ship with artificial gravity, similar to fictional ships in the movies "Interstellar" and "2001: A Space Odyssey." The ship is powered by eight nuclear fusion reactors, Dyas said, and can run autonomously, healing most systems even with the crew asleep (as seen in the film).

The ship's immense structure is about 1 kilometer (0.62 miles) in length, and Dyas said he imagines that it was assembled in space over decades. The film takes place at an indeterminate point in the future, Dyas said, but he assumed that by the time the ship was being built, humans would have the ability to mine some of the materials from nearby asteroids or the moon to save on transportation costs.

"My approach to the [ship] design was that I tried to go about it as though I was a cruise liner ship designer," Dyas told "I wanted to put myself in the shoes of somebody who had been designing a craft that had a portion of it dedicated to entertainment, and of course that led to the array of colors and textual changes in the ship."

This approach led Dyas to design the more functional areas (such as the mess hall) in stainless steel, while a classy passenger pub was decorated in rich oranges, golds and reds, for example.

Banks of hibernation pods occupy huge halls in the ship. The crew slumbers in separate quarters, inaccessible to the passengers. The pods are clustered into small groups, perhaps (Dyas suggests) so that if one group's cluster fails, at least the other 5,000 passengers are theoretically unaffected.

The hibernation procedure is not really described in the film, but what's clear to moviegoers is what happens afterward: passengers are soothed by a holographic figure explaining where they are. They are escorted to an elevator, then guided to their individual cabin, where they can relax for the last four months of the journey.

In between resting in their quarters, passengers can also get to know the rest of the 5,000 people in common areas, such as the mess hall, the grand concourse, the pool or the bar.

While "Passengers" shows people placed in a hibernation state for decades at a time, that kind of technology does not exist today. There are situations, however, where patients can be put into induced comas with cooled saline solutions for a few days to allow traumatic injuries to heal.

In 2015, a company called SpaceWorks received a NASA Innovative Advanced Concepts grant to investigate the possibility of extending the timeframe of an induced stasis in humans even further than what is currently possible. Aerospace engineer John Bradford, the company's COO, told that induced stasis should be possible given that some mammals can hibernate for months. (NIAC grants are for early-stage work in far-off technologies.)

"We're not trying to extend the human lifetime," Bradford said, so the technology that SpaceWorks is pursuing is different from what is shown in "Passengers." But in other respects, the movie shows essentially the same thing his company strives for.

"We're trying to put people in a small container to minimize the mass and power requirements, and the consumables [during spaceflight]," he added, saying that during a long Mars journey of perhaps six months, putting astronauts into stasis would cut down on the amount of food required for the mission, not to mention the possibility of crew boredom.

And what about exercise? Bradford said it would be possible to keep up an astronaut's muscle mass using neuromuscular electric stimulation; there have been some positive results in comatose patients using that technique, he said.

Bradford said he had been lucky enough to see "Passengers" before its release, and that he was really pleased to see an emphasis on hibernation, and what happens in the moments after waking up, when the passengers are disoriented and extremely tired (since hibernation or stasis is not the same as sleep).

"That part of the storyline is usually jumped over," he said.

Nuclear fusion is a possible source of propulsion for interstellar ships, but the problem is the size of the reactors that would need to be assembled in space, or launched there, according to some scientists we talked to. So other methods are being considered to get spacecraft going at interstellar speeds.

One idea under consideration by Philip Lubin, a physics professor at the University of California, Santa Barbara, uses lasers. Under another NIAC grant, he is developing a concept known as Directed Energy Propulsion for Interstellar Exploration, which would generate propulsion from laser photons reflected in a mirror. The long-term goal is to create a spacecraft that can, like in "Passengers," move at a significant fraction of the speed of light.

Antimatter engines are another possibility for fueling interstellar ships, said Andreas Tziolas, the co-founder and president of Icarus Interstellar. Antimatter particles are naturally occurring particles that are "opposites" to regular matter particles — so the positron is the antimatter equivalent to the electron; the particles have the same mass but are different in other ways, including electric charge (the electron is negative, the positron is positive). When matter and antimatter collide, they annihilate, and release energy.

"The energy [an antimatter engine] generates is very pure in that it generates a lot of photons when matter reacts with antimatter," he told "All of the matter is annihilated and it turns into pure photonic energy. However, the photons themselves are hard to capture."

Though it's not stated directly it the film, it's possible the "Passengers" ship is being fueled by the interstellar medium — the tenuous collection of hydrogen particles that populate much of the universe. This concept was proposed in a 1960 thought experiment by American physicist Robert Bussard, who argued it would allow a ship to travel without having to haul fuel along for the ride.

But there's a problem with that idea, according to Geoffrey Landis, a science fiction author and NASA physicist. Since 1960, scientists have discovered that the medium is too sparse to allow fusion to happen, Landis said.

"The idea was, if you don't carry your fuel with you, you might be able to avoid having a simply enormous fuel tank," he said. But with that theory debunked, the problem remains about how to get to such an incredible speed while still hauling fuel with you. [Does Humanity's Destiny Lie in Interstellar Space Travel? (Op-Ed)]

From a practical standpoint, Landis also agreed that a ship that size would likely have to be built largely in space, and that will probably require asteroid mining.

While asteroid mining is still in the future, there are a couple of companies that are getting started on prospecting. Both Deep Space Industries and Planetary Resources have plans to scout out nearby asteroids to learn about their composition, and the possibilities for getting spacecraft out there. Asteroid-mining technology is in an early stage, but both companies are generating other products (such as Earth observation) that have received some support from customers.

Building a business case would take some time, but Landis said it would be very possible to create a spacecraft from extraterrestrial resources.

"In the long term, if we're going to build these enormous habitats, we are going to have to build them from material in space," he said. "That's a very feasible idea. There's literally millions of asteroids out there from which we could harvest materials without having to drag it out of the gravity well of the Earth."

Ship design

Landis also seemed to think that the Avalon creates gravity by rotating.

"I'm getting a little tired of artificial gravity in 'Star Trek' and 'Star Wars,'" he said, referring to the ability of the ships in these long-standing franchises to generate gravity by more theoretical means.

Experts interviewed for the story agreed that, in general, the ship also appears to take into account human factors, which means designing an environment so that it can best accommodate how humans operate.

An example is how the environment is decorated. Even on the International Space Station, the sterile gray interior is populated with pictures, signs and other mementoes from past crews. Individual astronauts can decorate their quarters to their liking, so that they have family pictures to look at during their six-month missions. So the décor choices that Dyas made are important in real-world spaceflight as well.

Looking at previews for the movie, Tziolas said he thinks the Starship Avalon is similar to the concept that Icarus Interstellar has proposed for an interstellar spaceship. Called Project Hyperion, this craft also has cruise ship-like amenities, room for 5,000 passengers and a spinning design for artificial gravity.

Tziolas added that he is so pleased that Hollywood is getting more realistic with its ship designs in general.

So could the ship from Passengers really exist? Our experts seemed to agree that there are some aspects that reflect real-world science, but some key questions remain about how such a massive vessel would make an interstellar trek.

Shutting a new door on locality
The classic mystery of quantum mechanics concerns the passage of a particle through two slits simultaneously. We know from our understanding of the double-slit experiment that the photon must have traveled through both slits, yet we can never actually observe the photon passing through both slits at the same time. If we find the photon in one place, its probability amplitude to be anywhere else immediately vanishes.

At least, that is the familiar story. In a new experiment described in Scientific Reports, Ryo Okamoto and Shigeki Takeuchi of Kyoto University in Japan have shown that under the right conditions, a single photon can have observable physical effects in two places at once.

Understanding the interaction of light passing through two slits seemed much simpler back when Thomas Young made this sketch (of water wave interference) in the early 1800s.
Understanding the interaction of light passing through two slits seemed much simpler back when Thomas Young made this sketch (of water wave interference) in the early 1800s.
The power of postselection

The heightened sort of nonlocality demonstrated in Kyoto occurs only in the presence of postselection. Consider that identically prepared photons impinging on, for instance, a double slit will land at different points on the screen some time later. The question posed by Yakir Aharonov, Peter Bergmann, and Joel Lebowitz (ABL) in 1964, and revisited in even more dramatic form by Aharonov, David Albert, and Lev Vaidman (AAV) in 1988, was the following: If we look only at the subensemble of photons that land at some particular point on the screen, does that observation give us new information about those photons? Might it allow us to draw more conclusions about what the photons were doing between the time they were prepared and the time we selected a particular region on the screen?

The suggestion that we can chart the progress of photons on their journey may sound heretical at first. The standard lore is that a wave function is a complete description of the quantum state, and that the uncertainty principle prevents one from knowing both where a particle came from and where it will go. Yet ABL and AAV showed rigorously—using only the standard rules of quantum mechanics—that if some measuring apparatus had been interacting with the photons, its final state (the “pointer position” on the meter, as measurement theorists put it) would depend on which subset of photons had been considered. (See the article by Aharonov, Sandu Popescu, and Jeff Tollaksen, Physics Today, November 2010, page 27.)

In the ABL case, the interaction with the meter can be thought of as collapsing the photons into one state or another. Depending on which measurement outcome occurs, the photons may subsequently be more likely to reach one point or another on the screen: The pointer position becomes correlated with the final position of the photons. Classical statistics is sufficient for working out what the average pointer position should be, conditioned on a particular final state. (AAV extended this work to consider postselected “weak” measurements, a scenario that yields purely quantum results one could not obtain classically. Weak measurements raise even more thorny questions about the foundations of quantum mechanics that go beyond the scope of this article.)

Many experiments have confirmed these predictions about the outcomes of conditional measurements. Experimenters have applied postselection to studying interpretational issues in quantum theory and to the practical purpose of amplifying small effects to improve measurement sensitivity. Postselection has also become an important element in the toolbox of quantum information, with applications in linear-optical approaches to quantum computing and the closely related measurement-based quantum computing.

Head-scratching consequences

Although the formulas put forward by ABL and AAV are unavoidable consequences of quantum theory, they cry out for interpretation—especially since they expose new counterintuitive effects. Suppose an experimenter prepares a photon in a symmetric superposition of three states, |ψi⟩ = (|A⟩ + |B⟩ + |C⟩) / √3, but later finds the particle in a different superposition of those three states, |ψf⟩ = (|A⟩ + |B⟩ – |C⟩) / √3. What can the experimenter say about where the particle was between preparation and postselection? In 1991 Aharonov and Vaidman showed that a measurement of the particle number in state A would yield a value of 1 whenever the postselection succeeded; yet by symmetry, so would a measurement of the particle number in state B. In other words, the postselected particle would be certain to influence a measurement apparatus looking for it at A, and equally certain to influence a measurement apparatus looking for it at B. It’s akin to saying that the conditional probability is 100% to be at A and 100% to be at B.

Obviously, that situation never arises classically. But in the quantum world, every measurement disturbs the system, and the two conditional probabilities correspond to different physical situations: one in which a measurement interaction occurred at A, and one in which it occurred at B. Suppose the experimenter looks for the particle at A but doesn’t find it. That projects the original state onto |B⟩ + |C⟩ / √2, which is orthogonal to |ψf⟩. Therefore, if the experimenter does not find the particle at A, the postselection never succeeds. It follows trivially that whenever the postselection does succeed, the particle must have been found at A. My group carried out a verification of that prediction in 2004.

If the results of that quantum shell game weren’t baffling enough, Aharonov and Vaidman proposed an even stranger thought experiment in 2003. In “How one shutter can close N slits,” they imagined a series of two or more slits (let us consider the simplest example, N=2, although the argument applies for any N). They then considered an experimenter who possessed a single shutter that could block one of the slits. Instead of placing the shutter to block one slit or the other, the experimenter prepares the shutter in a superposition of both positions, along with a third position that does not block a slit.

Aharonov and Vaidman showed that if the experimenter postselected in a different, properly chosen superposition (the |ψf⟩ of the original three-box problem), then any measurement of whether the shutter was blocking a particular one of the two slits would be guaranteed to yield an affirmative answer. If a single photon was sent toward slit A, the shutter would block its path. If the photon was sent toward slit B, the same shutter would still be guaranteed to block it. Most remarkably, the photon would also be stopped by the shutter if it was sent along any coherent superposition of the paths leading to the N slits. In other words, the experimenter can block the photon whether it goes through slit A or slit B—and also block the photon if it doesn't go through one particular slit or the other, with no need to determine which slit it hit. In this very real sense, the shutter acts as though it is in two places at once.

In a recent head-scratching experiment, researchers managed to get one shutter to simultaneously blockade two slits. Credit: Shigeki Takeuchi
In a recent head-scratching experiment, researchers managed to get one shutter to simultaneously blockade two slits. Credit: Shigeki Takeuchi
One might be tempted to cry foul. “The shutter is affected by the choice to look for it in one place or another,” that person might say. “It wouldn’t be able to block two photons arriving at both slits.” Yet in a weak-measurement version of the proposal, in which the disturbance due to measurements is reduced to near zero, the predictions hold: The unperturbed shutter acts as though it is fully at A and also fully at B.

Building a quantum shutter

Of course, macroscopic shutters can no more easily be placed in superpositions than can macroscopic felines. Enter Okamoto and Takeuchi. Using ideas from linear-optical quantum computing, they replaced the slits and shutters with quantum routers—logic gates in which the presence of one photon (let’s call it a shutter photon) determines whether another photon (the probe photon) is transmitted or reflected.

The researchers prepared a shutter photon, which was generated via spontaneous parametric down-conversion, in a superposition of three paths. If the shutter photon took path 1, it should block a probe photon reaching router 1; if it took path 2, it should block a probe photon reaching router 2; and if it took path 3, it wouldn't block any photons at all. The three paths were then recombined with a phase shift, so that the firing of a certain detector signaled a measurement of the shutter photon in state |ψf⟩. Looking only at cases when the detector fired, the researchers could study the behavior of a postselected photon and see which slit or slits it blocked.

Lo and behold, when the postselection succeeded, Okamoto and Takeuchi found that the probe photons had a high probability of being reflected, regardless of which router they were sent toward. Because of subtle technical issues involved in the concatenation of the linear-optical quantum gates, the probability was limited to a maximum of 67%. The researchers observed about 61% experimentally, clearly exceeding the 50% threshold that could be achieved by a shutter constrained to be in one place at a time. Furthermore, the scientists confirmed the remarkable prediction that the same shutter was capable of blocking probe photons prepared in arbitrary superpositions of the two paths. The experiment demonstrates that any “shutter” (in this case a shutter photon) that is prepared in a given initial state and then measured in the appropriate final state must have been in front of multiple slits at the same time.

So, what does this experiment teach us about nonlocality? It’s hard to find a more down-to-earth approach to that question than my mother’s insightful proposal: “I hope this means I can shop in two different places at the same time.” Well, quantum mechanics may offer us fascinating new phenomena, but they always come at a price. In the case of quantum shopping, the postselection step means that my mother may get no shopping done at all. However, she may be in luck if she happens to know that only one of her N favorite stores still has the gift she wants, but she doesn’t know which store. An application of Okamoto and Takeuchi's result would mean that in the time it takes her to visit just a single store, she would have a finite probability of being certain to find the gift.

I’ll leave the calculation of the exact value of that probability as an exercise for holiday shoppers. But from the perspective of research into the foundations of quantum mechanics, we now have experimental confirmation that postselected systems exhibit a form of nonlocality even more striking than the ones we were familiar with before.

Aephraim Steinberg is a professor of physics at the University of Toronto, where he is a founding member of the Centre for Quantum Information and Quantum Control and a fellow of the Canadian Institute for Advanced Research.

Unexpected interaction between dark matter and ordinary matter in mini-spiral galaxies
Statistical analysis of mini-spiral galaxies shows an unexpected interaction between dark matter and ordinary matter. According to the SISSA study recently published in Monthly Notices of the Royal Astronomical Society, where the relationship is obvious and cannot be explained in a trivial way within the context of the Standard Model, these objects may serve as "portals" to a completely new form of Physics which can explain phenomena like matter and dark energy.

They resemble a spiral galaxy like ours, only ten thousand times smaller: the mini-spiral galaxies studied by Professor Paolo Salucci of the International School for Advanced Studies (SISSA) in Trieste, and Ekaterina Karukes, who recently earned her PhD at SISSA, may prove to be "the portal that leads us to a whole new Physics, going beyond the standard model of particles to explain matter and dark energy," says Salucci. It is the first time these elements have been studied statistically, a method that can erase the "individual" variability of each object, thus revealing the general characteristics of the class. "We studied 36 galaxies, which was a sufficient number for statistical study. By doing this, we found a link between the structure of ordinary, or luminous matter like stars, dust and gas, with dark matter."
Dark matter is one of the great mysteries of Physics: since it does not emit electromagnetic radiation we cannot see it, even with the most sophisticated instruments. It was only discovered through its gravitational effects. Many believe it makes up 90% of our Universe. "Most dark matter, according to the most credible hypotheses, would be non-baryonic or WIMP. It would not interact with ordinary matter except through gravitational force," continues Karukes. "Our observations, however, disagree with this notion."
Salucci and Karukes showed that, in the objects they observed, the structure of dark matter mimics visible matter in its own way. "If, for a given mass, the luminous matter in a galaxy is closely compacted, so it is the dark matter. Similarly, if the former is more widespread than in other galaxies, so is the latter."
The "tip of the iceberg"
"It is a very strong effect that cannot be explained trivially using the Standard Model of particles." The Standard Model is the most widely-accepted theory of Physics in the scientific community. It explains fundamental forces (and particles of matter), however it contains some doubtful points, most notably the fact that it does not include gravitational force. Phenomena such as the existence of dark matter and dark energy make it clear to scientists that there is another sort of physics yet to be discovered and explored.
"From our observations, the phenomenon, and thus the necessity, is incredibly obvious. At the same time, this can be a starting point for exploring this new kind of physics," continues Salucci. "Even in the largest spiral galaxies we find effects similar to the ones we observed, but they are signals that we can try to explain using the framework of the Standard Model through astrophysical processes within galaxies. With mini-spirals, however, there is no simple explanation. These 36 items are the tip of the iceberg of a phenomenon that we will probably find everywhere and that will help us discover what we cannot yet see. "
Explore further: NA64 hunts the mysterious dark photon
More information: E.V. Karukes et al. The universal rotation curve of dwarf disk galaxies, Monthly Notices of the Royal Astronomical Society (2016). DOI: 10.1093/mnras/stw3055

Read more at:

Thermodynamics constrains interpretations of quantum mechanics
John Stewart Bell’s famous theorem is a statement about the nature of any theory whose predictions are compatible with those of quantum mechanics: If the theory is governed by hidden variables, unknown parameters that determine the results of measurements, it must also admit action at a distance. Now an international collaboration led by Adán Cabello has invoked a fundamental thermodynamics result, the Landauer erasure principle, to show that systems in hidden-variable theories must have an infinite memory to be compatible with quantum mechanics.

In quantum mechanics, measurements made at an experimenter’s whim cause a system to change its state; for a two-state electron system, for example, that change can be from spin up in the z-direction to spin down in the x-direction. Because of those changes, a system with hidden variables has to have a memory so that it knows how to respond to a series of measurements; if that memory is finite, it can serve only for a limited time. As an experimenter keeps making observations, the system must eventually update its memory, and according to the Landauer principle, the erasure of information associated with that update generates heat. (See the article by Eric Lutz and Sergio Ciliberto, Physics Today, September 2015, page 30.) In the electron example, if all spin measurements must be made along the x– or z-axis, each measurement dissipates a minimum amount of heat roughly equal to Boltzmann’s constant times the temperature. Cabello and colleagues show, however, that if an experimenter is free to make spin measurements anywhere in the xz-plane, the heat generated per measurement is unbounded—obviously, an unphysical result.

Heat need not be produced in a hidden-variables theory if a system could store unlimited information. Such is the case, for example, for David Bohm’s version of quantum mechanics, in which a continuous pilot wave serves as the information repository. And in formulations of quantum mechanics without hidden variables, such as in the Copenhagen interpretation, heat is not generated because there is no deterministic register to update. (A. Cabello et al., Phys. Rev. A 94, 052127, 2016.)

Billions of Stars and Galaxies to Be Discovered in the Largest Cosmic Map Ever
The Pan-STARRS telescope in Hawaii spent four years scanning the skies to produce two petabytes of publicly-available data. Now it's up to us to study it.

Need precision observations of a nearby star? Want to measure the light-years to a distant galaxy? Or do you just want to stare into the deep unknown and discover something no one has ever seen before? No problem! The Panoramic Survey Telescope & Rapid Response System (Pan-STARRS) has got you covered after releasing the biggest digital sky survey ever carried out to the world.

"The Pan-STARRS1 Surveys allow anyone to access millions of images and use the database and catalogs containing precision measurements of billions of stars and galaxies," said Ken Chambers, Director of the Pan-STARRS Observatories, in a statement. "Pan-STARRS has made discoveries from Near Earth Objects and Kuiper Belt Objects in the Solar System to lonely planets between the stars; it has mapped the dust in three dimensions in our galaxy and found new streams of stars; and it has found new kinds of exploding stars and distant quasars in the early universe."

RELATED: Vast Map Charts Our 2 Billion Light-Year Wide Cosmic 'Hood

The Pan-STARRS project is managed by the University of Hawaii's Institute for Astronomy (IfA) and the vast database is now available by the Space Telescope Science Institute (STScI) in Baltimore, Md. To say this survey is "big" is actually a disservice to just how gargantuan a data management task it is. According the IfA, the entire survey takes up two petabytes of data, which, as the university playfully puts it, "is equivalent to one billion selfies, or one hundred times the total content of Wikipedia."

Scientists Measure Antimatter for the First Time
Using a laser to excite the antiparticles (positrons and antiprotons), scientists can begin to measure the atomic structure of some of the most mysterious material in the universe.
Antimatter isn't the absurd theoretical substance it sounds like—it's just material composed of particles that have the same mass as conventional particles, but opposite charges. An electron with a positive charge is called a positron, and a proton with a negative charge is called an antiproton. Material composed of these antiparticles is called antimatter.

In a groundbreaking experiment published today in Nature, physicists at CERN, the European Organization for Nuclear Research's particle physics lab outside Geneva, have measured the energy levels of antihydrogen for the first time. Hydrogen is the simplest element on the periodic table, with just one electron and one proton, so making antihydrogen is easier than any other type of antimatter.

It's incredibly difficult to create antimatter and retain it for any extended period of time. This is because antimatter and matter annihilate each other when they come into contact. An electron and positron will zap each other out of existence, releasing energy in the form of light, and protons and antiprotons do the same. Considering conventional matter is floating all around our world, it can be quite difficult to keep antimatter from coming into contact with it.

"What you hear about in science fiction—that antimatter gets annihilated by normal matter—is 100 percent true," Jeffrey Hangst, a physicist at Denmark's Aarhus University who founded the ALPHA group at CERN to study antimatter, told NPR. "[It] is the greatest challenge in my everyday life.
To create antihydrogen, ALPHA physicists combined positrons and antiprotons in a vacuum tube and used extremely powerful magnetic fields to keep the resulting antihydrogen from colliding with the walls of the container. Using this technique, the team has successfully maintained antihydrogen atoms for about 15 minutes.

Electrons orbit the nucleus of conventional atoms at different energy levels, and when they move from one energy level to another, they let off energy that can be measured as light on the electromagnetic spectrum. The same is true for antihydrogen atoms.

The ALPHA researchers used a laser to excite the antihydrogen so the positrons would jump from a lower energy level to a higher one. Then, when the positrons return to the lower energy level, scientists can measure the light released. What they found is that positrons in antihydrogen move from one energy level to another in the same way that electrons do in normal hydrogen.

This seemingly benign finding has actually captured the attention of theoretical physicists around the world because it is central to one of the great mysteries of the universe: Why does anything exist at all? Models of the Big Bang suggest that an equal amount of matter and antimatter should have been created. As far as we can tell, all of the antimatter and matter of the universe should have been annihilated shortly after it was created. But here we are.

"Something happened," Hangst told NPR, "some small asymmetry that led some of the matter to survive, and we simply have no good idea that explains that right now."

It could be that antimatter doesn't obey the same laws of physics as matter. It could be that our models of the Big Bang are flawed. But whatever the true answer to our existence is, experiments with antimatter like this one could unlock the secrets. Our first experiment suggests that antihydrogen behaves just like boring old hydrogen—but this is merely the beginning of a new kind of scientific study.

Europe's Bold Plan for a Moon Base Is Coming Together
Imagine an international research station on the moon, where astronauts and cosmonauts and taikonauts and any other-nauts from around the world conduct science experiments, gather resources, build infrastructure, study our home planet from afar, and erect a new radio telescope to probe the mysteries of the ancient cosmos. This is the vision of Jan Woerner, the German civil engineer who serves as the Director General of the European Space Agency. He calls it "Moon Village."
Moon Village isn't so much a literal village as it is a vision of worldwide cooperation in space. It is part of Woerner's larger concept of "Space 4.0."

Woerner, you see, breaks down the history of space exploration into four periods. All of ancient and classical astronomy is lumped into Space 1.0, the space race from Sputnik to Apollo is Space 2.0, and the establishment of the International Space Station defines the period of Space 3.0. As the largest space station—which holds the record for longest continuous human habitation, 16 years and counting—the ISS soars as a shining example of successful, longterm, peacetime international cooperation like no other program in the history of humankind.
Space 4.0 is a continuation of that spirit of global cooperation, and it represents the entry of private companies, academic institutions, and individual citizens into the exploration of the cosmos. Moon Village, part of Space 4.0, is a worldwide community of people who share the dream of becoming an interplanetary species.

"Somebody was asking me, 'When do you do it, and how much money do you need?' I said it's already progressing, as a village on Earth. The village starts with the first actor, and we have several actors right now, so it's already on its way," Woerner said to the Space Transportation Association (STA) at a Capital Hill luncheon on December 9, as reported by Aviation Week.
Of course, all this sentiment is nice, but where are we in terms of building a physical moon base? Closer than you might think.

The rest of Europe has united behind Woerner's idea, as the science ministers of each ESA member state have endorsed Space 4.0. To that end, the European Space Agency is developing a Lunar Lander, its first. The program was postponed in 2012 because Germany, which is covering 45 percent of the costs, couldn't convince the other member nations to put up the additional 55 percent. Renewed interest in lunar exploration with a German at the helm of ESA could be enough to jumpstart the program again.

Meanwhile, ESA is investing in technologies to develop 3D printing methods that would work using lunar soil. The research could pave the way for constructing tools and even habitats on the moon. The British architecture firm Foster + Partners has gone so far as to design a catenary dome with a cellular structure that could guard an inflatable lunar habitat against both small pieces of debris and space radiation.

Other nations have their eyes set on the moon as well. India and Japan both have lunar rovers under development that they plan to launch before 2020. China has two sample return missions in the works and a plan to land on the far side of the moon for the very first time, all before the decade is out. The space agencies of Europe, Japan, Russia and China have all proposed missions to put astronauts on the moon in the coming decades.

Einstein's Theory Just Put the Brakes on the Sun's Spin
Although the sun is our nearest star, it still hides many secrets. But it seems that one solar conundrum may have been solved and a theory originally proposed in 1905 by Albert Einstein could be at the root of it all.

Twenty years ago, solar astronomers realized that the uppermost layer of the sun rotates slower than the rest of the sun's interior. This is odd. It is well known the sun rotates faster at its equator than at its poles — a phenomenon known as "differential rotation" that drives the sun's 11-year solar cycle — but the fact that the sun has a sluggish upper layer has been hard to understand. It's as if there's some kind of force trying to hold it in place while the lower layers churn below it.

Now, researchers from University of Hawaii Institute for Astronomy (IfA), Brazil, and Stanford University may have stumbled on an answer and it could all be down to fundamental physics. It seems that the light our sun generates has a braking effect on the sun's surface layers.

Dying Star Offers Glimpse of Earth's Doomsday in 5B Years
Five billion years from now, our sun will die. After running out of hydrogen fuel, it will start burning heavier and heavier elements in its fusion core, causing its body to bloat, shedding huge quantities of material into space via violent stellar winds. During this time, our star will expand around 100 times bigger than it is now, becoming what is known as a "red giant." This dramatic expansion will engulf Mercury and Venus, the two closest planets to the sun.

But what is less clear is what will happen to Earth — will our planet go the way of Mercury and Venus and succumb to an ocean of superheated plasma? Or will our planet escape the worst of the sun's death throes to continue orbiting the tiny white dwarf star that will be left behind?

"We already know that our sun will be bigger and brighter [when entering the red giant phase], so that it will probably destroy any form of life on our planet," said Leen Decin, of the KU Leuven Institute of Astronomy, in a statement. "But will the Earth's rocky core survive the red giant phase and continue orbiting the white dwarf?

With the help of the most powerful radio observatory on the planet, astronomers could soon have a clue by looking at a nearby star system that resembles how our solar system will look when the sun begins to die.

RELATED: Enjoy Earth Day While You Can, There Are Only 5 Billion Left

L2 Puppis is an evolved star located over 200 light-years from Earth. Though this seems far away, it's pretty much on our cosmic doorstep and well within the resolving power of the Atacama Large Millimeter/submillimeter Array (ALMA) in Chile. Through precise measurements of the star, astronomers have deduced its mass and age, realizing that it is (or was) a sun-like star that's now 10 billion years old. It's also a prime example of a planetary nebula in the making.

Like our sun five billion years in the future, L2 Puppis is ripping itself apart, blasting huge quantities of gas into space. This process creates a massive glowing cloud and this particular planetary nebula resembles a beautiful cosmic butterfly

But that's not all. According to the new study published in the journal Astronomy & Astrophysics, L2 Puppis also appears to have a planet in tow, roughly 300 million kilometers from the star. Though this distance is around twice the distance that Earth orbits the sun, it provides a very privileged view of a world orbiting a dying sun-like star. It's also an ominous preview of what's in store for Earth in a few billion years and the researchers hope to study this unfortunate planet as it experiences the wrath of L2 Puppis."We discovered that L2 Puppis is about 10 billion years old," said Ward Homan, also from KU Leuven. "Five billion years ago, the star was an almost perfect twin of our sun as it is today, with the same mass. One third of this mass was lost during the evolution of the star. The same will happen with our sun in the very distant future."

RELATED: Real Doomsday: Earth Dead in 2.8 Billion Years

"Five billion years from now, the sun will have grown into a red giant star, more than a hundred times larger than its current size," said Decin. "It will also experience an intense mass loss through a very strong stellar wind. The end product of its evolution, 7 billion years from now, will be a tiny white dwarf star. This will be about the size of the Earth, but much heavier: one tea spoon of white dwarf material weighs about 5 tons."

Astronomers often look to the stars to better understand our own place in the galaxy. In this case, they've glimpsed the future and seen a key part of the life cycle of a sun-like star. They've also seen a true doomsday, an event so final that it wrecks our sun, taking the nearest planets with it. And though Earth may or may not be swallowed whole by the swelling stellar inferno, it will be sterilized of life — on our planet's roasted surface at least.

Dark Matter Not So Clumpy After All
Dark matter, the mysteriously invisible substance that makes up about 27 percent of the mass in the universe, may not be as clumpy as scientists previously thought.

In 2013, researchers with Europe's Planck mission, which studied the oldest light in the universe, found that dark matter has lumped together over time through gravitational attraction. What started out as a smooth and even distribution of dark matter slowly formed dense chunks over time.

But new research at the European Southern Observatory's (ESO) Very Large Telescope (VLT) at the Paranal Observatory in Chile suggests that dark matter is not quite as clumpy as the Planck mission previously found.
"This latest result indicates that dark matter in the cosmic web, which accounts for about one-quarter of the content of the universe, is less clumpy than we previously believed," Massimo Viola, a researcher at the Leiden Observatory in the Netherlands who co-led in the study, said in a statement.

To see how dark matter is distributed in universe, the international team of researchers used data from the Kilo Degree Survey (KiDS) at the VLT Survey Telescope. This deep-sky survey looked at about 15 million galaxies in five patches of the southern sky, covering an area as big as 2,200 full moons (or 450 square degrees).

Because dark matter's gravity can bend light — a process called gravitational lensing — the light coming from these 15 million galaxies could reveal information about the structure and distribution of dark matter, the researchers suggest. In this study, they looked for a variation of this phenomenon known as weak gravitational lensing, or cosmic shear.
Weak gravitational lensing is a subtle effect that has to be measured with precision. When large-scale structures like galaxy clusters cause weak gravitational lensing, the light-warping effect is subtler and more difficult to detect than gravitational lensing around smaller objects like stars. But with high-resolution images taken by the VLT Survey Telescope, the researchers were able to detect this subtle effect. This study is the first to use this imaging method on such a large portion of the sky to map the invisible matter in the universe, the authors wrote.

When the researchers then used this data to calculate how clumpy dark matter is, they discovered that it is significantly smoother than the Planck satellite data had previously determined. This means that dark matter may be more evenly distributed than scientists have thought.
How dark matter has spread and clumped together since the Big Bang happened 13.8 billion years ago, can provide insights into the evolution of the universe, according to co-author Hendrik Hildebrandt of the Argelander Institute for Astronomy in Bonn, Germany. "Our findings will help to refine our theoretical models of how the universe has grown from its inception up to the present day," Hildebrandt said in the same statement.

"We see an intriguing discrepancy with Planck cosmology at the moment," co-author Konrad Kuijken of the Leiden Observatory in the Netherlands, who is principal investigator of the KiDS survey, said in the statement. "Future missions such as the Euclid satellite and the Large Synoptic Survey Telescope will allow us to repeat these measurements and better understand what the universe is really telling us."

Scientists Catch "Virtual Particles" Hopping In and Out of Existence
About 400 light-years from here, in the area surrounding a neutron star, the electromagnetic field of this unbelievably dense object appears to be creating an area where matter spontaneously appears and then vanishes.

Quantum electrodynamics (QED) describes the relationships between particles of light, or photons, and electrically charged particles such as electrons and protons. The theories of QED suggest that the universe is full of "virtual particles," which are not really particles at all. They are fluctuations in quantum fields that have most of the same properties as particles, except they appear and vanish all the time. Scientists predicted the existence of virtual particles some 80 years ago, but we have never had experimental evidence of this process until now.


How can we possibly see such a thing? One of the properties virtual particles have in common with actual particles is that they both affect light. In addition, intense magnetic fields are thought to excite the activity of virtual particles, affecting any light that passes through that space more dramatically.

So a team of astronomers pointed our most advanced ground-based telescope, the European Southern Observatory's Very Large Telescope (VLT), at one of the densest objects we know of: a neutron star.
Neutron stars have magnetic fields that are billions of times stronger than our sun's. Using the VLT, Roberto Mignani from the Italian National Institute for Astrophysics (INAF) and his team observed visible light around the neutron star RX J1856.5-3754 and detected linear polarization—or the alignment of light waves according to external electromagnetic influences—in the empty space around the star. This is rather odd, because conventional relativity says that light should pass freely through a vacuum, such as space, without being altered. The linear polarization was to such a degree (16 degrees to be precise) that the only known explanations are theories of QED and the influence of virtual particles.

"According to QED, a highly magnetized vacuum behaves as a prism for the propagation of light, an effect known as vacuum birefringence," Mignani says. "The high linear polarization that we measured with the VLT can't be easily explained by our models unless the vacuum birefringence effects predicted by QED are included."


Vacuum birefringence was first predicted in the 1930s by Werner Heisenberg and Hans Heinrich Euler. It was an exciting time for the development of quantum mechanics, when many of the advanced theories still studied today were developed.

In the quantum realm, matter behaves very strangely to say the least. It violates both Newton's classical laws of physics and Einstein's theories of relativity and gravity. Matter can exist in two separate places at once. Entangled particles, separated by miles, can influence each other instantaneously. As far as we can tell, the smallest building blocks of matter exist with multiple, or even infinite properties, known as quantum states, until they are observed or measured.

Fortunately, we can model and even predict some quantum phenomena, and we do this using wave functions. A wave, such as a sine curve, is represented by an equation that has multiple correct values to make it a true mathematical statement. This same basic principle can be applied to physical models of particles that exist in different locations, or with different properties, or sometimes don't exist at all. When the particles are measured, the wave function collapses, and the matter only exists with one set of properties like you would expect. The researchers were able to measure the virtual particles around a neutron star indirectly, by measuring the light that passes through them.

These concepts are so profound that Einstein and Niels Bohr famously debated, at length, whether the universe even exists as a tangible smattering of matter across the void, or if it is a fluid conglomerate of infinite possible realities until we observe it. The first experimental evidence of vacuum birefringence—absurdly strong electromagnetic forces tugging at the very foundations of matter—reminds us that this is still an open-ended question.

New theory of gravity might explain dark matter
A new theory of gravity might explain the curious motions of stars in galaxies. Emergent gravity, as the new theory is called, predicts the exact same deviation of motions that is usually explained by invoking dark matter. Prof. Erik Verlinde, renowned expert in string theory at the University of Amsterdam and the Delta Institute for Theoretical Physics, published a new research paper today in which he expands his groundbreaking views on the nature of gravity.

In 2010, Erik Verlinde surprised the world with a completely new theory of gravity. According to Verlinde, gravity is not a fundamental force of nature, but an emergent phenomenon. In the same way that temperature arises from the movement of microscopic particles, gravity emerges from the changes of fundamental bits of information, stored in the very structure of spacetime.

Newton's law from information

In his 2010 article (On the origin of gravity and the laws of Newton), Verlinde showed how Newton's famous second law, which describes how apples fall from trees and satellites stay in orbit, can be derived from these underlying microscopic building blocks. Extending his previous work and work done by others, Verlinde now shows how to understand the curious behaviour of stars in galaxies without adding the puzzling dark matter.

The outer regions of galaxies, like our own Milky Way, rotate much faster around the centre than can be accounted for by the quantity of ordinary matter like stars, planets and interstellar gasses. Something else has to produce the required amount of gravitational force, so physicists proposed the existence of dark matter. Dark matter seems to dominate our universe, comprising more than 80 percent of all matter. Hitherto, the alleged dark matter particles have never been observed, despite many efforts to detect them.

No need for dark matter

According to Erik Verlinde, there is no need to add a mysterious dark matter particle to the theory. In a new paper, which appeared today on the ArXiv preprint server, Verlinde shows how his theory of gravity accurately predicts the velocities by which the stars rotate around the center of the Milky Way, as well as the motion of stars inside other galaxies.

"We have evidence that this new view of gravity actually agrees with the observations, " says Verlinde. "At large scales, it seems, gravity just doesn't behave the way Einstein's theory predicts."

At first glance, Verlinde's theory presents features similar to modified theories of gravity like MOND (modified Newtonian Dynamics, Mordehai Milgrom (1983)). However, where MOND tunes the theory to match the observations, Verlinde's theory starts from first principles. "A totally different starting point," according to Verlinde.

Adapting the holographic principle

One of the ingredients in Verlinde's theory is an adaptation of the holographic principle, introduced by his tutor Gerard 't Hooft (Nobel Prize 1999, Utrecht University) and Leonard Susskind (Stanford University). According to the holographic principle, all the information in the entire universe can be described on a giant imaginary sphere around it. Verlinde now shows that this idea is not quite correct—part of the information in our universe is contained in space itself.

This extra information is required to describe that other dark component of the universe: Dark energy, which is believed to be responsible for the accelerated expansion of the universe. Investigating the effects of this additional information on ordinary matter, Verlinde comes to a stunning conclusion. Whereas ordinary gravity can be encoded using the information on the imaginary sphere around the universe, as he showed in his 2010 work, the result of the additional information in the bulk of space is a force that nicely matches that attributed to dark matter.

On the brink of a scientific revolution

Gravity is in dire need of new approaches like the one by Verlinde, since it doesn't combine well with quantum physics. Both theories, crown jewels of 20th century physics, cannot be true at the same time. The problems arise in extreme conditions: near black holes, or during the Big Bang. Verlinde says, "Many theoretical physicists like me are working on a revision of the theory, and some major advancements have been made. We might be standing on the brink of a new scientific revolution that will radically change our views on the very nature of space, time and gravity."

Supersolids produced in exotic state of quantum matter
A mind-bogglingly strange state of matter may have finally made its appearance. Two teams of scientists report the creation of supersolids, which are both liquid and solid at the same time. Supersolids have a crystalline structure like a solid, but can simultaneously flow like a superfluid, a liquid that flows without friction.

Research teams from MIT and ETH Zurich both produced supersolids in an exotic form of matter known as a Bose-Einstein condensate. Reports of the work were published online at on October 26 (by the MIT group) and September 28 (by the Zurich group).

Bose-Einstein condensates are created when a group of atoms, chilled to near absolute zero, huddle up into the same quantum state and begin behaving like a single entity. The scientists’ trick for creating a supersolid was to nudge the condensate, which is already a superfluid, into simultaneously behaving like a solid. To do so, the MIT and Zurich teams created regular density variations in the atoms — like the repeating crystal structure of a more typical solid — in the system. That density variation stays put, even though the fluid can still flow.

The new results may be the first supersolids ever created — at least by some definitions. “It’s certainly the first case where you can unambiguously look at a system and say this is both a superfluid and a solid,” says Sarang Gopalakrishnan of the College of Staten Island of the City University of New York. But the systems are far from what physicists predicted when they first dreamt up the strange materials.

Scientists originally expected supersolids to appear in helium-4 — an isotope of the element helium and the same gas that fills balloons at children’s birthday parties. Helium-4 can be chilled and pressurized to produce a superfluid or a solid. Supersolid helium would have been a mixture of these two states.

Previous claims of detecting supersolid helium-4, however, didn’t hold up to scrutiny (SN Online: 10/12/2012). So, says Nikolay Prokof’ev of the University of Massachusetts Amherst, “now we have to go to the artificial quantum matter.” Unlike helium-4, Bose-Einstein condensates can be precisely controlled with lasers, and tuned to behave as scientists wish.

The two groups of scientists formed their supersolids in different ways. By zapping their condensate with lasers, the MIT group induced an interaction that gave some of the atoms a shove. This motion caused an interference between the pushed and the motionless atoms that’s similar to the complex patterns of ripples that can occur when waves of water meet. As a result, zebralike stripes — alternating high- and low-density regions — formed in the material, indicating that it was a solid.

Applying a different method, the ETH Zurich team used two optical cavities — sets of mirrors between which light bounces back and forth repeatedly. The light waves inside the cavities caused atoms to interact and thereby arrange themselves into a crystalline pattern, with atoms separated by an integer number of wavelengths of light.

Authors of the two studies declined to comment on the research, as the papers have been submitted to embargoed journals.

“Experimentally, of course, these are absolutely fantastic achievements,” says Anatoly Kuklov of the College of Staten Island. But, he notes, the particles in the supersolid Bose-Einstein condensates do not interact as strongly as particles would in supersolid helium-4. The idea of a supersolid is so strange because superfluid and solid states compete, and in most materials atoms are forced to choose one or the other. But in Bose-Einstein condensates these two states can more easily live together in harmony, making the weird materials less counterintuitive than supersolid helium-4 would be.

Additionally, says Prokof’ev, “some people will say ‘OK, well, this does not qualify exactly for supersolid state,’” because the spacing of the density variations was set externally, rather than arising naturally as it would have in helium.

Still, such supersolids are interesting for their status as a strange and new type of material. “These are great works,” says Kuklov. “Wide attention is now being paid to supersolidity.”

You Can 3D Print Your Own Mini Universe
Have you ever wondered what the universe looks like in all of its entirety, or how it would feel to hold the universe in the palm of your hand? Good news: It is now possible to do both of these things — all you need is a 3D printer.

Researchers at the Imperial College London have created the blueprints for 3D printing the universe, and have provided the instructions online so anyone with access to a 3D printer can print their own miniature universe. You can see a video on the science behind the 3D-printed universe here.

The researchers' representation of the universe specifically depicts the cosmic microwave background (CMB), or a glowing light throughout the universe that is thought to be leftover radiation from the Big Bang, when the universe was born about 13.8 billion years ago.

Creating Antimatter Via Lasers?
Russian researchers develop calculations to explain the production and dynamics of positrons in the hole-boring regime of ultrahigh-intensity laser-matter interactions.
Dramatic advances in laser technologies are enabling novel studies to explore laser-matter interactions at ultrahigh intensity. By focusing high-power laser pulses, electric fields (of orders of magnitude greater than found within atoms) are routinely produced and soon may be sufficiently intense to create matter from light.

Now, intriguing calculations from a research team at the Institute of Applied Physics of the Russian Academy of Sciences (IAP RAS), and reported this week in Physics of Plasmas, from AIP Publishing, explain the production and dynamics of electrons and positrons from ultrahigh-intensity laser-matter interactions. In other words: They’ve calculated how to create matter and antimatter via lasers.

Strong electric fields cause electrons to undergo huge radiation losses because a significant amount of their energy is converted into gamma rays -- high-energy photons, which are the particles that make up light. The high-energy photons produced by this process interact with the strong laser field and create electron-positron pairs. As a result, a new state of matter emerges: strongly interacting particles, optical fields, and gamma radiation, whose dynamics are governed by the interplay between classical physics phenomena and quantum processes.

A key concept behind the team’s work is based on the quantum electrodynamics (QED) prediction that “a strong electric field can, generally speaking, ‘boil the vacuum,’ which is full of ‘virtual particles,’ such as electron-positron pairs,” explained Igor Kostyukov of IAP RAS. “The field can convert these types of particles from a virtual state, in which the particles aren’t directly observable, to a real one.”

One impressive manifestation of this type of QED phenomenon is a self-sustained laser-driven QED cascade, which is a grand challenge yet to be observed in a laboratory.

But, what’s a QED cascade?

“Think of it as a chain reaction in which each chain link consists of sequential processes,” Kostyukov said. “It begins with acceleration of electrons and positrons within the laser field. This is followed by emission of high-energy photons by the accelerated electrons and positrons. Then, the decay of high-energy photons produces electron-positron pairs, which go on to new generations of cascade particles. A QED cascade leads to an avalanche-like production of electron-positron high-energy photon plasmas.”

For this work, the researchers explored the interaction of a very intense laser pulse with a foil via numerical simulations.

“We expected to produce a large number of high-energy photons, and that some portion of them would decay and produce electron-positron pairs,” Kostyukov continued. “Our first surprise was that the number of high-energy photons produced by the positrons is much greater than that produced by the electrons of the foil. This led to an exponential -- very sharp -- growth of the number of positrons, which means that if we detect a larger number of positrons in a corresponding experiment we can conclude that most of them are generated in a QED cascade.”

They were also able to observe a distinct structure of the positron distribution in the simulations -- despite some randomness of the processes of photon emission and decay.

“By analyzing the positron motion in the electromagnetic fields in front of the foil analytically, we discovered that some characteristics of the motion regulate positron distribution and led to helical-like structures being observed in the simulations,” he added.

The team’s discoveries are of fundamental importance because the phenomenon they explored can accompany the laser-matter interaction at extreme intensities within a wider range of parameters. “It offers new insights into the properties of these types of interactions,” Kostyukov said. “More practical applications may include the development of advanced ideas for the laser-plasma sources of high-energy photons and positrons whose brilliance significantly exceeds that of the modern sources.”

So far, the researchers have focused on the initial stage of interaction when the electron-positron pairs they produced don’t significantly affect the laser¬¬-target interaction.

“Next, we’re exploring the nonlinear stage when the self-generated electron-positron plasma strongly modifies the interaction,” he said. “And we’ll also try to expand our results to more general configurations of the laser–matter interactions and other regimes of interactions -- taking a wider range of parameters into consideration.”


The article, "Production and dynamics of positrons in ultrahigh intensity laser-foil interactions," is authored by I. Yu. Kostyukov and E. N. Nerush. The article will appear in the journal Physics of Plasmas on September 27, 2016 (DOI: 10.1063/1.4962567). After that date, it can be accessed at

No, Astronomers Haven't Decided Dark Energy Is Nonexistent
This week, a number of media outlets have put out headlines like "The universe is expanding at an accelerating rate, or is it?” and “The Universe Is Expanding But Not At An Accelerating Rate New Research Debunks Nobel Prize Theory.” This excitement is due to a paper just published in Nature’s Scientific Reports called "Marginal evidence for cosmic acceleration from Type Ia supernovae,” by Nielsen, Guffanti and Sarkar.
Once you read the article, however, it’s safe to say there is no need to revise our present understanding of the universe. All the paper does is slightly reduce our certainty in what we know—and then only by discarding most of the cosmological data on which our understanding is based. It also ignores important details in the data it does consider. And even if you leave aside these issues, the headlines are wrong anyway. The study concluded that we’re now only 99.7 percent sure that the universe is accelerating, which is hardly the same as “it’s not accelerating.”
The initial discovery that the universe is expanding at an accelerating rate was made by two teams of astronomers in 1998 using Type Ia Supernovae as cosmic measuring tools. Supernovae—exploding stars—are some of the most powerful blasts in the entire cosmos, roughly equivalent to a billion-billion-billion atomic bombs exploding at once. Type Ia’s are a special kind of supernova in that, unlike other supernovae, they all explode with just about the same luminosity every time likely due to a critical mass limit. This similarity means that the differences in their observed brightness are almost entirely based on how far away they are. This makes them ideal for measuring cosmic distances. Furthermore, these objects are relatively common, and they are so bright that we can see them billions of light years away. This shows us how the universe appeared billions of years ago, which we can compare to how it looks today.These supernovae are often called “standard candles” for their consistency, but they’re more accurately “standardizable candles,” because in practice, their precision and accuracy can be improved still further by accounting for small differences in their explosions by observing how long the explosion takes to unfold and how the color of the supernovae are reddened by dust between them and us. Finding a way to do these corrections robustly was what led to the discovery of the accelerating universe. .
The recent paper that has generated headlines used a catalog of Type Ia supernovae collected by the community (including us) which has been analyzed numerous times before. But the authors used a different method of implementing the corrections—and we believe this undercuts the accuracy of their results. They assume that the mean properties of supernovae from each of the samples used to measure the expansion history are the same, even though they have been shown to be different and past analyses have accounted for these differences. However, even ignoring these differences, the authors still find that there is roughly a 99.7 percent chance that the universe is accelerating—very different from what the headlines suggest.Furthermore, the overwhelming confidence astronomers have that the universe is expanding faster now than it was billions of years ago is based on much more than just supernova measurements. These include tiny fluctuations in the pattern of relic heat after the Big Bang (i.e., the cosmic microwave background) and the modern day imprint of those fluctuations in the distribution of galaxies around us (called baryon acoustic oscillations). The present study also ignores the presence of a substantial amount of matter in the Universe, confirmed numerous times and ways since the 1970’s, further reducing the study confidence. These other data show the universe to be accelerating independently from supernovae. If we combine the other observations with the supernova data, we go from 99.99 percent sure to 99.99999 percent sure. That’s pretty sure!
We now know that dark energy, which is what we believe causes the expansion of the universe to accelerate, makes up 70 percent of the universe, with matter constituting the rest. The nature of dark energy is still one of the largest mysteries of all of astrophysics. But there has been no active debate about whether dark energy exists and none about whether the universe is accelerating since this picture was cemented a decade ago.
There are now many new large surveys, both on the ground and in space, whose top priority over the next two decades is to figure out exactly what this dark energy could be. For now, we have to continue to improve our measurements and question our assumptions. While this recent paper does not disprove any theories, it is still good for everyone to pause for a second and remember how big the questions are that we are asking, how we reached the conclusions we have to date and how seriously we need to test each building block of our understanding.

Behind This Plant's Blue Leaves Lies a Weird Trick of Quantum Mechanics
In the fading twilight on the rainforest floor, a plant's leaves glimmer iridescent blue. And now scientists know why. These exotic blue leaves pull more energy out of dim light than ordinary leaves because of an odd trick of quantum mechanics.

A team of plant scientists led by Heather Whitney of the University of Bristol in the U.K. has just discovered the remarkable origin and purpose of the shiny cobalt leaves on the Malaysian tropical plant Begonia pavonina. The plant owes its glimmer to its peculiar machinery for photosynthesis, the process plants use to turn light into chemical energy. Strangely enough, these blue leaves can squeeze more energy out of the red-green light that reaches the eternally dim rainforest floor. Whitney and her colleagues describe the blue leaves today in the journal Nature Plants.

"It's actually quite brilliant. Plants have to cope with every obstacle that's thrown at them without running away. Here we see evidence of a plant that's actually evolved to physically manipulate the little light it receives," says Whitney, "it's quite amazing, and was an absolutely surprising discovery."

Small entropy changes allow quantum measurements to be nearly reversed
In 1975, Swedish physicist Göran Lindblad developed a theorem that describes the change in entropy that occurs during a quantum measurement. Today, this theorem is a foundational component of quantum information theory, underlying such important concepts as the uncertainty principle, the second law of thermodynamics, and data transmission in quantum communication systems.

Now, 40 years later, physicist Mark M. Wilde, Assistant Professor at Louisiana State University, has improved this theorem in a way that allows for understanding how quantum measurements can be approximately reversed under certain circumstances. The new results allow for understanding how quantum information that has been lost during a measurement can be nearly recovered, which has potential implications for a variety of quantum technologies.
Quantum relative entropy never increases
Most people are familiar with entropy as a measure of disorder and the law that "entropy never decreases"—it either increases or stays the same during a thermodynamic process, according to the second law of thermodynamics. However, here the focus is on "quantum relative entropy," which in some sense is the negative of entropy, so the reverse is true: quantum relative entropy never increases, but instead only decreases or stays the same.
In fact, this was the entropy inequality theorem that Lindblad proved in 1975: that the quantum relative entropy cannot increase after a measurement. In this context, quantum relative entropy is interpreted as a measure of how well one can distinguish between two quantum states, so it's this distinguishability that can never increase. (Wilde describes a proof of Lindblad's result in greater detail in his textbook Quantum Information Theory, published by Cambridge University Press.)
One thing that Lindblad's proof doesn't address, however, is whether it makes any difference if the quantum relative entropy decreases by a little or by a lot after a measurement.
In the new paper, Wilde has shown that, if the quantum relative entropy decreases by only a little, then the quantum measurement (or any other type of so-called "quantum physical evolution") can be approximately reversed.
"When looking at Lindblad's entropy inequality, a natural question is to wonder what we could say if the quantum relative entropy goes down only by a little when the quantum physical evolution is applied," Wilde told "It is quite reasonable to suspect that we might be able to approximately reverse the evolution. This was arguably open since the work of Lindblad in 1975, addressed in an important way by Denes Petz in the late 1980s (for the case in which the quantum relative entropy stays the same under the action of the evolution), and finally formulated as a conjecture around 2008 by Andreas Winter. What my work did was to prove this result as a theorem: if the quantum relative entropy goes down only by a little under a quantum physical evolution, then we can approximately reverse its action."

Wilde's improvements to Lindblad's theorem have a variety of implications, but the main one that Wilde discusses in his paper is how the new results allow for recovering quantum information.
"If the decrease in quantum relative entropy between two quantum states after a quantum physical evolution is relatively small," he said, "then it is possible to perform a recovery operation, such that one can perfectly recover one state while approximately recovering the other. This can be interpreted as quantifying how well one can reverse a quantum physical evolution." So the smaller the relative entropy decrease, the better the reversal process.
The ability to recover quantum information could prove useful for quantum error correction, which aims to protect quantum information from damaging external effects. Wilde plans to address this application more in the future with his colleagues.
As Wilde explained, Lindblad's original theorem can also be used to prove the uncertainty principle of quantum mechanics in terms of entropies, as well as the second law of thermodynamics for quantum systems, so the new results have implications in these areas, as well.
"Lindblad's entropy inequality underlies many limiting statements, in some cases said to be physical laws or principles," Wilde said. "Examples are the uncertainty principle and the second law of thermodynamics. Another example is that this entropy inequality is the core step in determining limitations on how much data we can communicate over quantum communication channels. We could go as far as to say that the above entropy inequality constitutes a fundamental law of quantum information theory, which is a direct mathematical consequence of the postulates of quantum mechanics."
Regarding the uncertainty principle, Wilde and two coauthors, Mario Berta and Stephanie Wehner, discuss this angle in a forthcoming paper. They explain that the uncertainty principle involves quantum measurements, which are a type of quantum physical evolution and therefore subject to Lindblad's theorem. In one formulation of the uncertainty principle, two experiments are performed on different copies of the same quantum state, with both experimental outcomes having some uncertainty.
"The uncertainty principle is the statement that you cannot generally make the uncertainties of both experiments arbitrarily small, i.e., there is generally a limitation," Wilde said. "It is now known that a statement of the uncertainty principle in terms of entropies can be proved by using the 'decrease of quantum relative entropy inequality.' So what the new theorem allows for doing is relating the uncertainties of the measurement outcomes to how well we could try to reverse the action of one of the measurements. That is, there is now a single mathematical inequality which captures all of these notions."
In terms of the second law of thermodynamics, Wilde explains how the new results have implications for reversing thermodynamic processes in both classical and quantum systems.
"The new theorem allows for quantifying how well we can approximately reverse a thermodynamic transition from one state to another without using any energy at all," he said.
He explained that this is possible due to the connection between entropy, energy, and work. According to the second law of thermodynamics, a thermodynamic transition from one quantum state to another is allowed only if the free energy decreases from the original state to the final state. During this process, one can gain work and store energy. This law can be rewritten as a statement involving relative entropies and can be proved as a consequence of the decrease of quantum relative entropy.
"What my new work with Stephanie Wehner and Mischa Woods allows for is a refinement of this statement," Wilde said. "We can say that if the free energy does not go down by very much under a thermodynamic transition (i.e., if there is not too much work gained in the process), then it is possible to go back approximately to the original state from the final state, without investing any work at all. The key word here is that you can go back only approximately, so we are not in violation of the second law, only providing a refinement of it."
In addition to these implications, the new theorem can also be applied to other research topics in quantum information theory, including the Holevo bound, quantum discord, and multipartite information measures.
Wilde's work was funded in part by The DARPA Quiness program (ending now), which focused on quantum key distribution, or using quantum mechanics to ensure secret communication between two parties. He describes more about this application, in particular how Alice and Bob might use a quantum state to share secrets that can be kept private from an eavesdropper Eve (and help them survive being attacked by a bear), in a recent blog post.

Did the Mysterious 'Planet Nine' Tilt the Solar System?
The putative "Planet Nine" may have tilted the entire solar system, researchers say.

In January, astronomers revealed evidence for the potential existence of another planet in the solar system. Researchers suggest that if this world — dubbed Planet Nine — exists, it could be about 10 times Earth's mass and orbit the sun at a distance about 500 times the distance from the Earth to the sun.

Previous research suggested that Planet Nine would possess a highly tilted orbit compared with the relatively thin, flat zone in which the eight official planets circle the sun. This led scientists to investigate whether Planet Nine's slant might help explain other tilting seen elsewhere in the solar system.
Now, researchers suggest that Planet Nine's influence might have tilted the entire solar system except the sun.

"Planet Nine may have tilted the other planets over the lifetime of the solar system," said study lead author Elizabeth Bailey, an astrophysicist and planetary scientist at the California Institute of Technology in Pasadena.

Prior work found that the zone in which the eight major planets orbit the sun is tilted by about 6 degrees compared to the sun's equator. This discrepancy has long been a mystery in astronomy.
Bailey and her colleagues ran computer simulations that suggest that the tilt of the eight official planets can be explained by the gravitational influence of Planet Nine "over the 4.5-billion-years-ish lifetime of the solar system," Bailey told

Bailey did note that there are other potential explanations for the tilt of the solar system. One alternative is that electrically charged particles influenced by the young sun's magnetic field could have interacted with the disk of gas and dust that gave rise to the planets in ways that tilted the solar system. Another possibility is that there might have been an imbalance in the mass of the nascent sun's core.

"However, all these other ways to explain why the solar system is tilted are really hard to test — they all invoke processes that were possibly present really early in the solar system," Bailey said. "Planet Nine is the first thing that has been proposed to tilt the solar system that doesn't depend on early conditions, so if we find Planet Nine, we will be able to see if it's the only thing responsible for the tilt, or if anything else may have played a role."

The scientists detailed their findings yesterday (Oct. 18) at a joint meeting of the American Astronomical Society's Division for Planetary Sciences and European Planetary Science Congress in Pasadena, California.

Cosmological mystery solved by largest ever map of voids and superclusters
A team of astrophysicists at the University of Portsmouth have created the largest ever map of voids and superclusters in the universe, which helps solve a long-standing cosmological mystery. The map of the positions of cosmic voids – large empty spaces which contain relatively few galaxies – and superclusters – huge regions with many more galaxies than normal – can be used to measure the effect of dark energy 'stretching' the universe.

The results confirm the predictions of Einstein's theory of gravity.
Lead author Dr Seshadri Nadathur from the University's Institute of Cosmology and Gravitation said: "We used a new technique to make a very precise measurement of the effect that these structures have on photons from the cosmic microwave background (CMB) – light left over from shortly after the Big Bang – passing through them.

"Light from the CMB travels through such voids and superclusters on its way to us. According to Einstein's General Theory of Relativity, the stretching effect of dark energy causes a tiny change in the temperature of CMB light depending on where it came from. Photons of light travelling through voids should appear slightly colder than normal and those arriving from superclusters should appear slightly hotter. "This is known as the integrated Sachs-Wolfe (ISW) effect.

"When this effect was studied by astronomers at the University of Hawai'i in 2008 using an older catalogue of voids and superclusters, the effect seemed to be five times bigger than predicted. This has been puzzling scientists for a long time, so we looked at it again with new data."
To create the map of voids and superclusters, the Portsmouth team used more than three-quarters of a million galaxies identified by the Sloan Digital Sky Survey. This gave them a catalogue of structures more than 300 times bigger than the one previously used.
The scientists then used large computer simulations of the universe to predict the size of the ISW effect. Because the effect is so small, the team had to develop a powerful new statistical technique to be able to measure the CMB data.

They applied this technique to CMB data from the Planck satellite, and were able to make a very precise measurement of the ISW effect of the voids and superclusters. Unlike in the previous work, they found that the new result agreed extremely well with predictions using Einstein's gravity.
Dr Nadathur said: "Our results resolve one long-standing cosmological puzzle, but doing so has deepened the mystery of a very unusual 'Cold Spot' in the CMB.
"It has been suggested that the Cold Spot could be due to the ISW effect of a gigantic 'supervoid' which has been seen in that region of the sky. But if Einstein's gravity is correct, the supervoid isn't big enough to explain the Cold Spot.
"It was thought that there was some exotic gravitational effect contradicting Einstein which would simultaneously explain both the Cold Spot and the unusual ISW results from Hawai'i. But this possibility has been set aside by our new measurement – and so the Cold Spot mystery remains unexplained."

The Universe Has 10 Times More Galaxies Than Scientists Thought
More than a trillion galaxies are lurking in the depths of space, a new census of galaxies in the observable universe has found — 10 times more galaxies than were previously thought to exist.

An international team of astronomers used deep-space images and other data from the Hubble Space Telescope to create a 3D map of the known universe, which contains about 100 to 200 billion galaxies. In particular, they relied on Hubble's Deep Field images, which revealed the most distant galaxies ever seen with a telescope. [Video: Our Universe Has Trillions of Galaxies, Hubble Study]

Then, the researchers incorporated new mathematical models to calculate where other galaxies that have not yet been imaged by a telescope might exist. For the numbers to add up, the universe needs at least 10 times more galaxies than those already known to exist. But these unknown galaxies are likely either too faint or too far away to be seen with today's telescopes.
It boggles the mind that over 90 percent of the galaxies in the universe have yet to be studied," Christopher Conselice, a professor of astrophysics at the University of Nottingham in the U.K., who led the study, said in a statement. "Who knows what interesting properties we will find when we observe these galaxies with the next generation of telescopes.”

Looking far out into deep space also means looking back in time, because light takes a long time to travel across cosmic distances. During the study, Conselice and his team looked at parts of the universe up to 13 billion light-years away. Looking this far allowed the researchers to see partial snapshots of the evolution of the universe since 13 billion years ago, or less than 100 million years after the Big Bang.

They discovered that the early universe contained even more galaxies than it does today. Those distant galaxies were small and faint dwarf galaxies, they found. As the universe evolves, such galaxies merge together to form larger galaxies.
n a separate statement, Conselice said that the results are "very surprising as we know that, over the 13.7 billion years of cosmic evolution since the Big Bang, galaxies have been growing through star formation and mergers with other galaxies. Finding more galaxies in the past implies that significant evolution must have occurred to reduce their number through extensive merging of systems."

The results of the study are detailed in The Astrophysical Journal.

Correlation between galaxy rotation and visible matter puzzles astronomers
A new study of the rotational velocities of stars in galaxies has revealed a strong correlation between the motion of the stars and the amount of visible mass in the galaxies. This result comes as a surprise because it is not predicted by conventional models of dark matter.
Stars on the outskirts of rotating galaxies orbit just as fast as those nearer the centre. This appears to be in violation of Newton's laws, which predict that these outer stars would be flung away from their galaxies. The extra gravitational glue provided by dark matter is the conventional explanation for why these galaxies stay together. Today, our most cherished models of galaxy formation and cosmology rely entirely on the presence of dark matter, even though the substance has never been detected directly.
These new findings, from Stacy McGaugh and Federico Lelli of Case Western Reserve University, and James Schombert of the University of Oregon, threaten to shake things up. They measured the gravitational acceleration of stars in 153 galaxies with varying sizes, rotations and brightness, and found that the measured accelerations can be expressed as a relatively simple function of the visible matter within the galaxies. Such a correlation does not emerge from conventional dark-matter models.

The Spooky Secret Behind Artificial Intelligence's Incredible Power
Spookily powerful artificial intelligence (AI) systems may work so well because their structure exploits the fundamental laws of the universe, new research suggests.

The new findings may help answer a longstanding mystery about a class of artificial intelligence that employ a strategy called deep learning. These deep learning or deep neural network programs, as they're called, are algorithms that have many layers in which lower-level calculations feed into higher ones. Deep neural networks often perform astonishingly well at solving problems as complex as beating the world's best player of the strategy board game Go or classifying cat photos, yet know one fully understood why.

It turns out, one reason may be that they are tapping into the very special properties of the physical world, said Max Tegmark, a physicist at the Massachusetts Institute of Technology (MIT) and a co-author of the new research.
The laws of physics only present this "very special class of problems" — the problems that AI shines at solving, Tegmark told Live Science. "This tiny fraction of the problems that physics makes us care about and the tiny fraction of problems that neural networks can solve are more or less the same," he said. [Super-Intelligent Machines: 7 Robotic Futures]

Deep learning

Last year, AI accomplished a task many people thought impossible: DeepMind, Google's deep learning AI system, defeated the world's best Go player after trouncing the European Go champion. The feat stunned the world because the number of potential Go moves exceeds the number of atoms in the universe, and past Go-playing robots performed only as well as a mediocre human player.

But even more astonishing than DeepMind's utter rout of its opponents was how it accomplished the task.

"The big mystery behind neural networks is why they work so well," said study co-author Henry Lin, a physicist at Harvard University. "Almost every problem we throw at them, they crack."

For instance, DeepMind was not explicitly taught Go strategy and was not trained to recognize classic sequences of moves. Instead, it simply "watched" millions of games, and then played many, many more against itself and other players.

Like newborn babies, these deep-learning algorithms start out "clueless," yet typically outperform other AI algorithms that are given some of the rules of the game in advance, Tegmark said.

Another long-held mystery is why these deep networks are so much better than so-called shallow ones, which contain as little as one layer, Tegmark said. Deep networks have a hierarchy and look a bit like connections between neurons in the brain, with lower-level data from many neurons feeding into another "higher" group of neurons, repeated over many layers. In a similar way, deep layers of these neural networks make some calculations, and then feed those results to a higher layer of the program, and so on, he said.

Magical keys or magical locks?

To understand why this process works, Tegmark and Lin decided to flip the question on its head.

"Suppose somebody gave you a key. Every lock you try, it seems to open. One might assume that the key has some magic properties. But another possibility is that all the locks are magical. In the case of neural nets, I suspect it's a bit of both," Lin said.

One possibility could be that the "real world" problems have special properties because the real world is very special, Tegmark said.

Take one of the biggest neural-network mysteries: These networks often take what seem to be computationally hairy problems, like the Go game, and somehow find solutions using far fewer calculations than expected.

It turns out that the math employed by neural networks is simplified thanks to a few special properties of the universe. The first is that the equations that govern many laws of physics, from quantum mechanics to gravity to special relativity, are essentially simple math problems, Tegmark said. The equations involve variables raised to a low power (for instance, 4 or less). [The 11 Most Beautiful Equations]

What's more, objects in the universe are governed by locality, meaning they are limited by the speed of light. Practically speaking, that means neighboring objects in the universe are more likely to influence each other than things that are far from each other, Tegmark said.

Many things in the universe also obey what's called a normal or Gaussian distribution. This is the classic "bell curve" that governs everything from traits such as human height to the speed of gas molecules zooming around in the atmosphere.

Finally, symmetry is woven into the fabric of physics. Think of the veiny pattern on a leaf, or the two arms, eyes and ears of the average human. At the galactic scale, if one travels a light-year to the left or right, or waits a year, the laws of physics are the same, Tegmark said.

Tougher problems to crack

All of these special traits of the universe mean that the problems facing neural networks are actually special math problems that can be radically simplified.

"If you look at the class of data sets that we actually come across in nature, they're way simpler than the sort of worst-case scenario you might imagine," Tegmark said.

There are also problems that would be much tougher for neural networks to crack, including encryption schemes that secure information on the web; such schemes just look like random noise.

"If you feed that into a neural network, it's going to fail just as badly as I am; it's not going to find any patterns," Tegmark said.

While the subatomic laws of nature are simple, the equations describing a bumblebee flight are incredibly complicated, while those governing gas molecules remain simple, Lin added. It's not yet clear whether deep learning will perform just as well describing those complicated bumblebee flights as it will describing gas molecules, he said.

"The point is that some 'emergent' laws of physics, like those governing an ideal gas, remain quite simple, whereas some become quite complicated. So there is a lot of additional work that needs to be done if one is going to answer in detail why deep learning works so well." Lin said. "I think the paper raises a lot more questions than it answers!"

Science of Disbelief: When Did Climate Change Become All About Politics?
Barely over a quarter of Americans know that almost all climate scientists agree that climate change is happening and that humans are to blame, a new Pew Research Center survey finds.

The survey also reveals a strong split between political liberals and political conservatives on the issue. While 55 percent of liberal Democrats say climate scientists are trustworthy, only 15 percent of conservative Republicans say the same.

The findings are in line with the results of other surveys of the politics of climate change, said Anthony Leiserowitz, director of the Yale Program on Climate Change Communication. Leiserowitz was not involved in the Pew study, but he and his colleagues conduct their own surveys on climate attitudes.

"The overwhelming finding that they see here is that there's a strong partisan divide on climate change, and that is a pattern we first saw emerge in 1997," Leiserowitz told Live Science.

The partisan gap isn't necessarily set in stone, however, Leiserowitz said. It's actually been narrowing recently — but it remains to be seen how the result of this year's presidential election may affect the divide.

Prior to 1997, the two major parties held similar beliefs on the occurrence of human-caused climate change, Leiserowitz said. Right around that time, then-President Bill Clinton and then-Vice President Al Gore took on the issue and pushed for the Kyoto Protocol, an international climate treaty meant to reduce greenhouse gas emissions.

"That's the moment when they come back and say, 'This is a global problem, and the U.S. needs to be part of the solution,' that the two parties begin to diverge," Leiserowitz said.

Since then, the American public's belief that climate change is real has fluctuated. Belief that climate change exists and that it's human-caused began to rise around 2004 and hit a peak around 2007, driven by media coverage of California's climate initiatives under Republican Gov. Arnold Schwarzenegger and the Hollywood film "The Day After Tomorrow," released in 2004. (Really: Leiserowitz's research found that Americans who saw the blockbuster were moved to think climate change is a problem. Al Gore's film "An Inconvenient Truth" was released in 2006 but was seen by far fewer people, Leiserowitz said.)

These numbers waned during the 2008 recession, when the media abruptly stopped talking about climate change and the conservative tea-party wing of the Republican Party gained more power, Leiserowitz said. Belief in man-made climate change bottomed out in 2010 and 2011 but has been creeping upward since then, he said. [6 Unexpected Effects of Climate Change]

"That uptick is not coming from Democrats," he said. "Democrats have not really changed much at all. Independents — their belief that global warming is happening — has increased. But the real shift is happening among Republicans, and most interesting, the biggest shift — 19 percentage points — is among conservative Republicans."

But even with those increases, because the percentage of conservative Americans who believed in man-made climate change was so small, the overall proportion of conservatives who believe climate change is caused by human activity is still small. The new Pew survey, conducted between May 10 and June 6, 2016, found that 48 percent of Americans overall believe that the Earth is warming mostly because of human activity. Seventy-nine percent of liberal Democrats held that belief, compared with 63 percent of moderate Democrats, 34 percent of moderate Republicans and 15 percent of conservative Republicans.

Climate scientists have the trust of far more people on the left side of the political spectrum than the right. Only 9 percent of conservative Republicans believe that climate scientists' findings are usually influenced by the best available evidence, compared with 55 percent of liberal Democrats. Only 7 percent of conservative Republicans and 15 percent of moderate Republicans think climate scientists are motivated by concern for the public's best interest, compared with 31 percent of moderate Democrats and 41 percent of liberal Democrats.

Still, up until last spring, the trends were "moving in a more science-aligned direction," Leiserowitz said. Even members of the Republican establishment had been willing to discuss climate change as a problem, Leiserowitz said, citing former presidential candidate John McCain, who had sponsored and supported climate legislation in the U.S. Senate.

"Then, along comes Donald Trump, and he basically flips over all the card tables," Leiserowitz said. The candidate has called climate change a hoax on multiple occasions and once tweeted that "the concept of global warming was created by and for the Chinese in order to make U.S. manufacturing non-competitive." Trump has also been consistent in calling for less regulation of fossil fuel emissions. [Election Day 2016: A Guide to the When, Why, What and How]

"It's not clear where he has taken the Republican base," Leiserowitz said. The outcome of the election alone won't be enough to determine what kind of collateral damage climate opinion will accrue. Should Trump lose, Leiserowitz said, the Republican Party will have to decide whether to move even more rightward or whether to take a more centrist tack.

However, Americans' views aren't quite as extreme as the political class would make it seem, Leiserowitz said. Yale's surveys found that about 17 percent of Americans are alarmed about climate change, and 10 percent are entirely dismissive. The other 63 percent believe in, and are worried about, climate change to differing degrees.

"Most Americans are actually in the middle, and more of those people in the middle are leaning pretty well toward the scientific consensus," Leiserowitz said.

Eyeballing Proxima b: Probably Not a Second Earth
In our profound quest to discover strange new worlds, we've inevitably been trying to find alien planets that possess any Earth-like similarities. Now, with the incredible find of an Earth-mass exoplanet orbiting a neighboring star at just the right distance for liquid water to persist on its surface, hopes are high that we may have discovered an "Earth 2.0" right on our galactic doorstep.

But in our rush to assign any terrestrial likeness to this small exoplanet, we often forget that just because it's in the right place and is (apparently) the right mass, it likely has very little resemblance to Earth. And even if it does possess water, it could still be a very strange world indeed.

In a new study headed by scientists at the French National Center for Scientific Research (CNRS) and Cornell University, computer simulations have been run to figure out the possible characteristics of the small rocky world that was discovered orbiting the red dwarf star Proxima Centauri. Located only 4.2 light-years from Earth, the so-called Proxima b was discovered by the ESO's La Silla observatory in Chile and astronomers of the Pale Red Dot campaign to much excitement in August.

By measuring the slight wobbles of Proxima Centauri, the telescope was able not only to decipher the mass of the exoplanet, it could also calculate its orbital period. With this information, the researchers realized that the world was orbiting the red dwarf within the star's "habitable zone." The habitable zone of any star is the distance at which a planet can orbit that is not too hot and not too cold for liquid water to persist on its surface.

The implications are clear: on Earth, where there's liquid water, there's life — if there's liquid water on Proxima b, perhaps there's life there too. And, if we look for enough into the future, perhaps we might one day become an interstellar species and set up home there.

But it's worth remembering that we currently have very little information about Proxima b. We know that it has an orbital period of a little over 11 days (yes, a "year" on Proxima b is only 11 days).* We know it orbits within the star's habitable zone. We also know its approximate mass. However, we don't know whether or not it has an atmosphere. Also, we don't know Proxima b's physical size. If we don't know its physical size, we can't calculate its average density and therefore there's ambiguity as to what materials it contains. So, in an effort to confront this ambiguity, the researchers ran some simulations of a 1.3 Earth-mass world (the approximate mass of Proxima b) in orbit around a red dwarf star to see what form it might take.

Assuming the rocky world has the smallest physical size allowed for its mass (94 percent Earth's diameter), according to planetary formation models this would consist of a metal core, making up for 65 percent of the mass of the entire planet. The outer layers would consist of rocky mantle and very little water (if any). In this scenario, Proxima b would be a rocky, barren and dry world, resembling a massive Mercury. Last time we checked in on Mercury, it didn't appear very "habitable."

But this is just one possibility. The researchers then shifted the scale to the other extreme. What would happen if the physical size of the planet was pushed to the maximum? Well, the mass of Proxima b could support a world that is 40 percent bigger than Earth. Now things get interesting.

In this scenario, Proxima b would be a lot less dense, meaning there would be less rock and metal. A huge proportion of the planet's mass would consist of water. In fact, 50 percent of the entire planet's mass would be water. This would be a "water world" in the strongest possible sense.

Somewhere between these two scenarios — either a dense and barren rock or bloated water world — is the highly sought-after "Earth 2.0"; basically a world with a small metal core, rocky mantle and plentiful oceans flooding the surface. It's this exoplanetary compromise that you regularly see in artistic impressions of Proxima b, the temperate alien world that looks like Earth:

Alas, this version of Proxima b is just one possibility over a huge range of scenarios. So, yeah, from this study alone, Proxima b is probably not very Earth-like. But wait, there's more.

Just because a planet orbits its star in the habitable zone, it doesn't mean it has the same life-giving qualities as Earth (keep in mind that both Mars and Venus also orbit the sun within our solar system's habitable zone).

Proxima b orbits very close to its star. It's the nature of the beast; red dwarf stars are small and therefore cooler than sun-like stars. Proxima Centauri's habitable zone is therefore one hell of a lot more compact than our sun's. The Proxima Centauri habitable zone is well within the orbit of Mercury. If a planet got that close to our hot sun, it would be burnt to a crisp; for a planet in orbit around Proxima Centauri, this location is an oasis.

But when you orbit so close to a red dwarf, a planet starts to succumb to some tidal difficulties. One face of an orbiting planet around a red dwarf will be constantly facing the star, meaning the planet's spin matches its orbital period. One hemisphere of the planet is in constant light while the other hemisphere is in constant darkness — a situation called "tidal locking."

So, in this case, let's imagine the orbiting exoplanet really is a textbook "Earth-like" world with just the right composition. A world with an iron core, rocky mantle and enough water on the surface to create liquid water oceans that could support life. But this world is tidally locked with its star — that's got to cause some problems, right?

Let's assume that this planet somehow possesses an atmosphere (more on that later), to have one hemisphere being constantly heated while the other hemisphere is constantly frozen certainly doesn't sound like a good time. Many simulations have been run in an attempt to model the complexities of the atmospheric conditions in this situation and most outcomes aren't good. Some scenarios predict planet-wide hurricanes that act like a blast oven, other scenarios predict a dry wasteland on the star-facing hemisphere and a frozen solid dark hemisphere.

There are, however, some planetary models that could save the day for these unfortunate wannabe "second Earths." One fun prediction is the possible existence of "Eyeball Earths." These peculiar planets would still be tidally locked to their star, with one hemisphere a constantly baked desert and the other hemisphere in deep freeze, but there would be a region between day and night where the conditions are just right for a liquid water ocean to circle the world between the darkness and light. Oh, and it would look like an eyeball, seriously:

In other research around atmospheric dynamics of tidally locked exoplanets, there could be a situation where the world has efficient "air conditioning" — hot air from one hemisphere is distributed about the planet in such a way to balance global temperatures. But this assumes a high degree of friction between the lower atmosphere and a craggy, rocky surface and efficient high-altitude air flow.

But the ultimate kicker when considering "Earth-like" exoplanets around red dwarf stars is that just because red dwarfs are small, it doesn't mean they are docile. In fact, red dwarf stars can be downright violent, frequently erupting with powerful flares, flooding any nearby planets with ionizing radiation. This radiation, plus inevitably powerful stellar winds, would likely blow any atmosphere away from our hypothetical burgeoning Earth 2.0. Without an atmosphere, the only vaguely habitable location on that planet would be under the surface, perhaps in a sub-surface ocean protected by an icy crust like Jupiter's moon Europa.

But, like Earth, if these planets have a powerful global magnetosphere, perhaps the worst of the stellar storm can be deflected and an atmosphere could form, who knows?

Though there are many challenges facing our search for "Earth 2.0," we are only just beginning our quest to seek out alien worlds orbiting other stars. Yes, it is an incredible stroke of luck to find a small world orbiting a neighboring star, but as red dwarfs are the most populous type of star in our galaxy, the odds are that a handful may well have just the right ingredients to support a habitable atmosphere. But is Proxima b one of those diamonds in the rough?

For now, with the tools at our disposal, we simply do not know. Perhaps with the launch of NASA's James Webb Space Telescope in 2018 we might be able to tease out the spectroscopic fingerprint of an atmosphere, but that would likely be beyond its capabilities. So we might just have to send an interstellar probe there to find out if Proxima b is really the habitable exoplanet everyone hopes it will be.

Does the Drake Equation Confirm There Is Intelligent Alien Life in the Galaxy?
The Drake Equation, written by astrophysicist Frank Drake in 1961, is a probabilistic equation to come up with an estimate of the number of intelligent, technological civilizations that should be in the Milky Way—and by extension, the universe. It is the foundation for a number of statistical models that suggest intelligent alien life should be widespread throughout the galaxy. In 1961, Drake's original estimate for the number of intelligent civilizations in our galaxy was between 20 and 50,000,000. As a new episode of PBS's Space Time points out, we have significantly refined our estimates for the number of potentially habitable planets in the Milky Way thanks to the Kepler planet-hunting mission. (We think there are around 40 billion rocky planets orbiting within the habitable zone of their parent stars.)
What we still struggle with is pinning down the probability that life will spring from organic compounds, a process known as abiogenesis, and the probability that basic microbial life will eventually evolve into an intelligent species. To help dial in this estimate a bit more, astrophysicists Adam Frank and Woodruff Sullivan asked how small the intelligent life probability would need to be if we are in fact the only technologically advanced species in the entire universe.

They concluded that if only one intelligent civilization ever existed in the history of the known universe (humans, and nothing else ever before), then the probability that a habitable planet produces intelligent life would have to be less than 1 in 400 billion trillion, or 2.5 x 10^-24—and, even if this were the case, there would still only be a 1 percent chance that no technological civilization ever existed other than humans. This is such a ludicrously small probability that astrophysicists are forced to conclude that we are not the only intelligent civilization to ever exist.

If we narrow the focus to just the Milky Way, then there is still only a 1 in 60 billion chance that a habitable planet produces an advanced civilization, assuming that we are the only such civilization to ever exist in the galaxy. Most people therefore conclude that there must be other intelligent civilizations in our galaxy, if not now then at some point in the past. But we have never detected any, and therein lies the Fermi paradox.

Check out the Space Time video above to learn more about the likelihood that we are not alone in our galaxy, and be sure to stay tuned for the bonus question at the end of the episode. If you get it correct, PBS will send you a Space Time t-shirt free of charge.

Scientists build world's smallest transistor
Silicon transistors have been getting smaller and smaller, packing more computing power into smaller dimensions all while using less energy. But silicon transistors can't get much smaller.

To keep the trend going, scientists have turned to silicon alternatives. Recently, a team scientists set a new record for world's smallest transistor using a pair of novel materials, carbon nanotubes and molybdenum disulfide. The combination belongs to a class of materials called transition metal dichalcogenides, or TMDs
Molybdenum disulfide, or MoS2, is an engine lubricant that scientists believe has tremendous potential in the field of electronics. Like silicon, MoS2 boasts a crystalline lattice structure. But electrons don't move as easily through MoS2 as they do through silicon.

Transistors rely on a gate to control the flow of electricity through its terminals. But because silicon allows for such a free flow of electrons, the particles barge through the doors when the gate becomes too small.

"This means we can't turn off the transistors," Sujay Desai, a graduate student at the Department of Energy's Lawrence Berkeley National Laboratory, explained in a news release. "The electrons are out of control."

When electrons are out of control, transistors leak energy.

With MoS2, scientists were able to make the gate -- and the transistor -- much smaller without making susceptible to gate-crashing electrons. In fact, Desai and his research partners built a transistor with a 1-nanometer gate. A single strand of human hair measures roughly 50,000 nanometers across.

While the feat is impressive, and the technology promising, researchers say there is much work to do.

"This work demonstrated the shortest transistor ever," Ali Javey, a professor of electrical engineering and computer sciences at the University of California, Berkeley. "However, it's a proof of concept. We have not yet packed these transistors onto a chip, and we haven't done this billions of times over."

If the technology is going to make in the electronics industry, researchers will need to find new ways to produce the materials at scale.

"Large-scale processing and manufacturing of TMD devices down to such small gate lengths will require future innovations," said Moon Kim, professor of materials science and engineering at the University of Texas, Dallas.

Still, researchers are hopeful the breakthrough will translate to smaller more efficient computer chips, and ultimately, smaller, more efficient electronics.

"A cellphone with this technology built in would not have to be recharged as often," Kim said.
Molybdenum disulfide, or MoS2, is an engine lubricant that scientists believe has tremendous potential in the field of electronics. Like silicon, MoS2 boasts a crystalline lattice structure. But electrons don't move as easily through MoS2 as they do through silicon.

Transistors rely on a gate to control the flow of electricity through its terminals. But because silicon allows for such a free flow of electrons, the particles barge through the doors when the gate becomes too small.

"This means we can't turn off the transistors," Sujay Desai, a graduate student at the Department of Energy's Lawrence Berkeley National Laboratory, explained in a news release. "The electrons are out of control."

When electrons are out of control, transistors leak energy.

With MoS2, scientists were able to make the gate -- and the transistor -- much smaller without making susceptible to gate-crashing electrons. In fact, Desai and his research partners built a transistor with a 1-nanometer gate. A single strand of human hair measures roughly 50,000 nanometers across.

While the feat is impressive, and the technology promising, researchers say there is much work to do.

"This work demonstrated the shortest transistor ever," Ali Javey, a professor of electrical engineering and computer sciences at the University of California, Berkeley. "However, it's a proof of concept. We have not yet packed these transistors onto a chip, and we haven't done this billions of times over."

If the technology is going to make in the electronics industry, researchers will need to find new ways to produce the materials at scale.

"Large-scale processing and manufacturing of TMD devices down to such small gate lengths will require future innovations," said Moon Kim, professor of materials science and engineering at the University of Texas, Dallas.

Still, researchers are hopeful the breakthrough will translate to smaller more efficient computer chips, and ultimately, smaller, more efficient electronics.

"A cellphone with this technology built in would not have to be recharged as often," Kim said.

'Alien Megastructure' Star Keeps Getting Stranger
The more scientists learn about "Tabby's Star," the more mysterious the bizarre object gets.

Newly analyzed observations by NASA's planet-hunting Kepler space telescope show that the star KIC 8462852 — whose occasional, dramatic dips in brightness still have astronomers scratching their heads — has also dimmed overall during the last few years.

"The steady brightness change in KIC 8462852 is pretty astounding," study lead author Ben Montet, of the California Institute of Technology in Pasadena, said in a statement.

"Our highly accurate measurements over four years demonstrate that the star really is getting fainter with time," Montet added. "It is unprecedented for this type of star to slowly fade for years, and we don't see anything else like it in the Kepler data."

KIC 8462852 hit the headlines last September, when a team of astronomers led by Tabetha Boyajian of Yale University announced that the star had dimmed dramatically several times over the past few years — in one case, by a whopping 22 percent.

These brightness dips are too significant to be caused by an orbiting planet, so scientists began suggesting alternative explanations. Perhaps a planet or a family of orbiting comets broke up, for example, and the ensuing cloud of dust and fragments periodically blocks the star's light. Or maybe some unknown object in the depths of space between the star and Earth is causing the dimming.

The brightness dips are even consistent with a gigantic energy-collecting structure built by an intelligent civilization — though researchers have been keen to stress that this "alien megastructure" scenario is quite unlikely.

The weirdness increased in January 2016, when astronomer Bradley Schaefer of Louisiana State University reported that KIC 8462852 also seems to have dimmed overall by 14 percent between 1890 and 1989.

This conclusion is based on Schaefer's analysis of photographic plates of the night sky that managed to capture Tabby's Star, which lies about 1,500 light-years from Earth. Some other astronomers questioned this interpretation, however, suggesting that differences in the instruments used to photograph the sky over that time span may be responsible for the apparent long-term dimming.

So Montet and co-author Joshua Simon, of the Observatories of the Carnegie Institution of Washington, decided to scour the Kepler data for any hint of the trend Schaefer spotted. And they found more than just a hint.

Kepler observed KIC 8462852, along with about 150,000 other stars, from 2009 through 2013. During the first three years of that time span, KIC 8462852 got nearly 1 percent dimmer, Montet and Simon found. The star's brightness dropped by a surprising 2 percent over the next six months, and stayed level for the final six months of the observation period. (Kepler has since moved on to a new mission called K2, during which the telescope is hunting for exoplanets on a more limited basis and performing a variety of other observations.)

"This star was already completely unique because of its sporadic dimming episodes," Simon said in the same statement. "But now we see that it has other features that are just as strange, both slowly dimming for almost three years and then suddenly getting fainter much more rapidly."

Montet and Simon said they don't know what's behind the weird behavior of Tabby's Star, but they hope their results, which have been accepted for publication in The Astrophysical Journal, help crack the case eventually.

"It's a big challenge to come up with a good explanation for a star doing three different things that have never been seen before," Montet said. "But these observations will provide an important clue to solving the mystery of KIC 8462852."

What's Out There? 'Star Men' Doc Tackles Life Questions Through Science
The documentary "Star Men," which has just begun to play in select theatres in the United States, uses the life stories of four prominent astronomers to take a compassionate look at aging, death and humanity's search for meaning.

Following a screening of "Star Men" at the California Institute of Technology (Caltech) in Pasadena last month, one of the film's subjects, astronomer Neville (Nick) Woolf, said that when the project began he thought it would be a science documentary set against the backdrop of the American Southwest.

Instead, he was surprised to see that the film is actually centered on the 50-year friendship among himself and three colleagues — Roger Griffin, Donald Lynden-Bell and Wallace (Wal) Sargent — who worked together at Caltech in the early 1960s.

Evidence for new form of matter-antimatter asymmetry observed
Like two siblings with divergent personalities, a type of particle has shown signs of behaving differently than its antimatter partner. It’s the first time evidence of matter-antimatter differences have been detected in decays of a baryon — a category of particle that includes protons and neutrons. Such matter-antimatter discrepancies are key to explaining how the universe came to be made mostly of matter, scientists believe.

The result is “the first measurement of its kind,” says theoretical physicist Yuval Grossman of Cornell University. “Wow, we can actually see something that we’ve never seen before.”

Evidence of matter-antimatter differences in decays of baryons — particles which are composed of three smaller particles known as quarks — has eluded scientists until now. Previous experiments have found differences between matter and antimatter varieties of mesons, which are made up of one quark and one antiquark, but never in baryons.

For most processes, the laws of physics would be the same if matter were swapped with antimatter and the universe’s directions were flipped, as if reflected in a mirror. But when this principle, known as CP symmetry (for “charge parity”), is violated, matter and antimatter act differently. Now, scientists have found hints of CP violation in the decays of a particle known as a lambda-b baryon.

Scientists with the LHCb experiment, located at the Large Hadron Collider near Geneva, reported the result online September 16 at They found that when the lambda-b baryon decays, the particles produced by the decay speed away at different angles and momenta for matter and antimatter versions of the baryon. (LHCb scientists declined to comment for this article, citing the embargo policy of Nature Physics, the journal to which the paper was submitted.)

After the Big Bang, the universe initially held equal parts antimatter and matter. But as the universe evolved, the laws of physics favored matter through CP violation, and antimatter became a rarity. Scientists’ well-tested theory of particle physics, the standard model, includes some CP violation, but not enough to explain the current imbalance. So physicists are searching for additional sources of the discrepancy.

It’s not surprising that differences in matter and antimatter appeared in baryons as well as mesons, says theoretical physicist David London of the University of Montreal. But precise measurements of baryons might eventually reveal deviations from the predictions of the standard model. Such a result could point the way to additional asymmetry that allowed the universe as we know it to form. “It's just the first step, and hopefully there will be more such measurements,” says London.

Giant hidden Jupiters may explain lonely planet systems
Lonely planets can blame big, pushy bullies. Giant planets may bump off most of their smaller brethren, partly explaining why the Kepler space telescope has seen so many single-planet systems.

Of the thousands of planetary systems Kepler has discovered, about 80 per cent appear as single planets passing in front of their stars. The rest feature as many as seven planets – a distinction dubbed the Kepler dichotomy.

Recent studies suggest even starker differences. While multiple-planet systems tend to have circular orbits that all lie in the same plane – like our solar system – the orbits of singletons tend to be more elliptical and are often misaligned with the spins of their stars.

Now, a pair of computer simulations suggest that hidden giants may lurk in these single systems. We wouldn’t be able to see them; big, Jupiter-like planets in wide orbits would take too long for Kepler to catch, and they may not have orbits that cause them to pass in front of their stars in our line of sight. But if these unseen bullies are there, they may have removed many of the smaller planets in closer orbits, leaving behind the solitary worlds that Kepler sees.

The simulations show that gravitational interactions involving giants in outer orbits can eject smaller planets from the system, nudge them into their stars or send them crashing into each other.

Pushy planets
“There are bigger things out there trying to pull you around,” says Chelsea Huang at the University of Toronto, Canada. She and her team also showed the giants pull the few remaining inner planets into more elliptical and inclined orbits – the same kind seen in many of the single systems Kepler has spotted.

Alex Mustill at Lund Observatory in Sweden and his colleagues mimicked more general scenarios, including planets orbiting a binary star system, and got similar results. The studies complement each other, say Huang and Mustill.

“We know these configurations have to occur in some fraction of exoplanet systems,” Mustill says.

But that doesn’t mean they’re universal. “They don’t occur all the time, and this is one reason why you can’t explain the large number of single planets purely through this mechanism,” Mustill says. According to his analysis, bullying giants can only account for about 18 per cent of Kepler’s singles.

To confirm their proposed mechanism, the researchers must wait until next year for the launch of the Transiting Exoplanet Survey Satellite (TESS), which will target closer and brighter systems – and thus be easier for follow-up observations to uncover the bully planets.

Rarest nucleus reluctant to decay
Nature’s rarest type of atomic nucleus is not giving up its secrets easily.

Scientists looking for the decay of an unusual form of the element tantalum, known as tantalum-180m, have come up empty-handed. Tantalum-180m’s hesitance to decay indicates that it has a half-life of at least 45 million billion years, Bjoern Lehnert and colleagues report online September 13 at “The half-life is longer than a million times the age of the universe,” says Lehnert, a nuclear physicist at Carleton University in Ottawa. (Scientists estimate the universe’s age at 13.8 billion years.)

Making up less than two ten-thousandths of a percent of the mass of the Earth’s crust, the metal tantalum is uncommon. And tantalum-180m is even harder to find. Only 0.01 percent of tantalum is found in this state, making it the rarest known long-lived nuclide, or variety of atom.

Tantalum-180m is a bit of an oddball. It is what’s known as an isomer — its nucleus exists in an “excited,” or high-energy, configuration. Normally, an excited nucleus would quickly drop to a lower energy state, emitting a photon — a particle of light — in the process. But tantalum-180m is “metastable” (hence the “m” in its name), meaning that it gets stuck in its high-energy state.
Tantalum-180m is thought to decay by emitting or capturing an electron, morphing into another element — either tungsten or hafnium — in the process. But this decay has never been observed. Other unusual nuclides, such as those that decay by emitting two electrons simultaneously, can have even longer half-lives than tantalum-180m. But tantalum-180m is unique — it is the longest-lived isomer found in nature.

“It’s a very interesting nucleus,” says nuclear physicist Eric Norman of the University of California, Berkeley, who was not involved with the study. Scientists don’t have a good understanding of such unusual decays, and a measurement of the half-life would help scientists pin down the details of the process and the nucleus’ structure.

Lehnert and colleagues observed a sample of tantalum with a detector designed to catch photons emitted in the decay process. After running the experiment for 176 days, and adding in data from previous incarnations of the experiment, the team saw no evidence of decay. The half-life couldn’t be shorter than 45 million billion years, the scientists determined, or they would have seen some hint of the process. “They did a state-of-the-art measurement,” says Norman. “It's a very difficult thing to see.”

The presence of tantalum-180m in nature is itself a bit of a mystery, too. The element-forging processes that occur in stars and supernovas seem to bypass the nuclide. “People don’t really understand how it is created at all,” says Lehnert.

Tantalum-180m is interesting as a potential energy source, says Norman, although “it’s kind of a crazy idea.” If scientists could find a way to tap the energy stored in the excited nucleus by causing it to decay, it might be useful for applications like nuclear lasers, he says.

Weird Science: 3 Win Nobel for Unusual States of Matter
How is a doughnut like a coffee cup? The answer helped three British-born scientists win the Nobel prize in physics Tuesday.

Their work could help lead to more powerful computers and improved materials for electronics.

David Thouless, Duncan Haldane and Michael Kosterlitz, who are now affiliated with universities in the United States, were honored for work in the 1970s and '80s that shed light on strange states of matter.

"Their discoveries have brought about breakthroughs in the theoretical understanding of matter's mysteries and created new perspectives on the development of innovative materials," the Royal Swedish Academy of Sciences said.

Thouless, 82, is a professor emeritus at the University of Washington. Haldane, 65, is a physics professor at Princeton University in New Jersey. Kosterlitz, 73, is a physics professor at Brown University in Providence, Rhode Island, and currently a visiting lecturer at Aalto University in Helsinki.

The 8 million kronor ($930,000) award was divided with one half going to Thouless and the other to Haldane and Kosterlitz.

They investigated strange states of matter like superconductivity, the ability of a material to conduct electricity without resistance.

Their work called on an abstract mathematical field called topology, which presents a particular way to describe some properties of matter. In this realm, a doughnut and a coffee cup are basically the same thing because each contains precisely one hole. Topology describes properties that can only change in full steps; you can't have half a hole.

"Using topology as a tool, they were able to astound the experts," the academy said.

For example, in the 1970s, Kosterlitz and Thouless showed that very thin layers of material — essentially containing only two dimensions rather than three — could undergo fundamental changes known as phase transitions. One example is when a material is chilled enough that it can start showing superconductivity.

Scientists had thought phase changes were impossible in just two dimensions, but the two men showed that changes do occur and that they were rooted in topology.

"This was a radically new way of looking at phases of matter," said Sankar Das Sarma, a physicist at the University of Maryland in College Park.

"Now everywhere we look we find that topology affects the physical world," he said.

Haldane was cited for theoretical studies of chains of magnetic atoms that appear in some materials. He said he found out about the prize through an early morning telephone call.

"My first thought was someone had died," he told The Associated Press. "But then a lady with a Swedish accent was on the line. It was pretty unexpected."

Kosterlitz, a dual U.K.-U.S. citizen, said he got the news in a parking garage while heading to lunch in Helsinki.

"I'm a little bit dazzled. I'm still trying to take it in," he told AP.

Nobel committee member David Haviland said this year's prize was more about theoretical discoveries even though they may result in practical applications.

"These theoreticians have come up with a description of these materials using topological ideas, which have proven very fruitful and has led to a lot of ongoing research about material properties," he said.

Haldane said the award-winning research is just starting to have practical applications.

"The big hope is that some of these new materials could lead to quantum computers and other new technology," he said.

Quantum computers could be powerful tools, but Kosterlitz was not so sure about the prospects for developing them.

"I've been waiting for my desktop quantum computer for years, but it's still showing no signs of appearing," he said. "At the risk of making a bad mistake, I would say that this quantum computation stuff is a long way from being practical."

This year's Nobel Prize announcements started Monday with the medicine award going to Japanese biologist Yoshinori Ohsumi for discoveries on autophagy, the process by which a cell breaks down and recycles content.

The chemistry prize will be announced on Wednesday and the Nobel Peace Prize on Friday. The economics and literature awards will be announced next week.

Besides the prize money, the winners get a medal and a diploma at the award ceremonies on Dec. 10, the anniversary of prize founder Alfred Nobel's death in 1896.

Methane didn’t warm ancient Earth, new simulations suggest
Methane wasn’t the cozy blanket that kept Earth warm hundreds of millions of years ago when the sun was dim, new research suggests.

By simulating the ancient environment, researchers found that abundant sulfate and scant oxygen created conditions that kept down levels of methane — a potent greenhouse gas — around 1.8 billion to 800 million years ago (SN: 11/14/15, p. 18). So something other than methane kept Earth from becoming a snowball during this dim phase in the sun’s life. Researchers report on this new wrinkle in the so-called faint young sun paradox (SN: 5/4/13, p. 30) the week of September 26 in the Proceedings of the National Academy of Sciences.

Limited oxygen increases the production of microbe-made methane in the oceans. With low oxygen early in Earth’s history, many scientists suspected that methane was abundant enough to keep temperatures toasty. Oxygen may have been too sparse, though. Recent work suggests that oxygen concentrations at the time were as low as a thousandth their present-day levels (SN: 11/28/14, p. 14).

Stephanie Olson of the University of California, Riverside and colleagues propose that such low oxygen concentrations thinned the ozone layer that blocks methane-destroying ultraviolet rays. They also estimate that high concentrations of sulfate in seawater at the time helped sustain methane-eating microbes. Together, these processes severely limited methane to levels similar to those seen today — far too low to keep Earth defrosted.

New 'Artificial Synapses' Pave Way for Brain-Like Computers
A brain-inspired computing component provides the most faithful emulation yet of connections among neurons in the human brain, researchers say.

The so-called memristor, an electrical component whose resistance relies on how much charge has passed through it in the past, mimics the way calcium ions behave at the junction between two neurons in the human brain, the study said. That junction is known as a synapse. The researchers said the new device could lead to significant advances in brain-inspired — or neuromorphic — computers, which could be much better at perceptual and learning tasks than traditional computers, as well as far more energy efficient.

Stephen Hawking Is Still Afraid of Aliens
Humanity should be wary of seeking out contact with alien civilizations, Stephen Hawking has warned once again.

In 2010, the famed astrophysicist said that intelligent aliens may be rapacious marauders, roaming the cosmos in search of resources to plunder and planets to conquer and colonize. He reiterates that basic concern in "Stephen Hawking's Favorite Places," a new documentary streaming now on the CuriosityStream video service.

"One day, we might receive a signal from a planet like this," Hawking says in the documentary, referring to a potentially habitable alien world known as Gliese 832c. "But we should be wary of answering back. Meeting an advanced civilization could be like Native Americans encountering Columbus. That didn't turn out so well."

The Ig Nobel Prize Winners of 2016
The 2016 Ig Nobel Prizes were announced on Sept. 22, revealing the honorees who were deemed to have made achievements that make people laugh and then make them think. In the 26th year of the ceremony, those honored did not disappoint. From rats wearing polyester pants and rock personalities to the science of BS and the satisfaction of mirror scratching, here's a look at this year's winners.

Teleported Laser Pulses? Quantum Teleportation Approaches Sci-Fi Level
Crewmembers aboard the starship Enterprise on the iconic TV series "Star Trek" could "beam up" from planets to starships, making travel between great distances look easy. While these capabilities are clearly fictional, researchers have now performed "quantum teleportation" of laser pulses over several miles within two city networks of fiber optics.

Although the method described in the research will not replace city subways or buses with transporter booths, it could help lead to hack-proof telecommunications networks, as well as a "quantum internet" to help extraordinarily powerful quantum computers talk to one another.

Teleporting an object from one point in the universe to another without it moving through the space in between may sound like science fiction, but quantum physicists have actually been experimenting with quantum teleportation since 1998. The current distance record for quantum teleportation — a feat announced in 2012 — is about 89 miles (143 kilometers), between the two Canary Islands of La Palma and Tenerife, off the northwest coast of Africa.

Quantum teleportation relies on the bizarre nature of quantum physics, which finds that the fundamental building blocks of the universe, such as subatomic particles, can essentially exist in two or more places at once. Specifically, quantum teleportation depends on a strange phenomenon known as "quantum entanglement," in which objects can become linked and influence each other instantaneously, no matter how far apart they are.

Currently, researchers cannot teleport matter (say, a human) across space, but they can use quantum teleportation to beam information from one place to another. The quantum teleportation of an electron, for example, would first involve entangling a pair of electrons. Next, one of the two electrons — the one to be teleported — would stay in one place while the other electron would be physically transported to whatever destination is desired.

Then, the fundamental details or "quantum state" of the electron to be teleported are analyzed — an act that also destroys its quantum state. Finally, that data is sent to the destination, where it can be used on the other electron to recreate the first one, so that it is indistinguishable from the original. For all intents and purposes, that electron has teleported. (Because the data is sent using regular signals such as light pulses or electrons, quantum teleportation can proceed no faster than the speed of light.)

Now, two research groups independently report quantum teleportation over several miles of fiber-optic networks in the cities of Hefei, China, and Calgary, Alberta. The scientists detailed their findings online Sept. 19 in two independent papers in the journal Nature Photonics.

China Claims It Developed "Quantum" Radar To See Stealth Planes
Beijing's state media has made the bold claim that a Chinese defense contractor successfully developed the world's first quantum radar system. The radar can allegedly detect objects at range of up to 62 miles. If true, this would greatly diminish the value of so-called "stealth" aircraft, including the B-2 and F-22 Raptor fighter. But it's a pretty far-out claim.

Quantum radar is based on the theory of quantum entanglement and the idea that two different particles can share a relationship with one another to the point that, by studying one particle, you can learn things about the other particle—which could be miles away. These two particles are said to be "entangled".

In quantum radars, a photon is split by a crystal into two entangled photons, a process known as "parametric down-conversion." The radar splits multiple photons into entangled pairs—and A and a B, so to speak. The radar systems sends one half of the pairs—the As—via microwave beam into the air. The other set, the Bs, remains at the radar base. By studying the photons retained at the radar base, the radar operators can tell what happens to the photons broadcast outward. Did they run into an object? How large was it? How fast was it traveling and in what direction? What does it look like?

Quantum radars defeat stealth by using subatomic particles, not radio waves. Subatomic particles don't care if an object's shape was designed to reduce a traditional, radio wave-based radar signature. Quantum radar would also ignore traditional radar jamming and spoofing methods such as radio-wave radar jammers and chaff.

According to Global Times, the 14th Institute of China Electronics Technology Group Corporation (CETC) developed the radar system last month. The subdivision website describes the "14th Institute" as "the birthplace of Radar industry (sic) in China", employing 9,000 workers on a 2,000-acre research campus.

China isn't the only country working on quantum radar: Lockheed Martin was granted a patent on a theoretical design 2008. Lockheed's plans were more far-reaching, including the ability to "visualize useful target details through background and/or camouflaging clutter, through plasma shrouds around hypersonic air vehicles, through the layers of concealment hiding underground facilities, IEDs, mines, and other threats." In many ways, Lockheed's concept of quantum radar resembles the spaceship and handheld sensors on "Star Trek."

Since the 2008 patent, Lockheed's been silent on the subject of quantum radars. Given what a technological leap such a system would be, it's quite possible the research has gone "black"—highly classified and subject to a high level of secrecy.

Earth Wobbles May Have Driven Ancient Humans Out of Africa
Ancient human migrations out of Africa may have been driven by wobbles in Earth's orbit and tilt that led to dramatic swings in climate, a new study finds.

Modern humans first appeared in Africa about 150,000 to 200,000 years ago. It remains a mystery as to why it then took many millennia for people to disperse across the globe. Recent archaeological and genetic findings suggest that migrations of modern humans out of Africa began at least 100,000 years ago, but most humans outside of Africa most likely descended from groups who left the continent more recently — between 40,000 and 70,000 years ago.

Previous research suggested that shifts in climate might help explain why modern human migrations out of Africa happened when they did. For instance, about every 21,000 years, Earth experiences slight changes to its orbit and tilt. These series of wobbles, known as Milankovitch cycles, alter how much sunlight hits different parts of the planet, which in turn influences rainfall levels and the number of people any given region can support.

Alien Planet Has 2 Suns Instead of 1, Hubble Telescope Reveals
Imagine looking up and seeing more than one sun in the sky. Astronomers have done just that, announcing today (Sept. 22) that they have spotted a planet orbiting two stars instead of one, as previously thought, using the Hubble Space Telescope.

Several planets that revolve around two, three or more stars are known to exist. But this is the first time astronomers have confirmed such a discovery of a so-called "circumbinary planet" by observing a natural phenomenon called gravitational microlensing, or the bending of light caused by strong gravity around objects in space. You can see how researchers found the planet in this video.

In binary-star systems, the two stars orbit a common center of mass. When one star passes in front of the other from our perspective on Earth, gravity from the closer star bends and magnifies the light coming from the star in the background. Astronomers can study this distorted light to find clues about the star in the foreground and any potential planets orbiting the star system.

Glider Will Attempt Record-Breaking Flight to Edge of Space
In a spot in South America known for its powerful winds, scientists and engineers are gearing up to attempt a record-breaking feat: to fly a human-carrying glider to the edge of space.

The expedition, known as Perlan Mission II, aims to take the glider up to an elevation of 90,000 feet (27,000 meters). The project is more than an attempt at aviation history; it's designed to study the layers of Earth's atmosphere. The researchers plan to fly the glider on a series of flights to measure electromagnetic fields, pressure, ozone and methane levels, and more.

To reach such great heights, the glider was built to take advantage of an atmospheric phenomenon called stratospheric mountain waves. Normal mountain waves form between cold and warm air masses as they move across mountain ranges and create high-altitude winds. Stratospheric mountain waves, which the researchers plan to ride, form when the polar vortex — a large, low-pressure and cold air system — reaches peak strength, giving the high-altitude winds more energy.

"The strong winds will be perpendicular to the Andes, and as they come over the mountains, they cause a wave in the air that's invisible unless there are clouds present," Jim Payne, chief pilot for the Perlan Mission II project, told Avionics. "We fly in the area where the air is rising and propagates all the way up to 90,000 feet, although meteorologists say it may go up to 130,000 feet [40,000 m]."

Stratospheric mountain waves occur at peak strength in the Southern Hemisphere's winter months [summer in the Northern Hemisphere], so the Perlan Project team members recently traveled to Patagonia, in South America, where they will await ideal conditions for their first attempt at flying to the edge of space.

"Typically, the polar vortex, which causes the high-altitude wave, is best in August and September," Payne said. "So far, August has been disappointing; we haven't had the high-altitude winds. The one downside of this is that we're totally at the mercy of the weather."

If conditions are right and the flight is successful, Perlan would surpass the world altitude record for a fixed-wing aircraft. The current record of 85,068 feet (25,929 m) was set 50 years ago by the SR-71 Blackbird, a jet-powered spy plane, National Geographic reported. Unlike the Blackbird, the Perlan glider would achieve the feat without a drop of fuel.

Earlier this year, another aviation record was set without consuming any fuel. The Solar Impulse 2, a plane powered entirely by the sun, completed a journey around the world, becoming the first solar-powered aircraft to circumnavigate the globe without using any fuel.

Entangled Particles Reveal Even Spookier Action Than Thought
Sorry, Einstein: It looks like the world is spooky — even when your most famous theory is tossed out. This finding comes from a close look at quantum entanglement, in which two particles that are "entangled" affect each other even when separated by a large distance. Einstein found that his theory of special relativity meant that this weird behavior was impossible, calling it "spooky."Now, researchers have found that even if they were to scrap this theory, allowing entangled particles to communicate with each other faster than the speed of light or even instantaneously, that couldn't explain the odd behavior. The findings rule out certain "realist" interpretations of spooky quantum behavior.What that tells us is that we have to look a little bit deeper," said study co-author Martin Ringbauer, a doctoral candidate in physics at the University of Queensland in Australia. "This kind of action-at-a-distance is not enough to explain quantum correlations" seen between entangled particles, Ringbauer said. Most of the time, the world seems — if not precisely orderly — then at least governed by fixed rules. At the macroscale, cause-and-effect rules the behavior of the universe, time always marches forward and objects in the universe have objective, measurable properties.

But zoom in enough, and those common-sense notions seem to evaporate. At the subatomic scale, particles can become entangled, meaning their fates are bizarrely linked. For instance, if two photons are sent from a laser through a crystal, after they fly off in separate directions, their spin will be linked the moment one of the particles is measured. Several studies have now confirmed that, no matter how far apart entangled particles are, how fast one particle is measured, or how many times particles are measured, their states become inextricably linked once they are measured.

For nearly a century, physicists have tried to understand what this means about the universe. The dominant interpretation was that entangled particles have no fixed position or orientation until they are measured. Instead, both particles travel as the sum of the probability of all their potential positions, and both only "choose" one state at the moment of measurement. This behavior seems to defy notions of Einstein's theory of special relativity, which argues that no information can be transmitted faster than the speed of light. It was so frustrating to Einstein that he famously called it "spooky action at a distance."

To get around this notion, in 1935, Einstein and colleagues Boris Podolsky and Nathan Rosen laid out a paradox that could test the alternate hypothesis that some hidden variable affected the fate of both objects as they traveled. If the hidden variable model were true, that would mean "there's some description of reality which is objective," Ringbauer told Live Science. [Spooky! The Top 10 Unexplained Phenomena]

Then in 1964, Irish physicist John Stewart Bell came up with a mathematical expression, now known as Bell's Inequality, that could experimentally prove Einstein wrong by proving the act of measuring a particle affects its state.

In hundreds of tests since, Einstein's basic explanation for entanglement has failed: Hidden variables can't seem to explain the correlations between entangled particles. But there was still some wiggle room: Bell's Inequality didn't address the situation in which two entangled photons travel faster than light.
In the new study, however, Ringbauer and his colleagues took a little bit more of that wiggle room away. In a combination of experiments and theoretical calculations, they show that even if a hidden variable were to travel from entangled photon "A" to entangled photon "B" instantaneously, that would not explain the correlations found between the two particles.

The findings may bolster the traditional interpretation of quantum mechanics, but that leaves physicists with other headaches, Ringbauer said. For one, it lays waste to our conventional notions of cause and effect, he said.

For another, it means that measurements and observations are subjective, Ognyan Oreshkov, a theoretical physicist at the Free University of Brussels in Belgium, told Live Science.

If the state of a particle depends on being measured or observed, then who or what is the observer when, for instance, subatomic particles in a distant supernova interact? What is the measurement? Who is "inside" the entangled system and who is on the outside observing it? Depending on how the system is defined, for instance, to include more and more objects and things, the "state" of any given particle may then be different, Ringbauer said.

"You can always draw a bigger box," Ringbauer said.

Still, realists should take heart. The new findings are not a complete death knell for faster-than-light interpretations of entanglement, said Oreshkov, who was not involved in the current study.

The new study "rules out only one specific model where the influence goes from the outcome of one measurement to the outcome of the other measurement," Oreshkov said. In other words, that photon A is talking to photon B at faster-than-light speeds.

Another possibility, however, is that the influence starts earlier, with the correlation in states somehow going from the point at which the photons became entangled (or at some point earlier in the experiment) to the measured photons at the end of the experiment, Oreshkov added. That, however, wasn't tested in the current research, he said. [10 Effects of Faster-Than-Light Travel]

Most physicists who were holding out for a nonlocal interpretation, meaning one not constrained by the speed of light, believe this latter scenario is more likely, said Jacques Pienaar, a physicist who was recently at the University of Vienna in Austria.

"There won't be anybody reading this paper saying, 'Oh, my God, I've been wrong my whole life,'" Pienaar, who was not involved in the current study, told Live Science. "Everybody is going to find it maybe surprising but not challenging, they'll very easily incorporate it into their theories."The new study suggests it may be time to retire Bell's Inequality, Pienaar said.

"I think that people are too focused on, too obsessed with Bell Inequalities," Pienaar said. "I think it's an idea which was really amazing and changed the whole field, but it's run its course."

Instead, a tangential idea laid out in the paper may be more intriguing – the development of a definition of causality on the quantum scale, he said. If people focus on cracking quantum entanglement from these new perspectives, "I think lots of cool discoveries could be made," Pienaar said

Dark Matter Just Got Murkier
They say that love makes the world go around and that may well be true. But when you look at things on a much larger scale — say the size of galaxies — love just isn't enough. And, for that matter, neither are the stars of the galaxies themselves. In fact, what makes galaxies go around is a kind of matter that has never been directly observed. That undiscovered "stuff" is called dark matter, and an amazing new measurement was recently announced that is causing the scientific world to rethink long-held thoughts.

New 'Gel' May Be Step Toward Clothing That Computes
A gel-like material that can carry out pattern recognition could be a major step toward "materials that compute," with possible applications for "smart" clothing or sensing skins for robots, according to a new study.

Recent advances in both materials and computer science have prompted researchers to look beyond standard silicon-based electronics and exploit the inherent properties of materials to create systems where the material itself is the computer.

Now, a team from the University of Pittsburgh has designed a material that can solve pattern-recognition problems using changes in the oscillations of a chemically powered gel that pulsates like a heart.

3.7-Billion-Year-Old Rock May Hold Earth's Oldest Fossils
Tiny ripples of sediment on ancient seafloor, captured inside a 3.7-billion-year-old rock in Greenland, may be the oldest fossils of living organisms ever found on Earth, according to a new study.

The research, led by Allen Nutman, head of the School of Earth and Environmental Sciences at the University of Wollongong in Australia, described the discovery of what look like tiny waves, 0.4 to 1.5 inches (1 to 4 centimeters) high, frozen in a cross section of the surface of an outcrop of rock in the Isua Greenstone Belt in southwestern Greenland, a formation made up of what geologists regard as the oldest rocks on the Earth's surface.

The researchers said the ripples are the fossilized remains of cone-shaped stromatolites, layered mounds of sediment and carbonates that build up around colonies of microbes that grow on the floor of shallow seas or lakes.

Planck: First Stars Formed Later Than We Thought
ESA's Planck satellite has revealed that the first stars in the Universe started forming later than previous observations of the Cosmic Microwave Background indicated. This new analysis also shows that these stars were the only sources needed to account for reionising atoms in the cosmos, having completed half of this process when the Universe had reached an age of 700 million years.

Galaxy Cluster 11.1 Billion Light-Years from Earth Is Most Distant Ever Seen
NASA has just discovered a group of galaxies far, far away — so far, in fact, that it set a new record for the most distant ever discovered. The cluster of galaxies, named CL J1001+0220 (or CL J1001 for short), resides a whopping 11.1 billion light-years from Earth. Astronomers found the distant cluster of galaxies using a combination of observations from NASA's Chandra X-ray Observatory and several other space telescopes.

Of the 11 galaxies in the cluster, nine appear to be experiencing a firestorm of new star births. "This galaxy cluster isn't just remarkable for its distance, it's also going through an amazing growth spurt unlike any we've ever seen," Tao Wang of the French Alternative Energies and Atomic Energy Commission (CEA) and lead investigator in the discovery, said in a statement.

What Earth's Oldest Fossils Mean for Finding Life on Mars
If recent findings on Earth are any guide, the oldest rocks on Mars may have signs of ancient life locked up inside.

In a new study, a team of geologists led by Allen Nutman, of the University of Wollongong in Australia, discovered 3.7-billion-year-old rocks that may contain the oldest fossils of living organisms yet found on Earth, beating the previous record by 220 million years. The discovery suggests that life on Earth appeared relatively quickly, less than 1 billion years after the planet formed, according to the new research, published online today (Aug. 31) in the journal Nature.
If that's the case, then it's possible that Martian rocks of the same age could also have evidence of microbial life in them, said Abigail Allwood, a research scientist at NASA's Jet Propulsion Laboratory in Pasadena, California. Allowed was not involved with the new study but authored an opinion piece about the discovery, which was also published today in Nature.

Earth Just Narrowly Missed Getting Hit by an Asteroid
On Saturday, astronomers discovered a new asteroid, just a few hours before it almost hit us. The asteroid is called 2016 QA2, and it missed the Earth by less than a quarter of the distance to the moon. That puts it about three times as far away from Earth as our farthest satellites. And we never saw it coming.

Astrobiology Primer v2.0 Released
The long awaited second edition of the Astrobiology Primer is now published in the journal Astrobiology.
This version is an update of the Primer originally published in 2006, written by graduate students and postdoctoral researchers to provide a comprehensive introduction to the field. Redone from scratch, the 2016 version contains updated content that addresses the definition of life in scientific research, the origins of planets and planetary systems, the evolution and interactions of life on Earth, habitability on worlds beyond Earth, the search for life, and the overall implications of the research.

The Primer is intended to be a resource for early-career scientists, especially graduate students, who are new to astrobiology.

A new class of galaxy has been discovered, one made almost entirely of dark matter
Much of the universe is made of dark matter, the unknowable, as-yet-undetected stuff that barely interacts with the "normal" matter around it. In the Milky Way, dark matter outnumbers regular matter by about 5 to 1, and very tiny dwarf galaxies are known to contain even more of the stuff.

But now scientists have found something entirely new: a galaxy with the same mass as the Milky Way but with only 1 percent of our galaxy's star power. About 99.99 percent of this other galaxy is made up of dark matter, and scientists believe it may be one of many.

The galaxy Dragonfly 44, described in a study published Thursday in the Astrophysical Journal Letters, is 300 million light years away. If scientists can track down a similar galaxy closer to home, however, they may be able to use it to make the first direct detection of dark matter.

How We Could Visit the Possibly Earth-Like Planet Proxima b
A potentially Earth-like planet has been discovered orbiting a star located right next door to the sun. Should humanity try to send a probe there as soon as possible?

The newly discovered planet, known as Proxima b, orbits the star Proxima Centauri, the closest star to the sun. Proxima Centauri is about 4.22 light-years — or 25 trillion miles (40 trillion kilometers) — from Earth.

That's a daunting distance. But an initiative announced earlier this year aims to send superfast miniature probes to Proxima Centauri, on a journey that would take about 20 years. With the discovery of Proxima b, the founders of that initiative are even more eager to get going.
In 2015, NASA's New Horizons probe completed its 3-billion-mile (4.8 billion km) journey to Pluto after traveling for about 9.5 years. The spacecraft traveled at speeds topping 52,000 mph (84,000 km/h). At that rate, it would take New Horizons about 54,400 years to reach Proxima Centauri.

Last month, NASA's Juno probe reached speeds of about 165,000 mph (265,000 km/h) as it entered into orbit around Jupiter. At that rate, a probe could reach Proxima Centauri in about 17,157 years. (It should also be noted that there is currently no feasible way to accelerate a craft large enough to carry humans to those speeds.)

In other words, sending a probe to the nearest star system would not be easy.

The founders of the Breakthrough Starshot initiative want to send wafer-thin probes to Proxima Centauri at very high speeds. The plan calls for equipping these probes with thin sails, which would capture the energy imparted by a powerful Earth-based laser.

This laser would accelerate the probes to 20 percent the speed of light (about 134.12 million mph, or 215.85 million km/h), according to the program scientists. At that rate, the probes could reach Proxima Centauri in 20 to 25 years.

But first, scientists and engineers have to build the apparatus that will launch the tiny probes on their journey. In a news conference today (Aug. 24), Pete Worden, chairman of the Breakthrough Prize Foundation, said that a group of experts had convened earlier this week and discussed plans to build a prototype of the Starshot system. However, he added that the full-scale apparatus is at least 20 years off.

"We certainly hope that, within a generation, we can launch these nanoprobes," Worden said. "And so perhaps 20, 25 years from now, we could begin to launch them, and then they would travel for 25 years to get there."

He added that building the full-scale apparatus would likely cost about the same as building the Large Hadron Collider, the largest particle accelerator in the world; that project is estimated to have cost about $10 billion.

"Over the next decade, we will work with experts here at ESO [the European Southern Observatory] and elsewhere to get as much information as possible about the Proxima Centauri planet … even including whether it might bear life, prior to launching mankind's first probe towards the star," Worden said.

Worden said the Breakthrough Prize Foundation also hopes to "obtain similar data about the other nearby stars, Alpha Centauri A and B." (The two Alpha Centauri stars lie about 4.37 light-years from Earth; some astronomers think Proxima Centauri and the Alpha Centauri stars are part of the same system.)
The New Horizons mission to Pluto was a good demonstration of the benefits of sending a probe to study a planet (or dwarf planet). Images of Pluto captured by the world's most powerful telescopes could barely resolve any surface features on the icy world. During its 2015 flyby, New Horizons provided an incredibly detailed view of Pluto's surface and a boatload of new information about its history.

Could a wafer-thin probe sent to Proxima Centauri b reveal similar details about the planet, or perhaps even reveal the presence of life?

There would be some significant limitations to how much information the probes proposed by Breakthrough Starshot would be able to send back to Earth. First and foremost, the data would take 4.22 years to travel back to Earth, on top of the 20 to 25 years it would take the probe to get to Proxima Centauri.

Seth Shostak, a senior astronomer at the SETI Institute (SETI stands for "search for extraterrestrial intelligence"), told that the prospect of sending a miniature probe to Proxima Centauri is "even more interesting now than it was ... six months ago because now we know there is a planet there."

"I think [the discovery of Proxima b] has real implications for sending something physical to the star system because now there's a target of interest," Shostak said.

But he also brought up some of the unknown variables that people will have to consider when investing in Breakthrough Starshot, including what kind of information the probes could send back from the planet. Those wafer-thin probes would have to carry very small instruments, and thus might be able to do only a very rudimentary study of a planet or star.

It's difficult to predict the exact technology that would be on board, because electrical components and other technical gear will likely continue to shrink in size over the next 20 years. Scientists and engineers would have to consider whether, in the time it would take for information to come back from a probe sent to Proxima Centauri, they could build a telescope capable of gathering the same information.

Penelope Boston, director of NASA's Astrobiology Institute, thinks the continuing trend of hardware miniaturization will make it possible to equip a wafer-thin probe with instrumentation that would make a trip to Proxima Centauri well worth the investment. Boston said the intricate details of a planet's surface can create a huge variety of specific habitats, and resolving the details of those environments on a planet outside Earth's solar system is "certainly beyond the resolution of any conceivable telescope."

"I see the trends in all different kinds of instrumentation going in a kind of ['Star Trek'] tricorder direction, where you have more and more capability packaged into ever-small physical space," Boston told

'Virtual' Particles Are Just 'Wiggles' in the Electromagnetic Field
There are a few physics terms floating around in the world that are deceptive little buggers. These jargon phrases seem to succinctly describe a topic, encapsulating a complex process or interaction into a tidy, easily digestible nugget of information. But they're liars. The concepts they're intended to communicate are actually radically different from what the jargon would suggest.

Take, for example, "virtual particles." The term is supposed to answer a very old question: How, exactly, do particles interact? Let's say we have two charged particles, and let's call them Charles and Charlene. Let's continue to say that both Charles and Charlene are negatively charged. Maybe they're electrons; maybe they're muons. Doesn't matter. What matters is that if Charlene comes racing toward Charles, they bounce off each other and end up going their separate ways.

How did that bounce happen? What made it possible for Charles and Charlene to communicate with each other so that they knew to head in a new direction when the collision was all said and done?This is a fantastically basic question, so it seems that if we could satisfactorily answer it, we could unlock Deep and Important Mysteries of the Universe.

The modern perspective of quantum field theory recognizes photons — bits of light — as the carriers of the electromagnetic force. Charles and Charlene are charged particles, so they interact with light. But obviously, Charles and Charlene aren't shooting lasers at each other, so the trite explanation for their brief dalliance is that "they exchange virtual photons."
What in the name of Feynman's ghost does that mean?

Let's take a step back. Back in the olden-days (i.e., the 19th century) view of physics, each charged particle generates an electric field, which is basically an instruction sheet for how other particles can interact with it. In the case of a particle, this field is strong nearby the particle and weaker farther out. That field also points out in every direction away from the particle. [The 9 Biggest Unsolved Mysteries in Physics]

So our Charles particle produces a field that permeates all of space. Other particles, like Charlene, can read this field and move accordingly. If Charlene is super-duper far away from Charles, the field she reads has very, very small numbers, so she barely notices any effect from Charles. But when she gets close, her field reader goes off the charts. Charles' electric field is very clearly saying "GO AWAY," and she obliges.

In this view, the field is just as real and important as the particle. The universe is full of stuff, and the fields tell that stuff how to interact with other stuff.

In the early to mid-20th century, physicists realized that the universe is a much, much stranger place than we had imagined. Marrying special relativity with quantum mechanics, they developed quantum field theory, and let's just say the results weren't what anybody expected.

As the name suggests, the field got a promotion. Instead of just being the bookkeeping device that showed how one particle should interact with another, it became — and here come some italics for emphasis — the primary physical object. In this modern, sophisticated view of the universe, the electron isn't just a lonely particle. Oh no. Instead, there's an electron field, permeating all of space and time like milk in French toast.

This field is it — it's the thing. Particles? They're just pinched-off bits of that field. Or, more accurately, they're excitations (like, wiggles) of the field that can travel freely. That's important, and I'll get back to it soon.

Here's where things start to get fuzzy. A particle traveling from one spot to another doesn't exactly stay a particle, or at least not the same kind of particle.
Let's go back to Charles, the charged particle. Since he's charged, by definition he interacts with light, which is the electromagnetic field. So wiggles in the electron field (a field made up of electrons) can affect wiggles in the electromagnetic field. So, literally, as Charles zips around, he spends some of his time as an electron-field wiggle and some of his time as an electromagnetic-field wiggle. Sometimes he's an electron, and sometimes he's a photon — a bit of the electromagnetic (EM) field!

It gets worse. Way worse. Charles-turned-EM-wiggle can become other wiggles, like muon wiggles. For every fundamental particle in the universe, there's a corresponding field, and they all talk to one another and wiggle back and forth constantly.

The summation of all the wiggles and sub-wiggles and sub-sub-wiggles add up to what we call "an electron traveling from one spot to another." It all becomes really nasty mathematically very quickly, but folks like physicist Richard Feynman came up with handy tricks to get some science work done.
Now, after tons of backstory, we can get to the main question. The fields wiggle to and fro (and sometimes fro and to). If the wiggles persist and travel, we call them "particles." If they die off quickly, we call them "virtual particles." But fundamentally, they're both wiggles of fields.

When Charles encounters Charlene, they're not like two little bullets ready to slam into each other. Instead, they're complicated sets of wiggles in all sorts of fields, phasing in and out from one type of field to another.

When they do get close enough to interact, it's … messy. Very messy. Wiggles and counter-wiggles, a frenzied mishmash of intermingling. The machinery of quantum field theory — after many tedious calculations — does indeed provide the correct answer (Charles and Charlene bounce off each other), but the details are headache-inducing.

So, the shorthand — "they exchange virtual particles" — rolls off the tongue quite easily, a little slip of jargon to package up a very complicated process. But, unfortunately, it's not very accurate.

Are tiny BLACK HOLES hitting Earth once every 1,000 years? Experts claim primordial phenomenon could explain dark matter
Earlier this year, experts predicted that dark matter may be made of black holes formed during the first second of our universe's existence.
Known as primordial black holes, they could be hitting out own planet every 1,000 years, the professor behind the theory has now revealed.
The Nasa study claimed this interpretation aligns with our knowledge of cosmic infrared and X-ray background glows and may explain the unexpectedly high masses of merging black holes.

‘Largest structure in the universe’ undermines fundamental cosmic principles
Just in time for the hype surrounding No Man’s Sky, the game that takes cosmic scale to the extreme, a team of astronomers say they’ve discovered what might be the largest structure in the observable universe. The tremendous feature consists of nine gamma-ray bursts (GRB), forming a ring that is streaking across some 5 billion light years through space, according to a paper published in Monthly Notices of the Royal Astronomical Society.

The ring’s diameter stretches more than 70 times that of the full moon as seen from Earth. And, as the GRBs each appear to be about 7 billion light years away, the probability that these features are positioned in this way by chance is just one in 20,000, according to lead author Professor Lajos Balazs from the Kinkily Observatory in Budapest.

Amazingly, the team of astronomers discovered the cosmic ring by accident. “Originally, we studied the space distribution of gamma ray bursts,” Balazs told Digital Trends. “GRBs are the most energetic transients in the universe and the only observed objects sampling the observable universe as a whole. In general, we were interested to conclude whether the universe is homogeneous and isotropic on large scale.
“We were totally surprised,” he added, “because we did not expect to find it.”
However, there are reasons to step back and reconsider the discovery — it seems to undermine our established understanding of how the universe developed.

According to the cosmological principle, the structure of the universe is uniform at its largest scale and its largest structures are theoretically limited to 1.2 billion light years across. This new discovery pushes that limit nearly five-fold.

Balazs and his team used telescopes in space and observatories on Earth to identify the structure. They will now investigate whether the cosmological principle and other processes of galaxy formation can account for the ring structure. If not, theories about the formation of the cosmos may need to be rewritten.

“If we are right,” Balazs commented in a press release, “this structure contradicts the current models of the universe. It was a huge surprise to find something this big – and we still don’t quite understand how it came to exist at all.”

"Kitchen Smoke" in nebula offer clues to the building blocks of life
Using data collected by NASA's Stratospheric Observatory for Infrared Astronomy (SOFIA) and other observatories, an international team of researchers has studied how a particular type of organic molecules, the raw materials for life - could develop in space. This information could help scientists better understand how life could have developed on Earth.

Bavo Croiset of Leiden University in the Netherlands and his collaborators focused on a type of molecule called polycyclic aromatic hydrocarbons (PAHs), which are flat molecules consisting of carbon atoms arranged in a honeycomb pattern, surrounded by hydrogen. PAHs make up about 10 percent of the carbon in the universe, and are found on the Earth where they are released upon the burning of organic material such as meat, sugarcane, wood etc.
Croiset's team determined that when PAHs in the nebula NGC 7023, also known as the Iris Nebula, are hit by ultraviolet radiation from the nebula's central star, they evolve into larger, more complex molecules. Scientists hypothesize that the growth of complex organic molecules like PAHs is one of the steps leading to the emergence of life.

Some existing models predicted that the radiation from a newborn, nearby massive star would tend to break down large organic molecules into smaller ones, rather than build them up. To test these models, researchers wanted to estimate the size of the molecules at various locations relative to the central star.

Croiset's team used SOFIA to observe Nebula NGC 7023 with two instruments, the FLITECAM near-infrared camera and the FORCAST mid-infrared camera. SOFIA's instruments are sensitive to two wavelengths that are produced by these particular molecules, which can be used to estimate their size.

The team analyzed the SOFIA images in combination with data previously obtained by the Spitzer infrared space observatory, the Hubble Space Telescope and the Canada-France-Hawaii Telescope on the Big Island of Hawaii.

The analysis indicates that the size of the PAH molecules in this nebula vary by location in a clear pattern. The average size of the molecules in the nebula's central cavity, surrounding the illuminating star, is larger than on the surface of the cloud at the outer edge of the cavity.

In a paper published in Astronomy and Astrophysics, The team concluded that this molecular size variation is due both to some of the smallest molecules being destroyed by the harsh ultraviolet radiation field of the star, and to medium-sized molecules being irradiated so they combine into larger molecules. Researchers were surprised to find that the radiation resulted in net growth, rather than destruction.

"The success of these observations depended on both SOFIA's ability to observe wavelengths inaccessible from the ground, and the large size of its telescope, which provided a more detailed map than would have been possible with smaller telescopes," said Olivier Berne at CNRS, the National Center for Scientific Research in Toulouse, France, one of the published paper's co-authors.

Brian Krill: Evolution of the 21st-Century Scientist
Throughout the past year, I've struggled with how best to define myself as a scientist. At times, I even questioned whether it's appropriate to refer to myself as a scientist.

The thing is, even though I have a PhD in experimental psychology and multiple scientific publications to my name, I no longer work in a traditional scientific setting. That is to say, I don't teach or carry out my own research at a college or university, like many scientists. Indeed, according to a recent survey conducted by the National Science Foundation (NSF), roughly 45 percent of PhD recipients in science work at four-year educational institutions.
I freely chose to leave academia about a year and a half ago because, at the time, doing so was the best thing for my family. Nonetheless, the transition was difficult, especially after so much time devoted to preparation for what I thought would be my lifelong career—four years of undergraduate education followed by five years of graduate training and another two years of postdoctoral training.

After 18 months, I'm at last fully adjusted to life on the outside. But to get to this point, I’ve had to challenge my own heavily ingrained assumptions about what it means to be a scientist. For years, I had clung to the belief, likely held by so many others, that being a scientist necessarily means being an academic and a scholar. This view is wrong—more so now than ever—because the economic landscape for scientists is changing. As such, it might well be time for the entire scientific community to rethink what it means to be a scientist.

Simulated black hole experiment backs Hawking prediction
Prof Jeff Steinhauer simulated a black hole in a super-cooled state of matter called a Bose-Einstein condensate. In the journal Nature Physics, he describes having observed the equivalent of a phenomenon called Hawking radiation - predicted to be released by black holes. Prof Hawking first argued for its existence in 1974. "Classical" physics dictates that the gravity of a black holes is so strong that nothing, not even light, can escape. So Hawking's idea relies on quantum mechanics - the realm of physics which takes hold at very small scales. These quantum effects allow black holes to radiate particles in a process which, over vast stretches of time, would ultimately cause the black hole to evaporate.

But the amount of radiation emitted is small, so the phenomenon has never actually been observed in an astrophysical black hole. Prof Steinhauer, from the Technion - Israel Institute of Technology in Haifa, uncovered evidence that particles were spontaneously escaping his replica black hole. Furthermore, these were "entangled" (or linked) with partner particles being pulled into the hole - a key signature of Hawking radiation. The Bose-Einstein condensate used in the experiment is created when matter, in this case a cloud of rubidium atoms inside a tube, is cooled to near the temperature known as absolute zero, -273C.
In this environment, sound travels at just half a millimetre per second. By speeding up the atoms partway along the tube, to faster than that speed, Prof Steinhauer created a sort of "event horizon" for sound waves. It was packets of sound waves, called "phonons", that played the part of entangled particles on the fringe of a black hole.

The findings do not help answer one of the trickiest puzzles about black hole physics: the Information Paradox. One of the implications of Hawking's theory is that physical information - for example, about properties of a sub-atomic particle - is destroyed when black holes emit Hawking radiation.
But this violates one of the rules of quantum theory. Toby Wiseman, a theoretical physicist at Imperial College London, told BBC News: "Analogues are very interesting from an experimental and technological point of view. But I don't think we're ever going to learn anything about actual black holes [from these simulations]. What it is doing is confirming the ideas of Hawking, but in this analogue setting."
Dr Wiseman, who was not involved with the research, compared idea of the Bose-Einstein condensate simulation to water in a bathtub. "It relies on the fact that there's a precise mathematical analogue between the physics of particles near black holes and ripples in flowing fluids... It's an elegant idea that goes back some way. "If you pull the plug in a bath, you create a flow down the plug, and the ripples on the the water get dragged down the plughole. The flow gets quicker as it gets toward the plughole and if you have a system where the flow is going faster than the speed of the ripples, when those ripples flow past some point near the plughole, they can never come back out." Dr Wiseman said this point was equivalent to the event horizon - the point of no return for matter being drawn in by the gravity of a black hole.

Deuteron joins proton as smaller than expected
According to the international Committee on Data for Science and Technology (CODATA), the charge radius of the proton is 0.8768(69) fm. Few researchers would give that number much thought if not for measurements in 2010 and 2013 that yielded a radius 4% smaller than and 7.2 standard deviations distant from the CODATA value. Randolf Pohl of the Max Planck Institute of Quantum Optics in Garching, Germany, and colleagues obtained the curiously low radius after analyzing the energy-level shifts of muons orbiting hydrogen nuclei. With a mass 207 times that of the electron, a muon has a tighter orbital that more closely overlaps the nuclear charge distribution, which makes the negatively charged particle a useful tool for probing nuclear dimensions. The discrepancy between the results of muon-based and other experimental investigations has come to be known as the proton radius puzzle.

Now Pohl and his colleagues have used the same technique to measure the radius of the deuteron, a nucleus of one proton and one neutron. The researchers shot a beam of muons at a target of D2 gas. Lasers excited some of the atoms whose electrons were replaced by muons and probed the muons’ energy-level transitions. By combining the measurements with theory, the researchers came up with a deuteron charge radius of 2.12562(78) fm. That’s 7.5 σ smaller than the CODATA value (see graph below; the new result is in red). In addition, both the proton and deuteron sizes are in tension with the values obtained by applying the same technique to atoms with electrons rather than muons.

Scientists Identify 20 Alien Worlds Most Likely to Be Like Earth
Astronomers are narrowing the field in their search for a "second Earth."

An international team of researchers has identified the 20 most Earth-like worlds among the more than 4,000 exoplanet candidates that NASA's Kepler space telescope has detected to date, scientists report in a new study.

All 20 potential "second Earths" lie within the habitable zones of their sun-like stars — meaning they should be able to harbor liquid water on their surfaces — and are likely rocky, the researchers said.

Identifying these Earth-like planets is important in the hunt for alien life, said study lead author Stephen Kane, an associate professor of physics and astronomy at San Francisco State University (SFSU).

"[It] means we can focus in on the planets in this paper and perform follow-up studies to learn more about them, including if they are indeed habitable," Kane said in a statement.

Kane and his team sorted through the 216 habitable-zone Kepler planets and candidates found so far. (A "candidate" is a world that has yet to be confirmed by follow-up observations or analysis. Kepler has found about 4,700 candidates to date, more than 2,300 of which have been confirmed; about 90 percent of all candidates should eventually turn out to be the real deal, mission team members have said.)

Second-Earth candidates had to be safely within the habitable zone. If a planet is too close to the inner edge, it could experience a runaway greenhouse effect like the one that occurred on Venus. And if it's too close to the outer edge, the planet could end up being a frigid world like Mars, the researchers said.

‘Alien Megastructure’ Star Mystery Deepens After Fresh Kepler Data Confirms Erratic Dimming
The case of Tabby’s star keeps getting curiouser and curiouser. The star — formally known by its somewhat clunky name KIC 8462852 — has been a source of endless intrigue for astronomers since September, when its bizarre behavior triggered speculations over the existence of an alien civilization around it.

The WTF (Where’s the Flux) star is now back in the news, and it's more mysterious than ever.

Before we delve into the latest findings, released online through the preprint server arXiv, here is a quick lowdown of the story so far:

Last fall, a team of scientists led by Tabetha Boyajian from Yale University, who lends the object its informal name, the “Tabby’s star,” reported that the star was not behaving as it should. Based on observations conducted using NASA's Kepler Space Telescope between 2009 and 2013, the team witnessed two unusual incidents, in 2011 and 2013, when the star's light dimmed in dramatic, never-before-seen ways.

This dimming indicated that something had passed in front of the star — located between the constellations, Cygnus and Lyra. At the time, a swarm of comets was proposed as the most likely explanation.

However, this is not when Tabby’s star captured the public’s imagination. That happened a month later, in October, when Jason Wright, an astronomer from Penn State University, put forth the idea that the swarm of objects around the star is “something you would expect an alien civilization to build.”

In other words, he suggested that the swarm may be an “alien megastructure,” or a giant Dyson sphere, built by a technologically advanced species to harness the star’s energy.

Unfortunately, two subsequent independent searches, especially tailored to detect alien radio signals and laser pulses, drew a blank, and, earlier this year, a study based on analysis of photographic plates of the sky dating back to the late 19th century argued that even the comet swarm idea, which was the best of the remaining proposals, cannot explain the star’s erratic dimming — although the study’s findings were widely disputed.

In the new paper — which is yet to be peer-reviewed — Caltech astronomer Ben Montet and Joshua Simon of the Carnegie Institute detail their analysis of photometric data of the star gathered by the Kepler space telescope.

Their findings — the star was definitely dimming at a rate that defies explanation over the four years Kepler monitored it. For instance, in the first 1,000 days of Kepler’s observations, the star’s luminosity dipped by roughly 3.4 percent per year, before dropping dramatically by 2.5 percent in a span of just 200 days — something that suggests that the long-term dimming hypothesis may very well be true.

“We note these results are apparent in data from each individual detector, not just the combined light curve, suggesting that the decline in flux is an astrophysical effect rather than an instrumental one,” the researchers wrote in the paper. “We offer no definitive explanation that could explain the observed light curve in this work. The effect could be stellar in nature, although there are no known mechanisms that would cause a main-sequence F star to dim in brightness by 2.5 percent over a few months. The effect could also be caused by a passing dust cloud in orbit around KIC 8462852.”

Although the study seems to rule out all the possible explanations that have been put forward so far — a swarm of comets, planetary fragments, or a distorted star — it does not mean the dimming is caused by an alien megastructure. It just means we still don’t have an explanation for what is causing the star’s light pattern to dip erratically.

“The new paper states, and I agree, that we don’t have any really good models for this sort of behavior,” Wright told Gizmodo. “That’s exciting!”

Those looking for a satisfactory explanation would now pin their hopes on a team of researchers led by Boyajian, which recently successfully crowdfunded a campaign to secure observation time at the Las Cumbres Observatory Global Telescope Network — a privately run network of telescopes set up around the globe to ensure continuous monitoring of an object.

For now, though, “the most mysterious star in our galaxy” continues to live up to its fame.

News Archives