Site icon Medical Market Report

How Did We Actually Take A Picture Of A Black Hole?

Four years ago, the first image of the supermassive black hole (SMBH) at the heart of the Messier 87 galaxy (M87) proved the Internet could be broken by something other than a celebrity’s thirst trap. The excitement was followed up last year by the imaging of the much smaller, but still unimaginably vast, SMBH in our own galaxy, Sagittarius A*. The work continues, with the original data processed by AI this year to make the image much sharper. More examples are coming, so perhaps now is a good time to consider how it is done.

The first thing to note is that none of these images are actually of a black hole. The defining feature of a black hole is gravity so intense even light cannot escape. Consequently, we can’t actually see them no matter what instruments are used. However, black holes, particularly SMBHs, are often surrounded by accretion disks radiating just outside their event horizons, the point of no return. These can be very bright, and if the orientation is right, the hole itself stands out against them.

Advertisement

Despite this brightness, SMBHs’ accretion disks aren’t easy to see. There’s a reason (actually quite a few) why these images required some of the largest collaborations in the history of astronomy, if not quite up with the study of the first kilonova

For one thing, M87* (the asterisk differentiates the SMBH from its galaxy) is a very long way away. Fifty-four million light-years to be as precise as we can currently be. Although the accretion disk is vast by the standards of our solar system – a few light days across – it’s still very hard to resolve at that distance. Sagittarius A* is 2,000 times closer, but there is a lot of dust and other stars blocking our view.

In order to get some resolution out of something that far away, it would be ideal to have an immensely large telescope – say one the size of the Earth. That would be a tad pricey, even if no one mistook it for a death star and bombed its garbage chute. 

Instead, astronomers got eight radio telescopes scattered across the planet to work together in what they called The Event Horizon Telescope (EHT). Just as the distance between your eyes allows you extra depth perception, the separation between these telescopes provides a baseline that makes higher resolution possible.

Advertisement

Your brain has a few hundred million years of evolution behind it when it comes to adding the images produced by both eyes together. Telescopes do the equivalent through interferometry, which relies on the way peaks and troughs of electromagnetic waves affect each other, creating an intensity pattern based on the differences in phase between the waves. It was pioneered with instruments like the Very Large Array, which uses 27 antennae on rails in the New Mexico desert. The radio waves each dish collects get brought together so precisely their peaks combine to produce detail far beyond the capacity of each one individually.

Today, very-long-baseline interferometry allows us to combine telescopes half a world apart. It takes a phenomenal computing power to produce images from such distant sources, but as that has become more available, astronomers have been able to perform this feat from more widely separated locations. 

In the case of the EHT, that meant instruments in Hawaii, California, Arizona, Mexico, Chile, Greenland, Spain, and France. Radio telescopes aren’t as susceptible to clouds as optical instruments are, but storms or even high winds can certainly interfere. Since the observations needed to be performed simultaneously, the project had to wait for calm conditions at every site at once.

Interferometry allows us to create images of objects that would otherwise be too small from our perspective to see, but it’s a complex process.

Image Credit: National Radio Astronomy Observatory (CC BY 3.0)

Transmission of the data between the telescopes would have far exceeded the capacity of intercontinental transmission networks, so the data was stored on sets of hard drives, which had to be brought together in one place. Each observation was timestamped to the nanosecond by atomic clocks. When merged, allowance was made for the time it took for the radio waves to reach different instruments traveling at the speed of light.

Advertisement

Even with all this observational capacity, astronomers couldn’t simply combine the radio waves the telescopes collected and convert them into an image accessible to our eyes. The raw product was simply too unclear for that. 

Interference, created by everything from our atmosphere to the outskirts of M87*’s galactic center had to be identified and removed. Even differences in atmospheric pressure between the different sites at the time of observation had to be allowed for. This process was even harder when repeating the process with Sagittarius A*, since there is so much more intervening material.

Finally, the EHT team compared the observations with computer models built on decades of trying to understand how black holes warp the space around them and the expected behavior of material in the accretion disk. This relied on what we know, or think we know, about the way matter that hot behaves under a combination of powerful gravitational and magnetic fields.

This level of uncertainty is why AI could make the same image so much clearer after learning from 30,000 simulated images of event horizons to find common patterns.

Advertisement

The same interferometry approach has allowed astronomers to turn the same telescopes back to M87* and reveal the jets produced as the SMBH feeds on dismembered stars.

Source Link: How Did We Actually Take A Picture Of A Black Hole?

Exit mobile version