Check Out The Imaging Technology Behind The Black Hole Picture Which Made It Possible
Till date, the existence of black holes remained a matter of theory. Black hole history dates back to the era of Albert Einstein who stated that the entire theory of relativity could be mere speculation and weird if manifested in real life. The idea of something to dismantle first and then reach a state of infinite density trapping even light defines a black hole. For years scientists have been trying to lay up facts and shreds of evidence to voice the theory and cut the cord of the long-existing mythical reality of black holes.
Though many would still not be sure of the existence of Black Hole, NASA’s Hubble Telescope has put up an image affirming the presence of high energy mass in space as matter shattered and fell beyond the event horizon. Astronomers who were continuously having an eye on the entire phenomenon witnessed pulsating ultraviolet light fading out from extensive bundles of hot gas that spun across Cygnus XR-1. Scientists believed the light to submerge in a black hole.
— Heino Falcke (@hfalcke) April 11, 2019
The images captured by the Hubble Telescope consists of 10 radio telescope across the entire planet functioning as a single receiver and tuned to a very high frequency of radio waves. A team of four scientists took up the charge and worked towards transforming the captured radio signals to an image. True that the first image of a black hole has startled the entire nation, yet the technology used to turn this mission possible is to be credited.
Coordinating 10 telescopes and then backing it with an algorithm was a major challenge for the scientists. Led by an electrical engineer, the team used products that were off the shelf, along with the existing telescopes to finally lay up the final model. Up to these things seemed easier but they were not really simpler. To convert the radio signals into an image, the scientists needed strong software tools and an efficient algorithm.
It meant like capturing slices of data, too small and sparse and them coveting them to a more meaningful and apparently the black hole.
Decoding The Algorithm
The telescope sent photon signals that were primarily shreds of light falling from space. Even though a chain of the telescope was connected, the data sent would still be too tiny to be measured. And so converting them to live images appeared tough. The team worked on the above problem to devise an algorithm that would not only help the transform electrical signals to images but at the same time rule out the actual image from the pool of transformed images.
This is more like crafting an image with nothing in hand. All you can do is assume and when multiple people are working, you are bound to have multiple assumptions so finding the one which is most appropriate is another challenge stacking up. But thanks to the talented team on the project, they skillfully worked to dry run each image and see which is the most plausible. Streamlining features of the actual image to weed out the irrelevant ones turned their task easier.
No doubt this is a massive success for all the astronomers and the scientists involved in the study of Blackhole. And it’s not just the success in black hole domain but the scientists and the technology that helped to pool out the images as well. You need to get the tools right in order to pitch the result right. The team on this EHT project did an excellent job not just by giving the world an image of Blackhole but also a technology that could efficiently convert signals to images and open doors for new invention in the imaging technology.
Image Source – Washington Post