FROGDAB Week 2 – The Big Bois – 07/02/2022

Hello again friends!

April here, back again to tell you all about what the Frog Dabbers have been up to this week. We’ve got some good stuff to report so I hope your all seat belted in. First off we’re going to look in on Fenn who’s found some interesting methods of calculating the masses of black holes:

When the speed separates idk – Velocity Dispersion

Whilst I was doing some research into methods of calculating black hole masses I came across this very interesting method of relating the velocity dispersion to the mass of a black hole.

The velocity dispersion? What’s that??

Its the dispersion (differing values) of the velocities around the mean velocity for an astronomical object! In our case its for the galaxy surrounding the black hole.

Interesting stuff! So how can we use that to calculate the masses of black holes?

So by using [Equation 1 or 2] we can use the values of sigma (the velocity dispersion) to calculate the mass!

Thanks Fenn! So the equations we can use are as follows:

This is a super convenient equation because as it happens we have a whole column in our data set full of the sigma values for the black holes. All we have to do is plug in the values and we get the masses of all the black holes which have available velocity dispersion data!

Meanwhile – Luminosity Masses

Finding the masses of black holes from their luminosity in specific bands however is not as easy. Our first step is to actually be able to find the different flux (and thereby luminosity) values of the spectra from the black holes, once again by pure coincidence someone has already done half of the work for us:

This is the ESO Archive website (https://archive.eso.org/scienceportal/home?data_collection=LEGA-C):

Its a very convenient tool of applying the Lega-C data we already have to actual spots in the sky. All we do is input the coordinates of a datapoint from our data set into the search bar at the top. ESO then finds available spectral data from near the point (usually within hundredths or even thousandths of an arcsecond), which means that we can obtain the flux values for all the datapoints we have information on.

Thanks to the excellent work of Adrien and Jacob we were able to attach the names of the spectrum data files to the coordinates of the data points. This is very useful in coding terms as it means we can easily iterate through all the data points and find the corresponding spectral data. Speaking of which:

Figure 3 shows what one of the spectra looks like, its a big ol’ mess of different wavelengths having different fluxes (this is one of the spectra with a high sound to noise ratio, imagine what one with low looks like). It doesn’t mean much to us in its current state as the redshift hasn’t been taken into account. As astronomical objects get closer or further away their emitted wavelengths get either compressed or stretched respectively. This leads to the spectra we collect from our satellites being at the wrong values compared to what’s actually being emitted. We can fix this though, through equation 3:

So if we divide all of the observed wavelengths by z+1 and then zoom in on whichever peak we want to look at (in this case its Magnesium), we get this:

But what do we do with this information? You’ll have to tune in next week to find out!

Auf Wiedersehen ma dudes 🙂