# FROGDAB Week 1 – Astronomical – 31/01/2022

When we first began our task, I do not think we understood the astronomical task in front of us. I mean we understood the astronomy behind it but once we opened the data files in TOPCAT, hundreds upon thousands of data points stared back at us, astronomical in size. Only a supercomputer would be able to analyze all of this data in the time we had, and my laptop with its difficulty opening the file explorer wasn’t going to cut it. We needed a solution; conveniently, we are geniuses B).

There are many different ways of calculating the values of different attributes of black holes. But rather than list them all here so that you forget I’ll just show you which equation we were using at the time and how we went about calculating the variables.

## Overcomplicated – Luminosity Calculations

First is the following equation to calculate the mass of a black hole from its Luminosity and a FWHM of a peak in its spectra:

Okay seems easy enough so we just find those values from our datasets and calculate us some masses, easy peasy.

Hang on a second. But wh- th- there’s how many?!?!

Our first job was clear, reduce this data set to the usable data points and create a much easier path for us to clear our overall hypothesis, whilst also preventing a few fires starting in the Astro Lab at the same time. There are a few ways we found to do this:

1. Match up data from both the Lega-C and COSMOS data sets, that way we only have data points for points we have all the data for. This can be done by matching up the Right Ascension (RA) and Declination (DEC) coordinates from the different datasets, TOPCAT can do this with its match function, and save the resulting file. This reduced our data from over 195,000 points, to just over 3,700 data points. Whilst this is significantly less data, it isn’t very reliable so…
2. Further cut down the data via their sound to noise ratio (SNR). The useful data coming from an astronomical object is called sound, whilst the not so useful data is called noise, the ratio of these gives an idea as to how reliable the data being analyzed is. Typically we want a SNR value greater than 1 as this means that there is more sound than noise. By running a quick python script to remove any data points with a SNR value less than 1 we can cut down the data to about 2,400 points. Again this is great, but the data could still be more useful to us…
3. Remove any values which have no flux in the F Band. For our equations we need the spectra in the X-Ray range, this corresponds to Chandra’s F Band filter. The data is much more useful to us if it has a significant flux in the F band, hence we can use this to remove any data points with little to no flux. This finally cuts down our data to just 143 points. Which is a much more manageable and also reliable data set.

And so those were our tasks for the week, python scripts take a while to write (and April and Jonathan had to learn an entire new module of python). Conveniently there are 6 people in our group however so this isn’t all we got up to…

## Get outta here – Accretion Rates

Another important quality of black holes is their ability to disperse materials in their environments. Whilst their high density leads to most of the materials being pulled into it, its also possible for black holes to perform a gravity assist (aka gravity slingshot) to propel material in other directions away from the black hole.

Gravity assists are very convenient things in astrophysics, they are used practically all the time in astronautics in order to give objects more speed and direction. Famously one was used to correct the course of the Voyager 1 via Jupiter towards Saturn.

The same can be done for matter around black holes, they accelerate via the black holes gravity and are flung back out into the environment surrounding it. This is something we want to investigate as well, if the amount of matter being flung back out is having an affect on the star formation rate within the galaxy. Conveniently there are equations we can use to calculate the accretion rates of black holes, such as this one:

The Bolometric luminosity is related to the luminosity in the x-ray band, this is a value we can find in our Combined data sets and such we can calculate the Accretion Rates:

You’re probably thinking: “Wow guys those correlate so well” and you are right but the reason for that is because one is calculated from the other, with the discrepancy in linearity coming from the changes in redshift.

So there we go, thats what the frog dabbers got up to this week! Join us next week where we’ll make some code to calculate the masses of black holes from both the H-Beta and Magnesium line. Also Fenn will drop in to tell us all about their method of calculating masses.

Farewell ’till then

April