With the successful reduction of images in week 1, we thought this group project might be a breeze! As it turned out, we had hit upon a mountain of coding errors and problems. Emma and I had adapted a script to put our reduced images through SExtractor. This software uses catalogues of stars to determine sources in our images and convert their counts to magnitudes. However, the table it produced gave results of zero… oh dear! As it turned out, the Isaac Newton Telescope where our images come from, assigns incorrect RA and Dec co-ordinates to our images.
A quick aside on FITS files: each image is contained in a format known as a FITS file. It stores the images from each CCD (the ‘camera’ of a telescope) of the Wide Field Camera of the INT. Each CCD has a ‘header’ which includes information about the CCD such as the properties of the CCD pixels, etc. There is also the main header which includes the very important RA and Dec co-ordinates for where the telescope was pointing when it took the images. SExtractor can then read these values from the header and therefore know which part of the sky the images are from and can therefore know where the stars in the image are.
So without these corrected RA/DEC values, our reduced images weren’t much use as the main objective of this project is to use them to calculate magnitudes and thus plot HR diagrams of clusters so we can determine their properties. A such, we had to implement new code provided to us by Dr Sobral, into our reduce.py script that would fix these headers. To do this though required more code to split the 4 CCD images into separate FITS files etc. Some of this code was also depreciated so had to be rewritten using functions that were supported. When all this was finally done and the errors sorted, it produced reduced images for each of the 4 CCDs! Science had been done! Giddy with the prospect of finally fixing the code, we opened the new images to find:
Unfortuanly as you can see, it has managed to completely ruin the images. Closer inspection showed that the flats seemed to be superimposed onto the original images. This seemed perplexing as this new version of our reduction script actually used the same code to actually reduce the image. With our patience wearing dangerously thin, we desperately searched for a solution. By reading the actual count values of the images at each stage of the reduction we found that the reason was due to a correction being applied when the images are split into 4. This correction is to account for how the images were produced at the INT but should be applied to both the unreduced images, flats and biases… which I had foolishly not. By removing this correction for the time being, we re-ran the script:
I’d believe that when I see it, it had already lied to us once this week. We opened up the images to find:
Science had been done this time! However, this is not the end of our coding nightmare. The code will now have to be re-written again to implement this correction. Then we will have to also add code that combines the 4 images into one image using software known as SCAMP. Only then can we send these onto stage 2; SExtractor.
Luckily Emma had been grappling with SExtractor during the week and had managed to come up with a script that would process our reduced images and produce tables of data to plot on a HR diagram, including a colour magnitude! She has also created what are known as bad pixel maps. These are FITS files that define parts of an image that are too noisy and should not be analysed by SExtractor. This noise can occur in the gaps between CCDs that have been joined together to make one image.
Armed with this, Emma used the third step in our image processing code; a script which takes the tables made by SExtractor to produce HR diagrams such as our Result of the Week using reduced images given to us, shown below:
Coding was not all that occurred this week. The rest of the group has been tackling manual photometry in earnest. Essentially this is doing by hand what the SExtractor script does, so that we understand the principles. Hamish and Tom spent the week sifting through the teething problems. These include finding appropriate catalogues from software Topcat which is then loaded into gaia which is the software that displays the image of the cluster in question. The zero-point must be found first. Then by setting apertures around stars, the counts can be read off from the software. Care must be taken as to how large the aperture is set. Too small and most of the flux of the star is missed. Too large and the noise of the CCD and stray light from other stars will affect the measure counts.
We prepare to enter the third week of the lab stage hopeful that science will soon be done!