Monday, October 21, 2013

Unexpected problems with image alignment

Continuing the development of pyAstroStack (I have to come up with a better name for it...) and I ran into problems where I didn't quite expect them.

I'm also starting to find out that when I thought I knew what happens in the image registration and stacking, I'm actually missing quite a lot. I knew about the Bayer filter in DSLR, but seems like I misunderstood how it's used. I thought there are 12M red pixels, 12M blue pixels and 24M green pixels in 12Mpix sensor, when there actually are 3M red, 3M blue and 6M green. I also didn't understand that converting a raw photo into fits or tiff with DCRaw's (or Rawtran's) default settings, doesn't give me the real raw data. So the first version of my program used interpolated color data from the beginning. I started to fix this.

I had difficulties understanding Rawtrans switches but I knew what I had to do with DCRaw in order to get debayered data from raws. Rawtran gave me FITS, but DCRaw PPM or TIFF. I wanted my program to output TIFF so I decided to change AstroPy.Fits to something that uses TIFF. I found Pillow. It can also import images into numpy.arrays, so I should be able to do the transition easily...

I made a lot of changes before sunning proper tests. I tried to include dark, bias and flat calibrations at the same time and when I finally ran tests, all the resulting images were mostly black. I removed functions about calibration from my test program but to no effect. I stretched images to extreme and found this:


Seems to me like registration fails. I really couldn't understand this since I hadn't touched anything related to registration. All the changes were in image loading and stacking. The weirdest thing to me was, why alignment fails only for Y-coordinates and X is ok. It took me a while to figure this out. I reverted to older, working, version of the code and started making all the changes to it one by one and running tests after each change. Pillow and TIFF was the cause. Still I couldn't understand why until I ran everything using FITS but the output as TIFF. Result was perfectly aligned image, upside down! Either TIFF handles coordinates in different order than FITS or just Pillow-library makes the numpy.array with flipped Y, but now that I knew the cause, it was simple to fix.

Star coordinates were fetched on SExtractor using FITS even when everything else used TIFF so the Y-coordinate was always reversed. I simply changed the Y SExtractor gave into Ymax - Y and everything worked.


My plan was to have calibration done by now, but this mix up of coordinates took more time than it should. Reminds me how amateur I still am...

What next?

Maybe now I can work on the calibration. For what I've understood the procedure is
  • masterbias = stack(bias)
  • masterdark = stack(dark - masterbias)
  • masterflat = stack(flat - masterdark - masterbias)
  • stack((light - masterdark - masterbias)/masterflat)
Feels like masterflat should be normalized. It doesn't make any sense to me dividing by same and bigger values you find in light images. Dividing by flat normalized to [0,1] feels like a better idea.

Also colouring the images would be nice. As I said, the first images were made from interpolated raws and now I'm using properly debayered (I think). After calibrations I should interpolate monochromes into colour images with a correct bayer mask. If I'm right about how it's done, it doesn't sound too fast of an operation on Python. Perhaps PyCuda here? Some introduction to PyCuda I read said it's at its best on calculating numpy.arrays.