Posted: Tue Jun 15, 2010 7:16 pm Post subject: Overscanning
General question on overscan fitting and bias frame subtraction : why do people do both? It seems that simply subtracting a bias frame from one's program frames would accomplish the same thing, as the bias frames ought to have the same bias level as the program frames.
I also wondered how an Overscan correction is (mathematically) applied to a science frame.
Just one note from me: The bias level is not the same throughout an entire run. Normally one takes Bias frames before and after a run, so just check the mean level across all the frames, and you should see, that these values vary with time.
For our CCD (SBIG STL-6303E) this variation amounts up to ~4 ADU when you compare frames from before and after the observation. So, the mean bias level in an overscan region could be an initial guess or a scaling factor for the reduction, but I'm sure someone knows the correct answer here.
Right, as Antares says, the overscan accounts for any drift in the overall mean level with time. The bias frames themselves include any structure from pixel to pixel whilst the overscan gives any necessary additive zero-point correction overall.
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum