Ugly Hedgehog - Photography Forum
Home Active Topics Newest Pictures Search Login Register
Astronomical Photography Forum
Work Flow capturing and processing data
Feb 4, 2024 12:48:24   #
alberio Loc: Casa Grande AZ
 
Post processing:
I'm struggling with my post processing. I'm looking for others recommendations after you get a series of images/subs. What is your work flow, especially when using a DSLR or Mirrorless camera as opposed to a dedicated astro camera.
Gathering Data:
Is letting the camera reduce High ISO and long exposure noise for each useful, or should you take dedicated dark frames with a cover over the scope a better approach? Do you take a dark for each sub or just one at the end of the series?
I know these are out of order, but

Reply
Feb 5, 2024 16:40:09   #
Ballard Loc: Grass Valley, California
 
alberio wrote:
Post processing:
I'm struggling with my post processing. I'm looking for others recommendations after you get a series of images/subs. What is your work flow, especially when using a DSLR or Mirrorless camera as opposed to a dedicated astro camera.
Gathering Data:
Is letting the camera reduce High ISO and long exposure noise for each useful, or should you take dedicated dark frames with a cover over the scope a better approach? Do you take a dark for each sub or just one at the end of the series?
I know these are out of order, but
Post processing: br I'm struggling with my post pr... (show quote)


Hi alberio

For my DSLR shots I would note the temperate at the time I took the light frames and take a series of Dark Frames at after the light frames. I also keep a library of previous dark frames and note the temperature that they were taken at and if the library contains a set that is within 5 degrees of the light frames usually just reuse those rather than take new ones. After a year or so I've heard it maybe prudent to retake dark frames as the sensor can change as it ages, (so far I haven't found that I need to do that). I similarly keep a library for flats based on the camera and optics, including filters used, and bias frames base on the camera used. (Note: I have had to occasionally take a separate set of flats if dust bunnies get in there and use that as an indicator that it is time to clean the filters, optics and or sensor.

Note: I organize the images by Deep sky vs Planetary and then have subdirectories for the image (e.g. M101).
I then typically have a subdirectory for the optical system used (e.g. DSLR_500mm, ZWO_meade16, etc).
This is then followed by the raw DSLR images (or an other subset of directories for different exposures lengths and or individual filter names used which contain either DSLR RAW frames or fits image taken with an astro camera). I also keep the stacked images and processed images at this level of the directory structure. Note:
Under there are directories for the good fames, calibrated frame, Debayered frames for the DSLR) and registered frames.

Below my cheat sheet I use for processing images using pixinsight so not all of it may be relevant to your process.
Below is my cheat sheet

Integrate Bias frames (Use process ImageIntegration)
Image integration
Combination: Average
Normalization: No normalization
Weights: Don’t Care (all weights=1)
Scale evaluation: Median absolute deviation from the median (MAD) THIS IS GONE IN NEW PIXINSIGHT.
Uncheck evaluation noise default others

Pixel Rejection 1
Rejection algorithm: Sigma Clipping
Normalization: No normalization

Pixel Rejection 2
Sigma low : 3.0
Sigma high : 3.0

Notes:
(** Warning: Inconsistent Instrument:Telescope:FocalLength (FOCALLEN keyword) value(s) - metadata not generated.
** Warning: Inconsistent Instrument:Telescope:Aperture (APTDIA keyword) value(s) - metadata not generated)

*** Error: Parsing OBJCTRA FITS keyword: Parsing sexagesimal expression: empty string
*** Error: Parsing OBJCTDEC FITS keyword: Parsing sexagesimal expression: empty string

These errors are not fatal, so you can still integrate the frames. However, metadata for geodetic observer coordinates will be lost in the integrated image (which is not really important for Bias frames).


Add RAW files - hit go button (circle on bottom of process box)
Stretch rejection_high and rejection_low to check for any big issuesStretch rejection_high and rejection_low to check for any big issues.

Save- integrated image as master_bias.xisf

Integrate Dark frames (same process and settings and as Bias frames)
Image integration
Combination: Average
Normalization: No normalization
Weights: Don’t Care (all weights=1)
Scale evaluation: Median absolute deviation from the median (MAD)THIS IS GONE IN NEW PIXINSIGHT.
Uncheck evaluation noise default others

Pixel Rejection 1
Rejection algorithm: Sigma Clipping
Normalization: No normalization

Pixel Rejection 2
Sigma low : 3.0
Sigma high : 3.0

Notes:
Add RAW files -hit go button (circle on bottom of process box)
Stretch rejection_high and rejection_low to check for any big issuesStretch rejection_high and rejection_low to check for any big issues.
Save- integrated image as master_dark.xisf


*** Error: Parsing OBJCTRA FITS keyword: Parsing sexagesimal expression: empty string
*** Error: Parsing OBJCTDEC FITS keyword: Parsing sexagesimal expression: empty string

These errors are not fatal, so you can still integrate the frames. However, metadata for geodetic observer coordinates will be lost in the integrated image (which is not really important for Dark frames).
Calibrate Flats (Use process ImageCalibration)
Output Files
Output Directory: example J:/deep sky/M31 2min 71F/Good/Calibrated/flats
Sample format: 32bit floating point
Master Bias (put check in box)
Set location of master bias
Don’t set calibrate as this is only used with overscan for special CCD’s.


Master Dark (do not check as there is no correlation to darks)
Master Flat (do not check as there is not a master flat to calibrate with)

Notes: Hit the go button (circle on bottom of process box). This fills the output directory with calibrated flats.

Integrate Calibrated flat frames (Use process Image Integration)

Image integration
Combination: Average
Normalization: Multiplicative
Weights: Don’t Care (all weights=1)
Scale evaluation: Median absolute deviation from the median (MAD)
Uncheck evaluation noise default others

Pixel Rejection 1
Rejection algorithm: Sigma Clipping
Normalization: Equalize fluxes

Pixel Rejection 2
Sigma low : 3.0
Sigma high : 3.0

Notes: We use Equalize fluxes in this case.
Add RAW files -hit go button (circle on bottom of process box)
Stretch rejection_high and rejection_low to check for any big issuesStretch rejection_high and rejection_low to check for any big issues.

Save- integrated image as master_flat.xisf
Find best Light frames (Use process SubframeSelector)
Add linear mode files DSLR RAW light frame files or mono .fits to Subframe Selector.
System Parameters
Subframe scale : (2.11 for 500mm lens, 1.106 with 2X teleconverter added, 0.396 for C11, 0.27 for lx200)
ZWO w 500mm -1.55arcsec/pixel
ZWO w 4096mm -.19arcsec/pixel
[Arcseconds Per Pixel] = (206.2648 x [Pixel Size in μm]) / [Focal Length in mm]
Note: FOV 247.65 x165.10 arcminutes for 500mm lens
123.83x82.55 arcminutes with teleconverter added
44.32x29.55 for C11
EOS 5d MarkIV pixel size 5.36 microns (num pixels 6720x4480)
EOS 5D Mark IV Camera resolution:14 bit
SBig 11002 Camera resolution:16 bit


Can use FWHM or weighting to decide on the best frames to use. (Also check on the number of stars to remove those that had clouds go by or high(bad) FWHM.
Typically default star detection parameters
Use the “measure subframes” routine to measure the frames. The click on those to remove in the graph.
Use the “output subframes” routine to copy the accepted frames to directory specified.
Rename best sub as the reference frame to be used later in image resitration.
Calibrate Light frames(Use process ImageCalibration)
Output Files
Output Directory: example J:/deep sky/M31 2min 71F/Good/Calibrated/lights
Sample format: 32bit floating point
Noise evaluation: Multiresolution Support

Uncheck CFA for monochromatic frames, leave set for color DSLR-RAW)
Master Bias (put check in box)
Set location of master bias
Don’t set calibrate as this is only used with overscan for special CCD’s.
Master Dark (put check in box)
Set location of master dark
Set Calibrate to calibrate lights with dark data
Optimization threshold: 0
Don’t set Optimize.
Master Flat (put check in box)
Set location of master flat (don’t check calibrate as the flat already has this in it).

Notes: This will subtract master bias from master dark then dark is subtracted from lights.
Hit the go button (circle on bottom of process box). This fills the output directory with calibrated light frames. Calibrate light frames from the accepted frames from subframe sector.


Debayer light frames(use process Debayer)
Don’t do for monochromatic Images
Bayer/mosaic pattern: RGGB
Demosaicing method : VNG
Evaluate noise is checked
Noise evaluation: Iterative K-Sigma Clipping
Output directory set location to put debayered frames
Added “calibrated files” from calibrated files. Debayer after calibration.
Hit the go button (circle on bottom of process box).


Register light frames (use process StarAlignment)
Reference image : Set reference image to best image found with subframe selector
Registration model: Projective Transformation
Working model : Register/Match Images
Check drizzle to generate drizzle data
Target image
Add all images that where accepted, calibrated and (and for color images debayered).
Output Directory (set to location to save registered images).
Default other settings

Hit the go button (circle on bottom of process box).


Integrate Registered light frames (Use process ImageIntegration)
Image integration
Combination: Average
Normalization: Additive with scaling
Weights: Noise evaluation (PSF signal)

Pixel Rejection 1
Rejection algorithm: Sigma Clipping (used Winsorized Sigma Clipping for >40 images)
Normalization: Scale +zero offset

Pixel Rejection 2
Sigma low : 3.0
Sigma high : 3.0 (lower this to remove satellites at the expense of added noise)

Add registered files and Drizzle files if drizzle was set in registration
-hit go button (circle on bottom of process box)
Stretch rejection high and rejection low to check for any big issues



For HDR use star alignment on the upstretched images. Then use the HDR composite process to combine the images taken with different exposure times.

Noise Reduction with Multiscale Linear Transform (Use Process MultiscaleLinearTransform before stretching

Use Dyadic with 4 layers
Scaling function Liner interpolation
Set layer 1 (3.000,0.33,3)
Set layer 2 (3.000,0.33,3)
Set layer 3 (3.000,0.26,3)
Set layer 4 (3.500,0.20,3)

Reply
Feb 5, 2024 18:00:12   #
alberio Loc: Casa Grande AZ
 
Ballard wrote:
Hi alberio

For my DSLR shots I would note the temperate at the time I took the light frames and take a series of Dark Frames at after the light frames. I also keep a library of previous dark frames and note the temperature that they were taken at and if the library contains a set that is within 5 degrees of the light frames usually just reuse those rather than take new ones. After a year or so I've heard it maybe prudent to retake dark frames as the sensor can change as it ages, (so far I haven't found that I need to do that). I similarly keep a library for flats based on the camera and optics, including filters used, and bias frames base on the camera used. (Note: I have had to occasionally take a separate set of flats if dust bunnies get in there and use that as an indicator that it is time to clean the filters, optics and or sensor.

Note: I organize the images by Deep sky vs Planetary and then have subdirectories for the image (e.g. M101).
I then typically have a subdirectory for the optical system used (e.g. DSLR_500mm, ZWO_meade16, etc).
This is then followed by the raw DSLR images (or an other subset of directories for different exposures lengths and or individual filter names used which contain either DSLR RAW frames or fits image taken with an astro camera). I also keep the stacked images and processed images at this level of the directory structure. Note:
Under there are directories for the good fames, calibrated frame, Debayered frames for the DSLR) and registered frames.

Below my cheat sheet I use for processing images using pixinsight so not all of it may be relevant to your process.
Below is my cheat sheet

Integrate Bias frames (Use process ImageIntegration)
Image integration
Combination: Average
Normalization: No normalization
Weights: Don’t Care (all weights=1)
Scale evaluation: Median absolute deviation from the median (MAD) THIS IS GONE IN NEW PIXINSIGHT.
Uncheck evaluation noise default others

Pixel Rejection 1
Rejection algorithm: Sigma Clipping
Normalization: No normalization

Pixel Rejection 2
Sigma low : 3.0
Sigma high : 3.0

Notes:
(** Warning: Inconsistent Instrument:Telescope:FocalLength (FOCALLEN keyword) value(s) - metadata not generated.
** Warning: Inconsistent Instrument:Telescope:Aperture (APTDIA keyword) value(s) - metadata not generated)

*** Error: Parsing OBJCTRA FITS keyword: Parsing sexagesimal expression: empty string
*** Error: Parsing OBJCTDEC FITS keyword: Parsing sexagesimal expression: empty string

These errors are not fatal, so you can still integrate the frames. However, metadata for geodetic observer coordinates will be lost in the integrated image (which is not really important for Bias frames).


Add RAW files - hit go button (circle on bottom of process box)
Stretch rejection_high and rejection_low to check for any big issuesStretch rejection_high and rejection_low to check for any big issues.

Save- integrated image as master_bias.xisf

Integrate Dark frames (same process and settings and as Bias frames)
Image integration
Combination: Average
Normalization: No normalization
Weights: Don’t Care (all weights=1)
Scale evaluation: Median absolute deviation from the median (MAD)THIS IS GONE IN NEW PIXINSIGHT.
Uncheck evaluation noise default others

Pixel Rejection 1
Rejection algorithm: Sigma Clipping
Normalization: No normalization

Pixel Rejection 2
Sigma low : 3.0
Sigma high : 3.0

Notes:
Add RAW files -hit go button (circle on bottom of process box)
Stretch rejection_high and rejection_low to check for any big issuesStretch rejection_high and rejection_low to check for any big issues.
Save- integrated image as master_dark.xisf


*** Error: Parsing OBJCTRA FITS keyword: Parsing sexagesimal expression: empty string
*** Error: Parsing OBJCTDEC FITS keyword: Parsing sexagesimal expression: empty string

These errors are not fatal, so you can still integrate the frames. However, metadata for geodetic observer coordinates will be lost in the integrated image (which is not really important for Dark frames).
Calibrate Flats (Use process ImageCalibration)
Output Files
Output Directory: example J:/deep sky/M31 2min 71F/Good/Calibrated/flats
Sample format: 32bit floating point
Master Bias (put check in box)
Set location of master bias
Don’t set calibrate as this is only used with overscan for special CCD’s.


Master Dark (do not check as there is no correlation to darks)
Master Flat (do not check as there is not a master flat to calibrate with)

Notes: Hit the go button (circle on bottom of process box). This fills the output directory with calibrated flats.

Integrate Calibrated flat frames (Use process Image Integration)

Image integration
Combination: Average
Normalization: Multiplicative
Weights: Don’t Care (all weights=1)
Scale evaluation: Median absolute deviation from the median (MAD)
Uncheck evaluation noise default others

Pixel Rejection 1
Rejection algorithm: Sigma Clipping
Normalization: Equalize fluxes

Pixel Rejection 2
Sigma low : 3.0
Sigma high : 3.0

Notes: We use Equalize fluxes in this case.
Add RAW files -hit go button (circle on bottom of process box)
Stretch rejection_high and rejection_low to check for any big issuesStretch rejection_high and rejection_low to check for any big issues.

Save- integrated image as master_flat.xisf
Find best Light frames (Use process SubframeSelector)
Add linear mode files DSLR RAW light frame files or mono .fits to Subframe Selector.
System Parameters
Subframe scale : (2.11 for 500mm lens, 1.106 with 2X teleconverter added, 0.396 for C11, 0.27 for lx200)
ZWO w 500mm -1.55arcsec/pixel
ZWO w 4096mm -.19arcsec/pixel
[Arcseconds Per Pixel] = (206.2648 x [Pixel Size in μm]) / [Focal Length in mm]
Note: FOV 247.65 x165.10 arcminutes for 500mm lens
123.83x82.55 arcminutes with teleconverter added
44.32x29.55 for C11
EOS 5d MarkIV pixel size 5.36 microns (num pixels 6720x4480)
EOS 5D Mark IV Camera resolution:14 bit
SBig 11002 Camera resolution:16 bit


Can use FWHM or weighting to decide on the best frames to use. (Also check on the number of stars to remove those that had clouds go by or high(bad) FWHM.
Typically default star detection parameters
Use the “measure subframes” routine to measure the frames. The click on those to remove in the graph.
Use the “output subframes” routine to copy the accepted frames to directory specified.
Rename best sub as the reference frame to be used later in image resitration.
Calibrate Light frames(Use process ImageCalibration)
Output Files
Output Directory: example J:/deep sky/M31 2min 71F/Good/Calibrated/lights
Sample format: 32bit floating point
Noise evaluation: Multiresolution Support

Uncheck CFA for monochromatic frames, leave set for color DSLR-RAW)
Master Bias (put check in box)
Set location of master bias
Don’t set calibrate as this is only used with overscan for special CCD’s.
Master Dark (put check in box)
Set location of master dark
Set Calibrate to calibrate lights with dark data
Optimization threshold: 0
Don’t set Optimize.
Master Flat (put check in box)
Set location of master flat (don’t check calibrate as the flat already has this in it).

Notes: This will subtract master bias from master dark then dark is subtracted from lights.
Hit the go button (circle on bottom of process box). This fills the output directory with calibrated light frames. Calibrate light frames from the accepted frames from subframe sector.


Debayer light frames(use process Debayer)
Don’t do for monochromatic Images
Bayer/mosaic pattern: RGGB
Demosaicing method : VNG
Evaluate noise is checked
Noise evaluation: Iterative K-Sigma Clipping
Output directory set location to put debayered frames
Added “calibrated files” from calibrated files. Debayer after calibration.
Hit the go button (circle on bottom of process box).


Register light frames (use process StarAlignment)
Reference image : Set reference image to best image found with subframe selector
Registration model: Projective Transformation
Working model : Register/Match Images
Check drizzle to generate drizzle data
Target image
Add all images that where accepted, calibrated and (and for color images debayered).
Output Directory (set to location to save registered images).
Default other settings

Hit the go button (circle on bottom of process box).


Integrate Registered light frames (Use process ImageIntegration)
Image integration
Combination: Average
Normalization: Additive with scaling
Weights: Noise evaluation (PSF signal)

Pixel Rejection 1
Rejection algorithm: Sigma Clipping (used Winsorized Sigma Clipping for >40 images)
Normalization: Scale +zero offset

Pixel Rejection 2
Sigma low : 3.0
Sigma high : 3.0 (lower this to remove satellites at the expense of added noise)

Add registered files and Drizzle files if drizzle was set in registration
-hit go button (circle on bottom of process box)
Stretch rejection high and rejection low to check for any big issues



For HDR use star alignment on the upstretched images. Then use the HDR composite process to combine the images taken with different exposure times.

Noise Reduction with Multiscale Linear Transform (Use Process MultiscaleLinearTransform before stretching

Use Dyadic with 4 layers
Scaling function Liner interpolation
Set layer 1 (3.000,0.33,3)
Set layer 2 (3.000,0.33,3)
Set layer 3 (3.000,0.26,3)
Set layer 4 (3.500,0.20,3)
Hi alberio br br For my DSLR shots I would note t... (show quote)


Wow, I'm going to print this for future reference if that's OK. Thanks for taking the time.

Reply
 
 
Feb 5, 2024 21:06:42   #
Ballard Loc: Grass Valley, California
 
alberio wrote:
Wow, I'm going to print this for future reference if that's OK. Thanks for taking the time.


Hi alberio
You are very welcome. Feel free to print it out and give it to anyone that you think it would benefit.

Reply
Feb 7, 2024 10:46:03   #
Marc G Loc: East Grinstead, West Sussex, England
 
Ballard wrote:
Hi alberio

For my DSLR shots I would note the temperate at the time I took the light frames and take a series of Dark Frames at after the light frames. I also keep a library of previous dark frames and note the temperature that they were taken at and if the library contains a set that is within 5 degrees of the light frames usually just reuse those rather than take new ones. After a year or so I've heard it maybe prudent to retake dark frames as the sensor can change as it ages, (so far I haven't found that I need to do that). I similarly keep a library for flats based on the camera and optics, including filters used, and bias frames base on the camera used. (Note: I have had to occasionally take a separate set of flats if dust bunnies get in there and use that as an indicator that it is time to clean the filters, optics and or sensor.

Note: I organize the images by Deep sky vs Planetary and then have subdirectories for the image (e.g. M101).
I then typically have a subdirectory for the optical system used (e.g. DSLR_500mm, ZWO_meade16, etc).
This is then followed by the raw DSLR images (or an other subset of directories for different exposures lengths and or individual filter names used which contain either DSLR RAW frames or fits image taken with an astro camera). I also keep the stacked images and processed images at this level of the directory structure. Note:
Under there are directories for the good fames, calibrated frame, Debayered frames for the DSLR) and registered frames.

Below my cheat sheet I use for processing images using pixinsight so not all of it may be relevant to your process.
Below is my cheat sheet

Integrate Bias frames (Use process ImageIntegration)
Image integration
Combination: Average
Normalization: No normalization
Weights: Don’t Care (all weights=1)
Scale evaluation: Median absolute deviation from the median (MAD) THIS IS GONE IN NEW PIXINSIGHT.
Uncheck evaluation noise default others

Pixel Rejection 1
Rejection algorithm: Sigma Clipping
Normalization: No normalization

Pixel Rejection 2
Sigma low : 3.0
Sigma high : 3.0

Notes:
(** Warning: Inconsistent Instrument:Telescope:FocalLength (FOCALLEN keyword) value(s) - metadata not generated.
** Warning: Inconsistent Instrument:Telescope:Aperture (APTDIA keyword) value(s) - metadata not generated)

*** Error: Parsing OBJCTRA FITS keyword: Parsing sexagesimal expression: empty string
*** Error: Parsing OBJCTDEC FITS keyword: Parsing sexagesimal expression: empty string

These errors are not fatal, so you can still integrate the frames. However, metadata for geodetic observer coordinates will be lost in the integrated image (which is not really important for Bias frames).


Add RAW files - hit go button (circle on bottom of process box)
Stretch rejection_high and rejection_low to check for any big issuesStretch rejection_high and rejection_low to check for any big issues.

Save- integrated image as master_bias.xisf

Integrate Dark frames (same process and settings and as Bias frames)
Image integration
Combination: Average
Normalization: No normalization
Weights: Don’t Care (all weights=1)
Scale evaluation: Median absolute deviation from the median (MAD)THIS IS GONE IN NEW PIXINSIGHT.
Uncheck evaluation noise default others

Pixel Rejection 1
Rejection algorithm: Sigma Clipping
Normalization: No normalization

Pixel Rejection 2
Sigma low : 3.0
Sigma high : 3.0

Notes:
Add RAW files -hit go button (circle on bottom of process box)
Stretch rejection_high and rejection_low to check for any big issuesStretch rejection_high and rejection_low to check for any big issues.
Save- integrated image as master_dark.xisf


*** Error: Parsing OBJCTRA FITS keyword: Parsing sexagesimal expression: empty string
*** Error: Parsing OBJCTDEC FITS keyword: Parsing sexagesimal expression: empty string

These errors are not fatal, so you can still integrate the frames. However, metadata for geodetic observer coordinates will be lost in the integrated image (which is not really important for Dark frames).
Calibrate Flats (Use process ImageCalibration)
Output Files
Output Directory: example J:/deep sky/M31 2min 71F/Good/Calibrated/flats
Sample format: 32bit floating point
Master Bias (put check in box)
Set location of master bias
Don’t set calibrate as this is only used with overscan for special CCD’s.


Master Dark (do not check as there is no correlation to darks)
Master Flat (do not check as there is not a master flat to calibrate with)

Notes: Hit the go button (circle on bottom of process box). This fills the output directory with calibrated flats.

Integrate Calibrated flat frames (Use process Image Integration)

Image integration
Combination: Average
Normalization: Multiplicative
Weights: Don’t Care (all weights=1)
Scale evaluation: Median absolute deviation from the median (MAD)
Uncheck evaluation noise default others

Pixel Rejection 1
Rejection algorithm: Sigma Clipping
Normalization: Equalize fluxes

Pixel Rejection 2
Sigma low : 3.0
Sigma high : 3.0

Notes: We use Equalize fluxes in this case.
Add RAW files -hit go button (circle on bottom of process box)
Stretch rejection_high and rejection_low to check for any big issuesStretch rejection_high and rejection_low to check for any big issues.

Save- integrated image as master_flat.xisf
Find best Light frames (Use process SubframeSelector)
Add linear mode files DSLR RAW light frame files or mono .fits to Subframe Selector.
System Parameters
Subframe scale : (2.11 for 500mm lens, 1.106 with 2X teleconverter added, 0.396 for C11, 0.27 for lx200)
ZWO w 500mm -1.55arcsec/pixel
ZWO w 4096mm -.19arcsec/pixel
[Arcseconds Per Pixel] = (206.2648 x [Pixel Size in μm]) / [Focal Length in mm]
Note: FOV 247.65 x165.10 arcminutes for 500mm lens
123.83x82.55 arcminutes with teleconverter added
44.32x29.55 for C11
EOS 5d MarkIV pixel size 5.36 microns (num pixels 6720x4480)
EOS 5D Mark IV Camera resolution:14 bit
SBig 11002 Camera resolution:16 bit


Can use FWHM or weighting to decide on the best frames to use. (Also check on the number of stars to remove those that had clouds go by or high(bad) FWHM.
Typically default star detection parameters
Use the “measure subframes” routine to measure the frames. The click on those to remove in the graph.
Use the “output subframes” routine to copy the accepted frames to directory specified.
Rename best sub as the reference frame to be used later in image resitration.
Calibrate Light frames(Use process ImageCalibration)
Output Files
Output Directory: example J:/deep sky/M31 2min 71F/Good/Calibrated/lights
Sample format: 32bit floating point
Noise evaluation: Multiresolution Support

Uncheck CFA for monochromatic frames, leave set for color DSLR-RAW)
Master Bias (put check in box)
Set location of master bias
Don’t set calibrate as this is only used with overscan for special CCD’s.
Master Dark (put check in box)
Set location of master dark
Set Calibrate to calibrate lights with dark data
Optimization threshold: 0
Don’t set Optimize.
Master Flat (put check in box)
Set location of master flat (don’t check calibrate as the flat already has this in it).

Notes: This will subtract master bias from master dark then dark is subtracted from lights.
Hit the go button (circle on bottom of process box). This fills the output directory with calibrated light frames. Calibrate light frames from the accepted frames from subframe sector.


Debayer light frames(use process Debayer)
Don’t do for monochromatic Images
Bayer/mosaic pattern: RGGB
Demosaicing method : VNG
Evaluate noise is checked
Noise evaluation: Iterative K-Sigma Clipping
Output directory set location to put debayered frames
Added “calibrated files” from calibrated files. Debayer after calibration.
Hit the go button (circle on bottom of process box).


Register light frames (use process StarAlignment)
Reference image : Set reference image to best image found with subframe selector
Registration model: Projective Transformation
Working model : Register/Match Images
Check drizzle to generate drizzle data
Target image
Add all images that where accepted, calibrated and (and for color images debayered).
Output Directory (set to location to save registered images).
Default other settings

Hit the go button (circle on bottom of process box).


Integrate Registered light frames (Use process ImageIntegration)
Image integration
Combination: Average
Normalization: Additive with scaling
Weights: Noise evaluation (PSF signal)

Pixel Rejection 1
Rejection algorithm: Sigma Clipping (used Winsorized Sigma Clipping for >40 images)
Normalization: Scale +zero offset

Pixel Rejection 2
Sigma low : 3.0
Sigma high : 3.0 (lower this to remove satellites at the expense of added noise)

Add registered files and Drizzle files if drizzle was set in registration
-hit go button (circle on bottom of process box)
Stretch rejection high and rejection low to check for any big issues



For HDR use star alignment on the upstretched images. Then use the HDR composite process to combine the images taken with different exposure times.

Noise Reduction with Multiscale Linear Transform (Use Process MultiscaleLinearTransform before stretching

Use Dyadic with 4 layers
Scaling function Liner interpolation
Set layer 1 (3.000,0.33,3)
Set layer 2 (3.000,0.33,3)
Set layer 3 (3.000,0.26,3)
Set layer 4 (3.500,0.20,3)
Hi alberio br br For my DSLR shots I would note t... (show quote)


Great explanation for PixInsight users.

Reply
Feb 7, 2024 12:17:46   #
alberio Loc: Casa Grande AZ
 
Marc G wrote:
Great explanation for PixInsight users.


Hi Marc, do you use another? I just looked at Siril, and don't understand half of what they are talking about. What is a bias sub? I think I understand flats and darks. I get lucky with my simple techniques, but would like to step it up a bit. I'm only using a mirrorless camera, Canon R6, so I know I'm limited in the camera department. Also registering and calibrating?

Reply
Feb 7, 2024 12:35:59   #
Ballard Loc: Grass Valley, California
 
alberio wrote:
Hi Marc, do you use another? I just looked at Siril, and don't understand half of what they are talking about. What is a bias sub? I think I understand flats and darks. I get lucky with my simple techniques, but would like to step it up a bit. I'm only using a mirrorless camera, Canon R6, so I know I'm limited in the camera department. Also registering and calibrating?


Hi Alberio
My understanding is that Bias frames are used to filter out read noise when the data is read off the sensor. To take these, set your camera on fastest shutter speed possible and put the lens cap on so it is completely dark. Then take a bunch of shots and stack them to make a master bias. Since the frames are short they should have little to no temperature sensitivity and reflect the background electrical noise when reading the sensor. I make separate bias frames for each camera. Like darks it is recommended to take new ones after a year or so as the senor ages (I haven't found that necessary for either my DSLR or my cold camera yet). (Note: in my instructions on pixisight the image intergation uses different settings than light frames, I haven't looked into how to do this with other programs.

Reply
 
 
Feb 7, 2024 12:46:12   #
Marc G Loc: East Grinstead, West Sussex, England
 
alberio wrote:
Hi Marc, do you use another? I just looked at Siril, and don't understand half of what they are talking about. What is a bias sub? I think I understand flats and darks. I get lucky with my simple techniques, but would like to step it up a bit. I'm only using a mirrorless camera, Canon R6, so I know I'm limited in the camera department. Also registering and calibrating?


Yes I use astro pixel processor for stacking, calibration & initial processing.
Then switch to photoshop.

I no longer use dslr but to my understanding bias frames are used to calibrate read noise out.
Dithering frames can also assist with read noise.

Thankfully I know only have to shoot flats & dark flats as well as the lights obviously.

Reply
Feb 7, 2024 12:58:51   #
alberio Loc: Casa Grande AZ
 
Ballard wrote:
Hi Alberio
My understanding is that Bias frames are used to filter out read noise when the data is read off the sensor. To take these, set your camera on fastest shutter speed possible and put the lens cap on so it is completely dark. Then take a bunch of shots and stack them to make a master bias. Since the frames are short they should have little to no temperature sensitivity and reflect the background electrical noise when reading the sensor. I make separate bias frames for each camera. Like darks it is recommended to take new ones after a year or so as the senor ages (I haven't found that necessary for either my DSLR or my cold camera yet). (Note: in my instructions on pixisight the image intergation uses different settings than light frames, I haven't looked into how to do this with other programs.
Hi Alberio br My understanding is that Bias frames... (show quote)


Thanks Ballard, I'll get a master bias saved for this camera, it might help with some background patterns/artifacts the sensor might be integrating into the stack. Thanks for the help. Charlie (alberio)

Reply
Feb 7, 2024 13:02:58   #
alberio Loc: Casa Grande AZ
 
Thanks Marc, You and Ballard have given me some great info. You'd think as long as I've been taking astro images, I'd already know these things. I've just taken the easy way out.

Reply
Feb 7, 2024 13:36:48   #
Marc G Loc: East Grinstead, West Sussex, England
 
alberio wrote:
Thanks Ballard, I'll get a master bias saved for this camera, it might help with some background patterns/artifacts the sensor might be integrating into the stack. Thanks for the help. Charlie (alberio)


I'm not sure that you can build a bias library cos read noise differs from frame to frame?
Dithering will certainly assist cos the differ will shift the frame a few pixels thus the read noise will be different.
This is one of the reasons I switched to dedicated astro cam.
Walking & bias noise is virtually non existent.

Saying that I still use a dark library but my 269 doesn't produce amp glow, so no need for darks.
But to me the use of temperature controlled darks produce a better stacked image.

That leads me onto dark flats.
Dark flats are similar to bias but instead of shooting at fastest shutter speed the camera is set optimum adu full well value.

Thankfully APT works out my flat / dark flats & shoots them. Typically a 2s exposure

Reply
 
 
Feb 8, 2024 11:06:14   #
alberio Loc: Casa Grande AZ
 
Marc G wrote:
I'm not sure that you can build a bias library cos read noise differs from frame to frame?
Dithering will certainly assist cos the differ will shift the frame a few pixels thus the read noise will be different.
This is one of the reasons I switched to dedicated astro cam.
Walking & bias noise is virtually non existent.

Saying that I still use a dark library but my 269 doesn't produce amp glow, so no need for darks.
But to me the use of temperature controlled darks produce a better stacked image.

That leads me onto dark flats.
Dark flats are similar to bias but instead of shooting at fastest shutter speed the camera is set optimum adu full well value.

Thankfully APT works out my flat / dark flats & shoots them. Typically a 2s exposure
I'm not sure that you can build a bias library cos... (show quote)


Hi Marc,
I just downloaded APT. Just starting to learn a bit about it. Can a person load a series of previously captured fits files into APT to stack?

Reply
Feb 8, 2024 16:11:38   #
Marc G Loc: East Grinstead, West Sussex, England
 
alberio wrote:
Hi Marc,
I just downloaded APT. Just starting to learn a bit about it. Can a person load a series of previously captured fits files into APT to stack?


You mean APP?
Yes you can.
Recommend that you have associated calibration frames to stack too.
Saying that..
APP has brilliant user friendly linear processing tools.
Light pollution, background extraction (gradients), star colour calibration, star removal.

Reply
Feb 8, 2024 19:56:11   #
alberio Loc: Casa Grande AZ
 
Marc G wrote:
You mean APP?
Yes you can.
Recommend that you have associated calibration frames to stack too.
Saying that..
APP has brilliant user friendly linear processing tools.
Light pollution, background extraction (gradients), star colour calibration, star removal.


No, I was referring to APT. So, APP is used for post processing and APT is for capturing?

Reply
Feb 9, 2024 05:39:04   #
Marc G Loc: East Grinstead, West Sussex, England
 
alberio wrote:
No, I was referring to APT. So, APP is used for post processing and APT is for capturing?


yes correct mate

Astrophotography Tool APT = capture etc
Astro Pixel Processor = post processing

Reply
If you want to reply, then register here. Registration is free and your account is created instantly, so you can post right away.
Astronomical Photography Forum
UglyHedgehog.com - Forum
Copyright 2011-2024 Ugly Hedgehog, Inc.