The Solar-Stellar Spectrograph

[ Home | About | Tech Info & Data | Publications | References ]      [ Site by Jeffrey Hall | Research funded by NSF ]

ARTICLE

Overview of the data reduction procedure

Introduction

The descriptions in this article and in the detailed procedure descriptions assume familiarity with spectroscopic data reduction. However, I have also tried to include layman level descriptions of what we're doing to our data, and these descriptions are highlighted by green text. If you work with astronomical data regularly, you can probably skip these descriptions; if not, I hope they help clarify our methods.

 

Format of the "old" data (1993-2007)

Raw SSS data frames using the old CCDs are 512 x 400 arrays of 14-bit integer data. An individual frame is a grafted composite of the 19 good orders imaged by the echelle, and the single Ca II H&K order. We reduce these frames to spectra suitable for analysis using object-oriented, in-house software written in IDL. No "black box" methods are applied to the data at any point. Analysis and generation of plotted time series is also done with in-house IDL routines. In this part of the SSS Web site, we describe the reduction and analysis procedure. Below we provide an overview of the procedure, which follows fairly typical practice for multiaperture spectroscopic data.

 

Format of the "new" data (2008+)

Raw SSS data frames using the new CCDs are 512 x 400 arrays of 16-bit floating-point data. An individual frame is a grafted composite of the 15 good orders imaged by the echelle, and the single Ca II H&K order. We reduce these frames to spectra suitable for analysis using the same IDL procedures that are used for the old data. The only things that have changed are the order map and the wavelength solution; all the other extraction and analysis methods are identical to preserve contintuity of the data.

 

Summary of the reduction procedure

Every data frame -- solar and stellar -- is reduced to a continuum-normalized 20-order or 16-order spectrum in exactly the same way. In some steps (e.g., removal of scattered light) the red orders and the single blue order must be treated separately, since although they are all part of a single data frame, they originate from two different CCDs.

1. Preprocessing: Integer data frames are converted to floats, and FITS headers are prepended using information from the electronic logs.

2. Debiasing: We remove the bias by subtracting a scalar value from the red and blue parts of the frame. The value subtracted is obtained from an average ADU value in a non-illuminated part of the frame.

3. Order map creation: We employ a three-pass tracing procedure to locate orders on a flat field frame. The order map thus created is used for any solar or stellar data for which the generating flat field is applicable.

4. Flat field creation: Flat field spectra are created by extracting the spectrum of the flat field (quartz lamp) image. We also determine and save the shape of the flat field spectrum by fitting a spline to the extracted flats.

5. Flat fielding of target frames: We remove pixel-to-pixel gain variations from the raw target frames by dividing each order by the analogous order from the flat field spectrum.

6. Removal of scattered light: We have adopted a method of creating order base traces to remove the scattered light from the frame. This is reasonably straightforward for the "blue" data, comprising the single H&K order, and much less so for the "red" echelle data, which have low-level ghost orders between the main orders that complicate the process.

7. Spectrum extraction: The spectra are extracted from the "descattered" frames using a profile-weighted extraction algorithm over order apertures nine pixels wide.

8. Wavelength calibration: Velocity information is not critical to the stellar cycles program. We set the spectra of Sun-like stars to zero velocity for easy measurement of the line time series, though we preserve velocity information wherever needed (e.g., for binaries). The wavelength calibration itself is derived from exposures of a thorium-argon hollow cathode and is applied to every spectrum after extraction.

9. Continuum normalization: We normalize the echelle spectra using cubic spline fits to predefined continuum points. The heavily blanketed HK spectra are much trickier to normalize, and our procedure involves making a linear fit to two reference points on either side of the HK lines whose absolute intensities are known, followed by a check of the consistency of non-variable parts of the spectrum over long time intervals.

The data product obtained from extracted, normalized spectra is the residual intensity in a specified bandpass (typically one Ångstrom). This can be left as is (as is the case for the NSO-like HK indices), or converted to flux, the Mount Wilson S index, or to the fractional chromospheric emission R'(HK). Time series of these data are analyzed to examine cycle characteristics, periods, and low-level variability in the stellar and solar atmospheres.


[Back to contents]

Supported by grants from the National Science Foundation.
[ Back to my home page | Email me: jch [at] lowell [dot] edu ]
The SSS is publicly funded. Unless explicitly noted otherwise, everything on this site is in the public domain.
If you use or quote our results or images, we appreciate an acknowledgment.
This site is best viewed with Mozilla Firefox.