This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision | ||
chara:pavo_analysis_manual [2018/11/20 21:47] gail_stargazer [CVS Repository and //Getting the Code The first step in getting the IDL code is to get an account on michelson. If you really don't see yourself editing code because you aren't familiar with IDL, then you can use the standard observer account, or tgz fil |
chara:pavo_analysis_manual [2018/11/29 15:30] (current) jones [l1_l2_gui.pro] |
||
---|---|---|---|
Line 1: | Line 1: | ||
===== Preliminaries ===== | ===== Preliminaries ===== | ||
- | ==== The PAVO data reduction pipeline is available through | + | |
+ | ==== Downloading the Pipeline ==== | ||
+ | |||
+ | The PAVO data reduction pipeline is available through | ||
To download the PAVO software: | To download the PAVO software: | ||
Line 13: | Line 16: | ||
git pull | git pull | ||
</ | </ | ||
- | |||
==== Setting up IDL ==== | ==== Setting up IDL ==== | ||
- | In addition to the PAVO code, you will need to download the Goddard astrolib library from http:// | + | |
+ | In addition to the PAVO code, you will need to download the Goddard astrolib library from [[http:// | ||
- Put the astrolib library in ~/ | - Put the astrolib library in ~/ | ||
- Assuming that you've done the CVS checkout above into ~/, use the following for your idl_startup.pro: | - Assuming that you've done the CVS checkout above into ~/, use the following for your idl_startup.pro: | ||
- | | + | |
- | !path = expand_path(' | + | defsysv, ' |
- | !path = expand_path(' | + | |
- | \\ Then everything should work. Alternatively, | + | < |
+ | !path = expand_path(' | ||
+ | !path = expand_path(' | ||
+ | </ | ||
+ | |||
+ | \\ Then everything should work. Alternatively, | ||
==== Data Analysis Strategy ==== | ==== Data Analysis Strategy ==== | ||
- | ==== PAVO data is not labelled by distinct dark, on-fringe and off-fringe files. Instead, a program | + | |
+ | PAVO data is not labelled by distinct dark, on-fringe and off-fringe files. Instead, a program | ||
==== Fourier-Based Analysis ==== | ==== Fourier-Based Analysis ==== | ||
- | In this analysis, the first thing that happens is that each frame is turned into a data cube, with wavelength and 2D pupil position. Then each wavelength is analysed using a 2D Fourier transform, where the power is chopped out, the data inverse-transformed and the analysis done on the demodulated fringes in the pupil-plane.\\ | + | |
+ | In this analysis, the first thing that happens is that each frame is turned into a data cube, with wavelength and 2D pupil position. Then each wavelength is analysed using a 2D Fourier transform, where the power is chopped out, the data inverse-transformed and the analysis done on the demodulated fringes in the pupil-plane. | ||
==== Demodulation (model-based) Analysis ==== | ==== Demodulation (model-based) Analysis ==== | ||
- | This is the method used to get group delays in get_gd.pro. There is some potential for this method to deliver improved results for the very faintest 3-telescope data. Consider this to be experimental only. If this is sped up in a later version of the code, the output quantities from processv2.prowill not have to be recalculated. The latest ideas (1 April 2010) are to use Graphics Processing Units (GPUs) in a collaboration with the school of IT at Sydney, with a timescale of late-2010, or sparse matrices. The prototype will be sparse matrices in IDL. If GPUs were to be used (1000 times faster than a PC), then we estimate 2 weeks to analyze all PAVO data taken thus far!\\ \\ | + | |
+ | This is the method used to get group delays in get_gd.pro. There is some potential for this method to deliver improved results for the very faintest 3-telescope data. Consider this to be experimental only. If this is sped up in a later version of the code, the output quantities from processv2.prowill not have to be recalculated. The latest ideas (1 April 2010) are to use Graphics Processing Units (GPUs) in a collaboration with the school of IT at Sydney, with a timescale of late-2010, or sparse matrices. The prototype will be sparse matrices in IDL. If GPUs were to be used (1000 times faster than a PC), then we estimate 2 weeks to analyze all PAVO data taken thus far! | ||
===== Programs for Analysis ===== | ===== Programs for Analysis ===== | ||
+ | |||
==== headstrip.pro ==== | ==== headstrip.pro ==== | ||
- | Use this program like headstrip, / | + | |
+ | Use this program like headstrip, / | ||
==== processor.pro ==== | ==== processor.pro ==== | ||
- | This, or something you name yourself, is an example of how to script multiple data analysis runs.g. over a weekend. It is an excellent way to keep a diary of what was run before. NB //You either have to run processor.pro, | + | |
+ | This, or something you name yourself, is an example of how to script multiple data analysis runs.g. over a weekend. It is an excellent way to keep a diary of what was run before. NB //You either have to run processor.pro, | ||
==== processv2.pro ==== | ==== processv2.pro ==== | ||
- | This is the main program for analysis. Originally, it was only for V2 analysis. These options are all inputs to processv2.pro.\\ | + | |
+ | This is the main program for analysis. Originally, it was only for V2 analysis. These options are all inputs to processv2.pro. | ||
* nohann Usually, a window is applied in the Fourier domain. Setting this keyword means that a window isn't used. | * nohann Usually, a window is applied in the Fourier domain. Setting this keyword means that a window isn't used. | ||
Line 42: | Line 63: | ||
* individual Essential for rejecting bad data. | * individual Essential for rejecting bad data. | ||
* plot Always use this option when you are running a new data analysis. The plots will look like the following for a strong fringe on baseline 1, which uses beams 1 and 2: | * plot Always use this option when you are running a new data analysis. The plots will look like the following for a strong fringe on baseline 1, which uses beams 1 and 2: | ||
- | {{http:// | + | |
+ | |||
+ | The outputs from processv2.pro are: | ||
* The out_file, which is the text file output, containing V2 values, estimated errors, closurephases etc. | * The out_file, which is the text file output, containing V2 values, estimated errors, closurephases etc. | ||
* The .pav files from individual analysis. These are IDL variable files that contain finely sampled V2 and closure-phase. | * The .pav files from individual analysis. These are IDL variable files that contain finely sampled V2 and closure-phase. | ||
* The log file, which contains the settings used for the out file. | * The log file, which contains the settings used for the out file. | ||
- | * If /plot is set, a screenshot of the output of the wavelength calibration will be created. A faulty wavelength calibration has strong potential to screw things up, and it should therefore be checked before proceeding in the analysis. Offsets and Rotation are listed in the log file: Offsets of +/- 2 pixels are ok, everything higher should be suspicious. An example plot for an ok wavelength calibration with an offset of 1.2 can be found [[file/ | + | * If /plot is set, a screenshot of the output of the wavelength calibration will be created. A faulty wavelength calibration has strong potential to screw things up, and it should therefore be checked before proceeding in the analysis. Offsets and Rotation are listed in the log file: Offsets of /- 2 pixels are ok, everything higher should be suspicious. An example plot for an ok wavelength calibration with an offset of 1.2 can be found {{: |
==== ==== | ==== ==== | ||
+ | |||
==== ==== | ==== ==== | ||
+ | |||
+ | |||
==== l0_l1_gui.pro ==== | ==== l0_l1_gui.pro ==== | ||
- | This program takes the .pav files and performs an automated outlier rejection as well as enables the user to manually reject bad sections of data. This program can only be run if /individual is set when running processv2.pro (i.e. .pav files have been created), which should really be the default option.\\ \\ Walkthrough: | + | |
+ | This program takes the .pav files and performs an automated outlier rejection as well as enables the user to manually reject bad sections of data. This program can only be run if /individual is set when running processv2.pro (i.e. .pav files have been created), which should really be the default option. \\ \\ Walkthrough: | ||
+ | 4) Outlier rejection is based on 3 criteria: S/N, seconds after lock on fringes is lost (NSEC) and deviation from mean in sigma (SIGMA LIMIT); The default values for this should be fine in most cases; S/N cuts are probably the most sensible to be adjusted if data is particularly bad/good. \\ | ||
+ | 5) The bottom row shows a range of diagnostics that can be used to determine the quality of data such as group delays, cart positions, V2C/V2 (measure of t0, good if high) and histograms. An example for a histogram display for 3-bl data can be found {{: | ||
+ | 6) Once you're satisfied with the rejection settings for the scan press OUTPUT FILE, which will show V2 vs lambda using all frames that survived the outlier rejection. This step will create and entry in the output file // | ||
==== l1_l2.pro ==== | ==== l1_l2.pro ==== | ||
- | This program takes as input the out file from processv2.pro and the telescopes in use, and calibrates the visibility data. Some key options for this are:\\ | + | |
+ | This program takes as input the out file from processv2.pro and the telescopes in use, and calibrates the visibility data. Some key options for this are: | ||
* diamsfile This is an optional text file with two columns. The first is the star name (exactly as in processv2.pro) and the second is the angular diameter of the star. This feature hasn't really been tested, needs diameter errors and correct propagation (anyone?). | * diamsfile This is an optional text file with two columns. The first is the star name (exactly as in processv2.pro) and the second is the angular diameter of the star. This feature hasn't really been tested, needs diameter errors and correct propagation (anyone?). | ||
* exp If you turn this feature on with /exp, the photometry is used to correct the visibilities for photometric fluctuations. Definitely at least try it for 3-telescope data. Sometimes, 2-telescope data seems to be more stable than the photometric measurements themselves: hence this is an option rather than the default. | * exp If you turn this feature on with /exp, the photometry is used to correct the visibilities for photometric fluctuations. Definitely at least try it for 3-telescope data. Sometimes, 2-telescope data seems to be more stable than the photometric measurements themselves: hence this is an option rather than the default. | ||
- | \\ | + | |
==== l1_l2_gui.pro ==== | ==== l1_l2_gui.pro ==== | ||
- | A GUI extension of l1_l2.pro to calibrate data, including multi-bracket calibration and proper uncertainty calculations using Monte-Carlo simulations. This program can only be run with the output of l0_l1_gui.pro (see above).\\ \\ Walkthrough: | + | |
+ | A GUI extension of l1_l2.pro to calibrate data, including multi-bracket calibration and proper uncertainty calculations using Monte-Carlo simulations. This program can only be run with the output of l0_l1_gui.pro (see above). \\ \\ Walkthrough: | ||
+ | 4) Once you have identified the brackets that should be used for calibration, | ||
+ | 5) Load your config file using the LOAD CONFIG FILE button and press CALIBRATE to perform the multi-bracket calibration. A LD/UD will be performed automatically. Press OUTPUT FILES to output the data of this graph. \\ | ||
+ | 6) Pressing RUN MC will start the Monte-Carlo simulations to calculate uncertainties of the UD fit to the data. The simulations include uncertainties in wavelength scale, calibrator diameters, measurement errors, limb-darkening coefficient and correlations between wavelength channels (see {{: | ||
+ | \\ | ||
+ | Note: /exp is default in this program as I've found it to improve the calibration in almost all cases. | ||
===== Nice features to add ===== | ===== Nice features to add ===== | ||
- | processv2: Knowledge of the telescopes from the start, i.e. finding if 2 or 3 telescopes were used based on the shutter sequences.\\ | + | |
+ | processv2: Knowledge of the telescopes from the start, i.e. finding if 2 or 3 telescopes were used based on the shutter sequences. \\ l1_l2_gui: Add more complex models such as binaries etc in fitting routine. | ||
+ |