Methodology to create a new Total Solar Irradiance record: Making a composite out of multiple data records [SSA]

Many observational records critically rely on our ability to merge different (and not necessarily overlapping) observations into a single composite. We provide a novel and fully-traceable approach for doing so, which relies on a multi-scale maximum likelihood estimator. This approach overcomes the problem of data gaps in a natural way and uses data-driven estimates of the uncertainties. We apply it to the total solar irradiance (TSI) composite, which is currently being revised and is critical to our understanding of solar radiative forcing. While the final composite is pending decisions on what corrections to apply to the original observations, we find that the new composite is in closest agreement with the PMOD composite and the NRLTSI2 model. In addition, we evaluate long-term uncertainties in the TSI, which reveal a 1/f scaling

Read this paper on arXiv…

T. Wit, G. Kopp, C. Frohlich, et. al.
Thu, 9 Feb 17

Comments: slightly expanded version of a manuscript to appear in Geophysical Research Letters (2017)

Corral Framework: Trustworthy and Fully Functional Data Intensive Parallel Astronomical Pipelines [IMA]

Data processing pipelines are one of most common astronomical software. This kind of programs are chains of processes that transform raw data into valuable information. In this work a Python framework for astronomical pipeline generation is presented. It features a design pattern (Model-View-Controller) on top of a SQL Relational Database capable of handling custom data models, processing stages, and result communication alerts, as well as producing automatic quality and structural measurements. This pat- tern provides separation of concerns between the user logic and data models and the processing flow inside the pipeline, delivering for free multi processing and distributed computing capabilities. For the astronomical community this means an improvement on previous data processing pipelines, by avoiding the programmer deal with the processing flow, and parallelization issues, and by making him focusing just in the algorithms involved in the successive data transformations. This software as well as working examples of pipelines are available to the community at

Read this paper on arXiv…

J. Cabral, B. Sanchez, M. Beroiz, et. al.
Mon, 23 Jan 17

Comments: 8 pages, 2 figures, submitted for consideration at Astronomy and Computing. Code available at this https URL

From Blackbirds to Black Holes: Investigating Capture-Recapture Methods for Time Domain Astronomy [HEAP]

In time domain astronomy, recurrent transients present a special problem: how to infer total populations from limited observations. Monitoring observations may give a biassed view of the underlying population due to limitations on observing time, visibility and instrumental sensitivity. A similar problem exists in the life sciences, where animal populations (such as migratory birds) or disease prevalence, must be estimated from sparse and incomplete data. The class of methods termed Capture-Recapture is used to reconstruct population estimates from time-series records of encounters with the study population. This paper investigates the performance of Capture-Recapture methods in astronomy via a series of numerical simulations. The Blackbirds code simulates monitoring of populations of transients, in this case accreting binary stars (neutron star or black hole accreting from a stellar companion) under a range of observing strategies. We first generate realistic light-curves for populations of binaries with contrasting orbital period distributions. These models are then randomly sampled at observing cadences typical of existing and planned monitoring surveys. The classical capture-recapture methods, Lincoln-Peterson, Schnabel estimators, related techniques, and newer methods implemented in the Rcapture package are compared. A general exponential model based on the radioactive decay law is introduced, and demonstrated to recover (at 95% confidence) the underlying population abundance and duty cycle, in a fraction of the observing visits (10-50%) required to discover all the sources in the simulation. Capture-Recapture is a promising addition to the toolbox of time domain astronomy, and methods implemented in R by the biostats community can be readily called from within Python.

Read this paper on arXiv…

S. Laycock
Tue, 17 Jan 17

Comments: Accepted to New Astronomy. 11 pages, 8 figures (refereed version prior to editorial process)

Quasi-oscillatory dynamics observed in ascending phase of the flare on March 6, 2012 [SSA]

Context. The dynamics of the flaring loops in active region (AR) 11429 are studied. The observed dynamics consist of several evolution stages of the flaring loop system during both the ascending and descending phases of the registered M-class flare. The dynamical properties can also be classified by different types of magnetic reconnection, related plasma ejection and aperiodic flows, quasi-periodic oscillatory motions, and rapid temperature and density changes, among others. The focus of the present paper is on a specific time interval during the ascending (pre-flare) phase. Aims. The goal is to understand the quasi-periodic behavior in both space and time of the magnetic loop structures during the considered time interval. Methods.We have studied the characteristic location, motion, and periodicity properties of the flaring loops by examining space-time diagrams and intensity variation analysis along the coronal magnetic loops using AIA intensity and HMI magnetogram images (from the Solar Dynamics Observatory(SDO)). Results. We detected bright plasma blobs along the coronal loop during the ascending phase of the solar flare, the intensity variations of which clearly show quasi-periodic behavior. We also determined the periods of these oscillations. Conclusions. Two different interpretations are presented for the observed dynamics. Firstly, the oscillations are interpreted as the manifestation of non-fundamental harmonics of longitudinal standing acoustic oscillations driven by the thermodynamically nonequilibrium background (with time variable density and temperature). The second possible interpretation we provide is that the observed bright blobs could be a signature of a strongly twisted coronal loop that is kink unstable.

Read this paper on arXiv…

E. Philishvili, B. Shergelashvili, T. Zaqarashvili, et. al.
Mon, 2 Jan 17

Comments: 12 pages, 10 figures, A&A, in press

Method of frequency dependent correlations: investigating the variability of total solar irradiance [SSA]

This paper contributes to the field of modeling and hindcasting of the total solar irradiance (TSI) based on different proxy data that extend further back in time than the TSI that is measured from satellites.
We introduce a simple method to analyze persistent frequency-dependent correlations (FDCs) between the time series and use these correlations to hindcast missing historical TSI values. We try to avoid arbitrary choices of the free parameters of the model by computing them using an optimization procedure. The method can be regarded as a general tool for pairs of data sets, where correlating and anticorrelating components can be separated into non-overlapping regions in frequency domain.
Our method is based on low-pass and band-pass filtering with a Gaussian transfer function combined with de-trending and computation of envelope curves.
We find a major controversy between the historical proxies and satellite-measured targets: a large variance is detected between the low-frequency parts of targets, while the low-frequency proxy behavior of different measurement series is consistent with high precision. We also show that even though the rotational signal is not strongly manifested in the targets and proxies, it becomes clearly visible in FDC spectrum.
The application of the new method to solar data allows us to obtain important insights into the different TSI modeling procedures and their capabilities for hindcasting based on the directly observed time intervals.

Read this paper on arXiv…

J. Pelt, M. Kapyla and N. Olspert
Fri, 23 Dec 16

Comments: 19 pages, 5 figures, accepted for publication in Astronomy & Astrophysics

When "Optimal Filtering" Isn't [CL]

The so-called “optimal filter” analysis of a microcalorimeter’s x-ray pulses is statistically optimal only if all pulses have the same shape, regardless of energy. The shapes of pulses from a nonlinear detector can and do depend on the pulse energy, however. A pulse-fitting procedure that we call “tangent filtering” accounts for the energy dependence of the shape and should therefore achieve superior energy resolution. We take a geometric view of the pulse-fitting problem and give expressions to predict how much the energy resolution stands to benefit from such a procedure. We also demonstrate the method with a case study of K-line fluorescence from several 3d transition metals. The method improves the resolution from 4.9 eV to 4.2 eV at the Cu K$\alpha$ line (8.0keV).

Read this paper on arXiv…

J. Fowler, B. Alpert, W. Doriese, et. al.
Thu, 24 Nov 16

Comments: Submitted to the Proceedings of the 2016 Applied Superconductivity Conference

Filling the gaps: Gaussian mixture models from noisy, truncated or incomplete samples [IMA]

We extend the common mixtures-of-Gaussians density estimation approach to account for a known sample incompleteness by simultaneous imputation from the current model. The method called GMMis generalizes existing Expectation-Maximization techniques for truncated data to arbitrary truncation geometries and probabilistic rejection. It can incorporate an uniform background distribution as well as independent multivariate normal measurement errors for each of the observed samples, and recovers an estimate of the error-free distribution from which both observed and unobserved samples are drawn. We compare GMMis to the standard Gaussian mixture model for simple test cases with different types of incompleteness, and apply it to observational data from the NASA Chandra X-ray telescope. The python code is capable of performing density estimation with millions of samples and thousands of model components and is released as an open-source package at

Read this paper on arXiv…

P. Melchior and A. Goulding
Fri, 18 Nov 16

Comments: 12 pages, 6 figures, submitted to Computational Statistics & Data Analysis