Skip to main content
SHOW DETAILS
eye
Title
Date Archived
Creator
Arxiv.org
texts

eye 8

favorite 0

comment 0

Propensity score weighting is a tool for causal inference to adjust for measured confounders in observational studies. In practice, data often present complex structures, such as clustering, which make propensity score modeling and estimation challenging. In addition, for clustered data, there may be unmeasured cluster-specific variables that are related to both the treatment assignment and the outcome. When such unmeasured cluster-specific confounders exist and are omitted in the propensity...
Topics: Statistics, Methodology
Source: http://arxiv.org/abs/1703.06086
Arxiv.org
by Dan Crisan; Jeremie Houssineau; Ajay Jasra
texts

eye 7

favorite 0

comment 0

We introduce a new class of Monte Carlo based approximations of expectations of random variables defined whose laws are not available directly, but only through certain discretisatizations. Sampling from the discretized versions of these laws can typically introduce a bias. In this paper, we show how to remove that bias, by introducing a new version of multi-index Monte Carlo (MIMC) that has the added advantage of reducing the computational effort, relative to i.i.d. sampling from the most...
Topics: Computation, Statistics
Source: http://arxiv.org/abs/1702.03057
Arxiv.org
texts

eye 5

favorite 0

comment 0

Causal inference with observational studies often relies on the assumptions of unconfoundedness and overlap of covariate distributions in different treatment groups. The overlap assumption is violated when some units have propensity scores close to zero or one, and therefore both theoretical and practical researchers suggest dropping units with extreme estimated propensity scores. We advance the literature in three directions. First, we clarify a conceptual issue of sample trimming by defining...
Topics: Statistics, Methodology
Source: http://arxiv.org/abs/1704.00666
Arxiv.org
by Ajay Jasra; Kody Law; Carina Suciu
texts

eye 10

favorite 0

comment 0

This article reviews the application of advanced Monte Carlo techniques in the context of Multilevel Monte Carlo (MLMC). MLMC is a strategy employed to compute expectations which can be biased in some sense, for instance, by using the discretization of a associated probability law. The MLMC approach works with a hierarchy of biased approximations which become progressively more accurate and more expensive. Using a telescoping representation of the most accurate approximation, the method is able...
Topics: Computation, Statistics, Numerical Analysis, Methodology, Mathematics
Source: http://arxiv.org/abs/1704.07272
Arxiv.org
by Ajay Jasra; Kengo Kamatani; Kody Law; Yan Zhou
texts

eye 10

favorite 0

comment 0

In this article we consider computing expectations w.r.t.~probability laws associated to a certain class of stochastic systems. In order to achieve such a task, one must not only resort to numerical approximation of the expectation, but also to a biased discretization of the associated probability. We are concerned with the situation for which the discretization is required in multiple dimensions, for instance in space and time. In such contexts, it is known that the multi-index Monte Carlo...
Topics: Computation, Statistics
Source: http://arxiv.org/abs/1704.00117
Arxiv.org
by Shu Yang; Jae Kwang Kim
texts

eye 8

favorite 0

comment 0

Predictive mean matching imputation is popular for handling item nonresponse in survey sampling. In this article, we study the asymptotic properties of the predictive mean matching estimator of the population mean. For variance estimation, the conventional bootstrap inference for matching estimators with fixed matches has been shown to be invalid due to the nonsmoothess nature of the matching estimator. We propose asymptotically valid replication variance estimation. The key strategy is to...
Topics: Statistics, Methodology
Source: http://arxiv.org/abs/1703.10256
Arxiv.org
by Alexandros Beskos; Ajay Jasra; Kody Law; Youssef Marzouk; Yan Zhou
texts

eye 3

favorite 0

comment 0

In this article we develop a new sequential Monte Carlo (SMC) method for multilevel (ML) Monte Carlo estimation. In particular, the method can be used to estimate expectations with respect to a target probability distribution over an infinite-dimensional and non-compact space as given, for example, by a Bayesian inverse problem with Gaussian random field prior. Under suitable assumptions the MLSMC method has the optimal $O(\epsilon^{-2})$ bound on the cost to obtain a mean-square error of...
Topics: Computation, Statistics
Source: http://arxiv.org/abs/1703.04866
Arxiv.org
by Ajay Jasra; Seongil Jo; David Nott; Christine Shoemaker; Raul Tempone
texts

eye 3

favorite 0

comment 0

In the following article we consider approximate Bayesian computation (ABC) inference. We introduce a method for numerically approximating ABC posteriors using the multilevel Monte Carlo (MLMC). A sequential Monte Carlo version of the approach is developed and it is shown under some assumptions that for a given level of mean square error, this method for ABC has a lower cost than i.i.d. sampling from the most accurate ABC approximation. Several numerical examples are given.
Topics: Statistics, Methodology
Source: http://arxiv.org/abs/1702.03628
Arxiv.org
texts

eye 11

favorite 0

comment 0

We consider causal inference from observational studies when confounders have missing values. When the confounders are missing not at random, causal effects are generally not identifiable. In this article, we propose a novel framework for nonparametric identification of causal effects with confounders missing not at random, but subject to instrumental missingness, that is, the missing data mechanism is independent of the outcome, given the treatment and possibly missing confounder values. We...
Topics: Statistics, Methodology
Source: http://arxiv.org/abs/1702.03951
Arxiv.org
by Daniel Paulin; Ajay Jasra; Dan Crisan; Alexandros Beskos
texts

eye 4

favorite 0

comment 0

In this paper we consider filtering and smoothing of partially observed chaotic dynamical systems that are discretely observed, with an additive Gaussian noise in the observation. These models are found in a wide variety of real applications and include the Lorenz 96' model. In the context of a fixed observation interval $T$, observation frequency $h$ and Gaussian observation variance $\sigma_Z^2$, we show under assumptions that the filter and smoother are well approximated by a Gaussian when...
Topics: Optimization and Control, Statistics, Dynamical Systems, Methodology, Mathematics
Source: http://arxiv.org/abs/1702.02484
Arxiv.org
by Ajay Jasra; Kengo Kamatani; Kody J. H. Law; Yan Zhou
texts

eye 9

favorite 0

comment 0

In this article we consider static Bayesian parameter estimation for partially observed diffusions that are discretely observed. We work under the assumption that one must resort to discretizing the underlying diffusion process, for instance using the Euler-Maruyama method. Given this assumption, we show how one can use Markov chain Monte Carlo (MCMC) and particularly particle MCMC [Andrieu, C., Doucet, A. and Holenstein, R. (2010). Particle Markov chain Monte Carlo methods (with discussion)....
Topics: Probability, Computation, Statistics, Mathematics
Source: http://arxiv.org/abs/1701.05892