The overall goal of my research is to develop robust and highly scalable computational algorithms and software tools for solving large-to-extreme scale inverse problems and optimal design of experiments. Specifically, I am interested in creating novel methodologies for performing data assimilation with highly-nonlinear models and in the presence of non-Gaussian probabilistic uncertainties. I am also interested in developing efficient goal-oriented algorithmic approaches for sensor placement, optimal data acquisition, and optimal design of experiments.

In the past, I have developed new family of ensemble-based sampling algorithms, based on the Hamiltonian Monte- Carlo sampling strategy, to avoid the shortcomings of the current state of the art algorithms.

I have also developed a novel reinforcement (machine learning) stochastic framework for solving otherwise computational infeasible mixed integer programs for optimal experimental design.

Currently, among other active research projects, I am developing efficient machine learning approaches for design optimal data acquisition schemes that are robust with misspecification of Bayesian inversion elements, such as the prior and noise error models.

Active Projects:

  • Randomization for accelerating Bayesian inversion and optimal design of experiments.
  • Robstification of optimal design in sensor placement for Bayesian inversion.
  • Goal-oriented optimal experimental design for large-scale non-linear inverse problems.

Previous Projects:

  • Stochastic learning (neuro-dynamic programming) approaches for Binary optimization and optimal design of experiments.
  • Goal-oriented optimal experimental design for large-scale linear inverse problems.
  • Highly extensible parallel ultra-high-resolution data assimilation software package for Doppler Lidar data.
  • DATeS: OOP-based extensible data assimilation testing suite.
  • Cluster sampling filters: fully non-Gaussian filtering methodology with GMM approximation of the prior.
  • Reduced-Order sampling: solving the non-Gaussian data assimilation problem in a reduced-order-model subspace.
  • HMC sampling smoother: a Hybrid Monte-Carlo sampling smoother as alternative to 4DVAR.
  • HMC sampling filter: a new Hybrid Monte-Carlo Sampling algorithm as alternative to EnKF.
  • Nonlinear/non-Gaussian data assimilation in imperfect-model settings.

Other Projects:

  • Downscaling and stochastic parametrization.
  • Automate parameter tuning of the Hybrid Monte-Carlo filter and smoother via optimization.
  • A new optimal experimental design (ODE) framework for covariance inflation and localization for ensemble filters.