Keynote Abstracts

Professor Qiang Du

Title: Asymptotically compatible discretizations of nonlocal models and their local limits

Abstract: Nonlocality is ubiquitous in nature and is a generic feature of model reduction of complex systems. Nonlocal models and nonlocal balanced laws are attractive alternatives to treat anomalous processes and singular behavior. We present the framework of asymptotically compatible discretizations for parameterized variational problems that provide convergent approximations in the nonlocal setting and in the local limit. These methods allow consistent and robust simulations of problems involving multiple scales and are useful to the validation of nonlocal models and simulations in physical applications.

Professor Liliana Borcea

Title: Imaging with waves in complex environments

Abstract: The talk is concerned with the application of sensor array imaging in complex environments. The goal of imaging is to estimate the support of remote sources or strong reactors using time resolved measurements of waves at a collection of sensors (the array). This is a challenging problem when the imaging environment is complex, due to numerous small scale inhomogeneities and/or rough boundaries that scatter the waves. Mathematically we model such complexity (which is necessarily uncertain in applications) using random processes, and thus study imaging in random media. I will focus attention on the application of imaging in random waveguides, which exhibits all the challenges of imaging in random media. I will present a quantitative study of cumulative scattering effects in such waveguides and then explain how we can use such a study to design high fidelity imaging methods.

Professor J. Tinsley Oden

Title: The Emergence of Predictive Computational Science

Abstract: The idea of predictive science has arisen in recent years in reference to the body of scientific knowledge that determines the predictability of mathematical and computational models of physical events. It embraces the processes of model selection, calibration, validation and verification and their use in forecasting specific features of physical events with quantified uncertainty. In no area of computational science is the issue of predictivity more complex than in the analysis of multiscale models of atomistic and molecular systems. This area is among those discussed in this lecture: the validity and predictability of coarse-grained models of atomistic systems. We begin by tracing the foundations of scientific prediction from the classical notions of deductive and inductive logic, to the generalizations of probability theory in the writings of R.T. Cox and others, which argue that the natural extension of Aristotelian logic that accounts for uncertainty is Bayesian. We discuss general procedures for model selection using the notion of model plausibilities for given observational data, statistical calibration and validation of models, all demonstrated on applications to coarse-grained models of atomistic systems.  Beyond the Bayesian framework, maximum entropy methods are presented as approaches for computing priors and designing validation experiments. We discuss, among several special topics, the idea of adaptive modeling based on the model plausibilities, model sensitivity to variations in model parameters, and issues of model inadequacy. We propose an algorithm motivated by Occam's Razor, where the simplest model among a class of parametric models is selected adaptively according to validation criteria.

Professor Douglas Arnold

Title: The Fundamental Theorem of Numerical Analysis

Abstract: The accuracy of a numerical simulation depends on the consistency and the stability of the discretization. The paradigm that consistency and stability together lead to convergence recurs throughout numerical analysis, and is especially important to the numerical solution of partial differential equations. However, consistency and, especially, stability, can be subtle and elusive. Even relatively simple examples can yield unexpected--sometimes catastrophic--results.  Traditionally numerical analysis relied on elementary tools such as Taylor expansions, Fourier series, and matrix analysis to explore convergence and stability.  In response to ever more challenging problems, numerical analysts are bringing a new array of techniques to bear, including tools from differential geometry and algebraic topology that have enabled recent breakthroughs.

Professor Erika Tatiana Camacho

Title:  My research, passion, and story: the intersection of modeling photoreceptor degeneration, diversifying the mathematical sciences, and contributing to a strong scientific workforce

Abstract:  Finding a solution for blindness requires mathematically modeling photoreceptor degeneration, exploring potential treatments via in silico experiments, talking across disciplines, and much more... Faced by rapidly accelerating social, environmental, and medical/health challenges there is an urgent need to create a strong quantitative workforce. Addressing these challenges requires intense, aggressive, and innovative efforts at every level. As educators and members of a larger community we need to create an environment where our students can take part in the process of “becoming scientists”.  We need to have the student’s personal experiences, cultural perspective, interests, and passion as the starting point of their STEM-C education.  These are key in enticing, actively recruiting, and successfully retaining students, especially underrepresented minorities and women, in scientific disciplines and careers.

In this talk I will provide insights into potential solutions to these challenges through my story, research on modeling photoreceptor degeneration, and expertise in using a global perspective to effect change in STEM-C education.

 

National Public Radio Interviewed Professor Camacho in March 2017. Take a listen here.



Page last modified August 17, 2019