Implcit neural representations for model reduction

New papers present an alternative view of kinematic approximation

Excited to share a few of our recent research papers (with Maurizio Chiaramonte and our academic collaborators) that aim to unlock immersive Metaverse experiences via model reduction applied to large-scale computational-physics models. These papers show that leveraging implicit neural representations—which have been popularized in geometric computer vision (e.g., NeRF, DeepSDF)—brings many benefits when used to develop kinematic approximations for model reduction, such as facilitating analytic computation of spatiotemporal gradient and enabling simulation super-resolution.

Avatar
Kevin T. Carlberg
Director of AI Research Science

My research combines machine learning, computational physics, and high-performance computing. The objective is to discover structure in data to drastically reduce the cost of simulating nonlinear dynamical systems at extreme scale. I also work on technologies that enable the future of virtual and augmented reality.