Multiple Physics Pretraining for Physical Surrogate Models

By M. M. McCabe et al.
Read the original document by opening this link in a new tab.

Table of Contents

ABSTRACT
INTRODUCTION
BACKGROUND
RELATED WORK
SCALABLE MULTIPLE PHYSICS PRETRAINING
COMPOSITIONALITY AND PRETRAINING
ARCHITECTURE
BALANCING OBJECTIVES DURING TRAINING

Summary

The document discusses Multiple Physics Pretraining (MPP), an approach for task-agnostic pretraining of physical surrogate models. It introduces the concept of embedding multiple heterogeneous physical systems into a shared space for autoregressive prediction. The paper highlights the efficacy of pretraining models with MPP in surpassing task-specific baselines without finetuning. It also explores the transfer capabilities of MPP-trained models on low-data systems and tasks like inferring simulation parameters. The architecture utilizes Axial Attention in a ViT backbone, along with field embedding and normalization strategies. The training process involves balancing objectives through diverse task sampling.
×
This is where the content will go.