EGU23-2944
https://doi.org/10.5194/egusphere-egu23-2944
EGU General Assembly 2023
© Author(s) 2023. This work is distributed under
the Creative Commons Attribution 4.0 License.

Foundation AI Models for Science

Manil Maskey, Rahul Ramachandran, Tsengdar Lee, and Raghu Ganti
Manil Maskey et al.
  • Madison, United States of America (manil.maskey@nasa.gov)

Foundation Models (FM) are AI models that are designed to replace a task or an application specific model. These FM can be applied to many different downstream applications. These FM are trained using self supervised techniques and can be built on any type of sequence data. The use of self supervised learning removes the hurdle for developing a large labeled dataset for training. Most FM use transformer architecture utilizes the notion of self attention which allows the network to model the influence of distant data points to each other both in space and time. The FM models exhibit emergent properties that are induced from the data.

 

FM can be an important tool for science. The scale of these models results in better performance for different downstream applications and these applications show better accuracy over models built from scratch. FM drastically reduces the cost of entry to build different downstream applications both in time and effort. FM for selected science datasets such as optical satellite data, can accelerate applications ranging from data quality monitoring, feature detection and prediction. FM can make it easier to infuse AI into scientific research by removing the training data bottleneck and increasing the use of science data.

How to cite: Maskey, M., Ramachandran, R., Lee, T., and Ganti, R.: Foundation AI Models for Science, EGU General Assembly 2023, Vienna, Austria, 24–28 Apr 2023, EGU23-2944, https://doi.org/10.5194/egusphere-egu23-2944, 2023.