The Growing Momentum for AI Foundation Models in Biotech and 12 Notable Companies
As artificial intelligence (AI) foundation models grow increasingly capable, they become useful for applications across a wide range of economic functions and industries, including biotech.
The most prominent examples of general purpose foundation models are the GPT-3 and GPT-4 models, which form the basis of ChatGPT, and BERT, or Bidirectional Encoder Representations from Transformers.
These are gigantic models trained on enormous volumes of data, often in a self-supervised or unsupervised manner (without the need for labeled data).
Thanks to special model design, including transformer architecture and attention algorithms, foundation models are inherently generalizable, allowing their adaptation to a diverse array of downstream tasks, unlike traditional AI models that excel in single tasks like, say, predicting molecule-target interaction.
The "foundation" aspect comes from their generalizability: once pre-trained, they can be fine-tuned with smaller, domain-specific datasets to excel in specific tasks, reducing the need for training new models from scratch. This approach enables them to serve as a versatile base for a multitude of applications, from natural language processing to bioinformatics, by adapting to the nuances of particular challenges through additional training.
Topics: AI & Digital