- “As noted in the Artificial Intelligence Index Report 2021, last year the number of journal publications in the field grew by 34.5%.
- The advent of Transformers led to the development of GPT-3 (Generative Pre-trained Transformer), which boasts 175 billion parameters, was trained on 45 TB of text data, and cost upward of $12 million to build.
- Transformers: More than meets the eye
- Generative adversarial networks
- Machine learning meets molecular synthesis
- Neuro-symbolic A.I.”