- “Analysts estimate that by 2025, 30% of generated data will be real-time data. That is 52 zettabytes (ZB) of real-time data per year – roughly the amount of total data produced in 2020.
- Over the last decade, technologies have been developed by the likes of Materialize, Deephaven, Kafka and Redpanda to work with these streams of real-time data.
- But to really make such enormous volumes of data useful, artificial intelligence (AI) must be employed.
- To make real-time AI ubiquitous, supporting software must be developed. This software needs to provide:
- An easy path to transition from static to dynamic data
- An easy path for cleaning static and dynamic data
- An easy path for going from model creation and validation to production
- An easy path for managing the software as requirements – and the outside world – change”