From the time I began work on AI in 2010 to now, the amount of training data that goes into frontier AI models has grown by a staggering 1 trillion times—from roughly 10¹⁴ flops for early systems to over 10²⁶ flops for today's largest models.
令人惊讶的是:AI训练数据量在短短16年间增长了1万亿倍,这是一个难以想象的指数级增长。这种计算能力的爆炸式发展远超人类直觉,解释了为什么AI进步如此迅速且难以预测。大多数人无法真正理解这种指数级增长意味着什么,这也是为什么许多专家对AI发展速度预测失败的原因。