3 Matching Annotations
  1. Last 7 days
    1. When models go wrong, we will want to know why. What led the drone to abandon its intended target and detonate in a field hospital? Why is the healthcare model less likely to accurately diagnose Black people?

      这些关于AI系统失败场景的提问揭示了未来社会面临的核心挑战。随着AI系统被部署在更关键领域,我们需要建立新的问责机制和解释框架。'内脏占卜师'这一职业概念的提出,暗示了我们需要发展全新的方法论来理解和解释复杂系统的行为,这可能会催生新的跨学科研究领域。

  2. Jul 2021
    1. An “attention map” of each prediction shows the important data points considered by the models as they make that prediction.

      This gets us closer to explainable AI, in that the model is showing the clinician which variables were important in informing the prediction.

  3. Feb 2021