Autonomous cognitive agents must adapt with changing environments to remain effective and robust. Expectations about the world enable an agent to identify an anomalous situation and thus provide the foundation for an appropriate response. While significant work has addressed anomalies or discrepancies in the world, few examples exist that address discrepancies within an agent's own cognition by explicitly declaring and reasoning about high-level expectations. Such high-level expectations allow an agent to identify anomalous mental situations, which in turn enables the agent to adapt its own knowledge or cognitive processes to complex problems. This paper introduces a new class of expectations called metacognitive expectations. Our contributions are the following: (1) a general formalism for these expectations; (2) an implementation of metacognitive expectations in an agent embodied within a cognitive architecture; and (3) experimental results indicating that an agent equipped with metacognitive expectations can outperform agents operating without such structures.
Saving...