The Logic of Hebbian Learning
DOI:
https://doi.org/10.32473/flairs.v35i.130735Keywords:
Neurosymbolic AI, Hebbian Learning, Dynamic Logics, Knowledge Representation and Reasoning, Nonmonotonic Reasoning, Preference UpgradeAbstract
We present the logic of Hebbian learning, a dynamic logic
whose semantics1 are expressed in terms of a layered neural
network learning via Hebb’s associative learning rule. Its lan-
guage consists of modality Tφ (read “typically φ,” formalized
as forward propagation), conditionals φ ⇒ ψ (read “typi-
cally φ are ψ”), as well as dynamic modalities [φ+]ψ (read
“evaluate ψ after performing Hebbian update on φ”). We give
axioms and inference rules that are sound with respect to the
neural semantics; these axioms characterize Hebbian learning
and its interaction with propagation. The upshot is that this
logic describes a neuro-symbolic agent that both learns from
experience and also reasons about what it has learned.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2022 Caleb Kisby, Saúl Blanco, Lawrence Moss
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.