The Logic of Hebbian Learning

Authors

  • Caleb Kisby Department of Computer Science, Indiana University Bloomington
  • Saúl Blanco Department of Computer Science, Indiana University Bloomington
  • Lawrence Moss Department of Mathematics, Indiana University Bloomington

DOI:

https://doi.org/10.32473/flairs.v35i.130735

Keywords:

Neurosymbolic AI, Hebbian Learning, Dynamic Logics, Knowledge Representation and Reasoning, Nonmonotonic Reasoning, Preference Upgrade

Abstract

We present the logic of Hebbian learning, a dynamic logic
whose semantics1 are expressed in terms of a layered neural
network learning via Hebb’s associative learning rule. Its lan-
guage consists of modality Tφ (read “typically φ,” formalized
as forward propagation), conditionals φ ⇒ ψ (read “typi-
cally φ are ψ”), as well as dynamic modalities [φ+]ψ (read
“evaluate ψ after performing Hebbian update on φ”). We give
axioms and inference rules that are sound with respect to the
neural semantics; these axioms characterize Hebbian learning
and its interaction with propagation. The upshot is that this
logic describes a neuro-symbolic agent that both learns from
experience and also reasons about what it has learned.

Downloads

Published

04-05-2022

How to Cite

Kisby, C., Blanco, S., & Moss, L. (2022). The Logic of Hebbian Learning. The International FLAIRS Conference Proceedings, 35. https://doi.org/10.32473/flairs.v35i.130735

Issue

Section

Special Track: Semantic, Logics, Information Extraction and AI