Probabilistic Language of Thought by Fausto Carcassi
Some of our most developed and expensive models of cognitive processes, artificial neural networks, struggle to match human skills in various respects: few- or zero-shot learning (figuring out a rule from few examples) and manipulating compositionally structured representations and symbols with a rich logical structure. This course focuses on a promising framework to model these capacities: the probabilistic Language of Thought (LoT). We will start with the philosophical underpinnings of the program in the work of Jerry Fodor. Then, we will discuss recent developments, combining the LoT with probabilistic approaches to learning to develop models of category acquisition across a variety of conceptual domains. This will require a discussion of several technical tools (formal grammars, compositional semantics, Bayesian inference). Finally, we will look at recent promising developments in the field (neurosymbolic learning, the child as a hacker).