Faculty of Computer Science

Research Group Theoretical Computer Science


Oberseminar: Heterogene formale Methoden


Date: 2021, June 29
Time: 10:00 a. m.
Place: Online
Author: Mossakowski, Till
Title: Differentiable Logic Machines

Abstract:

This talk presents Differentiable Logic Machines, which are a form of neural-symbolic integration. They build on Neural Logic Machines presented earlier in the seminar. They can extract a logic program from noisy data, outperforming both conventional neural networks as well as inductive logic programming as baselines. The talk presents this paper.

Abstract of the paper:
The integration of reasoning, learning, and decision-making is key to build more general AI systems. As a step in this direction, we propose a novel neural-logic architecture that can solve both inductive logic programming (ILP) and deep reinforcement learning (RL) problems. Our architecture defines a restricted but expressive continuous space of first-order logic programs by assigning weights to predicates instead of rules. Therefore, it is fully differentiable and can be efficiently trained with gradient descent. Besides, in the deep RL setting with actor-critic algorithms, we propose a novel efficient critic architecture. Compared to state-of-the-art methods on both ILP and RL problems, our proposition achieves excellent performance, while being able to provide a fully interpretable solution and scaling much better, especially during the testing phase.


Back to the Oberseminar web page
Webmaster