Deep Learning for Symbolic Mathematics

preview_player
Показать описание
This model solves integrals and ODEs by doing seq2seq!

Abstract:
Neural networks have a reputation for being better at solving statistical or approximate problems than at performing calculations or working with symbolic data. In this paper, we show that they can be surprisingly good at more elaborated tasks in mathematics, such as symbolic integration and solving differential equations. We propose a syntax for representing mathematical problems, and methods for generating large datasets that can be used to train sequence-to-sequence models. We achieve results that outperform commercial Computer Algebra Systems such as Matlab or Mathematica.

Authors: Guillaume Lample, François Charton

Links:
Рекомендации по теме
Комментарии
Автор

Read the paper called Grammar Variational Autoencoder (Matt J. Kusner et al). I think it has a much better approach than just letting a RNN seq2seq memoize the dataset.

In that paper, the neural network predicts the sequence of productions in the context free grammar. It has a nice smooth and interpretable latent space which I think is a good sign that the neural network is learning things of value.

herp_derpingson
Автор

Seems like the eureka moment was realizing Polish notation is a thing. Makes you wonder if there are simple changes that might exist in other domains that could improve things.

glennkroegel
Автор

I was just thinking about this earlier and thought 'I wonder if Yannic knows?'

He knows indeed.

Collinoeight
Автор

Why the hack would this work without any inference ? Small changes in an equation can change the result dramatically.. If solutions would be of lingvistic nature se could study math as a foreign language :)

blanamaxima
Автор

Hello Yannic! I run a youtube channel focused on AI and computer science like yourself. Would you be interested in collaborating?

theadameubanks