
Normal and Triangular Determinantal Representations of Multivariate Polynomials
In this paper we give a new and simple algorithm to put any multivariate...
read it

Improved Polynomial Remainder Sequences for Ore Polynomials
Polynomial remainder sequences contain the intermediate results of the E...
read it

On the Computational Power of Transformers and Its Implications in Sequence Modeling
Transformers are being used extensively across several sequence modeling...
read it

On Explicit Branching Programs for the Rectangular Determinant and Permanent Polynomials
We study the arithmetic circuit complexity of some wellknown family of ...
read it

Extending Equational Monadic Reasoning with Monad Transformers
There is a recent interest for the verification of monadic programs usin...
read it

Spectrogram Inpainting for Interactive Generation of Instrument Sounds
Modern approaches to sound synthesis using deep neural networks are hard...
read it

Investigating transformers in the decomposition of polygonal shapes as point collections
Transformers can generate predictions in two approaches: 1. autoregress...
read it
Analyzing the Nuances of Transformers' Polynomial Simplification Abilities
Symbolic Mathematical tasks such as integration often require multiple welldefined steps and understanding of subtasks to reach a solution. To understand Transformers' abilities in such tasks in a finegrained manner, we deviate from traditional endtoend settings, and explore a stepwise polynomial simplification task. Polynomials can be written in a simple normal form as a sum of monomials which are ordered in a lexicographic order. For a polynomial which is not necessarily in this normal form, a sequence of simplification steps is applied to reach the fully simplified (i.e., in the normal form) polynomial. We propose a synthetic Polynomial dataset generation algorithm that generates polynomials with unique proof steps. Through varying coefficient configurations, input representation, proof granularity, and extensive hyperparameter tuning, we observe that Transformers consistently struggle with numeric multiplication. We explore two ways to mitigate this: Curriculum Learning and a Symbolic Calculator approach (where the numeric operations are offloaded to a calculator). Both approaches provide significant gains over the vanilla Transformersbased baseline.
READ FULL TEXT
Comments
There are no comments yet.