Deep Learning and Formal Languages: Building Bridges

Event Notification Type: 
Call for Papers
Abbreviated Title: 
delfol
Location: 
State: 
Country: 
Italy
Contact Email: 
City: 
Florence
Contact: 
Matthias Gallé
Submission Deadline: 
Friday, 26 April 2019

Deep Learning and Formal Languages:
Building Bridges
Deep Learning and Formal Languages: Building Bridges -- ACL 2019 Workshop

Florence, Italy

Website: https://sites.google.com/view/delfol-workshop-acl19

SUBMISSION DEADLINE: 26 April 2019

While deep learning and neural networks have revolutionized the field of natural language processing, changed the habits of its practitioners and opened up new research directions, many aspects of the inner workings of deep neural networks remain unknown.

At the same time, we have access to many decades of accumulated knowledge on formal languages, grammar, and transductions, both weighted and unweighted and for strings as well as trees: closure properties, computational complexity of various operations, relationships between various classes of them, and many empirical and theoretical results on their learnability.

The goal of this workshop is to bring researchers together who are interested in how our understanding of formal languages can contribute to the understanding and design of neural network architectures for natural language processing. For example, fundamental work on neural nets has examined whether they could learn different classes of formal languages, and reciprocally whether formal grammars or automata could closely approximate neural networks. Recently we have seen new research directions on what each formalism can bring to understand or improve the other. Topics which fall within the purview of the workshop include, but are not limited to

Learnability of formal languages with neural nets (both strong and weak learning)

Relationship between deep learning models and linguistically inspired formalisms

Connections between neural network architectures and classical computational models

Traditional formal grammars augmented through non-linearity

Hybrid models combining neural networks and finite state machines

The use of formal grammars to analyze and interpret the behavior of neural networks

Approximating neural networks with weighted automata and grammars

Including formal grammar constraints as symbolic priors in neural networks

We call for three types of papers:

(1) Regular workshop paper

(2) Extended abstracts

(3) Cross-submissions

Only (1) will be included in the workshop proceedings

Some recent work which falls within the scope of this call include:

Bridging CNNs, RNNs, and Weighted Finite-State Machines. Roy Schwartz, Sam Thomson,and Noah A Smith. (ACL 2018)

Rational Recurrences. Hao Peng, Roy Schwartz, Sam Thomson, Noah A. Smith. (ENMLP 2018)

Recurrent Neural Networks as Weighted Language Recognizers. Y. Chen, S. Gilroy, A. Maletti, J. May, and K. Knight. (NAACL 2018)

Using Regular Languages to Explore the Representational Capacity of Recurrent Neural Architectures. Abhijit Mahalunkar and John D. Kelleher. (ICANN 2018)

Explaining black boxes on sequential data using weighted automata. Stéphane Ayache, Rémi Eyraud and Noé Goudian. (ICGI 2018)

Extracting Automata from Recurrent Neural Networks Using Queries and Counterexamples. Gail Weiss, Yoav Goldberg, and Eran Yahav. (ICML 2018)

Generalized Earley Parser: Bridging Symbolic Grammars and Sequence Data for Future Prediction. Siyuan Qi, Baoxiong Jia, and Song-Chun Zhu. (ICML 2018)

Efficient Gradient Computation for Structured Output Learning with Rational and Tropical Losses. Corinna Cortes, Vitaly Kuznetsov, Mehryar Mohri, Dmitry Storcheus, Scott Yang (NIPS 2018)

Composing RNNs and FSTs for Small Data: Recovering Missing Characters in Old Hawaiian Text. Oiwi Parker Jones and Brendan Shillingford (IRASL workshop at NIPS 2018)

Verification of Recurrent Neural Networks Through Rule Extraction. Q Wang, K Zhang, X Liu, and CL Giles (arxiv.org 2018)

A Comparison of Rule Extraction for Different Recurrent Neural Network Models and Grammatical Complexity. Q Wang, K Zhang, II Ororbia, G Alexander, X Xing, X Liu, CL Giles (arxiv.org 2018)

Grammar Variational Autoencoder. Matt J. Kusner, Brooks Paige, José Miguel Hernández-Lobato. (ICML 2017)

Subregular Complexity and Deep Learning. Enes Avcu, Chihiro Shibata, and Jeffrey Heinz. (LAML 2017)

Recurrent Neural Network Grammars. Chris Dyer, Adhiguna Kuncoro, Miguel Ballesteros, and Noah A. Smith. (NAACL 2016).

Weighting finite-state transductions with neural context. Pushpendre Rastogi, Ryan Cotterell, and Jason Eisner (NAACL 2016)

Programme Committee
Borja Balle, Amazon

Xavier Carreras, dMetrics

Shay B. Cohen, University of Edinburgh

Alex Clark, University of London

Ewan Dunbar, Université Paris Diderot

Marc Dymetman, Naver Labs Europe

Kyle Gorman, City University of New York

Hadrien Glaude, Amazon

John Hale, University of Georgia

Mans Hulden, University of Colorado

Franco Luque, University of Córdoba

Chihiro Shibata, Tokyo University of Technology

Adina Williams, FAIR

Organizers
Jason Eisner, Johns Hopkins University

Matthias Gallé, Naver Labs Europe

Jeffrey Heinz, Stony Brook University

Ariadna Quattoni, dMetrics

Guillaume Rabusseau, Université de Montréal / Mila