Baseline: A Library for Rapid Modeling, Experimentation and Development of Deep Learning Algorithms targeting NLP

Daniel Pressel, Sagnik Ray Choudhury, Brian Lester, Yanjie Zhao, Matt Barta


Abstract
We introduce Baseline: a library for reproducible deep learning research and fast model development for NLP. The library provides easily extensible abstractions and implementations for data loading, model development, training and export of deep learning architectures. It also provides implementations for simple, high-performance, deep learning models for various NLP tasks, against which newly developed models can be compared. Deep learning experiments are hard to reproduce, Baseline provides functionalities to track them. The goal is to allow a researcher to focus on model development, delegating the repetitive tasks to the library.
Anthology ID:
W18-2506
Volume:
Proceedings of Workshop for NLP Open Source Software (NLP-OSS)
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Eunjeong L. Park, Masato Hagiwara, Dmitrijs Milajevs, Liling Tan
Venue:
NLPOSS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
34–40
Language:
URL:
https://aclanthology.org/W18-2506
DOI:
10.18653/v1/W18-2506
Bibkey:
Cite (ACL):
Daniel Pressel, Sagnik Ray Choudhury, Brian Lester, Yanjie Zhao, and Matt Barta. 2018. Baseline: A Library for Rapid Modeling, Experimentation and Development of Deep Learning Algorithms targeting NLP. In Proceedings of Workshop for NLP Open Source Software (NLP-OSS), pages 34–40, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Baseline: A Library for Rapid Modeling, Experimentation and Development of Deep Learning Algorithms targeting NLP (Pressel et al., NLPOSS 2018)
Copy Citation:
PDF:
https://aclanthology.org/W18-2506.pdf
Code
 dpressel/baseline
Data
CoNLL 2003SSTSST-2WNUT 2017