UTCS Artificial Intelligence
courses
talks/events
demos
people
projects
publications
software/data
labs
areas
admin
Copy That! Editing Sequences by Copying Spans (2021)
Sheena Panthaplackel
, Miltiadis Allamanis, Marc Brockschmidt
Neural sequence-to-sequence models are finding increasing use in editing of documents, for example in correcting a text document or repairing source code. In this paper, we argue that common seq2seq models (with a facility to copy single tokens) are not a natural fit for such tasks, as they have to explicitly copy each unchanged token. We present an extension of seq2seq models capable of copying entire spans of theinput to the output in one step, greatly reducing the number of decisions required during inference. This extension means that there are now many ways of generating the same output, which we handle by deriving a new objective for training and a variation of beam search for inference that explicitly handles this problem.In our experiments on a range of editing tasks of natural language and source code, we show that our new model consistently outperforms simpler baselines.
View:
PDF
,
Arxiv
Citation:
In
The AAAI Conference on Artificial Intelligence (AAAI)
, February 2021.
Bibtex:
@inproceedings{panthaplackel:aaai21b, title={Copy That! Editing Sequences by Copying Spans}, author={Sheena Panthaplackel and Miltiadis Allamanis and Marc Brockschmidt}, booktitle={The AAAI Conference on Artificial Intelligence (AAAI)}, month={February}, url="http://www.cs.utexas.edu/users/ai-labpub-view.php?PubID=127887", year={2021} }
Presentation:
Slides (PDF)
Slides (PPT)
Poster
People
Sheena Panthaplackel
Ph.D. Alumni
spantha [at] cs utexas edu
Areas of Interest
Natural Language for Software Engineering
Labs
Machine Learning