Publications: Transfer Learning
Traditional machine learning algorithms operate under the assumption
that learning for each new task starts from scratch, thus disregarding
any knowledge they may have gained while learning in previous
domains. Naturally, if the domains encountered during learning are
related, this
tabula rasa approach would waste both data and
computer time to develop hypotheses that could have been recovered by
simply examining and possibly slightly modifying previously acquired
knowledge. Moreover, the knowledge learned in earlier domains could
capture generally valid rules that are not easily recoverable from
small amounts of data, thus allowing the algorithm to achieve even
higher levels of accuracy than it would if it starts from scratch.
The field of transfer learning, which has witnessed a great increase
in popularity in recent years, addresses the problem of how to
leverage previously acquired knowledge in order to improve the
efficiency and accuracy of learning in a new domain that is in some
way related to the original one. In particular,
our current research is focused on developing transfer learning techniques
for Markov Logic Networks (MLNs), a recently developed approach to
statistical relational learning.
Our research in the area is currently sponsored by the Defense Advanced
Research Projects Agency (DARPA) and managed by the Air Force Research
Laboratory (AFRL) under contract FA8750-05-2-0283.
- Zero-shot Video Moment Retrieval With Off-the-Shelf Models
[Details] [PDF] [Poster]
Anuj Diwan, Puyuan Peng, Raymond J. Mooney
In Workshop on Transfer Learning for Natural Language Processing at NeurIPS 2022, December 2022.
- Knowledge Transfer Using Latent Variable Models
[Details] [PDF] [Slides (PDF)]
Ayan Acharya
PhD Thesis, Department of Electrical and Computer Engineering, The University of Texas at Austin, August 2015.
- Active Multitask Learning Using Both Latent and Supervised Shared Topics
[Details] [PDF] [Slides (PDF)]
Ayan Acharya and Raymond J. Mooney and Joydeep Ghosh
In Proceedings of the 2014 SIAM International Conference on Data Mining (SDM14), Philadelphia, Pennsylvania, April 2014.
- Using Both Latent and Supervised Shared Topics for Multitask Learning
[Details] [PDF] [Slides (PDF)]
Ayan Acharya, Aditya Rawal, Raymond J. Mooney, Eduardo R. Hruschka
In Proceedings of the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML/PKDD), 369--384, Prague, Czech Republic, September 2013.
- Learning with Markov Logic Networks: Transfer Learning, Structure Learning, and an Application to Web Query Disambiguation
[Details] [PDF]
Lilyana Mihalkova
PhD Thesis, Department of Computer Sciences, University of Texas at Austin, Austin, TX, August 2009. 176 pages.
- Transfer Learning from Minimal Target Data by Mapping across Relational Domains
[Details] [PDF]
Lilyana Mihalkova and Raymond Mooney
In Proceedings of the 21st International Joint Conference on Artificial Intelligence (IJCAI-09), 1163--1168, Pasadena, CA, July 2009.
- Transfer Learning by Mapping with Minimal Target Data
[Details] [PDF]
Lilyana Mihalkova and Raymond J. Mooney
In Proceedings of the AAAI-08 Workshop on Transfer Learning For Complex Tasks, Chicago, IL, July 2008.
- Improving Learning of Markov Logic Networks using Transfer and Bottom-Up Induction
[Details] [PDF]
Lilyana Mihalkova
Technical Report UT-AI-TR-07-341, Artificial Intelligence Lab, University of Texas at Austin, Austin, TX, May 2007.
- Mapping and Revising Markov Logic Networks for Transfer Learning
[Details] [PDF]
Lilyana Mihalkova, Tuyen N. Huynh, Raymond J. Mooney
In Proceedings of the Twenty-Second Conference on Artificial Intelligence (AAAI-07), 608-614, Vancouver, BC, July 2007.
- Transfer Learning with Markov Logic Networks
[Details] [PDF]
Lilyana Mihalkova and Raymond Mooney
In Proceedings of the ICML-06 Workshop on Structural Knowledge Transfer for Machine Learning, Pittsburgh, PA, June 2006.