下载
Meta Self-training for Few-shot Neural Sequence Labeling [Code]
2021年10月
This is the implementation of the paper Meta Self-training for Few-shot Neural Sequence Labeling. MetaST is short for meta-learning for self-training.
Meta Representation Transformation for Low-resource Cross-Lingual Learning [Code]
2021年5月
This is a source code release for a published research at NAACL 2021. Paper Title: MetaXL: Meta Representation Transformation for Low-resource Cross-Lingual Learning Paper Abstract: The combination of multilingual pre-trained representations and cross-lingual transfer learning is one of the most…
Self-training with Weak Supervision [Code]
2021年4月
State-of-the-art deep neural networks require large-scale labeled training data that is often either expensive to obtain or not available for many tasks. Weak supervision in the form of domain-specific rules has been shown to be useful in such settings to…
Microsoft Icecaps: An Open-Source Toolkit for Conversation Modeling
2019年8月
Microsoft Icecaps is a new open-source NLP toolkit featuring pre-trained models and an emphasis on conversational scenarios