Télécharger
Meta Self-training for Few-shot Neural Sequence Labeling [Code]
octobre 2021
This is the implementation of the paper Meta Self-training for Few-shot Neural Sequence Labeling. MetaST is short for meta-learning for self-training.
Meta Representation Transformation for Low-resource Cross-Lingual Learning [Code]
mai 2021
This is a source code release for a published research at NAACL 2021. Paper Title: MetaXL: Meta Representation Transformation for Low-resource Cross-Lingual Learning Paper Abstract: The combination of multilingual pre-trained representations and cross-lingual transfer learning is one of the most…
Self-training with Weak Supervision [Code]
avril 2021
State-of-the-art deep neural networks require large-scale labeled training data that is often either expensive to obtain or not available for many tasks. Weak supervision in the form of domain-specific rules has been shown to be useful in such settings to…
Microsoft Icecaps: An Open-Source Toolkit for Conversation Modeling
août 2019
Microsoft Icecaps is a new open-source NLP toolkit featuring pre-trained models and an emphasis on conversational scenarios