Masked and Permuted Pre-training for Language Understanding

About MPNet

MPNet is a pre-training method for language models that combines masked language modeling (MLM) and permuted language modeling (PLM) in one view. It takes the dependency among the predicted tokens into consideration through PLM (permuted language modeling) and thus avoids the issue of BERT.

MPNet screenshots

Ready to start building?

At Apideck we're building the world's biggest API network. Discover and integrate over 12,000 APIs.

Check out the API Tracker