Logo
UniLM logo

UniLM

A new unified pre-trained Language Model (UniLM) for natural language understanding and generation tasks.

Visit Website
Screenshot of UniLM

About UniLM

This paper presents a new Unified pre-trained Language Model (UniLM) that can be fine-tuned for both natural language understanding and generation tasks. The model is pre-trained using three types of language modeling tasks: unidirectional, bidirectional, and sequence-to-sequence prediction. The unified modeling is achieved by employing a shared Transformer network and utilizing specific self-attention heads for different types of language modeling tasks. The UniLM achieves state-of-the-art perf...

Key Features

6 features
  • Unified pre-trained Language Model.
  • Fine-tuning for natural language understanding and generation tasks.
  • Pre-training using unidirectional
  • bidirectional
  • and sequence-to-sequence prediction.
  • Shared Transformer network.

Use Cases

2 use cases
  • Natural language understanding.
  • Natural language generation.
Added April 13, 2024
Loading reviews...