Chrome Extension
WeChat Mini Program
Use on ChatGLM

Headless Language Models: Learning Without Predicting with Contrastive Weight Tying

ICLR 2024(2024)

Cited 1|Views62
Key words
representation learning,NLP,language modeling,pretraining,contrastive
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined