Tianyu Gao will present his General Exam "SimCSE: Simple Contrastive Learning of Sentence Embeddings" on Wednesday, April 13, 2022 at 3:00 PM in CS 402 and via Zoom.

 

Zoom link: https://princeton.zoom.us/j/95387462065

 

Committee Members: Danqi Chen (advisor), Karthik Narasimhan, Sanjeev Arora

 

Abstract:

We present SimCSE, a simple contrastive learning framework that greatly advances the state-of-the-art sentence embeddings. We first describe an unsupervised approach, which takes an input sentence and predicts itself in a contrastive objective, with only standard dropout used as noise. This simple method works surprisingly well, performing on par with previous supervised counterparts. We find that dropout acts as minimal data augmentation and removing it leads to a representation collapse. Then, we propose a supervised approach, which incorporates annotated pairs from natural language inference datasets into our contrastive learning framework, by using "entailment" pairs as positives and "contradiction" pairs as hard negatives. We evaluate SimCSE on standard semantic textual similarity (STS) tasks, and our unsupervised and supervised models using BERT base achieve an average of 76.3% and 81.6% Spearman's correlation respectively, a 4.2% and 2.2% improvement compared to previous best results. We also show -- both theoretically and empirically -- that contrastive learning objective regularizes pre-trained embeddings' anisotropic space to be more uniform, and it better aligns positive pairs when supervised signals are available.

 

Reading List:

https://docs.google.com/document/d/1fSGNfvOSkPggmMZUv_rAp4MK0fo05EeL9enjoyep0Og/edit?usp=sharing

 

Everyone is invited to attend the talk, and those faculty wishing to remain for the oral exam following are welcome to do so.