Information-Theoretic Probing with Minimum Description Length

When:
Thursday, December 3, 2020, 10:00 am - 11:00 am PSTiCal
Where:
VIRTUAL:https://usc.zoom.us/j/96662265166
This event is open to the public.
Type:
NL Seminar
Speaker:
Elena (Lena) Voita-University of Edinburgh
Video Recording:
https://usc.zoom.us/j/96662265166
Description:

Abstract:
How can you know whether a model has learned to encode a linguistic property? The most popular approach to measure how well pretrained representations encode a linguistic property is to use the accuracy of a probing classifier (probe). However, such probes often fail to adequately reflect differences in representations, and they can show different results depending on probe hyperparameters. As an alternative to standard probing, we propose information-theoretic probing which measures minimum description length (MDL) of labels given representations. In addition to probe quality, the description length evaluates “the amount of effort” needed to achieve this quality. We show that (i) MDL can be easily evaluated on top of standard probe-training pipelines, and (ii) compared to standard probes, the results of MDL probing are more informative, stable, and sensible.

Bio:

Elena (Lena) Voita is a Ph.D. student at the University of Edinburgh and University of Amsterdam supervised by Ivan Titov and Rico Sennrich, and is currently a Facebook PhD Fellow. Her research focuses on document-level neural machine translation, as well as on understanding what and how neural models learn.  Previously, she was a research scientist at Yandex Research and worked closely with the Yandex Translate team. She also teaches NLP at the Yandex School of Data Analysis. The extended public version of (a part of) this course is available at "NLP Course For You".

https://lena-voita.github.io/nlp_course.html

 

 

« Return to Upcoming Events