Return to Past and Current Projects
Deep embeddings and few-shot learning
We propose and evaluate a novel loss function for discovering deep embeddings that make explicit the categorical and semantic structure of a domain. The loss function is based on the F statistic that describes the separation of two or more distributions. This loss has several key advantages over previous approaches, including: it does not require a margin or arbitrary parameters for determining when distributions are sufficiently well separated, it is expressed as a probability which facilitates its combination with other training objectives, and it seems particularly well suited to disentangling semantic features of a domain, leading to more interpretable and manipulable representations.
We are also comparing and evaluating a range of algorithms for few-shot learning, including schemes explicitly designed for few-shot learning (e.g., Prototypicality Networks), schemes designed for obtaining semantic embeddings (e.g., Historgram loss), and schemes based on transfer of hidden representations from one neural network to another.
Students
Karl Ridgeway (Computer Science, Colorado)
Tyler Scott (Computer Science, Colorado)
Collaborators
Rich Zemel (Computer Science, Toronto)