Conference item icon

Conference item

Understanding in-context learning in transformers and LLMs by learning to learn discrete functions

Abstract:
In order to understand the in-context learning phenomenon, recent works have adopted a stylized experimental framework and demonstrated that Transformers can match the performance of gradient-based learning algorithms for various classes of real-valued functions. However, the limitations of Transformers in implementing learning algorithms, and their ability to learn other forms of algorithms are not well understood. Additionally, the degree to which these capabilities are confined to attention-based models is unclear. Furthermore, it remains to be seen whether the insights derived from these stylized settings can be extrapolated to pretrained Large Language Models (LLMs). In this work, we take a step towards answering these questions by demonstrating the following: (a) On a test-bed with a variety of Boolean function classes, we find that Transformers can nearly match the optimal learning algorithm for ‘simpler’ tasks, while their performance deteriorates on more ‘complex’ tasks. Additionally, we find that certain attention-free models perform (almost) identically to Transformers on a range of tasks. (b) When provided a teaching sequence, i.e. a set of examples that uniquely identifies a function in a class, we show that Transformers learn more sample-efficiently. Interestingly, our results show that Transformers can learn to implement two distinct algorithms to solve a single task, and can adaptively select the more sample-efficient algorithm depending on the sequence of in-context examples. (c) Lastly, we show that extant LLMs, e.g. LLaMA-2, GPT-4, can compete with nearest-neighbor baselines on prediction tasks that are guaranteed to not be in their training set.
Publication status:
Published
Peer review status:
Peer reviewed

Actions


Access Document


Publication website:
https://openreview.net/forum?id=ekeyCgeRfC

Authors


More by this author
Institution:
University of Oxford
Division:
MPLS
Department:
Computer Science
Role:
Author
More by this author
Institution:
University of Oxford
Division:
MPLS
Department:
Computer Science
Role:
Author
ORCID:
0000-0003-4558-2457
More by this author
Institution:
University of Oxford
Division:
MPLS
Department:
Computer Science
Role:
Author
ORCID:
0000-0002-2300-4819


Publisher:
OpenReview
Host title:
Proceedings of the 12th International Conference on Learning Representations (ICLR 2024)
Publication date:
2024-03-15
Acceptance date:
2024-01-16
Event title:
12th International Conference on Learning Representations (ICLR 2024)
Event location:
Vienna, Austria
Event website:
https://iclr.cc/
Event start date:
2024-05-07
Event end date:
2024-05-11


Language:
English
Pubs id:
2001852
Local pid:
pubs:2001852
Deposit date:
2024-05-30

Terms of use



Views and Downloads






If you are the owner of this record, you can report an update to it here: Report update to this record

TO TOP