Locality, relation, and meaning construction in language, via tokenization in LLMs

Presenter's Name(s)

Julia Zimmerman

Abstract

Large Language Models (LLMs) like ChatGPT reflect profound changes in the field of Artificial Intelligence, achieving a linguistic fluency that is impressively, even shockingly, human-like. The extent of their current and potential capabilities is an active area of investigation by no means limited to scientific researchers. We examine the details of their gnogeography. We present high-level takeaways on cognition, language, and the future of learning machines as well as the science around them, arguing that relation is the fundamental mechanism for linguistic meaning construction, and that problem-solving is fundamentally achieved through subverting locality via the mechanism of relation.

Primary Faculty Mentor Name

Chris Danforth

Status

Graduate

Student College

College of Engineering and Mathematical Sciences

Program/Major

Data Science

Primary Research Category

Social Science

Abstract only.

Share

COinS
 

Locality, relation, and meaning construction in language, via tokenization in LLMs

Large Language Models (LLMs) like ChatGPT reflect profound changes in the field of Artificial Intelligence, achieving a linguistic fluency that is impressively, even shockingly, human-like. The extent of their current and potential capabilities is an active area of investigation by no means limited to scientific researchers. We examine the details of their gnogeography. We present high-level takeaways on cognition, language, and the future of learning machines as well as the science around them, arguing that relation is the fundamental mechanism for linguistic meaning construction, and that problem-solving is fundamentally achieved through subverting locality via the mechanism of relation.