BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Yiannis Vlassopoulos (Athena RC)
DTSTART:20230313T140000Z
DTEND:20230313T150000Z
DTSTAMP:20260423T021324Z
UID:GANT/27
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/GANT/27/">La
 nguage modelling with enriched categories\, the Yoneda embedding and the I
 sbell completion</a>\nby Yiannis Vlassopoulos (Athena RC) as part of Greek
  Algebra & Number Theory Seminar\n\n\nAbstract\nNeural networks (like Chat
 GPT) trained to probabilistically predict the next word to a text\, have r
 ecently achieved human like capabilities in language understanding and use
 .\n\nWhat is though the underlying mathematical structure that these model
 s learn and how is semantic information encoded in the statistics of word 
 co-occurances?\n\nWe will introduce a category L whose objects are texts i
 n the language and a morphism from text x to text y is the probability of 
 extension from x to y\, in order to propose a partial answer to these ques
 tions. The category is enriched over the monoidal closed category whose se
 t of objects is [0\, 1] and monoidal structure is multiplication. The Yone
 da embedding of L into its category of presheaves naturally encodes co-occ
 urance information. Applying −log to morphisms we obtain an equivalent c
 ategory which is also a Lawvere metric space and a tropical linear space. 
 We will then explain the Isbell completion which relates completion by op 
 co-presheaves (probabilites of extending a text) to completion by presheav
 es (probabilities of extending to a text). This is based on joint work wit
 h T.D. Bradley\, J. Terilla and S. Gaubert.\n
LOCATION:https://researchseminars.org/talk/GANT/27/
END:VEVENT
END:VCALENDAR
