BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Jason Eisner (Johns Hopkins University)
DTSTART:20200820T190000Z
DTEND:20200820T203000Z
DTSTAMP:20260423T003301Z
UID:IASML/21
DESCRIPTION:Title: <a href="https://researchseminars.org/talk/IASML/21/">E
 vent Sequence Modeling with the Neural Hawkes Process</a>\nby Jason Eisner
  (Johns Hopkins University) as part of IAS Seminar Series on Theoretical M
 achine Learning\n\n\nAbstract\nSuppose you are monitoring discrete events 
 in real time.  Can you predict what events will happen in the future\, and
  when?  Can you fill in past events that you may have missed?  A probabili
 ty model that supports such reasoning is the neural Hawkes process (NHP)\,
  in which the Poisson intensities of K event types at time t depend on the
  history of past events.  This autoregressive architecture can capture com
 plex dependencies.  It resembles an LSTM language model over K word types\
 , but allows the LSTM state to evolve in continuous time.  \n\nThis talk w
 ill present the NHP model along with methods for estimating parameters (ML
 E and NCE)\, sampling predictions of the future (thinning)\, and imputing 
 missing events (particle smoothing).  I'll then show how to scale the NHP 
 or the LSTM language model to large K\, beginning with a temporal deductiv
 e database for a real-world domain\, which can track how possible event ty
 pes and other facts change over time.  We take the system state to be a co
 llection of vector-space embeddings of these facts\, and derive a deep rec
 urrent architecture from the temporal Datalog program that specifies the d
 atabase.  We call this method "neural Datalog through time."\n\nThis work 
 was done with Hongyuan Mei and other collaborators including Guanghui Qin\
 , Minjie Xu\, and Tom Wan.\n
LOCATION:https://researchseminars.org/talk/IASML/21/
END:VEVENT
END:VCALENDAR
