Event Sequence Modeling with the Neural Hawkes Process

Jason Eisner (Johns Hopkins University)

20-Aug-2020, 19:00-20:30 (5 years ago)

Abstract: Suppose you are monitoring discrete events in real time. Can you predict what events will happen in the future, and when? Can you fill in past events that you may have missed? A probability model that supports such reasoning is the neural Hawkes process (NHP), in which the Poisson intensities of K event types at time t depend on the history of past events. This autoregressive architecture can capture complex dependencies. It resembles an LSTM language model over K word types, but allows the LSTM state to evolve in continuous time.

This talk will present the NHP model along with methods for estimating parameters (MLE and NCE), sampling predictions of the future (thinning), and imputing missing events (particle smoothing). I'll then show how to scale the NHP or the LSTM language model to large K, beginning with a temporal deductive database for a real-world domain, which can track how possible event types and other facts change over time. We take the system state to be a collection of vector-space embeddings of these facts, and derive a deep recurrent architecture from the temporal Datalog program that specifies the database. We call this method "neural Datalog through time."

This work was done with Hongyuan Mei and other collaborators including Guanghui Qin, Minjie Xu, and Tom Wan.

bioinformaticsgame theoryinformation theorymachine learningneural and evolutionary computingclassical analysis and ODEsoptimization and controlstatistics theory

Audience: researchers in the topic


IAS Seminar Series on Theoretical Machine Learning

Series comments: Description: Seminar series focusing on machine learning. Open to all.

Register in advance at forms.gle/KRz8hexzxa5P4USr7 to receive Zoom link and password. Recordings of past seminars can be found at www.ias.edu/video-tags/seminar-theoretical-machine-learning

Organizers: Ke Li*, Sanjeev Arora
*contact for this listing

Export talk to