Testing the manifold hypothesis and fitting a manifold of large reach to noisy data

Hariharan Narayanan (Tata Institute for Fundamental Research)

24-Feb-2021, 16:00-17:00 (5 years ago)

Abstract: The hypothesis that high dimensional data tend to lie in the vicinity of a low dimensional manifold is the basis of manifold learning. We will discuss a joint work with Charles Fefferman and Sanjoy Mitter on testing the manifold hypothesis. We will outline an algorithm (with accompanying complexity guarantees) for fitting a manifold to an unknown probability distribution supported in a separable Hilbert space, only using i.i.d samples from that distribution. We also give a solution based on joint work with Charles Fefferman, Sergei Ivanov and Matti Lassas to the following question from manifold learning. Suppose data belonging to a high dimensional Euclidean space is sampled independently, identically at random, from a measure supported on a d dimensional twice differentiable embedded manifold M, and corrupted by Gaussian noise with small standard deviation sigma. How can we produce a manifold M_o whose Hausdorff distance to M is small and whose reach (normal injectivity radius) is not much smaller than the reach of M? We show how to produce a manifold within O(sigma^2) of M in Hausdorf distance, whose reach is smaller than that of M by a factor of no more than O(d^6).

algebraic geometrydifferential geometryquantum algebrasymplectic geometry

Audience: researchers in the topic


Boston University Geometry/Physics Seminar

Series comments: Please email Yu-Shen Lin (yslin0221@gmail.com) for password or adding to the email list.

Organizer: Yu-Shen Lin*
*contact for this listing

Export talk to