Redundancy of information: lowering effective dimension
Joe Miller (University of Wisconsin)
Abstract: A natural way to measure the similarity between two infinite binary sequences X and Y is to take the (upper) density of their symmetric difference. This is the Besicovitch distance on Cantor space. If d(X,Y) = 0, then we say that X and Y are coarsely equivalent. Greenberg, M., Shen, and Westrick (2018) proved that a binary sequence has effective (Hausdorff) dimension 1 if and only if it is coarsely equivalent to a Martin-Löf random sequence. They went on to determine the best and worst cases for the distance from a dimension t sequence to the nearest dimension s>t sequence. Thus, the difficulty of increasing dimension is understood.
Greenberg, et al. also determined the distance from a dimension 1 sequence to the nearest dimension t sequence. But they left open the general problem of reducing dimension, which is made difficult by the fact that the information in a dimension s sequence can be coded (at least somewhat) redundantly. Goh, M., Soskova, and Westrick recently gave a complete solution.
I will talk about both the results in the 2018 paper and the more recent work. In particular, I will discuss some of the coding theory behind these results. No previous knowledge of coding theory is assumed.
logic
Audience: researchers in the topic
Computability theory and applications
Series comments: Description: Computability theory, logic
The goal of this endeavor is to run a seminar on the platform Zoom on a weekly basis, perhaps with alternating time slots each of which covers at least three out of four of Europe, North America, Asia, and New Zealand/Australia. While the meetings are always scheduled for Tuesdays, the timezone varies, so please refer to the calendar on the website for details about individual seminars.
Organizers: | Damir Dzhafarov*, Vasco Brattka*, Ekaterina Fokina*, Ludovic Patey*, Takayuki Kihara, Noam Greenberg, Arno Pauly, Linda Brown Westrick |
*contact for this listing |