mpla.math.uoa.gr
Font size: Αα Αα Αα hide gadgets
You are here: Theses » MSc » Ioannis Psaros

MSc thesis of Ioannis Psaros

Low-quality dimension reduction and high-dimensional Approximate Nearest Neighbor"

Supervisor: Ioannis Emiris

The approximate nearest neighbor problem (ANN) in Euclidean settings is a fundamental question, which has been addressed by two main approaches: Data-dependent space parti- tioning techniques perform well when the dimension is relatively low, but are affected by the curse of dimensionality. On the other hand, locality sensitive hashing has polynomial dependence in the dimension, sublinear query time with an exponent inversely proportional to the error factor 𝜖, and subquadratic space requirement. We generalize the Johnson-Lindenstrauss lemma to define “low-quality” mappings to a Euclidean space of significantly lower dimension, such that they satisfy a requirement weaker than approximately preserving all distances or even preserving the nearest neighbor. This mapping guarantees, with arbitrarily high probability, that an approximate nearest neighbor lies among the 𝑘 approximate nearest neighbors in the projected space. This leads to a randomized tree based data structure that avoids the curse of dimensionality for (1 + 𝜖)-ANN. Our algorithm, given 𝑛 points in dimension 𝑑, achieves space usage in 𝑂(𝑑𝑛), preprocessing time in 𝑂(𝑑𝑛 log 𝑛), and query time in 𝑂(𝑑𝑛^𝜌 log 𝑛), where 𝜌 is proportional to 1 − 1/ln ln 𝑛, for fixed 𝜖 ∈ (0, 1). It employs a data structure, such as BBD-trees, that efficiently finds 𝑘 approximate nearest neighbors. The dimension reduction is larger if one assumes that pointsets possess some structure, namely bounded expansion rate.

Defended: April 23, 2015.

Scientific committee

Download

Download thesis.

Reporter

Web standards: XHTML1.0, CSS3.
© 1996 – 2018 MPLA: Graduate program in Logic, Algorithms and Computation.
Contact the webmaster.