Information
- Publication Type: Master Thesis
- Workgroup(s)/Project(s):
- Date: February 2011
- Date (Start): 23. July 2007
- First Supervisor:
- Keywords: Out-of-Core algorithms, Point-based rendering, Normal estimation
Abstract
This diploma thesis introduces methods for external sorting and fast k nearest neighbor searching for very large point clouds. Very large point clouds are datasets that can not be processed in main memory. This leads to certain requirements for any used reconstruction technique. The most important ones are out-of-core memory management and sequential data handling algorithms. The paper “Stream- Processing Points” by Renato Pajarola shows a way to design a framework which allows to process a subset of very large point clouds. A subset is defined as a spatially continuous region holding a predefined number of points. This diploma thesis is based on the aforementioned paper and improves on the stream processing pipeline presented therein. The proposed algorithm for searching the k nearest neighbors has a complexity of O(N log(M)), where N are all points in the point cloud and M are all points in main memory, which is similar to current state of the art algorithms for in-core processed data sets.Additional Files and Images
Additional images and videos
image:
Normals estimated in the Domitilla Catacomb model with approximately 2 billion points.
Additional files
thesis:
The thesis
Weblinks
No further information available.BibTeX
@mastersthesis{marek-2011-normalest, title = "Normal Estimation of Very Large Point Clouds", author = "Stefan Marek", year = "2011", abstract = "This diploma thesis introduces methods for external sorting and fast k nearest neighbor searching for very large point clouds. Very large point clouds are datasets that can not be processed in main memory. This leads to certain requirements for any used reconstruction technique. The most important ones are out-of-core memory management and sequential data handling algorithms. The paper “Stream- Processing Points” by Renato Pajarola shows a way to design a framework which allows to process a subset of very large point clouds. A subset is defined as a spatially continuous region holding a predefined number of points. This diploma thesis is based on the aforementioned paper and improves on the stream processing pipeline presented therein. The proposed algorithm for searching the k nearest neighbors has a complexity of O(N log(M)), where N are all points in the point cloud and M are all points in main memory, which is similar to current state of the art algorithms for in-core processed data sets.", month = feb, address = "Favoritenstrasse 9-11/E193-02, A-1040 Vienna, Austria", school = "Institute of Computer Graphics and Algorithms, Vienna University of Technology ", keywords = "Out-of-Core algorithms, Point-based rendering, Normal estimation", URL = "https://www.cg.tuwien.ac.at/research/publications/2011/marek-2011-normalest/", }