Speaker
Description
Two of the typical points of interest with elevation data, or Geographic Information Systems
(GIS) data in general, are storage and query costs. The former is typically addressed by
integrating standard compression schemes into already existing storage mechanisms, such
as GZIP in HDF5. Space-Filling Curves (SFCs) have already been used to reduce access
time for spatial operations on point and polygon data. In this research, we evaluate the effect
of using SFCs as a pre-processing step for standard compression schemes on elevation
data. We break up common compression tools into their base algorithms and identify
canonical SFCs from the literature (for example, the Hilbert curve).
We use 1-arcsecond resolution elevation maps from the Shuttle Radio Topographic Mission
(SRTM) as the comparative data-set upon which we apply all combinations of SFCs and
compression schemes. The SFCs, in most cases, neither significantly improve nor worsen
compression ratios when compared to non-preprocessed results. However, we show that
certain pre-processing steps improve the compression performance of otherwise ineffective
compression techniques. This research shows the potential for future work on compression
schemes which allow for in-place search and modifications without the loss of compression
performance. Another application is to apply these techniques to astronomical data from the
Square-Kilometre Array, a major scientific and engineering project in South Africa, for which
some preliminary results have been attained.