Natural Earth Vector Map in Marble
The current "Atlas map" is based on the MWDB2 dataset. The data is pretty old and the whole implementation that covers the Atlas map still has a few traces of "historic" code. There's a new proposal that would introduce the high quality Natural Earth data. This would require several changes to the Marble code and its data. See http://techbase.kde.org/Projects/Marble/NaturalEarth
Motivation for Proposal / Goal:
Marble is a World Atlas and Virtual Globe which can be used in order to acquire information about the Earth, gather knowledge about different places with just a click, measure distances between locations or even watch the current cloud cover. Being part of the KDE Educational Project, it promotes the usage of open-source maps (www.openstreetmap.org, MWDB2, etc.). The current “Atlas” map of Marble is based on the MWDB2 dataset, which has some major problems: it is very old, it last update dating from 20 years ago; the “Atlas” map is built with both vectors (for borders and coastlines) and JPG images (for the terrain); it doesn't have too many details, so the zooming is strongly limited. The current vector layer also has the disadvantage of not being able to be manipulated either programtiacally or by the user, so the selection / manipulation of a geographic entity is impossible.
This proposal is about creating a next generation “Atlas” map, based on the Natural Earth Data project. The Natural Earth Data set is a "public domain map dataset available at 1:10m, 1:50m, and 1:110 million scales. Featuring tightly integrated vector and raster data, with Natural Earth you can make a variety of visually pleasing, well-crafted maps with cartography or GIS software." This data set seems ideal as a replacement for the old MWDBII.
There are multiple advantages this replacement would bring, among which: much more features than the current format; the entire map is going to be vector-based; it is updated regularly; it provides bigger scales, so the zoom limitations will improve significantly; data attributes on the map, such as country codes, population, etc.
The Natural Earth Data is available in ESRI Shapefile format, which is now supported by Marble (using libshp). The disadvantage of this data type is the space efficiency, since it's ~6 times bigger than the previous MWDBII. During the project, I will do research in order to convert the data to a more efficient format, and, in the end, the full Natural Earth Data dataset will be used, but only a minimal required dataset will be shipped with Marble, with the remainder of the data being later downloaded.
The ultimate goals of this project would be to provide the new Atlas map, in a way that is space efficient, which shows different kinds of topographic features on the map (filtering on zoom, no insignificant details on a small zoom level), that has a basic dataset provided with the application and allows loading of further vector data on demand online, with no further user interaction.
The main change required would be creating a new map, based on Shapefile format files, using the recently implemented parser. One major performance issue arises here, whether a zoom level attribute is needed, for fast drawing, and in case it does, how to implement it. Some possibilities would be by using Quad Tiles, or by calculating a zoom level for each point, but this will require more storage space, or by improving the vector drawing layer to calculate the zoom level on the fly (using the Douglas-Peucker algorithm), which may be too slow. Further ways need to be tested.
In order to have a nice view of the map on each zoom level, a filtering algorithm should be applied while zooming in and out on the map. Filtering the data raises the problem of priorities: “What should and what should not be seen on the current level?” “Is this road more important than the other one?” “The name tag of this lake covers a mountain, should it be removed?” During the project I am going to research on ways to prioritize map's elements as well as possible so that a zoom level will show the relevant information. No matter what method is going to be used, the Douglas-Peucker algorithm will be part of it, so I am going to implement it in the GeoDataLineString file.
In order to optimize the memory, a new Marble file format (PNT) could be created from the Shapefile maps. Points could be stored as tuples of 32 bits integers (2 integers – 64 bits for normal points, 3 integers – 96 bits for the point situated at the start of each polygon). Using this method, better described here, the size of the file containing the country borders for the 1:10m scale map (highest quality) would reach ~ 4MB. (this does not include coastlines and internal borders). If the size if too large to be delivered with Marble, then we could create the files for the 1:50m scale map as default and download the higher quality ones once Marble connects to the internet. In order to achieve these results, I will implement a script for doing the conversion from shapefile to PNT.
The metadata files (basically name tags) are currently stored in the .dbf format, which needs to be converted either into Marble's own format or into CSV's/XML's.
After having a working prototype of the map, I am going to select which geographical elements are going to be delivered with Marble and which will be downloaded on the run, according to their importance.
|File name||Size||Date submitted|
|Cezar_Mocan.tar.gz||67.2 KB||September 02 2012 15:14 UTC|