Becker Traffic Assist High Speed Ii Karten Update Java
Becker Traffic Assist HS II 7988 new portable. I can`t find these files anywhere else. Password: bRAd These are the only Becker maps, i recommend you to install iGO to your device because they update their devices more often, and you do now have anything to lose if you have the original DVD/CD that came with your device.
These are the latest maps avaialbile for free, for the following becker traffic assist HighSpeed / Pro devices:
Traffic Assist High speed 7934
Traffic Assist High speed II 7988
Traffic Assist Pro 7916
Traffic Assist Pro Ferrari 7929.
The maps are for the following Countries:
The maps are suitable for the following countries: Latvia, Lithuania, Luxembourg, Macedonia, Monaco, Bulgaria, Croatia, Czech Republic, Liechtenstein, Denmark, Estonia, Finland, France, Albania, Andorra, Austria, Belarus, Belgium, Slovenia, Spain, Sweden, Switzerland, The Netherlands, Vatican City Bosnia Herzegovina, Montenegro, Norway, Poland, Portugal, Romania, Germany, Gibraltar, Great Britain, Greece, Hungary, Ireland, Italy, San Marino, Serbia and Slovakia.
I`m sorry, the link is dead.
I can`t find these files anywhere else.
Download Becker Traffic assist HighSpeed / Pro maps
Password: bRAd
These are the only Becker maps, i recommend you to install iGO to your device because they update their devices more often, and you do now have anything to lose if you have the original DVD/CD that came with your device.
You can find suitable tutorials about how to do this on our website.
- Elaine Dickerson
- 1 years ago
- Views:
Transcription
1 v Preface The Geoinformatik 2011 is the third in a series of conferences that calls upon researchers from academia and industry. It provides a forum to come together, exchange ideas and advance the state of the art in GI science. Geoinformatik 2011 covers a number of exciting topics that include established fields in GI science, current and novel GI applications, as well as new and emerging research directions. This year s conference theme Geochange reflects the increasing importance of Geoinformatics methodology and application gained in monitoring, modelling and analysing global change. A substantial number of submissions discuss the role of new GI technologies and approaches in climate, environmental, and energy issues. In addition, the contributions published in these proceedings cover many further well established and emerging topics in Geoinformatics including mobile technologies, real-time data processing, spatial information systems and geographical data infrastructures, spatio-temporal modelling, geosensor networks, volunteered geographic information, and open source software development. For the first time, the Geoinformatik conference hosts four workshops and two tutorials that cover a range of relevant topics such as climate/environment, mobile technologies, legal aspects of geographical data, and geoinformatics in education. The conference offers two different tracks with oral presentations: one scientific track with academic presentations, workshops, and tutorials; and one track with invited presentations that discuss innovative applications. In parallel, companies from the GI sector present their products during an industrial exhibition. In total, 30 full research papers and 19 short papers on innovative applications were submitted to the conference, out of which 34 submissions have been accepted for oral presentation at the conference and a further eight as posters. These conference proceedings contain the accepted, peer reviewed manuscripts, and demonstrate the strongly international character of Geoinformatics research in Germany: 36 submissions were made in English. We received submissions from seven different countries. Many people have contributed to bringing this conference to life. We particularly would like to thank all submitting authors for contributing their high quality papers to Geoinformatik We also wish to extend our thanks to the program committee who critically evaluated the papers and provided valuable comments that were essential to the quality of the conference proceedings. Finally, we would like to express our gratitude towards the organizing team and the sponsors without whom this conference could not have been realized. Angela Schwering Edzer Pebesma Kai Behncke
2 vi Conference committee Angela Schwering, University of Münster, Germany Edzer Pebesma, University of Münster, Germany Programme committee Lars Bernard, TU Dresden, Germany Michela Bertolotto, University College Dublin, Ireland Thomas Blaschke, University of Salzburg, Austria Arnold Bregt, Wageningen University, Netherlands Martin Breunig, Karlsruhe Institute of Technology, Germany Thomas Brinkhoff, Jade University of Applied Sciences, Germany Tiago Carneiro, Federal University of Ouro Preto, Brazil Christophe Claramunt, Naval Academy Research Institute, France Volker Coors, HFT Stuttgart, Germany Tobias Dahinden, Leibniz University Hannover, Germany Clodoveu Davis, Universidade Federal de Minas Gerais, Brazil Norbert de Lange, University of Osnabrück, Germany Wolfgang Deiters, Fraunhofer Institute of Software and Systems Engineering Doris Dransch, Deutsches GeoForschungsZentrum, Germany Sara Irina Fabrikant, University of Zurich, Switzerland Theodor Foerster, University of Münster, Germany Christian Freksa, University of Bremen, Germany Gerard Heuvelink, Wageningen University, Netherlands Uwe Jasnoch, Intergraph SG&I Deutschland GmbH, Germany Tomi Kauppinen, University of Münster, Germany Carsten Keßler, University of Münster, Germany Eva Klien, Fraunhofer IGD, Germany Christian Kray, University of Newscastle, United Kingdom Werner Kuhn, University of Münster, Germany Michael Lutz, European Commission Joint Research Centre, Italy Hartmut Müller, University of Applied Sciences Mainz, Germany Volker Paelke, Institut de Geomatica Barcelona, Spain Torsten Prinz, University of Münster, Germany Florian Probst, SAP Research, Germany Ross Purves, University of Zurich, Switzerland Martin Raubal, University of California, United States Tumasch Reichenbacher, University of Zurich, Switzerland Albert Remke, 52 North, Germany Sven Schade, European Commission Joint Research Centre, Italy Jochen Schiewe, HafenCity University Hamburg, Germany Johannes Schöning, DFKI GmbH, Germany Bettina Speckmann, TU Eindhoven, Netherlands John Stell, University of Leeds, United Kingdom Karl-Peter Traub, HafenCity University Hamburg, Germany Antonio Miguel V. Monteiro, INPE, Brazil Marc van Kreveld, Utrecht University, Netherlands Lubia Vinhas, INPE, Brazil
3 vii Organizing committee Geonetzwerk Münsterland GiN e.v. GfGI - Gesellschaft für Geoinformatik Institute for Geoinformatics, University of Münster, Germany Main sponsors ESRI Deutschland GmbH con terra GmbH GEOCOM Informatik GmbH Sponsors abcverlag GmbH, Heidelberg Akademie für Geowissenschaften und Geotechnologien e.v. Anwenderverband für integrierte Rauminformationen und Technologien e.v. (AIR) Berhard Harzer Verlag GmbH BM Vektor e.k. Deutscher Dachverband für Geoinformation e.v. (DDGI) Deutsche Gesellschaft für Kartographie e.v. (DGFK) disy Informationssysteme GmbH European Society for egovernment e.v. Geodatenportal Niedersachsen; Landesamt für Geoinformation und Landentwicklung Niedersachsen GeoData+ GmbH GEOkomm e.v. GEOPLEX-3D & Solar GmbH GEOSYSTEMS GmbH Hansa Luftbild AG - German Air Survey Institut für Geoinformatik und Fernerkundung, Universität Osnabrück, Landwirtschaftskammer Niedersachsen melezo GbR Mensch und Maschine Systemhaus GmbH PROGIS Software GmbH UNIGIS Salzburg & Research Studio ispace Verband Deutscher Vermessungsingenieure e.v. (VDV) 52 North Initiative for Geospatial Open Source Software GmbH
4 viii Table of Contents Geoinformation for the assessment and management of urban water consumption in Mediterranean contexts... 1 Angela HOF An Event Driven Architecture for Decision Support... 7 Thomas EVERDING; Theodor FOERSTER Handling of spatial data for complex geo-scientific modelling and 3D landfill applications with DB4GeO Martin BREUNIG; Edgar BUTWILOWSKI; Paul Vincent KUPER; Norbert PAUL; Andreas THOMSEN; Sabine SCHMIDT; Hans-Jürgen GÖTZE Design With Nature 2.0 A Geodata Infrastructure Approach to Map Overlay Claus RINNER; Martin DÜREN Track-based OSM Print Maps Holger FRITZE; Dustin DEMUTH; Kristina KNOPPE; Klaus DRERUP LOSM - A lightweight approach to integrate OpenStreetMap into the Web of Data Johannes TRAME; Philippe RIEFFEL; Umut TAS; Alkyoni BAGLATZI; Volker VON NATHUSIUS OpenFloorMap Implementation A. WESTERMANN; G. TSCHORN; P. WEISS; O. OGUNDELE; D. LASNIA On the Integration of Geospatial Data and Functionality into Business Process Models Andre MÜLLER; Matthias MÜLLER Enabling User-friendly Query Interfaces for Environmental Geodata through Semantic Technologies Andreas ABECKER; Wassilios KAZAKOS; Gabor NAGYPAL; Aleksei VALIKOV Integration of Qualitative Spatial Reasoning into GIS- An Example with SparQ Sahib JAN; Malumbo CHIPOFYA Matching-Based Map Generalization by Transferring Geometric Representations Hendrik WARNEKE; Michael SCHÄFERS; Udo W. LIPECK; Joachim BOBRICH Interoperable integration of high precision 3D laser data and large scale geoanalysis in a SDI for Sutra inscriptions in Sichuan (China) Sandra LANIG; Arne SCHILLING; Michael AUER; Bernhard HÖFLE; Nicolas BILLEN; Alexander ZIPF Integrating Marine Modeling Data into a Spatial Data Infrastructure Christoph WOSNIOK; Michael BAUER; Rainer LEHFELDT
5 ix Leveraging standardized near real-time insitu sensor measurements in nature conservation areas Manfred MITTLBOECK; Bernd RESCH; Thomas BLASCHKE; Helmut FRANZ GeoURI/GeoURL: A protocol convention to connect mobile Apps and isolated Wifi cells via a distributed indoor-information-infrastructure Roland M. WAGNER A reference schema for interoperability between geo data and 3d models Kerstin FALKOWSKI; Jürgen EBERT Disambiguating Resilience Desiree DANIEL; Jens ORTMANN Mobile In-Situ Sensor Platforms in Environmental Research and Monitoring Juliane BRINK; Timo JANSEN Building Tracking Applications with Sensor Web Technology Simon JIRKA; Henning BREDEL Web-Based Near Real-Time Geo-Analyses of Environmental Sensor Measurements Günther SAGL; Michael LIPPAUTZ; Manfred MITTLBÖCK; Bernd RESCH; Thomas BLASCHKE Towards Highly Parallel Geostatistics with R Katharina HENNEBÖHL; Marius APPEL Agricultural land use dynamics in Tenerife (Canary Islands): The development of fallow land as resettlement area for adjacent natural ecosystems Sebastian GÜNTHERT; Alexander SIEGMUND; Simone NAUMANN Towards Standards-Based, Interoperable Geo Image Processing Services Peter BAUMANN; Roger BRACKIN; Michael OWONIBI; Tim CUTSWORTH Extracting the Evolution of Land Cover Objects from Remote Sensing Image Series Lúbia VINHAS; Olga BITTENCOURT; Gilberto CÂMARA; Sergio COSTA Ontology-Based Modeling of Land Change Trajectories in the Brazilian Amazon Tomi KAUPPINEN; Giovana MIRA DE ESPINDOLA Integration of dynamic environmental data in the process of travel planning Thomas SPANGENBERG; Hardy PUNDT TransitDroid: Delivering real-time bus tracking information on mobile devices Bashir SHALAIK; Ricky JACOB; Adam WISTANLEY
6 x DGPS- and INS-Based Orthophotogrammetry on Micro UAV Platforms for Precision Farming Services Jakob GEIPEL; Christian KNOTH; Olga ELSÄSSER; Torsten PRINZ RM-ODP for WPS Process Descriptions Theodor FOERSTER; Bastian SCHÄFFER Towards Linking the Digital and Real World with OpenThingMap Damian LASNIA; Theodor FOERSTER; Arne BRÖRING Konzeption von akustisch unterstützten animierten Karten zur Präsentation raumzeitlicher Informationen Jochen SCHIEWE; Beate WENINGER WebGIS-Technologien im Einsatz für den ehrenamtlichen Naturschutz Astrid LIPSKI; Roland HACHMANN XErleben Datenmodell für ein kommunales Freizeitkataster Christine ANDRAE; Jens HINRICHS; Friedhelm KRUTH; Katja NIENSTEDT; Birgit PIEKE; Axel ZOLPER Risikobewertung von Sichtbehinderungen durch niedrige Sonnenstände für das Verkehrswegenetz M. RICHTER; M.-O. LÖWNER Poster Abstracts Unveiling the design framework behind transactional map symbols Martin LOIDL; Florian FISCHER; Christoph TRAUN Geographical information systems for research biological resource of the World Ocean in climate fluctuation conditions Pavel P. CHERNYSHKOV; Stanislav G. GLUSHCHENKO Individual Geographic Stream Processing for Driver Assistance Christian KUKA; Susanne BOLL Virtuell kuren mit einem WebGIS Peter WOLFF; Matthias BINDING; Viviane WOLFF Flex-I-Geo-Web - ein dienstebasierter Softwarebaukasten zur Standortentwicklung Robert KULAWIK Challenges and Advantages of using GPS Data in Outdoor Advertisement Dirk HECKER; Christine KÖRNER; Michael MAY
7 xi Dynamische 3D-Zeitreihenvisualisierung in interaktiven Webmapping-Applikationen Martin GEMEINHOLZER; Andre SCHÜCKER WebGIS für Kommunales Informationsmanagement Sascha TEGTMEYER; Dirk ROHRMOSER Creating the new new : Facilitating the growth of neo-geographers in the Global South using emergent Internet technologies Michael MARTIN; Jon CORBETT
8 Angela HOF 1 Geoinformation for the assessment and management of urban water consumption in Mediterranean contexts Angela HOF Postdoctoral Researcher, Geography Department, Ruhr-Universität Bochum, Germany, Abstract. Climate change is expected to intensify water supply problems in the Mediterranean. The challenges for managing decreasing water resources more efficiently coincide with the evolvement of disperse settlement patterns. Geoinformation plays an important role in the establishment of indicators for permanent water demands that are the direct consequence of the proliferation of swimming pools and irrigated landscaping in many urbanized seasides, municipalities, and holiday destinations in the Mediterranean. The paper illustrates the application of the methodology for a case study area which exemplifies the nexus of new urban landscapes and permanent water demand on the island of Mallorca, an environment where the water supply situation is already critical. Keywords. Urban form, residential water budget, indicators, stakeholder-oriented geoinformation Introduction At a time when the already critical water supply situation in the Mediterranean is expected to be exacerbated by climate change, urban sprawl and low density residential areas characterize an increasing number of municipalities on Mediterranean coasts [1, 2, 3, 4]. Spain in particular is experiencing a tourist and second-home boom, and residential tourism and leisure structures (golf courses, spas, aquatic parks, swimming pools and irrigated gardens) can have significant impacts on water resources through high levels of consumption. These development pressures coincide with the necessity to manage decreasing water resources more efficiently [5, 6, 7, 8]. The present paper discusses a concept for deriving residential water budgets with the use of geoinformation, and illustrates the extraction of relevant information for a case study area. Other studies in similar climates and contexts have demonstrated the interrelation of urban form and water consumption. The low density urban form produces higher per capita and per area water consumption than the high density urban form. The sensitivity to climate change is positively correlated with irrigated landscaping, a high percentage of single residential houses with private swimming pools, and high income levels [1, 6, 9]. More complete information on this type of water demand is needed by state and local authorities to define priorities for water conservation or demand management programmes. The present methodology is tailored to contexts where extensive meter and/or household survey data is difficult to
9 Angela HOF 2 access or collect, or where the available per capita water consumption data mask the magnitude of permanent water demand and the variability of consumption levels caused by different land use patterns. The focus is on geoinformation-based indicators to improve statistical coverage of the subject [8]. 1. Conceptual issues and the role of geoinformation in water consumption analyses The present analysis focuses on the relevant variables for residential water consumption models: per capita indoor consumption (dependent on household size, seasonality of inhabitancy and capacity utilization of tourist accommodation), and outdoor consumption for swimming pool and garden maintenance [1, 3, 6, 7, 9]. Water consumption data can be combined with a detailed land use and population geodatabase to allocate water consumption to different types of usage (indoor and outdoor). Relevant input data can be inventoried from cadastral data and remote sensing imagery to circumvent the reliance on water meter data, which are rarely available. The result is a model of urban water demand on the basis of statistics and consumption ratios that is useful as a short-term tool for use in decision-making, defining conservation strategies, and assessing the impact of future urban development [4, 8]. The model requires water consumption data at municipal or sub-municipal level, average estimates for different types of residential units (houses, apartments, hotels), and an inventory of facilities (gardens and pools) that cause permanent outdoor water demand irrespective of occupancy. The land use inventory is pivotal for disaggregating water consumption data in the domestic residential sector to different types of uses, as it has been shown in several studies that gardens and pools are major drivers of water consumption [1, 4, 6, 7, 8, 9]. The model that is proposed here is tailored to water consumption analyses in urbanized seasides that are evolving along Mediterranean coasts, where water demand management is becoming a major challenge [2, 8]. Such municipalities are already typically found in Spain on the Alicante coast, the Costa del Sol or the Balearic and Canary islands, where residential houses, second homes, and tourist hotels and apartment are mixed spatially, and where water demand data do not discriminate between these users [4, 8, 10]. Land use information at sub-parcel scale can be used to calculate and visualize water consumption by outdoor uses in order to identify urban neighborhoods with high potential water demand that should be targeted for water conservation campaigns and that are potentially sensitive to climate change. Visualized in maps, the results are support for and powerful communication means in water conservation campaigns that already address gardens and swimming pools tentatively and encourage the use of more Mediterranean species in gardening, adequate irrigation technology, and pool covers [13]. 2. Case study application 2.1. Motivation of the study The concept of use of geoinformation for the assessment and management of urban water consumption was applied to the domestic residential sector in Nova Santa Ponsa, a sub-municipal census district of the municipality of Calvià in southwest Mallorca,
10 Angela HOF 3 Balearic islands, Spain. Nova Santa Ponsa is a residential area where more than 80% of all parcels are used by single residential houses. It represents the proliferation of irrigated gardens and private swimming pools as residential resort features, and positional goods that are characteristic for the new urban natures evolving in many Spanish municipalities. Located next to a golf course and marina, Nova Santa Ponsa has experienced a real estate boom, mainly through the active development of second homes with high housing standards. In 2008, there were 18 times more private swimming pools and 3.4 times as many houses as in Through this type of development, the official residential population grew by 71.5% ( ) in Calvià municipality, and per capita water consumption rose from 300 liters per person per day (lpd) to 700 lpd, exceeding the predicted demand [11, 12]. The Mallorcan water supply situation is already critical and the island government s action program for the Hydrological Plan of the Balearics [13] sets aside over 1 million Euro for stakeholderoriented communication of water saving measures. Understanding the share of private irrigated gardens with swimming pools in residential water budgets is a first step toward designing water conservation policies focused on reducing outdoor water use and targeting neighborhoods where outdoor use is high. The results presented here could support such water conservation campaigns Assembling the relevant geoinformation A land use database was built from digital cadastre data and by visual interpretation of high resolution digital color orthophotos (year 2006, geometric resolution 40cm/pixel), and on-screen digitizing [14, 15]. Pool area, garden area, and built-up area (sealed surfaces and buildings) were mapped using the geographic information system ArcGIS 9 by subdividing land parcels into these land use types at a scale of 1:600 (Fig. 1). For every parcel with tourist use declared in the online cadastre, the numbers of official tourist beds in hotels and apartments as listed by the Calvià municipal government for the year 2007 (unpublished data) were recorded in the database. For all residential parcels, the number of single houses was recorded and the number of flats in multi-residential houses was queried from [14] and recorded in the database. With this geodatabase structure and content, the relevant outdoor land uses and the relevant data for indoor water consumption (number of inhabitants and tourist beds) can be determined per parcel and for different tourist and residential housing types (Fig. 1). Figure 1. Data model for assembling geoinformation for the water consumption analysis. Orthophoto (year 2002, resolution 40 cm/pixel) reprinted with permission from Sitibsa S.A., Palma de Mallorca.
11 Angela HOF Establishing monthly residential water consumption budgets Monthly water consumption data (2007) for the domestic residential sector was collected from the private water company ATERCA S.A. (Aguas del Término de Calvià, unpublished data). The data exclude the loss in the water supply network, the water consumption of commercial users and golf courses. Monthly indoor water consumption was calculated for the official residents and tourists, taking the monthly percentage of open hotels and tourist apartments, and their capacity utilization into account [16]. The average consumption per bed occupied across different hotel categories is obtained from [4]. Indoor consumption for apartment beds equals the average per capita consumption of single residential houses without garden or swimming pool (142 l/p/d). The mapped swimming pool surface area is used to calculate water loss from evaporation. The average quantity evaporated over the year for a swimming pool in Mallorca is an estimate of 5 liters of water/m²/day, which amounts to 122% of the average pool water volume [17]. The amount of water available for garden irrigation is derived by deducting the indoor water use and the pool water use from the total water consumed in the respective month. The resulting amount of consumed water is divided by the total garden area, resulting in an estimate of water consumed for garden irrigation per square meter. 3. Results and discussion The monthly water use profile shows that outdoor water use induces substantial consumption increases in the summer season (Fig. 2). The residential character of the study area and the minor influx of tourists to the couple of hundreds hotel beds in Nova Santa Ponsa are reflected in the low variability of indoor consumption. The estimate of the contribution of pool water loss by evaporation follows climatic patterns and is highest in July, the month of maximum water consumption and evapo-transpiration. On an annual average basis, indoor water consumption accounts for 18.3% of the residential water budget, while 77.1% of water is allocated for garden irrigation, and 4.6% of water consumed in the domestic residential sector is water loss from evaporation from the swimming pools. Currently, there is a lack of reliable, householdlevel data on water consumption in urban contexts like the one described here, so the results can only be gauged against the findings of other studies in similar climates and contexts. The results on the contribution of garden irrigation and pools to residential water budgets (Fig. 2) are consistent with the findings of studies based on water meter data and interviews where it was observed that water consumed outdoors attains between 50 and 75% of total water consumed in the household [1, 6, 9]. When compared to water consumption in residential areas with lower per capita garden area and lower level of swimming pool ownership in Mallorca and similar urban contexts [4, 10], the per capita water consumption in areas like Nova Santa Ponsa is 2.9 to 3.5 times higher because of substantial outdoor uses of water. These findings support the evidence from other studies that have shown that there is a positive relationship between the presence of irrigated gardens and swimming pools and higher water consumption levels and that the absence of a pool and garden results in a two to three times lower mean consumption per household, per capita and in the month of maximum water consumption [1, 4, 6, 9]. The results presented here are statistical
12 Angela HOF 5 estimates on the magnitude of the contribution of water-intensive new urban landscapes to urban water demand. These estimates relate directly to a range of indicators suggested by [8] for monitoring the water demand of tourism and its different subsectors (second homes, facilities, activities etc.) that elude statistical observation. The proposed geodatabase structure and the water consumption model are operational for calculations of such estimates in order to obtain plausible residential water consumption budgets based on geoinformation, statistics and consumption ratios. Figure 2. Monthly residential water budget for the case study area (year 2007). Data source: ATERCA S.A.; own calculations. 4. Conclusions and outlook The results presented here show that gardens and swimming pools are important issues for water management on Mallorca and conservation measures should start to address this more explicitly. For gardens, the use of more Mediterranean species and adequate irrigation technology should be encouraged, and for swimming pools, the use of pool covers and more extensive reuse of water are strongly advisable. The results do not necessarily reflect a close estimate of the actual individual household-level water consumption, but they are the basis for further analysis and could support stakeholderoriented communication and water-related environmental campaigning. The geodatabase can be queried to select parcels with above average garden size that include a swimming pool to visualize the estimated outdoor water consumption that is a function of pool and garden area. The geodatabase content could equally serve as the basis for a stratified questionnaire household-level survey on domestic water consumption and irrigation practices, or for targeting conservation campaigns at neighborhoods where outdoor use is high. For analyses over larger spatial units, the
13 Angela HOF 6 methodology could incorporate object-oriented and multispectral analysis of very highresolution satellite imagery for the mapping of the vegetated garden area and the discrimination of garden types (e.g. trees or turf grass dominated) [18]. The combination of parcel-level land use data and a sample of household consumption data could be used for the refinement of the residential water budget model, and for a comparison of water consumption patterns across different urban forms. Ultimately, the approach is expected to provide a more refined model of water consumption in low density urban areas in the Mediterranean and similar climates and contexts. List of literature [1] E. Domene, D. Saurí, Urbanisation and water consumption: influencing factors in the metropolitan region of Barcelona, Urban Studies 43 (2006), [2] A. Iglesias, L. Garrote, F. Flores, M. Moneo, Challenges to Manage the Risk of Water Scarcity and Climate Change in the Mediterranean, Water Resources Research 21 (2007), [3] C.H. March, D. Saurí, What lies behind domestic water use? A review essay on the drivers of domestic water consumption, Boletín de la A.G.E. 50 (2009), [4] A.M. Rico-Amoros, J. Olcina-Cantos, D. Saurí, Tourist land use patterns and water demand: Evidence from the Western Mediterranean, Land Use Policy 26 (2009), [5] M. Barke, Second Homes in Spain: An analysis of change at the provincial level, , Geography 92 (2007), [6] E. Domene, D. Sauri, M. Pares, Urbanization and sustainable resource use: The case of garden watering in the metropolitan region of Barcelona, Urban Geography 26 (2005), [7] M. Vidal, E. Domene, D. Saurí, Changing geographies of water-related consumption: residential swimming pools in suburban Barcelona, Area, DOI: /j x. [8] European Commission (Ed.), MEDSTAT II: Water and Tourism pilot study, Luxembourg, [9] R.C. Balling, P. Gober, N. Jones, Sensitivity of residential water consumption to variations in climate: An intraurban analysis of Phoenix, Arizona, Water Resources Research 44 (2008), W [10] A. Hof, T. Schmitt, Urban and tourist land use patterns and water consumption: evidence from Mallorca, Balearic islands, Land Use Policy (2011), [11] M. Blázquez, I. Murray, J.M. Garau, El tercer boom. Indicadors de sostenibilitat del turisme de les Illes Balears , CITTIB, Palma de Mallorca, [12] OST (Observatori de Sostenibilitat i Territori, Grup d Investigació sobre Sostenibilitat i Territori, Universitat de les Illes Balears) (Ed.), Els Indicadors de Sostenibilitat Socioecològica de les Illes Balears ( ), Versió Extensa, OST, Palma de Mallorca, [13] Conselleria de Medi Ambient (Ed.), Proposta del pla hidrològic de la demarcació de les Illes Balears. Programes d actuació i infrastructures, Palma de Mallorca, [14] Dirección General de Catastro (Ed.), Cartografía Catastral de Urbana 1:500 y 1: URL: Web Map Service URL: [Accessed 20 January 2011]. [15] IDEIB (Infraestructura de Dades Espacials de les Illes Balears) (Ed.), 2011, Orthophoto year 2006, on ideib.caib.es, Resolution 50cm/pixel. Web Map Service URL: [Accessed 20 January 2011]. [16] CITTIB (Balearic Tourism Research and Technologies Centre) (Ed.), Dades informatives El turisme a les Illes Balears, Conselleria de Turisme, Palma de Mallorca, [17] A. Hof, D. Böhlein, A. Kilzer, T. Schmitt, W. Leiner, Estimation of water losses by evaporation from open air swimming pools on Mallorca - A physical model to be incorporated into a domestic water consumption analysis, (in preparation, manuscript to be resubmitted to the journal ERDKUNDE). [18] R. Mathieu, C. Freeman, J. Aryal, Mapping private gardens in urban areas using object-oriented techniques and very high-resolution satellite imagery, Landscape and Urban Planning 81 (2007),
14 Thomas EVERDING; Theodor FOERSTER 7 An Event Driven Architecture for Decision Support Thomas EVERDING; Theodor FOERSTER Institute for Geoinformatics, University of Münster, Germany Abstract. This paper presents an Event Driven Architecture for environmental monitoring and live decision support. Multiple OGC web services are integrated into this architecture, including the Web Processing Service and the Sensor Event Service. The system is demonstrated using a pilot from the EC funded GENESIS project. The architecture, the processing steps and the benefits of the system are described in detail. Keywords. Event Driven Architecture, Spatial Data Infrastructure, OGC Web Services, Decision Support 1. Introduction Environmental measurements are crucial for decision making and assessing health risks. The variety of measurements is accessed via Spatial Data Infrastructures (SDIs). However, the pure access to the information is not sufficient as the measurements have to be available in real-time. Especially phenomena with a strong effect on health and a fast spread have to be detected and processed as soon as possible. This paper presents a near real-time system based on an Event Driven Architecture (EDA) and existing web services such as the OGC Web Processing Service (WPS) and OGC s suite of sensor services. The EDA was developed as part of the FP-7 project GENESIS (GENeric European Sustainable Information Space for environment) [1] which builds collaborative information networks for environment management and health actors. It provides a Service Oriented Architecture (SOA) applying standards from various organizations like ISO, the OGC, the W3C and OASIS. Section 2 describes basic concepts of OGC web service standards as applied in this paper. In Section 3 the use case applied to the presented system is described. The fourth section presents the system including its architecture and the utilized techniques and processes. Finally the benefits of the system are summarized and an outlook for future developments is given. 2. Related Work This section provides an overview about the related work on EDAs and the sensor web as applied in this work. For this paper we follow the event definition as developed in [2]: An event is anything that happens or is contemplated as happening at an instant or over an
15 Thomas EVERDING; Theodor FOERSTER 8 interval of time. Such an event can represent the real world phenomenon or its digital representation. For this paper we understand an event as the latter. An Even Driven Architecture (EDA) is an architectural style in which most of the components execute their actions based on incoming events and communicate via events [3]. To extract specific information from events Event Stream Processing (ESP) is applied. Such processing is performed using so-called data views which provide access to a sub-set of the available events, e.g. all events received during the last hour or the newest 100 events. The processing rules are defined as event patterns [4,5]. The Event Driven Architecture is based on standards for service interfaces and data encodings such as provided by the OGC for geographic applications [6]. The Web Processing Service (WPS) allows clients to perform configurable remote processes over the web [7]. Performing web-based processes on the web is regarded as the next step in SDIs and allows clients to build flexible processing chains [8,9]. The Web Coverage Service (WCS) provides means to query coverage datasets over the web [10]. Coverages are for instance remote imagery or any n-dimensional raster dataset. Besides the services for raster and processing, the Sensor Web Enablement (SWE) initiative at the OGC develops standards specific for sensors, sensor systems and sensor networks. In the recent years the SWE initiative released several service specifications as well as encodings for sensor metadata and for sensor measurements. Within the SWE initiative also services using a publish/subscribe communication were specified. These are namely the Sensor Alert Service (SAS) [11] and the Sensor Event Service (SES) [12]. The SAS allows clients to subscribe for sensor measurements using filter criteria like a bounding box or a threshold. The SES is a successor of the SAS, also based on the publish/subscribe messaging pattern. It makes use of existing standards such as the Web Services Notification (WS-N) suite from OASIS [13,14,15], Observations & Measurements [16] and the OGC Filter Encoding specification (FES) [17] for the definition of subscription filters. Additionally, it applies Event Stream Processing. The event patterns of the SES e.g. for Event Stream Processing are described through the Event Pattern Markup Language (EML) [18]. 3. Use Case The EDA presented in the following section is based on a use case of the GENESIS project about cyanobacterial bloom in the artificial Villerest reservoir. The reservoir is located in the north-west of Lyon, impounding the Loire River. In the summer months the reservoir is often affected by cyanobacterial blooming. During such a bloom phase the concentration of blue-green algae and natural toxins is largely increased to a degree that it even threatens humans. Consequently, drinking the reservoir s water as well as skin contact has to be avoided. The detection model for cyanobacterial bloom is based on in-situ measurements and remote imagery data [19,20]. 4. Event Driven Architecture Based on the use case the presented EDA needs to perform processing of in-situ measurements (i.e. ph values) as well as of remote imagery. In the following the EDA with its components is described. In particular, the Enterprise Service Bus, the topic
16 Thomas EVERDING; Theodor FOERSTER 9 concept, the processing by the SES and the necessary extensions of the components are presented. The components and data flows of the EDA are shown in Figure 1. The sensors provide ph value measurements. To encapsulate the sensors and to provide a sensor gateway these measurements are sent to a Sensor Alert Service (SAS). The SAS feeds them into the Enterprise Service Bus (ESB), a central communication middleware component, which serves as communication platform. All events (e.g. sensor measurements or other notifications) in the EDA are sent to the ESB and disseminated to the subscribers. The Sensor Event Service (SES), subscribed for ph value measurements, calculates the daily ph value variation based on the sensor measurements. In case of high variations, notifications are produced that trigger the Web Processing Service (WPS). Figure 1. Overview of the components and data flows. The WPS accesses remote imagery data for the area in which a high variation was detected and scans the image for high concentrations of Chlorophyll A. Details on algorithms for Chlorophyll detection can be found in [21]. The processing results are available through a Web Coverage Service (WCS) for querying. In addition, clients (e.g. a GIS or a web portal) are notified that new results are available. Based on the results, the user decides about restrictions in the affected water body Enterprise Service Bus and Topics The Enterprise Service Bus handles all event communication. It provides a Web Services-Notification (WS-N) interface for notification as well as for subscriptions of specific notifications, which are grouped by topics. Topics allow users to subscribe for a type of notification without the need of knowing details like measurements taken by sensor XY. They are abstract filter for types of notifications and are structured as a tree, where a topic may be a node containing further sub-topics or a leaf on which notifications are published. Clients can only subscribe for leaf topics. Figure 2 shows the topics that were implemented for the use case. The SES for instance is subscribed for notifications on the PH topic to receive every ph value measurement. Critical variations are published to the CriticalPHVariation topic on which the WPS is subscribed. The distinction between Measurements and DerivedInformation allows separating the base data from information that is generated. This helps to identify the topics of interest, especially when taking further phenomena into account (e.g. oxygen saturation measurements and information derived thereof).
17 Thomas EVERDING; Theodor FOERSTER Sensor Event Service Processing Figure 2. Overview of the topics. The SES calculates the daily ph value variation and checks if it exceeds a critical threshold. According to [20], a critical ph value variation is one indicator for a cyanobacterial bloom. Table 1 gives the definition of the algorithm. Table 1. Definition of the critical ph value variation detection algorithm [20]. Name Data Unit Value Range Threshold for Type decision process ph Float ph Units 0 to 14 see below ph variation Float ph Units / day -14 to 14 abs(ph variation) >= 0.5 ph Status Logical Normal if abs(ph variation) >= Abnormal 0.5 then ph status = Abnormal, else ph Status = Normal This algorithm is translated into Event Stream Processing (ESP) rules that can be executed on-the-fly in the SES (encoded as EML). For identifying cyanobacterial bloom five event patterns are created (see Figure 3). Figure 3. EML patterns for the critical ph value variation detection. The ph values for a given location are passed onto the first pattern. It uses a data view to provide access to the measurements from the last 24 hours. From this view the maximum and the minimum values are selected and pushed to the next pattern. Here the difference is calculated which is the absolute value of the maximum ph value variation during the last 24 hours (see Table 1, thresholds). Next, this difference is forwarded to a pattern checking if the threshold of 0.5 ph units is exceeded. In theory one could stop here and generate a notification to trigger the WPS process. But imagine a situation in which the ph value changed by more than 0.5 during the last 20 hours. In that case the three patterns described above generate an output for every new measurement received during the next four hours. This leads to needless triggering of the WPS as the critical situation happened earlier and was already recognized.
18 Thomas EVERDING; Theodor FOERSTER 11 To avoid this behavior of the system, two more patterns are needed. A first one looks for ph value changes that are within an uncritical range (< 0.5). The results of the two threshold patterns are then given to the last pattern which checks if an uncritical variation (low) is followed by a critical one (high). If this criterion is matched, the threshold was exceeded for the first time, later critical changes are ignored. In order to generate an output again, the ph value variation has to drop to a normal state again Extensions of Components Several adaptions to the components were required to execute the workflow described above. First, the Sensor Alert Service had to be connected to the Enterprise Service Bus. There were two options, either to change the SAS to provide WS-Notification compliant notifications or to extend the ESB to accept also notifications sent via XMPP, the messaging protocol used by the SAS. The second option was implemented as it allows an easier integration of external sensors provided via regular SAS instances. For this aim also Sensor Event Service had to be extended to be able to handle SAS notifications. A larger modification was made at the Web Processing Service as it had to be connected via WS-Notification to the ESB. This connector is able to receive notifications from the SES, extract the location and further necessary information and build the request to execute the Chlorophyll A detection algorithm. Also the notifications after the finishing process are built by this connector and sent to the ESB. 5. Conclusion and Outlook This article describes an Event Driven Architecture for environmental monitoring and health assessment. The presented architecture is based on standards for web services (e.g. WS-N, SES and WPS). The architecture has been applied to a use case based on the fresh water quality pilot C from the FP-7 project GENESIS. The presented architecture shows several benefits of an Event Driven Architecture compared to a classic Service Oriented Architecture. At first, the system works on-thefly, which means that processes are started as soon as possible, namely as the input data (e.g. in case of the SES) or the trigger (e.g. in case of the WPS) is available. This way, the decision maker gets notified when a new decision is needed instead of guessing when new information for the decision has to be requested. Furthermore, there is no need for frequently requesting this inputs which may lead to unnecessary requests (if the frequency is too high) or to missed information (if the frequency is too low). This makes the system very flexible in terms of the application area and deployment context. This architecture is also very flexible. Due to the loose coupling of the components, data sources can be exchanged easily. New sensors can be added or other remote imagery stores can be used during run time. Also service instances can be added or exchanged without stopping the system. This allows to migrate services to more powerful machines, to add redundant backup machines to avoid down times or to add new services to extend the workflow, e.g. by decision support services that automatically provide recommendations on necessary actions based on the WPS processing results. Furthermore, the processing rules and algorithms of the WPS and SES can be changed during run time by altering the subscribe (SES) or execute (WPS)
19 Thomas EVERDING; Theodor FOERSTER 12 requests. This might for instance be necessary due to a change of the remote imagery source. From all this also follows that the system needs to be set up and initialized only once. From that point on it will automatically execute the whole workflow. Using an Enterprise Service Bus as the communication infrastructure also provides a single point to access data. This might be the final WPS processing results but also the intermediate results of the SES or the SAS notifications. To make use of these benefits it is envisioned to integrate event based communication mechanisms into existing Spatial Data Infrastructures (SDIs) rather than to replace the current request-response based technologies. This was also used in the EDA presented in this paper: the remote imagery was not sent via the ESB but stored in a Web Coverage Service; only notifications referring to new processing results were distributed this way. This approach is also followed by the newly formed Pub/Sub Standards Working Group (SWG) of the OGC. It develops a standard for publish/subscribe usable by existing OGC web services. Acknowledgement This work was supported by the European Commission through the GENESIS project, an Integrated Project, contract number Special thanks go to the project partners that helped to implement the presented EDA. References [1] GENESIS Consortium, Project overview, 2010, last accessed on [Online]. Available: [2] T. Everding and J. Echterhoff, OGC OWS-6 SWE Event Architecture Engineering Report, Open Geospatial Consortium, Tech. Rep. OGC document number , [3] D. Luckham and R. Schulte, Event processing glossary - version 1.1, Event Processing Technical Society, Tech. Rep., [4] D. Luckham, The Power of Events. Addison-Wesley, [5] O. Etzion and P. Niblett, Event Processing in Action. Manning Publications Co., [6] Open Geospatial Consortium, About OGC, 2010, last accessed on [Online]. Available: [7] P. Schut, OpenGIS Web Processing Service, OGC document number r7, Open Geospatial Consortium Std., [8] C. Kiehle, K. Greve, and C. Heier, Standardized geoprocessing - taking spatial data infrastructures one step further, in 9th AGILE International Conference on Geographic Information Science, 2006, pp [9] B. Schaeffer and T. Foerster, A client for distributed geo-processing and workflow design, Journal for Location Based Services, vol. 2(3), pp , [10] A. Whiteside and J. D. Evans, Web Coverage Service (WCS) Implementation Standard, OGC documentnumber r5, Open Geospatial Consortium Std., [11] I. Simonis and J. Echterhoff, OGC Sensor Alert Service Implementation Specification, OGC document number r3, Open Geospatial Consortium, Tech. Rep., [12] J. Echterhoff and T. Everding, OpenGIS Sensor Event Service Interface Specification (proposed), Open Geospatial Consortium, Tech. Rep. OGC document number , [13] S. Graham, D. Hull, and B. Murray, Web Services Base Notification 1.3, OASIS Std., [14] D. Chapell and L. Liu, Web Services Brokered Notification 1.3, OASIS Std., [15] W. Vamberge, S. Graham, and P. Niblett, Web Services Topics 1.3, OASIS Std., [16] S. Cox, Observations and Measurements - Part 1 - Observation schema, OGC document number r1, Open Geospatial Consortium Std., [17] P. A. Vretanos, OpenGIS Filter Encoding Implementation Specification Version 1.1.0, OGC document number , Open Geospatial Consortium Std., 2005.
20 Thomas EVERDING; Theodor FOERSTER 13 [18] T. Everding and J. Echterhoff, Event Pattern Markup Language (EML), Open Geospatial Consortium, Tech. Rep. OGC document number , [19] M. Gerbaux, Fresh water quality state of the art report, GENESIS, Tech. Rep. 3.0, [20] Fresh water quality thematic pilot specification document, GENESIS, Tech. Rep. 1.0, [21] M. M. Islam and C. W. Tat, A comparison of empirical algorithms for chlorophyll concentration in singapore regional waters, in Proceedings of the 22nd Asian Conference on Remote Sensing, Singapore, November 2001
21 Martin BREUNIG et al. 14 Handling of spatial data for complex geoscientific modelling and 3D landfill applications with DB4GeO Martin BREUNIG; Edgar BUTWILOWSKI; Paul Vincent KUPER; Norbert PAUL a ; Andreas THOMSEN; Sabine SCHMIDT; Hans-Jürgen GÖTZE b a Geodetic Institute, Karlsruhe Institute of Technology, Germany {martin.breunig, edgar.butwilowski, paul.kuper, b Institute of Geoscience, Christian-Albrechts-Universität Kiel, Germany {athomsen, sabine, Abstract. The handling of geometric and topological data is a central task for many geo-scientific applications. However, the management of complex 3D data is not well supported by current geo-database management systems. Some ideas and results concerning the management of geometric and topological data in the context of a specific geo-scientific scenario and a landfill scenario are presented. Keywords: 3D database, geometric data, topological data, meta-information, landfill data, DB4GeO. Introduction Decision making on competing and sometimes mutually exclusive possible uses of the underground, e.g. for geothermal energy, intermittent storage of excess renewable energy, must be supported by a comprehensive information basis, despite the sparseness and cost of available basic subsurface data. Advanced geoscientific methods are therefore required for the assessment of underground use potential and risk estimation. Thus 3D subsurface models play a central part, integrating a variety of information from different sources with geological expert knowledge. For the management of such models in a distributed multi-user system, spatial 3D database technology is required. In the following we shall present two different scenarios and current work on the development of DB4GeO [1] [2], a 3D database kernel supporting the handling of complex 3D geo-scientific models [3] [4] [5]. 1. Complex geo-scientific applications using a 3D database 1.1. Meta-Information, 3D databases and visualization tools as a basis for synoptic interpretation Data integration in the geosciences may cover diverse fields such as mineralogy, geochemistry, hydrology, geomechanics, geophysics, structural geology, sedimentology etc., each employing a different set of numerical and statistical methods.
22 Martin BREUNIG et al. 15 A special case is the interactive construction of geological 3D underground models, as it is based on a variety of information sources, and in particular comprehensive expert knowledge. As shown in figure 1 - instead of a single monolithic system - WWW technology is helpful to build a loose structure consisting of a network of individual sites, and an XML-based data representation and transformation wherever the incurred redundancy is tolerable. The integration is enhanced by a common meta- data and information database that provides metadata and background information for processes. Semantic Web technology shall be used to document relationships between different information items, and to represent the flow of information between the different processes leading to a particular result. For this purpose we currently investigate the use of Topic Map [6] and RDF [7] techniques. Figure 1: Outline of the heterogeneous software environment of a geoscientific project comprising DB4GeO for spatial data and exist for meta-information Sampling, experiment and simulation results provide the information basis for a synoptical representation and interpretation. As a central part of the software architecture, the 3D database management system for geological 3D models handles the spatial basis for information integration and numerical modeling. In combination with 3D stereo-visualization [8], the textual and graphic display of numerical results and meta-information, it provides the basis for a synoptic interpretation. Our geo-database DB4GeO is integrated into a network of different geo-scientific projects in order to manage the comprehensive geological 3D models using GOCAD 1 [9] at the regional geological service of Schleswig-Holstein 2 [3]. For 3D visualization, software such as GOCAD, IGMAS+ [4], ParaView [10], Geocando [11] and GeoWall hardware [8] can be employed to get a better insight into complex 3D models and their interrelations. Additional background information on the employed methods and 1 GOCAD software is distributed by Paradigm. 2 Landesamt für Landw wirtschaft, Umwelt und ländliche Räume SH, Abt. Geologie, Flintbek, Germany
23 Martin BREUNIG et al. 16 documentation of the interaction between the different projects will enhance the synoptic interpretation of the combined models Piesberg landfill application At the so called Piesberg, near Osnabrück, Germany, the upper carboniferous of the Ibbenbüren coal mining district reaches the surface. Here formerly underground coal mining took place, while at the surface a quarry was situated, part of which later has served as a waste deposit site. Finally, the site has been transformed into a culture and natural park with a mining museum [5]. In the Piesberg landfill application, spatial data was collected during a period of eighteen years from Today this data is used for a new project examining the 3D data management in the geo- sciences 3. Figure 2a shows a single time step of these geological data representing a Triangulated Irregular Network (TIN) of a Piesberg stratum surface. The Piesberg data set [5] combines a spatially selected part of the Digital Elevation Model (DEM) at Osnabrück City in North Germany, a SICAD drawing showing the breaking edges around the dumpsite, and several cross-sections. The 2D drawing has been combined and interpolated with the manually digitized cross-sections to generate several 2.5D TINs of the geomodel. The DEM has been used to expand the geomodel at its boundaries. At a single time step the model consists of about 30K triangles. Figure 2. a) Geological application Piesberg region showing a single time step of a stratum surface represented as TIN; b) Same region aggregated into three faces according to some thematic criterion 2. Handling of 3D spatial data with DB4GeO DB4GeO [1], [2] dwells on many years of experience with earlier spatial database developments, in particular the development of GeoToolkit [12] at Bonn University within the Collaborative Research Centre 350 funded from by German Research Foundation. This former geo-database prototype was already used by geological and geophysical research groups [13]. DB4GeO is an object-oriented database software completely written in Java based on the OODBMS-software db4o [14]. It differs in some aspects from known commercial or open-source Geo-DBMS like ArcGis/ArcSDE (ESRI), ORACLE Spatial, PostGIS (Refraction), cf. e.g. [15]: 3 Project 3D data an nd model management for the geosciences founded by German Research Foundation (DFG)
24 Martin BREUNIG et al DB4GeO was designed from the outset for fully 3D geoscientific applications. - Different from the usual object-relational approach, it is object-oriented. - The use of 0 3-dimensional simplicial complexes as underlying mathematical model provides a sound basis for geometric and topological operations on spatial objects. This feature should be maintained during future extensions of the range of supported spatial structures. - It was designed with a service-based architecture right from the beginning [2]. Currently the http protocol in REST style [16] is used for internet access Handling of geometry The present implementation of DB4GeO supports a number of geometric operations, which can be combined into sequences of operations as complex geometric services [2]. For example, profile sections may be computed from a geological 3D subsurface model. DB4GeO is accessed exclusively via its geometric and topological services. The kernel of DB4GeO consists of its 3D geo-database which is based on a geometry library and R-tree based spatial access methods. Data are organized in projects that comprise one or more spaces. Each space has a Spatial Reference System Identifier (SRID), a scale, and contains all associated spatial and thematic objects. Hitherto DB4GeO only supports simplicial complexes (Point Sets, Polylines, Triangulated surfaces, Tetrahedron meshes) as spatial representations. However, future work will also deal with the extension to other spatial data types such as grids Handling of topology With data as shown in figure 2a, it can be examined how units comprising many triangles can be topologically defined to represent thematically homogenous parts such as geologically defined surfaces (see figure 2b) and solids with given densities. 3D-simplicial complexes such as polylines, triangulated surfaces, tetrahedron nets, are well suited for the modeling of irregular geological objects. With such spatial structures, topological and geometric operations go hand in hand, and there is not much need for a clear separation of the two. For the modeling of complex solids, however, boundary representation (BREP) is more frequently used. In addition, for some geoscientific applications, e.g. gravity and geomagnetic modeling with IGMAS+ [4], BREP is better suited for the employed algorithm. Also, for applications such as finite element (FE) models [17], a restriction to simplicial complexes would exclude a number of FE classes. Yet another problem is scaling of 3D models [18]: the aggregation of a given subset of 3D simplices according to some thematic criterion generally does not yield a simplex, but rather an irregular shape, however with a well defined topology obtained by the suppression of internal boundaries. Such a shape lends itself to some sort of boundary representation. Basic spatial operations defined for simplices, such as point in simplex, intersection of (0,1,2,3-) simplices, boundary of a simplex, size (length, area, volume) can be extended to simplicial complexes (e.g. in DB4GeO). The same holds for some basic consistency checks based on these operations. However, the extension of these operations to arbitrary 3D bodies in boundary representation is not always obvious, and the developer may be left with a great number of special case problems to handle.
25 Martin BREUNIG et al. 18 For topological modeling, the Generalized Map (GMap) [19] and the closely related Cell-Tuple Structure [20] provide a unifying approach: at all levels of detail/aggregation, regardless of the dimension of the model, the same type of formal structure can be used for the representation of topology, and at the cost of some redundancy simplicial complexes can be mapped onto GMaps as well. A set of elementary and complex topological operations and their RDBMS implementation has been presented by [21][22]; these operations are now being integrated into the Db4GeO kernel. GMaps and Cell-Tuple Structures have been employed for large 3D building models and their visualization [18], a current application is the modeling of the development of Piesberg (see above). 3. Outlook In our future work we will also focus on the handling of spatio-temporal data with DB4GeO, revising our temporal data model implemented in DB4GeO. We intend to provide a more flexible spatio-temporal model allowing the insertion of new version for geo-objects within a scene and secondly to reduce the storage place needed for spatio-temporal 4D models. Furthermore, we will try to develop suitable DB4GeO geometric extensions for regular and for deformable grids, and possibly for other spatial meshes which are not modeled by simplicial complexes. We are also planning to enable complex spatial and thematic queries by DB4GeO using query scripts. Last, but not least, a different topology approach called Relational Chain Complexes is examined in our current research. Acknowledgements We thank the Geological Survey of Schleswig-Holstein for providing example data from the regional subsurface model, as well as Jürgen Berlekamp from USF, Osnabrück University and the Survey Office of Osnabrück city for the grants to use the Piesberg data for scientific purpose. Last, but not least, we thank for the support by the GEOTECHNOLOGIEN research program, BMBF grant 03G0644A, and DFG grant BR2128/11-1. References [1] Bär W. (2007): Verwaltung geowissenschaftlicher 3D Daten in mobilen Datenbanksystemen. Ph.D. Thesis, University of Osnabrück, Germany, 166p. [2] Breunig M., Schilberg B., Thomsen A., Kuper P.V., Jahn M., Butwilowski E. (2010): DB4GeO, a 3D/4D Geodatabase and its Application for the Analysis of Landslides. In: Konecny M., Zlatanova S., Bandrova T.L. (Eds.): Geographic Information and Cartography for Risk and Crisis Management. Springer LNG&C, 2010, [3] Hese F., Liebsch-Dörschner T., Offermann P., Rheinländer, J. (2010): Geologische 3D Modelle des tiefen Untergrundes Schleswig-Holsteins. Jahrestagung der Deutschen Gesellschaft für Geowissenschaften und der Geologischen Vereinigung, Darmstadt. [4] Schmidt S., Götze H.-J., Fichler Ch., Ebbing J., Alvers M.R. (2007): 3D Gravity, FTG and Magnetic Modeling: the new IGMAS+ Software. Int. Workshop Innovation in EM., Grav., Mag. Methods Capri, Italy.
26 Martin BREUNIG et al. 19 [5] Berlekamp J., Lautenbach S. (2002): Data set for visualization of central landfill Piesberg at Osnabrück. Institute for Environmental System Research, University of Osnabrück, Germany. [6] Pepper, S. (2002): The TAO of Topic Maps, retrieved [7] Berners-Lee T. (1998): Semantic Web Road map, retrieved [8] GeoWall Consortium. retrieved [9] Mallet J.L. (1992): GOCAD: A computer aided design programme for geological applications. Turner, A.K. (Ed.): Three-Dimensional Modelling with Geoscientific Information Systems, proc. NATO ASI 354, Kluwer Academic Publishers, [10] Paraview Kitware. retrieved [11] Geocando (acc ) [12] Balovnev O., Bode Th., Breunig M., Cremers A.B., Müller W., Pogodaev G., Shumilov S., Siebeck J., Siehl A., Thomsen A.(2004): The Story of GeoToolKit - an Object-oriented Geo-Database Kernel System. Geoinformatica 8:1, Kluwer Academic Publishers, [13] Alms R., Balovnev O., Breunig M., Cremers A.B., Jentzsch T., Siehl A. (1998): Space-time modelling of the Lower Rhine Basin supported by an object-oriented database. Physics and Chemistry of the Earth, Vol. 23, No. 3, Elsevier Science Ltd., [14] db4objects by Versant, retrieved [15] Bianca Schön, Debra F. Laefer, Sean W. Morrish, Michela Bertolotto (2009): Three-Dimensional Spatial Information Systems: State of the Art Review, Recent Patents on Computer Science, 2, [16] Fielding R.Th. (2000): Achitectural styles and the design of network-based software architectures. Ph.D. thesis, University of California, Irvine, 162p. [17] Wang W., Kosakowski G., Kolditz O. (2009): A parallel finite element scheme for thermo-hydromechanical (THM) coupled problems in porous media. Comp. Geosci. 35, [18] Fradin D., Meneveaux D., Lienhardt P. (2002): Partition de l espace et hiérarchie de cartes généralisées. AFIG 2002, Lyon, 12p. [19] Lienhardt P. (1989): Subdivisions of n-dimensional spaces and n-dimensional generalized maps. In: Association for Computing Machinery (Hg.): Proceedings of the fifth annual symposium on Computational geometry. Washington: ACM Press, [20] Brisson, E. (1989): Representing Geometric Structures in d Dimensions: Topology and Order. In: Association for Computing Machinery (Hg.): Proceedings of the 5th ACM Symposium on Computational Geometry. Washington: ACM Press, [21] Thomsen, A., Breunig, M. (2007): Some remarks on topological abstraction in multi representation databases. In: Popovich, V., Schrenk, M. and Korolenko, K. (eds.): 3rd workshop Inf. Fusion & GIS, Springer, Berlin, 2007, [22] Thomsen A., Breunig M., Butwilowski E., Broscheit B. (2007): Modelling and Managing Topology in 3D Geoinformation Systems, Proceedings of 3D Geoinformation 2007, Delft, Lecture Notes in Geoinformation and Cartography, Springer, Heidelberg,
27 Claus RINNER; Martin DÜREN 20 Design With Nature 2.0 A Geodata Infrastructure Approach to Map Overlay Claus RINNER a,1 b ; Martin DÜREN a Department of Geography, Ryerson University, Toronto, Canada b Institute for Geoinformatics, University of Münster, Münster, Germany Abstract. McHarg s Design With Nature was a precursor of the layer model in modern geographic information systems. We are reviewing the layer overlay approach from a geospatial data infrastructure (GDI) perspective and experiment with weighted map overlay using Web map services. A case study for natural hazard risk assessment for Southern Quebec illustrates this visual approach to multi-criteria analysis using online mapping. We conclude with a call for more research on thematic mapping and its use for decision support within GDIs. Keywords. Geospatial data infrastructure, map overlay, multi-criteria analysis 1. Introduction Ian McHarg is widely credited with inventing the layer model that was adopted in geographic information systems (GIS) (e.g. Goodchild 1992). In his seminal book, Design With Nature, McHarg (1992) used semi-transparent map overlays for suitability selection processes, including highway route selection. His printed map transparencies represented social and environmental values around potential routes by degrees of grey shades. The overlay resulted in sensitive areas being masked through multiple overlaid maps while more suitable areas remained translucent. Subsequently, the computational overlay of maps has been formalized in map algebra and cartographic modeling (Berry 1987, Tomlin 1990). Map algebra conceptualizes a set of mathematical operators applied to raster data to derive new map layers. It includes various overlay operators and is implemented in commercial GIS software. However, the visual overlay via regular map layer management is the most immediate implementation of McHarg s (1992) approach in today s desktop GIS. Map overlay is also a key feature of geospatial data infrastructures (GDI). A basic tenet of GDIs is to leave datasets under the custody of an authoritative provider; access them through distributed Web services; and overlay them on the end-user s screen. The OpenGIS Web Map Service (WMS) (de La Beaujardiere 2006) is among the services that support visual overlay à la McHarg, while recent developments aim to support computational overlay, e.g. Holzmeier and Ostländer s (2005) Web Map Algebra Service. While WMS is extensively used, it is usually employed for general reference maps, e.g. road maps. There are few examples of thematic maps within GDIs in general, despite the capability of maps to visualize spatial patterns of natural, 1 Corresponding Author.
28 Claus RINNER; Martin DÜREN 21 demographic, and socio-economic phenomena, and support associated planning and decision-making. In this paper, we replicate McHarg s Design With Nature in a GDI environment in order to illustrate decision support capabilities of simple, visual overlay of online thematic maps. Section 2 provides a brief overview of weighted overlay and multicriteria analysis as they apply to this work. In Section 3, we summarize the pertinent aspects of the WMS specification, the OpenLayers development platform, the alpha blending approach for the rendering of semi-transparent map overlays, and the use of styled layer descriptors. Section 4 illustrates the user interface and functionality of the prototype client. Section 5 describes a case study of using online map overlay for natural hazard assessment in the context of global environmental change. Finally, Section 6 concludes the paper with a summary of results and outlook on future work. 2. Weighted Map Overlay and Multi-Criteria Analysis Weighted overlay of map layers in GIS is an extension of the general idea of map overlay described above. Eastman et al. (1995) were among the first to systematically describe raster data overlay and provide an implementation in the Idrisi GIS. The authors distinguish between constraints and factors. A constraint limits the set of feasible alternatives in a decision situation by establishing strict boundaries or threshold values to be met. In contrast, a factor enhances or detracts from the suitability of a specific alternative for the activity under consideration (Eastman et al. 1995, p.539). In other words, higher or lower measurements on a factor will affect the suitability of locations without setting specific minimum or maximum thresholds. This terminology links weighted map overlay to the broader field of GIS-based multi-criteria analysis (Malczewski 1999). Each map layer represents a criterion that measures the outcome of a decision. Weights are used to represent different levels of importance attributed to the criteria. Weighted criterion layers can be summed using map algebra tools to create a weighed average of criterion values across a study area. The weighted average, also known as weighted linear combination, is among the most common multi-criteria methods (Malczewski 2000). This process results in a arithmetic combination of input layers into a single output layer, the cell values of which represent suitability or desirability of each location for the decision to be made. McHarg s (1969) approach can be seen as a purely visual implementation of geospatial multi-criteria analysis, since the grey shades resulting from semi-transparent map overlay represent site suitability in the same way as the GIS-based weighted overlay. We propose to use weighted map overlay as a decision support approach within GDIs. 3. Geospatial Data Infrastructures and Web Mapping Concepts 3.1. Web Map Servers Use of GeoServer A GDI is an online platform that provides analysts, decision-makers, and the general public with access to geographic information in the form of maps and geospatial data. Web map servers in a broad sense are a key element of any GDI, and the OpenGIS Web Map Server (WMS) specification provides a widely accepted foundation for mapping services. A WMS must respond to two basic requests: GetCapabilities, which
29 Claus RINNER; Martin DÜREN 22 provides information about available themes and map layers, their geographic extent, and the available graphic output formats; and GetMap, which retrieves an image of a map that has been prepared on the server using parameters set by the client, including data layer, map extent, and output format. A styled layer descriptor (SLD) is an XML schema that extends the WMS standard and allows user-definedd symbolization and colouring of geographic data (Lupp 2007). The SLD implementation specification defines an XML styling language that allows users to define rules to control how data are rendered. A layer-specific symbolization can be included in a WMS request by adding an SLD parameter to the request URL. In this project, GeoServer was used to provide access to the case study data. GeoServer is an open source Java software package for sharing geospatial data. It implements several OpenGIS service standards, including the WMS with SLD Web Map Client Use of OpenLayers The client application in this project was built using OpenLayers, an Open Source Geospatial Foundationn (OSGeo) project, which provides a JavaScript library that allows for the integration of maps from distributed sources into the HTML code of a Web page. The map overlay client also uses the jquery JavaScript library, which helped to define the control elements such as the menu tab used in this application Alpha Blending Alpha blending is an algorithm used in computer graphics to combine overlapping, partially transparent picture elements. It adds an alpha channel to an image s colour information. The alphaa channel contains information about the opacity of pixels, with 0.0 being transparent and 1.0 being opaque. For example, a 50% black colour in an RGB colour model would be represented by the tuple (0.0, 0.0, 0.0, 0.5), where (0.0, 0.0, 0.0) is the RGB encoding for black and the additional alpha value is set to 0.5. Figure 1. Logical overlay of partially transparent (translucent) picture elements with additive combination of opacities (left), compared with graphical overlay with multiplicative alpha blending (right) The use of alpha blending as the default image overlay method in Web clients creates an inconsistency in visual overlay of weighted map layers. In a greyscale scenario, the three layers shown in Fig.1 have opacities of 50%, 30%, and 20%. If
30 Claus RINNER; Martin DÜREN 23 opacities are considered to represent criterion weights, their weights should add up to 100%, as shown in Fig.1 (left). However, alpha blending does not use additive opacity. For example, the triple overlay in the centre of Fig.1 (right) does not receive full black opacity. Instead, the graphical overlay results in a grey shade with an opacity of 1 - (1-0.5)*(1-0.3)*(1-0.2) = 1 - (0.5*0.7*0.8) = = The RGB tuple (71, 71, 71) corresponds to a 72% black. 4. Prototype User Interface and Functionality A prototype implementation of the map overlay tool includes the following functions: Import layers from Web Mapping Services and overlay them in the Web client Set up with default collection of WMS URLs suitable for demonstration Switch layers on and off, and change the order of layers Basic functions of a Web mapping client such as panning and zooming Change layer opacity by distributing 100% over selected layers The user interface (Fig.2) consists of two main elements: map and menu. The map includes controls for panning and zooming as well as control elements for choosing base layers and turning thematic layers on and off. The menu is organized into functional areas. The Add Data area provides text fields for adding layers from OpenGIS-compliant Web services along with sample WMS and WFS layers for direct access. The Change Opacities area contains the sliders for modifying the opacity of each layer. When a layer is added to the map, a new opacity slider is added automatically. Similarly, in the Layer Order area of the menu, buttons for controlling the layer arrangement are added automatically for each layer. The current layer order can also be seen in the control elements within the map. Figure 2. Screenshot of the prototype s user interface
31 Claus RINNER; Martin DÜREN Case Study Natural Disaster Risk Assessment The global risk data platform (Giuliani and Peduzzi 2011) is an international effort to collect and share information about natural hazards. Access to the data is provided as maps on the site itself ( or through Web services. Their WMS offers different maps, including those of risk levels classified between very low and very high, with associated colour coding (e.g. dark red for highest risk). This fixed colour scheme and legend were not suitable for map overlay. Therefore, copies of the data were served from a custom WMS with SLD support. For illustration purposes, the colour black was given to all pixels that were encoded with any risk level above zero. Figure 3. Flood risk areas in Southern Quebec overlay of UNEP WMS with Google Maps base layer Three layers representing the risk of cyclones, floods, and earth quakes were included in this case study. The map extent was set to Southern Quebec, Canada. This area borders the Saint Lawrence River, a major shipping route that connects the Great Lakes with the Atlantic Ocean. Quebec City is one of the oldest settlements in North America and its historic district is a UNESCO world heritage site. The City lies at a narrowing of the Saint Lawrence River and can be seen slightly east of the centre of the maps in Fig.3 and Fig.4. Fig.3 shows black pixels (given full visual weight) that represent areas identified as prone to flood risk by Giuliani and Peduzzi (2011). These areas include the shore of the Saint Lawrence River upstream from Quebec City as well as many of its tributaries. Figure 4. Overlay of three thematic WMS layers using transparency to represent hypothetical importance weighting cyclone risk (20%), flood risk (30%), and earthquake risk (50%)
32 Claus RINNER; Martin DÜREN 25 In Fig.4, darker grey shades visually indicate the overlap of multiple natural hazard risks. The risk of cyclones, floods, and earthquakes are given hypothetical weights (opacities) of 20%, 30%, and 50%, respectively. The study area shows generally low levels of risk with a concentration of higher risk near Quebec City and along the Saint Lawrence River. An inspection of the three individual layers confirms the similarity in geospatial patterns and considerable spatial coincidence of natural hazard risk in Southern Quebec. 6. Summary and Outlook The goal of this research was to illustrate the concept of weighted map overlay in a GDI environment, and spark discussion about the use of WMS for spatial decision support. An interesting advantage of the purely visual, non-computational overlay used here is that distributed data sources of different qualities (e.g. spatial resolution) can be jointly explored by the end-user without requirements on compatibility. The prototypical implementation of a map overlay client was fast and costeffective, in part due to the use of open source software tools. The current tool needs updates in a few functions, in particular the control of layer weights adding up to 100% in accordance with multi-criteria analysis methods. In addition, we plan to expand the testing of the tool to a regional or local scenario with more decision criteria and involvement of stakeholders who would provide feedback on the weighting approach. More broadly, we argue that thematic mapping merits more attention from GDI researchers and developers. While specifications and tools are in progress that support the computational combination of data (e.g. Web Processing Service), the basic overlay of thematic map layers in GDIs has not been studied and promoted enough. Geospatial Web services are still a far cry from being able to replace professional GIS mapping. Acknowledgements Partial funding for this research was provided by the Canadian GEOIDE Network of Centres of Excellence. References [1] J.K. Berry (1987) A Mathematical Structure for Analyzing Maps. Env. Management 11(3): [2] J. de La Beaujardiere (2006) OpenGIS Web Map Server Implementation Specification. Version , dated 15 March Available online through [3] J.R. Eastman, W. Jin, P.A.K. Kyem, J Toledano (1995) Raster Procedures for Multi-Criteria/Multi- Objective Decisions. Photogrammetric Engineering and Remote Sensing 61(5): [4] G. Giuliani, P. Peduzzi (2011) The PREVIEW Global Risk Data Platform: a geoportal to serve and share global data on risk to natural hazards. Natural Hazards and Earth System Science 11(1): [5] M. F. Goodchild (1992) Geographical information science. International Journal of Geographical Information Systems 6(1): [6] R. Holzmeier, N. Ostlaender (2005) Map-algebra goes online - Introducing the WMAS. In I. Simonis (ed.) Sensors and Geographic Networks, GI-Days, May 2005, Münster, Germany, pp [7] M. Lupp (2007, ed.) Styled Layer Descriptor Profile of the Web Map Service. Implementation Specification, Open Geospatial Consortium, reference number OGC r4. Available at [8] J. Malczewski (1999) Spatial Multicriteria Decision Analysis. In J.-C. Thill (ed.) Spatial Multicriteria Decision Making and Analysis. Chapter 2. Aldershot: Ashgate, pp
33 Claus RINNER; Martin DÜREN 26 [9] J. Malczewski (2000) On the Use of Weighted Linear Combination Method in GIS: Common and Best Practice Approaches. Transactions in GIS 4(1): 5-22 [10] I.L. McHarg (1992) Design With Nature. 25 th Anniversary edition. John Wiley & Sons, New YorkC.D. Tomlin (1990) Geographic Information Systems and Cartographic Modeling. Prentice Hall [11] C.D. Tomlin (1990) Geographic Information Systems and Cartographic Modeling. Prentice Hall
Becker Traffic Assist High Speed Ii Karten Update Javascript
34 Holger FRITZE et al. 27 Track-based OSM Print Maps Holger FRITZE 1 ; Dustin DEMUTH; Kristina KNOPPE; Klaus DRERUP Institute for Geoinformatics, University of Muenster, Weseler Str. 253, Münster Abstract Mobile devices have become more and more powerful in the last years and it is possible to use them for locating and navigation. But are they always the better choice than using a paper map? In many situations e.g. on a long hike a paper map is still useful. This project aims at providing map booklets based on OpenStreetMap data. In comparison to similar projects creating print maps for an area of interest this projects creates a map booklet along a given GPS track with customizable rendering options and overlapping map sections. Keywords. OpenStreetMap, print map, GPS track, tiling, rendering 1. Introduction GPS devices or smartphones with specific applications can help you to easily locate yourself in and navigate through unknown surroundings. But traditional paper maps still offer advantages over mobile devices. For example on long hiking trips it might be unforeseeable when it will be possible to recharge your electronic device. In such a situationa printed map is a valuable alternative. Also, it is much easier to get an overview of the area on a high resolution A4 paper map than on a 3-inch display. Searching the web, there are many solutions to export and print map views. However there is no service that allows to create customized OpenStreetMap paper maps with integration of tracks or user-defined visualization styles. Creating such a map with GIS software is much too difficult for a user who is not familiar with this kind of software. This paper presents an architecture to create customized OpenStreetMap booklets on the basis of a GPS track that might be downloaded from hiking websites like This track can represent the mentioned hiking trail, or any other intended route, e.g. the navigation instruction from OpenRouteService. The booklet supports multi-page mode, at which the map sections overlap each other. A step-bystep interface offers everyone the possibility to create map booklets that fit their individual needs. 2. RelatedWork There are some similar projects that offer functionalities to export and print Open- StreetMap data. The Walking Papers project by Michal Migurski encourages users to 1 Corresponding Author: Holger Fritze,
35 Holger FRITZE et al. 28 Figure 1. The service architecture. On the left side is the server with the four steps of the business logic. The right side denotes the interaction with the user and via the web interface. print OpenStreetMaps, annotate and add information, scan the map and then digitize additions and changes. The main advantage of this method is, that no mobile device is necessary for the mapping part. The most famous application was after the earthquake in Haiti [5]. MapOSMatic is a free webservice that allows to create printable city plans. It contains the map and a street index, ready to print as PNG, PDF or SVG file [1]. The TownGuide is a python program that, like MapOSMatic, renders a map to a PDF file including a street index, but also adds an index of user-selectable POIs. It can run either as a web service or as a stand-alone program if the source code is installed [3]. There are also application-based projects that assist to create a printable map from OpenStreetMap data: The Osmbook turns OpenStreetMap data into a printed book. It generates a highlevel gridded overview page and multiple other pages with cross referencing and additional information [2]. All these project have in common that they create paper maps based on Open- StreetMap data for a given area of interest defined by a bounding box. But none of these projects is capable to use a track as an input and render maps along this track. 3. Architecture The architecture is based on a simple client-server interaction. As depicted in Figure 1 the business logic splits the data processing on server side into four single tasks: validation, tiling, rendering and generating the booklet. The user interaction is realized through a web interface. The user uploads a GPS track, which is validated against a XML-schema and modified if necessary. An algorithm has been developed to calculate overlapping map tiles along the track. It computes map tiles represented by a rectangle centered around a certain point of the track with a predefined overlap. The tile size is defined by the format of the paper and the scale of the map. The renderer creates a graphical representation for each tile. This project uses the Kogutowicz renderer [4], which is implemented in Java and offers many setting adjustments. The renderer fetches the raw OpenStreetMap data for each tile using the OSM API. Apart from the map in the background the renderer also visualizes the extent of the neighboring, overlapping tiles and the track itself. Figure 2 shows an example of
36 Holger FRITZE et al. 29 Figure 2. An example of a tile along a track. The track is in blue, the extent of the neighboring tiles is in black. a rendered map section. In the final step the map sections are linked with additional information, the resulting map booklet is generated and sent to the user via . The user gets feedback for each step through the web interface that can be accessed at giv-osm.uni-muenster.de. The validated track is visualized on a map as well as the preview of the tiles. The user can adjust the page settings if necessary and specifythe rendering options. The whole architecture is implemented in Java. Using OpenStreetMap has several advantages, especially for copyright and licensing. OpenStreetMap data is licensed under the Creative Commons Attribution-ShareAlike 2.0 License. That means that you are free to copy, distribute, transmit and adapt our maps and data, as long as you credit OpenStreetMap and its contributors. 4. Conclusion The project demonstrates an architecture to create individual map booklets based on the integration of GPS tracks. Similar projects offering printing functionality of Open- StreetMap data have already been reported. But none of them offers a web-based tool to print maps along a path. This project enables a user to create a map booklet for a specific purpose. The map booklet shows the local environment around a track and provides only the areas the user is interested in. For this purpose an algorithm has been developed to calculate overlapping map tiles along a track. Existing path splitting algorithms are not capable to perform this tiling. Furthermore this project integrated a renderer to fetch raw OpenStreetMap data, which enables the user to adapt the map individually to the personal needs.
37 Holger FRITZE et al. 30 References [1] Decotigny, David, Frédéric Lehobey, Pierre Mauduit, David Mentré, Maxime Petazzoni, Thomas Petazzoni, and Gaël Utard: MapOSMatic, [2] Hardaker, Wes: Osmbook, [3] Jones, Graham: Free town guide generator, [4] Márton, Elek: Kogutowicz - an extensible map renderer application in java, [5] Migurski, Michal: Walking papers,
38 Johannes TRAME et al. 31 LOSM - A lightweight approach to integrate OpenStreetMap into the Web of Data Johannes TRAME a,1 ; Philippe RIEFFEL a ; Umut TAS a ; Alkyoni BAGLATZI a ; Volker VON NATHUSIUS a Institute for Geoinformatics, University of Muenster, Germany Weseler Str. 253, Muenster Abstract. The idea of interlinking spatial data from OpenStreetMap to the large repositories of the Linked Open Data Cloud is already acknowledged as a valid contribution to the semantic web. Since the nature of the OSM data is volatile, this work proposes a dynamic approach to make OpenStreetMap available as linked data. Instead of serializing the whole OpenStreetMap database into RDF and linking upon these triples, we serve resources on the fly and link the original data. Integrating the linkage of the data cloud into the original data prevents redundancies, preserves actuality and creates additional value. Keywords. OpenStreetMap, Linked Data, RDF, XSLT, Web of Data 1. Introduction The principal of linked data refers to a set of methods for publishing and interlinking data from different resources in a structured and meaningful way. Following the principles of linked data enables simple integration and fusion of data pieces from different data provider. For example, the spatial representation of the 'Statue of Liberty' in OpenStreetMap (OSM) could thus easily be enriched by thematic data available through the Wikipedia project. Bizer [1] summarized four basic possibilities for publishing as linked data: Static files (like FOAF 2 profiles), conversion tools (for non-xml data formats), mapping from relational databases and wrapping services encapsulating Web APIs (XML- and microformats). The idea of making OSM data available as linked data is already successfully implemented within the LinkedGeoData (LGD) project [2]. A whole dump of the OSM dataset was first parsed into a relational database and then mapped as triples into a store using Triplify [3]. However, the sheer amount of data and the nature of the transformation process causes multiple side-effects, e.g. redundancy and limited actuality (sometimes several weeks behind the official OSM dataset). Furthermore, LGD provides only a limited number of links exclusively to the DBpedia 3 dataset. 1 Corresponding Author: Johannes Trame;
39 Johannes TRAME et al. 32 Another approach, instead of mapping from a relational database, is using a dynamic wrapping service to translate small amounts of data on the fly. Bizer et.al. [4] wrapped several Book Web APIs like Amazon or Google Base into a RDF Book Mashup. They demonstrated how this data could be integrated into theweb of Data following the linked data principles. While this transformation was more or less 'hardcoded', frameworks like Swignition 4 and Krextor [5] provide more generic approaches. Swignition is implemented in PERL, with the ability to call also XSLT processor, whereas Krextor is purely based on XSLT-transformations between XML and RDF. A different attempt trying to overcome the barriers between XML and RDF is XSPARQL. It is a combination of XQuery and SPARQL which can be used to perform queries directly on XML and RDF data [6]. With regard to successively interlinking the OSM database with the Linked Open Data Cloud (LOD), on-the-fly transformations around existing Web APIs can be advantageous: There is no requirement for a local database maintenance, thus also no need for dedicated hardware. It further implies the avoidance of data redundancies and ensures up-to-date data, since it is always requested directly from the original database upon the users needs. This work describes the conceptual considerations as well as the technical implementation of a lightweight on-the-fly wrapping service for OSM. After reviewing related ideas and work that has already been done, the following sections give an overview of the different parts of the project and its individual implementations before summing up in a conclusive discussion. 2. Towards Linked OpenStreetMap This section introduces the technical implementation as well as the vocabulary developed to foster such a process and to identify and structure corresponding items Vocabulary Sharing a common vocabulary contributes to interoperability among people and systems in the general concept of the semantic web. As linked data is part of it, it is affected by the idea of sharing common understanding by means of well accepted vocabularies. The basic data model of OSM is a topological centric one, consisting of connected nodes and ways (lists of nodes) with an unlimited number of tag elements assigned. Object classes which can be used for mapping and linking OSM entities to concepts, do not exist a priori. Moreover, there is also no well structured and specific vocabulary users have to follow. According to the OpenStreetMap wiki 5, users are free to use any terms that seem appropriate to annotate any features of interest. However, the community tries to agree on a common set of tags that should be used in a certain way. Nevertheless, there are no formal restrictions because this is done rather bottom-up and not by following ontology engineering methods and rules. Following the linked data principles [7], we tried to reuse terms and properties from existing vocabularies like dc:creator (Dublin Core Vocabulary), rdfs:label (RDF
40 Johannes TRAME et al. 33 Schema), xmlns:datetime (W3C XML Schema) or geo:lat/long (Basic Geo WGS84 lat/long Vocabulary). While examining existing vocabularies, we tried to map the OSM terms into the PROTON Ontology 6 which was also used in [8]. An advantage of this approach would be a well structured vocabulary from an ontology engineering perspective. However, terms used in OSM are not necessarily used with the same syntax in the PROTON ontology (naming heterogeneity). This led to the question how to state in RDF that they have the same meaning or are similar, since owl:sameas should only be applied on instance level [9]. Additionally, the PROTON ontology does not seem to be used frequently in the context of linked data. Thus, some compromises were made and the final decision was to adopt all terms from the OSM wiki. More specifically, the top level categorization (keys) of the OSM Map features were adopted as they are; keys were translated to classes and the values to subclasses. This categorization was enriched by adding structural layers between the fairly general and detailed terms of the implicitly given set of map feature types. For example, in OSM the term amenity refers to a super class of many heterogeneous small scaled terms like bank, restaurant, bar. By adding intermediate levels such as EatAndDrink, terms of the same genus are grouped together. On the one hand we respect the bottom-up way people are used to tag their creations (words, expressions, semantics) while on the other hand more structure is added. This can facilitate basic reasoning tasks like subsumption reasoning and enables more structural browsing and querying Converting OpenStreetMap to RDF The core of our Osm2Rdf RESTful web-service is built upon the Krextor framework [5], an extensible XSLT-based framework for extracting RDF from XML. Krextor serves as a generic interface for mapping different XML input languages to several RDF output notations. We decided to use Krextor, since it supports the definition of new XSLT input modules independently from the aimed output. We extended Krextor by defining a new input module for the OSM XML format (API0.6 7 ). Krextor comes along with several templates like create-resource, addliteral-property and add-uri-property, simplifying the extraction and mapping task. For this prototype we restricted the mapping to OSM Node Elements. Defined by a static rule we extract basic attributes like id, lon, lat, userid and timestamp existing among all OSM entities. We linked them using common terms and properties (section 2.1). Additionally, each <tag> element carries an attribute key and value. We consider only nodes having at least more than one <tag> child element (usually all other nodes are referred by a way, meaning they do represent more complex geometries). If the key attribute of a tag element matches to name, its value is mapped to a literal using the property rdfs:label. Values where the key contains a string sameas are treated as a resource URIs pointing to resources in other data hubs (section 2.3). We use the owl:sameasproperty (Web Ontology Language) to link them, since they describe the same entity. For all the other tag elements it is rather cumbersome to determine appropriate mappings, if one does not want to define several hundred of mappings by hand
41 Johannes TRAME et al. 34 However, the OSM community informally agreed on a set of feature types and corresponding tags in order to create, interpret and display the OSM entities in a common way across different applications. This allows us to state some more or less generic mapping rules, by aggregating over common keys taken from the OSM Wiki. In case the key matches to one of the key groups, we extract the corresponding value of the tag element and link it as an instance to our ontology. Of course this generic rule conceals several risks, for example assigning more than one concept to the same resource or linking to not deferrable concepts of the ontology (e.g. someone used a common key like k=amenity but as value v=tiergarten, which does not correspond to a feature on the OSM Wiki thereby also not in our vocabulary). Though, the rules are easily extendable to more constrained ones in the future. The following example shows the RDF serialization of the OSM node representing the new museum ('Neues Museum') in Berlin 8 : <rdf:rdf xmlns:rdf=' xmlns:ns1=' xmlns:ns2=' xmlns:ns3=' xmlns:ns4=' xmlns:ns5=' xmlns:ns6=' <rdf:description rdf:about=' <ns1:creator rdf:resource=' /> <ns2:lat rdf:datatype=' <ns2:lon rdf:datatype=' <ns3:point> </ns3:point> <ns4:datetime rdf:datatype=' T13:18:42Z </ns4:datetime> <ns5:label rdf:datatype=' Museum</ns5:label> <ns5:type rdf:resource=' /> <ns6:sameas rdf:resource=' /> <ns6:sameas rdf:resource=' /> </rdf:description> </rdf:rdf> As it can be seen, it is already linked to other resources of remote datasets with the owl:sameas property. The RDF representation is generated on-the-fly and served through our RESTful server interface. At the moment we serve RDF representations of single nodes. The basic URI pattern for this information resources is: The id corresponds to the original OSM node id. Depending on the accept header of the request the response will be returned encoded in RDF or HTML. In addition to single resources, also whole geographically bounding boxes to a limited size can be queried: The bounding box left,bottom,right,top corresponds to the left-bottom and top-right corner of the BBOX encoded in WGS84 coordinates. Using a HTTP GET Request all OSM nodes of the bounding box are returned encoded in RDF. This is used as a shortcut, since the on-the-fly nature of the service and the amount of the data does not 8
42 Johannes TRAME et al. 35 allow the creation of an ad-hoc queryable graph for the whole dataset. Therefore, the service accepts POST requests with a plain text SPARQL query embedded in the request body limited to a certain BBOX. The results are returned encoded in JSON Semi-Automatic Linking to the Linked Open Data Cloud The idea of the semi-automatic annotation tool is to recommend links to other entities in the LOD that describe the same feature as the node in the OSM dataset. Therefore we built a set of SPARQL queries to search through the different RDF data dumps. The information used for these queries are taken from the OSM nodes like the name, the coordinates or the type. Furthermore, a reverse geocoder is used to retrieve information like the country or city of the corresponding node. The queries are run against the LOD cache SPARQL endpoint 9 which contains the datasets from the LOD. This allows us to address different data dumps within one query, instead of running several queries to various sources. Nevertheless, trying to address several data dumps also has a downside, since these dumps use different vocabularies in different ways. For example, if we want to search the labels of resources, we need to perform the search for triples with predicate rdfs:label, which will deliver results from several datasets, since this property is widely used. When trying to add a restriction using the coordinates provided by the OSM data this looks different. Inspecting some of the bigger data dumps in the LOD, one will recognize that all of them use different representations for geographical coodinates. For example, DBpedia or GeoNames 10 are using the W3C WGS84 Geo Positioning vocabulary 11, while Freebase 12 deployed a definiton by their own vocabulary. So for now we decided to optimize the queries for DBpedia entries, for example when making restrictions for coordinates or the category. We chose dbpedia since it has the most consistent and well described dataset connected to several other sources providing us with additional sameas links Interface To demonstrate the capabilities and benefits when combining spatial data from OSM with additional sources of the LOD, a small webinterface was developed integrating the different components 13. The OSM nodes for the current bounding box are requested from our service, displayed on a map and can then be investigated by the user. If the resource is already linked with a sameas property, this information from the remote data hub is fetched and displayed in a structured way. At the same time, once the user clicks on an existing feature, the link recommendation service queries for new or further possible outgoing links. To ensure quality and integrity of the data, the suggestions for sameas links have to be confirmed manually by the user. This should avoid misleading links. If the user The prototype and source code is online available at
43 Johannes TRAME et al. 36 has selected one or more suggested annotation links as appropriate, these links will be added to the node in the original OSM database using the tag owl:sameas=[link]. An additional command line allows the direct input of SPARQL queries. The queries are posted to our server interface, where the graph is build on-the-fly representing the current bounding box. 3. Conclusion To conclude, our vocabulary structures the OpenStreetMap data establishing new connections between entities, which were not available before. The on-the-fly wrapper is able to fetch up-to-date information from the OSM API and convert it according to our vocabulary. Instead of taking a snapshot of the whole OSM dataset and parsing it into a relational database for triplifying it, our prototype serves just as a facade in front of the OpenStreetMap API. Therefore, RDF representations of the OSM data can be offered without any modifications to the OSM infrastructure and with a minimal requirement for software and hardware maintenance. Furthermore, the XSLT nature of Krextor provides a flexible way for defining and changing mapping rules. Our semi-automatic recommendation service suggests links to already existing datasets in the LOD. In this way the data can be enriched with information not being part of the original OSM dataset. Once approved, the links are stored back into the original OSM database. Of course this approach also has some limitations. Since all the data has to be loaded from the OSM servers upon request, the response time heavily depends on the extent of the area queried. We tried to overcome this using some server-side caching mechanisms. The performance also depends on the amount of data to be transformed, even though the XSLT transformation process itself is quite fast. Furthermore, the wrapper could be used to stream the transformed triples into a triple store, also incorporating frequently update notifications e.g streaming OSM changes as RSS feeds. The mapping of tag annotations to our vocabulary is not as straight forward as one might expect. If we look at the 'Berlin TV tower' 14 for example, being tagged as attraction and restaurant in the OSM data. After the conversion, we end up with a node of two different types. Describing the data in an unambiguous way was one of the main issues developing the service and still will be in future work. Further investigations are also needed to transform and represent more complex geometries like lines or polygons in an efficient way. Linking to more and different datasets would further improve the service, as currently most of the links are pointing to DBpedia. Geonames - one of the biggest providers of free geodata - could be a valuable, further option for that. The linkage could also be improved by adding more link types like connecting people via their place of birth or a node to their creator s FOAF page. 14
44 Johannes TRAME et al. 37 Acknowledgements We thank Christoph Lange for his Krextor Framework 15, published under GNU Lesser General Public License as well as for his support. Also, thanks to Carsten Kessler for food for thought. References [1] C. Bizer, R. Cyganiak, and T. Heath, How to publish linked data on the web. Linked Data Tututorial, available from [2] S. Auer, J. Lehmann, and S. Hellmann, LinkedGeoData: Adding a spatial dimension to theweb of Data, The Semantic Web-ISWC 2009, pp , [3] S. Auer, S. Dietzold, J. Lehmann, S. Hellmann, and D. Aumueller, Triplify: light-weight linked data publication from relational databases, in Proceedings of the 18th international conference on World wide web, pp , ACM, [4] C. Bizer, R. Cyganiak, and T. Gauß, The RDF Book Mashup: from Web APIs to a web of data, in The 3rd Workshop on Scripting for the Semantic Web (SFSW 2007), Innsbruck, Austria, [5] C. Lange, Krextor an extensible XML to RDF extraction framework, Scripting and Development for the Semantic Web (SFSW2009), [6] W. Akhtar, J. Kopeck`y, T. Krennwallner, and A. Polleres, XSPARQL: Traveling between the XML and RDF worlds-and avoiding the XSLT pilgrimage, in Proceedings of the 5th European semantic web conference on The semantic web: research and applications, pp , Springer-Verlag, [7] T. Berners-Lee, Design issues: Linked data. Available from [8] M. Damova, A. Kiryakov, K. Simov, and S. Petrov, Mapping the Central LOD Ontologies to PROTON Upper-Level Ontology, Ontology Matching, p. 61, [9] H. Halpin and P. Hayes, When owl: sameas isn t the same: An analysis of identity links on the Semantic Web, in Proceedings of the WWW2010 workshop on Linked Data on the Web, LDOW2010,
45 Andres WESTERMANN et al. 38 OpenFloorMap Implementation A. WESTERMANN a ; G. TSCHORN a ; P. WEISS a ; O. OGUNDELE a ; D. LASNIA a,1 a Institute for Geoinformatics, University of Muenster, Germany Abstract. OpenFloorMap (OFM) is an approach to enhance the level of detail in open spatial data repositories. OpenStreetMap only provides information up to buildings being represented as polygons. OFM enables users and developers to explore building layouts. On account of this, a repository providing data for buildings and rooms establishes a new kind of platform for open spatial data. Our prototype includes a mobile application to capture room extent and a web application to manage floor plans. Keywords. Volunteered Geographic Information, Android, OpenStreetMap, Floor plan Introduction OpenFloorMap (OFM) is an approach towards an open web repository for floor plans and 3-D room models inside buildings. Our approach includes a web service, web application and mobile application to contribute volunteered geographic information to an repository and to make them available in the web. Three-dimensional room models are measured within a few seconds using the mobile application. The rearrangement of measured rooms is implemented in the OFM web application. A RESTful web service integrates these components and makes the data available on the web. 1. Architecture The backbones of the OFM prototype are spatial data repositories as Google Maps and OpenStreetMap, in addition to free software packages for spatial applications as OpenLayers in connection with Geoserver. OFM is set up as a web service with an Android mobile application for data capturing and uploading, and a web client for editing room information. Furthermore, the web application uses the Web Feature Service provided by Geoserver in combination with OpenLayers to give users the possibility to locate a building of interest and obtain the reference from the database Data Model The data model consists of two Extensible Markup Language (XML) schemas. The first one (room XML file) represents a single room and is used as an exchange format between Android application and OFM web server. It consists of the name, floor and description of the room and a human readable georeference (Geopath) of the building 1 Corresponding Author.
46 Andres WESTERMANN et al. 39 (country/city/zip code/street/street number). The results of the measurement are stored as nodes: one for each corner including the values for width, length and height which refer to a local spatial reference system. The second one (building XML file) serves for the representation of a whole building. In this structure a building has a unique identifier, a name and a Geopath. The data set is divided into different floors with a given level. Floor elements contain room elements, whose structure is basically the same as in the room XML. When room information is sent to the OFM server, the Geopath (e.g Germany/Muenster/48151/Weseler_Strasse/253) is checked by a web service. If it already exists in the database, new room information will be added to the regarding building data file. Otherwise the application creates a new building XML and inserts the committed room data. As the real coordinates of a room sent by the mobile application are unknown, a new room has a local spatial reference system. When it was moved to its correct location by the web application the spatial reference system changes to WGS 84. The current spatial reference system of a room is represented by an additional tag Web Service The RESTful web service architecture provides web connectivity based on the state-of- and thus the-art web oriented architecture axioms, enabling direct resource accessibly great mashup capabilities. This characteristic underlines our approach of an open architecture for floor maps, minimizing the accessibility threshold. 2. Applications 2.1. Android App The Android application was developed with the intent to create a simple tool for capturing room layout data within a building. The application guides the user by using Google Maps, whose features are already supported by the Android API. For the next version of OFM it is possible to use an equivalent OSM tool instead. With a click on the building the geocoded address will be loaded (Figure 1). Providing the floor level and room name is mandatory as well, while the description is optional. Figure 1. Androidd application interface (left), Web application interface (right)
47 Andres WESTERMANN et al. 40 The next interface helps capturing the room dimension. This can either be done manually or automatically using the included measuring capabilities. This tool uses the camera functionalities and the orientation to calculate the distance between all corners of the room. The measurement result generates length, width and height of the room. This information can be corrected manually. The information of the room is sent in an XML-encoded HTTP post request to the OFM web service Web App Arrangement and rearrangement of room plans is performed via the web application. Figure 1 shows the interface of the prototypical implementation. Additionally, OpenLayers including a Transactional Web Feature Service (WFS-T) helps users to select available room plans at specific locations. At these locations it is possible to view floor plans for each floor of the building. At the top of the web page users can switch between available floors. In the middle the room plan of the selected floor is visualized. At the bottom of the page two different forms of uploaded rooms are listed. The first category consists of unordered rooms representing the rooms that have been captured by the Android application but are not yet arranged within the whole floor plan. The second category, named ordered rooms, includes all rooms that are already placed at the correct location by users who used the web application. The arrangement and rearrangement of rooms is possible by drag and drop of the object from the bottom of the page to the room plan panel above. After this step, users are able to move the room to the correct location and save the changes afterwards. Subsequently, the room plan gets updated and the new arrangement of rooms is visible to all users. 3. Future Work The prototypical OpenFloorMap implementation provides all basic functionalities for a volunteered building layout repository. Upcoming developments have to address versioning mechanisms as a part of an overall open community infrastructure. Moreover, a 3-D rendering engine is needed to provide comfortable access and integration capabilities for floor map data. These features are the most essential next steps towards an establishment of OpenFloorMap. Further research should include approaches for indoor positioning based on OpenFloorMap data and smartphone embedded sensors, thus to provide indoor navigation solutions. References [1] R. M. Wagner, O. Wessler, Indoor Information Infrastructure (2009), Presentation: retrieved on 2011/02/02.
48 Andre MÜLLER; Matthias MÜLLER 41 On the Integration of Geospatial Data and Functionality into Business Process Models Andre MÜLLER a ; Matthias MÜLLER b a Faculty of Spatial Information, University of Applied Sciences Dresden, Germany b Geoinformation Systems, Technische Universität Dresden, Germany Abstract. Business process modelling is a common instrument to formalise, execute and document complex workflows at an enterprise level. A large part of spatial data and functionality residing in Spatial Data Infrastructures (SDI) and Geospatial Information Systems (GIS) has the potential to be integrated into common business process models. This paper presents an approach to provide geospatial data and functions in business process models. The integration of geospatial tasks into business processes is realised with the common modelling language BPMN. Furthermore the graphical modelling capabilities of BPMN are investigated to foster an integrated methodology for process execution and documentation. Keywords. Business Process Modelling, Notation, Workflow Management, Spatial Data Infrastructures Introduction The portrayal of business processes in a graphical form is a common instrument for representing complex workflows at an enterprise level. Originally it was a means to document business processes of an enterprise to preserve the knowledge of the processes and to be able to pass this knowledge on. The scopes of application have evolved continuously. Today the goal is the integrated execution of graphically modelled business processes by Workflow Management Systems or Business Process Management System (BPMS) in addition to an attractive visualisation. The major scopes of application in business process models according to [1, 2, 3, 6, 10] are: 1) Process documentation and process visualisation Process management, process analysis, process reengineering Requirements development for IT development, software selection Organisational development, staff and task planning, responsibilities, safety constraints, process cost accounting Certification Training 2) Process implementation and process engineering Software development, Model Driven Software Development
49 Andre MÜLLER; Matthias MÜLLER 42 Process control, Enterprise Application Integration Management Process validation Process definition for BPMS Process optimisation with process monitoring The goal of (1) is to prioritise the formal, abstract description of the processes from the perspective of business users and enterprise management. The goal of (2) is the translation of business processes into software. For both scopes of application different modelling standards, architectures and notations as well as different IT systems to support them have been developed. At the intersection of Spatial Data Infrastructures (SDI), Geospatial Information Systems (GIS) and Business Process Modelling (BPM) the following research questions can be identified: Can methodical and technical approaches of business process modelling be applied in SDI and GIS? Is an adoption possible by purely relying on existing standards? What are possible limitations? Are some extensions necessary? 1. Modelling of business processes in SDI and GIS Spatial Data Infrastructures (SDI) are built up at all levels in state administration. This development is particularly stimulated by the European INSPIRE Directive (Infrastructure for Spatial Information in the European Community) [11], which demands a service-oriented architecture, relying on standardised spatial web services. SDI provide both geodata and geoprocessing capabilities across the boundaries of a specific GIS or database. Thus SDI and GIS provide support processes in EGovernment and E-Business [8]. Existing GIS and spatial ETL (Exchange Transform Load) software already offer graphical modelling capabilities to chain specific tools, data sets and compose higher level processing steps (e.g. ArcGIS Model Builder and Workflow Manager, or the Feature Manipulation Engine FME) to automate recurring processes. The technology is usually proprietary and generic IT standards for process interoperability are not used, thus the specified workflows and models can hardly be used with 3rd party products. Geospatial processes are also increasingly assembled in SDI. Concepts for web service based compositions of individual service are discussed by Kiehle et al. [12] and Friis-Christensen et al. [13]. The service chaining is usually implemented using the Business Process Execution Language (BPEL) or some proprietary hardwired execution logic. In any of these cases the modelling process is not standardised and rather tool-driven than task-driven. However, geospatial web services with standardised interfaces have the potential to become integrated parts in other service based business processes.
50 Andre MÜLLER; Matthias MÜLLER Requirements for a process modelling language To achieve comprehensible modelling of executable business processes in SDI and GIS and the chance for integration with E-Government and E-Business, the following requirements are imposed on the modelling language: The modelling language is standardised. To reach a common understanding and the exchange of the process models among users from different domains or different levels and departments, the modelling language shall be standardised. In order to achieve technical interoperability, it is necessary to standardise the execution logic. The European Interoperability Framework (EIF) for European public services [14] recommends using open standards in E-Government. The modelling language is neutral to the implementation. The process description and execution is not bound to any specific platform. Process models may be operated on different Workflow Management Systems, BPMS or executing engines. Established standards like web services should be provided as a default implementation. The notation of the modelling language is standardised. A common understanding and of visualised models between the different people is only possible if there is a fairly simple and uniform graphical notation. The definition of uniform symbols enables fast visual detection and analysis of the business processes meaning. The modelling language defines a corresponding encoding. For the exchange of models between different tools or engines a common encoding shall be used. The encoding shall represent both the graphic representation and the execution semantics. XML languages have been established as a general means for message encoding in a web service environment. The standard should also use an XML-based encoding thus provide an opportunity to be automatically validated (e.g. by X-Schema). The process model is executable. A sole description and documentation of the process is not sufficient. The model must be interpretable and executable by process execution engines. The modelling language supports automatic and manual tasks. Not all process steps can be run fully automatic. Discretionary decisions play an important role in state administration. The invocation of analogous documents, the application of plausibility checks or the necessity to make telephone call are manual tasks and require user interaction. The modelling language is extensible. The modelling language should provide a basic set of symbols that can be further refined and specialised to support new domains. The derived visualisation has to be
51 Andre MÜLLER; Matthias MÜLLER 44 adapted for a fast perception by users from other domains, e.g. through a generalised visual symbol. The modelling language should be able to support organisational aspects. The description of business processes also serves as a documentation and planning instrument for an organisation s management. Thus the language shall provide means to represent different responsibilities, e.g. by allowing to specify roles. The modelling language should offer different forms of visualisation. Based on that wide application range, process models can be used for different purposes at different levels and in different departments. Recognising those different aspects simplifies the different representations of a business process. The modelling language should support quality management, process analysis and process model validation. To monitor, analyse and optimise the defined processes, it must support the extraction of quality indicators for reporting and monitoring. The standard is public and implemented by common software products. To be able to exchange and run the modelled processes across the boundaries of enterprises or administrations, a wide support of software manufacturers is necessary. 3. Standards for the business process modelling Standards for business process modelling have evolved over time 1. The most important modelling languages will be introduced briefly. Languages that are notation-only (e.g. VISIO) or execution-only e.g. BPEL are omitted. The languages are classified according to Gadatsch 2005 [6] who distinguishes script-based, diagram-based, objectoriented, implementation='# ##WebService' operationref='tns:getmap'> <iospecification> <datainput itemsubjectref='tns:getmaprequestitem' /> <inputset>
54 Andre MÜLLER; Matthias MÜLLER 47 <datainputrefs>datainputofservicetask</datainputrefs> </inputset> <outputset /> </iospecification> <datainputassociation> <sourceref> <LAYERS>ortho_rgb06</LAYERS> <SRS>EPSG:4326</ /SRS> <BBOX>13.0,51.00,13.5,51.20</BBOX>... </sourceref>... </datainputassociation> </servicetask> <interface name='ogc WMS 1.1.0' implementationref='ogc:wms:1.1.0'> <operation name='get Map' implementationref='ogc:wms:1.1.0:getmap'>... </operation> </interface> Listing 1. A WMS query as a BPMN Service Task A general feature of business process modelling languages is branching to represent conditions in the process flow. In BPMN process flows a branched with so called gateways. Figure 5 shows a WPS invocation for a relationship check between spatial features with the 9 Intersection Model [18]. An evaluation of the property contains is done with a so called exclusive gateway. The WPS element is highlighted in yellow to distinguish it from data services. Figure 5. Relationship check of spatial features Figure 6 shows a simple process in which the images of two WMS are overlaid. The overlay is produced by a web service (Image-Processing-Service IPS) that provides general image processing capabilities. The resulting image is sent as a message to the requestor.
55 Andre MÜLLER; Matthias MÜLLER 48 Figure 6. Image overlay example For some GIS specific tasks a geographic extension of BPMN is not required. This is usually the case for manual or very generic tasks. Figure 7 shows a BPMN user task where a human person has to digitise spatial features. A BPMN script task which determines the surface area of a given input feature is shown in figure 8. Figure 7. A BPMN 2.0 User Task Figure 8. A BPMN 2.0 Script Task Of course both elements can also be defined as geo-specific elements. An obvious advantage would be the possibility to standardise the underlying procedures and enforce a stronger typing of the input data. But a high degree of automation and complex typing requires a solid foundation, e.g. in terms of a common geoprocessing algebra (e.g. based on [19, 20]). As there is no such algebra for geoprocessing in general, practical reasons suggest the application of custom procedures and code to accomplish certain geoprocessing tasks [21]. To provide interoperable tasks elements, that stem from SDI and GIS, into common business process models it is required to define the described extensions as profiles of BPMN. Such a Geo-BPMN might harmonise the invocationn of spatial data and geoprocessingg functionality. Thus general business processes can draw from the capabilities of spatial services. A first step is the recognition and adoption of standardised service interfaces to enable a late binding to specific services at run time. To avoid the specification of hosts or endpoints of web services instancess a lookup procedure in a registry is required. Such a registry would allow inquiries for web services by stating their required features and subsequently invoking a web service that satisfies them.
56 Andre MÜLLER; Matthias MÜLLER Conclusion The paper presented an approach to leverage SDI and GIS for business process modelling and execution. The upcoming BPMN 2.0 standard was found to fulfil most the requirements imposed on a modelling language. Some examples for extending the core elements of BPMN were given to illustrate the potential benefits of integrating spatial data and functions into general business processes in E-Government and EBusiness. As a future task the extension of the BPMN core elements to form a Geo- BPMN profile was suggested. Such a profile is expected to be mutually beneficial as it helps to introduce SDI and GIS at an enterprise level in E-Government and E-Business and also justifies the service-oriented provision of spatial data and processing logic. The integrated process modelling approach which focuses on user s and process management s requirements has the potential to lower the thresholds in spatial data handling and thus broaden the community of users of spatial data and functionality. List of literature [1] Gadatsch, Management von Geschäftsprozessen: Methoden und Werkzeuge für die IT-Praxis, Vieweg,Braunschweig, [2] Becker, Prozessmanagement: Ein Leitfaden zur prozessorientierten Organisationsgestaltung, Springer, Berlin Heidelberg, [3] Allweyer, Geschäftsprozessmanagement: Strategie Entwurf Implementierung Controlling, W3L-Verlag, Bochum, [4] Freund, Praxishandbuch BPMN 2.0, Hanser, München Wien, [5] Kruczynski, Prozessmodellierung im Wettbewerb: EPK vs. BPMN, ISreport 6 (2008). [6] Gadatsch, Grundkurs Geschäftsprozess-Management: Methoden und Werkzeuge für die IT-Praxis, Vieweg, Wiesbaden, [7] Allweyer, BPMN Business Process Modeling Notation, Books on Demand GmbH, Norderstedt, [8] EABPM, Business Process Management Common Body of Knowledge BPM CBOK Leitfaden für das Prozessmanagement, Verlag Dr. Götz Schmidt, Wettenberg, [9] Beydeda, Model-Driven Software Development, Springer, Berlin Heidelberg, [10] Stahl, Modellgetriebene Softwareentwicklung: Techniken Engineering Management, d.punkt Verlag, Heidelberg, [11] European Parliament (2007): Directive 2007/2/EC of the European Parliament and of the Council of 14 March 2007 establishing an Infrastructure for Spatial In-formation in the European Community (INSPIRE), Official Journal of the European Union 50 (2007), S. L 108/1 14. [12] Kiehle, Requirements for Next Generation Spatial Data Infrastructures - Standardized Web Based Geoprocessing and Web Service Orchestration, Transactions in GIS 11(6) (2007). [13] Friis-Christensen, Designing Service Architectures for Distributed Geoprocessing: Challenges and future directions, Transactions in GIS 11(6) (2007). [14] European Commission (2010): IP/10/1734 the European Commission has adopted the Communication Towards interoperability for European public services, COM(2010) 744 final, Annex 2, European Interoperability Framework (EIF) for European public services. [15] OGC (2007): OpenGIS Web Processing Service, Version [16] OGC (2006): OpenGIS Web Map Service Implementation Specification. Version 1.3.0, OGC [17] OMG (2011): Business Process Model and Notation (BPMN). Version 2.0, formal/ [18] Egenhofer, A Mathematical Framework for the Definition of Topological Relationships, FourthInternational Symposium on Spatial Data Handling. Zurich (1990), [19] Egenhofer. Spatial SQL: A Query and Presentation Language, IEEE Transactions on Knowledge and Data Engineering 6(1) (1994), [20] Tomlin, Geographic Information Systems and Cartographic Modelling, Englewood Cliffs, Prentice- Hall, [21] Müller Matthias, Moving Code in Spatial Data Infrastructures: Web Service Based Deployment of Geoprocessing Algorithms, Transactions in GIS 14 (2010),
57 Andreas ABECKER et al. 50 Enabling User-friendly Query Interfaces for Environmental Geodata through Semantic Technologies Andreas ABECKER a, b, 1 ; Wassilios KAZAKOS b ; Gabor NAGYPAL b ; b Aleksei VALIKOV a FZI Forschungszentrum Informatik an der Universität Karlsruhe, Haid-und-Neu-Str , D Karlsruhe b disy Informationssysteme GmbH, Erbprinzenstr. 4-12, D Karlsruhe Abstract. We sketch the functionality and architecture of the HIPPOLYTOS prototype, which uses ontology-based metadata to allow user-friendly text queries to structured data in the area of environmental information systems. Keywords. Spatial Reporting, Semantic Search, Ontology, Environmental IS. 1. Introduction There is a huge amount of environmental geodata available, and with the increasing uptake of INSPIRE, it will dramatically grow in the near future. Up to now, the commercial and societal impact of open governmental spatial data is nevertheless very limited, due to a number of reasons among others, also the tremendous intransparency of geodata offerings. For a normal citizen, an interested company, even for employees of public authorities in a different domain, the heterogeneity and distributedness of environmental geodata is often overwhelming. Hence, user-friendly and powerful search interfaces are a must-have in this area. On the other hand, Semantic Web technologies promise advanced functionalities for intelligent search, integration, and processing of Web-accessible data, documents, and services [1]. Based on the annotation of complex machine-readable metadata to Web-based information resources, these resources can better be found and interpreted. Semantic metadata instantiate and refer to ontologies, powerful conceptual domainknowledge models, agreed within a certain user community and represented in expressive, logic-based languages, which are standardized by the World Wide Web Consortium (W3C) [2]. The power of semantic technologies relies on several factors the specific importance of each depending on the specific use case like standardization of representation languages, metadata or knowledge models, like automation of some reasoning services because of the logic-based semantics, or like empowering user or systems by the domain-specific background knowledge represented in ontologies. Furthermore, ontologies for knowledge organization and 1 Corresponding Author.
58 Andreas ABECKER et al. 51 human search and navigation, often comprise a so-called lexical layer, which describes how the concepts are referred to by natural-language expressions, often covering the variability of human language, addressing phenomena like synonymy, or enabling multilingual knowledge access. Although comprehensive metadata and extensive background knowledge for knowledge organization (in the form of thesauri) are widespread in environmental information systems, there are not yet too many industrial-strength applications of semantic technologies in this area. This paper presents some results and design decisions of the HIPPOLYTOS project which aims at a practicable combination of (i) semantic technologies and (ii) a commercial tool for geodata management and spatial reporting (disy Cadenza). Very generally spoken, the project goals of HIPPOLYTOS were: to map an intuitive, text-based search interface at the front-end to complex data structures and relationships in the back-end (environmental information system/ data warehouse) in order to better exploit existing expert knowledge (in domain ontologies and in predefined selectors and selector metadata), but also taking into account real-world constraints and requirements. The paper is structured as follows: in Section 2, we sketch the look-and-feel of the HIPPOLYTOS prototype. In Section 3, we elaborate on its realization, covering some basic design principles and a rough architectural sketch of the system. In Section 4, we summarize and discuss the current status of the system and some future work. 2. Look-and-Feel of the HIPPOLYTOS Query Interface In contrast to other semantic search projects, HIPPOLYTOS does not focus primarily on text, documents or multimedia information (as we do in complementary research [3,4]), but on structured data in databases, which is continuously collected and provided to the citizen by many public authorities like statistics offices or environmental agencies 2. HIPPOLYTOS develops a search layer on top of such data repositories realized, e.g., by disy GmbH s Cadenza software. 3 Fig. 1 illustrates the current prototype of the HIPPOLYTOS system: Assume the user types in Eisenschrott Ballungsraum Stuttgart ( iron junk city region Stuttgart ) at a Google-like, most easy-to-use query interface. The system now reasons as follows: 2 For instance, the Landesanstalt für Umwelt, Messungen und Naturschutz Baden-Württemberg (LUBW) offers on its Web portal up-to-date metered values for air quality, ozone, particulate matter, radioactivity, high and low water river stage, several water quality parameters, detailed metereological data, for all over Baden-Württemberg. 3 Cadenza ( is a system for building search, analysis, and visualization solutions for spatial data. At its core stands a repository system, which manages the back-end data sources. An important Cadenza concept are so-called selectors, pre-defined query templates for the back-end systems which are designed by domain experts for specific analysis tasks. Selectors can be described with text metadata in the repository. They stand at the heart of many special applications that disy has built for environmental agencies and other public authorities.
59 Andreas ABECKER et al. 52 Iron junk is not a technical term in environmental information systems, but recyclable fraction FE scrap is which is represented in the ontology, with iron junk as a synonymous wording. Figure 1. Query for Eisenschrott Ballungsraum Stuttgart. The ontology also contains the taxonomic knowledge that potential recyclables is a super-concept of FE scrap and that metal is a superconcept of iron/fe whereas waste is a super-concept of scrap. It also contains in its taxonomy the knowledge that recyclable fraction Aluminium scrap and recyclable fraction glass may be siblings to recyclable fraction FE scrap in the taxonomy. Following the relationships in the ontology, we will also find that iron as well as recyclable materials have something to do with cars (passenger cars and trucks, PKW and LKW in German). Furthermore, the lexical part of the ontology knows that city region is a synonym for metropolitan region or for urban agglomeration, which is an informal term that can be mapped to several spatial interpretations, such as the city of Stuttgart, the Stuttgart region constituted by 6 neighboring administrative districts, or the geographic area within a certain radius around Stuttgart city center. Using this lexical and conceptual background knowledge, the system can identify a number of stored and semantically indexed selectors parameterized, pre-defined queries to the structured data sources in the back-end. The match between query concepts and semantic-annotation concepts for stored selectors can be based on: The subject matter of the selector e.g., there may be a selector querying for the amount of certain recyclable materials [which is a parameter of this
60 Andreas ABECKER et al. 53 selector] in sorted waste of a given region [2 nd parameter] in a given timeframe [3 rd parameter] (here, a proximity match could be made with the potential recyclables concept in the set of super-concepts of the query concepts). The co-domain of the selector parameters e.g., FE could be a parameter value for the 1 st parameter of the example selector above, and Stuttgart for the 2 nd parameter. The visualization or presentation type for the results e.g., data value, data series, data table, map-based visualization, specific diagram type. For instance, if the query would contain terms like comparison, trend, or distribution, this could give hints to the expected kind of presentation. 4 Then for the given query the most appropriate selectors and parameter settings can be identified and sent to the back-end system. The result screen in Fig. 1 shows a ranked list of potential result selectors as well as previews of the visualized results of the two top-ranked ones. 3. Technical Realization Some basic design decisions for the system seem notable to us: We settle upon industrial-strength full-text retrieval (LUCENE) in order to guarantee performance and scalability. As we search not only within the semantic annotations, but also in the free-text annotations already available when defining Cadenza selectors, our results can never become worse (in terms of recall) than those of a pure full-text search in the Cadenza repository. In order to guarantee run-time performance, we aim at a reasonable balance between off-line and on-line processing. In particular, we do not query the ontology at query-time, but exploit background knowledge for creating extensive semantic annotations in the full-text index. In that way, we materialize the possibly useful inferences in the pre-processing. In order to avoid the well-known knowledge-acquisition bottleneck of semantic technologies, we aim at fully-automatic approaches wherever possible. In particular, this means ontology-creation from existing sources (import of the SKOS version of GEMET) and fully-automatic semantic annotation of selectors with a heuristic algorithm from [5]. Further, we settle upon existing expert knowledge wherever possible. Concretely, this means that in the current system version we do not directly search the back-end data sources, but we first retrieve pre-defined selectors with their existing descriptions. Further, we build on the existing GEMET thesaurus for ontology creation. Currently, we employ lightweight semantic technologies (SKOS ontology to provide background knowledge, no highly-sophisticated logics-based representations or inferences) as it is sufficient to meet our project goals. The future 4 This kind of query hints for the visualization type is not yet implemented.
61 Andreas ABECKER et al. 54 work has to show whether more complex techniques are necessary or useful for future usage scenarios. Fig. 2 below gives an impression of overall system components as well as off-line and on-line processing steps. Figure 2. Sketch of Overall Architecture. 4. Summary and Future Work Summary. We have sketched functionality and realization of the HIPPOLYTOS prototype for semantic search over geo-referenced environmental data. The goal is a third kind of information access for disy s Spatial Reporting products, which currently offer map-based and form-based access to geodata. This third kind of access shall be a Google-like, simple text-query interface, which automatically finds and instantiates available selectors and thus automatically configures appropriate structured queries to the back-end data sources. To this end, repository elements and their textual descriptions are semantically annotated with ontology concepts. The ontological background knowledge about taxo- variations nomic and non-taxonomic relationships between concepts and about lexicall of concept references allows to enrich a conventional full-text index for selectors such that also vague, too abstract or too specific, or even wrongly expressed queries might be resolved. In order to provide a solution suitable for real-world usage, we aim at largely automated ontology creation and annotation processes. Status. The current system must be seen as a proof-of-concept that still has some hardwired aspects and works on a very small example scenario. It shows already that, also for realistic data volumes and ontology sizes, the approach is able to deliver reasonable results with acceptable performance. Nevertheless, the systematic, large-scale and long-term evaluation of the retrieval quality still has to be evaluated in long-term experiments. Obviously, the quality of the retrieval depends on the used ontologies and annotations. Here, the practicability of fully-automatedd ontology
62 Andreas ABECKER et al. 55 creation and semantic annotation still has to be verified in practice and, probably, user-friendly editors and work-embedded working processes for manual corrections must be implemented. Approaches for fostering data curation in the linked open data (LOD) community may provide helpful ideas. Future work. There are still many areas for future work: (1) In the SUI and SUI II projects [3, 4], more usage and design studies for ontology-based access to environmental information have been performed, including unstructured information and the links between information sources, as well as navigational support for end users through ontological knowledge. A combination with HIPPOLYTOS makes sense. (2) Besides facilitating human search and navigation through background knowledge, another major reason for using semantic technologies is to facilitate automatic integration of different information sources and vocabularies. In the SUI II project [4], initial steps in this direction are investigated. In the long-term, such ontology-mapping techniques would have to be integrated into HIPPOLYTOS. (3) The current approach mainly employs background knowledge about the domain of geo-referenced environmental information. It does not yet go very deeply into the semantic analysis of the spatial concepts themselves in the query. Though the use of ontologies is a longstanding research topic in GIS (see, e.g., [6, 7]), it has not yet found its way very far into OGC or W3C standardization. Pragmatic steps into this direction may be a thrilling long-term goal. (4) Currently, we only retrieve pre-defined selectors and do not take into account the really existing data values in the back-end data sources when interpreting a user query. This exploits the expert knowledge put into selector definition and annotation, but it is useless if a completely new question arises. So, we are also experimenting with socalled schema agnostic search techniques from semantic search, which exploit DB schemata and distributional semantics of data values in the back-end systems to create SQL queries from scratch, out of text-based user queries. Acknowledgment. This work has partially been funded by the German Federal Ministry of Economics and Technology (BMWi) in the project HIPPOLYTOS within the Mittelstandsprogramm of the THESEUS research programme. It has been supported by the Ministry of Environment, Nature Conservation and Transport of the Federal State of Baden-Württemberg and by the Landesanstalt für Umwelt, Messungen und Naturschutz Baden-Württemberg (LUBW). References [1] J. Domingue, D. Fensel, J.A. Hendler (ed), Handbook of Semantic Technologies, Springer-Verlag, Berlin, Heidelberg, [2] S. Staab, R. Studer (ed.), Handbook on Ontologies (2 nd ed), Springer-Verlag, Berlin, Heidelberg, [3] A. Abecker et al., SUI Ein Demonstrator zur semantischen Suche im Umweltportal Baden-Württemberg, In: R. Mayer-Föll, A. Keitel, W. Geiger (ed), Kooperative Entwicklung wirtschaftlicher Anwendungen für Umwelt, Verkehr und benachbarte Bereiche in neuen Verwaltungsstrukturen, Phase IV 2008/09, Forschungszentrum Karlsruhe, Wissenschaftliche Berichte, FZKA 7500, 2009, [4] U. Bügel et al., SUI II Weiterentwicklung der diensteorientierten Infrastruktur des Umweltinformationssystems Baden-Württemberg für die semantische Suche nach Umweltinformationen. In: R. Mayer- Föll, R. Ebel, W. Geiger (ed), Kooperative Entwicklung wirtschaftlicher Anwendungen für Umwelt, Verkehr und benachbarte Bereiche in neuen Verwaltungsstrukturen, Phase V 2009/10, Karlsruher Institut für Technologie, KIT Science Reports, FZKA 7544, 2010,
63 Andreas ABECKER et al. 56 [5] G. Nagypál, Possibly Imperfect Ontologies for Effective Information Retrieval, Dissertation, Fakultät für Informatik, Universität Karlsruhe (TH). [6] F. Fonseca, M. Egenhofer, P. Agouris, G. Camara, Using Ontologies for Integrated Geographic Information Systems. Transactions in GIS, 6, , [7] T. Bittner, M. Donnelly, B. Smith, A Spatio-Temporal Ontology for Geographic Information Integration, International Journal of Geographical Information Science 23(6), , 2009.
64 Sahib JAN; Malumbo CHIPOFYA 57 Integration of Qualitative Spatial Reasoning into GIS- An Example with SparQ Sahib JAN; Malumbo CHIPOFYA Institute for Geoinformatics, University of Munster {s_jan Introduction Qualitative reasoning is an approach for dealing with commonsense knowledge without using numerical computation. Instead, one tries to represent knowledge using a limited vocabulary such as qualitative relationships and qualitative categories for representing real values (J. Renz, B. Nebel, 2007). Qualitative approach is use to deal with incomplete knowledge and it is considered to be close to how humans represent and reason about commonsense knowledge. This point, among others, motivates the integration of QSR into Geographic Information Systems (GIS). During the last two decades a multitude of formal calculi for spatial relations has been proposed focusing on different aspects of space like topology, orientation, and distance (Freksa and Röhrig, 1993). However, the application of these calculi in GIS remains sparse. We approach this problem by building an appropriate Application Programming Interface (API) that encapsulates the functionalities of the qualitative spatial reasoner SparQ1 to make them available to GIS applications. Our API which is written in Java provides a set of Extensible Markup Language (XML) data structure for specifying the query to SparQ and returning the results. The API itself resides on the client-side and accepts XML structured queries which it then passes on to SparQ in the latter s own syntax. Results from SparQ are converted back into XML and returned to the user application. In this paper we will first describe the API we developed for SparQ and how it has been tested with the open source java-based GIS software OpenJUMP2. Then we will note some shortcomings of our work especially with respect to the applicability of the API in a broader context, for example considering restrictions on the types of entities on which qualitative reasoning can be applied. Finally, we will discuss some future directions for our work and the challenges we envisage. 1 SparQ was developed and is maintained at the University of Bremen under the project (R3-[Q-Shape], etc.). Go to for more details. 2 OpenJUMP is an open source GIS application developed and maintained by the Canadian Companies Vivid Solutions and Refractions Research. Go to for more details.
65 Sahib JAN; Malumbo CHIPOFYA Connecting GIS to SparQ 2.1. Qualitative Spatial Reasoning Using SparQ SparQ (Spatial reasoning done Qualitatively) is toolbox that makes available a set of binary and ternary calculi and reasoning techniques developed in QSR community (Wallgrün, 2009). As a toolbox, SparQ was designed to be used directly in other application over a TCP/IP connection or as a standalone console application. It is a modular software program with four main modules. The Compute-relation module allows computation of operations defined in the specific calculi. It takes as parameters the name of an operation together with a set of variables representing entities from the appropriate domain and constraints between the given set of entities, each labeled by its corresponding relation. The Qualify module implements a single operation (qualify) which takes a quantitative scene description (coordinates of entities in the scene) and returns a qualitative description in terms of the possible constraints between the given entities for a particular calculus. Constraint-reasoning reads a description of a constraint network and performs operations to identify network consistence. Finally, the Algebraic-reasoning module is used for reasoning about real-valued domains using techniques of algebraic geometry QSR API Design Our API comprises a set of Java classes and XML files. It contains a set of rules and specifications that a software program can follow to access services provided by the API. As previously stated, the purpose of designing an API in this study is to integrate qualitative spatial reasoning engines with GIS applications, particularly to integrate the reasoning engine SparQ with the GIS application openjump. The API will serve as an interface between these two applications. Initially, the API establishes a TCP/IP connection with the reasoning engine. Figure 1 shows the global architecture of the GIS-API-SparQ configuration. The API allows a user to send a query in XML format from the client (GIS) application and retrieve results in XML format as well. It parses the given XML file, transforms the query into the syntax and encoding format of the reasoner, forwards the query to the reasoner, and waits for the results. Upon receiving the results, the API transforms them back into defined XML structure and returns them to the GIS application.
66 Sahib JAN; Malumbo CHIPOFYA 59 Figure 1: API workflow overview XML files are defined based on an analysis of SparQ s syntax. Qualitative calculi were analyzed using all modules and their specific operations in SparQ to identify their syntax commonalities. Each module in SparQ takes a command with a sequence of module specific parameters. The general syntax of a SparQ query as it would be given at the command prompt ($>) of a terminal is as follows $>./sparq <module-name><calculus-name><module specifies parameters> The module specific parameter must be consistent with the module and calculus specified at the beginning of the command. They include the set of operations, relations, and constraint-networks to be used for the reasoning task. We categorized the possible input queries and designed the XML structure for each module and module specific operation. Every XML query structure is set of tags that include module-name, calculus-name, operation, relations, and other modules specific parameters as shown, for example, in Figure 2 below. <?xml version='1.0' encoding='utf-8'?> <module name='qualify'> <calculus type= binary name='dra-24'> <controlmode>all</controlmode> <entity type=' '>A</entity> <entity type=' '>B</entity> </calculus> </module> Figure 2: XML structure for input queries The above given XML structure for input query is specific for the Qualify module. The XML query above contains module and calculus names and calculus type (binary or ternary) as attribute values. The controlmode tag in the query takes two value (ALL/First2all) used to return all possible constraints between given entities or to
67 Sahib JAN; Malumbo CHIPOFYA 60 return possible constraints between first-two entities. Each XML structure for input query varies with respect to the module used From XML to SparQ Syntax Queries in XML format are converted into SparQ syntax by reorganizing the data in the query and augmenting it with required non-functional syntactic elements like parentheses, blank-spaces, etc. We use the DOM parser to generate the document tree for a given XML file which is then mapped into a SparQ-syntax formatted query. The resulting query is then forwarded to SparQ as a simple text string. Figure 3 shows an XML query with the Constraint-reasoning module s algebraic-closure operation. <?xml version='1.0' encoding='utf-8'?> <module name='constraint-reasoning'> <calculus type= binary name='cardir'> <operation type= constraint-reasoning >algebraicclosure</operation> <entity>a</entity> <relation>n</relation> <entity>b</entity> <entity>b</entity> <relation>n</relation> <entity>c</entity> </calculus> </module> Figure 3: XML query for constraint-reasoning on given constraint-network After processing the query above, the resulting string will look as follows. <constraint-reasoning> <CARDIR> <algebraic-closure> (( <A> <N> <B> ) (<B> <N> <C>)) n SparQ Result Analysis SparQ results are analyzed to identify the possible output patterns in the results for given input queries. We used all possible modules specific queries to find out commonalities between the results given by the reasoner and the type of errors generated. The purpose of result analysis is to design a suitable mapping into common XML data structures for given results. Based on the analysis, we generalized SparQ outputs into the five categories of simple-text, simple-text and constraint-network, constraint-network, syntax errors and simple-relations. The standard XML structure for requesting a result from constraint reasoning is shown in the example presented in Figure 4.
68 Sahib JAN; Malumbo CHIPOFYA 61 <result type = constraint-reasoning > <operation name= algebraic-closure type= constraint-reasoning > <comments> Modified Network</comments> <entity>a</entity> <relation>rllr</relation> <entity>b</entity> <entity>a</entity> <relation>ells</relation> <entity>c</entity> </operation> </result> Figure 4: SparQ result in XML format The extracted results from SparQ are generalized into five categories. API coverts received result into defined output XML structure. The above mentioned result in XML format contains composition of text and constraint-network. Result-tag that represent used module for reasoning. Operation-tag contains operation-type and name to represent module and module specific operation used for reasoning. Comment-tag is used to pass a received comment to the end-user and the received constraint-network is split into entity-tags and relation-tags that represent possible relationship between given entities Converting Results From SparQ Syntax into XML The developed API extracts results from SparQ and stores them as a string array. During conversion these results are split into sub results based on type of result received from the reasoner. The API contains set of module specific methods and conditions to extract and process the received results from the reasoner. For example, queries using algebraic-reasoning and scenario-consistency generate results as compositions of simple-text and constraint-networks. The results are extracted, stored in a string array, and then split into two string arrays based on predefined numeric values between 0 and 9 and the. character. The array that contains simple text is forwarded as comment in the comment tags. The second array that contains the constraint-network is further processed and split based on SparQ syntax (punctuation) rules. During this process the data elements of the substring are mapped back into XML as attribute values or text-data in defined standard tags resulting in a structure similar to the one shown in Figure Connecting OpenJUMP with API A Specific Example OpenJUMP supports reading and writing shape files and simple GML file format as other GIS applications. It supports different data format including GML, SHP, DXF, JML, MIF, TIFF and postgis etc (Dalluege, 2006). OpenJUMP provides functionality to extend application by writing own plug-ins, cursor tools, renderers, and other such facilities with the help of the extension class. To test our API we implemented a Java based OpenJUMP plug-in and used it to pass qualitative reasoning tasks to SparQ.
69 Sahib JAN; Malumbo CHIPOFYA 62 Presently, the extension consists of an input screen for selecting an XML file (using the Input text field in Figure 5) containing the query, displaying XML data for the input query as well as its results (the Output text area in Figure 5), connecting to and disconnecting from SparQ via the API, and sending the specified query to the reasoner. Figure5: OpenJUMP plug-in provides interface for reasoning The OpenJUMP extensions API provides a broad array of functions that allow developers to not only write code that can access data loaded by the GIS but also to modify and enhance the behavior of standard functionality such as rendering, processing, and editing of the data. The simplicity of the extension model and the above stated advantages are what led us to select OpenJUMP as a testing platform for our study. 3. Concluding Remarks 3.1. Shortcomings and Future Challenges The developed framework (API) is limited in several respects like selection of qualitative reasoning engines, automating spatial queries, and supporting visual representation of results on the client side. We developed the API based on a particular reasoner, SparQ, in mind. It is possible to improve the developed API to interact with other reasoners but this would involve substantial rewriting of the XML-to-reasonersyntax translation code. The approach we intend to take here is to improve the current design by making our XML schemas more generic and integrate support for extensible style sheets (XSLT) on which query and result transformations can be based.
70 Sahib JAN; Malumbo CHIPOFYA 63 Task chaining in which a composite reasoning task is computed in a single query request is an essential feature for GIS applications. One of our targets is to support such chaining tasks by allowing the output of one or more tasks to be used as input for other tasks possibly with additional information supplied by the GIS between subtasks. To achieve this level of functionality we believe it is necessary to understand the sorts of GIS tasks that may require QSR methods. As such, one aspect that needs to be clarified is the utility of QSR in GIS. This leads us to analyze the scenarios in which employing qualitative as opposed to quantitative processing would be most beneficial or at least desirable for GISs and pin point those calculi that are most useful in those situations. A final shortcoming that is worth noting here is related to the restrictions that formal definitions of qualitative spatial calculi place on the admissible primitive entities that can be reasoned over. For example, SparQ does not directly support reasoning over entities of type polygon (in Euclidean 2-Space) which would be useful for applying RCC reasoning on vector data with geometries of that type. Integrating multiple reasoning engines with different types of reasoning capabilities maybe constitute part of the solution to this problem Summary The developed a platform independent API allows GIS users to integrate the reasoning engine with spatial applications like ArcGIS and OpenJUMP. The API provides a set of functions for establishing a connection with the reasoner through a GIS application and sending and receiving queries it. The main advantage of the API is that it supports a machine and human understandable format (XML) and can easily integrate with any Java based application or can be accessed over a network or can be converted into a web-based API. Further reasoning tasks can be performed on results from any previous reasoning tasks enabling complex tasks to performed without requiring the reasoning mechanisms to be implemented in the GIS. The qualify facility provided by SparQ will allow developers of GIS extensions that use QSR to focus on the core functionality they intend to provide instead of the details of deriving qualitative descriptions from quantitative data or qualitative reasoning algorithms. Acknowledgements This work has been supported in part by an Erasmus Mundus Program scholarship granted by North-Rhine-Westphalian Ministry for Innovation, Science, Research and Technology (MIWFT) under object-number and by the Deutsche Forschungsgemeinschaft (German Research Foundation) under grant GRK References [1] Lutz Frommberger, Frank Dylla, Diedrich Wolter, Jan Oliver Wallgrun et al.(2009). SparQ User Manual V0.7. Accessed at V0.7.4.pdf on 1st Feb [2] C. Freksa and R. Röhrig (1993). Dimensions of qualitative spatial reasoning. In N. P. Carrete and M. G. Singh, editors, Qualitative reasoning and decision technologies, Proc. QUARDET93, , Barcelona.
71 Sahib JAN; Malumbo CHIPOFYA 64 [3] Christian Freksa (1991). Qualitative Spatial Reasoning, Institute for Informatics, Technical University Munich. [4] Uwe Dalluege (2006). OpenJUMP tutorial, Department of Geomatik, Hafen City University, Hamburg. [5] J. Renz and B. Nebel (2007). Qualitative spatial reasoning using constraint calculi, In: Handbook of Spatial Logics, M. Aiello and I. E. Pratt-Hartmann and J. F.A.K. van Benthem (eds), pp , Springer.
72 Hendrik WARNEKE et al. 65 Matching-Based Map Generalization by Transferring Geometric Representations Hendrik WARNEKE a ; Michael SCHÄFERS a ; Udo W. LIPECK a ; Joachim BOBRICH b a Fachgebiet Datenbanken und Informationssysteme, Institut für Praktische Informatik, Leibniz Universität Hannover, b Bundesamt für Kartographie und Geodäsie, Frankfurt a. M., Abstract. We present an approach that integrates a generalized dataset of road networks into a digital landscape model to create a smaller-scale map representation. In an automatic matching process corresponding objects from the different datasets are identified and linked. These links are used to transfer the geometric representation from the generalized dataset to the ungeneralized landscape model. All objects not linked, e.g. vegetation areas, are adjusted to the transferred geometries. This is done using a rubber sheeting transformation that propagates the displacements created by the geometry transfer to other objects in the vicinity. Our implementation uses an object-relational database and spatial partitioning such that it becomes capable of handling large datasets. We show exemplary results from processing authoritative data for Germany. Keywords. map generalization, object matching, spatial data integration, map adjustment, rubber sheeting 1. Introduction Matching spatial datasets can serve many purposes. Often thematic information about objects of one dataset shall be transferred to matching objects of another dataset. We have investigated a practical scenario where geometric information is transferred between matching objects, and geometries of non-matching objects can be adjusted. On the one hand, a thematically rich large-scale dataset (the German authoritative base digital landscape model) is given, but it needs cartographic generalization for presentation purposes. On the other hand, another smaller-scale dataset is available that carries the line skeleton structure in an already generalized representation. Actually, the second dataset (a commercial product) shows road networks that have already been generalized for smaller-scale road map production purposes. Thus, it contains only few other objects and much less thematic information. In order to achieve an efficient generalization of the first dataset, both datasets are first processed by an automatic matching process to identify and link corresponding objects between the datasets. Then these links are used to transfer the geometric representations of road objects from the generalized dataset to the ungeneralized landscape model. Finally, all objects not linked, e.g. vegetation areas, are adjusted to the transferred geometries. This is done using a rubber sheeting transformation that
73 Hendrik WARNEKE et al. 66 propagates the displacements created by the geometry transfer to other objects in the vicinity. In order to handle large datasets stored in a spatial database, we utilize a spatial partitioning of the datasets such that the results of processing partitions can be merged without loss of precision. After surveying related work, sections 3-5 will describe input data, object matching, and geometry adjustment including geometry transfer and partitioning. Section 6 will report on the implementation and typical results. 2. Related Work Spatial data integration has been investigated by many researchers producing a large amount of solutions to different problems regarding the combined usage of heterogeneous datasets. To keep it short, we only mention some of the approaches containing object matching techniques to link instances from different datasets. [1,2,3] use matching algorithms based on the buffer growing principle to generate candidate matchings that are filtered using some kind of selection algorithm afterwards. In [4,5] methods that match spatially embedded graphs are introduced and applied to geographic data. [6,7] describe techniques for transferring postal data (i.e. points along lines with address information) and routing information between different datasets that are also based on matching and rubber sheeting. Rubber sheeting originates from the area of image processing. In [8] this technique was used to integrate spatial vector data. [9,10] describe rubber sheeting techniques for vector data, that are not based on displacement vectors but directly on corresponding linear features from different datasets. They also employ sophisticated filtering techniques to restrict the input for the transformation to locally related objects. 3. Input Data Description For our data integration process spatial data from two sources are used: BDLM and KVD. While the first dataset provides a semantically and geometrically rich landscape model, the latter mainly supplies road networks manually generalized for map production. The German federal mapping agency (Bundesamt für Kartographie und Geodäsie) collects topographic digital landscape models from all federal states to make up an information system (ATKIS 1 ) for the whole country. This includes the ATKIS Base Digital Landscape Model (BDLM) which is Germany s large scale topographic landscape model. The ATKIS Object Catalogue lists a variety of available object types together with their specific attributes and object modelling rules. KVD (Kartographische Vektor-Daten) is an object-based vector-formatted dataset, too. As it is primarily used for creating printed road maps, it has been edited in a manual generalization process. Contained objects were shaped by visual aspects and do not follow a strict modelling scheme. There are less and coarser object types than in BDLM. 1 Authoritative Topographic Cartographic Information System (Amtliches Topographisch-Kartographisches Informationssystem)
74 Hendrik WARNEKE et al. 67 In a preprocessing step, a topological data model (TDM) is computed for each dataset. The resulting spatially embedded planar graphs represent geographic objects as composed of nodes, edges, and faces. 4. Object Matching As a central part of the data integration process, an object matching algorithm is used to identify BDLM and KVD objects representing the same real world entity. Object similarity (or equality) is determined by spatial criteria like the object geometry or its topological position and also by thematic attributes like object types. Most common matching algorithms identify 1:1-matchings only, i.e. one object from dataset A is matched with exactly one object in dataset B. (Non-matched objects can be understood as 0:1- and 1:0- matchings.) Due to different object modelling rules in the input datasets 1:1-matchings are not always possible to find. This raises a need for more complex n:m-matchings, meaning that an aggregation of n (semantically and geometrically connected) objects from dataset A is matched with an aggregation of m objects in dataset B. Figure 1 illustrates a 3:2-matching between line objects. Figure 1. 3:2-matching of single line objects from dataset A (blue) and B (red). An 1:1-matching of the resulting aggregated objects is shown with dashed lines. As an optimized solution for matching object-based vector data with all constraints mentioned above, we have developed a graph matching algorithm [4]. The algorithm operates on topological relations between objects, and uses a product graph construction as well as filtering rules to find a well-fitting inexact graph matching. This matching is represented by a set of links between aggregated edges as illustrated above. 5. Geometry Adjustment The links resulting from the matching process are used to adjust the geometries of BDLM-objects to the geometric representation of KVD-objects. For all links, the geometries from the KVD-edges are directly transferred to their matching partners. Also based on the links, displacements are calculated that are used in a rubber-sheeting transformation to adjust the DLM-edges without matching partners. Since all transformations are done on edges from the topology of the BDLM, common geometric parts of different objects are adjusted uniquely. The last step consists in the inverse transformation of the adjusted BDLM-topology to a feature object model.
75 Hendrik WARNEKE et al Transferring Geometries Since the matching does not provide correspondences between single edges but rather between groups of edges (n:m) we determine the appropriate part of the aggregated KVDgeometry for each BDLM-edge using linear referencing. For start- and endpoints of the BDLM-edges a measure is calculated that corresponds to the relative distance from the start of the aggregated BDLM-geometry.We compute the points that have the same measures on the aggregated KVD-geometry and use them to subdivide it into smaller parts. These are used to replace the geometries of the corresponding BDLMedge (see Fig. 2). Figure 2. Group of BDLM-edges linked to group of KVD-edges (left). Measures calculated for start- and endpoints of BDLM-edges and corresponding points on the aggregated KVD-geometry (middle). Transfer of subdivided KVD-geometries to BDLM-edges (right) Generating Displacement Vectors We use the links computed in the matching process to calculate vectors that represent the displacements induced by transferring the geometric representations in section 5.1. We want these vectors to closely resemble this displacement, so we first project all shape points of both aggregated geometries on the other geometry using linear referencing again. Each pair of corresponding points on the BDLM- and KVDgeometry represents one displacement vector (see Figure 3). As we also want the vectors uniformly distributed over the length of the linked geometries, we interpolate straight lines between shape points to get additional points for projection and vector generation. Figure 3. Displacement vectors generated at shape points of aggregated geometries (left). Additional vectors at interpolated points between shape points (right).
76 Hendrik WARNEKE et al Adjusting Geometries by Rubber Sheeting The displacement vectors define the rubber sheeting transformation [8] executed on the non-matched BDLM-edges in order to adjust them to the transferred representation of section 5.1. For each shape point p of a BDLM-edge its displacement ~v(p) is calculated as the weighted average of the vectors ~v1; :::; ~vr from section 5.2. The weight wk(p) of the k-th vector decreases with increasing distance of its startpoint sk from p, as displacements in the vicinity of an edge should have a stronger influence than those far away. Figure 4. Filtering displacement vectors using a face (left) or a window (right). Due to the quadratic reciprocity, the influence that displacement vectors far distant from p have on the transformation is very small. Therefore the majority of vectors can be discarded from the transformation of one shape point without significant changes to the (generalization) result, but with a much smaller computation effort [11]. We have developed two filters that are used to reduce the set of vectors for the transformation of p. For the face-filter we compute faces constituted from linked BDLM-edges before the rubber sheeting process. To calculate the displacement for p the mesh surrounding it is identified and only vectors with a starting point inside or on border of this mesh are used (see Figure 4). The window-filter selects only those vectors with a starting point inside a fixed size window centered around p; it is used when faces are missing or too large Partitioning of Datasets Since the influence of displacements generated in 5.2 is restricted to the close vicinity in the adjustment step 5.3, it is possible to partition the input data into smaller datasets that are processed independently. Partitions are defined on a rectangular grid where each rectangle represents the interior of a partition. Before selecting objects from input data that intersect the rectangle, we enlarge it into all directions by a fixed width border area that overlaps neighbouring partitions. Objects located in this border area need to be included to compute good matchings and adjustment of objects located at grid lines.
77 Hendrik WARNEKE et al. 70 After processing, the partitioned data have to be composed into one result dataset. As partitions are non-disjoint, we eliminate duplicates by discarding objects from the border region of a partition; only objects located in the interior are added to the result. Since these objects are far distant from the outside of the partition, enough spatial context was provided to generate results of equal quality to unpartitioned processing. 6. Implementation and Results We have implemented our algorithms in Java using an Oracle 11g Database Server for intermediate storage and spatial filtering operations. A graphical user interface for configuring process parameters and viewing results has been developed, too. As none of the known matching algorithms is able to match all objects of our datasets correctly, our implementation finally provides a visual interface for manually correcting matching results in exceptional situations. Thereby generalization results can be improved where the computed results are not satisfying. We have tested our implementation on datasets from different areas of Germany. Since matching is done with roads only, the most notable effects affect objects of this type. In the BDLM dataset the distance between different road objects is proportional to the real distance of the modelled roads. When visualized by rendering line objects with bitmap signatures, often small roads are covered by the signature of a bigger road located next to them. In KVD, extra space is inserted such that the signatures of accompanying roads do not intersect. After this representation is transferred, previously invisible roads are correctly visualized (see Figure 5). Figure 5. Visualization of a crossing in original BDLM (left) and after adjustment (right). 7. Conclusions We have presented an approach for generalizing large digital maps applicable whenever an already generalized map is available that can be matched. As manual generalization usually takes high effort and expertise, we believe that the techniques introduced here are a valuable contribution to simplify work and reduce costs in map processing. As the quality of our generalization results highly depends on correctness of computed matchings, an interesting area of further research is the improvement of existing or the development of new matching algorithms. Also our methodology could be improved to avoid or detect induced topological errors that might rarely occur. Finally we plan to further research scalability issues and hope that the performance of
78 Hendrik WARNEKE et al. 71 processing large datasets can be improved by developing sophisticated partitioning strategies. References [1] V. Walter and D. Fritsch, Matching spatial data sets: a statistical approach, Int. Journal of Geographical Information Science 13(5) (1999), [2] D. Mantel and U. Lipeck, Matching cartographic objects in spatial databases, Int. Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences XXXV(B4) (2004), [3] M. Zhang,W. Shi and L. Meng, A generic matching algorithm for line networks of different resolutions, Proc. of 8th ICA Workshop on Generalisation and Multiple Representation, A Coruña, Spain (2005). [4] M. Tiedge and U. Lipeck, Graphbasiertes Matching in räumlichen Datenbanken, Proc. of the 19. GIWorkshop on Foundations of Databases (2007), [5] S. Mustiere and T. Devogele, Matching Networks with Different Levels of Detail, GeoInformatica 12(4) (2008), [6] M. Zhang and L. Meng, An iterative road-matching approach for the integration of postal data, Computers, Environment and Urban Systems 31(5) (2007), [7] M. Zhang, L. Liu, H. Gong and L. Meng, An Automatic Approach to Integrate Routing-Relevant Information from Different Resources, Proc. of the 16th ACM GIS, New York, USA (2008). [8] R. Laurini, Spatial multi-database topological continuity and indexing: a step towards seamless GIS data interoperability, Int. Journal of Geographical Information Science 12(4) (1998), [9] Y. Doytsher, S. Filin and E. Ezra, Transformation of Datasets in a Linear-based Map Conflation Framework, Surveying and Land Information Systems 61(3) (2001), [10] J.-H. Haunert, Link based Conflation of Geographic Datasets, Proc. of the 8th ICA Workshop on Generalisation and Multiple Representation, A Coruña, Spain (2005). [11] D. Wirries, Optimierung von Rubber-Sheeting-Verfahren in räumlichen Datenbanken, Bachelor Thesis, Leibniz Universität Hannover (2007).
79 Sandra LANIG et al. 72 Interoperable integration of high precision 3D laser data and large scale geoanalysis in a SDI for Sutra inscriptions in Sichuan (China) Sandra LANIG a,1 ;Arne SCHILLING a ; Michael AUER a ; Bernhard HÖFLE a ; Nicolas BILLEN b ; Alexander ZIPF a a GIScience, Department of Geography, University of Heidelberg, Germany b Department of Computer Science, University of Bonn, Germany Abstract. Over the entire province of Sichuan (China) there exist Buddhist stone inscriptions so called Stone Sutras dating from 8 th to 12 th century. So far, the documentation and reproduction of the surface texture of these historic inscriptions took place via simple manual tracing on paper (rubbing). Innovative Terrestrial Laser Scanning (TLS) methods make it possible to capture these artifacts both digitally and in 3D and to derive high-resolution 3D models. This paper presents a concept for the integration of the Buddhist inscriptions into a Spatial Data Infrastructure (SDI) using Open Geospatial Consortium (OGC) Web Service (OWS) standards in archaeological, art-historical and linguistic contexts. The aim is to link existing humanistic data to an interdisciplinary Web-based Geographical Information System (GIS) with appropriate time and space reference. Special emphasis is put on SOA-based geo-processing (OGC WPS) and 3D visualization (OGC W3DS). The whole SDI is enriched with additional historic metadata of the inscription sites and finally joined in a Web Atlas for Stone Sutras in China. Keywords. SDI, Geo-processing, 3D Visualization, Stone Sutras, Web Atlas Introduction On the Asian continent the Buddhism was based on both the inscribed words (Sutra) and the statues or pictures of the Buddha originally. Chinese Buddhist monks started writing the holy Scriptures into rock faces in the second half of the 6 th century [10]. Over the entire province of Sichuan (China) around 80 Sutras dating from 8 th to 12 th century can be found [8]. In particular in the Chinese area it throws a new light on the history of the Chinese Buddhism and on its adjustment to the Chinese culture. More than 1,500 years the preservation, documentation and reproduction of the surface texture of the Buddhist inscriptions were made via so-called rubbings [6]. A copy of the original text was made by simply pressing thin, wet pieces of paper on the inscriptions written in stone and carefully dyeing the paper by manually tracing the surface. By means of these rubbings the teachings of the Buddha could be easily transported and distributed. 1 Corresponding Author.
80 Sandra LANIG et al. 73 However, due to the progressive weathering of the rock inscriptions and the abrasion with the production of the rubbings this archiving method can be used only conditionally for the protection of this information treasure. Hence, the objective of the research project 3D-Sutras is to investigate and document the stone inscriptions with different scientific approaches and from different viewpoints, including archeological, art historical, linguistically and geographical. New innovative capturing methods are necessary for a permanent preservation of the stone Sutras. A possibility for a contactless archiving of the Sutra text is offered by Terrestrial Laser Scanning (TLS) techniques which record the data digitally as 3D point clouds. In the course of several measuring campaigns the stone inscriptions were scanned by precise measuring procedures and processed to 3D models by our project partners at the University of Applied Sciences Mainz (i3mainz, Germany). The point density of the laser scan data depends on the size of the Sutra characters. A typical character covers about 1 cm 2 and a complete Buddhist stone inscription enclosure approximately 3x4 meter. Therefore, the scan has been carried out with a lateral resolution of 0.25 mm, which results in a data set of about 1,500 to 2,500 points per character. The size of the laser data set of an original 3D model of 4m 2 is about 4.32 GB [8]. Sharing all the historic and spatial information as Web services is a fundamental key aspect of our study. In order to make all historic geographical information available in different scales in a sustainable way, all data is integrated into a Spatial Data Infrastructure (SDI). The visualization component based on standardized spatial Web services includes an interactive 2D map to geographically browse the available information about the archaeological inscription sites, the historical infrastructure which connected the sites, time dependent information about the development of the area of power of the Tang-Dynasty from to 669 to 820 AD and time dependent information about the itineraries of Buddhist monks involved with the sutras and the province at the specified time [2]. In the research project, conventional SDI was enriched with additional services. Besides the classical SDI for the management and visualization of the spatio-temporal datasets acquired by historical text investigation and interpretation, the proposed Web-Atlas component provides a sophisticated spatial information infrastructure. Important implemented functionality are e.g. the textual description of all inscription sites, an inscription catalog with metadata about the texts, a reading tool to explore the inscriptions and a search module to query the inscription database. Furthermore, a multimedia map combines geographic 2D/3D visualization with 360 -panoramas, annotated photographic pictures and GIS functionality for measuring, searching and analyzing. This study goes even one step further and investigates (3D) geo-processing functionality based on the OGC Web Processing Service (WPS) interface specification [9] and 3D visualization component based on the OGC Portrayal Service like the OGC Web 3D Service (W3DS) [7] in order to give enable realistic 3D exploration of the archaeological sites by means of a virtual 3D model derived by laser scanning and other modeling techniques. This paper focuses on the geo-processing aspects in different scales from region wide spatial analysis to only few millimeter long Sutra inscription character based on TLS data as well as 3D visualization techniques in spatio-temporal and art-historic context. The objective of the research work is: first to preserve the stone inscriptions with innovative capturing methods based on TLS and improve the readability; second detect spatial relationships associated with historical information of the Buddhist inscriptions between sites; third introduce a new concept for a Web-based SDI covering different scientific approaches from different domains and scales for this purpose.
81 Sandra LANIG et al Geospatial Infrastructure for Art History Within the context of art history, geographical data has been considered only negligibly and not consistently, usually as scanned ancient maps or rubbings, not georeferenced in open formats. In order to make these precious assets accessible in a sustainable and flexible way and usable in a geo-spatial context, we pursued the integration into a SDI. From a technical viewpoint, an SDI is usually based on standardized Web services as specified by the OGC. The Web-Atlas utilizes several standardized OGC Web services for distributing and maintaining vector (Web Feature Service, WFS) and raster data (Web Coverage Service, WCS), for data analysis (Web Processing Service, WPS), and for the portrayal of 2D (Web Map Service, WMS) and 3D data (Web 3D Service, W3DS). Figure 1. Overview of the 3D Sutra SDI architecture A challenge is to develop a sustainable and interoperable concept for the integration of existing digital data of historical Buddhist inscriptions, which are held in a previously set up XML data base, into an SDI. This XML data base originated from a previous research project and is designed for electronic encoding, structuring, and exchange of documents within the domains of social science and art by using metadata standards. The historic data is stored in an open source XML database called exist (exist-db.org). Industry standards such as XQuery, XPath, and XSLT are used for complex queries and processing of the content. Import of data can be easily accomplished by using a WebDAV compliant editor or an online form [1]. The content of this database comprises textual scientific documents, transcriptions, catalog meta data of inscriptions, context data about inscriptions of sites and caves etc. Each XML document is referenced with a geographic coordinate which enables to utilize geospatial query and analysis methods as well as to directly integrate content in a map application. However, the technological gap must be bridged. Since these documents are not available in a common geo-standard, they cannot be used by OGC services directly. This has been solved by implementing a connector which synchronizes the XML data base with the geo database which is implemented in PostgreSQL/PostGIS. As soon as changes in the XML database occur, events are triggered, which synchronize the geospatial database by creating, deleting or updating Simple Feature geo-objects with respective attributes (Fig. 1). The Web client can then easily access
82 Sandra LANIG et al. 75 the combined geospatial data and information on Buddhist inscriptions through a WFS. Alternatively, map overlays can be created using a WMS/WCS based on the same content. This geo-visualization pipeline has been implemented using open source software Geoserver and OpenLayers. 2. Web-based Geo-processing Toolbox for Historians The WPS interface standard as specified by the OGC [9] describes an interface in order to provide Web-enabled distributed processing and analysis capabilities for geodata. A WPS process defines the implemented algorithm logic that run the calculation. The rather generic WPS specification provides spatial and non-spatial processes in arbitrary complexity. Thus, it is possible developing services in a wide range of complexity. In order to analyze both high-resolution laser scanning data as well as large scale spatial relationships between historic sites and monasteries, WPS processes were implemented. Multiple WPS frameworks were used for this task. By utilizing terrestrial laser scanners for capturing Sutras a very high volume of data (point clouds) is produced in a short time. Standalone or classical desktop GIS software can hardly cope with this massive amount of 3D raw data. It is therefore preferred to move the CPU intensive task of processing TLS data to a high performance server accessible through a standardized WPS. The purpose of TLS processing of Buddhist inscriptions is to improve the readability and thus interpretability by applying morphological, geometrical, and image based pattern analysis. As an example for image based processing, phong normals and relief shading maps have been computed from 3D stone inscriptions. They allow a better recognition of the heavily weathered carved characters. The process has been developed using the WPS interface implementation based on the Java deegree framework [4]. Until now, inscriptions are not examined on spatial relationships. In a first step, we evaluate the available spatial data for potential archaeological analyses in historicgeographical context. Classical questions for historians are for example: Which monasteries are visible from a certain point of view? How far did a monk walk by foot within one hour? Which routes have the monks in dependence of the historical route network and the geographical conditions (e.g. slope inclination, river barriers etc) possibly selected? Which path would be a cost-effective path and which were selected? What distance and how many meters of altitude were covered? Which further monasteries lie in a periphery of 300km? From this, typical geographical analysis are identified. An interactive analysis toolbox for historians is implemented based on the historical information of the migration of a monk. Apart from simple measuring functions on the 2D map, regional analyses like the creation of a buffer or a surface profile, the investigation of sight relationship (line-of-sight and viewshed analysis) or accessibility and distance cost analyses are implemented, based on the pywps implementation. This WPS implementation offers the possibility to use all GRASS GIS analysis functionalities by a Python implemented WPS interface [3]. 3. Interactive Web 3D Service In contrast to other comparable portals within the domains of art history or archeology, 3D content is not just added as multimedia component for displaying single artifacts
83 Sandra LANIG et al. 76 within this project. Instead, terrain and object information within the extent of the site is embedded as independent component (OGC Web 3D Service, W3DS) into the SDI. The advantage is on the one hand, that efficient streaming of complex landscapes can be utilized; on the other hand the persistent geo-referencing which allows a straightforward overlay with other geo data. The W3DS is designed as 3D Portrayal Service, meaning that spatial subsets of 3D data sets for the display in Web GIS or online portals (as in this case) are served. Several information layers can be switched on and off separately. The data exchange with the 3D Web client relies on industry standards (X3D/VRML). Similar to the WMS, the W3DS also provides the capability to retrieve further information on selected objects. Upon a position query (mouse click) and transmitting the coordinates, the server generates a list of available attributes which can then be displayed by the client. Furthermore, the geo-referencing enables superimposing arbitrary 3D models with aerial and satellite imagery. For this project a satellite image has been integrated and is served by a WMS. By spatial tiling as described earlier, the satellite image can be mapped as color texture on the 3D terrain. For each tile served by the W3DS, a WMS GetMap request with matching coordinates is generated and attached as texture URL. A 3D scene viewer is embedded in the atlas system visualizing both the inscriptions in the caves and the 3D surface model. Figure 2. Parts of the Web-Atlas Interface. Left: Spatial access to meta data and site information, right: detailed 3D view of one site. 4. Conclusion & Outlook The research collaboration between social science, humanities and geoscience creates new views on technical aspects of SDIs as well as on spatial questions related to archeological and historical research fields. An important aspect of the interdisciplinary research is the integration of existing non-spatial data repositories (e.g. Stone Sutra XML database) into a standardized SDI using Web services. A sustainable and interoperable integration of systems of different domains has been accomplished by locating and geo-referencing Buddhist stone inscriptions. Data on cultural heritage can be collected and maintained using already established user interfaces without the requirement of GIS expert knowledge. At the same time all this data is automatically synchronized and made accessible to map clients through standardized geo services. This study developed a combined 2D and 3D map framework as entry point for all available data. Detailed models of stone Buddha created from TLS data along with multimedia content such as 360 -Panoramas provide realistic impressions of the
84 Sandra LANIG et al. 77 historically significant sites. Furthermore, the implemented geoprocessing toolbox can be used, in order to offer different analysis functionalities over the Web client. Altogether the Atlas platform offers an integration of GIS components into classical text-based analysis techniques of the traditional historic art science. Concluding, this approach has a large potential for future social and historical studies with spatial context. However, there is a need to extend OGC service based SDIs beyond the access and visualization of geo data and to include analytical tasks that support archeological studies. For example, it is possible to develop domain specific OGC WPS Application Profiles taking semantic descriptions and even ontologies into account. Another important research area is processing high volume raw 3D spatial data such as point clouds from laser scans within an SDI [5]. Due to limited processing power and bandwidth, it is preferable to use dedicated powerful servers providing WPS interfaces. Further processing performance could be gained by connecting cloud computing and grid computer clusters to OGC components to achieve (near) real time workflows. It is important to agree on generic concepts that can be applied to a wide range of application domains, and to develop WPS processing profiles, otherwise the flexibility of SDI will get lost. Acknowledgements This research project 3D Sutras is funded by the German Ministry of Education and Research (BMBF) within a program on interdisciplinary research between natural and social sciences. References [1] Arnold, M. (2008), Buddhist Stone Scriptures From Shandong, China. EVA Electronic Visualisation and the Arts. London, UK. [2] Auer, M., Höfle, B., Lanig, S., Schilling, A., Zipf, A. (2011 submitted), 3D-Sutras: A web based atlas of laser scanned Buddhist stone inscriptions in China. AGILE The 14th AGILE International Conference on Geographic Information Science. Association of Geographic Information Laboratories in Europe (AGILE). Utrecht, Netherlands. [3] Cepicky, J. & Becchi, L. (2007), Geospatial Processing via Internet on Remote Servers PyWPS. In: OSGeo Journal 2007 Nr. 1, S Web: ( ). [4] Fitzke, J., Greve, K., Müller, M. & Poth M. (2004), Building SDIs with Free Software - the deegree Project. In: Proceedings of GSDI- 7. Bangalore, India. [5] Lanig, S. & A. Zipf (2009), Towards generalization processes of LiDAR data based on GRID and OGC Web Processing Services (WPS). Geoinformatik Osnabrück, Germany. [6] Ledderose, L. (1981), Rubbings in Art History. In: Walravens, H. (Hrsg.) Catalogue of Chinese Rubbings from Field Museum, Field Museum of Natural History, Chicago, USA. Fieldiana Anthropology New Series, vol. 3, pp. XXVIII-XXXVI (1981). [7] Schilling, A. & Kolbe, Th. H. (2010), OpenGIS Web 3D Service, Draft for Candidate. OGC Discussion Paper, OGC r1, Open Geospatial Consortium. [8] Schmidt, N., Schüzte, R. & Boochs, F. (2010), 3D-Sutra Interactive analysis tool for a web-atlas of scanned sutra inscriptions in china. ISPRS International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. [9] Schut, P. (2007), OpenGIS Web Processing Service (WPS). OGC Implementation Specification OGC r7, Open Geospatial Consoritum. [10] Wenzel, C. (2007): Anikonik im chinesischen Mahāyāna-Buddhismus: Die Wahren Merkmale des Buddha. In: Weltbild Bildwelt. Ergebnisse und Beiträge des Internationalen Symposiums der Hermann und Marianne Straniak-Stiftung, Weingarten Ed. by Walter Schweidler. Academia Verlag: Sankt Augustin 2007, S
85 Christoph WOSNIOK et al. 78 Integrating Marine Modeling Data into a Spatial Data Infrastructure Christoph WOSNIOK a ; Michael BAUER a ; Rainer LEHFELDT b a Research Assistants, Federal Waterways Engineering and Research Institute, Hamburg,Germany {christoph.wosniok b Senior Scientist, Federal Waterways Engineering and Research Institute, Hamburg, Germany Abstract. Spatial Data Infrastructures (SDIs) have become widespread within the last few years, in Europe largely evoked by the implementation of the INSPIRE directive of the European Union. Search processes for metadata within spatial portals usually result in single data sets spread across several topics without the possibility to get an encompassing overview over a geographic complex. The German Marine Data Infrastructure (MDI-DE) addresses this issue and aims at providing an integrated view on semantically close topics. The Elbe estuary serves as a prototypical example, which is described in detail on several levels. Along with the usual data such as maps, aerial pictures, or gauges, a section of the MDI- DE under the label Elbe will collect results from numerical models of the estuary. This raises additional questions on the technical level, i.e. how to integrate the diverse numerical data into a classic SDI and how to visualize aggregated data. In the following a concept and first prototypical implementation of ways to integrate modeling data into SDIs will be shown. Keywords. Spatial Data Infrastructures, Marine Data, Model Data, Information Infrastructures, INSPIRE, MSFD 1. Motivation The European directive Infrastructure for Spatial Information in the European Community (INSPIRE) 1 requires the member states to provide an overview on the present metadata within the countries in a standardized manner. To ensure compatibility on national and international level, implementing rules for metadata, data specifications, network services and others have to be adopted by data providers. The Marine Strategy Framework Directive (MSFD) 2, which should ensure protection, conservation and, if possible, restoration of European sea habitats, demands provision of data from the maritime domain by Other directives like Habitats Directive 3 already require status reports and data. In Germany with more than 2000km of coastline including the world natural heritage Wadden Sea and several other nature reserves on different legal levels, the
86 Christoph WOSNIOK et al. 79 production and maintenance of spatial data for the coastal zone is distributed between different federal state authorities. Additionally, national park administrations conduct monitoring and collect data. Consequently, there are several marine data collectors, providers, and archives, distributed spatially and affiliated with diverse agencies in northern Germany. With the implementation of INSPIRE and the MSFD, data collecting agencies are ultimately required to publish their data according to the directives. Mandatory data encompass, e.g. water networks, protected areas, digital terrain models, and oceanographic parameters. Numerical modeling is another area handling big amounts of marine data concerning the North Sea. Agencies and companies run models to analyze the impact of dredging in the German estuaries Elbe, Weser, Jade, and Ems. An example, which regularly attracts public interest, is the Elbe deepening [1][2]. As deeper rivers commonly have higher currents, which increase sediment transports and may lead to further bathymetric changes, dredging is pre-assessed in numerical models with regard to risks for the environment and dykes. Federal agencies like the Federal Waterways Engineering and Research Institute (BAW) 4 or the Federal Institute of Hydrology (BfG) 5 regularly develop status reports of the estuaries to back up political decisions. A central platform for the publication of such reports or related data is currently not available. In the following, we propose a concept for the integration of such data into a Spatial Data Infrastructure (SDI). This concept will be implemented in the course of the Marine Data Infrastructure Germany project (MDI-DE) [3] from 2010 to This paper continues with a demonstrative scenario of how model data in SDIs assist in decision making and followed by a section of how the marine data is prepared for the inclusion in the MDI-DE. Subsequently, the structure of MDI-DE is illustrated. Finally, we summarize our findings and outline some ideas for future work. 2. Scenario For our scenario, an engineer from a coastal authority wants to check the state of the coastal and riverbank protection in the Elbe estuary and to identify potential needs for additional groins after a deepening of the river. Currents are strong in the lower reaches of the Elbe, so groins were built to prevent erosion processes and possible danger for inhabitants during storm surges. Changes to the riverbed can have significant effects on the characteristics of the current, erosion and sedimentation, local water levels and many other parameters of the estuarine system. To get an overview, the engineer chooses to consult a SDI, having in mind specific questions such as Which parts of the banks along the Elbe could be affected by the deepening of the riverbed?. Such a query could bring up data sets with groins along the Elbe river. However, a condition of the bank structure can only be approximated within the spatial metadata sets. Without engaging into deeper research of his own, our engineer could with a little luck also find a ready made solution. The Elbe river as one of Germany s largest water ways is an economic factor [4] for the region and is regularly subject of research
87 Christoph WOSNIOK et al. 80 projects 6 and status reports. Results from these studies provide information based on geographic and model data, and can help users in decision making. 3. Aggregating Marine Model Data A common thematic map consists of topography and thematic data one wants to show, along with properties for a common reference system and a defined legend [5]. So far, in SDIs one can usually find topographic and thematic data, which enable the user to combine both into one single map. Products in the marine domain are often more complex. For instance, the bathymetry of the Elbe estuary needs regular updates, as conditions are changing continuously. Other than land based topography, the wet zone is under the constant impact of tides and currents. Especially shipping channels, with high freighter traffic volume underlie constant changes and are of both economic and environmental interest. As a daily surveying of an entire estuary is not feasible, numerical models are applied to estimate the bathymetric changes. Consequently, there is not one single bathymetric map, but several, which reflect the different assumptions made in the numerical modeling scenarios. Model data have the advantage of giving multidimensional overviews in different scales on an area of interest, in our scenario the Elbe river. Several parameters can be calculated within models: currents, sediment transport, and/or changes of bathymetry. Results from numerical models could show the effectiveness of bank fortifications, i.e. does a groin help to stabilize a dike by reducing stress from high tides, are structures redundant or are there places where new obstacles could direct the currents better than in the present state. Additionally, models are not only constructed to show the present state but to forecast the behavior of a system, in this case the Elbe, which can support our engineer with her decisions on possible further constructions. Consequently, model data can provide more information than simple spatial data sets, but connections between geographic data and models must not be underestimated: the more accurate the underlying geographic data, the more exact are the models. The Marine Data Infrastructure Germany (MDI-DE) is an attempt to aggregate relevant data for marine data products. The collaborative research is carried out with joint project management of the Federal Waterways Engineering and Research Institute and the Federal Maritime and Hydrographic Agency (BSH) 7 of Germany. It is based on the previously conducted NOKIS Project 8 [6] and on the GeoSeaPortal [7] of BSH. NOKIS aimed at establishing a metadata information system for marine data, with a metadata editor as the core element. Several data collecting organizations participated in this research and development project and added several thousand geographical metadata sets, which led for a first time to a publicly available overview over marine data in Germany. Driven by INSPIRE and MSFD to publish spatial data within a tight schedule, public data collectors and providers along the German North Sea and Baltic Sea coast have gathered to fulfill the technical and political conditions to provide not only the metadata but also services to access the data properly. The aim is to set up a portal where available marine data is aggregated, searchable and, unless there are legal restrictions, downloadable. 6 http//
88 Christoph WOSNIOK et al. 81 The original NOKIS project pinpointed on establishing an initial basic infrastructure for coastal data [8]. MDI-DE will use this basic metadata catalogue and extend the functionality to that of a complete SDI. NOKIS focused on metadata sets from coastal geographic data, so called geo metadata sets, which are created with the NOKIS metadata editor, or directly imported using a NOKIS XML profile. This NOKIS coastal metadata profile is a profile derived from ISO19115 [9], and includes the INSPIRE profile. Additional metadata profiles regarding research projects and literature have been derived as well and are being used in the Web sites of KFKI 9 and NOKIS 10. As the ISO standard for spatial data is formulated in an abstract way, it is also possible to describe results from numerical models by specifying an appropriate profile. The editor is being adjusted to represent these model specific metadata elements, i.e. a description of the model grid, model input parameters and possible preand postprocessing. The integrated metadata catalogue constitutes therefore a universal platform for different types of data and within the framework of MDI-DE we aim at ensuring interoperability between the existing data types. 4. MDI-DE System The MDI-DE is a distributed system (figure 1). Several NOKIS Editor instances (so called nodes) are connected via a CS-W Interface to a NOKIS core instance, where the combined metadata is stored. External portals like the German Spatial Data Infrastructure GDI-DE and the German environmental portal PortalU 11, access this core instance via CS-W requests. A new portal MDI-DE will be established to serve as main access point for users searching for marine data. Figure 1. The general structure of MDI-DE. It shows how all information is exposed via CS-W. Relevant elements are marked in dark grey
89 Christoph WOSNIOK et al. 82 It makes use of the metadata concerning data from numerical modeling, describing projects and referencing literature, which are merged for an integrated information search by NOKIS. All metadata is produced before being searchable within the portal, we discuss further options for handling model data in the future work section. The new MDI-DE portal will enhance the existing metadata search by visualization and download methods. In case of model data additional methods will be developed in order to cope with three dimensional and time dependent data sets. In order to make model data available through a SDI, the data first has to find its way into an appropriate service. In a first prototype, this is accomplished by converting the binary export file format of the modeling software into an exchange format, where we chose ESRI shape files, as it is a de facto standard and there are several tools for data transformation and handling available. The shape files are transferred into a database, which can be accessed from different OGC web service implementations [10] to generate diverse services. These steps are wrapped in a shell script for easy batch processing of large model data sets. To guarantee a consistent visualization across platforms according to the conventions of the marine engineering community, the data is classified and visualized via the Styled Layer Descriptor (SLD) technology [11]. 5. Conclusions and Future Work The recently started project German Marine Data Infrastructure aims at pulling all the aforementioned data strings together and to establish an integrated view on complex questions within the coastal community. While spatial data in SDIs only show a small part of the production chain of geographic products, MDI-DE provides a framework for different types of spatial data. Integrating modeling data enhances the capabilities of an SDI, as the results from numerical models enable the user to browse through a complex product based on spatial reference data. To achieve the desired state, some steps will be considered in the future. The first proof of concept simply displayed 2D maps consisting of model data aggregated into polygons according to the structure of the underlying model grid. A wide range of services can be provided using model data and the variety of OGC Web Service specifications. An obvious choice would be the provision of a Web Feature Service (WFS) [12], but also more elaborate services could be implemented. Simulations of currents, tides and other marine parameters are usually conducted in three dimensions, thus three dimensional data is produced. To display such data as two dimensional maps means losing a lot of information inherent in the original model data. In order to provide the best information to the user through the SDI, 3D services could be set up, using one of the upcoming OGC services for displaying 3D data like the Web 3D Service [13] or the Web Perspective View Service [14]. These would allow the visualization of 3D structures, objects or point clouds to present depictions closer to the original data than plain maps. The above mentioned SLD technology can be applied to these, too [15]. Using the WFS for data provision is an established and sound method, but it is also be conceivable to utilize a Sensor Observation Service to get the data from the data storage even though it is not sensor data in the narrow sense [16]. Having set up such a service, the capabilities of the SDI could additionally be improved by adding live sensor data to the SDI and combining them with previously modeled data. On this level, cooperation with the COSYNA project is planed. COSYNA [17] aims at
90 Christoph WOSNIOK et al. 83 constructing a long-term observatory for the North Sea including the presentation of near-real-time data on its portal. Metadata is already stored in a NOKIS instance and a connection with proper integration into MDI-DE is being established. A consequent although difficult step for the future would be to migrate the simulation process itself into the SDI using Web Processing Services (WPS) [18], where some approaches for using model data in WPS have already been made [19][20]. Acknowledgements The work presented here is funded by the Federal Ministry for Education and Research (BMBF) through Project Management Jülich (PtJ) under grant number 03KIS089 (Coastal Engineering). The authors gratefully acknowledge this support as well as the contributions from co-workers and other partners to this research project. References [1] Projektbüro Fahrrinnenanpassung (beim Wasser- und Schifffahrtsamt Hamburg), Fahrrinnenanpassung Unter- und Außenelbe. Das Projekt im Überblick., Fraujansen Kommunikation, Hamburg [2] Bundesanstalt für Gewässerkunde, Umweltrisikoeinschätzung und FFH-Verträglichkeitseinschätzung für Projekte an Bundeswasserstraßen, Weitere Fahrrinnenanpassung von Unter- und Außenelbe an die Containerschifffahrt mit einem Salzwassertiefgang von rd. 14,50 m, BfG 1380, Koblenz, [3] Lehfeldt, R., Melles, J., Marine Dateninfrastruktur Deutschland MDI-DE. In Traub, K-P, Kohlus, J., Lüllwitz, T. (Eds.) Geoinformationen für die Küstenzone. Beiträge des 3. Hamburger Symposiums zur Küstenzone. Wichmann Verlag (2011, in print). [4] Klimke, Jürgen, Elbvertiefung bringt klare Wettbewerbsvorteile für Hamburg. Kommunalpolitische Blätter 58 (2006), 3, [5] Kraak, M.-J., Ormeling F., Cartography - Visualization of Geospatial Data. 2nd ed. Prentice Hall [6] Lehfeldt, R., Reimers, H-C., Kohlus, J., Sellerhoff, F., A Network of Metadata and Web Services for Integrated Coastal Zone Management, COPEDEC VII, Dubai, Cyber-proceedings, (2008) paper 207. [7] Soetje, Kai C., To be on the right path from BOOS to an integrated pan-european marine data management system, US/EU-Baltic International Symposium, 2008 IEEE/OES, Tallin, [8] Lehfeldt, R., Heidmann, C., Piasecki, M., Metadata in Coastal Information Systems. In Holz, K.P., Kawahara, M., Wang, S.Y. (Eds.) Advances in Hydro-Science and Engineering, Vol. 5. Cyberproceedings, 5 Intl. Conf. Hydro-Science & -Engineering, Warsaw (2002). [9] ISO/ TC211: ISO19115:2003 Geographic Information Metadata. [10] OGC Web Service Common Implementation Specification: [11] OpenGIS Styled Layer Descriptor Profile of the Web Map Service Implementation Specification - [12] OpenGIS Web Feature Service - [13] Zipf, A., J. Basanow, P. Neis, S. Neubauer, A. Schilling, Towards 3D Spatial Data Infrastructures (3D- SDI) based on Open Standards - experiences, results and future issues. In: 3D GeoInfo07. ISPRS WG IV/8 International Workshop on 3D Geo-Information: Requirements, Acquisition, Modelling, Analysis, Visualisation. Delft, NETHERLANDS (2007). [14] Hagedorn, B., Hildebrandt, D., Döllner, J., Towards Advanced and Interactive Web Perspective View Services, Developments in 3D Geo-Information Sciences, Lecture Notes in Geoinformation and Cartography, Springer, [15] Neubauer, S., Zipf, A., Suggestions for Extending the OGC 'Styled Layer Descriptor (SLD) Specification into 3D'. Towards Visualization Rules for 3D City Models. In: Urban Data Management Symposium (2007). Stuttgart, Germany. [16] OpenGIS Sensor Observation Service -
91 Christoph WOSNIOK et al. 84 [17] Doerffer, R., Colijn, F., van Beusekom.J. (Eds.), Observing the Coastal Sea - an Atlas of Advanced Monitoring Techniques. LOICZ Reports & Studies No. 33. Geesthacht, Germany: GKSS Research Centre, [18] OpenGIS Web Processing Service - [19] G. Geller and F. Melton, Looking forward: Applying an ecological model web to assess impacts of climate change, Biodiversity 9, no. 3&4, 2008 [20] S. Schade and L. Díaz (2010). Supporting Content Provision in Environmental Information Infrastructures. envip2010 workshop in conjunction with EnviroInfo2010, Cologne/Bonn, Germany.
92 Manfred MITTLBOECK et al. 85 Leveraging standardized near real-time insitu sensor measurements in nature conservation areas Manfred MITTLBOECK a ; Bernd RESCH b ; Thomas BLASCHKE c d ; Helmut FRANZ a Key Researcher, Research Studio ispace, Salzburg, Austria, b Research Scientist, Research Studio ispace, Salzburg, Austria c Professor, University Salzburg, Salzburg, Austria d Head Research Coordination, National Park, Berchtesgaden, Germany Abstract. This paper presents a standardized workflow from integrating, processing and presenting real-time in-situ sensor measurements in the Nature conservation application domain. Especially the integration of environmental phenomena like weather phenomena like, temperature, humidity, wind speed etc. into automated, SOA based distributed geographic information architecture allows for contextual provision of new domain specific spatial knowledge in understanding ongoing changes in an protected area like the National Park Berchtesgaden. Therefore the local climate measurements from weather stations in the national park area have been integrated via geo- enabled sensor networks utilizing OGC Sensor Web Enablement interface standards. New dynamic geographic knowledge can be derived when integrating, combining and presenting this real-time data with existing geographical layers mainly from hydrosphere lithosphere and biosphere domain. To achieve and verify this goal, a framework has been developed which utilizes recent OGC standardization achievements for sensor measurement integration and distributed geographical processing, analysis and visualization methods. Measured data from various sensor sources (text, database) are transformed on-the-fly into OGC Observations- & Measurements XML-structured data, accessible via a custom OGC Sensor Observation (OGC SOS) service. We present a workflow, modules and components which permit the near-real-time integration into GI systems. Keywords. OGC SWE; Sensor Obervation Service, Geographical analysis; realtime geographical analysis, spatial data infrastructures. 1. Introduction Environmental data are traditionally maintained, processed, and archived by different national and international organizations. Increasingly, international environmental monitoring initiatives call for the integration of heterogeneous geographic data from local to global levels to assist in decision making and in achieving societal benefits [8]. By nature, in-situ data are valid for points or very small test areas or, more rarely, for transects. Today, in-situ data loggers analogue or digital are creating terabytes of data resulting in massive archives with enormous information potential. In many if not most - cases, data access is very often
93 Manfred MITTLBOECK et al. 86 constrained by organisatorical, technological and/or security barriers. Tools for spatially analyzing, comparing, visualizing and even sharing these data and their extracted information are still in their infancy. Furthermore, policy, legal and remuneration issues in regard to ownership and responsibility of value-added products or products that represent the culmination of different users input are yet to be stipulated but there is significant recent progress in spatial data infrastructure research (e.g., [6], [11], [5], [9]) and research on distributed geographic information processing [8] as well as new legal mandates like the European INSPIRE Directive [3] and their national deducted laws on National Spatial Data Infrastructures within the EU member states. In this paper we briefly describe recent standardization efforts of OGC and other organizations and exploit them by dovetailing different types of real-time data integration, information/processing services, visualization and knowledge provision. We describe Berchtesgaden live, a thematic service bus integrating real-time location aware in-situ sensor-data, enhance this structured data into thematic information layers, combine and analyze with legacy GI-layers and finally spatially valuate the results to extract new geographical knowledge being used in nature conservation domain. 2. Geo-sensor Network Data Integration approach Future geo-sensor networks will be based on distributed ad-hoc wireless networks of sensor-enabled miniature platforms that monitor environmental phenomena in geographic space. Individual sensor communication nodes are low cost and low power, potentially allowing dense networks of nodes to be deployed to monitor environmental phenomena. Such geo-sensor networks provide the capability to monitor geographic phenomena in remote, sensitive, or hazardous environments at much higher spatial and temporal granularity than it is possible with well established monitoring systems. Current research into geo-sensor networks is proceeding rapidly on several fronts. For example, special tasking services may ensure that not all sensors operate all the time. Some being in a sleeping mode may be activated based on threshold values of other sensors and, consequently, power consumption is minimized. However, apart from technical sensor network challenges, the real-time integration and usage of sensor data into expert and decision support systems is a vital part to evaluate and assess current environmental conditions. Timely can differ and vary significantly depending on the specific application context. E.g. the update cycle for land slide monitoring can be around five seconds in some cases, whereas for tracking wild-life, half-our intervals may be sufficient for appropriate research. To guarantee maximum interoperability and wide applicability, the authors aim for a real-time sensor data integration sensor fusion into existing GIS systems using well-established geo-data provision standards such as OGC Web Feature Service (WFS), Web Map Service (WMS), and Web Coverage Service (WCS). This allows for an integration of real-time measurements by interfacing live data sources with a broad spectrum of geo-processing service infrastructures. Therefore for ESRI ArcGIS a plug-in data source and a datastoreextension for open source Geoserver 2 software components has been developed to allow the transparent usage of real-time sensor in abroad range of GIS.
94 Manfred MITTLBOECK et al Real-time data integration through Geo-Sensor Web standardization Current approaches towards real-time data integration usually rely on the traditional request/response model in web service implementations [7]; [10]. Sarjakoski et al [18] establish a real-time spatial data infrastructure (SDI), which performs a few basic operations such as coordinate transformation, spatial data generalization, query processing or map rendering and adaptation. However, the implemented system does not consider the integration of real-time sensor data and event-based push technologies like XMPP which is e.g. widely used in various internet messaging clients (ICQ. Google Talk etc.). Other approaches try to achieve real-time data integration via the creation of a temporary database. Oracle s system, presented by Rittman [17], is essentially a middleware between (web) services and a continuously updated database layer. The Oracle approach is able to detect database events in order to analyse heterogeneous data sources and to trigger actions accordingly. In Rahm et al. [16], a more dynamic way of data integration and fusion is presented using object matching and metadata repositories to create a flexible data integration environment. However, all these approaches have their limitations. As data integration and fusion originated in the domain of computer science, very few approaches exist, which are dedicatedly designed for location aware data. Thus, integration of sensor data into GI systems currently mostly happens via the laborious interim step of a temporary physical database. This is not desirable in an automated GIS workflow chain as the database can easily become a bottleneck handling very large data volumes and spatial data sets. Moreover, such an indirect approach unnecessarily adds another component to the overall workflow, which can result in substantially lower performance. Thus, a need arises for an approach towards on-the-fly integration of sensor measurements and flexible adaptation of data containers. Sensor Web Enablement (SWE) extends the OGC web services and encodings framework by providing additional models to enable the creation of web-accessible sensor assets through common interfaces and encodings. SWE services are designed to support the discovery of sensor assets and capabilities, access to those resources and data retrieval, subscription to alerts, and tasking of sensors to control observations (OGC 2008). SWE shall foster the interoperability between disparate sensors and optionally to simulation models and decision support systems. A number of authors have addressed the issue of service chaining in SDIs, and more particular, the use of distributed processing services that can be combined into value-added service chains to serve as specific GI applications (e.g., [6], [11], [5], [9]). 4. Leveraging near-real-time GI technologies for nature conservation areas and the INSPIRE protected sites domain According to the International Union for the Conservation of Nature (IUCN) a Protected Site is an area of land and/or sea especially dedicated to the protection and maintenance of biological diversity, and of natural and associated cultural resources, and managed through legal or other effective means. The European Union Directive 2007/2/EC [3] aims at establishing an Infrastructure for Spatial Information in the European Community (INSPIRE) to support environmental policies. Therefore, harmonized datasets, services and structured information about the geographic
95 Manfred MITTLBOECK et al. 88 resources are the main requirements for supporting decision-making processes at all levels. Domain experts in protected areas worldwide already generated a vast amount of environmental data in accordance with their immediate requirements and priorities. They organize specialized datasets including protected biotopes and habitats, flora and fauna species distribution and supporting datasets like geological data, soil data, forestry, hydrological data, climate data, elevation, administrative units etc. Within the nature conservation domain representations of these various geospatial entities can differ in terms of data model, spatial, temporal and thematic scales, data generalization (the preserved information about real entities and/or phenomena), conceptual models, geographic projections etc. Therefore the integration of heterogeneous geospatial data needs a standardized conceptual model for capturing spatial and temporal characteristics of environmental entities. The OGC SOS provides an API for describing, accessing and managing deployed sensors and retrieving sensor data in a well structured standardized manner using XML technology. Such approaches for e.g. hydrosphere measurements via standardized interfaces allow for dramatically reducing time and efforts integrating this timely data into nature conservation SDIs. Additional essential functionalities for embedding real-time sensor data analysis into application-specific workflows are alerting and notification. As there are a variety of real-time data sources available with rising tendency it is increasingly important to organize and filter these data according to pre-defined criteria and rules. For instance, when monitoring protected landscapes, temperature variations may mainly be of specific interest in connection with regard to other phenomena like precipitation and soil conditions in order to better and faster support landslide risk assessment. Only specific combinations of several of these parameters allow for the identification of socalled events, which can trigger appropriate user actions. Actions could be sending out automated information (e.g. SMS or s) and/or to trigger further tasks. OGC SWE therefore defines the Sensor Alert Service (SAS), which specifies interfaces (not a service in the traditional sense) enabling sensors to advertise and publish alerts including associated metadata. Clients may subscribe to sensor data alerts based on defined spatial and property based constraints. Also, sensors can be advertised to the SAS to allow clients to subscribe for the sensor data via the SAS, which is currently in its version 0.9.0, has not jet been released as an official OGC standard. There are still discussions on the common suitability of this standard and the standard document is currently under investigation especially in conjunction with OGC Sensor Event Service (SES). SAS may use the Extensible Messaging and Presence Protocol (XMPP) a push protocol which has major technological advantages in terms of a very light-weight (low network payload) delivery of sensor notifications in comparison to HTTP. SAS notifications are provided via a Multi User Chatrooms (MUC) for each registered sensor and each predefined sensor alert definitions. To receive notifications, a client has to join the specific MUC. E.g.: There are chatrooms for different gauge-heights on a river-level. Values and alerts may be posted only into those chatrooms which height definitions have been exceeded. With this XMPP technology it s possible for thousands of users to subscribe to OGC SAS and still the network-traffic stays low as the information alerts are casted just once.
96 Manfred MITTLBOECK et al GI analysis of real--time environmental monitoring information: National Park Berchtesgaden The geographic information infrastructure in the national Park Berchtesgaden is based on Service Oriented Architectures SOA - to ensure flexibility, reusability and portability of the components and the overall infrastructure. The Berchtesgaden National Park in southeast Germany is one of the oldest oldest protected areas in the Alps, established in It comprises 218 square kilometres, kilometres, with altitudes ranging from 540 meter at lowland Königssee to the towering Watzmann Mountain (2670 m). It comprises one of the oldest GIS installations in Germany. Since 1984 enormous amount of data have been collected, analyzed and archived. The GI system is used as the main instrument for the long term National Park Plan which came into force in The original GIS data structure is currently being reorganized and restructured aiming to integrating also local weather-station measurements directly into the GIenvironment for spatio--temporal analysis. The existing climate-station station network consists of nine stations which send their measurements via GSM/GPRS to a hosted centralized server providing visual web-access we access (tables and graphs) for the different entities like temperature, humidity, snow cover etc in a proprietary manner. To get spatial information informat on the actual climatologic situation within the National Park we implemented automated mechanisms (using python scripting language) to structure the proprietary measurements of the weather stationss being accessible as OGC O&M on a 10 minutes basis. This enables the accessibility of the structured sensor datasets (temperature, humidity, barometric pressure, rain ra fall, snow depth etc.) via the well established OGC SOS interface. A special Web map solution (Fig. 1) has been developed (based on Microsoft Silverlight) to t support real-time areal snapshot analysis of the recent climatologic situation in the National Park. The measurements are directly integrated into ArcGIS Server (using a SOS custom plugin-datasource) datasource) as input for co-kriging and IDW interpolation processes es integrating also differences in altitude (via DEM) of the weather stations. Fig 1: NP Berchtesgaden, Ber Temperature Interpolation Application
97 Manfred MITTLBOECK et al. 90 This technological approach empowers the on-demand personalized web-based processing of near real-time measurements for providing an interpolated map of the climatologic situation. 6. Conclusion This new approach directly integrating near-real-time environmental measurements into the nature conservation SDI in the National Park Berchtesgaden enables a faster and better understanding and assessment of environmental dynamics. The proposed and validated system is utilizing upcoming international standards like OGC SOS, interface and processing standards to shift possibilities in extracting new spatial knowledge and enhance the support for National Park administration in their everyday s work. List of literature [1] Bartelme,N., Geoinformatik: Modelle, Strukturen, funktionen. 4. Auflage. Heidelberg. [2] Botts, M., Percivall, G., Reed, C. and Davidson, J. (Eds.) (2007a) OGC Sensor Web Enablement: Overview and High Level Architecture. OpenGIS White Paper OGC , Version 3, 28 December (17 August 2009) [3] Echterhoff, J. and Everding, T. (2008) OpenGIS Sensor Event Service Implementation Specification. OpenGIS Discussion Paper OGC , Version 0.0.1, 27 August (20 September 2009) [4] EUROPEAN UNION, (2007) DIRECTIVE 2007/2/EC OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL. Online: : uri=oj:l:2007:108:0001:0014:en:pdf [5] Friies-Christensen, A., Lucchi, R., Lutz, M., Ostländer, N. (2009) Service chaining architecture for implementing distributed geoprocessing applications. Int. Journal of Geogr. Inf. Science, 23(5), [6] GRANELL, C., GOULD, M. and ESBRI, M.A. (2008) Geospatial web service chaining. In H.A. Karimi (Ed.), Handbook of Research on Geoinformatics (Hershey, PA: IGI Global). [7] Harrie, L. (2004) Using Simultaneous Graphic Generalisation in a System for Real-Time Maps. Papers of the ICA Workshop on Generalisation and Multiple Representation, Leicester, August 20-21, [8] Jasani, B., Pesaresi, M., Schneiderbauer, S., Zeug, G. (eds.) (2008) Remote Sensing from Space. Supporting International Peace and Security. New York: Springer. [9] Kiehle, C., Greve, K. and Heier, C. (2007) Requirements for next generation spatial data Infrastructuresstandardized web based geoprocessing and web service orchestration. Transactions in GIS, 11(6), [10] Lehto, L. and Sarjakoski, L.T. (2005) Real-time Generalisation of XML-encoded Spatial Data for the Web and Mobile Devices. Intern. Journal of Geographical Information Science, 19 (8-9), [11] Lemmens, R., de By, R., Gould, M., Wytzisk, A., Granell, C., van Oosterom, P. (2007) Enhancing geoservice chaining through deep service descriptions. Transactions in GIS, 11(6), [12] Mittlboeck, M. and Resch, B. (2008) Federal Pervasive Sensor Networks Serving Geographic Information Services. In: Proceedings of the 5th International Symposium on LBS and Telecartography. November 26-28, 2008, Salzburg, Austria. [13] Mittlboeck, M. and Resch, B. (2008) Standardisierte Integration von Real-time Sensormessungen für Zeitnahe GIS-Analyse. In: Strobl, J., Blaschke, T., Griesebner, G. (Eds.) (2008) Angewandte Geoinformatik 2008, Wichmann Verlag, Heidelberg, pp [14] Mhatre, Vivek and Rosenberg, Catherine (2004) Homogeneous vs Heterogeneous Clustered Sensor Networks: A Comparative Study. Proceedings of IEEE International Conference on Communications, Paris, France, June [15] Nash, Edward (2008) WPS Application Profiles for Generic and Specialised Processes. Proceedings of the GI Days, Münster, Germany, June [16] Rahm, E., Thor, A. and Aumueller D. (2007) Dynamic Fusion of Web Data. XSym 2007, Vienna, Austria, p
98 Manfred MITTLBOECK et al. 91 [17] Rittman, M. (2008) An Introduction to Real-Time Data Integration (22 July 2009) [18] Sarjakoski, T., Sester, M., Illert, A., Rystedt, B., Nissen, F., Ruotsalainen, R (2004) Geospatial Infomobility Service by Real-time First, the existing concepts refer to a system. Based on general systems theory, a system can be defined as a complex of interacting elements with the interactions of the elements specifying a particular relation [28]. This can be applied to the inter-relation of the components discussed in the previous section that constitute the social and ecological systems coined socio-ecological system. Furthermore, the system has been described in literatures as having a particular ability or a capacity to do something as it pertains to resilience. This something relates the ability to manage or deal via some mechanism with stress or disturbance unto the system, which we termed as 'cope'. The external factor is a generalization of the stress or disturbance that threatens the system. However, its qualification as undermining the system makes it a synonym to stress or
118 Desiree DANIEL; Jens ORTMANN 111 threat. The latter definitions of resilience included the function of recovery by the system from a disturbance that is denoted by 'bouncing back' in the generic definition. Thus, the generic definition represents the most general form of resilience that can be deduced. From our observations, we deemed the idea of coping as the central component of resilience. According to the Oxford dictionary [8], to cope means to effectively deal with something. From the concepts examined, the idea of dealing or handling of stress by a system have been conveyed in the use of the words such as 'absorb' [13,24,9], 'resist' [27,1,7], 'withstand' [1], 'adapt' [18,3,27,29,7], 'self-organise' [13,3,27], 'maintain' [1], 'return' [20], 'remain' [3]. These words describe actions taken by the system to deal with stress or disturbance and can be seen as strategies of coping. To see how well the specialized definitions fit into the general one, two familiar definitions are aligned it i.e. The Resilience Alliance and UN/ISDR definitions. The Resilience Alliance considers three dimensions of resilience. At a glance, this definition fits well with our general take on resilience since the key elements are identifiable within its structure wherein each dimension indicates one strategy of coping. Re-arranging the first dimension such that it is expressed as: the capacity of a system to absorb an external factor and remain within the same state or domain of attraction. Repositioning the second dimension such that it states: the ability of a system to self-organise in face of an external factor that undermines the system. Altering the third dimension in the similar manner: the degree of adaptation by a system to an external factor that undermines the system. The UN/ISDR definition consists of two dimensions. The first dimension fits easily into the general definition. Coping with respect to an external factor can be implied. Likewise, the second dimension does not require altering to meet the general definition. In the second dimension, the authors specified a particular system i.e. the social system. If 'the degree to which a system is capable of organizing itself' is one way of describing resilience then this part of the UN/ISDR definition implies that social resilience determines the overall resilience of a system. This reiterates the link that the social and ecological systems share via institutions and the management of resources (see [1]). A disparity exists between the first dimensions of both definitions. The Resilience Alliance emphasizes the perpetuation of a system in its current state whereas the UN/ISDR implies a transformative perspective wherein the system assumes a new acceptable state. Miller et al. [16] has identified this conflict among resilience concepts. The authors acknowledged that in light of climate change the persistence of a current state by systems would curtail the positive adaptation that encourages sustainability and social equity. Thus, this form of coping undermines social resilience in the long-run. Examining the idea of 'cope', some specializations are more related to each other than others. Withstand and resist are synonymous with each other. To withstand or resist a disturbance implies keeping out the disturbance from the system as much as possible, before the disturbance occurs. In the U.S., forms of resistance are evident. Local governments have began taking precautionary measures against sea level rise by adding sand to the beach to offset beach erosion; in other coastal communities residents are encouraged to raise their structures via the incentive of lower flood insurance premiums as well as areas of coast has been reserved to allow for the retreat of wetlands 3. Absorb implies an amount of disturbance a system can take in before a change in 3 (last accessed: )
119 Desiree DANIEL; Jens ORTMANN 112 its structure occurs. Absorb can be tied to Adger s [1] concept of criticality that exist between social and ecological systems. Criticality states that feasible adaptation measures are in place, however a system can absorb a disturbance without choosing to resist first. In Tuvalu, some farmers continue crop cultivation despite coastal inundation and decrease in the amount and quality of produce thereby absorbing the disturbance 4. Maintain, return and remain have been linked to preserving a particular state of the system i.e. the system does not transform to a new state. With regards to sea level rise, returning to an original state prior to coastal inundation is not possible. As stated previously, maintaining and remaining in a particular state stunts adaptation and sustainability in face of climate change and sea level rise and are not desirable attributes. To adapt to a disturbance implies continuous change by the system. It is based on the system s experience, which enables it to adjust accordingly. Self-organisation is one form of adaptation wherein the system alters its structure in face of a particular stress. In Tuvalu where there is evidence of sea level rise and coastal inundation, residents have opted to raise their houses and buildings to restrict flooding and new houses are required to be constructed on 10-foot tall stilts4. Therefore, we can suggest a process of coping wherein a system first resists then absorbs followed by adaptation to a disturbance (climate change and sea level rise) after which the system absorbs once more. 3. Conclusion and Future Work Global warming is expected to have disastrous effects on societies, such as the induced rise in sea level. The threat of sea level rise to SIDS would unmask its socio-ecological systems to the effects of coastal flooding, salinisation, erosion, etc.. It is established that resilience is a pertinent concept that can be effective in dealing with the uncertainty surrounding climate change. Resilience has been linked to vulnerability as both concepts aim to understand socio-ecological system dynamics in face of disturbance. As such, there is a need for convergence of theoretical and methodological approaches of both concepts [16,25]. Thus, integration of concepts can lead to holistic adaptation strategies in face of sea level rise. Noting the complementary nature of both concepts, this paper aimed to focus solely on resilience since it underscores systemic characteristics that facilitate recovery. However, resilience still remains on the conceptual level as several domains focus their academic debates on the definition of resilience, in lieu of methods of extraction, formalization and application across domains to resolve developmental and sustainability issues of society. The notion of resilience as a property (of a socioecological system) taken with reference to an external entity (the external threat) makes semantic reference systems [14,15] a promising approach to operationalize resilience. In creating this reference frame, other aspects of resilience still need to be considered such as socio-ecological system feedbacks, limits to adaptation and the dichotomy of system based or actor based approaches of resilience [16]. 4 see (last accessed )
120 Desiree DANIEL; Jens ORTMANN 113 Acknowledgements This work has been partly supported through the International Research Training Group on Semantic Integration of Geospatial Information by the DFG, GRK References [1] W.N. Adger. Social and ecological resilience: are they related? Progress in Human Geogrpahy, 24(3): , [2] F. Berkes and C. Folke. Linking social and ecological systems, chapter Linking social and ecological systems for resilience and sustainability. Cambridge University Press, Cambridge, UK, [3] Steve Carpenter, Brian Walker, J. Marty Anderies, and Nick Abel. From metaphor to measurement: Resilience of what to what? Ecosystems, 4(8): , [4] F.S. Chapin, E.S. Zavaleta, V.T. Eviner, R.L. Naylor, P.M. Vitousek, H.L. Reynolds, D.U. Hooper, S. Lavorel, O.E. Sala, S.E. Hobbie, M.C. Mack, and S. Diaz. Consequences of changing biodiversity. Nature,405: , [5] R. Costanza, M. Kemp, andw. Boynton. Scale and biodiversity in estuarine ecosystems., pages Cambridge University Press, Cambridge, [6] R.S. Groot. Functions of nature: evaluation of nature in environmental planning, management and decision making. Wolters-Noordhoff BV, Groningen, The Netherlands, [7] J. W. Handmer and S. R. Dovers. A typology of resilience: Rethinking institutions for sustainable development. Organization & Environment, 9(4): , [8] S. Hawker and M. Waite, editors. Oxford Paperback Thesaurus. Oxford University Press, [9] C.S. Holling. Resilience and stability of ecological systems. Annual review of ecology and systematics, 4:1 23, [10] T. Hosein and J. Opadeyi. GIS-Based Assessment of Coastal Vulnerability to Sea Level Disturbance in the Carribean. Department of Surveying and Land Information, The University of the West Indies, St Augustine, Trinidad, [11] J.M. Kendra and T. Wachtendorf. Elements of resilience after the world trade center disaster: Reconstituting new york city s emergency operations centre. Disasters, 27(1):37 53, [12] R.J.T. Klein. Coastal Vulnerability, Resilience and Adaptation to Climate Change - An Interdisciplinary Perspective. PhD thesis, [13] R.J.T. Klein, J.R. Nicholls, and F. Thomalla. Resilience to natural hazards: How useful is this concept? Global Environmental Change Part B: Environmental Hazards, 5(1-2):35 45, [14] W. Kuhn. Semantic reference systems. International Journal of Geographical Information Science, 17(5): , [15] W. Kuhn and M. Raubal. Implementing semantic reference systems, [16] F Miller, H Osbahr, E Boyd, F Thomalla, S Bharwani, G Ziervogel, B Walker, J Birkmann, S van der Leeuw, and J Rockstrom. Resilience and vulnerability: Complementary or conflicting concepts? Ecology and Society, 15(3):11, [17] S. Naemm, L..J. Thompson, S.P. Lawler, J.H. Lawton, and R.M. Woodfin. Declining biodiversity can alter the performance of ecosystems. Nature, 368:734 37, [18] M Pelling. The vulnerability of cities: natural disasters and social resilience. Earthscan Publications, London, UK, [19] G.D. Peterson, C.R. Allen, and C.S. Holling. Ecological resilience, biodiversity, and scale. Ecosystems, 1:6 18, [20] S.L. Pimm. The complexity and stability of ecosystems. Nature, 307(5949): , [21] E.-D. Schulze and H.A. Mooney. Biodiversity and Ecosystem Function. Springer, New York, [22] M. Sutherland, P. Dare, and K. Miller. Monitoring Sea Level Change in the Caribbean. Geomatica, 62(4): , [23] D. Tilman. Biodiversity and ecosystem functioning, pages Island Press,Washington, DC, [24] P. Timmerman. Vulnerability, resilience and the collapse of society: A review of models and possible climatic applications. Technical report, Institute for Environmental Studies, University of Toronto, [25] B. L. Turner. Vulnerability and resilience: Coalescing or paralleling approaches for sustainability science? Global Environmental Change, 20: , [26] B.L. Turner, R.E. Kasperson, P.E. Matson, J. McCarthy, R.W. Corell, L. Christensen, N. Eckley, J.X.
121 Desiree DANIEL; Jens ORTMANN 114 Kasperson, E. Luers, M.L. Martello, C. Polsky, A. Pulsipher, and A. Schiller. A framework for vulnerability analysis in sustainability science. Proc. of the Nat. Academy of Science, 100(14): , [27] UN/ISDR. Living with risk: A global review of disaster reduction initiatives 2004 version. Technical report, Inter-Agency Secretariat of the International Strategy for Disaster Reduction, [28] L. von Bertalanffy. An outline of general system theory. The British Journal for the Philosophy of Science, 1(2): , [29] M. A. Waller. Resilience in ecosystemic context: evolution of the concept. Am J Orthopsychiatry, 71(3):290 7, [30] A. Wildavsky. Searching for Safety. Transactions Publishers, New Brunswick, NJ, USA, 1991.
122 Juliane BRINK; Timo JANSEN 115 Mobile In-Situ Sensor Platforms in Environmental Research and Monitoring Juliane BRINK a ; Timo JANSEN b a Institute for Geoinformatics, University of Muenster b Institute for Landscape Ecology, University of Muenster Abstract. The use of Unmanned Aerial Vehicles and Autonomous Underwater Vehicles as mobile sensor platforms in environmental science is growing. While the vehicles and sensor technology have reached maturity for practical operation, we observe that the potential of artificial intelligence allowing these devices to perform their tasks autonomously is not utilized. We give an overview of current applications of such mobile sensor platforms in the domains oceanography, meteorology and atmospheric dispersion and discuss the approaches for intelligent adaptive sensor movement proposed in research and applied in practice. Keywords. mobile sensors, sensor movement planning, Unmanned Aerial Vehicles (UAV), Autonomous Underwater Vehicles (AUV) Introduction Technological advances of Unmanned Aerial Vehicles (UAV) and Autonomous Underwater Vehicles (AUV) generate new opportunities in environmental research and applications. UAVs and AUVs are increasingly being deployed as sensor platforms in environmental exploration and monitoring, or emergency response. There are growing efforts and investment in unmanned aircraft technology for earth science by European and US national research institutions [13,12]. Unmanned vehicles have the ability to operate in remote areas with limited accessibility by humans, or in hostile and hazardous environments, avoiding direct human intervention and risk to humans. An obvious example is the assessment of pollution from chemicals that are poisonous, odourless and opaque gases, where vision sensors (i.e. remote sensing) are not applicable. This paper will provide an overview of the current use and concepts for use of mobile in-situ sensor platforms in environmental exploration and monitoring. We will discuss examples from the domains oceanography, meteorology and atmospheric dispersion. Our special interest is in concepts and algorithms for intelligent movement strategies for in-situ sensor platforms. This brings us to the fundamental problems of integrating mobile in-situ sensor data into environmental models and using environmental models for efficient use of mobile in-situ sensors, which both constitute research challenges of multi-disciplinary interest to Geographic Information Science.
123 Juliane BRINK; Timo JANSEN Environmental applications of mobile in-situ sensor platforms Mobile in-situ sensors are employed in environmental research and monitoring for various tasks. This section will review their current use in the three major application areas oceanography, meteorology and atmospheric dispersion Oceanography Scientific mapping and survey missions of the deep sea have traditionally been performed by inhabited submersibles, towed vehicles, and tethered remotely operated vehicles (ROVs). These are now being replaced by AUVs due to superior mapping capabilities and improved logistics [30]. These AUVs carry sensors like cameras, sonars, but also in-situ sensors for conductivity, temperature, or chemicals, mass spectrometers or magnetometers. Yoerger et al. deployed an AUV for the exploration of hydrothermal plumes and the discovery of hydrothermal vents [29]. Other applications are estimating the heat flux from vent fields, the exploration of cold seeps or bathymetric and magnetic mapping [30]. AUVs are used in research expeditions under the Arctic ice, where the operation of inhabited submersibles is considered too risky and the ice permits the use of towed or remotely operated vehicles. NASA s Astrobiology Science and Technology for the Exploration of Planets program funded an expedition for the exploration of hydrothermal vent fields in the Arctic in 2007 with the explicit goal of investigating robotic technology to explore Europe s ice-covered ocean [16]. Ramos and Abreu describe AUV surveys of wastewater plumes from coastal sewage discharges [20]. These surveys aim at a better understanding of the dilution processes and predicting environmental impacts. AUVs equipped with mass spectrometers have been used for analysing naturally occurring oil seeps and also for tracking subsurface oil leaks from damaged blow-out preventers. When the blow-out of the Deepwater Horizon offshore oil drilling rig in April 2010 caused the largest oil spill in history, researchers deployed AUVs equipped with mass spectrometers and found a continuous subsurface oil plume of more than 35 kilometers in length [3] Meteorology Unmanned aircraft technology is increasingly employed in meteorological research to complement observations of meteorological towers and radiosondes. The robotic aircraft Aerosonde was first used for meteorological observations in the Arctic in Alaska in 1999 [4]. The Aerosonde is equipped with sensors for relative humidity and air temperature and is continuously improved for operation in Arctic weather conditions. One future goal is to use the Aerosonde in targeted or adaptive observational strategies to provide input to operational numerical weather prediction models. Van den Kroonenberg et al. describe an unmanned aircraft called M²AV, which is collecting horizontal wind vector data for boundary layer research [25]. The M²AV is equipped with a five-hole probe for wind measurements and a combined temperature and relative humidity sensor and performed flights in the Weddell Sea of the Antarctic in Reuder et al. used an unmanned aircraft system equipped with sensors for temperature, humidity and pressure to obtain profiles for atmospheric boundary layer research in 2007 and 2008 in Iceland and Spitsbergen [21]. They also describe a study using a UAV to monitor the horizontal variability of
124 Juliane BRINK; Timo JANSEN 117 Figure 1. The Ifgicopter equipped with humidity and temperature sensors. temperature and humidity fields above different types of agricultural land use. Frew and Argrow propose an unmanned aircraft system to study the process of tornado formation in severe convective storms, which requires in-situ measurements of the thermodynamic and microphysical properties in the rear-flank region of supercell storms [9]. The Ifgicopter project at the Institute for Geoinformatics in Münster uses microcopters for identifying boundary layers near the ground [27]. Figure 1 shows the Ifgicopter equipped with humidity and temperature sensors. The microcopter is also used for research on vertical distributions of methane gases and for locating emitters of methane Atmospheric Dispersion The deployment of UAVs for surveillance tasks of atmospheric dispersion of gas or particles, i.e. toxic emissions, is well-motivated but compared to applications in meteorology and oceanography it did not mature beyond an experimental stage to this day. UAVs and swarms of UAVs equipped with in-situ sensors are proposed in a number of application scenarios including environmental monitoring and emergency response [2]. Daniel et al. propose a system architecture for a swarm of micro unmanned aerial vehicles for the assessment of contaminants in the air in emergency response called AirShield [5]. The AirShield project is funded by the German Federal Ministry of Education and Research as part of a program for protection systems for security and emergency services. One practical example for a kind of UAV deployed in atmospheric dispersion monitoring is the work of Harrison et al., who equipped a balloon (radiosonde) with charge and aerosol particle sensors to investigate the volcanic ash plume generated by the Eyjafjallajökull in Iceland in April 2010, which prohibited aviation for several days over large parts of Europe [10]. Volcanic ash constitutes a serious threat to aviation and thus is continuously monitored by the nine Volcanic Ash Advisory Centers1, which have to run their models on the basis of satellite imagery with limited availability due to temporal delay and cloud coverage. This is one example for an application where quick availability of in-situ data collected by unmanned aircraft systems would be beneficial. 1
125 Juliane BRINK; Timo JANSEN Approaches for intelligent sensor movement An intelligent movement strategy for a mobile sensor is crucial to effectively perform time-critical observation tasks or to account for the dynamics of a phenomenon to be observed, e.g. to track a pollutant plume. Research in statistics and artificial intelligence has brought about various concepts for movement strategies for mobile insitu sensors. These address different observation goals such as mapping a phenomenon, locating the source of an emission, or delineating an area where measurements exceed some threshold. We will give an overview of approaches for intelligent sensor movement proposed in research, which we divide into biomimetic and model-based approaches, and approaches being applied in practice Biomimetic approaches proposed in research Different sensor movement strategies for locating sources of gas or odours, which have been developed in robotics, are inspired by nature, e.g. insect orientation. For example Ishida et al. and Li et al. suggest to mimic insect orientation strategies to pheromone with robot platforms [14,17]. Marques et al. discuss odour source localization strategies inspired by male silkworm tracking of female moth pheromone [18]. The main shortcomings of these methods are that they can only deal with odour sources that are not moving and that they require wind information Model-based approaches proposed in research Several methods utilize a model of the observed phenomenon. This model can be for example a numerical model based on partial differential equations describing a dispersion process. Patan et al. and Song et al. propose methods of optimal sensor motion planning for parameter estimation of distributed systems [19,23]. Walkowski describes a geostatistical approach for sensor network optimization that uses the kriging variance as a measure of the information deficit at a location, i.e. the need for additional measurements [26]. A similar idea can be found in Elston et al., who use the geostatistical concept of the variogram to identify regions of high variability, which they associate with high scientific interest [7]. Heuvelink et al. propose a geostatistical methodology for optimizing the allocation of mobile measurement devices complementing a static radioactivity monitoring network [11]. Spatial simulated annealing is used to optimize the sampling design according to the criterion of minimizing the costs of false classifications into above or below intervention level concentrations. The reference concentration map is based on the outcome of a physical atmospheric dispersion model. These geostatistical approaches address the problem of positioning a mobile sensor within an existing network of mobile sensors and thus are not easily applicable to a scenario with one or only few mobile sensors. Moreover, these methods identify some sensor location within the study area rather than one step of a continuous sensor movement. Other approaches use qualitative models of the observed phenomenon. Subchan et al. present a method for cooperative path planning of two UAVs to detect and model the shape of a contaminant cloud, which is modelled as a discrete Gaussian shaped plume [24]. The boundary is approximated by connecting the entry and exit points detected by the UAVs with line segments of constant curvature to form splinegons. This method however uses a static model and does not address the temporal dynamics
126 Juliane BRINK; Timo JANSEN 119 of the contaminant cloud. Brink proposes a concept for spatio-temporal reasoning about a gas plume based on a Gaussian model as a basis for adaptive sensor movement [1]. This approach infers qualitative information about the plume movement and size from the sensor data that is relevant for tracking a moving plume. The main drawback of these qualitative methods is that they yield only imprecise results. However, compared to the methods that use quantitative models, they also require less or less precise information as input Approaches applied in practice Very few UAV or AUV explorations in environmental science are automated in terms of an intelligent adaptive movement strategy. Yoerger et al. employ a movement strategy for hydrothermal vent discovery and exploration with an AUV that begins with a conventional grid survey and then revisits the locations of clusters of anomalous sensor readings, which are ranked according to their relative value of being revisited [29]. For assessing the extent of the hydrocarbon plume resulting from the Deepwater Horizon oil spill Camilli et al. navigated the AUV in a zig-zag pattern starting from the leak [3]. 3. Conclusion Developing intelligent autonomous mobile sensor platforms poses a couple of open interdisciplinary research questions, among which we identify two important aspects. The first aspect is the integration of geospatial information relevant for the exploration or monitoring task. In the context of monitoring contaminant dispersion using UAVs Daniel et al. point to the need for integration of dynamic and static data, such as safety relevant geodata, e.g. locations of kindergartens, schools, hospitals and retirement homes, and terrain and weather data [5]. This information can be used to increase efficiency of the observation by concentrating the measurements where the data is most urgently needed. The second aspect is the development of intelligent sensor movement strategies that are able to efficiently collect the data relevant for a specific observation task [8]. This requires the integration of environmental models describing the behaviour of the observed phenomenon, e.g. atmospheric dispersion models, or models describing underlying and related phenomena such as weather nowcasts and forecasts. Daniel et al. suggest realtime dispersion modelling of aerosols and gases as a basis for flight routes of UAVs and using the collected sensor data to enhance the model [5]. This type of high-fidelity models can be too computationally intensive to be included in a realtime path planning loop or there can be too little data to characterize the phenomenon with sufficient accuracy [7]. Frew and Argrow propose the combination of real-time science driven control of unmanned aircraft systems with online modeling and data assimilation using domainspecific reduced order phenomenological models [9]. They envision vehicles that have simple models of atmospheric phenomena onboard that only retain features of the environment necessary for their guidance. Research in Geosensor Networks elaborates on object-based models of dynamic environmental phenomena [6,28,15,22]. Object-based modelling of the observed phenomenon (as for example in [24,1] mentioned in Section 2.2) might be a reasonable approach to meet
127 Juliane BRINK; Timo JANSEN 120 the requirements of 1) sensor movement planning in real-time and 2) applicability in situation where there is little phenomenological information. This paper illustrates the state of the art in autonomous mobile in-situ sensor platforms for environmental exploration and monitoring. The presented examples from practice reveal a gap between the advances in vehicle and sensor technology, which is mature to operate in practice, and the development of artificial intelligence, and sensor data integration and modelling, which seems to lag behind. This review suggests, that there is currently no generic method for navigating mobile sensors, which would make use of all potentially available phenomenological and other relevant data and allow to automate an exploration or monitoring task. We think that future research in this area would be of high benefit to various geo-disciplines and is essential to fully exploit the large potential of UAV and AUV technology for environmental sciences. Acknowledgements This research is funded by the International Research Training Group on Semantic Integration of Geospatial Information by the DFG (German Research Foundation), GRK References [1] J. Brink. Qualitative Spatio-temporal Reasoning from Sensor Data using Qualitative Trigonometry. In Proceedings of the 14th AGILE International Conference on Geographic Information Science. Springer, [2] A. Bürkle, F. Segor, and M. Kollmann. Towards autonomous micro uav swarms. Journal of Intelligent and Robotic Systems, pages 1 15, [3] R. Camilli, C.M. Reddy, D.R. Yoerger, B.A.S. Van Mooy, M.V. Jakuba, J.C. Kinsey, C.P. McIntyre, S.P. Sylva, and J.V. Maloney. Tracking Hydrocarbon Plume Transport and Biodegradation at Deepwater Horizon. Science, 330:201, [4] J.A. Curry, J. Maslanik, G. Holland, and J. Pinto. Applications of aerosondes in the arctic. Bulletin of the American Meteorological Society, 85(12): , [5] K. Daniel, B. Dusza, A. Lewandowski, and C.Wietfeld. AirShield: A system-of-systems MUAV remote sensing architecture for disaster response. In Systems Conference, rd Annual IEEE, pages IEEE, [6] M. Duckham, S. Nittel, and M.Worboys. Monitoring dynamic spatial fields using responsive geosensor networks. In Proceedings of the 13th annual ACM international workshop on Geographic informationsystems, pages ACM, [7] J. Elston, M. Stachura, E.W. Frew, and U.C. Herzfeld. Toward model free atmospheric sensing by aerial robot networks in strong wind fields. In Proceedings of the 2009 IEEE International Conference on Robotics and Automation, pages , Kobe, Japan, May [8] J. Exeler. Sensor Trajectory Planning in Environmental Observations. In Proceedings of the GIScience 2010 Doctoral Colloquium, Zurich, Switzerland, pages 23 28, [9] E.W. Frew and B. Argrow. Embedded reasoning for atmospheric science using unmanned aircraft systems. In AAAI 2010 Spring Symposium on Embedded Reasoning: Intelligence in Embedded Systems, Palo Alto, CA, March [10] R.G. Harrison, K.A. Nicoll, Z. Ulanowski, and T.A. Mather. Self-charging of the Eyjafjallajökull volcanic ash plume. Environmental Research Letters, 5:024004, [11] G.B.M. Heuvelink, Z. Jiang, S. De Bruin, and C.J.W. Twenhofel. Optimization of mobile radioactivity monitoring networks. International Journal of Geographical Information Science, 24(3): , [12] A. Houston, B. Argrow, J. Elston, and J. Lahowetz. Unmanned aircraft observations of airmass boundaries: The collaborative coloradonebraska unmanned aircraft system experiment. In 24th Conference on Severe Local Storms, 2008.
128 Juliane BRINK; Timo JANSEN 121 [13] COST European Cooperation in the field of Scientific and Technical Research. Action es0802: Unmanned aerial systems (uas) in atmospheric research &action_number=ES0802, [14] H. Ishida, T. Nakamoto, T. Moriizumi, T. Kikas, and J. Janata. Plume-tracking robots: A new application of chemical sensors. The Biological Bulletin, 200(2):222, [15] J. Jiang and M. Worboys. Detecting basic topological changes in sensor networks by local aggregation. In Proceedings of the 16th ACM SIGSPATIAL international conference on Advances in geographic information systems, pages ACM, [16] C. Kunz, C. Murphy, R. Camilli, H. Singh, J. Bailey, R. Eustice, M. Jakuba, K. Nakamura, C. Roman, T. Sato, et al. Deep sea underwater robotic exploration in the ice-covered arctic ocean with AUVs. In Intelligent Robots and Systems, IROS IEEE/RSJ International Conference on, pages IEEE, [17] W. Li, J.A. Farrell, R.T. Cardé, et al. Tracking of fluid-advected odor plumes: Strategies inspired by insect orientation to pheromone. Adaptive Behavior, 9(3-4): , [18] L. Marques, U. Nunes, and A.T. de Almeida. Olfaction-based mobile robot navigation. Thin solid films,418(1):51 58, [19] M. Patan and D. Ucinski. Robust activation strategy of scanning sensors via sequential design in parameter estimation of distributed systems. In Parallel Processing and Applied Mathematics, volume 3019 of Lecture Notes in Computer Science, pages Springer Berlin / Heidelberg, [20] P. Ramos and N. Abreu. Environmental monitoring of wastewater discharges using an autonomous underwater vehicle. In IROS 2010 Workshop on robotics for environmental monitoring, [21] J. Reuder, P. Brisset, M. Jonassen, M. Muller, and S. Mayer. The Small Unmanned Meteorological Observer SUMO: A new tool for atmospheric boundary layer research. Meteorologische Zeitschrift, 18(2): , [22] M. Shi and S. Winter. Detecting change in snapshot sequences. In Sara Fabrikant, Tumasch Reichenbacher, Marc van Kreveld, and Christoph Schlieder, editors, Geographic Information Science, volume 6292 of Lecture Notes in Computer Science, pages Springer Berlin / Heidelberg, [23] Z. Song, Y.Q. Chen, J.S. Liang, and D. Ucinski. Optimal mobile sensor motion planning under nonholonomic constraints for parameter estimation of distributed systems. International Journal of Intelligent Systems Technologies and Applications, 3(3): , [24] S. Subchan, B.A. White, A. Tsourdos, M. Shanmugavel, and R. Zbikowski. Dubins path planning of multiple UAVs for tracking contaminant cloud. In Proceedings of the 17th World Congress of the International Federation of Automatic Control, pages IFAC, [25] A. van den Kroonenberg, T. Martin, M. Buschmann, J. Bange, and P. Voersmann. Measuring the Wind Vector Using the Autonomous Mini Aerial Vehicle Msup 2 V. Journal of Atmospheric and Oceanic Technology, 25(11): , [26] A.C. Walkowski. Model based optimization of mobile geosensor networks. In The European Information Society, Lecture Notes in Geoinformation and Cartography, pages Springer Berlin / Heidelberg, [27] Sensor web web-based geoprocessing and simulation lab. Ifgicopter: A microcopter as a sensor platform [28] M. Worboys and M. Duckham. Monitoring qualitative spatiotemporal change for geosensor networks. International Journal of Geographical Information Science, 20(10): , [29] D.R. Yoerger, A.M. Bradley, M. Jakuba, C.R. German, T. Shank, and M.A. Tivey. Autonomous and remotely operated vehicle technology for hydrothermal vent discovery, exploration, and sampling [30] D.R. Yoerger, M. Jakuba, A.M. Bradley, and B. Bingham. Techniques for deep sea near bottom survey using an autonomous underwater vehicle. Robotics Research, pages , 2007.
129 Simon JIRKA; Henning BREDEL 122 Building Tracking Applications with Sensor Web Technology Simon JIRKA a,1 a ; Henning BREDEL a 52 North Initiative for Geospatial Open Source Software GmbH Abstract. This article introduces a new kind of Sensor Web application for near real time ship tracking. Typically, most existing Sensor Web systems concentrate on the monitoring of certain thematic attributes (e.g. water level, meteorological parameters or air quality) that are visualised as diagrams. The system presented in this paper has a different focus: it shows how Sensor Web components can be used for handling observations that concern the dynamically changing position geometries of objects (ships). Furthermore, an approach is presented how a Sensor Web based tracking system can be coupled with other conventional spatial data infrastructure components (i.e. Web Map Services) for providing user friendly visualisations. Keywords. Sensor Web, Ship Tracking, Automatic Identification System Introduction In recent years more and more applications based on the Sensor Web technology have been deployed. Most of these applications were built to handle observations of all kinds of environmental phenomena like environmental pollution [1], hydrological measurements [2] or oceanographic data [3]. Usually, these implementations concern time-dependent thematic parameters for certain fixed locations that are visualised as time-series plots. This article introduces a new type of application which demonstrates that the Sensor Web technology can also be used for tracking moving objects like ships. The next sections offer a more detailed view of the developed system which is the result of a project of the Open Source Initiative 52 North 2 and the Service Centre Information Technology of the Federal Ministry of Transport, Building and Urban Development (DLZ-IT BMVBS) 3. The developed system has to be seen in the context of existing (web based) GIS applications that are currently available. Usually there are specific applications which provide functionality for displaying tracking data (e.g. Flightradar24.com and MarineTracking.com). However, for integrating tracking functionality into (internal) spatial data infrastructures, the presented system was developed as a complementary standards based approach. 1 Corresponding Author
130 Simon JIRKA; Henning BREDEL Technological Background and System Architecture The ship tracking data used in the project is gathered by the Automatic Information System (AIS) 4. This is a system which is mandatory for most ships that are active in commercial shipping. Usually a ship has to be equipped with an AIS device which comprises a GPS receiver for determining the ship position and a radio transmitter that broadcasts the ship position as well as further data about the ship (e.g. destination, information about dangerous goods, dimensions of the ship, etc.). A base station (e.g. operated by the Waterways and Shipping Administration of the Federal Government 5 ) is then able to collect the transmitted ship data for further processing and use. In the context of the presented project the access to the collected ship tracking data is provided through a SOAP Web Service that is based on a non-standardised interface. In order to achievee the seamless integration of the ship tracking data into spatial data infrastructures the Sensor Web Enablement (SWE) framework of the OGC was chosen [4]. More specifically a Sensor Observation Service (SOS) was set up that encapsulates the proprietary interface of the SOAP Web Service so that the AIS ship tracking data becomes accessible in a standardized manner. Thus, every client capable of understanding spatial observations provided by a SOS is able to retrieve the ship tracking data and to make them available to users. Finally, for creating user friendly visualisations of the ship tracking data, an OGC Web Map Service (WMS) was set up. This WMS instance, which is based on the 52 North OX-Framework [5], is able to consume the latest ship tracking data for a certain area from a SOSS and to render according maps. These maps can be displayed by any WMS compliant client application. Within the project an Open Layers based client was developed that features an automatic update mechanism so that every five seconds a new map is displayed. This makes it possible to watch the ship movements directly within the WMS client. Figure 1. Overview of the system. An overview of the system is shown in Figure 1. Figure 2 depicts an example of a map which shows the ship locations as well as additional labels containing information about the speed and course of the individual ships. In summary our work comprised the development of an approach how tracking data can be mapped to the SWE information model, the design of a service chain that allows adapting a proprietary data source through a proxy to the SWE interfaces and the implementation of a WMS façade for integrating the tracking data also into legacy web mapping clients. 4 g/wiki/automatic_identification_system 5
131 Simon JIRKA; Henning BREDEL Conclusion The experiences gained in the project showed that the Sensor Web technology is capable of handling near real-time ship tracking data. Within the area covered by the project (the Weser River between Bremen and Minden) it was possible to achieve update rates of less than five seconds with a latency of a few seconds. The number of ships travelling in the area covered by the system was usually in the range of 50 to 60. By optimising the service architecture a significantly larger number of objects can be handled as well. Figure 2. WMS layer showing ship tracking data. The system developed within the project had the character of a feasibility study. Thus, for the future the speed of the data delivery can be further enhanced. The main factor that was limiting the performance was the relatively long chain of services that was used for delivering the AIS data. In the future this can be optimized by relying on efficient caching mechanisms. Furthermore it might be interesting to deploy event based approaches that ensure the delivery of new tracking information as soon as it becomes available. In summary it can be concluded, that the Sensor Web framework is not only capable of delivering conventional time series data but also of handling near real time tracking data. By relying on the interoperable SWE interfaces it is possible to apply the presented approach to any other kind of tracking application besides ship tracking. References [1] S. Jirka, A. Broering, and C. Stasch, Applying OGC Sensor Web Enablement to Risk Monitoring and Disaster Management, GSDI 11 World Conference, Rotterdam, Netherlands, June [2] S. Guru, P. Taylor, H. Neuhaus, Y. Shu, D. Smith, and A. Terhorst, Hydrological Sensor Web for the South Esk Catchment in the Tasmanian State of Australia, escience IEEE Fourth International Conference on, December 2008, [3] L. Bermudez, P. Bogden, E. Bridger, G. Creager, D. Forrest, and J. Graybeal, Toward an ocean observing system of systems, OCEANS 2006, 2006, 1-4. [4] M. Botts, G. Percivall, C. Reed, and J. Davidson, OGC Sensor Web Enablement: Overview and High Level Architecture, Lecture Notes In Computer Science, vol. 4540, , [5] Bröring, A., E.H. Juerrens, S. Jirka, and C. Stasch, Development of Sensor Web Applications with Open Source Software. Proceedings of: First Open Source GIS UK Conference. June Nottingham, UK.
132 Günther SAGL et al. 125 Web-Based Near Real-Time Geo-Analyses of Environmental Sensor Measurements Günther SAGL 1a ; Michael LIPPAUTZ a ; Manfred MITTLBÖCK a ; Bernd RESCH a ; Thomas BLASCHKE a a Studio ispace, Research Studios Austria Abstract. This paper demonstrates two web-based applications for near real-time geo-analysis. Environmental sensor measurements are directly integrated in a fully service oriented workflow. Emphasis is put on rapid web-based dissemination of in-situ data (points) and their interpolation results (lines, polygons, or surface) for web clients such as Google Earth or common web-browsers. One aim of such applications is to enhance time-critical spatial decision support in crisis management. Keywords. web-based geo-processing, standardised environmental monitoring Introduction Nowadays, near real-time analyses of a vast amount of sensor information is crucial for decision-support systems utilized for crisis management. Geographically oriented perspectives on such sensor data might enhance spatial temporal awareness of decision makers. Today s technology is already capable of generating measurements of environmental phenomena that can be (pre-)filtered and quality assured. Additionally, standardized web-services are able to deliver this information in real-time by means of smart in-situ sensors [1, 2]. Distributed geo-sensor networks in combination with Geographic Information Systems (GIS) can be deployed intelligently to automatically generate multidimensional information beyond point measurements through web-based geoprocessing routines [3, 4]. Such information could e.g. inform the general public with live weather data or enhance crisis management with near real-time localisation of harmful substances such as toxic gases or radioactive radiation. In order to demonstrate how such technology can be used to assist time-critical decision support, we developed two web applications for near real-time geo-analyses of environmental sensor measurements. In contrast to previous research, we integrate these measurements directly into GIS. We show live interpolation results of temperature and gamma radiation dose rate measurements. The utilization of standardized services enables the seamless integration of these measurements and their analysis results into a variety of other systems including widely accepted visualization clients such as Google Earth. 1
133 Günther SAGL et al Methodology Environmental measurements are usually available in a variety of well-established but mostly heterogeneous, and thus incompatible, systems. Recent standardization efforts tackle this problem and enhance data accessibility and integration. We, therefore, follow the live geography approach [1] which fulfils the needs for environmental monitoring almost perfectly in terms of interoperability. Up-to-date environmental in-situ sensor measurements are requested from accurately calibrated weather stations and highly mobile intelligent sensor pods (see [5] for a detailed description) using Open Geospatial Consortium (OGC) Sensor Observation Service (SOS). In order to integrate these measurements in real-time into GIS, we developed a SOS plug-in for ESRI ArcGIS which we utilize as a geo-processing engine. As a result, spatio-temporal data (e.g. temperature) provided via SOS are directly integrated into the (geo-)processing workflow. In-situ sensor measurements and their geo-processed results, i.e. interpolation, are then published as standardized web-services. Client-side usability includes easily interpretable visualization of interpolation results. Thus, emphasis has been put on simple user interface design and the use of widely accepted visualization clients (e.g. Google Earth). 2. Results Figure 1 illustrates live Kriging interpolation results elevation corrected of temperature measurements from fixed weather stations within a mountainous region. Two selected clients are shown herein, ArcGIS Explorer (left) and Google Earth (right). Figure 1: web-based live Kriging interpolation of temperature measurements (elevation corrected) Figure 2 shows Inverse Distance Weighting (left) and Kriging (right) interpolation results of gamma radiation dose rate measurements obtained from intelligent sensor pods mentioned above. During a radiation safety exercise, two scenarios have been conducted: placement of one (Figure 2 left), and two radiation sources (Figure 2 right).
134 Günther SAGL et al. 127 Figure 2: tailored web-application: live spatial interpolation of radiation dose rate measurements (points) 3. Discussion and Conclusion This paper illustrates web-based geo-analysis of sensor measurements in near real- value of time. Two selected applications served as examples to highlight the addedd using web-based geo-processing routines, in particular spatial interpolation methods. Temperature measurements and their interpolation results shown in Figure underline the flexibility of visualisation on different clients. It also demonstrates the use of rather complex underlying models which integrate also legacy geodata, for example Digital Elevation Models for elevation correction of temperature measurements. Geo-analyses results shown in Figure 2 have been captured in the course of the G2real project 2 exercisee shining garden, which is described in detail in [6]. In the first case, as shown in Figure 2 left, the radiation source is clearly identifiable based on the spatial interpolation result. In the second case, two radiation sources were placed and subsequently precisely localised as shown in Figure 2 right. The results have been evaluated by domain experts and show that this near real-time approach can enhance time-critical decision making. We conclude that standardized services enable an easy and direct integration of sensor measurements and their interpolation results into a variety of internet-based clients, in particular for visualisation purposes. The rapid web-based dissemination of geo-processing and geo-analyses results improve, depending of the application specific context, spatial awareness for the environmental phenomenon of interest. Monitoring the current state of the environment is an important component for various applications and domains, for example public information platforms and time-critical spatial decision support. Acknowledgement This research is partially founded by the Austrian Federal Ministry for Science and Research. 2
135 Günther SAGL et al. 128 References [1] B. Resch, M. Mittlboeck, F. Girardin, R. Britter, and C. Ratti, Live Geography Embedded Sensing for Standardised Urban Environmental Monitoring, International Journal on Advances in Systems and Measurements, 2 (2009), pp [2] J. K. Hart and K. Martinez, Environmental Sensor Networks: A revolution in the earth system science?, Earth-Science Reviews, 78 (2006), pp [3] A. Friis-Christensen, R. Lucchi, M. Lutz, and N. Ostländer, Service chaining architectures for applications implementing distributed geographic information processing, International Journal of Geographical Information Science, 23 (2009), pp [4] S. Ninsawat, V. Raghavan, and S. Masumoto, Integration of Web Processing Service and Sensor Observation Service for distributed geoprocessing using real-time data, Geoinformatics, 13 (2008). [5] B. Resch, M. Mittlboeck, and M. Lippautz, Pervasive Monitoring An Intelligent Sensor Pod Approach for Standardised Measurement Infrastructures, Sensors, 10 (2010), pp [6] G. Sagl, M. Lippautz, B. Resch, and M. Mittlboeck, Near Real-Time Geo-Analyses for Emergency Support: An Radiation Safety Exercise, in 14th AGILE International Conference on Geographic Information Science, Utrecht, The Netherlands, in press, p. pending.
136 Katharina HENNEBÖHL; Marius APPEL 129 Towards Highly Parallel Geostatistics with R Katharina HENNEBÖHL a1 ; Marius APPEL a a Institute for Geoinformatics, University of Muenster, Germany Abstract.Tools for geostatistical analysis are standard functionality in geographic information systems. The underlying algorithms are well studied and usually optimized for today s state-of-the-art hardware. However, real-time provision of large spatio-temporal datasets, the availability of highly parallel hardware such as programmable graphics processing units (GPUs) and the resulting specification of programming models such as OpenCL or CUDA call for re-thinking established sequential programming paradigms. This paper presents the prototype of the gpugeo R package that aims at taking advantage of highly parallel computing resources for geostatistical analysis. Keywords. Parallel Geostatistics, OpenCL, Spatial Interpolation, R 1. Introduction The R Project for Statistical Computing [9] provides a wide range of well developed tools for geostatistical analysis. Since open web enabled geosensor networks envision real-time availability of large spatio-temporal datasets that report information on environmental change, it would be convenient to improve the available tools using accelerated hardware, in particular for such fundamental predictive modeling tasks like spatiotemporal interpolation. In fact, graphics processing units (GPUs) offer the opportunity for highly parallel processing on virtually each notebook or desktop computer. Yet, their computational power seems to be underemployed due to a lack of integration into common data analysis and GIS software. To bridge this gap, it is necessary to re-think established sequential programming paradigms and to identify parallelizable computing tasks. There are several commercial and open source software projects that aim at integrating the computational power of GPUs into existing software. We limit our discussion to current open source initiatives relevant to the research field of geostatistics. The GPUML package [12] accelerates machine learning [8] algorithms building upon CUDA, NVIDIA s proprietary parallel computing architecture for GPUs [6]. The R gputools package [1] provides a set of common data mining algorithms also implemented in CUDA [6]. The R magma package enables parallel basic linear algebra operations (BLAS) in hybrid CPU and GPU architectures [13]. However, geostatistics involves but is not restricted to data mining algorithms and accelerating geostatistical functions at the BLAS level is often not optimal since 1 Corresponding Author: Katharina Henneböhl, Institute for Geoinformatics, University of Muenster, Weseler Str. 253, Münster, Germany;
137 Katharina HENNEBÖHL; Marius APPEL 130 matrices in geostatistics often have a functional representation such as distance or covariance matrices. This paper discusses the prototype of the R gpugeo package that aims at improving the performance of geostatistical algorithms using the non-proprietary Open Computing Language (OpenCL) [5]. The remainder is organized as follows. Section 2 describes the architecture of the R gpugeo package. Section 3 evaluates accelerated functions for inverse distance weighted (IDW) and kriging interpolation, i.e. spatiotemporal prediction, and discusses some possible extensions for conditional simulation of spatial random fields. Section 4 concludes with relevant findings. 2. R and OpenCL for highly parallel geostatistics 2.1. GPU basics Highly parallel geostatistics as addressed in this paper aims at taking advantage of massively multithreaded computing capabilities as provided by today s programmable GPUs. A GPU s architecture is designed for single-instruction-multiple-data (SIMD) tasks with high arithmetic intensity, i.e. high number of numeric operations per memory access. Due to its specific properties the GPU is a co-processor. It can support the CPU in certain computations rather than replace it. The architecture of a modern GPU can be seen as a grid consisting of blocks. Each block houses a number of work items (or threads) that perform the computations. Each work item can read from and write to a private register. All work items within a block share a local memory and all blocks have access to the global memory. Among the different kinds of memories on the GPU, global memory has the largest capacity but also the slowest access time. Regular access patterns that make each work item within a block read and write to consecutive global memory locations reduce the effort (memory coalescing). Thus, optimization of algorithms for the GPU has two main targets: first, define regular access patterns to control global memory accesses, and second, maximize usage of private registers and local shared memory Architecture of the gpugeo R package The open source R statistical environment does not provide support for GPUs by default. It is necessary to add specific interfaces. We build upon the R gstat package [7] and use OpenCL to accelerate the spatial interpolation functions. In contrast to NVIDIA s CUDA programming standard, OpenCL is an open standard not only for GPUs but for parallel computing on heterogeneous platforms with CPUs, GPUs and other processors. It is specified by the Khronos Group, a non-profit consortium. OpenCL consists of a programming language (OpenCL C) for writing functions that are executed on a computing device (also called kernels) and an API that interfaces between host applications and the task executions of the computing devices. Kernels are executed in a massively parallel manner and are compiled at runtime for the dedicated devices. Figure 1 illustrates how the gpugeo package interfaces between R and OpenCL, i.e. between the host (CPU) and the highly parallel device (GPU). The gpugeo package comprises a set of R function definitions (.R) and C source files (.cpp) that wrap the OpenCL specific kernels. The CPU invokes the R functions and the host C source
138 Katharina HENNEBÖHL; Marius APPEL 131 code. The host C source code is the heart of the R/OpenCL interface: it is responsible for device memory allocation, device initialization and kernel execution. Because of the different types and architectures of devices, different kernels can be implemented for one function, optimized to a specific device type. The gpugeo package is optimized for GPUs. When loading the gpugeo package into an R session it checks the properties of the available computing devices automatically and compiles the respective kernels. Figure 1. Architecture of the gpugeo R package: here 'host' represents the CPU, 'device' a highly parallel co-processor, i.e. GPU. 3. Interpolation and simulation with the gpugeo R package 3.1. Spatio-temporal prediction Spatial and spatio-temporal interpolation algorithms such as IDWand kriging [2] predict quantities at unobserved locations based on a limited number of data points. Usually, the interpolated values are computed as a linear combination of data values. This part of interpolation, to which we will refer as spatio-temporal prediction, can then be expressed as a simple matrix-vector product Wz data = (1)
139 Katharina HENNEBÖHL; Marius APPEL 132 where W represents the m n spatial weights matrix, z data the n 1 vector of data points and and the m 1 vector of interpolated values. An item w ji denotes the weight of the i-th data point to predict the j-th location. In case of IDW, the w ji are given as a function of distance between the data point and the prediction location. A condition for valid IDW predictions is that the w ji sum up to one. For kriging, the w ji are determined by the covariance function (or variogram model) of the spatial random process [2]. The IDWimplementation for the GPU is straightforward and has a computational complexity of O(nm). Kriging requires the solution of an n n linear equation system prior to spatio-temporal prediction: Cq = z k C -1 z k = q (2) where C is the variance-covariance matrix of the data z k and q the n 1 solution vector of the linear system that replaces the vector z data in equation 1. The overall complexity of kriging is O(nm+n 3 ) where n 3 is the effort for solving the linear equation system. For a comprehensive explanation of how the classic kriging algorithm can be re-formulated to exploit highly parallel computing architectures the reader is referred to Srinivasan et al. [11]. In our implementation, the linear kriging system is solved on the CPU. The main steps of the GPU optimized OpenCL algorithm for spatio-temporal prediction can be summarized as follows (according to [12]): 1. Each work item within a GPU block is assigned to compute the weighted linear combination, i.e. interpolated value, for one prediction location. 2. To reduce accesses to global memory, the data points are loaded into local shared memory. Each work item reads the information of one data point thus taking advantage of coalesced memory access. 3. If the number of data points exceeds the capacity of local memory, which is likely to occur, the data points are divided into subsets and processed sequentially. 4. Each work items computes the weights for the data points currently present in local shared memory on-the-fly and updates the weighted sum. 5. Once all subsets of data points are processed, the resulting weighted sum is written to global memory. Exploiting the highly parallel GPU architecture for spatio-temporal prediction results in significant acceleration. Table 1 compares the computation time of OpenCL accelerated gpugeo IDW to gstat IDW. In most cases we obtained speedup factors between 8 and 15. Note that these depend on the technical set-up of the test setting, i.e. not all GPUs would compute our implementation faster than any CPU. All our experiments were performed on a Quadro FX 4800 NVIDIA GPU with 1.5 GB global memory and an Intel Xeon Quad-Core 2.67 GHz machine with 6 GB RAM and an Ubuntu linux operating system version The size of speedup in table 1 also illustrates the potential of acceleration for the spatio-temporal prediction step in kriging. The possible overall acceleration of kriging is subject to further investigation and depends on different factors including computation time needed for (i) parameter estimation of the covariance model, (ii) solution of the n n linear equation system, and (iii) spatio-temporal prediction.
140 Katharina HENNEBÖHL; Marius APPEL 133 Table 1. Comparison of computation times in seconds (sec) between gpugeo IDWand gstat IDWfor different numbers of data and prediction locations. The speedup factor is specific for the test setting (see text for technical details). Data points Prediction points gstat sec gpugeo sec speedup Routines for conditional simulation of random fields The ideas presented in the previous section can be re-used and expanded for simulation of spatial random fields. The gpugeo package offers some basic routines for conditional simulation [2] based on covariance matrix decomposition. These routines may complement existing procedures in gstat and other R packages. Due to the limited scope of this paper, we focus our discussion on the main concepts: 1. The conditional simulation procedure starts with drawing a vector of independent random numbers from the standard Gaussian distribution with mean = 0 and variance 2 = 1. This step is entirely computed on the GPU [3]. 2. By multiplying the vector of independent normally distributed random numbers with a lower triangular Cholesky root of a specified variance-covariance matrix, we obtain a vector of correlated random numbers. The variance-covariance matrix describes the spatial dependence among the simulation locations which in case of simulation locations has dimension We compute the lower triangular 'offline' on the CPU and employ the GPU for the matrix multiplication part. 3. The vector of now correlated random numbers is equivalent to an unconditional simulation of a random field and is conditioned to data values using ordinary kriging [2]. Here, we re-use the procedures for spatio-temporal prediction as described in section 3.1 The above approach exploits a GPU s strength for random number generation and matrix multiplication. Step 1 and 2 have been discussed in [10] to accelerate financial Monte Carlo. Critical for the computational performance is the number of simulation locations which determines the size of the covariance matrix. To obtain several conditionally simulated realizations however seems less critical. Roughly speaking, our implementation allows generating 'stacks' of correlated random numbers, i.e. unconditional simulations in matrix form. These are conditioned to a set of data values
141 Katharina HENNEBÖHL; Marius APPEL 134 within one GPU call which is equivalent to the interpolation of several datasets at the same time. 4. Discussion and Conclusion In this paper, we presented the prototype of the gpugeo R package that integrates highly parallel computing power into the R gstat package, a widely used open source tool for geostatistical analysis. In contrast to the majority of other software initiatives that target the integration of highly parallel computing power into existing software, the gpugeo R package builds upon the non-proprietary OpenCL programming model. Classic geostatistics usually comprises four core components: (i) model selection, (ii) parameter estimation, (iii) prediction and (iv) simulation. The current gpugeo implementation particularly adresses spatio-temporal prediction and provides accelerated functions for IDW and kriging interpolation. Nevertheless, the underlying ideas can be expanded for geostatistical simulation, employing e.g. efficient random number generation and matrix multiplication. Thus, the gpugeo package also offers some highly parallel routines for unconditional and conditional simulation of (spatial) random fields which may complement existing procedures in gstat and other R packages. At the stage of this paper, the architecture of the gpugeo R package follows a hybrid approach, taking advantage of CPU and GPU capabilities in a personal computing environment. Whereas IDW is an example for an algorithm that completely can be executed on the GPU kriging and the routines for conditional simulation are examples for hybrid implementations. Performance tests indicate significant acceleration for both, promising considerable benefit for GIS applications such as but not restricted to (real-time) mapping from sensor networks. We conclude that even if only a part of the overall task complies with the highly parallel paradigm re-thinking wider classes of geostatistical algorithms is worth the effort. Moreover, advances in multi-gpu computing (e.g. [4]) offer numerous perspectives not only for geostatistics but further GIS related modeling and image processing tasks. References [1] Buckner, J., Wilson, J., Seligman, M., Athey, B., Watson, S., Meng, F. (2010). The gputools package enables GPU computing in R. Bioinformatics 26(1), pp [2] Cressie, N. (1993). Statistics for Spatial Data. Wiley Series in Probability and Statistics. [3] Howes, L., Thomas, D. (2007). Efficient random number generation and application using CUDA. In Nguyen, H. (Ed.) GPU gems 3, pp [4] Kindratenko, V., Enos, J., Shi, G., Showerman, M., Arnold, G., Stone, J. Phillips, J., Hwu, W. (2009) GPU clusters for high-performance computing. In Proceedings of the Workshop on Parallel Programming on Accelerator Clusters (PPAC 09), New Orleans, Louisiana, USA, 2009, pp [5] Munshi, A. (2010). The OpenCL Specification. The Khronos Group. URL org/registry/cl/specs/opencl-1.1.pdf [6] NVIDIA (2008). NVIDIA CUDA Programming Guide 2.0. URL download.nvidia.com/compute/cuda/2_0/docs/nvidia_cuda_programming_ Guide_2.0.pdf [7] Pebesma, E. (2004). Multivariable geostatistics in S: the gstat package. Computers & Geosciences 30 (7), pp [8] Rasmussen, C., Williams C. (2005). Gaussian Processes for Machine Learning. The MIT Press. [9] R Development Core Team (2010). R: A language and environment for statistical computing. R
142 Katharina HENNEBÖHL; Marius APPEL 135 Foundation for Statistical Computing, Vienna, Austria. ISBN , URL R-project.org/. [10] Singla, N., Hall, M., Shands, B., Chamberlain, R. D. (2008). Financial Monte Carlo Simulation on Architecturally Diverse Systems. In Proc. of Workshop on High Performance Computational Finance, Nov [11] Srinivasan, B. V., Duraiswami, R., Murtugudde, R. (2010a) Efficient Kriging for Real-Time Spatio- Temporal Interpolation. 20th Conference on Probability and Statistics in the Atmospheric Sciences, American Meteorological Society, January [12] Srinivasan, B. V., Qi, H., Duraiswami, R. (2010b) GPUML. Graphical processors for speeding up kernelmachines.workshop on High Performance Analytics Algorithms, Implementations, and Applications. Siam Conference on Data Mining, April [13] Volkov, V. and Demmel, J. (2008) Benchmarking GPUs to tune dense linear algebra. In Proceedings of SC 08. IEEE Press, Piscataway, NJ, USA, pp
143 Sebastian GÜNTHERT et al. 136 Agricultural land use dynamics in Tenerife (Canary Islands): The development of fallow land as resettlement area for adjacent natural ecosystems Sebastian GÜNTHERT a,b ; Alexander SIEGMUND a,b ; Simone NAUMANN a a Research Group for Earth Observation r geo, Department of Geography, University of Education Heidelberg, Heidelberg, Germany, b Institute of Geography, Heidelberg University, Heidelberg Abstract. Since the middle of the 1960s, the island Tenerife is subject to an economic change from an agrarian to a service based society, mainly focused on tourism. This development lead not only to an increasing expansion of infrastructure near to the coasts but also to increasing fallow land in higher and backward areas. The presented study aims at the modelling of agricultural land use changes for detecting fallow lands as potential regeneration areas for natural ecosystems on Tenerife. This formerly cultivated land can be seen as a major factor influencing natural reforestation and renaturation of sensitive ecosystems (e.g. laurel forest or pinewood). It provides potential space where adjacent ecosystems can spread and hence a natural resettlement of formerly cleared and agriculturally used land can come into effect. For this purpose an object-based classification of satellite images over the last thirty years is done followed by a change detection analysis on the basis of a post classification comparison. Taking into account the different local and global driving forces for these changes the spatial development of fallow land will then be simulated and visualised. Based on these results a prospection of the possible resettlement trend in fallow lands through different sensitive ecosystems can be done. Keywords. Tenerife, Object-based LULC classification, Change detection analysis, Multi-Agent-System modelling 1. Introduction and objectives On Tenerife, political and economic developments lead to a transformation process over the last decades (especially inducted by an expansive tourism), which caused concentration- and intensification-tendencies of agricultural land use in specific areas as well as agricultural set-aside and rural exodus. Since the 1960s, the mass tourism on the island increased from 1.3 million in 1978 to about 6 million tourists a year nowadays [8]. So today more than 75% of the employees on Tenerife are working in the service-based tourist sector, whereas in 1978 it constituted only 56%. At about the same time the cultivated area decreased from about ha in 1982 to ha in The number of small scale farms with a parcels size under 5 ha also decreased from to ha (from ) [2].
144 Sebastian GÜNTHERT et al. 137 Due to these modifications in the economic sectors significant changes in land use and land cover (LULC) can be observed. The touristic induced development of the service sector in the last few decades lead to migrations from the economically disadvantaged rural areas to the urban tourist centres, resulting in a spatial concentration of the population and settlements near to the coasts and increasing fallow land in higher and backward areas [6]. However, these LULC changes will also have an impact on the future development of natural ecosystems on Tenerife. The fallow land in peripheral regions can generally be seen as potential area for natural reforestation and renaturation. It provides space where adjacent ecosystems can spread and hence a natural regeneration of the formerly cleared and agriculturally used land can come into effect. The central hypothesis of the project is that there will be a further increase of agricultural fallow land on Tenerife in the next decades, driven by the change from an agrarian to a service based society. These areas will have a high environmental value according to the preservation of sensitive ecosystems like the laurel forest and its numerous endemic species. To address and valuate the spatial increase of ecologically valuable areas, it is firstly proposed to analyse and model the future trend of agricultural land use changes on Tenerife (with special respect to fallow land) under regard of different economic scenarios by GIS and remote sensing based methods. After detecting those possible hot spots of natural regeneration, a prospection of the possible resettlement trend through different sensitive ecosystems can be done. 2. The study area Tenerife Island The study area is located about 350 km far from the western coast of Morocco (see fig. 1). Tenerife is the largest and highest (2052 km 2, 3718 m at Pico del Teide) island in the Canary Archipelago (27-29 N, W) with a very mountainous landscape of volcanic origin, dating back to Ma [3]. Fig. 1: Location of the Canary islands and the research area Tenerife [2] Due to the different mountain ranges in conjunction with the exposure to the humid northeast trade-winds, the research area furthermore shows an enormous spatial variation in precipitation with a subtropical-arid climate in the south (mean annual precipitation of approximately 100 mm) and a more humid northern part with a mean
145 Sebastian GÜNTHERT et al. 138 annual precipitation of mm (to over 900 mm in the highest northern and northeastern parts of the Anaga massif) [1, 3]. Both parts of the island (especially the South) present developed agricultural areas, which can be divided into two types: A dynamic and more developed type with the aim of exportation (plantations) and a more traditional second type for local consumption and the domestic market. In general, the agriculture drastically transformed the landscape of Tenerife, mainly by building of large terraces to get flat agricultural surfaces. The use of greenhouses is also a frequent practice today. The main crops grown for export are bananas and tomatoes. Agriculture for domestic market consists mainly potatoes, wine grapes and fruit orchards. These crops are primarily localised in a region between m above sea level [12]. 3. Research methodology The study is based on three main working steps: (1) Remote sensing techniques to get information about agricultural land use changes, (2) Spatial modelling techniques to analyse the future spatial development of fallow land and (3) GIS-based prospection of resettlement trends. These will be explained in detail below LULC classification and multiresolutional change detection analysis In order to get information about the agricultural development of the last 30 years, medium resolution satellite images from 1978 (Landsat3), 1988 (Landsat5), 1998 (Spot4) and 2010 (RapidEye) are classified by an object-based classification method. Therefore a multi-scale, knowledge-based segmentation procedure will be developed with a main focus on the exact differentiation of the main ecosystems and their subsystems as well as settlements, different forms of agriculture and fallow land (fig.1). To achieve this aim, additional geodata like a Digital Elevation Model (DEM), digital soil and geological maps etc. and in addition extracted information from the satellite images (NDVI, Principal Component Analysis etc.) will be integrated into the segmentation algorithm. The generated LULC-classifications are then analysed by a change detection analysis on the basis of a post classification comparison. In this approach, two independently classified and registered satellite images will be compared pixel wise by generating a change matrix [4]. Main advantage of this method is the possibility to merge land use classes which present very different spectral signatures (e.g. due to different seasonal recording dates) into the same land use. Thus the procedure based on the comparison of independent classifications is less sensitive to radiometric variations between the scenes and is more appropriate, when dealing with data recorded at different dates [6]. The aim of this method is to detect two main change types: (1) The change from 'agricultural areas' to 'fallow' to get information about the spatial development and the main regions of recently abandoned land, and (2) the resettlement of fallow land through adjacent ecosystems. By analysing this second change type, potential resettlement trends on formerly abandoned agricultural land though the ecosystem, as well as different influencing factors for renaturation shall be identified.
146 Sebastian GÜNTHERT et al. 139 Fig. 2: Target class structure of the object-based LULC classification 3.2. Modelling the future spatial development of agricultural fallow land Once the change detection analysis is realised, it is necessary to identify the different influencing factors, which are responsible for agricultural land use changes. In general, these so called driving forces can be subdivided into different groups: (1) socioeconomic drivers, for example different price trends (of e.g. water, crops, level of wages), development of employees in agriculture and the tourist sector (as an alternative income for farmers) etc.; (2) proximate drivers, which can be seen as land management variables, for example regional planning or the establishing of nature protection areas [14] and (3) location-specific drivers which do not 'drive' land use changes directly, but can influence land use abandonment decisions, for example the closeness of the agricultural areas to settlements or the location of fields in very mountainous areas. The statistical connections between agricultural land use changes and driving forces will be identified by the use of correlation and regression analyses, followed by a causal proof to avoid statistical but not causal interactions of the correlations. Since these connections are detected, it is also possible to construct different future scenarios of the socioeconomic development on Tenerife. These various scenarios are then used as a basis for the subsequent land use modelling. With the use of the detected driving forces and land use changes, an adequate modelling technique for the simulation of the future spatial development of agricultural land use will be developed. Multi-Agent-System models of land use and land cover changes (MAS/LUCC models), a combination of two approved modelling techniques, seem to be well suited for this step. They combine a cellular model that represents the landscape of interest with an agent-based model that simulates decision making by individual agents explicitly addressing interactions among individuals. The cellular model as the spatial modelling framework further includes biogeophysical and ecological submodels whereas the agent based model represents human decision
147 Sebastian GÜNTHERT et al. 140 making. So the first model is part of the agents environment, in which agents act according to given rules [9, 10, 11] GIS-based prospection of possible resettlement by different ecosystems With the use of the land use scenarios and former analysis of resettlement trends in the past, it is possible to give a prospection for different ecosystems and their potential expansion on formerly cultivated areas. However, it must be noted that a positive renaturation of fallow land depends on many factors, on the one hand on soil characteristics, or the type and intensity of the former land-use and on the other hand on characteristics of the respective ecosystem, e.g. the kind of seed dispersal, growth rates etc. These factors will be considered in the GIS-based resettlement analyses. 4. First results and outlook First results of the object-based segmentation procedure are shown in fig. 3. The example comprises the region around Puerto de Santiago in the south western part of Tenerife. The algorithm, generated with the object-based image analysis software ecognition 8.0 Developer, has been developed and tested with RapidEye imagery from april Fig. 3: LULC classification of SW-Tenerife To get closer information about the formerly agriculturally used areas in Tenerife, an adequate detection method needs additionally to be developed, which allows a more exact identification of older agricultural fallow land with a higher level of natural succession. Previous field trips on Tenerife showed that especially in mountainous areas, crops are solely planted in terraces. This leads to the assumption that long-term fallow areas can mainly be detected together with old agricultural terraces and its specific linear texture. One solution for the acquisition of such areas could be the texture-based
148 Sebastian GÜNTHERT et al. 141 detection and area-wide extraction of linear terraces structures in current orthophoto images of Tenerife and its subsequent integration in the existing LULC-classification. A first object-based algorithm in this respect has been developed and needs now to be validated. After a further modifying of the classification technique with regard to the applicability on satellite imagery from other sensors and the later change detection on the basis of a post classification comparison, it is necessary to find the controlling factors (driving forces) for agricultural land use changes. These will have a considerable influence on how the MAS/LUCC model can finally be implemented, especially with regard to the agent architecture (agents' behavior) and the interaction with the agents' environment. Acknowledgements We thank the DLR for the allocation of data from the RapidEye Science Archive and CNES for providing SPOT 4 and SPOT 5 imagery. References [1] F. Aguilera-Klink, E. Pérez-Moriana, and J. Sánchez-Guarcía, The social construction of scarity. The case of water in Tenerife (Canary islands), Ecological Economics 34 (2000), [2] Instituto Canario de Estadisdica, ed., Canarias en Cifras 2008, Gobierno de Canarias, [3] S. Günthert, A. Siegmund and S. Naumann, Modeling and valuation of ecological impacts of land cover and land use changes on Tenerife (Canary Islands), Earth Resources and Environmental Remote Sensing/GIS Applications. Edited by Ulrich Michel, Daniel L. Civco. Proceedings of SPIE 7831 (2010) [4] J. Keuchel, S. Naumann, M. Heiler, and A. Siegmund, Automatic land cover analysis for Tenerife by supervised classification using remotely sensed data, Remote Sensing of Environment 86 (2003), [5] T. Lillesand, R. Kiefer, and J. Chipman, Remote Sensing and Image Interpration, Wiley, Hoboken NY, 5 ed., [6] J.-F. Mas, Monitoring land-cover changes: a comparison of change detection techniques, International Journal of Remote Sensing 20:1 (1999), [7] S. Naumann and A. Siegmund, Modellierung der Siedlungsentwicklung auf Teneriffa auf Basis von multikriteriellen Entscheidungsverfahren und Zellulären Automaten, Salzburger Geographische Arbeiten 43 (2008), [8] S. Naumann, Modellierung der Siedlungsentwicklung auf Tenerife. Eine fernerkun-dungsgestützte Analyse zur Bewertung des touristisch induzierten Landnutz-ungswandels, Heidelberger Geographische Arbeiten 125 (2008). [9] D. C. Parker, S. M. Manson, M. A. Janssen, M. J. Hoffmann, and P. Deadman, Multi-agent systems for the simulation of land-use and land-cover change: A review, Annals of the Association of American Geographers 93:2 (2003), [10] P. Schreinemachers and T. Berger, Land use decisions in developing countries and their representation in multi-agent systems, Journal of Land Use Science 1:1(2006) [11] P. H. Verburg, P. P. Schot, M. J. Dijst, and A. Veldkamp, Land use change modelling: current practice and research priorities, GeoJournal 61 (2004), [12] S. Villa, A. Finizio, R. Diaz Diaz and M. Vighi, Distribution of organochlorine pesticides in pine needles of an oceanic island: The case of Tenerife (Canary Islands, Spain), Water, Air, and Soil Pollution 146 (2003),
149 Peter BAUMANN et al. 142 Towards Standards-Based, Interoperable Geo Image Processing Services Peter BAUMANN a,b,1 ; Roger BRACKIN c ; Michael OWONIBI a c ; Tim CUTSWORTH a Jacobs University, Bremen, Germany b rasdaman GmbH, Bremen, Germany b Envitia Ltd, Horsham, UK Abstract. The OGC Web Processing Service (WPS) Standard has received high attention among the geo community due to its generality on principle, any algorithm can be offered via an open WPS-based interface. However, this generality comes at a price: Normally, WPS instances are not interoperable as only the function signature, but not the semantics is described in a machine-readable fashion. To achieve WPS interoperability we propose to establish specializations (application profiles in OGC nomenclature) based on well-defined service components and present such an application profile for raster data. It is based on the OGC Web Coverage Processing Service (WCPS) which defines a raster processing language. Keywords. OGC, open standards, WPS, WCPS, raster services 1. Introduction Server-side processing capabilities are of steadily growing importance for geo services. Quality of service can be distinctly improved when shifting from a paradigm of data stewardship to service stewardship. This in particular as the sheer amount of data increasingly prohibits a data shipping from the server for client-side processing. The alternative is a code shipping approach where the server, possibly in some scripting language, can be tasked by the clients. However, this raises the issue of the server-side programming interface. Recently, the OGC Web Processing Service (WPS) Standard [5] has received high attention among the geo service community due to its generality on principle, any algorithm can be offered via a WPS interface. However, this generality comes at a price: Normally, WPS instances are not interoperable as only the function signature, but not the application semantics is described in a machine-readable fashion. Concretely, WPS only describes syntax in a formalized XML structure; semantics is only captured in the Title and Abstract tag containing free text (Figure 1). Hence, there is no way for a client to automatically verify that the server code really performs what is advertised. Figure 1. Sample WPS 1.0 process description. 1 Corresponding Author.
150 Peter BAUMANN et al. 143 Database-style query languages show a way forward: their high level style allows phrasing requests independently from storage structures; they usually are safe in evaluation so that no single query can block the server forever; and the declarative language design enables extensive server-side optimizations. This role model, which SQL has introduced on alphanumeric data, has been adopted by the Web Coverage Processing Service (WCPS) Standard issued by OGC in 2008 [2]. WCPS can be summarized as XQuery for multi-dimensional raster data. Due to its formal semantics there is a stable fundament for interoperability: all servers will interpret a given query identically. In the VAROS project [7] funded by the European Space Agency (ESA) we have developed a specification draft, backed by an implementation pilot, where WCPS acts as an Application Profile of WPS. An Application Profile, in OGC nomenclature, is a strict subset of a standard (in terms of this standard s functionality) with the possibility to add extra, application specific definitions. In the case of WCPS, only a specific process type, ProcessCoverages, is introduced which receives a WCPS query as input and delivers a result list. Actually, the WPS specification [5] expressly recommends such a procedure for interoperability: This document does not specify the specific processes that could be implemented by a WPS. Instead, it specifies a generic mechanism that can be used to describe and web-enable any sort of geospatial process. To achieve interoperability, each process must be specified in a separate document, which might be called an Application Profile of this specification. The remainder of this contribution is organized as follows. In the next section, we give a brief overview of WCPS so as to enable understanding of the semantic level. Section 3 explains the approach of using the WPS protocol for WCPS query shipping. The interoperability experiment conducted is described in Section 4, followed by a discussion of use case scenarios in Section 5. Section 6 concludes the paper. 2. WCPS in Brief The WCPS language resembles a high-level, declarative query language on spatiotemporal geo raster data of unlimited volume [2]. As such, it defines syntax and semantics of expressions for specifying ad-hoc search, extraction, aggregation, and analysis of coverages containing multi-dimensional sensor, image, or statistics data. The following example shows the flavour of the WCPS language, see [4] for an extensive discussion of concepts, expressive power, and also design decisions. From a list of 3-D MODIS time series data cubes, it picks those ones which contain at least one pixel where, over land, the near-infrared component (nir) exceeds 250; of these matches, a particular time slice is delivered as HDF-EOS file: for $c in ( ModisTS1, ModisTS2, ModisTS3 ), $m in ( LandSurfaceMask ) where count( $c.nir > 250 and $m ) > 0 return encode( ( ($c.nir - $c.red) / ($c.nir + $c.red) ) [ t: ISO-8601/0/Gregorian+UTC ( 'Sun Mar 22 13:33:29 CET 2009' ) ], 'HDF-EOS' )
151 Peter BAUMANN et al. 144 WCPS only defines the abstract service language and is protocol agnostic. The syntax comes in two semantically equivalent flavors, XML and a so-called Abstract Syntax used in the above example which lends itself towards XQuery to facilitate a future integration. 3. WCPS over WPS The WPS Application Profile for Coverage Processing is a draft specification for a WPS supporting the server-side evaluation of WCPS queries on multi-dimensional coverages [1]. To this end, the WPS process type ProcessCoverages is defined. This process has one input parameter, the query string, and a possibly empty list of output parameters, either coverages or scalars (in case of an aggregation query). The XML code below shows a sample WPS Execute request invoking ProcessCoverages, with the input query provided in WCPS Abstract Syntax (but could be in WCPS XML encoding as well): <?xml version='1.0' encoding='utf-8' standalone='yes'?> <wps:execute service='wps' version='1.0.0' > <ows:identifier>processcoverages</ows:identifier> <wps:datainputs> <wps:input> <ows:identifier>query</ows:identifier> <ows:title>query in Abstract Syntax</ows:Title> <wps:data> <wps:literaldata> for $c in ( MODIS ) return encode( $c.red + $c.nir, tiff ) </wps:literaldata> </wps:data> </wps:input> </wps:datainputs> </wps:execute> Notably, no change to WPS as such had to be made. This orthogonality keeps available all WPS functionality, such as asynchronous processing. Another advantage is the stability against WPS changes. At the time of the specification drafting, only WPS 1.0 has been available, although the WPS Standards Working Group is working heavily on WPS 2.0. It is expected that few or no changes will have to be made to the Application Profile once WPS 2.0 is stable and rolled out. 4. The Interoperability Experiment In the VAROS project, a proprietary geo client, ChartLink, has been coupled with an open-source raster database, rasdaman [8]. Figure 2 shows the overall architecture. The interactive client accepts user input, transforms it to a WCPS request which is shipped to the server using WPS. The rasdaman server receives this request through its petascope component, a Java servlet which performs decoding of WCPS queries, resolves the geo semantics with the aid of its internal metadata, and facilitates translation into a
152 Peter BAUMANN et al. 145 pure array query of the rasdaman raster query language, rasql. This rasql query is evaluated by the rasdaman engine. The multi-dimensional array data themselves are maintained in a partitioned manner in socalled tiles; each such tile is kept in a PostgreSQL BLOB. The query result is constructed by the rasdaman engine by fetching all relevant tiles into main memory, composing the result by performing all necessary processing steps, encoding the result into the data format requested, and delivering back this result to petascope which finally forwards it to the client. 5. Use Case Scenarios The two stakeholder and end user partners in the project, the UK Royal Navy and the German Governmental Spatial Data Infrastructure Initiative (GDI-DE), have made Figure 2. ChartLink / rasdaman architecture. available suitable sample data sets encompassing bathymetry, elevation, orthoimage, and several thematic layers; additionally, AVHRR SST images have been downloaded from USGS. On these data sets, the following seven use case scenarios for WPS-based coverage processing have been addressed: Ortho image retrieval. This consists of simple zoom and pan access to otherwise unmodified data; in other words, it emulates a WMS. Terrain/bathymetry integration, demonstrating dynamic multi-source fusion by generating a seamless altitude map from a Digital Elevation Model and a Bathymetry Model. Terrain slope calculation, as a representative for focal operations and general convolutions. Topographic classification/coloring. Copyright protection. In this case, a constant image (such as the data provider s logo) is overlaid on all requests. In a practical scenario, a corresponding query subexpression would be added server-side to enforce stamping. Decision support. A weighted composition of an elevation layer and several thematic layers is established and colored so as to highlight areas fulfilling particular criteria. Weights can be adjusted in the client by the user. Flood analysis; as a representative for what-if analysis types, the combined bathymetry/elevation terrain map is shaded according assumed high or low tidal or flood situations. Most of these use cases focus on interactive clients for map visualization tasks (Figure 4). Fully automated (batch) clients probably convey further characteristics to be captured by additional use cases. This is subject to future research.
153 Peter BAUMANN et al. 146 Figure 3. ChartLink query editing dialog. Figure 4. ChartLink visualization of map overlay (left), weighted classification (center), and projected view of classification (right). We want to draw particular attention to the way queries are generated in the client. As the WCPS interface defines query strings (or XML documents, resp.) as the only interface, the client is tasked with transforming user input into such queries. We have found three scenarios particularly characteristic: Completely prefabricated queries maintained in the client and invoked by the user through a mouse click. Queries parametrized with geometry. Users navigate to a particular location and activate a virtual layer, whereupon the client patches the corresponding coordinates into the query generating this virtual layer. This we found the most frequent use case by far. Queries parametrized with sub-expressions. Depending on the user s knowledge about the WCPS language, the spectrum ranges from simple expressions to complete queries. For example, a normalized difference index formula, such as the NDVI, could be entered by the user; the client would embed this into an overall query and add appropriate bounding box and resolution parameters. This request type has not been used in VAROS, but in an earlier coupling of rasdaman to an astrophysical simulation tool. The Chartlink client is powerful enough to allow defining and editing of any such definitions, including interactive parameter instantiation (Figure 3). The language inter-
154 Peter BAUMANN et al. 147 face has proven outstandingly useful for distributed development and debugging in the project: code snippets could be exchanged easily for discussion, and both syntactic and semantic errors could be spotted easily in the server log by looking at the queries sent. 6. Conclusion Geo processing has a high potential, but currently suffers from a lack of machineunderstandable semantics the more powerful operations get. However, a concise agreement between client and server is indispensable for Semantic Web services where there is no human supervision any longer. Envitia Ltd, engaged in the OGC WPS group and vendor of WPS solutions, has stated that 'Previous research led to the conclusion that WPS offered no significant benefit over a bespoke web service because unlike other OGC services there is no common domain semantics between differing WPSs. WCPS would offer a particular benefit therefore in the areas of environmental data comparison, exploitation and validation which are key interests of many of our customers.' One might ask, then, why to go the WPS way in the first place and not straight the Web Service way? The answer is twofold. First, geo processing standards obviously are beneficial to the community if interoperability can be achieved and WPS sees an eager take-up in geo communities and by geo tool vendors. Second, the roadmap to WPS interoperability is described already in WPS 1.0 [5] what we did is exactly following this roadmap. With our work, we aim at lifting raster services to the semantic level. Coupling WPS and WCPS is one step towards this goal (a connector into WCS exists already). Actually, the WPS Coverage Processing specification is part and the first of a major initiative within OGC to establish altogether five Application Profiles. In VAROS, two independently developed tools the rasdaman server and the ChartLink client have been coupled successfully, giving a first indication for interoperability. In future, service providers can choose whether to use WCPS raster processing through the WCS or the WPS transport protocol. Services which are mainly data oriented or embedded in standard Web GIS environments, such as combinations of a Web Map Service (WMS) with virtual coverages obtained through dynamic WCPS requests might prefer WCS. In a more processing oriented environment which might want to exploit the advanced process handling foreseen for WPS 2.0, this might be the choice. On a larger perspective, this activity is part of the overall harmonization and integration of coverage-related activities in OGC. Harmonization between GML, SWE Common, and WCS has been achieved already, WPS, O&M, and data exchange formats like GeoTIFF, JPEG2000, and NetCDF are under way. The ultimate vision is that coverages can be exchanged freely - and independently from their particular encoding - between all OGC-based services, thereby getting closer to the Holy Grail of coverage service interoperability. Acknowledgement VAROS has been funded by ESA under contract no. ESRIN/22742/09/I-EC. Bremen geo data have been provided by Geoinformation Bremen, Plymouth Data by Royal Navy, and satellite imagery by ESA and NASA.
155 Peter BAUMANN et al. 148 References [1] P. Baumann (ed.), OGC WPS Application Profile for Coverage Processing. OGC document , [2] P. Baumann (ed.), OGC Web Coverage Processing Service (WCPS). OGC document r2, [3] P. Baumann (ed.), OGC Web Coverage Service (WCS) Core. OGC document r3, [4] P. Baumann, The OGC Web Coverage Processing Service (WCPS) Standard. Geoinformatica, 14(4)2010, pp [5] P. Schut (ed.), OpenGIS Web Processing Service. OGC document r7, [6] P. Baumann (ed.), OGC GML Application Schema for Coverages. OGC document r1, [7] last seen: 2011-feb-04. [8] last seen: 2011-feb-04. [9] last seen: 2011-feb-04.
156 Lúbia VINHAS et al. 149 Extracting the Evolution of Land Cover Objects from Remote Sensing Image Series Lúbia VINHAS a,1 ; Olga BITTENCOURT a ; Gilberto CÂMARA a a ; Sergio COSTA a National Institute for Space Research (INPE), Brazil Abstract. We live in a changing world and this acceleration of change provides a strong motivation for research in GeoInformatics. Our tools and methods for representing and handling land cover data should be capable of dealing with change. In this work we developed a land cover model to handle the evolution of land cover objects. This consists of an abstract data type ( land cover object ) and uses a limited number of spatial operations such as create and split to elicit the changes it suffers in time. We present an application of the model to the data provided by the Brazilian Amazon Deforestation Monitoring Program. Keywords. Evolving objects, geosemantic algebra, object history, spatiotemporal objects evolution. Introduction We live in a changing world and this acceleration of change provides a strong motivation for research in GeoInformatics. Our tools and methods for representing and handling land cover data should be capable of dealing with change, that is defined by Singh [1] as the different states that objects adopt in distinct observed timestamps. Indeed, there is much work in the literature about modelling and representing change in land cover objects [2] [3] [4] [5]. One of the challenges for modelling geographical change is dealing with the massive spatial data sets generated by remote sensing. Understanding land change using remote sensing data requires more than having access to multitemporal data sets. The researcher needs to build a conceptual model of the processes that cause change. Then, he needs to relate these processes to objects identifiable in the images. The next step is describing how these objects change from one image to the next. After the information is organized in a coherent spatio-temporal view, the researcher can relate the evolution of the objects found on the images to his conception of the land cover dynamics. An important question that we should be able to answer is: How can we reconstruct the changes in the areas, given a set of snapshots of the area? To help answer this question, this work is based on the notion of image objects and land cover object that were previously described in [6]. Image objects are static entities whose existence is tied to the image they were extracted. An image object is geometrically represented by one or more polygons which enclose homogenous areas. They are usually obtained by a segmentation followed by a classification [7]. A land 1 Corresponding Author: Lúbia Vinhas, INPE/DPI, Av. dos Astronautas, 1758, S. José dos Campos, SP, Brazil;
157 Lúbia VINHAS et al. 150 cover object is a spatio-temporal object whose boundaries and properties change, but whose identity remains during its existence. Given a sequence of images from the same area, a land cover object will be associated to one or more image objects extracted from each image. Matching land cover objects to image objects is usually subject to topological restrictions. When a land cover object is matched to two image objects extracted from successive images, most applications require that the image objects have part of their boundaries or interiors in common. Thus, one land cover object might be mapped to many image objects detected in different images taken at different times. We propose to exploit this multitemporal dataset of land information systems by detecting and storing the history of land cover objects that emerge from processes related to change. We do this by applying a limited number of spatial operations create, update, merge, and split on the land cover and image objects, accordingly to a set of rules that gives semantic to the land cover objects. We call the model a land cover evolution model and is described in the next section. 1. The Land Cover Evolution Model There is a common situation in land information systems, where data comes from multitemporal remote sensing images. The input is the state of the world at discrete times t 0, t 1,..., t n. In a time t i we have a set of image objects detected from an image taken at t i that represents the change detected from t i-1 to t i. These are image objects and there is no linking from them to the land cover objects detected times t 0, t i-1. In our model, the application needs to define a suitable set of rules for controlling how an image object (that represents change) object relates to land cover objects detected at the same time or previously. We start with a merge rule that defines when an image object should be spatially joined to an existing land cover object, for example: if two objects are topologically adjacent (they touch), they are joined. Meaning that the image object detected at this time is the result of a change suffered from an existing land cover object in a previous time. Conversely, we have split rule that determines when an image object represents the changes suffered from two previously existing land cover objects. For example: if an object is detected and it overlaps a previously existing object, it should be split. The sequence of operations suffered by each land cover object in the data set represents its history and allows us to infer patterns of evolution or identify agents responsible for the evolution The Evolution Data Set In order to apply our model we consider an input data set, called OD - Objects Dataset that contains the image objects, representing change, detected a given time. From this dataset we create a second data set to store the corresponding land cover objects, including the history of image objects that contributed to its current configuration. This data set is called EOD - Evolving Objects Dataset. At the initial time t 0, we retrieve the set of objects from the original OD and insert it into the EOD. For each timestamp from t 1 to t n, the land cover evolution model combines the image objects from the OD in time t i with the land cover objects from the EOD in t i-1 resulting in the objects in EOD in time t i. This sequence of steps is illustrated in Figure 1.
158 Lúbia VINHAS et al. 151 Figure 1. The Land Cover Evolution Model. We use a genealogy tree to describe the history of each object. At the lowermost level of the tree, we have the ancestor objects. As these merge and split with others, the tree grows upwards. We implemented the concepts described here as an algebraic model for handling land cover objects. The algebra is described in details in [8] and [9]. In this algebraic model, we can associate types to the land cover objects and define the evolution rules according to these types. As a practical consideration, evolution rules that depend on topological considerations (such as objects touching each other) may be affected by the geometric matching between two snapshots. In general, the user needs to perform suitable pre-processing operations to ensure that there is good correspondence between data from subsequent time steps. This pre-processing avoids incorrect rules resulting in an analysis generated by distinct computational distances of tolerance values. 2. Evolution of deforestation in the Brazilian Amazon: a case study This section presents an example of applying the Land Cover Evolution Model to study the evolution deforestation in the Brazilian Amazon rainforest. We use data from the surveying work done by the Brazilian National Institute for Space Research (INPE). Using remote sensing images, the INPE provides data on deforestation and degradation of the Brazilian Amazon tropical forest: it indicates that more than 37,000,000 ha were cut from 1988 to 2008 [10]. To study the evolution of deforestation our input data are a set of image objects associated to patches of change detected in subsequent remote sensing image. We are interested in detect and monitoring the land cover deforestation evolving objects during the evolution process. We analyzed deforestation process in the Vale do Anari municipality, Rondônia State (2). This is a 400,000 ha region where occupation started with government-planned rural settlement.
159 Lúbia VINHAS et al. 152 Figure 2. Location of Vale do Anari study area. Source: [11]. Figure 3 shows the sequence of images detected by LANDSAT on four different timestamps and that shows the changes occurred in Vale do Anari municipality, from 1985 to 1994, with a three-year interval between each set. Clear areas represent the deforested areas on these images. (a) 1985 (b) 1988 (c) 1991 (d) 1994 Figure 3. Location of Vale do Anari study area. Silva et al. (2008) [11] and Mota et al. (2009) [12] proposed a classification for the land this dataset according to the land change agents acting in this region based on expert knowledge about the area. They recognize three different land cover objects: AlongRoad, Concentration and SmallLot. In one time stamp, each image object can be classified with one of these three types. Mota et al. (2009) [12] went a step further and developed the following rules for the evolution of these land cover objects according to their types: R1.Two adjacent Concentrations merge and the new object is a Concentration. R2.Two adjacent SmallLots with areas smaller than 50 ha merge and the new object is a SmallLot. R3.A SmallLot with an area smaller than 50 ha adjacent to a Concentration merge and the new object generated is a Concentration.
160 Lúbia VINHAS et al. 153 Our model allows us to discover when objects of one type become another type, which we call evolution. In this case, this represents discovering when and where the deforestation processes change. The correspondent OD and EOD can be seen in Figure 4 that gives a general view of the model application. Time Objects Dataset (OD) Evolving Objects Dataset (EOD) to to to 1994 Figure 4. The OD and EOD datasets for Vale do Anari. By looking the resulting EOD we verified when some SmallLot objects evolved to Concentrations, allowing us to identify the land concentration process on this study area, and that if reflects on the observed deforestation patch patterns. Field data [12]
161 Lúbia VINHAS et al. 154 support this conclusion, thus the application of the land cover evolution model can improve the accuracy of deforestation detection. 3. Conclusions The main contribution of this paper is to propose a land cover evolution model to track the evolution history of a set of evolving objects as well as the individual history of each object in the set. Our model combines distinct types of land cover objects, describes and recovers the evolution of objects in a flexible way and considers constraints derived from knowledge about the application domain. We applied the land cover algebra in the domain of environmental change monitoring using remote sensing images to analyze a time series of deforestation land cover objects in the Brazilian Amazon. We identified land cover objects as evolving objects and were able of evolving them by applying the operations merge and split, which are semantically adaptable to the application. We can therefore verify the influence of land cover objects in close regions, discover patterns associated with the evolution histories and increase our ability to understand the land use changes detectable in remote sensing image datasets. Advances can be done to improve the model in the environmental domain and to use it to better support economics and policy making in the Brazilian Amazon. The evolution of objects provides insight into the broader scope and complementary perspectives. These methods can also be applied in other areas and scenarios. References [1] A. Singh, Digital change detection techniques using remotely-sensed data. International Journal of Remote Sensing 66 (1989), [2] A. Frank, Ontology for Spatio-temporal Databases. In: Spatio-Temporal Databases: The Chorochronos Approach. Springer, Berlin, [3] M. Worboys, M Event-oriented approaches to geographic phenomena. International Journal of Geographical Information Science 19, (2005), [4] P. Grenon, B. Smith B., SNAP and SPAN: towards dynamic spatial ontology. Spatial Cognition & Computation. 4(1) (2003), [5] M. Goodchild, M. Yuan, T. Cova. Towards a general theory of geographic representation in GIS. International Journal of Geographical Information Science. 21:3 (2007), [6] B. K. Becker, Amazônia. ed. Ática. São Paulo. 5ª ed., [7] M. I. S. Escada, Evolução de Padrões da Terra na Região Centro-Norte de Rondônia. São José dos Campos: INPE, [8] O. F. R. O. Bittencourt and G. Câmara and L. Vinhas and J. S. Mota. Rule-based evolution of typed spatiotemporal objects. Proceedings of the VIII Brazilian Symposium in Geoinformatics. Campos do Jordão, Brazil,2007. [9] O. R. F. O. Bittencourt, Algebraic modelling of statiotemporal objects: understanding change in the Brazilian Amazon. São José dos Campos: INPE, [10] INSTITUTO NACIONAL DE PESQUISAS ESPACIAIS (INPE). PRODES - Program for deforestation assesment in the Brazilian Legal Amazonia. São José dos Campos, Brazil: INPE, Available at: Last access: 31st January [11] M. P. S. Silva and G. Câmara and M. I. S. Escada and R. C. M. Souza, R. C. M. Remote sensing image mining: detecting agents of land use change in tropical forest areas. International Journal of Remote Sensing, 29(16) (2008),
162 Lúbia VINHAS et al. 155 [12] J. S. Mota and G. Câmara and M. I. S. Escada and O. R. F. O. Bittencourt and L. M. G. Fonseca and L. Vinhas. Case-Based Reasoning eliciting the evolution of geospatial objects. In: Conference on Spatial Information Theory: COSIT'09, 2009, Aber Wrac'h, France, Proceedings Aber Wrac'h: [s.n], 2009.
163 Tomi KAUPPINEN; Giovana MIRA DE ESPINDOLA 156 Ontology-Based Modeling of Land Change Trajectories in the Brazilian Amazon Tomi KAUPPINEN a ; Giovana MIRA DE ESPINDOLA b a Institute for Geoinformatics, University of Muenster, Germany b National Institute for Space Research (INPE), Brazil Abstract.Tropical deforestation is an example of geochange with massive impacts to the environment locally and globally. In the Brazilian Amazon deforestation has prevailed owing mostly to private investments in agricultural expansion, associated with large-scale cattle ranching, smallscale subsistence farming and soybeans expansion. Data on deforestation have been relied mostly on satellite remote sensing, mapping the extent of forest loss. Several existing data having diverse spatial and temporal resolution are maintained to analyze the whole land cover dynamics in the region. Although the extent of forest loss has been examined across the Brazilian Amazon, little is known about the transitions among land change pathways. Nevertheless, there is much information about site conditions available from different sources, such as land management and agricultural production, as well as existing settlements, land tenure and household assets. In this paper we propose the Process-oriented Land Use and Tenure Ontology (PLUTO) for semantically integrating and reasoning about data sets related to deforestation and land change trajectory in the Brazilian Amazon, and for publishing them as Linked Data. Keywords. Linked Data, Semantic Web, GeoChange, Ontologies, Deforestation, The Brazilian Amazon Rainforest Introduction Tropical deforestation causes large greenhouse gas emissions to the atmosphere every year. In the Brazilian Amazon, the rates of deforestation have averaged 17,486 km2 per year [9], creating significant negative externalities as loss of bio-diversity, erosion, oods and lowered water tables [14,6]. Although considerable research has focused on estimating rates of forest conversion in the Brazilian Amazon, less is known about the fate of land that has been converted to human use. Data on forest loss have been relied mostly on satellite remote sensing to reveal regions where deforestation has taken place. Several existing data having diverse spatial and temporal resolution are maintained to analyze the whole land cover dynamics in the region. For example, National Institute for Space Research (INPE) in Brazil has four operating systems for monitoring deforestation in the Brazilian Amazon: PRODES, DETER, QUEIMADAS and DEGRAD. These systems are complementary and were designed to meet different goals. Nevertheless, there is much information about site conditions available from different sources, such as land
164 Tomi KAUPPINEN; Giovana MIRA DE ESPINDOLA 157 management and agricultural production, as well as existing settlements, land tenure and household assets. Regarding land management and agricultural production, agricultural census data are a rich archive of regional information on land use. Agriculture censuses constitute the most complete survey of agriculture production, including the area under different land use categories (temporary versus permanent agriculture, for example), crop production, levels of mechanization and agricultural inputs. In addition, planning dimensions, including creation of protected areas, indigenous lands and settlements, and land tenure and household assets, have been crucial in shaping the land change trajectories in the region. However, integration of all different data about different pathways of the whole land change trajectory. is not straightforward for several reasons. First of all, there is often a lack of syntactic interoperability. But even more crucial problem is the lack of semantic interoperability [10] between data sets regarding tropical forests (see e.g. [4,11]). It is necessary to analyse causal links in order to explicate relationships between events and environmental changes [12]. Essentially, different data sets maintain information about different pathways of the whole land change trajectory. The problem hence is how to formally define these pathways, and relate them to each other. Ontologies provide mechanisms for interconnecting concepts e.g. functions, purposes, activities, and plans [5] in a machine-processable way, and hence together with reasoning mechanisms offer a potential solution for modeling spatiotemporal semantics of land change pathways. In this paper we are interested in giving an ontological foundation to essential land change trajectories, and to model them with formal semantics. To achieve this we propose the Process-oriented Land Use and Tenure Ontology (PLUTO) built as an alignment to the top-level ontology DOLCE [8] for semantically integrating several data sets related to deforestation and land change trajectory in the Brazilian Amazon, and to publish and share these data sets as Linked Data Modeling Semantics of Land Change Trajectories 1.1. Background The most compelling reason to monitor land change in the Brazilian Amazon is the strong effect of land change trajectory on the state of converted areas. Concepts of land change trajectories have been used to identify some dominant pathways leading to specific outcomes, and have been presented as typical successions of causes of tropical deforestation across the region. The potential transition pathway from forest to other uses depends on the state of the human occupation and site conditions, such as distance to roads [1], presence of settlements and land tenure [13], soils and environmental weather, and market conditions. Therefore, land use and tenure issues have been affecting deforestation in the Brazilian Amazon in several ways, and they are related to recent controversies about detriment impact of land law on deforestation [2]. Since the 1970s, the Brazilian federal government has set up agricultural settlement projects that constrain the ways of use of natural resources and territorial occupation. These offcial 1
165 Tomi KAUPPINEN; Giovana MIRA DE ESPINDOLA 158 colonization incentive policies and the associated agricultural and cattle expansion remained dominant until the end of the 1980s [4]. A growing environmentalist trend took shape during the 1990s, allied with rules enabling local populations to take part in natural resource management. Since around year 2000, the federal government has created policies about land management, including policies about the creation of settlements. In this scenario, government policies have played a significant role in the agricultural colonization frontier. Generally speaking, settler farms have in common a production system characterized by intense use of family labor and simple agricultural technologies joined to a strong drive for cattle ownership and overexploitation of land. As a result, areas destined to settler farms move through a similar progression of land use pathways over time. The role of the land change trajectory is quite complex since it involves social and institutional arrangements that need to be better understood [3] Process-oriented Land Use and Tenure Ontology (PLUTO) For the purpose of this paper, we will describe a minimum process that represents the most signiffcant pathways related to settler farms. It starts when farmers get parcels (Figure 1-Land reform) and require some initial deforestation to establish ownership and produce food crops to meet immediate food needs (Figure 1-Subsistence). Farmers then clear additional lands for more crops, and at some point they start to purchase cattle (Figure 1-Extensive cattle raising). From this point, the activities of farmers planting subsistence crops are currently small relative to the clearing for cattle raising. After an intensive use of the pasture, the land can be recuperated (Figure 1- Recuperation) or abandoned (Figure 1-Abandonment). Considering these main descriptions we present below the main concepts of PLUTO. Land reform Redistribution of land. Deforestation Forest is removed from an area. Subsistence Land stays in subsistence until the portion of the deforestation reaches a critical amount. Extensive Cattle raising Extensive cattle raising after which pasture typically gets exhausted. Abandonment Regrowth of the forest. Reclaim Public repossession. Recuperation Removal of stumps and logs, and plowing, fertilizing,
166 Tomi KAUPPINEN; Giovana MIRA DE ESPINDOLA 159 These concepts were defined 2 using DOLCE [8]. They are perdurants in the sense of DOLCE and were introduced as subclasses of process. In addition, in the ontology there are endurants that participate in perdurants. These include concepts such as farmer, land and parcel. The intended use of PLUTO is to spatio-temporally annotate land regions in the Brazilian Amazon and model land change trajectories related to them. This allows for analysis of different regions, and of the characteristics of the land change trajectories in each land region. The hypothesis is that these characteristics may be used to find similar pathways around the Brazilian Amazon and hence to help to predict the future of regions based on similar pathways found in other regions. Combined with further data sets about regions (e.g. distance to market, distance to a river or road system, policies regarding the region) this analysis can reveal new knowledge about land change trajectories and help to create better policies for sustainability Rules for Reasoning about PLUTO Figure 1. Land tenure model. In this section we define rules for reasoning about PLUTO ontology. Essentially, these rules are used for asserting new facts concerning endurants and perdurants. The syntax of the rules is as follows. The rules consist of lefthandside (before ' ') and righthandside (after ' '). The idea is that if the lefthandside of the rule matches to some individuals in the knowledge base then the facts in the righthandside are asserted to the knowledge base. The notion parcel ( ) means that is an individual of parcel, i.e. is A-relationship holds between and the category parcel. Slots (properties) for each class are defined inside brackets. For example, deforestation(d){participant } means that there is an individual d of the class deforestation, which has a participant. The rules are as follows: parcel( ) land-reform( ) {participant } farmer-gets-parcel(g) {participant }, (1) 2 PLUTO is downloada able at
167 Tomi KAUPPINEN; Giovana MIRA DE ESPINDOLA 160 where if a parcel is a participant in a land-reform a then the parcel participates in an event farmer-gets-parcel g. parcel( ) {deforestedportion > 0} deforestation(d) {participant }, (2) meaning that if at least some portion of parcel is deforested then is a participant in deforestation process d. parcel( ) {deforestedportion < p 1 } deforestation(d){participant } (3) subsistence(s){participant }, where p 1 is a typical maximum portion of deforestation of a parcel such that the is still a participant in subsistence process s, i.e. the parcel is not (yet) used for extensive cattle raising. parcel( ){deforestedportion p 2 } (4) extensive-cattle-raising(e){participant }, where p 2 is a typical minimum portion of deforestation of a parcel such that if it is exceeded then the parcel will be a participant in extensive-cattle-raising e. parcel( ) extensive-cattle-raising(e){participant }{duration t 1 } (5) land-exhaustion(e){participant }, where t 1 = a typical time period of extensive-cattle-raising e after which pasture land normally gets exhausted (degraded) i.e. the parcel is then a participant in landexhaustion e. For example, according to [7] 'pasture degrades after about ten years', i.e. according to it t 1 = 10 years. parcel( ) land-exhaustion(e){participant } (6) recuperation(c){participant } abandonment( ){participant }; where a parcel participates in land-exhaustion e and not in recuperation c then the parcel will participate in abandonment. 2. Conclusions A tremendous quantity of information related to land change deforestation, creation of settlements, types of land use that should be interoperable, linked and shared is still not effectively done. In this paper we addressed the need for information interoperability in the domain of deforestation research. The idea was to support the linking of information resources with the help of an exhaustive and rigorous ontology. We propose use of PLUTO for semantic information integration within the domain of deforestation research. PLUTO enables interconnection between disparate sources for the purpose of 1) processing them automatically, 2) reasoning about it in a way only
168 Tomi KAUPPINEN; Giovana MIRA DE ESPINDOLA 161 possible when information has been integrated, and finally 3) sharing and publishing information about deforestation as Linked Data for different organizations to use. Acknowledgments The research of the first author has been funded by the International Research Training Group on Semantic Integration of Geospatial Information (DFG GRK 1498, see References [1] D. S Alves. Space-time dynamics of deforestation in Brazilian Amazonia. International Journal of Remote Sensing, 23(14): , [2] Claudio Araujo, Catherine Araujo Bonjean, Jean-Louis Combes, Pascale Combes Motel, and Eustaquio J. Reis. Does land tenure insecurity drive deforestation in the Brazilian Amazon? Working Papers , CERDI, [3] Gilberto Câmara. Land use change in Amazonia: Institutional analysis and modelling at multiple temporal and spatial scales (LUA/IAM). INPE, [4] Davis Clodoveu, Gilberto Câmara, and Frederico Fonseca. Beyond SDI: integrating science and communities to create environmental policies for the sustainability of the Amazon. International Journal of Spatial Data Infrastructure Research (IJSDIR), 4: , [5] Helen Couclelis. The abduction of geographic information science: transporting spatial reasoning to the realm of purpose and design. In Proceedings of the 9th international conference on Spatial information theory, COSIT'09, pages 342{356, Berlin, Heidelberg, Springer-Verlag. [6] Philip M. Fearnside. Amazonian deforestation and global warming: carbon stocks in vegetation replacing Brazil's Amazon forest. Forest Ecology and Management, 80(1-3):21-34, [7] Philip M. Fearnside. Land-tenure issues as factors in environmental destruction in Brazilian Amazonia: The case of southern Pará. World Development, 29(8): , August [8] Aldo Gangemi, Nicola Guarino, Claudio Masolo, Alessandro Oltramari, and Luc Schneider. Sweetening ontologies with DOLCE. In EKAW '02: Proceedings of the 13th International Conference on Knowledge Engineering and Knowledge Management. Ontologies and the Semantic Web, pages , London, UK, Springer-Verlag. [9] INPE. PRODES-Amazon deforestation database: São Jose dos Campos. Technical report, INPE, Available online at: [10] Werner Kuhn. Geospatial semantics: Why, of what, and how? Journal on Data Semantics III, pages 1-24, [11] Patrick Maué and Jens Ortmann. Getting across information communities. Earth Science Informatics, 2(4): , [12] M. L. Mendonҫa-Santos and C. Claramunt. An integrated landscape and local analysis of land cover evolution in an alluvial zone. Computers, Environment and Urban Systems, 25(6): , [13] E. F. Moran, E. S. Brondizio, and L. K. VanWey. Population and environment in Amazônia: Landscape and household dynamics. Population, Land Use, and Environment, [14] J. Shukla, C. A. Nobre, and P. J. Sellers. Amazon deforestation and climate change. Science, 247: , 1990.
169 Thomas SPANGENBERG; Hardy PUNDT 162 Integration of dynamic environmental data in the process of travel planning Thomas SPANGENBERG; Hardy PUNDT University of Applied Sciences Harz, Wernigerode, Germany / Abstract. This article describes the research concerning the development of a web-based travel planner, which is used mainly for touristic purposes. The paper focuses on the integration of factors that are vital for the planning process. These factors are completed with dynamic environmental data. Interoperability, as well as the integration of data in the planning algorithm play an important role. The interplay between different GI services in combination with different influencing factors provides a potential solution for the optimization of a web-based and a mobile tool to support touristic leisure and travel planning. Keywords. Travel planning, environmental data, tourism, web services Introduction The vulnerability study of the State of Saxony-Anhalt [1] presents several findings that are concerned with consequences of climate change for tourism. The climate is an essential and determining factor for outdoor activities, yet it is hardly taken account of in existing leisure and travel planning applications. Up to now, they focus mainly on adequate routing by different means of transport and itinerary as well as visualization through web-mapping services. Dynamic weather data like probability of precipitation, temperature, wind chill factor and visual range as well as the daily sunrise and sunset times greatly influence the personal travel planning process. They currently require the user to investigate manually different media and then draw conclusions to match the personal needs with the travel route. This paper describes the automated integration of weather data using a web-based travel planner and its mobile version. It takes the dynamic environmental data into account during the planning procedure, combines it with further parameters and generates optimized route suggestions.
170 Thomas SPANGENBERG; Hardy PUNDT The Travel-Planner The vantage point for this study is the Travel-Planner [2] which has been developed in a research project called GeoToolsHarz-Advanced (GOTHA) at the University of Applied Sciences Harz. It was designed to plan day and weekend tours taking into account individual requirements of users as much as possible. Based on a variety of points of interest (POI) within a desired travel period, and taking into account various parameters (e.g. means of transport, personal travel type) the system suggests possible accommodation (hotels, pensions, vacation rentals) for the region. The suggestions are generated according to the itineraries. A distinctive feature of the Travel-Planner is a modified algorithm for solving the classical Traveling Salesman Problem [3]. It was developed in a previous project at the Martin-Luther-University Halle-Wittenberg (MLU). Here, a genetic algorithm [4] that performs optimizations based on time windows using the Euclidean distance was realized. On the one hand, time windows are created to plan the complete travel period (which may extend over several days). On the other hand, the individual travel time per day, which can vary between users and their desired start, end and break times, are considered. Additionally, time windows are coupled specifically with the selected POI. Currently, major factors are the POI opening times because it makes no sense to include a POI in the travel plan if it is not accessible. The time spent to visit a POI is also a variable factor, depending strongly on the user type and his or her personal interests. For example, a family with children spends more time in a zoo and most likely less time in a museum. The travel times between destinations are also variable conditions, depending on the choice of transport means and its speed. 2. Provider for dynamic environmental data In a first attempt it was decided to integrate dynamic environmental data into the planning process. Taking into account the two variations of the Travel-Planner, the web-based, and the mobile tool, the integration of such environmental data requires the definition of different use cases. We differentiated the integration of real time data for the mobile tool, and short-term and middle-term data for the web-based application. The data, however, come from different web sites that must be linked to the travel planner aiming at optimizing the planning procedure Real Time Data The first opportunity is the use of real time data when the trip is planned on the current day. In such a context the user plans a tour and realizes it on the same day. The user must have access to real time weather data of a relevant measurement station. This use case represents an example for the application of the mobile, smartphone-based travel planning tool. Since smartphones are often equipped with GPS sensors, the appropriate measurement station can be determined directly in the context of the user s current location. Ideally, the data of the weather station are accessed automatically and
171 Thomas SPANGENBERG; Hardy PUNDT 164 integrated into the planning algorithm which provides eventually a modified tour for the user Forecast A second option to integrate external data is weather forecasts. If the user plans a trip applying the web-based travel planner, the integration of weather forecasts can lead to significant changes of the plan. The basis are middle-term local weather forecasts, which are remarkably accurate for a period of five to six days [5]. The German Weather Service (DWD), for instance, provides weather forecasts via its webpage. Such information can be accessed by the travel planner in a suitable way. In terms of interoperability, the usage of web services (e.g. the Google Weather API) is another opportunity. They are communicated through standardized formats which is an advantage Empirical weather data If the tour is planned more than one week in advance, it is not possible to supply reliable forecasts. At this point, empirical weather data can indicate a trend and serve the user with a rough orientation. A freely available service with daily weather statistics of the recent years is offered by the site Parameters like temperature, humidity, barometric pressure, visual range and wind speed are grouped by high, low and average values and supplied for the past ten to fifteen years. It also offers a probability calculation for day temperature, wind speed and precipitation. 3. Integration of weather data into the travel planning process The next section presents an overview, how the integration of weather data into the tour planning process is realized Prototype implementation The optimization algorithm was initially implemented as a Java application. To guarantee interoperability, the algorithm was provided as a SOAP web service. This enables an XML-based data exchange, independent of the programming language that has been used. Additional input parameters can be integrated easily. Providing the service for a multitude of mobile devices is therefore possible Influence on the data basis The integration of weather data influences the content description of the POI data basis. To integrate weather related information into the POI description, the metadata need to be extended. Apart from a pure metadata-based approach, the development of an ontology for the purpose of semantically enriching the POI descriptions could also be an option that is currently under investigation [6].
172 Thomas SPANGENBERG; Hardy PUNDT Travel planning and tour optimization The POI that are taken into account for a concrete trip, are chosen by the user from an interactive map or by directly searching for them through text or perimeter search. The application starts after the POI are determined, as shown in Figure 1. Figure 1.The Workflow of the tour planning process, considering weather data After the POI were chosen, the weather module collects the data for the relevant destinations. Since not all POI are associated with an explicit location, the specific coordinates are used to find a weather station in the vicinity by reverse geocoding. Real time, - predictive, or empirical data are used depending on the temporal distance toward the travel period. Based on this data a valid time window is generated for each destination. For example, periods of the day with minimal chance for rainfall are higher weighted for outdoor activities (a suggestion could be to take longer time in a zoo), instead of those day times, where rainfall has a large probability (where it could be proposed to visit a museum).for hiking tours, to give another example, it is worthwhile to consider not only weather related information, but also sunset times. In any case, the time windows proposed by the travel planner will be adjusted accordingly. POI, including their time windows, and opening times, are sent via a SOAP request to the web service and the optimization procedure is performed during which the algorithm tries to consider as many parameters as possible. According to the chosen starting point (e.g. a hotel) one or more tour suggestions will be displayed as a result. The results can be visualized depending on the application. Within the GOTHA project, the travel planning prototype is developed as an extension [7] for the TYPO3 open source content management system. The web services architecture enables also the usage on mobile devices. In a first step, a series of optimized browser templates were created to provide a general presentation on smart phones. Due to security-related limitations, a mobile web application does not have direct access to the device
173 Thomas SPANGENBERG; Hardy PUNDT 166 hardware and requires a permanent Internet connection. To solve this problem, a native application for the Android platform has already been developed. 4. Conclusion In this paper the integration of dynamic environmental data into the automated process of travel planning was presented. An exemplary use case is the integration of weather data into the trip planning procedure. Further research is required on the individualization of the planning process, based on specific user groups. In order to get optimized planning results, the requirements must be queried with the help of dialogues and interactions with the user. The next step is to include methods of recommender systems [8] into the planning process to get better suggestions for tours and destinations. The goal is the combination of different recommendation methods to develop a hybrid system for travel planning. This work will be part of a new project that started in January 2011 under the name KOGITON. List of literature [1] J. Kropp, Klimawandel in Sachsen-Anhalt, Potsdam-Institut für Klimaforschung, 2009 [2] T. Spangenberg, H. Pundt, Tourenplanung als Geo-Extension in einem Open Source Content- Management-System, In: A. Zipf, K. Behncke, F. Hillen, J. Schaefermeyer, Geoinformatik 2010 Die Welt im Netz. Akademische Verlagsgesellschaft, Heidelberg, pp , 2010 [3] D. L. Applegate, R. E. Bixby, W. J. Cook, The Traveling Salesman Problem: A Computational Study, Princeton Series in Applied Mathematics, 2007 [4] G. Neumann, R. Weinkauf, W. Welz, A GIS-based Decision Support for Multicriteria Location- Routing Problems, Reports of the Institute of Optimization and Stochastics, Report No. 12, [5] H. Malberg, Meteorologie und Klimatologie, pp. 246, Springer Verlag, Berlin, 2006 [6] H. Pundt, T.Spangenberg, Individualized Travel Planning through the Integration of different Information Sources including a POI Ontology, In: A. Zipf, K. Behncke, F. Hillen, J. Schaefermeyer, Geoinformatik 2010 Die Welt im Netz. Akademische Verlagsgesellschaft, Heidelberg, pp , 2010 [7] D. Dulepov, Typo3 Extension Development, Packt Publishing, 2008 [8] P. B. Kantor, F. Ricci, L. Rokach: Recommender Systems Handbook, Springer Verlag, Berlin, 2010
174 Bashir SHALAIK et al. 167 TransitDroid: Delivering real-time bus tracking information on mobile devices Bashir SHALAIK; Ricky JACOB; Adam WISTANLEY National University of Ireland Maynooth Computer Science Department {bsalaik, rjacob, Abstract. The recent technological advances in mobile communication, computing and geo-positioning technologies have made real-time transit vehicle information systems an interesting application area. In this paper we present a transit application system which displays the transit information on an OpenStreetMap (OSM) web interface and delivers this information on the Google Android mobile device. The content is in the form of predicted arrival/departure times for buses at user-selectable geographic locations within a transit region. This application uses the real-time information such as current location and timestamp of both bus and users to provide bus route information. The public interface provides a graphical view which used to display and updates the vehicle locations and to allow users to see routes, stops and moving buses. The mobile device provides the user with the expected arrival/departure time of the next bus at the bus stop based on the user s current position. Keywords. Real-time data, OSM, AVL, Android, Openlayers. Introduction In transportation information, different mappings interfaces are used to provide cartographical means for displaying, monitoring, and improving transit vehicles performances. In a pilot project between National University of Ireland Maynooth (NUIM) and Blackpool transit (Great Britain) a prototype of a working web-based realtime bus tracking transit system is developed using OpenStreetMap(OSM). In real-time bus tracking systems data is collected in real-time and transmitted to a central server for analyzing and extracting transit information. Computer software technologies such as Ajax is used to update the map-based display without interrupting the users by switching pages or screens as they view the map[1]. The developed transit system provides services that take advantages of modern technologies to displays bus Arrival/departure time information on the platform of Google Android mobile devices. Doing so will enable passengers to enquire about bus-arrival/departure time to a selectable geographic locations of their interest. For Bus Arrival/departure time estimation, three prediction models (namely, a historical-data-based model, a multiple linear regression model and one-dimensional Kalman filter model) were implemented and their performance is evaluated using the mean Absolute Percentage Error (MAPE) [2] [3]. In this paper, the process for
175 Bashir SHALAIK et al. 168 downloading data to platform of the Google Android device and mobile interfaces were designed using the Android operating system for mobile devices and java programming language. The utility of displaying bus arrival/departure times to selectable geographic locations can be applied to both a fleet management context and a bus information system environment. The developed system provides real-time information about bus routes and bus locations for those how have a mobile device with internet accessibility. They can link to the web site and get the current transit information such bus arrival time to the nearest bus stop. 1. Related works Real-time arrival information for bus, subways, light rail, and other transit vehicles are displayed in a significant number of cities worldwide at places such as rail stations, transit centers, and major bus stops. With the possibility that real-time transit information will not be available on a public display at every stop, the smart mobile devices are being used to help manage the complexity of using transit information. Whether it is a simple phone or SMS interface, or a more complex native mobile application, these systems can provide schedules, routes and real-time arrival information. Google Transit, which was started as a Google Labs project in December of 2005, is now directly integrated into the Google Maps product and provides interfaces to Google Transit which are exist on a variety of mobile devices, making use of location sensors such as GPS and WiFi localization on the device to improve the usability of the transit application. Various mobile-phone-based transit information systems have been developed to provide users with transit information. The Intelligent transportation system research group at university of Washington has developed a real time-system for predicting bus-arrival time, based on access to transit agency data. The prediction times made available to the traveling public via web site known as MyBus. The usability of public transit system can be enhanced by providing good traveler information system. OneBusAway [4] is a set of transit tools focused on providing real-time arrival information. This application made use of increased availability of powerful mobile devices and the possibility of displaying transit data in machine readable format. In OneBusAway systems, transit information such as bus arrival time to a particular bus stop is displayed on internet-enabled mobile devices. In [5] the usage of a transit vehicle information system that delivers estimated departure times for a large transit fleet is described. Due to the physical restriction of mobile devices which affects the user interaction and data presentation, the WML, has been introduced as the new language for WAP-enabled device. Transitr is a transit trip planner (TTP) system from the University of California, Berkeley [6]. The system provides the shortest paths between any two points within the transit network using the real-time information provided by a third party bus arrival prediction system, relying on GPS equipped transit vehicles. Users submit their origin and destination points through a map-based iphone or through a Java script enabled web browser. Services such as 511.org and Google Transit allow users to plan public transit trip by generating routes based on static schedule data where as with the proposed Transitdroid system, dynamic transit information is received via web services. In [7] a mobile public transportation information service was developed to provide map-based information of the nearest mass rapid transit station, the nearest bus stop of the bus route chosen by the user, that can take the user to his/her chosen
176 Bashir SHALAIK et al. 169 destination. The developed systems can deliver a map marked with the nearest mass rapid transit station on a Nokia 6600 cell phone. Bertolotto et al [16] describe a BusCatcher system. The main functionality provided include: display of maps, with overlaid route plotting, user and bus location, and display of bus timetables and arrival times. Barbeau et al [13] describe a Travel Assistance Device (TAD) which aids transit riders with special needs in using public transportation. Turunen et al [14] present approaches for mobile public transport information services such as route guidance and push timetables using speech based feedback. Bantre et al [15] describes an application called UbiBus which is used to help blind or visually impaired people to take public transport. This system allows the user to request in advance the bus of his choice to stop, and to be alerted when the right bus has arrived. In [8] a transit information system was developed to implement a bus arrival time predictor on a Google Android mobile device. The developed system provides relevant bus routes information with arrival time to users in order to explore the possibility and capability of various sensors and GPS on the device. In the developed system, the users current location is collected and, together with a stored bus schedule, a bus arrival time is calculated on the server and was displayed on the mobile device on a built-in Google Maps display. 2. Real-time Bus Tracking Systems Intelligent transportation systems provide the technology to enables people to make smarter travel choice. Better transit information allows users to make better decision about their travel options. One of the first online bus tracking systems, BusView, was developed at the University of Washington [9]. Nowadays, many public bus services provide on-the-fly information to their users, including the current locations of buses and the predicted arrival times at bus stops. These buses typically use the Global Positioning System (GPS) for positioning and use wireless communication such as radio or GSM/GPRS for communicating their position to a central server. Real-time prediction of accurate bus arrival times has been studied in literature for a couple of decades. Different methods are used for predicting bus arrival time, some researchers use simple statistical/mathematical models, e.g. prediction according to deviation from the schedule [10]. Kalman filter and more sophisticated artificial intelligence and machine learning algorithms have also been used [11] [12]. In this project, three bus arrival models were tested, namely a historical-data-based model, a multiple linear regression model and a one-dimensional Kalman filter model. The Automatic Vehicle location (AVL) data is used to track vehicle location while the other vehicle data is used to predict arrival time to a certain bus stop or the selected transit area along the route. Figure (2) shows the tracking and predicting components of real-time bus tracking system.
177 Bashir SHALAIK et al. 170 AVL Data Tracking Component Bus Data Prediction Component Arrival Time Bus Tracking System Figure 1. Tracking and Predicting Components of real-time bus tracking system. The historical data model predicts travel time using the average travel time for the same journey over similar conditions obtained from the data archive. The Kalman filter is a multi-dimensional model based on the numbers of state variables to be estimated [11]. The regression models predict a dependent variable with a mathematical function formed by independent variables. In order to evaluate the performance of the three models, the mean Absolute Percentage Error (MAPE) was used as a measure of closeness between predicted and observed values.mape represents the average percentage difference between the observed value and the predicted value. Table (1) shows MAPE values of the three prediction models on the same sample test data. Table 1 MAPE Values of Bus Arrival Times Prediction Models Model MAPE Historical Data Model 13 % Kalman Filter Model 20 % MLR Model 29 % The historical data model has the least MAPE as the number of observarions (samples) to predict bus travel time is sufficient and higher as compared to the other two models. The drawback of this model however is that the accuracy of the results depend on the smiliarity of travel patterns. Thus the historical data model will not perform well in case of an unexpected event like an accident or traffic and congestion. 3. System Implementation In Blackpool real time bus tacking system and to increase the satisfaction of among transit users, the system delivers transit information via standard web browser and mobile devices. The system uses off-the-shelf GPS/GPRS integrated units programmed to transmit location at regular intervals (45 seconds approximately) while the vehicle is in motion. The data is stored on a server and is then visualised through a standard web browser to show views representing current locations of vehicles in close-to-real-time.
178 Bashir SHALAIK et al Web interface development The system uses web technologies such as JavaScript, MySQL, XML, PHP and Ajax. The position of the bus along with the timestamp and bus details is sent to the server using GPRS. The remote server inserts the data into a MySQL database. An interactive public interface was developed to allow user more interaction with the transit system Figure (2) shows the public interface of Blackpool transit system with updated vehicle location on an OSM interface developed using OpenLayers. Figure 2. The Public interface showing updating textual display plus moving locations on OSM. 4. Mobile Interface The wireless communication technology is designed to utilise existing internet protocols and standards. A URL address is used for the identification of a resource, and HTTP is used as a protocol between WAP gateways and content server. Wireless content can be served using existing web server software. The smart mobile phone interface was developed using the open source android operating system for mobile devices. In this project an HTC Magic smart phone that runs on the Android operating system was used to communicate with the transit project server. The mobile device uses a HTTP protocol to connect with the MySQL database on server. Data between Android and PHP could be coded in JavaScript Object Notation (JSON) format which has built-in JSON functions in both languages. To view arrival/departure time on the mobile device, the user selects his/her preferred destination; the transit system collects his current location from built-in GPS. The user s current location along with the destination selected by the user is sent using the HTTP protocol to the server by appending it to the URL which connects to the server. The URL from the mobile device is transmitted and the response is displayed on the mobile screen. The nearest bus stop to his/her current location is suggested and Bus Arrival time is displayed on the device. Fig. (3) shows a sequence diagram of the interaction between users, mobile Devices ad the web.
179 Bashir SHALAIK et al. 172 User Mobile Device Web Server Database Request location via GPS Prompt for destination selection Destination choice Send destination and location to server Perform query with parameters and current time Present information to user Format and return results Return next bus time Figure 3. A sequence diagram showing the interaction between users, mobile Devices ad the web. Figure (4-a) shows a desire destination selected by user Figure (4-b) shows the response from server 5. Conclusions and Future Work In this work we have shown that transit information collected in real time can be shown on OSM for tracking and monitoring purposes. Internet enabled mobile phones can receive real-time transit information. Android software for smart mobile phones offers the ability to overcome the physical restriction of interface design on mobile phones. To further improve bus arrival time prediction accuracy, transit data from other sources can be incorporated into the predictor algorithms. Future work on this project includes
180 Bashir SHALAIK et al. 173 development of a feature which alerts a user when bus is a specified number of minutes away. References [1] J.S. Zepeda and S.V. Chapa From Desktop Applications Towards Ajax Web Applications, in the proceedings of the 4 th International Conference on Electrical and Electronics Engineering (ICEEE2007) Mexico City, [2] J.Patniak, S.Chien and A.Baldikas, Estimation of Bus Arrival Times using APC Data, Journal of public transportation 7(1) [3] B.Predic and B. Rancis Automatic Vehicle Location in Public Bus Transportation System, in the proceedings of the 11 th International Conference on Computers (WSEAS) Crete Island Greece (2007). [4] F.Brian, E.Kari and B. Alan OneBusAway: Behavioral and Satisfaction Changes Resulting from Providing Real-Time Arrival Information for Public Transit, in the proceedings of CHI2010, USA [5] S.D Maclean and D.J. Daily, The use of Wireless Internet Services to Access Real-Time Transit Information, in the proceedings of IEEE Intelligent Transportation Systems Conferences, USA (2001). [6] J.Jariyasunant Mobile Transit Trip Planning with Real-Time Data, in the proceedings of (TRB2010) Annual Meeting, USA (2009). [7] J.Shwu, H.Gong and H. Shian Location-Aware Mobile Transportation Information Service, in the proceedings of the 2nd International Conference on Mobile Technology, Applications and Systems, Guangzhou,(2009). [8] Lam, K. Bus Arrival Predictor on the platform of Google Android, [9] D. Dailey and J, Fisher BusView and Transit Watch: an Update on Two Products from the Seattle Smart Trek Model Deployment Initiative. In the proceedings of the 6th world congress intelligent transport system, USA (1999). [10] L. Wei-Hua and Z Experimental Study of Real-Time Bus Arrival Time Prediction with GPS Data Publication of Transportation Research Board of the National Academies, USA (1999). [11] D. Dailey and Z.R. Wall An Algorithm and Implementation to Predict the Arrival of Transit Vehicles in the proceedings of the 2nd IEEE Conference on Intelligent Transportation Systems, USA, (2000). [12] R. Jeong, and R. Rilett Bus arrival time prediction using artificial neural network model In the proceedings of the 7th IEEE Conference on Intelligent Transportation Systems, (2005) [13] Barbeau, S., Winters, P., Georggi, N., Labrador, M., Perez, R.: Travel assistance device: utilising global positioning system-enabled mobile phones to aid transit riders with special needs. Intelligent Transport Systems, IET, 4(1):12 23, (2010). [14] Turunen, M., Hurtig, T., Hakulinen, J., Virtanen, A., and Koskinen, S., Mobile Speech-based and Multimodal Public Transport Information Services. In proceedings of MobileHCI 2006 Workshop on Speech in Mobile and Pervasive Environments, (2006). [15] Bantre, M., Couderc, P., Pauty, J., Becus, M.: Ubibus: Ubiquitous computing to help blind people in public transport. In S. Brewster and M. Dunlop, editors, Mobile HumanComputer Interaction MobileHCI 2004, volume 3160 of Lecture Notes in Computer Science, pages Springer Berlin / Heidelberg, (2004). [16] Bertolotto, M., M. P. O Hare, G., Strahan, R., Brophy, A., N. Martin, A., McLoughlin, E.: Bus catcher: a context sensitive prototype system for public transportation users. In B. Huang, T. W. Ling, M. K. Mohania, W. K. Ng, J.-R. Wen, and S. K. Gupta, editors, WISE Workshops, pages IEEE Computer Society, (2002).
181 Jakob GEIPEL et al. 174 DGPS- and INS-Based Orthophotogrammetry on Micro UAV Platforms for Precision Farming Services Jakob GEIPEL a 1 ; Christian KNOTH b ; Olga ELSÄSSER c ; Torsten PRINZ d ainstitute for Geoinformatics WWU Münster, Germany, binstitute for Geography WWU Münster, Germany, cinstitute for Geoinformatics WWU Münster, Germany, dinstitute for Geoinformatics WWU Münster, Germany, Abstract. Small unmanned aerial vehicle (UAV) sensor platforms like a multicopter develop to important auxiliary tools in the scope of close range remote sensing. They provide new opportunities of data acquisition for various environmental- and geoscientific purposes. One of the objectives of the ifgicopter project is to produce CIR images as classified input data for subsequent procedures of precision farming. Therefore, the projects multicopter is equipped with a modified digital camera to record high resolution aerial CIR images. In a subsequent step, the captured images are processed to orthophotographs. Focal distortions are eliminated and the positions of the images are calculated relative to a desired frame of reference. This processing usually requires a certain amount of known Ground Control Points (GCP) to acquire information about the parameters of exterior orientation of the images. The measurement of these points is costintensive and time-consuming. For this reason the ifgicopter navigation system is extended with DGPS-devices in order to determine the essential parameters of the exterior orientation without the need of GCP s. Moreover, sensor data from onboard sensors such as gyroscopes, accelerometers and a magnetometer is combined to an inertial navigation system (INS) and coupled to the GPS position solutions. Therefore trajectory and orientation of the multicopter and its camera system are tracked at each point in time. The now available information allows the calculation of CIR orthophotographs as part of a sensor web-enabled service for subsequent procedures of precision farming. Keywords. Micro UAV, color infrared (CIR) orthophotographs, DGPS, INS, sensor-web services, precision farming Introduction The spatial and temporal versatility of Micro UAVs, due to small size, low weight and little operating costs, makes them an attractive tool especially for monitoring and observation tasks in application fields like landscape ecology, forestry or disaster management (see [1] and [3]). To explore the applicability of a multicopter as a platform for the acquisition of various types of sensor data such as orthorectified aerial images, the interdisciplinary ifgicopter project aims on establishing techniques, 1 Corresponding Author.
182 Jakob GEIPEL et al. 175 workflows and communication frameworks for collecting and processing data as well as providing it in real time via sensor web services. 1. Color Infrared (CIR) Aerial Photography for Precision Farming One promising application field for this technique is small-format aerial photography within the scope of precision farming. Here the adoption of capital equipment like crop protection products (also fertilizer) can be adapted to the spatial and temporal dynamics of soil and population parameters to reduce costs and keep processes more environmentally-compatible [11]. To exploit these capabilities effectively, accurate information on the varying site-specific conditions is necessary. Other authors have already shown that UAV s are able to provide such substantial information and geobase data to precision farming processes [13]. In this context, an objective of this project is to produce CIR photographs and other remote sensing products in the visible (VIS) and near infrared (NIR) range as classified input data for subsequent procedures of precision farming and for the conduction of efficiency tests in a user defined spatial and temporal resolution. Therefore the ifgicopter is equipped with a modified compact digital camera now capable of capturing radiation ranging from about 330nm to 1100nm [8]. Combined with a special color filter the camera provides a setup for generating high resolution aerial CIR photographs [6]. In order to conduct further image analyses it is necessary to calculate accurate orthophotographs. Consequently, the aerial images need to be rectified and georeferenced as prerequisite for getting coordinate and scale information. Regarding the distortion correction it is necessary to have a digital terrain model of the survey area. The interior and exterior orientation of the camera are needed for georeferencing the photographs [7]. While the parameters of interior orientation of the camera are defined by calibration regarding the focal distortion, the parameters of exterior orientation have to be determined during image post-processing. In remote sensing two different approaches are usually applied, the direct and the indirect georeferencing method. During the indirect georeferencing method Ground Control Points (GCP s) are required [4]. GCP s represent terrain points, whose 3D positions are known and which are clearly identifiable in the images. However, creating and measuring GCP s in the field is a very time- and cost-intensive work. Figure 1. UAV georeferencing system applying DGPS/INS for orthorectified aerial images In order to reduce hard to establish) it the field work to a minimum (and in regions, where GCP s are is possible to calculate the exterior orientation directly by
183 Jakob GEIPEL et al. 176 exploiting the onboard sensor data of the ifgicopter s (D)GPS position (Figure 1) and the inertial navigation system (INS) [7]. 2. DGPS and INS for Determination of Exterior Orientation Parameters The majority of currently used UAVs are equipped with standard low-cost GPS receivers which provide an absolute position accuracy of about 2m to 15m. Since these measurements are too inaccurate for the purposes of determining the precise focal 3D position of the mounted camera, the GPS system is improved by upgrading the ifgicopter with a GPS receiver, capable of using various differential GPS (DGPS) techniques. Consequently a L1 C/A receiver with carrier phase smoothing is installed to track satellites on L1-frequency and to calculate the position by considering pseudoranges and carrier phase measurements. Additionally, the navigation system is improved by establishing communication between the GPS receiver and the German SAPOS Ground Based Augmentation System (GBAS). As a consequence, the accuracy of the absolute GPS position solution is increased approximately by a factor of 10 [5]. Since the orientation of the ifgicopter and its camera system is known by making use of the onboard gyroscopes, accelerometer and magnetic sensors (INS), it is possible to transform the detected values into angles that indicate the absolute orientation of the ifgicopter axis in an earth centered, earth fixed coordinate reference system. Furthermore the improved position of the DGPS and the sensor values of the INS are loosely coupled. Then the parameters of exterior orientation (position and orientation) are calculated via a strapdown navigation algorithm for each point in time [15]. These values determine the exterior orientation of the taken images and are used in a following step for the processing of orthophotos Architecture of the ifgicopter s Navigation System The architecture of the ifgicopter s navigation system is based on different components which are introduced in the following: In this study, a md4-200 quadrocopter serves as Micro UAV sensor platform [9] and is extended by an OEMStar GPS Receiver [10] and a come2ascos radio modem [2]. These three navigation components are essential to generate adequate sensor data for the determination of a reliable exterior orientation. Having established a first position fix, the GPS receiver transmits its position information via a serial interface to the radio modem using a standardized NMEA GGA message. Then the modem establishes a GPRS connection, logs on to SAPOS GBAS and forwards this message to gather standardized RTCM 3.1 correction data from the Satellite Positioning Service. In a next step, the RTCM data is handed back to the GPS receiver which then calculates its improved 3D position as a function of time. Following, it passes this information to the md4-200 s Navigation Control through a second serial interface. There, 3D position, gyroscope, accelerometer and magnetometer data is analyzed and processed by using a strapdown navigation algorithm [15]. In a final step, this set of navigation information is committed to the md4-200 s Flight Control which offers a 2.4 GHz wireless downlink connection to a ground station where this data is logged (Figure 2).
184 Jakob GEIPEL et al. 177 Figure 2. Navigation data workflow and communication 2.2. Future Integration into a Sensor Software Framework As this procedure stilll requires extracting exterior orientation parameters from the collected navigation data log, the ifgicopter sensor platform has to be integrated into an already existing software framework. This specific framework is designed to simplify sensor integration, synchronization of sensor data streams and to support multiple output formats [12]. It is aimed to connect the ifgicopter s data downlink as one possible dataa source to the sensor platform framework, where the data is processed, interpolated and then provided by an output plugin in an appropriate format for consequent calculation of orthophotos (Figure 3). Figure 3. Framework model for ifgicopter sensor data integration [12] 3. Applications and Outlook Given the exterior orientation and a digital terrain model (DTM) the images obtained from the aerial surveyss can be orthorectified using the direct georeferencing method (provided by various photogrammetric image analysis software packages). Subsequently they are subjected to object-orientated texture analysis for supervised classifications regarding the distinction of different vegetation types like weed and crop, or different statess of vegetation health due to soil dryness, precipitation damages or pest infestation. The classified images are processed to generate accurate vector and position data (centroid) of examined areas and objects. The results of the analysis can then be provided to the user (farmer) as web mapping services (orthophotographs, Figure 4), web feature services (vector data) or via an open geo data portal like the StudMap14 (
185 Jakob GEIPEL et al. 178 Figure 4. Decision support in precision farming via web mapping services (WMS) and web feature services (WFS) muenster.de/geoext/index.html). This geo data portal, realized and maintained by the faculty of geosciences of the University of Muenster, is a web-application which can be used not only to browse various types of geo data and web services, but also supports the up- and download of specific user data [16]. Another near future application is to include the orthophotographs as a subsidiary WMS source within the prototypical flight planning and communication software of the ifgicopter project ( developed muenster.de:8080/flugplanung/flugplanung.html). Latter one has been concurrently as an open source and web-based product for the operation of UAV s [14] References [1] Aber, J. S., Marzolff, I. and Ries, J. B., Small-Format Aerial Photography. Principles, Techniques and Geoscience Applications, Elsevier, Oxford, [2] Allsat GmbH, come2ascos, Hannover, on/ come2ascos.html [3] Becker, T., Kutzbach, L., Forbrich, I., Schneider, J., Jager, D., Thees, B. and Wilmking, M.., Do we miss the hot spots? - The use of very high resolution aerial photographs to quantify carbon fluxes in peatlands, Biogeosciences 5 (2008), [4] Eisenbeiß, H., UAV Photogrammetry, IGP Mitteilungen 105, Institute of Geodesy and Photogrammetry, ETH Zürich, [5] Kettemann, R., GPS-Verfahren Einsatzgebiete Rahmenbedingungen Kombinationslösungen, Faculty for Geomatics, Computer Science and Mathematics, University of Applied Sciences Stuttgart, [6] Knoth, C., Prinz, T. & Loef, P., Microcopter-Based Color Infrared (CIR) Close Range Remote Sensing as a Subsidiary Tool for Precision Farming, ISPRS Workshop on Methods for Change Detection and Process Modelling, Cologne, [7] Kraus, K., Photogrammetry. Geometry from Images and Laserscans, de Gruyter, Berlin, [8] LDP LLC, Camera Conversions, Carlstadt, nversions.htm [9] Microdrones GmbH, md4-200 / Einführung, Siegen,
186 Jakob GEIPEL et al. 179 [10] Novatel Inc., OEMStar, Calgary, [11] Ponitka, J. & Pößneck, J., Precision Farming - Anwendungen, Saxon State Office for the Environment, Agriculture and Geology, Dresden, [12] Rieke, M., Foerster, T. & Bröring, A., Unmanned Aerial Vehicles as Mobile Multi-sensor Platforms, Accepted Paper at AGILE 2011: The 14th AGILE International Conference on Geographic Information Science, April 2011, Utrecht, The Netherlands, [13] Universität der Bundeswehr München, Aufbau eines UAV für Zwecke der Datenerfassung im Bereich der Präzisionlandwirtschaft, [14] Verhoeven, P., Konzeption und prototypische Umsetzung eines flexiblen GUI Frameworks für die effiziente Steuerung von unbemannten Flugobjekten, Diploma Thesis at the Institute for Geoinformatics, University of Münster, [15] Woodman, O. J., An introduction to inertial navigation, Technical Report Number 696, Computer Laboratory, University of Cambridge, [16] Zentrum für Digitale Medien und Mediendidaktik des Fachbereichs Geowissenschaften, StudMap14, University of Münster,
187 Theodor FOERSTER; Bastian SCHÄFFER 180 RM-ODP for WPS Process Descriptions Theodor FOERSTER a,1 ; Bastian SCHÄFFER b a Institute for Geoinformatics, University of Münster, Germany b 52 North GmbH, Münster, Germany Abstract. Web-based geoprocess models are currently published through Web Processing Service interface. For interoperability of these models, profiles of these geoprocess models have been developed. However, these profiles are rarely documented and are not human-readable. Based on the lack of well-documented profiles and the absence of adequate semantic descriptions, this article presents an approach how to document profiles using the viewpoints of the Reference Model for Open Distributed Processing (RM-ODP). The presented approach is described exemplary in a walkthrough. Keywords. Web Processing Service, WPS, profiles, RM-ODP, interoperability. 1. Introduction Geoprocessing on the web is often described as the next evolution step of Spatial Data Infrastructures (SDIs) from data serving to information provision [1]. Information provision on the web is enabled by web-based geoprocess models, which transform web-based data as currently available in SDIs into web-based information. These geoprocess models are available as distributed and loosely-coupled resources on the web encapsulated as Web Services. In this context, interoperability is a key requirement to make use of such Web Services. To provide geoprocess models as interoperable Web Services, the Open Geospatial Consortium (OGC) specified the Web Processing Service (WPS) interface [2]. In particular, profiles are regarded to enhance the interoperability of geoprocess models which are available through WPS interface. Profiles are web-based descriptions of common interfaces of a geoprocess model. If one profile is referenced by different geoprocess models providing the same functionality and using the same interface, the different geoprocess models become semantically equal from an interface point of view. Besides some elements for linking and providing machine readable information, a profile contains of a textual description, which is the only source for humans to retrieve the semantics of the specific process. Currently, this textual description is unstructured but highly relevant for inspecting the semantics of the specific process, as semantic descriptions are still missing. However, a coherent and structured approach to document the functionality of a process in such textual descriptions has not been proposed yet. In this article, we will apply the Reference Model for Open Distributed Processing (RM-ODP) to structure the textual description of profiles. RM-ODP is a widely accepted model to document complex structures such as web services comprehensively by using several views. RM-ODP will help to enhance the existing descriptions and 1 Corresponding Author.
188 Theodor FOERSTER; Bastian SCHÄFFER 181 thereby tackle the problem of semantic descriptions, which is identified as one of the challenges in web-based geoprocessing [3]. Section 2 describes the related work of web-based geoprocessing and RM-ODP. Based on this, the proposed approach is described (Section 3), which is then exemplified by a walkthrough regarding profile interaction (Section 4). The article ends with a conclusion. 2. Related Work This section describes the related work as applied in this article to create comprehensive textual descriptions of profiles and puts the presented work into context Web-based Geoprocessing Geoprocessing is the application of functionality representing real-world processes (e.g. hydrological runoff models) or processing of geodata (e.g. generalization, (coordinate) transformation). Providing these models and functionality on the web is a relevant topic in research and industry, as it allows users to generate web-based information to support decision making. The WPS interface specification is OGC s attempt towards a standardized interface for web-based geoprocess models. The WPS specification describes three operations. GetCapabilities provides service metadata, DescribeProcess provides process metadata with input and output parameters of the designated process and Execute allows the client to perform the specific process according to the process metadata. The process metadata currently only consists of syntactic information about the input and output data (e.g. schema, datatype). Profiles are used to address semantic interoperability of processes. A profile has the following properties [2]: An OGC Uniform Resource Name (URN) that uniquely identifies the process (mandatory) A reference response to a DescribeProcess request for that process (mandatory). A human-readable document that describes the process and its implementation (optional, but recommended). A WSDL description for that process (optional). A few profiles are available and are listed in Table 1. All these profiles focus on a specific classification of processes and input and output parameters (second property of a profile), but are rarely described in a structured and comprehensive way (third property of a profile). Table 2: Overview of published WPS profiles. Profile topic Editors Vector and raster-based processes Nash 2008 [4] Analysis of 3D data Lanig & Zipf[5] Geomarketing Walenciak & Zipf [6] Decision support Ostlaender [7] Feature and Statistical Analysis Foerster & Schaeffer [8]
189 Theodor FOERSTER; Bastian SCHÄFFER 182 Besides the design of profiles, semantic annotations and semantic descriptions of geoprocess models have been investigated. For instance [9] investigated the use of finegranular ontologies for Geoprocessing Services, whereas [10] proposed a coursegranular semantic descriptions of Geoprocessing Services based on the service classification of ISO [11] Reference Model for Open Distributed Processing RM-ODP is a standardized approach from International Organization for Standardization (ISO) and International Electrotechnical Commission (IEC) to develop distributed systems. RM-ODP s realization consists of object modeling, viewpoint specification, distribution transparency and conformance [12]. Object modeling allows building abstractions of the basic system concepts. Viewpoints are used to specify a system from different perspectives (Figure 1). Distribution transparency of specific distributed components and conformance supports interoperability of the components. For this work, viewpoints have been selected, as they allow describing a component (such as a profile) comprehensively. Figure 1: RM-ODP viewpoints. 3. The Approach To enhance the description of WPS Profiles, an approach is required, which provides a comprehensive view on the specific process. It can thereby be seen as human-readable metadata, which helps users to reason about the specific process. RM-ODP and its viewpoints have been identified as appropriate to help developers and communities in designing and documenting profiles. Table 2 describes the different viewpoints (areas of concern) and how they can be used (main concepts). The engineering viewpoint is not listed in the table, as it provides implementation specific information, which is not considered due to the encapsulation of web service interfaces. The Feature and Statistical Analysis report of the OGC testbed phase 7 [8] can be considered to be a first attempt to structure textual process descriptions by RM-ODP viewpoints.
190 Theodor FOERSTER; Bastian SCHÄFFER 183 Table 3: RM-ODP viewpoints and their function in WPS profile development. Viewpoint Enterprise Information Computation Technology Areas of Concern Main concepts Objectives of processes Artifacts, roles Information models and information manipulation Data schemas Logical decomposition of processes Computational interfaces Technical artifacts solutions and The documented viewpoints can be linked as a plain file in the metadata element of profile or directly be included in the profile to support search regarding the different viewpoints. In particular, we propose a new set of XML elements in the WPS DescribeProcess document extending the ProcessBriefType as presented in Figure. Figure 2. Profile with RM-ODP extension. A top-level RM-ODP element contains four child elements labeled according to the corresponding viewpoints. Each child elements holds a human readable description of the view point according to the description from above. It is thereby possible for a client to query specific parts of the metadata. 4. Walkthrough This section exemplifies a walkthrough for interacting with web-based geoprocess models using profiles from a user s perspective. This perspective, already assumes, that the profile is described accordingly (based on RM-ODP, Section 3) and has been registered officially. This walkthrough (Figure 3) does not take a catalog search into account, as profiles are not considered in catalogs yet. Thus, given a WPS entrypoint,
191 Theodor FOERSTER; Bastian SCHÄFFER 184 the WPS user accesses the GetCapabilities and the DescribeProcess documents of a specific WPS. The Profile URN included in the DescribeProcess is used to retrieve the profile information using an official URN resolver. The URN resolver returns the WPS profile with the documented RM-ODP viewpoints. Based on the WPS profile, the user can inspect the syntactic interface. The documented viewpoints (Section 3) provide specific information about the process and its application. Based on this information the user can decide, if this process fits his needs and can specify the request to perform the process with the designated data. accesses Figure 3: WPS profile walkthrough. 5. Conclusion Based on the lack of well-documented profiles (as also shown in the overview of available profiles in Section 2), this article proposes the use of RM-ODP to document profiles of geoprocess models. RM-ODP allows documenting distributed architectures using a viewpoint analysis. These viewpoints have been adopted in the approach for profile description (Section 3). Based on the structured way of viewpoint analysis, RM- ODP helps to create comprehensive and well-designed descriptions, which are humanreadable. Using this approach to document well-known and referenced profiles will increase the interoperability of the profiles and will limit misinterpretation by the specific user. This becomes especially important in the Model Web [13], which exposes many different models as standardized Geoprocessing Services. The RM-ODP document can be referenced as a separate metadata file or be included as an extension of the structure of the profile. Using such a structure and the proposed encoding, is a first step for querying of profiles. The walkthrough (Section 4) shows the course of action involved for using a profile and how RM-ODP can support it. However, the approach is also limited in terms of establishing semantic interoperability for automatic web service interaction. Consequently, future research needs to enhance the proposed structure by including semantic descriptions using ontologies [9], [10] or the Object Constraint Language (OCL) from object-oriented modeling [14]. As demonstrated in the walkthrough, a unified approach for the interaction with profiles is not yet specified. Future research will need to investigate the
192 Theodor FOERSTER; Bastian SCHÄFFER 185 handling of URNs and the querying of profiles for instance in catalogs. The presented approach here can be used as a starting point. References [1] C. Kiehle, C. Heier, and K. Greve, Requirements for Next Generation Spatial Data Infrastructures- Standardized Web Based Geoprocessing and Web Service Orchestration, Transactions in GIS, vol. 11, no. 6, pp , Dec [2] OGC, OpenGIS Web Processing Service. Open Geospatial Consortium, [3] J. Brauner, T. Foerster, B. Schaeffer, and B. Baranski, Towards a Research Agenda for Geoprocessing Services, in 12th AGILE International Conference on Geographic Information Science, [4] E. Nash, WPS Application Profiles for Generic and specialised Processes, in Proceedings of the 6th Geographic Information Days, vol. 32, pp , [5] S. Lanig and A. Zipf, Proposal for a Web Processing Services (WPS) Application Profile for 3D Processing Analysis, in 2010 Second International Conference on Advanced Geographic Information Systems, Applications, and Services, pp , [6] G. Walenciak and A. Zipf, Designing a Web Processing Service Application Profile for Spatial Analysis in Business Marketing, in 13 th AGILE International Conference on Geographic Information Science, p. 8, [7] N. Ostlaender, Creating Specific Spatial Decision Support Systems in Spatial Data Infrastructures, Phd thesis, University of Muenster, [8] T. Foerster and B. Schaeffer, OWS-7 Feature and Statistical Analysis Engineering Report. OGC, 2010, p. 41. [9] M. Lutz, Ontology-based Descriptions for Semantic discovery and composition of Geoprocessing Services, GeoInformatica, vol. 11, no. 1, pp. 1-36, [10] R. Lemmens, Semantic interoperability of distributed geo-services, PhD thesis, Delft University of Technology, [11] ISO/TC 211, Geographic information - Services. International Organization for Standardization, 2005, p. 67. [12] ISO/EC, Information technology - open distributed processing - reference model: overview. Geneva, Switzerland: ISO, [13] G. N. Geller and W. Turner, The model web: a concept for ecological forecasting, in 2007 IEEE International Geoscience and Remote Sensing Symposium, pp , [14] J. Warmer and A. Kleppe, The Object Constraint Language, Second Edition. Addison Wesley, 2003.
193 Damian LASNIA et al. 186 Towards Linking the Digital and Real World with OpenThingMap Damian LASNIA a,1 ; Theodor FOERSTER a ; Arne BRÖRING a,b,c a Institute for Geoinformatics, University of Muenster, Germany b 52 North, Münster, Germany c ITC Faculty, University of Twente, Netherlands Abstract Currently digital inventories of outdoor features evolve such as OpenStreetMap. A digital inventory of real-world things is still missing, but is required to enable full search and interaction in the digital world. This article presents an approach to link any real world object to a georeferenced digital representation, thus suggests a concept for location within the areas of Web of Things, Ubiquitous Computing and Ambient Technology. The approach is twofold based on a) OpenFloorMap as an inventory for buildings and b) OpenThingMap as an inventory for things. The inventories are based on a lightweight data model and are populated through the knowledge of the crowd using advanced mobile devices. Keywords: Crowdsourcing, Spatial Inventories, Mobile Applications. Introduction Building inventories of real-world features in databases can be realized by Crowdsourcing. Projects such as OpenStreetMap [1] use mobile devices which are able to measure the location context (i.e. GPS), the knowledge of the crowd and a lightweight data model to build an inventory of outdoor features. This supports outdoor wayfinding and search for locations, represented as POIs. So, every real-world object outside is represented in the digital world, which is described through tags. However, as most of our daily life is spent inside, an approach to map the inside of buildings as well as things residing within the building, has not been proposed yet. Therefore, this article a) describes the OpenFloorMap for modeling buildings and b) proposes OpenThingMap as a way to link any entity in the real-world to the digital world. Thereby, the interaction between the things in real-world and the digital world will be seamlessly. The presented approach is based on smartphone technology, which provides a set of sensors to measure the real-world in such a way, that digital representations of things can be created easily. The approach is based on light-weight data models and the Web of Things. Section 1 presents OpenStreetMap and Web of Things as the main building blocks of the presented approach. Section 2 describes the OpenFloorMap and OpenThingMap, respectively. Section 3 presents a conclusion and provides an outlook for future work. 1 Corresponding Author.
194 Damian LASNIA et al Related work 1.1. OpenStreetMap The OpenStreetMap project [1] was found in 2004 and has a rapidly growing data inventory of streets and POIs. Its access and use is free of charge (Creative Commons license). The data is reported by so-called mappers, who use digitizing tools or mobile devices with GPS capabilities to capture the data. OpenStreetMap is used for several applications such as mapping or routing. The data model is based on three types of objects, nodes, ways and closed ways (to represent polygons). The thematic data is captured through tags (key value pairs). The set of available tags is thereby unlimited. Specific renderers use a set of designated tags to create cartographic representations Web of Things The Web of Things [2] evolved from the Internet of Things ([3]) and integrates realworld things with the Web. Examples for such things are household appliances, embedded and mobile devices, but also smart sensing devices. Often, the user interaction takes place through a cell phone acting as the mediator within the triangle of human, thing, and Web. Applications of the Web of Things are influenced by the idea of ubiquitous computing [4] and range from smart shoes posting your running performance online, over management of logistics (e.g., localization of goods in the production chain), to insurance (e.g., car insurance costs based on the actually driven kilometres). The Web of Things leverages existing Web protocols as a common language for real objects to interact with each other. HTTP is used as an application protocol rather than a transport protocol as it is generally the case in web service infrastructures such as OGC s SWE framework [5]. Things are addressed by URLs and their functionality is accessed through well-defined HTTP operations (GET, POST, PUT, etc.). Hence, Web of Things applications follow the REST paradigm [6]. Specific frameworks (e.g. [7, 8]) offer REST APIs to enable access to things and their properties as resources. These REST APIs may not only be used to interact with a thing via the Web, also website representations of things may be provided to display dynamically generated visualizations of data gathered by the thing. Then, the mash-up paradigm and tools from the Web 2.0 realm can be applied to easily build new applications. An example application may use Twitter to announce the status of a washing machine or may let a fridge post to an Atom feed to declare which groceries are about to run out. 2. Approach The proposed approach establishes an ubiquitous, tight coupling between our real life and web content that is connected with things in our environment. This is based on two aspects OpenFloorMap and OpenThingMap.
195 Damian LASNIA et al OpenFloorMap The OpenFloorMap is based on the need for a location referencing model for all human accessible spaces, and because current flagship smartphones are capable to measure room extent up to an applicable degree of accuracy. To enable easy data capturing and data management, the data model of OpenFloorMap (Figure 1) applies the simplicity of the OpenStreetMap data model (Section 1.1). In particular, the data model of the OpenFloorMap consists of two-dimensional levels and three-dimensional rooms inside those. A set of levels, a set of geographic coordinates and a unique identifier are the representation of a building in the real world. Each building is associated with a set of POIs in OpenStreetMap, which represents the entries to the building. Figure 1: Data model of OpenFloorMap. The OpenFloorMap is based on an Android application (Figure 2). The user can capture the layout of the rooms and report the data to the server. Modern smartphones provide APIs to access their build-in sensors that are capable to provide parameters for a system of equations based on trigonometric functions to determine room extend: Proximity sensors measure the distance between the camera and room corners, gyroscopes and orientation sensors determine the device s attitude. An assisted user interface adds building and level information. In a browser-based application, reported rooms can be arranged via drag-and-drop to represent the floor s actual layout. Figure 2: Android application user interfaces
196 Damian LASNIA et al OpenThingMap The OpenThingMap integrates web content into our every day`s environment. It is based on the OpenFloorMap (Section 3.1). Persons are able to access the OpenThingMap via gateways and explore the digital world in terms of an in-world knowledge browser. This means that the user is able to set the gateway s position to any human accessible space inside buildings ( levels/<level_id>/rooms/<room_id>) or the outside-world (Latitude, Longitude) to gather a catalogue of surrounding things and their linked content in the web. Photos may link to social network profiles, the microwave to its manual and the air conditioner to its remote control web service. Things that expose their capabilities in a standardized format become localizable through spatial queries as physical entity, no more just as URIs. This enablement of spatial queries within Mashups of things [2] is a great practical achievement of the OpenFloorMap integration towards personalization ambient spaces within universal environments. We propose that things act as gateway between the real and the digital world and thereby link them (Figure 3). A gateway is the most specified class within OpenThingMap. It does not only represent itself, but the user as well. Smart things are generalized gateways with no user log in, but differ from things because they host embedded or attached computer components to provide direct connectivity with the web. The generalization of a smart thing class is the thing class. The thing class can hold everything in the real world that can be referenced either with geographic coordinates or a room in OpenFloorMap with local room coordinates. Things can be linked with static web resources e.g. for descriptive purposes and with web feeds to announce a thing s status. Figure 3: Concept of OpenThingMap. 3. Conclusion This article presents an approach how to incorporate real-world things into the digital world for search and interaction. The approach is twofold using a) the OpenFloorMap
197 Damian LASNIA et al. 190 and b) the OpenThingMap. Both concepts are based on a light-weight data model, the knowledge of the crowd and the capabilities of current smartphones. The presented approach on light-weight data models and protocols is a complementary to the existing semantic approaches for annotating data [9]. Future research questions should address machine readable search and bind capabilities in OpenThingMap and the inclusion of other digital sources which are not modeled as things, such as noise and air quality (phenomena). References [1] M.M. Haklay and P. Weber. OpenStreetMap: user-generated street maps. IEEE Pervasive Computing, pages 12 18, [2] D. Guinard and V. Trifa. Towards the Web of Things: Web Mashups for Embedded Devices. In International World Wide Web Conference, Madrid, Spain, [3] N. Gershenfeld, R. Krikorian, and D. Cohen. The Internet of Things. Scientific American, 291(4):76 81, [4] M. Weiser. The Computer for the 21st Century. Scientific American, 265(9):94 104, [5] M. Botts, G. Percivall, C. Reed, and J. Davidson. OGC sensor web enablement: Overview and high level architecture. GeoSensor Networks, pages , [6] R. T. Fielding and R. N. Taylor. Principled Design of the Modern Web Architecture. ACM Transactions on Internet Technology, 2(2): , [7] J. Pinto, R. Martins, and J.B. Sousa. Towards a REST-style Architecture for Networked Vehicles and Sensors. In Proceedings of 8th IEEE International Conference on Pervasive Computing and Communications: WoT 2010: First International Workshop on theweb of Things, pages , [8] B. Ostermaier, F. Schlup, and K. Romer. WebPlug: A Framework for the Web of Things. In Proceedings of 8th IEEE International Conference on Pervasive Computing and Communications: WoT 2010: First International Workshop on the Web of Things, pages , [9] V. Stirbu, P. Selonen, and A. Palin. The location graph: towards a symbolic location architecture for the web. In Proceedings of the 3rd International Workshop on Location and the Web, page 10. ACM, 2010.
198 Jochen SCHIEWE; Beate WENINGER 191 Konzeption von akustisch unterstützten animierten Karten zur Präsentation raumzeitlicher Informationen Jochen SCHIEWE; Beate WENINGER HafenCity University Hamburg, Lab for Geoinformatics and Geovisualization (g 2 lab) Keywords. Geovisualisierung, Multimedia-Kartographie, Raumzeitliche Visualisierung 1. Einführung Die statische Kartengraphik ist zweifelsohne die zentrale Form zur Präsentation von Geoinformation, da sie eine platzsparende und in einem Koordinatensystem verortete Informationswiedergabe gewährleistet. Daneben sind aber auch andere Formen der Informationskodierung denkbar (z.b. Animationen, Videos oder akustische Darbietungen), die bei Bedarf zu multimedialen Darstellungen zusammengefügt werden können. Vor dem Hintergrund, dass die Notwendigkeit von Darstellungsmöglichkeiten für raumzeitliche Veränderungen (change detection and analysis) weiter stark ansteigen wird, soll der Fokus dieses Beitrages auf die Kombination zweier spezieller Kodierungsformen Animationen und akustische Darbietungen gelegt werden. Die zugrunde liegende Hypothese für diese, bisher wenig untersuchte Kombination lautet, dass die multikodale bzw. multimodale Vermittlung eine umfangreichere und flexiblere Kommunikation von zeitabhängigen Geoinformationen ermöglicht. Idealer Weise werden die Nachteile dieser beiden Kodierungsformen durch die jeweils andere Form aufgehoben. 2. Stand der Entwicklungen Aufgrund ihres dynamischen Charakters sind animierte Darstellungen zur Abbildung raumzeitlicher Veränderungen prinzipiell sehr gut geeignet. Andererseits weisen sie aber auch eine Reihe von Nachteilen auf. Hierzu gehört aufgrund der zeitlichen Taktung die hohe Informationsdichte, die zu einer Überforderung des Kurzzeitgedächtnisses führen kann. Ferner kann beim einfachen Ablaufen der Animation nicht ohne Weiteres auf vorhergehende Zustände zurückgegriffen werden, sodass Vergleiche schwerfallen. Aus diesen Gründen ist auch darauf zu achten, dass weder zu viele Werte (z.b. je ein Wert für jeden US-Bundesstaat), noch gegenläufig verändernde Werte repräsentiert werden (z.b. teilweise Abnahme, teilweise Zunahme von Durchschnittstemperaturen). Idealer Weise sollten Animationen also nur zur
199 Jochen SCHIEWE; Beate WENINGER 192 Vermittlung eines einzigen thematischen Attributs mit einem einheitlichen Trend verwendet werden. Akustische Darbietungen (Töne, Klänge, Geräusche, Knalle) werden zur Vermittlung von Geoinformationen selten eingesetzt sieht man einmal vom Fall der ergänzenden Sprachausgabe (z.b. in Navigationssystemen) ab. Zwar kann das menschliche Gehör sehr geringe Unterschiede in Lautstärke, Tonhöhe, etc. gut unterscheiden, dafür ist aber die Erfassung absoluter Werte unmöglich. Auch die gleichzeitige Darbietung von zwei oder mehreren, unterschiedlichen Quantitäten ist praktisch genauso wenig realisierbar wie die Repräsentation von geometrischen Eigenschaften. Erste detaillierte Untersuchungen im Zusammenhang mit akustischen Darbietungen für die Präsentation von Geoinformationen stammen von Krygier (1994), der analog zu den graphischen Variablen (Größe, Farbe, etc.) Soundvariablen definiert (Lautstärke, Tonhöhe, Dauer, Tempo, etc.). Diese sind grundsätzlich zur Vermittlung von ordinalskalierten Daten geeignet. Praktischen Einsatz finden akustische Signale in Karten für Blinde, z.b. in der Software isonic ( Für die sonstige Nutzung multimedialer kartographischer Darstellungen mit Sound- Unterstützung gibt es nur wenige Implementierungen und entsprechende Bewertungen. Ausnahmen stellen z.b. die Arbeiten von Krygier (1993; Thema: AIDS-Fälle in den USA), Fisher (1994) bzw. Lodha et al. (1996; beide zum Thema: Unsicherheiten), Lodha et al. (1999) bzw. Harding et al. (2002; beide zum Thema: Seismische Ereignisse) oder Brauen (2006; Thema: Wahlen in Kanada) dar. Hierbei werden jeweils statische (und teilweise auch interaktive) thematische Karten mit akustischen Elementen erweitert und in den meisten Fällen auch eine verbesserte Gebrauchstauglichkeit propagiert. Die Kombination von akustischen Darbietungen mit animierten Karten wurde in der Literatur bisher kaum bzw. nur wenig systematisch behandelt. Buziek (2003) schildert den Entwurf einer audio-visuellen Darstellung eines Überflutungsereignisses, bei dem wichtige Merkmale des Prozesses durch musikalische oder sprachliche Elemente verstärkt werden. Eine tatsächliche Kodierung quantitativer Werte findet hier aber ebenso wenig statt, wie eine fundierte Bewertung aus Nutzersicht. Aus der Wahrnehmungspsychologie sind generelle Untersuchungen bekannt, die sich auf Überlastungs- und Interferenzgefahren konzentrieren (Goldstein, 2002). 3. Experimente Im Rahmen des vom GiN e.v. unterstützten Forschungsvorhabens a 2 maps werden verschiedene, neuartige akustisch unterstützte animierten Karten erzeugt, die im WWW frei verfügbar gemacht und dort getestet werden sollen, u.a. durch einen Vergleich zu alternativen, statischen Darstellungen (Zeitserien, Zeitkarten). Bei diesen Implementierungen, die auf Basis der Adobe Flash-Software erfolgen, sollen zwei Grobkonzepte multimedialer Karten zur Vermittlung quantitativer Merkmale berücksichtigt werden - die Doppel - sowie die Zusatzkodierung (siehe auch Abb. 1). Bei der Doppelkodierung eines Merkmals (z.b. der Ausbreitung von Algen oder der Veränderung der Bevölkerung Deutschlands) werden die entsprechenden Werte sowohl durch typische graphische Variationen in einer Choroplethenkarte, als auch synchron dazu durch die Variation von Tönen (z.b. durch Lautstärke) repräsentiert. Hierbei wird zwischen einer globalen Darstellung (d.h., einem akustischen Wert für die
200 Jochen SCHIEWE; Beate WENINGER 193 gesamte Szene) und einer objektbezogenen Darstellung (d.h. separaten akustischen Werten für einzelne räumliche Bezugsgrößen) unterschieden. Im letzteren Fall ist hierfür eine interaktive Auswahl der jeweiligen Bezugsgrößen (z.b.. Flächen) notwendig (z.b. durch Vorab-Definition oder on-the-fly-selektion per mouse-over). Die Kernfrage dieses Experiments ist, ob durch diese Dopplung eine verbesserte Erfassung eines quantitativen Attributs seitens des Nutzers erzielt werden kann, oder aber umgekehrt ein Overload -Effekt zu beobachten ist. Konkrete Untersuchungsaspekte, die durch Befragung von Nutzern der WWW-Version und EyeTracking-Analysen beantwortet werden sollen, beziehen sich auf die Usability- oder aber zur Ablenkung?) Kriterien Effektivität (Werden Werte korrekt bzw. korrekter erfasst?), Effizienz (Kann die Animation u.u. schneller ablaufen?), und Zufriedenheit (Führt die akustische Darbietung zur Motivationssteigerung In diesem Zusammenhang stellen unterschiedlichen Varianten der Legendenkomplette Gestaltung einen zentralen Untersuchungsaspekt dar zum Beispiel ist eine und ausschließliche a-priori-darstellung oder aber eine selektive (u.u. interaktiv aufzurufende) Legende der jeweils dargestellten Zeichen möglich. Abbildung 1. Schematische Darstellung der multimedialen Kombination aus animierterr Karte und akustischer Darbietung zur Kodierung von quantitativen Merkmalen von Geoobjekten
201 Jochen SCHIEWE; Beate WENINGER 194 Bei der Zusatzkodierung wird die räumliche Ausbreitung eines Phänomens erneut in einer Choroplethenkarte visualisiert, während die globale (oder interaktiv für Teilregionen auszuwählende) Veränderungsrate durch Variationen von Tönen (z.b. Lautstärke für Quantität, Tonhöhe für das Vorzeichen der Änderung) beschrieben wird. Auch hier steht neben den o.g. Aspekten der Usability die Frage nach der wahrnehmbaren Informationsdichte im Mittelpunkt der Untersuchungen. Hierbei erfolgt auch eine Unterscheidung bezüglich inhärent korrelierter Werte (z.b. visuelle Darstellung der Veränderung der Bevölkerungszahl der Bundesländer und akustische Wiedergabe der Veränderung der Gesamtbevölkerung) und lediglich potenziell korrelierten Werten (z.b. visuelle Darstellung der Veränderung der Bevölkerungszahl der Bundesländer und akustische Wiedergabe der Veränderung der Verkehrstoten in Deutschland). Danksagung Dieses Vorhaben wird durch den Verein zur Förderung der Geoinformatik in Norddeutschland (GiN e.v.) gefördert. Literatur [1] Brauen, G. (2006): Designing interactive sound maps: Using scalable vector graphics. Cartographica. 41(1): [2] Buziek, G. (2003): Eine Konzeption der kartographischen Visualisierung. Habilitationsschrift, Universität Hannover. [3] Fisher, P.F. (1994): Visualization of the reliability in classified remotely sensed images. Photogrammetric Engineering and Remote Sensing, 60(7): [4] Goldstein, E.B. (2002): Wahrnehmungspsychologie. 2. Auflage. Spektrum-Akademischer Verlag. [5] Harding, C. et al. (2002): A Multi-Sensory System for the Investigation of Geoscientific Data. Computers & Graphics: 26: [6] Krygier, J.B. (1993): Sound and cartographic design. Videotape. Pennsylvania State University, Department of Geography. [7] Krygier, J.B. (1994): Sound and Geographic visualization. In: MacEachren, A.M. & Taylor, D.R.F. (Hrsg.): Visualization in modern cartography. Pergamon-Verlag: [8] Lodha, S.K. et al. (1996): Visualizing geometric uncertainty of surface interpolants. Graphics Interface: [9] Lodha, S.K., Joseph, A.J. & Renteria, J.C. (1999): Audio-visual data mapping for GIS-based data: an experimental evaluation. Proceedings of the 1999 workshop on new paradigms in information visualization and manipulation in conjunction with the eighth ACM international conference on Information and knowledge management:
202 Astrid LIPSKI; Roland HACHMANN 195 WebGIS-Technologien im Einsatz für den ehrenamtlichen Naturschutz Astrid LIPSKI 1,a ; Roland HACHMANN a a IP SYSCON GmbH, Hannover Zusammenfassung. Ehrenamtliche tragen mit ihrem Engagement und den von ihnen erfassten Daten wesentlich zur Arbeit des Naturschutzes bei. Angesichts der rückläufigen Zahlen der ehrenamtlich im Naturschutz Aktiven wird es umso wichtiger, ihre Tätigkeit durch die Bereitstellung geeigneter Werkzeuge zu unterstützen. WebGIS-Technologien können hier gewinnbringend eingesetzt werden. Anhand verschiedener Praxis-Beispiele werden deren Anwendungsgebiete und Vorteile für die Unterstützung der ehrenamtlichen Arbeit im Naturschutz vorgestellt. Schlagworte. Naturschutz, Ehrenamt, WebGIS, Artenerfassung, Informationssystem, mobile Anwendung, GPS, PDA Einleitung Das Engagement Ehrenamtlicher besitzt eine große Bedeutung für den Naturschutz in Deutschland, dabei ist die Erfassung von Tier- und Pflanzenarten einer der Haupttätigkeitsbereiche [1]. Verbände und naturschutzfachliche Vereinigungen tragen Informationen zum Vorkommen von Arten zusammen und liefern so eine Grundlage für wissenschaftliche Auswertungen, die Bewältigung naturschutzfachlicher und planerischer Aufgaben sowie für die Umweltbildung. Aufgrund verschiedener gesellschaftlicher und wirtschaftlicher Entwicklungen (u. a. Überalterung, Individualisierung, wirtschaftliche Unsicherheiten) nimmt die Bereitschaft, sich ehrenamtlich und langfristig für den Naturschutz zu engagieren, jedoch ab [2]. Neben der Beschreitung anderer Wege zur Motivation für ein ehrenamtliches Engagement und die Rekrutierung neuer Ehrenamtlicher wird daher auch die Bereitstellung geeigneter Werkzeuge zur Unterstützung einer effizienten Datenerfassung und -weitergabe forciert. WebGIS-gestützte Systeme können bei entsprechender Ausgestaltung zum einen die Anforderungen der Datenverwerter an eine wissenschaftlich auswertbare, qualitativ hochwertige Datenerfassung aufgreifen. Zum anderen entsprechen sie den Erfordernissen der ehrenamtlichen Nutzer, die intuitiv bedienbare Werkzeuge, Kommunikationsmöglichkeiten und eine anschauliche Datenpräsentation favorisieren. Wie diese Anforderungen umgesetzt werden und welche Anwendungsbereiche durch entsprechende Systeme praktisch unterstützt werden können, erprobt die IP SYSCON GmbH derzeit in Kooperation mit verschiedenen ehrenamtlichen, behördlichen und 1 Korrespondierende Autorin: Dr. Astrid Lipski, IP SYSCON GmbH, Tiestestraße 16-18, Hannover,
203 Astrid LIPSKI; Roland HACHMANN 196 wissenschaftlichen Partnern in Niedersachsen. Die Herausforderung besteht vor allem darin, vorhandene GIS- und Internettechnologien so zu kombinieren und durch neue Funktionen zu ergänzen, dass die vielschichtigen Anforderungen an die Datenerfassung und -auswer-tung, die Bereitstellung von Informationen via Internet sowie an die Zusammenarbeit der Nutzer untereinander umgesetzt werden. 2. Der Einsatz von WebGIS-Technologien für 2.1. die Datenerfassung und auswertung durch ehrenamtliche Nutzer Eine vom NABU Laatzen e.v. und der IP SYSCON GmbH gemeinsam entwickelte WebGIS-gestützte Lösung greift vor allem die Anforderungen an die Datenerfassung aus ehrenamtlicher Sicht auf [3] 2. Das auf freier Software basierende Portal emapper bietet Werkzeuge zur einheitlichen, digitalen Erfassung von Artendaten und berücksichtigt dabei vorhandene fachliche Standards. Um eine redundante Datenhaltung zu vermeiden, können neben Fachdaten des Verbandes weitere Geo(basis)daten in Form von WebDiensten (WMS, WFS) integriert werden. Verschiedene Filter- und Exportmöglichkeiten stehen zur Datenauswertung und - weitergabe zu Verfügung. Usability-Tests zeigten, dass das Portal den Anforderungen der ehrenamtlichen Nutzer gerecht wird. Gelobt wurden vor allem die intuitive Bedienbarkeit, die übersichtlich gestaltete Benutzeroberfläche sowie die automatische Plausibilitätskontrolle der Eingaben die Zusammenführung heterogener Daten auf behördlicher Seite Ein flexibler Systemaufbau ermöglicht die Nutzung des Portals emapper auch für die Datenerfassung auf behördlicher Seite. Gemeinsam mit der Region Hannover erfolgte eine gezielte Anpassung an die Anforderungen einer Behörde. So kann ein digitales Artenkataster etabliert werden, das bei unterschiedlichen Planungsaufgaben Unterstützung leisten kann und auch einzelne Meldungen aus Kartierungen einer übergeordneten Auswertung zugänglich macht. Über die standardisierte Datenerfassung wird zudem der Datenaustausch mit dem Niedersächsischen Landesbetrieb für Wasserwirtschaft, Küsten- und Naturschutz (NLWKN) von Seiten der ehrenamtlichen Melder und der Region Hannover optimiert. Wesentliches Ziel der Entwicklung war neben der Integration von Daten unterschiedlicher Herkunft auch der Aufbau eines Rollen- und Rechtemanagements, das der angestrebten, heterogenen Nutzerstruktur und den daraus resultierenden Anforderungen an die Datenweitergabe gerecht wird. An die Entwicklung schließt sich nun eine einjährige, praktische Erprobungsphase an. 2 Das Projekt wurde von der Deutschen Bundesstiftung Umwelt und der Region Hannover gefördert. Weitere Informationen unter
204 Astrid LIPSKI; Roland HACHMANN die Integration mobil erfasster Daten Das Forschungsvorhaben ARDINI (Artenschutz digital in Niedersachsen) 3, eine Kooperation mit der Universität Oldenburg, der Jade Hochschule Wilhelmshaven / Oldenburg / Elsfleth, dem NLWKN sowie mehreren Naturschutzverbänden, zielt auf die drahtlose Übertragung von erfassten Daten direkt aus dem Gelände, unter Nutzung der Mobilfunktechnologie, ab. Kartenmaterial und Artbestimmungsfunktionen im Gelände sind ebenfalls Teil der geplanten Anwendung. Die bereits erprobten WebGIS- Technologien des emapper werden zur zentralen Sammlung und weiteren Bearbeitung der GPS-gestützt erfassten Daten verwendet. Das System wird anhand ausgewählter Artengruppen bis zur Anwendungsreife entwickelt und in Hinblick auf inhaltliche Validität und praktischen Nutzen für eine zukunftsfähige, digitale Unterstützung der Artenerfassung durch Ehrenamtliche evaluiert die öffentlichkeitswirksame Präsentation von Naturschutzfachdaten Gemeinsam mit dem BUND Landesverband Niedersachsen e.v. erfolgt der Aufbau eines niedersachsenweiten Informationssystems für Streuobstwiesen 4. Es soll zukünftig allen Interessierten die Möglichkeit bieten, Informationen zu Obstwiesen via Internet zu erfassen und abzufragen. Die Entwicklung des WebGIS-gestützten Informationssystems erfolgt auch auf Basis der bestehenden Erfahrungen mit der WebGIS-gestützten Artenerfassung. Das System soll neben räumlichen Informationen zu den Streuobstwiesen auch Informationen zu Vermarktung und Veranstaltungen, dem Zustand der Wiesen und dem Vorkommen alter Obstsorten enthalten. Zudem werden Werkzeuge für eine einheitliche Datenerfassung bereitgestellt. Der Aufbau der Plattform geschieht schrittweise: Die Erprobung und auch die Einbindung vorliegender Informationen sowie der unterschiedlichen, beteiligten Akteure erfolgt zunächst anhand von vier Modelllandkreisen. Mit Abschluss dieser Erprobungsphase steht die Plattform dann in vollem Umfang allen interessierten Nutzern zur Verfügung. 3. Fazit und Ausblick WebGIS-gestützte Lösungen leisten einen wichtigen Beitrag zur effektiveren Vernetzung im Naturschutz und führen die Ergebnisse der ehrenamtlichen Arbeit vielfältigen Anwendungszwecken zu. Sie können nicht nur als Daten- und Kommunikationsplattform zwischen Verband und ehrenamtlich Kartierenden fungieren, sondern binden je nach Einsatzbreite auch Behörden, andere Verbände sowie die Öffentlichkeit ein. Für einen Ausbau der Anwendungsfelder und Zielgruppen lassen sie sich auch mit bestehenden technischen Lösungen (z. B. Fachportalen, Fachkatastern) verknüpfen. Ebenso sind sie in technologische Infrastrukturen und Metadateninformationssysteme integrierbar sowie zur Veröffentlichung von Umweltfachdaten entsprechend der Umweltinformationsgesetze nutzbar. Die Integration von mobilen Technologien und ein Ausbau der 3 Das Projekt wird von der Deutschen Bundesstiftung Umwelt gefördert. Weitere Informationen unter 4 Das Projekt wird von der Niedersächsischen Bingo-Umweltstiftung gefördert. Weitere Informationen unter
205 Astrid LIPSKI; Roland HACHMANN 198 Kommunikationsmöglichkeiten im Sinne von Web 2.0 bieten zudem gerade jüngeren Ehrenamtlichen mehr Anreiz, sich ehrenamtlich zu engagieren. Quellen [1] R. Schulte, Freiwillige in Naturschutzverbänden, Naturschutz und Biologische Vielfalt 37 (2006), [2] H.-W. Frohn & J. Rosebrock, Europ. Jahr des Ehrenamts und Biodiversität: Der Beitrag naturwissenschaftlicher Vereinigungen zur Erhaltung der biologischen Vielfalt, Natur und Landschaft 1 (2011), 2-6. [3] S. Rüter, R. Hachmann, S. Krohn-Grimberghe, D. Laske, A. Lipski & E. v. Ruschkowski, GIS-gestütztes Gebietsmonitoring im ehrenamtlichen Naturschutz, ibidem-verlag, Stuttgart, 2010.
206 Christine ANDRAE et al. 199 XErleben Datenmodell für ein kommunales Freizeitkataster Christine ANDRAE a ; Jens HINRICHS b ; Friedhelm KRUTH c ; Katja NIENSTEDT d ; Birgit PIEKE b a ; Axel ZOLPER a Regionalverband Ruhr, Stadtplanwerk Ruhrgebiet b Kreis Warendorf, Vermessungs- und Katasteramt c Bezirksregierung Köln, Abt. GEObasis.nrw d Stadt Solingen, Vermessungs- und Katasteramt Abstract. XErleben beschreibt ein Datenmodell für den standardisierten Datenaustausch von Orten von Interesse mit kommunalen Freizeit- und Infrastrukturinformationen. Ziel ist die interoperable Bereitstellung und Verwendung in regionalen, landesweiten oder fachlich spezialisierten Portalen und die interoperable Einspeisung in Portale der Nachbarkommunen oder Nachbarregionen. Das Modell wurde im Auftrag der kommunalen Spitzenverbände in Nordrhein-Westfalen von einer Arbeitsgruppe erarbeitet und mit verschiedenen Akteuren des Landes abgestimmt. Keywords. Datenmodell, Orte von Interesse, kommunale Geodateninfrastruktur 1. Einleitung Wo ist eigentlich die Veltins-Arena? Welches ist die nächste UBahn-Haltestelle dazu? Wo ist das nächste Freibad? An welchen Sehenswürdigkeiten komme ich auf der Römerroute vorbei? Dies sind Fragen, auf die Bürger und Touristen Antworten in kommunalen und regionalen Geoportalen suchen, und die an kommunalen Grenzen keinen Halt machen. Technisch stellt der Datenaustausch kein Problem dar. Was bisher jedoch fehlt, ist eine Verständigung über die Struktur der Daten auf Ebene der Objektarten und ihrer Eigenschaften. Die Arbeitsgruppe Kommunales Freizeitkataster der kommunalen Spitzenverbände in Nordrhein-Westfalen hat auf der Basis nationaler und internationaler Standards 1 ein fachlogisches Modell namens XErleben erarbeitet und in UML sowie als GML-Schema dokumentiert 2. Mit diesem semantischen Modell können dezentral gehaltene Daten in Portalen zusammengeführt und regelbasiert einheitlich dargestellt werden. Als wichtige Akteure auf diesem Gebiet wurden der Regionalverband Ruhr und die Bezirksregierung Köln, Abteilung GEObasis.NRW in die Arbeitsgruppe eingebunden. In einem zweiten Schritt wurden Tourismusverbände, Landschaftsverbände und weitere Akteure mit Bezug zum Thema Tourismus und Freizeit in die fachliche Abstimmung einbezogen. 1 Schüttel, Marcel: AAA-konforme Modellierung von Geofachdaten. zfv 2009/01 S Informationen und Datenmodell auf der Homepage
207 Christine ANDRAE et al Inhalte des Datenmodells XErleben Orte von Interesse können nicht nur klassische, touristisch relevante Points of Interest wie Museen oder Hotels, sondern auch Objekte der kommunalen Infrastruktur wie Schulen, Kindergärten, soziale Einrichtungen oder Standorte von Wochenmärkten und Gewerbegebieten sein. Auch Objekte mit linien- oder flächenförmiger Ausdehnung wie Freizeitwege oder Szeneviertel wollen in Portalen gesucht und gefunden werden. Das Modell sieht auch Veranstaltungsobjekte sowie Unternehmensstandorte mit Branchenschlüsseln für YellowPages-Anwendungen vor. Ausgangspunkt für das Modell war das bereits unter dem Namen TFIS 3 existierende semantische Modell des Tourismus- und Freizeitinformationssystems der AdV 4. Während sich TFIS jedoch auf Freizeit- und Tourismusinformationen beschränkt, bildet das kommunale Datenmodell XErleben zudem die wichtigen Bereiche der kommunalen Infrastruktur ab. Neben der Erweiterung des Objektkatalogs um zusätzliche Objektarten wurde auch die Tiefe der Information im Hinblick auf geeignete Suchkriterien ausgebaut. 3. Aufbau des Datenmodells XErleben Vergleichbar mit CityGML, Xplanung, OKSTRA XML und anderen fachspezifischen Standards 5 ist XErleben ein GML(3.1)-Anwendungsschema. Es ist jedoch bewusst einfach gehalten und beschränkt die verwendeten Geometrien auf einfache zweidimensionale Primitive und verzichtet vorerst auf topologische Beziehungen. Die zentrale Objektklasse des Datenmodells ist die Objektart XE_OrtVonInteresse (OVI). Sie bündelt alle von der Art des Objekts unabhängigen Stammdaten. Jedem OVI kann eine touristische und eine fachliche Bedeutung zugeordnet werden. Das ermöglicht in Geoportalen eine einfache maßstabsgerechte Selektion. Dem XE_OrtVonInteresse muss mindestens ein Punkt als Geometrie zugeordnet werden, zusätzlich können weitere Punkt-, Linien- oder Flächengeometrien erfasst werden. Außerdem besteht die Möglichkeit, auf vorhandene Geometrien zu referenzieren. Es können beliebig viele verschiedene Beziehungen zwischen einzelnen Orten von Interesse bestehen, die eine organisatorische, thematische oder räumliche Zugehörigkeit zu einem anderen Ort von Interesse ausdrücken. So können Sehenswürdigkeiten oder Gastronomiebetriebe einer touristischen Route, Unternehmen einem Gewerbegebiet zugeordnet werden. Beliebig viele Synonyme können für Suchanfragen, aber auch für die Eliminierung von Duplikaten verwendet werden. Dem XE_OrtVonInteresse muss mindestens eine XE_Kategorie zugeordnet werden. Es können dabei beliebig viele Kategorien gewählt werden. In Abhängigkeit von der Kategorie werden weitere Attribute zugeordnet. Damit die Datenpflege nicht zu aufwändig wird, gibt es nur eine kleine Menge an Pflichtattributen, aber eine ganze Reihe optionaler Attribute. Die Attributausstattung ist im Hinblick auf Suchkriterien erarbeitet und ersetzt keine Fachdatenbank. 3 Flocke, Berthold und Wolf, Peter: Touristik- und Freizeitinformationssystem Konzeptioneller Ansatz zur Führung von raumbezogenen Fachinformationen und praktische Realisierung, zfv 2009/04 S Arbeitsgemeinschaft der Vermessungsverwaltungen der Länder der Bundesrepublik Deutschland
208 Christine ANDRAE et al. 201 Abb. 1: Zentrales Objekt des XErleben-Modells ist die Klasse XE_OrtVonInteresse. Aus dem Kategorienbaum sind nur die obersten beiden Ebenen dargestellt. 4. Realisierungen des Modells Ziel der Modellierung ist die Definition eines Übergabeschemas in GML. Die technische Umsetzung in ein relationales Modell bleibt den Kommunen oder regionalen Kooperationen überlassen. Das Modell wurde bewusst einfach gehalten, damit auch kleine Kommunen ihre Daten mit einfachen Mitteln vorhalten und bereitstellen können. Anspruchsvollere Realisierungen können durch regionale Kooperationen geleistet werden. Der Regionalverband Ruhr und das Geoportal Münsterland arbeiten an Implementierungen. Erste Ansätze zeigen, dass das Datenmodell mittels Weboberflächen pflegbar ist. In Nordrhein-Westfalen stellen verschiedene regionale und landesweite Akteure wie Geobasis.NRW, der Radroutenplaner NRW, die Landschafts- und Tourismusverbände ihre Freizeit- und Infrastrukturinformationen über Geodienste bereit. Mit dem Standard XErleben liegt nun ein gemeinsamer Nenner vor, der die Zusammenführung dieser dezentral gehaltenen Informationen in Portalen und die Darstellung über regelbasierte Signaturierung ermöglicht. Wünschenswert ist eine weite Verbreitung auch über die Grenzen von Nordrhein- Westfalen hinaus. Dies wird auch durch eine Empfehlung der kommunalen Spitzenverbände an ihre Mitgliedskommunen erfolgen. Die Aufnahme in die Familie der XÖV-Standards 6 wird angestrebt. 6
209 Mona RICHTER, Marc-Oliver LÖWNER 202 Risikobewertung von Sichtbehinderungen durch niedrige Sonnenstände für das Verkehrswegenetz M. RICHTER; M.-O. LÖWNER Zusammenfassung. Wir stellen eine Modellierung und Implementierung zur Risikobewertung durch niedrige Sonnenständen für niedersächsische Autobahnen unter Verwendung von OpenStreetMap Daten und einem zeitlich hoch auflösenden Sonnenstandsmodul vor. Sichtbehinderungen durch niedrige Sonnenstände führen im Straßenverkehr wiederholt zu teilweise schwerwiegenden Unfällen. Durch einen Abgleich der für einzelne Straßengeometrien berechneten Sonnenwinkelpaare mit den empirisch ermittelten Sichtfeldwinkeln eines Versuchsfahrzeuges werden gefährdete Straßenabschnitte in einer zeitlichen Auflösung von 15 Minuten identifiziert. Die hier vorgestellten Analysen stellen damit eine Pilotstudie für ein Frühwarnsystem für derartige Gefährdungssituationen im Bereich des Individualverkehrs dar. Keywords. Risikobewertung, OpenStreetMap, Verkehr 1. Einleitung Sichtbehinderungen durch niedrige Sonnenstände stellen für den Individualverkehr ein erhebliches Risikopotenzial dar. Dies zeigt nicht zuletzt die Massenkarambolage im Juli 2009, die sich auf 30 Kilometern auf der A2 ereignete, 80 Verletzte in über 250 beteiligen Fahrzeugen forderte und einen Sachschaden von mehr als 1,5 Millionen Euro verursachte (Welt 2009). Vor diesen Gefährdungen im Straßenverkehr wird zurzeit weder allgemein von amtlichen Stellen noch individuell von kommerziellen Navigationssystemen gewarnt. Vorhersagen von Sichtbehinderungen durch niedrige Sonnenstände für das Verkehrswegenetz sind dabei prinzipiell möglich. Während Daten über die Geometrien von Straßen teilweise sogar frei vom OpenStreetMap Projekt zur Verfügung gestellt werden (OSM 2011), lässt sich die Sonnenbahn an beliebigen Orten zu jeder Tageszeit berechnen. In Kombination mit einem Fahrzeugmodell zur Einschätzung des Fahrersichtfeldes kann eine Risikoanalyse erfolgen. Wir stellen hier eine solche Risikoabschätzung für den Individualverkehr durch niedrige Sonnenstände vor. Durch Kombination astrophysikalischer Berechnungen und freie Geodaten über niedersächsische Autobahnen wird, zunächst offline, eine räumlich und zeitlich hoch aufgelöste, geometrisch basierte Einschätzung der Sichtbehinderung durch niedrige Sonnenstrände analysiert.
210 Mona RICHTER, Marc-Oliver LÖWNER Modelarchitektur und Implementierung der Risikoanalyse Die Modellierung des Risikopotenzials durch niedrige Sonnenständee für den Individualverkehr umfasst die Bereitstellung von Geometrien der niedersächsischen Autobahnen sowie die zeitlich hoch aufgelöste Berechnung der Sonnenstände im Viertelstundentakt für ein Jahr. Diese werden in Kombination eines Fahrzeugmodells, das die Sichtwinkel für einen Fahrer repräsentiert geometrisch analysiert. Als Ergebnis werden die fünf Risikoklassen für den Fahrzeugführer zu jeder Viertelstunde eines Jahres auf den gegeben Straßengeometrien berechnet. Das Fahrzeugmodell (Abb. 1) berücksichtigt die Sichtbehinderung durch niedrige Sonnenstände von vorne und von hinten. Bei der Sichtbehinderung von vorne wird hier der Normalfall (α O + δ O + α U ) von dem bei heruntergeklappter Sonnenblendee (δ O + α U ) unterschieden. Für Sichtbehinderungen von hinten sind Blendungen durch den Rückspiegel (β L + β R ) und den linken Außenspiegel (γ L ) berücksichtigt worden. In diesem Ansatz wurden die entsprechenden Sichtwinkel an einem Skoda Oktavia ermittelt. Abb. 1: Fahrzeugmodell mit den für niedrige Sonnenstände gefährdenden Sichtwinkel des Fahrzeugführers (Fahrzeugabbildungen links: Boellet-Formenbau 2001, rechts: Audiquattrofan 2001). Die im Fahrzeugmodell identifizierten Sichtwinkel, bei denen eine Gefährdung durch niedrige Sonnenstände auftritt werden in den Gefahrenklassen keine Gefahr, Blendung von vorne, Blendung von vorne, trotz Sonnenblende sowie Blendung von hinten zusammengefasst. Als Datengrundlage der in Abb. 2 skizzierten Implementierung dienen OpenStreetMap Daten der niedersächsischen Autobahnen, die über das Tool osm2pgsql in eine durch PostGIS erweiterte PosgreSQL Datenbank importiert werden. Nach Transformation des Google Mercator in das Gauß-Krüger Koordinatensystem sind die als Polylinien vorliegenden Geometrien zur präziseren Modellierung in ihre einzelnen Segmente unterteilt worden (in Abb. blau). Zwecks schnellerer Risikomodellierung wurden je Azimut und Mittelpunkt alle dieser Linien berechnet und in weiteren Spalten der Geometrietabelle gespeichert.
211 Mona RICHTER, Marc-Oliver LÖWNER 204 Abb. 2: Implementierung der Gefährdungsanalyse (siehe Text für Erläuterungen). Die Berechnung der viertelstundengenauen Sonnenstände in Form von Azimutalund Höhenwinkel nach (AstAlm 2006; Meeus 2000) sowie die Berechnung der Risikoklassen erfolgte in Java (in Abb. rosa). Die Ergebnisse wurden für jede betrachtete Geometrie als String codiert in der Datenbank abgelegt. In einem weiteren Schritt wurde die in den Risikoklassen definierten Schwellenwerte der Einstrahlungswinkel mit den Sonnenwinkeln abgeglichen und das Ergebnis ebenfalls in der Datenbank abgelegt. Der Zugriff auf die Datenbank wurde dabei mit der postgresql-jdbc Schnittstelle realisiert, die entsprechend den Anforderungen erweitert werden musste (PG-Forum 2007). Hier zeigte sich, dass der lesende und schreibende Datenbankzugriff einen entscheidenden Kostenfaktor ausmacht, der bei Weiterentwicklung dieses Ansatzes dringend verbessert werden muss. Dieser Flaschenhals ist kaum zu vermeiden, da sämtliche hier zur Verfügung stehenden Straßengeometrien mit Sonnenwinkelpaaren verglichen werden müssen. Bei Vorgabe konkreter Routen und entsprechenden Zeitangaben fallen die Kosten der Datenbankabfrage nicht ins Gewicht. Hier wird sich die Datenbanklast etwa um den Faktor bis verringern. Die Visualisierung der durch pgsql2shp exportierten Shapedateien erfolgte in ArcMap. Mittels eines dafür entwickelten Skriptes in VBA mit ArcObjects wurde für jeden Zeitschritt eine Karte der niedersächsischen Autobahnen erstellt und als Bild exportiert. Ihre Farbgebung wurde nach der Gefahrenklasse bestimmt. Das Ergebnis ist somit eine Zeitreihe von Bildern. 3. Ergebnisse Die Ergebnisse der oben beschriebenen Modellierung ist eine 15-minütige Risikoklassifizierungen für alle verfügbaren Einzelgeometrien niedersächsischer Autobahnen. Die Abbildung 3 zeigt die Situation auf den Autobahnen um Braunschweig am um 16:00 (UT). Erwartungsgemäß wird der Autofahrer zu dem hier beispielhaft gewählten Zeitpunkt auf den von Ost nach West führenden Autobahnen von vorne geblendet, wenn er in Richtung Westen fährt. Das Autobahnkreuz Braunschweig-Nord, an dem sich die A2 und die A391 treffen, zeigt in dem vergrößerten Ausschnitt eine wechselnde Gefahrenlage in den Autobahnabfahrten.
212 Mona RICHTER, Marc-Oliver LÖWNER 205 Diese Situation ist besonders prekär, da der Fahrzeugführer von der Blendung durch die Sonne überrascht wird. Weitere Ergebnissee sind in Form von Filmen ausgewählter Zeitabschnitte auf den Internetseiten des Instituts für Geodäsie und Photogrammetrie der TU Braunschweig unter abrufbar. Abb. 3: Gefährdung des Autofahrers auf niedersächsischen Autobahnen am um 16:00 (UT). 4. Diskussion Die Ergebnisse der hier durchgeführten Analyse liegen für Straßengeometrien im 15-Minuten-Takt für ein ganzes solares Jahr vor. Es konnte gezeigt werden, dass eine Modellierung der Sichtbehinderungen durch niedrige Sonnenstände in einer hohen zeitlichen Auflösung unter Zuhilfenahme freier Geodaten prinzipiell erfolgen kann. Als Folge dieser Modellierung ist eine entsprechende Verkehrsleitplanung durch etwa temporäre Tempolimits oder die Vorschrift zum Tragen von Sonnenbrillen möglich. Die oben gezeigten Modellergebnisse sind in der Folge einer stark eingeschränkten Geodatenlage entstanden. en. So ist weder das Relief noch andere Geometrien, wie Häuser oder Bäume, welche zur Verschattung der betrachteten Straßen führen können, berücksichtigt worden. Ebenso ist nicht auf die individuelle Sichtsituation anderer als der hier genannten Fahrzeugklasse eingegangen worden. Ebenfalls muss eine Evaluierung durch Testfahrten erfolgen, um die Modellergebnisse zu verifizieren und gegebenenfalls zu verbessern.
213 Mona RICHTER, Marc-Oliver LÖWNER 206 In den weiteren Schritten ist neben den oben genannten Modelverbesserungen auch die Modellierung für weitere Straßenarten, etwa im innerstädtischen Bereich geplant. Die in der hier vorgestellten, vereinfachten Betrachtungsweise kostenintensiven Datenbankabfragen werden dabei noch mehr in den Vordergrund gerückt werden müssen. Über eine entsprechende Algorithmik muss der Zugriff auf die Datenbank minimiert werden, um etwa auch für einzelne Routen eine realtime-analyse zu ermöglichen. Allerdings ist, wie oben diskutiert, die Datenbanklast bei solchen Anwendungen erheblich geringer, als bei der Betrachtung aller Geometrien zu allen Zeitintervallen. Eine realtime-analyse der Gefährdung kann dann in der Folge im Zusammenspiel mit Navigationssystemen und aktuellen Wetterinformationen individuell zum Einsatz kommen. Danksagung Das Pilotprojekt Risikobewertung von Sichtbehinderungen durch niedrige Sonnenstände für das Verkehrswegenetz ist durch den Verein zur Förderung der Geoinformatik in Niedersachen (GiN) e. V. finanziell unterstützt worden. Ebenfalls sei den Gutachtern dieses Beitrages für ihre konstruktiven Diskussionsbeiträge gedankt. Literatur [1] Audiquattrofan: URL: spqgrb2oben.gif, verif [2] AstAlm 2006, The Astronomical Almanac For The Year 2006, The Stationery Office, London 2004 [3] Boellet-Formenbau, URL: verif [4] Meeus 2000, Astronomical Algorithms, Willmann-Bell, Richmond 2000 [5] OSM 2011, verif [6] PG-Forum, verif [7] Welt, 2009, URL: verif
214 Poster Abstracts 207
215 Martin LOIDL et al. 208 Unveiling the design framework behind transactional map symbols Martin LOIDL a,1 ; Florian FISCHER b a ; Christoph TRAUN a Center for Geoinformatics (Z_GIS), University of Salzburg b Institute for Geographic Information Science, Austrian Academy of Science Abstract. Map symbols in general play a central role in cartography since they represent real-world entities on a map. Due to an increasing utilization of maps as Web 2.0 interfaces for several applications, especially point symbols have become windows to the real-world, allowing communication about and interaction with real-world entities. Although this functional shift is clearly observable for years, a sound formalization of graphical and functional design is still lacking. This paper tries to unveil currently applied concepts behind the graphical and functional design of so-called transactional map symbols based on an examination of several popular map-based web platforms. Furthermore this paper should serve as a starting point in a discussion about to which extent cartographic concepts could improve the point symbol design in a highly dynamical and interactive map environment. Keywords. Cartography, map symbol, graphical user interface, design Introduction The growing popularity of digital maps in internet media and their use as a baseline for ubiquitous urban computing [1-4] has led to an alteration of the core purpose of maps. It is shifting from a static communication medium to a graphical user interface (GUI) for any application with location related content [5]. Whether objects, places or persons, everything can be located precisely and therefore be represented in maps, which in turn serve as an intuitive organizational structure and interface for any kind of communication (e.g. rating, commenting) or transaction (e.g. ordering, booking). In contrast to an interactive map use for exploratory spatial data analysis (ESDA), maps in such contexts serve as direct interfaces to real-world entities. In other words, they are the stage or space [6: p.2534] for any location-related communication and transaction. In contrast to analog maps, digital maps offer the opportunity to integrate dynamic, interactive elements (e.g. hyperlinks). Within the last twenty years [7] enormous functional advancements could have been observed in this field. But with the establishment of the Web 2.0 concept [8], the utilization of maps entered a new era [9]. Techniques like mash-ups made it possible to use map elements as entry points to a nearly unlimited amount of location related information [10] available on any device. Due to the rising popularity of spatial information and the ease of use of map APIs and 1 Martin LOIDL, Center for Geoinformatics (Z_GIS), University of Salzburg, Hellbrunnerstraße 34, A-5020 Salzburg.
216 Martin LOIDL et al. 209 GI-tools [11], countless portals, websites and mobile applications have incorporated maps as interfaces for their content [12]. In this context map symbols (mostly point symbols) often represent not only the corresponding entity as such but possess connective abilities, blurring the boundary between the virtual representation of space and real-world settings (cf. Harrison & Tatar s interpretation of place construction in an pervasive environment [13]). Due to their role as an interface for a map-facilitated spatial communication and transaction, we accordingly call them transactional map symbols. This term characterizes their function as a window to real-world entities and emphasizes the ability to directly interact with them [14]. Transactional map symbols are applied in several contexts: Collection of location relevant information (e.g. Google Places 2 ): any information about a specific place, shop, restaurant etc. is collected and tagged. Most of these platforms are run commercially. Location related collaborative content (e.g. Qype 3 ): any kind of information, ratings and comments are user generated and can be edited by any member of the platform. Location or object related real-time information (e.g. Public Transport 4 ): real-time information is connected with symbols representing either locations, like bus stops, or moving entities. Platforms communicating real-time whereabouts of persons enjoy rising popularity (e.g. Foursquare 5 ). Location related interaction (e.g. Booking Portal 6 ): opportunities to interact with places/objects are directly bound to map symbols (e.g. book now -button in a map overview of hotels). Interaction with localized persons [15] is provided by several platforms (e.g. Trendsmap 7 ). Augmented location-based information and interaction (e.g. Wikitude 8 ): realworld environments are overlaid with location-specific information. Mobile devices with integrated positioning systems and active internet connection serve as tools for mobile spatial interaction [16]. All these examples have in common that transactional map symbols do not primarily code for a general type of object (like restaurant, post office etc.) but collect and connect to a specific location and its relevant information or even allow to trigger realworld actions. In this sense the design of such symbols is not merely a matter of cartography, but of a more general geomedia- or geointeraction-design [5]. By now the graphical und functional design of such symbols is primarily driven by default options (like the already famous Google pushpin), although a well-established set of design rules for map symbology would exist (e.g. Bertin s work [17] is regarded as fundamental). The urgent question for cartographers arises to which extent they 2 [ ] 3 [ ] 4 [ ] 5 [ ] 6 [ ] 7 [ ] 8 [ ]
217 Martin LOIDL et al. 210 could contribute to a more effective and user friendly design of transactional map symbols which are most often used in a not purely cartographic context. To put it the other way round: It seems to be worthy to discuss existing design-guidelines in the light of a broader application of maps [18]. As a first step this paper aims to unveil the conceptual design framework behind transactional map symbols as actually used in several representative examples (see e.g. footnotes above). Furthermore it should serve as groundwork for discussing the potential contribution of cartography in an enhanced user interface design. The relevance of this has been expressed by Bilandzic & Foth [16: p.2606]: Only if designers manage to create intuitive and easy-to-learn interfaces, mobile spatial interaction (MSI) applications might be adopted and used by the broad mass of users. By now the technical and conceptual standard of platforms and applications is quite high [19], but a cartographic design framework for this specific context is still widely lacking. 5. Unveiling the Design Framework Since transactional map symbols possess interactive functions and therefore need to be selectable, they are nearly exclusively based on point geometries. A major reason, beside some technical advantages, for this is, that point symbols require little space on the map (or screen) and consequently more information associated to the displayed area can be communicated. Take for example an urban park with several facilities like an ice-cream parlor, a fountain, benches, restroom etc. A communal information web-portal wants to tag all these entities with transactional map symbols for feedback or maintenance-requests. Given that the park is stored as a polygon-feature it would not be advisable to derive a polygonal hotspot or transactional symbol because entities within the park would be covered and could not be targeted distinctly anymore. The obvious and most practicable solution is to represent the park itself and every facility within as a point symbol, as shown in figure 1. Figure 1: Transactional map symbol Central Park as polygon (left) vs. points (right); hotspots are indicated by grey shading/circles and hyperlinks. The usage of points as transactional map symbols prevents hotspots from overlapping. The functional and graphical design of transactional map symbols strongly depends on the general map purpose. Nevertheless reviewing various popular portals of different types, five core aspects, influencing the design of transactional map symbols were identified as common denominator. In the following the impact on the transactional map symbol design of these aspects is discussed and core concepts are collected at the end of each section.
218 Martin LOIDL et al Intention of the User Initial (mostly textual) search and filter options allow the users to define the thematic and spatial range of their queries, leading to potentially interesting results. Depending on the design of the platform query results are presented either as textual lists or in combination with maps. Thus the map, in combination with a thematic database (e.g. yellow pages), is used as catalog where location is just one of several selection criteria. In contrast, map symbols in analog maps are often packed with as much information as visually possible in order to provide the map reader the option to extract individually relevant information. The explicit focus on a predefined content through initial filtering makes the map legend more or less dispensable. Since in most cases only one thematic layer, according to the search like restaurants is mapped, in general unique symbols are used. A further graphical differentiation for the sake of additional information communication is not common, since it can lower the readability and additional information can be communicated in a consecutive step anyway. Despite this, some platforms discriminate among the symbols in order to reflect relevance, rating or popularity 9 although this is depending on the general concept of a platform usually reflected in the selection of results or ranking in a list. An example for a reasonable differentiation of symbols can be found in the next section. In contrast to most provider-driven platforms, the design of map symbols (selection of shape, size, color etc.) in collaborative mapping projects [20; 21] is entirely up to the user. A graphical differentiation of map symbols is not common because of initial filtering or search. However in some cases differentiations are applied in order to communicate the intention of the provider. Most applications do not offer legends since the commonly used unique symbols show the result of an initial query and base map symbology (Google Maps, OSM, Bing Maps etc.) is widely known Intention of the Provider The graphical and technical design of transactional map symbols depends highly on the intention and business model of the provider respectively. Ideally the user is guided to and attracted by the most relevant content from a provider s point of view. This can be mainly achieved by visual variables [22] like size, color hue, transparency and any kind of effects. Effects can help to navigate the user to the intended target. Mouseover-effects, for example, change the appearance of symbols when touched by the mouse pointer and signal that additional information for a certain object is available. The typical application of visual variables for the design of transactional map symbols can be illustrated by the following example. A web-platform provides freely available directory searches and visualizes search results in an overview map. Any information related to resulting entries is address-based and consequently connected to the map symbol. The service is financed by commercial ads which in turn are higher rated and 9 [ ]
219 Martin LOIDL et al. 212 displayed more prominently than conventional entries. In this sense, visual prominence is auctioned. In addition to a pure selection mechanism, commercial entries can be highlighted by more extraordinary or more noticeable map symbols. Shape, size and color hue variation are commonly used to meet this purpose. Analogous business models can be found behind real estate platforms 10, yellow pages 11 or city guides 12. Visual differentiation of symbols is used in some cases for guidance and attraction. Effects, like mouseover, are utilized for effective guidance of users and to enhance attractiveness. Quite often an excessive application leads to unfavorable visual noise Resolution & Display Dimension From a technical point of view the design of transactional symbols basically depends on the display resolution and dimension of the map medium. In the case of embedded maps the window resp. object size is decisive. Low resolution or small displays require simple and unambiguous map symbols. As the functionality of transactional symbols is often used for mobile apps, complex symbols with multiple information layers are generally put aside. Since the attractiveness and usability of map applications are of major interest, it is crucial to keep every single symbol selectable with any device (e.g. mouse, touchscreen). If too many locations or objects meet the initial search criterion a selection algorithm has to be applied. This will be dealt with in the next section. In most cases the basemap exhibits a comparably high degree of generalization due to potentially small displays and/or low resolution. A clear cartographic hierarchy can be observed between the basemap and the actual content represented by transactional map symbols. Large and concise transactional map symbols are utilized on comparably quite generalized basemaps. Ideally, symbols are individually selectable Scale & Zoom Level In most cases the size of transactional map symbols is not proportional to the scale or zoom level but retains as initially fixed 14. Preventing the map image to be visually overloaded different selection algorithms can be applied. Google Maps, for example, maps a constant amount of POIs, whereas Bing Maps focuses on POIs, located in the central area of the map 15. In the case of small scale representations sometimes more entries exist in the database than can be meaningfully mapped. This problem increases with low resolution or small display dimensions. Depending on the purpose of the map and on the 10 [ ] 11 [ ] 12 [ ] 13 [ ] 14 [ ] 15 These different strategies can be easily observed when searching for any business in or
220 Martin LOIDL et al. 213 provider s intention, different approaches are observable (cf. [23] for conceptual details): a) Selection of a delimited number of symbols. The selection can be regulated by ratings, relevance, popularity, spatial distribution or commercial interests (ads purchase) 16. b) A symbol represents several symbols on a higher zoom level. This can be communicated by annotations or effects 17. c) All database entries are mapped despite overlaps in order to communicate the density of relevant locations, objects or incidences. In this case the principle of individually selectable symbols is violated, but additional information comparable to dot density maps is communicated. Most often this method is applied following commercial or political intentions (e.g. Crime Mapping 18 ). The extensive dissociation of symbol size and scale/zoom level indicates the discussed shift from a pure cartographic paradigm to a more general framework of geocommunication, dealing with specific, user-requested information. Consequently the map cannot be used for questions, like How do I get from A to B?, but answers specific, user-generated requests, like Where are the best rated restaurants in town?. Symbols have a fixed size and are not adapted to scale or zoom level. Depending on the scale/zoom level, selections or alternative graphical methods are applied in order to prevent the map from visual overloading Perception & Dwell Time With regard to symbol quality (graphical and functional design) and quantity (number of POIs) two important factors have to be accounted for: a) Visual perception is limited for physiological and cognitive reasons. b) The mean dwell time on websites is relatively low. To attract the user s attention for transactional map symbols resp. the connected information, it is fundamentally necessary to keep the visual communication simple and concise. Mapping too many objects at once leads to capacity overload in the shortterm memory [24] and a decreasing attractiveness of the resulting map [25]. Virtually no study result is available about the mean dwell time on websites with maps or to what extent maps influence the dwell time. For news portals a mean dwell time of approximately three minutes per website and session is reported [26]. Given that the portals mentioned in this paper have comparable dwell times, the importance of a straightforward visual communication is obvious. Most providers of map-based portals tackle this challenge with a user-generated preselection and filtering and with a simple and concise design of transactional map symbols. By putting together limited perceptual abilities and a relatively short dwell time it can be deduced that: 16 [ ] 17 [ ] 18 [ ]
221 Martin LOIDL et al. 214 Selections are applied in order to focus the user on the most relevant content. Consequently the provider has to understand what the user is interested in; the other way round, the users must define what they are looking for. Map symbols are kept simple. In analog maps symbols often have several inherent information layers to transport a maximum of content. For transactional map symbols this approach does not seem to be conducive. 6. Conclusion The shift from classical cartography to geomedia-design [5] is widely recognized and accepted [27]. However, many of the examples cited in this paper already foreshadow the next step towards geointeraction-design for an augmented space. Driven by enormous technical dynamics and the general availability of GI-tools and spatial data, new mapping techniques and map applications arise. This development fundamentally affects map symbols. Instead of representing specific objects in a coded form, they serve as entry points for the interaction with a vast amount of specific location-based entities. The map itself becomes a Geographical User Interface (GeoUI) for all applications that pervade the environment prospectively. In short it can be said that the context of web-cartography as well as the graphical and functional design of map symbols has changed due to the establishment of the Web 2.0 concept [9]. Although transactional map sysmbols might contribute to a better functionality, traditional cartographic design-guidelines are virtually not considered yet in the process of symbol design. On the other hand cartographic design-guidelines will have to compete with the new transactional use of mass-market map applications (above all based on Google Maps) that experience an enormous rise of popularity. The summary of major concepts behind the graphical and functional design of transactional map symbols, presented in this paper should serve as an input in a discussion about to which extent cartographic concepts could improve the point symbol design in a highly dynamical and interactive map environment. On the other hand it can stimulate a discussion about the role of cartography in a growing and very successful broader geo-interaction-design context. Acknowledgement This paper has been written in the context of authoring a digital textbook on cartography for UNIGIS ( an international network of universities, dedicated to GI-education on a postgraduate level. Many thanks go to the director, Prof. Dr. Strobl (University of Salzburg, Center for Geoinformatics), for his support. The useful hints and critical questions by the anonymous reviewers are greatly acknowledged. References [1] T. Blaschke and J. Strobl, Geographic Information Science Developments, GIS.Science 23 (2010), [2] T. Kindberg, M. Chalmers, and E. Paulos, Guest Editors' Introduction: Urban Computing, Pervasive Computing, IEEE 6 (2007),
222 Martin LOIDL et al. 215 [3] H. Hagras, Embedding Computational Intelligence in Pervasive Spaces, Pervasive Computing, IEEE 6 (2007), [4] P. Froehlich, L. Baillie, and R. Simon, FEATURE: Realizing the vision of mobile spatial interaction, interactions 15 (2008), [5] L. Brodersen, Geocommunication and Information Design, in, Forlaget Tankegang, Frederikshavn, [6] A. MacEachren, I. Brewer, and E. Steiner, Geovisualization to mediate collaborative work: tools to support different-place knowledge construction and decision-making, in: Proceedings, 20th International Cartographic Conference, ICA, Beijing, China, August 6-10, 2001, 2001, pp [7] D. DiBiase, A.M. MacEachren, J.B. Krygier, and C. Reeves, Animation and the Role of Map Design in Scientific Visualization, Cartography and Geographic Information Science 19 (1992), [8] T. O'Reilly, What is Web 2.0: Design Patterns and Business Models for the Next Generation of Software, Journal of Digital Economics 65 (2007), [9] A. Turner, Introduction to Neogeography, O'Reilly Media, Sebastopol, CA, [10] R. Edsall, Map Interactivity, in: International Encyclopedia of Human Geography, K. Rob and T. Nigel, eds., Elsevier, Oxford, 2009, pp [11] M. Haklay, A. Singleton, and C. Parker, Web Mapping 2.0: The Neogeography of the GeoWeb, Geography Compass 2 (2008), [12] ESRI, The GeoWeb: Spatially Enabling the Next-Generation Web, in: ESRI White Paper, ESRI, ed., Environmental Systems Research Institute, Redlands, 2006, p. 10. [13] S. Harrison and D. Tatar, Places: People, Events, Loci - the Relation of Semantic Frames in the Construction of Place, Computer Supported Cooperative Work 17 (2008), [14] M. Foth, H.G. Klaebe, and G.N. Hearn, The Role of New Media and Digital Narratives in Urban Planning and Community Development, Body, Space & Technology 7 (2008). [15] K. Field and J. O'Brien, Cartoblography: Experiments in Using and Organising the Spatial Context of Micro-blogging, Transactions in GIS 14 (2010), [16] M. Bilandzic and M. Foth, Mobile Spatial Interaction and Mediated Social Navigation, in: Encyclopedia of Information Science and Technology (2nd ed.), M. Khosrow-Pour, ed., IGI Global, Hershey, PA, 2009, pp [17] J. Bertin, Semiology of graphics, University of Wisconsin Press, Madison, [18] H. Faby, Adaption kartographischer Kommunikationsmodelle im Kontext der neo-cartography, in: AGIT, J. Strobl, T. Blaschke, and G. Griesebner, eds., Wichmann Verlag, Salzburg, 2009, pp [19] M. Foth, V.M. Gonzalez, and K.L. Kraemer, Design Considerations for Community Portals in Master- Planned Developments in Australia and Mexico, in: OZCHI: Australasian Computer-Human Interaction Conference, F. Vetere, G. Connor, and C. Satchell, eds., Association for Computing Machinery, 2008, pp [20] T.A. Slocum, C. Blok, B. Jiang, A. Koussoulakou, D.R. Montello, S. Fuhrmann, and N.R. Hedley, Cognitive and Usability Issues in Geovisualization, Cartography and Geographic Information Science 28 (2001), [21] R. Kingston, Public Participation in Local Policy Decision-making: The Role of Web-based Mapping, The Cartographic Journal 44 (2007), [22] T.A. Slocum, R.B. McMaster, F.C. Kessler, and H.H. Howard, Thematic Cartography and Geovisualization, Pearson Prentice Hall, Upper Saddle River, NJ, [23] A. Edwardes, D. Burghardt, and R. Weibel, Portrayal and Generalisation of Point Maps for Mobile Information Services, in: Map-based Mobile Services - Theories, Methods and Implementations, L. Meng, A. Zipf, and T. Reichenbacher, eds., Springer, Berlin, Heidelberg, New York, 2005, pp [24] A.M. MacEachren, How Maps Work - Representation, Visualization and Design, The Guilford Press, New York, London, [25] S. Weinschenk, 100 Things You Should Know about People: #3 - You Can Only Remember 3 to 4 Things At A Time (The Magic Number 3 or 4), in: What Makes Them Click, S. Weinschenk, ed., [26] Journalism.org, Nielsen Analysis, in: The State of the News Media An Annual Report on American Journalism, P.F.E.I. Journalism and P.I.A.L. Project, eds., [27] F. Hruby and R.M. Guerrero, Kartographie im Spannungsfeld expliziter und impliziter Forschung, Journal for Theoretical Cartography. meta-carto-semiotics (2008).
223 Pavel P. CHERNYSHKOV; Stanislav G. GLUSHCHENKO 216 Geographical information systems for research biological resource of the World Ocean in climate fluctuation conditions Pavel P. CHERNYSHKOV a ; Stanislav G. GLUSHCHENKO b Immanuel Kant Baltic Federal University a Oceanographer,Doctor of Science b Postgraduate student Abstract. The results of GIS technologies application in researches of the World Ocean biological resource changes during climate fluctuation conditions are presented. The databases were created using ArcGIS 9.1 for three regions: Canary upwelling, Scotia Sea and South Pacific. During the analysis some parameters of seasonal and interannual fluctuations of oceanological conditions, as well as biomass fluctuations, and fishing objects distribution were received. Keywords. Biological resources of the World Ocean, seasonal and interannual fluctuations, Canary upwelling, Scotia Sea (Antarctica), South Pacific, geoinformatic technologies, climatic changes Introduction A variety of remotely sensed, surveyed, statistical and species life history data can be integrated through extensive GIS analysis resulting in the seasonal mapping of species population dynamics. This dynamic GIS mapping provides valuable information for fisheries managers, who continuously require background information for developing management scenarios. [1] Till present day the large amount of data about fishery statistics, biologic parameters of populations, interannual variability of biomass and distribution of bioresources was collected. The data which characterize environmental conditions in the Canary Upwelling region was collected as well. The Geographic Information System technology is the most appropriate instrument for maintenance and analysis of such database.[2] In respect of monitoring ocean and fisheries dynamics and generating new information-based management schemes, GIS technology is closely related to several other technologies, such as Global Positioning Systems, Remote Sensing, modeling, image processing, spatial statistics as well as the Internet (Figure 1).[ 1]
224 Pavel P. CHERNYSHKOV; Stanislav G. GLUSHCHENKO 217 Figure 1. A diagram that show the variety of technological disciplines and issues, which h are highly associated with the main core of marine GIS technology [1] 1. Data and methods Biological resources of ocean suffer meaningful changes under influence of habitation environment (oceanological ogical conditions). It is exactly strongly appears in Canary upwelling region(small pelagic fishes), Scotia Sea (Antarctic krill Euphausia Superba) and in South Pacific (jack mackerel). So, therefore, exactly for that regions were created databases in the first order. Databases was created in AcrGIS environment and performed the researches earches of oceanologic conditions influence on the biological resources state. [3-5] For the analysis s were used multidimensional statistic methods: fields decomposition on main components and cluster analysis. The scheme of functioning of created system represented on Figure 2.
225 Pavel P. CHERNYSHKOV; Stanislav G. GLUSHCHENKO 218 Figure 2. Using ArcGIS 9.1 for research biological resource 2. Results and Discussions 2.1. Canary Upwelling First One of the main oceanographic features of waters off the Northwest African coast is boundary between en North Atlantic Central Waters (NACW) and South Atlantic Central Waters (SACW), which occurs near Cab Blanc. Location of this boundary can be well defined on the basis of oceanographic survey. Multivariate statistical analysis of temperature and salinity data, which had been obtained in eight cruises in the Canary Upwelling region in , was applied to reveal spatial variation of the boundary in the upper layer. The main reason of the boundary shifts is dynamic factor, such as intensity of the North Brunch of Equator Counter Current (NECC) and Canary Current (CC). Number of comprehensive surveys which were carried out off Northwest African coast in demonstrated that interannual variability in recruitment abundance and distribution might be partly explained by environmental factors. The main factors are mesoscale physical processes (e.g. coastal upwelling, eddies and hydrographic fronts). Data of remote sensing (e.g. satellite altimetry) are good addition to the in situ measurements and allow revealing peculiarities of environmental conditions. The analysis revealed that t in some cases interannual variability of the recruitment abundance index might be caused by cohort's abundance rate, while in the other cases the index was mostly influenced by survey conditions. Thus, it was quantitatively demonstrated that environmental share might prevail among all the other factors.[3,6]
226 Pavel P. CHERNYSHKOV; Stanislav G. GLUSHCHENKO Scotia Sea In this work the variability of dynamic processes in the atmosphere and ocean (atmospheric transport, geostrophic circulations, water masses structure, krill drift) are considered in the western part of the Atlantic sector of Antarctic in relationship to pelagic ecosystem parameters and distribution of krill being the major element of this ecosystem and traditional fishing object. The available data for , including oceanographic and meteorological observations, data of acoustic and trawl surveys, catch statistics were used as initial material. Presenting the research results, the authors pay a special attention to historical and current areas of krill fishery in the Scotia Sea: Orkneys Subarea (Subarea 48.2), South Georgia Subarea (Subarea 48.3), Antarctic Peninsular Subarea (Subarea 48.1). It is demonstrated that variability of oceanographic conditions in the Scotia Sea fishing areas are determined by large-scale atmospheric processes occurred within the latitude band S, dynamic regime of the central and southern branches of ACC, intensity of the western periphery of the Weddell Circulation. Inter-annual fluctuations of climatic conditions in the studied areas are characterized with long-term trends of years in duration. The synchronous fluctuations have been revealed in the atmospheric pressure anomalies fields at the sea level and in SST anomalies in fishing grounds of the Antarctic sector of the Atlantic Ocean. These variations interpreted as quazi-four-year Antarctic circumpolar wave are typical to all areas researched. The assessment of krill geostrophic transport characteristics in the Scotia Sea Subareas were fulfilled, and the impact of transport on the krill distribution and operational indices of commercial vessels in the fishing grounds were demonstrated. Typification of the water masses horizontal circulation and associated krill distribution based on long-term scientific observations and fishery data was provided; besides, biomass estimates and its commercial importance, factors of krill drift in different modifications of the Antarctic water masses were demonstrated.[4] 2.3. South Pacific The cycles of 2.8, 3.6 and 5.2 years are clearly distinguished in the inter-annual trend of the first and the second PC of Sea Surface Temperature. The area classification with the cluster analysis divides it into 5 classes. Class 1 surrounds STO area from the South America coast at S to the northwest, west and south-west up to 50 S and W. Class 2 is located in the Peruvian Current and Southern Trade-Wind Current areas. Class 3 is situated closer to the central part of the subtropical circulation (30-40 S, W). Classes 4 and 5 cover the most part of STO to the south of 40 S. The boundary between them is at 130 S. In the inter-annual trend of the mean-class SST anomalies the cycles similar to PC are observed. The ocean level anomalies provide information on the currents field. They allow to calculate the geostrophic velocity anomalies, absolute velocity of the geostrophic currents, kinetic eddy energy (KEE) being maximum in the jet flows with high velocities from 25 to 50 cm/s and higher. Currents at the boundaries of STO subtropical circulation are characterized with velocities from 1 to 20 cm/s on average. On the map of average long-term eddy energy their velocity values do not exceed 25 cm2/c2 [5,7]
227 Pavel P. CHERNYSHKOV; Stanislav G. GLUSHCHENKO 220 Analysis of the principle components and classification (Word s method) of the atmospheric pressure and sea-surface temperature fields allowed to reveal the areas considerably differed in the seasonal and inter-annual variability pattern in the South Pacific Ocean. Analysis of the ocean level anomalies on the basis of satellite altimetric data allowed to determine zones of intensive eddies formation associated to the main frontal zones of the ocean upper layers. 3. Conclusion Presented results are preliminary, though 3D temporal model of frontal zone which divides intermediate waters of Antarctic and Arctic origin were created for Canary upwelling region. Antarctic waters contains much considerably more biogenic elements, so that model allows to estimate tendencies of biological productivity changes at the region of Canary upwelling, which in turn renders assistance for planning fisheries in climate fluctuation conditions. References [1] Valavanis, Vasilis D Geographic infovation systems in oceanography and fisheries Taylor & Francis 11 New Fetter Lane, London EC4P 4EE, 209 p. [2] Fishery and oceanological researches in the Atlantic Ocean and the South Pacific / Edited by V.N.Yakovlev.- Kaliningrad, p. [3] Chernyshkov P., Bukatin P. Timoshenko N Seasonal and annual distribution of small pelagic fish stocks related to environment conditions in Canary Upwelling area from 1994 to 2007 Abstract, 3rd GLOBEC Open Science Meeting, Victoria, Canada. [4] Chernyshkov P., Polishchuk I.,Shnar V., Kasatkina S.,Litvinov F Inter-annual and seasonal variability of pelagic ecosystem and krill resources parameters in the Scotia sea in relation to oceanographic condition. Abstract, 3rd GLOBEC Open Science Meeting, Victoria, Canada. [5] Chernyshkov P., Timokhin E Variability of oceanological conditions in the Southern Pacific and structure of the pelagic ecosystem Abstract, 3rd GLOBEC Open Science Meeting, Victoria, Canada. [6] Ostrowski, M. Gleza, I. Hilmi, K. et all Evolution of oceanographic conditions in the permanent upwelling region off the northwest Africa , with consequences to distribution of pelagic stocks. Symposium Eastern boundary upwelling ecosystems: integrative and comparative approaches. 2-6 June 2008, Las Palmas, Gran Canaria, Spain [7] Koshlyakov М.N., Tarakanov R.Yu Intermediate waters of the South Pacific. Оceanology Volume 45, 4, p
228 Christian KUKA; Susanne BOLL 221 Individual Geographic Stream Processing for Driver Assistance Christian KUKA a ; Susanne BOLL b a OFFIS - Institute of Information Technology Oldenburg, Germany b Carl von Ossietzky University Oldenburg, Germany Abstract. Today more and more sensors are available, that measure different obstacles on the street. However, these measurements of events are not available while driving a car. Sensor information and especially predictions and aggregations of these events could be used to assist our driving behavior, e.g., inform the driver about possible problems on the current street or problems in a next street closed by. In this paper, we introduce a way to allow the car to perform complex event processing on predicted sensor data. Hence this event processing allows the driver to change her driving behavior accordingly. Keywords. Datastream Processing, Geographic Aggregation, Driver Assistant 1. Motivation Today a wide range of information sources exists that could support driving behavior at the current or even future position. These sources include GPS data from transport vehicles, current weather data, measurements from induction loops, and information from the CAN bus of different car vendors. Even micro blogs and social networks provide traffic information to everyone. For example a fraction of 0.05% of the twitter messages include traffic information [5]. Even today traffic systems provide information on traffic problems nonetheless they only report just a few of such traffic problems and we as a driver combine these information to a scenario that probably will happen in the future. What is needed here is a way to process driver related data in a fast and flexible way, perform prediction of upcoming events, and allow a complex event processing on the individual predicted events to support the drivers. 2. Related Work The goal of this PhD thesis is the complex event processing (CEP) of predicted geographic data from heterogeneous sensor sources in cars. The requirement for a processing of multiple data streams from heterogeneous sources in cars is the communication between the car or the driver to other cars and the environment. In the field of Vehicular AdHoc Networks (VANET) multiple communication standards and routing strategies are proposed [15]. These communication protocols are used in several projects [7,6] to provide location-aware services, traffic monitoring, and collision detection. However, these works only cover the communication of data between the cars and the environment and can only warn or inform drivers when an
229 Christian KUKA; Susanne BOLL 222 event happened. A prediction and especially an individual geographic event processing are not possible yet. Current work on the prediction and processing of moving objects are done using Moveable Object Databases (MOD) [11]. MODs store object information persistent in a database [8] and use index structures optimized for select and update queries. However, in driver scenarios a moving object is only a matter of concern for a short period of time resulting in continuously insert queries and delete queries at a high frequency [4]. An approach for flexible processing of data streams are Data Stream Management Systems (DSMS). A DSMS allows the processing of continuously incoming data through a temporal extended relational algebra [3]. DSMS systems like Stream Spinner [13] and PLACE [10] also tackle the processing of geographic data streams. However these systems do not provide prediction of geographic data and complex event processing of the geographic data is not supported. To perform processing and especially detection of multiple events based on predefined pattern different approaches like rule-engines and complex event processing engines exist. CEP languages like SASE [12] allow the definition of complex events through multiple operators like loops and sequences. However both approaches do only allow the detection of events on the current and previous events and do not support the complex event processing on predicted events and their different probabilities. The prediction of events in data streams are targeted by several research groups especially from the field of data mining and machine learning using a concept model [14]. Other approaches use prediction functions either embedded into the data stream [9] or predefined in the query [2]. These works only cover the prediction itself and not the further processing on the predicted events and their probability. 3. Complex event processing on predicted geographic events Current CEP engines use a non-deterministic finite automaton (NFA) to represent a pattern query [1]. This NFA is defined as A = (Q; E; ; q1; F) and consists of a set of states, Q, a set of directed edges E, a set of formulas, labeling those edges, a start state, q1, and a final state F. A changeover from one state to a successor state connected by an edge only happens when a given input event confirms the given formula on the edge. A changeover into multiple states can also happen in case of non mutually exclusive formulas. The naive approach to perform complex event processing on predicted events would be the continuous generation of events through a prediction function. This approach would lead to a high overhead for the CEP engine. Further the probability of predicted events is not handled by current CEP engines. A more efficient approach would be the transformation of the prediction function into inequalities depending on time as a variable as shown in [2]. These inequalities in combination with the condition for a changeover of the state only have to be checked at a given time and not continuously.with ongoing time and new probabilities of predictions the states in the NFA can become invalid. This happens especially when the probability of a predecessor state has changed to zero or below the probability of the formulas on the edges. In this case all successor states become invalid too. A further aspect to keep in mind is the definition of the pattern using a query language. Current query languages for defining complex event processing on data streams only target current events and previous events. To allow a query formulation
230 Christian KUKA; Susanne BOLL 223 on previous, current, and future events different query languages will be analyzed and the grammar will be extended to define the event processing on predicted events. 4. Conclusion The presented concept will support the driver of a car in the daily use. This will be achieved through an individual event processing of predicted geographic sensor data. Targeting research topics include the event processing of predicted geographic data with their different probabilities and the pattern definition using a query language. References [1] Jagrati Agrawal, Yanlei Diao, Daniel Gyllstrom, and Neil Immerman. Efficient pattern matching over event streams. Proceedings of the 2008 ACM SIGMOD international conference on Management of data - SIGMOD 08, page 147, [2] A. Bolles, Marco Grawunder, Jonas Jacobi, Daniela Nicklas, and H. Appelrath. Prediction Functions in Bi-temporal Datastreams. In Database and Expert Systems Applications, pages Springer, [3] Michael Cammert, Christoph Heinz, J. Kraemer, and B. Seeger. Anfrageverarbeitung auf Datenströmen. Philipps-Universitaet Marburg (BWT), pages 13 24, [4] Michael Cammert, Christoph Heinz, Jürgen Krämer, and Bernhard Seeger. Datenströome im Kontext des Verkehrsmanagements. In Mobilität und Informationssysteme, [5] Sara Carvalho, L. Sarmento, and R.J.F. Rossetti. Real-Time Sensing of Traffic Information in Twitter Messages. paginas.fe.up.pt, [6] S. Dashtinezhad, T. Nadeem, B. Dorohonceanu, C. Borcea, P. Kang, and L. Iftode. TrafficView: a driver assistant device for traffic monitoring based on car-to-car communication IEEE 59th Vehicular Technology Conference. VTC 2004-Spring (IEEE Cat. No.04CH37514), pages , [7] Marios Dikaiakos, Andreas Florides, Tamer Nadeem, and Liviu Iftode. Location-Aware Services over Vehicular Ad-Hoc Networks using Car-to-Car Communication. IEEE Journal on Selected Areas in Communications, 25(8): , October [8] Kejia He and Liangxu Liu. Efficiently Indexing Moving Objects on Road Network International Conference on Computational Intelligence and Software Engineering, pages 1 4, December [9] S. Ilarri, O. Wolfson, E. Mena, a. Illarramendi, and N. Rishe. Processing of Data Streams with Prediction Functions. Proceedings of the 39th Annual Hawaii International Conference on System Sciences (HICSS 06), 00(C):237a 237a, [10] M.F. Mokbel, Xiaopeng Xiong, M.A. Hammad, andw.g. Aref. Continuous query processing of spatiotemporal data streams in PLACE. Geoinformatica, 9(4): , [11] O. Wolfson, B. Xu, S. Chamberlain, and L. Jiang. Moving objects databases: issues and solutions. Proceedings. Tenth International Conference on Scientific and Statistical Database Management (Cat. No.98TB100243), pages , [12] Eugene Wu, Yanlei Diao, and Shariq Rizvi. High-performance complex event processing over streams. Proceedings of the 2006 ACM SIGMOD international conference on Management of data - SIGMOD 06, page 407, [13] S. Yamada, Y. Watanabe, H. Kitagawa, and T. Amagasa. Location-Based Information Delivery Using Stream Processing Engine. 7th International Conference on Mobile Data Management (MDM 06), pages 57 57, [14] Ying Yang, Xindong Wu, and Xingquan Zhu. Combining proactive and reactive predictions for data streams. Proceeding of the eleventh ACM SIGKDD international conference on Knowledge discovery in data mining - KDD 05, page 710, [15] Sherali Zeadally, Ray Hunt, Yuh-Shyan Chen, Angela Irwin, and Aamir Hassan. Vehicular ad hoc networks (VANETS): status, results, andchallenges. Telecommunication Systems, December 2010.
231 Peter WOLFF et al. 224 Virtuell kuren mit einem WebGIS Peter WOLFF a ; Matthias BINDING b ; Viviane WOLFF c ; a Königsweg Geosolutions UG (i.gr.), Fulda b Königsweg Geosolutions UG (i.gr)., Bad Schwalbach c Hochschule Fulda, FB ET, Fulda Abstract. Im nationalen und internationalen Wettbewerb um Kur- und Badegäste und um selbst zahlende Wellness-Besucher müssen traditionelle Kurorte ihr Alleinstellungsmerkmal schärfen und auch im Informationsangebot zeitgemäßere Wege gehen. Die hessische Kur- und Kreisstadt Bad Schwalbach will potenzielle Besucher mit einem WebGIS und dort integrierten thematischen Routen von den Vorzügen ihres Ortes überzeugen. Keywords. WebGIS, Geoinformationssysteme, Bad Schwalbach, Kurort, Wellness, Geschichte 1. Motivation Eine mehr als 400jährige Tradition als Kur- und Badeort liegt hinter der hessischen Kreisstadt Bad Schwalbach, nahe Wiesbaden. Diese Jahre waren bewegt. Anfangs noch ein Dorf, in dem man zufällig gesundheitsfördernde Quellen und Moore entdeckt hat, wurde es bald ein Kurort mitten im Taunus, der gekrönte und reiche Häupter anzog. Nach dem 2. Weltkrieg war es immerhin noch ein von den Gesundheitskassen bezahltes Sozialbad, doch auch diese Ära verschwand mit dem inflationären Einführen von Gesundheitsreformen, verbunden mit immer größeren Kürzungen und Einschränkungen. Jahrelang waren daher die Staatsbäder ein Zuschussbetrieb des Landes Hessen und die Bäder wurden privatisiert. Auch das Staatsbad Bad Schwalbach kam vor wenigen Jahren mit der Zusage mehrerer Millionen Euro an Fördermitteln vom Land Hessen in das Eigentum der Stadt. Nun, nachdem die Förderung ausgelaufen ist, steht das Bad unmittelbar im Wettbewerb zu vielen Bädern im direkten regionalen Umfeld (Bad Ems, Schlangenbad, Bad Soden, Wiesbaden, Bad Kreuznach), aber auch zur großen internationalen Konkurrenz und dem seit Jahren boomenden Markt der Wellness- Angebote. Ein Instrument unter vielen, das die öffentliche Wahrnehmung für den idyllischen Ort im Taunus schärfen sollte, ist ein WebGIS mit für den Tourist oder Kurgast interessanten Routen rund um den Ort und innerhalb desselben. 2. Ein WebGIS für die Kur Drei Rundgänge wurden in der Pilotversion implementiert: Die Berühmten Köpfe [1], die in der Stadt früher lebten oder kurten, die zahlreichen Brunnen [3] im
232 Peter WOLFF et al. 225 Stadtgebiet und bedeutende oder historische Gebäude (die so genannten POI Point of Interrests). Die Rundgänge werden in einer separaten Kartenansicht dargestellt, wobei die Popup-Funktion und der Aufruf der POI-Seiten via WFS geschehen. Hierfür wird ein Abbildung 1: Eingangsseite von SchwalbIS, dem WebGIS der Stadt Bad Schwalbach eigens dafür in JavaScript geschriebenes Programm verwendet, das im Programmpaket von GeoServer mitgeliefert und für die hier beschriebenen Rundgänge adaptiert wurde. Um künftig nicht in die Abhängigkeit propietärer Anbieter zu geraten und darüber hinaus auf gängige Standards zu setzen, wurde das WebGIS komplett mit OpenSource-Software realisiert (siehe Abb. 2). Als Kartenserver kommt ein GeoServer und als Webserver Apache Tomcat zum Einsatz. Die Geodaten werden entweder als Shapes und Coverages in Geoserver oder in der Datenbank PostgreSQL/PostGIS bereitgehalten, derzeit noch als Option, weil die Stadt mit ihren Informationen im normalen Webauftritt und dessen Provider eine andere Datenhaltung realisieren. Mittels WMS (Web Mapping Service) können weitere, frei verfügbare Informationen von externen Datenbanken eingebunden werden. Die Kartendarstellung erfolgt mit OpenLayers. Das WebGIS ist über die Webseite der Stadt Bad Schwalbach zu erreichen (Subdomain). Die Startseite besteht aus der Kartenansicht der Kurstadt mit ihren Stadtteilen und den Funktionen Navigation Zoomen, Straßensuche und Entfernungen messen sowie jeweils einem separaten Link zu den drei unterschiedlichen Rundgängen. Zudem können Kartenlegende, Hilfe und Impressum eingeblendet werden (siehe Abb. 1).
233 Peter WOLFF et al. 226 Abbildung 2: Systemarchitektur des WebGIS 'SchwalbIS' (Grafik modifiziert nach [2]) 3. Fazit Da SchwalbIS erst im Frühjahr 2011 online ging, fehlen derzeit noch Ergebnisse der Anwender. Konzeptionell ist das WebGIS erweiterbar, so auch um Video-Sequenzen mit entsprechenden Landschafts- oder Straßenansichten oder mit Inhalten zur Geschichte, die z.b. direkt aus dem örtlichen Museum integriert werden könnte. Auch eine eigene Version für Smartphones wäre eine Option. Als Instrument potenzielle Besucher von Vorzügen und Einzigartigkeiten, sowie über historische Hintergründe vorab online zu informieren hat es aber schon heute einen Vorzug gegenüber klassischer Werbung von Kurorten. Literatur [1] WOLFF, P. (2011): Bad Schwalbacher Berühmtheiten Namen die Geschichte schrieben (i.v.): Eckhard-Humbert-Verlag, Bodenheim [2] ADEN, Chr.; KLEPPIN, L.; SCHMIDT, G.; SCHRÖDER, W.: (2009) Zusammenführung, Visualisierung und Analyse von waldzustandsrelevanten Daten im WebGIS WaldIS, in: Strobl/ Blaschke/Griesebener (Hg.): Angewandte Geoinformatik 2009, Wichmann-Verlag [3] STAATSBAD Bad Schwalbach (Hg.) (2010): Der Brunnenweg in Bad Schwalbach
234 Robert KULAWIK 227 Flex-I-Geo-Web - ein dienstebasierter Softwarebaukasten zur Standortentwicklung Robert KULAWIK AG GIS Universität Bonn, Meckenheimer Allee 166, Bonn, Keywords. Opensource, WebProcessing, GIS, WebService, Ressourcenmanagement, Stadtentwicklung 1. Motivation Unternehmerische, private wie auch behördliche (Investitions-)Entscheidungen, deren Folge Flächenversiegelungen auf bislang natürlichen oder naturnah genutzten Flächen sind, beruhen nicht selten auf unvollständigen Informationen über mögliche Alternativen in bereits erschlossenen Gebieten oder über brach liegende, umnutzbare Altflächen. Es erscheint häufig attraktiver, Gewerbegebiete oder Wohnbebauungen auf der viel zitierten grünen Wiese zu errichten, weil hier freier geplant werden kann als in bereits entwickelten und verdichteten Räumen. Eine Ursache für Neuplanungen außerhalb von Siedlungsbereichen sind oftmals Schwierigkeiten die notwendigen Informationen und Rahmenbedingungen für die Nutzung innerstädtischer Brachen und Baulücken zu beschaffen. Solche Planungsvorhaben benötigen neben Geobasis- vor allem Geofachdaten. Diese reichen von sozioökonomischen Bevölkerungsdaten über Verkehrsdaten oder Informationen zur Anbindung und Kapazität innerstädtischer Ver- und Entsorgungsnetze bis hin zu Informationen über die umgebende Fassadengestaltung, um Neubauten harmonisch ins Stadtbild einzufügen oder mit ihnen Impuls gebende Akzente zu setzen. Projektentwickler, Bauherren und Architekten stehen dabei vor der Abwägung sich auf einen möglicherweise langwierigen Prozess der Informationsbeschaffung und - bewertung einzulassen oder von vornherein auf der grünen Wiese zu planen. Die Waage neigt sich vor diesem Hintergrund häufig in Richtung Neuerschließung, was zu wachsendem Flächenverbrauch mit vielfältigen ökonomischen und ökologischen Folgen führt (vgl. Schiller et al. 2009). Einen Beitrag zur Lösung des dargestellten Problems könnte ein webbasiertes Informations- und Analysesystem leisten, das die entsprechenden Daten bereit stellt und sie intuitiv verständlich darstellt sowie dem Benutzer erlaubt, seine lokalen Projektparameter einzustellen, um so attraktive Flächen für sein jeweiliges Vorhaben in einem individuell gestaltbaren, mehrstufigen analytischen Prozess eigenständig zu ermitteln. Dies kann aus vielerlei Gründen kein fertiges oder gar geschlossenes Onlineportal sein, weil kein Betreiber die Vielzahl denkbarer Informationswünsche der
235 Robert KULAWIK 228 Nutzer vorhersehen kann. Es liegt daher nahe, international anerkannte OGC Spezifikationen und Standards einer Geodateninfrastruktur zu nutzen, um einen Open- Source-basierten Softwarebaukasten zu entwickeln. Dieser erlaubt es interessierten Anwendern intuitiv, zu beliebigen Fragestellungen mit Raumbezug schnell und einfach webbasierte Onlineportale zu realisieren, die zahlreiche Software-Funktionalitäten ebenso zur Verfügung stellen, wie sie vorhandene GDI-Komponenten nutzen, um beliebige Daten einzubinden. Flex-I-Geo-Web ist ein gemeinschaftliches OpenSource Entwicklungsprojekt von 8 Mitgliedern der Geo-Initiative der Region Bonn. Projektpartner sind das Fraunhofer Institut IAIS, das Geographische Institut der Universität Bonn, die Industrie- und Handelskammer, die Wirtschaftsförderung der Stadt Bonn und 4 innovative GIS- Unternehmen aus der Region (CPA, lat/lon, interactive instruments, WhereGroup). Das Projekt wird im Rahmen des Technologie- und Innovationsprogramm NRW aus dem Europäischen Fonds für Regionale Entwicklung (EFRE) gefördert. Die Laufzeit des Projektes beträgt 2,5 Jahre und endet im Oktober Umsetzung des Projektes Bei der Realisierung des Projektes wird vor allem auf die Integration und Interaktion serverbasierter Dienste gesetzt, sowohl bei der Bereitstellung der Daten als auch bei deren Verarbeitung. Dem entsprechend können Kartendienste über die OGC WMS Schnittstelle eingebunden werden, Datendienste über die OGC WFS Schnittstelle. Eigene Daten des Benutzers können georeferenziert über ein Interface in ein Kartenoder Datendienst überfuhrt und zur weiteren Verarbeitung bereitgestellt werden. (Abb. 1) Mittels transaktionalem Zugriffs auf einen WFS ist es möglich die eigenen Daten zu verändern und somit aktuell zu halten. Abbildung 1: Bereitstellung von Datenquellen innerhalb von Flex-I-Geo-Web Eine zentrale Rolle im Projekt nimmt die Integration des WebProcessingService (OGC WPS 1.0.0) in das Framework ein, über den beliebige standardisierte Geodatenverarbeitungsdienste und Analysefunktionen in Webportale integriert werden können. Damit ist das gesamte System in seiner Funktionalität sehr flexibel und durch das Einbinden weiterer, je nach Anwendungsfall benötigten, Prozesse unbegrenzt
236 Robert KULAWIK 229 erweiterbar, was auch eine Kombination einzelner atomarer Prozesse zu einer Prozesskette einschließt, beispielweise um einen wiederkehrenden Workflow abzubilden. Ein im Rahmen des Flex-I-Geo-Web Projektes entwickelter WPS-Client soll dabei ermöglichen, dass Web Prozesse über eine graphische Oberfläche einfach angesprochen und konfiguriert werden können. Der WPS-Client fungiert dadurch als Bindeglied zwischen dem Webportal und den Prozessen, über die unterschiedliche (Berechnungs-) Funktionen für das Projekt bereitgestellt werden (Abb. 2). Abbildung 2: Einbindung von WPS in das Flex-I-Geo-Web Framework Zu den weiteren Diensten und Funktionen, die innerhalb des Flex-I-Geo-Web Baukastens bereitgestellt werden, gehören neben den angesprochenen Werkzeugen zur Interaktion mit Karten- und Datendiensten wie auch zur Integration eigener Datenquellen weitere Analysewerkzeuge für Daten und Ergebnisse sowie ihre grafische Aufbereitung und Visualisierung. Sie bieten dem Nutzer die Möglichkeit ihre Daten oder Ergebnisse beispielsweise zu klassifizieren, gewichten, filtern oder zu sortieren. Ein entsprechendes Werkzeug hilft bei der Erstellung und Ausführung aus einzelnen Funktionen zusammengesetzten Workflows. Die verwendeten Daten und die aus ihnen gewonnenen Ergebnisse können anschließend mittels eines Reporting-Werkzeugs templatebasiert ausgedruckt oder als PDF exportiert werden. Ein standardkonformes Rechtemanagement als Servicekomponente für den Zugriff auf Dienste und Daten sichert das Portal sowie die Nutzerdaten ab. Es basiert auf den einschlägigen Normen ISO/IEC 2700x sowie auf Spezifikationen des Bundesamtes für Sicherheit in der Informationstechnik (BSI). Für das Rechtemanagement werden die entsprechenden Standards des OGC (GeoRM) hinzugezogen, insbesondere für die Anwendungsfelder Authentifizierung (Identitätsprüfung), Authentisierung (Identitätsnachweis), Autorisierung (Prüfung der Zugriffsberechtigung), Unterstützung von Lizenzmanagement und Lizenzhandel sowie die Unterstützung von Abrechnungsprozeduren. Dadurch soll eine Interaktion mit lizenzpflichtigen und kostenpflichtigen Datentendiensten über ein Abrechnungmodul übernommen werden können. Die Interaktion der Sicherheitskomponente mit dem Portal erfolgt über eine einfach konfigurierbare und intuitive Benutzeroberfläche. Die technische Architektur des Projektes folgt seiner Konzeptionsidee einer Client- Server Architektur mit ausgelagerten Serverdiensten für Daten und Berechnungen und
237 Robert KULAWIK 230 einem Klienten im Browser, über den die Steuerung der Dienste sowie die Visualisierung das Daten und weniger rechenintensive Auswertungen vorgenommen werden können. Zudem werden in dem Framework teilweise unterschiedliche Lösungen der beteiligten Projektpartner integriert. Der Web-Client basiert hauptsächlich auf HTML Seiten und JavaScript. Der die Anwendung steuernde eigene Anwendungskern ist ebenfalls in JavaScript programmiert. Zum Einsatz kommen weiterhin die Bibliotheken jquery, jqueryui und zur Visualisierung die OpenLayers Bibliothek. Serverseitig basieren die Komponenten auf Java. Zur Einbindung der Java Komponenten werden JavaServerPages (JSP) sowie JavaServerFaces (JSF) eingesetzt. Zusätzlich werden im Projekt auch externe Services, wie der WMS/WFS/WPS Service von deegree sowie eine Benutzerdatenbank eingebunden. Zeitlich befindet sich das Projekt in seiner Fertigstellungsphase. Das bedeutet, dass noch fehlende Module zeitnah nachgeliefert oder modifiziert werden sowie die Interoperabilität der Module untereinander und mit der Anwendung getestet und nach Bedarf noch angepasst wird. Nach Abschluss dieser Arbeiten wird ein Demonstrator mit einer möglichen Benutzersicht auf das System erstellt, die den Workflow und die benötigten Funktionen abbilden soll und gleichzeitig die momentane Entwicklersicht auf das Projekt ablösen soll (Abb. 3). Die Fertigstellung des Demonstrators ist für das Frühjahr 2011 geplant. Anschließend erfolgt eine weitere Evaluierung des Ergebnisses mit ausgesuchten Zielnutzern, die bereits schon in einer früheren Phase des Projekts eingebunden waren. Als Demonstrator für das Projekt soll ein Portal dienen, das Standortanalysen zur Suche und individuellen Bewertung von Baulücken, Brachflächen und Leerständen ermöglicht. Mit dem Demonstrator entsteht zugleich ein neues Instrument für Architekten und Bauherren, um eine flächenschonende Stadtentwicklung zu gestalten und ökonomische und ökologische Aspekte ausgewogen abzuwägen. Abbildung 3: Admistrator-/Entwicklersicht auf die Oberfläche von Flex-I-Geo-Web
238 Robert KULAWIK Resümee Durch das Flex-I-Geo-Web Projekt werden die Möglichkeiten zur Analyse von geographischen Daten in einer webbasierten Lösung entscheidend weiter entwickelt. Der Anwender benötigt lediglich einen Webbrowser und Zugang zu einem Geodatenportal mit integrierten Fex-I-Geo-Web-Bausteinen. Mit dem Projekt bietet sich die Chance, eine beispielhafte, frei zugängliche Open-Source-Lösung für einen intuitiv verständlichen Geoinformations-Portal-Baukasten zu entwickeln. Bei der Umsetzung des Projekts werden bestehende Standards des Open Geospatial Consortium (OGC) unterstützt und weiterentwickelt. Die entwickelten Bausteine sollen nach Projektende der Allgemeinheit verfügbar gemacht werden, um dadurch eine nachhaltige Wiederverwendbarkeit zu gewährleisten. Dabei folgt das Projekt mit der Idee eines flexiblen Baukastens bereits dem Opensource Gedanken, da es einerseits selbst aus Opensource Bausteinen besteht, die für diesen angepasst wurden wie auch auf einer erwarteten höheren Akzeptanz des Projektergebnisses, wenn es individuell auf die Bedürfnisse und den Einsatzzweck des Nutzers frei erweitert oder angepasst werden kann. Flex-I-Geo-Web ist weiterhin so konzipiert, dass es mit bereits bestehenden Lösungen für Geodatenportale zusammenarbeiten kann. So basiert die Viewer- Komponente des in der Basiskonfiguration auf der OpenLayers API. Um Flex-I-Geo- Web jedoch auch in einer bereits bestehenden Infrastruktur einzusetzen sind Schnittstellen zu Portalen wie z.b. igeoportal oder MapBender vorgesehen. Bei der Entwicklung des Flex-I-Geo-Web Baukastens wird zudem sichergestellt, dass kein Spezialsystem zur nachhaltigen Siedlungsentwicklung entsteht, sondern grundsätzliche Anforderungen an innovative raumbezogene und dienstebasierte Webanwendungen abgeleitet werden. Am Beispiel des Demonstrators in einem online zur Verfügung stehenden Software-Baukasten sollen in Zukunft zahlreiche weitere Nutzungsideen und neue Informationsportale initiiert werden, die eigenständig von Endanwendern oder auch durch Aufträge an beliebige IT-Dienstleister realisiert werden können. Es entsteht dadurch in gewisser Weise ein Katalysator für vielfältige Folgeprojekte. So können mit dem geplanten Demonstrator auch gesamtwirtschaftliche oder gesellschaftliche Fragestellungen angegangen werden, zugleich bewirkt jede neue aus dem Baukasten resultierende Anwendung gleich in zweifacher Weise wirtschaftliche Impulse, zum einen durch Nachfrage beim Aufbau eines Informationsund Analysesystems, zum anderen durch dessen Nutzung. Ein zentraler Punkt im Projekt betrifft die Bereitstellung von Daten. Die mangelnde Bereitschaft vieler Datenbesitzer die erforderlichen Geofachdaten zur Verfügung zu stellen, birgt sowohl für dieses als auch vergleichbare Projekte Risiken mit sich. Es ist für den Erfolg des angestrebten Demonstrators entscheidend, Software- Services und Datendienste nicht nur technisch zu beherrschen, sondern von Beginn an verlässliche Zugriffe auf reale, möglichst flächendeckende Datenbestände anzubieten. Vielfach liegen die notwendigen Informationen als (Geo)-Fachdaten in der Verantwortung einzelner Fachämter, bei Stadtwerken oder in kommunalen Unternehmen vor, die häufig restriktiver als fachlich notwendig mit ihren Daten umgehen. Zahlreiche Forschungs- und Entwicklungsvorhaben einzelner Projektbeteiligter konnten ihr Potenzial in der Vergangenheit deshalb nicht voll entfalten. Damit kann dieses Projekt zugleich für die Beispiel-Region Bonn unter enger Einbindung der politischen Spitzen einen Prozess anzustoßen, der zu einer
239 Robert KULAWIK 232 Grundsatzentscheidung der Entscheidungsträger führt, alle öffentlichen Daten im Rahmen und zumindest für die Dauer der Projekt-Entwicklung ohne Nutzungsbeschränkungen zur Verfügung zu stellen, soweit sie als personenbezogene Daten nicht unter datenschutzrechtliche Bestimmungen fallen. Für die Datenanbieter bietet das Projekt auf lange Sicht eine neue alternative Plattform um ihre Geodaten anzubieten, zugleich auch die Möglichkeit einfache Abrechnungsmodelle zu evaluieren und zu harmonisieren. Ein Erfolg dieses Vorgehens kann in Zukunft weitere Kommunen und Kreise dazu ermuntern ihre Daten ebenso auf einem vergleichbar einfachem Weg über Webservices anzubieten. Eine zuletzt nicht zu unterschätzende Innovation ist die Möglichkeit des Flex-I- Geo-Web Baukastens, potenzielle Endanwender zu aktiven Entwicklern zu machen. Wenn man so will, sind dies erste Beiträge zu einem Web 3.0-Konzept, in dem nicht mehr nur der Content nutzergeneriert sein wird (Web 2.0), sondern auch Software und Prozesse. Sämtliche im Rahmen des Projektes entstandene Softwarekomponenten werden als Open-Source Lösung frei zur Verfügung stehen, sowie neue Standards und Spezifikationen in einen internationalen Abstimmungsprozess eingebracht. Dies regelt den allgemeinen wie gegenseitigen Zugang zu den in den Arbeitspaketen erzielten Ergebnissen References [1] OGC (2007): OpenGIS Web Processing Service v , doc.nr r7 [2] OGC (2005): OpenGIS Web Feature Service Implementation Specification v , doc.nr [3] OGC (2002): OpenGIS Web Map Service Implementation Specification v , doc.nr r3 [4] Schiller, Georg; Gutsche, Jens-Martin; Siedentop, Stefan; Deilmann, Clemens (2009): Von der Außenzur Innenentwicklung in Städten und Gemeinden - Das Kostenparadoxon der Baulandentwicklung Texte Nr. 31/2009 Umweltbundesamt 2009 als Onlineressource vom :
240 Dirk HECKER et al. 233 Challenges and Advantages of using GPS Data in Outdoor Advertisement Dirk HECKER a ; Christine KÖRNER a ; Michael MAY a a Fraunhofer Institut Intelligente Analyse- und Informationssysteme Schloss Birlinghoven, D Sankt Augustin, Germany Abstract. A growing number of companies use mobility data in their day-to-day business. Especially in the area of outdoor advertising, GPS devices have been successfully applied in order to measure poster performance in recent years. Based on personal mobility traces, the quality and precision of performance measures has increased significantly. However, the usage of GPS technology poses several challenges when applied to critical business processes. We will present several challenges and solutions which we developed in the last years of our mobility research with GPS data. Keywords. mobility surveys, GPS data, outdoor advertising, missing data Introduction Outdoor advertisement is one of the oldest advertising media and continues to play an important role in the advertisement industry. In 2008 the turnover was 684 million CHF (about 460 million Euros) in Switzerland and 805 million Euros in Germany. In recent years the market has changed rapidly. The change is predominately caused by two factors, namely the competition with other advertising media and the emergence of digital media. First, outdoor advertisement competes with other media including the classic television, radio and press as well as the modern online ads and direct mailing. In order to become incorporated by media planners in an advertisement mix, transparent measures are needed for the performance of a campaign. Typical measures are (1) the reach of a campaign, which is the percentage of persons within a target group defined by socio-demographic attributes that has had contact with a campaign in a certain time interval (often one week), and (2) the number of total contacts this group has had. Improved methods for audience measurement have become available in the last years due to technological advances and improved methodology. E.g., for performance measurement of cars and pedestrians on street, GPS technology has established itself as a new standard in Switzerland and Germany, greatly improving the possibilities of finegrained media planning [1]. Other countries, such as Austria and the UK, are currently preparing GPS studies, and it can be expected to become a worldwide standard. This paper is based on two industry projects of Fraunhofer IAIS that are concerned with audience measurement in outdoor advertising in Switzerland and Germany. Both projects have raised numerous challenging questions and led to many developments concerning the industrial use of GPS data. In this paper we give a collection of challenges and solutions that need to be addressed in similar applications.
241 Dirk HECKER et al Mobility Surveys in Germany and Switzerland The Arbeitsgemeinschaft Media-Analyse e.v. (ag.ma), a joint industry committee of German advertising vendors and customers, commissioned a nationwide mobility survey as basis for an objective performance evaluation of outdoor advertisement. The survey was conducted using two different observation policies. In the first part a nationwide representative sample of persons was interviewed and queried about their movements of the previous day in a Computer Assisted Telephone Interview (CATI). In the second part persons were provided with GPS devices for a period of 7 days. Swiss Poster Research Plus (SPR+) is a neutral research organization of the Swiss outdoor advertising branch. SPR+ equipped a representative sample of test persons with a GPS logger for a period between 7-10 days. In total, the survey includes more than participants. In addition to mobility data both empirical studies contain information about the poster sites ( in Germany and about in Switzerland). Besides geographic coordinates, a visibility area for each panel is defined from within which the poster is likely to be seen. Given the trajectories of an individual and the visibility area of a poster panel, all resulting passages can be calculated by geographic intersection and the performance measures can be derived. A formal, application-independent description of the scenario has been given by Körner et al. [2]. Although the calculation sounds easy at first sight, several challenges, including missing measurements, small spatial variability and obfuscated movements within objects, have to be handled to derive valid values for poster performance. 2. Application Challenges In this section we give a short introduction of the challenges in measuring poster performance with GPS data Missing measurements GPS mobility studies contain different types of missing measurements. First, short interruptions occur due to tunnels, street canyons or the warm-up phase of a GPS device. Second, single trips within a day may not be recorded. For example, people may easily forget to carry the device during a short trip to the bakery. Third, complete measurement days may be missing due to several reasons. The GPS device may have been defective, or people drop out of the study early. Depending on the kind of missing data, different courses of action need to be taken. Short interruptions can be detected during data preparation and can be closed using routing algorithms. The second and third type of missing data pose serious problems, because they cannot be identified from the data itself. Usually, only completely missing measurement days are realistic to detect within a mobility study. May et al. [3] analyzed the effects that such a bias can introduce on data mining results. Moreover one requirement of mobility surveys is that inference about populationwide mobility patterns can be made. Current monitoring technologies as mentioned above easily lead to missing measurements. Missing data is not only uncomfortable to handle but can also introduce a bias in the data sample, which questions the validity of
242 Dirk HECKER et al. 235 data mining results. Therefore it is necessary to provide a systematic approach to detect possible sample biases in mobility data due to missing measurements. Hecker et al. [4] show that dependencies between mobility, socio-demography and missing data are not unusual and need attention in the data mining process and apply the approach to a large GPS mobility survey. The core of our approach consists of subgroup analysis, which identifies dependencies in mobility surveys Modeling Micro-Movement Variability One demand in mobility surveys, especially in outdoor advertising is the high level of spatial detail. The high dimensionality of geographic space, however, makes this requirement hard to fulfill. Even large mobility studies cannot guarantee to comprise all movement variation in large regions. For Germany this would mean to form a representative set of test persons that cover with their movements about 6.7 million street segments in a given period of time. In Germany test persons cover barely 26.7% of the German street network. However, as mobility studies are very laborious and expensive, it is not realistic to perform larger studies in the near future with GPS technology. It is thus necessary to increase the spatial variability of a given mobility data set while retaining important mobility characteristics of the sample. Hecker et al. [5] introduced a spatial aggregation of the mobility data and a subsequent simulationdriven disaggregation of the data based on information about the traffic distribution Pedestrian Movement Model for Indoor Poster Campaigns GPS technology has the drawback that it cannot be applied indoors due to signal loss. In Germany and Switzerland many valuable posters are situated in public buildings such as train stations or shopping malls and their evaluation is of high interest. Liebig et al. [6] introduced a method that allows performance measurements for billboards that are placed indoors. The approach is based on obtaining a number of relatively inexpensive frequency counts manually, and to generate a general model for indoor pedestrian movements based on frequency counts and a network of the possible pathways through a train station. References [1] D. Hecker, C. Körner, and M. May, Räumlich differenzierte Reichweiten für die Außenwerbung, In Angewandte Geoinformatik 2010, Beiträge zum 22. AGIT Symposium Salzburg, p [2] C. Körner, D. Hecker, M. May, and S. Wrobel, Visit potential: A common vocabulary for the analysis of entity-location interactions in mobility applications, In Proc. of the 13th International Conference on Geographic Information Science (AGILE'10), p [3] M. May, C. Körner, D. Hecker, M. Pasquier, U. Hofmann, and F. Mende, Handling Missing Values in GPS Surveys Using Survival Analysis: A GPS Case Study of Outdoor Advertising, IN Proc. of the 3rd International Workshop on Data Mining and Audience Intelligence for Advertising (ADKDD),2009. p [4] D. Hecker, H. Stange, C. Körner, M. May, Sample Bias due to Missing Data in Mobility Surveys, In Proc. of the 6th International Workshop on Spatial and Spatiotemporal Data Mining (SSTDM), p [5] D. Hecker, C. Körner, D. Schulz, H. Stange, M. May, Modeling Micro-Movement Variability in Mobility Studies, in Proc. of the 14th International Conference on Geographic Information Science (AGILE'11), 2011 (to appear).
243 Dirk HECKER et al. 236 [6] T. Liebig, H. Stange, D. Hecker, M. May, C. Körner, and U. Hofmann, A general pedestrian movement model for the evaluation of mixed indoor-outdoor poster campaigns, In Proc. of the Third Workshop on Pervasive Advertising and Shopping, 2010.
244 Martin GEMEINHOLZER; Andre SCHÜCKER 237 Dynamische 3D-Zeitreihenvisualisierung in interaktiven Webmapping-Applikationen Martin GEMEINHOLZER; Andre SCHÜCKER geosys - Gesellschaft für angewandte Informationstechnologie, Berlin Einleitung Die dreidimensionale Visualisierung räumlicher Daten findet spätestens seit dem vermehrten Aufkommen von 3D-Stadtmodellen rasche Verbreitung. Im Vordergrund steht hierbei die möglichst naturgetreue Darstellung real existierender unbeweglicher Geoobjekte wie Gebäude, Stadtmöbel etc., die sich über die Zeit wenig verändern. Darüber hinaus gibt es jedoch auch Möglichkeiten, in der dritten Dimension zeitliche Veränderungen von (auch beweglichen) Objekten und ihren Attributdaten zu visualisieren. Zeitreihen finden in einer Vielzahl wichtiger Anwendungsgebiete zur Analyse und Visualisierung langfristiger Daten und zur Erstellung von Trends Verwendung. Einsatzfelder sind z.b. Umweltbeobachtung, Visualisierung von Klimaveränderungen, wiederkehrenden Ereignissen und Naturkatastrophen. Hierbei gestattet und vereinfacht die Kombination von 3D-Darstellung und Zeitreihen bei komplexen Informationen die gleichzeitige Abbildung, Erfassung und Abfrage einer Fülle von Informationen [1][2]. Der Beitrag stellt am Beispiel des Geodatenformates KML und des Google-Earth Plugins aktuelle Möglichkeiten vor, 3D-Geodaten mit Zeitinformationen dynamisch im Browser zu visualisieren. Es werden Vorteile und Restriktionen von 3D-Darstellungen sowie zukünftige Entwicklungen skizziert. 1. Voraussetzungen Gegenwärtig sind die Möglichkeiten der Visualisierung dreidimensionaler Geodaten unter Berücksichtigung der zeitlichen Komponente als Webanwendung noch wenig entwickelt, kaum standardisiert und erfordern derzeit noch die Installation eines Browser-Plugins. Im April 2008 wurde die Keyhole Markup Language (KML) vom Open Geospatial Consortium als offener Standard angenommen. 1 Ursprünglich für die Darstellung räumlicher Daten im Geodaten-Browser Google Earth entwickelt wird die XMLbasierte Auszeichnungssprache für Geodaten KML nun auch verstärkt von Software anderer Anbieter und in Webmapping-Anwendungen verwendet. KML lässt die Darstellung zwei- und dreidimensionaler Geodaten, die Einbindung von 3D-Modellen sowie zeitliche Animation zu. Mit der Bereitstellung des für die gängigen Browser 1
245 Martin GEMEINHOLZER; Andre SCHÜCKER 238 verfügbaren Google Earth Browser-Plugins steht ein virtueller Globus für entsprechende Daten in Webmapping-Applikationen zur Verfügung. 2. Beispiel: Simulation eines Wetterballon-Aufstiegs Exemplarisch veranschaulichen wir Möglichkeiten der 3D-Webvisualisierung mit Zeitkomponente am Beispiel des simulierten Fluges eines Wetterballons (Abb. 1). Es wird der zeitliche Verlauf des Aufstiegs vom Boden bis in eine Höhe von m dargestellt, bei dem der Wetterballon in kurzen Zeitintervallen Messdaten (Ozongehalt, Luftdruck, Temperatur) sowie seine aktuelle Position erfasst. Abb. 1: Wetterballon-Beispiel im ExtJS-Framework, Darstellung eines Zeitintervalls Das Google Earth-Plugin ist in ein ExtJS-Layout 2 eingebettet, das die notwendigen Kontrollelemente zur Steuerung und beschreibende Informationen zur Verfügung stellt. Der Anwender hat die Möglichkeit, die Zeit zu kontrollieren (Start, Stopp, Geschwindigkeit, Zeitintervall) und die Ansicht (Blick- und Neigungswinkel, Zoom) in der aus der Google Earth-Desktop-Anwendung bekannten Weise frei zu wählen. Zudem lassen sich über die Google Earth-API 3 eine Vielzahl weiterer Funktionen des Plugins steuern (Layerauswahl, vordefinierte Ansichten, Ortssuche usw.). Die erhobenen Daten werden in einer PostgreSQL/PostGIS-Datenbank gespeichert. Über die Google Earth-API wird eine KML-Datei aufgerufen, die einen Netzwerklink zu einem PHP-Skript enthält. Dieses generiert KML-Code, indem per SQL-Abfrage Geometrie- und Attributdaten aus der Datenbank abgerufen und in selbigen integriert werden. Räumliche Informationen können dabei sowohl in Spalten einer
246 Martin GEMEINHOLZER; Andre SCHÜCKER 239 beliebigen Tabelle vorliegen als auch mittels PostGIS-Funktion aus der die Rauminformationen beinhaltenden Spalte, z.b. eines in die Datenbank geladenen Shapefiles, gelesen werden. Der Wetterballon sowie die Messwerte für Ozon und Luftdruck werden durch 3D- COLLADA-Modelle 4 symbolisiert, welche sich entlang der 3D-Punkte im Raum bewegen, wobei fehlende Zwischenwerte linear interpoliert werden, so dass eine flüssige Animation entsteht. Dieses wird mit Hilfe der KML-Extension <gx:track> erreicht. Beliebige Attributdaten lassen sich über Farb-, Größen-, und Formveränderung von COLLADA-Objekten visualisieren. Im Beispiel symbolisiert z.b. die Farbe des Ballons die Temperatur, die Größe der über dem Ballon befindlichen Kugel den Ozongehalt und die Länge des unter dem Ballon befindlichen Balkens den Luftdruck. Beliebige Informationen können mit einem Klick auf das Objekt abgerufen und in einem Description-Balloon angezeigt werden. Das gezeigte Beispiel lässt sich beliebig auf andere Anwendungsgebiete, beispielsweise die Visualisierung von Verkehrs- und Warenströmen übertragen. 3. Besonderheiten der 3D-Darstellung Ein gewichtiger Vorteil der 3D-Kartographie besteht in der Aufrechterhaltung der Wahrnehmungsgewohnheiten und eines gewohnten Umfeldes des 3D Raumes [3][4]. Aus diesem Vorteil der Pseudo-3D-Umgebung virtueller Globen ergibt sich eine Problematik bei der Visualisierung von Attributdaten durch die Größe von Elementen, da Größe im 3D-Raum zugleich der Wahrnehmung von Entfernungen dient. Ein Weg dieses zu umgehen besteht darin, Legenden fixer Ausdehnung mit dem Abb. 2: Objektbezogene Legende Objekt zu verknüpfen. Dieses kann beispielsweise mittels COLLADA-Modellen (vgl. Abb. 2) oder Photo-Overlays realisiert werden [5]. Ferner sollte beachtet werden, dass die Sinnhaftigkeit der Verwendung eines 3D- Globus stark von den zu visualisierenden Daten abhängt und der Benutzer auch erst die Navigation im Plugin erlernen muss [1]. Nur mit der richtigen Kombination aus Aufgabe, Benutzer und Interface bieten 3D-Visualisierungen einen zusätzlichen Nutzen, der dann aber durch die simultane Erfassung und Visualisierung komplexer Daten immens sein kann. Gegebenenfalls kann auch die Kombination synchronisierter 2D/3D-Ansichten weiteren Informationsgewinn bedeuten z.b. /examples/googleearthpanelexample.html
247 Martin GEMEINHOLZER; Andre SCHÜCKER Zukunft/Ausblick Die Notwendigkeit, zur Darstellung von 3D-Inhalten im Webbrowser zuerst ein (möglicherweise nicht auf allen Plattformen verfügbares) Plugin installieren zu müssen, welches dann noch beim Aufruf zeitraubend geladen werden muss, ist äußerst unbefriedigend und steht der Verbreitung und allgemeinen Akzeptanz dieser Art der Datenvisualisierung im Wege. In (naher) Zukunft wird die 3D-Darstellung im Browser auch ohne proprietäre Plugins möglich sein. Der schon heute mit den gängigen Browsern nutzbare, aber noch nicht endgültig verabschiedete HTML5-Standard stellt dafür die Basis dar. Mit ihm lassen sich sog. canvas-elemente definieren in denen mit dem freien WebGL-Format 6 3D-Daten visualisiert werden können. Technisch gesehen ist WebGL die Verknüpfung von Javascript und OpenGL. Weiterhin wird das kartographisch interessante Vektorzeichenformat SVG [6] direkt in HTML5-Quellcode unterstützt, was bisher nur mit Plugin möglich war. Im Bereich der 3D-Geodatenvisualisierung können diese neuen Techniken vorteilhaft angewandt werden, wie vielversprechende Beispiele 7 demonstrieren. Literatur [1] Hilbring, D. (2005). 3D-GIS Visualisierung in der Umweltinformatik. Univ.-Verl. Karlsruhe, Karlsruhe, Online unter: [2] Lin H., J. Zhu, B. Xu, W. Lin und Y. Hu (2009). A Virtual Geographic Environment for a Simulation of Air Pollution Dispersion in the Pearl River Delta (PRD) Region. In: Yi, C. (Hrsg.) (2009). 3D geoinformation sciences. Springer, Berlin [u.a], S [3] Zenner, C., H. Asche und M. Wolff (2008). Virtuelle 3D-Geovisualisierungen innovative Formen der Kommunikation und Perzeption räumlicher Strukturen. In: Strobl, Blaschke und Griesebner (Hrsg.): Angewandte Geographische Informationsverarbeitung XX. Beiträge zum AGIT-Symposium Salzburg. Wichmann, Heidelberg, S [4] Jobst, M. (2004). Interpretierte Realität 3D-Kartographie als Hilfsmittel der Geokommunikation. In: Strobl, Blaschke und Griesebner (Hrsg.): Angewandte Geographische Informationsverarbeitung XVI. Beiträge zum AGIT-Symposium Salzburg. Wichmann, Heidelberg, S [5] Sandvik, B. (2008). Using KML for Thematic Mapping. University of Edinburgh, School of Geosciences, Institute of Geography. Dissertation. Online unter: [6] Ueberschär, N. und A. Winter (2006). Visualisieren von Geodaten mit SVG im Internet. Wichmann, Heidelberg. Zusätzliche Literatur Erickson, T. A., A data system for visualizing 4-D atmospheric CO2 models and data. Präsentation FOSS4G- Konferenz Sydney, Australien Online unter: z.b. oder
248 Sascha TEGTMEYER; Dirk ROHRMOSER 241 WebGIS für kommunales Informationsmanagement Hamburg`s Ansatz für die Integration von WebGIS in E- Government Infrastrukturen Sascha TEGTMEYER a ; Dirk ROHRMOSER a a Landesbetrieb Geoinformation und Vermessung Freie und Hansestadt Hamburg Abstract. Mit dem Hamburger Solar Atlas und dem Artenkataster werden in diesem Beitrag beispielhaft zwei vom LGV entwickelte WebGIS Lösungen für das kommunale Informationsmanagement im Themenbereich Klima- und Artenschutz im Hamburger Raum vorgestellt. Keywords. WebGIS, Solar Atlas, Artenkataster, Hamburg, GDI-HH Einleitung Die Metropolregion Hamburg ist ein dynamischer urbaner Raum. Die Freie und Hansestadt Hamburg und die umliegenden 14 Landkreise bedecken eine ca Quadratkilometer große Fläche und beheimaten mehr als vier Millionen Menschen. In diesem Kontext begegnet die öffentliche Verwaltung verschiedene Handlungsfelder, die mit Geoinformationen verknüpft sind. Der Titel Umwelthauptstadt Europas 2011 befördert den Klima- und Artenschutz zu einem der aktuell wichtigsten Handlungsfelder der Stadt Hamburg. Der Landesbetrieb Geoinformation und Vermessung (LGV) Hamburg bietet in diesem Kontext umfangreiche Serviceleistungen zur Entwicklung individueller WebGIS-Lösungen an. Voraussetzung ist die Verfügbarkeit von Geoinformation in standardisierten Geodateninfrastrukturen (GDI). Die GDI-Hamburg und die GDI der Metropolregion Hamburg bilden mit ihrem dienstebasierten Ansatz die Grundlage für die Entwicklung von WebGIS beim LGV. In diesem Beitrag werden mit dem Hamburger Solar Atlas und dem Artenkataster beispielhaft zwei innovative vom LGV entwickelte WebGIS Lösungen für das kommunale Informationsmanagement in dem Themenbereich Klima- und Artenschutz vorgestellt.
249 Sascha TEGTMEYER; Dirk ROHRMOSER Hamburger Solar Atlas Die Frage Wie viel Sonne scheint auf Ihr Dach sollte sich jeder Dachbesitzer stellen, bevor er die Installation einer Solaranlage in Erwägung zieht. Durch den Hamburger Solar Atlas (1), den der LGV in Kooperation mit HAMBURG ENERGIE SOLAR entwickelt hat, kann diese Frage einfach beantwortet werden. Interessenten haben die Möglichkeit, sich mit dem Hamburger Solar Atlas zu informieren, ob ihr Dach für eine Photovoltaik- oder Solaranlage geeignet ist. Für einen Großteil der Hamburger Dachflächen wurden aufgrund des vom LGV entwickelten 3D- Stadtmodells Dachflächen mit zugehörigen Strahlungsmengen und Eignungsklassen abgeleitet. Wie in Abbildung 1 zu sehen, können über das entwickelte WebGIS diese Informationen zu jeder einzelnen Dachfläche online abgerufen und ausgedruckt werden. Abbildung 1. Ausschnitt Hamburger Solar Atlas Die Solarpotentialanalyse erfolgte unter Verwendung aller verschattungsrelevanten Objekte wie z.b. Topographie, Gebäude oder Vegetation. Als Datenbasis für die Analyse diente das digitale 3D-Stadtmodell (LOD2) von Hamburg, welches zum Zeitpunkt der Untersuchung ca Gebäude umfasste. Für sämtliche Dächer dieses Modells wurden die Parameter Größe, Ausrichtung und Neigung erfasst und anschließend ein hoch aufgelöstes Rechengitter mit 0,25m² großen Teilflächen abgeleitet. Der Einfluss von Verschattungsobjekten wurde mit Hilfe von Horizontlinien ermittelt. Auf diese Weise konnte eine minutenfeine Berücksichtigung des Schattenwurfes realisiert werden. Durch die automatische Segmentierung der Dachflächen in räumlich hochaufgelöste Teilflächen konnte die jahreszeitlich wechselnde Dachabschattung durch umliegendes Gelände, Gebäude, Vegetation oder Dachgauben sehr detailliert erfasst werden.
Linking Sensor Web Enablement and Web Processing Technology for Health-Environment Studies
Linking Sensor Web Enablement and Web Processing Technology for Health-Environment Studies Simon Jirka 1, Stefan Wiemann 2, Johannes Brauner 2, and Eike Hinderk Jürrens 1 1 52 North Initiative for Geospatial
More informationA RDF Vocabulary for Spatiotemporal Observation Data Sources
A RDF Vocabulary for Spatiotemporal Observation Data Sources Karine Reis Ferreira 1, Diego Benincasa F. C. Almeida 1, Antônio Miguel Vieira Monteiro 1 1 DPI Instituto Nacional de Pesquisas Espaciais (INPE)
More informationGeoprocessing in Hybrid Clouds
Geoprocessing in Hybrid Clouds Theodor Foerster, Bastian Baranski, Bastian Schäffer & Kristof Lange Institute for Geoinformatics, University of Münster, Germany {theodor.foerster; bastian.baranski;schaeffer;
More informationFraunhofer Institute for Computer Graphics Research IGD. Spatial Information Management
Fraunhofer Institute for Computer Graphics Research IGD Spatial Information Management Spatial Information Management @ Fraunhofer IGD Spatial information management @ Fraunhofer IGD 2 3D spatial information
More informationGENESIS Employing Web Processing Services and Sensor Web Technology for Environmental Management
GENESIS Employing Web Processing Services and Sensor Web Technology for Environmental Management Steven Smolders (1), Christian Alegre (2), Simone Gianfranceschi (3), Marc Gilles (4), Bernd Resch (5),
More informationDevelopment of Sensor Web Applications with Open Source Software
Development of Sensor Web Applications with Open Source Software Arne Bröring1, Eike Hinderk Jürrens1, Simon Jirka1, Christoph Stasch2 1 52 North Initiative for Geospatial Open Source Software GmbH {broering,
More informationTourist mature destinations as complex spaces. Notes about the elaboration process of an atlas of Costa del Sol
Tourist mature destinations as complex spaces. Notes about the elaboration process of an atlas of Costa del Sol Sergio J. Reyes Corredera Universidad de Málaga, ETS Arquitectura, Málaga, España. e-mail:
More informationhttp://www.esri.com/bia
ArcGIS for Tribal Transportation Management David Gadsden Federal Account Manager dgadsden@esri.com d d 1 David Gadsden GIS Background UW Geography (1995) Peace Corps Tanzania Environmental Social Science
More informationPDOK Kaart, the Dutch Mapping API
PDOK Kaart, the Dutch Mapping API Haico van der Vegt*, Leo van der Sluijs* * Cadastre, Land Registry and Mapping Agency of the Netherlands Abstract. PDOK (public services on the map) is a partnership of
More informationTechnology Trends In Geoinformation
Technology Trends In Geoinformation Dato Prof. Sr Dr. Abdul Kadir Bin Taib Department of Survey and Mapping Malaysia (JUPEM) Email: drkadir@jupem.gov.my www.jupem.gov.my NGIS 2008 3 rd. National GIS Conference
More informationGIS Initiative: Developing an atmospheric data model for GIS. Olga Wilhelmi (ESIG), Jennifer Boehnert (RAP/ESIG) and Terri Betancourt (RAP)
GIS Initiative: Developing an atmospheric data model for GIS Olga Wilhelmi (ESIG), Jennifer Boehnert (RAP/ESIG) and Terri Betancourt (RAP) Unidata seminar August 30, 2004 Presentation Outline Overview
More informationIntegrated Risk Management System Components in the GEO Architecture Implementation Pilot Phase 2 (AIP-2)
Meraka Institute ICT for Earth Observation PO Box 395 Pretoria 0001, Gauteng, South Africa Telephone: +27 12 841 3028 Facsimile: +27 12 841 4720 University of KwaZulu- Natal School of Computer Science
More information13 th EC GI & GIS Workshop WIN: A new OGC compliant SOA. for risk management. GMV, 2007 Property of GMV All rights reserved
13 th EC GI & GIS Workshop WIN: A new OGC compliant SOA for risk management GMV, 2007 Property of GMV All rights reserved Content 1. Introduction 2. Objectives 3. Architecture and Model 4. Technical aspects
More informationProviding GRASS with a Web Processing Service Interface
Providing GRASS with a Web Processing Service Interface Johannes Brauner Institute for Geoinformatics, University of Münster joejoe@uni-muenster.de Abstract. The process of bringing GIS functionality into
More informationImplementing a Municipal SDI with Service Oriented Architecture
Implementing a Municipal SDI with Service Oriented Architecture A. A. Ghaemi a, F. Samadzadegan b, A. Rajabifard c, M. Yadegari b a Tehran Municipality ICT Organazation, Tehran, Iran ghaemi@tehran.ir b
More informationConcept for an Ontology Based Web GIS Information System for HiMAT
Concept for an Ontology Based Web GIS Information System for HiMAT Gerald Hiebel Klaus Hanke University of Innsbruck Surveying and Geoinformation Unit {gerald.hiebel; klaus.hanke}@uibk.ac.at Abstract The
More informationVISUALIZATION STRATEGIES AND TECHNIQUES FOR HIGH-DIMENSIONAL SPATIO- TEMPORAL DATA
VISUALIZATION STRATEGIES AND TECHNIQUES FOR HIGH-DIMENSIONAL SPATIO- TEMPORAL DATA Summary B. Schmidt, U. Streit and Chr. Uhlenküken University of Münster Institute of Geoinformatics Robert-Koch-Str. 28
More informationDATA VISUALIZATION GABRIEL PARODI STUDY MATERIAL: PRINCIPLES OF GEOGRAPHIC INFORMATION SYSTEMS AN INTRODUCTORY TEXTBOOK CHAPTER 7
DATA VISUALIZATION GABRIEL PARODI STUDY MATERIAL: PRINCIPLES OF GEOGRAPHIC INFORMATION SYSTEMS AN INTRODUCTORY TEXTBOOK CHAPTER 7 Contents GIS and maps The visualization process Visualization and strategies
More informationNear Real-Time Geo-Analyses for Emergency Support: An Radiation Safety Exercise
Near Real-Time Geo-Analyses for Emergency Support: An Radiation Safety Exercise Guenther Sagl, Michael Lippautz, Bernd Resch, Manfred Mittlboeck INTRODUCTION Studio ispace, Research Studios Austria A variety
More informationD 8.2 Application Definition - Water Management
(FP7 609081) Date 31st July 2014 Version [1.0] Published by the Almanac Consortium Dissemination Level: Public Project co-funded by the European Commission within the 7 th Framework Programme Objective
More informationCIESIN Columbia University
Conference on Climate Change and Official Statistics Oslo, Norway, 14-16 April 2008 The Role of Spatial Data Infrastructure in Integrating Climate Change Information with a Focus on Monitoring Observed
More informationRESTful Web Processing Service
RESTful Web Processing Service Theodor Foerster 1, Andre Brühl 2, Bastian Schäffer 3 1 Institute for Geoinformatics (ifgi) - University of Muenster, Germany 2 dynport GmbH, Hamburg, Germany 3 52 North
More informationAdvancing Sustainability with Geospatial Steven Hagan, Vice President, Server Technologies João Paiva, Ph.D. Spatial Information and Science
Advancing Sustainability with Geospatial Steven Hagan, Vice President, Server Technologies João Paiva, Ph.D. Spatial Information and Science Engineering 1 Copyright 2011, Oracle and/or its affiliates.
More informationsecure intelligence collection and assessment system Your business technologists. Powering progress
secure intelligence collection and assessment system Your business technologists. Powering progress The decisive advantage for intelligence services The rising mass of data items from multiple sources
More informationCityGML goes to Broadway
CityGML goes to Broadway Thomas H. Kolbe, Barbara Burger, Berit Cantzler Chair of Geoinformatics thomas.kolbe@tum.de September 11, 2015 Photogrammetric Week 2015, Stuttgart The New York City Open Data
More informationNetwork for testing GI services
Network for testing GI services Anders Östman GIS Institute, University of Gävle, Nobelvägen 2, SE 80176, Gävle, Sweden Anders.Ostman@hig.se Abstract. The use of standards is essential when building a
More informationIntroduction to Geospatial Web Services
Geospatial Web Services Introduction to Geospatial Web Services An introduction and inventory of geospatial web services and their importance to interoperability in the geospatial domain. vers. 20100604
More informationInformation Services for Smart Grids
Smart Grid and Renewable Energy, 2009, 8 12 Published Online September 2009 (http://www.scirp.org/journal/sgre/). ABSTRACT Interconnected and integrated electrical power systems, by their very dynamic
More informationService-Oriented Visualization of Virtual 3D City Models
Service-Oriented Visualization of Virtual 3D City Models Authors: Jan Klimke, Jürgen Döllner Computer Graphics Systems Division Hasso-Plattner-Institut, University of Potsdam, Germany http://www.hpi3d.de
More informationSODDA A SERVICE-ORIENTED DISTRIBUTED DATABASE ARCHITECTURE
SODDA A SERVICE-ORIENTED DISTRIBUTED DATABASE ARCHITECTURE Breno Mansur Rabelo Centro EData Universidade do Estado de Minas Gerais, Belo Horizonte, MG, Brazil breno.mansur@uemg.br Clodoveu Augusto Davis
More informationA framework for Itinerary Personalization in Cultural Tourism of Smart Cities
A framework for Itinerary Personalization in Cultural Tourism of Smart Cities Gianpaolo D Amico, Simone Ercoli, and Alberto Del Bimbo University of Florence, Media Integration and Communication Center
More informationMobile GIS for Cadastral Data Collection in Ghana
Mobile GIS for Cadastral Data Collection in Ghana Eric MENSAH OKANTEY Barend KÖBBEN 1 Introduction With the development of Web GIS and the emergence of Mobile GIS new possibilities of data capture and
More informationA Technical Framework for Visualizing Spatio-temporal Quality Metrics of Volunteered Geographic Information
A Technical Framework for Visualizing Spatio-temporal Quality Metrics of Volunteered Geographic Information Oliver ROICK a,1, Lukas LOOS a and Alexander ZIPF a a Chair of GIScience, Heidelberg University,
More informationIFS-8000 V2.0 INFORMATION FUSION SYSTEM
IFS-8000 V2.0 INFORMATION FUSION SYSTEM IFS-8000 V2.0 Overview IFS-8000 v2.0 is a flexible, scalable and modular IT system to support the processes of aggregation of information from intercepts to intelligence
More informationAn Esri White Paper June 2011 ArcGIS for INSPIRE
An Esri White Paper June 2011 ArcGIS for INSPIRE Esri, 380 New York St., Redlands, CA 92373-8100 USA TEL 909-793-2853 FAX 909-793-5953 E-MAIL info@esri.com WEB esri.com Copyright 2011 Esri All rights reserved.
More informationRemote Sensing and GIS Application In Change Detection Study In Urban Zone Using Multi Temporal Satellite
Remote Sensing and GIS Application In Change Detection Study In Urban Zone Using Multi Temporal Satellite R.Manonmani, G.Mary Divya Suganya Institute of Remote Sensing, Anna University, Chennai 600 025
More informationManaging Large Imagery Databases via the Web
'Photogrammetric Week 01' D. Fritsch & R. Spiller, Eds. Wichmann Verlag, Heidelberg 2001. Meyer 309 Managing Large Imagery Databases via the Web UWE MEYER, Dortmund ABSTRACT The terramapserver system is
More informationArcGIS. Server. A Complete and Integrated Server GIS
ArcGIS Server A Complete and Integrated Server GIS ArcGIS Server A Complete and Integrated Server GIS ArcGIS Server enables you to distribute maps, models, and tools to others within your organization
More informationApplying GIS in seismic hazard assessment and data integration for disaster management
Applying GIS in seismic hazard assessment and data integration for disaster management Rumiana Vatseva, Dimcho Solakov, Emilia Tcherkezova, Stela Simeonova, Petya Trifonova National Institute of Geophysics,
More informationSDI Workshop ESDIN Best Practices. 2011-06-28 INSPIRE conference, Edinburgh. Arnulf Christl, Metaspatial
SDI Workshop ESDIN Best Practices 2011-06-28 INSPIRE conference, Edinburgh Arnulf Christl, Metaspatial Download this slide set as ODP (1.2MB) or PDF (1.1MB) at http://arnulf.us/publications Arnulf Christl
More informationProbabilistic Risk Assessment Studies in Yemen
Probabilistic Risk Assessment Studies in Yemen The catastrophic risk analysis quantifies the risks of hazard, exposure, vulnerability, and loss, thus providing the decision maker with the necessary information
More informationFOSS4G-based energy management system for planning virtual power plants at the municipal scale
FOSS4G-based energy management system for planning virtual power plants at the municipal scale Luis Ramirez Camargo1,2, Roland Zink1, Wolfgang Dorner1 1 2 Applied Energy Research group, Technologie Campus
More informationEmerging Trends in SDI.
Emerging Trends in SDI. Jeanne Foust ESRI gsdi 1 Spatial Data Infrastructure TRENDS GIS use continues to rapidly grow. Recognition Of GIS As Critical Infrastructure growing. Alignment of SDI and National
More information12th AGILE International Conference on Geographic Information Science 2009 page 1 of 9 Leibniz Universität Hannover, Germany
12th AGILE International Conference on Geographic Information Science 2009 page 1 of 9 Sensor Web for River Water Pollution Monitoring and Alert System Natasa Markovic, Aleksandar Stanimirovic, Leonid
More informationADVANCED GEOGRAPHIC INFORMATION SYSTEMS Vol. II - Using Ontologies for Geographic Information Intergration Frederico Torres Fonseca
USING ONTOLOGIES FOR GEOGRAPHIC INFORMATION INTEGRATION Frederico Torres Fonseca The Pennsylvania State University, USA Keywords: ontologies, GIS, geographic information integration, interoperability Contents
More informationTerraLib as an Open Source Platform for Public Health Applications. Karine Reis Ferreira
TerraLib as an Open Source Platform for Public Health Applications Karine Reis Ferreira September 2008 INPE National Institute for Space Research Brazilian research institute Main campus is located in
More informationREFERENCE ARCHITECTURE FOR SMAC SOLUTIONS
REFERENCE ARCHITECTURE FOR SMAC SOLUTIONS Shankar Kambhampaty 1 and Sasirekha Kambhampaty 2 1 Computer Science Corporation (CSC), India skambhampaty@gmail.com 2 Student, Department of Computer Science,
More informationProject Title: Project PI(s) (who is doing the work; contact Project Coordinator (contact information): information):
Project Title: Great Northern Landscape Conservation Cooperative Geospatial Data Portal Extension: Implementing a GNLCC Spatial Toolkit and Phenology Server Project PI(s) (who is doing the work; contact
More informationManaging Variability in Software Architectures 1 Felix Bachmann*
Managing Variability in Software Architectures Felix Bachmann* Carnegie Bosch Institute Carnegie Mellon University Pittsburgh, Pa 523, USA fb@sei.cmu.edu Len Bass Software Engineering Institute Carnegie
More informationApplying OGC Sensor Web Enablement to Risk Monitoring and Disaster Management
Applying OGC Sensor Web Enablement to Risk Monitoring and Disaster Management Simon Jirka 1, Arne Bröring 2, Christoph Stasch 3 1 52 North Initiative for Geospatial Open Source Software GmbH, jirka@52north.org
More informationWHAT IS GIS - AN INRODUCTION
WHAT IS GIS - AN INRODUCTION GIS DEFINITION GIS is an acronym for: Geographic Information Systems Geographic This term is used because GIS tend to deal primarily with geographic or spatial features. Information
More informationAn Assessment of the Effectiveness of Segmentation Methods on Classification Performance
An Assessment of the Effectiveness of Segmentation Methods on Classification Performance Merve Yildiz 1, Taskin Kavzoglu 2, Ismail Colkesen 3, Emrehan K. Sahin Gebze Institute of Technology, Department
More informationArcGIS Data Models Practical Templates for Implementing GIS Projects
ArcGIS Data Models Practical Templates for Implementing GIS Projects GIS Database Design According to C.J. Date (1995), database design deals with the logical representation of data in a database. The
More informationDISMAR: Data Integration System for Marine Pollution and Water Quality
DISMAR: Data Integration System for Marine Pollution and Water Quality T. Hamre a, S. Sandven a, É. Ó Tuama b a Nansen Environmental and Remote Sensing Center, Thormøhlensgate 47, N-5006 Bergen, Norway
More informationMSc Geo-information Science (MGI) INFORMATION. Willy ten Haaf (Study Advisor)
MSc Geo-information Science (MGI) INFORMATION Willy ten Haaf (Study Advisor) Why Wageningen University? international university flexible programmes - much free choice intensive study advise small and
More informationUSE OF GEOSPATIAL AND WEB DATA FOR OECD STATISTICS
USE OF GEOSPATIAL AND WEB DATA FOR OECD STATISTICS CCSA SPECIAL SESSION ON SHOWCASING BIG DATA 1 OCTOBER 2015 Paul Schreyer Deputy-Director, Statistics Directorate, OECD OECD APPROACH OECD: Facilitator
More informationPART 1. Representations of atmospheric phenomena
PART 1 Representations of atmospheric phenomena Atmospheric data meet all of the criteria for big data : they are large (high volume), generated or captured frequently (high velocity), and represent a
More informationSENSOR WEB SERVICES FOR EARLY FLOOD WARNINGS BASED ON SOIL MOISTURE PROFILES
SENSOR WEB SERVICES FOR EARLY FLOOD WARNINGS BASED ON SOIL MOISTURE PROFILES Thomas Brinkhoff a, *, Stephan Jansen b a Jade University Oldenburg, Institute for Applied Photogrammetry and Geoinformatics,
More informationBig Data Collection and Utilization for Operational Support of Smarter Social Infrastructure
Hitachi Review Vol. 63 (2014), No. 1 18 Big Data Collection and Utilization for Operational Support of Smarter Social Infrastructure Kazuaki Iwamura Hideki Tonooka Yoshihiro Mizuno Yuichi Mashita OVERVIEW:
More informationService Oriented Architecture
Service Oriented Architecture Charlie Abela Department of Artificial Intelligence charlie.abela@um.edu.mt Last Lecture Web Ontology Language Problems? CSA 3210 Service Oriented Architecture 2 Lecture Outline
More informationData interchange between Web client based task controllers and management information systems using ISO and OGC standards
Data interchange between Web client based task controllers and management information systems using ISO and OGC standards Michael Nørremark 1*, Ole Jørgensen 1, Jens Bligaard 2 and Claus G. Sørensen 1
More informationChaining Façades: Higher Efficiency in evolution-enabled Spatial Data Infrastructures (SDI)
Chaining Façades: Higher Efficiency in evolution-enabled Spatial Data Infrastructures (SDI) Roland M. Wagner Fraunhofer ISST, Berlin/Dortmund, Germany ABSTRACT After the first publication of the known
More informationSextant. Spatial Data Infrastructure for Marine Environment. C. Satra Le Bris, E. Quimbert, M. Treguer
Sextant On-Line information system for marine geographical information E. Quimbert, M. Bellouis, F. Lecuy, M. Treguer Centre de Bretagne BP 70, Plouzané 29280 France E-mail: sextant@ifremer.fr Sextant
More informationPremium Data Centre Europe - 2012 Pricing, Business Model & Services
Premium Data Centre Europe - 2012 Pricing, Business Models & Services Premium Data Centre Europe - 2012 Pricing, Business Model & Services Table of Contents A list of figures used in this report 5 Methodology
More informationM.S. Civil Engineering, Drexel University, Philadelphia, PA. Dec. 2001. B.S. Industrial Engineering, Los Andes University, Bogotá, Colombia. Sep.
EDUCATION Ph.D. Hydro-informatics, Drexel University, Philadelphia, PA. Dissertation title: Ontomet: Ontology Metadata Framework Advisor: Dr. Michael Piasecki Dec. 2004 M.S. Civil Engineering, Drexel University,
More informationGIS BASED LAND INFORMATION SYSTEM FOR MANDAL SOUM, SELENGE AIMAG OF MONGOLIA
GIS BASED LAND INFORMATION SYSTEM FOR MANDAL SOUM, SELENGE AIMAG OF MONGOLIA B. Tuul GTZ, Land Management and Fiscal Cadastre project, Government building 12, ALAGCaC, Ulaanbaatar, Mongolia tuul1119@yahoo.com,
More informationBusiness Rule Standards -- Interoperability and Portability
Rule Standards -- Interoperability and Portability April 2005 Mark H. Linehan Senior Technical Staff Member IBM Software Group Emerging Technology mlinehan@us.ibm.com Donald F. Ferguson IBM Fellow Software
More informationMultiscale Object-Based Classification of Satellite Images Merging Multispectral Information with Panchromatic Textural Features
Remote Sensing and Geoinformation Lena Halounová, Editor not only for Scientific Cooperation EARSeL, 2011 Multiscale Object-Based Classification of Satellite Images Merging Multispectral Information with
More informationStrategy for Improving Cadastral Spatial Quality toward Effective e- Government based NSDI
Strategy for Improving Cadastral Spatial Quality toward Effective e- Government based NSDI Young-ho LEE, Republic of Korea Key words: e-government, interoperability, NSDI, Spatial quality SUMMARY E-Government
More informationGeosciences - Programme subject in programme for Specialization in General Studies
Geosciences - Programme subject in programme for Specialization in General Studies Dette er en oversettelse av den fastsatte læreplanteksten. Læreplanen er fastsatt på Bokmål Laid down as a regulation
More informationManagement Control and Reporting of Intangibles
Special Issue 4/13 Management Control and Reporting of Intangibles edited by Andreas Duhr and Axel Haller Arbeitskreis Immaterielle Werte im Rechnungswesen der Schmalenbach-Gesellschaft für Betriebswirtschaft
More informationCatalogue or Register? A Comparison of Standards for Managing Geospatial Metadata
Catalogue or Register? A Comparison of Standards for Managing Geospatial Metadata Gerhard JOOS and Lydia GIETLER Abstract Publication of information items of any kind for discovery purposes is getting
More informationWildfire Prevention and Management in a 3D Virtual Environment
Wildfire Prevention and Management in a 3D Virtual Environment M. Castrillón 1, P.A. Jorge 2, I.J. López 3, A. Macías 2, D. Martín 2, R.J. Nebot 3,I. Sabbagh 3, J. Sánchez 2, A.J. Sánchez 2, J.P. Suárez
More informationMeasurement of the hotel Average Daily Rate using Internet Distribution Systems
Measurement of the hotel Average Daily Rate using Internet Distribution Systems Ibai Roman a, Igor Ibarguren a, Jon Kepa Gerrikagoitia a, and Emilio Torres-Manzanera b a CICtourGUNE, Cooperative Research
More informationIntroduction to Imagery and Raster Data in ArcGIS
Esri International User Conference San Diego, California Technical Workshops July 25, 2012 Introduction to Imagery and Raster Data in ArcGIS Simon Woo slides Cody Benkelman - demos Overview of Presentation
More informationJOURNAL OF OBJECT TECHNOLOGY
JOURNAL OF OBJECT TECHNOLOGY Online at www.jot.fm. Published by ETH Zurich, Chair of Software Engineering JOT, 2008 Vol. 7 No. 7, September-October 2008 Applications At Your Service Mahesh H. Dodani, IBM,
More informationInteroperable Solutions in Web-based Mapping
ISPRS SIPT IGU UCI CIG ACSG Table of contents Table des matières Authors index Index des auteurs Search Recherches Exit Sortir Interoperable Solutions in Web-based Mapping Marta Wojnarowska and Bridget
More informationAndrea Buffam, Natural Resources Canada Canadian Metadata Forum National Library of Canada Ottawa, Ontario September 19 20, 2003
Geospatial Metadata Andrea Buffam, Natural Resources Canada Canadian Metadata Forum National Library of Canada Ottawa, Ontario September 19 20, 2003 The Presentation - Geospatial Metadata This presentation
More informationGEOSPATIAL SERVICE PLATFORM FOR EDUCATION AND RESEARCH
GEOSPATIAL SERVICE PLATFORM FOR EDUCATION AND RESEARCH Jianya Gong, Huayi Wu, Wanshou Jiang, Wei Guo, Xi Zhai, Peng Yue State Key Laboratory of Information Engineering in Surveying, Mapping and Remote
More informationSUSTAINABLE TRAFFIC CONCEPT IN MUNICIPALITY OF NEA KYDONIA
SUSTAINABLE TRAFFIC CONCEPT IN MUNICIPALITY OF NEA KYDONIA The municipality of Nea Kydonia in prefecture of Chania, Crete, is located in an area of 2.150 hectares in a coastal district, 5 km west of the
More informationOPEN STANDARD WEB SERVICES FOR VISUALISATION OF TIME SERIES DATA OF FLOOD MODELS
OPEN STANDARD WEB SERVICES FOR VISUALISATION OF TIME SERIES DATA OF FLOOD MODELS Barend Köbben FRMRC RPA 7 Workshop visualisations and flow simulations as flood risk communication tools
More information不 来 梅 科 技 大 学. Hochschule Bremen Programme: September 20 - October 08. Bremen University of Applied Sciences at the Expo Shanghai 2010
Bremen University of Applied Sciences at the Expo Shanghai 2010 不 来 梅 科 技 大 学 Hochschule Bremen Programme: September 20 - October 08 Science in the City - 城 市 中 的 科 学 中 心 How will we want to live and work
More informationCoventry Development Plan 2016 Appendix 89. Glossary of Key Terms
Coventry Development Plan 2016 Appendix 89 Glossary of Key Terms Area Action Plan A Development Plan Document which focuses upon a specific location or an area subject to significant change. Affordable
More informationNATIONAL CLIMATE CHANGE & WILDLIFE SCIENCE CENTER & CLIMATE SCIENCE CENTERS DATA MANAGEMENT PLAN GUIDANCE
NATIONAL CLIMATE CHANGE & WILDLIFE SCIENCE CENTER & CLIMATE SCIENCE CENTERS DATA MANAGEMENT PLAN GUIDANCE Prepared by: NCCWSC/CSC Data Management Working Group US Geological Survey February 26, 2013 Version
More informationINTEROPERABLE IMAGE DATA ACCESS THROUGH ARCGIS SERVER
INTEROPERABLE IMAGE DATA ACCESS THROUGH ARCGIS SERVER Qian Liu Environmental Systems Research Institute 380 New York Street Redlands, CA92373, U.S.A - qliu@esri.com KEY WORDS: OGC, Standard, Interoperability,
More informationCROP CLASSIFICATION WITH HYPERSPECTRAL DATA OF THE HYMAP SENSOR USING DIFFERENT FEATURE EXTRACTION TECHNIQUES
Proceedings of the 2 nd Workshop of the EARSeL SIG on Land Use and Land Cover CROP CLASSIFICATION WITH HYPERSPECTRAL DATA OF THE HYMAP SENSOR USING DIFFERENT FEATURE EXTRACTION TECHNIQUES Sebastian Mader
More informationA new cost model for comparison of Point to Point and Enterprise Service Bus integration styles
A new cost model for comparison of Point to Point and Enterprise Service Bus integration styles MICHAL KÖKÖRČENÝ Department of Information Technologies Unicorn College V kapslovně 2767/2, Prague, 130 00
More informationonetransport 2016 InterDigital, Inc. All Rights Reserved.
onetransport 1 onetransport: Who We are Today Platform Provider Transport Expert Analytics Sensors / Analytics Data providers / Use case owners 11 partners 2- year project 3.5m Total funding 2 How this
More informationConcepts for quality assurance during mobile online data acquisition
Concepts for quality assurance during mobile online data acquisition Stephan Mäs, Wolfgang Reinhardt, Admire Kandawasvika, Fei Wang AGIS - Arbeitsgemeinschaft GIS Universität der Bundeswehr München Werner-Heisenberg-Weg
More informationBusiness Rules and SOA: Parallels and Synergies
Business Rules and SOA: Parallels and Synergies White Paper As of January 2006 Innovations Software Technology GmbH, 2009. All rights reserved. Dissemination or reproduction of this publication or any
More informationData classification methods in GIS The most common methods
University of Thessaly, Department of Planning and Regional Development Master Franco Hellenique POpulation, DEveloppement, PROspective Volos, 2013 Data classification methods in GIS The most common methods
More informationFrom Big Data to Smart Data How to improve public transport through modelling and simulation.
From Big Data to Smart Data How to improve public transport through modelling and simulation. Dr. Alex Erath, Pieter Fourie, Sergio Ordó ~ nez, Artem Chakirov FCL Research Module: Mobility and Transportation
More informationDynamism and Data Management in Distributed, Collaborative Working Environments
Dynamism and Data Management in Distributed, Collaborative Working Environments Alexander Kipp 1, Lutz Schubert 1, Matthias Assel 1 and Terrence Fernando 2, 1 High Performance Computing Center Stuttgart,
More informationACE GIS Project Overview: Adaptable and Composable E-commerce and Geographic Information Services
ACE GIS Project Overview: Adaptable and Composable E-commerce and Geographic Information Services José Poveda, Michael Gould, Carlos Granell 64 Departamento de Lenguajes y Sistemas Informáticos Universitat
More informationUnderstanding Raster Data
Introduction The following document is intended to provide a basic understanding of raster data. Raster data layers (commonly referred to as grids) are the essential data layers used in all tools developed
More informationA Hybrid Architecture for Mobile Geographical Data Acquisition and Validation Systems
A Hybrid Architecture for Mobile Geographical Data Acquisition and Validation Systems Claudio Henrique Bogossian 1, Karine Reis Ferreira 1, Antônio Miguel Vieira Monteiro 1, Lúbia Vinhas 1 1 DPI Instituto
More informationERDAS Tools for Online Sharing of Geospatial Knowledge and Workflows
ERDAS Tools for Online Sharing of Geospatial Knowledge and Workflows Oliver Zimmermann ERDAS Leica Geosystems AG, Heinrich Wildstrasse, CH-9434 AU, Switzerland Oliver.zimmermann@erdas.com KEY WORDS: Online
More informationResource Oriented Architecture and REST
Resource Oriented Architecture and REST Assessment of impact and advantages on INSPIRE Roberto Lucchi, Michel Millot European Commission Joint Research Centre Institute for Environment and Sustainability
More informationAssessment of Workforce Demands to Shape GIS&T Education
Assessment of Workforce Demands to Shape GIS&T Education Gudrun Wallentin, Barbara Hofer, Christoph Traun gudrun.wallentin@sbg.ac.at University of Salzburg, Dept. of Geoinformatics Z_GIS, Austria www.gi-n2k.eu
More informationImpact of Service Oriented Architecture on ERP Implementations in Technical Education
Impact of Service Oriented Architecture on ERP Implementations in Technical Education Swati Verma Department of Computer Science & Engg, B.T. Kumaon Institute of Technology, Dwarahat, 263653, India. E-mail:
More information