GI-Forum ST 2017
Date | Topic | Presenter | Affiliation | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
18.04.2017 | Institute Meeting | ||||||||||||||
25.04.2017 | Smart Cartography – Relevance and Challenges | Jochen Schiewe | HafenCity University Hamburg | ||||||||||||
Today everything is or should be „smart“ – from “Smart Cities” over “Smart Mobility” to “Smart Buildings”, and even Society is supposed to be smart. In this presentation the possible – and value adding – contributions of Cartography will be discussed, with the overall aim to shape a more satisfactory, efficient and sustainable life for humans. For this purpose, selected projects currently executed at the Lab for Geoinformatics and Geovisualization (g2lab) at HafenCity University Hamburg will be presented. In particular, both the cartographic relevance and the challenges will be worked out for public participation workshops aiming at finding refugee housing places, for creating better insights in the course of the fairway adjustment process of the river Elbe, and for generating adaptive visualizations for pedestrian navigation in indoor/outdoor environments. | |||||||||||||||
02.05.2017 | Amplifying the Mind with Digital Tools | Albrecht Schmidt | Uni Stuttgart | ||||||||||||
Historically the use and development of tools is strongly linked to human evolution and intelligence. The last 10.000 years show a stunning progress in physical tools that have transformed what people can do and how people live. Currently, we are at the beginning of an even more fundamental transformation: the use of digital tools to amplify the mind and human perception. Digital tools provide us with entirely new opportunities to enhance the perceptual and cognitive abilities of humans. However, our understanding of how this can be achieved through ubiquitous computing and media is still very basic and research to explore this domain lacks a systematic approach. In our research we create novel systems and devices that enhance human cognition and perception through digital technologies. Our experimental approach is to: first, understand the users in their context as well as the potential for enhancement. Second, we create innovative interventions that provide functionality that amplifies human capabilities. And third, we empirically evaluate and quantify the enhancement that is gained by these developments. I will address the following exemplary research topics from our lab to highlight the feasibility of creating such novel systems and illustrate the challenges faced during experimental quantification of improvements: (1) improving human abilities to create and understand instructions through assistive systems, (2) enhancing human memory through life-logging technologies, and (3) augmenting the human visual sense for thermal depth perception. These research programs are the starting point for the ERC-project AMPLIFY. Ultimately these technologies have the potential to become the foundation for overcoming the temporal and spatial boundaries in human perception and for massively improving cognition. The transformation can be expected to even have a more drastic impact than the invention or writing. With these statements I would open up the discussion to the question: what will be the key enablers for this transformation with regard to systems, technologies, algorithms, and methods. | |||||||||||||||
09.05.2017 | The ethics of big data as a public good: which public? Whose good? | Linnet Taylor | Tilburg University | ||||||||||||
This talk will examine the increasing claims by scientists and the media that big data is a public good and therefore any data available must be used - and in particular the claim that the needs of the poor and marginalised provide a mandate for such use. I will discuss the nature of the visibility and power that (big) data creates, how they are distributed, and different arguments about governing the power to visualise through data. The talk will focus on the international political economy of data, including the use of data for human development and humanitarian response; the public/private interface with regard to governance, and the notion of a duty of visibility on the part of the 'data subject'. | |||||||||||||||
16.05.2017 | Real-time Geo-information Fusion | Florian Hillen | IfGI, WWU Münster | ||||||||||||
In recent years the amount of sensors that provide geo-information experienced a major growth. The resulting flood of geo-information builds the basis for new, time-critical geo-applications that would have been inconceivable a decade ago. The real-time characteristics of geo-information, which are also getting more important for traditional sensors (e.g. remote sensors), require new methodologies and scientific investigations regarding aggregation and analysis that can be summarised under the term geo-information fusion. In my talk I will introduce the basic idea of geo-information fusion and will present an agent-based modelling use case in the context of crowd monitoring. The scenario is designed to emphasise the benefits of fusing geo-information from different sources as well as to demonstrate the need for up-to-date information and real-time processing. |
|||||||||||||||
23.05.2017 | Implicit semantic structures in spatial data analysis | Dr. Simon Scheider | Universiteit Utrecht | ||||||||||||
In this presentation, I will discuss, based on a range of examples, how spatial data analysis depends on implicit assumptions about the meaning of data, and how these assumptions influence not only our choice of analysis methods, tools, and data sources, but also the usefulness of analysis results according to the questions we ask. Turning such implicit semantic structures into an explicit form is therefore a central task for eScience in general, and for spatial information science in particular. I propose to handle implicit semantic structures for spatial analysis based on ontologies, linked data and latent semantics. I will discuss the cases of choropleth maps and statistical summaries of social statistics in geoportals, the Modifiable Area Unit (MAUP) problem, and show how implicit semantic structures can be added to geoprocessing workflows in ordinary GIS. The goal is to enrich, modularize and share spatial analysis methods together with data on the Web. | |||||||||||||||
30.05.2017 | Institute Meeting | ||||||||||||||
06.06.2017 | Pentecost Holidays | ||||||||||||||
13.06.2017 | Statistical algorithms for change detection in optical and SAR imagery, implementations in Python and on the Google Earth Engine | Mort Canty | Forschungszentrum Jülich | ||||||||||||
The talk will outline the theory and open-source software implementations of two change detection procedures applicable to visual/infrared and polarimetric SAR satellite data. Specifically, the "Iteratively Re-weighted Multivariate Alteration Detection" (IR-MAD) algorithm for multi- and hyper-spectral imagery [1] will be explained, together with its application to automatic radiometric normalization [2]. Additionally, a recent procedure for detecting changes in time series of polarimetric SAR images [3] will be discussed. The algorithms have all been programmed in the scripting language Python. The source code is available on GitHub (MIT License) and also encapsulated in so-called Docker containers serving Jupyter notebooks for easy installation and use. The programs are also currently being ported to the Python API of the Google Earth Engine. The software will be demonstrated with examples. | |||||||||||||||
20.06.2017 | Data-Enabled Design for Smart Cities | Gert Kortuem | Uni Delft | ||||||||||||
Cities around the world are embarking on ambitious journeys to turn themselves into so-called smart cities. While initial efforts were driven by technocratic visions of efficency and control, more recent efforts focus on creating tangible benefits for citizens and communities. In this talk I will provide insights into the MK:Smart smart city project in Milton Keynes and discuss the role of design thinking in bringing together technological, business and societal drivers. I will particularly focus on the role of urban data in conceiving and delivering smart city services, and ask how we can utilise (urban and sensor) data as design material. Gerd Kortuem is Professor of Internet of Things at the Faculty of Industrial Design Engineering at Delft University of Technology. He also holds an associate professorship at The Open University in the UK, where he was deputy-director of the Milton Keynes smart city project MK:Smart between 2013-2016. |
|||||||||||||||
27.06.2017 | Graduate School for Geoinformatics | IfGI, WWU Münster | |||||||||||||
Members of the Graduate School for Geoinformatics here at ifgi will present topics connected to their dissertation.
|
|||||||||||||||
04.07.2017 | Algorithmically-Guided User Interaction: Smart Interfaces for Information Extraction from Old Maps | Thomas van Dijk | Uni Wuerzburg | ||||||||||||
Historical maps are important sources of information for scholars of various disciplines. Many libraries are digitising their map collections as bitmap images, but for these collections to be most useful, there is a need for searchable metadata describing the information contained in the maps. Due to the heterogeneity of the images, computer vision software does not perform well, and information is often extracted by hand if at all: many collections are so large that anything more than the most rudimentary metadata would require an infeasible amount of manual effort. Given that computers cannot solve these information extraction tasks fully automatically, we need a different approach. In this talk, we propose "smart" user interaction based on active learning and sensitivity analysis, where the algorithm is designed to minimize manual effort while providing quality assurance. We illustrate this on several information extraction tasks from historical maps. | |||||||||||||||
11.07.2017 | Holistic Approaches to Manage Processes of the Anthropocene: Geomorphology, GIScience and Community Resilience | Chris Renschler | University at Buffalo | ||||||||||||
Abstract: Existing isotope techniques based on fallout radionuclides and process-based soil redistribution modeling together are complementary techniques to provide more reliable and detailed data to a broad spectrum of stakeholders with different objectives: managers of natural resources and disaster managers of contaminated soils. On the one side, utilizing process-based model approaches and fallout radionuclides of surface atomic bomb tests more than half a century ago (considered as the begin of the Anthropocene), enable to support more detailed soil and water conservation analysis of the past and future impact studies under changes of land use and/or climate around the world. While in the latter case the main objective is the sustainable use of natural resources, the same approach can also be used to assess a variety of land management strategies with the primary goal of minimizing erosion of radiation contaminated soils and increase the deposition of contaminated sediments before they reach a water body or stream. The talk presents techniques to develop modeling tools based on holistic approaches in Geomorphology and GIScience for stakeholders to design, verify, validate and apply models assessing soil redistribution and the return periods of extreme events for agricultural soil conservation strategies as well as recovery of radiation contaminated soils. The paper presents a methodology to integrate quantitative models (such as GeoWEPP - the Geospatial Interface for the Water Erosion Prediction Project) to drive the analysis of the complex, interdependent processes that interact within multi-dimensional, functional systems in landscapes. Creating potentially win-win situations based on quantitative measures among a larger group of stakeholders in a watershed is an important aspect of creating long-term partnerships, particularly those in communities exposed to the need for natural resources development and higher risks of natural and man-made hazards (e.g. Fukushima Nuclear Power Plant Disaster). The ‘PEOPLES Resilience Framework’ is a platform that provide the basis for the integration of quantitative and qualitative models that continuously measure the resilience of communities against extreme events or disasters in any or a combination of the above-mentioned dimensions. | |||||||||||||||
18.07.2017 | Spatializing Global Population Projections | Carsten Kessler | Aalborg University of Copenhagen | ||||||||||||
The global population is projected to pass the 10 billion mark in the second half of this century. At the same time, the effects of climate change start to show, including rising temperatures and prolonged and more extreme heat waves. We need a solid assessment of the spatial distribution of the future global population to estimate how many people will be affected by these phenomena. For this purpose, a series of simulations has been developed that simulate where the global population will live. Starting from the status quo, the simulation uses the numbers from different demographic projections up to the year 2100 and spatializes them on a global 1km grid. This talk will discuss the simulation approach, computation considerations to make it feasible on standard hardware, and initial results that show how many people can be expected to be affected by varying degrees of extreme heat in the future. | |||||||||||||||
25.07.2017 | Institute Meeting |