2021-09-30, 16:30–16:35, Group on Earth Observations
Defining temporal quality for OpenStreetMap comes with additional challenges while assessing the data from global south. This is because the extrinsic quality measure cannot be taken, and intrinsic measures can only indicate subjectively where the data is doubtful. This work take the challenge to define the temporal quality aspect for the purpose of disaster risk reduction that would determine when data should be updated, revisited, and produced and when it can be constituted as incomplete in OSM. This presentation is part of larger research aim that I as an independent researcher am trying to achieve which developing a relationship between (regional) contexts and OSM data quality.
The OSM data is staled and in many instances not updated. The example of OSM data production in Haiti is an interesting and alarming case because the data is only produced when there are distresses. While if we visualize the pattern for data production from OSM history viewer from Heidelburg institute we can see that there are sudden jumps of data production which has correlation with major HOTOSM project and/or major disaster. This causes alot of back log in aid and situational awareness. The data needs to be added before urgency. This work will try to work on finding solutions on how can temporal uncertainty be detected.
Open GIScience Research Lab, Enschede, the Netherlands
Transition to FOSS4GTopic –
FOSS4G implementations in strategic application domains: land management, crisis/disaster response, smart cities, population mapping, climate change, ocean and marine monitoring, etc.Level –
1 - Principiants. No required specific knowledge is needed.Language of the Presentation –
Independent researcher based in the Netherlands. Did MSc in Spatial Engineering from Faculty of Geo-information Science and Earth Observation (ITC). I conducted several important research that would lead towards better understanding of the spatial data quality of OSM.