GL: I am a volcanologist and am particularly interested in volcanic and tsunami mapping. In volcanology, we get big datasets (large or complex), usually consisting of multiple data types. Images are a primary data type for volcanology. We also have a lot of data stored in GeoNet databases, MS Excel tables, as files in folders, and the GNS Oracle databases. Many data sets have to come together in or to create cohesive story about a volcano or volcanic event. This is Big Data in a sense of many types of unstructured data coming together and looking for relationships. We also use LIDAR and satellite data that are TB sized. Satellite imagery in particular is used to monitor Pacific volcanoes for example.
Keeping on top of all this data, archiving, managing and updating is a huge challenge for us. We deal with volatile subjects, so we have highly dynamic data – but the workload is uneven depending on source. GeoNet data is always available. Satellite data often only when we choose to purchase it; chemical data is hard to collect because we have to gather it directly from volcanic vents – but when we do is complex to analyse If there’s an event then photographic data rapidly proliferates. The research disciplines in volcanology are diverse. We have mapping specialists (GIS, remote sensing), geological sample specialists, geo-chemical sample specialists, and seismologists to name a few. We use point in time sensing data; we collect airborne gas and particulate samples using planes, and in future perhaps UAVs and helicopters; we have many thousands of digital photos and webcam images on every volcanic vent. Our webcam updates automatically every 15 minutes, but this is not always granular enough to capture an event. Where we need information between within the 15 minute intervals, we can download second split images from each webcam and create a detailed time-lapse movie. When required, or occasionally, we use densely set GPS sensors to measure and map ground deformation.
GL: There are a number of barriers to getting timely, reliable data in our field. The most obvious is probably health and safety – we can’t get very close to an erupting volcano, so it’s difficult to get good information. Sensors, no matter how hardened, get destroyed pretty quickly if they are too close to the action. We need sensors in UAVs and robots that can enter the hazardous zones with real time and robust connections so we can get the data back for analysis.
The other barrier is IT secure connectivity and access – our team is diverse and working from around the country. We have duty officers working from home, researchers in the field and key emergency response agencies close to the scene. Volcanologists are a diverse, virtual team and are highly dispersed around the country and around the world. When there’s an event we need to bring experts and data together very quickly to provide analysis at short notice.
We have reached the point where we can use statistics and observations to forecast eruptions. This will progress as our connectivity increases in terms of speed and agnosticism to where we are. Our ability to forecast volcanic activity will also improve as the capability to present diverse data in an integrated manner matures. Volcanology is image driven, and event analysis focuses to a high degree on imagery, so we need everything brought together and available in an integrated picture that many experts can view.
GL: Internationally, we are heading towards a unified volcano model. The global goal is for unified data from a single volcano so we have a standard to allow comparison over time at a volcano, and across volcanos and systems of volcanos. Data is still quite siloed though – this global model is some years away. In NZ, we need to move to automated management of volcano data before we can contribute to the single model view; to date archiving and indexing of data has not been well done here, mostly due to workload issues.
GL: Funding is a problem, because there are sources of funding for sensors and experts, but less funding for building the data models that bring them together. We have a plan, but we haven’t been able to execute it to date – while we can see how to better organise the data, we’d have to stop research activity to spend the time doing the organising – and then we’d just have better organised data, not necessarily better science because we’d cut back on research for a time period.
GL: In a crisis, there’s a need for visualisation-driven computational steering in real time, available to collaborators around the world, and updated as new information becomes available. For tsunami events, the aim is better forecasting between earthquake and tsunami arrival – this has global impact – but it requires a degree of real time monitoring, optimized supercomputing and secure virtual presence that isn’t available at the moment. Volcanic events span a longer timeframe, so there’s a need for longitudinal dynamic modelling – we need to run finite element models comparing physical deformation and crustal strength and many other paramaters ultimately.
GL: Volcanology and tsunami forecasting have some particular requirements from technology:
– We need robust sensors – especially robust geo-chemical sensors – that can operate in volcanically active areas, so are hardened in terms of sensing capacity and data telemetry, and require little or no maintenance.
– We need to monitor gas and fluid data at the volcanic vent – so this means we need mobile, remotely operated sensors that can get closer than humans can to extreme conditions.
– In tsunami events, a really big deal is to be able to detect the tsunami directly – not just predict it based on the geological event. This need for sensing tsunamis is major area of research – current efforts involve ocean-wide monitoring by satellites, but telemetry and maintenance of sensors is a huge problem here too.A key barrier in tsunami modelling is the quantity of legacy code in use the lack of funding for parallelisation.
Volcanos, earthquakes, and tsunamis are major forces that can be devastating, so sensors and telemetry that can improve forecasting and reduce potential loss of life and damage are a big deal for researchers in this area. Thinking about earthquakes for example, the Japanese have implemented light-speed communications between earthquake sensors and nuclear power stations, with automated shut down as soon as shaking is signalled. Of course in the New Zealand context, we have to understand what contribution such high speed warnings might make. Perhaps before the ground shaking arrives we could instantly bring all our trains to a halt; cease activity in surgical theatres when an alarm sounded, or close a tunnel.
The other major natural hazard in NZ is flooding. Again, data and modelling are an issue – the water management data produced by regional councils is heterogeneous and not particularly integrated because there is little centralised standardisation in this area – as far as I am aware. This is not core business to GNS Science, but mentioned to give tsunami, earthquakes and volcanic eruptions some context against this other major natural hazard.