- Methodology
- Open access
- Published:
Volcanic plume height monitoring using calibrated web cameras at the Icelandic Meteorological Office: system overview and first application during the 2021 Fagradalsfjall eruption
Journal of Applied Volcanology volume 12, Article number: 4 (2023)
Abstract
The Icelandic Meteorological Office maintains a national network of webcams designed and built in house for environmental monitoring. During the 2021 Fagradalsfjall eruption these cameras, along with a temporary near-field network of commercial cameras installed by the Department of Civil Protection and Emergency Management, were used to estimate the height of the \(SO_2\) plume and lava fountain. Here we present the webcam designs, the techniques used to calibrate them, and the messaging system and web interface that allow near real time measurements to be made from the images. With this system we were able to make estimates of heights with an accuracy on the order of tens to a few hundreds of meters with a lag typically of five to ten minutes at up to ten minute intervals when weather conditions were favorable. The plume heights were then used to constrain the performance of the \(SO_2\) dispersion model used for air quality forecasts while fountain heights were used to delineate danger zones where visitors at the eruption site were in danger of being hit by ballistic clasts.
Introduction
Visual webcams are an established volcano monitoring tool used by volcano observatories and research institutes worldwide, for instance by the United States Geological Survey (USGS) at the Alaska, Cascades and Hawaiian volcano observatories (Poland et al. 1992; Snedigar et al. 2006), Pusat Vulkanologi dan Mitigasi Bencana Geologi (PVMBG) in Indonesia (ESDM 2022), Istituto Nazionale di Geofisica e Vulcanologia (INGV) at Etna (Behncke et al. 2009; Calvari et al. 2011) and Stromboli (Calvari et al. 2016), the Kamchatka Volcanic Eruption Response Team (KVERT) in Russia, e.g. (Lovick et al. 2008; Melnikov et al. 2018), Kagoshima University in Japan (Tupper et al. 2003), Servicio Nacional de Geología y Minería (SERNAGEOMIN) in Chile, and Instituto Geológico Minero y Metalúrgico (INGENMET) in Peru (Machacca-Puma et al. 2019), amongst others. These cameras are used for a range of purposes, including simple visual identification of activity, as well as quantitative measurements such as incandescence (Patrick et al. 2010), plume heights (Arason et al. 2011; Petersen et al. 2012; Scollo et al. 2014), and plume geometry more generally (Valade et al. 2014).
Plume heights are particularly important, as they are needed for finding relationships with the Mass Eruption Rate (MER), e.g. (Mastin 2014; Mastin et al. 2009; Sparks et al. 1997). Once this relationship has been established, the MER can be estimated in real time during an eruption by inverting the relationship using plume height observations from various sensors. A number of semi and fully automatic systems have been set up at volcano observatories to do this, including the VESPA system (Arason et al. 2017) at the Icelandic Meteorological Office (IMO) and REFIR at INGV Catania (Dürig et al. 2018). The MER, along with the injection height can then be used to initialize Volcanic Ash Dispersal Models (VADMs), which can be used to forecast the spread of tephra and ash through the atmosphere. At IMO a newly developed system allows the creation of volcanic cloud forecasting maps for various volcanoes and eruptive scenarios. The tool is designed to have maximum flexibility in launching a new run, whenever is needed, by selecting the volcano name and inserting input parameters like starting time of the eruption, injection height, grain-size distribution, gas flux rate and the duration of the emission. Each day the system runs automatically to produce current forecasts for hypothetical eruptions at five key volcanoes: i.e. Hekla, Katla, Grímsvötn, Bárðarbunga and Reykjanes/Svartsengi. In this way, even for eruptions with very short anticipation, the specialists on duty can have a reference on where and when tephra/gases will be transported and what might be the impact on the ground and in the atmosphere. For the modelling of tephra transport IMO uses the Lagrangian model NAME, developed at the UK Met Office and adopted by London VAAC for the creation of their Volcanic Ash Advisories (Jones et al. 2007).
Plume heights are typically measured manually from webcam images by interpolating visually between calibrated guidelines located above the vent at regular intervals and projected into the camera’s Field of View (FOV) onto the image (e.g. (Tupper et al. 2003)). More advanced approaches orient the guidelines above the volcano in the direction of the wind before projecting them onto the image, to take into account the perspective effect as the plume drifts away from the vent (Scollo et al. 2014). These wind direction profiles can be taken from Numerical Weather Prediction (NWP) models, or from radiosondes. To project the guidelines from geographic coordinates (latitude, longitude, altitude) into image coordinates (pixel, row) so that they can be drawn on the image, the camera has to be calibrated, such that its internal geometry (focal length, principal point location, lens distortion) and external geometry (location in geographic space and orientation) are known. The former are often found using a chessboard calibration (for an example in a volcanological context see (Scollo et al. 2014) while the latter are sometime found using image landmarks (e.g. (Arason et al. 2011)).
In this paper we present an overview of the IMO camera network for volcanic plume height monitoring, including the selection criteria for new sites, the camera designs in use and the operational considerations that went into them. In particular, we extend the techniques of Arason et al. (2011); Scollo et al. (2014); Tupper et al. (2003) with an interactive user interface that allows the user to select (i) the web camera, (ii) the source of the vertical wind profile, (iii) the vent location the guideline is situated above and (iv) the height of guideline. Once the guideline is matched with the level of interest in the plume (plume top, bottom, center, etc.), the height can then be saved and exported for further processing in IMO’s VESPA system. Point (iii) is particularly important for Icelandic fissure volcanoes where prior information on potential vent locations are weak and the vent may well migrate after eruption onset. In addition, we allow the user to measure distances downplume to locate e.g. visible ash fallout. The whole system is implemented as a single page application using the open source JavaScript react.js framework, and fully integrated into the automated data distribution system at IMO which is based on the open source messaging broker software RabbitMQ, allowing forecasters and research scientists to access the latest data from any physical location via the web. In order to project the height guideline onto the image the cameras also have to be calibrated, which is to say that the internal geometry of the camera and its location and orientation in geographical space have to be known. The internal geometry, also known as the intrinsic parameters is found either using a conventional laboratory calibration procedure, or using vicarious calibration techniques based on features in the scene, while the orientation can be retrieved using horizon matching, all of which are described here. The 2021 Fagradalsfjall eruption, 30 km SW of Reykjavík, provided an opportunity to test the system during a comparatively benign event, and we describe how it was used to accurately measure plume heights for initializing \(SO_2\) dispersion models in a timely manner.
Materials and methods
Webcam designs
Here we present the three webcam types referred to in this paper, PiCams, ArduinoCams, and Mobotix cameras. The PiCams and ArduinoCams were designed, built, installed and maintained by IMO, while the Mobotix cameras were installed and maintained by Icelandic Civil Protection.
The Picam design is used for routine acquisitions where power and communication infrastructure are available, and consists of a Raspberry Pi single board computer mounted in a weatherproof plastic box with a window inserted into the front through which the PiCam web camera unit can view outside (Fig. 1). The box is partially covered by an aluminum weather shield for extra protection, e.g. from falling ice, and to aid with mounting to some external structure. Power and data transfer are over ethernet which requires pre-existing power and data transfer infrastructure, and a heating unit is located inside the enclosure to prevent condensation. These cameras are typically configured to acquire images regularly at multiples of ten minutes past the hour, and to vary exposure time during the night to adapt to ambient light levels. The exposure time is automatically adjusted throughout the year based on the site’s calculated civil twilight time with the program ‘sunwait’ (Risacher and Craig 2022). Although this approach still returns almost black images on completely overcast or dark nights without moon illumination, it works surprisingly well at other times, and facilitates calibrating units using the stars, as discussed later.
The ArduinoCam design is used for opportunistic acquisitions in the wilderness where power and communications are absent, and consists of two low power 5MP OV5642 CMOS sensors from OmniVision mounted in a weatherproof plastic box, with view out of plastic windows inserted at 180 degrees to each other (Fig. 2). The cameras are powered by a 12V battery, and use approximately 200 mA when taking pictures and transmitting, and 300 \(\mu\)A when sleeping between acquisitions. Picture transmission is by FTP over the GPRS mobile phone network, which uses the greater proportion of the power, such that energy consumption is largely a function of image size, which can be varied between 2018x1563 and 320x240 pixels depending on the need to conserve power. Missed acquisitions due to poor signal reception or data transfer issues cannot be retried so images are acquired sporadically and not necessarily at pre-configured times. The simple camera design and basic on board processing results in images with substantial lens distortion, making accurate calibration very important.
The Icelandic Civil protection set up five Mobotix webcams in the area surrounding the volcanic eruption. They were used to monitor the course of events for risk assessment and the locations were chosen carefully to give a complete overview of the whole area yet remain far enough from the eruption to be able to record long lasting events without the need to relocate the set up as the lava advances. The interval can vary form one image per minute to one image per hour based on how fast changes are occurring in the area and the amount of battery power available. Remote mountain peaks are often the best locations and therefore the setup has to be rugged and highly weatherproof and self-sufficient in respect to power and communication.
Each setup consists of a Mobotix S74 camera housing and two independent lens modules (Fig. 3). Power to the camera is controlled by a Campbell data logger that gives access to power for the camera with an intervall chosen in respect to the charging load to the batteries. It is powered by two 200 Ah, 12V batteries charged by solar panels. Communication to and from each setup is via 4G GSM modem. Using a Campbell data logger also allows weather data to be collected by adding censors to the mast.
Still images from the cameras are sent via ftp to a server and copied to the civil protection website simultaneously (of Civil Protection and Management 2022). The images are available to scientists as well as the public. Due to the two lenses per camera, a total of 10 different views were available during the eruption.
The webcam network of IMO and Icelandic Civil Protection
The Icelandic Meteorological Office maintains a substantial national network of ArduinoCam and PiCam cameras for a variety of uses (e.g. atmospheric visibility, snow accumulation and river flow conditions). A website with the location, orientation and operational status of each webcam is maintained at webcam.vedur.is (with restricted access), screen shots of which are shown in Fig. 4. Figure 4a shows how the web camera network is concentrated along the main volcanically active corridor running from south west to north east. Within an ICAO funded project, this network is currently being expanded to improve surveillance of the volcanoes assessed to pose the greatest threat: Hekla, Grímsvötn and Bárðarbunga. However during 2021 the network was densified around Fagradalsfjall, in the Reykjanes peninsula, (Fig. 4b) and Askja in response to new unrest at both locations. Webcams from a temporary network installed by the Department of Civil Protection and Emergency Response to monitor Fagradalsfjall were also integrated into IMO’s systems (Fig. 4c).
Possible locations to install PiCam units, which require power and communications, were considered from places where continuous weather, hydrological, seismic and deformation stations are already installed. We also considered the locations where mobile LiDAR and RADAR trailers were positioned on an ad hoc basis as they were moved to different places. The potential PiCam sites were assessed for the view they give of volcanic systems (central volcanoes and their fissure swarms) using the profile tool in Google Earth. The following features are considered important: (i) accessibility, for maintenance as needed, (ii) distance from the potential eruption sites, to ensure different expected ranges of plume heights are within view with sufficient resolution (it is desirable to have stations at different distances for different coverage), and (iii) azimuthal coverage, where it is desirable to have multiple units viewing a particular volcano. The ArdunioCams which are less power demanding can be located anywhere there is mobile reception, the view would be beneficial, and the site possible to access.
Once the cameras are connected to the network they start acquiring images on their pre-defined schedule and immediately transmit them back to IMO where they are processed and stored, with messages published on a messaging broker containing relevant metadata such as image size, site location and camera direction, as well as URLs pointing to the location of the images where they can be retrieved. This process is designed to be ‘event based’ in contrast with more common ‘schedule based’ processing, with the goal of reducing artificial lag in image availability to downstream processing systems.
At IMO the RabbitMQ messaging broker is used (RabbitMQ 2022). A python script listens to the messaging broker, and whenever a message is received indicating a new image is available, it automatically extracts the appropriate wind profiles from the latest weather forecast data, and stores them in JavaScript Object Notation (JSON) files, which are subsequently loaded by the website, as described later.
Webcam calibration
When measuring plume heights we are fundamentally trying to estimate a 3D (X, Y, Z) measurement of the plume top from a 2D (U, V) observation on an image. In order to do this, we need two things, (i) a relationship between points in 3D geographic space and 2D image space, which we will invert, and (ii) extra geometric constraints to give us a unique answer. The first is given by a pinhole camera with lens distortion model, which requires the camera location, orientation and its internal geometry and lens distortion coefficients as parameters. The second is provided by assuming the vent location and wind direction, which gives us the X and Y locations of the 3D point, leaving only the Z component, or the plume height. Both requirements are discussed in more detail below.
A basic visualization of the relationship between 3D coordinates and 2D image coordinates is shown in Fig. 5a. For a simple pinhole camera with no lens distortion, a point in the camera’s field of view is projected onto the 2D camera image through the focal point, with the exact location on the image determined by the cameras focal length, principal point, and sensor size. However, most camera lenses introduce some appreciable distortion into the image and the final projected point may be offset from that of an ideal pinhole camera as a consequence. A comprehensive overview of the equations for calculating the 2D projection of a 3D point, including lens distortion, are detailed in OpenCV (2022), but in summary, in order to project a point \((X_w, Y_w, Z_w)\) in geographic space onto an image as a point (u, v) we need to know the focal length \(f_x , f_y\), principal point \(c_x , c_y\), and the distortion coefficients \(k_1 , k_2 , k_3 , p_1 , p_2\), the camera location in geographic coordinates (t) and it’s orientation as yaw, pitch and roll angles, expressed as the rotation matrix R. These parameters are typically grouped into two categories - those that change when the camera is moved (the location and orientation) are called the extrinsics and those that stay the same when the camera is moved (the cameras internal geometry, i.e. focal length, principal point, distortion coefficients) are called intrinsics. When we know the extrinsics and intrinsics for a camera we say that it is calibrated.
Camera calibration is fundamentally about imaging points of known location, and varying the extrinsics and intrinsics to minimize the offset between the imaged points and the projection of the points onto the image - the so called reprojection error. This can take the form of imaging a known pattern with the same camera many times from different angles, and solving for the camera extrinsics for each image plus the intrinsics common to all images (Fig. 5b), or where the camera location is known relative to the pattern, solving for the intrinsics and a single orientation (Fig. 5c,d showing examples using terrain and stars, respectively). The former approach is typically used in the laboratory, where geometric patterns are printed out and affixed to a flat surface, while the latter is used for cameras that have been already installed in the field, where we know the camera location and the bearing and elevation of stars or features in the terrain. We refer to these two different approaches as laboratory and vicarious calibration, respectively.
Once a camera is calibrated, we can project 3D points onto any image taken by the camera, as if the camera was viewing those points in the scene. By taking the location of the vent and the variation in wind direction with height, we can describe a point at a height h above the vent and a distance d downwind (Fig. 5e). Assuming the plume has reached neutral bouyancy and is being passively carried in the wind direction, we can trace out a line of notional plume height h in 3D space by varying d. Projecting this onto the image using the calibration parameters gives a line that can be moved up or down by varying h to find the desired level in the plume - say the plume top, bottom, center line, etc.
A summary of the different calibration procedures for field deployments are shown in the Venn diagram in Fig. 6. Given we know the camera location, vicarious calibration provides all the information we need: focal length, the rest of the intrinsics, and orientation. However laboratory calibration does not provide the orientation at the installation site, so in this situation, the orientation is calculated by identifying known landmarks in the image (buildings, mountain peaks) or matching a horizon from a digital elevation model with the imaged one. In operational situations horizon matching is typically used to find the orientation, even for vicariously calibrated cameras, as cameras tend to drift in their orientation over time, and horizon matching is the easiest and most robust method to implement as a user interface, as will be discussed later.
The digital horizon for horizon matching, and terrain features for vicarious calibration are extracted from high resolution topography and orthophotos as follows. The viewshed from the camera location is calculated from a Digital Elevation Model (DEM), and the horizon is extracted from the viewshed by dividing the viewshed into azimuthal bins about the camera location and taking the most distant point in each bin (Fig. 7a). This results in a list of XYZ values that trace the edge of the viewable terrain. The terrain features are found by manually identifying the same feature in both a real camera image and in simulated images of the landscape using a best guess camera calibration. The DEM, viewshed and an RGB orthophoto or DEM shaded relief image are resampled to the same grid (Fig. 7b), and then iterated over row by row. For each row, cells that are marked as visible by the hillshade are selected and projected onto the image using the best guess calibration and the X,Y location of the cell and the height from the DEM. The appropriate pixel of the image is then coloured according to either the grey scale value in the shaded relief or the RGB value of the orthophoto in that cell, to give simulated images Fig. 7c a simulated view of the shaded DEM, and Fig. 7d, a simulated view of the DEM colored according to the orthophoto, respectively. Where more than one cell is projected into a pixel the average RGB or greyscale value is taken, and empty pixels are then linearly interpolated after the whole of the grid is processed. Features common to both the simulated and real images are then identified, and the azimuth and elevation of the selected points on the simulated image are converted to cartesian unit vectors relative to the cameras location, which can then be used for calibration.
Astronomical bodies, the other source of image features for vicarious calibration, are identified in a similar manner. The azimuth and elevation of stars and planets at the camera location are calculated using the Hipparcos catalogue (Lindegren et al. 1997) using functionality in the Skyfield python package (Rhodes 2019). The Skyfield package corrects for atmospheric refraction, for which typical values of surface atmospheric pressure and temperature are used. These are projected onto night time images using a best guess camera calibration and then the projected stars are matched with their true imaged counterparts. The azimuth and elevation of these points are then converted to cartesian unit vectors relative to the camera location for use in calibration, as before. This technique has been used previously for calculating the intrinsic parameters of consumer cameras (Klaus et al. 2004), and for all sky cameras (Antuña-Sánchez et al. 2022).
Once enough calibration points have been collected, the reprojection error is minimized for the appropriate combination of intrinsic and extrinsic parameters using a Levenberg-Marquardt procedure (Levenberg 1944; Marquardt 1963). For a laboratory calibration, the standard OpenCV function “calibratecamera” is used (Bradski and Kaehler 2000), but this lacks the ability to hold the camera location constant, which is needed for field based vicarious calibration, so those fits were performed using the lmfit python package (LMFIT 2022). Examples of the two different calibration approaches (laboratory and vicarious) are shown in Fig. 8. For the laboratory calibration, many views of the target pattern (a 8x13 40 mm chessboard pattern printed on A2 paper and affixed to a flat board, (Fig. 8a, b) were acquired, and then processed to automatically identify the square corners with subpixel refinement. Once enough images are acquired to fill the cameras field of view from multiple orientations (typically 40 to 50 images, Fig. 8c), the extrinsic parameters for each image acquisition and the single set of intrinsics common to them all are retrieved. The goodness of fit can then be inspected by plotting the calibration points in image space, coloured according to their reprojection error to look for systematic error indicating a poor fit, and also to check if the spatial coverage is adequate (Fig. 8c). Two examples of vicarious calibration are also shown in Fig. 8, one where stars fill enough of the field of view to be used alone (Fig. 8d,e,f), and one where both terrain features and stars are needed to fill the frame (Fig. 8g, h, i). For the former, a single set of intrinsics and a single orientation are retrieved, but for the latter two orientations are required, one for the stars and one for the terrain features. This is because the azimuth and elevation of the stars are defined with respect to true north and the local zenith while the terrain feature azimuth and elevation are defined relative to grid north and Z direction, and it is simpler to treat these separately.
The intrinsics of all calibrations performed on IMO and Civil Protection cameras to date are shown in Fig. 9. No Mobotix cameras were calibrated before deployment, so they had to be calibrated vicariously, and no star calibrations are available for Arduino cameras as they do not acquire low light images at night. The parameters for the same type of camera cluster together, and for the PiCam cameras where comparison between star and laboratory calibration are possible, star calibration appears to perform well. However, the errors in the retrieved parameters for vicarious calibration are significantly higher than for laboratory calibration, as might be expected given the lower number of features available, and lower point accuracy on the images due to manual digitization errors and lack of sub pixel refinement for vicarious features. This can be seen more clearly in a plot of fractional uncertainty Fig. 10, where the uncertainty in each parameter is plotted as a fraction of the estimated value to facilitate comparison between webcams with different geometries. The laboratory chessboard calibration consistently performs the best, with stars performing less well and the combination of stars and terrain performing the worst. The Root Mean Square (RMS) reprojection error for each calibration is plotted, again normalized to facilitate comparison between different camera designs, this time by the width of the image, and again laboratory calibration is found to perform better than vicarious techniques.
We can inspect how uncertainty in the camera parameters propagates into plume height measurements using a Monte Carlo approach. We select a pixel in the cameras field of view which we desire to use for a plume height estimate, and draw a number of random camera calibrations from the probability distribution over the parameters defined by the calibration result and its uncertainty. For each sample, we find the unit vector in 3D space in the direction of that pixel, and we take the standard deviation over the z component of all these unit vectors to get the height uncertainty per unit distance from the camera. By repeating this for pixels across the image we can map out the variability in plume height uncertainty across the field of view, as shown in Fig. 11. Here we have split the error into contributions from intrinsic, orientation and digitization uncertainty (the latter is uncertainty in exactly which pixel is chosen for the height measurement). Note that the intrinsic and orientation contributions increase substantially towards the corners as a result of the strong image distortion in this particular camera being poorly constrained there.
We can then calculate the height uncertainty at a particular distance by multiplying the height uncertainty per unit distance by the assumed distance to the plume in the pixel. Figure 12 shows this for all calibrated cameras, separated by calibration method. Additionally, for each camera we show the error for intrinsic parameters alone, intrinsic and digitization, and intrinsic digitization and orientation. As before, the error increases from laboratory chessboard calibration, through star calibration to star and terrain calibration, with error arising from the larger uncertainty in the intrinsics contributing overwhelmingly to the latter. However, despite star and terrain calibration giving the worst performance it still gives a height uncertainty of between 300 and 400 m at 100 km, so the worst case scenario still provides useful data, while laboratory chessboard calibration gives the best performance of 200 to 250 m height uncertainty at 100 km. It should be noted that the orientation uncertainty was calculated using an assumed error in the yaw pitch and roll angles of x degrees, which is a reasonable uncertainty for manual horizon matching using the web interface, as is discussed later. It should also be noted that these uncertainties are specified given a particular wind direction profile because we don’t have well defined errors for wind directions at significant elevations above the surface. In operational use the system provides the user with a number of profiles from different sources which can be used to provide a a “poor man’s ensemble” if an error estimate including model uncertainty is needed. In practice, the model with a guideline that fits best the top of the plume is usually chosen.
Measuring the plume height
The height measurement is made from the images using a single page react.js application shown in Fig. 13. The system consists of static (i.e. always the same, not generated by each request) .html, .js files that render the website, and .json files that store camera calibration and wind profile data. These files can be hosted on any basic file server and accessed through a web browser. The .json files are updated by the aforementioned python script whenever triggered by the RabbitMQ messaging server, ensuring the list of available dates and times for each camera, and the vertical wind profiles, are kept up to date. In this way the system is near real time, with a delay of 5 to 13 minutes depending on the camera system (Fig. 13). The user selects the camera they desire to measure from, which updates a date picker, which in turn updates a list of available times in a dropdown box when a particular date is selected (Fig. 14a). Once a date and time are chosen the website fetches the appropriate image and wind velocity profile for that date, time and location. The user can then adjust the vent location and plume height (Fig. 14b) the latter of which is used to interpolate the wind direction from the wind velocity profile. The user can also adjust the camera orientation (Fig. 14c) if it is found to have drifted. The main display (Fig. 14d) shows the image with the digital horizon projected onto it to check for camera drift. Once the camera is correctly oriented in space, the location and plume height are adjusted such that the vertical yellow line is over the vent and the cyan line is at the desired level in the plume. The user also has the option to plot a point at some specific distance upwind or downwind of the vent, as well as set the wind direction manually (Fig. 14c), and save, plot and export measurements from multiple images and webcams (Fig. 15). Note the curved guidelines in Fig. 15b; these straight 3D lines are curved when projected onto the image due to the strong lens distortion, illustrating how important it is to accurately estimate the distortion coefficients. The uncertainty in the height measurement is calculated by finding the uncertainty per unit distance interpolated from a reduced resolution grid (e.g. Fig. 11) for the pixel in the image at the intersection of the yellow and cyan line, and multiplying it by the distance to that point in 3D space (i.e. the point at height h above the assumed vent). Additionally, ray projection can be used to constrain the location of an image feature to a line on the inset map, which can be useful for identifying the location of a vent in the initial stages of an eruption (Fig. 16).
Results
The system was used during the 2021 Fagradalsfjall eruption to track the height of the volcanic plume rich in \(SO_2\). This eruption, which started on 19 March 2021 and lasted six months, was characterized by lava flows and release of volcanic gases into the atmosphere. Its vicinity to inhabited regions and its accessibility by hundreds of thousands of people, set the hazard due to gas pollution at the foremost priority. The Icelandic Meteorological Office responded to this volcanic crisis by setting a dispersal code for the production of daily forecast of transport of volcanic \(SO_2\) (Barsotti et al. 2023). The CALPUFF model used for this application needs input parameters like the \(SO_2\) flux, the geometry of the vent, and the meteorological data (Scire et al. 2000). The model simulates the rise of volcanic mixture in the atmosphere and calculates the injection height where the dispersal takes place. Observations of plume heights are then very important to validate and confirm the quality of the model results. Indeed, wrong injection heights would imply the usage of wrong wind speed and direction in the transport equations and, consequently, wrong dispersal patterns. During the eruption, large discrepancies between the observed and modelled plume heights required investigations and the tuning of model settings in order to improve its performance. Two cameras proved useful for this application, one a laboratory calibrated ArduinoCam located at Slaga, 4 km SSW of the eruption and one a laboratory calibrated PiCam at Hvassahraun, 15 km NNE, with the former capturing higher plumes and the latter being useful for measuring low plumes (Fig. 17a). Note that height estimates from Slaga are more uncertain at higher altitudes because these lie close to the upper edge of the image where lens distortion is strong and poorly modeled. The calibrated camera network was also found to be useful for measuring lava fountain heights, while the fountain was visible over the rim of the crater, in the period 27 April - 28 June. Estimating the fountain heights became a key element for assessing the local hazard due to clast fallout. The lava fountain activity started on 27 April and shortly after clasts up to 5 cm in size were detected reaching several hundred of meters downwind from the vent. At that time tourists could still get very close to the eruption site and clast fallout was witnessed by many people. A numerical model for ballistics was rapidly set up to define the area potentially exposed to clast fallout. The model settings were done by using the maximum observed fountain heights to evaluate the worst-case scenario. Eventually two areas of 500 m and 650 m of radius were added to the danger zone map and visitors were recommended not to enter. The fountain height time series recorded by the vicariously calibrated Mobotix Meradalahnúkur camera is shown in (Fig. 17b).
Discussion
The main limitation of this method of plume height measurement is the implicit assumption that the plume is compressed onto a vertical 2D plane extending from the vent in the direction of the wind. In reality, plumes are 3D objects with some finite width, and as a consequence most cameras will not see the top of the plume from their vantage point on the ground. As such, the “top” of the plume visible in the image will in reality be at some location on the side of the plume. The plume height measurement will therefore have a positive bias, negligible at a distance but becoming greater the closer the camera is. The size of the bias is a function of plume geometry, with a wide flat plume viewed from close up providing the most unfavorable geometry. The introduction of extra geometric constraints can go someway to alleviating this effect, for instance two cameras viewing a plume from approximately right angles can be used to create a minimum bounding box that must entirely contain the plume by projecting the outlines of the plume in the images into two cones in 3D space and intersecting them. Alternatively stereo rigs can be used to create 3D point clouds of the plume facing that particular rig, and several placed around a vent can be used to create a full 3D point cloud. Moderate to high resolution satellite data can also be used to align the guideline with the center of the plume, which aids with plume height estimation when wind direction varies with height. These approaches are currently under development at IMO.
Another limitation is that the system presented here is largely manual, in that identifying image features for vicarious calibration, horizon matching for finding the camera orientation, and measuring the plume height are all performed by hand. We deliberately decided to implement manual techniques first so we would always have them to fall back on should automatic systems fail during a crisis. A number of methods for automatic horizon matching (e.g. Baboud et al. 2011) have been proposed, and we are developing techniques tailored for our application.
Conclusions
The network of custom and commercial webcams maintained by the Icelandic Meteorological Office and Department of Civil Protection and Emergency Management was successfully used to estimate \(SO_2\) plume and lava fountain heights during the 2021 Fagradalsfjall eruption. The custom built webcams were calibrated in the laboratory by IMO, while the vicarious calibration techniques described in this paper allowed the network maintained by Civil Protection to be calibrated and integrated into the system after it had already been deployed. Accurate calibration allowed heights to be measured with accuracies on the order of tens to hundreds of meters depending on the distance of the camera, while the use of a message queue system to provide notifications allowed measurements to be made in near real time with a typical lag of only five to ten minutes. These height estimates were then used operationally for air quality forecasts and delineating zones with a high risk of ballistics. At the time of writing the system is still operational and is being expanded to cover the volcanoes most likely to produce ashy eruptions in the future, Hekla, Grímsvötn, Bárðarbunga, with expansion planned for Katla, öræfajökull and Askja. Height estimates of ash plumes are a critical parameter for ash dispersion models and for delivery by volcano observatories to Volcanic Ash Advisory Centres (Barsotti et al. 2022). Future work will focus on automating current manual procedures such as star identification, horizon matching and plume height identification, as well as providing extra geometric constraints using multiple camera views and satellite data to better constrain the height.
Availability of data and materials
Data and code area available from the authors.
Abbreviations
- CMOS:
-
Complementary metal-oxide-semiconductor
- FOV:
-
Field of View
- GPRS:
-
General Packet Radio Service
- ICAO:
-
International Civil Aviation Authority
- IMO:
-
Icelandic Meteorological Office (Veðurstofan Íslands)
- INGENMET:
-
Instituto Geológico Minero y Metalúrgico
- INGV:
-
Istituto Nazionale di Geofisica e Vulcanologia
- KVERT:
-
Kamchatka Volcanic Eruption Response Team
- MER:
-
Mass Eruption Rate
- NAME:
-
Numerical Atmospheric-dispersion Modelling Environment
- NWP:
-
Numerical Weather Prediction
- PVMBG:
-
Pusat Vulkanologi dan Mitigasi Bencana Geologi
- REFIR:
-
Real-time Eruption source parameters FutureVolc Information and Reconnaissance system
- SERNAGEOMIN:
-
Servicio Nacional de Geología y Minería
- VAAC:
-
Volcanic Ash Advisory Center
- VADMs:
-
Volcanic Ash Dispersal Models
- VESPA:
-
Volcanic Eruptive Source Parameter Assessment
References
Antuña-Sánchez JC, Román R, Bosch JL, Toledano C, Mateos D, González R, Cachorro V, de Frutos Á (2022) ORION software tool for the geometrical calibration of all-sky cameras. PLoS ONE 17(3):0265959
Arason Þ, Barsotti S, de’Michieli Vitturi M, Jónsson S, Arngrímsson H, Bergsson B, Pfeffer MA, Petersen GN, Björnsson H (2017) Real-Time Estimation of Mass Eruption Rate and Ash Dispersion During Explosive Volcanism. In: International Association of Volcanology and Chemistry of the Earth’s Interior (IAVCEI), Scientific Assembly, Portland, Oregon, USA, 14-18 August 2017. http://www.hergilsey.is/arason/rit/2017/arason_etal_2017_iavcei_e.pdf. Accessed 9 Apr 2022
Arason P, Petersen G, Bjornsson H (2011) Observations of the altitude of the volcanic plume during the eruption of Eyjafjallajökull, April-May 2010. Earth Syst Sci Data 3(1):9–17
Baboud L, Čadík M, Eisemann E, Seidel HP (2011) Automatic photo-to-terrain alignment for the annotation of mountain pictures. In: CVPR 2011. IEEE, pp 41–48
Barsotti S, Parks MM, Pfeffer MA, Óladóttir BA, Barnie T, Titos MM, Jónsdóttir K, Pedersen GB, Hjartardóttir ÁR, Stefansdóttir G, et al (2023) The eruption in fagradalsfjall (2021, iceland): how the operational monitoring and the volcanic hazard assessment contributed to its safe access. Nat Hazards 116:3063–3092
Barsotti S, Witham C, Scollo S, Gurioli L, Donnadieu F (2022) The 2nd European VOs-VAACs workshop took place successfully in November 2021. IAVCEI News 1
Behncke B, Falsaperla S, Pecora E (2009) Complex magma dynamics at Mount Etna revealed by seismic, thermal, and volcanological data. J Geophys Res Solid Earth 114, B03211
Bradski G, Kaehler A (2000) Dr Dobb’s Journal of Software Tools. OpenCV Libr 25(11):120
Calvari S, Intrieri E, Di Traglia F, Bonaccorso A, Casagli N, Cristaldi A (2016) Monitoring crater-wall collapse at active volcanoes: a study of the 12 January 2013 event at Stromboli. Bull Volcanol 78(5):1–16
Calvari S, Salerno G, Spampinato L, Gouhier M, La Spina A, Pecora E, Harris AJ, Labazuy P, Biale E, Boschi E (2011) An unloading foam model to constrain Etna’s 11–13 January 2011 lava fountaining episode. J Geophys Res Solid Earth 116, B11207
Dürig T, Gudmundsson MT, Dioguardi F, Woodhouse M, Björnsson H, Barsotti S, Witt T, Walter TR (2018) REFIR-A multi-parameter system for near real-time estimates of plume-height and mass eruption rate during explosive eruptions. J Volcanol Geotherm Res 360:61–83
ESDM (2022) MAGMA Indonesia. https://magma.esdm.go.id. Accessed 9 Apr 2022
Jones A, Thomson D, Hort M, Devenish B (2007) The UK Met Office’s next-generation atmospheric dispersion model, NAME III. Air pollution modeling and its application XVII. Springer, Berlin, pp 580–589
Klaus A, Bauer J, Karner K, Elbischger P, Perko R, Bischof H (2004) Camera calibration from a single night sky image. In: Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. CVPR 2004., vol 1. IEEE, p I–I
Levenberg K (1944) A method for the solution of certain non-linear problems in least squares. Q Appl Math 2(2):164–168
Lindegren L, Kovalevsky J, Hoeg E, Bastian U, Bernacca P, Crézé M, Donati F, Grenon M, Grewing M, Van Leeuwen F et al (1997) The HIPPARCOS catalogue. Astron Astrophys-A &A 323(1):49–52
LMFIT (2022) LMFIT: Non-Linear Least-Squares Minimization and Curve-Fitting for Python. https://lmfit.github.io/lmfit-py/. Accessed 9 Apr 2022
Lovick J, Lawlor O, Dean K, Dehn J (2008) Observation of volcanoes through webcams: Tools and techniques. AGU Fall Meeting Abstracts 2008:51–2019
Machacca-Puma R, Lesage P, Larose E, Lacroix P, Anccasi-Figueroa RM (2019) Detection of pre-eruptive seismic velocity variations at an andesitic volcano using ambient noise correlation on 3-component stations: Ubinas volcano, Peru, 2014. J Volcanol Geotherm Res 381:83–100
Marquardt DW (1963) An algorithm for least-squares estimation of nonlinear parameters. J Soc Ind Appl Math 11(2):431–441
Mastin LG (2014) Testing the accuracy of a 1-D volcanic plume model in estimating mass eruption rate. J Geophys Res Atmos 119(5):2474–2495
Mastin LG, Guffanti M, Servranckx R, Webley P, Barsotti S, Dean K, Durant A, Ewert JW, Neri A, Rose WI et al (2009) A multidisciplinary effort to assign realistic source parameters to models of volcanic ash-cloud transport and dispersion during eruptions. J Volcanol Geotherm Res 186(1–2):10–21
Melnikov D, Manevich A, Girina O (2018) Correlation of the satellite and video data for operative monitoring of volcanic activity in Kamchatka. 10th biennual workshop on Japan-Kamchatka-Alaska subduction processes, Petropavlovsk-Kamchatsky, Russia, 20th-26th August, 2018. http://repo.kscnet.ru/3336/1/MelnikovDV_101-82.pdf
of Civil Protection D, Management E (2022) Myndavélar Almannavarna. www.almannavarnir.is/eldgos/myndavelar/. Accessed 9 Apr 2022
OpenCV (2022) Camera Calibration and 3D Reconstruction. https://docs.opencv.org/4.x/d9/d0c/group__calib3d.html. Accessed 9 Apr 2022
Patrick MR, Kauahikaua JP, Antolik L (2010) MATLAB tools for improved characterization and quantification of volcanic incandescence in Webcam imagery: Applications at Kilauea Volcano. Hawaii. US Geol Surv Tech Methods 13(A1):1–16
Petersen G, Bjornsson H, Arason P (2012) The impact of the atmosphere on the Eyjafjallajökull 2010 eruption plume. J Geophys Res Atmos 117, D00U07
Poland MP, Dzurisin D, LaHusen RG, Major JJ, Lapcewich D, Endo ET, Gooding DJ, Schilling SP, Janda CG (1992) Remote camera observations of lava dome growth at Mount St. Helens, Washington, October 2004 to February 2006. Development 1980(86):225–236
RabbitMQ (2022) RabbitMQ. https://www.rabbitmq.com/. Accessed 9 Apr 2022
Rhodes B (2019) Skyfield: High precision research-grade positions for planets and Earth satellites generator. Astrophys Source Code Library, record ascl:1907.024. https://ascl.net/
Risacher D, Craig I (2022) Sunwait. https://github.com/risacher/sunwait. Accessed 9 Apr 2022
Scire JS, Strimaitis DG, Yamartino RJ, et al (2000) A user’s guide for the CALPUFF dispersion model. Earth Tech Inc 521:1–521
Scollo S, Prestifilippo M, Pecora E, Corradini S, Merucci L, Spata G, Coltelli M (2014) Eruption column height estimation of the 2011–2013 Etna lava fountains. Ann Geophys 57(2):214–214
Snedigar S, Cameron C, Nye C (2006) The Alaska Volcano Observatory Website a Tool for Information Management and Dissemination. AGU Fall Meeting Abstracts 2006:51–1695
Sparks RSJ, Bursik M, Carey S, Gilbert J, Glaze L, Sigurdsson H, Woods A (1997) Volcanic plumes. Wiley, Hoboken
Tupper A, Kinoshita K, Kanagaki C, Iino N, Kamada Y (2003) Observations of volcanic cloud heights and ash-atmosphere interactions. WMO/ICAO Third International Workshop on Volcanic Ash, Toulouse, France, September 29th to October 3rd, 2003. https://pages.mtu.edu/~raman/papers2/Tupperetal.pdf
Valade S, Harris AJ, Cerminara M (2014) Plume Ascent Tracker: Interactive Matlab software for analysis of ascending plumes in image data. Comput Geosci 66:132–144
Acknowledgements
The authors would like to acknowledge the help of Icelandic Search and Rescue who helped facilitate safe access to the eruption and moved instruments when lava threatened them.
Funding
This research was funded by the International Civil Aviation Organization (ICAO) under the Joint Finance Agreement with the Icelandic Meteorological Office, for working paper JS.212.WP.2054.
Author information
Authors and Affiliations
Contributions
Conceptualization, S.B., M.A.P., S.M., E.M.S., T.H., B.B, þ.A. and T.B.; methodology, T.B., T.H and E.M.S.; software, T.B, T.H. and B.B ; hardware development, installation and operation, T.H, B.B., S.K.P., þ.I., B.O. and V.S.þ.; investigation, T.B., M.T. and E.M.S.; resources, T.H., B.B., S.K.P, þ.I., V.S.þ. and B.O.; data curation, T.H., B.O., M.T.; writing—original draft preparation, T.B.; writing—review and editing, T.B., S.B, M.A.P., M.T., þ.A., T.H., S.v.L.o.M; visualization, T.B.; project administration, S.B.; funding acquisition, S.B., M.A.P., S.M., E.M.S. All authors have read and agreed to the published version of the manuscript.
Corresponding author
Ethics declarations
Ethics approval and consent to participate
Not applicable.
Consent for publication
All authors give their consent for publication.
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
About this article
Cite this article
Barnie, T., Hjörvar, T., Titos, M. et al. Volcanic plume height monitoring using calibrated web cameras at the Icelandic Meteorological Office: system overview and first application during the 2021 Fagradalsfjall eruption. J Appl. Volcanol. 12, 4 (2023). https://doi.org/10.1186/s13617-023-00130-9
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s13617-023-00130-9