Monday, September 25, 2017
Published: Jul 2017
Authors
  • Richard Hill
    • Richard Hill
    • Company: QPS bv
  • Frans Nijsen
    • Frans Nijsen
    • Company: QPS bv
Categories
  • Hydrographic Sonar software

Innovative Hydrography; delivering a workflow for Acquisition, Processing, Visualization and Sharing for all modalities of today’s shallow water multibeam echosounders

Today’s shallow water multibeam echosounder are capable of efficiently delivering bathymetry, backscatter and water column data types. To benefit from this technology, Hydrographers are having to adjust their data collection and data processing workflows to deliver detailed and accurate information in an effective manner to a wider variety of End Users.

Issues usually arise when there is the incorporation of software products from different vendors within a single and non-seamless workflow from acquisition through data delivery. This can result in an accumulation of human error which in turn leads to inaccurate final products and/or poor decisions with undesirable consequences. Advancements in the Hydrographic workflow by researchers and engineers at Quality Positioning Systems (QPS) have wherever possible automated the mundane, human-error prone tasks and the QPS workflow offering guides the user through the steps needed to get from sensor data to processed deliverables. This workflow removes redundancy and capitalizes on advanced computing technology to provide a dynamic multidimensional user interface that allows those even with a low knowledge threshold to deliver high-quality final products.

Introduction

While software generally keeps pace with the advancements in hardware and processing methods, many of the frustrations are present because human operators must connect all the pieces together to come up with a final processed solution (Beaudoin, 2016). A paradigm shift is currently underway to isolate and minimize the human error in the modern day Hydrographic Workflow while maximizing advancements in computing technology to automate the mundane, error prone tasks.

Uncertainty

To obtain a single, accurate sounding on the seafloor, one must account for error sources associated with the likes of: position (XY), draft, squat, load, tide, geoid model, bathy depth, node offsets, timing offsets, speed, gyro heading, vessel motion (HPR), mounting offsets, beam range, beam angle, beam width, beam steering, sound velocity at transducer head, sound velocity profile. Within the integrated hydrographic survey system, uncertainty accumulates within your setup which is known as Total Propagated Uncertainty or TPU. The Total Propagated Uncertainty (TPU) of a sounding is a measure for the accuracy to be expected for such a point, when all uncertainty sources are considered. The TPU can be computed statistically using the well-known Law of Uncertainty Propagation. Hence the indication “Propagated”. The TPU value thus results from the combination of all contributing uncertainties. Hence the indication “Uncertainty”.

What is not considered is the “human uncertainty” in any given workflow. Quality controls are imposed that minimize error however errors still do occur and their cumulative totals result in what is now referred to as “human induced TPU”. During data acquisition, we maybe utilize Survey Logs to note line names, positions, start/stop times, features (shoals, wrecks) and unique observations that may pertain to any survey line. During data processing, we incorporate check lists to ensure proper application of such things as tides, sound velocity, post-processed heave, etc. The common denominator with survey logs and check lists is that they require manual (analog) input with cognitive feedback to ensure quality. Unlike systematic TPU which can be calculated human TPU is generally unpredictable (a function of each different operator) and is difficult to measure. Typical areas for Human TPU include:

— Transcription: There are 3 different coordinate frame conventions that may exist within a single installation spread across a minimum of 8 software interfaces. An error transcribing a vessel configuration from the logging solution to the processing solution results in large, very cumbersome errors (Figure 1)

— Importing Ancillary data then failing to apply it to the correct data files

— Changing processing configurations and then not completing the appropriate reprocessing of the data to account for the new settings

Minimizing TPU, both systematic and human represent a logical evolution in the Acquire — Process — Visualize — Share workflow.

Guiding the User

Real-Time Improvements

Standard hydrographic software solutions are governed by menu-driven Graphical User Interface (GUI) that reference files structured through a “Folder” organization scheme that reside within the computer hard-drive. In the year 2000, Quality Positioning Systems (QPS) introduced the guided workflow to their QINSy software to assist their users in the setup process. The process includes a guided “Wizard Based Setup” for Project Preparation including: automated folder creation; Template Database Creation including guided Geodesy, Vessel Setup and hardware interfacing; On-line preparation including; data recording setup, On-The-Fly data filtering, Calibrations & Field checks.

A key objective for QINSy is to save time in post processing and the possible need for re-survey. By providing tools for real-time QA/ QC and on-the-fly correction for sensor offsets, attitude, sound velocity refraction, data flagging and tide/height, DTM points are derived as the survey proceeds. The multi-layered sounding grid data shown in the QINSy displays is populated with corrected DTM points on-the-fly, giving the operator a complete view of what has been surveyed. Advanced surface shading and DTM cell attributes like the “95% confidence level” and “hit count” are some of the tools that promote real-time QA/QC of collected data. A design DTM and/or previous survey allows real-time monitoring of DTM differences.

Paradigm Shift

Despite the collection of high quality data, producing a high-quality product is still hard. Mistakes still happen despite safeguards being put in place to minimize or eliminate and sometimes at great cost. Despite field procedure improvements, the hydrographic workflow is complex since it requires a human to connect all the pieces to produce a final product. Following in suite with the QINSy model, QPS has evolved the Guided Workflow into the Processing portion of the Workflow. The solution is known as Qimera and represents a paradigm shift in Hydrographic Processing. Qimera incorporates the QINSy Hydrographic algorithms with the Fledermaus 4D visualization capability and its optimised file/data handling methods provide users with a seamless, dynamic and pleasant work environment. It performs complete hydrographic processing for most modern sonar formats (.db, .all, .s7k, .hsx, .jsf, .gsf, etc), it supports many ancillary formats (SBET, PosPac, most tides and SVP) and it can export to a variety of industry standard exchange formats (GSF, FAU, BAG, Arc and other image formats).

The Qimera paradigm shift is the reduction of human induced TPU. This is accomplished through automation of mundane and error prone tasks (transcription automation and processing state management) to isolate the stages for which the hydrographer is best suited. Examples of such best suited stages include: data validation, processing configuration management and trouble-shooting. To accomplish this, two types of workflows are incorporated: Guided and Dynamic. The Guided workflow allows for non-expert users to arrive at typical bathymetric deliverables with little training or expert knowledge. The Dynamic workflow is processing state management which codifies and manages the relationships between the observations and the results. You don’t need to remember what processing need to be done but rather that “some” processing must be done. Everybody, regardless or knowledge or experience should be able to reach the end of a post processing session with a data deliverable.

Using what’s already there

The Guided workflow within Qimera is a series of prompts that step the user from one stage to the next. However, the innovation lies in the background. Upon opening Qimera, a minimum of seven (7) functions are automated. For example, creating a project establishes the file structure, organizes by file type (processed, grid, image, SVP, tide, SBET) and structures where all raw and soon-to-be processed data will be stored. Most modern file formats (.dB, .all, .s7k, .gsf) contain most if not all the necessary information required for processing as setup prior and during acquisition. Qimera utilizes the available information to guide the user thus eliminating the necessity of manual check lists. File import searches each format for what information exists (range, angle, motion, dynamic heave, SV, SVP, etc), catalogues data availability, transcribes all vessel configuration information, processing configuration and then performs the initial processing such as ray tracing based on the raw sonar data (bathy & ancillary), etc. Following the initial automated processing, Qimera prompts you as to how you wish to create your surface. Resolution, CUBE settings and colour map are presented in a simplified interface. Within a minimum number of mouse clicks and in the shortest time possible, you the user has a map view of your data and are now ready to utilize one or many of Qimera’s data editing tools to clean and validate your data, apply SBET’s or edit & validate both the assembled bathymetric data and the ancillary data used to calculate them.

Data validation consists of creating a loop between the DTM surface and calculated point cloud results to expose errors immediately. This is referred to as “Live” processing state management. In Qimera it is very easy to make processing configuration adjustments or to perform data validation and to immediately asses the impacts of any changes. Near immediate feedback shortens the time between cause and effect while promoting causal reasoning which is a key ingredient for natural cognitive evolution. In short, it allows users to train themselves through something as simple as immediately recalculating the dynamic surface and showing the user they have accidentally deleted good data or applied the wrong SVP, etc (Figure 3).

Scalability: Collaboration and Production Line Processing

QPS provides workflows within Qimera that scale to multiple users (a team of data processors) contributing to an overall processing effort through Cooperative Cleaning and Production Line Processing. Cooperative cleaning allows multiple users to clean a large project by dividing it into smaller, more manageable projects, while also maintaining complete data integrity. Data processors work within their sub-projects and, once complete, merge their efforts back into the main project. In parallel, the main project may have ancillary data processing completed (e.g. SBET, SVP, Height corrections). The edits from the sub-projects are incorporated back into the main project without impacting the overall progress of the entire project but enormously increasing efficiency.

Production Line Processing on the other hand allows for projects to be broken down into stages. These stages can be done based on survey days, survey segments, survey vessels, etc. The processing for a stage (e.g. a day, vessel, segment) of data in handled in its own isolated processing project. The processed outputs from this effort are then aggregated into a master project where it is evaluated by the senior Hydrographer and compared to other stages for either approval or for further cleaning either separately or within the master project. This is done repeatedly and combined with other stages within the master project. If problems are identified, the stage project can be re-opened, corrections applied and then re-introduced into the main project. The net result is the integration of multiple smaller projects processed in the exact same manner into a final deliverable.

Backscatter and Water Column

Data Modalities

Seafloor mapping is all about creating a complete picture of the seabed — the morphology, sedimentology, and biology — and interpreting the results to create thematic maps of distinct habitats that can guide marine policy, management, and resource utilization. Fledermaus Geocoder Toolbox (FMGT) is designed to visualize and analyze backscatter data from MBES and to a lesser degree SSS sensors. In processing the source files into mosaics, it is designed to perform as many sonar data corrections as possible to maximize the information content within the backscatter signal. The software can read multiple files of backscatter data, apply corrections, and then create a 2D representation of the ocean floor called a backscatter mosaic. Once the mosaic has been generated, various statistics can be calculated and exported in a number of formats, along with the mosaic backscatter value. Angle Range Analysis (ARA) can also be performed to attempt to classify the seabed types. All of the processing stages of FMGT are multi-core aware to maximize throughput of data and minimize any required reprocessing due to changes in desired output mosaic resolution or alteration in the number of data files.

FMMidwater can rapidly extract relevant water column features from a range of sonar file formats. Typically raw sonar files are first converted to a Generic Water Column format (GWC) for use in further processing and visualization. A simple graphical user interface is used to perform threshold filtering on a number of key parameters to help with feature extraction. FMMidwater also provides multiple views of the water column features and finally allows for easy export to a variety of Fledermaus visualization objects and exchange files. Being able to show echosounder bathymetry and water column data, interactively in the same Fledermaus 4D scene, significantly aids the subsequent interpretation of seabed survey results during hydrographic surveys for charting purposes Identifying the shoalest depth has often proved tricky to achieve, with a Bar Sweep typically being the preferred methodology. In recent years, results derived from water column data has been accepted as an alternative, and far more cost efficient and safe, way of determining the shoalest depth. The most recent developments for the Midwater utility have been in the field of semi-automated Seep detection (Dr. Tom Weber CCOM UNH), for Oil and Gas projects and this research has not gone unnoticed by the dredging community who are increasingly interested in making full use of the data modalities available from today’s echo sounders, in order to identify different seabed types and to research the visualization of dredge plumes.

Very Rapid Electronic Chart Production

“The Port of Rotterdam ENC and IENC production is coming close to perfection thanks to the use of the latest versions of ArcGIS, QPS QINSy Processing and QPS Qarto.”

Thanks to a cooperation between Esri, QPS and the Port of Rotterdam, the port’s ENC production process has been improved by the following steps:

• ENC and IENC production from the ArcGIS product

library

• The ArcGIS S-57 validation utility

• The QINSy Processing generalize DTM utility

• Qarto depth contours and depth areas have quality and

performance improvements

• Qarto date-of-survey areas (M_SREL) auto populated

from DTM metadata

• The Qarto S-58 validation utility

Starting from an up-to-date GIS database, a must for a world class port, the Maritime and Inland ENC’s are produced within minutes for each ENC cell. The Qarto workflow is completed within just a few mouse clicks and takes the GIS exported ENC (base cell) and integrates this with the depth model from QINSy Processing to make available user defined depth contours, depth areas and spot soundings.

Maximise port availability and goods throughput

All ports aim to maximize the availability for vessels under all circumstances and to increase the throughput of goods by maximizing the vessels draft as much as possible. The two principal ENC consumers at the Port of Rotterdam are the Port Authority and the Marine Pilots group. Both of these groups use the information in different formats to assist in the safe navigation of ships with marginal UKC. The use of High Density ENC’s or BENC’s by marine pilots is a critical factor in this part of the operation, as this shows exactly where and where not it is safe to navigate a ship, taking the vessel draft and the real-time water level into account. Any ship that is limited to the fairway by draft has to call in to the Port Authority at least 48 hours ahead. When the ship calls in, the Harbour Masters’ Harbour Coordination Center office (HCC) checks the fairway and berth depths using ArcGIS Maritime Chart Server (MCS). In the MCS user interface, the HCC officer can enter the ship’s draft, UKC and the tide level. The safety contour will be derived and shown automatically in MCS. During this time, the pilot will update his Portable Pilot Unit (PPU) with the same ENCs as MCS to prepare the ship’s transit to the berthing location.

The Port of Rotterdam produce in total some 300 usage 5 and usage 6 ENC’s. The usage 6 charts all have 10cm interval depth contours, giving it a BENC or High Density ENC. Based on the hydrographic surveys done in the port the new editions for the charts are selected and produced over night. In other words, what has been surveyed yesterday is available as a BENC today and in use by the pilots and the harbour masters, and potentially even likes of the captains (with dispensation) of the ferry services that have daily schedules to and from the port.

Summary

This article demonstrates how today’s shallow water multibeam echosounder are capable of efficiently delivering bathymetry, backscatter and water column data types. Accordingly, to benefit from this technology, Hydrographers are having to adjust their data collection and data processing workflows to deliver detailed and accurate information in an effective manner to a wider variety of End Users.

The hydrographic workflow offered by QPS has evolved to provide a dynamic multidimensional user interface that allows those even with a low knowledge threshold to make good decisions that lead to high-end final products. The critical component is the isolation of tasks within the workflow to capitalize on the technological advances in computing technology to automate the mundane error prone tasks to bring more value to the stages in which the human brain brings value. QPS through its products QINSy, Qimera, Fledermaus and Qarto innovate the user experience through several key design features including: Guided workflow, Transcription automation, processing state management, real-time QA, the dynamic workflow for validation, collaborative cleaning and production line processing are all processes to reduce human error, the QA burden in general and lowers the entry knowledge barrier. For the hydrographic manager, the return on investment is found in; a demonstrate ability to collect once and use many times, in lower trainings costs due to the guided workflow (easier to learn and retain knowledge), in improved processing outcomes, through easy scalability, in reduced post-processing times and finally in better results.

References

Malzone Chris, Beaudoin Jonathan, Evolving Ocean Mapping; Developing a Seamless workflow for Acquisition, Processing, Visualization and Sharing of Hydrographic Based Data, June 2017

Van Reenen, Jeroen, Raines, Caitlyn, The Port of Rotterdam: A Modern Hydrographic Workflow 2015 US Hydro proceedings

Beaudoin, Jonathan, Doucet, Moe, Advances in Hydrographic Data Processing: Time for a Paradigm Shift, U.S. Hydrographic Conference (US Hydro), Galveston, TX, USA, March 2017.

Calder, Brian R, How to Run CUBE (with the Baseline CCOM/JHC Implementation), Internal Report, University of New Hampshire (UNH), Center for Coastal and Ocean Mapping (CCOM)/Joint Hydrographic Center (JHC), 2003.

Calder, Brian R, Wells, David, CUBE User’s Manual, (Version 1.13), University of New Hampshire (UNH), Center for Coastal and Ocean Mapping (CCOM)/Joint Hydrographic Center (JHC), 2007.

Fonseca, Luciano and Calder, Brian R, Geocoder: An Efficient Backscatter Map Constructor, U.S. Hydrographic Conference (US HYDRO). San Diego, CA, USA, pp. 2005.

Back to Special Reports and Whitepapers Library
Marine Technology ENews subscription Marine Technology ENews is the subsea industry's largest circulation and most authoritative ENews Service, delivered to your Email three times per week
Subscribe for MTR E-news
ASV C-Worker 5 operating autonomously
page 10

Improved Operational Decision- Making for USV Fleet Surveys

Significant interest in autonomous and unmanned vehicles has developed within the survey community during the last few years. Unmanned Surface Vehicles (USV’s) are being considered for tasks such as data gathering in waters considered too shallow for manned vehicles and in areas presenting hazardous operating conditions. Looking forward, the concept of operating a fleet of autonomous and unmanned surface vehicles, with or without manned vessels as part of the fleet, holds great promise to improve survey efficiency.

The SWiFT SVP is Valeport’s latest addition to its portfolio of sound velocity sensors and profilers
page 16

Listening to your customer is key when designing a new product

As a manufacturer, we are always striving to create products that we believe to be what the end users desire. The technology around which our lives revolve these days has had a strong influence over what we expect from products in every area of life. For instance, I have my smartphone, tablet and laptop; I have high speed access to information and data via the internet. Hardware is easily charged, memory is plentiful and expandable, quality is good and costs are reasonable. So, these are the standards I expect from the equipment I am about to use when hydrographic surveying. Valeport has led the way in sound velocity technology for more than a decade and our latest addition to the portfolio, the SWiFT SVP, was designed from the outset with customer feedback front of mind and a practical understanding of what customers actually want from an instrument.

Figure 1. Dual SBP acquisition in HYPACK® SURVEY with no acoustic interference between the SBP systems. Note the positions of the sensors in relation to the vessel.
page 22

SUB-BOTTOM PROFILING ACQUISITION & PROCESSING IN HYPACK®

The newly launched HYPACK® SUB-BOTTOM is a sub-bottom profiling acquisition and processing software package designed for marine geophysical, engineering & geotechnical site surveys, dredging, mining applications. It’s a simple and easy-to-use solution for most sub-bottom profiling survey requirements. Basic sub-bottom profiling acquisition & processing has been available in HYPACK for the past few years, but during the first quarter of 2017 a considerable amount of effort has been put into improving the stability, memory allocation and features of the program. This article is highlights some of new features and functionality of HYPACK® SUB-BOTTOM.

;