An Approach to Seabed Classification from Multi-beam Bathymetric Sonar Data

© April 2001

algorithm specifications

Network architecture
     • Training and Classification

In sea-bed classification the number of different acoustic classes mapped during a survey will generally be very limited. A system that is capable of discriminating 10 bottom-types in any particular survey would be satisfactory for all practical purposes. With only a small number of classes, it is possible to employ one associative memory for each individual class.

An echo-return is then presented to every one of those associative memories and the response would merely indicate whether this return is similar to the returns the memory was trained with or not. Votes from all memories are then presented to a fuzzy decision rule, which ultimately assigns class membership of the echo and a level confidence for that decision. Figure 4 shows a schematic diagram of the classification network.


Up

Training and Classification

Associative memories do not require a large number of training samples to load.

Figure 4


Sample




Associative
Memories






Fuzzy Classifier



Class and
Confidence-Level
Assignment




Figure 4: Schematic of classification algorithm

In the example given below, about 300 beams were used in the training sets, but less than 100 are actually required to load each class memory. Associative memories are different from other types of artificial neural networks in that there is no iterative error-back-propagation in the training procedure for example. Presenting the memory with a new training sample basically results in an update process quite similar to subspace tracking methods employed in adaptive array processing. The computational complexity is roughly of O (N2), where N = p x q, with p and q being the dimensions of the time-frequency image submitted for each beam.

The classification phase differs from training mainly in the fact, that no update is performed on the memories. It is however possible to interweave classification and training in order to implement a form of re-enforcement learning. Samples presented for classification, which qualify with high confidence values for a particular class, could be used to update and re-enforce the memory associated with that class. The effects of memory re-enforcement have not been investigated at this time.

The architecture of the classifier presented in this report is inherently parallel and real-time classification would be feasible on a multi-processor system. On a single processor AMD K6-400 equipped computer under LINUX, classification of a single beam return takes about 200 ms. Updating a class memory with a single beam return during the training phase can take up to 2 seconds on the same computer.

Up
algorithm example