© 2000 - 2011 LIN - Leibniz Institut für Neurobiologie Magdeburg

LIN: Forschungsabteilungen > Akkustik, Lernen, Sprache > Unterpunkt Ebene 3 > Unterpunkt Ebene 4

Titel: LIN Layout

Home switch to german Print Search:
Facebook YouTube
Staff Links Sitemap


 Multilevel Hypermap Architecture

Structure of the Multilevel Hypermap Architecture

The Multilevel Hypermap Architecture (MHA) belongs to self-organizing maps and is an extension of the Hypermap introduced by Kohonen.

Instead of two levels proposed in the Hypermap, the data and the context level, the MHA supports several levels of data relationship. The MHA is trained with the different levels of the input vector whose representation is a hierarchy of encapsulated subsets of units, the so called clusters and subclusters, which define different generalized stages of classification.

By means of MHA it is possible to analyze structured or hierarchical data, i.e.

  • data with priorities, e.g. projection of hierarchical data structures in data bases
  • data with context (data bases, associative memories)
  • time series, e.g. speech, moving objects

One advantage of the MHA is the support for both, the classification of data and the projection of the structure in one unified map. The resulting hierarchy has some redundancy like in biological systems.

In the previous years some real world applications using the MHA were reported in the literature. Beside a system for speech processing and recognition an application which deals with an implementation of the Modified Hypermap Architecture for classification of image objects within moving scenes and an application related to the analysis of fMRI images of auditory cortex activity which were obtained from acoustic stimulation are carried out (see list of publications ).

With the new hypothetical learning algorithm the MHA has an advantage in learning and representing weak data relationships. Furthermore the MHA is able to find these relationships by itself without the need to present it in the input learning data. This behavior will be also useful in the case of working like an associative memory.

last update: 2010-02-24 report a bug print this page