In this paper, a new classifier, called adaptive high order neural tree (AHNT), is proposed for pattern recognition applications. It is a hierarchical multi-level neural network, in which the nodes are organized into a tree topology. It successively partitions the training set into subsets, assigning each subset to a different child node. Each node can be a first-order or a high order perceptron (HOP) according to the complexity of the local training set. First order perceptrons split the training set by hyperplanes, while n-order perceptrons use n-dimensional surfaces. An adaptive procedure decides the best order of the HOP to be applied at a given node of the tree. The AHNT is grown automatically during the learning phase: its hybrid structure guarantees a reduction of the number of internal nodes with respect to classical neural trees and reaches a greater generalization capability. Moreover, it overcomes the classical problems of feed-forward neural networks (e.g., multilayer perceptrons) since both types of perceptrons does not require any a-priori information about the number of neurons, hidden layers, or neuron connections. Tests on patterns with different distributions and comparisons with classical neural tree-based classifiers have been performed to demonstrate the validity of the proposed method.

Adaptive High Order Neural Trees for Pattern Recognition

FORESTI, Gian Luca;MICHELONI, Christian;SNIDARO, Lauro
2002

Abstract

In this paper, a new classifier, called adaptive high order neural tree (AHNT), is proposed for pattern recognition applications. It is a hierarchical multi-level neural network, in which the nodes are organized into a tree topology. It successively partitions the training set into subsets, assigning each subset to a different child node. Each node can be a first-order or a high order perceptron (HOP) according to the complexity of the local training set. First order perceptrons split the training set by hyperplanes, while n-order perceptrons use n-dimensional surfaces. An adaptive procedure decides the best order of the HOP to be applied at a given node of the tree. The AHNT is grown automatically during the learning phase: its hybrid structure guarantees a reduction of the number of internal nodes with respect to classical neural trees and reaches a greater generalization capability. Moreover, it overcomes the classical problems of feed-forward neural networks (e.g., multilayer perceptrons) since both types of perceptrons does not require any a-priori information about the number of neurons, hidden layers, or neuron connections. Tests on patterns with different distributions and comparisons with classical neural tree-based classifiers have been performed to demonstrate the validity of the proposed method.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: http://hdl.handle.net/11390/738078
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 3
social impact