SVM Learning with the SH Inner Product

Abstract

We apply support vector learning to attributed graphs where the kernel matrices are based on approximations of the Schur-Hadamard inner product. The evaluation of the Schur-Hadamard inner product for a pair of graphs requires the determination of an optimal match between their nodes and edges. It is therefore efficiently approximated by means of recurrent neural networks. The optimal mapping involved allows a direct understanding of the similarity or dissimilarity of the two graphs considered. We present and discuss experimental results of different classifiers constructed by SVM operating on positive semi-definite (psd) and non-psd kernel matrices.

Authors:
Brijnesh Jain, Peter Geibel, Fritz Wysotzki
Category:
Journal
Year:
2005
Location:
Neurocomputing, 64, pp. 93-105