Skip to main content
×
×
Home
  • Get access
    Check if you have access via personal or institutional login
  • Cited by 4
  • Cited by
    This (lowercase (translateProductType product.productType)) has been cited by the following publications. This list is generated based on data provided by CrossRef.

    Cortesi, Marilisa Llamosas, Estelle Henry, Claire E. Kumaran, Raani-Yogeeta A. Ng, Benedict Youkhana, Janet and Ford, Caroline E. 2018. I-AbACUS: a Reliable Software Tool for the Semi-Automatic Analysis of Invasion and Migration Transwell Assays. Scientific Reports, Vol. 8, Issue. 1,

    Leal, Jorge Alberto Ochoa, Luis Hernan and Contreras, Carmen Cecilia 2018. Automatic Identification of Calcareous Lithologies Using Support Vector Machines, Borehole Logs and Fractal Dimension of Borehole Electrical Imaging. Earth Sciences Research Journal, Vol. 22, Issue. 2, p. 75.

    Gu, Jingjing and Chen, Songcan 2012. Manifold-based canonical correlation analysis for wireless sensor network localization. Wireless Communications and Mobile Computing, Vol. 12, Issue. 15, p. 1389.

    Guzey, Onur Wang, Li-C. Levitt, Jeremy R. and Foster, Harry 2010. Increasing the Efficiency of Simulation-Based Functional Verification Through Unsupervised Support Vector Analysis. IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, Vol. 29, Issue. 1, p. 138.

    ×
  • Print publication year: 2004
  • Online publication date: March 2011

2 - Kernel methods: an overview

from Part I - Basic concepts
Summary

In Chapter 1 we gave a general overview to pattern analysis. We identified three properties that we expect of a pattern analysis algorithm: computational efficiency, robustness and statistical stability. Motivated by the observation that recoding the data can increase the ease with which patterns can be identified, we will now outline the kernel methods approach to be adopted in this book. This approach to pattern analysis first embeds the data in a suitable feature space, and then uses algorithms based on linear algebra, geometry and statistics to discover patterns in the embedded data.

The current chapter will elucidate the different components of the approach by working through a simple example task in detail. The aim is to demonstrate all of the key components and hence provide a framework for the material covered in later chapters.

Any kernel methods solution comprises two parts: a module that performs the mapping into the embedding or feature space and a learning algorithm designed to discover linear patterns in that space. There are two main reasons why this approach should work. First of all, detecting linear relations has been the focus of much research in statistics and machine learning for decades, and the resulting algorithms are both well understood and efficient. Secondly, we will see that there is a computational shortcut which makes it possible to represent linear patterns efficiently in high-dimensional spaces to ensure adequate representational power. The shortcut is what we call a kernel function.

Recommend this book

Email your librarian or administrator to recommend adding this book to your organisation's collection.

Kernel Methods for Pattern Analysis
  • Online ISBN: 9780511809682
  • Book DOI: https://doi.org/10.1017/CBO9780511809682
Please enter your name
Please enter a valid email address
Who would you like to send this to *
×