Skip to main content Accessibility help
×
Hostname: page-component-848d4c4894-2pzkn Total loading time: 0 Render date: 2024-05-16T08:57:01.128Z Has data issue: false hasContentIssue false

7 - Implementation Techniques

Published online by Cambridge University Press:  05 March 2013

Nello Cristianini
Affiliation:
University of London
John Shawe-Taylor
Affiliation:
Royal Holloway, University of London
Get access

Summary

In the previous chapter we showed how the training of a Support Vector Machine can be reduced to maximising a convex quadratic form subject to linear constraints. Such convex quadratic programmes have no local maxima and their solution can always be found efficiently. Furthermore this dual representation of the problem showed how the training could be successfully effected even in very high dimensional feature spaces. The problem of minimising differentiable functions of many variables has been widely studied, especially in the convex case, and most of the standard approaches can be directly applied to SVM training. However, in many cases specific techniques have been developed to exploit particular features of this problem. For example, the large size of the training sets typically used in applications is a formidable obstacle to a direct use of standard techniques, since just storing the kernel matrix requires a memory space that grows quadratically with the sample size, and hence exceeds hundreds of megabytes even when the sample size is just a few thousand points.

Such considerations have driven the design of specific algorithms for Support Vector Machines that can exploit the sparseness of the solution, the convexity of the optimisation problem, and the implicit mapping into feature space. All of these features help to create remarkable computational efficiency. The elegant mathematical characterisation of the solutions can be further exploited to provide stopping criteria and decomposition procedures for very large datasets.

In this chapter we will briefly review some of the most common approaches before describing in detail one particular algorithm, Sequential Minimal Optimisation (SMO), that has the additional advantage of not only being one of the most competitive but also being simple to implement. As an exhaustive discussion of optimisation algorithms is not possible here, a number of pointers to relevant literature and on-line software is provided in Section 7.8.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2000

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

  • Implementation Techniques
  • Nello Cristianini, University of London, John Shawe-Taylor, Royal Holloway, University of London
  • Book: An Introduction to Support Vector Machines and Other Kernel-based Learning Methods
  • Online publication: 05 March 2013
  • Chapter DOI: https://doi.org/10.1017/CBO9780511801389.009
Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

  • Implementation Techniques
  • Nello Cristianini, University of London, John Shawe-Taylor, Royal Holloway, University of London
  • Book: An Introduction to Support Vector Machines and Other Kernel-based Learning Methods
  • Online publication: 05 March 2013
  • Chapter DOI: https://doi.org/10.1017/CBO9780511801389.009
Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

  • Implementation Techniques
  • Nello Cristianini, University of London, John Shawe-Taylor, Royal Holloway, University of London
  • Book: An Introduction to Support Vector Machines and Other Kernel-based Learning Methods
  • Online publication: 05 March 2013
  • Chapter DOI: https://doi.org/10.1017/CBO9780511801389.009
Available formats
×