To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Filling a gap in the literature, this book explores the theory of gradient flows of convex functionals in metric measure spaces, with an emphasis on weak solutions. It is largely self-contained and assumes only a basic understanding of functional analysis and partial differential equations. With appendices on convex analysis and the basics of analysis in metric spaces, it provides a clear introduction to the topic for graduate students and non-specialist researchers, and a useful reference for anyone working in analysis and PDEs. The text focuses on several key recent developments and advances in the field, paying careful attention to technical detail. These include how to use a first-order differential structure to construct weak solutions to the p-Laplacian evolution equation and the total variation flow in metric spaces, how to show a Euler–Lagrange characterisation of least gradient functions in this setting, and how to study metric counterparts of Cheeger problems.
• To understand the working principle of support vector machine (SVM).
• To comprehend the rules for identification of correct hyperplane.
• To understand the concept of support vectors, maximized margin, positive and negative hyperplanes.
• To apply an SVM classifier for a linear and non-linear dataset.
• To understand the process of mapping data points to higher dimensional space.
• To comprehend the working principle of the SVM Kernel.
• To highlight the applications of SVM.
10.1 Support Vector Machines
Support vector machines (SVMs) are supervised machine learning (ML) models used to solve regression and classification problems. However, it is widely used for solving classification problems. The main goal of SVM is to segregate the n-dimensional space into labels or classes by defining a decision boundary or hyperplanes. In this chapter, we shall explore SVM for solving classification problems.
10.1.1 SVM Working Principle
SVM Working Principle | Parteek Bhatia, https://youtu.be/UhzBKrIKPyE
To understand the working principle of the SVM classifier, we will take a standard ML problem where we want a machine to distinguish between a peach and an apple based on their size and color.
Let us suppose the size of the fruit is represented on the X-axis and the color of the fruit is on the Y-axis. The distribution of the dataset of apple and peach is shown in Figure 10.1.
To classify it, we must provide the machine with some sample stock of fruits and label each of the fruits in the stock as an “apple” or “peach”. For example, we have a labeled dataset of some 100 fruits with corresponding labels, i.e., “apple” or “peach”. When this data is fed into a machine, it will analyze these fruits and train itself. Once the training is completed, if some new fruit comes into the stock, the machine will classify whether it is an “apple” or a “peach”.
Most of the traditional ML algorithms would learn by observing the perfect apples and perfect peaches in the stock, i.e., they will train themselves by observing the ideal apples of stock (apples which are very much like apples in terms of their size and color) and the perfect peaches of stock (peaches which are very much like peaches in terms of their size and color). These standard samples are likely to be found in the heart of stock. The heart of the stock is shown in Figure 10.2.
When words and phrases combine together into whole clauses, the manner in which they combine can result in quite different meanings and in structural ambiguities that lead to lawsuits. Chapter 4 considers an “attachment ambiguity” in a false advertising suit, as in WE ARE PROVIDING GREAT REFRESHMENTS DURING THE SUPER BOWL AT CANDLESTICK PARK. The refreshments were not actually offered onsite at Candlestick Park, but at some distance from it. The ambiguity involves whether AT CANDLESTICK PARK modifies the verb, ‘We are providing at Candlestick Park great refreshments during the Super Bowl,’ or the noun phrase THE SUPER BOWL, in which case it is logically asserted only that the Super Bowl takes place there. In this latter attachment, questions of contextual inference and real-world knowledge then come into play and can still “implicate” that the services are also provided at Candlestick Park. More complex syntax, involving subordinate clauses and phrases that follow SUBJECT TO, resulted in a dispute over when or whether at all the prior condition for sale specified in this real-estate contract needed to be met. The subordinating conjunction ALTHOUGH and its “implicatures” were at the center of a libel suit in which unprofessional conduct by a lawyer and drinking to excess were alleged.
Chapter V turns to the image and discusses first former art historical approaches in ancient Near Eastern studies to the narrative in the image. It then introduces W. Mitchell’s notion of text-image dialectic and Schriftbildlichkeit (pictorial notation). It makes a case for a semiotic approach informed by Gottfried Boehm’s notion of the deixis of the image to justify a narrative reading of the image in the process of reception. This narrative reading is informed by the interpictoriality of the image which is anchored in a stream of tradition as well as its intermediality with mythic narratives and ritual performance.
• To define machine learning (ML) and discuss its applications.
• To learn the differences between traditional programming and ML.
• To understand the importance of labeled and unlabeled data and its various usage for ML.
• To understand the working principle of supervised, unsupervised, and reinforcement learnings.
• To understand the key terms like data science, data mining, artificial intelligence, and deep learning.
1.1 Introduction
In today’s data-driven world, information flows through the digital landscape like an untapped river of potential. Within this vast data stream lies the key to unlocking a new era of discovery and innovation. Machine learning (ML), a revolutionary field, acts as the gateway to this wealth of opportunities. With its ability to uncover patterns, make predictive insights, and adapt to evolving information, ML has transformed industries, redefined technology, and opened the door to limitless possibilities. This book is your gateway to the fascinating realm of ML—a journey that empowers you to harness the power of data, enabling you to build intelligent systems, make informed decisions, and explore the boundless possibilities of the digital age.
ML has emerged as the dominant approach for solving problems in the modern world, and its wide-ranging applications have made it an integral part of our lives. Right from search engines to social networking sites, everything is powered by ML algorithms. Your favorite search engine uses ML algorithms to get you the appropriate search results. Smart home assistants like Alexa and Siri use ML to serve us better. The influence of ML in our day-to-day activities is so much that we cannot even realize it. Online shopping sites like Amazon, Flipkart, and Myntra use ML to recommend products. Facebook is using ML to display our feed. Netflix and YouTube are using ML to recommend videos based on our interests.
Data is growing exponentially with the Internet and smartphones, and ML has just made this data more usable and meaningful. Social media, entertainment, travel, mining, medicine, bioinformatics, or any field you could name uses ML in some form.
To understand the role of ML in the modern world, let us first discuss the applications of ML.
The Channel Islands (Jersey, Guernsey, Alderney and Sark) are situated off the coast of Normandy (France), west of the Cotentin peninsula. A brief look at a map shows that, from a geographical point of view, they are much closer to France than to England. As the original language in these islands is a form of Norman French, they have traditionally been regarded in dialectology as a French-speaking area. However, the exclusive interest of traditional dialectology in Channel Islands French is not an adequate reflection of the current linguistic situation. Today, English is clearly the dominant language in the Channel Islands. The number of speakers of Norman French is rather small and steadily decreasing. Over the past 200 years, English has gained more and more influence and has gradually replaced the local Norman French dialects. Indeed, there are clear indications that they will become extinct in the not-too-distant future.
This chapter investigates the continuum which exists between vernacular speech and standard language and examines various issues which arise in this area. Key to the continuum of speech in any Western-style society is the notion of a supraregional variety which, on the one hand, embodies sufficient vernacular features to fulfil the identity function of language but, on the other hand, does not contain features which are stigmatised in a speech community. Supraregional varieties are dynamic entities and are thus subject to language variation and change. Such varieties are only occasionally explicitly codified. However, speakers in any speech community will be aware of stigmatised and non-stigmatised features (with regard to accepted usage in more formal situations) and can move along the continuum of relative vernacularity in given contexts.
• To understand the concept of artificial neural network (ANN).
• To comprehend the working of the human brain as an inspiration for the development of neural network.
• To understand the mapping of human brain neurons to an ANN.
• To understand the working of ANN with case studies.
• To understand the role of weights in building ANN.
• To perform forward and backward propagation to train the neural networks.
• To understand different activation functions like threshold function, sigmoid function, rectifier linear unit function, and hyperbolic tangent function.
• To find the optimized value of weights for minimizing the cost function by using the gradient descent approach and stochastic gradient descent algorithm.
• To understand the concept of the mini-batch method.
16.1 Introduction to Artificial Neural Network
Neural networks and deep learning are the buzzwords in modern-day computer science. And, if you think that these are the latest entrants in this field, you probably have a misconception. Neural networks have been around for quite some time, and they have only started picking up now, putting up a huge positive impact on computer science.
Artificial neural network (ANN) was invented in the 1960s and 1970s. It became a part of common tech talks, and people started thinking that this machine learning (ML) technique would solve all the complex problems that were challenging the researchers during that time. But sooner, the hopes and expectations died off over the next decade.
The decline could not be attributed to some loopholes in neural networks, but the major reason for the decline was the “technology” itself. The technology back then was not up to the right standard to facilitate neural networks as they needed a lot of data for training and huge computation resources for building the model. During that time, both data and computing power were scarce. Hence, the resulting neural network remained only on paper rather than taking centerstage of the machine to solve some real-world problems.
Later on, at the beginning of the 21st century, we saw a lot of improvements in storage techniques resulting in reduced cost per gigabyte of storage. Humanity witnessed a huge rise in big data due to the Internet boom and smartphones.
A long-established reasoning for progressive taxation is that all taxpayers should be subjected to equal utility sacrifice in paying taxes. The term “equal sacrifice” may be interpreted in three different ways. According to equal absolute sacrifice, everybody gives up the same amount of utility while paying taxes so that the difference between the utility of before-tax income and the utility of after-tax income is the same across taxpayers (Mill, 1989). Thus, implicit idea under the notion of equal absolute sacrifice is that the social evaluation function is of symmetric utilitarian type. That is why often equal absolute sacrifice rule is referred to as the “utilitarian” form of equal sacrifice (see Lambert and Naughton, 2009).
Equal sacrifice rule of taxation has sometimes been interpreted as sacrifice proportional to one's wellbeing (Cohen Stuart, 1889; Seligman, 1894). In other words, under equal proportional sacrifice everybody should give up the same percentage of utility while paying taxes. Evidently, given the positivity of utilities, equal proportional sacrifice is the same as equal absolute sacrifice in logarithms of utilities. In this case there is a tacit assumption that the social evaluation is of the symmetric Cobb–Douglas category so that under logarithmic transformation of utilities it becomes the symmetric utilitarian sum of logarithms of utilities. Finally, the equal marginal sacrifice principle claims that the total tax burden should be divided out among the taxpayers such that the after-tax marginal utilities of the taxpayers are equalized.
Chapter 2 describes four regular challenges for lexicographers when they are composing their semantic definitions for individual words, and it shows how each of these challenges has led to disputed meanings in lawsuits. First, just about every word of English has more than one basic sense and disputes can center around which meaning is intended. An example is given from a patent for a dental prosthesis involving SUBSTANTIALLY. It can have a “booster” meaning or a “downtoner” meaning, and which one is selected determines whether there is patent infringement or not. Second, there is the issue of the precise extent of the meaning of a word and its permitted “vagueness.” What exactly does APPROXIMATELY (300 acres) cover in a real-estate contract? And how broadly or narrowly is the word DISEASE understood in a medical insurance contract? Third, words can be used with nonliteral and extended meanings, as in metaphors. Some lawsuits are described involving FIGHTING, CHEATING, and MURDER, where the dispute centered on whether the intended meaning was literal or extended. Fourth, how are dictionary definitions impacted by the changing technological world around us? The word TANGIBLE is discussed from a movie contract fifty years ago when the objects referred to where in plastic or paper form. Are they still TANGIBLE today when they exist on a computer screen?