- Home
- Cambridge University Press
- Textbooks
- Cognitive Science
- Multiple choice questions
- Chapter 08
- Chapter 08 - answers
Chapter 08 - answers
For students:
For instructors:
More resources coming soon
Further questions for chapter 08 - suggested answers
Q. Label the domain and range in the figure below (from Figure 8.4).
Q. The above figure illustrates a mapping function. How many items in the range of a mapping function can be attached to each item in its domain?
A. Only one
Q. Explain the phrase "neurons that fire together, wire together."
A. This is a slogan that comes out of Hebb's work on neural networks. It refers to the tendency of neurons that fire in close synchrony to continue firing in close synchrony. The feedback does not come from outside of the system but rather from other neurons already in the system.
Q. Identify some differences between artificial and real, biological neural networks.
A. There are many different types of real neurons and many more units (neurons) in a real brain; real neurons are not assigned weights randomly; it’s not clear whether backpropogation is a real biological mechanism; real neurons don't get as much "external" supervision and feedback as artificial neurons.
Q. Explain why networks with hidden layers solve problems created by non-linearly separable functions.
A. Networks with hidden layers can assign two different weights to each unit, unlike single layer networks that have one weight for each unit. This allows networks with hidden layers to compute any function.
Q. Explain the process of learning for a neural network, and do so by using the perceptron convergence rule.
A. Learning for a neural network is a matter of readjusting the weights in a neural network to get the desired neural output. The perceptron convergence rule relies on external supervision of this readjustment. For example, if we observe that a neural network is producing the wrong output, we can readjust the weights and the thresholds for the network in order to get the right output.