To convince us that computers cannot have mental states, Searle (1980) imagines a “Chinese room” that simulates a computer that “speaks” Chinese and asks us to find the understanding in the room. It's a trick. There is no understanding in the room, not because computers can't have it, but because the room's computer-simulation is defective. Fix it and understanding appears. Abracadabra!