In this paper, we solve an exit probability game between two players, each of whom controls a linear diffusion process. One player controls its process to minimize the probability that the difference of the processes reaches a low level before it reaches a high level, while the other player aims to maximize the probability. By solving the Bellman–Isaacs equations, we find the sub-value and sup-value functions of the game in explicit forms, which are twice continuously differentiable. The optimal plays associated with the sub-value and sup-value are also found explicitly.