This paper proposes a new nonparametric test for conditional
parametric distribution functions based on the first-order linear
expansion of the Kullback–Leibler information function and
the kernel estimation of the underlying distributions. The test
statistic is shown to be asymptotically distributed standard normal
under the null hypothesis that the parametric distribution is
correctly specified, whereas asymptotically rejecting the null with
probability one if the parametric distribution is misspecified. The
test is also shown to have power against any local alternatives
approaching the null at rates slower than the parametric rate
n−1/2. The finite sample performance of
the test is evaluated via a Monte Carlo simulation.