Retinal ganglion cells (RGCs) are highly sensitive to changes in contrast, which is crucial for the detection of edges in a visual scene. However, in the natural environment, edges do not just vary in contrast, but edges also vary in the degree of blur, which can be caused by distance from the plane of fixation, motion, and shadows. Hence, blur is as much a characteristic of an edge as luminance contrast, yet its effects on the responses of RGCs are largely unexplored.
We examined the responses of rabbit RGCs to sharp edges varying by contrast and also to high-contrast edges varying by blur. The width of the blur profile ranged from 0.73 to 13.05 deg of visual angle. For most RGCs, blurring a high-contrast edge produced the same pattern of reduction of response strength and increase in latency as decreasing the contrast of a sharp edge. In support of this, we found a significant correlation between the amount of blur required to reduce the response by 50% and the size of the receptive fields, suggesting that blur may operate by reducing the range of luminance values within the receptive field. These RGCs cannot individually encode for blur, and blur could only be estimated by comparing the responses of populations of neurons with different receptive field sizes. However, some RGCs showed a different pattern of changes in latency and magnitude with changes in contrast and blur; these neurons could encode blur directly.
We also tested whether the response of a RGC to a blurred edge was linear, that is, whether the response of a neuron to a sharp edge was equal to the response to a blurred edge plus the response to the missing spatial components that were the difference between a sharp and blurred edge. Brisk-sustained cells were more linear; however, brisk-transient cells exhibited both linear and nonlinear behavior.