Linear representations have frequently been used in image analysis but are seldom found to be optimal in specific applications. We propose a stochastic gradient algorithm for finding optimal linear representations of images for use in appearance-based object recognition. Using the nearest neighbor classifier, we specify a recognition performance function with continuous derivatives. By formulating the problem of finding optimal linear representations as that of optimization on a Grassmann manifold, we first derive an intrinsic gradient flow on the Grassmannian and later generalize it to a stochastic gradient algorithm for optimization. We present several experimental results to establish the effectiveness of this algorithm.