In the machine vision community multi-scale image enhancement and
analysis has frequently been accomplished using a diffusion or
equivalent process. Linear diffusion can be replaced by convolution
with Gaussian kernels, as the Gaussian is the Green's function of such
a system. In this paper we present a technique which obtains an
approximate solution to a nonlinear diffusion process via the solution
of an integral equation which is the nonlinear analog of convolution.
The kernel function of the integral equation plays the same role that
a Green's function does for a linear PDE, allowing the direct solution
of the nonlinear PDE for a specific time without requiring integration
through intermediate times. We then use a learning technique to
approximate the kernel function for arbitrary input images. The result
is an improvement in speed and noise-sensitivity, as well as providing
a means to parallelize an otherwise serial algorithm.