Comments: 
From: eub 20160812 08:48 am (UTC)
 (Link)

Missing a basic point, I don't get how (b) works. In the limit doesn't that mean any two distinct points are independent?
Correct. In the same way that the Poisson process is the limit of imagining that a zillion infinitesimal rectangles have some infinitesimal independent chance of being on or off, my intent here is that it's a zillion infinitesimal rectangles that independently take some value in [0,∞) distributed Gamma(ε, 1) for an infinitesimal ε. The value you get by querying a realisticallysized rectangle is the sum of all those independent contributions. The Gamma distribution has the property that if you have random variables X and Y distributed Gamma(x, θ) and Gamma(y, θ), then X + Y is distributed Gamma(x+y, θ).
Then the thing that ideally comes out of convolution of this with a kernel with compact support has shortrange but no longrange correlations.
Dunno how to particularly efficiently compute this, though, which is a bummer, because I kind of still have as a personal white whale the task of replacing Perlin noise with something really isotropic and rotationinvariant but also reasonable to compute locally.
From: eub 20160813 08:42 am (UTC)
 (Link)

Ah, sorry, I spaced out for the convolution step! So you could get jcreed!perlin by a oneoctave bandpass, etc.
So this has a family resemblance to sparse convolution noise, which uses a sparse shot noise instead of your distribution. (Sparseness of the noise makes its bruteforce computation reasonable.) People filter that with a lowpass kernel, or with randomorientation Gabor kernels to get an octave like Perlin noise.
... OK there are some very basic things I don't know about stochastic processes. What is the choice of distribution accomplishing for us? If I take Gaussian white noise and convolve it with a compact kernel  i.e. FIR lowpass it  that has a "shortrange but no longrange correlations" property. Different than what you're going for though?
Gaussian white noise, or sparse convolution noise uses a sparse noise for computational efficiency, or you have a fancy noise. What differences result? I don't remember what, for example, you see spectrally for Gaussian vs. uniform white noise  it was something like the power spectra are both flat but the difference shows up in the bispectrum?
From: eub 20160813 08:44 am (UTC)
 (Link)

(Hm, it is not obvious to me why the Gabor noise people use randomorientation Gabors instead of just using a radial bandpass, if radial bandpass is what they want to end up with. For getting anisotropy, sure.)
Edited at 20160813 08:45 am (UTC)
From: eub 20160813 09:17 am (UTC)
 (Link)

(Is sparse impulse noise white at high frequencies? It's got something screwy up there past the impulse frequency, intuitively, but what is it? If I understood this I might have a step towards understanding other distributions.
Intuitively once you zoom in to where it looks like an impulse, that has perfectly flat spectral density, i.e. the phase is different than GWN's phase. Something something.)  