Log in

No account? Create an account
Is there a name, I wonder, for the following stochastic process? I… - Notes from a Medium-Sized Island [entries|archive|friends|userinfo]

[ website | My Website ]
[ userinfo | livejournal userinfo ]
[ archive | journal archive ]

[Aug. 10th, 2016|09:13 pm]

Is there a name, I wonder, for the following stochastic process? I may be using nonstandard terminology because I am not super familiar with the field.

The type of object I mean the process to yield is, I think, a 2d distribution in the sense of schwartz, in that it's the kind of thing that you can query by throwing, say, a rectangle in R2 at it, and it'll tell you what answer you get if you integrate over that rectangle.

The defining properties of the distribution are these:
(a) For any rectangle R with area A, the probability that integral over R has a particular value x is
xA-1 e-x / Γ(k)

i.e. it's distributed ~ Gamma(A, 1), and
(b) the distribution of disjoint rectangles is independent.

If this is a reasonable thing, I'm tempted to convolve it with a kernel that's nice and compactly supported and C and arrive a nice random smooth function from R2 to R≥0 that has short-range but no long-range correlations.

From: eub
2016-08-12 08:48 am (UTC)
Missing a basic point, I don't get how (b) works. In the limit doesn't that mean any two distinct points are independent?
(Reply) (Thread)
[User Picture]From: jcreed
2016-08-12 08:49 pm (UTC)
Correct. In the same way that the Poisson process is the limit of imagining that a zillion infinitesimal rectangles have some infinitesimal independent chance of being on or off, my intent here is that it's a zillion infinitesimal rectangles that independently take some value in [0,∞) distributed Gamma(ε, 1) for an infinitesimal ε. The value you get by querying a realistically-sized rectangle is the sum of all those independent contributions. The Gamma distribution has the property that if you have random variables X and Y distributed Gamma(x, θ) and Gamma(y, θ), then X + Y is distributed Gamma(x+y, θ).

Then the thing that ideally comes out of convolution of this with a kernel with compact support has short-range but no long-range correlations.

Dunno how to particularly efficiently compute this, though, which is a bummer, because I kind of still have as a personal white whale the task of replacing Perlin noise with something really isotropic and rotation-invariant but also reasonable to compute locally.
(Reply) (Parent) (Thread)
From: eub
2016-08-13 08:42 am (UTC)
Ah, sorry, I spaced out for the convolution step! So you could get jcreed!perlin by a one-octave bandpass, etc.

So this has a family resemblance to sparse convolution noise, which uses a sparse shot noise instead of your distribution. (Sparseness of the noise makes its brute-force computation reasonable.) People filter that with a lowpass kernel, or with random-orientation Gabor kernels to get an octave like Perlin noise.

... OK there are some very basic things I don't know about stochastic processes. What is the choice of distribution accomplishing for us? If I take Gaussian white noise and convolve it with a compact kernel -- i.e. FIR lowpass it -- that has a "short-range but no long-range correlations" property. Different than what you're going for though?

Gaussian white noise, or sparse convolution noise uses a sparse noise for computational efficiency, or you have a fancy noise. What differences result? I don't remember what, for example, you see spectrally for Gaussian vs. uniform white noise -- it was something like the power spectra are both flat but the difference shows up in the bispectrum?
(Reply) (Parent) (Thread)
From: eub
2016-08-13 08:44 am (UTC)
(Hm, it is not obvious to me why the Gabor noise people use random-orientation Gabors instead of just using a radial bandpass, if radial bandpass is what they want to end up with. For getting anisotropy, sure.)

Edited at 2016-08-13 08:45 am (UTC)
(Reply) (Parent) (Thread)
From: eub
2016-08-13 09:17 am (UTC)
(Is sparse impulse noise white at high frequencies? It's got something screwy up there past the impulse frequency, intuitively, but what is it? If I understood this I might have a step towards understanding other distributions.

Intuitively once you zoom in to where it looks like an impulse, that has perfectly flat spectral density, i.e. the phase is different than GWN's phase. Something something.)
(Reply) (Parent) (Thread)