Randomly generate an m x n array, where each cell is IID over the distribution {p: "cheap", (1-p): "expensive"}. I want to take a path from the origin to (m, n). Stepping into a "cheap" cell costs c, stepping into an "expensive" cell costs x. The cost of a path is just the sum of the cost of its steps. I'm only allowed to take steps north and east, so I have (n+m choose n) possible paths.

What does the expected cost of the cheapest of these paths look like for large m and n? Specifically, (how) can I choose p and the ratio x/c such that it's proportional to sqrt(m

^{2}+ n

^{2})? I.e., is it possible to have a manhattan-distance-with-random-costs rigged up to resemble euclidean distance? Empirical futzing around suggests "maybeish".

---

alternate version: assign to every edge of an infinite square lattice a random cost uniform in [0,1]. Does the shortest path from point a to point b tend to the euclidean distance between a and b asymptotically as a and b grow far apart?