Is there a notion of average that is robust against this arbitrariness of counting?

Try this: Take a collection v

_{1}...v

_{m}of vectors in

**R**

^{n}. Compute the m × m matrix D

_{ij}= || v

_{i}- v

_{j}|| of distances between them. Find a vector μ in

**R**

^{m}such that Dμ is a constantly-k vector for some k, and Σ

_{i}μ

_{i}= 1. We can take this to be some multiple of D

^{-1}(1 1 ... 1)

^{T}if D happens to be invertible. Now if I did my math right, Σ

_{i}μ

_{i}v

_{i}serves as a kind of average of the v

_{i}, but is invariant under rotation, uniform scaling, (but not nonuniform scaling!) and most importantly,

*duplication*of items in the list of vectors v

_{i}. If we literally duplicate entries the matrix D is not invertible, but the equation Dμ = k(1 1 ... 1)

^{T}allows exactly those reweightings that assign a total weight to the two duplicated vectors equal to the weight of the unduplicated one.

Anyhow this seems like an utterly elementary operation. I wonder why I haven't heard of it before? Is it because it doesn't actually work as I think it does?