Is there a notion of average that is robust against this arbitrariness of counting?
Try this: Take a collection v1...vm of vectors in Rn. Compute the m × m matrix Dij = || vi - vj || of distances between them. Find a vector μ in Rm such that Dμ is a constantly-k vector for some k, and Σiμi = 1. We can take this to be some multiple of D-1(1 1 ... 1)T if D happens to be invertible. Now if I did my math right, Σiμivi serves as a kind of average of the vi, but is invariant under rotation, uniform scaling, (but not nonuniform scaling!) and most importantly, duplication of items in the list of vectors vi. If we literally duplicate entries the matrix D is not invertible, but the equation Dμ = k(1 1 ... 1)T allows exactly those reweightings that assign a total weight to the two duplicated vectors equal to the weight of the unduplicated one.
Anyhow this seems like an utterly elementary operation. I wonder why I haven't heard of it before? Is it because it doesn't actually work as I think it does?