I have no idea what name to give it,
Ok. Abstractly your method is as follows. Given a distribution with unknown parameters
you define a number of statistics, defined on samples, equal to the number of unknown parameters.
Then you match the computed expectation values of the statistics to the observed ones
for a given sample.
In your case the unknown parameters are the elo's of the players, and the statistics
are the observed scores of each player.
but I think it is similar to ML.
No it is simlar to the method of moments. In the method of moments the statistics mentionned above are the moments. But for the logistic distribution, by some accident
of nature, your method happens to be equivalent to ML.....
In fact, please correct me since I am not a mathematician, you cannot do ML because there is no set of parameters that will guarantee a prediction (most likely outcome) that will match the results you have (that is the definition of ML if I am not wrong).
You can certainly do ML. In fact BayesElo does it. Basically ML picks the elo's so that the probability of the actually observed game outcomes is maximal among all possible elo's. And this is defined, regardless how bizarre (e.g. intransitive) these outcomes might be (this just means that the probability of the observation will be very small).
This is the frequentist approach to ML. The Bayesian version without prior is mathematically equivalent. The Bayesian version of ML is slightly more general in the
sense that it allows for the inclusion of a prior in a non-adhoc way (but of course
the prior itself is adhoc...).
The reason why people like ML is that it is asymptotically optimal (i.e. the variance
of a ML estimator approaches the theoretically minimal value for large samples). And of course it is also domain independent.
If you write things out you see that ML estimation leads to a system of equations, very similar to the one you obtain.