Proceedings of the AAAI Conference on Artificial Intelligence, (33), p. 1222-1229, 2019
DOI: 10.1609/aaai.v33i01.33011222
This paper proposes an approach to strength adjustment for MCTS-based game-playing programs. In this approach, we use a softmax policy with a strength index z to choose moves. Most importantly, we filter low quality moves by excluding those that have a lower simulation count than a pre-defined threshold ratio of the maximum simulation count. We perform a theoretical analysis, reaching the result that the adjusted policy is guaranteed to choose moves exceeding a lower bound in strength by using a threshold ratio. The approach is applied to the Go program ELF OpenGo. The experiment results show that z is highly correlated to the empirical strength; namely, given a threshold ratio 0.1, z is linearly related to the Elo rating with regression error 47.95 Elo where −2≤z ≤2. Meanwhile, the covered strength range is about 800 Elo ratings in the interval of z in [−2,2]. With the ease of strength adjustment using z, we present two methods to adjust strength and predict opponents’ strengths dynamically. To our knowledge, this result is state-of-the-art in terms of the range of strengths in Elo rating while maintaining a controllable relationship between the strength and a strength index.