It MAY not take a bigger more complex net. One of the entertaining things about neural nets is, once they get a bit complex, what they are 'up to' is hard to guess. We humans are lucky - we get a lot of our neural nets pre-sized and pre-weighted and it only takes us forever to do things. Having one too many, or too few nodes on an internal layer can make things nearly work really well after a couple of million training epochs but will never hit the sweet spot required to go the full 10 moves which could require billions of training epochs to get right.
Still waiting for it to work out how to peel the labels off.