In the past we might have assumed that computers would keep getting exponentially faster and solve the problem for us, but that’s no longer the case. The days when the latest and greatest achievements in computing are ‘in your pocket in 2 years’ are numbered. We can scale things up by adding more machines, but that gets expensive, and after a certain point you tend to see diminishing returns (link to https://twitter.com/demishassabis/status/708488229750591488, where Hassabis tweeted "We are using roughly same amount of compute power as in Fan Hui match: distributing search over further machines has diminishing returns")
1. More machines getting expensive is bullshit. Moore's law keeps going and we getting nice exponential growth of total compute power (accounting for growing number of cores) for the same price.
2. Diminishing returns argument is another instance of bullshit. Just one specific algorithm not being able to take advantage of more compute power doesn't really mean anything.