What do you mean, passively? I let the computer crunch on larger Sudokus while I was on vacation.
On August 3, 2024 a Sudoku with a box dimension of 65 (4225×4225 grid) with 8.6 million blanks completed in 1.5 days. A few weeks later, a Sudoku with a box dimension of 70 (4900×4900 grid) with 11.5 million blanks finished in 6.7 days.
Why were the times different than expected? When the first Sudoku (4225×4225 grid) started I realized that another memory management technique would adapt to the exponential memory requirements while solving the puzzles. I paused the solver, changed the solver to subdivide the work into smaller chunks to stay within memory limits, and restarted the solver. Chunking the puzzle slows down the solver but at least the problem gets solved.
I have rerun the puzzles on the leaderboard using the new solver. Times are down across the board by an order of magnitude. However, the chunking of the puzzle does seem to introduce some overhead. A reasonable tradeoff since without the new memory management the problem would be intractable given the current algorithm. Specifically, we only use 1/2 the available memory for temporary detection of hidden singles. So far this strategy seems to work.
How does this “biggest solved Sudoku ever” compare to the Guinness book of World Records? The largest multi-sudoku puzzle consists of 280 sudoku grids (approximately 17,920 blanks) which is about 645 times smaller (4900×4900 grid) than what we solved in 6.7 days or 480 times smaller than the 4225×4225 puzzle with respect to blanks.
The computer is currently crunching on a box dimension 75 puzzle (5625×5625 grid). The ETA for the solution is ~10 days.