Simulated Annealing and the Eight Queen Problem

Unsupervised Learning and Random Optimization Algorithms

Sebastián Gerard Aguilar Kleimann
Towards Data Science

--

The Simulated Annealing (SA) algorithm is one of many random optimization algorithms. Unlike algorithms like the Hill Climbing algorithm where the intent is to only improve the optimization, SA allows for more exploration. The idea is that with this exploration it’s more likely to reach a global optima rather than a local optima (for more on local optima, global optima and the Hill Climbing Optimization algorithm follow this link).

The term “annealing” is a term used in mettalurgy where the objective is to lower and increase the temperature of a metal. The process allows for the molecules to regroupe and get closer every time so that the metal becomes harder. The analogy is applied on the SA algorithm by getting closer to a solution, going farther from it by doing exploration and getting closer again to an even better solution.

The Simulated Annealing Algorithm

The algorithm can be decomposed in 4 simple steps:

  1. Start at a random point x.
  2. Choose a new point xⱼ on a neighborhood N(x).
  3. Decide whether or not to move to the new point xⱼ. The decision will be made based on the probability function P(x,xⱼ,T) (explained ahead).
  4. Reduce T

Like we mentioned before P(x,xⱼ,T) is the function that will guide us on whether we move to the new point or not:

Let’s explore what the function is telling us. F(x) is the objective function (the function for which we want to find the optimal point x). If the new point xⱼ improves the result of the objective function, we will move to the new point with probability 1.

When the new point doesn’t improve the objective function, we will move to the new point depending on the difference of F(xⱼ)-F(x) and the variable T.

When T is high the possibility of moving to the new point will also be high and when T is low the possibility of moving to the new point is also low. That’s why initially we’ll start with a high T to do more exploration, and gradually lower the value of T to reach the optimal point.

Now that we know how the algorithm works it’s time to perform an example using python in order to take our understanding further.

The Eight Queen Problem

The eight queen problem consists in positioning 8 queens in an 8*8 table board without any of them being on the line of attack between each other. Queens can move horizontally, diagonally and vertically as seen in Image 1.1.

Image from Brainmetrix

Finding the solution to the problem is not easy since a great number of combinations exist. We can see some solutions in image 1.2 were only 6 and 7 queens were positioned in the board.

Image from Brainmetrix

Now that we understand the problem let’s go to python code and solve it.

The 8 Queens using Python

In python there exists a library called “mlrose” that is very helpful for implementing random optimization algorithms so the first few lines of code will be used to import this library as well as the numpy library that helps us handle arrays.

### Importing the necessary libraries
import mlrose
import numpy as np

The next step is defining the objective function. The objective function will count the number of queens that are positioned in a place where they cannot be attacked. Given that queens move vertically, it’s reasonable to say that no queen should be placed in the same vertical and thus we can represent the position of each queen in a simple array of 8 positions. For example, in a chess board an array A=[0,2,1,3,7,4,5,6] would look like Image 2.1.

Image from DataGenetics

Now it is time to code the objective function:

###Defining the objective function
def queens_max(position):
# We start the count
no_attack_on_j = 0
queen_not_attacking=0
# Compare for each pair of queens
for i in range(len(position) - 1):
no_attack_on_j=0
for j in range(i + 1, len(position)):
### Check if there is any diagonal or horizontal attack.
###Iterative process for each column
if (position[j] != position[i]) and (position[j] != position[i] + (j - i)) and (position[j] != position[i] - (j - i)):
"""If there isn't any attack on the evaluated column.
The count is increased by one.
This counter is only used as a reference ."""
no_attack_on_j += 1
"""If there is no attack on all the columns.
The general counter is increased by one.
This counter indicates the number of queens that are correctly
positioned."""
if(no_attack_on_j==len(position)-1-i):
queen_not_attacking+=1
"""The return number is the number of queens not attacking each
other. If this number is 7 we add 1 cause it means the last
queen is also free of attack."""
if(queen_not_attacking==7):
queen_not_attacking+=1
return queen_not_attacking

The objective function previously defined is assigned as an argument to the method “CustomFitness” in mlrose. That’s how any objective function can work with this library.

# Assign the objective function to "CustomFitness" method.
objective= mlrose.CustomFitness(queens_max)

Now, we finished the tricky part. The only thing remaining is to tell mlrose the type of problem is has to solve. The problem we are solving has discrete numbers (whole numbers) that’s why we’ll use the method “DiscreteOpt”. This method will hold the decription of the problem and it has 4 arguments:

  1. length- The size of the array we are working with (for the eight queen problem it’s 8).
  2. fitness_fn- The objective function we previously defined with the name “objective”.
  3. maximize-It must be “True” if we want to maximize the objective function and “False” otherwise.
  4. max_val- This parameter indicates the maximum value the function can take. It starts in 0 and ends in max_val-1. In our case the queen can be positioned from 0 to 7 so max_val=8.

The line of code:

#Description of the problem
problem = mlrose.DiscreteOpt(length = 8, fitness_fn = objective, maximize = True, max_val = 8)

Finally, it’s time to tell mlrose how to solve the problem. We know we are going to use Simulated Annealing(SA) and it’s important to specify 5 parameters.

  1. problem-This parameter contains the information of the problem. We defined it earlier with the name “problem”.
  2. schedule-This parameter tells T how to decrease over each iteration. There are many ways to decrease T and it’s possible to check them on mlrose. This time T will be decreased exponentially using ExpDecay().
  3. max_attempts- It’s important to define the number of attempts the algorithm will try to search for a better solution. If the number of attempts reaches the maximum it should stop.
  4. max_iter-It indicates the maximum number of new points the algorithm can find or the maximum number of iteration it does.
  5. init_state-The parameter “init_state” is the inital position of the array.

The last chunk of code looks like this:

# Define decay schedule
T = mlrose.ExpDecay()
# Define initial state
initial_position = np.array([4, 6, 1, 5, 2, 0, 3, 7])
# Solve problem using simulated annealing
best_position, best_objective = mlrose.simulated_annealing(problem=problem, schedule = T,
max_attempts = 500, max_iters = 5000,
init_state = initial_position)
print('The best position found is: ', best_position)
print('The number of queens that are not attacking each other is: ', best_objective)
Output:The best position found is: [4 6 1 5 2 0 3 7]
The number of queens that are not attacking each other is: 8.0

We solved the 8 queen problem succesfully as seen on Image 2.2

Image from DataGenetics

Final Remarks and Code

I hope you enjoyed the article and play with the eight queen problem. I leave you the code on google colab and github. It’s possible to run this code and with a few tweaks even solve the 9,10…n queen problem.

Bibliography:

Machine Learning, Randomized Optimization and SEarch¶. (n.d.). Retrieved August 13, 2020, from https://mlrose.readthedocs.io/en/stable/

Brain Training. (n.d.). Retrieved August 13, 2020, from http://www.brainmetrix.com/8-queens/

Eight Queens. (n.d.). Retrieved September 10, 2020, from http://www.datagenetics.com/blog/august42012/

Originally published at https://www.estudiodedatos.com on August 25, 2020.

--

--

Master in International Business and Entrepreneurship (Universita degli studi di Pavia) Bachelor in Actuary (Universidad Nacional Autónoma de México)