Improvements to Evolutionary Algorithm Using ANNs That Allow XOR

I have to implement an Artificial Neural Network (ANN) with 2 inputs, 2 hidden and 1 output neurons that can solve the XOR problem. Network weights should be optimized using an evolutionary algorithm. The activation functions for each neuron and the fitness function for each ANN are given. The following photo summarizes the problem and introduces the variable names I used:

enter image description here

Now I have tried my best to solve this problem, but even with an evolutionary algorithm using a population of 1000 ANN and 2000 generations, my best fitness is never better than 0.75. My code includes an ANN class with neuron, activation and fitness functions, and a Main class that includes an evolutionary algorithm that optimizes weights for ANN. Here is the code:

Each ANN is initialized with random weights between -1 and 1 and can mutate, that is, return a mutation that differs by one weight, which is chosen at random.

public class ANN implements Comparable<ANN> {
    private Random rand = new Random();
    public double[] w = new double[6];  //weights: in1->h1, in1->h2, in2->h1, in2->h2, h1->out, h2->out

    public ANN() {
        for (int i=0; i<6; i++) //randomly initialize weights in [-1,1)
            w[i] = rand.nextDouble() * 2 - 1;
    }

    //calculates the output for input a & b
    public double ann(double a, double b) {
        double h1 = activationFunc(a*w[0] + b*w[2]);
        double h2 = activationFunc(a*w[1] + b*w[3]);
        double out = activationFunc(h1*w[4] + h2*w[5]);

        return out;
    }

    private double activationFunc(double x) {
        return 2.0 / (1 + Math.exp(-2*x)) - 1;
    }

    //calculates the fitness (divergence to the right output)
    public double fitness() {
        double sum = 0;
        //test all possible inputs (0,0; 0,1; 1,0; 1,1)
        sum += 1 - Math.abs(0 - ann(0, 0));
        sum += 1 - Math.abs(1 - ann(0, 1));
        sum += 1 - Math.abs(1 - ann(1, 0));
        sum += 1 - Math.abs(0 - ann(1, 1));
        return sum / 4.0;
    }

    //randomly change random weight and return the mutated ANN
    public ANN mutate() {
        //copy weights
        ANN mutation = new ANN();
        for (int i=0; i<6; i++)
            mutation.w[i] = w[i];

        //randomly change one
        int weight = rand.nextInt(6);
        mutation.w[weight] = rand.nextDouble() * 2 - 1;

        return mutation;
    }

    @Override
    public int compareTo(ANN arg) {
        if (this.fitness() < arg.fitness())
            return -1;
        if (this.fitness() == arg.fitness())
            return 0;
        return 1;   //this.fitness > arg.fitness
    }

    @Override
    public boolean equals(Object obj) {
        if (obj == null)
            return false;
        ANN ann = (ANN)obj;
        for (int i=0; i<w.length; i++) {    //not equal if any weight is different
            if (w[i] != ann.w[i])
                return false;
        }
        return true;
    }
}

      

The main class has an evolutionary algorithm and uses elitism and rank selection to create the next generation of each population, that is, the top 100 ANNs are copied, the remaining 900 are mutations of previously successful ANNs.

//rank-based selection + elitism
public class Main {
    static Random rand = new Random();
    static int size = 1000;                     //population size
    static int elitists = 100;                  //number of elitists

    public static void main(String[] args) {
        int generation = 0;
        ArrayList<ANN> population = initPopulation();
        print(population, generation);

        //stop after good fitness is reached or after 2000 generations
        while(bestFitness(population) < 0.8 && generation < 2000) {
            generation++;
            population = nextGeneration(population);
            print(population, generation);
        }
    }

    public static ArrayList<ANN> initPopulation() {
        ArrayList<ANN> population = new ArrayList<ANN>();
        for (int i=0; i<size; i++) {
            ANN ann = new ANN();
            if (!population.contains(ann))  //no duplicates
                population.add(ann);
        }
        return population;
    }

    public static ArrayList<ANN> nextGeneration(ArrayList<ANN> current) {
        ArrayList<ANN> next = new ArrayList<ANN>();
        Collections.sort(current, Collections.reverseOrder());  //sort according to fitness (0=best, 999=worst)

        //copy elitists
        for (int i=0; i<elitists; i++) {
            next.add(current.get(i));
        }

        //rank-based roulette wheel
        while (next.size() < size) {                        //keep same population size
            double total = 0;
            for (int i=0; i<size; i++)
                total += 1.0 / (i + 1.0);                   //fitness = 1/(rank+1)

            double r = rand.nextDouble() * total;
            double cap = 0;
            for (int i=0; i<size; i++) {
                cap += 1.0 / (i + 1.0);                     //higher rank => higher probability
                if (r < cap) {                              //select for mutation
                    ANN mutation = current.get(i).mutate(); //no duplicates
                    if (!next.contains(mutation))
                        next.add(mutation);     
                    break;
                }
            }
        }       

        return next;
    }

    //returns best ANN in the specified population
    public static ANN best(ArrayList<ANN> population) {
        Collections.sort(population, Collections.reverseOrder());
        return population.get(0);
    }

    //returns the best fitness of the specified population
    public static double bestFitness(ArrayList<ANN> population) {
        return best(population).fitness();
    }

    //returns the average fitness of the specified population
    public static double averageFitness(ArrayList<ANN> population) {
        double totalFitness = 0;
        for (int i=0; i<size; i++)
            totalFitness += population.get(i).fitness();
        double average = totalFitness / size;
        return average;
    }

    //print population best and average fitness
    public static void print(ArrayList<ANN> population, int generation) {       
        System.out.println("Generation: " + generation + "\nBest: " + bestFitness(population) + ", average: " + averageFitness(population));
        System.out.print("Best weights: ");
        ANN best = best(population);
        for (int i=0; i<best.w.length; i++)
            System.out.print(best.w[i] + " ");
        System.out.println();
        System.out.println();
    }
}

      

Even though I have thought about this at some length and used the methods I have learned, the result is not satisfactory. For some reason, the optimal weights seem to drift down to -1 for each weight. How does it make sense? Is the -1 to 1 range a good choice for weights? Should I also introduce crossovers in addition to mutations? I know this is a very specific problem, but I would really appreciate some help!

+3


source to share


1 answer


Incorrect network structure. Without an offset or threshold for each node, this network cannot solve the XOR problem.

One hidden node has to encode for OR and another hidden node needs to encode for AND. Then the output node can encode that the hidden symbol node is positive and the hidden symbol node is negative for the XOR problem. This only produces positive results when disabling the hidden node function and the hidden node symbol is missing.

I would also push the limits on the scales, let EA figure it out on its own. But it depends on the structure of the network, if necessary.



If you want to use this network with hidden nodes and thresholds see http://www.heatonresearch.com/online/introduction-neural-networks-java-edition-2/chapter-1/page4.html

If you want to use another biased network: http://www.mind.ilstu.edu/curriculum/artificial_neural_net/xor_problem_and_solution.php

+2


source







All Articles