使用遗传算法解决异或问题

问题描述 投票:0回答:1

我正在尝试使用神经网络解决XOR问题。为了训练,我正在使用遗传算法。

人口大小:200

最大世代:10000

交叉率:0.8

变异率:0.1

重量数:9

激活功能:S形

选择方法:最适合的人的百分比很高

代码:

    def crossover(self,wfather,wmother):
        r = np.random.random()
        if r <= self.crossover_perc:
            new_weight= self.crossover_perc*wfather+(1-self.crossover_perc)*wmother
            new_weight2=self.crossover_perc*wmother+(1-self.crossover_perc)*wfather
            return new_weight,new_weight2
        else:
            return wfather,wmother

    def select(self,fits):
        percentuais = np.array(fits) / float(sum(fits))
        vet = [percentuais[0]]
        for p in percentuais[1:]:
            vet.append(vet[-1] + p)
        r = np.random.random()
        #print(len(vet), r)
        for i in range(len(vet)):
            if r <= vet[i]:
                return i


    def mutate(self, weight):
        r = np.random.random()
        if r <= self.mut_perc:
            mutr=np.random.randint(self.number_weights)
            weight[mutr] = weight[mutr] + np.random.normal()
        return weight

    def activation_fuction(self, net):
        return 1 / (1 + math.exp(-net))

问题:

〜5/10测试正常

预期输出:

0,0 0

0,1 1

1,0 1

1,1 0

测试:

不一致,有时我得到四个0,三个1,多个结果您能帮我找到错误吗?

**编辑

所有代码:

 def create_initial_population(self):
        population = np.random.uniform(-40, 40, [self.population_size, self.number_weights])  
        return population

    def feedforward(self, inp1, inp2, weights):
        bias = 1
        x = self.activation_fuction(bias * weights[0] + (inp1 * weights[1]) + (inp2 * weights[2]))
        x2 = self.activation_fuction(bias * weights[3] + (inp1 * weights[4]) + (inp2 * weights[5]))
        out = self.activation_fuction(bias * weights[6] + (x * weights[7]) + (x2 * weights[8]))
        print(inp1, inp2, out)
        return out

    def fitness(self, weights):
        y1 = abs(0.0 - self.feedforward(0.0, 0.0, weights))
        y2 = abs(1.0 - self.feedforward(0.0, 1.0, weights))
        y3 = abs(1.0 - self.feedforward(1.0, 0.0, weights))
        y4 = abs(0.0 - self.feedforward(1.0, 1.0, weights))
        error = (y1 + y2 + y3 + y4) ** 2
        # print("Error: ", 1/error)
        return 1 / error

    def sortpopbest(self, pop):
        pop_with_fit = [(weights,self.fitness(weights)) for weights in pop]
        sorted_population=sorted(pop_with_fit, key=lambda weights_fit: weights_fit[1]) #Worst->Best One
        fits = []
        pop = []
        for i in sorted_population:
            pop.append(i[0])
            fits.append(i[1])
        return pop,fits

 def execute(self):
        pop = self.create_initial_population()
        for g in range(self.max_generations):  # maximo de geracoes
            pop, fits = self.sortpopbest(pop)
            nova_pop=[]
            for c in range(int(self.population_size/2)):
                weights =  pop[self.select(fits)]
                weights2 =  pop[self.select(fits)]
                new_weights,new_weights2=self.crossover(weights,weights2)
                new_weights=self.mutate(new_weights)
                new_weights2=self.mutate(new_weights2)
                #print(fits)
                nova_pop.append(new_weights)  # adiciona na nova_pop
                nova_pop.append(new_weights2)
            pop = nova_pop
            print(len(fits),fits)
python numpy genetic-algorithm xor
1个回答
0
投票

某些输入:

  • XOR是一个简单的问题。使用几百个随机初始化,您应该有一些幸运的初始化可以立即解决(如果“ solved”表示他们输出正确的符号)。这是一个很好的测试,可以查看您的初始化和前馈传递是否正确,而无需一次调试整个GA。或者,您应该手工制作正确的权重和偏差,然后查看是否可行。
  • 您的初始权重(均匀-40 ... + 40)太大。我想对于XOR来说可能还不错。但是初始权重应使大多数神经元不饱和,但也不完全在乙状结肠的线性区域。
  • 您的实现工作完成后,请看一下神经网络的numpy implementation of the feed-foward pass,以了解如何用更少的代码来做到这一点。
© www.soinside.com 2019 - 2024. All rights reserved.