Friday, December 23, 2022

 

Reducing Trials and Errors 

Model: When trials and errors are scattered in their results, an objective function that can measure the cost or benefit will help with convergence. If the samples are large, a batch analysis mode is recommended. The approach to minimize or maximize the objective function is also possible via gradient descent methods but the use of simulated annealing can overcome local minimum even if the cost is higher because it will accept with a certain probability. In Simulated annealing, the current cost is computed, and the new cost is based on the direction of change. If the cost improves, the temperature decreases. 
Sample implementation follows: 

def annealingoptimize(domain,costf,T=10000.0,cool=0.95,step=1): 
     # Initialize the values randomly 
     vec=[float(random.randint(domain[i][0],domain[i][1])) 
          for i in range(len(domain))] 
     while T>0.1: 
          # Choose one of the indices 
          i=random.randint(0,len(domain)-1) 
          # Choose a direction to change it 
          dir=random.randint(-step,step) 
          # Create a new list with one of the values changed 
          vecb=vec[:] 
          vecb[i]+=dir 
          if vecb[i]<domain[i][0]: vecb[i]=domain[i][0] 
          elif vecb[i]>domain[i][1]: vecb[i]=domain[i][1] 

          # Calculate the current cost and the new cost 
          ea=costf(vec) 
          eb=costf(vecb) 
          p=pow(math.e,(-eb-ea)/T) 
          # Is it better, or does it make the probability 
          # cutoff? 
          if(eb<ea or random.random( )<p): 
               vec=vecb 
          # Decrease the temperature 
          T=T*cool 
     return vec 

 

 

No comments:

Post a Comment