-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy pathGA.tex
More file actions
51 lines (45 loc) · 9.54 KB
/
GA.tex
File metadata and controls
51 lines (45 loc) · 9.54 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
\subsection{Optimization process}
Optimization process searches for the best set of parameters in the user-defined parameters range. The best set of parameters are those which minimize the optimization cost function; the parameters that provide a very close synthetic result to observation. A physics-based ground motion simulation approach uses $Q$ as an input value. Different combination of parameters (i.e., $C$, $\alpha$, and $\beta$) can generate the same $Q$ value. Consequently, a cost function can have an infinite number of local minima with acceptable accuracy. All these results are converged for effective shear wave velocity and are different for other velocity ranges. Such a problem requires a global optimization process to be able to have a good searching strategy in a different part of the domain. In this study, we use the genetic algorithm (GA) as a derivative-free single objective method to search for the optimum parameters \citep{Holland_1973}. GA is not a mathematically guided solution to the problem; instead, it is merely a stochastic, discrete, nonlinear, and highly dimensional search algorithm. \citet{Man_1996} provided guidelines to implement the GA algorithm. Each population includes quality factors parameters and their resultant signal comparison with a target value. Populations of the first iteration are randomly chosen in the parameters defined ranges. We generate a series of optimal results for each station. These results provide a good understanding of the effective shear wave velocity range for each station by converging to them. The algorithm iteratively establishes new populations and evaluates them using cost function. The objective is minimizing the cost function value by searching for appropriate parameters.
To facilitate the GA evolution cycle, two fundamental operators: Crossover and Mutation are required. We use uniform crossover approach as crossover operations. Uniform crossover generates offspring from the parents based on randomly generated crossover mask. Crossover mask determines which gene (i.e., $Q$ input parameter) of the elite population should switch with the other parameters of populations. In this process for each population and during each iteration, parents exchange the parameters to generate the offspring. Crossover tends to conserve the genetic information present in the strings.
Mutation, on the other hand, is not a conservative operator but capable of generating new building blocks radically. Mutation randomly chooses some gene in the population and substitutes it with a random parameter from the allowed ranges. See Fig.~\ref{fig:Figure_1} for a schematic illustration of crossover and mutation operators. Upon creating a new population, the algorithm calculates the costs and sorts the population according to score and crossover the best parameters with part of other good parameters in the population. Since the best chromosome of the population may fail to reproduce better offspring in the next generation, it is usually combined with the elitist strategy such that one or number of the best chromosome can be copied into the succeeded generation. Next generation (offspring) is a combination of the best solutions of the previous generation, the mutated generation, and the crossover generation. The cycle of evolution is repeated until the desired termination criterion is reached. In this study, we use adaptive feasible and crossover scattered functions as mutation and crossover functions (see \citet{Matlab_optim} for more details).
We define three termination criteria. Maximum number of iteration, in this case, the optimization process regardless of the wellness of the results is terminated, Fitness limit, in this case, the value of fitness function for the best point in the current population is less than or equal to fitness limit; and achieving best score and successive iterations with no produce of better results.
%Having the boundaries of parameters and a cost function, we need to set up an optimization process to search for the best set of parameters to efficiently minimize the cost function. Our preliminary studies (include 2016 scec poster) show that there are not a unique solution for the process and many combination of parameters can be a good candidate. It is understandable because for each station there is a dominant shear wave velocity in the ray path. Therefore, different combination of parameters where they generate common values for a range of Vs can be acceptable results. Therefore, our cost function can have infinite number of local minimum with acceptable accuracy. In result we need to have a global optimization process to be able to have a good searching strategy in different part of the domain. Therefore, we use Genetic algorithm as a derivative free single objective method in optimization process. We generate a series of optimal results for each station. These results provide a good understanding of the dominant/effective shear wave velocity for that specific station. \\
%\citet{Holland_1973} introduced genetic algorithms (GA). It is not a mathematically guided solution to the problem; rather, It is merely a stochastic, discrete, nonlinear, and highly dimensional search algorithm.We developed a simple GA according \citet{man1996genetic}. Each population includes the quality factors parameters. After evaluating the first set of populations that are basically random parameters in the defined range, the algorithm iteratively defines new populations. Every time that new population is generated it goes through the evaluation process. In the evaluation process which we call it cost or objective function (according to GA nomenclature), it gets the parameters as an input and compute the differences between observation and synthetic. Then it sorts the population according to the best cost (in ascending order). In order to facilitate the GA evolution cycle, two fundamental operators: Crossover and Mutation are required. We use uniform crossover approach as crossover operations. This generates offspring from the parents based on randomly generated crossover mask.
%In this process for each iteration, and for each population, parents exchange the sections to generate the offspring. At each iteration also in the mutations process, the algorithm randomly picks new value in the defined range. Crossover tends to conserve the genetic information present in the strings. Mutation however is not a conservative operator but capable of generating new building blocks radically. Upon generating new population the algorithm calculate the costs and sorts the population according to score and crossover the best solution with part of other good solutions. Since the best chromosome of the population may fail to reproduce better offspring in the next generation, it is usually combined with elitist strategy such that one or number of the best chromosome can be copied in to the succeeded generation. Next generation (offspring) is combination of best solutions of previous generation, mutated generation and crossover generation. The cycle of evolution is repeated until a desired termination criterion is reached. In this study we use adaptive feasible and crossover scattered functions as mutation and crossover functions (for more details see \citet{Matlab_optim}). We defined three termination criteria. Maximum number of iteration, in this case the optimization process regardless of the wellness of the results is terminated, Fitness limit, in this case the value of fitness function for the best point in the current population is less than or equal to fitness limit; and achieving best score and successive iterations with no produce of better results.
%In this study the we use the default mutation function (Adaptive Feasible) when there are constraints, randomly generates directions that are adaptive with respect to the last successful or unsuccessful generation. The mutation chooses a direction and step length that satisfies bounds and linear constraints.
%Crossover function (CrossoverFcn) specifies the function that performs the crossover. Do not use with integer problems. You can choose from the following functions:
%Scattered (@crossoverscattered), the default crossover function for problems without linear constraints, creates a random binary vector and selects the genes where the vector is a 1 from the first parent, and the genes where the vector is a 0 from the second parent, and combines the genes to form the child.
% PopulationType: 'doubleVector'
% PopInitRange: [2×3 double]
% PopulationSize: 40
% EliteCount: 2
% CrossoverFraction: 0.8000
% ParetoFraction: []
% MigrationDirection: 'forward'
% MigrationInterval: 20
% MigrationFraction: 0.2000
% Generations: 20
% TimeLimit: 600
% FitnessLimit: 1.0000e-04
% StallGenLimit: 50
% StallTest: 'averageChange'
% StallTimeLimit: 300
% TolFun: 1.0000e-06
% TolCon: 1.0000e-03
% InitialPopulation: [1 0.8995 -1]
% InitialScores: [0×1 double]
% NonlinConAlgorithm: 'auglag'
% InitialPenalty: 10
% PenaltyFactor: 100
% PlotInterval: 1
% CreationFcn: @gacreationuniform
% FitnessScalingFcn: @fitscalingrank
% SelectionFcn: @selectionstochunif
% CrossoverFcn: @crossoverscattered
% MutationFcn: @mutationadaptfeasible
% Display: 'final'
% Vectorized: 'off'
% UseParallel: 1
% UserSpecPopInitRange: 0
% MultiObjective: 0
% Verbosity: 1