Concept of stochastic gradient Differentiation of the objective function of two-stage SLP Stochastic perturbation approximation Likelihood approach Differentiation of integrals given by inclusion Projection of Stochastic Gradient
Several estimators examined for stochastic gradient: Several estimators examined for stochastic gradient: - Analytical approach (AA);
- Finite difference approach (FD);
- Likelihood ratio approach (LR)
- Simulated perturbation approximation.
Assume, density of random variable doesn’t depends on the decision variable. Assume, density of random variable doesn’t depends on the decision variable. Thus, the analytical stochastic gradient coincides with the gradient of random integrated function:
Let consider the two-stage SLP: Let consider the two-stage SLP:
The stochastic analytical gradient is defined as
Let us approximate the gradient of the random function by finite differences. Let us approximate the gradient of the random function by finite differences. Thus, the each ith component of the stochastic gradient is computed as: is the vector with zero components except ith one, equal to 1, is some small value.
where is the random vector obtaining values 1 or -1 with probabilities p=0.5, is some small value where is the random vector obtaining values 1 or -1 with probabilities p=0.5, is some small value (Spall 1992).
Two-stage stochastic linear optimisation problem. Two-stage stochastic linear optimisation problem. Dimensions of the task are as follows: - the first stage has 10 rows and 20 variables;
- the second stage has 20 rows and 30 variables.
- http://www.math.bme.hu/~deak/twostage/ l1/20x20.1/
- (2006-01-20).
The methods of nonlinear stochastic programming are built using the concept of stochastic gradient The methods of nonlinear stochastic programming are built using the concept of stochastic gradient Several methods exist to obtain the stochastic gradient by evaluating the objective function and stochastic gradient by the same random sample.
Dostları ilə paylaş: |