|
Second Derivative Methods
For binary, ordered, censored, and count models, EViews can estimate the model using Newton-Raphson or quadratic hill-climbing.
Newton-Raphson
Candidate values for the parameters may be obtained using the method of Newton-Raphson by linearizing the first order conditions at the current parameter values, :
(32.2)
where is the gradient vector , and is the Hessian matrix .
If the function is quadratic, Newton-Raphson will find the maximum in a single iteration. If the function is not quadratic, the success of the algorithm will depend on how well a local quadratic approximation captures the shape of the function.
Quadratic hill-climbing (Goldfeld-Quandt)
This method, which is a straightforward variation on Newton-Raphson, is sometimes attributed to Goldfeld and Quandt. Quadratic hill-climbing modifies the Newton-Raphson algorithm by adding a correction matrix (or ridge factor) to the Hessian. The quadratic hill-climbing updating algorithm is given by:
(32.3)
where is the identity matrix and is a positive number that is chosen by the algorithm.
The effect of this modification is to push the parameter estimates in the direction of the gradient vector. The idea is that when we are far from the maximum, the local quadratic approximation to the function may be a poor guide to its overall shape, so we may be better off simply following the gradient. The correction may provide better performance at locations far from the optimum, and allows for computation of the direction vector in cases where the Hessian is near singular.
For models which may be estimated using second derivative methods, EViews uses quadratic hill-climbing as its default method. You may elect to use traditional Newton-Raphson, or the first derivative methods described below, by selecting the desired algorithm in the Options menu.
Note that asymptotic standard errors are always computed from the unmodified Hessian once convergence is achieved.
这段可能有点用。
|