In statistics, nonlinear regression is the problem of inference for a model
-
based on multidimensional x,y data, where f is some nonlinear function with respect to unknown parameters θ. At a minimum, we may like to obtain the parameter values associated with the best fitting curve (usually, least squares). Also, statistical inference may be needed, such as confidence intervals for parameters, or a test of whether of not the fitted model agrees well with the data.
The scope of nonlinear regression is clarified by considering the case of polynomial regression, which actually is best not treated as a case of nonlinear regression. When f takes a form such as
- f(x) = ax2 + bx + c
our function f is nonlinear as a function of x but it is linear as a function of unknown parameters a, b, and c. The latter is the sense of "linear" in the context of statistical regression modeling. The appropriate computational procedures for polynomial regression are procedures of (multiple) linear regression with two predictor variables x and x2 say. However, on occasion it is suggested that nonlinear regression is needed for fitting polynomials. Practical consequences of the misunderstanding include that a nonlinear optimization procedure may be used when the solution is actually available in closed form. Also, capabilities for linear regression are likely to be more comprehensive in some software than capabilities related to nonlinear regression.