In metrology, **linearity** is actually a measure of the nonlinearity of the measurement instrument. When we talk about sensitivity, we assume that the input/output characteristic of the instrument to be approximately linear. But in practice, it is normally nonlinear, as shown in the figure below.

Linearity is defined as the maximum deviation of the output of the measuring system from a specified straight line applied to a plot of data points on a curve of measured (output) values versus the measurand (input) values.

\[\textrm{Linearity}=\dfrac{\Delta O}{O_{max}-O_{min}}\]

\[\Delta O=max(\Delta O_1,\Delta O_2)\]

In order to obtain accurate measurement readings, a high degree of linearity should be maintained in the instrument or efforts have to be made to minimize linearity errors. A better degree of linearity renders the instrument to be readily calibrated. However, in practice, only an approximation of the linearity is achieved as there is always some small variance associated with the measuring system. Hence, the expected linearity of the input is usually specified as a percentage of the operating range.

Before making any interpretation or comparison of the linearity specifications of the measuring instrument, it is necessary to define the exact nature of the reference straight line adopted, as several lines can be used as the reference of linearity. The most common lines are as follows:

**Best-fit line**. The plot of the output values versus the input values with the best line fit. The line of best fit is the most common way to show the correlation between two variables. This line, which is also known as the trend line, is drawn through the centre of a group of data points on a scatter plot. The best-fit line may pass through all the points, some of the points, or none of the points.**End point line**. This is employed when the output is bipolar. It is the line drawn by joining the end points of the data plot without any consideration of the origin.**Terminal line**. When the line is drawn from the origin to the data point at full scale output, it is known as terminal line.**Least square line**. This is the most preferred and extensively used method in regression analysis. Carl Freidrich Gauss (1975) was the first to give the earliest description of the least square method. It is a statistical technique and a more precise way of determining the line of best fit for a given set of data points. The best-fit line is drawn through a number of data points by minimizing the sum of the squares of the deviations of the data points from the line of best fit, hence the name least squares. The line is specified by an equation relating the input value to the output value by considering the set of data points.