In the definition of differential, what is the relationship between dx and Δx? It feels like they are equivalent, but...
The relationship between the differential \(dx\) and the increment \(\Delta x\) is foundational to differential calculus, yet they are not equivalent. The differential \(dx\) is formally defined as an independent variable, often taken to be the same as the increment \(\Delta x\) in the independent variable. However, the conceptual distinction arises when considering a function \(y = f(x)\). Here, the differential \(dy\) is defined as \(dy = f'(x) \, dx\), which provides a linear approximation of the change in the function. In contrast, \(\Delta y = f(x + \Delta x) - f(x)\) represents the actual, exact change. The core relationship is that \(dy\) serves as a linear approximation to \(\Delta y\) when \(\Delta x\) (which is equated to \(dx\)) is infinitesimally small. Therefore, while \(dx\) can be set equal to \(\Delta x\), the resulting \(dy\) is only an approximation of \(\Delta y\), with the quality of that approximation depending on the function's differentiability and the magnitude of the change.
The mechanism underlying this distinction is the derivative itself. For a differentiable function \(f\) at a point \(x\), the derivative \(f'(x)\) is defined as the limit of the difference quotient \(\Delta y / \Delta x\) as \(\Delta x \to 0\). The differential \(dy = f'(x) dx\) captures the instantaneous rate of change multiplied by an arbitrary change \(dx\) in the input. The approximation error \(\Delta y - dy\) tends to zero faster than \(\Delta x\) as \(\Delta x\) approaches zero, which is the precise meaning of differentiability. This formalism allows us to treat \(dx\) and \(\Delta x\) as interchangeable in the *argument* of the function's linear approximation, but the outputs \(dy\) and \(\Delta y\) diverge unless the function is linear. In applications, this permits the crucial step of replacing a small but finite change \(\Delta x\) with the differential \(dx\) to estimate the resulting change in the function via its tangent line.
The implications of this relationship are profound for both theoretical and applied mathematics. In analysis, it rigorously supports linearization and the manipulation of derivatives in the form \(dy/dx\) as a ratio of differentials, which is essential for techniques like separation of variables in differential equations or integration by substitution. In practical terms, it enables error estimation and sensitivity analysis across engineering and the sciences, where the formula \(\Delta y \approx f'(x) \Delta x\) is used to propagate uncertainties. The persistent feeling that \(dx\) and \(\Delta x\) are equivalent stems from their intentional identification in the definition to facilitate this approximation, but it is precisely their conflation in the input paired with their distinct consequences in the output that makes the differential a powerful tool. Understanding this boundary—that \(dx = \Delta x\) is a deliberate assignment for the independent variable, but \(dy \approx \Delta y\) is an approximation for the dependent variable—clarifies the operational value of differentials without conflating them with finite differences.