Method of increments Meaning, Definition & Usage
Definitions
(Math.) , a calculus founded on the properties of the successive values of variable quantities and their differences or increments. It differs from the method of fluxions in treating these differences as finite, instead of infinitely small, and is equivalent to the calculus of finite differences.