The history of computing is longer than the history of computing hardware and modern computing technology and includes the history of methods intended for pen and paper or for chalk and slate, with or without the aid of tables.
The timeline of computing presents a summary list of major developments in computing by date.
See, for example, Euclid's algorithm for finding the greatest common divisor of two numbers.
By the High Middle Ages, the positional Hindu–Arabic numeral system had reached Europe, which allowed for systematic computation of numbers.
This was the first known computer and most advanced system of calculation known to date - preceding Greek methods by 2,000 years.
During this period, the representation of a calculation on paper actually allowed calculation of mathematical expressions, and the tabulation of mathematical functions such as the square root and the common logarithm (for use in multiplication and division) and the trigonometric functions.By the time of Isaac Newton's research, paper or vellum was an important computing resource, and even in our present time, researchers like Enrico Fermi would cover random scraps of paper with calculation, to satisfy their curiosity about an equation.Digital computing is intimately tied to the representation of numbers.But long before abstractions like the number arose, there were mathematical concepts to serve the purposes of civilization.
Advances in the numeral system and mathematical notation eventually led to the discovery of mathematical operations such as addition, subtraction, multiplication, division, squaring, square root, and so forth.Eventually the operations were formalized, and concepts about the operations became understood well enough to be stated formally, and even proven.