Depending upon where one is from, Decimal Point, Decimal Comma, or *both* notations are used as a Decimal Mark to separate integers from the fractional portion of a number. Decimal notation has shifted over the centuries as mathematics has evolved into it's modern usage through trial and error. Below are some of examples from Florian Cajori's *A History Of Mathematical Notations*, which provides an excellent description of the steps that have brought mathematics to the current state of Decimal Point and Decimal Comma usage supremacy.

According to Florian Cajori the confusion started with John Napier's *Rabdologia *published in 1617:

In section 5 he says: "Whatever is written after the period is a fraction," and he actually uses the period. But in the passage we quoted from Rabdologia he speaks of a "period or comma" and actually uses a comma in his illustration. Thus, Napier vacillated between the period and the comma; mathematicians have been vacillating in this matter ever since.

A History Of Mathematical Notations| Florian Cajori

Florian Cajori centers the Decimal Comma/Decimal Point divide on two titans of mathematics born 4 years apart, Sir Isaac Newton and Gottfried Leibniz, and their preferred notation for the multiplication mark. Sir Isaac Newton preferred using "x" as the multiplication mark commonly used in England at the time and thus had no concerns with using the Decimal Point, the extremely popular polymath Gotfried Liebniz hailing from Germany had preferred to use the "•" multiplication mark, and thus championed the Decimal Comma.

In the eighteenth century, trials of strength between the comma and the dot as the separatrix were complicated by the fact that Leibniz had proposed the dot as the symbol of multiplication, a proposal

which was championed by the German textbook writer Christian Wolf and which met with favorable reception throughout the Continent.A History Of Mathematical Notations| Florian Cajori

With the progression of colonialism The United States of America would go on to follow a mix of Decimal Comma and Decimal Point use due to influences from France and England:

Nor did these two dots introduce confusion, because (if we may use a situation suggested by Shakespeare) the symbols were placed in Romeo and Juliet positions, the Juliet dot stood on high, above Romeo's reach, her joy reduced to a decimal over his departure, while Romeo below had his griefs multiplied and was "a thousand times the worse" for want of her light. Thus, 2•5 means 2 5/10, while 2.5 equals 10.

A History Of Mathematical Notations| Florian Cajori

However America would eventually settle on and reverse the Romeo & Juliet notation commonly used in England:

Sherwin writes: "To distinguish the sign of Multiplication from the period used as a decimal point, the latter is elevated by inverting the type, while the former is larger and placed down even with the lower

extremities of the figures or letters between which it stands." In 1881 George Bruce Halsted placed the decimal point halfway up and the multiplication point low.A History Of Mathematical Notations| Florian Cajori

Florian Cajori sums up the timeline with the following paragraphes:

That no general agreement in the notation for decimal fractions exists at the present time is evident from the publication of the International Mathematical Congress in Strasbourg (1920), where decimals are expressed by commas as in 2,5 and also by dots as in 2.5. In that volume a dot, placed at the lower border of a line, is used also to indicate multiplication.

The opinion of an American committee of mathematicians is expressed in the following: "Owing to the frequent use of the letter x, it is preferable to use the dot (a raised period) for multiplication in the few cases in which any symbol is necessary. For example, in a case like 1•2•3 . . . . (x- l).x, the center dot is preferable to the symbol X ; but in cases like 2a(x-a) no symbol is necessary. The committee recognizes that the period (as in a.b) is more nearly international than the center dot (as in a•b); but inasmuch as the period will continue to be used in this country as a decimal point, it is likely to cause confusion, to elementary pupils at least, to attempt to use it as a symbol for multiplication."

Below is a googleVis visualization from DataCamp that shows which countries us Decimal Point/Decimal Comma notation or both:

Decimal Comma or Point? A googleVis visualization | DataCamp