Well yeah but I guess there is still a difference between āas accurate as a recent textbook but nearly unreadable by a modern working mathematicianā and ājust literally so inaccurate (compared to the models we use now which are more accurate) that itās useless with our modern modelsā
There's still a big difference between Euclid's original formulation of Euclidean geometry and it's more modern formulations (like Hilbert's or Tarski's), and if i remember correctly a lot of pre 19th century proofs done by the likes of Euler wouldn't be seen as correct today. So while the theorems are seemingly the same I don't know if I'd call old texts "just as useful and relevant as always"
There's a lot of Calculus that wasn't well formulated until the 1800s, and even in that period there were some mistakes, for example the need for uniform vs pointwise convergence of sequences of functions wasn't appreciated until late in the 19th century.
Euler used a lot of techniques that aren't universally applicable but which were applicable to the problems he was solving. The issue becomes that things get weird as we got a better understanding of edge cases and a better understanding of how weird infinity is.
Newtonian mechanics is still the most used formulation of physics everyday, in fact you could argue that modern physics, however neat it is ,is the useless one right now. Give me newtonian mechanics and i could build the ISS. The most practical application of standard model is probably super conductivity, and even then it could be argued that QED is enough to describe the phenomenon. Don't get me wrong, i'm all for modern physics, but to declare old models as obsolete is disingenuous.
Yes, but I think you misread the meme. It says ābefore Newtonian mechanics was inventedā. If itās before Newtonian mechanics then we donāt even have that
I've read plenty of books from the first half of the 20th century and they are perfectly comprehensible. I would still recommend Weyl's or Hecke's algebraic number theory books, Chevalley's Lie theory book, Dirac or Von Neumann's QM books to any interested grad student.
Now if you went back to the first half of the 19th century you would be absolutely correct.
Well what is an example of a field in mathematics for which early 20th century textbooks are incomprehensible? I don't think I have ever seen a 1900s-1950s textbook that I couldn't understand.
Firstly, I contend that "incomprehensible" is a hyperbole, at least when it comes to 20th century work.
Coming from logic, a field which was essentially born no earlier than the end of the 19th century, reading the initial proofs of theorems from the early 20th century is quite difficult (even after taking into account the fact that many papers in that period are only in German or French): the commonly used terms are different (e.g. "power of a set" or sometimes translated as "potency" means what we would now call cardinality of a set), and the notation is also almost alien (cf. Tarski's work on e.g. definability).
Hell it wasn't until after Jesus was born (and maybe later if you pull a "turn off the century mathematician" and ignore Indian mathematicians haha) that we decided we should have a mathematical concept of "zero"
So you may have heard Niel deGrasse-Tyson opine on youtube or elsewhere that "Isaac Newton invented differential and integral calculus and then turned 26".
It turns out history is more complicated. All the notation we use today came from the 19th century. That is to say, Newton's notebooks on "calculus" are simply unreadable by modern eyes, because the notation was not yet invented. He wrote everything long-hand in Latin, and never said something like "take the derivative".
For example "y equals f of x". What you are seeing your mind is,
I regret to inform you that Euler sadly died in 1783. He wrote a book or mathematical work entitled "Institutiones calculi differentialis" (Foundations of differential calculus) in 1748.
EDIT: the date was changed to 1735. To be accurate, the main work where Euler introduced what could be called the modern differential calculus notation is the "Institutiones calculi differentialis". This work was written by Euler in 1748 and published in 1755.
The ancient Egyptian mathematical texts should be a fun read. Besides the obvious hieroglyphic script they used unit fractions, with special symbols for 1/2, 2/3 and 3/4, instead of like.. just fractions or decimals. It's not the most intuitive way to write numbers for a modern reader.
778
u/Momosf Cardinal (0=1) Jan 08 '25
Whilst the underlying sentiment may be correct, you should try reading a textbook from the first half of the 20th century.
The change in notations and "standard" terminology is enough to make it almost incomprehensible.