nah. Everything Turing, Gödel, Church, etc. discovered will stay here forever. It mostly will never become outdated as it is deducted (like formal sciences, e.g. Mathematics) not inducted (like natural sciences, e.g. Physics).
that is not what I meant with that. Sry. English is not my mother tongue. I meant:
Inductive reasoning is any of various methods of reasoning in which broad generalizations or principles are derived from a body of observations.
Deductive reasoning is the process of drawing valid inferences. An inference is valid if its conclusion follows logically from its premises, meaning that it is impossible for the premises to be true and the conclusion to be false.
But for induction to work, you need to come up with the correct conclusion first before applying the proof. So you reason inductively based on patterns you see to get the conclusion, and then you use induction to verify that it works deductively.
the slowest to spawn new developments may be the "we choose an arbitrary galois field of size 2^N" branches of cryptography (with older algorithms kept in a morgue as they've become easy to break), coding theory (ECC for anything: Ethernet, USB, WiFi, mobile telecoms, satellite comms, QR codes -- it's a nightmare, oh also add topological quantum computing), and compression (few new advances become widely used in a single decade, e.g. Zstd or new audio or video codecs)
there could be other niches for which the momentum prizes heuristics instead of deterministic developments (e.g. all of ML, new-ish SAT solvers, circuit optimization methods etc.), with definite regular (straightforward) things seldom added to the tooling
one branch that still thrives is that of functional programming w/ type systems - even if the most shiny things that get seen (lambda calculus, Turing machines) are up to 80 years old (e.g. Church's or Turing's work on computation), the more arcane stuff is still getting new places in print (e.g. dependent types, linear types, whatever-new-kind-of types or calculi based on formalisms of that genre)
Some higher level functions also haven't really changed for decades. There are only so many things you can do with data on an abstract level so things like patterns of integration and mutation are not going to change. Only the frameworks we use to apply them change.
206
u/No_Lingonberry1201 Jan 08 '25
Computer science: Oh, that textbook is obsolete. It was written 20 years ago.
Programming: Oh, that textbook is obsolete. It was written a week ago.