Description
In this episode, Ned and Chris examine classical computing fundamentals, breaking down complex topics like Turing machines, the von Neumann architecture, and the role of logic gates in computing. They explain how binary operations, logic gates, and transistors come together to form the foundation of modern computers. They also get into a discussion of reduced instruction set computing (RISC) vs. x86 architectures and the trade-offs between speed, efficiency, and complexity in modern processors.
Links:
xkcd Purity: https://xkcd.com/435/
Turing Machine: https://plato.stanford.edu/entries/turing-machine/
Von Neumann Architecture: https://en.wikipedia.org/wiki/Von_Neumann_architecture
Half Adder: https://www.geeksforgeeks.org/half-adder-in-digital-logic/
From AI's stumbling progress to groundbreaking tools and cyber threats, here’s what you need to know for this week:
All The Major AI Models Continue To Lose Money AND Stop Advancing: Is the golden age of AI innovation already behind us? Reports from OpenAI, Google, and Anthropic suggest that we...
Published 11/18/24
Ah, passwords—the not-so-secret keys to our digital world. In this episode, we dig into the fascinating (and flawed) history of passwords, from their Roman origins to their debut in 1960s computing, and the constant struggle between ease and security ever since. Why are we still relying on...
Published 11/14/24