Learning in the AI Era
Reading and remembering with LLMs, top 0.1% of ideas, and Markdown-friendly sticky notes.
Hi friends! Thanks for tuning in to another episode of Manan's Marvelous Musings. I hope you've enjoyed these so far because they take so long to write 🥹.
What I thought about this week
Inspired by Dwarkesh Patel and Dan Skipper's recent podcast episode, I took some notes on learning in the AI era. In a world where information is abundant but time is scarce, how do we effectively gain and retain knowledge? How are LLMs most useful for learning? Check out my notes to find out.
What I read this week
Code: The Hidden Language of Computer Hardware and Software. This book explains how the modern computer works using first principles. When I say first principles, I mean going from Morse code to the full circuit design of the Intel 8080 microprocessor. Sometimes, the author's desire to be thorough makes the book a little tiring to read, but I've found the book to be satisfying and informative regardless.
The 0.1% Of Ideas I've Found. A list of the 0.1%-most-
{interesting, insightful, life-changing, etc.}
articles, speeches, books, and podcasts that the author has come across. Pretty legit stuff in here. I like the concept of the article and am considering writing a version for myself.College student falls from Yosemite's Half Dome cables and dies. I hiked Half-Dome exactly (1) week before the incident discussed in this article happened. Terrifying and sobering.
Spaced repetition memory systems make memory a choice. I think not enough people have embraced spaced repetition for what it is — the single biggest cognitive "hack" to increase human intelligence. I use spaced repetition every day to retain and reflect on things I've read (using Readwise) and plan on embracing it as part of my workflow even more (see Learning in the AI Era). Andy's write-up explains well why leveraging spaced repetition effectively is such a useful skill.
What is entropy?. How are the concepts of entropy in thermodynamics, entropy in statistical mechanics, and entropy in information theory related? Why does each molecule of hydrogen have ∼ 23 bits of entropy at standard temperature and pressure? I'm reading this enlightening and surprisingly funny note to find out.
Other cool things I wanted to share
This section is getting long and I'm considering different ways to structure it. If you have any ideas, please let me know!
I didn't know glue used to be (and still sometimes is) made of animals, particularly horses... wtf. I guess I never thought about what people used for glue until Elmer's existed. Shoutout to Albert for enlightening me.
Animal glue was the most common woodworking glue for thousands of years until the advent of synthetic glues, such as polyvinyl acetate (PVA) and other resin glues, in the 20th century.
Animal glue - WikipediaStripe Press — Ideas for progress. Stripe's publishing library, featuring "ideas that [they] think can be broadly useful". Some really cool books in here.
typst/typst: A new markup-based typesetting system that is powerful and easy to learn.. Markdown x LaTeX? Might migrate my resume over.
Blank Page. Simple, minimalistic notes tool for the browser. Just type.
SideNotes. I've enjoyed using this app over the last week as a more featureful, Markdown-friendly, well-structured Sticky Notes.
StudyChat. Transforms reading material into interactive quizzes. I think this would've been an awesome tool to use when I was still in school.
Thanks for making it to the end! Hope you have a great week.