The model training paradox
Plus, some surprising plans from OpenAI and a peek inside the GPT-4 black box
We’re back with another edition of Machine Learnings, brought to you by the folks at Heyday.
Heyday is an AI-powered memory assistant that resurfaces content you forgot about while you browse the web.
We’ve been training and tweaking models all year, and we’re seeing great returns.
When we think about your experience using an AI tool, we don’t want you to think about AI. We just want to help you go where you want to go, with a dash of magic while you’re en route.
If you’d like to get your hands on an experience like this, connect to Heyday for a free 2-week trial today.
-@samdebrule
What we're reading.
1/ [Research] Reports of workers training AI models with AI data have plenty of people thinking. Insightful and worrying findings to their core research question – what will happen to the next GPT model once LLMs contribute much of the language found online? Learn more at arXiv >
2/ The latest Midjourney update is blowing artists’ minds for it’s perfection of the zoom-out technique. Ethical questions aside, this is quite impressive. Learn more at Ars Technica >
3/ Ben Thompson interviews Marc Andreessen on AI. That should be enough to get you to read on. Learn more at Stratechery >
4/ It turns out OpenAI has larger ambitions for ChatGPT. With a “supersmart personal assistant for work” in Sam Altman’s plans, what’s that mean for their partnership with Microsoft? Learn more at The Information >
5/ When GPT-4 was released, many were blown away by the leap in performance from GPT-3.5. Was a different method used? Indeed! Instead of a single large model, GPT-4 is built from a blend of 8 different models. Step into the black box! Learn more at The Algorithmic Bridge >
6/ Our friend Dan Shipper’s talking about how he’s turned his email over to AI. Plenty of nuggets here, but we’re most interested in this single line response approach. Learn more at Every’s Chain of Thought >
7/ [Deep Dive] Curious about the latest techniques on building LLMs? This is a holistic resource from Andreesen Horowitz on the emerging architectures powering future LLMs. This is thorough. Learn more at a16z >
Research for this edition of Machine Learnings was enhanced by Heyday, the AI-powered memory assistant.