Give your bot some personality
Plus detecting cancer, data poisoning, and a ChatGPT oral history
We’re back with another edition of Machine Learnings, brought to you by the folks at Heyday.
Heyday is an AI-powered memory assistant that resurfaces content you forgot about while you browse the web.
These days, we’re thinking about personally intelligent experiences. LLMs today serve the broad public equally, but what can we do to transform your insights into action?
Want early access to personalized drops? Try Heyday for free today.
-@samdebrule
What we're reading.
1/ The big news from Microsoft this week was the ability to control Bing Chat’s personality. This opens new doors to more tailored AI experiences. Learn more at The Verge >
2/ The hotbed of medical AI tech testing lives in Hungary, and we’re seeing advances in the worst of human ails – cancer. What’s this mean for the future of medicine? Learn more at The New York Times >
3/ [Long read] A deep-dive profile into Dr. Emily Bender, one of the forefront experts of computational language. Why does she believe ChatGPT is nothing like humans? This one will take me months to digest. Learn more at NYMag >
4/ Data poisoning. That sounds bad, and nearly everyone agrees how easy it is for this to come about. Learn more at ZDNet >
5/ A technical breakdown mixed with speculative analysis of what’s gone wrong with the launch of Bing Chat. Strap in. Learn more at 80000 Hours >
6/ We missed this one last week, but it’s well worth returning to. A fun, albeit spooky, read on cloning your digital self. No spoilers. Learn more at VICE >
7/ Closing this week with a wraparound interview with the internal team that built ChatGPT. Like an episode of Inside the Actors Studio, you can almost feel what it was like to be in the room. Learn more at MIT Technology Review >
Research for this edition of Machine Learnings was enhanced by Heyday, the AI-powered memory assistant.