We’re back with another edition of Machine Learnings, brought to you by the folks at Heyday.
Heyday is an AI-powered memory assistant that resurfaces content you forgot about while you browse the web.
When you feel overwhelmed in the deluge of content, we focus you on the signal in the noise.
In the coming weeks, we'll be rolling out our own experiments with LLMs. We’re building prototypes to help us synthesize insights from our full days of calls, conversations, meetings, and readings.
What do you want to see? Leave it in the comments or drop me a message. DMs are open.
-@samdebrule
What we're reading.
1/ Nick Cave levels ChatGPT in a way only a great musician and poet can. The humanity. Learn more at The Red Hand Files >
2/ Google Research is publishing a series of posts on their near-term forecasts for language, vision and generative models. Entry way here. Learn more at the Google AI Blog >
3/ Our favorite AI content on the web today comes from Every. They're friends of ours, but the ingenuity and depth in their work is inspiring. Here’s Dan Shipper asking his personal model what his past journals have to say about his hopes and dreams. Learn more at Every >
4/ We know LLMs absorb the open web without always correctly sourcing, and this publication felt the effects firsthand. So what happens after your work is taken? Learn more at Big Technology >
5/ Late 2023 scenario: everyone has their own LLM-trained AI assistant. Now what? Learn more at WIRED >
6/ Do we actually want Artificial Intelligence to sound like humans? What are realistic alternatives? Learn more at Bloomberg >
7/ Microsoft is investing $10B into OpenAI and integrating it into Office products and Bing search, so let’s speculate. What might we see? Learn more at MIT Technology Review >
Research for this edition of Machine Learnings was enhanced by Heyday, the AI-powered memory assistant.