Friday, March 6

Tag: Reddit

Why is this book taking so long to release -
News Feed, Reddit

Why is this book taking so long to release – “The Singularity Is Nearer: When We Merge with AI” by ray Kurzweil Release Date – 25 June 2024

https://preview.redd.it/zexzjvu8swtc1.png?width=373&format=png&auto=webp&s=47213fa6e590efe20d91439d4c561a6022d12e3f Seriously the book is finished, Ray gifted a copy to Joe Rogan live while on his Pod. This is one of the oldest guys working in the AI space and possibly propounds that by the time a book comes out on AI, especially the latest trends it can get outdated in days. So why is he making us all wait? Just release it instead of making us wait for a few more months. Is this all to generate hype and boost sales or some other factors are keeping us from this most likely awesome book? If possible how to get an early copy. submitted by /u/Maddragon0088 [link] [comments]
Google will only train on your Google Docs if it finds them online.
News Feed, Reddit

Google will only train on your Google Docs if it finds them online.

Business Insider’s Katie Notopoulos wondered if Google trains its AI models on Google Docs we share with “anyone with a link.” Google, which added AI features to workspaces last year, says it only trains on “publicly available” Google Docs. But the company says that even documents that are accessible to “anyone with a link” remain private unless that link is posted online where Google’s webcrawler can find it. Source: Your Google Docs are (probably) safe from AI training submitted by /u/Nearby-Ad-5130 [link] [comments]
Stanford CS 25 Transformers Course (OPEN TO EVERYBODY)
News Feed, Reddit

Stanford CS 25 Transformers Course (OPEN TO EVERYBODY)

Tl;dr: One of Stanford's hottest seminar courses. We are opening the course through Zoom to the public. Lectures on Thursdays, 4:30-5:50pm PDT (Zoom link on course website). Talks will be recorded and released ~2 weeks after each lecture. Course website: https://web.stanford.edu/class/cs25/ Each week, we invite folks at the forefront of Transformers research to discuss the latest breakthroughs, from LLM architectures like GPT and Gemini to creative use cases in generating art (e.g. DALL-E and Sora), biology and neuroscience applications, robotics, and so forth! We invite the coolest speakers such as Andrej Karpathy, Geoffrey Hinton, Jim Fan, Ashish Vaswani, and folks from OpenAI, Google, NVIDIA, etc. Check out our course website for more! submitted by /u/MLPhDStudent [link] ...
The AI Report