Pyxis.ai
We’re hosting Pyxis.ai on March 24th at 6pm EST! Zoom link below. You can read more about them here and find them on LinkedIn.
More …We’re hosting Pyxis.ai on March 24th at 6pm EST! Zoom link below. You can read more about them here and find them on LinkedIn.
More …Despite orders, Timnit Gebru put her name on this paper. She was fired from Google. On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?
Join us on Discord! https://discord.gg/Bxp63Mcu7X
Self-supervised and unsupervised learning has been behind some of the biggest AI stories. You may have heard about them already, but here is a collection.
More …Here’s a link to Andej Karpathy’s Medium Article on Software 2.0 a really interesting outlook on the future of software.
Ashish Vaswani’s guest lecture at Stanford about Attention and Transformers from Attention Is All you Need. Followed by Anna Huang’s lecture on music generation. There is also a wonderful notebook from harvardnlp, The Annotated Transformer, for understanding the paper.
More …Jacob Devlin’s guest lecture at Stanford about Bidirectional Encoder Representations from Transformers (BERT), which was the SOTA Language model and a major paper.
More …