789: Do More With AI - LLMs With Big Token Counts
Listen now
Description
Join Scott and CJ as they dive into the fascinating world of AI, exploring topics from LLM token sizes and context windows to understanding input length. They discuss practical use cases and share insights on how web developers can leverage larger token counts to maximize the potential of AI and LLMs. Show Notes 00:00 Welcome to Syntax! 01:31 Brought to you by Sentry.io. 02:42 What is a token? Quizgecko GPT-4 Token Counter. 04:22 Context window sometimes called “max tokens”. OpenAI Platform Models. Claude Models. 10:42 Understanding input length. 11:59 Models + services with big token counts. Gemini Docs. 13:22 Generating open API documentation for a complex API. 17:29 Generating JSDoc style typing. Drop-In stolinski GitHub. 21:07 Generating seed data for a complex database. bytedash w3cj GitHub. 24:34 Summarizing 8+ hours of video. 29:35 Some things we’ve yet to try. 31:32 What about cost? Google AI for Developers Cost. Hit us up on Socials! Syntax: X Instagram Tiktok LinkedIn Threads Wes: X Instagram Tiktok LinkedIn Threads Scott: X Instagram Tiktok LinkedIn Threads CJ: X Instagram YouTube TwitchTV Randy: X Instagram YouTube Threads
More Episodes
Scott and Wes dive into the 2023 State of JavaScript survey, breaking down the latest trends and pain points in front-end frameworks, build tools, and JavaScript runtimes. Tune in for their hot takes and insights on what’s shaping the JavaScript landscape this year! Show Notes 00:00 Welcome to...
Published 07/03/24
Scott and CJ chat with Paul Copplestone, CEO and co-founder of Supabase, about the journey of building an open source alternative to Firebase. Learn about the tech stack, the story behind their excellent documentation, and how Supabase balances business goals with open-source values. Show Notes ...
Published 06/28/24