Hi! 👋 Welcome to The Big Y!
If LLMs are taking over the software world, then the hardware powering them must also be experiencing exploding demand. Anyone trying to get access to more GPUs from Amazon or Microsoft knows that there is unprecedented demand and a line to wait in when you request more.
Nvidia has caught the AI wave, with first the A100s (which is what everyone is in line for) and now, Nvidia’s newest chip, H100 which is in “private preview” with OpenAI, Meta and StabilityAI all using the new chips to power their latest models. Nvidia’s stock surged this past week, with its market cap nearing the trillion dollar mark (only Apple, Microsoft, Alphabet and Amazon sit up there).
There are other chip makers out there like AMD and Intel, but their chips haven’t been able to keep up with Nvidia when it comes to AI specialized chips. Nvidia has successfully created a moat of software called CUDA that effectively prevents other hardware makers from making inroads. Google has also had a small presence with its TPUs and thinks its newest supercomputer is competitive. With so much demand for compute power, this space is ripe for disruption for whoever can deliver compute and enough access to run LLMs as they grow in size and demand.
A lawyer used ChatGPT to help file a 10-page brief that was filled with relevant court briefings to support their case. But turns out ChatGPT made them all up… My favorite quote from this article: “Mr. Schwartz said that he had never used ChatGPT, and ‘therefore was unaware of the possibility that its content could be false.’” Imagine if this was your lawyer.
Know someone who might enjoy this newsletter? Share it with them and help spread the word!
Thanks for reading! Have a great week! 😁
🎙 The Big Y Podcast: Listen on Spotify, Apple Podcasts, Stitcher, Substack