Here are the highlights from the week that was in AI and our spicy takes on what it actually means. 🌶️
This week, we're diving into NVIDIA's controversial financing deals, OpenAI's 400x efficiency gains and enterprise adoption surge, why implementation beats product when selling to enterprises, and the launch of ChatGPT apps.
On the Chipp side, we're working with Stripe on usage-based billing for AI so you can access the best models without flat-fee limits. Discover how voice is the new interface for customer support and sales - from hotels booking rooms in multiple languages to sales teams practicing with AI.
AI Highlights from Around the Web
Is NVIDIA the Next Enron?
Ed Zitron dives deep into why NVIDIA keeps insisting it's nothing like Enron (which, honestly, makes people wonder even more). He breaks down the company's circular financing deals, inventory questions, and the massive GPU orders that seem to be piling up without clear profit paths. Read the full analysis.

🌶️ Value is being created with AI - is it enough for the valuations? Time will tell but with the massive investments being made, look for new competition to spring up to NVIDIA and other infrastructure providers.
Enterprise AI Is Actually Happening
OpenAI just dropped their first-ever enterprise AI report, and the numbers are wild. Weekly ChatGPT Enterprise messages jumped 8x in a year, workers are saving 40-60 minutes daily, and 75% say AI improved their work speed or quality. But here's the kicker - the gap between power users and everyone else is widening fast.

🌶️ Enterprises are spending and seeing results. Here at Chipp, we're focused on the "Chipp Flipp" - bringing these opportunities to every business no matter how big or small.
The Real Cost of Selling AI to Big Companies
Harry Stebbings drops an unpopular truth - you can't just sell AI tools to large enterprises and expect them to stick around. Without hands-on implementation support and customization (think full white-glove service), companies won't adopt it. The art isn't in the product, it's in making it actually work for them. Explore the insight.

🌶️ Forward deployed engineers, or FDE's, is the term de jour - it's really just a consultant or onboarding. We recently added addition support for "do it for me" requests because hand holding gets AI to be sticky.
OpenAI Gets 400x More Efficient in One Year
Steven Adler highlights OpenAI's massive efficiency gains - dropping from $4,500 per problem to just $12 in one year. That's a 400x improvement. The catch? Human labor costs don't drop 400x annually, creating a widening gap between what AI can do and what it costs versus human alternatives.

🌶️ Price drops are continuing with AI. As AI is cheaper we will start to do tasks we never considered because it was prohibitively expensive. That will change how we work.
ChatGPT Apps Are Coming
Romain Huet announces builders can now submit ChatGPT apps. This opens the door for UI-driven experiences that feel like natural extensions of conversations, powered by MCP. First approved apps roll out in the new year.

🌶️ It will be fascinating to see if businesses want to disintermediate their relationships with customers. ChatGPT will offer a massive opportunity for user acquisition, but companies will be giving up data and direct connection. Look for lots of experimentation here.
Chipp Updates
We're Working with Stripe on Usage-Based Billing for AI
Small businesses shouldn't have to compromise on AI quality because of unpredictable pricing. That's why we worked with Stripe as their first design partner on Billing for LLM tokens. Now you can use the best models for each task and only pay for what you actually use. Every Chipp subscription includes 1/3 of your plan value in monthly tokens - that's roughly 10 million words a month.
Voice as the New Interface
Scott explores why voice is transforming customer support. From hotel guests booking rooms in multiple languages to sales teams practicing pitches with AI, voice removes friction and lets people communicate naturally.
