|
It's been a while since I sent an email. We had our second baby girl! I've been spending a lot of late nights and early mornings with AI, specifically Claude Code and now codex. I want to share the journey. I haven't had time to write or make videos because I have been knee-deep in just figuring out best practices for working with AI (it's been amazing and frustrating at the same time). To be blunt, we're at a tipping point where there's so much you can do. You can just do things now; it's incredibly exciting. At the same time, it's still a little early, and there are rough edges. I've been able to build 6 different tools (macOS apps, CLI tools, and iOS apps) since I last emailed you all in the direction of an app that I want to build. Want to learn with me on a weekly basis and finally publish your app on the App Store? Join the waitlist for the App Launch Community here (ETA: January). Spider Warning... I've discussed this a little bit in the past. I thought I would just be able to create a little OpenAI wrapper around spider detection. If spiders scare you, you might not want to watch this video or continue scrolling . . . We're at a tipping point where you can just build tools and apps. You can learn new technologies and frameworks, and you can get a prototype out the door to experiment with things. I've been using AI, specifically GPT-5 now, to create spider-related image classification. I'm not sticking just to spiders, but my first exploration into this space is going to be dangerous spiders. I might do snakes or other things that people consider dangerous. That's kind of my angle, and I've been able to build the tooling to support machine learning. I'm still not sure if the results are going to be great, but I know that the results from other apps are mixed. I have ideas that can improve the experience, and that's what I'm working on. Watch the spider video where I show off one of my tools to detect spiders and then classify them (two stages). This Mac app is a test harness for the models that I'm training in Create ML. Along with that, I still plan to kickstart the App Launch Community. I put that on hold with the baby. I just didn't have the bandwidth, and I still don't have the bandwidth yet. I won't have it for another 2-3 months until all the baby's in day care (Plus we're doing a massive renovation project). My attic is a construction zone, and I only have so much time in the day. Over the next few weeks I'll share what's working for me. And how you can leverage what I'm learning (for free). If you want to take it to the next level, I'll be opening a weekly community group to help you drive towards your own app goals. Now don't get me wrong; I still like coding, and I still have to drop into code to fix problems that the AI is unable to solve. But I get to operate at a higher level, and I get to be more of the idea person driving the app and testing it and making sure it works. That's been really exciting. If you're interested in publishing an iPhone app or a Mac app, or if you're interested in anything related to artificial intelligence, such as machine learning, I've been deep diving into image classification and object detection, and I've been using Create ML, which is Apple's framework for creating models that can detect things. I've been working with data from iNaturalist.org, and then I've been building out apps that help with image detection and classification. It's really interesting, and it's fascinating that I can use AI to prototype this. Click here if you want to sign up for when I do announce more details about my small community where I'm going to be working on a weekly basis with people working on their apps, alright, so that's all I have for today. Talk to you soon, Paul Solt |
Join 5,901+ iOS/macOS developers using Codex and agents to build and ship apps. Expect practical tutorials, repeatable workflows, and hard-earned lessons from 7 shipped apps and time at Apple, Microsoft, and GoPro.
Hey Reader, Agents write better code when they can read docs instead of guessing from memory. A 2026 paper found up to a 20% improvement in first-try code accuracy when models were given the right programming context. So what do you do with that? Don’t rely on the agent to know everything from their training set. Instead, provide tools or local docs that they can reference. Your local docs should include sample code so agents can use the code correctly (the first time). If you’ve ever read...
Hey Reader, My iOS and macOS app development workflow has changed since my last GPT-5.2 Workflow video. GPT-5.4 lets me iterate on bug fixes, features, and ideas much faster than before. Watch: How I Build Apps with Codex and GPT-5.4 If you want to build apps with agents, use my app-creator skill. It can scaffold a new Xcode project and teach your agent how to create a Makefile for an existing Xcode project. 2 Tactics You Can Steal #1 Use a Learnings.md File with Your Agent Self-improvement...
Hey Reader, I’m launching early access for my new app-creator agent skill for iOS/macOS apps today! In this email: 2 Tactics You Can Steal This Weekend 5 Resources and Links Community Spotlight What does app development look like in 2026? Coding has forever changed over the last 3 months. I’m all-in on app development with Codex 5.3. It’s a workhorse. If you’re an idea person still learning the basics, you have a huge opportunity. Gone are the days when you need to know every line of code....