Every now and then a post comes along that says everything I would want to say about a subject. Michael Taggart’s I used AI. It worked. I hated it. is almost exactly that kind of post. I say almost, because clearly, I’m sitting here writing another post to add my own perspective. His post is good, though. So good! It’s well organized and covers so many points I would want to make. It will probably have people on both sides of the AI-for-coding debate complaining, which I take as a sign that the post is doing something right.
One of the things I like most is that the post has a very AI-works-but-it’s-not-perfect-yet view. So many people today are either boomers or doomers. They either write about AI coding as if they’ve discovered computers for the first time, or they act like it can’t do anything right. My experience is more like the one Taggart describes in his well documented post.
Well, the thing works. The code is in production today, serving certificates for TTI. The only direct changes I made to the codebase were for elegance. The core logic was solid from the jump, owing I believe as much to Rust's safeties in development as to the model's capabilities.
But then he notes:
Did the model hallucinate? Yes, albeit rarely and with self-correction. A handful of times it made up methods for a struct in one of the libraries or another. However, Rust's error messages from the LSP server and compilation checks coerced the model to recheck its work, leading to correct implementation. I did not intervene in this process. It took about five minutes per issue.
Hallucinations still happen for me, too. A lot actually, even though developers I work with say it never or rarely happens to them. I’m working in Python and JavaScript, so I do have to manually intervene, unlike Taggart. He makes a compelling case for using Rust if you’re going to be using coding agents. I just appreciate that he’s being fair here, calling out how cool and unique this is, while also being clear about its issues.
Regardless of how good (or not!) LLM-assisted coding is, I end up in a place pretty similar to Taggart.
If I could disinvent this technology, I would. My experiences, while enlightening as to models' capabilities, have not altered my belief that they cause more harm than good. And yet, I have no plan on how to destroy generative AI. I don't think this is a technology we can put back in the box. It may not take the same form a year from now; it may not be as ubiquitous or as celebrated, but it will remain.
It’s this “cause more harm than good” where I think I sit these days. For me, the harm is in forcing devs down a productivity-at-all-costs path. It’s hustle culture at the expensive of building deep expertise. Practically, there’s a time and a place for moving fast and for being contemplative. AI hype has gotten so strong that you would think it’s only moving fast that matters.
I get it, though. I really do! The reason that contemplative, thoughtful coding is not seen as something to value is because most developers don’t work on things they care very much about. Devs want to get their code done and move on to the next thing. This is an indictment of our industry more than anything that speaks to the importance of LLM-assisted programming.
I end up in a pretty similar place to Taggart. Like him, I think coding assistants are here to stay. I don’t expect things to look that same a year or two from now. There will certainly be a correction. When there is, deep expertise about both writing and understanding software will be a valuable skill to have. If anyone asks me, I would say hold on to that, no matter how you feel about LLMs and agentic coding.
I'm at work today, and because of my job, there's CNN playing on every screen. I looked up to see this scene. My immediate thought was—did Trump think it was Easter today? So I snapped this pic and sent it to the Hodge Peeps group chat with this message:
He was dropping F-bombs on Easter Sunday, but then decided to show up with the Easter Bunny today. What even is going on?!
Trump would be a hard character to write in fiction. He just wouldn't be believable enough. I think that's still true, even after all these years of Trump. That's how surreal the current moment is.
Given I’ve refocused my spare time on this site, I should warn you all: I’m going to be making some updates around here. Be patient with me as things shift and change around on this site.
I plan to make some theme updates here first. I want a text-first, easy reading experience. The current theme is basically the style I want, but I also need to make a pass over it to polish things up a bit. The typography needs some love first and foremost.
Once that’s done, I want to work on the functionality. I’ll probably be disabling Ghost’s subscriber feature for a little while. I want a web-first approach for this site, and I’d like to have a reason to bring back the subscriber model when approrpriate. I’m thinking eventually I’ll bring this back as a membership program, with a set of perks for loyal readers. Before I do that, I need to build up that loyal audience.
For that reason, I am sending this post as an email to subscribers, too. If you’re getting this in email, it will probably will be the last email for a while. If and when I reopen the subscriber/membership part of this site, I’ll send another email then to share updates. For those of you here with me up to this point, thanks so much for reading and being part of my journey here on this site!
This is the first moon trip in 50 years, so it’s the first one where people have phones and modern DSL cameras to take pictures. These photos of the earth from the space craft are so cool.
This is my favorite one.
NASA astronaut Christina Koch peers out of the Orion spacecraft
Yesterday, I wrote about wanting to hit 1000 words a day in writing. I’m sure some might think, why would you want to do that? I even asked myself. It’s not like I don’t have plenty to do. I’ve got a good career and no shortage of work for my day job. I’m married, and though my daughters are grown now, we still have an active and busy family life. For me, the desire to write more comes down to a few things.
The first thing is that I love writing. Nothing satisifies like spending time crafting sentences that lead me to some new thought. It’s that idea made famous by several authors — Joan Didion and Flannery O’Connor come to mind — that a writer (well, anyone really!) doesn’t know what she thinks until she writes it down. I very much fall into that camp. Writing is thinking for me, which leads me to the second thing.
I want to slow down and spend more time thinking. Writing takes time. Thinking takes time. I’m just so over hustle culture. To combat that pressure to move faster, do more, to hustle, I want to intentionally slow down. Carving out time to write is the best way, and the most rewarding way, for me to do that.
Which leads me to the last thing.
I feel like I am uniquely situated for this present moment where science wants to overtake art, in that I am myself equal parts art and science. I love story, emotion, feeling, and art. I spent the first quarter of my life studying language, literature, and fiction writing. I have also spent years professionally building up skills in programming, math, and science. I reject the idea that we need more STEM and less art. We need both, and so, I want to explore the territory there in the middle. Hopefully, these 1000 words per day will lead to something worth saying in that space.
I’m setting myself a goal to write 1000 words a day. Every. Single. Day. It’s an ambitious goal, but I’ve let myself go, in terms of spending time on writing. It’s time to get back to it, get serious about the craft again.
I’m proud of the writing I’ve managed to string together here over the last 2 years on this site. I started back to blogging just before going to grad school at SCAD. I’ve posted around 45 posts in those nearly two years. That’s not nothing, but I still don’t feel like I’ve found my rhythm with tone, style, and focus. The trick with getting a rhythm going is playing more often, if you’ll allow me to stay with that analogy.
Anyway, that’s the goal. Write often. Write a lot. Write 1000 words per day. Not everything will end up posted here. I’m sure I’ll work on short stories and longer pieces that will be posted when they’re ready. But still, I want to push myself and write a lot every day.
I’ve managed about 750 words across the 3 posts I’ve added here today. Now I think I’ll get away from the site and work on some fiction for a bit. One day down, a bunch more to go.
AI is a hot topic among developers these days. And I mean hot. There’s clearly a divide amongst us. I see it most sharply between the devs I know at work and the devs I know in open source. Work devs are all in on this stuff. My open source friends, not so much.
It would be too easy for me to draw the conclusion that the AI divide among my friends and colleagues is a divide between those who just want to ship software and those who see some intrinsic value in code. There’s been plent of writing lately trying to draw similar conclusions. The best starting point for this is Les Orchard’s Grief and the AI Split.
Before AI, both camps were doing the same thing every day. Writing code by hand. Using the same editors, the same languages, the same pull request workflows. The craft-lovers and the make-it-go people sat next to each other, shipped the same products, looked indistinguishable. The motivation behind the work was invisible because the process was identical.
It’s a really great, thoughtful post. I think there’s some truth to it, maybe even a lot of truth. The reason I can’t so easily accept this explanation is because it’s not so neat and clean for me. I love craft, and I also love to ship software. I’m a weird mix of pragamatism and ideology. I’m definitely the kind of developer who has feelings about AI, but I’m not sure what they are. Maybe Les is right, and I’m just at war with myself. I'm definitely going to keep writing, keeping working it out for myself, until I figure out what I really think.
They passingly said “hustle culture” when describing the current moment, but then spoke of it as something that started in the early 2000s, something going on 25 years now. I mean, it’s obvious, right? We’re clearly fixated on hustle culture and min-maxing life, but also, it struck me as a fresh observation, this idea that it’s been going on for this long. And then I thought, have we reached peak hustle culture yet?
This moment won’t last forever, of that I’m sure. We’re now able to take a critical eye to the way the industrial revolution infected our view of everything. We can see how having schools be little factories for children isn’t the best idea. Or maybe we’re not all neatly divided into worker and owner classes. Or maybe a body isn’t so similar to a machine after all.
That’s the problem with metaphors. They’re great for writing, great for distilling complex ideas into understandable imagary. They make fiction sing. But that’s the thing — they’re fiction. We’re not actually all machines. Our brains are not really like computers. We don’t think in binary. The world cannot be understood that way. Not really. One day, we’ll wake up and realize it, or at least move on to the next thing and the process will start all over again.
My developer career was born in the news business. While working on web sites for Auburn University Libraries, I started doing more and more in open source, especially with Django. That lead me to work for a small paper in Naples, FL, which lead to a job at the Washington Post. I think about my time at the Post a lot, even more so here recently.
So many things were happening then. My youngest daughter was born right as I joined the Post. I’ll never forget the month and year of starting there! It was October 2006. Not long after that came the iPhone. Then, social media. I was working on what I think we called the “Special Projects” team, so I was helping build stuff for all of that.
We were trying to figure out mobile web sites then. This was before iOS apps. We were also trying to figure out integration with social media. I spent a week in California at Facebook and actually sat in a conference room with Mark Zuckerberg as he explained to us their new F8 platform launch. F8 is what they were calling their platform for embedded integrations. Think little pieces of the Washington Post users and data, running on Facebook’s web site. We were going to build one of those first embedded apps. All of this was super exciting.
Looking back now, some of it has aged well, and other parts, not so much.
My daughter, she clearly aged well. She’s 19 and deep into her first year in college. The iPhone, likwise. Clearly, this is an essential device for so many people. The mobile web is the only web for a lot of folks. Social media, on the other hand, is a mixed bag.
I can remember being really excited about Facebook and Twitter when I first joined those sites. It felt like we were building something completely new. There was all this talk of social graphs and connecting the world. News sites in particular couldn’t wait to figure it out. The newspaper business was in decline, and there was hope social would be a new source of attention and revenue for the business. I really can’t say enough about how uniformly excited people were about the possibilities then.
Then came 2016. Cambridge Analytica. Political polarization. Mental health issues. Anxiety. Anger. Isolation. The very thing that was meant to bring us all together has driven us apart. Just two days ago, people were celebrating as Facebook and YouTube were held accountable in the courts here in the US. News and media once craved social media attention, and now we’re all focused on “direct to consumer,” trying to build our own relationship with our audiences. 2006 seems such a long time ago.
I think it’s too easy to dismiss social media as uniformly bad. There is real utility for keeping in touch with friends and family. Small businesses have options for reaching people that really can help a business grow and thrive. My older daughter is a hair stylist, and she went from 0 clients to an overbooked schedule in the matter of a few months, largely all thanks to her posts on social media. Maybe the bad outweighs the good, but it's not all bad. Either way, it's complex and clearly not the panacea we thought it would be in 2006.
I said up top that I’ve been thinking about all this a lot recently. I joined Warner Bros. Discovery to work on CNN last year. I’m back in the news and media industry again, it’s 2026, and what was old feels new again. I see a lot of similarities in that social media fervor to how people are thinking about and embracing AI today. It’s a little different. For one, it’s not all so uniformly positive here at the start. A lot of programmers and business people are excited about AI, but creatives and journalists are skeptical at best and fearful at worst. What happened with social media and big tech has left so many people jaded. Maybe even as a result of social media, we’re more polarized about any new development. But still, there’s something that feels similar.
Maybe it’s just me and my circumstances. The cable news business is in decline, the way newspapers were back in 2006. AI will almost certainly change how we get and share news, the same way social media did back then. The question is will it be a largely net positive like my daughter’s birth and the iPhone, or will it be something we’re left regretting but for a couple of pockets of promise.
Only time will tell. Let’s check in again in 2036 and see.
Something Benedict Evans wrote on LinkedIn today hit me as related.
Benedict Evans on Apple treading its own path
I love that phrase "Apple treads its own path." It's hard to do that. Seriously hard. It's why there really is only one Apple. It's also why a company like Amazon has done more with cloud infrastructure than category defining products. I love generative AI as technology, but it's more cloud infrastructure than iPhone. And that's ok. That's still a big deal. But let's be real about what kind of deal it is.