Anderycks.Net by Deryck Hodge logo

The kind of developer who has feelings about AI

AI is a hot topic among developers these days. And I mean hot. There’s clearly a divide amongst us. I see it most sharply between the devs I know at work and the devs I know in open source. Work devs are all in on this stuff. My open source friends, not so much.

It would be too easy for me to draw the conclusion that the AI divide among my friends and colleagues is a divide between those who just want to ship software and those who see some intrinsic value in code. There’s been plent of writing lately trying to draw similar conclusions. The best starting point for this is Les Orchard’s Grief and the AI Split.

Before AI, both camps were doing the same thing every day. Writing code by hand. Using the same editors, the same languages, the same pull request workflows. The craft-lovers and the make-it-go people sat next to each other, shipped the same products, looked indistinguishable. The motivation behind the work was invisible because the process was identical.

It’s a really great, thoughtful post. I think there’s some truth to it, maybe even a lot of truth. The reason I can’t so easily accept this explanation is because it’s not so neat and clean for me. I love craft, and I also love to ship software. I’m a weird mix of pragamatism and ideology. I’m definitely the kind of developer who has feelings about AI, but I’m not sure what they are. Maybe Les is right, and I’m just at war with myself. I'm definitely going to keep writing, keeping working it out for myself, until I figure out what I really think.

Our brains are not really like computers

We’re living in peak hustle culture now, at least I hope we are. I was struck by this thought while watching this dicussion between Arthur Brooks and Cal Newport on how to have meaning in a distracted life.

They passingly said “hustle culture” when describing the current moment, but then spoke of it as something that started in the early 2000s, something going on 25 years now. I mean, it’s obvious, right? We’re clearly fixated on hustle culture and min-maxing life, but also, it struck me as a fresh observation, this idea that it’s been going on for this long. And then I thought, have we reached peak hustle culture yet?

This moment won’t last forever, of that I’m sure. We’re now able to take a critical eye to the way the industrial revolution infected our view of everything. We can see how having schools be little factories for children isn’t the best idea. Or maybe we’re not all neatly divided into worker and owner classes. Or maybe a body isn’t so similar to a machine after all.

That’s the problem with metaphors. They’re great for writing, great for distilling complex ideas into understandable imagary. They make fiction sing. But that’s the thing — they’re fiction. We’re not actually all machines. Our brains are not really like computers. We don’t think in binary. The world cannot be understood that way. Not really. One day, we’ll wake up and realize it, or at least move on to the next thing and the process will start all over again.

Thinking about the news business, social media, and AI

My developer career was born in the news business. While working on web sites for Auburn University Libraries, I started doing more and more in open source, especially with Django. That lead me to work for a small paper in Naples, FL, which lead to a job at the Washington Post. I think about my time at the Post a lot, even more so here recently.

So many things were happening then. My youngest daughter was born right as I joined the Post. I’ll never forget the month and year of starting there! It was October 2006. Not long after that came the iPhone. Then, social media. I was working on what I think we called the “Special Projects” team, so I was helping build stuff for all of that.

We were trying to figure out mobile web sites then. This was before iOS apps. We were also trying to figure out integration with social media. I spent a week in California at Facebook and actually sat in a conference room with Mark Zuckerberg as he explained to us their new F8 platform launch. F8 is what they were calling their platform for embedded integrations. Think little pieces of the Washington Post users and data, running on Facebook’s web site. We were going to build one of those first embedded apps. All of this was super exciting.

Looking back now, some of it has aged well, and other parts, not so much.

My daughter, she clearly aged well. She’s 19 and deep into her first year in college. The iPhone, likwise. Clearly, this is an essential device for so many people. The mobile web is the only web for a lot of folks. Social media, on the other hand, is a mixed bag.

I can remember being really excited about Facebook and Twitter when I first joined those sites. It felt like we were building something completely new. There was all this talk of social graphs and connecting the world. News sites in particular couldn’t wait to figure it out. The newspaper business was in decline, and there was hope social would be a new source of attention and revenue for the business. I really can’t say enough about how uniformly excited people were about the possibilities then.

Then came 2016. Cambridge Analytica. Political polarization. Mental health issues. Anxiety. Anger. Isolation. The very thing that was meant to bring us all together has driven us apart. Just two days ago, people were celebrating as Facebook and YouTube were held accountable in the courts here in the US. News and media once craved social media attention, and now we’re all focused on “direct to consumer,” trying to build our own relationship with our audiences. 2006 seems such a long time ago.

I think it’s too easy to dismiss social media as uniformly bad. There is real utility for keeping in touch with friends and family. Small businesses have options for reaching people that really can help a business grow and thrive. My older daughter is a hair stylist, and she went from 0 clients to an overbooked schedule in the matter of a few months, largely all thanks to her posts on social media. Maybe the bad outweighs the good, but it's not all bad. Either way, it's complex and clearly not the panacea we thought it would be in 2006.

I said up top that I’ve been thinking about all this a lot recently. I joined Warner Bros. Discovery to work on CNN last year. I’m back in the news and media industry again, it’s 2026, and what was old feels new again. I see a lot of similarities in that social media fervor to how people are thinking about and embracing AI today. It’s a little different. For one, it’s not all so uniformly positive here at the start. A lot of programmers and business people are excited about AI, but creatives and journalists are skeptical at best and fearful at worst. What happened with social media and big tech has left so many people jaded. Maybe even as a result of social media, we’re more polarized about any new development. But still, there’s something that feels similar.

Maybe it’s just me and my circumstances. The cable news business is in decline, the way newspapers were back in 2006. AI will almost certainly change how we get and share news, the same way social media did back then. The question is will it be a largely net positive like my daughter’s birth and the iPhone, or will it be something we’re left regretting but for a couple of pockets of promise.

Only time will tell. Let’s check in again in 2036 and see.

AI Is Cloud Infrastructure, Not the Next iPhone

Two things can be true at once. I am becoming convinced software engineering as an industry is forever changed by LLM coding agents. I also think the AI hype is way overblown. Folks are out here talking about curing cancer and solving climate change with AI. I think it’s much more likely that AI is just normal technology and something more akin to cloud computing than the iPhone. So AI can be great and amazing and also perfectly normal. That makes complete sense to me.

Something Benedict Evans wrote on LinkedIn today hit me as related.

LinkedIn post from Benedict Evans where he notes "Apple treads its own path"
Benedict Evans on Apple treading its own path

I love that phrase "Apple treads its own path." It's hard to do that. Seriously hard. It's why there really is only one Apple. It's also why a company like Amazon has done more with cloud infrastructure than category defining products. I love generative AI as technology, but it's more cloud infrastructure than iPhone. And that's ok. That's still a big deal. But let's be real about what kind of deal it is.

Pragmatism and AI Coding Tools

Over the past few months, the way software engineers talk about AI has started to change. I hear it in Slack threads at work and conversations with other developers. I read it more and more in other developers' blog posts. For a while, the conversation was only polarized. Some were convinced, and some still are, that large language models will replace programmers entirely. Others dismiss the hype. Recently the tone feels more practical, even among skeptics.

Glyph, who has written critically about LLMs, recently published a post asking What Is Code Review For? He's so spot-on about code review, btw – go read that post! – but then drops a surprise. The post is actually about LLMs!

He writes:

Sigh. I’m as disappointed as you are, but there are no two ways about it: LLM code generators are everywhere now, and we need to talk about how to deal with them.

I find my own point of view similar to Glyph's:

My own personal preference would be to eschew their use entirely, but in the spirit of harm reduction, if you’re going to use LLMs to generate code, you need to remember the ways in which LLMs are not like human beings.

That's kind of where I am. I have my own qualms about this. Anyone paying attention probably should. These systems raise real questions about what it means to write code, how engineers develop skill, and what our work might look like a decade from now. Never mind that the people most pushing this stuff are just creeps.

But still, I’m pragmatic. If these tools are going to be part of our future, then the responsible thing to do is to understand them, and understand well enough, that we can decide how best to ethically and humanely incorporate this technology into our lives.

Humanism, Religion, and Disney Vacations too

I had this thought while trying to make sense of all the week’s news. Recent stories about the chaos of business and the greed of tech are an almost weekly feature these days. It's so exhausting. That thought I had: when did humanism go out of fashion?

Which is also to say: how did I end up here?

I never imagined a life for myself in tech and business. It was an accident, the kind of accident born from a conflagration of circumstances. One thing leads to another, then consumes your life, and here you are.


As a kid, I thought I’d write or draw comic books. I was a little artsy nerd. Then I found music, and the world opened for me, both literally and figuratively. I left home to be a touring musician. After a few years of that, I started to split time between touring and going to school for an English degree. College brought me back around to writing. I’d like to think I started to take writing seriously. I imagined then that I would write literary fiction and be a teacher. Maybe even a professor.

During that time, I took a geography class to satisfy some core curriculum requirement. The geography professor was excited by the early web, and for a couple weeks, we learned to build web sites by hand to host our class projects. I was hooked on web development. I imagined using the web as a means to share my writing. I was one of those naive 90s web enthusiasts. I saw the web as a tool in the hands of artists, professors, and writers. It was more text than technology for me.

I started to grad school, while also doing more and more on the web. I was an older student, already married a decade at that point, and my wife and I were ready to have children. I better get a real job, I thought, and after a year of teaching high school English, I took my first technical job at Auburn University Libraries.

From there, it was a developer role for a small newspaper. Then, the Washington Post. Canonical. Amazon. Disney. Apple. Some others sprinkled between, and now, CNN at Warner Bros. Discovery. (God rest its soul.)

The web has been good to me. I really shouldn’t complain.


Liberal arts degrees, like an English degree, spend a lot of time on the early Renaissance and the beginnings of humanism. It’s that moment in time when modern, rational thought took hold. You can draw a straight line from the Renaissance to classical liberalism, from rational thought to modern science, from humanism to democracy.

It makes sense to focus on it. We are who we are today because of humanism.

You cannot, however, truly understand humanism without understanding the context into which it was born. It seems simplistic today, but back then, it was religion versus reason. Religion granted authority to kings. It forced servitude. You didn’t matter. The church mattered. Land mattered. Your relationship to the land as nobility or peasant mattered.

Humanism came along and said everyone mattered because of individual worth and agency. It wasn’t God who gave you meaning. It was your own existence. It was a radical idea, and though modern churches still denounce humanism, especially in evangelical circles, most religious folks — at least here in the western world — recognize individual liberty and personal autonomy as basic human rights.

That’s humanism.


When I say I never thought I’d build a life for myself in tech and business, I mean that it wasn't even imaginable to me. I grew up splitting time between two households. My parents were divorced. In my dad’s house, it was the Wall Street Journal over breakfast. In my mom’s, it was tears of regret over coffee and a cigarette.

I never wanted much to do with either approach, largely because both seemed hollow. Business felt like a stuffy, greedy endeavor. Those tears of regret for my mom — they were born in religious superstition.

Religion and business were things best avoided at all costs.


I had to attend Christian school from ages 10-14 years old. It was oppressive and insulting. A pastor who was also my teacher at school told me I would never be a real man because I lived alone with my mom. The pastor didn't know, or couldn't be bothered to learn, that I spent Christmases and summers with my dad. People select facts that fit the narratives they want to tell, both then and now.

I have this vivid memory from my days in Christian school. I don’t remember the exact age, but I was young. I’m guessing elementary school aged.

I took a trip to Disney with my grandparents on my dad’s side. They were the best, the one part of my family that seemed normal and loving. I was excited about this trip and talked about it at school. I’ve always been a talker, always loved a good story. Then, the first Sunday after sharing my excitement at school, the pastor — the same one who said I would never be a man — preached on the evils of Disney and how good Christians would boycott that filth.

Boycotts are the modern age’s religious sacrament, a way to advertise one’s devotion via the most consumer-friendly means possible. I grew up with boycotts. I also see boycotts regularly employed by my circle of liberal, technological friends. Don't drive a Tesla; Elon is evil! AI, the end of humanity! And, of course. Social media is the cause of all that ails society!

It’s funny how there isn’t much daylight between the religious fundamentalists I grew up with and the techno-fundamentalists who make up my professional circle. Fundamentalism is convenient. It’s a way to draw a line in the sand. Us versus them. Heathen versus religious. Owner versus employee.


I guess the path I’ve taken has made it less clean and easy for me. I think of myself as a person of faith, at least in the "substance of things hoped for" sense, but I still see the harm organized religion causes. I’m an intellect and welcome deep, thoughtful engagement with life, but I don’t feel trapped in my mind. I’m a creative person, I hope an artist even, but I earn my living from a mostly boring, corporate job. I'm equal parts writer and web developer.

I’m an individual, all messy and complicated and hard to pin down, and that’s ok. That’s how it should be. We need more of that. More individualism, less dogma. More learning and understanding. More celebration of what makes us unique as human beings. If we had more of that, there'd be less corproate greed, less desire to see humanity replaced by computers. That’s what humanism taught me. So more of that please, and more Disney vacations, too.

FWIW, Disney vacations are still my favorite.

The True Cost of Principles

Yesterday morning, over coffee, I was reading the latest Benedict Evans newsletter. In the section "Principles cost money," Ben quoted from advertising legend Bill Bernbach:

The ad industry guru Bill Bernbach said “it’s not a principle until it costs you money”. I left Twitter two years ago. It probably cost me money.

I followed to Ben's newsletter from this post of his on Threads:

Tim Cook has spent a decade or two talking about principles and ‘fundamental human rights rights’ and putting rainbow flags and ‘allyship’ on Apple surfaces, and then he goes to watch a movie at the White House, and says nothing.

“It’s only a principle if it costs you money”

This struck me profoundly and deeply. I've been feeling the collision of my principles and my own relationship to tech, to politics, to culture for at least the last year.

I think seriously about returning to Linux, to get out of the Apple ecosystem, but I go back to my Mac after a month. It's hard. Well no, not hard. Inconvenient. I miss the apps I love. I miss the ease with which my digital life carries with me across devices, anchored and mobile, all at once.

I think about leaving social media entirely – to be clear, I'm not that active now – but then FOMO takes over. sigh. That FOMO. It's a force, invisible and centrifugal, like that black hole at the center of our galaxy. You don't realize it's there, but you're trapped in orbit around it.

Then, there are all the other things I mentally charge toward and then back flip away from.

I should retire from tech, build a business for myself. No, I can still make a 10 year run somewhere.

Writing, that's all I've ever wanted to do. No, I've been a developer so long, that's what I really am.

Stop using gen AI, it's sleazy. But this comic I made with the help of AI, it's kind of cool!

Today, I realized. It's the cost that keeps me trapped in this spin of indecision. It's the cost, the loss, that I want to avoid. Maybe not money, but it's something. Loss of pride, that fear of looking silly or dumb. Loss of respect, of getting found out for the true lack of depth of understanding and commitment I carry with me. Failure, that true loss, the true cost you might pay for trying to be more than you are.

In our modern world, money is the surest test of success. It's the thing Benedict Evans knows most well. He is primarily devoted to investments and business, and so, it's the framework that makes things make sense to him. But loss of any sort, it's the thing we all most want to avoid. Loss is the real cost, and if your principles don't carry with them the risk of true, fundamental loss, they must not be principles at all.

The Obligatory 2026 Post

The new year is here, and I'm naturally in a reflective mood. What a year 2025 was for me. I changed jobs, saw my youngest graduate and start college, moved twice, and so much more. I expect 2026 to be less chaotic and more focused. At least that’s what I’m hoping for. 

I've never been one for New Year's resolutions. I don't like the idea of making plans once and executing on them. I'm more of a be in the moment and adjust as you go kind of person. But even with that, I can't help but look ahead and think about what I'd like to get done this year.

Let's start with my work at CNN.

To say I'm grateful for this job is an understatement. My prior two roles didn't work out as I had hoped. I was starting to get a bit down about my ability to land something that could be a long-term role for me. CNN came along at the perfect time. I've now got a role where I enjoy the work I do, it's a good technical fit for me and my expertise, and I've got great colleagues to share the journey with. I want to maintain that, so any plans I have around work are to be consistent in my role and try to grow within it.

We're doing more with Rust and edge compute, so I want to really learn Rust well this year. I've dabbled before, and I know enough Rust to be dangerous. I want to grow my technical expertise there. I also want to re-up my connection to Python and the Python community. I'm thinking of going to PyCon this year. For all my experience in Python and its open source community, I've never been to a PyCon. I hope to change that this year. I'm thinking about re:Invent this year too. AWS is a mainstay in my toolkit, both personally and at work, and I'd like to be more involved in re:Invent, even if it's the end of the year before we get there.

Outside of work, I'd like to spend more time with the things that help me feel more present.

One of those is travel with my wife. We're empty nesters now, and we've been talking about taking more weekend getaways. It feels cliche to say it, but it's true – my wife Wendy really is my best friend. We enjoy each other's company, and we love travel. I'm excited about visiting nice locales within a day trip of Atlanta. I hope to literally get away more as often as we can.

I also want to find some opportunities to play more of the Marvel Multiverse Role-Playing Game with friends or other local Atlanta players. I love superheros, comics, and playing games with friends. MMRPG combines all those. I took up the game a little over a year and played a few times last year. I'd like to really lean in this year and play more, especially in person with others in Atlanta.

So 2026, here we go. Onward and upward. More focus. More living in the present. And hopefully I'll do a little better at sharing updates on all that here on my blog.

Weekend Plans

Reading

  • The Ministry of Time by Kaliane Bradley. Cannot put this down or say enough good things about how I’m enjoying this book.
  • Catching up on a bunch of tech news pieces and blog posts I marked in Tapestry.

Watching 

  • Latest episode of Pluribus — I can’t say enough good things about this show.
  • Binging Outlander with my wife. We’re currently starting on Season 4.

And speaking of Outlander…

We’re going to the GA Renaissance Festival on Saturday. My daughter loves it, and my wife and I are going to check it out. I’m going in Scottish Highland attire.

My family is giving me no end of grief  suddenly getting in touch with my Scottish heritage since watching Outlander. But it’s a fun escape this weekend, and feels overdue with the week behind.

Being Present When We Live in the Future

I’m thinking a lot about writing lately. I also spend a lot of my time working on technology. There’s actually a tension there. It might not be obvious, but it’s there. Writing is about slowing down, savoring the language. Every. Single. Word. Technology is about speeding things up, about doing more with less. The oft talked about, and (depending on your perspective) dreaded: productivity.

I was out walking earlier this week, here in Atlanta, maybe headed into work, maybe out for exercise. I was on this part of a street near my home where the trees dotted both sides of the road as it stretched out toward the horizon. High-rise apartments and corporate offices stood off in the distance, reaching into the clear, blue sky. People were walking by, listening to music on their iPhones. The AirPods were the give away. A lone Uber delivery robot was creeping up the sidewalk across from me, and I thought, “Wow, I’m living in the present and the future, all at the same time.”

That’s writing and technology. One takes us into the future, the other makes us present. The world is a little slanted toward technology at this point, and it shows. We're starting to wear under the weight of that singularity. There's a work around, though. It's being present and writing about it.

Anderycks.Net by Deryck Hodge

Connect with me