DOOM! Or, the web developer market in 2024

If you’ve ever referred to yourself as a “web developer”, you’re probably aware that the job market right now is not great.

As someone who was out of a job for a bit, this is my hot take on the current state of things.

What is happening!?!

Maybe you’ve read things online about the effects of the end of ZIRP, Pandemic “overhiring”, or the rise of the robots. The reality is the web, as long as I’ve known it, has always been boom and bust, because it’s currently the only place where you can address a market of ~6 billion people with not a whole lot of effort. If you get that right — boom! If you don’t, the “not a whole lot of effort” is also pretty easy to dismantle. It’s just made out of people! Your office might just be a WeWork1 and there’s no inventory or machinery. Most of the people who wrote the code wrote it to work without them even! How fortunate for the people running these companies.

Dr. Manhattan from the Watchmen watching the market for web developers collapse, time and time again.

But the vibes in 2024 feel different

At least I think they do, for a few reasons:

1. A web developer is not a strategic asset any more

Let’s get some definitions out of the way, because I’m going to use them a few times from here out:

Web developer - A person who makes services and/or UIs that are consumed via a web browser.

Web 1.0 - You go to and search for a local Thai restaurant. You find nothing relevant, but the first page of search results has a link to a blog where someone has documented the history of Thai food in America through stories of working in their family’s restaurant.

Web 2.0 - You you go to and search for a local Thai restaurant. You get a map showing all the ones near you, with reviews generated by other human beings. The order of the restaurants on the page is determined by which one paid Google more. You find one that looks interesting and see that they have a page on Facebook where they last posted three years ago. The blog you found in the Web 1.0 days still exists, it’s just on page 1,275,431 of the results.

In both the Web 1.0 and Web 2.0 days, being a web developer was a huge asset. For Web 1.0, there just wasn’t a lot of people who knew how do put things on the Internet. For Web 2.0, you rode the wave of the Internet extending to the whole world, and taking input from the world. Web 2.0 was quickly followed (or overlapped) by the rise of mobile phones, which extended the domain of web developers to these devices and introduced new and interesting avenues of development.

Today though, we as developers benefit from all of those solved problems, but suffer from lack of new interesting ones. For any given company online today, you’ll find 20 similar competitors, all with the same UIs, differentiated more by their marketing or legal departments than anything on the web.

And while each of those services still needs an app and a website, the market for new devices to help solve new problems has dried up, unless you really believe in the AR/VR future. And if you do believe in that future, you’d be better off with a background in 3D game development than web development.

And on top of that

2. Web 3.0 is very unfriendly to the web

Alright so we defined web 1.0 and 2.0 above, so what is web 3.0?

Web 3.0 - The Internet replaces money. Or, it replaces contracts. No wait, it replaces reality? If that’s too much what if it just replaces websites with a colorful animated blob that reads websites to you?

The blobs reading websites to you might be the most concerning development for traditional websites. While they rely on data from websites (see the big deals made with sites like Reddit and Stack Overflow lately), they also want you to not actually go to the website for information. This has already been a growing issue with Google search, where a simple search returns inline “answers” for results, and even a more complicated search returns primarily ads or the major sites Google wants you to see.

I don’t have any predictions about where this leads us. It seems not great. Certainly there’s nothing stopping anyone from putting a website up on the WWW, but if it’s completely undiscoverable and simply scraped by an AI bot, what’s the point.

Speaking of AI…

3. AI won’t take your job, but maybe it + one of the five billion other people who learned how to code will

I’m not an economist, I won’t make claims about whether the market for software developers is oversaturated or not, but it’s a simple reality that there’s a lot more developers out there than there was 10 or 20 years ago. Part of it is this idea that “learning how to code” is an “easy” way to get into a middle class life, a claim that keeps getting repeated over, and over, and over in sometimes truly absurd ways.

Part of it is simple economics I can understand. 20 years ago when I went to college:

  • a degree cost x.
  • you could assume that almost any major2 would get you at least x per year as a salary to start.

You could do some math based on expenses and projected increases over time and figure out how long it would take to pay off the degree. To put it in simpler terms, I went to school and got an English degree, which was considered a bad idea if I wanted to make piles of money3, but a fine idea if you wanted a white collar job. Today if you’re not in a STEM field the logic looks more like:

  • a degree costs x
  • the starting salary is GET FUCKED.

Most STEM fields might get to x = x, but to get to > x going into software is your safest bet.

On top of that most software positions don’t require any professional certifications or further education. If you’re looking to min/max your college education around salary, it’s a pretty clear cut winner.

Also, I’m only talking about the US here. That thing I mentioned before about Web 2.0 reaching the entire world inspired people to become developers around the world. I won’t get into the pros and cons of sourcing talent from other countries, but I will note that we’re rapidly approaching a future where we see the industry assume every role will be part developer, part AI, and junior and mid level roles become a person who knows some code and how to ask AI for the rest. AI might not take your job, but it will make standing out a lot harder.

Now the good stuff

4 panel Anakain meme

I don’t have a great conclusion here. I am still a “web developer”. Currently I’m working on something that is web based, but you’d never find it unless you work in a niche field. I intend to keep doing this up until we’re at the point where you can yell “go go gadget website!” to generate a fully functioning SAAS.

I will say — I think there’s a lot of opportunity to make things better online. If you venture off the main drag of the Internet, away from the shining Salesforce megatower or Apple spaceship campus or Facebook legless metaverse portal, you’ll find that things get shitty real quick4. Local businesses and restaurants at least made an attempt to be online during the pandemic, but you’re more likely to find them infrequently updated with no online sales presence, or at best using something called STEVESALES that’s a pile of CGI scripts written in 2001 by a guy named Steve who currently raises alpacas in Vermont. My daughter’s school puts report cards on a site that frequently renders nothing but a blank page, every type of doctor I see has a different patient portal that is varying levels of obtuse and frustrating, and the last time we tried using a local bank we gave up because the website appeared to have been made by the apes from 2001: A Space Odyssey that didn’t touch the obelisk.

That went into a bit of a rant, but here’s what I’m saying: we might be past the “shiny toy” days of the Internet, but this idea of people getting information and communicating with each other online is going to go on forever. If you like building things for the web, find the little weird corner that interests you the most, and make it a little better. Maybe that’s not a sure shot to a job where you commute in a car where the doors go up and down instead of side to side, but then again the early days of the web never held that promise, and people built it anyway.


  1. uh, unless WeWork is gone? It’s hard to keep up.

  2. OK, sure, certain degrees even 20 years ago were known to be unlikely to be economically viable, but they were also often studied by people who didn’t have to worry about things like economic viability.

  3. Unless you went into law.

  4. I mean the missing legs was shitty but I have not personally built a metaverse, so maybe there’s just complexity I don’t understand.

"loading posts from Mastodon..."