Published on

There's No Such Thing as a Full Stack Developer

Authors

Language matters. Giving something a label and a definition helps to take something abstract and ill-defined in our minds and make it "real." Once this abstract concept becomes "real," it frames the terms of the debate by forcing even those who oppose it to argue within the construct that is defined.

"Language matters because whoever controls the words controls the conversation, because whoever controls the conversation controls its outcome, because whoever frames the debate has already won it..."

Erica Jong

Let's take the concept of a "full stack developer." This is a very recent concept but its use has picked up steam lately. There seems to be a constant stream of posts on sites like Hacker Noon or Medium that try to help junior developers or aspiring developers become a "full stack developer." More and more companies are posting jobs looking for "full stack developers."

However, I want to argue that simply by creating the term and using it, we've now solidified "full stack developer" into a concept that is reframing how we (and especially the employers who'd hire us) look at the skills required to do the job. What was once a preposterous list of requirements for a job that many of us used to mock gets lumped under the term "full stack developer"

Let's look at a (by no means comprehensive) list of what are often listed as the "essential skills to become full stack developer":

  • HTML/CSS including:
    • Front-end frameworks like Bootstrap or Foundation
    • CSS preprocessors like Sass
    • Responsive and/or adaptive design
  • JavaScript including:
    • JavaScript frameworks like Angular, React or Vue
    • JavaScript toolchains featuring things like TypeScript, Babel and ESLint, npm
    • JavaScript testing with tools like Jasmine and Mocha
  • Back-end development, which depending on the needs of the employer, may potentially including:
    • PHP
    • Node.JS/JavaScript
    • Ruby
    • Python
    • C# or Java
  • Database development including:
    • RDBMS like MS SQL
    • NoSQL data stores like MongoDN
    • In-memory stores like Redis
  • Web application architecture including:
    • Leveraging serverless/cloud services for a microservices architecture
    • Deployment to various platforms including AWS, Azure, Heroku, etc.

Let's admit that that is an impossible list - even assuming only a minimal level of expertise in some or most of these. I also left out things like understanding web application security, managing version control, configuring a web server that are all essentially assumed skills.

<blockquote class="twitter-tweet" data-lang="en"><p lang="en" dir="ltr">The &quot;full stack&quot; dev is a dying breed. Just keeping up with JS &amp; React ecosystems is a full-time job! Redux, MobX, GraphQL/Apollo/Relay, Jest, Enzyme, Babel, Webpack, ESLint...plus frequent React releases, annual JS versions, and countless npm packages!<a href="https://twitter.com/hashtag/javascript?src=hash&amp;ref_src=twsrc%5Etfw">#javascript</a> <a href="https://twitter.com/hashtag/greatproblem?src=hash&amp;ref_src=twsrc%5Etfw">#greatproblem</a></p>&mdash; Cory House 🏠 (@housecor) <a href="https://twitter.com/housecor/status/974700015414382592?ref_src=twsrc%5Etfw">March 16, 2018</a></blockquote> <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>

Listed out in a manner such as this makes the requirements of a "full stack developer" seem laughable. However, as an employer, I can still imply all of those same requirements under the acceptable term of "full stack developer." Recruiters and other people training developers in the industry are reading things like the "6 Essential Tips on How to Become a Full Stack Developer" or "What the Heck is a Full Stack Developer?" that attempt to normalize the definition and make it seem more reasonable and palatable.

The net result is that even if companies don't get everything they want on this list, they've still successfully ratcheted up the requirements for being a successful developer - and made it acceptable to ask for everything in the first place. They've made junior devs and future devs aspire to the qualifications of a supposed "full stack developer."

<blockquote class="twitter-tweet" data-lang="en"><p lang="en" dir="ltr">After reading an article on Hacker Noon, I am reminded that I&#39;m not a fan of the term &quot;full stack developer.&quot; I think it gives companies an easy way to unrealistically ask for everything and sets a unachievable standard, especially for junior devs.</p>&mdash; Brian Rinaldi (@remotesynth) <a href="https://twitter.com/remotesynth/status/971074236596084738?ref_src=twsrc%5Etfw">March 6, 2018</a></blockquote>

In effect, we have allowed the "full stack developer" term to frame the debate. We should not. It is a term that defines something that doesn't exist - cannot exist because it is an impossible standard. We can begin by refusing to use the term ourselves. We can try to cut through the BS being fed to junior devs and aspiring devs so that they don't see the term as an impediment to future success. And we can ask our employers to not use the term - lay out your requirements, rather than hide them under a ridiculous title. If you have a good job at a good company under the "full stack developer" title, try to convince them that it would be better to use a clearer title that better reflects the specifics of the role.

Hopefully the less we are willing to accept this term, the less we see it used and the more we can regain control over the discussion about what it takes to become a successful developer.

Note: This post was originally published on my blog