Progressive Enhancement—Ain’t Nobody Got Time for that

Note: Content might be outdated!

I had the chance to witness an exchange of ideas between two WordPress developers I personally respect a lot, Andrey Savchenko and Joe Hoyle. The discussion was started by Andrey and centered around a JavaScript/REST-API-powered site and progressive enhancement as a development principle—or the lack thereof. This post is a summary of the key arguments of both sides, appended with a couple of thoughts of my own.

“Progressive Enhancement, Please”

There is excitement in WordPress community about upcoming REST API. It represents an opportunity to build JavaScript driven sites on top of WordPress. Unfortunately the trend seems to ignore progressive enhancement principles.


In his blog post, Andrey makes the somewhat common stance for progressive enhancement aka why one should care about disabled JavaScript. JavaScript can break in which case the content of a site should still be accessible for users.

He fully acknowledges server-side JavaScript as a key part of the typical hosting stack for a JavaScript-driven site, while at the same time identifying it as a “facility which PHP-powered WordPress core simply doesn’t have”.

Therefore, WordPress developers would typically still be dealing with a scenario where their JS-powered site “only receives data via API and is templated client-side”—thus should rely on progressive enhancement to deliver content in any case, JavaScript being processed, or not.

“Progressive Enhancement Evangelists Hold back Innovation”

I don’t value progressive enhancement very highly and don’t agree it’s a fundamental principle of the web that should be universally employed. Quite frankly, I don’t care about not supporting JavaScript, and neither does virtually anyone else.


Joe’s reply reads somewhat like the exact counter theory to Andrey’s argument. Not only does he frankly state his own disinterest in supporting static fallbacks for JS-powered sites, he also argues that advocators of progressive enhancement would “hold back innovation by working to lowest common denominator theory—allowing technology rather than pragmatism to decide what is best for the users”.

Joe agrees that theoretically a site not requiring JavaScript to deliver any content would be preferable, but that in practice “one has to prioritise what we’ll support and not support” and that “developing for no-javascript is just typically not a priority and I think shouldn’t be”.

To confirm his argument he lists a series of screenshots of larger sites like Facebook,, Netflix and others where he had disabled JavaScript. None of those sites seems to provide full functionality without JavaScript, three of them blanking out completely with no visible content at all.

He closes off with the general remark that depending if the reader was a person “who lives by principles rather than case-by-case pragmatism”, they might or might not strongly disagree with him.

Ain’t Nobody Got Time for that!

I said o Lord Jesus, it’s a fire! Then I ran out, I didn’t grab no shoes or nothin’, Jesus, I ran for my life. And then the smoke got me, I got bronchitis—ain’t nobody got time for that!

Kimberly Wilkins

When I read Joe’s response to Andrey’s initial post, I couldn’t help but immediately think of Kimberly Wilkins. (You might remember her local news appearance that went viral in 2012.)

Although I haven’t written too much code myself for a while, I still read a lot to stay on top of things. I was quite familiar with the concept of progressive enhancement to the point I considered it a principle every self-respecting developer will see as a requirement, not an option. (And I still do.)

And then: Joe. 🙂

Never before have I heard a developer publicly state in such clarity he basically doesn’t give a heck about a practice many, if not most, renown voices of our industry (at least the ones I’ve read) basically consider a requirement for modern web development.

The argument is as simple as compelling: pragmatism. Progressive enhancement may be desirable in theory, but in practice? Ain’t nobody got time for that. Proof?, Facebook, Netflix, Bank of America, AMEX—big players who without any doubt were able to provide the resources to implement no-JavaScript versions of their sites if they considered them valuable.

I don’t have any problem buying into pragmatism as the main and often pressing reason for not investing into a no-JS fallback. The idealistic nature of a design directive like progressive enhancement is very clear to me, and so are typical restrictions in client projects (budgets, deadlines, processes of decision making).

What both frightens, but then again fascinates me is the notion in Joe’s post to simply ditch principle altogether, even as an ideal that may do nothing but determine direction, and instead make decisions that potentially affect millions of people based upon what he calls “case-by-case pragmatism”.

“I ran out, didn’t grab no shoes or nothin, Jesus”—to me, that is a valid employment of case-by-case pragmatism. You don’t waste a thought on principle when in life danger.

Is this the type of situation project managers and decision makers for website projects find themselves in on a day-to-day basis? How can one make anything that’ll last for even the short life cycle we’ve come to accept in web technology, when one’s decisions are based on adhoc standards and current phenomenons?

Wouldn’t a house be built as a solid structure first on which then decoration of the current taste would be applied? And wouldn’t decoration be changed in time while the underlying structure would hold for years, centuries hopefully?

Or have building principles like these become outdated?

Do we have to get ourselves acquainted with a new concept based on which things—digital or physical—are built solely out of decoration, without any tangible inner core? Is a website no longer a document written in Hyper Text Markup Language that then becomes a subject to design and interaction, but are modern web apps rather dynamically generated conglomerates of … what, pixels?

I’m not entirely sure, but I don’t think so.

As long as there is content made by humans to be comprehended by humans, semantics do matter. As long as semantics do matter, there is a text document structured with intention. As long as there is an intentional structure in a text document, it can and must be used as a foundation to derive design, motion, and interaction from.

Ain’t nobody got time for progressive enhancement always, maybe. But entirely ditching principle as a compass for resilient decision making won’t do.

It’s the nature of the web as a continuum for support and capability to be in a constant state of flux. Embracing this variability means that we, as web designers, must prioritize content delivery to all browsers (usually via HTML and mobile-first structured CSS), and progressively enhance from there.

Trent Walton

Update 2015-11-30: Mike Little, co-founding developer of WordPress, chimed in on the discussion with Thoughts on Progressive Enhancement and Accessibility.