In March 2012, Guy Podjarny ran a test comparing the performance of hundreds of shiny new responsive websites across four different screen resolutions. The results were very disappointing.
Two years into the rise of responsive web design, after every imaginable sort of designer and developer had jumped into the train, it took a test to almost rock the theory to its foundations.
Guy proved that almost every known responsive site was overweight.
~ Jason Grigsby
But, more importantly, every mobile user was receiving the same kilobyte overload as a desktop user.
The community had varied reactions to this. Some claimed responsive design wasn’t the ultimate solution, perhaps not mature enough for the challenges web designers face today.
Thankfully, the Web community can always count on a number of people who will grab the bull by the horns and turn the situation around.
~ Christian Heilmann
Web performance has traditionally been built around (no offense) developer-exclusive jargon. Terms like GZIPing, uglifying, minifying, DNS Lookup, file-concatenation… These obscure words push designers out of the equation.
Smart people in the community, though, have since realized that the problem has a deeper root. It really doesn’t matter if you optimize or compress an ultra-high-res image, if your plan is to hide it from a mobile user and still make them download it.
~ Brad Frost
To achieve truly lightweight sites, performance shouldn’t only be a concern, it should be treated as a design feature.
Performance is like any other issue. Sites that overcome it are the ones who acknowledged it from the beginning. And the ones that overlook it are the ones that suffer for it in the end.
~ Brad Frost
- The Why
- Sounds assumable, right?
- And How?
Let’s Get Technical!
- Image Techniques
- Responsive Images
- Compressive Images
- Vectors VS Bitmaps
- Icon Fonts
- HiDPI Images
- What’s Next
- Asset Loading
- CSS, Images
- Advertising, Social Widgets or any third party assets
- Old-school Performance Techniques
- Reduce the number of HTTP Requests
- Reduce the number of Bytes
- In Summary
Research shows 57% of users will leave your site if it takes more than 3 seconds to load.
Google, Page Speed & SEO
As of the spring of 2010, Google took speed in to account as a ranking factor. The impact is not major for average-speed sites, but if the page falls behind a certain threshold, it will be punished by the company’s search algorithm.
This proves that speed is a concern when talking about User Experience.
Back in the day, people used to talk about the very abstract concept of ‘Mobile Context‘. Google’s famous theory breaks mobile users down into three types:
- Repetitive Now: People that use their phone to stay up to date with ongoing, repetitive changes (sports scores, Facebook feeds or stock market)
- Bored Now: Users that take their phone out while waiting for something to happen
- Urgent Now
Sounds assumable, right?
Well, the truth is there is no truth about this. There is no ‘mobile context’. People will use their phone when they are walking in the street, traveling by train or relaxing in their home. They do everything at the same time!
Phones follow people everywhere, so people use them anywhere.
~ Tim Kadlec
Luke Wroblewski highlights some really interesting stats:
Where are people using mobile devices?
- 84% at home
- 80% during miscellaneous downtime throughout the day
- 76% waiting in lines of waiting for appointments
- 69% while shopping
- 64% at work
- 62% while watching TV (alt. study claims 84%)
- 47% during commute in to work
As new situations emerge, as new markets and different habits rise, mobile context will change. We can safely assume that the concept of mobile context will always be on the move until people stop using mobile phones.
This leads us to keep an eye on bandwidth. There is only one scenario where you can serve users an obese website and get away with it: serving it to their Macbook Pros, while they are at home with full bandwidth.
But all the rest of the possible situations, which are a great many, have to be covered as well. These include the seemingly endless stream of devices poured every day into the market. Which, of course, people use to visit websites.
~ Karen McGrane
They include the countries that didn’t have that many smartphones a few years ago, but are now ruthlessly moving forward.
~ Karen McGrane
But more importantly, they include all the places people will be when using your site. So you have to watch all bandwidths. It’s not only the inhabitants of the poor areas of the world that don’t have the same data-speed. Users will try to access a site at work, with a 100mb/s connection; at home, with 2 to 30mb/s and also with 3G, and also with 4G, and also with a data plan, etc., etc.
To put it bluntly, Responsive design is not about screen sizes any more, but about different scenarios, so the solutions must be flexible, adaptable and thought out top to bottom.
Well, glad you asked.
We said before not to look at performance as a bunch of automated tasks running server side that help with an already doomed site. There are ways to undertake these concerns and turn them into a competitive advantage.
What to avoid
Guy Podjarny cites three key reasons for the number of bloated responsive websites we see out there:
- Download and Hide: Assets are still downloaded, but hidden
- Download and Shrink: High-res desktop-level images are downloaded, and shrunk to fit the users screen
- Excess DOM: There is no way to avoid browsers parsing and processing all areas of the DOM, including the hidden ones
A Preemptive Approach
There’s a great deal of information out there about why websites keep failing to surpass expectations in performance. But what most people come to say is something like ‘be responsible from the start’.
All techniques I’m going to cover have been around for a while. To me, the interesting part comes in how they mix and intertwine, covering each other’s flaws and combining their strengths. It is now, deep in the mobile explosion that they show how powerful they are.
…is all about providing a web experience reduced to the essential and taking it from there.
A couple of years ago this theory was taken mostly from a browser point of view. With emerging technologies like HTML5, CSS3, jQuery and so on, the web makers had kind of forgotten about their users. Quite a big percentage of them were getting an incomplete form of their site, relying a bit too much on this shiny new tech.
Now that Webkit engines and Firefox and others have taken over much of the market share, the problem is the enormous quantity of devices with browsers that don’t have the capabilities of the iPhone or Samsung. Again, Progressive Enhancement is the only approach which takes care of these forgotten players first and leave the shine for the ones that can take it.
Mobile First Development
Back in 2009, Luke Wroblewski proposed designing mobile first for three reasons:
- Mobile is exploding
- Mobile forces you to focus (allowing you to get rid of the clutter that stems from having too much screen real estate)
- Mobile extends your capabilities (with technology like GPS, geolocation, multi-touch gestures, accelerometer, cameras…)
Since then, Web design has been rapidly shifting to this approach. Along the way many designers, and many developers have pointed out that building mobile first gives you an edge over desktop development (highly related to Luke’s second point above). Progressive Enhancement and Mobile First Development have suffered a fusion of sorts. Devs start building for mobile and progressively enhance from there, taking larger screen space as an enhancement over a mobile core foundation.
Jordan Moore offers a good summary of the reasons. He argues that, since we can’t safely bet on connection speed, the ‘responsible web designer’ would build for the lowest point of entry – a mobile-first approach, assuming for the slowest connection speed and building up from there to larger breakpoints for faster connections. In the future, we will be able to rely on solid bandwidth detection, but for now it is a good idea to take it as a concern and try not to take any steps in the wrong direction.
To Sum Up:
Code the site for the lowest resolution and possibilities. Make true use of Progressive Enhancement from the start. Build extra functionality, enhanced visuals and interaction when it can be used.
RESS: REsponsive Web design + Server Side components
To many people, responsive design has one major shortcoming. It relied mainly on screen width detection.
As more and more types of devices emerge, hybrid devices like touch screen laptops and so on, feature detection has become essential for responsive design. Libraries that provide it, mainly Modernizr, have bloomed and are now used on most projects. They help devs evaluate whether the client’s browser supports certain functionality and provide it accordingly. But many times it’s tricky to rely on browsers, because ‘they’ will say they support features when, really, ‘they’ do whatever they want. Support for new features is usually partial.
RESS was born to provide a solution. Like mobile first, the term was coined by Luke Wroblewski in 2011. It relies on detecting the user’s device type, evaluating it and providing an experience tailored for it. To do this, there are heavy tools out there, like WURFL, DeviceAtlas or lighter ones like Browser Gem, that read the user agent string and start from there.
This gives responsive design an edge over Mdot sites. Mdot’s only advantage, until RESS came along, was providing an experience specific for mobile devices.
The BBC (very smart people, with millions of readers across the globe and a big responsibility toward their users) talk about how RESS and Progressive Enhancement could work as one and only. They call their approach Cut the Mustard!. It consists of creating a core experience that will work on every device you can imagine. After that, they evaluate the device on the server and they decide whether or not it ‘Cuts the Mustard’. If it does, a progressively enhanced experience is handed out. But again, if it doesn’t, the user can still access the core content.
~ Mat ‘Wilto’ Marquis
Let’s take a couple of points of view into account:
- Mobile users want THE content, as much as desktop users.
~ Brad Frost
- Mobile forces you to focus. There are some constraints designers have to embrace to serve the same content, like bandwidth and lesser screen size.
Also referred to as ‘Aggressive Enhancement’, this development technique allows designers to focus on the core content and progressively enhance it for bigger screens. It provides basic access to certain content that can later be injected on the page as space becomes available.
~ Jeremy Keith
Image and user heavy sites that need to be optimized for mobile, like Facebook, Twitter or Pinterest, make use of lazy loading to provide a better experience. When you first load the page, a number of posts are loaded. When you scroll down, the designer assumes it is because you want to browse through even more content, so it is injected in the page via Ajax. This makes the page load much faster by avoiding DOM excess.
Setting a Performance Budget
Tim Kadlec argues that setting a maximum page weight and being always aware of it is the ultimate way to keeping page load down. ‘Set your goals and stick to them’. Steve Souders discusses three options to choose from, if you fall over your budget:
- Optimize an existing feature or asset
- Remove an existing feature or asset
- Don’t add a new feature of asset
To me this sounds a bit radical, but it makes a point of closely following the overall performance of a site over time and with each new feature.
Let’s Get Technical!
There are certain speed methods that work in a more technical and less conceptual level.
Images constitute around 60% of websites. If you are serving mobile users with unknown bandwidth connections desktop-sized images, you are basically dooming your site to poor performance.
The trick to overcoming this is to serve different versions of images, depending on screen size or type. You would serve a small image to a mobile phone, a high-res one to a desktop and you would serve a double-sized image to a Hi-DPI device.
Daan Jobsis, a dutch designer, found a very strange phenomenon when compressing images with Photoshop. He proved the following: Take an image, double its size (200%), compress it to 25% or less its original quality, resize it back in the browser(100%). The image will not only be lighter in size but already optimized for HiDPI screens, since its pixel density is already doubled.
The only observed problem is that the browser might have a hard time painting a double-sized image back to its original size (if it has to do it a hundred times, like in image-heavy sites). A little bit of testing is required to see if this is the optimal solution.
Vectors VS Bitmaps
SVG images are the way to go at this time. They are completely scalable, so will perform better on any screen. Providing fallback is very easy through Modernizr.
Technically they are vector based images, only served as a font. As Chris Coyier puts it, ‘Icon Fonts are Awesome’ because:
- You can easily resize
- You can easily change the color
- You can easily shadow their shape
- They will work in IE6, unlike transparent PNGs
- You can do everything you can do with images
- You can do anything you would do with typography
Dave Bushell wrote recently a very interesting article with some thoughts on HiDPI images. He argues that, even if today we have the possibility of serving iPhones, iPads and other modern devices with images that fulfill their screen capabilities, it is still too soon to assume a site is not going to get crippled by doing it.
~ Dave Bushell
The point is to do it but do it sensibly, considering the case before jumping into 4x images.
Google recently developed a new image format, WebP. It provides lossless compression for web images, resulting in files 3x smaller, compared to PNG.
Load assets carefully and in order. Controlling this aspect provides a big advantage, by allowing the page to render the basic content and enhance it afterwards.
Controlled loading through media queries, conditional or lazy loading and responsive / compressive image techniques
Make use of HTML5 functionality, like async or defer. There are also loading helpers like RequireJS that can handle loading and dependencies.
Advertising, Social Widgets or any third party assets
Just inject after load.
Old-school Performance Techniques
They have been around for a while, but are still just as relevant today.
Reduce the number of HTTP Requests
To achieve this, devs have to think resource by resource, but here are a number of guidelines:
- Concatenate all CSS files or make use of CSS Preprocessors to compile them into one file.
- Unify all JS Plugins under the same file and always load them in the footer, unless they really need to block the rendering of the page (if you load Typekit fonts in the footer, you will get the famous FOUT or ‘Flash of Unstyled Text’).
- If you must use PNG images, use sprites. They unify all images in one and make use of CSS to cut the pieces.
- Make use of the data URI scheme where possible, that allows you to include images as inline data, getting rid of some more HTTP requests.
Reduce the number of Bytes
Uglify, minify every script or CSS file you call from the page. Set your server up to allow GZIP compression and expansion and GZIP every asset.
The importance of Web performance has been slightly overlooked since the birth of responsive design.
Designers and developers have been focusing on how to solve the responsive puzzle and, along their way, a new multi-bandwidth, multi-device, multi-location web is starting to come into focus.
To be prepared for tomorrow’s problems, we have to include performance as an essential consideration, as the desktop-centered web is disappearing before our eyes. The mobile user is hastier and readier and won’t jump through hoops to get the content, and since more and more sites spring up every day, being fast will mean being ahead.