Introducing a progressive webapp for Pokémon fans

The mobile web has a bad reputation these days. Everyone agrees it's slow, but there's no shortage of differing opinions on how to fix it.

Recently, Jeff Atwood argued convincingly that the state of single-threaded JavaScript on Android is poor. Then Henrik Joretag questioned the viability of JavaScript frameworks on mobile altogether, saying that tools like Ember and Angular are just too bloated for mobile networks to bear. (See also these posts for a good follow-up discussion.)

So to recap: Atwood says the problem is single-threadedness; Joretag says it's mobile networks. And in my opinion, they're both right. As someone who does nearly as much Android development as web development, I can tell you first-hand that the network and concurrency are two of my primary concerns when writing a performant native app.

Ask any iOS or Android developer how we make our apps so fast, and most likely you'll hear about two major strategies:

  1. Eliminate network calls. Chatty network activity can kill the performance of a mobile app, even on a good 3G or 4G connection. Staring at a loading spinner is not a good user experience.
  2. Use background threads. To hit a silky-smooth 60 FPS, your operations on the main thread must take less than 16ms. Anything unrelated to the UI should be offloaded to a background thread.

I believe the web is as capable of solving these problems as native apps, but most web developers just aren't aware that the tools are out there. For the network and concurrency problems, the web has two very good answers:

  1. Offline-first (e.g. IndexedDB and ServiceWorkers)
  2. Web workers

I decided to put these ideas together and build a webapp with a rich, interactive experience that's every bit as compelling as a native app, but is also "just" a web site. Following guidelines from the Chrome team, I built – a progressive webapp that works offline, can be launched from the home screen, and runs at 60 FPS even on mediocre Android phones. This blog post explains how I did it.

Pokémon – an ambitious target

For those uninitiated to the world of Pokémon, a Pokédex is an encyclopedia of the hundreds of species of cutesy critters, as well as their stats, types, evolutions, and moves. The data is surprisingly vast for what is supposedly a children's game (read up on Effort Values if you want your brain to hurt over how deep this can get). So it's the perfect target for an ambitious web application.

The first issue is getting the data, which is easy thanks to the wonderful Pokéapi. The second issue is that, if we want the app to work offline, the database is far too large to keep in memory, so we'll need some clever use of IndexedDB and/or ServiceWorker.

For this app, I decided to use PouchDB for the Pokémon data (because it's good at sync), as well as LocalForage for app state data (because it has a nice key-value API). Both PouchDB and LocalForage are using IndexedDB inside a web worker, which means any database operations are fully non-blocking.

However, it's also true that the Pokémon data isn't immediately available when the site is first loaded, because it takes awhile to sync from the server. So I'm also using a fallback strategy of "local first, then remote":

When the site first loads, PouchDB starts syncing with the remote database, which in my case is Cloudant (a CouchDB-as-a-service provider). Since PouchDB has a dual API for both local and remote, it's easy to query the local database and then fall back to the remote database if that fails:

async function getById(id) {
  try {
    return await localDB.get(id);
  } catch (err) {
    return await remoteDB.get(id);

(Yes, I also decided to use ES7 async/await for this app, using Regenerator and Babel. It adds < 4KB minified/gzipped to the build size, so it's well worth the developer convenience.)

So when the site first loads, it's a pretty standard AJAX app, using Cloudant to fetch and display data. Then once the sync has completed (which only takes a few seconds on a good connection), all interactions become purely local, meaning they are much faster and work offline. This is one of the ways that the app is a "progressive" experience.

I like the way you work it

I also heavily incorporated web workers into this app. A web worker is essentially a background thread where you can access nearly all browser APIs except the DOM, with the benefit being that anything you do inside the worker cannot possibly block the UI.

From reading the literature on web workers, you might be forgiven for thinking their usefulness is limited to checksumming, parsing, and other computationally expensive tasks. However, it turns out that Angular 2 is planning on an architecture where nearly the entire application lives inside of the web worker, which in theory should increase parallelism and reduce jank, especially on mobile. Similar techniques have also been explored for Flux and Ember, although nothing solid has been shipped yet.

The idea is to run basically the whole app in [a web worker], and send rendering instructions to the UI side.
— Brian Ford, Angular core developer


Since I like to live on the bleeding edge, I decided to put this Angular 2 notion to the test, and run nearly the entire app inside of the web worker, limiting the UI thread's responsibilities to rendering and animation. In theory, this should maximize parallelism and milk that multi-core smartphone for all it's worth, addressing Atwood's concerns about single-threaded JavaScript performance.

I modeled the app architecture after React/Flux, but in this case I'm using the lower-level virtual-dom, as well as some helper libraries I wrote, vdom-as-json and vdom-serialized-patch, which can serialize DOM patches as JSON, allowing them to be sent from the web worker to the main thread. Based on advice from IndexedDB spec author Joshua Bell, I'm also stringifying the JSON during communication with the worker.

The app structure looks like this:

Note that the entire "Flux" application can live inside the web worker, as well as the "render" and "diff" parts of the "render/diff/patch" pipeline, since none of those operations rely on the DOM. The only thing that needs to be done on the UI thread is the patch, i.e. the minimal set of DOM instructions to be applied. And since this patch operation is (usually) small, the serialization costs should be negligible.

To illustrate, here's a timeline recording from the Chrome profiler, using a Nexus 5 running Chrome 47 on Android 5.1.1. The timeline starts from the moment the user clicks on a Pokémon in the list, which is when the "detail" panel is patched and then slides up into view:

(The delay between applying the patch and calculating the FLIP animation is intentional; it's to allow the "ripple" animation to play.)

The important thing to notice is that the UI thread is totally unblocked between the user tapping and the patch being applied. Also, the deserialization of the patch (JSON.parse()) is inconsequential; it doesn't even register on the timeline. I also measured the overhead of the worker itself for a single request, and it's typically in the 5-15ms range (although it does occasionally spike to as much as 200ms).

Now let's see what it looks like if I move those operations back to the UI thread, by removing the worker:

Whoa nelly, there's a lot of action happening on the UI thread! Besides IndexedDB, which introduces some slight DOM-blocking, there's also the rendering/diffing operation, which is considerably more expensive than the patching.

You'll also notice that both versions take roughly the same amount of time (300-400ms), but the former blocks the UI thread way less than the latter. In my case, I'm using GPU-accelerated CSS animations, so you won't notice much of a difference either way. But you can imagine that, in a more complex app, where there might be many simultaneous bits of JavaScript fighting for the UI thread (third-party ads, scroll effects, etc.), this trick could mean the difference between a janky UI and a smooth UI.

Progressive rendering

Another benefit of Virtual DOM is that we can pre-render the initial state of the app on the server side. I used vdom-to-html to render the first 30 Pokémon and inline that HTML directly in the page. (HTML in our HTML! what a concept.) To re-hydrate that Virtual DOM on the client side, it's as simple as using vdom-as-json to build up the initial Virtual DOM state. with JavaScript disabled. with JavaScript disabled.

I'm also inlining most of the critical CSS and JavaScript, with non-critical CSS loaded asynchronously thanks to a pretty nifty hack. The pouchdb-load plugin is also being leveraged for faster initial replication.

For hosting, I'm simply putting up static files on Amazon S3, with SSL provided by Cloudflare. (SSL is required for ServiceWorkers.) Gzip, cache headers, and SPDY are all handled automatically by Cloudflare.

Testing in the Chrome Dev Tools with the network throttled to 2G, the site manages to get to DOMContentLoaded in 5 seconds, with the first paint at around 2 seconds. This means that the user at least has something to look at while the JavaScript is loading, vastly improving the site's perceived performance.

The "do everything in a web worker" approach also helps out with progressive rendering, because most of the JavaScript related to UI (click animations, side menu behavior, etc.) can be loaded in a small initial JavaScript bundle, whereas the larger "framework" bundle is only loaded when the web worker starts up. In my case, the UI bundle weighs in at 24KB minified/gzipped, whereas the worker bundle is 90KB. This means that the page at least has some minor UI flourishes while the full "framework" is downloading.

Of course, ServiceWorker is also storing all of the static "app shell" – HTML, CSS, JavaScript, and images. I'm using a local-then-remote strategy to ensure the best possible offline experience, with code largely borrowed (well, stolen really) from Jake's Archibald's lovely SVGOMG. Like SVGOMG, the app also displays a little toast message to reassure the user that yes, the app works offline. (This is new tech; users need to be educated about it!)

Thanks to ServiceWorker, subsequent loads of the page aren't constrained by the network at all. So after the first visit, the entire site is available locally, meaning it renders in less than a second, or slightly more depending on the speed of the device.


Since my goal was to make this app perform at 60 FPS even on substandard mobile devices, I chose Paul Lewis' famous FLIP technique for dynamic animations, using only hardware-accelerated CSS properties (i.e. transform and opacity). The result is this beautiful Material Design-style animation, which runs great even on my ancient Galaxy Nexus phone:

The best part about FLIP animations is that they combine the flexibility of JavaScript with the performance of CSS animations. So although the Pokémon's initial state isn't pre-determined, we can still animate from anywhere in the list to a fixed position in the detail view, without sacrificing any frames. We can also run quite a few animations in parallel – notice that the background fill, the sprite movement, and the panel slide are three separate animations.

The only place where I deviated slightly from Lewis' FLIP algorithm was the animation of the Pokémon sprite. Because neither the source nor the target were positioned in a way that was conducive to animations, I had to create a third sprite, absolutely positioned within the body, as a façade to transition between the two.


Of course, any webapp can suffer from slowdowns if you're not careful to keep an eye on the Chrome profiler and constantly check your assumptions on a real device. Some of the issues I ran into:

  1. CSS sprites are great for reducing the payload size, but they slowed the app down to a crawl due to excessive memory usage. I ultimately went with base64 inlining.
  2. I needed a performant scrolling list, so I took some inspiration from Ionic collection-repeat, Ember list-view, and Android ListView, to build a simple <ul> that just renders the <li>s that are in the visible viewport, with stubs for everything else. This cuts down on memory usage, making the animations and touch interactions much snappier. And once again, all of the list computation and diffing is done inside of the web worker, so scrolling is kept buttery-smooth. This works with as many as 649 Pokémon being shown at once.
  3. Be careful what libraries you choose! I'm using MUI as my "Material" CSS library, which is great for bootstrapping, but sadly I discovered that it often wasn't doing the optimal thing for performance. So I ended up having to re-implement parts of it myself. For instance, the side menu was originally being animated using margin-left instead of transform, which leads to janky animations on mobile.
  4. Event listeners are a menace. At one point MUI was adding an event listener on every <li> (for the "ripple" effect), which slowed down even the hardware-accelerated CSS animations due to memory usage. Luckily the Chrome Dev Tools has a "Show scrolling perf issues" checkbox, which immediately revealed the problem:

To work around this, I attached a single event listener to the entire <ul>, which is responsible for animating the ripple effect on individual <li>s.

Browser support

As it turns out, a lot of the APIs I mention above aren't perfectly supported in all browsers. Most notably, ServiceWorker is not available in Safari, iOS, IE, or Edge. (Firefox has it in nightly and will ship very soon.) This means that the offline functionality won't work in those browsers – if you refresh the page without a connection, the content won't be there anymore.

Another hurdle I ran into is that Safari does not support IndexedDB in a web worker, meaning I had to write a workaround to avoid the web worker in Safari and just use PouchDB/LocalForage over WebSQL. Safari also still has the 350ms tap delay, which I chose not to fix with the FastClick hack because I know Safari will fix it themselves in an upcoming release. Momentum scrolling is also broken in iOS, for reasons I don't yet understand. (Update: looks like it needs -webkit-overflow-scroll: touch.)

Surprisingly, Edge and FirefoxOS both worked without a hitch (except for ServiceWorker). FirefoxOS even has the status bar theme colors, which is neat. I haven't tested in Windows Phone yet.

Of course, I could have fixed all these compatibility issues with a million polyfills – Apple touch icons instead of Web Manifests, AppCache instead of ServiceWorker, FastClick, etc. However, one of my goals for this app was to make a high-quality experience with progressive degradation for the less standards-compliant browsers. On browsers with ServiceWorker, the app is a rich, high-quality offline app. On other browsers, it's just a web site.

And I'm okay with that. I strongly believe that web developers need to push the envelope on this stuff, if we expect browser vendors to have any motivation to improve their implementations. To quote WebKit developer Dean Jackson, one of the reasons they didn't prioritize IndexedDB was because they "don't see much use." In other words, if there were a lot of high-quality sites that depended on IndexedDB, then WebKit would have been pushed to implement it. But developers didn't step up their game, so browser vendors just shrugged it off.

If we only use features that work in IE8, then we're condemning ourselves to live in an IE8 world. This app is a protest against that mindset.


There are still more improvements to make to this app. Some unanswered questions for me, especially involving ServiceWorker:

  1. How to handle routing? If I do it the "right" way with the HTML5 History API (as opposed to hash URLs), does that mean I need to duplicate my routing logic on the server side, the client side, and in the ServiceWorker? Sure seems that way.
  2. How to update the ServiceWorker? I'm versioning all the data I store in the ServiceWorker Cache, but I'm not sure how to evict stale data for existing users. Currently they need to refresh the page or restart their browser so that the ServiceWorker can update, but I'd like to do it live somehow.
  3. How to control the app banner? Chrome will show an "install to home screen" banner if you visit the site twice in the same week (with some heuristics), but I really like the way Flipkart Lite captures the banner event so that they can launch it themselves. It feels like a more streamlined experience.


The web is quickly catching up on mobile, but of course there are always improvements to be made. And like any good Pokémaniac, I want to be the very best, like no app ever was.

So I encourage anyone to take a look at the source on Github and tell me where it can improve. As it stands, though, I think is a gorgeous, immersive mobile app, and one that's tailor-made for the web. My hope is that it can demonstrate some of the great features that the web of 2015 can offer, while also serving as a valuable resource for the Pokémon fan who's gotta catch 'em all.

Thanks to Jacob Angel for providing feedback on a draft of this blog post.

For more on the tech behind, check out my "progressive webapp" reading list.

The quest for smooth scrolling

Back in 2012, when Facebook famously ditched HTML5 for fully native apps, the main problem they cited was a lack of smooth scrolling:

This is one of our most important issues. It’s typically a problem on the newsfeed and on Timeline, which use infinite scrolling. [...] Currently, we do all of the scrolling using JS, as other options were not fast enough (because of implementation issues).
— Tobie Langel

It should be clear why this was a dealbreaker for Facebook. If you want people to stay glued to their timelines, then you need a scrolling list that doesn't break a sweat. Otherwise the user's attention starts to drift pretty fast.

This affects more than just Facebook, though. As any mobile developer knows, the "ginormous scrolling list" is a common UI pattern across a wide variety of apps. Even if you're not Facebook, you probably have some kind of list – of notes, tasks, beers, Pokémon, whatever – that figures heavily into your app's presentation.

Recently Flipboard caused quite a stir by unveiling their scrolling list implementation, which renders to canvas in order to achieve 60 FPS. My reaction was: "neat!" The reaction of the web intelligentsia was a bit more mixed.

As an Android developer who only recently learned the web stack, I can see where Flipboard is coming from. I've heard for years that HTML5 was the equal of native platforms, and yet every time I've tried to write a PhoneGap or AppCache-powered web app, I've always had fun, but ended up feeling a little disappointed.

For instance, I recently built a prototype app using Ionic. Overall I found it to be a great developer experience, but getting a scrolling list to perform at 60 FPS turned out to be impossible. And this was despite the admirable efforts of the Ionic team, who developed some very clever tricks to achieve even 45-50 FPS.

So the Flipboard approach starts to look pretty attractive by comparison. The cry of the web advocates, however, is that we shouldn't sacrifice accessibility, copy-paste, "find in page," "open in new tab," or other classic browser features at the altar of 60 FPS.

My point of comparison, however, is with native apps. And when, exactly, was the last time you used a native app that let you "find in page"? Or even copy-paste? Sometimes you can "share," but such functionality is usually hobbled in comparison to the web experience.

Native apps have long abandoned the kind of built-in DOM features that web advocates swoon over (with the exception of accessibility, which works out-of-the-box with TextViews). And yet, native apps are the reigning champions of user attention, especially when it comes to "ginormous scrolling list"-style apps. That speaks volumes about what people really value when they're flicking through tweets at the bus stop.

The truth is that for a largely consumptive experience – which is what infinite scrolling embodies – 60 FPS is much more valuable than the interactive features like "find in page" or copy-paste. Admittedly, accessibility is harder to justify throwing under the bus, which is why even Flipboard acknowledged that they need to fix it.

But amidst all the hand-wringing from web apologists, as well as some gleeful jabs from the web's detractors, I found the most insightful response to the Flipboard fiasco to be Jacob Rossi's:

In other words, the web can solve the problem of slow scrolling in the same way that it solved the problem of slow animations. Just as CSS animations allowed us to achieve native-level performance by moving rendering off the main thread (where it blocks the DOM) and onto a background thread or the GPU (where it hums along nicely), the animation triggers proposal would move the responsibility for scrolling squarely to the browser.

Trying to re-implement the browser's scrolling in JavaScript, as Ionic and other frameworks do, is simply a non-starter if we want to achieve 60 FPS. JavaScript blocks the DOM, and as Flipboard says, "if you touch the DOM in any way during [JavaScript] animation, you’ve already blown through your 16ms frame budget."

Flipboard's solution was to replace the DOM with GPU-powered canvas. But the real solution is to replace JavaScript scrolling with GPU-powered scrolling.

For the time being, though, JavaScript scrolling is the best we've got, unless we want to dip into canvas or WebGL experiments like Flipboard's. Personally I think their approach is pretty exciting, especially if it spurs browser vendors to finally solve the sorts of scrolling problems that Facebook identified way back in 2012.

Of course, now Facebook has an even bolder experiment in React Native, based on the presumption that, even at 60 FPS, WebViews just "feel" wrong. But that will have to wait for a future blog post.

Closing the UX gap

For JavaScript to be an attractive tool for mobile developers, it's not enough for it to offer a good developer experience. It also has to offer a good user experience.

The UI should feel spry and peppy. Animations should zip. Screens should respond immediately to the user's touch. If there's a reason people love their phones so much, it's because mobile apps are very good at delivering on these promises.

Historically, though, hybrid and web apps have lagged behind their native counterparts in the "peppy" department. Mark Zuckerberg famously griped that HTML5 "just wasn't ready" – a phrase that still echoes in the minds of many mobile developers today.

For the mobile web to be taken seriously, it needs to provide the same smooth performance that we've come to expect from native apps. In short, it needs to close the UX gap.


I recently gave a talk at Brooklyn JS where I shared some tricks for building web and hybrid apps with near-native performance. It wasn't recorded, but you can read the full slides and speaker notes online.

The slides themselves are kinda neat, because they contain the very CSS animations that I talk about in the presentation. So you can open it up on a mobile browser and see the difference between, say, hardware-accelerated and non-accelerated animations. The FPS demo is also fun to play with.

Just don't judge me too harshly if you open the slides in IE or Safari – I didn't have enough patience to get WEBM video or flexbox working properly. The CSS animations should look lovely, though.

Smooth animations in CSS

Working on the Squarespace Blog app for Android, one of the biggest challenges is that the app is half-native, half-web.

My job is to make sure you can't tell which is which.

Spoiler alert.

While trying to achieve this goal, one of the first things I learned is that animations with native-level performance – i.e. animations that don't stutter, jerk, or lag – can only be achieved with hardware-accelerated CSS.

Trust me, I've tried. The other methods don't even come close.

Here's a video showing a JavaScript animation, which I rewrote using hardware-accelerated CSS transitions:

And here's a regular, non-accelerated CSS transition, which I rewrote in the same way:

Behold the power of 3D transforms! And yes, I'm not actually using the z-axis, but it doesn't matter. Just consider "3D" to be the "open sesame" to coax the GPU into revealing its treasures.

If you'd like to get started with hardware-accelerated CSS, then you can take a look at this sample widget, which is the same one from the first video. It works on all modern browsers, both desktop and mobile. (Note the hand-crafted flexbox shim and -webkit prefixes.)

These videos were taken on a rather dated Galaxy Nexus running Android 4.2, so the performance improvement is crystal-clear. But in fact, you can also see the difference on more powerful devices, and even on desktop browsers. I actually first noticed the slow trashcan animation while playing with it on desktop Chrome.

Believe me: if it's anything less than 60 frames per second, your users will notice.