User-Agent Sniffing Only Way to Deal With Upcoming SameSite Cookie Changes

On February 4th 2020, Chrome will introduce changes to the handling of SameSite cookies that require changes to anyone taking in cookies from cross-origin requests. The cookie format required by the new Chrome cannot be read by a significant chunk of older browsers.

One proposed way to deal with this is to set two versions of each cookie, but as explained below this approach is infeasible for many apps. This leaves user agent sniffing as being both the most error prone practice in web development as well as the only compatible approach to cross-origin cookies. For this reason, we'll conclude this post with a comprehensive guide to getting this user-agent sniffing right, synthesizing the recommendations from the major players in a way that tackles the full list of incompatible browsers, and validating it on a dataset of more than 40000 unique real-world user-agent strings in use today.

The web developer community has done its best to avoid user-agent sniffing in JavaScript for over a decade, but server-side user-agent sniffing has hardly been done since the 90s, and even then it felt bad. Now it's back, and for certain use cases it's mandatory.

First let's look into how this came about. If you just want to see how to reliably fix this, scroll on down.

How did we get here?

What this means for cross-origin cookies

Why did this happen?

The point of the changes introduced by Chrome is to get stronger Cross-Site Request Forgery protections by default. These are breaking changes on their own - no matter what older browser were doing, the changes pushed by Chrome mean that SameSite=None; Secure would have to be set for intentional cross-site cookie passing to work with POST requests. These changes drastically reduce the chance of accidentally introducing CSRF vulnerabilities, so they clearly have some value. Had this been introduced in Netscape 1.0, advocating for it would be a no-brainer. When these breaking changes are introduced a quarter of a century after the introduction of cookies, there are clearly a lot of things that will break as a result.

The bigger problem is that it turned out that older versions of the spec were incompatible with the new one. A bunch of older Chrome versions, and all browsers on iOS 12 follow versions of the 2016 standard were SameSite=None is either ignored or treated as SameSite=Strict. The author of the 2019 spec was also an author of the 2016 specs, but it's not clear whether the mutual exclusivity of these specifications were understood.

Browser usage shares

According to w3counter, 4.75% of web traffic comes from an iOS 12 device. Caniuse.com provides break downs of usage shares per Chrome version, and say Chrome 51-66 make up 0.75% of web traffic in total. UC Browser is listed as having 0.3 - 2.9% global market share depending on the source, although this varies greatly from country to country. For example, it is reported to have 22% share in India. For UC Browser, the sources do not differentiate by version number. In summary, using statistics from December 2019, about 6% of web traffic will be unable to handle SameSite=None; Secure. Coming February 4th, auto-updating Chrome installations, making up about 55% of web traffic, will refuse to pass along cookies across origins if they are not set with SameSite=None; Secure.

Who needs to care about this?

First, think about whether or not you ever need to read cookies from third-party POSTs. If you don't, you can set cookies to SameSite=Lax (or perhaps leave it unset, which will default to SameSite=Lax behavior going forward). If you need to protect against cross-site GET requests, go with SameSite=Strict.

If you actually need cookies in POSTs originating from other sites, you need to take special steps. Two ways to handle this are explained in what follows below.

A fix with caveats: Double cookies

One way to fix this is to read and write two different cookies, one with SameSite=None; Secure, and one without, and check for both in your cookie handling. Your HTTP would look something like this.

HTTP/1.1 200 OK
Date: Fri, 17 Jan 2020 10:10:01 GMT
Content-Type: text/html; charset=utf-8
Set-Cookie: myCookie1=value; SameSite=None; Secure
Set-Cookie: myCookie2=value; Secure

This may well be a viable solution for you, but are a couple of reasons why this might not be a good idea, and you should think twice about whether or not it is.

  1. One of the browsers you're targeting, Safari on iOS 12, has hard limits on cookie size per domain. Safari allows about 4KB in total per domain. If you're setting two cookies, you're halving the available space, and chances are good you'll exceed this limit. It all depends on what you're storing in the cookie.
  2. Changing how a cookie is set has some complexity to it but is likely to be centralized to one location. However, changing where a cookie is read might be spread out over a larger portion of the code base. You also need to change any place where cookies are deleted, and you need in every case to make sure you handle each of the cases where either one cookie is present, the other cookie is present, or both, or neither. In all, there is a lot of complexity to this, and it's complexity that's likely related to authentication. That is not where you want to hack around and take chances.

While the double cookie approach is what is recommended by Google, the ASP.NET team concluded otherwise - they found that going with a user-agent sniffing approach would be the safer approach. User-agent sniffing is hard to get right. It's hard to cover all the relevant browsers, and it's hard to do so in a way where you can be reasonably sure it won't break in user-agents of the future. But when someone gets it right, that becomes a readily copiable solution that anyone can use.

The other fix: User-agent sniffing

Parsing user-agent strings is hard, because it doesn't actually follow any particular format, and also because every browser lies about who they are, and states that it is a lot of other browsers as well as itself. The reason browsers lie is to get around bad code that does the wrong thing when parsing user-agent strings. So because user-agent string parsing was hard originally, the universe conspired to make it even harder now. There are really no rules for how to parse them, and you must rely on statistical evidence to see if you got it right. You need to check your code against known user-agent variants, and be specific enough about checking version numbers that you can be confident it won't break in future releases.

There's good reasons why we generally avoid user-agent sniffing at all cost. However, these SameSite issues are an extreme where it may be the solution causing the least breakage.

Being a JavaScript error logging service, CatchJS has a lot of data on user agents, and some expertise in parsing them. What follows are some recommendations based on this. First we'll outline how to reliably detect any browser on iOS 12, Safari on MacOS 10.14, and Chrome versions 51-66. Then we'll expand on this to include the full list of known incompatible browsers.

//lightweight version, covers most incompatible browsers
function shouldntGetSameSiteNone(ua) {
    return ua.includes("iPhone OS 12_") || ua.includes("iPad; CPU OS 12_") ||  //iOS 12
        ua.includes("Chrome/5") || ua.includes("Chrome/6") ||                  //Chrome
        (ua.includes(" OS X 10_14_") 
            && ua.includes("Version/") && ua.includes("Safari"));              //Safari on MacOS 10.14
}

The above code detects almost all SameSite=None incompatible browsers, and is fine if you want a low-complexity solution that covers 99.X% of browsers in use. If you want to cover the long tail of rare browsers, the following paragraphs shows how.

One browser group ignored above are the older installs of UC Browser on Android. In most markets, this may be fine. Detecting it requires more expensive parsing to pull out the version numbers. The browser will generally automatically update, and has relatively low usage share in most markets, so the share of visits coming from un-updated UC Browsers will for many sites be microscopic. However, in some Asian countries UC Browser is huge, and you might not be able to ignore older installs. In that case, you can use the code below to reliably detect the installs that shouldn't get SameSite=None cookies.

function isOlderUcBrowser(ua) {
    var match = ua.match(/UCBrowser\/(\d+)\.(\d+)\.(\d+)\./);
    if (!match) return false;
    var major = parseInt(match[1]);
    var minor = parseInt(match[2]);
    var build = parseInt(match[3]);
    if (major != 12) return major < 12;
    if (minor != 13) return minor < 13;
    return build < 2;
}

After including the check for UC Browser, we're still leaving out older installs of Chromium, and embedded web views on MacOS 10.14. These are fringe, but they can be detected. Chromium can be detected similiarly to how we detected Chrome, and the Mac embedded browser can be detected by checking that the user-agent string includes " OS X 10_14_" and ends with "(KHTML, like Gecko)". In the end that gives the somewhat long, but complete check below.

//more expensive version, covers all known incompatible browsers
function shouldntGetSameSiteNoneFull() {
    return ua.includes("iPhone OS 12_") || ua.includes("iPad; CPU OS 12_") ||  //iOS 12
        (ua.includes("UCBrowser/")
            ? isOlderUcBrowser(ua)                                             //UC Browser < 12.13.2
            : (ua.includes("Chrome/5") || ua.includes("Chrome/6"))) ||         //Chrome
        ua.includes("Chromium/5") || ua.includes("Chromium/6") ||              //Chromium
        (ua.includes(" OS X 10_14_") && 
            ((ua.includes("Version/") && ua.includes("Safari")) ||             //Safari on MacOS 10.14
            ua.endsWith("(KHTML, like Gecko)")));                              //Embedded browser on MacOS 10.14
}

We can compare with user-agent sniffing recommendations given by other sources.

The ASP.NET blog provides sample code for doing this that is used by the Azure Active Directory team. It is much inline with our lightweight version above, in that it uses substring checks and covers only the older Chrome, iOS 12 and MacOS 10.14 cases.

The Chrome team provides a pseudo code implementation of user-agent sniffing, that detects the full list of known incompatible browsers. It relies on running multiple regular expressions to extract and read version numbers. A quick informal benchmark showed it to be about 2.5 times slower. This may not matter at all, depending on your load volume.

We compared results from Google's version and our version on a dataset of 40000 unique real-world user-agents. They differ only in the fringe of the fringe: For example, Google's regex requires a space after Chrome/[versionNumber], and therefore won't recognize the (hardly used) Chromium derivative Vivo Browser 6.0, which puts Chrome/[versionNumber] at the end of the user-agent string. Our version requires an underscore in "iPhone OS 12_" to future proof against iOS 120, but therefore misses the Google Maps app's embedded web view on iOS 12, which reports itself as "iPhone OS 12.n" (with a dot). The usage share of either of these is essentially zilch. Figuring out the trade-off between supporting a future iOS 120 vs supporting Google Maps web view on iOS 12 vs full version number parsing is left as an exercise for the masochistic reader. For most, either solution is good enough.