otsukare Thoughts after a day of work

UA Detection and code libs legacy

A Web site is a mix of technologies with different lifetimes. The HTML MarkUp is mostly rock-solid for years. CSS is not bad too, apart of the vendor prefixes. JavaScript, PHP, name-your-language are also not that bad. And then there's the full social and business infrastructure of these pieces put together. How do we create Web sites which are resilient and robust over time?

Cyclic Release of Web sites

The business infrastructure of Web agencies is set up for the temporary. They receive a request from a client. They create the Web site with the mood of the moment. After 2 or 3 years, the client thinks the Web site is not up to the new fashion, new trends. The new Web agency (because it's often the case) praises that they will do a better job. They throw the old Web site, breaking at the same time old URIs. They create a new Web site with the technologies of the day. Best case scenario, they understand that keeping URIs is good for branding and karma. They release the site for the next 2-3 years.

Basically the full Web site has changed in terms of content and technologies and is working fine in current browser. It's a 2-3 years release cycle of maintenance.

Understanding Legacy and Maintenance

Web browsers are being updated every 6 weeks or so. This is a very fast cycle. They are released with a lot of unstable technologies. Sometimes, they release entirely new browsers with new version numbers and new technologies. Device makers are releasing also new devices very often. It triggers both a consumerism habit and a difficulty for these to exist.

The Web developers design and focus their code on what is the most popular at the moment. Most of the time it's not their fault. These are the requirements from the business team in the Web agency or the clients. A lot of small Web agencies to not have the resources to invest in automated testing for the Web site. So they focus on two browsers. The one they develop with (the most popular of the moment) and the one the client said it was an absolute minimum bar (the most popular of the past).

Libraries of code are relying on User Agent detection for coping with bugs or unstable features of each browser. These libraries know only the past, never the future, not even the now. Libraries of code are often piles of legacy by design. Some are opensource, some have licenses fees attached to them. In both cases, they require a lot of maintenance and testing which are not planned into the budget of a Web site (which is already exploded by the development of the new shiny Web site).

UA detection and Legacy

The Web site will break for some users at a point in time. They chose to use a Web browser which didn't fit in the box of the current Web site. Last week, I went through WPTouch lib Web Compatibility bugs. Basically Firefox OS was not recognized by the User Agent detection code and in return WordPress Web sites didn't send the mobile version to the mobile devices. We opened that bug in August 2013. We contacted the BraveNewCode company which fixed the bug in March 2014. As of today, December 2014, there are still 7 sites in our list of 12 sites which have not switched to the new version of the library.

These were bugs reported by users of these sites. It means people who can't use their favorite browsers for accessing a particular Web site. I'm pretty sure that theree are more sites with the old version of WPTouch. Users either think their browser is broken or just don't understand what is happening.

Eventually these bugs will go away. It's one of my axioms in Web Compatibility: Wait long enough and the bug goes away. Usually the Web site doesn't exist anymore, or redesign from the ground up. In the meantime some users had a very bad experience.

We need a better story for code legacy, one with fallback, one which doesn't rely only on the past for making it work.

Otsukare.