When testing for Web compatibility issues, we meet a lot of strange cases, some of these are related to implementations of HTTP verbs and how the server understands the request made by the client.
These are examples. They are not meant to point fingers at companies, but just to show the variety of issues you may have when testing Web sites.
HEAD and GET
HEAD is a useful HTTP verb giving the possibility to know information about the resource (URI). The server sends the same information than a
GET but without the payload (body) for this specific resource.
so if you were requesting
HEAD / HTTP/1.1 Accept: text/plain Host: somewhere.example User-Agent: sharethelove
the server can answer
HTTP/1.1 200 OK Accept-Ranges: bytes Content-Length: 5 Content-Type: text/plain; charset=utf-8 Date: Thu, 20 Feb 2014 06:02:53 GMT Last-Modified: Fri, 03 Jan 2014 00:01:41 GMT
without the body which in this case is "Yeah!". Let see with a
GET request on ask.com:
http -v GET http://ask.com/
GET / HTTP/1.1 Accept: */* Accept-Encoding: gzip, deflate, compress Host: ask.com User-Agent: HTTPie/0.8.0
The response is simple.
HTTP/1.1 301 Moved Permanently Content-Length: 226 Content-Type: text/html; charset=iso-8859-1 Date: Thu, 20 Feb 2014 05:56:42 GMT Location: http://www.ask.com Server: Apache from-tr: trafrt015iad.io.askjeeves.info tr-request-id: UwWZGgpcqP4AAHZCH8oAAAD@ <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>301 Moved Permanently</title> </head><body> <h1>Moved Permanently</h1> <p>The document has moved <a href="http://www.ask.com">here</a>.</p> </body></html>
So if we do the same HTTP request with
HEAD this time we should receive the same headers.
http -v HEAD http://ask.com/
HTTP/1.1 301 Moved Permanently Content-Type: text/html; charset=iso-8859-1 Date: Thu, 20 Feb 2014 06:13:13 GMT Location: http://www.ask.com Server: Apache from-tr: trafrt018iad.io.askjeeves.info tr-request-id: UwWc@QpcqQEAAHL83AoAAACD
It worked! The issue. It took a very long time. Something around 30s. I had some cases where the answers didn't come back at all and there was a timeout. Something is not right.
In the past Yahoo! home page was sending an error 500 when doing a
HEAD and giving a response when doing a
GET. They since switched to HTTPS and the error disappeared.
ROT will stand for "Rule Of Testing" in the rest of this article.
ROT 1: So most of the time, it's unfortunately better to test with HTTP
GET than trying to use HTTP
With HTTPie on the command line to keep only the headers and dump the body, you may use:
http --print hH GET http://example.org/
It will print the request and the response headers and will not print the payload.
Good Headers For Top Testing
Because we were writing about Yahoo! home page there are still issues. This is a HEAD request with the Firefox OS user agent.
http -v HEAD https://www.yahoo.com/ User-Agent:"Mozilla/5.0 (Mobile; rv:26.0) Gecko/26.0 Firefox/26.0"
The request from the client is:
HEAD / HTTP/1.1 Accept: */* Accept-Encoding: gzip, deflate, compress Host: www.yahoo.com User-Agent: Mozilla/5.0 (Mobile; rv:26.0) Gecko/26.0 Firefox/26.0
And the HTTP response code is…
HTTP/1.1 200 OK […]
(I have cut the rest of the response to focus on the issue). Let's restart with a different
http -v HEAD https://www.yahoo.com/ User-Agent:"Mozilla/5.0 (Mobile; rv:26.0) Gecko/26.0 Firefox/26.0" Accept:"text/html"
The request from the client becomes:
HEAD / HTTP/1.1 Accept: text/html Accept-Encoding: gzip, deflate, compress Host: www.yahoo.com User-Agent: Mozilla/5.0 (Mobile; rv:26.0) Gecko/26.0 Firefox/26.0
The response from the server is:
HTTP/1.1 302 Found Location: http://m.yahoo.com/?.tsrc=yahoo&mobile_view_default=true […]
It shows another pitfall of Web Compatibility testing. In this case, Yahoo! servers are very sensitive to…
ROT 2: When testing on the command line the HTTP requests and response, be as close as possible as the environment you are testing.
It means sending the same HTTP headers in the request than your device/browser would send to the server. It's not always possible depending on the tool you are using.
HTTP Verbs are overrated
Recently testing a French bank Web site, I noticed that they were using SPIP, a popular platform in France for publishing content. During the test I had the chance to make a typo when writing the verb. Yeah! Even human do mistakes
Let's use the verb
FUNKY which doesn't exist.
http -v FUNKY http://www.credit-agricole.fr/
The server response was:
HTTP/1.1 200 Condition Intercepted Cache-Control: no-cache Content-Type: text/html Date: Thu, 20 Feb 2014 06:40:27 GMT ETag: an-err-no2 Expires: Thu, 01 Jan 1970 00:00:00 GMT Last-Modified: Thu, 01 Jan 1970 00:00:00 GMT Pragma: no-cache Server: Apache Transfer-Encoding: chunked <html> <head><title>Security alert!</title></head> <body bgcolor=#FFFFFF> <h1>Security alert!<hr></h1> <p> Alert #<b>2</b>! </p> <hr> </body> </html>
Hmmm interesting. So I wonder about other SPIP sites.
http --print hH FUNKY http://www.openweb.eu.org/
The request was
FUNKY / HTTP/1.1 Accept: */* Accept-Encoding: gzip, deflate, compress Content-Length: 0 Host: openweb.eu.org User-Agent: HTTPie/0.8.0
(btw it seems that pygments which gives color to this code… has a better knowledge of HTTP verbs than SPIP ;) Who knew?!
HTTP/1.1 200 OK Cache-Control: max-age=0 Composed-By: SPIP @ www.spip.net Content-Encoding: gzip Content-Length: 4459 Content-Type: text/html; charset=utf-8 Date: Thu, 20 Feb 2014 06:45:48 GMT Expires: Thu, 20 Feb 2014 06:45:48 GMT Last-Modified: Thu, 20 Feb 2014 06:45:48 GMT Server: Apache/2.2.22 (Debian) DAV/2 PHP/5.4.4-14+deb7u4 Vary: Cookie,Accept-Encoding X-Powered-By: PHP/5.4.4-14+deb7u4 X-Spip-Cache: 86400
And this was consistent across SPIP Web sites. SPIP is accepting any verbs which is not
GET like a…
GET. If I had more time in my hands, I would test all barebones installation of HTTP servers for this kind of things.
Anyway as usual, I wish you a good evening: