Thursday 13 December 2012

Website fixing

Today I was doing more website error fixing. I found in my error logs another 499 error for wp-cron, even after making those changes about a week or two ago. But only one error, and other POSTs to wp-cron succeeded with 200 reponse code. So I will leave it as it is for the moment, and not switch to a real cron.

I also noticed lots of 302 temporary redirects. These came from requests for wp-comments.php. Looking into this, I found that this is a standard practice to avoid the likelyhood of a form accidentally being re-submitted: Wikipedia - Post/Redirect/Get.

I also had 499 responses to some Google Earth requests for kml. I don't know why they generated a 499 code. When I tested the one of the requested URLs, it worked okay. I actually need to check whether my KML is working properly, that particular request returned a KML with no points. (Of course that is possible, but strange for a user to load dynamic KML then zoom into an area with no markers).

I also posted on the Google Webmaster forums about some issues I was seeing related to bad / incorrect behaviour by google. While doing this I checked a few other threads that were on the forums there. People seemed to be pretty unhelpful, just accusing people of having spammy websites or engaging in webspam tactics, rather than offering any practical advice. Unfortunately most of the issues would really have required someone from Google to reply as to why people's sites had dropped down / out of Google's results.

I was seeing some 206 Partial Content response codes. When I looked into this, it seems that you can request just part of a file. Most of the requests were coming from facebookexternalhit/1.1 (+http://www.facebook.com/externalhit_uatext.php). This is a facebook bot, but I don't know why it is making 206 partial requests for some files from my website.

Some other requests that generated 206 responses seemed to be from normal browsers, and some were referred from google image search. So long as the browser / facebook bot did request partial content, then I can't see any problem with this. I don't think I can check what (all the headers of) the actual request was though.

Another error I had was that some of my sites were configured to send all .php requests to PHP, without checking whether the file existed first. Solution is to add try_files $uri =404 to the PHP location block.

I had a blog post where I had forgotten to (or didn't think I needed to) make it so that a larger version of an image would open in a lightbox when the link was clicked. (Instead it was just a plain link to the large image). I had a strange error when I checked that the page would work though: Refused to execute a JavaScript script. Source code of script found within request. I had no idea what this meant, but apparently it is just a security feature to prevent XSS attacks: Refused to execute a JavaScript script. Source code of script found within request.

Just refreshing the page got rid of the error.

No comments: