Friday, 31 December 2010
Not a lot
I signed up to AboutUs.org and Alexa.com as well, and updated some of my site info on them.
In the evening I watched the standard Star Trek TOS episode with Mauser and Bo. Then I did a Japanese lesson with Masuer and watched Magi Rangers with Biddleberry.
Thursday, 30 December 2010
Pano processing
So I thought I might as well fix my XMP panel. What should have been a quick fix actually took me quite a long time. The problem was that in my function I had a variable like
var t:Array;
Then further down in my code I was populating this array like
t = taxon.split("(");
I needed to change this so that it would only split the string at the last bracket, so instead of using
split
, I changed my code to:pos = taxon.lastIndexOf("(");
t[0] = taxon.substr(0, pos);
t[1] = taxon.substr(pos+1);
But this didn't work, and I got an error message instead. I spent ages trying to find out what the problem, and eventually found that just because I had declared the variable
t
to be of type Array, this doesn't actually initialise an array. So I had to initialise the variable likevar t:Array = [];
And now my code worked.
Maybe obvious if you normally work with strictly typed languages, but I am used to the loose typing of javascript and php.
Before uploading one of the panos, I found a problem with it. Then I found a few other problems, so I spent quite a while fixing it. I was also unsure as to what the white balance should be like. After checking the internet, I found that most people seem to prefer using Daylight white balance for night images, thus giving them a warm (and in my opinion, natural) feel.
My photo had been taken using cloudy white balance, so I cooled the colour temperature down slightly.
After getting the panos uploaded I started processing one of the other Korea panos.
After lunch I updated my wordpress installations as the latest version fixes a serious vulnerability. (They even sent out an email urging users to update!) My multiuser blog and another standard installation upgraded automatically okay. But my photosite one took quite a while to update as I have to hack the core code to get it to work.
I processed another pano, then it was dinner time. I think with these panos I must spend about as much time trying to figure out exactly where the pano was taken as I do with the actual processing!
Goes to show how important it is to sync your GPS and camera clock and make sure you have your GPS with you and switched on. Would have saved me a lot of time!
After dinner I watched an episode of Star Trek TOS with Mauser and Bo. It was quite unusual because it didn't feature any scenes of Shatner with his top off. Also, Spock had a good chance of throwing Uhura into her chair, but he pushed her back into her chair very gently. We thought it must be because he hasn't had much practice at throwing Uhura into her chair in recent episodes, and so has forgotten how to do it properly.
The rest of the evening I processed another pano and also played on Mario Kart Wii with Sarah, Mark, and Bo for a bit.
Wednesday, 29 December 2010
Websiting
In the afternoon I worked on an ebay plugin for wordpress and also spent ages trying to find out why the Amazon ads weren't appearing on my website. The answer was that I have the adblock plugin enabled in Chrome. Doh!
I did a Japanese lesson with Mauser as well.
In the evening I watched an episode of Star Trek TOS with Mauser and Bo, then did more work on my ebay wp plugin.
Tuesday, 28 December 2010
Poggin'
*Most of them I have had for probably at least year
After that I sorted out some cardboard boxes and other rubbish. I updated my faces made from food blog, then it was lunch time.
After lunch I did some more website work, still working on the comment / email post updates subscription thing.
In the evening I scanned and sorted more pogs, and did more website work. I also watched an episode of Star Trek TOS with Mauser, Bo, and Clare.
Monday, 27 December 2010
Pano processing
In the evening I also watched the DS9 tribbles episode in HD with Mauser and Bo. It in included on the TOS Season 2 Blu-ray. I also watched Magi Rangers with Belly, which was quite funny.
Lad said that there is a Power Rangers (American) version of Magi Rangers called Power Rangers Magical Force. It would be quite interesting to watch both series at the same time and see the differences and similarities between the American and Japanese versions of the same show.
Sunday, 26 December 2010
Boxing Day
I also finished watching 'The Roaring Twenties' with Mauser.
Saturday, 25 December 2010
Christmas Day
Friday, 24 December 2010
Making a YTP video
But today I finally finished it and uploaded to youtube, just in time for Christmas:
It's amazing how a video that's purposefully rubbish can take so long to make. When watching videos by people like MadAnonymous or some of the better Ran Ran Ruu videos, you don't really appreciate how much work must have gone into putting the video.
About the only other thing I did today was to wrap up my christmas presents for everyone.
Monday, 20 December 2010
Various stuff
I also decided that it would be a good idea to not have a shower as soon as I get up (like I normally do), but rather wait for the temperature to warm up a bit.
I switched my PC on, but it wouldn't go on. I looked at the UPS and the battery indicator wasn't showing. The UPS has been showing the battery charging continuously for the last couple of weeks, so I wasn't that surprised that the battery might be gone.
I had previously looked into the problem, but the UPS manual just says that if something goes wrong with it, then you need to get an authorised technician (or similar) to look at it.
I pressed the self test button on the UPS to see if that would do anything, and it did. Now instead of showing the voltage as 230V and the status as 'normal' it showed the voltage as 000V and the status as 'fault'.
So I got some normal plugs from the garage and plugged my PC and monitor into a normal power strip so I could the PC. I checked ebuyer, where I bought the UPS from, and found I bought it in July 2009, so it wouldn't be in warranty now. I looked at other UPS models, which seem to be quite expensive, especially given that my broken one was rated for 1000VA and only cost about £35.
Doing some research I found that thankfully you don't need to use a UPS that is rated the same or better than your PC's PSU. The rating given to a PSU like 500W or 600W is its maximum power draw. Under normal usage the power draw will be much less.
For the moment I think I'll go without a UPS, but in the future I'll probably buy a more expensive APC model like the Back-UPS CS 650. It's £80, but has replacement batteries available for about £25, and I expect the batteries last longer than the 1½ years my cheap UPS lasted.
Next I did some more website work.
An error quite often occurring in the website error logs was something like
I had looked into this briefly before, but couldn't find a fix. I looked into it more deeply today and found a fix: Re: Error message after a jpegPhoto - msg#000012010/12/12 00:56:01 [error] 2297#0: *1135474 FastCGI sent in stderr: "PHP Notice: ob_end_flush() [<a href='ref.outcontrol'>ref.outcontrol</a>]: failed to delete buffer zlib output compression. in /home/djeyewater/webapps/htdocs/photosite/blog/wp-includes/functions.php on line 3071" while reading response header from upstream
The problem is caused by wordpress, in the includes/functions.php file. I edited the file and changed the wp_ob_end_flush_all function to the following:
/**
* Flush all output buffers for PHP 5.2.
*
* Make sure all output buffers are flushed before our singletons our destroyed.
*
* @since 2.2.0
*/
function wp_ob_end_flush_all() {
$levels = ob_get_level();
//Edit by DK
for ($i=0; $i<$levels; $i++){
$obStatus = ob_get_status();
if (!empty($obStatus['type']) && $obStatus['status']) {
ob_end_flush();
}
}
}
Now I don't get the 'failed to delete buffer zlib output compression' error.
After correcting that and a few other errors, I then spent the rest of the morning and quite a bit of the afternoon working on improving the format of the weekly digest email my blog sends.
For a bit I did think of using Feedburner again (I investigated Feedburner previously when first setting up a mailing list). It would be so much easier to get a nice looking email using Feedburner than hacking the s2subscribe plugin like I am doing.
But Feedburner still has the same problem of needing to enter Google's tricky captcha, and I suspect they also require you to validate your email address. Generally with mailing lists you want to make them as easy to join as possible, not as difficult as possible.
In the afternoon I also went in the garden and took some photos of the frosty cobwebs and other frozen stuff.
In the evening I processed the photos I took in the afternoon and watched an episode of Star Trek TOS with Mauser and Bo. Star Trek was very strange as they didn't have any shots of Shatner with his top off, something which is normally included in every episode.
Also in the evening I made some Cinnamon Swirls with Bo, though only half the usual amount as we didn't have enough bread flour.
Sunday, 19 December 2010
Websiting
I also did a blog post for my photography website and some website stat checking / attempted error fixing.
Saturday, 18 December 2010
Getting annoyed by banks
Surely the Inland Revenue must have this info already, otherwise I could put whatever I wanted and they wouldn't be able to check it!
Anyway, I thought I best just get on with the job and find out how much interest I'd been paid and how much tax I'd paid by going through my bank statements. Unfortunately, the Alliance & Leicester bank statements only go back about 250 days. Looking at the info on Alliance & Leicester's website, it says that you should download statements to ensure you have historical statements.
Why didn't they tell me this when I opened the account? It was probably in the small print, but something like this is quite important. Knowing Alliance & Leicester they probably knew it was important and so hid it in the small print knowing that no-one would read it, and so ensuring they would annoy people.
Also, there appears not be any way to request old statement data from Alliance & Leicester.
This morning I thought I might as well get on with filling in the form to prevent them from taking any more tax from future interest payments, so I did that. Unfortunately I didn't have 32p for a 2nd class stamp and no-one had change for a note. Luckily I had a 10p stamp and lots of 1p stamps from a few years ago, so I just used those instead.
I started doing a sound recording for a video of my top 5 photography tips article, then went to help Peter set up his new printer.
After lunch I spent most of the afternoon playing Wii Sports Golf, Puzzle Bobble, and Mario Kart Wii with Mauser and Bo.
In the evening I went on Animal Crossing and also created a hubpage version of my top 5 photography tips article. They don't allow you to include images in a text 'module', so you have to use a separate image 'module' for each image. So since my article has quite a lot of images in it, it took quite a long time to do.
I also tried to set up the affiliate options in hub pages as well.
Friday, 17 December 2010
Still articling
It's a lot of work just for a few nice sunrise photos!
When I got back home I wrote up yesterday's blog post, then did some more work on my top 5 photography tips article PDF version. I changed some of the layout a bit more, then found that the links weren't working when exported to PDF.
I tried a different PDF and the links in that worked okay, so I thought you must need to do something special to hyperlinks in OpenOffice to get them to work as a hyperlink in a PDF. Googling I found this thread: Links not working in Adobe Reader in Exported PDF, which said the problem was caused by exporting as PDF/a. So I tried tagged PDF instead, and now the links worked.
When I'd got the PDF finished and uploaded I then tried slideshare, but found that the PDF was really in the wrong orientation (portrait) for slideshare. So I spent a lot of the afternoon and evening making a presentation version of the article in OpenOffice.org impress.
Unfortunately it wasn't just a case of copy and pasting as with the different page orientation everything needed rearranging and Impress would always mess up the formatting whenever pasting the text into it. It was particularly annoying with the flickr links where it would make the whole flickr credit text look like a hyperlink and you had to alter the style of the text in a very specific and cumbersome way to make the text look like normal text while keeping the actual hyperlinks looking like links.
I did some checking on the creative commons licensed flickr photos, and found that one of the requirements was that when using the photo you must specify that it is CC licensed. I didn't realise this previously.
The Firefox Greasemonkey script I'd been using to get the image / credit html for CC licensed flickr photos did originally produce the credit in the form of something like
<a href="link to photo on flickr">Creative Commons licensed photo</a> by <a href="link to user on flickr">username</a> on flickr
But I didn't think that was that great so I had changed it to
<a href="link to photo on flickr">Photo title</a> by <a href="link to user on flickr">username</a> on flickr
Now I realised why the script creator had written it to mention CC instead of the photo title. Anyway, all I did was to put a note on my website, the squidoo lens, the presentation, and the pdf that all the flickr photos used are CC licensed.
That way I keep using the photo titles, don't make the photo credits really long by mentioning on every photo credit that it's CC licensed, and also comply with the CC licensing rules.
I also read this blog post about CC licenses being revoked and the burden of proof being on you as a user to prove that you had a license to use the photo: Gaming the Creative Commons for Profit.
So although it is a pain, I took screenshots of every page for each flickr photo I was using, making sure that the little CC icons were on show. Of course, if someone did decide to falsely sue me they could claim that I doctored the images. But still, I think it is better than nothing.
And when there are probably people using the images with no attribution that they could go after, just the fact that I have a screenshot might be enough to put off any of these claims. I hope that in reality I never receive any such claim in the first place though.
In the evening I also watched an episode of Star Trek with Mauser and Bo. My favourite quote was
Dammit, I'm a doctor not an escalator!
Thursday, 16 December 2010
Article submitting
Well, it took me over a day to write the article, and then when I'd finished I realised it was far too long. So I cut it down to 5 top tips and can either use the 10 tips version for a free opt-in PDF or cut it up into other articles.
Submitting the article to the article directories isn't as simple as it sounds as they have various regulations about the number of links in an article, what HTML is allowed, and whether images are allowed.
I only submitted the article to articlesbase.com and ezinearticles.com, but both have different requirements. If you were just writing a plain text article, with <strong> tags used for headlines then submitting to the article directories would be easy.
Since I was writing about photography, I had included a lot of CC licensed flickr photos in my article, and so each of these also linked back and included text links back to flickr. But due to the article directories only allowing 3-4 links in an article, I had to remove all these photos.
I also made a squidoo page, which thankfully lets you use as many images and links as you want. However, it inserts HTML line breaks where you have a text line break in your HTML code. So I had to remove all line breaks from my HTML code. I also had to insert each section as a different block in squidoo. I added some Amazon and ebay ad blocks to my squidoo version of the article as well.
There was also a problem with squidoo that when I set my account up I hadn't actually set up a mailbox for the email address I registered with. And squidoo requires you to validate your email address before you can publish an article. So I set my mailbox up and made sure the email could send and receive okay.
But when I used the link on squidoo to resend my validation email, I never received anything. I tried a few times and decided to wait in case it was just really slow.
That took all morning and afternoon. In the evening I watched an episode of Star Trek TOS with Mauser and Lad.
I checked my email for the squidoo confirmation email again, but still hadn't received anything. So I changed the email address in squidoo to my standard one, and this time the resend validation worked. So I validated my email and published the article.
I set up a facebook fanpage for the website as well, then started converting the article to a PDF. Again, because the article wasn't just plain text, this was quite a bit of work. I tried to resize and re-arrange some of the photos and text so the document would flow nicely, instead of have a line of text about a photo on one page and then a large space and the photo on the next page.
While I was doing this I was also listening to (and occasionally watching) a webinar by Daniel Wagner. The title of the webinar was something like 'What to do and what not to do to make money online in 2010'. But actually all it covered was Daniel's membership site 6 figure mentors. There was no useful content at all. It seems that Daniel's idea for what to do for success in 2011 is that you should join his membership site, and what not to do is miss out on joining his membership site.
The fact that he has integrated a ponzi / pyramid scheme (make money for referring new members and any members they refer etc.) into the site makes it less appealing in my opinion than a standard affiliate program as ponzi schemes tend to be those used by fraudsters.
Tuesday, 14 December 2010
Various
I found Bing Webmaster tools to be pretty useless. According to their stats, they have only crawled about 9 pages of my several thousand page site, which has been up for about a year now. And I have even added the sitemap for that site in Bing Webmaster tools.
I guess this explains why Bing search is still rubbish compared to Google, if Bing doesn't even bother crawling sites properly.
In the afternoon I worked on a top 10 photography tips article.
In the evening I watched the final episode of DS9 with Mauser and Bo, and then some of the extras interviews.
After that I noticed that the sky was quite clear, so I went out to try and take a few night sky photos. Unfortunately it seems my lens / camera aren't up to letting in enough light to get a really star filled sky. And when trying to do a star trail, the camera's battery ran out after five or so minutes.
Monday, 13 December 2010
Installing php
In the afternoon I did some more fixing the articles, adding paragraphs round the amazon ads and flickr photos I'd inserted. I also installed the latest version of php, which had a few problems along the way.
Last week
Saturday, 4 December 2010
Pano fixing
I managed to process the panos yesterday, and today I was just adding the descriptions etc. and uploading them to the website. I only had 4, so I didn't think it would take very long.
Unfortunately I kept finding things wrong with them so I'd have to open them and fix them, save them, and then re-save the various different versions (full size TIFF, full size JPEG, sharpened sRGB TIFF, 640px sRGB JPEG, 316px sRGB JPEG, 2000px cube faces, 640px cube faces).
One of the problems I found was due to the High pass LCE being different at each edge of the image (so it created a seam when wrapping 360°). This is something I've never noticed before, but it wouldn't be that surprising if it is a problem in all my existing panos, just that I somehow hadn't noticed it.
In the afternoon I also played on Wii Sports Golf with Mauser and Bo, and in the evening I watched an episode of DS9 with them.
Later in the evening we went to see K.K. as well.
I also watched a Canon 600mm f/4 IS lens go for £2,651.00 on ebay. Very cheap for this lens, but I'd rather save my money for things I'd get more use from.
Thursday, 2 December 2010
Various stuff
After that I had to replace all the places that wordpress uses named entities like » « and as these entities don't exist by default in xml and so break pages served as XHTML.
Next I updated the subscribe2 plugin that I have customised slightly. But when I tested it, it didn't work. After looking into it I found that part of my customisation didn't make sense. I think I must have just copied it from the filsofo enroll comments plugin that I also customised slightly.
After attempting to fix my code, I tried again, but it still wasn't working. After a while of trying to find what the problem was, I found that, according to the SMTP plugin, I hadn't configured PHP with SSL support. And so because the SMTP plugin wasn't working, the Subscribe2 plugin couldn't send any email.
Re-installing PHP is quite a task, and apparently a new version of PHP will be released soon anyway: PHP 5.3.4 in December. So I'll wait for PHP 5.3.4 to be released, then build it with SSL support.
I went out for a walk and took a few photos around sunset. After the sun had set it was very annoying. The sun would light up the clouds nicely from below the horizon, but as soon as I set up my tripod and camera the light would go from the clouds / sky.
I would take a few photos anyway and wait for a few minutes, but when the light didn't improve at all I'd give up and pack up the tripod and camera. Then about a minute later the light would come back and light up the clouds and sky very nicely again.
This happened three times in a row, very annoying!
In the evening I watched an episode of DS9 with Mauser and Bo, then did a Japanese lesson with Mauser.
After that I spent ages looking for the rubber grip bits that had fallen off my tripod legs quite a while ago. I thought they were all in a bag, but the bag only had one out of nine bits in it. I looked everywhere I could think of that they might be, but couldn't find any more.
I wanted to glue them back onto the tripod as tightening / untightening the legs is quite difficult in the cold without the rubber grips. So it was annoying that I couldn't find them. I was sure they were in the bag, and since I looked just about anywhere else they could be, I've no idea what's happened to them.
Wednesday, 1 December 2010
Various stuff
While it copied across I sorted and processed some old MT-24EX diffusion test photos, mostly of flea beetles.
When the copy was finished I checked and the file size was different between my folder and the folder on Host Gator.
I spent a while trying to check where the difference might be. In the end, I did
ls -al ~/path/to/folder > contents.txt
On both the webfaction and Host Gator servers.
Then I had to strip out the info that was different like the permissions (not sure why though), date created, owner user, and owner group. I did this by just pasting the text into OpenOffice.org calc, which then makes the text into columns so you can easily delete the columns you don't want.
Probably you can easily run
ls
to only select the exact columns you want, but stripping them out through calc isn't much trouble.Then I pasted the two columns I wanted (filename and filesize) for each host in separate text files. I then did a compare using KomodoEdit, but it didn't find any difference. So I guess that actually all the files did copy okay.
My hex / allen key set arrived from Amazon, so I spent most of the afternoon trying to adjust the autofocus on my D200.
Using the 1.5mm allen key I was able to adjust the autofocus, something I had found too tricky with our existing allen keys that had much shorter 'handles'.
After quite a few adjustments I found what seems to be an okay setting. The exact focus point chosen by the autofocus seems to vary from shot to shot (assuming you move the focus between shots, forcing the camera to refocus), and also from lens to lens.
I haven't taken any 'real life' shots using autofocus with the camera since adjusting it yet though.
In the evening I watched an episode of Star Trek DS9 with Mauser and Bo, then did a Japanese lesson with Mauser.
After that I did more photo processing / sorting. I also received some more stuff from Amazon, so I sorted it out and put it all in a 'Christmas presents' box.
Tuesday, 30 November 2010
Websiting
After updating the config files, I got awstats to update with the log files it had missed (which was since mid-October). Thankfully I still had the batch script I used last time this happened, so I modified that with the missed log dates, and then updating was easy.
I found that I had a 500 error on a couple of my pages. When I looked into this it was because I was missing the PEAR Mail_mime module, so I installed that and it fixed it. When looking into this problem I also found some problems with another page, so I fixed that too.
In the afternoon I was trying to set up my host gator account. I managed to get a subdomain of one of my sites pointing at my host gator account. I also managed to get a different domain pointing at hostgator while the subdomains pointed to webFaction (my current host).
But I couldn't sftp into my host gator account, so after quite a while of trying different things and reading the host gator help topics I gave up and sent them an email to get some help.
I read a few articles on the Luminous Landscape until it was dinner time.
After dinner I watched an episode of DS9 with Mauser and Bo, did a Japanese lesson with Mauser, then finished reading the latest Luminous Landscape articles.
After that I checked my email and had received a reply from Host Gator. It seems they have to enable ssh access for you to be able to use it / sftp. I followed their instructions on setting up the ssh keys so that I could login to sftp without being prompted for a password: SSH Keys.
This worked okay, but my shell script to ftp the images from my current host to host gator didn't seem to work, so I'll have to try and find what the problem with that is tomorrow.
Monday, 29 November 2010
Various
After looking into this I found the problem was that two of the listing titles were really long and with no spaces in them.
Now I knew what the problem was, I just had to figure out how to solve it. The answer was the CSS rule
word-wrap: break-word
.However, this didn't work in IE8 and 9. I found two solutions for IE8 and 9 - one was to use
word-break: break-all
. Unfortunately this meant that the text would break at the edge of the container even if the text contained spaces, e.g.this is some text that has break-all applied
The other solution was making sure that the element that the word-wrap was applied to had
hasLayout
. Unfortunately setting zoom: 1
didn't seem to work, but setting display: block
did work.It didn't work completely though - despite the long titles now wrapping, the table cells containing the titles were still expanded to the unwrapped size of the title. This in turn pushed the whole size of the table larger. And so the ebay logo that was floated right in the table's
<caption>
was outside the containing <div>
and couldn't be seen.To fix this I just set a fixed width on the
<caption>
to make sure it wouldn't expand outside the containing <div>
.I also wrote a blog post for my photo website about the print I ordered from SnapMad.com, and tried again to adjust the autofocus on my D200. I couldn't do it though, and so ordered a set of Allen keys from Amazon that look like they have longer handles than the ones I currently have.
In the evening I watched an episode of DS9 with Mauser and Bo. Then I spent quite a while looking at the In the picture website that I'm going to be helping out the local event.
I also watched an episode of Magi Rangers with Bo as well.
Sunday, 28 November 2010
Pano processing
I also finished processing the panos that I took yesterday and watched an episode of DS9 with Mauser and Bo.
Saturday, 27 November 2010
Cinnamon Whirl making and photo taking
After lunch I did some washing up and then watched an episode of Zeo Rangers with L. After that me, L, and Mauser went on a walk. When we got home I rushed out again to try and get some sunset photos from Farndon Fields.
Unfortunately I was a bit too late and the sun was just setting behind the hill as I got there, but I still took a few panos anyway.
When I got home I copied the pics across to my comp and then geo-coded them all. After that I started work on processing one of the panos.
In the evening I watched an episode of Star Trek DS9 with Mauser and Bo, then finished processing the pano I was working on.
I went on Animal Crossing with Mauser and Bo, and we all went to see K.K. play. After that I started processing another pano, then went to bed.
Friday, 26 November 2010
Various
Yesterday when sorting a few Korea photos taken with the Nikon 50mm f/1.4D lens, I noticed that they were back focused.
So in the afternoon today I took some test photos with the lens on my D200, and found that it did have a back focusing problem. Next I checked my 18-70mm f/3.5-4.5G lens at 50mm and 70mm, and that back focused as well. Then I tried my 70-300mm f/4.5-5.6G VR lens, but this seemed to work okay.
So I wasn't sure if the 18-70mm and 50mm lenses back focused or the D200 back focused and the 70-300mm lens front focused. I was in a bit of a pickle as I wanted the 50mm lens at least working for some photography I've got coming up in December.
Then I remembered that I had the Fuji IS-Pro, which is Nikon F mount. So I tried the 50mm and 18-70mm lenses on that, and they both auto-focused okay.
I did a google search to see if anyone else had trouble with the D200 back focusing, and found this thread: D200 back focus?, which had a few people complaining about back focus with their D200s. Thankfully, the thread also contains a link to a website with information on how to adjust the focus in the D200: My Camera needs adjusting. Can I do it myself?
Unfortunately, I couldn't get the allen key into the hole, so I gave up after a bit of trying. The position of the hole also means that you have to put the allen key very close to the sensor, and when you're trying to wiggle it to get it to lock into the hole, you could easily slip and damage the sensor. I might try again tomorrow though.
I went out on a walk about 3pm to try and get some sunset photos, but unfortunately the sun set behind a bank of cloud again, and so there was no sunset to be seen. The ground was mostly frozen, which made a nice change to the usual wet mud of the countryside paths at this time of year.
In the evening I watched an episode of Star Trek Deep Space 9 and an episode of VR Troopers with Mauser and Bo.
After that I added a description etc. to the pano I'd processed in the morning and then uploaded it to my website.
I also purchased 3 years of hosting from HostGator as they had a special 50% off Black Friday sale, making it $4/month instead of the normal $8/month for a 3 year plan.
Later in the evening I did a Japanese lesson with Mauser.
I received my print from Snap Mad today as well, it looks quite good other than a small blob of ink in the top left corner. I'll probably do a more detailed review and put it on my photo website blog.
Thursday, 25 November 2010
Pano processing
In the afternoon I did a bit more work on my photo website then went out around sunset, but the sunset wasn't visible.
In the evening I watched an episode of Star Trek DS9 with Mauser and Bo, an episode each of Magi Rangers and Zeo Rangers with L, and Autumn Watch with Clare and Brian.
I did a bit more website work, and processed a photo from Korea. Looking at the photos it looks like either my 50/1.4 lens or D200 has a back focusing problem. I'll have to try and do some tests tomorrow.
Wednesday, 24 November 2010
SEOing
I had a very weird problem where a page that I had updated was still loading the old version, even after refreshing the page. I downloaded the file from the server to check it had been updated, and it had. I then tried the page in Opera (previously I was using Google Chrome), and the updated page loaded okay.
So I started up Fiddler to check that Chrome was requesting the page from the server when I refreshed it, but now when I refreshed the page Chrome did request the page and did load the updated page. It's like Chrome was trying to annoy me by loading from cache but realised that I was watching it when I started up Fiddler and so decided to load the page from the server like it should.
After getting the site working okay I tried out the IIS SEO Toolkit on the local copy of my site. The first time I ran it, it downloaded 2 pages though. It seems that it doesn't work properly if you specify to consider links as internal from the main domain and subdomains.
It did find a quite a few true errors with my site, but mostly it found thousands of incorrect errors:
The description is missing (on 301s, 302s with Location, and 404s)
The <h1> tag is missing (on 301s, 302s with Location, and 404s)
The title is missing (on 301s, and 302s with Location)
The page contains multiple canonical formats (on one 301, 302s with Location, and 404s)
The page contains broken hyperlinks (on 301s, 302s with Location, links to )
An unexpected Error has occurred (on XML files larger than 100kB)
The link Text is not relevant (for links that contain a title attribute with relevant text)
The page contains too many hyperlinks (on the xml sitemap files!)
It is possible to filter results to show only pages with a 200 HTTP status code and without an <h1>, for example, but it would be more sensible if it already ignored things missing from pages that are automatically redirected.
In the evening I watched an episode of Star Trek DS9 with Mauser and Bo and did some online Christmas shopping.
Tuesday, 23 November 2010
Websiting
1, 2, ... 25, 26, ... 99, 100
While I was at it I also namespaced most of my javascript objects. The problem with namespacing is that when first start writing javascript for a website you might only have 3 javascript objects, so adding them all to a single namespace seems pointless. But then when you've got about 20 objects, it makes much more sense.
But then of course, making the changes to those 20 objects and elsewhere to add them all to a single namespace is quite a bit of work. So I think whenever starting a new website, it is best to start by creating a single namespace everything else will go under, and avoid the work of having to do this later in the project.
In the evening I watched an episode of Deep Space 9 with Mauser and Bo. Apparently the writer René Echevarria had written what sounded like a reasonable script, but the bear called Stephen who works for the IRA was such a terrorist that he made the script be totally different and rubbish instead: Chrysalis DS9 episode on Star Trek wiki.
There was a quite good / maniacal bit where the genetically modified people were just singing a song for ages. Personally, I think it would have been better it they just kept singing for about half an hour. They would need to keep stopping about every minute or two for a few seconds so that you think they've finished, then carry on again.
After that I was doing some more work on my website, and found that a php variable wasn't available in my footer.php file in wordpress.
After quite a bit of debugging, I found the answer - I was including my file that declares the variable in the header.php file. Since this file is called by get_header(), any variables declared were local to the get_header function, and weren't available outside of that function.
So to fix this I had to declare the variables as global before I included the file that set the variables.
Like the header.php file, footer.php is included via a function - get_footer(). So again, you have to declare the variable(s) as global to be able to access the global variables instead of referring to local variables.
After doing that, I could now access the variables in the wordpress footer.php file okay.
Monday, 22 November 2010
Websiting
In the afternoon I also tried contacting travelodge to see if they wanted to purchase any of my photos for decorating their new hotel with, or want a virtual tour done.
And I ordered a canvas print of one my photos from Snap Mad to see what the print quality is like.
In the evening I also did a Japanese lesson with Mauser and watched an episode of of DS9 with Mauser and Bo. It was about the main characters playing baseball and was directed by the legend that is Chip Chalmers. I'm not sure why Chip Chalmers is a legend though, I think it's probably just because his name is alliterative.
Sunday, 21 November 2010
Poggin'
I also watched an episode of Deep Space Nine with Mauser and Bo, and I watched the Jackie Chan Drunken Master film with them as well.
Saturday, 20 November 2010
Watching a film and stuff
Replace the current document with the one at the provided URL. The difference from the assign() method is that after using replace() the current page will not be saved in session history, meaning the user won't be able to use the Back button to navigate to it.I tested it out to try setting the hash like
window.location.replace(url+'#someHash');
and was glad to see that it didn't reload the whole page, but did set the hash.After testing though, I found it didn't seem to be working properly. Then I realised I'd made a mistake in my code that made it redirect to a different version of the page. After correcting this, I was happy to find that it did work as advertised in most browsers.
In Opera I found that it didn't replace the history state, and so if you call
window.location.replace()
100 times, you'll still need to press the back button 100 times to get back to the previous page. And Arora had the same problem as well. Firefox 3.5 on Ubuntu, Safari 5, Chrome 7, IE6, and IE8 (on Windows) all seemed to work properly though.After lunch I watched most of an episode of Star Trek DS9 with Mauser and Bo.
In the morning Mauser had been saying about doing a trip round China / parts of Asia. So I decided to look at how much a micro four thirds setup would cost, as it looks like it would make for a good camera system when travelling (small and light).
I'm not too bothered by the size / weight of consumer lenses for 'normal' DSLRs, but mainly thought the small size of micro four thirds lenses would make it easy to keep them in one or two pockets. This should reduce the chance of them being pick pocketed compared to having keep 5 lenses in 5 different pockets.
However, it looks like micro four thirds are very expensive, e.g. comparing wide angle zooms for Micro Four Thirds, Nikon and Canon:
Panasonic Lumix 7–14mm f/4: £900
Nikon 16-35mm f4 G AF-S ED VR: £829.00
Canon 17-40mm f/4 L: £530
And comparing fisheyes:
Panasonic Lumix G Fisheye 8mm F3.5 £600
Tokina 10-17mm f/3.5-4.5 AT-X 107 DX Fisheye £410
Taking into account that I already own a few lenses for Nikon, buying the equivalent lenses for use with Micro Fout Thirds would make it very expensive.
After that I spent a while reading the PhotoShelter docs to see how I could integrate PhotoShelter with my existing website. Unfortunately it doesn't look like PhotoShelter can be integrated with an existing website, so I sent them a message to see if it is possible or not.
In the evening I watched another episode of Star Trek DS9 with Mauser and Bo, and also finished watching 'The Last Emperor' with them.
Thursday, 18 November 2010
Google Map and Earthing
Strangely, the info bubbles for items in the layers built in to Google Earth seem to work okay. But if I copied the html from one of those info bubble and pasted it as my info bubble, although my info bubble would look exactly the same as the bubble I'd copied from, my bubble would still do the annoying resizing thing!
I think I will give up and try filing a bug report.
I also watched an episode of Power Rangers Zeo with L and part of The Last Emperor with Mauser, Bo and Clare. The Last Emperor is quite annoying as the people speak a mixture of English and Chinese. I wish they had just done it in Chinese except for any bits that are meant to be in English.
Wednesday, 17 November 2010
Google Mapping, Record Recording, and Forest Gump on CDI watching
Unfortunately after getting the function working in Chrome, I found that it didn't work in IE7, and debugging it, it didn't look like there would be any (easy) way to make it work.
The problem was that I using the getContent method of the info window, and then using jquery to get the offset of the content's parent. But in IE7 it seemed like getContent returned a copy of the content node, rather than the actual node itself, and so getting the offset didn't work.
I spent most of the rest of the day testing in other browsers, and found webkit / Safari doesn't handle history.replaceState and window.location.hash properly. So I had to add some extra logic to make sure that history.replaceState is only used if the browser supports it and it works correctly. I also filed a bug report with webkit about this issue.
I also got my record player from the garage, set it up, and recorded a few records to my PC.
In the evening I also watched a bit of Forest Gump on the CDI with Diddlebury, and 2 episodes of Power Rangers Zeo.
Tuesday, 16 November 2010
Google Mapping
Still to do is more extensive testing in IE8, IE9, Safari, Firefox3.5, Firefox4, Opera, Konqueror and K-Meleon.
In the evening I watched the "It's reeeeaaal" episode of Star Trek DS9 with Mauser and Bo
And I also watched some Robert W Paul films with Mauser.
Monday, 15 November 2010
Google Mapping
In the evening I also finished watching 'For All Mankind' with Mauser, Bo, and Clare.
Sunday, 14 November 2010
Various stuff
In the afternoon I updated my pog website then accidentally deleted some files from my Ubuntu VM.
I went out to help someone with a computer problem for a few minutes, then came back home and restored my backup of the Ubuntu VM from this morning. While that copied over I read a bit of Moose Peterson's blog.
When the backup of the Ubuntu VM had finished copying over, I started it up and then re-copied all the pog images etc. I had previously copied to it since the last backup.
I then did some Google Earth work, but was having in trouble in that I had put the wrong URL for an image in my kml. I corrected the mistake and cleared the cache (both memory and disk), but Google Earth was still requesting the wrong URL for the image. I tried again, with closing Google Earth first, still the same. I tried logging out and choosing to delete the cache file, but it was still the same.
Eventually I tried changing the file name of the kml file that originally had the wrong URL in it, and updating the KML that requested the kml file with the file's new name. Hooray, it now worked. It seems that clearing the cache in Google Earth does not actually clear the cache.
Possibly clearing the cache only deletes items downloaded from Google's servers. If Google Earth downloads files from your website with long expires headers, it will cache them (as it should), but then there seems to be no way to clear this cache. I filed a 'feature suggestion' with Google that Google Earth should clear its cache when you clear the cache, as they don't seem to have a bug reporting mechanism.
In the evening I watched the first episode of Mahou sentai Magirenjâ, the first two episodes of VR Troopers, and an episode of Masked Rider with L. I also watched part of a film about Appollo 8 and the moon with Mauser, L, and Clare.
I did some more work on my google maps / earth as well.
Saturday, 13 November 2010
Watching a film
Well, Home Shopping Selections normally sells the watch for £15. Strangely, they also sell it for £2. And according to a poster on this thread, even free is too much:
they are cheap rubbish!
I have one that i got free, which was still too much!
As usual, you get what you pay for, and if it seems too good to be true it probably is.
In the afternoon and evening I watched Master & Commander: Far Side of the World and a DS9 episode with Mauser and Bo.
Friday, 12 November 2010
Google Map and Earthing
But then this morning, I was thinking that actually if you are using Google Maps, do you expect the back button to go back to the last interaction you made with the map (e.g. scrolling or opening an info window)? My thought is that no, you would expect the back button to go back to the previous page. By recording all interactions with the map in the url hash, if you made 100 interactions, you would then need to use the back button 100 times to get back to the previous page.
So I changed my code to not update the hash, then wondered if it was just me who thought this way about how the back button should work with Google Maps. So I did a google search, and found this post where the writer is complaining that you can't easily bookmark a google map. While he doesn't complain about lack of back button functionality in Google Maps, he does complain about it with another website.
So I posted to the sitepoint forums to see what other people thought.
I did some vacuuming and then spent quite a while doing KML stuff. I was trying to find out why my KML would load would it wouldn't fly to the loaded placemark and open the bubble for it. After quite a while of trying different things I eventually figured it out - I had used
;balloonFlyTo
when it should be ;balloonFlyto
. Doh!I then tried to get multiple icons working in my KML but got stuck, so posted to the KML forums / google group to try and get some help.
I wrote some more of this blog post, then looked into whether it is possible to somehow delete the browser history for the page so that the URL could be updated without creating history states. This way bookmarking the current state of the map would be easy, while pressing the bck button would take you back to the previous page instead of back to the last interaction you made with the map.
I didn't think this would be possible, but it is - using
history.replaceState()
. It seems that currently this is only available in Chrome (maybe other webkit browsers as well), and the FF4 beta. I downloaded the IE9 Platform preview, and was disappointed to see that this doesn't support it either. So I filed a feedback report in the vain hope they'll fix it. I would guess they'll wait until other browsers have been implementing it for a few years before they add it to IE though.
In the evening I was working on converting an image in DXO, ACR, and CNX2 to compare how well they handled CA correction of the Tokina fisheye. While I was doing this, I listened to a webinar with Dean Hunt and Phil Henderson. Dean introduced the webinar and said that he would hand over to Phil who would get straight into the 'meat', and wouldn't bore you with his life story.
But then, Phil spent ages talking about his life story, and how his life is so great now etc. etc. He was saying that he loved having the freedom to do what he wants and just go out trout fishing whenever he wants.
Then later he said that he was retiring as internet marketing was too much work and too tiring. He said that he would never promote or sell another 'make money' product after this one. And he said he only had 3 places on his new course. In fact, the course is so 'hush hush' and secret that you have to sign and NDA if you purchase one of the three places.
Then later he expanded the amount of places to 5. And later still Dean Hunt kept saying that Phil might open some more spaces in two weeks. Phil said that it might not be in two weeks, but did indicate that he would indeed be opening up more places in the future. So suddenly the course is not so exclusive as it was at the start of the webinar.
And regarding not selling any more 'make money' products in the future, I guess Phil must think that closing a course and then opening it again in the future doesn't constitute selling any more 'make money' products because it is the same product?
On the whole then, I wasn't impressed with his sales technique.
Another thing was that he said the course was a good few months 'ahead of the curve'. Now, I don't know exactly what the course was about, but Dean hinted strongly and Phil somewhat confirmed it was about Local search marketing. (Where you do internet marketing for local businesses). Local search marketing certainly isn't 'ahead of the curve', as I've seen quite a few people promoting it heavily already.
Thursday, 11 November 2010
Photo processing and Google Mapping
In the evening I also watched an episode of Masked Rider and Kamen Rider with Bo and looked as Shaz's photos from South Africa.
Tuesday, 9 November 2010
AJAX url permalinking
The rest of the day I was working on my google maps implementation, mainly adding unique urls / permalinks as per http://ajaxpatterns.org/Unique_URLs. I don't have back button functionality or any url change polling working at the moment though.
In the evening I also watched an episode of Masked Rider and Kamen Rider Black with Diddleberry.
Monday, 8 November 2010
Processing photos
Shaz came home for a few days today as well.
Wednesday, 3 November 2010
Okay weather
I also went out to Farndon Fields again and took a pano by one of the trees on the edge of a field.
When I got home I copied the photos to my PC and manually geo-coded them (I forgot to switch my GPS on again). I also uploaded some photos to my photo website.
While I was waiting I started watching RIP: A Remix Manifesto -
In the afternoon I finished watching the Remix Manifesto film. It started raining so I was glad I had gone out this morning.
After that I checked my email. Reading the Money Morning email, Dominic Frisby said
But with so many markets at such obvious turning points, and inflation very much the mot du jour, my short-term speculative money is betting on a rise in the dollar, if only in the short term, and a fall in everything else.So I decided to take his advice, and sold my silver and precious metal ETFs, and put £1000 in a short yen long dollar ETF. But I kept my gold ETFs in case things go the other way to how Dominic thinks they might.
The rest of the afternoon and most of the evening I was working on a couple of panos that I took in the morning.
In the evening I also watched an episode of Masked Rider with L.
Monday, 1 November 2010
Processing photos
In the late afternoon the weather was looking the same as it did yesterday afternoon when there was a nice sunset, so I decided to go out. Unfortunately the sunset today was rubbish.
In the evening I also watched the final episode of Chojin Sentai Jetman with L. It was quite weird for a final episode. I also watched an episode of Masked Rider with L, which was quite nonsensical (even more than usual).
Sunday, 31 October 2010
Pano processing
After dinner I watched Jetman, Masked Rider, and Kamen Rider with L.
The rest of the afternoon and evening I was working on a couple of panos I took yesterday.
The weather was overcast nearly all day today. Around sunset the sun started shining through the clouds. I didn't think it was worth going out to take any photos since the cloud was absorbing quite a bit of the light, and I could see a bank of cloud that the sun was going to set behind.
But then actually the sunset was quite good, with quite a bit of the cloud covered sky being lit up a pale orange. After the sunset had being going on for quite a while I eventually decided that I might as well go and get some photos. Unfortunately I was a bit late, and just got the end of the afterglow really. And I didn't have time to find a place with a nice foreground / landscape either.
Saturday, 30 October 2010
Photo taking and geocoding
After having a shower I went out to photograph the sunrise. Although I got out to where I wanted to go (the same place as I went the other morning when it was too cloudy) in time for the sunrise, I needed to be about 15 minutes earlier to catch the pink / purple clouds before sunrise. There weren't a lot of clouds around so it wasn't that annoying that I'd missed the dawn though.
After taking the sunrise photos I wanted, I took some photos of the nearby trees in autumn colours (the same ones I took blurred photos of the other day). I took a pano in the field there as well.
I went on up through the next field towards East Farndon, then along the road towards the main road that goes through East Farndon. I thought that I would then be able to go across a field towards the valley on the east of East Farndon, but actually that footpath was further down the road.
Rather than walking down the road and then back up across the field I was originally planning to go through, I thought I would go up through East Farndon, then across the fields towards the valley.
So I did this, and was relieved to find that the bulls in the fields didn't chase me. When I got to the point that I had been planning to visit, I found that actually I didn't think it was that great for a pano. (It was a point I had walked through probably in July and August and thought that it would make a nice place for a daytime pano).
I was surprised that the large drinking trough bowl there didn't have any water in it, given all the rain we've had lately. (I had been planning to put the tripod in the drinking trough bowl for the pano).
Anyway, I didn't think it was worth taking a pano so I didn't take one.
I then walked back across the fields to Harborough Road, and back home along the roads. I also saw someone from Church while I was out walking.
I got back home about 10am, and was hungry and thirsty since I hadn't had anything to eat or drink yet today. I heated a cinnamon whirl up in the oven for me, Mauser, Lad, and Clare (though Clare had actually already had her breakfast).
The rest of the morning and afternoon I spent copying the images to the PC, and geo-tagging them and some of my other recent images. Unfortunately I keep forgetting to use my GPS lately, so I had to manually geo-code them all. And the altitude lookup in Robogeo hasn't been working lately, so I had to manually look up the altitude for each image taken at a different location. So that's why it took most of the morning and afternoon to do the geo-tagging.
I also added a bit of other basic metadata to the images.
About 4.40pm I went out to try and take a pano of the sunset in a field I'd passed through on my morning walk. It had a really nice yellow hedge and yellow oak (I think) trees in it, but in the morning the sun was hitting the back of them.
They looked very nice backlit in the morning light, but I thought that shooting into the sun, the sky would either be blown or the leaves dark. And it was windy in the morning, so exposure blending wouldn't have worked well.
Unfortunately, by the time I got to the field in the late afternoon, the sun had hidden itself behind a cloud, so there wasn't the nice warm sunlight hitting the trees and hedge that I wanted. The sun started to shine through the cloud a bit, and it had nearly set behind the hill that East Farndon sits on. So I took a pano, knowing that I might not get another chance this year to capture the trees and hedge looking so nice and yellow.
Then a few minutes later, the sun started to shine through the cloud more strongly, so I took the pano again. It wasn't perfect though - the sun was shining through a layer of cloud, and so wasn't at its strongest, and the cloud wasn't lit up that well by the setting sun either. But it was better than the first pano (or at least should be, I haven't processed it yet).
I went back into the previous field, and did a pano near a muddy puddle there. I was planning to do some twilight panos near the building work that is going on in the fields, but after sunset the clouds just went to being grey instead of purple / pink, so I didn't bother.
When I got home I copied the evening photos to my PC then had dinner.
After dinner me and L watched Kamen Rider, Masked Rider, and Birdman Rangers Jetman.
Then I geo-coded the evening's photos. I had remembered to switch on my GPS and sync the GPS and camera clocks, so geo-coding was relatively easy, though since I only took photos in 3 different locations, and two of them were virtually the same, geo-coding manually would have been about the same speed.
While I waited for the images to geocode I checked my email, Nikonrumors, and Canonrumors. I changed my watch and alarm to account for the DST change tonight, did a backup, wrote this blog post, then went to bed.
Friday, 29 October 2010
Google Earthing
My guess would be that fiddler can't intercept requests made through google earth's internal webkit browser, which is what is used to display the info bubbles.
Then I spent a lot of time trying to get things working. One of the problems was that I needed to load the css <link>s from an html page that I was fetching using jquery's AJAX function. I set the datatype to 'html' in the AJAX call, but in the data object that jquery creates with the response, it only contained the contents of the <body> from the fetched html page.
So there was no way for me to get the <link>s from the <head> of the fetched page as jquery (or webkit) had stripped the <head> away.
So I tried using a datatype of xml for the AJAX request instead. I had to get the page I was requesting to send an xml content type header as well to get that to work. Now the data object that jquery created with the response did include the full document, so I could get the <link>s from the <head> as well as the contents of the <body>.
However, when I used jquery to get the <link>s from the xml and then appended them to the <head> of the page I wanted to manipulate, they had no effect. I spent quite a while trying to figure the problem out.
Eventually I found that if I created a jquery object with each <link> (to convert it to an html object), and then took it's outerHTML and converted that to an html object using jquery, and appended that to the page, it would work.
E.g.
//Doesn't work
$(data).find('head > link').appendTo($('head'));
//Does work
$(data).find('head > link').each(function(){$('head').append(this.outerHTML);});
But then I found that the css wasn't being applied to a list (<ul>) that I was appending to the <body>. Again, I spent quite a while trying to find what the problem was, and it was similar to the problem I had with the <link>s not working.
//HTML appears to appended, but you can't access the appended html with jquery and the css rules aren't applied to it either
$('#contentContainer').append($(data).find('body > div'));
//Works properly
$('#contentContainer').append($(data).find('body > div')[0].outerHTML);
Then I also had lots of anti-fun trying to find out how to get google earth to zoom in to a placemark, and then open that placemark's info bubble. Unfortunately it seems that it is currently impossible. I think for the moment I am going to go with opening the info bubble and zooming to the placemark at the same time (this is possible).
This looks horrible as the info bubble keeps resizing and fills up most of the screen while google earth is zooming into the placemark. But if there's no better option, what can you do?
With most of my changes that I've made to try and make browsing my data in Google Earth a similar experience to browsing in google maps, I will have broken the Google Maps implementation. So when I have the Google Earth implementation working as best I can, I'll then have to go back and fix the Google Maps implementation.
About 4.40pm today the sun started to come out (it had been overcast all day, just like yesterday), so I quickly grabbed my photo stuff and went out to try and get some nice sunset photos. I figured that the large amount of cloud around should look very nice when lit up from underneath by the setting sun.
But about 5 minutes after I'd left, the sun went back behind the clouds. I hang around until about 5.30pm when I figured that the sun probably wasn't going to come out again, and so they'd be no sunset, and then went back home. I did take some blurry leaf photos while I was waiting for the sun though.
Also in the evening, I watched an episode of Masked Rider and Birdman Rangers Jetman with Belly.
Thursday, 28 October 2010
Google Map and Earthing
The rest of the morning I was working on making a custom marker for use in google maps, and getting it working in google maps. I made a number of different coloured versions, with each colour for a different range of images. E.g. a red marker meant one image at that point, purple meant 2-10 images at that point, blue was 11-25 images at that point, etc. I'm not sure if this is actually a good idea or not.
Most of the afternoon and part of the evening I was trying to fix my google earth kml so it would work the same as the google maps. This involved using javascript in google earth, which is quite difficult since I don't think google earth has a debugger. It seems to have some weird bugs as well, e.g. where DOMAIN is a variable containing the site domain (domain.com):
var WWW='http://www'+DOMAIN;
Doesn't workvar WWW='http://www';
Doesn't workvar WWW='http://'+'www'+DOMAIN;
Does workI haven't got very far with it yet though.
Wednesday, 27 October 2010
Geo-coding
In the late afternoon I went out to photograph the sunset, which was quite nice.
In the evening I watched an episode of The Masked Rider and Birdman Rangers Jetman, and the final two episodes of Mighty Morphin Power Rangers season 3 with L. After that I finished watching 2001: A Space Odyssey with Mauser and Biddles.
Tuesday, 26 October 2010
Geo-coding
As well as correcting the geocoding of the actual images, I also had to update the image data on my website's database, so it took a long time to do each image.
In the evening I watched an episode of Power Rangers, Masked Rider, Birdman Rangers Jetman with L. I also watched quite a bit of 2001: A Space Odyssey with Mauser and Bo.
Monday, 25 October 2010
broke stuff
After that I tried to find out why the Amazon Machine tags wordpess plugin wasn't working for me. I found out what the problem was, but not what the cause was or how to fix it, so I posted to the wordpress forum for the plugin to try and get some help.
Next I noticed that the ebay listings widget for my pog website wasn't working. I spent a while trying to debug that, but couldn't see what the problem was, so posted to the ebay developers forum to try and get some help with that.
After that I did some work on my geo clustering script. Then I spent quite a long time trying to test the different versions that I've made. The problem is that I have 12 versions, each version takes between 4-30 seconds to run, and my test script runs each one 5 times. So I kept getting various timeout errors.
Eventually I got it working by calling it via the php command line interface instead of via the browser.
In the afternoon I spent most of my time trying to improve the query that I use to retrieve the clusters from the database.
In the evening I watched Police Story, and episodes of Masked Rider, Power Rangers, and Birdman Rangers Jetman with L and Mauser.
Sunday, 24 October 2010
Watching films and baking
After dinner I finished watching Japanese Celine and Julie go boating (雨月物語) with Mauser and Bo.
Me and Bo made some cinnamon swirls, which took all afternoon. It takes so long because you have to let them rise (in the warmed oven) for 20 minutes, and then cook for 10 minutes. We can only fit two trays at a time in our oven, so doing about eight trays worth takes ages. Still, they taste nice and should last most of the week.
In the evening I watched Tokyo Drifter with Mauser, then I checked my email.
Friday, 22 October 2010
Processing and uploading photos
I spent most of the rest of the day uploading photos from my walk a couple of weeks ago, and also processing and then uploading the photos I took yesterday and today.
In the evening I watched Arsenal with Mauser and Bo. It was well directed, and I thought the music went well with the film, but the story didn't make much sense to me.
Thursday, 21 October 2010
Moving files
When I got back home I uploaded a few photos to my photo website, but then my Ubuntu VM kept complaining that it only had 1.7GB of free space.
So when the photos had finished uploading and the website had processed them, I decided to expand the VM's virtual hard drive. I loaded up the VMWare Server 2 web interface, but when I tried to expand the drive size I got the error
Insufficient disk space on datastore ''.I had enough free space on the drive where the datastore was located, but only about 5-10GB. I thought that maybe it needed some extra spare capacity for expanding the virtual drive, or maybe it creates a new virtual drive and then deletes the old one.
So I went through some folders deleting old stuff I didn't need any more, then tried expanding the virtual hard drive in VMWare Server 2 again. But I still got the same message about
Insufficient disk space on datastore ''.. I did some googling and found lots of info, but they all seemed to be about VMWare ESX, and the solutions didn't seem to be applicable to my situation.
I checked what the VMWare Server 2 Web Interface said about the free space level of the datastore, and while it was over 9000 (MB), it was less than the actual free space on the drive. I refreshed the page, but it still reported the same amount of free space.
So I restarted the PC, and then the VMWare Server 2 Web Interface finally reported the correct amount of free space for the datastore. I tried expanding the virtual hard drive again, thinking that maybe it didn't work previously as VMWare wasn't detecting the amount of free space correctly, but I still got the same message
Insufficient disk space on datastore ''..
In the afternoon I finished adding metadata to all the photos that I had taken a couple of weeks ago, and then moved them across from my 'Needs sorting' folder, which is on the same drive as the VMWare datastore, to my 'Pictures' folder, which is on a different drive.
When that was done I did a backup and checked my email.
When doing the backup, I noticed that one of the backup drives had much less free space than the drive in the PC that it was backing up, even though it was meant to be a mirror backup. The Recycle bin said it was empty, but when I checked the folder sizes on the backup drive and the computer drive, I found they were both the same.
So it was like there was some hidden files on the backup drive that were filling it up. It turned out there was - it seems that the Recycle bin on the computer doesn't register drives plugged in after you've already booted. So while there were files in the Recycle bin on the backup drive, you couldn't see them.
The solution was to either
- In Windows Explorer go to Tools > Folder Options, and then untick 'Hide protected operating system files (Recommended)' on the view tab. On the disk root there now appears a hidden Recycle bin folder that you can access and delete its contents.
- Otherwise, if you right-click on the drive in Windows Explorer and choose 'Properties', and then on the 'General' tab there is a 'Disk Cleanup' button you can use to empty the recycle bin.
But when I analysed the drive, it showed 0% fragmentation. So both disks have the same amount of fragmentation, the same amount of data, and the same size, but differing amounts of free space. Weird.
When the backups were complete I tried expanding the virtual hard drive for my Ubuntu VM again. This time it worked, though it took absolutely ages. I think it must be that VMWare Server 2 requires at least the size of how large the resized disk will be in free space before it will allow you to resize the disk. This would mean that if you had a VM with a 100GB drive, and you wanted to resize it to 101GB, you would need at least 101GB in free space, even though you're only increasing the disk size by 1GB.
When the drive resize was finally finished, I then had to move the VM partitions about and extend the main partition using gparted. Again, this took quite a while to do.
When that was done I wrote up this long, boring, useless blog post.
In the evening I went out to try and get some photos of the sunset. It was a nice firey red sunset, but unfortunately the area of the clouds being lit up was rather limited and quite far away.
After dinner I watched an episode of Power Rangers, two Masked Rider episodes, and one Chojin Sentai Jetman episode with Belly.
After that I watched Autumn Watch with Clare and Brian (and Belly watched the first half too). Then I uploaded some more photos from my walk two weeks ago and geo-coded the photos from today.
Tuesday, 19 October 2010
Ebay APIing and power cut
So I changed my code to instead just check the cache on every page request, and if the cache is stale or empty, then get the listings and cache them. Although it means checking the cache date on every page request, and if the cache is stale, the user will have to wait longer for the page to load while the new listings are fetched, I think it is better than auto generating listings that aren't needed.
In the morning we had a power cut for about an hour, so I couldn't do any work until the power came back. Strangely, HFM was out (playing music with no DJ) for about another hour. Power cuts are also annoying as all the burglar alarms go off.
Later in the afternoon I was looking at adding an ebay logo to my listings ad, but ebay have quite a lot of restrictions on the use of the logo. Strangely, on the logo use page it says that you must use a logo, but it doesn't seem to say this anywhere else in their docs. And the ebay API developers example scripts don't seem to include the mandatory logo and other text either.
Possibly ebay don't actually care if you include the logo and all the other text they say you must use when displaying data retrieved using the API. But it's better to be safe than sorry - you wouldn't want to earn lots of affiliate commissions from your ebay widget, and then ebay say they won't pay you because you resized the logo or missed out the copyright text.
So I emailed ebay using the contact form on their website to check if my usage of their logo and omission of other mandatory stuff was okay. Rather worryingly, I got a javascript error message pop up when I pressed submit, but the actual webpage said the email had been sent successfully. So I don't actually know if the message did get sent or not.
In the evening I watched an episode of Chojin Sentai Jetman and Masked Rider with Belly. After that I processed a few photos.
Monday, 18 October 2010
Ebay APIing
When I'd finished checking through the latest error logs it was lunch time. Most of the errors I saw in the logs were to do with url encoding. I don't encode most characters in my urls as all browsers I've tested do this (with most characters) automatically. Possibly IE < 6 and bots might have problems.
This isn't a problem for most people anyway, so I don't consider it a priority to fix. Indeed, bots not being able to rip your site could be considered advantageous.
After lunch I activated the new installs (well, actually about a week or so old now) of nginx and php on the webserver, and checked my websites worked okay.
When I was satisfied the websites seemed to be working okay I checked that the country was being checked from the IP address (which was the whole point of installing the new versions of nginx and php). Unfortunately it seemed that my IP address was being recorded as 127.0.0.1.
I wondered what the problem was until I realised I was checking phpinfo() on my local dev site instead of the live site. When I checked the live site I was relieved to see it had my outside IP address and the country recorded as 'GB'.
The rest of the afternoon I was working on the ebay listing code for my site.
In the evening I watched an episode of Chojin Sentai Jetman, Masked Rider, and Power Rangers with Diddleberry.
After that I did some more ebay API work. So far I have got it retrieving the ebay listings from all ebay sites and caching the html. And it is including the cached html in the page okay. I still need to style the listing and ping the ebay site using an Impression Pixel.
The weather today started off with a nice red cloud sunrise, then the rest of the day was overcast.
Sunday, 17 October 2010
Pogging and regexing
I got up to date on Moose Peterson's blog, then spent quite a bit of the afternoon and evening trying to get sed and awk to first get the pids of a command, and then get a command by pid (using the output of ps).
I also watched Once Upon A Time in China with Mauser and Bo, and a couple of Power Rangers episodes with Bo.
Saturday, 16 October 2010
Various stuff
php/bin/pear install Mail-1.2.0
, but this gave an error about not being able to write to php_dir. Weirdly I just tried it again now like php/bin/pear install --alldeps Mail
and it worked fine. It definitely was not working this morning.After trying a few things to get pear working, I gave up and decided to download the packages from the PEAR website and see if it was something that needed to be installed using pear or pyrus.phar, or just some files I could put in the right place manually. Thankfully, it was the latter, so I downloaded and unzipped the packages I needed, then placed the PHP files in the correct place (php/lib/php).
With php/lib/php included in the
include
setting for the site in php.ini, I loaded up the website and tested the contact form. It worked and PHP didn't break!!!When manually downloading the PEAR packages needed, I noted that they actually have a GEO_IP class. Since I'm only updating nginx and php to get this ability, that means I could have just installed that module, and saved all the work I've done all last work in trying to get nginx and php installed and working properly. But then I would likely still have had the same problems when I try to update nginx / php sometime in the future, so it wasn't a week lost for no reason.
After getting that working I watched an episode of The Masked Rider with L and Mauser, and then topped up the garden pond.
After lunch I processed my Fuji IS-Pro files using CornerFix, and it worked well. It took quite a while to process them though.
When that was done I checked whether the problem with php breaking was due to the old PEAR packages or their location. I copied the old PEAR files from /usr/local/lib/php to php/lib/php, and tried the contact form on the website, and it still worked okay. So the problem was with the location, not sure how the files being a folder php doesn't have permission to write to would make php break though.
In the evening I watched an episode of Power Rangers with L, and the rest of Even Dwarves Started Small with Mauser and L. I processed some more photos as well.
Friday, 15 October 2010
Getting annoyed with php not working
Unfortunately the CentOS init script for nginx wouldn't start it as it already thought nginx was running (the old installation was still running). So I had to deactivate the old one (thus making my websites inaccessible) before I could even test whether the new installation would work okay.
After finally getting them (nginx and php) up and working on the web server, I tried my websites and they loaded okay. But when I tried to send myself a message using the contact form on one of the websites, it made php stop responding. So when I realised this I had to make the changes to go back to the old nginx and php. So my websites were down for a few minutes.
Next I spent a while trying to find what the problem was. I found that the problem only occurred when the send function of the PEAR Mail class was used. So I spent a long time trying to find what the problem with this was, and why would it cause php to crash and become unresponsive?
I thought maybe the problem was php-fpm, so I tried to use spawn-fcgi to start php instead. But the php-cgi binary didn't exist.
So I tried re-installing PHP using --enable-fastcgi but got a message that it didn't recognise that option. Next I tried --enable-cgi (as well as --enable-fpm). configure didn't give any complaints, but when I had finished installing it (which takes quite a while), I found that it hadn't created the php-cgi binary. So I tried again, but this time without --enable-fpm, and it built with the php-cgi binary okay.
After modifying the php spawn-fcgi init script to work with the new installations of spawn-fcgi and php, and a few hiccups, I got php up and running through spawn-fcgi. And guess what? Yep, same problem as php-fpm, goes braindead when I use the pear mail class to send an email.
I thought maybe I should try using the latest PEAR Mail class in case the older one wasn't compatible with PHP 5.3.3. So I looked on the PEAR website, and after reading through a few of the docs saw they said you needed to download pyrus.phar, and then use that to install the PEAR modules.
So I downloaded the pyrus.phar and tried to use it as directed in the docs to download the SMTP class first. pyrus said that since it was a first run I should specify where the PEAR stuff should go. I left it at the default, but it then installed a load of folders into the main php folder. Turns out it just specifies the current working directory as the default. Doh!
I looked back up at all the pyrus options that had been printed out as part if the initialization process, but didn't see anything about changing the defaults or re-initializing. Near the top though it did say
Usage:
php pyrus.phar [/path/to/pear] [options]
php pyrus.phar [/path/to/pear] [options] <command> [options] [args]
It had quite a few errors because it hadn't loaded php.ini as well. Rather than always using -c /path/to/php.ini when using php pyrus.phar, I thought I would just move php.ini to php/lib, where it is automatically looked for by default. Of course, this meant modifying the spawn-fcgi php init script as well to point to the correct location for php.ini.Now I deleted all the new directories that had been created when I installed the SMTP module using pyrus and tried again. the usage examples show /path/to/pear as an option, but aren't explicit as to what this. Is it the pear file in php/bin, or is it /php/lib/PEAR? I thought probably the latter, so tried that.
But this just created a new folder in the PEAR directory called php, and then in that was the Mail class. This didn't seem right. I thought the correct directory to install into must be php/lib, then it should actually be installed into php/lib/php. So I deleted the new folder and tried again, but now got an error (well actually, lots of errors), which ended up
PEAR2\Pyrus\ChannelRegistry\Exception: Unable to process package name
PEAR2\Pyrus\ChannelRegistry\ParseException: Exception: corrupt registry, could not retrieve channel pear.php.net information
Googling for this error didn't come up with anything except the code that generates this error. Very helpful!So I tried downloading pyrus.phar again. I'm not sure whether the registry is saved and updated within the pyrus.phar file or somewhere else, but I thought it was worth a try. On the PEAR2 website it says
Pyrus is a tool to manage PEAR packages. Pyrus simplifies and improves the PEAR experience.If pyrus simplifies installing PEAR packages, it must have been an absolute nightmare to install them before!
After downloading pyrus.phar again, I still got the same errors. I tried removing the $HOME/.pear directory in case it was a problem with a file in there causing the errors, but still got the same error messages.
After restarting the VM I tried again, and pyrus now acted as if it hadn't been activated before. But I then got a load of error messages during the activation process.
I recompiled php yet again, copied the pyrus.phar again, but still the same errors. I checked the PEAR website in case the latest version of pyrus is buggy (the error messages seem to be to do with an xml parser inside pyrus). But I couldn't find any downloads of pyrus other than the latest one on the homepage.