This morning I cut out some pogs in Photoshop, then updated my pog website. After Church I watched L and Mauser play on Kirby for a few minutes, then had dinner.
After dinner I watched Jetman, Masked Rider, and Kamen Rider with L.
The rest of the afternoon and evening I was working on a couple of panos I took yesterday.
The weather was overcast nearly all day today. Around sunset the sun started shining through the clouds. I didn't think it was worth going out to take any photos since the cloud was absorbing quite a bit of the light, and I could see a bank of cloud that the sun was going to set behind.
But then actually the sunset was quite good, with quite a bit of the cloud covered sky being lit up a pale orange. After the sunset had being going on for quite a while I eventually decided that I might as well go and get some photos. Unfortunately I was a bit late, and just got the end of the afterglow really. And I didn't have time to find a place with a nice foreground / landscape either.
Sunday, 31 October 2010
Saturday, 30 October 2010
Photo taking and geocoding
This morning I woke up at 6am, but thought it was a bit early to get up, so stayed in bed a bit longer. Eventually I did get back to sleep, but not for long as my other alarm woke me up at 7am.
After having a shower I went out to photograph the sunrise. Although I got out to where I wanted to go (the same place as I went the other morning when it was too cloudy) in time for the sunrise, I needed to be about 15 minutes earlier to catch the pink / purple clouds before sunrise. There weren't a lot of clouds around so it wasn't that annoying that I'd missed the dawn though.
After taking the sunrise photos I wanted, I took some photos of the nearby trees in autumn colours (the same ones I took blurred photos of the other day). I took a pano in the field there as well.
I went on up through the next field towards East Farndon, then along the road towards the main road that goes through East Farndon. I thought that I would then be able to go across a field towards the valley on the east of East Farndon, but actually that footpath was further down the road.
Rather than walking down the road and then back up across the field I was originally planning to go through, I thought I would go up through East Farndon, then across the fields towards the valley.
So I did this, and was relieved to find that the bulls in the fields didn't chase me. When I got to the point that I had been planning to visit, I found that actually I didn't think it was that great for a pano. (It was a point I had walked through probably in July and August and thought that it would make a nice place for a daytime pano).
I was surprised that the large drinking trough bowl there didn't have any water in it, given all the rain we've had lately. (I had been planning to put the tripod in the drinking trough bowl for the pano).
Anyway, I didn't think it was worth taking a pano so I didn't take one.
I then walked back across the fields to Harborough Road, and back home along the roads. I also saw someone from Church while I was out walking.
I got back home about 10am, and was hungry and thirsty since I hadn't had anything to eat or drink yet today. I heated a cinnamon whirl up in the oven for me, Mauser, Lad, and Clare (though Clare had actually already had her breakfast).
The rest of the morning and afternoon I spent copying the images to the PC, and geo-tagging them and some of my other recent images. Unfortunately I keep forgetting to use my GPS lately, so I had to manually geo-code them all. And the altitude lookup in Robogeo hasn't been working lately, so I had to manually look up the altitude for each image taken at a different location. So that's why it took most of the morning and afternoon to do the geo-tagging.
I also added a bit of other basic metadata to the images.
About 4.40pm I went out to try and take a pano of the sunset in a field I'd passed through on my morning walk. It had a really nice yellow hedge and yellow oak (I think) trees in it, but in the morning the sun was hitting the back of them.
They looked very nice backlit in the morning light, but I thought that shooting into the sun, the sky would either be blown or the leaves dark. And it was windy in the morning, so exposure blending wouldn't have worked well.
Unfortunately, by the time I got to the field in the late afternoon, the sun had hidden itself behind a cloud, so there wasn't the nice warm sunlight hitting the trees and hedge that I wanted. The sun started to shine through the cloud a bit, and it had nearly set behind the hill that East Farndon sits on. So I took a pano, knowing that I might not get another chance this year to capture the trees and hedge looking so nice and yellow.
Then a few minutes later, the sun started to shine through the cloud more strongly, so I took the pano again. It wasn't perfect though - the sun was shining through a layer of cloud, and so wasn't at its strongest, and the cloud wasn't lit up that well by the setting sun either. But it was better than the first pano (or at least should be, I haven't processed it yet).
I went back into the previous field, and did a pano near a muddy puddle there. I was planning to do some twilight panos near the building work that is going on in the fields, but after sunset the clouds just went to being grey instead of purple / pink, so I didn't bother.
When I got home I copied the evening photos to my PC then had dinner.
After dinner me and L watched Kamen Rider, Masked Rider, and Birdman Rangers Jetman.
Then I geo-coded the evening's photos. I had remembered to switch on my GPS and sync the GPS and camera clocks, so geo-coding was relatively easy, though since I only took photos in 3 different locations, and two of them were virtually the same, geo-coding manually would have been about the same speed.
While I waited for the images to geocode I checked my email, Nikonrumors, and Canonrumors. I changed my watch and alarm to account for the DST change tonight, did a backup, wrote this blog post, then went to bed.
After having a shower I went out to photograph the sunrise. Although I got out to where I wanted to go (the same place as I went the other morning when it was too cloudy) in time for the sunrise, I needed to be about 15 minutes earlier to catch the pink / purple clouds before sunrise. There weren't a lot of clouds around so it wasn't that annoying that I'd missed the dawn though.
After taking the sunrise photos I wanted, I took some photos of the nearby trees in autumn colours (the same ones I took blurred photos of the other day). I took a pano in the field there as well.
I went on up through the next field towards East Farndon, then along the road towards the main road that goes through East Farndon. I thought that I would then be able to go across a field towards the valley on the east of East Farndon, but actually that footpath was further down the road.
Rather than walking down the road and then back up across the field I was originally planning to go through, I thought I would go up through East Farndon, then across the fields towards the valley.
So I did this, and was relieved to find that the bulls in the fields didn't chase me. When I got to the point that I had been planning to visit, I found that actually I didn't think it was that great for a pano. (It was a point I had walked through probably in July and August and thought that it would make a nice place for a daytime pano).
I was surprised that the large drinking trough bowl there didn't have any water in it, given all the rain we've had lately. (I had been planning to put the tripod in the drinking trough bowl for the pano).
Anyway, I didn't think it was worth taking a pano so I didn't take one.
I then walked back across the fields to Harborough Road, and back home along the roads. I also saw someone from Church while I was out walking.
I got back home about 10am, and was hungry and thirsty since I hadn't had anything to eat or drink yet today. I heated a cinnamon whirl up in the oven for me, Mauser, Lad, and Clare (though Clare had actually already had her breakfast).
The rest of the morning and afternoon I spent copying the images to the PC, and geo-tagging them and some of my other recent images. Unfortunately I keep forgetting to use my GPS lately, so I had to manually geo-code them all. And the altitude lookup in Robogeo hasn't been working lately, so I had to manually look up the altitude for each image taken at a different location. So that's why it took most of the morning and afternoon to do the geo-tagging.
I also added a bit of other basic metadata to the images.
About 4.40pm I went out to try and take a pano of the sunset in a field I'd passed through on my morning walk. It had a really nice yellow hedge and yellow oak (I think) trees in it, but in the morning the sun was hitting the back of them.
They looked very nice backlit in the morning light, but I thought that shooting into the sun, the sky would either be blown or the leaves dark. And it was windy in the morning, so exposure blending wouldn't have worked well.
Unfortunately, by the time I got to the field in the late afternoon, the sun had hidden itself behind a cloud, so there wasn't the nice warm sunlight hitting the trees and hedge that I wanted. The sun started to shine through the cloud a bit, and it had nearly set behind the hill that East Farndon sits on. So I took a pano, knowing that I might not get another chance this year to capture the trees and hedge looking so nice and yellow.
Then a few minutes later, the sun started to shine through the cloud more strongly, so I took the pano again. It wasn't perfect though - the sun was shining through a layer of cloud, and so wasn't at its strongest, and the cloud wasn't lit up that well by the setting sun either. But it was better than the first pano (or at least should be, I haven't processed it yet).
I went back into the previous field, and did a pano near a muddy puddle there. I was planning to do some twilight panos near the building work that is going on in the fields, but after sunset the clouds just went to being grey instead of purple / pink, so I didn't bother.
When I got home I copied the evening photos to my PC then had dinner.
After dinner me and L watched Kamen Rider, Masked Rider, and Birdman Rangers Jetman.
Then I geo-coded the evening's photos. I had remembered to switch on my GPS and sync the GPS and camera clocks, so geo-coding was relatively easy, though since I only took photos in 3 different locations, and two of them were virtually the same, geo-coding manually would have been about the same speed.
While I waited for the images to geocode I checked my email, Nikonrumors, and Canonrumors. I changed my watch and alarm to account for the DST change tonight, did a backup, wrote this blog post, then went to bed.
Friday, 29 October 2010
Google Earthing
This morning I was still trying to do google earth / kml stuff. I spent quite a while trying to find out why google earth wasn't loading my external javascript files, only to find out that it was. Strangely, while fiddler intercepts google earth's requests for image and kml requests, it doesn't seem to intercept the requests for javascript files.
My guess would be that fiddler can't intercept requests made through google earth's internal webkit browser, which is what is used to display the info bubbles.
Then I spent a lot of time trying to get things working. One of the problems was that I needed to load the css <link>s from an html page that I was fetching using jquery's AJAX function. I set the datatype to 'html' in the AJAX call, but in the data object that jquery creates with the response, it only contained the contents of the <body> from the fetched html page.
So there was no way for me to get the <link>s from the <head> of the fetched page as jquery (or webkit) had stripped the <head> away.
So I tried using a datatype of xml for the AJAX request instead. I had to get the page I was requesting to send an xml content type header as well to get that to work. Now the data object that jquery created with the response did include the full document, so I could get the <link>s from the <head> as well as the contents of the <body>.
However, when I used jquery to get the <link>s from the xml and then appended them to the <head> of the page I wanted to manipulate, they had no effect. I spent quite a while trying to figure the problem out.
Eventually I found that if I created a jquery object with each <link> (to convert it to an html object), and then took it's outerHTML and converted that to an html object using jquery, and appended that to the page, it would work.
E.g.
But then I found that the css wasn't being applied to a list (<ul>) that I was appending to the <body>. Again, I spent quite a while trying to find what the problem was, and it was similar to the problem I had with the <link>s not working.
Then I also had lots of anti-fun trying to find out how to get google earth to zoom in to a placemark, and then open that placemark's info bubble. Unfortunately it seems that it is currently impossible. I think for the moment I am going to go with opening the info bubble and zooming to the placemark at the same time (this is possible).
This looks horrible as the info bubble keeps resizing and fills up most of the screen while google earth is zooming into the placemark. But if there's no better option, what can you do?
With most of my changes that I've made to try and make browsing my data in Google Earth a similar experience to browsing in google maps, I will have broken the Google Maps implementation. So when I have the Google Earth implementation working as best I can, I'll then have to go back and fix the Google Maps implementation.
About 4.40pm today the sun started to come out (it had been overcast all day, just like yesterday), so I quickly grabbed my photo stuff and went out to try and get some nice sunset photos. I figured that the large amount of cloud around should look very nice when lit up from underneath by the setting sun.
But about 5 minutes after I'd left, the sun went back behind the clouds. I hang around until about 5.30pm when I figured that the sun probably wasn't going to come out again, and so they'd be no sunset, and then went back home. I did take some blurry leaf photos while I was waiting for the sun though.
Also in the evening, I watched an episode of Masked Rider and Birdman Rangers Jetman with Belly.
My guess would be that fiddler can't intercept requests made through google earth's internal webkit browser, which is what is used to display the info bubbles.
Then I spent a lot of time trying to get things working. One of the problems was that I needed to load the css <link>s from an html page that I was fetching using jquery's AJAX function. I set the datatype to 'html' in the AJAX call, but in the data object that jquery creates with the response, it only contained the contents of the <body> from the fetched html page.
So there was no way for me to get the <link>s from the <head> of the fetched page as jquery (or webkit) had stripped the <head> away.
So I tried using a datatype of xml for the AJAX request instead. I had to get the page I was requesting to send an xml content type header as well to get that to work. Now the data object that jquery created with the response did include the full document, so I could get the <link>s from the <head> as well as the contents of the <body>.
However, when I used jquery to get the <link>s from the xml and then appended them to the <head> of the page I wanted to manipulate, they had no effect. I spent quite a while trying to figure the problem out.
Eventually I found that if I created a jquery object with each <link> (to convert it to an html object), and then took it's outerHTML and converted that to an html object using jquery, and appended that to the page, it would work.
E.g.
//Doesn't work
$(data).find('head > link').appendTo($('head'));
//Does work
$(data).find('head > link').each(function(){$('head').append(this.outerHTML);});
But then I found that the css wasn't being applied to a list (<ul>) that I was appending to the <body>. Again, I spent quite a while trying to find what the problem was, and it was similar to the problem I had with the <link>s not working.
//HTML appears to appended, but you can't access the appended html with jquery and the css rules aren't applied to it either
$('#contentContainer').append($(data).find('body > div'));
//Works properly
$('#contentContainer').append($(data).find('body > div')[0].outerHTML);
Then I also had lots of anti-fun trying to find out how to get google earth to zoom in to a placemark, and then open that placemark's info bubble. Unfortunately it seems that it is currently impossible. I think for the moment I am going to go with opening the info bubble and zooming to the placemark at the same time (this is possible).
This looks horrible as the info bubble keeps resizing and fills up most of the screen while google earth is zooming into the placemark. But if there's no better option, what can you do?
With most of my changes that I've made to try and make browsing my data in Google Earth a similar experience to browsing in google maps, I will have broken the Google Maps implementation. So when I have the Google Earth implementation working as best I can, I'll then have to go back and fix the Google Maps implementation.
About 4.40pm today the sun started to come out (it had been overcast all day, just like yesterday), so I quickly grabbed my photo stuff and went out to try and get some nice sunset photos. I figured that the large amount of cloud around should look very nice when lit up from underneath by the setting sun.
But about 5 minutes after I'd left, the sun went back behind the clouds. I hang around until about 5.30pm when I figured that the sun probably wasn't going to come out again, and so they'd be no sunset, and then went back home. I did take some blurry leaf photos while I was waiting for the sun though.
Also in the evening, I watched an episode of Masked Rider and Birdman Rangers Jetman with Belly.
Thursday, 28 October 2010
Google Map and Earthing
This morning I went out to photograph the sunrise, but unfortunately it was rubbish. I waited until about 8.30am to see what would happen when the sun rose above the cloud bank it was behind but the answer was just that the clouds got a bit brighter. The light on the earth was still very diffused and boring.
The rest of the morning I was working on making a custom marker for use in google maps, and getting it working in google maps. I made a number of different coloured versions, with each colour for a different range of images. E.g. a red marker meant one image at that point, purple meant 2-10 images at that point, blue was 11-25 images at that point, etc. I'm not sure if this is actually a good idea or not.
Most of the afternoon and part of the evening I was trying to fix my google earth kml so it would work the same as the google maps. This involved using javascript in google earth, which is quite difficult since I don't think google earth has a debugger. It seems to have some weird bugs as well, e.g. where DOMAIN is a variable containing the site domain (domain.com):
I haven't got very far with it yet though.
The rest of the morning I was working on making a custom marker for use in google maps, and getting it working in google maps. I made a number of different coloured versions, with each colour for a different range of images. E.g. a red marker meant one image at that point, purple meant 2-10 images at that point, blue was 11-25 images at that point, etc. I'm not sure if this is actually a good idea or not.
Most of the afternoon and part of the evening I was trying to fix my google earth kml so it would work the same as the google maps. This involved using javascript in google earth, which is quite difficult since I don't think google earth has a debugger. It seems to have some weird bugs as well, e.g. where DOMAIN is a variable containing the site domain (domain.com):
var WWW='http://www'+DOMAIN;
Doesn't workvar WWW='http://www';
Doesn't workvar WWW='http://'+'www'+DOMAIN;
Does workI haven't got very far with it yet though.
Wednesday, 27 October 2010
Geo-coding
Most of today I was still going through the Shropshire photos from last year and correcting incorrectly geo-coded ones.
In the late afternoon I went out to photograph the sunset, which was quite nice.
In the evening I watched an episode of The Masked Rider and Birdman Rangers Jetman, and the final two episodes of Mighty Morphin Power Rangers season 3 with L. After that I finished watching 2001: A Space Odyssey with Mauser and Biddles.
In the late afternoon I went out to photograph the sunset, which was quite nice.
In the evening I watched an episode of The Masked Rider and Birdman Rangers Jetman, and the final two episodes of Mighty Morphin Power Rangers season 3 with L. After that I finished watching 2001: A Space Odyssey with Mauser and Biddles.
Tuesday, 26 October 2010
Geo-coding
Most of today I was checking and updating the geo-coding of various photos. The majority of photos I took in Leominster in July last year were incorrectly geo-coded. Because of this, I also checked some of the other ones from Shropshire. I found a batch that were geo-coded correctly, but missing altitude data, but unfortunately it seemed that the altitude lookup service robogeo uses was down most of today.
As well as correcting the geocoding of the actual images, I also had to update the image data on my website's database, so it took a long time to do each image.
In the evening I watched an episode of Power Rangers, Masked Rider, Birdman Rangers Jetman with L. I also watched quite a bit of 2001: A Space Odyssey with Mauser and Bo.
As well as correcting the geocoding of the actual images, I also had to update the image data on my website's database, so it took a long time to do each image.
In the evening I watched an episode of Power Rangers, Masked Rider, Birdman Rangers Jetman with L. I also watched quite a bit of 2001: A Space Odyssey with Mauser and Bo.
Monday, 25 October 2010
broke stuff
This morning I checked my email and then updated my pano website with a sunrise pano I took a few days ago.
After that I tried to find out why the Amazon Machine tags wordpess plugin wasn't working for me. I found out what the problem was, but not what the cause was or how to fix it, so I posted to the wordpress forum for the plugin to try and get some help.
Next I noticed that the ebay listings widget for my pog website wasn't working. I spent a while trying to debug that, but couldn't see what the problem was, so posted to the ebay developers forum to try and get some help with that.
After that I did some work on my geo clustering script. Then I spent quite a long time trying to test the different versions that I've made. The problem is that I have 12 versions, each version takes between 4-30 seconds to run, and my test script runs each one 5 times. So I kept getting various timeout errors.
Eventually I got it working by calling it via the php command line interface instead of via the browser.
In the afternoon I spent most of my time trying to improve the query that I use to retrieve the clusters from the database.
In the evening I watched Police Story, and episodes of Masked Rider, Power Rangers, and Birdman Rangers Jetman with L and Mauser.
After that I tried to find out why the Amazon Machine tags wordpess plugin wasn't working for me. I found out what the problem was, but not what the cause was or how to fix it, so I posted to the wordpress forum for the plugin to try and get some help.
Next I noticed that the ebay listings widget for my pog website wasn't working. I spent a while trying to debug that, but couldn't see what the problem was, so posted to the ebay developers forum to try and get some help with that.
After that I did some work on my geo clustering script. Then I spent quite a long time trying to test the different versions that I've made. The problem is that I have 12 versions, each version takes between 4-30 seconds to run, and my test script runs each one 5 times. So I kept getting various timeout errors.
Eventually I got it working by calling it via the php command line interface instead of via the browser.
In the afternoon I spent most of my time trying to improve the query that I use to retrieve the clusters from the database.
In the evening I watched Police Story, and episodes of Masked Rider, Power Rangers, and Birdman Rangers Jetman with L and Mauser.
Sunday, 24 October 2010
Watching films and baking
This morning I cut out some pogs in Photoshop, then went to church. After church I updated my pog website.
After dinner I finished watching Japanese Celine and Julie go boating (雨月物語) with Mauser and Bo.
Me and Bo made some cinnamon swirls, which took all afternoon. It takes so long because you have to let them rise (in the warmed oven) for 20 minutes, and then cook for 10 minutes. We can only fit two trays at a time in our oven, so doing about eight trays worth takes ages. Still, they taste nice and should last most of the week.
In the evening I watched Tokyo Drifter with Mauser, then I checked my email.
After dinner I finished watching Japanese Celine and Julie go boating (雨月物語) with Mauser and Bo.
Me and Bo made some cinnamon swirls, which took all afternoon. It takes so long because you have to let them rise (in the warmed oven) for 20 minutes, and then cook for 10 minutes. We can only fit two trays at a time in our oven, so doing about eight trays worth takes ages. Still, they taste nice and should last most of the week.
In the evening I watched Tokyo Drifter with Mauser, then I checked my email.
Friday, 22 October 2010
Processing and uploading photos
This morning I went out to try and photograph the sunrise again. Unfortunately, while it looked like the sunrise hadn't started yet from my bedroom window, when I actually got outside I could see the sunrise was already well underway, and I'd missed most of it.
I spent most of the rest of the day uploading photos from my walk a couple of weeks ago, and also processing and then uploading the photos I took yesterday and today.
In the evening I watched Arsenal with Mauser and Bo. It was well directed, and I thought the music went well with the film, but the story didn't make much sense to me.
I spent most of the rest of the day uploading photos from my walk a couple of weeks ago, and also processing and then uploading the photos I took yesterday and today.
In the evening I watched Arsenal with Mauser and Bo. It was well directed, and I thought the music went well with the film, but the story didn't make much sense to me.
Thursday, 21 October 2010
Moving files
This morning I went out on a short walk round the nearby field to try and get some sunrise photos. It was a nice sunrise, but the clouds were a bit far away. Really could have done with more clouds.
When I got back home I uploaded a few photos to my photo website, but then my Ubuntu VM kept complaining that it only had 1.7GB of free space.
So when the photos had finished uploading and the website had processed them, I decided to expand the VM's virtual hard drive. I loaded up the VMWare Server 2 web interface, but when I tried to expand the drive size I got the error
So I went through some folders deleting old stuff I didn't need any more, then tried expanding the virtual hard drive in VMWare Server 2 again. But I still got the same message about
I checked what the VMWare Server 2 Web Interface said about the free space level of the datastore, and while it was over 9000 (MB), it was less than the actual free space on the drive. I refreshed the page, but it still reported the same amount of free space.
So I restarted the PC, and then the VMWare Server 2 Web Interface finally reported the correct amount of free space for the datastore. I tried expanding the virtual hard drive again, thinking that maybe it didn't work previously as VMWare wasn't detecting the amount of free space correctly, but I still got the same message
In the afternoon I finished adding metadata to all the photos that I had taken a couple of weeks ago, and then moved them across from my 'Needs sorting' folder, which is on the same drive as the VMWare datastore, to my 'Pictures' folder, which is on a different drive.
When that was done I did a backup and checked my email.
When doing the backup, I noticed that one of the backup drives had much less free space than the drive in the PC that it was backing up, even though it was meant to be a mirror backup. The Recycle bin said it was empty, but when I checked the folder sizes on the backup drive and the computer drive, I found they were both the same.
So it was like there was some hidden files on the backup drive that were filling it up. It turned out there was - it seems that the Recycle bin on the computer doesn't register drives plugged in after you've already booted. So while there were files in the Recycle bin on the backup drive, you couldn't see them.
The solution was to either
But when I analysed the drive, it showed 0% fragmentation. So both disks have the same amount of fragmentation, the same amount of data, and the same size, but differing amounts of free space. Weird.
When the backups were complete I tried expanding the virtual hard drive for my Ubuntu VM again. This time it worked, though it took absolutely ages. I think it must be that VMWare Server 2 requires at least the size of how large the resized disk will be in free space before it will allow you to resize the disk. This would mean that if you had a VM with a 100GB drive, and you wanted to resize it to 101GB, you would need at least 101GB in free space, even though you're only increasing the disk size by 1GB.
When the drive resize was finally finished, I then had to move the VM partitions about and extend the main partition using gparted. Again, this took quite a while to do.
When that was done I wrote up this long, boring, useless blog post.
In the evening I went out to try and get some photos of the sunset. It was a nice firey red sunset, but unfortunately the area of the clouds being lit up was rather limited and quite far away.
After dinner I watched an episode of Power Rangers, two Masked Rider episodes, and one Chojin Sentai Jetman episode with Belly.
After that I watched Autumn Watch with Clare and Brian (and Belly watched the first half too). Then I uploaded some more photos from my walk two weeks ago and geo-coded the photos from today.
When I got back home I uploaded a few photos to my photo website, but then my Ubuntu VM kept complaining that it only had 1.7GB of free space.
So when the photos had finished uploading and the website had processed them, I decided to expand the VM's virtual hard drive. I loaded up the VMWare Server 2 web interface, but when I tried to expand the drive size I got the error
Insufficient disk space on datastore ''.I had enough free space on the drive where the datastore was located, but only about 5-10GB. I thought that maybe it needed some extra spare capacity for expanding the virtual drive, or maybe it creates a new virtual drive and then deletes the old one.
So I went through some folders deleting old stuff I didn't need any more, then tried expanding the virtual hard drive in VMWare Server 2 again. But I still got the same message about
Insufficient disk space on datastore ''.. I did some googling and found lots of info, but they all seemed to be about VMWare ESX, and the solutions didn't seem to be applicable to my situation.
I checked what the VMWare Server 2 Web Interface said about the free space level of the datastore, and while it was over 9000 (MB), it was less than the actual free space on the drive. I refreshed the page, but it still reported the same amount of free space.
So I restarted the PC, and then the VMWare Server 2 Web Interface finally reported the correct amount of free space for the datastore. I tried expanding the virtual hard drive again, thinking that maybe it didn't work previously as VMWare wasn't detecting the amount of free space correctly, but I still got the same message
Insufficient disk space on datastore ''..
In the afternoon I finished adding metadata to all the photos that I had taken a couple of weeks ago, and then moved them across from my 'Needs sorting' folder, which is on the same drive as the VMWare datastore, to my 'Pictures' folder, which is on a different drive.
When that was done I did a backup and checked my email.
When doing the backup, I noticed that one of the backup drives had much less free space than the drive in the PC that it was backing up, even though it was meant to be a mirror backup. The Recycle bin said it was empty, but when I checked the folder sizes on the backup drive and the computer drive, I found they were both the same.
So it was like there was some hidden files on the backup drive that were filling it up. It turned out there was - it seems that the Recycle bin on the computer doesn't register drives plugged in after you've already booted. So while there were files in the Recycle bin on the backup drive, you couldn't see them.
The solution was to either
- In Windows Explorer go to Tools > Folder Options, and then untick 'Hide protected operating system files (Recommended)' on the view tab. On the disk root there now appears a hidden Recycle bin folder that you can access and delete its contents.
- Otherwise, if you right-click on the drive in Windows Explorer and choose 'Properties', and then on the 'General' tab there is a 'Disk Cleanup' button you can use to empty the recycle bin.
But when I analysed the drive, it showed 0% fragmentation. So both disks have the same amount of fragmentation, the same amount of data, and the same size, but differing amounts of free space. Weird.
When the backups were complete I tried expanding the virtual hard drive for my Ubuntu VM again. This time it worked, though it took absolutely ages. I think it must be that VMWare Server 2 requires at least the size of how large the resized disk will be in free space before it will allow you to resize the disk. This would mean that if you had a VM with a 100GB drive, and you wanted to resize it to 101GB, you would need at least 101GB in free space, even though you're only increasing the disk size by 1GB.
When the drive resize was finally finished, I then had to move the VM partitions about and extend the main partition using gparted. Again, this took quite a while to do.
When that was done I wrote up this long, boring, useless blog post.
In the evening I went out to try and get some photos of the sunset. It was a nice firey red sunset, but unfortunately the area of the clouds being lit up was rather limited and quite far away.
After dinner I watched an episode of Power Rangers, two Masked Rider episodes, and one Chojin Sentai Jetman episode with Belly.
After that I watched Autumn Watch with Clare and Brian (and Belly watched the first half too). Then I uploaded some more photos from my walk two weeks ago and geo-coded the photos from today.
Tuesday, 19 October 2010
Ebay APIing and power cut
Today I was just doing more work on my ebay ad. After looking at my site logs I decided that auto generating the listings for all ebay sites was a bit pointless as I don't really get many visitors from, say, the Philippines. So generating a new listings ad for the Philippines every 10 minutes would be pointless.
So I changed my code to instead just check the cache on every page request, and if the cache is stale or empty, then get the listings and cache them. Although it means checking the cache date on every page request, and if the cache is stale, the user will have to wait longer for the page to load while the new listings are fetched, I think it is better than auto generating listings that aren't needed.
In the morning we had a power cut for about an hour, so I couldn't do any work until the power came back. Strangely, HFM was out (playing music with no DJ) for about another hour. Power cuts are also annoying as all the burglar alarms go off.
Later in the afternoon I was looking at adding an ebay logo to my listings ad, but ebay have quite a lot of restrictions on the use of the logo. Strangely, on the logo use page it says that you must use a logo, but it doesn't seem to say this anywhere else in their docs. And the ebay API developers example scripts don't seem to include the mandatory logo and other text either.
Possibly ebay don't actually care if you include the logo and all the other text they say you must use when displaying data retrieved using the API. But it's better to be safe than sorry - you wouldn't want to earn lots of affiliate commissions from your ebay widget, and then ebay say they won't pay you because you resized the logo or missed out the copyright text.
So I emailed ebay using the contact form on their website to check if my usage of their logo and omission of other mandatory stuff was okay. Rather worryingly, I got a javascript error message pop up when I pressed submit, but the actual webpage said the email had been sent successfully. So I don't actually know if the message did get sent or not.
In the evening I watched an episode of Chojin Sentai Jetman and Masked Rider with Belly. After that I processed a few photos.
So I changed my code to instead just check the cache on every page request, and if the cache is stale or empty, then get the listings and cache them. Although it means checking the cache date on every page request, and if the cache is stale, the user will have to wait longer for the page to load while the new listings are fetched, I think it is better than auto generating listings that aren't needed.
In the morning we had a power cut for about an hour, so I couldn't do any work until the power came back. Strangely, HFM was out (playing music with no DJ) for about another hour. Power cuts are also annoying as all the burglar alarms go off.
Later in the afternoon I was looking at adding an ebay logo to my listings ad, but ebay have quite a lot of restrictions on the use of the logo. Strangely, on the logo use page it says that you must use a logo, but it doesn't seem to say this anywhere else in their docs. And the ebay API developers example scripts don't seem to include the mandatory logo and other text either.
Possibly ebay don't actually care if you include the logo and all the other text they say you must use when displaying data retrieved using the API. But it's better to be safe than sorry - you wouldn't want to earn lots of affiliate commissions from your ebay widget, and then ebay say they won't pay you because you resized the logo or missed out the copyright text.
So I emailed ebay using the contact form on their website to check if my usage of their logo and omission of other mandatory stuff was okay. Rather worryingly, I got a javascript error message pop up when I pressed submit, but the actual webpage said the email had been sent successfully. So I don't actually know if the message did get sent or not.
In the evening I watched an episode of Chojin Sentai Jetman and Masked Rider with Belly. After that I processed a few photos.
Monday, 18 October 2010
Ebay APIing
This morning I was still working on getting php-fpm up and running how I wanted. When I thought that it seemed to be working okay I checked the error logs for my website in case there was any immediate problems that needed dealing with before I tried activating the new installs of nginx and php on the webserver.
When I'd finished checking through the latest error logs it was lunch time. Most of the errors I saw in the logs were to do with url encoding. I don't encode most characters in my urls as all browsers I've tested do this (with most characters) automatically. Possibly IE < 6 and bots might have problems.
This isn't a problem for most people anyway, so I don't consider it a priority to fix. Indeed, bots not being able to rip your site could be considered advantageous.
After lunch I activated the new installs (well, actually about a week or so old now) of nginx and php on the webserver, and checked my websites worked okay.
When I was satisfied the websites seemed to be working okay I checked that the country was being checked from the IP address (which was the whole point of installing the new versions of nginx and php). Unfortunately it seemed that my IP address was being recorded as 127.0.0.1.
I wondered what the problem was until I realised I was checking phpinfo() on my local dev site instead of the live site. When I checked the live site I was relieved to see it had my outside IP address and the country recorded as 'GB'.
The rest of the afternoon I was working on the ebay listing code for my site.
In the evening I watched an episode of Chojin Sentai Jetman, Masked Rider, and Power Rangers with Diddleberry.
After that I did some more ebay API work. So far I have got it retrieving the ebay listings from all ebay sites and caching the html. And it is including the cached html in the page okay. I still need to style the listing and ping the ebay site using an Impression Pixel.
The weather today started off with a nice red cloud sunrise, then the rest of the day was overcast.
When I'd finished checking through the latest error logs it was lunch time. Most of the errors I saw in the logs were to do with url encoding. I don't encode most characters in my urls as all browsers I've tested do this (with most characters) automatically. Possibly IE < 6 and bots might have problems.
This isn't a problem for most people anyway, so I don't consider it a priority to fix. Indeed, bots not being able to rip your site could be considered advantageous.
After lunch I activated the new installs (well, actually about a week or so old now) of nginx and php on the webserver, and checked my websites worked okay.
When I was satisfied the websites seemed to be working okay I checked that the country was being checked from the IP address (which was the whole point of installing the new versions of nginx and php). Unfortunately it seemed that my IP address was being recorded as 127.0.0.1.
I wondered what the problem was until I realised I was checking phpinfo() on my local dev site instead of the live site. When I checked the live site I was relieved to see it had my outside IP address and the country recorded as 'GB'.
The rest of the afternoon I was working on the ebay listing code for my site.
In the evening I watched an episode of Chojin Sentai Jetman, Masked Rider, and Power Rangers with Diddleberry.
After that I did some more ebay API work. So far I have got it retrieving the ebay listings from all ebay sites and caching the html. And it is including the cached html in the page okay. I still need to style the listing and ping the ebay site using an Impression Pixel.
The weather today started off with a nice red cloud sunrise, then the rest of the day was overcast.
Sunday, 17 October 2010
Pogging and regexing
This morning I started cutting out some pogs in photoshop, then went to Church. After church I did more pogging until dinner time. After dinner I finished updating my pog website, which took quite a while.
I got up to date on Moose Peterson's blog, then spent quite a bit of the afternoon and evening trying to get sed and awk to first get the pids of a command, and then get a command by pid (using the output of ps).
I also watched Once Upon A Time in China with Mauser and Bo, and a couple of Power Rangers episodes with Bo.
I got up to date on Moose Peterson's blog, then spent quite a bit of the afternoon and evening trying to get sed and awk to first get the pids of a command, and then get a command by pid (using the output of ps).
I also watched Once Upon A Time in China with Mauser and Bo, and a couple of Power Rangers episodes with Bo.
Saturday, 16 October 2010
Various stuff
This morning I was still trying to get the PEAR Mail class installed. With pyrus.phar refusing to work, I tried using
After trying a few things to get pear working, I gave up and decided to download the packages from the PEAR website and see if it was something that needed to be installed using pear or pyrus.phar, or just some files I could put in the right place manually. Thankfully, it was the latter, so I downloaded and unzipped the packages I needed, then placed the PHP files in the correct place (php/lib/php).
With php/lib/php included in the
When manually downloading the PEAR packages needed, I noted that they actually have a GEO_IP class. Since I'm only updating nginx and php to get this ability, that means I could have just installed that module, and saved all the work I've done all last work in trying to get nginx and php installed and working properly. But then I would likely still have had the same problems when I try to update nginx / php sometime in the future, so it wasn't a week lost for no reason.
After getting that working I watched an episode of The Masked Rider with L and Mauser, and then topped up the garden pond.
After lunch I processed my Fuji IS-Pro files using CornerFix, and it worked well. It took quite a while to process them though.
When that was done I checked whether the problem with php breaking was due to the old PEAR packages or their location. I copied the old PEAR files from /usr/local/lib/php to php/lib/php, and tried the contact form on the website, and it still worked okay. So the problem was with the location, not sure how the files being a folder php doesn't have permission to write to would make php break though.
In the evening I watched an episode of Power Rangers with L, and the rest of Even Dwarves Started Small with Mauser and L. I processed some more photos as well.
php/bin/pear install Mail-1.2.0
, but this gave an error about not being able to write to php_dir. Weirdly I just tried it again now like php/bin/pear install --alldeps Mail
and it worked fine. It definitely was not working this morning.After trying a few things to get pear working, I gave up and decided to download the packages from the PEAR website and see if it was something that needed to be installed using pear or pyrus.phar, or just some files I could put in the right place manually. Thankfully, it was the latter, so I downloaded and unzipped the packages I needed, then placed the PHP files in the correct place (php/lib/php).
With php/lib/php included in the
include
setting for the site in php.ini, I loaded up the website and tested the contact form. It worked and PHP didn't break!!!When manually downloading the PEAR packages needed, I noted that they actually have a GEO_IP class. Since I'm only updating nginx and php to get this ability, that means I could have just installed that module, and saved all the work I've done all last work in trying to get nginx and php installed and working properly. But then I would likely still have had the same problems when I try to update nginx / php sometime in the future, so it wasn't a week lost for no reason.
After getting that working I watched an episode of The Masked Rider with L and Mauser, and then topped up the garden pond.
After lunch I processed my Fuji IS-Pro files using CornerFix, and it worked well. It took quite a while to process them though.
When that was done I checked whether the problem with php breaking was due to the old PEAR packages or their location. I copied the old PEAR files from /usr/local/lib/php to php/lib/php, and tried the contact form on the website, and it still worked okay. So the problem was with the location, not sure how the files being a folder php doesn't have permission to write to would make php break though.
In the evening I watched an episode of Power Rangers with L, and the rest of Even Dwarves Started Small with Mauser and L. I processed some more photos as well.
Friday, 15 October 2010
Getting annoyed with php not working
Today I was still doing website work, trying to get the user's country from their IP address. I thought that I finally had everything working with my latest installs of nginx and php, and so decided to try activating them on the live server (I'd already installed them on there).
Unfortunately the CentOS init script for nginx wouldn't start it as it already thought nginx was running (the old installation was still running). So I had to deactivate the old one (thus making my websites inaccessible) before I could even test whether the new installation would work okay.
After finally getting them (nginx and php) up and working on the web server, I tried my websites and they loaded okay. But when I tried to send myself a message using the contact form on one of the websites, it made php stop responding. So when I realised this I had to make the changes to go back to the old nginx and php. So my websites were down for a few minutes.
Next I spent a while trying to find what the problem was. I found that the problem only occurred when the send function of the PEAR Mail class was used. So I spent a long time trying to find what the problem with this was, and why would it cause php to crash and become unresponsive?
I thought maybe the problem was php-fpm, so I tried to use spawn-fcgi to start php instead. But the php-cgi binary didn't exist.
So I tried re-installing PHP using --enable-fastcgi but got a message that it didn't recognise that option. Next I tried --enable-cgi (as well as --enable-fpm). configure didn't give any complaints, but when I had finished installing it (which takes quite a while), I found that it hadn't created the php-cgi binary. So I tried again, but this time without --enable-fpm, and it built with the php-cgi binary okay.
After modifying the php spawn-fcgi init script to work with the new installations of spawn-fcgi and php, and a few hiccups, I got php up and running through spawn-fcgi. And guess what? Yep, same problem as php-fpm, goes braindead when I use the pear mail class to send an email.
I thought maybe I should try using the latest PEAR Mail class in case the older one wasn't compatible with PHP 5.3.3. So I looked on the PEAR website, and after reading through a few of the docs saw they said you needed to download pyrus.phar, and then use that to install the PEAR modules.
So I downloaded the pyrus.phar and tried to use it as directed in the docs to download the SMTP class first. pyrus said that since it was a first run I should specify where the PEAR stuff should go. I left it at the default, but it then installed a load of folders into the main php folder. Turns out it just specifies the current working directory as the default. Doh!
I looked back up at all the pyrus options that had been printed out as part if the initialization process, but didn't see anything about changing the defaults or re-initializing. Near the top though it did say
Now I deleted all the new directories that had been created when I installed the SMTP module using pyrus and tried again. the usage examples show /path/to/pear as an option, but aren't explicit as to what this. Is it the pear file in php/bin, or is it /php/lib/PEAR? I thought probably the latter, so tried that.
But this just created a new folder in the PEAR directory called php, and then in that was the Mail class. This didn't seem right. I thought the correct directory to install into must be php/lib, then it should actually be installed into php/lib/php. So I deleted the new folder and tried again, but now got an error (well actually, lots of errors), which ended up
So I tried downloading pyrus.phar again. I'm not sure whether the registry is saved and updated within the pyrus.phar file or somewhere else, but I thought it was worth a try. On the PEAR2 website it says
After downloading pyrus.phar again, I still got the same errors. I tried removing the $HOME/.pear directory in case it was a problem with a file in there causing the errors, but still got the same error messages.
After restarting the VM I tried again, and pyrus now acted as if it hadn't been activated before. But I then got a load of error messages during the activation process.
I recompiled php yet again, copied the pyrus.phar again, but still the same errors. I checked the PEAR website in case the latest version of pyrus is buggy (the error messages seem to be to do with an xml parser inside pyrus). But I couldn't find any downloads of pyrus other than the latest one on the homepage.
Unfortunately the CentOS init script for nginx wouldn't start it as it already thought nginx was running (the old installation was still running). So I had to deactivate the old one (thus making my websites inaccessible) before I could even test whether the new installation would work okay.
After finally getting them (nginx and php) up and working on the web server, I tried my websites and they loaded okay. But when I tried to send myself a message using the contact form on one of the websites, it made php stop responding. So when I realised this I had to make the changes to go back to the old nginx and php. So my websites were down for a few minutes.
Next I spent a while trying to find what the problem was. I found that the problem only occurred when the send function of the PEAR Mail class was used. So I spent a long time trying to find what the problem with this was, and why would it cause php to crash and become unresponsive?
I thought maybe the problem was php-fpm, so I tried to use spawn-fcgi to start php instead. But the php-cgi binary didn't exist.
So I tried re-installing PHP using --enable-fastcgi but got a message that it didn't recognise that option. Next I tried --enable-cgi (as well as --enable-fpm). configure didn't give any complaints, but when I had finished installing it (which takes quite a while), I found that it hadn't created the php-cgi binary. So I tried again, but this time without --enable-fpm, and it built with the php-cgi binary okay.
After modifying the php spawn-fcgi init script to work with the new installations of spawn-fcgi and php, and a few hiccups, I got php up and running through spawn-fcgi. And guess what? Yep, same problem as php-fpm, goes braindead when I use the pear mail class to send an email.
I thought maybe I should try using the latest PEAR Mail class in case the older one wasn't compatible with PHP 5.3.3. So I looked on the PEAR website, and after reading through a few of the docs saw they said you needed to download pyrus.phar, and then use that to install the PEAR modules.
So I downloaded the pyrus.phar and tried to use it as directed in the docs to download the SMTP class first. pyrus said that since it was a first run I should specify where the PEAR stuff should go. I left it at the default, but it then installed a load of folders into the main php folder. Turns out it just specifies the current working directory as the default. Doh!
I looked back up at all the pyrus options that had been printed out as part if the initialization process, but didn't see anything about changing the defaults or re-initializing. Near the top though it did say
Usage:
php pyrus.phar [/path/to/pear] [options]
php pyrus.phar [/path/to/pear] [options] <command> [options] [args]
It had quite a few errors because it hadn't loaded php.ini as well. Rather than always using -c /path/to/php.ini when using php pyrus.phar, I thought I would just move php.ini to php/lib, where it is automatically looked for by default. Of course, this meant modifying the spawn-fcgi php init script as well to point to the correct location for php.ini.Now I deleted all the new directories that had been created when I installed the SMTP module using pyrus and tried again. the usage examples show /path/to/pear as an option, but aren't explicit as to what this. Is it the pear file in php/bin, or is it /php/lib/PEAR? I thought probably the latter, so tried that.
But this just created a new folder in the PEAR directory called php, and then in that was the Mail class. This didn't seem right. I thought the correct directory to install into must be php/lib, then it should actually be installed into php/lib/php. So I deleted the new folder and tried again, but now got an error (well actually, lots of errors), which ended up
PEAR2\Pyrus\ChannelRegistry\Exception: Unable to process package name
PEAR2\Pyrus\ChannelRegistry\ParseException: Exception: corrupt registry, could not retrieve channel pear.php.net information
Googling for this error didn't come up with anything except the code that generates this error. Very helpful!So I tried downloading pyrus.phar again. I'm not sure whether the registry is saved and updated within the pyrus.phar file or somewhere else, but I thought it was worth a try. On the PEAR2 website it says
Pyrus is a tool to manage PEAR packages. Pyrus simplifies and improves the PEAR experience.If pyrus simplifies installing PEAR packages, it must have been an absolute nightmare to install them before!
After downloading pyrus.phar again, I still got the same errors. I tried removing the $HOME/.pear directory in case it was a problem with a file in there causing the errors, but still got the same error messages.
After restarting the VM I tried again, and pyrus now acted as if it hadn't been activated before. But I then got a load of error messages during the activation process.
I recompiled php yet again, copied the pyrus.phar again, but still the same errors. I checked the PEAR website in case the latest version of pyrus is buggy (the error messages seem to be to do with an xml parser inside pyrus). But I couldn't find any downloads of pyrus other than the latest one on the homepage.
Thursday, 14 October 2010
Websiting
This morning I checked my email, which I had quite a bit of due to not being able to check it most of yesterday.
After that I looked into fixing the cyan shift caused by the UV IR filter, and found a program called CornerFix. Unfortunately it didn't recognise my camera and so didn't work, so I sent a message to the developer hoping they can update it.
I did some more website work, still looking at getting nginx and php-fpm up and running so that I can get the country a user comes from (using the geo-ip module for nginx).
In the afternoon I went to help someone with their computer for a bit, then did some more website work. I was having trouble with the PERL5LIB environment variable not being loaded into my bash session on Ubuntu, despite it being specified in my ~/.bash_profile. After trying a few things I gave google a try and found this helpful thread: Do I put PATH in bash_profile or bashrc or both?.
It turns out the problem is that in Ubuntu when you open a terminal session, it loads ~/.bashrc and not ~/.bash_profile. So to fix this, in terminal go to 'Edit' and then 'Profiles...'. In the window that opens click on 'Edit', then in the new window that opens go to the 'Title and Command' tab. Under the 'Command' section of this tab click 'Run command as login shell'. Close the two windows and close the terminal. Now if you open terminal again, it should ~/.bash_profile
When L got home from school we started watching Batman and Robin, then finished watching it after dinner. In the evening I did some more website stuff. First I was trying to figure out how to (let nginx) log serious php errors and exceptions without displaying error messages on screen, then I found that I had multiple php processes running thanks to my cron job.
The cron job just tries to start the nginx, mysql, and php processes every 15 minutes or so, expecting that if they are already running they will just say so and won't start a new instance. Unfortunately it seems that the php-fpm init script that ships with php doesn't include this logic (why should it?) and so I ended up with new php processes being started each time the cron job called the php-fpm init script.
So I modified the php-fpm init script to only start a new process if the php-fpm pid file doesn't exist. Not a great solution, if the server / php crash, it would probably leave the pid file so the edited script would think php was running when it isn't. I will try and improve it tomorrow.
After that I looked into fixing the cyan shift caused by the UV IR filter, and found a program called CornerFix. Unfortunately it didn't recognise my camera and so didn't work, so I sent a message to the developer hoping they can update it.
I did some more website work, still looking at getting nginx and php-fpm up and running so that I can get the country a user comes from (using the geo-ip module for nginx).
In the afternoon I went to help someone with their computer for a bit, then did some more website work. I was having trouble with the PERL5LIB environment variable not being loaded into my bash session on Ubuntu, despite it being specified in my ~/.bash_profile. After trying a few things I gave google a try and found this helpful thread: Do I put PATH in bash_profile or bashrc or both?.
It turns out the problem is that in Ubuntu when you open a terminal session, it loads ~/.bashrc and not ~/.bash_profile. So to fix this, in terminal go to 'Edit' and then 'Profiles...'. In the window that opens click on 'Edit', then in the new window that opens go to the 'Title and Command' tab. Under the 'Command' section of this tab click 'Run command as login shell'. Close the two windows and close the terminal. Now if you open terminal again, it should ~/.bash_profile
When L got home from school we started watching Batman and Robin, then finished watching it after dinner. In the evening I did some more website stuff. First I was trying to figure out how to (let nginx) log serious php errors and exceptions without displaying error messages on screen, then I found that I had multiple php processes running thanks to my cron job.
The cron job just tries to start the nginx, mysql, and php processes every 15 minutes or so, expecting that if they are already running they will just say so and won't start a new instance. Unfortunately it seems that the php-fpm init script that ships with php doesn't include this logic (why should it?) and so I ended up with new php processes being started each time the cron job called the php-fpm init script.
So I modified the php-fpm init script to only start a new process if the php-fpm pid file doesn't exist. Not a great solution, if the server / php crash, it would probably leave the pid file so the edited script would think php was running when it isn't. I will try and improve it tomorrow.
Wednesday, 13 October 2010
No internet means I can't do anything
Today I was doing more website stuff, trying to improve my nginx site configs. In the afternoon and all evening the internet was broken so I couldn't really do anything.
In the evening I watched an episode of Chojin Sentai Jetman with L, the end of Last of the Mohicans with L and Mauser, and part of Even Dwarves started small with Mauser and Bo.
The weather was mainly cloudy, but then the cloud cleared later in the day.
In the evening I watched an episode of Chojin Sentai Jetman with L, the end of Last of the Mohicans with L and Mauser, and part of Even Dwarves started small with Mauser and Bo.
The weather was mainly cloudy, but then the cloud cleared later in the day.
Tuesday, 12 October 2010
configuring and making and make installing
Today I processed a few photos, but was mainly compiling php and its dependencies (all this just so I can display ebay ad on my website!!)
The weather was cloudy most of the day, then later in the afternoon the sun started to shine through the clouds. It set behind the clouds though so there wasn't much of a sunset to see.
The weather was cloudy most of the day, then later in the afternoon the sun started to shine through the clouds. It set behind the clouds though so there wasn't much of a sunset to see.
Monday, 11 October 2010
Various stuff
I spent all morning and the first part of this afternoon doing website stuff. I was trying to install the latest version of nginx with the geo-ip module on the web server. But when I ran my shell script to do this, it had an error that it couldn't copy (backup) the old nginx installation. And then the new one didn't work!
So I was left without a working web server for probably 5 - 10 minutes while I tried to compile a working nginx (I couldn't just switch back to the previous one since the copy had failed, and the so the new version overwrote it). After some googling I found that I just needed to put
The problem causing my new installation of nginx not to work was that it couldn't find the GeoIP library. I couldn't work out why I was getting the error, so I posted to the nginx forums / mailing list to try and get some help.
After that I tried to find why my awstats hadn't been updating for about a month. I found it was because it couldn't find the Geo-IP perl module. After some googling I found how to make the PERL5LIB environment variable (which I had added the location of the Geo-IP perl module to) persist, so that awstats could find the Geo-IP perl module okay. Now I just needed to update the awstats with all the log files it had missed while it wasn't working.
This post - Overriding the AWStats LogFile Configuration Option - came in very handy, and I used the shell example there except with two loops - one to loop through my site configs, and then an inner loop to loop through the dated log files for that site. Loads quicker than having to manually update awstats with the log files for each week per site separately.
For the rest of the afternoon I went on a walk and processed / sorted some photos.
In the evening I watched a Christmas episode of Power Rangers with L, and Batman Returns with L and Mauser.
So I was left without a working web server for probably 5 - 10 minutes while I tried to compile a working nginx (I couldn't just switch back to the previous one since the copy had failed, and the so the new version overwrote it). After some googling I found that I just needed to put
set -e
near the top of my shell script, which makes it exit when there is an error instead of continuing on with the next command.The problem causing my new installation of nginx not to work was that it couldn't find the GeoIP library. I couldn't work out why I was getting the error, so I posted to the nginx forums / mailing list to try and get some help.
After that I tried to find why my awstats hadn't been updating for about a month. I found it was because it couldn't find the Geo-IP perl module. After some googling I found how to make the PERL5LIB environment variable (which I had added the location of the Geo-IP perl module to) persist, so that awstats could find the Geo-IP perl module okay. Now I just needed to update the awstats with all the log files it had missed while it wasn't working.
This post - Overriding the AWStats LogFile Configuration Option - came in very handy, and I used the shell example there except with two loops - one to loop through my site configs, and then an inner loop to loop through the dated log files for that site. Loads quicker than having to manually update awstats with the log files for each week per site separately.
For the rest of the afternoon I went on a walk and processed / sorted some photos.
In the evening I watched a Christmas episode of Power Rangers with L, and Batman Returns with L and Mauser.
Sunday, 10 October 2010
Various stuff
This morning I cut out some pogs in Photoshop then went to Church. After Church I finished updating my pog website and had dinner.
In the afternoon I watched 3 Power Rangers episodes with L, then checked various photography websites.
In the evening I did some more work on my ebay listings widget for my pog website, and tried installing the latest version of nginx with the geo-ip module so that I can serve up ebay listings according to the user's location. Unfortunately I can't tell if it worked on my local system as of course my ip address is local, and weirdly when I checked the page headers it still gave the old nginx version number. Possibly it could be that the nginx version number being sent with the headers is the front-end nginx rather than the back-end nginx, which was the one I upgraded.
I'll probably try upgrading nginx on the live web server tomorrow and see what happens. By using a shell script to do the upgrade I can keep downtime to around one or two seconds.
In the evening I also watched Project A with Mauser and L.
The weather today started off overcast, then the cloud cleared by about 11am. The rest of the day was sunny with a clear blue sky.
In the afternoon I watched 3 Power Rangers episodes with L, then checked various photography websites.
In the evening I did some more work on my ebay listings widget for my pog website, and tried installing the latest version of nginx with the geo-ip module so that I can serve up ebay listings according to the user's location. Unfortunately I can't tell if it worked on my local system as of course my ip address is local, and weirdly when I checked the page headers it still gave the old nginx version number. Possibly it could be that the nginx version number being sent with the headers is the front-end nginx rather than the back-end nginx, which was the one I upgraded.
I'll probably try upgrading nginx on the live web server tomorrow and see what happens. By using a shell script to do the upgrade I can keep downtime to around one or two seconds.
In the evening I also watched Project A with Mauser and L.
The weather today started off overcast, then the cloud cleared by about 11am. The rest of the day was sunny with a clear blue sky.
Saturday, 9 October 2010
Taking photos
I spent most of today taking photos of my UVIR filter and Fuji IS-Pro. Trying to get a photo of the filter reflecting red light was too difficult.
I also watched an episode of Power Rangers and an episode of Jetman with L.
The weather was overcast all day today, though it didn't rain.
I also watched an episode of Power Rangers and an episode of Jetman with L.
The weather was overcast all day today, though it didn't rain.
Friday, 8 October 2010
Writing a blog post
I spent most of today working on a blog post for my photography website. In the evening I published the post and also uploaded a photo to a few photo sharing websites. I watched an episode of Power Rangers with Mauser and L as well.
The weather today started off foggy, then gradually the fog lifted to reveal an overcast sky. Late in the afternoon the sun started to shine through the clouds a bit and some of the fog returned, looking like it might make for a nice sunset. But then the sun disappeared again, and there wasn't a nice sunset.
The weather today started off foggy, then gradually the fog lifted to reveal an overcast sky. Late in the afternoon the sun started to shine through the clouds a bit and some of the fog returned, looking like it might make for a nice sunset. But then the sun disappeared again, and there wasn't a nice sunset.
Thursday, 7 October 2010
Ebay APIing and Walking
I spent quite a bit of this morning trying to figure out how to use the ebay developer's API as an affiliate. There are quite a few places on both the ebay partner network website and the ebay developers website where they mention using the API as an affiliate, but I couldn't find any info on either site about how exactly you use the API as an affiliate i.e. what the code looks like where you add your affiliate id in.
Then eventually I just tried google and found this blog post, which explains it. You need to add
to your url request string.
That sorted, I wondered how best to go about retrieving the info from ebay. Obviously a url request to ebay's server when creating the page (in php) would slow down the page creation. But using javascript to get the info after page creation would mean that there would be a delay between the page loading and the info loading (with the info in the top right corner of the page, the user might scroll down before it had been loaded).
I was thinking about having a cron job that runs every 10 minutes, gets the info from ebay, and caches it. The PHP page would then include this page (so no delay caused by making a call to ebay's servers). Then when the page had loaded javascript would make a fresh request to ebay's servers and update the ad with the latest info.
This way the user sees the ebay info as soon as the page loads, the page should load quickly, and they get up-to-date info (when the js has finished loading it).
But then I thought that actually I might as well just use php. I think it should be possible to make sure that listings have at least x minutes left, so I could just retrieve listings with at least 10 minutes left every 10 minutes.
I'm not sure if this should be done on a request basis (code in page checks whether cached info is older than 10 minutes, and if it is, requests new info and writes new info to cache file / db). This method would need something to avoid dog piling. Or otherwise just use a cron job to refresh the cache.
In the afternoon I went out on a walk since it was nice weather. My UVIR cut filter for use with my IS-Pro arrived, so I also wanted to test it out. I went up through the town towards Robert Smythe school, then across the fields towards Great Bowden. Then I went back along the canal, and waited around for quite a while at the canal basin, hoping to take a pano there with the sky lit up by the sunset.
Unfortunately while there was a great sunset, it couldn't be seen from the canal basin (I saw the remnants of the sunset while walking home though). I had wondered if I should go to the canal basin or the fields for sunset, obviously the fields would have been a much better choice. Still, at least now I know.
In the evening I watched Sadgati (Deliverance) with Mauser and Bo. It was well directed (Satajit Ray), the story was alright, but not a lot happens.
I also geo-coded all the photos I took on my walk.
Then eventually I just tried google and found this blog post, which explains it. You need to add
&affiliate.trackingId=[yourcampaignid]
&affiliate.networkId=9
&affiliate.customId=[customid]
to your url request string.
That sorted, I wondered how best to go about retrieving the info from ebay. Obviously a url request to ebay's server when creating the page (in php) would slow down the page creation. But using javascript to get the info after page creation would mean that there would be a delay between the page loading and the info loading (with the info in the top right corner of the page, the user might scroll down before it had been loaded).
I was thinking about having a cron job that runs every 10 minutes, gets the info from ebay, and caches it. The PHP page would then include this page (so no delay caused by making a call to ebay's servers). Then when the page had loaded javascript would make a fresh request to ebay's servers and update the ad with the latest info.
This way the user sees the ebay info as soon as the page loads, the page should load quickly, and they get up-to-date info (when the js has finished loading it).
But then I thought that actually I might as well just use php. I think it should be possible to make sure that listings have at least x minutes left, so I could just retrieve listings with at least 10 minutes left every 10 minutes.
I'm not sure if this should be done on a request basis (code in page checks whether cached info is older than 10 minutes, and if it is, requests new info and writes new info to cache file / db). This method would need something to avoid dog piling. Or otherwise just use a cron job to refresh the cache.
In the afternoon I went out on a walk since it was nice weather. My UVIR cut filter for use with my IS-Pro arrived, so I also wanted to test it out. I went up through the town towards Robert Smythe school, then across the fields towards Great Bowden. Then I went back along the canal, and waited around for quite a while at the canal basin, hoping to take a pano there with the sky lit up by the sunset.
Unfortunately while there was a great sunset, it couldn't be seen from the canal basin (I saw the remnants of the sunset while walking home though). I had wondered if I should go to the canal basin or the fields for sunset, obviously the fields would have been a much better choice. Still, at least now I know.
In the evening I watched Sadgati (Deliverance) with Mauser and Bo. It was well directed (Satajit Ray), the story was alright, but not a lot happens.
I also geo-coded all the photos I took on my walk.
Monday, 4 October 2010
Photo metadataring
Today I was just sorting, processing, and metadataring my photos from my walk on Saturday morning.
I thought I should have easily been able to get them all done and uploaded to my website by the end of the day, but I haven't even finished rating them all yet.
In the evening I also watched an episode of Birdman Rangers Jetman with L, and two British Transport railway films with Clare and Mauser.
The weather started off foggy, then the fog gradually cleared during the morning. It was nice weather for about half an hour, then came over cloudy. It was cloudy most of the morning and afternoon, but later in the afternoon the clouds cleared. There was a nice sunset, but I missed it as I was still doing the washing up after dinner.
I thought I should have easily been able to get them all done and uploaded to my website by the end of the day, but I haven't even finished rating them all yet.
In the evening I also watched an episode of Birdman Rangers Jetman with L, and two British Transport railway films with Clare and Mauser.
The weather started off foggy, then the fog gradually cleared during the morning. It was nice weather for about half an hour, then came over cloudy. It was cloudy most of the morning and afternoon, but later in the afternoon the clouds cleared. There was a nice sunset, but I missed it as I was still doing the washing up after dinner.
Sunday, 3 October 2010
Various stuff
This morning I cut out some pogs in Photoshop, then went to Church.
When we got back home we watched Streetfighter the Movie on blu-ray until dinner.
After dinner I updated my pog website, then we (me, Mauser, and L) finished watching Streetfighter.
I did a bit of photo processing, but then I had a bad headache so I went to bed for a bit.
After a while in bed I opened my eyes to find that sunlight was coming through my bedroom windows (it had been raining all day). So I quickly grabbed my tripod and photo bag and went out to try and photograph the sunset. Unfortunately the sunset was pretty rubbish, so I just spent an hour standing around in a field waiting to see if the sun would shine up from below the horizon and light up the underneaths of the clouds.
When I got back home I processed some more photos, then spent a long time making salad for a sausage sandwich.
For the rest of the evening I processed some more photos, then did a backup.
When we got back home we watched Streetfighter the Movie on blu-ray until dinner.
After dinner I updated my pog website, then we (me, Mauser, and L) finished watching Streetfighter.
I did a bit of photo processing, but then I had a bad headache so I went to bed for a bit.
After a while in bed I opened my eyes to find that sunlight was coming through my bedroom windows (it had been raining all day). So I quickly grabbed my tripod and photo bag and went out to try and photograph the sunset. Unfortunately the sunset was pretty rubbish, so I just spent an hour standing around in a field waiting to see if the sun would shine up from below the horizon and light up the underneaths of the clouds.
When I got back home I processed some more photos, then spent a long time making salad for a sausage sandwich.
For the rest of the evening I processed some more photos, then did a backup.
Saturday, 2 October 2010
Stopped raining
This morning I went out on a walk because it wasn't raining. The weather wasn't that great for photography as the sky was mostly white rather than a mixture of blue sky and white clouds, but better than rain.
In the afternoon I sorted and processed some of the photos.
In the evening me and Mauser went on the halfcost.co.uk website to see if he wanted to buy some clothes from it.
The weather was sunny in the morning (sun shining through thin cloud), cloudy in the afternoon, and rainy in the evening.
In the afternoon I sorted and processed some of the photos.
In the evening me and Mauser went on the halfcost.co.uk website to see if he wanted to buy some clothes from it.
The weather was sunny in the morning (sun shining through thin cloud), cloudy in the afternoon, and rainy in the evening.
Friday, 1 October 2010
Wordpressing
This morning I finished sorting some photos, then did a backup. The rest of the morning and part of the afternoon, I was reading Luminous Landscape articles, as I hadn't checked their website in quite a while.
The rest of the afternoon and a bit of the afternoon I was trying to upgrade wordpress and figure out why I wasn't being sent weekly digest emails by my wordpress blog (with subscribe2 plugin). I didn't figure it out, I think I'll have to leave it and see if it works or not (since I upgraded the plugin). If not, then I'll try changing it to send an email for each new post instead of a weekly digest.
In the evening I also watched an episode each of Jetman and Power Rangers with L. I looked at some of Mauser's photos from Europe as well.
The weather today was rain all day.
The rest of the afternoon and a bit of the afternoon I was trying to upgrade wordpress and figure out why I wasn't being sent weekly digest emails by my wordpress blog (with subscribe2 plugin). I didn't figure it out, I think I'll have to leave it and see if it works or not (since I upgraded the plugin). If not, then I'll try changing it to send an email for each new post instead of a weekly digest.
In the evening I also watched an episode each of Jetman and Power Rangers with L. I looked at some of Mauser's photos from Europe as well.
The weather today was rain all day.
Subscribe to:
Posts (Atom)