Sunday, 31 January 2010

Looking after G-Dad and playing Civ 4

This morning I was tasked with looking after grandad. I started cutting out a few flippos in Photoshop, but then Gradndad's breathing was quite bad, so I just sat in the living room with him most of the morning to make sure he was okay. His breathing seemed to be quite bad when he was awake, but alright when he was asleep.

After lunch I finished cutting out the flippos in Photoshop, and uploaded them to my Pog website.

The rest of the day I played on Civ IV in Mauser's room, with help from L. I only had it on the 2nd easiest difficulty level, but the Germans went to war against me and easily nicked 2 of my cities, despite me thinking they were well defended. Also, whenever I attacked their Knights with my elephants, it said I had a 98% chance of winning, but I would always loose.

Food
Breakfast: Bowl of Choco Moons Cereal; Cup o' Tea.
Dinner: Chicken Curry; Vegetable Curry; Sultanas; Rice. Pudding was a Belgian Chocolate Brownie. Cup o' Tea; ½ Cadbury's Twirl.
Tea: Beef with mustard and sliced tomato sandwich; Banana; Slice of freshly baked home-made Victoria Sandwich Cake with Strawberry Jam; Cup o' Tea.

Saturday, 30 January 2010

Processing photos

This morning I had breakfast, then went out in the garden with L to see how frozen the pond was. It seemed to have a few flecks of snow on it, and the ice was about 1cm thick, which is quite thick considering the temperature hasn't been below zero for probably a couple of weeks now.

I calibrated my monitor, wrote up yesterday's blog pot, and did some vacuuming.

I started processing more photos from Acton Scott, which we visited on the fourth day of our Shropshire holiday, back in July.

Mike and Nicky came to visit, and bought their granddaughter Scarlet with them, though I didn't see much of them as I was too busy processing my photos.

After lunch I carried on processing photos, Mike and Nicky went home about 3.30pm.

I finished processing the Acton Scott image about 4.30pm, then checked the metadata and renamed the images.

In the evening I watched an episode of Power Rangers with L, then started uploading the Acton Scot photos to my photo website, and also uploaded a photo to various photo sharing websites.

After uploading one image (not larger than the other images or anything, just a normal image), I found that I couldn't access my website and was getting a 504 Getway timeout error. I tried my pog website, but that seemed to be working okay. So I logged into the webserver and looked at the error logs for my photo website. The error log just said it (i.e. nginx) wasn't receiving any response from upstream (i.e. php).

I checked what processing were running, and php was running (well, obviously, since my other website that runs on the same server with the same php instance was working okay). I tested my photo website again, but nope, still 504.

I restarted php using the CentOS spawn-fcgi init script, which said that it failed in closing down the current PHP processes, but started the new ones successfully. I checked what processes I was running, and only the new php processes showed up, so I guess that it did close down the old PHP processes, even though it said that it had failed. That it said that it had failed does indicate that something had gone wrong with the old PHP processes though.

It's strange how my pog website still worked though. The only thing I can think of is that that one or two of the php processes were dead (I had one PHP parent process with 3 child processes running), and when I tried to access my pog website my request luckily got handled by the working PHP process, while whenever I tried to access my photo website, my request unluckily got handled by one the broken PHP processes.

Anyway, after restarting php fcgi, I could access my photo website again. I checked the last image I uploaded had uploaded okay, and it had. I uploaded another image, but got a blank screen. I thought that php had broken again, but I pressed enter in the url bar, and the page reloaded okay. I checked the image had uploaded okay, and it hadn't, there wasn't even a record in the database for it (the image metadata is added to the database before the image is moved and resized).

I tried again, and this time the image uploaded okay.

Food
Breakfast: Orange Marmalade Toast Sandwich; Cup o' Tea.
Lunch: Mature Cheddar Cheese Sandwich made with fresh Bakery Bread; 2x Clementines; Slice of fresh Bakery Bread with Strawberry Jam; Chocolate Snowball; Cup o' Tea.
Dinner: Toad in the Hole; Mustard; Gravy; Potatoes; Green Beans; Carrots. Pudding was a Belgian Chocolate Truffle. Coffee.

Friday, 29 January 2010

Metadataring

This morning I was adding metadata to images from the fourth day of our Shropshire Holiday back in July.

After lunch I had a bad headache, so I went to bed until about 3.20pm. I got up and my headache wasn't too bad, so I finished adding metadata to the images of Acton Scott Farm.

After dinner I watched an episode of the Equalizer, then processed a few of the photos from Acton Scott Farm. Then I watched The Night of the Hunter with Mauser, which had Robert Mitchum as a money obsessed evil preacher.

The weather started off overcast, then started to brighten up during the morning. It was a mixture of sun and clouds most of the day, and there was quite a good sunset with some nice clouds.

Food
Breakfast: Orange Marmalade Toast Sandwich; Cup o' Tea.
Lunch: 2x Clementines; Ibuprofen.
Dinner: Battered Fish Portion; Potato; Spaghetti Hoops; Tartar Sauce; Ground Black Pepper; Mushrooms. Pudding was: Vanilla and Peach flavour Yoghurt; Hobnob; Dark Chocolate Digestive. Coffee.
Supper: Bowl of Choco Moons Cereal; Cup o' Tea.

Thursday, 28 January 2010

Photo uploading

This morning I got Grandad breakfast (actually I quite often do that), then checked my email. After that I tried a suggestion for limiting memory usage by the Imagick PHP extension. I spent quite a while doing that, trying to figure out what was happening, and also installed the latest beta of Imagick. But I couldn't get it working, it seemed that PHP would die as soon as I called a method of the Imagick class statically.

I wrote a blog post for my photo website about Church Stretton, and then updated my photo website with the blog posts I'd written yesterday and today.

In the afternoon I uploaded one of my photos to a few websites, then I did a bit of gardening, cutting down some old plants and cutting back the Chilean Potato Tree.

I uploaded my photo to a couple more photo websites, waited for Robogeo to adjust the time of some images by minus one hour, looked after Grandad, and went on the pinternet a bit for the rest of the afternoon.

In the evening I watched an episode of Power Rangers and Star Trek TOS with Mauser and L, then I watched Out of The Past with Mauser, which had a nice twisty story.

The weather started off overcast, then started to brighten up during the morning. After lunch it clouded over again and rained a bit, then was overcast the rest of the day.

Food
Breakfast: Orange Marmalade Toast Sandwich; Cup o' Tea.
Lunch: 2x Cheese on Toast; Crunchy Salad; 2x Clementines; Grapes; Cup o' Tea.
Dinner: Slice of Chicken & Mushroom Pie; Potato; Peas; Sweetcorn; Roast Parsnip. Pudding was Chocolate Cereal Crunch Cake. Coffee.

Wednesday, 27 January 2010

Uploading images

This morning I was uploading more photos to my photo website. I also had one photo that somehow I had missed out on processing, so I had to process that. A few images were missing metadata or needing cropping and converting to JPEGs from TIFFs, so I did that as well.

After lunch McRad wanted to go on the internet, and since uploading an image seems to stop any downloads while it's uploading, I had to stop the uploads for a bit. While I was waiting I checked my email and The Web Squeeze.

McRad finished going on the pinternet, so I carried on uploading some more images while doing some work on my pano website. I was trying for ages to figure out why my rewrite rule for the panoramas wasn't working properly. It seemed that the flash panoramas player was ignoring the query string added by the rewrite, and was instead working based on the URL displayed in the browser.

Then eventually I figured it out, obviously the flash file will be working based on the URL in the browser. Since Flash is a client side technology, it wouldn't receive the silent rewrite creating the query string done by the server.

After figuring out how to get the pano to work, I started writing up some blog post for the photo website about some of the images I've uploaded to the website over the last couple of days.

In the evening I also watched an episode of Power Rangers with L and Mauser, and Cape Fear with Mauser.

The weather started off overcast, then started to brighten up during the morning. After lunch it clouded over again and rained a bit, then was overcast the rest of the day.

Food
Breakfast: Orange Marmalade Toast Sandwich; Cup o' Tea.
Lunch: Cheese topped bun; Chicken with Mayonnaise, sliced Tomato, and Crispy Salad Sandwich; Banana; Grapes; Chocolate Snowball; Cup o' Tea; Couple of pieces of Sainsbury's Caramel Chocolate.
Dinner: Cheese & Corned Beef Fritter; Baked Beans; Mashed Potato; Brown Sauce; Ground Black Pepper. Pudding was Jamaica Ginger Cake with Custard and Golden Syrup. Coffee; 4x Pieces of Sainsbury's Truffle Chocolate.
Supper: Dark Chocolate Digestive Biscuit; Cup o' Tea.

Tuesday, 26 January 2010

Got photo website uploads working again

This morning I checked my email, then made the Licensing XMP templates for Commercial and Editorial Image Licences.

I opened up Adobe Bridge and found an image to apply the XMP to, but I wasn't sure if the image should be licensable on a commercial or editorial basis. The photo was of Acton Scott Hall, taken from what I'm pretty sure was a public footpath.

I looked on Alamy, and found there was quite a few photos from Acton Scott there. Clicking on the 'Commercial' tab (as opposed to the 'All of Alamy' tab) though, there were no results.

Next I checked the advice on Photoshelter, which says
Property releases are generally required for private homes and other privately-owned buildings and land. Some stock sites might require property releases for other forms of property as well, such as animals, vehicles, and personal possessions. In general, both stock houses and image buyers tend to be less concerned about property releases.
and
Images that focus on private homes and other buildings need a property release for most stock sites. PhotoShelter will still accept these images without releases.

Images that feature many private homes or buildings do not generally require a release, since the focus is not on any one particular building. However, identifiying details such as street names and house numbers should not be visible. (PhotoShelter will accept without a release)
Since skyline images do not focus on any one particular building, a property release is not required.
Pets are considered property, and some stock sites might require property releases for images of pets and working animals. (PhotoShelter will accept without a release)
In general, if an attraction, such as a park, is visible from a public place, it does not need a property release.
Some famous buildings or landmarks do require a property release for certain usages. (PhotoShelter will accept without a release)

I checked the article linked to from the Photoshelter article, Photo Attorney: Property Releases Revisited, which says there is no legal precedent set to require property releases.

I checked Getty, who say
Model and property releases ensure depicted individuals and property, e.g. brands, homes, etc. contained within an image/clip are cleared for commercial use.

However, if an image/clip does not contain an individual or property (see examples), then commercial use of the image/clip will not require any model or property releases as there will be no one to obtain a release from.
So while the photo Attorney says there's no legal need to obtain property releases, Getty seem to require them for all images except where there isn't anyone to obtain a release from. The example images they have are of a frog, the sea, and a dimly lit landscape where the focus is mainly on the clouds covering the valley below and the sun rising above the clouds.

Looking at images of Big Ben and the Parliament buildings on Getty, it is quite interesting to see that whether a property release is needed or not for the image to be used commercially is quite inconsistent, e.g.

I modified the local copy of my website to take account of the new XMP. One of the things I wasn't sure about, was how I should store the license type. Both license types (Commercial or Editorial) are Boolean (true or false), the way MySQL stores this is actually as an integer, where 0 equates to false and anything else (but you would normally use 1) equates to true. But for printing values out in a website you don't really want to be using 0 and 1, but rather true and false, or yes and no.

Doing some googling I saw that some people suggested using tinyint, some suggested BOOL (which is an alias of tinyint), and some suggested using enum with true and false or yes and no as the enum values. Then I read this thread on Stack Overflow, where the top answer suggested using BOOL or BOOLEAN, and that MySQL hopes to implement a proper BOOL datatype in the future.

So I decided to stick with using the BOOL datatype for my boolean fields, then checking if the value equates to true in PHP and printing 'Yes' or 'True' if it does, and otherwise printing 'No' or 'False'.

After finishing the updates on my local copy of the site, I updated the live site on the web server, then proceeded to upload some photos. However, when I got to uploading a large pano (approx 10000px x 5000px), ImageMagick didn't create a resized smaller version of the image. I was calling ImageMagick with 2>&1 from PHP, and printing the result, in the hope that this would mean any error messages would be printed out on the webpage.

But there weren't any errors printed for the initial image resize, just for the further resizes, which were caused because they try and resize the resized image, which obviously didn't exist since ImageMagick had failed to create it.

Testing the exact same command from the shell, all I got was
Segmentation fault
I resized the image to the correct size in Photoshop, and uploaded this to the correct place in the server. Then I tried the ImageMagick command used to resize the resized image to a smaller image. That gave me the following:
convert: unable to extend cache `../CSI/Img/99-The-Long-Mynd.jpg': 4MB @ cache.c/OpenPixelCache/4078.
convert: unable to extend cache `../CSI/Img/99-The-Long-Mynd.jpg': 2MB @ cache.c/OpenPixelCache/4078.
convert: unable to extend cache `../CSI/Img/99-The-Long-Mynd.jpg': MB @ cache.c/OpenPixelCache/4078.
convert: unable to extend cache `../CSI/Img/99-The-Long-Mynd.jpg': 4MB @ cache.c/OpenPixelCache/4078.
convert: unable to extend cache `../CSI/Img/99-The-Long-Mynd.jpg': 9MB @ cache.c/OpenPixelCache/4078.
convert: unable to extend cache `../CSI/Img/99-The-Long-Mynd.jpg': 3MB @ cache.c/OpenPixelCache/4078.
convert: unable to extend cache `../CSI/Img/99-The-Long-Mynd.jpg': 8MB @ cache.c/OpenPixelCache/4078.
convert: unable to extend cache `../CSI/Img/99-The-Long-Mynd.jpg': 3MB @ cache.c/OpenPixelCache/4078.
convert: unable to extend cache `../CSI/Img/99-The-Long-Mynd.jpg': 8MB @ cache.c/OpenPixelCache/4078.
convert: Application transferred too few scanlines `../CSI/Img/99-The-Long-Mynd.jpg' @ jpeg.c/EmitMessage/233.

So I create the other image sizes needed in Photoshop, and uploaded them to the web server so at least the website wouldn't have any problems with missing images.

Then I did some googling to try and find what the problem was, and found this thread on the ImageMagick forums: [solved] Big jpeg issue, where the answer was given:
the temporary disk space required by the pixel cache could not be allocated. The solution is to point the temporary space that ImageMagick writes to a partition with plenty of free space. For example,

convert -define registry:temporary-path=/data/tmp mybigassimage.png mybigassimage.jpg


or you can set the MAGICK_TMPDIR environment variable.
I'm not sure how you set shell environment variables when using it from PHP, so I just used the -define registry:temporary-path option.

First I tried resizing the medium image (which gave the lack of tmp space error) to a smaller image using a different tmp path (I just created a folder called tmp inside my apps directory). That worked okay. Then I tried resizing the original image, which had given the segmentation error, and that worked okay as well.

After fixing that, I carried on uploading images to my website (takes ages for each image), while checking my email and the internet.

After dinner I watched an episode of The Equalizer.

I had accidentally added some XMP to a couple of images that indicated that they were available for commercial licensing, when actually they weren't. Unfortunately, while adding XMP via Adobe's XMP File Info Panel Import option is easy, there's not any easy way provided to remove XMP (short of replacing the whole XMP block).

I tried using exiftool, but since it was one of my own tags I needed to remove, it couldn't do it. So I had to add my tags I wanted to remove the the User Defined tags in exiftool. Luckily this is very easy thanks to the great examples in the Exiftool_Config file, it's just a case of copying or modifying a couple of the examples in there. Then I could use exiftool to remove the tags I didn't want in the files.

I also received a couple of CDs today that I had bought on Amazon the other day. One I already had downloaded, but the other one I hadn't downloaded, and I couldn't find it on isohunt either. So I actually had to break the shrinkwrap and rip it myself.

After ripping it, tidying up the tags / directory structure, and downloading the Album artwork, I started listening to it in Winamp. But only the left and right speakers seemed to be working. So I tried googling for 'Winamp 5.1 Upmix', which found me a thread in the Winamp forums where some drivers were linked to for the Audigy 2 (same as my soundcard) that allow you to upmix to 5.1. And the person asking the question said that the drivers fixed it for them. But unfortunately the link had been removed.

So I did some more googling and found this thread - Sound upmix software for Vista?, where one of the posters says

Quote:










Originally Posted by JaylumX
View Post

I thought one of the features of CMSS was the upmixing of stereo signals to 5.1.



Yep, and it works in Vista as well, just need to enable it in the Creative Audio Console.
But I have CMSS enabled in Creative Audio Console, and it's not upmixing. Luckily, they go on to say
Or if you're not adverse to a little INF modifying you can add Vista's own version of CMSS (as well as a couple of other little goodies),



http://forums.creative.com/creativel...cending&page=1
So I followed that link, and followed the instructions there. After installing the drivers as described, I ticked the 'Speaker Fill' option on the 'Enhancements' tab of the Speakers Properties, and I now have my stero tracks upmixed to 5.1 in winamp.

The weather was mainly overcast, then started to brighten up in the afternoon. The sunset was quite boring with a slightly hazy sky, but after the sun had set the sky gradually turned orange, making for an amazing sunset/twilight that actually lasted quite a while as well.

Food
Breakfast: Orange Marmalade Toast Sandwich; Cup o' Tea.
Lunch: Mature Cheddar Cheese with Crispy Salad Sandwich; 2x Clementine; ½ Eccles Cake that me and L made on Sunday; Home-made Apricot Flapjack; Cup o' Tea.
Dinner: Shepherd's Pie; Green Beans; Carrots; Tomato Ketchup; Grated Mature Cheddar Cheese; Ground Black Pepper. Pudding was a Cherry Bakewell and a Chunky Choc Chip Cookie. Coffee; 2x Pieces of Sainsbury's Truffle Chocolate; Piece of Nestle Choc Noir (or sumat similar) Dark Chocolate.

Monday, 25 January 2010

Fixing things

This morning I moved my boxes of Pog stuff back behind the cupboard where they were a few months ago, so the boxes weren't taking up space in my room any more. This also meant that I could reconnect my speakers to my PC.

I hadn't bothered connecting my speakers since having to disconnect them to remove the boxes of pog stuff from behind a cupboard a few months ago, since I figured that it would be pointless to reconnect them, and then have to disconnect them again when I moved the boxes back behind the cupboard. Obviously, I didn't realise it would take me a few months until I moved the boxes back behind the cupboard.

After getting everything plugged in again, I went on my comp and finished writing Saturday's blog post, and wrote yesterday's blog post, and today's blog post so far.

I found out that the surround sound wasn't working properly - it seemed that only the center speaker was working okay, so I ran the Creative Software Update (since they're Creative speakers and a Creative soundcard) to download the latest updates.

After installing the updates and restarting my PC, I found that surround sound was now working, but the front left and right speakers were swapped with the rear ones. So I unplugged the 2 leads from the soundcard and swapped them over. But this froze my PC. I tried restarting, but nothing was output to the monitor.

After unplugging everything from the PC except the power and monitor, I found that the PC would start okay. I shut down and plugged in the cables for the Creative Sound Blaster Audigy 2 Platinum Pro EX external breakout box, and now the PC wouldn't output anything to the screen again when I booted it up.

I unplugged the Audigy 2 breakout box, started up the PC again, and went into the BIOS. Here I found that the Motherboard's onboard sound had been enabled (causing a conflict with the Sound Blaster Audigy 2 soundcard). I also found that the extended memory setting allowing you to use more than 4GB of RAM had also been disabled, so it seemed like the BIOS had been reset somehow.

After disabling the motherboard's onboard sound, I shut down the PC, plugged in the Audigy 2 Platinum Pro EX external breakout box, and then started the PC again, which now started okay.

I shut down again, plugged everything else back into the PC, started up again, and found everything was working okay. I setup the surround sound, and that was working as well. But then I noticed that the time on the computer was wrong. Lately it's happened a few times that my computer clock has been an hour or two out.

So I did some googling, and as I thought, it seems this is likely due to the motherboard CMOS battery being run out, which would also be why the BIOS had reset and caused the problem with my soundcard.

So I shut the PC down again, unplugged everything again, and moved the PC to the floor where I could work on it easily. I went into the garage and found the box for the PC case, and found the Motherboard manual in there. I looked through the manual, but it didn't have anything about changing the CMOS battery other than showing its position on the Motherboard.

I had to remove the graphics card, as the end of the graphics card goes over the CMOS battery location. The battery seemed to have 4 clips around the edges holding it in place, so I couldn't really pull all 4 clips out of place to pop the battery out. I just put a flat head screwdriver under the battery (this is on an ASUS P5B Deluxe Wifi AP board) and levered it up until it popped out.

I then swapped it with a replacement CR2032 battery that was in the battery box, however, I don't know how much power the replacement battery has in it, as it isn't new.

I plugged the power and monitor back into the PC, then started it up and went into the BIOS. I disabled the onboard sound, set the SATA configuration to RAID, and enabled the extended memory again (the BIOS had reset due to not having power for a bit). The PC worked okay, so I shut down and plugged everything back in, then switched it on again.

I noticed the time was wrong again, but couldn't get the Windows Time to update from the Internet. I then had a message from Windows Update that it couldn't update. It said 'click here for more information', which just took me to the Windows Update screen. So I pressed the update button, and now it did actually give me more information, in that it had an error code 80072f8f.

Clicking on the link for information about the error code just brought up Windows help with general information about different problems with Windows Update, none of which were related to the problem I was having or the error code it had given.

Googling did bring up information - apparently the error code means that your computer time and the time on the Windows Update server are too out of sync. Why they don't just print that message with the error code, or include this information in Windows help, I don't know.

So I went into the time settings and manually selected the correct date (it was in 2002 before), this also had the added benefit that the Internet time update now also worked.

After lunch I checked my email. When I'd finally finished wading through them, I cut out some Pogs in Photoshop. When I'd done a few of them, I needed to upload them to my Ubuntu Virtual Machine. Since I can't run my VMWare Virtual Machine at the same time as Microsoft Virtual PC 2007, I thought I'd uninstall VMWare Server 2.0 and then re-install it, and see if this fixes the problem.

Initially I tried downloading the latest version of VMWare Server 2.0 using Firefox, but the download terminated partway through. This happened a few times yesterday when I was trying to download Adobe Flash Builder 4 Beta. Then I tried Chrome to download the file, which had no problem, so I tried the same thing today for downloading VMWare.

While I was waiting for VMWare Server 2.0 to download, I wrote a bit more of this blog post. It was very annoying though, as when I have an HTTP download going, it sometimes seems to make my PC really slow, so the mouse is really laggy. This time though, it seemed to really mess up the keyboard, it would take about 10 key presses on a key until the key press actually registered. Also, sometimes it would print a different letter to the one you were pressing, e.g. once when I was pressing 'p', a 'b' was printed to the screen.

Another strange thing was that Firefox would randomly launch itself (or new windows if it was already open). Luckily, everything went back to normal when the download was complete, but it took quite a long time for the download to complete.

When it was done I restarted my PC (to make sure the previous install of VMWare Server 2.0 had been cleaned up properly), and then installed VMWare Server 2.0. Execpt, that it didn't install, I just got a message
Cannot continue. The microsoft runtime dll installer failed for complete installation.
After googling I found this thread. Following the advice given in the thread linked to from the first reply, I tried the Windows Installer CleanUp Utility, but there wasn't any entry for VMWare to remove.

I also tried
1. Open a CMD.EXE prompt.
2. Type msiexec /unregister and press Enter.
3. Type msiexec /regserver and press Enter.
But that didn't work either.

The other replies in that thread weren't helpful to me either. So I went back to the original thread to see the other replies there, and found one that said to try installing vcredist manually. So I installed vcre 2008 x64 (vcredist_x64), but still VMWare wouldn't install.

So I carried on looking at the replies, and found the last reply where the OP said (how) they had fixed it. The answer was that when you try installing VMWare Server 2.0, you don't click 'ok' on the error message, but leave it open. You can then find the VMWare temp files that the installer has extracted, and install VMWare from the temp files.

The author of the thread/post was using Windows XP, so their VMWare temporary install files were extracted to
C:\Documents and Settings\Administrator\Local Settings\Temp{AF08C71F-F822-4416-87A9-2BBF5A8A5F12}

But I'm using Vista x64, so my VMWare Server 2.0 install files were extracted to
C:\Users\Rusty\AppData\Local\Temp\{AF08C71F-F822-4416-87A9-2BBF5A8A5F12}~setup

In that folder was a vcredist_x64.exe and a vcredist_x86.exe file, so I tried installing the x64 version. But I got a message that there was no free space on the D: to install it. D: is my CD/DVD drive, so not surprising it couldn't install. There didn't seem to be any way to correct it to install on the correct drive, so I just tried running the VMWare Server.msi, which was in the same folder. This ran and installed okay.

After restarting the PC (required by the VMWare Server 2.0 installer), I booted up the Ubuntu Virtual Machine, and it worked okay. I then shut it down and started up a MS Virtual PC 2007 Virtual Machine. I then tried to boot up the VMWare Ubuntu Virtual Machine again, but it still had the same problem that it wouldn't work when a MS Virtual PC VM was on.

So I tried posting to the VMWare Server forums, but they still had the same problem in that they had an endless redirect when you try to post a message. So I thought I'd try in IE, since before I'd only tried FF and Chrome. Amazingly, I could post a message to the VMWare forums in IE8.

Unfortunately the message posting system VMWare uses on their forums seems to be rather rubbish, after inserting a quote block, I clicked in the text box and pressed enter to make a line break, but instead it posted the message. There didn't seem to be any option to edit your post either, so I had to make a reply to my first message with the rest of the message.

In the evening I updated my Pog website and also the facebook group for it and the twitter for it, and then found that finally the facebook updates to twitter were working, so some things got put on twitter twice, by me and by facebook (since previously the facebook updates to twitter haven't been working, I also updated twitter manually).

I found I had had a reply to my question on the VMWare forums VMWare Server 2 conflict with Microsoft Virtual PC 2007, which said
It is likely that VPC 2007 leaves the CPU in VMX operation, which prevents VMware Server from switching between legacy mode and long mode. Unfortunately, that mode switch is required for the majority of host/guest configurations under VMware Server 2.

I don't know if it is possible to disable VT-x support in VPC 2007, but if it is, that may be a workaround for this issue.
Looking up 'VT-x support' to see what that was, I found that it was hardware virtualization support. In MS Virtual PC there is a per VM option to disable/enable hardware virtualization, so I disabled hardware virtualization on the VM, then started up one of the MS VPC VMs. When that had loaded, I started up my VMWare Virtual Machine, and it started okay! So that's got that problem fixed.

I looked at the Adobe XMP File Info Panel SDK documentation to see how to edit my panel using Flex Builder (well actually, Flash Builder 4, but that wasn't around when the XMP File Info Panel SDK was released). I followed the instructions of copying a file to the Flash Builder plugins folder, but when I then started up Flash Builder, it said my evaluation period had ended, even though it's meant to be a 60 day evaluation copy (it's a beta), and I've only had it installed a couple of days.

So I uninstalled the Flash Builder 4 beta 2, restarted, and then installed it again, but it still said that the evaluation period had ended.

So I decided to try and build my panel in the same way that I was doing before. The first thing I had to do was to find the location of ant (it's bundled with Flash Builder 4), then add its location to the PATH environment variable.

After doing that I tried building my project, but got a message that java.exe couldn't be found.
'"java.exe"' is not recognized as an internal or external command,
operable program or batch file.
The system cannot find the batch label specified - end
So I found the location of java.exe, and added that to the PATH environment variable. I tried building the project again, but now I got a message that it couldn't find tools.jar.

So I removed the Java location from the PATH environment variable, installed the JDK, and then tried again, but got the same message I did earlier about Java.exe not being found. So I added the JDK Java.exe location to the PATH environment variable, and tried again. Now I got the message
Buildfile: build.xml
[taskdef] Could not load definitions from resource flexTasks.tasks. It could n
ot be found.

BUILD FAILED
Target "build.xml" does not exist in the project "DKimageXMPPanel".

Total time: 0 seconds
The system cannot find the batch label specified - end
So I edited my build.xml file and found the line
<taskdef resource="flexTasks.tasks" classpath="${FLEX_HOME}\ant\lib\flexTasks.jar"/>
I checked the correct path to flexTasks.jar, and that was the correct path.

I tried changing the Flex SDK location in the build.xml file to point to the Flex 3.4.1 SDK instead of the Flex 4.0.0 SDK, but that gave me the following error when I tried to build the project:
Buildfile: build.xml

BUILD FAILED
Target "build.xml" does not exist in the project "DKimageXMPPanel".

Total time: 0 seconds
The system cannot find the batch label specified - end


I then tried building one the sample panels that comes with the File Info SDK, but that didn't work either:
Buildfile: build.xml
[taskdef] Could not load definitions from resource flexTasks.tasks. It could n
ot be found.

BUILD FAILED
Target "build.xml" does not exist in the project "Flash Flex Panel".

Total time: 0 seconds
The system cannot find the batch label specified - end
I tried with both the Flex 4 SDK and Flex 3.4.1 SDK, but got the same error message when using both.

I think what I will have to do, is not modify the XMP File Info Panel (since neither Flash Builder nor compiling the panel manually seems to work), but instead make different XMP templates for the different license types. Then I can just import the correct one from the XMP File Info Panel, and get the correct info written to the image's XMP.

Then I still need to do the work on the website so it will read this XMP and add it to the database.

The weather today was overcast all day.

Food
Breakfast: Bowl of Choco Moons Cereal; Cup o' Tea.
Lunch: Mature Cheddar Cheese with Crispy Salad Sandwich; Clementine; ½ Eccles Cake that me and L made yesterday; Tesco fake Caramel Rocky; Cup o' Tea.
Dinner: Slice of bacon quiche; Potatoes; Peas; Ground Black Pepper. Pudding was a Slice of Home-made Apricot Flapjack. Coffee; 2x Pieces of Sainsbury's Truffle Chocolate; 2x Pieces of Sainsbury's Soft Caramel Chocolate.
Supper: Chunky Choc Chip Cookie; Cup o' Tea.

Sunday, 24 January 2010

Defragging

This morning I was cutting out some of the photos of Pog stuff I took yesterday evening.

After church and in the afternoon I carried on cutting out some of the photos of Pog stuff I took yesterday evening.

Then about 4.30pm I made some Eccles Cakes with Bo.

In the evening I watched Aguirre The Wrath of God with Mauser, it was quite boring as nothing really happens in it, something to be expected really for a Werner Herzog film.

Then after that I just watched my PC defragging as it didn't finish yesterday. That was exciting watching the little blocks slowly move about.

The weather started off overcast, then cleared up a bit and was sunny for the rest of the morning. Then in the afternoon it clouded over again and was overcast the rest of the day. In the evening it rained.

Food
Breakfast: Bowl of Choco Moons Cereal; Cup o' Tea.
Dinner: Pasta; Carbonara sauce stuff; Mixed Veg; Bacon; Ground Black Pepper. Pudding was 2x Chunky Cho Chip Cookies. Cup o' Tea.
Tea: Ham with Crispy Salad and Sliced Cherry Tomatoes Sandwich; Clementine; Warm Eccles Cake that me and L made; Cup o' Tea; Bit of Cadbury's Wispa; bit of Yorkie.

Saturday, 23 January 2010

Not much

This morning I decided to try and investigate why my website wasn't sending output to the browser as the page was being processed (so the page can be loaded incrementally as the server processes it, rather than the server processing the complete page and then sending it off to the browser).

It seemed that this was due to fastcgi buffering, I did some googling and found this thread, which said
> Is there no such option just because nobody implemented it? Or is it
> because of some kind of technical constraint?
Yes. It's because of FastCGI protocol internals. It splits "stream" into blocks max 32KB each. Each block has header info (how many bytes it contains, etc). So nginx can't send content to the client until it get the whole block from upstream.


So, not being sure about how the Fast CGI buffering works, and whether it is possible to flush the buffer to the browser from PHP when using Fast CGI, I posted to the Nginx forum/mailing list to see if anyone could advise me.

After lunch I did some more work on my website, testing some uploads, including a grayscale (Dot Gain 20%) image. It uploaded and converted to sRGB okay. Strangely, when viewing the original image in Firefox, it wasn't colour managed (the dark tones got much darker), so it seems that maybe Firefox only colour manages Adobe RGB images.

I was getting ready to upload the changes to the web server, but then I remembered that I also needed to change the way the image license is recorded - at the moment I just have whether the image is available on a royalty free or rights managed licensed, but I also need to have whether an image is licensable for commercial purposes or editorial use.

To record this, I need to modify my Adobe XMP File Info panel so that I can add the necessary metadata to the images. But since I've re-installed Windows Vista, I've lost the Flex SDK and software that was needed to compile the panel. So I did some searching and found a beta of Flash Builder 4, which replaces Flex Builder 3. Previously I wasn't using Flex Builder, but hand coding and ant to compile.

It seemed like a license for Flex/Flash Builder was available for free if you already owned a Adobe Creative Suite product. But on the info page, it said that you needed to send Adobe a copy of the receipt for your Adobe Creative Suite product that you bought from an authorised Adobe reseller.

Using the Adobe website to find what authorised Adobe resellers there are in the UK, it told me that it couldn't find any!

In the evening I installed PerfectDisk and started doing a defrag of my C: While I waited for that I took some photos of Pog stuff in some boxes that I've had in my room for quite a while waiting for me to take photos of them.

After taking photos of all the Pog stuff, the defrag was still going, so I watched part of the first episode of Star Trek TNG with Mauser and L.

The weather was overcast all day.

Food
Breakfast: Bowl of Choco Moons Cereal; Cup o' Tea.
Lunch: Mature Cheddar Cheese with Salad Sandwich made with Fresh Bread-Maker-made Bread; Crust of Fresh Bread-Maker-made Bread with Strawberry Jam; ½ a slice of Fresh Bread-Maker-made Bread with Strawberry Jam; Clementine; 2x Jaffa Cakes; Cup o' Tea; 3x Pieces of Sainsbury's Truffle Chocolate.
Dinner: 2x Delee Posh Sausages; Mashed Potato; Spaghetti Hoops. Pudding was some Banana with Butterscotch Whip and Chocolate Sauce. Coffee; Dark Chocolate Digestive.

Friday, 22 January 2010

Websiting

This morning I checked my bug report on Imagick and nothing had been done, so I thought I better change my website to call ImageMagick from the command line instead of using the PHP Imagick extension.

I looked up how to do sharpening in ImageMagick, and found some info about adaptiveSharpen, which is meant to be more like Photoshop's smart sharpen filter. So I edited my page to do an adaptive sharpen, resize, then another adaptive sharpen using ImageMagick. But when I tested it, I found that I was getting a 504 Gateway Time-out from Nginx.

Googling about this I found this thread on stack overflow: How do I prevent a Gateway Timeout with Nginx. After adding fastcgi_read_timeout 120; to my site config, I found that I was still getting the timeout error. Looking at the error logs I found the problem was not with my nginx/fastcgi backend, but rather the nginx frontend, timing out waiting for the response from the nginx backend.

So I added proxy_read_timeout 120; to the front end Nginx config, but still got a 504 Gateway Time-out. I decided to run the ImageMagick command in the terminal and time how long it actually took. I don't know how to time stuff in Linux so I just ran top refreshing every second and kept an eye on the time column until the process ended, which was quite boring.
$HOME/apps/ImageMagick/bin/convert \
-limit memory 64 \
-limit map 128 \
-filter Lanczos \
$originalImage \
-adaptive-sharpen 0x.6 \
-set option:filter:filter Lanczos \
-set option:filter:blur 0.8 \
-resize 1024x720 \
-adaptive-sharpen 0x.6 \
$HOME/resizedImage.jpg
Amazingly it managed to take 6 minutes 45 seconds to finish (this was with a 10000px x 5000px approx image. So I tried the same thing with unsharp mask to see if was any quicker:
$HOME/apps/ImageMagick/bin/convert \
-limit memory 64 \
-limit map 128 \
-filter Lanczos \
$originalImage \
-unsharp 0.5, 0.5, 5, 0.05 \
-set option:filter:filter Lanczos \
-set option:filter:blur 0.8 \
-resize 1024x720 \
-unsharp 0.5, 0.5, 1.66667, 0.05 \
$HOME/resizedImage.jpg
But this took 16 minutes 57 seconds - now that was a long time to wait just watching top!

Also, someone had replied to my question on the Ubuntu forums on how to record memory usage of a program/process until the program or process finishes, here's the script they suggested:
$#!/bin/bash
$HOME/apps/ImageMagick/bin/convert -limit memory 64 -limit map 128 $ORIGINAL_IMAGE -set option:filter:filter Lanczos -set option:filter:blur 0.8 -resize 1024x720 $HOME/resizedImage.jpg &
CMD_PID=$!

if [ $? -eq 0 ]; then
while [ $? -eq 0 ]; do
sleep 2
ps --no-headers -p $CMD_PID -o pid,%cpu,rss,cmd >> test.txt
done
fi

I must admit I don't really understand how it works - $? contains the return code of the last executed command. So I would of thought that $? would start off as null (or something similar) and then be 0 when ImageMagick has successfully processed the image. But it works, so I guess that's all that matters really.

The time taken for processing by ImageMagick is quite bad, especially considering that I also need to convert the image to sRGB, create two smaller versions, and create a thumbnail as well. So I looked to see if I could find anything about Lightroom 3 and creating an export plugin like the one it ships with for Flickr. Then I could resize my images in Lightroom (which could also use the superior smart sharpen) and upload the different sized images to the web server instead of the web server having to do the image resizing and sharpening.

I couldn't find anything so I posted a question about it on the Lightroom forums.

On my website it doesn't currently have any facility for dealing with grayscale images (as far as I'm aware), so I checked to see how grayscale (Dot Gain 20%) images look in a non colour managed web browser. First I had to find a non color managed browser. I tried Firefox with an adobe RGB first, but it rendered it with the correct colours, so I guess that Firefox 3.5 is now colour managed.

So I tried IE8, which of course isn't colour managed (given that FF has only recently implemented colour management, I wouldn't expect IE to implement for a couple of years). There was a very noticable difference between the grayscale image in IE and in Photoshop - much more contrasty in IE, with the darker tones very dark in IE.

After this I did more work on trying to get the equivalent ImageMagick command to my Imagick commands. I needed to get ImageMagick to convert the image to sRGB. But again, I needed a way to check whether the conversion had been successful. I don't have Internet Explorer on Ubuntu, so I couldn't use that. I tried Conkeror, but it wouldn't do anything when I dragged an image onto it.

I tried Konqueror, which loaded the image at half its actual size (weird, since it wasn't resizing it to fit in the screen or anything, just halving its size). I knew that Firefox was colour managed, so I tried opening an image in Firefox and also the 'eye of GNOME' default image viewer in Ubuntu. Strangely, it seems that the eye of gnome isn't colour managed, which is weird for a specialised image viewing program. Anyway, this at least gave me a way to compare a colour managed and non-colour managed version of an image.

I tried removing the sharpening before resizing the image, and this massively sped up the process - even though ImageMagick was now also converting the image to sRGB (from adobe RGB). I didn't time it, but it took about 30 seconds compared to the nearly 7 minutes it was taking before.

I then tried tweaking the sharpening settings to try and get the output as good as it was when using Imagick, but changing the adaptive sharpen parameters didn't seem to make any difference. Adaptive Sharpen must be working though, since when I was using it on the image before resizing as well as after resizing (as well as taking a lot longer), the final image was sharpened a lot more than the same one processed with unsharp (instead of adaptive sharpen).

In the evening I finished watching the Mysterians with Mauser and L, and watched Power Rangers with L. I played on Banjo Kazooie Nuts & Bolts, but it was really hard to control with the Xbox steering wheel (and pedals). Then I did some more website stuff.

The weather today was overcast and rained all day.

Food
Breakfast: Orange Marmalade Toast Sandwich; Cup o' Tea.
Lunch: Grated Mature Cheddar with Salad Sandwich; Packet of Prawn Cocktail Flavour Crisps; Clementine; Grapes; Mint KitKat; Cup o' Tea; 2x Pieces of Sainsbury's Mint Creme Chocolate.
Dinner: Breaded Fish Portion; Tartar Sauce; Peas; Tinned New Potatoes; Ground Black Pepper. Pudding was some Apple Crumble that L made at School today with Custard. Coffee; 2x Pieces of Sainsbury's Truffle Chocolate; Piece of Sainsbury's Caramel Chocolate.
Supper: Shortbread finger; Dark Chocolate Digestive; Cup o' Tea.

Thursday, 21 January 2010

Processing photos

This morning I carried on processing my photos of Ludlow from the third day of our Shropshire Holiday back in July.

I decided to check the GPS co-ordinates had been correctly added to an image I'd just finished processing, and annoyingly they weren't, they were somewhere else. I checked various other images to try and find at what point the GPS co-ordinates had gone awry, but after checking about 10 images, which all had correct GPS co-ordinates, I got up to the problem image I'd just processed.

So it seemed that just this one image (out of the ones I'd processed so far) had got the wrong GPS co-ordinates. But weirdly, when I checked the NEFs that the image had been built from, th NEFs had the correct GPS co-ordinates. So it was like Photoshop had somehow done something to mess up the GPS co-ordinates (I would guess they were about 20 miles out).

Now I had to try and get the correct GPS co-ordinates into the jpg and psd files. Adobe Bridge doesn't have anywhere to let you update GPS co-ordinates, so I couldn't use that. RoboGeo doesn't work with psd files, so I couldn't use that. I couldn't use 'copy from file' with exiftoolGUI as that just allows you to import the whole XMP block, and I didn't want to overwrite the existing XMP.

So I decided to use exiftool with -tagsFromFile, though that wasn't that easy. First I had to make a list of all the GPS tags to be copied, which I got from the exiftool website, but then I had to format them for use with exiftool. After that I had trouble with it saying that it couldn't find the file I wanted to update. So I was messing about with the carets ^ for a while trying to get it working, which I did eventually.

This was the final command (for exiftool on windows):
exiftool.pl -tagsfromfile ^
^"E:\path\to\file-to-be-copied-from.NEF^" ^
-GPS:GPSVersionID ^
-GPS:GPSLatitudeRef ^
-GPS:GPSLatitude ^
-GPS:GPSLongitudeRef ^
-GPS:GPSLongitude ^
-GPS:GPSAltitudeRef ^
-GPS:GPSAltitude ^
-GPS:GPSTimeStamp ^
-GPS:GPSSatellites ^
-GPS:GPSStatus ^
-GPS:GPSMeasureMode ^
-GPS:GPSDOP ^
-GPS:GPSSpeedRef ^
-GPS:GPSSpeed ^
-GPS:GPSTrackRef ^
-GPS:GPSTrack ^
-GPS:GPSImgDirectionRef ^
-GPS:GPSImgDirection ^
-GPS:GPSMapDatum ^
-GPS:GPSDestLatitudeRef ^
-GPS:GPSDestLatitude ^
-GPS:GPSDestLongitudeRef ^
-GPS:GPSDestLongitude ^
-GPS:GPSDestBearingRef ^
-GPS:GPSDestBearing ^
-GPS:GPSDestDistanceRef ^
-GPS:GPSDestDistance ^
-GPS:GPSProcessingMethod ^
-GPS:GPSAreaInformation ^
-GPS:GPSDateStamp ^
-GPS:GPSDifferential ^
^
-xmp-exif:GPSAltitude ^
-xmp-exif:GPSAltitudeRef ^
-xmp-exif:GPSAreaInformation ^
-xmp-exif:GPSDestBearing ^
-xmp-exif:GPSDestBearingRef ^
-xmp-exif:GPSDestDistance ^
-xmp-exif:GPSDestDistanceRef ^
-xmp-exif:GPSDestLatitude ^
-xmp-exif:GPSDestLongitude ^
-xmp-exif:GPSDifferential ^
-xmp-exif:GPSDOP ^
-xmp-exif:GPSImgDirection ^
-xmp-exif:GPSImgDirectionRef ^
-xmp-exif:GPSLatitude ^
-xmp-exif:GPSLongitude ^
-xmp-exif:GPSMapDatum ^
-xmp-exif:GPSMeasureMode ^
-xmp-exif:GPSProcessingMethod ^
-xmp-exif:GPSSatellites ^
-xmp-exif:GPSSpeed ^
-xmp-exif:GPSSpeedRef ^
-xmp-exif:GPSStatus ^
-xmp-exif:GPSDateTime ^
-xmp-exif:GPSTrack ^
-xmp-exif:GPSTrackRef ^
-xmp-exif:GPSVersionID ^
^
^"E:\path\to\file-to-be-updated.psd^"
Note that lines that start with a caret ^ have a space before the caret, while lines that end with a caret have no space after it.

In the afternoon and evening I was still processing the images, but finally managed to get them finished in the evening.

In the evening I also watched an episode of Power Rangers with L and part of the Mysterians with L and Mauser. I went on the pinternet a bit and did a backup.

The weather today was overcast most of the day, but was also sunny for a while in the afternoon.

Food
Breakfast: Bowl of Pecan Crunch Oat Cereal; Cup o' Tea.
Lunch: Bowl of Minestrone fake cup a soup; Slice of Toast; Ibuprofen; Banana; Mint KitKat; Cup o' Tea.
Dinner: Meatballs; Spaghetti; Tomato Bolognese sauce stuff; Green Beans; Ground Black Pepper. Pudding was a Chocolate Eclair. Coffee; 2x Pieces of Sainsbury's Truffle Chocolate; 2x Pieces of Sainsbury's Caramel Chocolate.
Supper: Shortbread finger; Dark Chocolate Digestive; Cup o' Tea.

Wednesday, 20 January 2010

Photo Processing

Today I was processing photos from the third day of our Shropshire holiday back in July. The ones I was processing were the ones I was adding metadata to the last couple of days.

In the evening I also watched an episode of Power Rangers with L.

I didn't manage to finish processing the photos - still got probably another days worth of processing to finish this batch.

In the post I received my Goldfish Perceptions of Pacha album, which is weird as when I try to get the downloads that you're meant to be able to access as soon as you've bought the album, it still says that my credit card payment is awaiting processing.

The weather today was overcast all day.

Food
Breakfast: Bowl of Peacn Crunch Oat Cereal; Cup o' Tea.
Lunch: Peppered Ham with sliced Cherry Tomatoes Sandwich; Banana; Slice of Home-made Treacle Tart; Cup o' Tea.
Dinner: Slice of Chicken Pie; Mashed Potato; Baked Beans; Ground Black Pepper. I didn't have any pudding. Coffee; Piece of Fairtrade Fudge.

Tuesday, 19 January 2010

Metadataring still

Today I was still adding metadata to the images from Ludlow on the third day of our Shropshire Holiday back in July.

Some useful sites I used for getting info for the captions/descriptions were:Of course, I also used Wikipedia quite a bit, and did a lot of reading of various Wikipedia articles.

For some reason I found that quite a few of the images had been geo-coded with a wrong location, so I had to re-geocode them, using RoboGeo and Google Earth.

In the evening I also watched an episode of Power Rangers with L.

I did actually manage to finish adding metadata to all the Ludlow photos by the end of the day. Now they just all need processing.

The weather today was overcast all day.

Food
Breakfast: Strawberry Jam Toast Sandwich; Cup o' Tea.
Lunch: 2x Cheese on Toasts; Cherry Tomatoes; Clementine; A few Grapes; Cup o' Tea.
Afternoon Snack: Dark Chocolate Digestive; Shortbread Finger; Cup o' Tea.
Dinner: Sloppy shop-bought Shepherds Pie; Carrots; Green Beans; Grated Mature Cheddar Cheese; Tomato Ketchup; Ground Black Pepper. Pudding was Tinned home-made Treacle Tart with Custard. Coffee; Piece of Turkish Delight.
Supper: Slice of home-made Treacle Tart; Cup o' Tea.

Monday, 18 January 2010

Metadataring

Today I was adding metadata to the images from the Shropshire in July that I processed the other day.

That took all morning and part of the afternoon. When I'd done that I thought I should start adding metadata to the rest of the images that I'd taken on the 3rd day of our Shropshire holiday in July. It's much easier to add metadata to the RAW files first (before processing them), then it means that you don't have to bother so much with adding metadata to the converted files and the RAW files as well, since most converted files will already contain the metadata from the RAW files.

A couple of useful websites I found when trying to find information for photo descriptions/captions were

In the evening I also watched a couple of episodes of Power Rangers with L and The Thing From Another World with Mauser.

The weather today was mainly overcast with the sun appearing every now and then. After sunrise there was a minute or so where the sun lit up the bottom of a large cloud with a nice orange glow, and the same thing at sunset as well.

Food
Breakfast: Bowl of Choco Moons Cereal; Cup o' Tea.
Lunch: Peppered Ham with Sweet & Crunchy Salad Sandwich; Packet of Chilli favlour Doritos; 2x Clementines; Cup o' Tea.
Afternoon Snack: Dark Chocolate Digestive; Shortbread Finger; Cup o' Tea.
Dinner: 1½ Scotch Eggs; Potatoes; Sweet & Crunchy Salad; Ground Black Pepper. Pudding was Tinned Mixed Fruit with a Trifle Sponge and Cold Custard. Coffee; 3x pieces of Sainsbury's Caramel Chocolate; Piece of Sainsbury's Truffle Chocolate.
Supper: Dark Chocolate Digestive; Shortbread Finger; Cup o' Tea.

Sunday, 17 January 2010

Cutting out Pogs in Photoshop

This morning Clare stayed at home to look after G-Dad, L went to church with McRad, and me and Mauser went to the Catholic Church, as it was a Churches together service or sumat.

We got there about 15 minutes early as I didn't realise it was much closer than 'normal' church, but there were hardly any seats left. Quite a few people who arrived later than us had to stand up, so lucky we were so early.

There were about 400-500 people there, and the priest said that it was about 100 more than they normally have, plus they would normally have children in for part of the service (because there wasn't any room all the children stayed out at Sunday School for mass), so it must normally be pretty packed.

The Priest was Irish, which seemed quite stereotypical for a Catholic Priest. What wasn't stereotypical is that while they had sung responses, they had a folksy sounding band playing music for them, instead of the typical chant.

After Church I started cutting out some Pogs in Photoshop. After lunch I carried on cutting out Pogs in Photoshop.

I had received a reply back from WebFaction about how to monitor the memory usage of a program that doesn't run in the background, but they script they sent me was pretty useless as it just printed the memory usage at the point in time that you run it (so the same as ps).

Doing some googling I managed to cobble together a simple bash script that would record the output of ps (piped into grep for just the program I wanted to monitor) every half second for 30 seconds. Far from perfect, but good enough to get the stats on memory usage that I needed.

Running this script, I found (unsurprisingly) that ImageMagick used about the same amount of memory for resizing an image as php with the imagick extension does. Doing some googling I found this thread on the ImageMagick forums that explains how to limit the memory being used by ImageMagick.

So I added -limit memory 64 -limit map 128 to my ImageMagick parameters, and re-ran the memory monitoring script, and while ImageMagick took a while longer to run (though still finished within 30 seconds), it now didn't use tons of memory.

Next I tried to find out how to do the same thing in Imagick, and found this page (google cache as page doesn't seem to be loading properly, scroll up a bit to see the relevant section). That page says you should use
/*** Set memory limit to 8 MB ***/
$im->setResourceLimit( Imagick::RESOURCETYPE_MEMORY, 8 );

But when I modified my script to include the Imagick commands to limit memory usage, restarted PHP, and then ran my PHP script again, I could see (using top to monitor the php processes) that the memory usage was still going up to 467MB (or sumat similar).

I tried downloading the latest Imagick and the latest ImageMagick, but still the memory was being eaten up. So I filed a bug report for Imagick. Hopefully they will fix it shortly or tell me I'm doing something wrong and how to fix it, otherwise I'll have to make command line calls to ImageMagick from my PHP script.

The problem with doing that is that it's slower than Imagick and also I'll have to learn what the ImageMagick equivalent commands of the Imagick methods and options are that I'm using.

The weather today was sunny most of the day. There was a cloudless sunset, they sky wasn't particularly interesting, but seeing the sun is always quite nice (you can look at the sun without it burning your retinas when it's setting).

In the evening I watched Hapkido with Mauser, which was good, lots of action, and a young Sammo.

Food
Breakfast: ½ Grapefruit; Tangerine Marmalade on Crust of Toast; Cup o' Tea.
Dinner: Stir Fry Vegetables; Noodles; Soy Sauce. Pudding was 2x Oven heated Apple Pies wih Custard. Coffee.
Tea: Peppered Ham with Ground Black Pepper and Sweet & Crunchy Salad Sandwich; Clementine; Double Chocolate Muffin that L made at school; Cup o' Tea.

Saturday, 16 January 2010

Processing photos

Beans-a-Zoy!

Today I was just processing some more photos, mianly from the 3rd day of our Shropshire holiday, back in July.

The weather today was overcast all day and it also rained a bit.

Food
Breakfast: Bowl of Choco Moons Cereal; Cup o' Tea.
Lunch: Seriously Strong Farmy Cheddar Cheese with Sweet & Crunchy Salad Sandwich made with Fresh Bread-Maker-made Bread; Strawberry Jam on Crust of Fresh Bread-Maker-made Bread; Banana; Pocky Sticks; Cup o' Tea.
Dinner: Slice of Home-made Pizza; Peas; Mashed Potato; Ground Black Pepper. Pudding was a Vanilla and Chocolate Crisp Balls Muller Corner. Coffee; Piece of Sainsbury's Truffle Chocolate; Piece of Sainsbury's Caramel Chocolate.

Friday, 15 January 2010

Solving a problem so I can solve a problem

This morning I checked my emails, then posted on the Sitepoint forums to see if I could get any help with the problem I'm having with PHP dying. I decided to ask on Sitepoint since I hadn't had any more replies on the Nginx mailing list, however the sitepoint forums did look like they were very Apache-centric.

After that I tried to see if I could find a good Wordpress theme for my pano website, but I couldn't. I wanted one with a white background and a wide enough post size to fit in the panorama preview images, which I currently had set at 640px wide. A lot of themes seemed to have 2 sidebars with the same information in both sidebars, so that ruled them out.

I did find a theme that I was quite nice, though the post width was a bit thin, and couldn't fit in the full width of the image (the theme automatically resized (browser resized, not server resized) the image to fit the posts. I thought maybe I could change the CSS to make the post width wider, but then I noticed a link at the bottom of the theme to an acne website.

Obviously the theme was one of these sponsored ones where a company pays the theme creator to have a link to their website in the theme somewhere. I didn't really want a link to an acne website from my pano website, not because it's not relevant, but rather because linking to rubbish sites can harm your own site's SEO.

So I went back to the theme I found yesterday. The main problem with this theme is that it's blue, and so makes my blue coloured twilight photos look rather flat and boring (well, okay, more flat and boring than they would do on a white or black background). It has a couple of other problems e.g. a 'read more' button when there isn't any more of the post to read, and also the hierarchical categories don't seem to display (at least certainly not in a hierarchical manner).

But I guess I just get the site online with the current theme and then worry about fixing this things later, otherwise I'll be in the same situation I'm normally in where making the fixes turns out to be much more complicated and time consuming than I envisaged and delay the site from being online.

I tried to get a pano working using the FPP show_pano.swf file, which is meant to do some flash version detection before loading the pano, so that users with old versions of flash will get a prompt to upgrade rather than a non-working pano. But I couldn't work out how to pass the panorama parameters as a query string, so (after googling without success) I asked in the FPP forum.

I processed a couple of non-pano images from my walk (just over a week ago now), and then tried geo-coding all the images from the walk in Robo-geo. Before doing the actual geo-coding, I checked that the co-ordinates to be geo-coded were correct. They weren't, as the camera time was an hour into the future. So I told Robogeo to change the time for each file to minus 1 hour, then it was lunch time.

After lunch Robogeo was still applying the time change to the images - or was it? Unfortunately the Robogeo window just went white, and maxed out one CPU core, so there wasn't any way to tell if it had just crashed or was actually doing anything. Just have to wait and be patient.

Yesterday evening I put a bid in (via goofbay) on a virtually new Canon 500mm/4 L IS lens with Canon 1.4x TC II for £2850, the price on ebay at that time was £2200. The normal second hand selling price on ebay for the lens alone is around £4000. But when I'd gone to bed I thought about how I didn't really have any money, and that I wouldn't have much time to use it, unless I wanted to do less macro or panoramic photography.

So this morning I removed my snipe, though actually as it happens, I wouldn't have won anyway as the end price was £3125. A good deal for the winner. The reason for the cheap price I think is that the seller only had 9 feedback, and none of it recent. Still, not as good as the £2500 and £2200 (or sumat like that) 500/4s that a seller with 0 feedback sold a year or two ago (and yes, positive feedback was left for both transactions).

I thought when the recession started that second hand lenses (and new lenses to some extent) would become a lot cheaper as people can't afford to bid so much and also sell off their lenses to try and get money for things like paying the mortgage. But that hasn't happened, new lens prices have increased quite a bit (thanks to the strong JPY compared to weak GBP), and I would say that not much has happened to second hand prices, though they may have increased a bit as well.

I guess maybe the low interest rates have meant those on tracker mortgages actually have more money to spend.

Robogeo did finish updating the times on the images, so it hadn't crashed, but then I had another long wait while it did the actual geo-coding. And thanks to the high CPU and disk usage by the geo-coding process, there wasn't a lot I could do while waiting for the geo-coding to finish.

While I was waiting for Robogeo, I thought I might as well scan some pogs in on Mauser's comp. But unfortunately Mauser is now using Windows 7, and I couldn't get the scanner drivers to install properly on it.

Eventually Robogeo finished, and I had received a reply on the sitepoint forums about the problem I was having with PHP dying. So I started Ubuntu up and tried their suggestion, and found that the php imagick extension was causing the problem.

So I recompiled PHP without the imagick extension. While I was waiting for that I cut out some Pogs in Photoshop.

When L got back from School I asked him to make Mauser's PC go into Vista so I could scan some pogs in. He did, then I scanned the pogs in. When I'd done that, I got Bo to put the pogs away in the folders.

PHP finished recompiling, so I installed the imagick extension, and my page now worked, yay! Using top I monitored the PHP memory usage, and found it went up to nearly 900MB, then went back down to about 450MB. I also found that it would alternate which php process handled my request, so if I uploaded the same file twice, I would end up with 2 php processes using 450MB each!

After dinner I watched a couple of episodes of Power Rangers with L.

Doing more testing on my image upload/resize script, I found I was occaisonally getting messages (warning popups) from Ubuntu that I didn't have much free space left. After deleting the duplicate uploads and emptying the trashbasket, I still only had just over 1GB free space, so I shut down the Ubuntu VM and modified the hard drive size in the VMWare Server admin control panel web interface.

It took me quite a while to get the VM so I could boot into the gparted iso, but I managed it eventually. When in gparted I then had to play the rearranging partitions game. While gparted rearranged and resized the partitions I played on Dirt a bit.

When the Ubuntu VM had finished having it's new drive space added, I did some more work in trying to see where all the memory was going. I fopund that just loading the image with imagick took about 367MB (so about double the uncompressed image size). Loading the image and resizing it took about 487MB. Loading the image, sharpening it, resizing it, and sharpening it took about 967MB.

I wanted to see how just loading and resizing the image with ImageMagick from the command line would compare. So I looked to try and find out what the equivalent command for ImageMagick would be. Then I needed to monitor how much memory ImageMagick was using when it did the resize. But I couldn't find out how to monitor the memory usage of a process that wasn't already running, so I asked WebFaction how to do this (since they had said they'd try and help if I had any questions).

I did a backup and also noticed a new Canon 500mm/4 IS was on ebay for £3500 (so nearly £400 more than the one that ended earlier today and also had included a 1.4x TC), and it already had a bid on it as well. Ebay is quite weird in how variable prices can be.

The weather today was rainy in the morning, the overcast the rest of day. Nearly all the snow is gone now.

Food
Breakfast: Bowl of Choco Moons Cereal; Cup o' Tea.
Lunch: Seriously Strong Farmy Cheddar Cheese with Salad Cream and Sweet & Crunchy Salad Sandwich; Banana; Clementine; Cup o' Tea.
Dinner: Breaded Fish Portion; Chips; Peas; Ground Black Pepper; Salt; Tartar Sauce. Pudding was a double Choc Chip Muffin that L made at school today. Coffee; Piece of Sainsbury's Truffle Chocolate; Piece of Sainsbury's Caramel Chocolate.

Thursday, 14 January 2010

Panoing

This morning and first bit of the afternoon I was processing the last two panos that I took a few days ago and didn't manage to process yesterday.

The last pano took quite a bit time to do as unfortunately the brightness of the images was quite variable. For one image I think this was because the battery ran out while taking the image, so the exposure time was quite a bit less than it should have been (I didn't realise this at the time).

Then the images taken after the battery ran out were quite a bit darker as they were taken a few minutes after the other images (I didn't have a spare battery so I just tried to warm up the run out battery for a few minutes, before putting it back in the camera and finishing the image sequence). Because the light was fading fast, this meant that the few extra minutes between the shots meant that the post empty battery shots are quite a bit darker (probably about 2 stops) than the pre empty battery shots.

So I had to do quite a bit of work on the pano trying to get the light levels relatively equal across the images. This meant processing the pano multiple times using images with different RAW exposure compensation applied until it looked 'okay'.

The other pano I had to process didn't have any features on the zenith shot to align it with the other shots. But it did have a telephone (or maybe electric) line going across the zenith, so it needed to be aligned perfectly. I managed to get it looking 'okay' by doing some copy-paste-warping.

I also finished off the pano I was working on yesterday where PTGUI didn't want to render the handheld nadir shot. I just removed the handheld nadir shot from the PTGUI project file, added it again, added some control points again, and this time when I ran the optimiser it worked okay and didn't make the handheld nadir disappear.

When I'd finished getting the panos processed I did a backup, then started up my Ubuntu Virtual Machine. I tried to get Nginx et al running at startup. I had thought that by putting the Nginx init script in /etc/init.d, it would start up automatically, but this wasn't happening. Doing some googling, the answer to getting a program to startup with Ubuntu was to edit /etc/rc.local and add the command you wanted processed to that file (before the line that says exit 0).

I added the programs that I wanted started up in my username to a file which I called start-progs.sh.

I then edited my crontab crontab -e and told it to run start-progs.sh every 30 minutes, and also @reboot (this is the same setup as I have on the webserver). But when I restarted, the nginx I had added to /etc/rc.local was running okay, but the programs in my start-progs.sh weren't being started.

Doing some more googling, I read that Ubuntu needs there to be a newline after the last command in your crontab. Editing the crontab, it looked like there was a new line after the last command (which was the @reboot one), but the editor said the document had 2 lines, so I added another new line just to be sure.

I saved the crontab, and restarted again, but still the programs that the crontab was meant to start weren't started. So I tried running my start-progs.sh file manually, and found it didn't work - I had forgotten to set execute on it! So after making start-progs.sh executable it all worked okay, and the crontab will now run the script that will start the programs each time Ubuntu is started.

I tried to get rid of the annoying backup tilde files that are created in Ubuntu whenever I edit a file, and found that actually it is not Ubuntu that creates these files, but gedit. To stop gedit creating backup tilde ~ files, in gedit go to Edit > Preferences, then in the preferences dialog go to the Editor tab and untick 'Create a backup copy of files before saving'. Easy!

I got my pano website and wordpress working on my Ubuntu machine, and then it was dinner time.

After dinner I watched a couple of youtube videos with Mauser and L. I also watched Mauser go on Dirt and went on Dirt a bit myself.

I did a bit of work on my pano website and watched a couple of episodes of Power Rangers with L as well.

Food
Breakfast: Tangerine Marmalade Toast Sandwich; Cup o' Tea.
Lunch: 1½ Cheese on Toasts; Apple; Clem 'n' tyne; Cherry Bakewell; Cup o' Tea.
Dinner: 2x Posh delee sausages; Mashed potato; baked beans. Pudding was Rice Pudding with Jam. Coffee.

Wednesday, 13 January 2010

Got some stuff done!!!!

Today I was just processing some panos that I took a few days ago when it was snowy. I actually managed to get 5½ done!

I tested my MT-24EX Macro Twin Flash that arrived back from Fixation yesterday, after having the broken plastic foot on it replaced, and it was working okay.

Also in the evening I watched a couple of episodes of Power Rangers with L.

The weather was a mixture of snow and sleet all day.

Food
Breakfast: Tangerine Marmalade Toast Sandwich; Cup o' Tea.
Lunch: Mature Cheddar Cheese with Sweet & Crunchy Salad andwich; a few Grapes; Clementine; Slice of Home-made Flapjack; Chocolate Waver Bar; Cup o' Tea.
Dinner: Slice of Chicken Pie; Roast Potatoes; Carrots; Swede; Gravy; Peas. Pudding was Coffee Custard with Tinned Mandarin Segments. Coffee; Cherry Liqueur; Piece of Sainsbury's Caramel Chocolate.

Tuesday, 12 January 2010

Trying to get PHP working

This morning I was trying to get PHP running on Ubuntu. I installed spawn-fcgi, and then copied the CentOS init script I was using on the webserver for starting and stopping PHP.

The first thing I needed to know was where the source function library was in Ubuntu. The CentOS init script had '/etc/rc.d/init.d/functions', but this doesn't exist in Ubuntu. A quick google and I found the Ubuntu version was '/lib/lsb/init-functions'.

I tried starting PHP using the script, but got the error
$Starting fcgi: ./webapps/spawn-fcgi/php-fcgictl: 62: daemon: not found
Another quick google and I found that Ubuntu doesn't use daemon, but instead start-stop-daemon. Looking at the Lighttpd init script for Ubuntu, it seemed that I could actually use just the plain spawn-fcgi command without having to put daemon or start-stop-daemon in front of it.

I could now get PHP started okay, but when I tried to stop it, I would get the message
$$Stopping fcgi:
And the PHP processes wouldn't be stopped. I did some googling, but couldn't work out how to get it working, so I posted to the Ubuntu forums to try and get some help.

While I was waiting for a reply I thought I might as well make sure that PHP was working okay. I started up Nginx but got the following:
djeyewater@rusty-ubuntu:~$ ./webapps/nginx/nginxctl restart
* Stopping Nginx Server... [ OK ]
* Starting Nginx Server... [warn]: duplicate MIME type "text/html" in /home/djeyewater/webapps/nginx/conf/nginx.conf:21
[warn]: conflicting server name "www.photosite.com" on 0.0.0.0:7776, ignored
[warn]: conflicting server name "photosite.com" on 0.0.0.0:7776, ignored
[warn]: conflicting server name "static1.photosite.com" on 0.0.0.0:7776, ignored
[warn]: conflicting server name "static2.photosite.com" on 0.0.0.0:7776, ignored
Googling didn't come up with much useful, then I remembered that I'd had this problem before - the problem is that when you edit a file Ubuntu creates a backup of the file before you edited it with a tilde ~ on the end of the file name. So my sites folder that nginx was loading the site configs from had a file 'photosite' and a backup of this file called 'photosite~'. Obviously nginx loads both files, and so you get the conflicting server name error as both files contain the same server names.

So to fix it, you just need to make sure you delete the backup files so Nginx only loads one version of each site config.

After lunch I tried importing the mysql databases from the web server into mysql on my Ubuntu Virtual Machine. But I kept getting an error about foreign key constraints. I looked at the table in question in the database that had been imported, and indeed, it contained values that violated the foreign key constraints.

I looked at the same table on the server, and it had different values that were okay. So I looked in the sql I was trying to import, and that had the correct values. It seemed like MySQL was assigning the rows that it was importing an auto increment id rather than the actual id value that was specified in the SQL.

Then I realised that the problem was down to a trigger I had on the table - the trigger was setting NEW.id to the value returned by a SELECT statement. This works fine in normal use as you're not going to be inserting rows into this table with a specific id. But of course, when you're trying to import the table, the value returned by the SELECT statement is NULL, and since the auto increment on the table has already been set as being the same number on the full table, this means that your rows get inserted with the wrong ids, thus causing the foreign key constraint to fail when you try to apply it to the table.

e.g. On my table I had 10 rows, so on the server they would have ids of 1-10, but when I tried to import the rows, they would start off at the next auto increment number, giving me rows numbered 11-20 instead.

The solution was just to modify the trigger to only set NEW.id if NEW.id starts off as NULL.

Eventually I managed to get a local copy of my website working, so I tried uploading a large image to see why it was using so much memory on the webserver. But the PHP process processing the request died. I tried again, and got the same thing. So after googling with no success I posted to the Nginx forum to see if I could get some advice there.

In the evening I watched 3 episodes of Power Rangers with L, and 'Hell is a City' with Mauser.

The weather was overcast all day.

Food
Breakfast: Tangerine Marmalade Toast Sandwich; Cup o' Tea.
Lunch: Bowl of Vegetable Fake Cup a Soup; Slice of Toast; Banana; Pear; Slice of Home-made Flapjack; Chocolate Waver Bar; Cup o' Tea.
Dinner: Chilli con Carne; Pepper; Rice; Cheese flavour Salsa Dip; Grated Mature Cheddar Cheese; Sweet & Crunchy Salad. Pudding was a Jam & Cream Doughnut. Coffee; Cherry Liqueur; 2x pieces of Sainsbury's Truffle Chocolate.

Monday, 11 January 2010

Today I was vectoring a picture of Pogman slamming some pogs, and also trying to get my Ubuntu Virtual Machine set up more like the web server my websites are actually hosted on.

After getting an nginx front end and nginx back end installed, I found that the front end wasn't proxying to the back end properly, so I posted on the nginx forum (mailing list) to see if I could get some advice on what I was doing wrong (I copied the syntax from the default file for proxying to an apache backend).

While I was waiting for a reply on that I thought I might as well install PHP. To make the setup similar to how the webserver was set up, this meant I also had to install MySQL. I did have some trouble installing MySQL compared to when I installed it on the webserver. When running mysql_install_db, I had to pass it the my.cnf file I wanted it to use as a parameter, otherwise it would use the one in /etc/mysql, which obviously wasn't what I wanted.

I didn't have to do this when installing mysql on the web server. Also, on the webserver I could access my installation of mysql by just typing mysql at the command terminal. In my Ubuntu Virtual Machine, I had to also pass the path of the socket that mysql was listening on as a parameter, otherwise I would connect to the normal Ubuntu mysql installation.

On the webserver, WebFaction had made me a 'PHP Stack' running on Apache, so when I installed nginx and PHP, I just copied the configure line from the 'PHP Stack' version of PHP, which kept most of the php configure options pointing to directories in the 'PHP Stack'.

But on my Ubuntu virtual machine, I didn't have a 'PHP Stack' with all the accompanying libraries installed. So I had try and get the libraries needed for PHP to build, so I spent most of the day downloading the libraries needed and installing them. While I was waiting for them to configure, make, and make install, I worked on doing a vector of pogman slamming some pogs.

When building the libraries required for an install of PHP similar to what I had installed on the webserver, I configured them all with the defaults except I changed them to install in a folder in my home directory so they would be easy to find.

Hmm... I did have loads of other stuff written here about building PHP from source and all the various errors I came across, but it seems that Blogger didn't actually save the post when it said that it had. ANNOYING

The weather was overcast all day and also it snowed a bit.

Food
Breakfast: Bowl of Chocolate Crunch Oat Cereal; Cup o' Tea.
Lunch: Bowl of Minestrone Fake Cup a Soup; Slice of Toast; Pear; Piece of Dried Mango; Cup o' Tea.
Dinner:

Sunday, 10 January 2010

Pogging

This morning I didn't go to Church because I was looking after Grandad again. I spent most of the day cutting out Pogs in Photoshop (had some weird shape Croky Caps that took a long time to cut out).

I also watched a couple of episodes of Power Rangers with L and watched Christo's Valley Curtain on youtube.

I got an email with a link to some work by photographer Peter Funch - he takes lots of photos from one point, then combines the images in Photoshop to get lots of people doing or wearing the same thing in one photo.

In the evening I made a squidoo lens for pogs, milkcaps, tazos, and flippos.

The weather was overcast all day with occaisonal snow, though the temperature was above freezing so quite bit of the snow melted (not that we had loads of snow in the first place, maybe about 5cms).

Food
Breakfast: Tangerine Marmalade Toast Sandwich; Cup o' Tea.
Dinner: Southern Fried Chicken; Flavoured Rice; Tiger Bun with marg. Pudding was sliced Kiwi Fruit with Strawberry flavour Whip. Coffee; 2x pieces of Sainsbury's Truffle Chocolate.
Tea: Honey and Mustard Ham with Sweet & Crunchy Salad Sandwich; Apple; Lemon Slice; Cup o' Tea.

Saturday, 9 January 2010

Uploading images

I spent quite a bit of today just uploading photos to my photo website. But when I uploaded a pano image, it caused my account to go way over it's memory limit, and so all my processes got terminated. The image was 10616 x 5308 pixels (161MB uncompressed), and the php process that was resizing the image spiked to 879MB memory usage before my webhost killed it (along with the rest of my processes).

After restarting my processes needed for my websites to function, I thought I'd better hold off uploading any more images for the moment. My D200 images are 3872 x 3592 pixels (28.7MB uncompressed), so if the image resizing process uses 5½ times as much memory as the uncompressed image takes up, this would mean that even resizing a D200 image would put me quite a lot over my memory limit.

Now, I'm sure I should be able to resize my images on the server without the resize process taking up 5½ times as much memory as the uncompressed image. So I need to do some testing in my local environment to try and find why the resize process is using so much memory. I found that the top -d 1 command can give you a useful output on how much memory different processes are using (-d 1 tells it to refresh the data every second).

But I couldn't really run the test on my Ubuntu virtual machine as I was using apache + suPHP, whereas the webserver is running Nginx + php fcgi. So I needed to reconfigure my Ubuntu Virtual Machine to more accurately mirror the setup on the webserver (my current Ubuntu Virtual Machine configuration is based on the setup that my old webhost evohosting used).

I started by changing Apache to listen on port 8080 instead of 80, and then disabled all the sites on it (sudo a2dissite sitename). I downloaded the latest stable version of nginx, and that's as far as I've got so far, I'll have to continue tomorrow.

In the afternoon I also played on Colin McRae Dirt a bit.

And in the evening I also watched Mauser play 1 vs 100 (expect it wasn't), watched an episode of Power Rangers with L, and watched 'The Killing' with Mauser and Clare.

Food
Breakfast: Bowl of Maple & Pecan Crunch Oat Cereal; Cup o' Tea.
Lunch: Arran Mustard Ham with Sweet & Crunchy Salad Sandwich made with Fresh Bread-maker-made Bread; Crust of Fresh Bread-maker-made Bread with Berry Jam; Clementine; 2x Posh Chocolate Biscuits; Cup o' Tea.
Dinner: Piece of Toad in the Hole; Potatoes; Green Beans; Swede; Ground Black Pepper; Gravy; Mustard. Pussing was Home-made Lemon Meringue Pie with Squirty Cream. Coffee; Cherry Liqueur; Piece of Sainsbury's Mint Creme Chocolate.
Supper: Fox's Choc Chip Cookie; Shortbread Finger; Cup o' Tea.

Friday, 8 January 2010

Adding metadata still

This morning I was mainly adding metadata to the same images I've been working on for the last week or so. I also helped Clare move the big table back into the garage, and take the smaller table out of the garage back into the kitchen.

While I was looking for information on Sheep (I needed to add a description/caption to a photo of a sheep), I came across this page, which lists various UK varieties of sheep.

I did finally finish adding metadata to my 33 images today. It's taken about a week to process them, and then a week to add metadata to them.

I took 3 versions of one the images I'd processed, and the original, and after getting them ready for the web, posted them to the Flickr critique group to try and get some feedback on them.

Also in the evening, I watched Mauser play a game on Xbox live called something like 'the one'. Basically, it's like who wants to be a millionaire, except the person is playing against 'the mob', which is a panel of 100 people. When anyone in 'the mob' gets a question wrong, they are knocked out. If 'the one' gets a question wrong, then they are eliminated, and all their points are distributed to the remaining members of 'the mob'.

Also like WWTBAM, 'the one' can choose to stop playing, and take their winnings.

Also, everyone in the audience can play (over 10,000 people), you get points for each question you get right, and bonus points depending on how fast you get the question right. If you're in the audience, your points aren't actually worth anything (other than being able to boast how many points you have), but I presume that the more points you have, the more likely you are to be chosen to be in 'the mob' or to be 'the one'.

Those in 'the mob' or 'the one' get to win Xbox points, which I think you can redeem for downloadable games etc.

I have seen a few things lately, where people mentioned that they use the ProPhoto RGB colorspace. I wondered why this was, and if it was actually any better than Adobe RGB, so I did some searching and found this thread over at photo.net: sRGB - AdobeRGB - ProPhoto RGB.

To quote Peter Werner's post there:
However, lately I've begun to question the wisdom of that. Basically, I could see the wisdom of capturing in ProPhoto, or some other super-wide gamut if you have the capability to do so ヨ this will give you a "digital negative" that you can use to make images when future monitor and printing technologies that aren't around today (actually, this is what Camera Raw is for, really). However, for printing or web publishing in the here and now, ProPhoto is a way bigger color space then one could possibly use ヨ to the best of my knowledge, even really good printers are only capable of rendering in Adobe RGB as their widest gamut. (Correct me if I'm wrong on this.)

The downside as I see it of working with ProPhoto is that you're inevitably going to have to downsample at some point. If you do so at the very end, there's a high likelihood that in you have all kinds of out-of-gamut colors. In fact, you might have even increased the number as you worked on the image color in ProPhoto. Neither the Colorimetric or Perceptual rendering intents that exist now to downsample to a smaller color space are perfect, and may have very real problems if a lot of your color is out of gamut, resulting in banding or like artifacts. On the other hand, if you start with or switch to the color space you're going to output in early in your workflow, you will only be shifting colors within the bounds of what you can actually print. Now, I'm not sure if having a wider gamut gives you some advantage with color correcting or otherwise rendering color even knowing you're going to downsample later.
So I think I'll stick with Adobe RGB for the moment.

The weather today started off a bit cloudy, about 9am the sun broke through a gap in the large grey cloud, looking very nice. It snowed a bit (very lightly), then the clouds blew away, and it was quite cloudless most of the day (including at sunset, which was very similar to yesterday, with a bald sky). About 4.30pm (about half an hour after sunset) some clouds started to roll in, then in the evening it snowed lightly a couple of times.

Food
Breakfast: Bowl of Maple & Pecan Crunch Oat Cereal; Cup o' Tea.
Lunch: Arran Mustard ham with ½ a sliced Vine Tomato Sandwich; Small Banana; Clementine; Lemon Slice; Chocolate Waver Bar; Cup o' Tea; 2x Cherry Liqueurs.
Dinner: Battered Fish Portion; Baked Beans; Mashed Potato; Ground Black Pepper. Pudding was 2x Home-made Mince Pies. 2x Coffees; Cherry Liqueur; Piece of Sainsbury's Mint Creme Chocolate.
Supper: Milk Chocolate Coated Crinkle Crunch; Cup o' Tea.