Thursday 7 October 2010

Ebay APIing and Walking

I spent quite a bit of this morning trying to figure out how to use the ebay developer's API as an affiliate. There are quite a few places on both the ebay partner network website and the ebay developers website where they mention using the API as an affiliate, but I couldn't find any info on either site about how exactly you use the API as an affiliate i.e. what the code looks like where you add your affiliate id in.

Then eventually I just tried google and found this blog post, which explains it. You need to add
&affiliate.trackingId=[yourcampaignid]
&affiliate.networkId=9
&affiliate.customId=[customid]

to your url request string.

That sorted, I wondered how best to go about retrieving the info from ebay. Obviously a url request to ebay's server when creating the page (in php) would slow down the page creation. But using javascript to get the info after page creation would mean that there would be a delay between the page loading and the info loading (with the info in the top right corner of the page, the user might scroll down before it had been loaded).

I was thinking about having a cron job that runs every 10 minutes, gets the info from ebay, and caches it. The PHP page would then include this page (so no delay caused by making a call to ebay's servers). Then when the page had loaded javascript would make a fresh request to ebay's servers and update the ad with the latest info.

This way the user sees the ebay info as soon as the page loads, the page should load quickly, and they get up-to-date info (when the js has finished loading it).

But then I thought that actually I might as well just use php. I think it should be possible to make sure that listings have at least x minutes left, so I could just retrieve listings with at least 10 minutes left every 10 minutes.

I'm not sure if this should be done on a request basis (code in page checks whether cached info is older than 10 minutes, and if it is, requests new info and writes new info to cache file / db). This method would need something to avoid dog piling. Or otherwise just use a cron job to refresh the cache.

In the afternoon I went out on a walk since it was nice weather. My UVIR cut filter for use with my IS-Pro arrived, so I also wanted to test it out. I went up through the town towards Robert Smythe school, then across the fields towards Great Bowden. Then I went back along the canal, and waited around for quite a while at the canal basin, hoping to take a pano there with the sky lit up by the sunset.

Unfortunately while there was a great sunset, it couldn't be seen from the canal basin (I saw the remnants of the sunset while walking home though). I had wondered if I should go to the canal basin or the fields for sunset, obviously the fields would have been a much better choice. Still, at least now I know.

In the evening I watched Sadgati (Deliverance) with Mauser and Bo. It was well directed (Satajit Ray), the story was alright, but not a lot happens.

I also geo-coded all the photos I took on my walk.

No comments: