24 December 2007

What Hi-Fi disappoint on Lossless

In the Sound Advice section of the Jan 2008 issue of What Hi-Fi a reader asks what the best portable player and file format is for maximising digital audio quality. The response is essentially buy an iPod and use Apple Lossless. Whilst I agree the iPod's audio hardware is very good, I don't think its virtually proprietary encoding algorithm is something What Hi-Fi should be encouraging people to use.

They later go on to say that you could buy a device from Creative and use "one of the other lossless formats" but they don't give any examples of other formats or mention that the only lossless formats the current Creative products support are WMA (Microsoft Lossless) and AAC (Apple Lossless).

All in all I'm disappointed that a magazine that professes to be unbiased and impervious to the corporates is so casual about recommending proprietary formats which facilitate consumer lock-in. In my opinion they should be promoting the open source formats and marking the products they review down for only supporting propretary technologies. Decreasing consumer choice is, after all, a negative factor and the corporates shouldn't be allowed to get away with it.

On the plus side, from 31st Dec, 7 Digital will be offering Radiohead's In Rainbows in FLAC format click here for more.

10 December 2007

Reference types in prototype declaration - JavaScript gotcha

I've recently started doing some heavy duty JavaScript, including messing around with constructors, prototypes, lamdas, closures etc. I started off creating a few classes by creating a constructor function and then defining its prototype using object-literal notation as it means your code looks quite like class declarations in other OO languages.

Here is a simple example without any methods, just with two member variables:

function MyClass(){}
MyClass.prototype = {
 iNum: 0,
 sStr: 'Derek'
};

var oA = new MyClass();
var oB = new MyClass();

// oA: iNum -> 0, sStr -> Derek
// oB: iNum -> 0, sStr -> Derek

oA.iNum++;
oA.sStr += ' Fowler';

// oA: iNum -> 1, sStr -> Derek Fowler
// oB: iNum -> 0, sStr -> Derek

All seems well, however, upon adding some reference types to the prototype things get a bit strange:

//...

MyClass.prototype = {
 iNum: 0,
 sStr: 'Derek',
 aAry: []
};

//...

// oA: aAry.length -> 0
// oB: aAry.length -> 0

oA.aAry.push('test');

// oA: aAry.length -> 1
// oB: aAry.length -> 1

Adding an element to aAry of oA has added it to oB's aAry too. "Odd", I thought, so I did a little more investigation:

// MyClass.prototype.aAry.length -> 1

The array property of the instances is pointing back to the property of the prototype. In other OO languages all member variables within the class declaration are copied into the instances, it is only static variables and method implementations that are shared between them. These JavaScript variables seem to be neither member nor static, they are member if value types and static if reference types. Next I tried looping though the properties and testing hasOwnProperty:

for(var sProp in oA){
 // oA.hasOwnProperty(sProp)
}

Before the assignments all the properties return false, that they are not member properties but reside within the prototype. Following the assignment however iNum and sStr return member and only aAry still claims to be prototype.

This behaviour isn't a bug in an implementation either, it is consistant in all the browsers but after having a quick look in the ECMA-262 doc i'm still none the wiser about why this is the case. Anyone who can shed light on this please comment below.

The solution

The solution to this is to create all member variables using the this keyword in the class constructor thus:

function MyClass(){
 this.aAry = [];
}

This ensures that the variable is created upon the new instance and you don't get any sharing problems.

27 October 2007

ALA Web Design Survey 2007 Results

The results from this year's A List Apart Survey of Web Design Professionals are in.

The findings are in an 82 page PDF which makes some interesting reading. You can also get an anonymised version of the 33,000 raw responses if you want to do some number crunching of your own.

Links

20 October 2007

Winamp 5.5 - 10th Anniversary Edition

That's right, 10th Anniversary, the first version of Winamp came out in 1997. Interestingly, on the 6th of August that year, Microsoft bought a $150 million share of the "financially troubled" Apple.

Since I bought my iPod (2004) I've pretty much exclusively used iTunes however, my foray into the world of FLAC has meant that it's not really meeting my requirements any more and I've been looking for a media library application with better format support.

I'll be posting a feature comparison and which one I've decided to go with in the coming weeks but until then I'll just say that I'm impressed with Winamp 5.5 so far. I'm only using the free version however it seems to have come a long way since the last time I used it. Features I'm particularly enjoying are:

  • Album art
  • Advanced SQL query style smart playlist generation
  • iPod support
  • Ripping to FLAC

Links

17 October 2007

Ripping to FLAC with Exact Audio Copy

In a previous post I talked about the convenience of having your CD collection archived to a hard disk. I weighed up the options for a archiving format and decided FLAC met my needs the best.

Anyone who has done and serious ripping on a Windows PC in the past will likely know that there are two big players; CDex and Exact Audio Copy. While EAC is not open source it seems to me to be the more complete solution and in a like for like comparison ripped far faster than CDex without any loss of quality. Mac or Unix users can check out the download page on the FLAC site for the ripping options availabe to them.

Getting it set up

After downloading and installing EAC you'll be presented with the very handy setup wizard which does a lot of the hard work for you. Unfortunately in my case the drive setup parts of the wizard didn't work first time which was down to EAC defaulting to using a third party ASAPI drive interface which i didn't have installed. This is easily fixed after the setup finished by going to:
EAC > EAC Options... > Interface(tab)
and selecting "Native Win32 interface for Win NT/2000/XP". Re-run the wizard again from the EAC menu to configure your drives.

Once you're finished with the wizard it's best to run through the settings it's chosen anyway as in some cases they aren't optimal. Main things to check, from the EAC menu:

EAC Options:

  • Extraction tab
    • Extraction and compression priority
      Setting this to "High" will ensure that EAC is prioritised above other applications when its ripping.

Drive Options:

  • Extraction Method tab
    • Mode
      Ideally you want "Secure".
    • Examine C2 Feature...
      You can use this tool along with a suitably scratched CD to determine whether your drive supports C2. This is worth doing as it will increase the speed of your rips.
  • Drive tab
    • Drive read command
      Hit the "Autodetect read command now" button as the wizard doesn't seem to set this.
  • Offset/Speed tab
    • Allow speed reduction during extraction
      Ideally you want this checked but some drives don't speed up again once they've slowed down.
    • Use AccurateRip with this drive
      You want this checked as it will look up the checksums of your rips against a database of other people's thus ensuring they're accurate.
  • Gap Detection tab
    • Gap/Index retrieval method
      It's worth putting a CD in and running a Detect Gaps once for each one of these options to see which is quickest.
    • Detection accuracy
      Set this to "Secure".

Compression Options:

The configuration wizard should have configured your FLAC settings for you so long as you picked the right option. You can check this in "Compression Options...". If it hasn't then run the wizard again and chose the FLAC option when prompted rather than trying to set the command line and all its parameters yourself.

freedb/Database Options:

  • freedb tab
    • Your e-mail address
      Enter your e-mail address to use freedb.
    • Get active freedb server list
      Hit this to get the list of freedb servers.

Ripping a CD

  1. Put a CD in and select the right drive from the drop-down list, the list of track should appear.
  2. Hit F4 (Action > Detect Gaps) to run the gap detection.
  3. Hit Alt-G (Database > Get CD Information From > Remote freedb) to get the album and track info.
  4. Check the album info is right and if not update it.
  5. Hit Ctrl-A (Edit > Select All) and the Shift-F5 (Action > Copy Selected Tracks > Compressed...) to begin ripping.

After this finishes you get a status log that gives you a quality percentage, AccurateRip confidence and and errors for each track.

Track quality indicates what percentage of the track was ripped first time with no problems. If everything is set up properly and you have a scratch free CD will normally be 99.9% or 100%.

AccurateRip confidence is the number of people in the AccurateRip database who have submitted the same checksum for a particular track as you. The higher the confidence the better however, a low confidence doesn't necessarily mean it is a bad rip, especially if you're ripping a lesser known CD.

Finally

Fire up Winamp, have a listen and then kick yourself for ever thinking MP3s sounded good!

01 October 2007

Amazon MP3 downloads

Today's top two MP3 artists on Amazon MP3 are Richard Wagner and Pink Floyd. I find this odd considering that the audiences for these two artists i.e. the classical lot and the prog lot are probably the most demanding when it comes to audio quality.

After it went public beta last week I have to admit I'm very impressed with Amazon's offering; comparing its service to a few others:

iTunesHMVWalmartAmazon
StandardPlus WMAMP3
Format AACAACWMAWMAMP3MP3
Quality 128kbps256kbps?128kbps256kbps256kbps
DRM YesNoYesYesNoNo
Price
Feist - The Reminder
$9.99$12.99$16.22$9.44$9.22$8.99

I'm not sure how many people consider things like quality and DRM when downloading music but one thing they will consider is price and Amazon is the clear leader here. Especially seeing as the album I chose for the comparison is one of the more expensive ones; you can download the whole of Pink Floyd's Wish You Were Here for $4.45, that's £2.20!

The fact that Amazon offer DRM-free MP3s means that they will play on pretty much anything. In addition to computers and portable music players, these days you'd be hard pushed to find a CD or DVD player that doesn't play MP3s. The AAC and WMA offerings of the competitors very much limit your choice of player; for example AAC work on iPods, WMA work on Creative Zens but not vice versa. MP3s, however, will work on both.

Amazon already has a massive share of the music retail market so its existing customers are more likely to download from them than go with one of the alternatives. Although they don't offer this yet, something that would seal up the Christmas market would be a "Buy an MP3 player preloaded with albums of your choice" service.

It all seems to stack up in Amazon's favour, I'd wager iTunes are going to see their market share slipping away in the coming months.

27 September 2007

FLAC on your iPod with Rockbox

The list of hardware with native FLAC support on the FLAC website includes quite a few home systems; Slim Devices' Squeezebox and Transporter, the Sonos Music System and wireless media players from the likes of Netgear, Olive and Helios. The list of portable players with native support is less impressive however, Cowon and Rio being the only names I recognise.

Thankfully the very nice people responsible for the Rockbox firmware had the foresight to include a FLAC codec meaning if you own an iPod or one of the other supported players listed below then all is not lost.

  • Apple: 1st through 5.5th generation iPod, iPod Mini and 1st generation iPod Nano (not the Shuffle, 2nd/3rd gen Nano, Classic or Touch)
  • Archos: Jukebox 5000, 6000, Studio, Recorder, FM Recorder, Recorder V2 and Ondio
  • Cowon: iAudio X5, X5V, X5L, M5 and M5L
  • iriver: H100, H300 and H10 series
  • SanDisk: Sansa c200, e200 and e200R series
  • Toshiba: Gigabeat X and F series (not the S series)

I'm normally skeptical of any such warranty voiding activities however, as my iPod is third generation its warranty days are long gone, I thought I'd give it a go. The firmware itself runs alongside the Apple one in a very nifty dual boot setup and the installation couldn't really be simpler. All you do is download one of the builds, copy it to the root of your iPod's hard disk, run a flash utility and it's done.

There are a lot of benefits to Rockbox over the native firmware including folder browsing, playlist editing, skinning and one the audiophiles will love, a proper parametric EQ so you can even out your headphones' response. There does seem to be one disadvantage too in that battery life seems somewhat reduced, possibly due to the increased disk access required for FLAC playback. This is a small price to pay for an impeccable performance from the full quality tracks going through the iPod DAC though.

Links

23 August 2007

.NET Performance Counter Problems

I've recently been doing some work with performance counters in .NET and discovered the support for them in the framework is a bit nasty.

One example is that the only way to add counters is through the PerformanceCounterCategory.Create method so you can't append a new counter to an existing category. You have to copy all the counters from the old category, add you new counter, delete the old category and then recreate the category with the new list of counters.

Another example is that in order to use any of the fraction based counters like averages you have to add two separate counters, one for the numerator and one for the denominator.

All very long winded anyway so I decided that I'd wrap up the creation of the counters and categories along with a few other bits in a helper class. That was the plan but it fell at the first hurdle - trying to create categories.

The symptom

This, seemingly innocuous bit of code...

if(!PerformanceCounterCategory.Exists(categoryName))
{
 PerformanceCounterCategory.Create(categoryName, categoryName, new CounterCreationDataCollection() );
}

if(!PerformanceCounterCategory.CounterExists(counterName, categoryName))
{
 //add a counter
}

...was yielding the following error on line 6:

System.InvalidOperationException: Category does not exist.
   at System.Diagnostics.PerformanceCounterLib.CounterExists(String machine, Str
ing category, String counter)
   at System.Diagnostics.PerformanceCounterCategory.CounterExists(String counter
Name, String categoryName, String machineName)
   at System.Diagnostics.PerformanceCounterCategory.CounterExists(String counter
Name, String categoryName)

i.e. three lines after the statement creating a category it says the category doesn't exist. Odd, I thought, so I knocked together a quick Snippet Compiler script:

string categoryName = "foo";
string counterName = "bar";
  
if(PerformanceCounterCategory.Exists(categoryName))
 PerformanceCounterCategory.Delete(categoryName);
  
WL("Category " + categoryName + " exists? " + PerformanceCounterCategory.Exists(categoryName));
  
PerformanceCounterCategory.Create(categoryName, categoryName, new CounterCreationDataCollection() );
  
WL("Category " + categoryName + " exists? " + PerformanceCounterCategory.Exists(categoryName));
      
try
{
 WL("Counter foobar exists? " + PerformanceCounterCategory.CounterExists(counterName, categoryName));
}
catch(Exception ex)
{
 WL(ex.ToString());
}
  
RL();

Further oddness arose from this because the call to PerformanceCounterCategory.Exists after the Create actually gives the correct response, it is only the CounterExists method that fails. I immediately suspected some sort of caching issue.

The cause

After having a poke around with Lutz Roeder's .NET Reflector I discovered that all the Performance Counter classes make use of an internal class called PerformanceCounterLib which has a few hashtables it uses to cache information about the categories and their counters.

The hashtable concerned here was the CategoryTable which, on adding a category for he first time, does not update properly. When the CounterExists call runs this line:

if(this.CategoryTable.ContainsKey(category))

it fails throwing the error however the Exists call has this line:

return (PerformanceCounterLib.IsCustomCategory(machineName, categoryName) 
   || PerformanceCounterLib.CategoryExists(machineName, categoryName));

the right part of this calls the same CategoryTable.ContainsKey however the IsCustomCategory on the left accesses the registry directly and returns the correct answer - true.

Why the CounterExists doesn't call this method itself and opts instead to provide its own implementation i've no idea.

The solution

Luckily the PerformanceCounter.CloseSharedResources() method flushes all these internal caches so the calls all return the same answer. Adding a call to this straight after my PerformanceCounterCategory.Create solved the problem.

19 August 2007

Archiving your CD collection

This post could probably be considered to be the second part of my very first post on this blog entitled Freeing your digital media in which I talked about the DLNA standards and how they allowed you to store all your media on a NAS device and stream it around your house over your network.

I'd been considering this more seriously and realised that a large proportion of my MP3 collection were low bitrate poor quality rips which just sound awful played though a half decent hi-fi.

Storage space is only getting cheaper and today you can pick up a 500GB external hard disk, enough for 700 uncompressed albums, for less than £80; the Western Digital My Book series proving to be a popular choice. There really is no reason why, with this much space, we should be sticking to lossy compression formats for our music archiving. So, with that in mind, what are the options?

The main contenders

WAV
Microsoft's uncompressed audio format
AIFF
Apple's uncompressed audio format
WMA Lossless
Microsoft's lossless audio compression format
Apple Lossless
Apple's lossless audio compression format
FLAC
Open source lossless audio compression format - now part of the Xiph.org Foundation, originally developed by Josh Coalson
Monkey's Audio
A lossless audio compression format developed by Matt Ashland
Shorten
An old open source lossless audio compression format developed by Tony Robinson at SoftSound

Hydrogen Audio have prepared a good comparison table of lossless formats. Uncompressed 44.1kHz 16bit audio uses up about 10MB of space per minute. The lossless compression formats listed above all reduce file sizes to about 6MB per minute. Not bad but compared to the 1 - 2MB per minute we're used to with lossy formats this still seems like a lot. My justification for this space usage is that I want the convenience of being able to carry 700 albums around with me and I've paid for the CDs so i might as well be listening to them in full CD quality.

My criteria for a format were; open source, good compression and good hardware and software support. This already removes all the Microsoft and Apple options and Monkey's Audio. Shorten is old and has poor support so the natural choice is FLAC which, being part of the Xiph.org Foundation, has a large developer base behind it and support in all the major software and hardware music players albeit via new firmware in some cases. Not to mention that it's now packaged with arguably the best CD ripper, Exact Audio Copy.

A quick download, set up and a few rips later and I'm rather astounded by the results. To my ears the FLAC files actually sound better than playing the original CD through the same system, delivering a far more punchy and detailed sound. I have to admit I wasn't expecting this and I'm assuming it's down to my CD-ROM reading the audio data poorly when playing the CD directly.

Links

23 July 2007

Mobile madness

I've recently been going through the yearly trauma of picking a new phone as my free upgrade. It's the same every time - there is never one phone that meets all my criteria so I have to go for the best compromise. The phone I'm upgrading is a Nokia 6680 which is a rather large Symbian smartphone, lacking in both the processor speed and memory stakes. The criteria for my new phone are therefore:

  • Features - preferably a smartphone with a camera (40%)
  • Usability - a decent size keypad and quick processor (30%)
  • Size - not a brick but not very thin either (20%)
  • Design - least important but something that looks reasonable (10%)

I've only ever had Nokia phones and have stuck with them mainly because they were a lot more usable than other brands. Nokia's current offerings were leaving me a bit cold however so I've been thinking "surely the other brands must have caught up" and indeed Sony Ericsson's sales are topping Nokia's in the UK at the moment. Anyway I thought I'd give another brand a try so I went for a Samsung U600 as I like the look of their sliders and although it's not a smartphone it is from their new "Ultra Edition 2" line so I thought it would meet my needs.

I'm currently awaiting the Jiffy bag from Orange to send it back in so needless to say it fell at the first hurdle. A hopelessly inadequate phonebook, annoying predictive text set-up requiring constant swapping between keypad and navigation key, and some very questionable calling and hanging up behaviour were my main grievances.

So now I'm back to square one and thinking I'm either going to get a Nokia or another brand only if it's running Symbian so I can be assured of a certain amount of base functionality. So the next two candidates were the Nokia 6300 or the Sony Ericsson W950i - the Sony is missing the all important camera and the Nokia isn't running Symbian so not clear cut by any means. I may have finally found the answer with the Nokia 6120 which is a smartphone, has a camera and is a centimetre narrower and half a centimetre thinner than the 6680. Only problem with this one is that it's not currently available on Orange so I've got to wait for it - Doh!

Links

20 June 2007

500 Geeks + 1.21 Jigawatts = ?

Well, i'm back. Thanks to Yahoo! and the BBC for a great weekend. It didn't get off to the best of starts what with the lightning and broken WiFi and all but everyone made the best of it.

A total of 73 hacks were presented on the Sunday evening however ours wasn't finished in time which was a shame because no-one else had made anything similar. If 500 geeks went and were in teams of five then there should have been 100 hacks so i'm guessing we weren't the only people who didn't finish.

Hack-tastic, looking forward to the next one!

28 May 2007

Hack Day UK 2007

Thought a quick post was in order as I've just received the e-mail confirming my place at this year's Hack Day UK.

I'll be there along with the fellows from my blogroll - current and former colleges - and we'll likely be attempting our own hack as well as pimping our skills to the highest bidder!

13 May 2007

ALA Web Design Survey 2007

Just a quick post to point you all in the direction of A List Apart's survey of Web Design Professionals. It's been up for a while now and I've only just remembered to fill it in myself. I look forward to the results and hope at least some of them are plotted on a nice graph.

02 May 2007

Why don't Google support SyncML?

I use quite a few of Google's online services and recently I've started exclusively using Mail and Calendar for managing all my e-mail accounts and calendars. Both are brilliant applications and it's great to be able to access this data from any computer without worrying about remote access or setting up something like MS Exchange. One thing I'd really like to be able to do however is have a two way sync between my Google services and my mobile phone.

Thom Shannon's PocketGCal does just that on a device running Windows Mobile and I toyed with the idea of writing a similar Java app for my Symbian device. Having got as far as downloading Eclipse and all the various projects and SDK's I'd need I happened upon a Sync app already installed on my phone.

The app uses the Data Synchronisation standard from the Open Mobile Alliance which was formerly called SyncML. It allows you to sync over Bluetooth or HTTP and supports different profiles specifying which access point to use and whether authentication is required etc.

"Great, an open standard. I bet Google support that, they're the champions of free data!", I thought naively. Alas it is not the case, Google seem to be pushing their proprietary GData as the means for programmatically updating the data you hold with them and nothing else.

Google do seem to have missed a trick here as I'm sure there are a lot of business types who'd see native support for SyncML as a very useful feature. This whole episode is at least going to make me look around at the other Mail and Calendar providers out their to see if any of them have better support.

Update - 09/05/2009

Google does now support SyncML for syncing of contacts, it also supports various other syncing methods including ActiveSync for Windows Mobile devices which allows you to sync your calendar entries as well. Check out Google Sync for details.

01 May 2007

Silverlight, Microsoft on the bleeding edge yet again

I jest.

It's odd to think that 12 years on from the introduction of Java applets Microsoft are giving the whole thing another go. Their marketing clout is a given but Silverlight won't be successful through that alone.

The two major limiting factors for the uptake of Java applets were the need for a Java Virtual Machine being installed and the security restrictions placed upon the applet. While the JVM isn't a big download by today's standards, on the modems of 1995 it took forever. An applet's access to the local machine was very limited unless the user trusted it which was an instant hit to how useful it could be.

The first factor has been removed as broadband connections are now widespread and the Silverlight installer is only a few megabytes depending on your platform. As for security it remains to be seen whether Silverlight is full of holes but seeing as it runs from within a browser, and not just Internet Explorer, it will be interesting how the other browser vendors react if this turns out to the the case and whether they implement their own measures to restrict its access.

All that doom and gloom aside security issues don't stop you being able to use all the awesome XAML based visual trickery available in Silverlight. Using Java applets it was pretty difficult to make anything that looked nice let alone doing things like performing image manipulation on videos as they're playing. Powerful stuff indeed and I'm looking forward to being able to make great looking, feature rich UI's that should just work on any platform.

28 March 2007

Keeping your data secure online

There is a growing trend among the Web 2.0 sites of asking you to enter a username and password for some other site in order to perform an action. Two recent examples of this i've come across are Technorati which asks for your Blogger username and password in order to verify you own a particular blog and Facebook which asks for your e-mail account username and password in order to match you contact list against existing Facebook members.

As an IT professional i'm naturally skeptical of any such demand and in the case of Technorati chose to verify my blog ownership in a different way. Terms and Conditions and Data Protection declarations are one thing but you've no guarantee your details won't be stored in a database and if they are there's all the more chance they may at some point fall into the wrong hands. The only way to be sure is not to type them in in the first place.

A Blogger user account is one thing but Facebook asks for something totally different. I wonder how many people actually consider what data is stored in their e-mail inbox before they submit their details with the promise of being shown how many of their friends are already signed up.

When you sign up to the majority of sites you're e-mailed a confirmation containing all your details, including things like home and work addresses, phone numbers, date of birth, passwords and perhaps the most crucial of all, answers to secret questions e.g. mother's maiden name. If some unscrupulous person gets hold of your e-mail account password it will likely take them little effort to steal your identity, empty your bank accounts and max out your credit cards.

This article from the BBC Many net users 'not safety-aware' serves to reaffirm the point that a large number of net users simply do not think before typing their sensitive information into random websites.

12 March 2007

Reference type keys and .NET dictionaries

The default implementation of the Equals method for reference types is to call ReferenceEquals i.e. to test whether two variables reference the same instance of an object.

When using a ListDictionary the Equals method is used so you can be sure you will be accessing the correct item. However if you use a HybridDictionary, which will swap to a Hashtable for collections of more than 10 items, you can get inconsistent results. This is all down to the fact that the Hashtable uses the GetHashCode method to get a code which represents an object and this is then used as the key. As you will read here the GetHashCode method does not always return a unique code for dissimilar objects so you can end up accessing the wrong item of your dictionary.

To get around this you can either stick to using the ListDictionary or implement your own IHashCodeProvider for the classes used as keys in your dictionary.

20 February 2007

32-bit Windows Script Components under 64-bit Windows

We're currently setting up a new suite of 64-bit web and database servers at work all running Windows Server 2003 x64 Edition. We've got quite a few legacy 32-bit Windows Script Components we need to use which by all accounts should run fine in the 32-bit environment provided by WOW64.

Imagine our surprise when, on registering the components, none of them worked - throwing errors at the point of instantiation.

WOW64 is rather an odd beast; residing as it does in it's own SYSWOW64 folder in the Windows directory. This folder essentially contains the 32-bit versions of all the DLLs and things that are available in a 32-bit version of Windows. The caveat being that in order to get your 32-bit fare to work you need to call on the services of these SYSWOW64 versions rather than the ones in the folder still called SYSTEM32 (note the stupid naming convention).

When registering WSC's you actually register the hosting service, scrobj.dll with regsvr32.exe, passing the path to your WSC as the command line for scrobj.dll using the /i switch e.g.

regsvr32 /i:"C:\Components\Display.wsc" "C:\WINDOWS\SYSTEM32\scrobj.dll"

Oddly the Register option in the file association for WSC's seems to mix versions, calling the 64-bit version of regsvr32.exe and the 32-bit version of scrobj.dll.

"C:\WINDOWS\system32\REGSVR32.EXE" /i:"%1" "C:\WINDOWS\SYSWOW64\scrobj.dll"

I'm not sure of the significance of this mixed version thing, however it didn't work in our case so we added a 32-bit Register option which called the 32-bit versions of both files from the SYSWOW64 folder e.g.

"C:\WINDOWS\SYSWOW64\REGSVR32.EXE" /i:"%1" "C:\WINDOWS\SYSWOW64\scrobj.dll"

and a 32-bit Unregister e.g.

"C:\WINDOWS\SYSWOW64\REGSVR32.EXE" /u /n /i:"%1" "C:\WINDOWS\SYSWOW64\scrobj.dll"

which sorted the issue.

13 February 2007

XML namespace prefixes in MSXML

If you're working with XSL or a similar technology that makes use of XML namespace prefixes using the Microsoft XML DOM you'll likely run into problems if you try to do anything more than just load in a file.

Adding elements

The W3 DOM specification includes a createElementNS method for creating an element scoped within a namespace however MSXML doesn't. You can create an element with a prefix using createElement but this doesn't correctly register the namespace of the node and you'll get a schema error something like:

msxml3.dll: Keyword xsl:stylesheet may not contain xsl:include.

In order to create an element and register it correctly you have to use createNode instead which takes node type (1 for an element), node name and namespace URI as arguments e.g.

Set ndIncl = xslDoc.createNode(1, "xsl:include",
"http://www.w3.org/1999/XSL/Transform")

Using XPath

Similar to the createElement problem, even if you've only loaded an XSL document you won't be able to use XPath to query it because oddly the namespaces aren't automatically registered with XPath e.g.

Set nlTemps = xslDoc.documentElement.selectNodes("/xsl:stylesheet/xsl:template")

yields the following error:

msxml3.dll: Reference to undeclared namespace prefix: 'xsl'.

To get this to play ball you have to set the "SelectionNamespaces" second-level property which takes a space delimited list of namespace definitions using a setProperty call of the form:

xslDoc.setProperty "SelectionNamespaces",
"xmlns:xsl='http://www.w3.org/1999/XSL/Transform'"

Links

05 February 2007

XML and SQL Server

In this post i'll cover how you can get SQL Server to return data as XML by using the FOR XML command and how you can use XML as input for updating records and as a rich argument for row returning stored procedures using the OPENXML command.

There are lots of reasons you may want to get data out of a database as XML:

  • You may be building an AJAX app and want to send XML to the client directly for processing by your client-side JavaScript
  • You may want to use XSL to transform your data into some format such as HTML or a CSV
  • You may want to export data and store it in a form which retains its original structure

These reasons also give you reasons for needing to pass XML into your database for example with the AJAX app you may want to receive changes as XML from the client and post them straight to a stored proc that updates your tables.

Examples are in VBScript using ASP and ADO.

Getting SQL Server to return XML

The key to getting SQL Server to return XML is the FOR XML command. It comes in three flavours:

FOR XML RAW
The least useful, RAW mode simply outputs the rows returned by your query as <row> nodes with the columns being either elements within this node or attributes of it as you define.
FOR XML AUTO
Automagically translates your SQL query, joins and all into suitable nested XML elements and attributes. For example if you are joining Orders to OrderItems the XML output will be OrderItem nodes nested within the associated Order node. You can alter the naming of the nodes by aliasing your table and column names but that's about it.
FOR XML EXPLICIT
Explicit mode allows the most customisability but it's also the most fiddly requiring you to alias all your columns names to a specific format which describes which nodes they should belong to.

You'll mostly use AUTO mode because it gives you the most useful results in the least amount of time so here it is in an example:

SELECT Order.*, OrderItem.*
FROM Order
INNER JOIN OrderItem
   ON Order.order_key = OrderItem.order_fkey
WHERE Order.customer_fkey = 1
FOR XML AUTO

All you do is tag FOR XML AUTO on to the end of your query, that's it! The output will look something like this:

<Order order_key="1" customer_fkey="48" date_placed="24/08/2006 12:31">
   <OrderItem orderitem_key="123" order_fkey="1" product_fkey="234" list_price="£14" />
   <OrderItem orderitem_key="124" order_fkey="1" product_fkey="64" list_price="£3" />
   <OrderItem orderitem_key="125" order_fkey="1" product_fkey="73" list_price="£27" />
</Order>

If you run this in Query Analyzer you'll notice in the results pane it looks like the XML has been split into rows. We need to use an ADODB.Stream object to get at the output properly, thus:

Set conn = Server.CreateObject("ADODB.Connection")
Set cmd = Server.CreateObject("ADODB.Command")
Set strm = Server.CreateObject("ADODB.Stream")

conn.Open "Provider=SQLOLEDB;Data Source=myServerAddress;" & _
   "Initial Catalog=myDataBase;User Id=myUsername;Password=myPassword;"

strm.Open

Set cmd.ActiveConnection = conn

cmd.Properties("Output Stream").Value = strm
cmd.Properties("Output Encoding") = "UTF-8"
cmd.Properties("XML Root") = "Root"  'this can be anything you want

cmd.CommandType = adCmdText
cmd.CommandText = strSQL

cmd.Execute , , adExecuteStream

Set xmlDoc = Server.CreateObject("Microsoft.XMLDOM")
xmlDoc.async = "false"

xmlDoc.LoadXML(strm.ReadText)

strm.Close : Set strm = Nothing
Set cmd = Nothing

xmlDoc now contains our XML to do with as we will.

Passing XML into SQL Server

The easiest way to get XML into SQL Server is as a parameter of a stored procedure thus:

cmd.Parameters.Append cmd.CreateParameter("somexml", adVarChar,
adParamInput, 8000, xmlDoc.xml)

You then use two System Stored Procedures along with the OPENXML command to SELECT from the contents of the XML parameter as if it were a table:

DECLARE @idoc int
EXEC sp_xml_preparedocument @idoc OUTPUT, @somexml

SELECT * FROM OPENXML (@idoc, '/Root/Order') WITH (Order)

EXEC sp_xml_removedocument @idoc

OPENXML takes the prepared XML document and an XPath expression telling it which nodes it is taking into account. The WITH statement in this case tells OPENXML that the nodes it is working on are of the same schema type as the rows in the Order table.

The result of this call is a list of records of the same schema as the Order table but which have actually come from the passed in XML document. Because the schema is that of Order you can put an INSERT INTO [Order] in front of the SELECT and this will add the rows from the XML to the Order table. You probably wouldn't want to do that but you get the idea.

You don't have to have a table representing the schema of the XML you're passing in in your database. WITH also accepts a normal schema declaration i.e. comma delimited column names with their types and which node this maps to in the XML:

SELECT order_key, customer_fkey, description
FROM OPENXML (@idoc, '/Root/Order') 
WITH (
   order_key     int           '@order_key',
   customer_fkey int           '@customer_fkey',
   description   nvarchar(100) 'description'
)

The advantage of being able to do this is that you can pass complex structured criteria into one of your stored procedures and use OPENXML to turn it into a rowset which you can use to JOIN to the tables in your database. Powerful stuff with a large number of applications in both improving querying data and updating it.

02 February 2007

A room with a Vista

I've been toying with the idea of buying Vista for a some time now. I have a beta version installed in a spare partition but some hard disk corruption issues prevented me from giving it a good run for its money. So, having not looked at it for a while, I've been reading CNET's Seven days of Vista series over the past week to see what's what in the final release.

Being a technology obsessed geek, the Ultimate Edition was the only viable option but it came as a roundhouse kick to the face when i learnt this would retail at £350! "Guess I'll be sticking to XP for now then", I though mournfully.

Not so, for at the end of day 7's post was the pièce de résistance:

Our final tip would be to consider buying OEM versions of Vista ... the consumer version of Ultimate is [£349], yet it's just £121.68 for the OEM version

I was aware you could buy OEM software but had always thought the difference in price was similar to that of retail and OEM hardware. "Surely that can't be right", I though, but sure enough dabs.com has both versions retail boxed and OEM.

The only differences with the OEM being you don't get any support and it's tied to the motherboard it's first installed on. I don't need support and, in the unlikely event that my mobo dies and i have to buy another OEM copy, I'll still be £50 richer. Besides you can get three OEMs for the price of one retail so I could just get a few spares!

01 February 2007

SQL Server Oddness

I recently came across a rather odd bug in SQL Server 2000 which, although fixed in SP4, i had to find a work-around for as an immediate solution was required that couldn't wait for a scheduled server patching.

In my case it concerned a JOIN subselect which contained a JOIN using a user-defined scalar function as one of its predicates e.g.

...
LEFT JOIN (
   SELECT foo.col1, MIN(bar.col1)
   FROM foo
   INNER JOIN bar
      ON foo.col1 = bar.col2
      AND bar.col3 = dbo.fn_GetVal("derek")
   GROUP BY foo.col1 
)
...

which produced the rather unhelpful error:

Server: Msg 913, Level 16, State 8, Line 4
Could not find database ID 102. Database may not be activated yet or may
be in transition.

The work-around in this case was to move the predicate to the WHERE clause rather than the ON e.g.

...
LEFT JOIN (
   SELECT foo.col1, MIN(bar.col1)
   FROM foo
   INNER JOIN bar
      ON foo.col1 = bar.col2
   WHERE bar.col3 = dbo.fn_GetVal("derek")
   GROUP BY foo.col1 
)
...

24 January 2007

What is it with PC component categorisation?

Like many geeks I buy my computers as components and assemble them myself, taking great care to make sure the quality and compatibility of kit is as good as i can afford. One thing of constant annoyance however is the inadequacy of online PC component retailer's product categorisation and searching facilities.

With all the different types of processor, RAM, hard disks etc that are available accurate categorisation is essential for you to find what you're looking for. Why is it then that so many of the online component retailers have these lacklustre categorisation schemes in place, often repeating categories of product e.g. "Core 2 Duo" and "Core Duo 2".

It does seem to be CPUs that suffer the worst, probably because of the many different ways you can group them; by manufacturer, product line, socket type, number of cores etc. A lot of these retailers only allow a product to be in one category rather than "tagging" products with all their relevant information and allowing a user to group by anything. Because of this one category restriction each retailer has gone with what they see as the best categorisation, the problem being that none of them are the same and most of them are inflexible.

The result of this is a very poor user experience making it difficult for the consumer wanting to shop around. What's needed is an open data initiative between the component industry and retailers to standardise on these categorisations thus empowering the consumer to find what they're looking for more easily.

As much as this sounds like a massive plug, the only online component retailer i regularly use is dabs.com as they're the only one i know of that implements a good product categorisation scheme. So take note ebuyer et al - proper categorisation pays!

22 January 2007

Is it time for a thin-client resurrection?

Microsoft is making much of the performance benefits flash memory brings to Windows Vista. Two items on the performance features page utilise it; ReadyBoost as an extension to the RAM and ReadyDrive as a large hard disk cache - part of hybrid drive technology.

Hybrid drive hard disks cache frequently used data in flash memory attached to the disk. Surely this just means the operating system is cached so why then don't we just go the whole hog and give the OS its own flash drive to run from? But wait, we can go further than this, Office Live removes the need for having Office installed on a local disk. On a lot of home PCs that is the only application installed.

We still need a hard disk for bulk storage of documents, media etc but a NAS over gigabit ethernet can potentially have an access time of 125MB/s which is comparible to a local hard disk. We've reached the point where the only people who need a hard disk are those who rely on consistant real-time access to disk i.e. audio and video editing and the like and these folk tend to use firewire disks anyway.

Processor, RAM, a load of Compact Flash and maybe an optical drive is all that's needed - hopefully it's not going to be long until desktop PCs look like this:

18 January 2007

A convenient falsehood

This was supposed to be a post about thin-clients but that will just have to wait...Mike's power supply problems have struck a nerve.

His problem stemmed from the fact that computers are becoming more power hungry thus necessitating ever higher wattage power supplies. 4GHz CPUs and double height graphics cards requiring their own power connector all come at a price. It's not a case of "they don't make them like they used to", it's a case of "my CPU requires a 500W PSU where my last computer only needed 200W". On average the 500W would burn out 2.5 times faster than the 200W.

It raises the question, is there really a need for this much horsepower in a desktop PC? I'd suggest there isn't; the majority of users could probably get by with a 2GHz machine and onboard graphics. Obviously gamers are a different matter but hardware enthusiast and overclocker types make up a small percentage of PC users.

I've lost count of the number of times i've heard a PC salesman say "...and you've got the latest graphics card so the kids can play their games...". These kids who've probably already got one of the latest consoles so aren't going to be bothered that the family PC's packing a behemoth. It'll be there anyway, sucking the life out of the planet like the crystalline entity from TNG.

It's a sorry state of affairs, considering all this "climate change" stuff component manufacturers should be concentrating on making their fare more efficient and and PC retailers should be advising customers to buy greener PCs.

15 January 2007

Apple's iPod/phone mashup

I'm purposefully steering clear of Cisco's iPhone brand name in the title of this post in the event that Apple lose the impending court battle. Not that this will make much difference now that every reference to Cisco's product is now buried underneath a mountain of links to Apple's. Anyway i will hereafter refer to Apple's product as the iPhone although i recognise this is Cisco's brand name yada yada...

I wasn't going to blog anything about the iPhone because there seems to be plenty of that going around already, all along the lines of "The 10 worst things about the iPhone (but i'm still going to buy one)". However the comments in an article on the BBC News website entitled From iPhone to iGroan have prompted me to get writing.

It occurs to me that a lot of the contributors to the article haven't really considered their comments and are just complaining for the sake of it, the TV show "Grumpy Old Men" springs to mind, however...

I'll start my deconstruction of these arguments with this excert from one of the contributors:

The functionality really doesn't differ that much from some of the mobiles already on the market. It's just another example of how well Apple have mastered the use of brand loyalty.

When the iPod came out it didn't differ in functionality from other MP3 players on the market. The things that set it apart were its design and Apple's meticulous attention to detail making it perform those functions in the best way possible. You just need to look at the number of units sold to know that it's not just the loyal Apple fans who appreciated that and bought one. Its success had nothing to do with brand loyalty and everything to do with the fact that Apple had created a great product.

Older people seem to miss the point of convergence devices, saying things like "Why do i want to take photos on my mobile phone?". The point is convenience. Your mobile phone, by its nature, is something you have with you most of the time. By building functions like cameras and music players into them it means that with no extra effort you can also take photos or listen to music when you want to. How many time have you wished you'd had a camera with you? How many times have you remembered a favorite song and wanted to listen to it straight away?

Several contributors allude to the fact that they have somehwhat of a love/hate relationship with their phone. They can make calls but all the other features are hidden away behind a labyrinth of menu screens. Convergence devices are all well and good but as Apple know there's no point making something unless it looks nice and is easy to use. Just as with the iPod, it's these two factors that will set the iPhone apart from the crowd and it's these two factors that will ensure it's a hit with not only the iPod generation but also the skeptics.

This leaves me asking one question - in this day and age, where cultural misunderstanding and racial hatred are rife, how can anything that facilitates better communication be a bad thing?

11 January 2007

Windows Script Components as an alternative to COM

Windows Script Components provide VBScript and JScript developers with the ability to create COM style components. They can wrap up shared functionality into a component which can be registered on a server and instantiated using a CreateObject call just like other COM.

Defining a component

<?xml version="1.0"?>
<component>
   <public>
      <property name="Forename" internalName="strForename" />
      <property name="Surname">
         <get/>
         <put/>
      </property>
      <method name="Save" />
   </public>
   <script language="VBScript">
   <![CDATA[
   
   Dim strForename, strSurname

   Function get_Surname()
      get_Surname = strSurname
   End Function

   Function put_Surname(strValue)
      strSurname = strValue
   End Function

   Function Save()
      If IsEmpty(strForename) Or IsEmpty(strSurname) Then
         Save = False
      Else
         Save = True
      End If
   End Function

   ]]>
   </script>
</component>

Usage

Set myObj = Server.CreateObject("MyApp.MyClass")
myObj.Forename = "Derek"
myObj.Surname = "Fowler"
myObj.Save()
Set myObj = Nothing

Advantages

  • You don't need to server-side include anything to use them, once they're registered they're available from anywhere
  • Like other COM you can use them Application or Session scoped in ASP which means you can use a tailored object rather than abusing Arrays or a Recordset to create your shopping basket
  • Anywhere you can use a COM component you can use a WSC - even within a .NET application

Disadvantages

  • You don't get the performance benefits of proper compiled COM components
  • They don't support destructors which can make clearing up a pain
  • You can't do proper locking in VBScript or JScript so it's difficult to avoid concurrancy issues such as when using them Application scoped in ASP

Having said all that for the majority of applications the advantages certainly outway the disadvantages. Creating your data access and business tiers using WSCs allow you to work outside the confines of ASP environment and create components you can use anywhere that supports COM.

For anyone working in a company that is resisting the adoption of .NET this ability to write functionality to use in ASP but which you can then reuse in ASP.NET provides you with a clear upgrade path.

Links

Microsoft downloads

06 January 2007

Freeing your digital media

Yesteryear

Ever since MP3 came about and the prospect of storing my entire music collection on my computer became reality i've ripped CDs and downloaded tracks with much rejoicing. At the time i spent a large proportion of my time in front of a computer and it was great to have any music i felt like listening to either right there or a couple of minutes of downloading away. The iPod came along and added a whole other dimension, allowing me to venture forth into the world with my record collection tucked neatly in my pocket.

These days however i work in the computer industry and the last thing i want to do when i get home is have to sit in front of the computer to listen to my music.

I'd investigated the whole network music thing a while back when Slim Devices was the only kid on the block with their SLIMP3. Although it was a nice design it was rather expensive and the prospect of having to turn on my computer to use it wasn't all that appealing.

Movies too?!

These days video has gone the way of audio to the detriment of the MPAA. You have the two original industry standards MPEG 1 & 2 but with MPEG-1's poor picture quality and MPEG-2's large file sizes they're not really viable options in the way MP3 is for audio. There are a number of other formats which are however:

MPEG-4 Part 2
Is an improvement upon MPEG-1 which produces similar file sizes but with images closer to that of MPEG-2. There are several implementations of this but the main two are the commercial DivX and the open source XviD. Both of which you may have seen support for on some new DVD players.
Windows Media Video
Now in it's ninth incarnation, Microsoft's video compression format is good and has quite wide support even if it does leave a bad taste in the mouth.
Real video
Has been around almost as long as MPEG-1 and is used for vidcasts by, among others, the BBC. That doesn't stop it being the worst of the three however.

With these compression formats able to fit a DVD on to a CD (4.7GB down to 700MB) and ever more massive hard disks available it starts looking like not only can we store and stream all our music but all our movies and TV shows too!

Affordable Network Storage

Network Attached Storage (NAS) appliances are similar to an external USB or FireWire hard disk apart from that they plug into your network and can be accessed by any computer on it. Recently companies have started making NAS apliances targeted at the home user and i've seen these popping up on Amazon for under £130 for 250GB.

One importat point missed out of the product specs on Amazon however is that some of these NAS units are DLNA-compliant, indeed it was only after looking on some other sites for price comparisons that i even discovered DLNA existed.

DLNA - The Keystone

DLNA is the Digital Living Network Alliance (formerly the Digital Home Working Group) and it's been around since 2003 coming up with a set of guidelines for interoperability between networked devices.

Although i'd never heard of it before, if you look at the roster, the list of companies involved is huge.

So what exactly does a NAS appliance being DLNA-compliant mean?

It means the NAS has the ability to not only store digital media and any other file but that it can also stream that media to any DLNA-compliant player connected to the network - that being something similar to the SLIMP3.

Some of these products are already on the market and more are in development. They include things like set top boxes such as Buffalo's LinkTheatre and Philip's Wireless Multi-media Adapter which will allow you to browse and watch media from your DLNA-compliant NAS or any Windows PC on your normal TV. You'll soon be able to buy TVs with a build in ethernet port that do the same and portable wireless devices that let you listen to music and watch video anywhere in the house over your network.

Exciting stuff

All this means that our digital media is about to be set free. A NAS for £150 and a media player to sit under the TV for £150 and you can watch all your digital video on your tv and listen to all your digital music on your hifi all without needing your computer on. What's more if you have HD video on your NAS you can stream that to your HD TV.

If your HiFi is in another room and you want to stream music there as well then just buy a music player that features a remove control and screen for browsing through your music collection to sit on top of it.

It seems DLNA has all the bases covered and i'll look forward to seeing more products coming to market with a DLNA Certified logo on. Now all they need to do is tell people about it which may be easier said than done with Microsoft pushing Windows Media Connect and Intel pussing Viiv. DLNA may still have the edge however as both of these require you to buy an expensive media centre PC to go under your TV. We'll have to wait and see.

Update - CES 2007

A recent article from the BBC's Click technology programme on the 2007 Consumer Electronics Show has this to say about DLNA:

Many companies are supporting a set of standardised formats through industry groups like the Digital Living Network Alliance (DLNA).
Not much admittedly but it's a start.