Browsing Posts published by james

psake 

A build automation tool… now with less XML…

psake is a build automation tool written in PowerShell. It avoids the angle-bracket tax associated with executable XML by leveraging the PowerShell syntax in your build scripts. psake has a syntax inspired by rake (aka make in Ruby) and bake (aka make in Boo), but is easier to script because it leverages your existent command-line knowledge.

psake is pronounced sake – as in Japanese rice wine. It does NOT rhyme with make, bake, or rake.

psake is a proof-of-concept still in the early stages of development. It consists of a single file, psake.ps1, which contains all the logic for creating a graph of dependent tasks and executing them in the correct order. You can download it from Google Code using Subversion or TortoiseSVN:

svn checkout http://psake.googlecode.com/svn/trunk/ psake-read-only

Here is a simple psake build script:

properties {
  $testMessage = ‘Executed Test!’
  $compileMessage = ‘Executed Compile!’
  $cleanMessage = ‘Executed Clean!’
}

task default -depends Test

task Test -depends Compile, Clean {
  Write-Host $testMessage
}

task Compile -depends Clean {
  Write-Host $compileMessage
}

task Clean {
  Write-Host $cleanMessage
}

The properties and task script blocks can contain any valid PowerShell commands. To execute the script, simply type:

psake [task(s)]

(This assumes that psake.ps1 is in your PowerShell path.) If you don’t specify a task or list of tasks (separated by spaces), it will look for a task named “default”. You can display the command line syntax by typing:

psake -help

psake [buildFile] [tasks] [-framework ver] [-debug]
  where buildFile is the name of the build file, (default: default.ps1)
        tasks is a list of tasks to execute from the build file,
        ver is the .NET Framework version to target – 1.0, 1.1, 2.0, 3.0, or 3.5
            3.5 is the default
        debug dumps information the tasks.
psake -help
  Displays this message.

Remember that psake is syntactic sugar around PowerShell. So anything you can do in PowerShell, you can do in psake. That means that you can run MSBuild, NAnt, or other scripts. There is no need to completely replace your current build system. You can use psake to automate and extend it!

psake automatically adds the appropriate version of .NET Framework to its path. So you can access MSBuild, csc.exe, vbc.exe, or any other tools installed in $env:windir\Microsoft.NET\Framework\$version\ without the fully qualified path.

task default -depends DisplayNotice
task DisplayNotice {
  msbuild /version
}

As I mentioned earlier, psake is a proof-of-concept. You’re likely to find some rough edges. I am releasing to the community under a MIT License. I would love to get your feedback and ideas for improving it. If you are interested in contributing, please contact me. Go forth and free yourselves from the shackles of executable XML! Take a sip of psake and enjoy!

Thanks to everyone for coming out to see Achieving Persistence Ignorance with NHibernate at the Calgary .NET User Group this past Wednesday. You can download the slidedeck and code here. The following are some good resources on NHibernate and the importance of persistence ignorance.

I also mentioned during the presentation the trick of modifying Configuration before constructing your SessionFactory to turn immutable entities into mutable ones for testing or data loading. You can find the details in my blog post:

Lastly I will be putting up a registration page for Object-Relational Mapping with NHibernate course in the next few days. If you’re interested or have any questions, don’t hesitate to email me.

Bil joins John and James on ooVoo to try out three-way videocasting. Play Silverlight video. Play MP3 audio only.

I’ll be speaking about Achieving Persistence Ignorance with NHibernate at the Calgary .NET User Group next Wednesday.

Achieving Persistence Ignorance with NHibernate

Object-relational persistence can be very complex and middle-tier code is often dominated by persistence concerns. Your Customer class probably contains more code related to loading and saving customers to the database than it does actual business rules about customers. Wouldn’t it be nice if you could remove all this persistence-related noise? This session examines why the concept of persistence ignorance is important and how to use NHibernate to build persistence ignorant domain models.

You can register here.

Date: 25-June-2008
Location: Nexen Conference Centre (801 – 7th Avenue SW, Calgary)
Registration: 4:45 pm to 5:15 pm
Presentation: 5:15 pm until everyone’s brain is full

Food and beverages will be provided.

I’m writing some integration tests around the .NET PetShop, which has no tests whatsoever. Since the architecture is tightly coupled, you can’t really start writing unit tests effectively. You have to start applying Michael Feather’s techniques for breaking dependencies. Before doing that, I want some smoke tests around the application. That’s where WatiN comes in. I am writing integration tests at the browser level. These tests are slow because you’re exercising the full stack – browser to web server to database. You need at least some of these full stack tests in every application. A good heuristic (for a large application) is a few dozen full stack integration tests, a few hundred presenter/controller integration tests, and a few thousand unit tests. (Smaller applications would probably be a dozen full stack integration, 50 presenter/controller integration, and a few hundred unit tests.) Enough testing theory… I wrote the following unit test:

[Test]

public void CanLoadHomePage() {

    using(var ie = new IE(“http://localhost:9999”)) {

        Assert.AreEqual(“Welcome to .NET Pet Shop Evolved”, ie.Title);

    }

}

 

When I ran the test (using Gallio‘s awesome ReSharper 4 Unit Test Runner support for MbUnit), Internet Explorer appeared, but I got a failed unit test:

image

and this stack trace (reproduced in text form from the image for Google’s indexing benefit):

WatiN.Core.Exceptions.TimeoutException: Timeout while waiting for main document becoming available
   at WatiN.Core.WaitForComplete.ThrowExceptionWhenTimeout(String timeoutMessage)
   at WatiN.Core.WaitForComplete.WaitWhileMainDocumentNotAvailable(DomContainer domContainer)
   at WatiN.Core.WaitForComplete.WaitForCompleteOrTimeout()
   at WatiN.Core.IEWaitForComplete.DoWait()
   at WatiN.Core.DomContainer.WaitForComplete(IWait waitForComplete)
   at WatiN.Core.IE.WaitForComplete()
   at WatiN.Core.IE.CreateNewIEAndGoToUri(Uri uri, LogonDialogHandler logonDialogHandler, Boolean createInNewProcess)
   at WatiN.Core.IE..ctor(String url)
   at WatiNTests.HomePageTests.CanLoadHomePage

 

Alrighty then. A TimeoutException and the Internet Explorer* instance was left stranded on my desktop. Taking a look at WatiN’s FAQ, I find this:

Which windows versions are supported?

Windows 2000, Windows XP and Windows 2003. Using WatiN on Vista gives some problems when User Account Control (UAC) is enabled. If you disable UAC almost all WatiN unit tests pass.

Disable UAC to run a testing framework??? Note that I’ve used WatiN on Vista before without disabling UAC. So it had to be some other setting. I noticed that Internet Explorer was running in Protected Mode, which dramatically limits the damage that a hijacked browser can do. Protected Mode runs IE in low integrity mode. (Low integrity processes are dramatically limited in which window messages they can send to higher integrity processes – such as Visual Studio and where they can read/write files.)

image

Solution

The obvious solution is to turn off IE Protected Mode for the site. Easiest way to do this? Add the site to your trusted sites list.

image

Survey says!

image

image

So rather than turning off UAC or Protected Mode in IE completely, you can just add the site under test to your trusted sites list and WatiN works.

For the Paranoid

The astute among you may be wondering if I haven’t opened up a security vulnerability by adding http://localhost to my trusted sites list. Let’s put it this way… If some l33t haxor can drop files on your box and start a webserver, they have already pwned your box. They don’t need you to navigate to a local site with unprotected IE to infect you more. smile_tongue

* For the curious, I rarely use IE, mostly just for certain Microsoft sites that are still IE only. Heck, I’ve been running RCs of FireFox 3 for awhile and upgraded to the RTM version yesterday when it was released. If you haven’t tried FireFox 3, you really need to. Gmail is oodles faster in FF3 than IE or FF2. You might wonder why I’m using IE with WatiN… WatiN only started supporting FireFox as of WatiN v2, which is currently in CTP. Historically WatiN (and WatiR) have relied on IE’s COM automation API to interact with the browser, which is also why WatiN/R doesn’t provide a way of getting HTTP status codes – because the COM API doesn’t expose them!

For those of you who missed it, JetBrains officially released ReSharper 4 last week. A list of new features can be found here. Most notably is full support for C# 3.0 and LINQ, but there are improvements in lots of other areas. (I’ll point out these improvements in the Becoming a Jedi screencasts. I’ve got an episode planned all around the new C# 3.0 features, which should surface in the next few weeks.) New keymaps for both the Visual Studio and ReSharper 2.X/IDEA schemes can be found here. Congratulations to the entire JetBrains team for getting this release out the door! If you haven’t tried ReSharper 4, you owe it to yourself to take it for a spin.

My third episode of Becoming a Jedi is live. In this episode, I start looking at ReSharper’s refactoring capabilities.

Episode Listing

Part 1 of N: Code Browsing streaming download
Part 2 of N: Code Cleanup streaming download
Part 3 of N: Refactoring I streaming download

Streaming requires Silverlight 1.0 or higher. Download is via Microsoft Skydrive.

After finishing the episode, I realized that I committed a huge refactoring faux pas. I neglected to run unit tests after each refactoring. I was feeling cocky and just doing simple refactorings such as renames and similar. When I tried to run the application later, it failed because it could no longer find PetShop.SqlServerDAL.Category, which had been renamed to PetShop.Repositories.CategorySqlRepository. So even on simple refactorings, you need the safety net of a good suite of unit tests. Lesson learnt.

Earlier this week, I received three (3) complimentary copies of Visual Studio 2008 Team Suite with a MSDN Premium Subscription from S. Somasegar, Corporate Vice President of DevDiv at Microsoft. This very generous gift, worth $10,939 USD per copy, was a thank you to developer MVPs for their work in making Visual Studio 2008 a success. I am free to distribute the three copies as I see fit. There are no strings attached for me or the recipient (other than import rules related to your country, of course).

The first copy goes to Fabio Maulo, the recently-appointed leader of the NHibernate project. Fabio has been tirelessly porting features from Hibernate to NHibernate. Due to his hard work, the soon-to-be-released* NHibernate 2.0 has feature-parity with Java’s Hibernate 3.2.5. He is now working on porting Hibernate 3.2.6** features to NHibernate 2.1 (the current trunk). When not porting features, he is answering NHibernate-related questions on the NHibernate developers and nhusers mailing lists. Thank you for all your hard work, Fabio.

The other two (2) copies are part of the The Great NHibernate/Castle Giveaway. This is a big thank you from me to the vibrant communities around both the NHibernate and Castle projects. If you’re not familiar with either project, you owe it to yourself to check them out. Each project receives one copy to award to whomever they feel has made (or makes) significant contributions to their project. I humbly suggest that they encourage developers to get involved and contribute to the project with the award going to the developer with the most significant contribution over the next few months. The exact rules will be determined by the core team of each project. If you’re interested in contributing to either project, you can find information here:

NHibernate – Getting Started with the NHibernate Source Code

Castle Project – Get involved

* NHibernate 2.0 Alpha1 is available now. The general availability (GA) release should happen in the next few months.

** Hibernate 3.2.6 is the current production version of Hibernate.

Today I decided to contribute some patches to the upcoming NHibernate 2.0 release. First order of business was to get latest and then run:

nant clean build > output-debug-build.log

which compiles NHibernate and the unit tests. Unfortunately something went horribly wrong as the build finished early. Usually nant chews on a large project like NHibernate for awhile and then spits out something like this near the bottom of the output:

BUILD SUCCEEDED

Total time: 57.5 seconds.

I started splunking through the output to try to figure out what was wrong and noticed this at the bottom of the output:

BUILD SUCCEEDED

Total time: 11.9 seconds.

About 12 seconds to compile everything! I’m used to a quick return from nant usually meaning a failed compile. Not in this case. The Ultimate Developer Rig – Kovacs Edition is just plain fast. And that made me happy.

CORRECTION: The above NAnt command builds the unit tests, but doesn’t actually run them. Mea culpa.

UPDATE: I ran everything again including running the unit tests:

nant clean build test > output-debug-build.log

On my Latitude D820 (Core Duo 2.0 GHz), total time was 98.6 seconds.

On the Ultimate Developer Rig – Kovacs Edition, total time was 62.8 seconds.

Yes, unit tests – especially for NHibernate where you really have to touch the database for most tests – take time to run. Note that a really fast box doesn’t help unit test times as the unit tests take about 40 seconds to run regardless. This says to me that unit test times are dominated by the time spent communicating with the local SQL Server database.

N.B. NAnt does not support parallel builds. So only one core was kept busy. You can hack MSBuild to support parallel builds as documented by Scott Hanselman. Given the fact that most developers have at least dual if not quad-proc machines, I hope that build tools start officially supporting parallel builds.

As I mentioned here, my motherboard died a horrible death two days after its warranty expired. The motherboard was based on the 939 platform for AMD processors, which has been discontinued in favour of the newer AM2 platform. My choice was to either start hunting around eBay for a used (and hopefully still working) 939 motherboard or get a new one based on a different chipset. The problem with getting a new one is that AM2 uses a different processor socket (hence new processor) and DDR2 RAM (hence new RAM as 939 uses DDR). So I’d be replacing a lot of components. Decisions, decisions…

Lately I’m doing a lot of podcasting and screencasting, as well as development. Encoding audio and video is time-consuming and if you’ve got a decent encoder, is one of the few places you’ll benefit from a quad-core processor. So I started doing some research. I’ve been a big fan of AMD for years, but the reality is that the Phenom quad-core has had some problems such as the TLB bug (fixed in the B3 stepping), higher TDW than the Core 2 (125W vs. 95W), and mediocre performance compared to the Q6600. The Core 2 platform has been out for awhile, it’s stable, and has excellent performance. Since I had to buy a new processor and RAM as well as motherboard, I decided that a Core 2 quad was the way to go…

It was time to do my homework. My primary starting point was Jeff Atwood’s (aka Coding Horror) Building a PC series where he builds the Ultimate Developer Rig for Scott Hanselman – hence the title of this blogpost. With some initial ideas, I started looking for more information at Tom’s Hardware and AnandTech. (I’ve been a fan of Tom’s Hardware for years – ever since Thomas Pabst, the founder, was running the site as a hobby while interning as a physician.) One of my goals was to re-use as many of the components as I had. Here is what I ended up with:

  Scott’s My Old My New
Case Antec P182 Antec Sonata II Antec Sonata II
PSU Corsair 520HX SilverStone Strider 750W SilverStone Strider 750W
Mobo MSI P6N SLI Platinum Gigabyte K8N Ultra-SLI MSI P7N SLI Platinum
Memory 2 x Kingston ValueRAM 2GB 2 x Corsair 2GB (DDR) 2 x Patriot 4GB (DDR2)
CPU Intel Core 2 Quad Q6600 AMD Athlon X2 4800+ Intel Core 2 Quad Q9450
CPU cooler Scythe Mine Stock Zalman CNPS9700NT
Video 2x GeForce 8600GTS GeForce 7900GS GeForce 8800GTS
HDD 1 x 150GB 10,000rpm Western Digital
1 x 500GB 7200rpm Seagate
4 x 320GB 7200rpm Seagate 4 x 320GB 7200rpm Seagate
DVD 20X DVD+/-R Burner 16X DVD+/-R Burner 16X DVD+/-R Burner

* I’m linking to Memory Express, an awesome computer parts store here in Calgary. They also have a location in Edmonton with another one opening in Winnipeg soon. They’ll ship anywhere anywhere in Canada, though they cannot ship outside Canada unfortunately.

I’m not going to bother with prices as those change so quickly. I ended up replacing mobo, CPU, cooler, RAM, and graphics card for about $1000. Let’s examine the parts and my reasoning for each…

Case

My Antec Sonata II case has been serviceable and I didn’t have any major complaints. Overall I’ve been very pleased with Antec cases. If I had to buy one right now, I’d buy the P182 (same as Scott’s) or the Nine Hundred Ultimate. Neither come with power supplies, but you’re better off buying a separate power supply than using the stock supply that comes with a case like the Sonata III. I replaced my stock supply when I put in a RAID array awhile ago as the stock supply would have been dangerously close to its limit with the 4 hard drives plus other components.

PSU

I’m running a SilverStone Strider 750W and highly recommend it. Previously I was running a SilverStone Zeus, but had to replace it under warranty a year ago. I bought a Strider so that I had a spare PSU while waiting for the Zeus to return. I’ve got the Zeus sitting in my closet as a spare because I prefer the Strider. My favourite feature… detachable cables. You only attach the cables that you need, which dramatically reduces cable clutter in your case.

HDD/DVD

I know that Jeff Atwood swears by 10,000rpm drives for your boot partition. I’m running 4 x 7200rpm drives in a RAID 0+1, which provides stripping and mirroring. Stripping for increased performance. Mirroring for data integrity. I considered running RAID 5 for awhile, but everything I’ve read indicates that the parity calculations kill performance. The advantage of RAID 5 is that you get more usable space. Maybe I’ll try a 10,000rpm boot drive at some point, but for the time being, I’ll stick with my RAID 0+1 array.

As for DVD, get yourself a good burner that does DVD+R, DVD-R, and any other format you care about. Not much to say here as you can get a decent 16x or 20x DVD+/-R for $25 to $50. You likely won’t see much difference between 16x and 20x drives as they only reach full speed at the outer edge of the disk. (CDs and DVDs are written in a continuous spiral from inner hub to outer edge.)

Moboimage

Gigabyte is a good name in motherboards, but I’ve never been terribly happy with the K8N – aside from it dying 2 days after warranty. Lots of minor annoyances such as a non-standard 1394 connector so I couldn’t use the front Firewire connection, SATA connectors that prevent newer (and longer) graphics cards from being installed, poor memory timings when using 4GB rather than 2GB of RAM, and a nForce4 bug that caused awful static for my Creative Xi-Fi audio card, though some of those deficiencies can be attributed to chipset limitations.

So it was time to figure out which new mobo to choose. First choice was the basic platform, not manufacturer. When choosing a motherboard, choose a chipset to match your processor, memory, and graphics card requirements, then choose a manufacturer that makes a motherboard with the desired extras, such as built-in high def audio, eSATA ports, extra SATA connectors, RAID levels, etc.

X38 and X48 boards are getting good reviews and have nice features, but they’re currently only available with CrossFire – ATI’s SLI technology. This isn’t a huge deal if you’re only running one NVidia video card, which I am, but it doesn’t leave a lot of options for the future. Plus X38/X48 boards are stupidly expensive. If you want a board capable of handling multiple NVidia cards, I would recommend a nForce-based board. (I was a bit worried given my experiences with the K8N, which uses nForce4.)

The MSI P6N, which Scott has, is based on the nForce 650i chipset. I decided on the MSI P7N SLI Platinum, which is an updated version of Scott’s board and based on the nForce 750i. It has a nice range of features and the extra cost of the nForce 780i boards didn’t seem worth it. (The 780i can handle faster RAM and has better PCIe speeds in SLI mode.) Reviewers had positive things to say about the MSI P7N SLI in terms of features, overclockability, and stability.

Memoryimage

I loaded the system up with 8GB of RAM. Why? Because the motherboard supports it, I’m running Vista x64, and you can never have enough memory. Most importantly, it was dirt cheap. $180 for 8GB of DDR2-800 RAM!

I decided to go with DDR2 rather than DDR3. DDR3 is much more expensive (8GB would have cost upwards of $650) and the performance difference is negligible. (Read AnandTech’s DDR3 vs. DDR2 and Tom’s Ultimate RAM Speed Tests.) DDR3 is the future, but from what I’ve read, you don’t see a difference until you get DDR3 clocked at 1.8GHz, which is just becoming available. (The DDR3 above for $650 was for 1333MHz sticks.) Additionally, motherboards capable of supporting DDR3 are more expensive. So you’re paying more for your mobo and more for your RAM without seeing any real performance difference – usually in the neighbourhood of 1-3%.

What is the downside of DDR2? Everyone is going DDR3. So when you want to move up, you have to replace your mobo and RAM. The upside… By the time we’re seeing real differences between DDR2 and DDR3, prices will have dropped for DDR3 RAM and mobos such that the money you saved by buying DDR2 will more than pay for the upgrade, IMHO.

Once I chose DDR2, the next choice was manufacturer and speed. DDR2-1066 or DDR2-1200 is more expensive than DDR2-800 (aka PC6400) and once again doesn’t offer significant performance advantages. I was also warned of stability problems with the higher frequencies even with good RAM and a quality mobo. Getting good quality DDR2-800 with good timings gives you better stability and more overclocking headroom. Patriot got good scores on overclocking. Also noted in Tom’s Ultimate RAM Speed Tests is that you’re better going with high quality DDR2-800 or DDR2-1066 RAM with good timings rather than DDR2-1200 or DDR3 with average timings. So I chose two sets of Patriot Extreme Performance DDR2 4GB (2 x 2GB) PC2-6400 Enhanced Latency Dual Channel Kit, which run at 800MHz stock with 5-5-5-12 timings.

CPUimage

The Intel Core 2 chip is a well-engineered piece of hardware. Quad-core Core 2’s routinely out-perform quad-core Phenoms. The top-of-the-line Phenom – the 9600 Black Edition – when overclocked to 2.7 GHz is still marginally slower than a stock Intel Core 2 Q6600. At stock speeds, Q6600 beats the Phenom 9600 by over 13% on average. Yes, there are certain benchmarks where the Phenom does better, but overall, the winner is the Q6600. In most audio and video encoding benchmarks at stock speeds, the Q6600 is the clear winner.

Now Q9450 vs. Q6600… The Q6600 is an excellent chip as you can see from the benchmarks above. It runs at 2.4 GHz, has 2x4MB of cache, and currently costs about $240 CAD. The Q9450 runs at 2.66 GHz, has 12 MB of cache, and currently costs about $380 CAD. You can always overclock the Q6600, but you’ll never increase the cache size. Since trips to main memory are often a bottleneck, more cache seems like a good idea. I wasn’t able to find any benchmarks comparing the Q9450 vs. Q6600, but $140 for the extra stock speed and cache seemed like a reasonable thing to do.

CPU Coolerimage

I know that Jeff Atwood swears by Scythe coolers. After reading a few reviews, especially this Scythe Infinity review at AnandTech, I wasn’t wild about the coolers. The deal breaker for me was this quote from the AnandTech review:

Installation is very easy after the mounting plate is installed. The 775 mount uses push pins – just like the Intel retail design. However easy the mount is, the fact that the Infinity weighs right at 2.2 lbs, or a kilogram, gives reason for pause. It is very uncomfortable having so much weight held by just those pop clips.

A kilogram of metal held by pop clips? I checked the data sheet for the Infinity and it is listed as 960 grams – just shy of a kilogram. The Sonata II is a tower case, which means the motherboard is on end and the full weight of the cooler would be held by the pop clips. This just makes me nervous. If the clips gave way (or I didn’t install the cooler properly), I would have a big hunk of metal falling onto my graphics card. The sudden loss of cooling would likely have dire repercussions on my Q9450 too!

Tom’s New Reference System is equipped with a Zalman CNPS9700 cooler. This seemed like a solid cooler with good marks. Take a look at the AnandTech Infinity review where the Zalman CNPS9700 ranks near the top both at idle and under load. You’ll also note that aftermarket coolers are much better than the stock cooler that comes with your processor. Do your research though as some Zalman coolers, such as the CNPS8700LED are not recommended. A good place to start is Tom’s Hardware CPU Cooler Charts 2008, Part 1, Part 2, and Part 3.

The biggest complaint from reviewers regarding the Zalman 9700 was the installation. You have to remove your mobo to secure the cooler. I was replacing the mobo anyway. So this wasn’t a concern and I liked the idea of a secure mounting system for such a crucial component that meant the life or heat death of my processor. No surprise – I opted for the Zalman CNPS9700NT.

Videoimage

The perennial question… NVidia or ATI… NVidia’s GeForce cards are winning most benchmarks, but the ATI Radeon 3870 supports DirectX 10.1. Given that most software barely supports DirectX 10, 10.1 support wasn’t high on my must-have list. I’ve been using GeForce boards for years and they generally have good driver stability. (ATI is historically known for problematic drivers – especially with newer cards. This may have changed in recent years.) I wanted a card with dual DVI out to run my two Acer x243w monitors. (Great monitors. Highly recommended.) The GeForce 8800GTS fit the bill.

If you’re in the market for a new video card, compare the GeForce 8800GT and GeForce 8800GTS. The main difference between the two is that the GT has 112 while the GTS has 128 universal shaders. The GTS is a dual-slot card, which flows air from outside the case, resulting in a cooler case. You won’t see much difference between the two cards in benchmarks and both are good buys. I chose the 8800GTS because the sale price was virtually the same as the 8800GT. And whenever I’m looking for a new card, my first stop is Tom’s Hardware Best Graphics Cards for the Money. Look for the latest edition to see what is your best bang for the buck.

Conclusion

One of the key advantages of building your own developer rig is the ease of replacing or upgrading parts. You learn a lot about computer hardware, which I believe is valuable for any developer. It doesn’t take a rocket scientist to build a computer. Just some research and asking around. I don’t consider myself in the same league as Jeff Atwood when it comes to building PCs, but I still find it enjoyable and rewarding to build out my own systems.

How hard is it to build a PC? Ask my wife. The first time I did it, I spent hours swearing and fiddling with parts. (I had built systems a decade previous, but connectors, cases, mobos, and everything else had changed dramatically. Plus I was out of practice.) It was a learning experience. The second one that I built, I had it assembled in less than an hour. This latest upgrade, which was sizable because I was replacing the mobo, took around 30 to 45 minutes. Most upgrades can be done in much less time. It’s not as hard as you might think.

The end result of my mobo upgrade (with corresponding CPU and RAM upgrades) was going from this:

image

to this:

image

The speed boost over the old system is noticeable. (I haven’t overclocked the system, though it is amenable to it. I’ll get around to some light tweaking eventually.) I can encode a half-hour episode of Plumbers @ Work using the Lame encoder in under 7 minutes. (As I recall, it used to take around 15 minutes to encode or about half the length of the audio.) Encoding screencasts is also much faster, though I haven’t timed it. (The old system would take roughly twice as much time as the video length for encoding. So a 10 minute screencast would take 20 minutes to encode. The new system takes about as much time to encode as the length of the video.) More importantly, the system remains responsive and usable even when encoding audio or video. I can simultaneously encode a screencast, compile in Visual Studio 2008, listen to music (without skips), and check my email without a hiccup. Multitasking at its best. I’m a happy developer. Do yourself a favour and consider building your own Ultimate Developer Rig.