My last purchase of a new desktop computer was, if I remember correctly, made in January 2003. I bought it at a shop in San Jose, back when I was stuck working in the USA.
That machine has served me well, but in recent months finally started to feel excruciatingly slow, even when performing trivial tasks, such as browsing the Web; although that’s arguably no longer a trivial task, given all the clutter that now emblazons many sites.
Justifying the replacement of a computer nearly seven years old didn’t require much thought, so I set about slowly bringing myself up to date on modern PC hardware, which is something I don’t normally stay abreast of.
After looking into buying from a local shop, I found little reason to pursue a purchase with them. They lack knowledge and expertise, their assortment of hardware is too limited, and one doesn’t get a sense that they particularly care about the customer.
Instead, I decided to follow a different path this time and order from Dell. As a Linux user, this hasn’t been a particularly attractive option in the past, because it can be hard to know which components will used to build one’s computer. For example, one can typically specify that one wants a hard drive of a certain size, but not which make or model should be used. One therefore typically ends up with components that wouldn’t be one’s first choice, with a specification that perhaps falls short of what one would like.
In spite of this potential for dissatisfaction, I don’t have a lot of free time any more, so I was reluctant to exhaustively research the vast array of options for every component I wanted to build into my new machine. I therefore decided to relinquish such fine-grained control over the build and to restrict my choice to the much smaller range of parts that Dell had pre-selected.
Furthermore, I had been very happy with the Dell machines we had purchased and used whilst at Google, both desktops and servers. They’re robust, reliable machines and I reasoned that a new model would probably serve me very well as my new desktop machine.
I thus set about researching the various product lines offered by Dell and found them to be quite competitive, as far as price is concerned.
The biggest problem did, indeed, turn out to be that I couldn’t initially find a product line that allowed me to choose parts close enough to my ideal that I didn’t feel the compromise was too great to warrant the purchase. I could have a machine with the processor I wanted, but not the video card. Or I could have the hard drive capacity I wanted, but not the amount of memory.
Eventually, though, I came across the Precision T7500 workstation, which, whilst still not ideal, was close enough that I could live with the minor compromises. I decided to take to take the plunge.
My pre-sales experience of the company soon included several phone calls and e-mails with an account manager and project manager, in order to clear up some ambiguities and on-line ordering glitches. For example, the site would tell me that ordering English language software was incompatible with a Dutch power cord. This kind of thing is enough to prevent you from completing the order process.
Anyway, my contacts at Dell were surprisingly helpful and efficient, so I ultimately felt very comfortable giving Dell the order and confident that if there were any foul-ups along the way, they wouldn’t be too great in magnitude.
The new machine arrived very quickly once the order had been placed. I was surprised how quickly, actually, as I had thought there to be a reasonable chance of some component or other being temporarily out of stock.
The big day arrived a couple of weeks ago, although with two children in the house, it took me another couple of days to get around to unpacking the boxes and setting things up. It was exciting to unpack my first new desktop in almost seven years,
Here’s a breakdown of the most important hardware, together with commentary on the parts of note:
2 x Intel Xeon quad-core W5580 @ 3.20 Ghz, 1 Mb L2 cache, 8 Mb L3 cache.
24 Gb 1333 Mhz DDR3 RAM.
That’s a lot of memory by today’s desktop standards, but I wanted to indulge myself and be able to concurrently run a large number of demanding tasks. You can simply never have enough RAM.
The memory is configured as 3 x 4 Gb DIMMs on the first CPU and another 3 on the riser card of the second CPU. This is the optimal configuration, as it maximises the bandwidth available across the 6 memory channels of the 2 CPUs. This results in the fastest access to the largest amount of memory.
I take my hat of to Dell for optimally configuring this, because it wasn’t something I was able to specify for the build. Dell simply did the right thing. You might think this
- 2 x 1.5 Tb SATA hard drives.
I didn’t know what model of disc drive I’d be receiving. I could probably have found out from my account manager but I actually didn’t think to do so. Besides, I doubt that I could have influenced the choice of component.
It turned out that I received Seagate ST31500341AS drives, which look good enough on paper, but don’t feel terribly fast in use.
Regrettably, Dell didn’t offer a solid state drive as a build option, or I would have bought one for booting the OS.
As my /home is mounted over NFS, anyway, local disc speed is only really a factor when booting and writing temporary files, such as when compiling, ripping CDs, etc.
- 2 x Dell UltraSharp U2410 24″ LCD monitor: 1920 x 1200; 6 ms response time; 16:10 aspect ratio; 80,000:1 contrast ratio.
These are real beauties and a huge upgrade from my previous single 20″ LCD monitor. I plan to run them as a single display, affording me a 3840 x 1200 work area. They’ve had some mixed reviews, but I find the colours to be vibrant and true. Each monitor is individually calibrated in the factory.
Once I’d mounted them on their stands and turned them on, they had me puzzled for a while, because I couldn’t find any buttons with which to configure them. It turned out that the blue LEDs at the right edge of the screen, which I had thought were just status lights, were actually touch-sensitive buttons. I would have actually known this if I had paid closer attention to the documentation sheet that accompanied each monitor.
- nVidia Quadro FX 4800 graphics card: 192 CUDA parallel processing cores; 1.5 Gb GDDR3 RAM; 76.8 Gb/sec bandwidth.
This is a very high-end card, more the preserve of engineers doing visual modelling than consumer desktops.
- Soundblaster X-Fi Titanium sound card.
An external sound card is still preferable to on-board audio. Dell didn’t offer me any options in the choice of external card, but this is a decent card, adequate for my needs.
Looking at the BIOS configuration, I was surprised to find that the machine has a very comprehensive BIOS, in fact the most extensive I’ve ever seen. I’ll mention only the features and settings that struck me as noteworthy:
SMART Reporting of drive errors during system start-up is disabled. I’m curious why that is, but I run smartd in Linux anyway, so I’ll hear about drive any errors soon enough.
On-board audio had been properly disabled, in line with the extra Soundblaster X-Fi Titanium card that had been installed in the machine. This is no more than one would hope for, but it would have been an easy setting to forget for Dell.
The performance section of the BIOS has me scratching my head a little:
Multi-core support is on, as you’d expect, but hyper-threading is disabled. I have enabled it, as I see no reason to leave it off.
Intel Turbo Boost is on, but SpeedStep is off.
C States Control is on, as are the Hardware Prefetcher and the Adjacent Cache Line Prefetch.
Memory Node Interleaving was set to SMP. I left it that way until I had installed Linux and verified its proper running, but I’ve since changed it to NUMA with no perceptible improvement in performance. Theoretically, there should be one, assuming my kernel has support built in.
On the other hand, dmesg shows:
Dec 5 21:28:37 coffee kernel: No NUMA configuration found
A quick Google search suggests a Dell BIOS bug that omits essential information from the ACPI tables, needed by the kernel. Grr.
Oh well; the performance is blindingly fast, anyway.
- HDD Acoustic Mode was set to Bypass, which does nothing. I’ve changed this to Performance, which theoretically gives me better drive performance at the expense of quiet operation.
With the hardware set up, it was time to install Linux. I’ve been running Fedora for years and Red Hat Linux for many years before that, so Fedora 12 was the obvious distribution to install. At some point, I’d like to devote some time to gaining familiarity with Ubuntu, but for now, F12 was the obvious choice and had been released just a week earlier.
The CD-ROM/DVD drive was positioned after the hard drives in the boot device order, so I had to move it up the list before I could boot from the F12 installation DVD.
The DisplayPort connectors of the video card work only until X starts. I therefore had to instead connect the monitor via the DVI port and select installation using the simple video driver option.
The second monitor wasn’t used at all during the installation.
- In the factory configuration, my 2 hard drives were sdg and sdh in Linux.
The box can house 6 drives, which would suggest sda to sdf. The fact that my two drives are sdg and sdh is perhaps because they are not configured with the on-board RAID controller.
I have a media card reader installed in the machine instead of a floppy drive. It appears that these removable drives are sda to sdf.
When examining the hard drives, I found that they had been partitioned by Dell as follows (the partition labels are Dell’s):
/dev/sdg1 VFAT DellUtility
/dev/sdg2 NTFS Recovery
/dev/sdg3 NTFS OS (Windows 7)
/dev/sdg4 (Extended)
/dev/sdg5 NTFS Data
/dev/sdh1 NTFS Data
As you can see, I’d ordered the machine with a Windows 7 Professional installation. It wasn’t very expensive and I reasoned that a dual-boot machine may prove useful at some point. Having Dell install an OS on the machine may also bring hardware faults to light that would otherwise go unnoticed.
In order to make some space for Linux, I had to delete sdg5 and recreate it as an ext4 file-system. Naturally, I made sure it was empty before I did this.
I also blew away sdh1, and replaced it with a swap partition, adding an sdh2 partition with a large ext4 file-system for an as yet undefined purpose.
Interestingly, sdg3 isn’t bootable. To boot Windows, one must boot sdg2, which is where the Windows boot loader resides. This chains to the actual Windows 7 installation on sdg3.
- Once Linux was installed, the DisplayPort connectors of the video card still wouldn’t work. I still had a single-headed DVI system. Having done my homework prior to purchasing, though, I had known that this would be the case. Xorg doesn’t yet support DisplayPort.
As soon as I had installed the latest nVidia driver, both DisplayPort connectors sprang to life and it was easy to configure the dual-head. With that working, I was able to put my DVI cable away.
- I’d had the box pre-configured with a 60 Gb primary partition, into which Windows 7 had been installed. However, I’d had no idea that Windows 7 was so large and that this would leave that OS with just 7 Gb of free space.
In any case, 60 Gb had been the largest partitioning option available to me during the specification of the build, so it was either that or allow Windows to span the entire drive, which I knew would give
me nowhere to install Linux.
I had been trying to avoid doing my own partitioning this time around. If Dell could partition the drive for me, why not have them do it and save myself the bother?
I would still have to add partitions for Linux, of course, but what I had really been trying to avoid was doing anything with NTFS, especially partition resizing. Unfortunately, due to the apparent size of Windows 7, I was now going to have to resize that 60 Gb NTFS partition, anyway.
Even worse, I had already installed Linux in the extended partition after Windows, so I was going to have to shrink that one before I could increase the size of the Windows partition.
Back in the day, I used Partition Magic for this kind of thing. Three years ago, however, that program had become obsolete, so I purchased Acronis Disk Director and used that to repartition my then new Lenovo laptop.
Unfortunately, Disk Director doesn’t yet work on Windows 7, and that’s part of the reason I had wanted my new box to arrive with NTFS file-systems that would require no further work.
That left me with just Linux’s own parted, which I’ve always shied away fom, fearing bugs and my own stupidity. Its command-line interface is very basic and offers immense opportunity to destroy working systems.
Happily, though, there’s a nice GTK+ interface to libparted (the underlying library of parted), called GParted, and the maintainers provide a nice live CD image that you can use to boot a simple X session running GParted. Running it from a live CD is necessary, because you can’t modify partitions containing a mounted file-system. That means that running it from a file-system on a partition you intend to modify isn’t going to work.
I burned a copy of the image to CD and booted into it.
GParted seems pretty mature and looks like a less-frilled version of Disk Director. I just hoped that it had no serious bugs.
I had to perform 4 operations in total:
Move /dev/sdg6 (/) to the right and shrink it from 1.30 Tb to 1.22 Tb.
Move /dev/sdg5 (/boot) to the right.
Move /dev/sdg4 (extended partition) to the right and shrink it from 1.30 Tb to 1.22 Tb.
Move /dev/sdg3 (Windows 7) to the right and expand it from 60 Gb to 150 Gb.
Large discs with large partitions take a long time to process. The estimated time to complete the above operations was some 20 hours. I’m not sure why moving so many empty sectors had to take so long, but there you go. I had no other tool at my disposal. I resigned myself to not getting to play with my new computer for a while.
Needless to say, I had the machine running via my UPS while performing this task.
The job actually took more like 18 hours, but there was an error during the final step, the resizing of the Windows 7 NTFS file-system. It threw up a message, notifying me that overlapping partitions weren’t allowed. Aargh!
Naturally, I feared the worst, that serious damage had already been inflicted on one or more partitions, but the problem turned out to be easy to correct. GParted was trying to resize the NTFS partition to 150 Gb, but there was now only enough free space to allow the partition to grow to 149.99 Gb.
I’d specified only that there should be no space either before or after the resized partition, i.e. that it should grow to fill the available space. It was GParted itself that had calculated that this would allow the partition to grow to 150 Gb, so the program must have a bug somewhere, possibly related to rounding sizes to cylinder boundaries. It, not I, was responsible for coming up with the values that had resulted in the attempt to create overlapping partitions.
With that minor hiccup rectified, I booted into Windows 7. As expected, Windows first needed to CHKDSK the file-system, after which the OS came up and saw the extra 90 Gb that was now available. Great.
I then booted Linux, which went equally smoothly, so I concluded that GParted had done a good job, even if it had put my heart in my throat for a minute or two at the end there. I’d certainly use it again, rather than shell out on expensive Windows software.
Since then, I’ve been configuring the various bits and pieces of Linux. Moving from a Fedora 9 to a Fedora 12 desktop has been nice, thanks to the incremental improvements here and there.
For one thing, configuring CUPS is particularly easy now. Sarah’s Vista laptop now properly prints to my CUPS-served Epson R800 printer, which is a configuration I could never get to work before.
Sound also works very well, because Creative have finally released their proprietary driver for the X-Fi card under an open source licence. It was integrated into ALSA as of 1.0.21. Before that, only a sub-standard open-source driver had been available.
Another thing that makes sound so good in this release of Fedora is that PulseAudio, the sound server, is now quite mature and reliable. For example, using PulseAudio, one can now configure the sound from different applications to go to different outputs. This allows, for example, sound from software telephone applications, such as Skype, to be redirected to a USB headset, whilst sound from all other applications is sent to the external speakers. This is a really cool feature.
Having my first 64 bit box has tripped me up a couple of times. For example, getting WINE to run the 32 bit Sonos Windows Desktop Controller software had me scratching my head for a minute or two, but was ultimately trivial to fix. Incidentally, that was probably the slowest application on my previous computer, but is scarcely slower than a native Linux application on this new box.
I’m immensely enjoying having two large monitors on my desk. My desktop is now 3840 x 1200, which allows me to leave Firefox maximised on the second monitor and configured to be visible on all workspaces, regardless of which virtual desktop I’m currently on. This is the only way to work! The browser is now never more than a mouse movement away.
I’ve also installed MythTV on this box and configured it as a front-end to the MythTV back-end that runs on the DVR in the living-room. This allows me to use one of the monitors as a TV, while I do work on the other. It’s pretty cool to be able to watch both live and recorded TV in my office, even though the computer contains no TV tuner. This is yet another very powerful feature of MythTV.
Incidentally, one of my new monitors had a dead pixel. I rang Dell and, without any fuss, they agreed to replace it free of charge. Within a couple of hours of calling them, I had received an e-mail with the UPS tracking information of the new monitor.
The new monitor arrived less than 24 hours after calling Dell to report the defect. I wouldn’t get that level of service from a local shop, so any fears I might have had about making a major hardware purchase with an on-line retailer have been nicely squared away by this experience.
To conclude, I’m very happy with both the new computer and the service I have received from Dell, both before and after the purchase. It’s been a very smooth process, the equipment was quickly delivered to my door, a duff part was replaced with no fuss in 24 hours, and everything works as expected. I couldn’t really have expected more from the experience.
Wow. That is a pretty serious machine. I’m surprised you’d slow things down with an NFS home directory though. Wouldn’t it be easier to have that local and back it up with rsync or something? Maybe stripe those disks for speed & make sure it backs up…
OK, that is all very unnecessary with a machine that fast. 😉
NFS doesn’t have to be slow, Shawn. My NAS doesn’t have many users, so it really doesn’t take very long to read and write the few files that my everyday applications use.
I have a gigabit network, so the NAS is definitely the bottleneck, but it’s not noticeable in practice, even when the MythTV box is recording multiple streams to disc while I’m working.
As I wrote, if I rip CDs or compile code, I do that to local disc, but everything else goes straight to the network, where it’s nice and safe on the RAID 5 NAS.
Why bother with rsync scripts, which inevitably suffer from a race condition? There will always be a period of time between committing a file to local disc and having it backed up. It’s not worth the hassle for the imperceptible gain in speed.
Interesting post. I’m surprised that you didn’t add an Intel SSD into such a badass machine. I know you mentioned that Dell didn’t sell it as an option, but if I were to build a super development workstation I would surely have an SSD for my OS and home directory. It’s easy enough to add an aftermarket drive.