Archive for the 'software' Category

14
Sep
07

a minor annoyance

I’ve started using Picasa to organize and edit my photos, since I don’t have the money at the moment to invest in Adobe Lightroom (the demo was lovely, though).  One gripe… Picasa can’t show EXIF data associated with a NEF (Nikon RAW) file.  It does seem to preserve it, because it’s there when I edit photos in Picasa and upload them to Flickr. Hopefully they’ll fix it in the next release.

How quickly I’ve turned into a degenerate digicam nerd.  And to think I used to be a proud film snob.

Advertisements
12
Sep
07

protective camouflage

I do most of my web hosting at dreamhost.com, which has worked well for me.  I can install WordPress there (and have, for extraterrestrialhighway.net).  But for daily blogging purposes, I stay here.  Why? 

Well, here I get the benefit of a large community to protect me from “predators”… spammers and site hackers.  I don’t have to worry about the security stuff as much here, because the WordPress.com community has much more interest in keeping that stuff away than I do.  I could go for days without checking one of my other websites.  It could get defaced or used as a spambot and I’d never know it.  For that matter, I prefer hosting services over setting up a Linux box and running my websites from home, even though I could.  Paying someone $10/mo is worth it just for the backups and security patches!

Y’know, given where this idea is going, the title sucks.  But it’s a cool phrase, so I think I’ll keep it.

07
Sep
07

polish

I just remixed the Late November song “Blue Frogs from Mars”, applying some much-needed reverb. The reverb came from the excellent freeware SIR Reverb, an impulse reverb system. Rather than synthesizing a space as digital reverbs usually do, it works from a recorded sound impulse of a real space (or a digital reverb, or anything else). I also applied some eq to the vocals, and a bit of other processing. Certainly, there’s still improvement to be had, but the results so far are interesting.

Here’s the remixed version

And here’s the old, dry version

07
Sep
07

excuses

Earlier this year, I built a new computer for music recording and photo editing.  For audio, I installed my cheap-but-good old M-Audio 410 PCI card, not the greatest, but it’s kept me from needing to invest in another interface.  But audio performance was dreadful.  If I didn’t run it with a 2048 sample buffer (over 40ms of delay), dropouts were unacceptable and the driver often crashed. The computer itself is modern and very fast, so there was obviously some other sort of problem.  I went through all the usual suspects – driver updates, IRQ conflicts, etc – but couldn’t find it.  Ultimately, some surfing led me to doubts about the interaction of the modern PCI Express video interface with the PCI bus, and a belief that the video subsystem was stealing cycles and interrupts from PCI, and hence killing audio performance.  I decided I would invest in a PCI Express Firewire card and a new Firewire audio interface whenever I had the money – and that’s a fair bit of money.

So last night, I decided to take another look at it.  Turned out that the soundcard was on IRQ 16… a virtual IRQ assigned by ACPI in Windows.  That meant it could be conflicting with a sub-15 IRQ.  I disabled the serial and parallel ports to free up their IRQs, rebooted, and viola!  I was able to crank the card down to 256 samples, or about 5ms of delay, where it ought to be.  So it was an IRQ conflict after all… I just wasn’t enough of a studly nerd to spot the Windows ACPI hazard. Now I can put off getting a new audio interface until I really need more inputs/have money to buy something extra nice.

The lesson to be learned, though, isn’t technical.  Rather, the lesson is that when I see a problem, I should keep trying to resolve it, rather than making excuses and looking for something/someone to blame.  And this applies to non-technical problems as well.

05
Sep
07

the fine line

A while back, I made an off-the-cuff remark on a software development mailing list… I said that most software hovers near the line between barely works and almost works. I’d like to expand on that a bit.

Much has been made about the sorry state of software quality, and smart professionals wonder why so much software just plain sucks. This question is often answered by the idea that Technology X (whatever X is) will finally result in improved software quality – whether it’s object-oriented programming, agile development, Ruby on Rails, whatever. But honestly, I don’t think software will ever get significantly above that almost/barely line.

Why? Because advances in computing power and software development tools have allowed software to become more complex. And given more resources, developers (and their customers) will choose complexity over quality. They will choose more feature and new capabilities over stability and elegance. It’s easier to let a software project degenerate into a Big Ball of Mud and then rewrite it from scratch in a few years than it is to design quality from the start.

And the complexity of modern software is amazing by the standards of, say, ten or twenty years ago.  Imagine, say, convenient libraries for handling virtually any kind of graphics file over a standardized connection protocol.  That would have been wizardry in the 1980s, but it’s something kids can knock off in Javascript today.  And did I mention the native multithreading support?  A lot of complexity gets hidden.

Occasionally, a lucky bit of software will reach its complexity limits without being obsoleted by something cooler, and then can be refined for quality.   A lot of Unix shell tools are like that, and the Open Source versions are quite lovely.  But when do we reach that limit of functionality and start going for quality, finally?  And will it happen to really large systems, like operating systems, languages, or web browsers?

And will it matter?  Ultimately, most software is designed to solve relatively small, unique problems, and is subject to the whole quality limit.

Bah.  I need to finish this for now and rewrite later.

23
Aug
07

Time, resources, scope, quality

This is an idea I originally encountered in Extreme Programming Explained, by Kent Beck.  They are the four factors governing software development, but really, they apply to many other art forms. Time is the amount of time available for a project. Resources is time and skill creators have, tools, etc. Scope is the amount of work to be done. Quality is the quality of the work.

In practice, the key factors are time, resources, and scope. Two will be controlled, and the third uncontrolled. If the time and resources for a project are fixed, the possible scope is governed by them. If resources and scope are fixed, the time varies, and so forth.

Quality is a special factor – work quality can be sacrificed for short-term gains in the other three, but over the course of a long-running project, sacrificed quality will eventually affect the other three factors as well. In general, it is foolish to sacrifice quality for other gains.

Many project management problems boil down to trying to control all three key factors simultaneously – you will get this much work done, in this much time, with these resources – “stone knives and bearskins”, as Spock once put it.  Because this sort of obsessive control usually arises from a semi-conscious awareness of failure, such projects are doomed, either to outright failure, or partial success, as measured by whomever cares.