I know we already have a lot of holidays and special occasions in September but I think we need another one. Let’s make September 9th, International Verify your Backups Day. On 9/9 it seems like a good idea to make sure that at least 99% of the files you’ve been backing up can be recovered, if not why back things up?
I am certain that many of us are sporadically dutiful in using backup software, compressing a bunch of files and copying them to a CD or syncing with a backup server. All too often this labor is lost when we can’t actually recover or make sense of what we recover when we need to (and there will always be a time when you need to recover some data). Why not spend a few minutes making sure that all of that effort isn’t in vain? Try and recover some of your old files and make sure they’re file-liciously fresh and usable!
Yes, for some of you, this means that September 8th will be International Backup Day – but that’s OK, at least you’re backing up your valuable data.
How do I backup? I work on three different systems (with four different operating systems between them, sigh) and try to keep most of my working files in one main directory that’s the same on each. I routinely compress and back up this directory into one large file and make the date of the backup part of the file name (as in 08-08-2005-Docs.zip). Then, I copy this file to another hard disk as well as burn this file to a CD, label it with a Sharpie marker and store it in my home or office (alternating between the two). I also have specific configuration files for each system I work on and I back those up too with a combination of small scripts (to run a copy, merge and compress sequence) and then either keep the backup on the particular system in a directory called “backup”, SFTP to a server or burn those to CD less frequently. I usually do not worry about backing up whole applications since in most cases it’s easier to re-installl an application than manage a huge backup file. Much less frequently, I use a full disk backup application (like Retrospect, which I really don’t care for so much) and keep the giant backup file on an external 250GB hard disk.
For other content like all my music files, I just do a full copy to an external drive (I have three external drives, all at least 250GB in size) and rotate among them.
I have tried many other systems, like using version control, automated .Mac-like backup services, and any number of personal or large-scale sync applications (more on them in a later post), but none seem to have the simplicity of what I’m using now.
How often do you backup your data? How do you do it?
Spotlight – TigerWiki
has some tips on using boolean searching with Spotlight. They also note you can use Spotlight from the command line as “mdfind”. This means that “man mdfind” will reveal all Spotlight’s secrets.
Ok, I’m impressed. It looks like Gary Becker, Nobel prize winning economist from the Univeristy of Chicago has a blog called The Becker-Posner Blog with Richard Posner, Law professor of some distinction his own self.
Professor Becker is known for many years as a columnist in BusinessWeek magazine on how economics affects our everyday life and how those same routine life decisions have large-scale economics implications. These works are collected for the most part in his enjoyable book The Economics of Life: From Baseball to Affirmative Action to Immigration, How Real-World Issues Affect Our Everyday Life. I’ve also wanted to take a look at his probably brainy read: The Essence of Becker, a compilation of some of more widely read articles. While I may not wholly agree with many of his conclusions and also regret that many of his hypotheses didn’t have ideal or far-ranging datasets, I find the approach to studying typical problems in an economics sensibility very appealing.
If I remember correctly, Becker was also recently mentioned by Steven D. Levitt, one of the authors of Freakonomics (a rather scattered and tepid book that was more of a general read on applying statistics) as a colleague and mentor.
I wonder how many other Nobel laureates have blogs?
Just a quick reminder: this Wednesday at 7pm there is another Austin Bloggers meetup. Austin Blogger meetups are certainly fabled in song and story and sure to be entertaining. If you haven’t come before, take some time out from your busy week and treat yourself to both Austin’s tastiest pizza at Mangia Pizza on Mesa Drive and to some lively conversation.
See you there.
As you may or may not have noticed, donturn.com was down for almost 72 hours. My hosting provider, dr2.net who became mesopia.com and is now netbunch.com seemed to have a little trouble (well, more than a little if you ask me) updating my domain name and then getting my account back online. I’ve been living with online hosting for about a decade and I have to say that this was the most frustrating time I’ve ever had trying to get something fixed.
I am pretty certain that all the people at netbunch are nice, hard working people but they have a series of problems in their systems that are not very customer-centric. No live phone support (you can call and leave a message), nor will they call you back. There is a live chat feature on their web site (which is a great idea), but doesn’t seem to be open during the times of day they claim it will be. Also, despite getting a ticket number when you send in an email, the feedback loop is either slow or a null op (which I’ll give them the benefit of my doubt in that they aren’t responding because they’re trying to hurry up and fix my problem?).
To make a multi-paragraph story short(er) – I think it would be wise to look around for another hosting provider in case I have more trouble. Do you have a recommendation? Ideally, it would be someone that makes hosting WordPress easy, uses something like the cPanel interface to coordinate things, provides log analysis support (like urchin), lets me coordinate multiple domains (and their blogs) from one account name (and purchase order) and has phone support (I’d even pay a fee if I really needed real-time feedback in a pinch). In my dream world, they’d also provide VPN or maybe just SSL POP and SSL IMAP too.
Cory Doctorow, over at boingboing links to a potential scoop about Apple to add Trusted Computing to the new kernel from a slashdot posting and commentary that references www.osx86.classicbeta.com (which I don’t see a story about on the main page).
Like, Cory – I’ve been an on-off Macintosh user for a long time (1985 for me, but Cory since 1979? you must have been 6 year old hacking on that Apple II!). If Apple Computer Inc. adds “trusted” computing, even in iTunes (wait, it’s not already in iTunes?), or in any other part of the OS, it would push me away from from the Macintosh as a computing platform I would use or recommend. I have been running GNU/Linux (but I usually just say “Linux” or in this case Ubuntu) on a PC/Intel machine now and it is pretty respectable and easy to use. I suspect many people are moving towards Linux-based systems and this would surely push them (or their companies) over the edge.
I would miss a few applications on the Mac, such as Salling Clicker, NetNewsWire and Quicksilver, but that’s a sacrifice I’d make to know that I can use my data whenever and in whatever application I like. I encourage these software developers to make their case known to Apple that choosing to enable a DRM system inside the OS (at the kernel level even) would impact the sales of their applications.
I also happen to work for a place that buys a huge number of Macs (let’s say 10,000 a year as a guess) and I would do my best to persuade them to stop purchasing all Apple equipment. I encourage anyone else with a dog in this fight to make a declaration about this too.
If I were the rabble-rousing, organizing type – I would recommend someone start an online petition to communicate mine (and your) opinions on trusted computing to Apple. Steve Jobs has managed to reinvograte Apple in the past few years, but I can think of nothing that would kill the Macintosh buzz and cachet quicker than locking owners out of their own data.
Update: It looks like myself and others didn’t have the whole story (but who does?) in that there do not seem to be any current plans to enable this technology into the core of the Mac OS. Some have mentioned that it could be used to ensure that the intel-flavored OS will only run on Apple hardware. As an Apple Computer, Inc. shareholder I can understand this, as a Macintosh user I do not want this as an extra thing to have to worry about when using the system, as a OS X developer I do not want this as an extra set of functions or libraries to have to work with or be concerned in conflicting with.
I do have to ask myself, is there any situation or clever use of “Trusted” Computing or DRM that is actually useful for a user? One comes to mind – version control – but there are a number of non-restrictive ways to solve that problem as we know. Let’s discuss it.