Rolling out technology for a non-profit with volunteers and open source

Imaging

I’d like to tell you that we moved the whole office onto Linux desktops, but that’s not the case.  All the systems are going out with a standard image of Windows XP.  However, there is progress there which will allow at least some of the workstations to move to Linux in the future.  More on that later.

The first challenge was to come up with a consistent way of working with images.  It needed to be cheap.  It needed to be easy.  It needed to have the potential for automation.  In a previous life, I was responsible for the agency-wide system replacement for Y2K at the Texas Lottery Commission.  I learned a lot about system imaging at that time.  I knew the commercial players in the imaging space.  Since it was all government money at that time, I didn’t have to worry about expense.  (I’m so ashamed.)  But now we want to go for free if we can.  Enter Clonezilla.

Clonezilla is a nice little combination of open source tools that allow you to create and restore images.  There is a server version and a live CD.  We opted for the live version because of network problems that are waiting for the contractor to complete.  (If you must know we need a buggy run between buildings replaced.  We’re doing fiber and it will be much better!)  As it turned out, it was not difficult to put both the base system image and the bootable imaging system on an 8GB flash drive, so the entire solution fits in the pocket.

Overall, I’m pleased with the Clonezilla solution.  It did as well as commercial products I’ve used in the past, with not much more trouble to learn it.  With the flash drive I was able to build a new system in about six minutes.  Not too shabby.  Since the live cd is Linux with some tools added in and startup scripting to get it going I can adjust that to have automated builds available.  A user could potentially load their own image off of a flash drive with no technical help.

Structured for support

We made some decisions about the structure of the systems based on my previous experience.  Here are some keys to what we did.  You might consider some of them if you are doing your own roll-out.

1. Separate operating system and data.

System administrators have known this for years.  Keep your data on separate storage from your operating system.  That helps isolate damage when there’s a problem.  If the OS disk has problems you don’t have to worry about the data so much and vice versa.  In the workstations we don’t have two disks, but we do have two partitions.  20G for the OS and the rest for the data.  That way we can re-image the OS at any time and leave the user’s data intact.  The policy is keep your information on the D: drive and everything will be fine.  Later we’ll probably change this policy to putting those things on server storage… but local data is almost always required, even if it’s just program settings.  Windows does allow us to move the user data to another drive so that’s been pretty easy.  Of course, as Linux moves into the desktop area we’ll be able to do whatever we want because of the robust aliasing and mounting approach to the Linux file system.  (Life is so much better without drive letters!)

2.   Keep careful track of your golden image

When imaging it is important that the Golden Image, that base that you use to build a workstation, is as clean as possible.  Any personal user information or configuration taints the image.  You want to keep it clean.  So, while designing the image, expect to spend a lot of time saving and restoring while you figure out what you need.  If you so much as add an icon that you want to be on the base image you need to save a new one.  If you’ve been tinkering with a system you probably shouldn’t use that one, you should build a new clean system, add the icon and then save the updated Golden Image.  This is tedious, but it’s the only way to keep your image pristine and reliable.  Virtual machines are becoming a usable resource for this kind of work and I may explore more what I can do with image creation in this way.

Until you have time to redo the golden image, then you need to document the change steps which must be done after a new image install.  These will be the exact changes that you need to do to update the golden image.  This also provides you with a nice change log of the image… which is a good thing to do anyway.

3.  Have a plan for migrating the old data

I was blessed to not be as involved with the migration of old user information onto the new systems.  Some areas of user migration is very complicated, especially moving between versions of Windows.  I’ve gotten so used to simply scooping up a home directory and moving on that cleaning out system IDs and all of those other things just seem to be a mess.  Some of our team found a solution and were able to make it go and I just came in after to check on some remote control and drive mapping issues.  The point is that users don’t want to lose their stuff.  You need to have a plan for how to get the old stuff away and give some thought to how your going to keep it safe.  Separating the OS and data can help, but there may be some intertwining there that you aren’t aware of that will be a big nasty surprise when you actually have to restore somone.

4.  Test your procedures

You know it should work.  I know it should work.  But does it know it should work?  Workstations defy logic sometimes.  When you have a plan, try it out a few times before inflicting it on your users.  Having everyone involved do a few dry runs with the test systems during the image development phase will save a lot of trouble later.  Remember, by doing imaging you are ultimately saving the need to do many kinds of troubleshooting.  When something is wrong enough you just re-image, saving the trouble if installing OS, other software and tweaking to the right settings for your environment.  The time you spend up front will be paid back in the time you save not doing all that re-installing and reconfiguring.

Moving on

I’m pleased with how everything has been going.  We’ve been rolling the systems out and it’s beeing going pretty well.  The goal of all of this is to make sure that the church is able to not spend money that they don’t have to and apply those funds to the unique things that it must buy and, of course, to its mission– which is surprisingly not IT-related.  Any volunteer organization can benefit from using more open tools and techniques with the help of volunteers.  Sure, many organizations can get pretty spectacular software licenses donated by members.  But often those members move away or reduce their involvement.  Then the organization has to make the tough decision of paying the bill to use the whiz-bang tools or migrating to the next one that is donated.  Using open source solutions in these cases stops that cycle.  It also removes all the barriers to entry for technical people who want to help out.  There is a little bit of a learning curve for some, though most know much more than they think they do if they stop being stubborn and give it a try.  However, they don’t need to buy expensive software or have an enterprise background to help out.

Next time I’ll probably tell you more about the software options that we used to gently introduce users to open source choices.  I’ll also tell you some of the choices that we made on the back end to help volunteers support the environment inexpensively. Maybe later I’ll spend some time talking about getting volunteers to appreciate these choices and getting on board.  (Maybe you’ll have some ideas for me too!  It can be so hard to keep people motivated on volunteer things like this, even when they are worthwhile.)

Leave a Reply

Your email address will not be published. Required fields are marked *