Basic Wireless Auditing with Linux and Open Source Tools

 

 

By Todd Hughes, November 22, 2009

 

            On occasion I am called upon to perform some basic auditing of a client’s wireless network. The standard audit consists of war-walking/driving around the client’s premises to collect data on any available wireless networks and then hauling this data back to my office for a detailed analysis.

            Being a FOSS kinda guy, I utilize several open source tools and a Linux operating system. My OS preference is any Debian based distro; currently I am running Mepis 8 on my laptop. The latest BackTrack distro is an excellent choice too, as it has all the necessary tools already installed and mostly configured. While I do use BT a lot (I’ve got it installed on a bootable USB stick with a persistent partition to save changes to), I wanted to install some basic tools on my Mepis laptop so that I wouldn’t have to fire up BT every time I needed to do a quick audit.

            In order to do basic wireless testing you’re going to want a few wireless pen-testing tools installed. The best one I’ve found for discovery and data gathering is Kismet. Kismet sniffs 802.11a, b, and g traffic, identifies named wireless networks, exposes hidden networks, and captures all the packets into a dump file. It also works in conjunction with GPS to record location coordinates of any networks discovered. I need to point out that you will require the latest version of Kismet, the older versions save data in a different format that won’t work with some of the tools referenced in this article. Debian based distros that use the stable repositories (like Mepis 8) most likely will grab the older version if you do an apt-get, Ubuntu I believe will get the latest version. Best bet is just to grab the latest source code (via SVN) and build from scratch, it’s very easy to do. If you’re familiar with the older Kismet, you are going to love the newest version.

            You will also need a decent rfmon (raw monitoring) capable wireless card. It’s also nice to have a card that is capable of doing packet injection for those jobs that require you to actually attack the client network (but that is out of scope for both the standard wireless assessment and this article). I use either a Cisco Aironet card (AIR-CB21AG-A-K9) or a Ubiquiti SRC300 with an external antenna. Both of these are PCMCIA cards with the Atheros chipset, perfect for our whitehat wireless exploits. Do a bit of Googling to determine if your card supports the necessary modes.

            Next up on our requirements list is some type of GPS antenna/receiver. There are many different flavors out there, I use the Pharos GPS500 III with the USB adapter. In addition to the GPS hardware you will need some software to make it work. I recommend “gpsd” and “python-gps”, both packages available via the standard apt repositories. (As a side note, there is a lot of mapping software available to turn your laptop into a GPS device ala TomTom, Garmin, etc. Take a look at Viking, GPSDrive, Roadnav, or Navit.)

            Some other tools I use are a python script “pykismetkmlv0.42.py” available at Google Code (code.google.com) which converts the Kismet data files into a format that can be imported into Google Earth, “macchanger” (apt-get macchanger) which allows me to spoof the mac address of my wireless card (always a good idea to disguise yourself), and “thcrut”, “fing”, “nmap” or some other tool that will allow you to gather mac addresses on the client’s wired network. My favorite for this is “arp-scan” (it’s in the standard Debian repositories).

             At this point I will not be spoon feeding you a step-by-step “how-to for dummies” but rather assume that if you are doing this type of testing/auditing that you know how to handle yourself around Linux and the command line. That said, here’s my basic framework for a simple wireless audit:

             Configure Kismet to work with your wireless card and GPS receiver and make sure it will dump it’s data files somewhere that you can find them (the default /tmp/datafiles is not a good idea). Start up gpsd, verify that it’s getting a signal lock, spoof the mac address of your wireless interface, put the interface in monitor mode, start up Kismet, and then spend some time walking/driving around the target site. Better yet, write a little script that will do all of the above (below is mine, feel free to edit for your needs):

 #!/bin/bash

wlanconfig ath0 destroy

wlanconfig ath0 create wlandev wifi0 wlanmode monitor

macchanger -m 00:DE:AD:BE:EF:00 ath0

ifconfig ath0 up

gpsd -n -D 2 /dev/ttyUSB0

/usr/local/bin/kismet

             After you’ve collected a good amount of data, shut down kismet, make sure you’ve got data, and then head to the wired network to grab a list of mac addresses. I run arp-scan against the internal network and save the data to a file somewhere. Remember, arp requires that you be on the same network segment that you are arp-scanning, so if there are several subnets you will need to physically plug into each one to grab the macs.

 arp-scan -I ethn0 192.168.11.0/24 > /data/11 gets me:

root@mepis:/home/thughes# cat /data/11

“Interface: eth0, datalink type: EN10MB (Ethernet)

Starting arp-scan 1.6 with 256 hosts (http://www.nta-monitor.com/tools/arp-scan/)

192.168.11.1    00:12:17:02:97:fe       Cisco-Linksys, LLC

1 packets received by filter, 0 packets dropped by kernel

Ending arp-scan 1.6: 256 hosts scanned in 1.385 seconds (184.84 hosts/sec).  1 responded”

            I suppose you may be wondering why I bother with the arp scan stuff? Comparing a list of mac addresses gathered on the wired network against the mac addresses of all the wireless access points and clients can sometimes aid in determining if there are any “rouge” access points on the network.

            Now that you have all the data, you need to analyze it and present a report to the client. I use wireshark to dig through the pcap dump and look for any sensitive info (passwords, usernames, etc.). Using the python script mentioned earlier, a .kml file can be created and imported into Google Earth to create some nice images to include in your report (like this):

Wireless Networks

            That’s about all there is to your basic wireless audit/assessment. Your report should probably include things like a list of all wireless networks found and whether they could be associated with the client’s network, signal strengths, what type (if any) encryption each network uses, a list of all clients discovered and which networks they were associated with, whether any sensitive data associated with the client’s network was captured, a map or diagram of the wireless networks discovered, and anything else you might feel relevant.

            As always, if you have any questions, comments, complaints, or would like to contribute to my beer fund, feel free to contact me at thughes@fwpm.com

Copyright 2009 Todd Hughes

Auditing and Logging

Is auditing and logging enabled on your servers and workstations? I bet many of you answer “yes”. My follow up question to your reply would be “When did you enable it?” That's right folks, basic logging is NOT enabled by default within Windows. This may come as a surprise to some but Windows Event logs are not considered basic auditing and logging. 

 Strap on your tinfoil hats and follow along with this scenario: Imagine that your database server is hacked. You are tasked with finding out what happened. Which files have been modified? Who hacked the machine? How long has it been compromised? Was it an inside job? 

 Lets start with the server itself; Which files have been modified recently? Probably easy enough to figure out by looking at the timestamps. Now, who modified them? What's that? You don't have logging enabled for object access? Or directory access? That's right, it's not enabled by default.

 Perhaps you can narrow down the time frame to somewhere between 2-2:15 AM last Sunday night. So, who logged into the server at that time? What, you don't log successful and failed logons? (again, not enabled by default) You get the picture………

 Another real world example: A colleague was having issues with an Exchange server sending large amounts of spam. Apparently an external entity was using the box as a relay, yet the box was not misconfigured as an open SMTP relay. So what was going on? Basic Exchange logging was useless. After enabling detailed logging, it was discovered that a 3rd party backup utility installed on that server and configured with default credentials was being used to obtain a valid logon and send mail through the box (let this be a lesson about changing default credentials).

 As we can see, logging and auditing is very important. You can utilize it for forensics purposes, to detect anomalies before real problems surface, troubleshooting, and to gain a better understanding of what is going on within your network.

 One of the oldest forms of logging is syslog, from the Unix world. This is an accepted standard and format that *nix machines have used forever. Unix and it's variants are very good when it comes to logging (some may say a bit too anal about it). These machines tend to log all operating system and user related events along with a majority of events generated by any applications running on the machines. Troubleshooting becomes very easy with this detailed level of logging. 

 But, alas, we're not here to talk about *nix machines only. Windows uses the Event Logs to keep track of what's happening with the device. The only pitfall is that even basic events are not logged by default. Successful and failed logons, object access, etc, is not logged. I can't stress how important it is that you develop a policy of enabling this on all of the servers that you build. Here's a good primer on getting started:

Recommendations on what to log

A simple "how to"

 All of this logging tends to rapidly eat up drive space, which is most likely why it is disabled by default in Windows. You will find that the best solution to this is centralized log aggregation (required for regulatory compliance in many industries). Basically, you point all the logs from all of your servers to one log server device. This gives you a central repository for the information and allows you to control access to the logs (an important security consideration) and increase your log retention time (dedicated space for log storage). Most central log servers also have a search interface and some even have a correlation engine that allows you to set up alerts based upon certain thresholds or events. Examples include the Kiwi syslog products, Cisco Mars (more of Cisco-centric product), a simple home-built Linux syslog server, xDefenders ESM appliance (shameless plug), etc.

 There are some issues with central logging in a network environment, not the least of which is the fact that Windows Event Logs are created in a proprietary format that is not directly compatible with the syslog standard. This is not a problem if you have a pure Windows network, don't want to log anything from your firewalls, switches, routers, etc., and are using a central log server that understands the Event Log format. I prefer to not only log my servers, I also want all that juicy information from my other network gear as well. Fortunately there is a solution in the form of products that can convert Event Log to syslog format. The best one by far (in my experience) is the Snare Agent for Windows. This product is free (everybody likes “free”), does a great job of formating the information into the syslog standard, and has a very powerful web-based interface to configure stuff to your heart's delight. The best feature of the Snare Agent is the fact that during the installation process it will ask you if it should enable some basic auditing and logging. Nice!

Get Snare Agent for Windows here

 In summary, enable some basic logging on all of the important devices on your network, implement a central log server, and start to utilize the benefits that logging can provide for you. Some day you'll thank me.

Copyright Todd Hughes 2008

Yahoo releases the Zimbra Desktop

    A while back I wrote an article about an open source alternative to Exchange called Zimbra. Zimbra was acquired by Yahoo recently and it appears that Yahoo is continuing to develop this product. Yahoo released the Yahoo Zimbra Desktop today as a free download. (Be aware, it's rather large.)

    This email/productivity software works in the off-line mode and also has available options for word processing, spreadsheets, task management, and more. The desktop runs as a stand-alone application and uses JAVA to store data locally. Google has been working on something similar, built on the open source “Gears” project, but has yet to release anything.

    The Zimbra Desktop does not run in a browser, it's more like Outlook. A nice feature is that it functions so much like Outlook that even the keyboard shortcuts you're used to using with Outlook will work within the Zimbra application. A few “gotchas” remain though, most notably that it doesn't sync with your contacts/calendar (you have to do a manual export/import) and the IMAP folder sync seems to be a bit funky.

    All in all, it's not a bad first attempt. Take a look at it if you're contemplating an alternative to Outlook. Just keep your fingers crossed that a Microsoft Yahoo buyout doesn't happen; the Zimbra suite is a major competitor to Exchange and we know what would happen to Zimbra if a deal goes through.

 

NY State AG battles Dell

NY State has charged that the computer retail giant has engaged in “false and deceptive advertising” in regards to free financing and next-day/on-site service. State Attorney General Andrew Cuomo has set up a web site where those who feel they were mislead by Dell can submit their complaints. These complaints will be used to gauge how much restitution Dell and Dell Financial Services owe these customers. New York residents can use the link below to contact Mr. Cuomo’s office and file an official complaint:

http://www.oag.state.ny.us/dell_comp/index.html

 

Free SEO Guide

Squishing Bugs

    It's Saturday morning and I just finished doing my weekly chores around the house. While I wait to swap the wash, I'm updating SSH on my Debian servers. It seems the random number generator within the application was broken, so that when the RSA keys are generated they may not be so “unique”. The fix was very simple; one command in a terminal and the application is updated, new keys generated, and the SSH daemon restarted. For those not comfortable in the shell, most GUI package managers either automatically updated the system or notified the user that an update was available. A simple click and you're safe once again.

    Why do I feel compelled to share this with you? Well, this recent vulnerability with SSH reinforces the fact that even those of us that choose not to use Windows must remain vigilant and keep our systems updated. More importantly, it points out a major difference between FOSS (free and open source software) and proprietary software; there is a marked difference in the way vulnerabilities are handled.

    Open source software is frequently criticized by its pundits as insecure and dangerous due to the very fact that the source code is freely available. The argument is that “since the source code is available, it's very easy for the bad guys to find the flaws”. The counter-argument by the FOSS folks is that “since the source code is available, it's very easy for the community to review and find flaws”. Basically, the code is reviewed by a large number of people on a regular basis which should result in an inherently safer product. Case in point: the vulnerability with the random number generator within SSH was discovered and announced on May 13. Within hours, the open source community had resolved the issue and had an updated version of the software available.

    Lets look at the other side now. Proprietary software does not make the source code available. It is up to the manufacturer of the software to review it for vulnerabilities and patch as necessary. We are left at the mercy of said manufacturers and must assume that the software has been tested and is safe to use. What happens when a user finds a security issue? While there is much debate on how this should be properly handled, the standard procedure is for the discoverer of the bug to notify the manufacturer. At this point the manufacturer will determine if there really is an issue, what (if anything) they are going to do about it, whether they should go public with the disclosure, etc. There is always the possibility that the manufacturer will never let the public know about the vulnerability, not issue a patch, and simply hope no one else discovers the flaw. A more common scenario is that the manufacturer releases an announcement of the flaw in conjunction with a patch, months (sometimes years) after they were initially notified or simply waits until the release of the next version to fix the flaw. This puts users of the software at risk for the whole period between the initial notification by the discovering party and the release of a patch.

    So, which do you prefer, FOSS or proprietary? I'll stick with the stuff that has nothing to hide, thanks.

Brain Dump 20080501

Brain Dump 20080501    

I just finished reading an article in a trade magazine. It seems the market for ULCPC's (ultra low cost pc's) is on the increase. Basically stated, folks are finding a use for a cheap ($200-300.00) PC that does everything the average user needs it to do: email, web browsing, word processing, playing music, watching video, etc. Granted, these are usually purchased as 2nd, 3rd, or even 4th machines, but the point is that the trend for ever-increasing power and speed seems to be slowing in favor of an adequate machine that can do what it's asked to do without all the bells and whistles. (My theory is that with the release of Vista and it's accompanying hardware requirements, most folks are finally saying “enough is enough”, but I digress……….)

    What I found interesting was the title of the article: “Microsoft targets low-cost PC market”. It seems Microsoft kind of missed the boat on this rapidly growing market while they were busy putting the final polish on Vista.  Now that they have seen many of the big names (HP, Dell) going with Linux on these devices to reduce cost and development time, Microsoft has to play catch up. This can be evidenced by the recently announced extension of XP Home availability for OEM installation on ULCPC devices until 2010. Vista is way too bloated and resource hungry to run on this new generation of devices.

    Microsoft's COO stated recently that they are “still evaluating which type of hardware will eventually be the most popular in this market segment”. Think about that; this is the first time in recent memory that the operating system manufacturer from Redmond is not in a position to dictate the hardware requirements to the consumer. They actually have to either pare down what they already have or develop something new that is lean enough to work on these devices.

    Maybe the “powers that be” at Microsoft will take all this to heart. Maybe they will take the lessons they are learning from the Vista backlash and start with a clean slate. Maybe they will realize that the average consumer does not want to purchase a new high-end PC every 3 years just so they can run the latest version of Windows. Maybe they will finally understand that enterprises don't want to deal with the application compatibility issues that arise with every new version of the OS. Maybe the average consumer will become comfortable with their cheap little ULCPC's and realize they don't need an expensive operating system just to read their email and shop on Amazon.

     I know what you're thinking: “Todd is a Microsoft basher”. (It's true, but it's my blog so I can write anything I want!) Seriously though, I really do not wish Microsoft any ill will. Without Windows and the other products from Redmond, I would not have a job. Even more frightening is a scenario where Windows falls out of favor and Linux becomes the 800 pound gorilla that everybody loves to hate. Yikes!

When Does Open Source Make Sense?

    Two weeks ago, my quiet Sunday afternoon was interrupted by a knock on the back door; my neighbor from across the street needed some computer help. While I normally shy away from helping friends and family with computer issues due to the “ownership of all future problems” factor, I decided to take a quick look.  (I should note that my neighbor owns a heating and cooling company, so he's a good guy to have owe me a favor in return).

    It seems that he had purchased a brand new PC a week ago and decided that he did not want Windows Vista anymore, so he had his cousin (who is a technician at a local computer repair shop) load a pirated copy of XP Pro. The problem they were having was that the PC would not boot to the XP CD. After spending several hours working on this, his cousin gave up. My neighbor's wife suggested he ask me for some help.

    A quick “CD E:” followed by “DIR” revealed a blank CD. Duh! (I should have gotten the name of the PC shop where his cousin works.) “So, what do we do now?” he asks. Well, I am certainly not going to provide a copy of XP for him to use with his pirated key, and while he does have a valid XP Home upgrade key, I am not going to load his pirated copy of 98 just so he can upgrade. This left one choice: Linux.

    We booted up to a live Mepis CD and verified that his digital cameras, printer, and all peripherals worked properly. Within 10 minutes of clicking the “install to hard drive” icon on the Mepis desktop, we were booted into the freshly installed Linux OS. Both my neighbor and his “computer tech” cousin were impressed by the ease of installation, amount of available software, and how easy it was to figure out how to move around in the OS. Two weeks have gone by now and the only time my neighbor has contacted me regarding the computer was to drop off a case of beer in appreciation and tell me how much he likes his new operating system.
    
    A small business finally outgrows the residential grade router they were using as a firewall and is looking for an upgrade to something more robust. They receive several quotes for Sonicwall, Pix, Fortigate, etc., but all of these quotes exceed the available budget.  A good time for open source? You betcha! An IPCop firewall would provide excellent perimeter protection, a VPN solution, and web content filtering all in one box. “How can I sell IPCop, it's open source?” you may ask. Well, the answer to that particular question is that you don't sell IPCop, you sell a firewall and support for that firewall.

   The above situations are what I consider to be good examples of the “right” time for open source. My neighbor needed a new PC, purchased one with Vista preloaded and decided he did not care for the new operating system at all. In addition, there were compatibility issues with his printer and the older of his two digital cameras. Linux offered him a “new” operating system without the learning curve he was experiencing with Vista, worked with all of his peripherals, and was free. The small office needed to upgrade their firewall without spending a large amount of money on hardware and licensing. The IPCop provides a very nice solution with an easy to use web interface that the “IT Person” (read: office manager) can use effectively.

    There are other times when a switch to open source makes sense: a client that needs a mail server but can't justify the expense of Exchange can use ZimbraEbox is a great replacement for SBS.  Nagios is a nice option to What's Up Gold or HP Openview. Need an enterprise class router but can't justify Cisco gear? Take a look at Vyatta .

    Do you have users that need nothing more than email, a word processor, and a web browser? Switch them to Linux. Thunderbird, Open Office, and Firefox look and work the same on Linux as they do in Windows. As an added bonus that user's PC is not going to be susceptible to viruses, spyware, and other malware that will eventually affect performance and even become a risk to their personal information.

    The list of open source alternatives to commercial products is growing everyday. A majority of these alternative applications do not require any special knowledge of Linux at all; they are easy to install, utilize web based administration, and have excellent support through the community. Take the time and familiarize yourself with some of the open source applications that are available. Download a few and play around with them. Install Linux on a spare computer and get familiar with it. Thinking “outside of the box” and having something to offer to your customers that all the “other guys” don't is what will set you apart from the competiton.

    As always, if you have any questions, comments, problems, or want to list me as the beneficiary of your life insurance policy, please feel free to contact me at thughes@fwpm.com .

Copyright 2008 Todd Hughes.

Grass Roots Effort to Save Zimbra

  April 7, 2008 —  A major open source player in the email/groupware category may fall victim to Microsoft’s latest acquisition plans. The Zimbra Collaboration Suite, an “Exchange-like” email and collaboration server which is available in both a commercial version and a community (free) edition, may just disappear if Yahoo is purchased by Microsoft. Yahoo bought Zimbra towards the end of 2007, and in doing so became one of the largest competitors to Microsoft’s Exchange (especially in Europe). With the possibility of Microsoft now acquiring Yahoo, the future of this excellent product is uncertain.

    A grass roots movement has begun to at least save the community (free) edition of Zimbra while also raising questions involving anti-trust, etc. If you are interested in saving Zimbra, voting “yes” for open source, or just learning more, please go to FreeZimbraNow.org .

{mos_fb_discuss:131} 

Selling Open Source

There are many open source alternatives to the proprietary/licensed versions of common commercial software applications. For those of us who serve the small business customer, these alternatives can provide a means to increase our income and drive sales by providing solutions for our customers that they might not otherwise be able to afford.    
   
     A good example of this is a small company with about 15 users that wants the benefits of a groupware server (mail, shared calendering and documents, etc.) but cannot realistically afford Microsoft's SBS or Exchange. Enter Zimbra, an open source collaboration suite. Zimbra offers everything that Exchange does in an easy to install and manage package: mail server (POP & IMAP), shared calendering, shared documents, and a host of other goodies. Users access their mail and other features via a web browser (ala OWA) making remote access available to the users even when they are at home or on the road.    

    Take a look at the screen shot below, does it look familiar? No, that's not OWA, it's Zimbra! Setup and administration is done via the web interface but there are also several command line tools available to perform various functions such as importing mail from an existing mail server, batch creation of user accounts, etc. User authentication can be done locally or Zimbra can be tied into an existing Active Directory environment for authentication.  As a service provider, you can enable inbound port 22 and port 7071 traffic through the customer's firewall (from your IP address at the shop only!) and have secure remote access to configuration files and the administrative interface to help your customers with any problems that they may have.

 

Zimbra

Zimbra is installed on top of a basic Linux operating system. I built mine on top of Ubuntu server.  Download the iso image   for Ubuntu Server 6.06.1 LTS and install the operating system. You can accept defaults during the install with the exception of the IP address; if it grabs a DHCP address just use the “back” button and manually assign the proper information. Next, install the Zimbra suite. A good how-to can be found here .

    Setup of Zimbra (including a quick start guide) can be found in the documentation at the Zimbra web site .   

    So, procure a decent server (nothing fancy, P4 2+Ghz, 512Mb RAM, 150-200 GB HD), install Ubuntu/Zimbra, and offer it to your clients as an “open source Exchange” server. Mark up the hardware to include a nice profit and cover your time building the server, charge them a few bucks for installation, offer support at maybe $500.00/yr, and start making some money on open source!

    Keep in mind that Zimbra is just the tip of the iceberg. There's Ebox (an SBS like clone), Snort/BASE (IDS), IPCop (Sonicwall like firewall), MySQL/MyPHP Admin (database and frontend), Ntop (network and bandwidth utilization),  Apache (web server), Nagios (networking monitoring and alerting), PacketFence (NAC), Zenoss (an HP OpenView/What'sUp Gold like clone), Squid/SquidGuard (a Websense/SurfControl like clone), many different versions of the Linux desktop, etc, etc. Start getting familiar with Linux and open source and you will find that you have something to offer your customers that the “other guys” don't.

    As always, if you have any questions, comments, problems, or want to name your first born after me, please feel free to contact me at thughes@fwpm.com.

Copyright 2007 Todd Hughes

 

Backups in a Linux Environment

Backups are an important task within any environment. While there are multiple options for performing backups on a Windows network, you may not be aware of the  options available for a Linux environment. Given the fact that I work strictly in the Linux world, both at home and at the office, I can offer some suggestions as to what options are available. We’ll stick with open source (free) solutions here, although there are many commercial products available.

             Bacula (http://www.bacula.org/) is the big player in the enterprise level network backup game. Setup and configuration of Bacula is a job for intermediate/advanced Linux users, although once installed and configured there is a very easy-to-use web GUI for administration.

             Another good product is Mondo Rescue (http://www.mondorescue.org/). While this application can be used for backup, it is designed more as a disaster recovery tool. Mondo Rescue creates bootable CD/DVD’s that contain snapshots of a complete system (ala Ghost or True Image) which can be used to restore a system from bare metal.  It’s administered from a simple text-based GUI and packages are available for most Linux distributions. Installation and use is simple enough for the average Linux user.

             Next up is Amanda (http://www.amanda.org/). The source code is available at the Amanda web site, but a quick Google search will net you a package in the proper format for your particular flavor of Linux. This is strictly a command line tool, so it’s best left to those power users that are comfortable working without a GUI. This is a very powerful and flexible backup application.

             Of course if you are not looking for a network type solution and just want something to backup your own machine, almost all Linux distributions come prepackaged with some type of easy-to-use GUI based software. The KDat application included with the KDE desktop is one example. Several folks have written their own and made it freely available to us all.  http://simplelinuxbkup.sourceforge.net/ for example.

             While I am by no means a code-monkey, I have learned enough basic shell scripting to write my own backup software. It works equally well for stand alone machines or as a network solution. You can find it here (you’ll have to be a registered member):

http://www.theforcefield.net/joomla/index.php?option=com_docman&task=cat_view&gid=911&Itemid=53  


             My backup software consists of a simple script that runs as a scheduled job and uses ‘tar’ to create compressed backup files. The default is to backup the /home directory daily and perform a full backup weekly. A copy of the backup is stored on the local machine (so that you may manually burn it to CD/DVD, archive it to a tape drive, copy to a USB stick, FTP it somewhere, etc.). This software can optionally use ‘rsync’ to automatically push a copy of the backup to a network storage device. I use an old server with a multiple scsi disk array, but you could use an external enclosure with a large hard drive attached to another Linux machine or even push the backup across the WAN to an off-site storage device running Linux. Another option would be to automatically FTP the tar files to any local or remote FTP server (although you will have to learn a bit of shell scripting and write this code yourself).  *NOTE: If you choose the preceding FTP option, PLEASE consider security and use SFTP or tunnel through a VPN! Full restores can be done simply by partitioning and formatting a new hard drive and then extracting the compressed tar file onto the disk. If you delete something important from your home directory, you can simply extract the /home tar file stored on the local machine back into your existing /home directory. Basic instructions are in the README, and the scripts are liberally commented.

             As always, if you have any questions, comments, problems, or want to erect a statue in my likeness, please feel free to contact me at thughes@fwpm.com .

 

Copyright 2008 Todd Hughes