Rethinking Home Servers

Since my first home-built server (a PI 75Mhz behemoth) I’ve used Red Hat based distributions as my home server.  This lasted until around 2002-3 when I moved into a 4 bedroom house with 3 of my Air Force buddies and one of them wanted to learn Linux.

I knew from experience in the mid-nineties that Slackware was probably the most Unix-like distribution out there…I felt at home there quite a bit after learning the *nix ropes on Solaris 2.0.  So we configured a Slackware 8.1 dual processor tower server he was lucky enough to acquire as our home firewall-all-around-great-linux box.  He took his beginning steps there and flourished since our Air Force job already had us jumping around in a VAX/VMS mainframe.  We had many late night hacking sessions attempting to get things to work or compile there.  We also had a multi GB shared hard disk (unheard of at the time!) shared over samba.

After I got moved out, I continued to keep the Slackware box up to date.  I moved onward to Slackware 9.  Samba operated like a champ and Slackware was a great routing system and dhcp server.  Then I discovered ClarkConnect and loved the web interface.  I could do things in half the time!  I could do them over the web from work without SSH tunneling!  All this appealed to me at the time.

I continued to run ClarkConnect from that point on and have continued to all the way up to when it changed to ClearOS this past year.  Indeed, I have ClearOS now as my central server.

The only problem is that I’ve suffered 2 of the most catastrophic losses of files in my samba shares when running ClarkConnect/ClearOS…and I didn’t draw the lines together  on these separate incidents until just recently.

The first loss came when an entire samba share was completely eradicated…13GB of music was just gone.  The second loss happened just the other day when tons of scanned pictures just VANISHED into thin air.  Each time these happened, I was using ClarkConnect/ClearOS.  Each time it happened a few users reported instability in the forums of those distributions.  I am not sure how it could have happened and I was caught completely off guard on the second time…my backups were not yet configured since it was a new server.  The first time it happened…I didn’t know the value of having a good backup routine.  So each time, no backups 🙁  Lesson learned the hard way but learned nonetheless.

I recall running Slackware on my server and NEVER having the problems I have had with ClarkConnect/ClearOS.  This got me rethinking my home server design.  Servers should be the epitome of stability.  One should be able to migrate from one version of the operating system to the next with few hiccups.  When considering each of these it is very apparent that I should be running Slackware core on my main samba server.

I will be making that transition in the next week or two and moving to a Slackware core based server.  I’m not sure what to use for backups across the network (I usually mirror the drive to an NTFS drive in my Windows based multimedia server) nor backups locally to other hard drives.  If you have any suggestions, I’d really like to hear them.  Also, I’d like to know what readers consider using for a server.  Please vote for your favorite below and drop me a comment letting me know specifics and thanks for your help!

[poll id=”3″]

Are you looking for Linux Hosting?

Author: devnet

devnet has been a project manager for a Fortune 500 company, a Unix and Linux administrator, a Technical Writer, a System Analyst, and a Systems Engineer during his 20+ years working with Technology.

18 thoughts on “Rethinking Home Servers”

  1. Have you actually traced the filewipe to clearos/clarkconnect or is it just a suspicion?

    1. I’ve actually not been able to…in both cases, I’ve been using clarkconnect/clearos but logs were empty as to what went wrong…in fact, there were no connections outside of the ordinary (a mounted drive on one central windows manchine). So unless Windows 7 just decided to delete a single directory…something happened on the linux box.

      So, I don’t have concrete proof because there is none…but it’s one thing or the other and since it’s happened twice in the same fashion, I’m more likely to blame the common denominator in each case.

  2. Hi devnet!

    Inquiring minds want to know …

    How come you did not stick with Amahi?

    What can be done in Amahi to improve it? What would you do?

    cpg

    1. Just small things…I reported one bug on one problem (transmission web interface reset download area after each update). The second problem was that I wasn’t able to unzip files on a samba share in windows. So if I downloaded a bunch of rar’d files and used 7zip in Windows (my multimedia machine runs Windows 7 64bit) to try to extract things…it barfed on itself. This caused me to have to unzip them to the Windows machine desktop and then copy and paste it to the share. So, I worked for weeks trying to get the permissions just right before I got frustrated and hopped around back to my old flame clarkconnect…which is now known as ClearOS. I setup a ‘flexshare’ there (fancy name for a samba share) and BOOM, things worked. However, they lack a comprehensive backup solution. So, it looks like I’ll probably be coming back to Amahi.

      Here’s what I’d need to know in order to come back: 1) If I install it on Fedora…will it continue to be released on Fedora? I don’t want to have to switch over to Ubuntu in the near future… 2) Can we work on the default permissions to kick the crap out of this problem? I have smb.conf on from a system that works in my arsenal 🙂

  3. I think the transmission issue is in feedback for someone to make sure the proposed fix works (I assume it’s bug 479). We can see if we can test it ourselves first and fix it.

    For the samba issue, … have been adding more technical options to samba (all after advanced settings are on). Definitely interested in fixing it, of course. If you could file a bug and perhaps include a small test case, that would be great. I use unrar locally on on a mac machine, but I can try on windows with the client you mentioned.

    I am thinking perhaps the rar file had some symbolic link or something in it?

  4. Oh … Idea! Maybe we should have a setting for “ninja options” … beyond advanced, where you can put whatever you want in the samba conf. 🙂

  5. I’ve been running Slackware since the 8.1 timeframe, too. I got my first Always on Internet connection in Fall 2002 and Slackware 8.1 became my home & internet server. IIRC, though, 8.1 didn’t ship until 2002.
    I’m now running Slackware 13 and I have learned more about Linux in the last 8 years running Slackware than I would have running anything else. I’ve had too!

    \\Greg

  6. HI there – my home setup is an Ubuntu 9.10 box running Samba and Netatalk (I only have Samba to incorporate the wife’s work laptop) and it works fine. I have my iTunes and iPhoto libraries backed up to it from my main mac using a standard rsync script. From the server I backup ‘across the cloud’ (as the nouveaux trendies call the interweb…) to a Virtual Machine at my workplace, once again using rsync. The result is that I have two copies of my data – one locally and one that I can get to with a USB drive the very next day. All set up and scheduled with cron so I can just fit and forget. I have had a hard drive fail on my mac but it was simply a case of replacing, reinstalling OS/patches and then rsyncing my data back. For a simple and reliable setup it’s hard to beat in my opinion!

  7. Both of my home servers are running Ubuntu 9.10. I have a local file server, as well as a development web server and they both run great. No matter if you go for Debian alone or Ubuntu I don’t think you will be disappointed.

    The apt package manager is great, the repositories for Ubuntu are huge, and Debian has always been stable and reliable for me. Also the size of the Ubuntu community is worth mentioning, I have always found help when I needed it.

    1. here’s the thing though Daniel…

      Canonical isn’t making a lot of money selling servers and Ubuntu is known for desktops. Red Hat is making money hand over fist selling a server operating system (well, support for it anyway).

      If anything, I’d go with a Red Hat based distribution instead of an Ubuntu one…it makes more sense.

      1. You make a solid point, and I agree with you. For me though, my Ubuntu home servers have been great to me. If you’re willing to fork the $80-$2500 it’s a good choice. I personally can’t afford that, so it doesn’t even cross mo mind.

  8. I am going to have to agree with Daniel. RHEL has served me very well in a corporate environment. however it is a little pricey to have as a home solution.(specially when cost saving is at the top of most of our lists these days)

    I would place ubuntu LTS on a unit(server edition if you are using the unit as a server)and give it a whirl. my current core server is an unbuntu 10.04 LTS … upgraded flawlessly recently, runs a software raid array. never have had an issue with it.

    install webmin (lock this down with ACL/iptables firewall) and you will have a great little point and click interface when you either A. can’t remember the syntax of the commands/configuration files B. want to take the quick and dirty route.

    1. I’ve found Amahi to be a GREAT drop in server for my needs…and I was able to use the advanced options to get samba tweaked how I like it.

      The price is right (free of course) and it runs on top of Fedora…so it’s not too bad 😀 The amahi web interface makes other home server solutions look dumb.

      I’m a happy camper right now.

  9. I’m all for gentoo since it’s a rolling-release. No headaches going from “one release to antother”. Just do your updates reguarly and you’re perfectly fine.

Comments are closed.

Creative Commons License
Except where otherwise noted, the content on this site is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.