LinuxWorld, Powered by Windows?

Did anyone else catch this? According to The Register, the LinuxWorld Conference and Expo 2006 web page is running powered on Windows Server 2003?! That’s a bit odd…so I checked things out myself by visiting netcraft. Yep, they sure are. Very odd. You’d think that someone who runs a Linux website would make sure that the host they went with and the designers they hired to do the website were Linux people instead of Windows. In fact, if it were me, I’d make damn sure I did it that way.

I looked at their site report from Netcraft and saw that they have just changed within the last month. As I’ve spoken about in the past, some of these larger linux websites/news agencies have really gone down hill. I used to think LinuxWorld was a really great magazine/website. Then they go and pull something like this. Oddly enough, their website has been suffering as of late: According to Alexa, they’ve been on a steady decline since 2004. In fact, my lowly blog here has been garnering more traffic than their site according to Alexa. You do the math…if they can’t beat my silly little blog in traffic, they’re going out fast.

Don’t worry though, I at least have enough sense to always power this site with Open Source and on the Linux platform…even though my primary job is with Microsoft Windows 2000 and 2003 servers. I may be good at Windows AND Linux but I’m no sell out. I bet LinuxWorld wishes they could say the same. I’d cancel my tickets and reservations if you have a spot at that expo. Make sure you check out the heavy hitters that are there too and express your opinions to them on this subject.

Brining Linux to Work – Portal Part 2

Beginning this month, I’ll be attempting to infuse my place of work with Linux. I am an new Applications Analyst and resident AIX/Linux expert for a government agency that lives and breaths Microsoft. I feel that Open Source software, mainly, Linux…can be a great addition to this agency. I’ll be documenting my attempts here while I go along. If you have tips, tricks, solutions, advice or supportive comments…please respond in kind.


Well, Ubuntu had some troubles but CentOS did a fine job for me. The problem was in the compilation of the mod_ntlm module for Apache. Ubuntu couldn’t get it right. Changing the makefile a bit (Thanks Billy!) did allow me to post the mod_ntlm.so file (finally) but I couldn’t get things to work for Apache 2. I reverted back to Apache 1.3 on the Ubuntu box but ran into the same problem that I did on the CentOS box with odd authentication issues. Alternatively, CentOS had no problems compiling the mod_ntlm Apache module for Apache 1.3 OR Apache 2.X which was much better than Ubuntu.

Of course, the real problem wasn’t getting the various software installed, the problem was doing it in the correct order. My advice to someone that wants to use mod_ntlm with Apache to pass parameters to a zope server for plone: Install zope and plone first…get a working site up and running on port 80 (intranet site that is) THEN install apache and work on mod_ntlm. I had trouble figuring this out as most of the instructions I found allowed for Apache to be working first before the zope server comes into play. Another thing you could do is turn off Apache during your zope/plone configuration.

Something else that is odd is that by default when you install zope in CentOS, it isn’t started. You can add it to automatic start using checkconfig in CentOS but finding out where the rpm installs zope is another story. Not being familiar with zope hindered my progress initially. After some fumbling I was able to get things working.

Overall on both the Ubuntu and CentOS installs, I was able to get things in working order but could not get Apache to use mod_ntlm correctly. Normally, if mod_ntlm is setup correctly and all directives are listed correctly (I was using .htaccess to house the ntlm directives) you’ll get to a page 404 not found if accessing the document root. Instead, I received 401 Unauthorized Access. This meant that I was not validating according to Apache to my active directory source.

Continue reading “Brining Linux to Work – Portal Part 2”

Brining Linux to Work – Portal Part 1

Beginning this month, I’ll be attempting to infuse my place of work with Linux. I am an new Applications Analyst and resident AIX/Linux expert for a government agency that lives and breaths Microsoft. I feel that Open Source software, mainly, Linux…can be a great addition to this agency. I’ll be documenting my attempts here while I go along. If you have tips, tricks, solutions, advice or supportive comments…please respond in kind.


You’ve Got to Start Somewhere…

Recently, I’ve been investigating portal applications (CMS portals) for an intranet server at work. The portal will act as a document repository and project status report tool. It needs to plug into the framework we have in place currently…which is a Windows 2000 Active Directory environment. Instead of powering this with IIS or WinXp with Apache…I’ve elected to go with Linux and Apache. However, I didn’t really investigate much to figure out if this would be a possibility. Problems were rampant and still are. Allow me to explain.

I’ve been given the requirements that any intranet page must be single sign on, meaning that when a user visits the page, they don’t have to login…they’re simply there and logged in already. This can be done using the apache ntlm module. I can also pass this parameter using Tomcat and JOSS with php. However, the ntlm module won’t compile on Ubuntu or SuSe and hence won’t install. So, that took away my top two choices for Linux distros (not to mention, caused me to waste 2 days of time). JOSS requires that I write and plugin my own php script which is something I don’t want to do currently. So I’m back at square one. I’ve changed direction and am instaling CentOS 4 currently…we’ll see where that takes me. I’ve had more luck with CentOS as a server (my server at home has around 120 days for uptime currently and runs CentOS at its core).

Continue reading “Brining Linux to Work – Portal Part 1”

Fan the Linux Flames

Anyone who knows me knows that I HATE inefficiency. If I find a new way of doing things that eliminates the resources I spend doing that thing, I pounce on it. So when I ran across a nifty little program that makes life managing my two linux boxes easier, I pounced. The tool I’m speaking about is called “Fanterm” and it makes managing a limited amount of Linux boxes a snap. I had forgotten that I had installed this and when I brought up my second Linux box (upgrade motherboard) I remembered reading about it on the web somewhere. A quick google search refreshed my memory…although this article only talks about fanout. Fanterm really brings a powerful tool for smaller network system admins.

So what does it do? It’s pretty easy and straightforward. After you download & install the necessary files, open up an Xterm and use the following syntax to parse your command:

fansetup onemachine anothermachine user@yetathirdmachine

The command above opens up 3 xterm windows in addition to the local one you opened up. Now you type your command in the original and watch as the command is mirrored in the other xterm windows. Making quick changes to smb.conf files works like a top. If you want to know the uptime of all your systems, you’re set. This makes managing a limited number of linux boxes a snap…apt-get update; apt-get upgrade anyone? The thing I like most about it is that I get to SEE what happens on each computer…that way if something goes haywire, I’m not executing a command on a file that doesn’t exist on the remote linux box.

Make sure you give this tool a go, it makes life much easier in small networks. Hope it comes to be as useful to you as it is to me.

ITWire in Australia on the Desktop

The point of all this is that from the standpoint of a new Linux user, having a snazzy looking interface is all well and good but it means nothing if users have to revert to the command line to perform what should be simple tasks. Installing new downloaded software is one of the most common tasks performed by desktop users at home and in small offices. Until the Linux suppliers can make this task trivial, they will continue to miss out on a whole world of users beyond the command line geeks.

NOTE: I normally don’t re-publish news like many of the “blogs” you see out there but in this case the article was pretty good and hits home with a theme I’ve been stating a bit lately.

The article above was taken from ITWire…IT News in Austrailia.

This article was a good read and I believe it to be true. Until Linux can come up with ways to make the user oblivious to what is going on underneath the GUI, it won’t make inrroads to the desktop.

UPDATE: 3/2007

Penguin Pete, the not famous blogger over at penguinpetes blog flagged this post as being the main reason that he no longer posts links to my blog. Interesting in that if anyone were to read this post out of context, they might not know what I was driving at for this post. The main intention of the post is to show that new users need to first feel comfortable in their OS before they drop down and get dirty with the shell. That’s a fact jack. Nothing is going to sway that…I’ve had many users I’ve switched over DESPISE dropping to the shell and cite that as the main reason they go back to Windows. This is what I was agreeing with in this instance…that New Linux users need to be semi oblivious to what is going on underneath and not have to worry about it in their beginnings…not to ‘dumb down’ Linux or remove functionality underneath it.

The PII 350 MHz Computer Dies?!

I always hate to send hardware off to that big chipyard in the sky. However, the PII 350 MHz PC decided to give up on me. Perhaps that is why I was getting so many errors while attempting to install various distros of Linux (including those optimized for old PCs). So, for those of you that were following along with my little journey, the PII is no more…too many errors began to pop up even in steady Slackware. I made a judgement call and retired the motherboard.

In its place, I forked out 23 bucks for a PC Chips Socket A motherboard. I then slapped in a spare XP 2600 and I have the newest flavor of SimplyMEPIS and PCLinuxOS installed. It’s running like a champ and is turning out to be the best 23 bucks I’ve spent in some time. For those that want a steady board for Linux, check Newegg here.

Alas, the PII was a good board. I knew it well. So glad I didn’t have to put it down and that I could gracefully retire it on a good note. Now the slowest PC I have is the CentOS 4 gateway/firewall with a Celeron 900 (Emachines w/ a refurb Gateway mATX mobo). Works great. Sorry I couldn’t finish out all those other distros.

In the meantime, I’ve made it my mission to document some really simple things using KDE and Gnome (How-Tos) for stuff that you’d normally do in Windows. I’m attempting to track down the easiest way to setup an anonymous share using KDE and Samba (with no smb.conf or smbpasswd or smbuser alteration…no shell). Thus far this has proved quite challenging. Getting Samba to play nice without passwords and users with full write access on a share is murder. If anyone has tips or links to a great how-to, I’m all ears. Thanks for reading.

Creative Commons License
Except where otherwise noted, the content on this site is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.