Thursday, December 24, 2009

eMachines eM250 with Ubuntu 9.10 Netbook Remix

My 10-year old son really wants a laptop for Christmas. He wants the portability, and he wants to access his Mac Mini desktop at my place (over VNC) when he is with his mom. The idea of him using Skype over wireless Internet appealed to me much more than getting him a cell phone. After much deliberation, I decided to get him an entry-level notebook and install Ubuntu for him.

I made my mind up to get an HP Mini 110-116NR Netbook. I had just installed Ubuntu on a Samsung N130. Although it worked fine, there were a few things I did not like about the N130. My biggest problem is that the screen does not tilt back far enough, so you end up typing with the unit not flat, but tilted. Far from ideal. Also, the trackpad and keyboard just felt cheap and flimsy.

So yesterday I drove over to my local Walmart Super Center in Ottumwa, Iowa. They were out of stock. Said that they nearest store that had it actually had 5 and it was about 45 minutes up the road in Oskaloosa. At the Oskaloosa Walmart there were no HP notebooks to be found, although they had a few Acers at $298. Hmmm.

After fossicking around a bit, I found several eMachine eM250's, which they were selling at $228. Never heard of that model. Checked all the spec's and I liked it - all except no listing for 802.1n wireless Internet (does have the slower 802.1b/g wireless though). The other specs included:

Intel Atom CPU N270 1.60GHz
InsydeH20 BIOS
250GB HD
3-cell Li-ion battery
10.1 " LED LCD
3 x USB

Nice! I pulled out $228, but their cash register rang it up "Not For Sale". I was told I could buy something else. After making a bit of a rukus, they checked it out - I was a little scared it might have been under recall or something. But no, some zealous Walmart associate had prematurely put the units out in the store, when in fact the retailer was under strict instructions not to sell them until the day AFTER Christmas. They had my money, so I got a receipt . . . AND the eM250.

Got it home, turned it on.

Blew away all vestiges of Microsoft, erasing Windows 7, formatting the 250 GB hard drive and installing Ubuntu 9.10 Netbook Remix . . . for free of course. Please take note eMachines, Acer, Samsung, HP and other manufacturers . . . we want your hardware, but we DO NOT WANT Windows or anything else by Microsoft. Bloated, buggy, unstable, not secure ... and way, way too expensive. I wonder how much cheaper they could actually sell these things without having to fork out for the Windows license?

It was interesting that the netbook came with both Windows Vista and Windows 7 on it (or is Windows 7 just another alias for Vista?).

After installing Ubuntu, the built-in webcam and Ethernet worked "out-of-the-box". I also got my son a Logitech USB headset, which worked with the sound interface just fine.

Hope he likes it!

Friday, October 30, 2009

Vomiting Pumpkins

If you can find the pair of vomiting pumpkins .... then you can find our place!


Tuesday, July 21, 2009

How To Choose The Best Domain Name

I wanted to put up a new website about A/B testing (split testing) web pages, and it needed a domain. I decided to go after a functional domain name, which would be based on keyword analysis (e.g. The other approach would be to try and establish a customized domain name (e.g. that had no lexical basis or keyword connection.

Procedure to Determine The Most Effective Domain Name

Here is the systematic and quantitative procedure I followed to identify and register my desired domain name:

1. Keyword Phrases. I brainstormed a quick list of keyword phrases, and then added to it with some keywords from similar and competing web sites.

a b response testing
A B test
a/b split testing
A/B Test
AB test
A-B test
baseline test
conversion test
conversion testing
conversion tests
copy test
effective website testing
effectiveness testing
how to test your site
instant site comparison
instant site tester
instant website testing
internet effectiveness testing
internet test marketing
marketing copy testing
marketing research
multivariate test
multivariate testing
response testing
split test
test your site
test your website
web site split test
web site testing
website a b testing
website effectiveness testing
website testing
web testing

2. Domain Availability. I then checked for domain name availability at my domain registrar of choice (my own After reading Syed Balkhi's "Which TLD is the best for SEO?" blog, I decided to restrict domain registrations to the top 5 most common TLD's (.com, .net, .org, .us and .info). Of course you could go crazy and use all 951 TLD's from someone like The other condition I imposed on my procedure was that the domain had to be available in the same TLD both with and without the dashes. In this manner I compiled a list of available domains from the keyword phrases in step 1. e.g. and for the first phrase on the list.

It was important to me to have the name be available both with and without the dashes, as I was going to set up a 301 redirect from the name without the dashes to the name with the dashes (redirects are commonly used to point to its www. version). That way, the search engines would be able to parse the keyword phrase of the domain with the dashes, while I could just tell people verbally to go to, and they would be able to quickly find the site.

I now had a spreadsheet listing the keyword phrases of step 1, collapsed to just those that had domain names both with and without the dashes.

3. Keyword Analysis. I then took the phrases that survived step 2 (it was pointless to include the phrases for which domains were not available), and I ran those phrases through the (free) Google AdWords Keyword Tool. For each phrase I recorded the Global Monthly Search Volume back on my spreadsheet.

The wonderful thing about the Keyword Tool is that made its own contribution of new keywords to my list. I took those new keywords and ran them through step 2, which expanded my population of keyword phrases for which domains were available.

As well as Google's Keyword Tool, I went through a similar process using my accounts on both WordTracker and Trellian Keyword Discovery. My list of names was growing.

4. Sorting. I now had a list of all the domains in my target TLD's, available both with and without dashes, together with their Global Monthly Search Volume. I then simply sorted this list in descending order of Search Volume, and my new domain spoke so loudly to me that I immediately registered both and My confidence was buoyed knowing there were 74,000 searches globally for this phrase every month. Now .. to build the site!

Steal This Idea (Please)!

You know, I believe that someone is going to make some money automating this procedure into a web application. Imagine it ... a combined keyword analysis site and domain registrar .... you go to a site and enter your initial keywords into a text area ... you click on the "Make it Awesome" button and it goes and finds additional keywords from the meta tags of websites matching your existing keywords, does the domain name availability searches and the final keyword analysis. Revealing to you the optimum domain name(s), the web application instantly monetizes by accepting payment for your domain registration(s).

Sites like WordTracker could actually add significant additional value. WordTracker incorporates Sumantra Roy's Keyword Effectiveness Index (KEI), that compares the number of times a keyword has appeared with the number of competing web pages to pinpoint exactly which keywords are most effective. Not only could the nascent web application quantitatively determine the optimum domain name, but it could weigh how many other sites are competing for those keywords.

Usually when someone gives away a good idea, no-one does anything about it. They figure if it's such a good idea, why don't I make such a website? The reasons are simple: I have challenges enough with my own web applications about which I have considerable more knowledge, familiarity and ambition than something I barely understand. I know that Andy Mindel of WordTracker and David Warmuz of Trellian are able to mysteriously log all requests made at the Metacrawler/Dogpile Metacrawlers, giving them a database from which to perform their magic. Anyone who knows how to do that could have a killer web app. I just want to use it. Perhaps they'll give me a free account.

Saturday, May 30, 2009

SEO: Google Page 1 in 10 days for Free

I thought I would share this interesting timeline of Google search engine dominance.  I wanted to place a brand new site on the first page of the search engine.  In this case I needed to be in front of about 539,000 other results.  I find this to be pretty compelling proof that if one has the desired google search phrase (in this case "postfix tutorial") as:
  1. Domain name in .com TLD with and without dashes ( and, with an Apache 301 redirect of to
  2. Title of Home Page ("Postfix Tutorial")
  3. H1 Heading Tag on Home Page ("Postfix Tutorial")
  4. Keyword as one of up to 10 meta tag keywords (first one is 'postfix tutorial')
... then if you follow basic SEO principles (see below) you can get page 1 of google pretty quick (in this case 10 days from May 19 to May 29th).  Take a look at our timeline:


April 2, 09

Domain Registered, Placeholder site, URL submission to search engines (google. yahoo, msn)

May 19, 09

Site deployed with content,,, Digg article

May 22, 09

1st listing on page 1 of, 4th listing on page 1 of, and Page 12 of

May 24, 09

Google Page 16 (changed "" to "Postfix Tutorial" in H1 tag and title)

May 29, 09

Google Page 1 (10th position)

May 30, 09

Google Page 1 (9th position) - Yahoo Page 1 (2nd position) - MSN Page 1 (1st position)

The only basic SEO principles used were a bookmark on, an article, and placement of some comments on a few leading online discussions, and some photos (all submitted to search engines).  This is with a score of only 35/100 on  No paid Adwords or any other commercial work - although I have now placed an Amazon advertisement for my least disliked Postfix book.

Tuesday, May 19, 2009

Installing, administering and maintaining email servers now for more than 12 years, I was not at all surprised to find how many online tutorials and even texts there are to setup Postfix on the Ubuntu nix distro. I was however quite alarmed at how incomplete they all seem to be.

Just to try and make sense out of things for myself, I kept a checklist of the steps in setting up an Enterprise-ready open-source mail server. When I kept reading comments from other users online who were as confused as I was with the state of tutorial information, I decided to put my checklist on the web, and was born.

This tutorial covers all the steps required to install and configure the popular open-source Postfix mail server with Courier for POP email, SASL for authentication and MySQL for configuration and administration. It will support large numbers of users from multiple domains. Included is an option to filter spam with Spamassassin. The installation is based on the current long-term support version of Ubuntu (LTS version 8.04 Hardy Heron expires April 2013).

As well as listing all the steps to deploy Postfix and its trimmings, also offers commercial installation and support services where Postfix will be remotely installed with everything included in the tutorial.

Friday, November 7, 2008 Tutorial

The Mac OS X operating system comes with several key components of an open-source server already installed. Mac OS X (10.4) came with the open-source Apache (1.X), MySQL (version 4.1.14) and PHP (4.X) already installed. They just need to be configured and activated.

The problem with enabling the built-in open-source environment on Mac OS X is that it is not up-to-date, and inevitably one updates a module (say php 5 instead of the default php4 for example), or installs other custom open-source software, only to find on the next Mac OS X Important Securituy Update that the environment has been compromised - either downgraded, obliterated or otherwise fatally modified.

Fortunately there are solutions. The first of these to be offered was the Fink environment, which installs a complete DAMP (Darwin/Apache/MySQL/PHP after LAMP for Linux) environment in its own root directory (/sw/). Apple promises to leave this directory alone with all system software updates, and Fink promises to keep that directory up to date. The drawback with Fink is that it has very few packages available, and the user tends to find a need to either create a package of their own, or install from source code - not a welcome idea when there are so many excellent DAMP packages out there.

The solution preferred and promoted by this website is MacPorts. Apple very wisely based Mac OS X on FreeBSD, which has its own Ports package management system claiming some 17,576 packages available. MacPorts is the migration of FreeBSD Ports to Mac OS X. You will find (nearly) all the packages you need are available, and installation and maintenance of a comprehensive up-to-date DAMP environment is now feasible. is a comprehensive tutorial that guides you through the installation of MacPorts, and its use to install and configure the open-source packages that change the Mac into an industrial-strength open-source server. Included are MacPorts installation and configuration tutorials for any or all of the following open-source components:
... you can use just the applications you want or install the whole open-source plethora.  Also included within is information on how to implement Porticus - a must-have tool for MacPorts users wanting to remain within the Mac OS X graphical user interface.

Saturday, September 6, 2008

PGP Encryption

In these days of communications that are just too easy for unintended people to intercept and divert, I feel it is essential to secure communications with encryption.  I am amazed daily how many people (including good friends that know better) inadvertently share with the public their private files and communications.

Fortunately Pretty Good Privacy (PGP) was originally created by Philip Zimmermann in 1991 as a public-key cryptography system.  You can keep private things private while assuring that you are in fact communicating with who you think you are communicating. These days you can either buy PGP management tools from PGP Corporation or obtain them open-source (for free) from GnuPG (GNU Privacy Guard is the GNU project's complete and free implementation of the OpenPGP standard).  Take a read of the GNU Privacy Handbook.

Implementations of PGP for Windows can be found on GnuPG. Mac GNU Privacy Guard (Mac GPG for short) is, after a fashion, the Mac OS X port of GnuPG. You can install it from Sourceforge here. There are some really useful instructions on how to configure GnuPG for Mac OS X here, including a GUI Key Management tool called GPG Keychain Access, GPGFileTool (used toEncrypt/Sign/Decrypt/Verify with a GUI) and GPGDropThing (Quickly use GnuPG on text via GUI).

You may find a link to my public key both at the header of this blog and also here.