Category Archives: Computers

Of computers and gadgets — why your screen goes blank, what kind of hosting you should get, how to get started on blogging etc.

14085902214_ce64468278_blogging

High Performance Blogs and Websites

Do you have a website or a blog and feel that it is getting bogged down with heavy traffic? First of all, congratulations — it is one of those problems that webmasters and bloggers would love to have. But how would you solve it? The first thing to do is to enable PHP acceleration, if your site/blog is PHP based. Although it should be straightforward (in theory), it might take a while to get it right. You know what they say — In theory, theory and practice are the same. In practice, they are not. Acceleration, however, is a low-hanging fruit, and will go a long way in solving your problems.

Once you have extracted all the mileage out of the accelerator solution, it is time to incorporate a Content Delivery Network or CDN. What a CDN does is to serve all your static files (images, style sheets, javascript files, and even cached blog pages) from a network of servers other than your own. These servers are strategically placed around the continent (and around the globe) so that your readers receive the content from a location geographically close to him. In addition to reducing the latency due to distance, CDN also helps you by reducing the load on your server.

If you have the technical know-how and time to spare, you can actually do it the hard way, by defining a distribution, origin source and setting up the DNS records pointing to something like Amazon CloudFront. If it sounds like too daunting a task, go with the right provider who will make it both cheap and easy. The daunting solution will work best for those who consider themselves semi-hackers or developers. The easier option is to take up something like MaxCDN. They provide round the clock expert support as well as faster service in continental US. They can also work out to be cheaper at the right volume. [See the comparison]

MaxCDN Content Delivery Network

Regardless of which route you decide to take, a CDN works by “pulling” the static files from the specified location, caching them across the globe, and serving your readers from the closest location. When you choose a CDN provider, you have to compare features and cost. For instance, if you are a developer, it may become important to you to be able to refresh (“invalidate”) the cache on demand, which is quite a bit easier (and cheaper) on MaxCDN compared to CloudFront. Also of interest is the fact that MaxCDN gives you detailed statistics about your CDN usage.

In short, if you are a professional blogger and webmaster, consider MaxCDN as your content delivery solution. It will significantly improve the performance of your popular sites, and enhance end user experience.

Note that the links to MaxCDN on this post are affiliate links.

Photo by Yordie Sands

Man as Chinese Room

In the previous posts in this series, we discussed how devastating Searle’s Chinese Room argument was to the premise that our brains are digital computers. He argued, quite convincingly, that mere symbol manipulation could not lead to the rich understanding that we seem to enjoy. However, I refused to be convinced, and found the so-called systems response more convincing. It was the counter-argument saying that it was the whole Chinese Room that understood the language, not merely the operator or symbol pusher in the room. Searle laughed it off, but had a serious response as well. He said, “Let me be the whole Chinese Room. Let me memorize all the symbols and the symbol manipulation rules so that I can provide Chinese responses to questions. I still don’t understand Chinese.”

Now, that raises an interesting question — if you know enough Chinese symbols, and Chinese rules to manipulate them, don’t you actually know Chinese? Of course you can imagine someone being able to handle a language correctly without understanding a word of it, but I think that is stretching the imagination a bit too far. I am reminded of the blind sight experiment where people could see without knowing it, without being consciously aware of what it was that they were seeing. Searle’s response points in the same direction — being able to speak Chinese without understanding it. What the Chinese Room is lacking is the conscious awareness of what it is doing.

To delve a bit deeper into this debate, we have to get a bit formal about Syntax and Semantics. Language has both syntax and semantics. For example, a statement like “Please read my blog posts” has the syntax originating from the grammar of the English language, symbols that are words (syntactical placeholders), letters and punctuation. On top of all that syntax, it has a content — my desire and request that you read my posts, and my background belief that you know what the symbols and the content mean. That is the semantics, the meaning of the statement.

A computer, according to Searle, can only deal with symbols and, based on symbolic manipulation, come up with syntactically correct responses. It doesn’t understand the semantic content as we do. It is incapable of complying with my request because of its lack of understanding. It is in this sense that the Chinese Room doesn’t understand Chinese. At least, that is Searle’s claim. Since computers are like Chinese Rooms, they cannot understand semantics either. But our brains can, and therefore the brain cannot be a mere computer.

When put that way, I think most people would side with Searle. But what if the computer could actually comply with the requests and commands that form the semantic content of statements? I guess even then we would probably not consider a computer fully capable of semantic comprehension, which is why if a computer actually complied with my request to read my posts, I might not find it intellectually satisfying. What we are demanding, of course, is consciousness. What more can we ask of a computer to convince us that it is conscious?

I don’t have a good answer to that. But I think you have to apply uniform standards in ascribing consciousness to entities external to you — if you believe in the existence of other minds in humans, you have to ask yourself what standards you apply in arriving at that conclusion, and ensure that you apply the same standards to computers as well. You cannot build cyclical conditions into your standards — like others have human bodies, nervous systems and an anatomy like you do so that that they have minds as well, which is what Searle did.

In my opinion, it is best to be open-minded about such questions, and important not to answer them from a position of insufficient logic.

Minds as Machine Intelligence

Prof. Searle is perhaps most famous for his proof that computing machines (or computation as defined by Alan Turing) can never be intelligent. His proof uses what is called the Chinese Room argument, which shows that mere symbol manipulation (which is what Turning’s definition of computation is, according to Searle) cannot lead to understanding and intelligence. Ergo our brains and minds could not be mere computers.

The argument goes like this — assume Searle is locked up in a room where he gets inputs corresponding to questions in Chinese. He has a set of rules to manipulate the input symbols and pick out an output symbol, much as a computer does. So he comes up with Chinese responses that fool outside judges into believing that they are communicating with a real Chinese speaker. Assume that this can be done. Now, here is the punch line — Searle doesn’t know a word of Chinese. He doesn’t know what the symbols mean. So mere rule-based symbol manipulation is not enough to guarantee intelligence, consciousness, understanding etc. Passing the Turing Test is not enough to guarantee intelligence.

One of the counter-arguements that I found most interesting is what Searle calls the systems argument. It is not Searle in the Chinese room that understands Chinese; it is the whole system including the ruleset that does. Searle laughs it off saying, “What, the room understands Chinese?!” I think the systems argument merits more that that derisive dismissal. I have two supporting arguments in favor of the systems response.

The first one is the point I made in the previous post in this series. In Problem of Other Minds, we saw that Searle’s answer to the question whether others have minds was essentially by behavior and analogy. Others behave as though they have minds (in that they cry out when we hit their thumb with a hammer) and their internal mechanisms for pain (nerves, brain, neuronal firings etc) are similar to ours. In the case of the Chinese room, it certainly behaves as though it understands Chinese, but it doesn’t have any analogs in terms of the parts or mechanisms like a Chinese speaker. Is it this break in analogy that is preventing Searle from assigning intelligence to it, despite its intelligent behavior?

The second argument takes the form of another thought experiment — I think it is called the Chinese Nation argument. Let’s say we can delegate the work of each neuron in Searle’s brain to a non-English speaking person. So when Searle hears a question in English, it is actually being handled by trillions of non-English speaking computational elements, which generate the same response as his brain would. Now, where is the English language understanding in this Chinese Nation of non-English speaking people acting as neurons? I think one would have to say that it is the whole “nation” that understands English. Or would Searle laugh it off saying, “What, the nation understands English?!”

Well, if the Chinese nation could understand English, I guess the Chinese room could understand Chinese as well. Computing with mere symbol manipulation (which is what the people in the nation are doing) can and does lead to intelligence and understanding. So our brains could really be computers, and minds software manipulating symbols. Ergo Searle is wrong.

Look, I used Prof. Searle’s arguments and my counter arguments in this series as a sort of dialog for dramatic effect. The fact of the matter is, Prof. Searle is a world-renowned philosopher with impressive credentials while I am a sporadic blogger — a drive-by philosopher at best. I guess I am apologizing here to Prof. Searle and his students if they find my posts and comments offensive. It was not intended; only an interesting read was intended.

Problem of Other Minds

How do you know other people have minds as you do? This may sound like a silly question, but if you allow yourself to think about it, you will realize that you have no logical reason to believe in the existence of other minds, which is why it is an unsolved problem in philosophy – the Problem of Other Minds. To illustrate – I was working on that Ikea project the other day, and was hammering in that weird two-headed nail-screw-stub thingie. I missed it completely and hit my thumb. I felt the excruciating pain, meaning my mind felt it and I cried out. I know I have a mind because I felt the pain. Now, let’s say I see another bozo hitting his thumb and crying out. I feel no pain; my mind feels nothing (except a bit of empathy on a good day). What positive logical basis do I have to think that the behavior (crying) is caused by pain felt by a mind?

Mind you, I am not suggesting that others do not have minds or consciousness — not yet, at least. I am merely pointing out that there is no logical basis to believe that they do. Logic certainly is not the only basis for belief. Faith is another. Intuition, analogy, mass delusion, indoctrination, peer pressure, instinct etc. are all basis for beliefs both true and false. I believe that others have minds; otherwise I wouldn’t bother writing these blog posts. But I am keenly aware that I have no logical justification for this particular belief.

The thing about this problem of other minds is that it is profoundly asymmetric. If I believe that you don’t have a mind, it is not an issue for you — you know that I am wrong the moment you hear it because you know that you have a mind (assuming, of course, that you do). But I do have a serious issue — there is no way for me to attack my belief in the non-existence of your mind. You could tell me, of course, but then I would think, “Yeah, that is exactly what a mindless robot would be programmed to say!”

I was listening to a series of lectures on the philosophy of mind by Prof. John Searle. He “solves” the problem of other minds by analogy. We know that we have the same anatomical and neurophysical wirings in addition to analogous behavior. So we can “convince” ourselves that we all have minds. It is a good argument as far as it goes. What bothers me about it is its complement — what it implies about minds in things that are wired differently, like snakes and lizards and fish and slugs and ants and bacteria and viruses. And, of course, machines.

Could machines have minds? The answer to this is rather trivial — of course they can. We are biological machines, and we have minds (assuming, again, that you guys do). Could computers have minds? Or, more pointedly, could our brains be computers, and minds be software running on it? That is fodder for the next post.

Brains and Computers

We have a perfect parallel between brains and computers. We can easily think of the brain as the hardware and mind or consciousness as the software or the operating system. We would be wrong, according to many philosophers, but I still think of it that way. Let me outline the compelling similarities (according to me) before getting into the philosophical difficulties involved.

A lot of what we know of the workings of the brain comes from lesion studies. We know, for instances, that features like color vision, face and object recognition, motion detection, language production and understanding are all controlled by specialized areas of the brain. We know this by studying people who have suffered localized brain damage. These functional features of the brain are remarkably similar to computer hardware units specialized in graphics, sound, video capture etc.

The similarity is even more striking when we consider that the brain can compensate for the damage to a specialized area by what looks like software simulation. For instance, the patient who lost the ability to detect motion (a condition normal people would have a hard time appreciating or identifying with) could still infer that an object was in motion by comparing successive snapshots of it in her mind. The patient with no ability to tell faces apart could, at times, deduce that the person walking toward him at a pre-arranged spot at the right time was probably his wife. Such instances give us the following attractive picture of the brain.
Brain → Computer hardware
Consciousness → Operating System
Mental functions → Programs
It looks like a logical and compelling picture to me.

This seductive picture, however, is far too simplistic at best; or utterly wrong at worst. The basic, philosophical problem with it is that the brain itself is a representation drawn on the canvas of consciousness and the mind (which are again cognitive constructs). This abysmal infinite regression is impossible to crawl out of. But even when we ignore this philosophical hurdle, and ask ourselves whether brains could be computers, we have big problems. What exactly are we asking? Could our brains be computer hardware and minds be software running on them? Before asking such questions, we have to ask parallel questions: Could computers have consciousness and intelligence? Could they have minds? If they had minds, how would we know?

Even more fundamentally, how do you know whether other people have minds? This is the so-called Problem of Other Minds, which we will discuss in the next post before proceeding to consider computing and consciousness.

1041815367_880b482111_imac

Missing Events and Photos in iPhoto?

Let me guess – you got your new iMac. You had a recent Time Machine backup on your Time Capsule. Setting up the new iMac was ridiculously easy — just point to the backup. A few hours later, your new iMac is just like your old Mac, right down to the wall paper and browser history. You shake your head in disbelief and say to yourself, “Man, this thing just works! This is the way it is supposed to be!”

A couple of days later, you fire up your iPhoto. It says it needs to update the database or whatever. No sweat. Just a couple of minutes — the new iMac is ridiculously fast. Hullo — what is wrong with the last four events? How come they have no photos in them? Well, actually, they do have something, you can see the thumbnails for a second, and then they disappear. The events seem to have the right number of photos. They even list the camera model and exposure data.

You scratch your head and say to yourself, “Well, may be the Time Machine backup didn’t unpack properly or whatever. May be the version upgrade messed up some data. No sweat. I can use the Time Machine and find the right iPhoto Library.” You fire up the Time Machine — probably for the first time for real. You restore the last good backup of the iPhoto Library to your desktop, and launch iPhoto again. Database update again. Anxious wait. Hey, the damned events are still missing.

Panic begins to set in. Mad Google for answers. Ok, hold down the Option and Command keys, and launch iPhoto. Regenerate thumbnails. Repair the library. Rebuild the Database. Still, the ****** events refuse to come back.

How do I know all this? Because this is exactly what I did. I was lucky though. I managed to recover the events. It dawned on me that the problem was not with the restore process, nor the version update of iPhoto. It was the Time Machine backup process — the backup was incomplete. I had the old Mac and the old iPhoto library intact. So I copied the old library over to the new iMac (directly, over the network; not from the Time Machine backup). I then started iPhoto on the new machine. After the necessary database update, all the events and photos showed up. Phew!

So what exactly went wrong? It appears that Time Machine doesn’t backup the iPhoto Library properly if iPhoto is open (according to Apple). More precisely, the recently imported photos and events may not get backed up. This bug (or “feature”) was reported earlier and discussed in detail.

I thought I would share my experience here because it was important piece of information and might save somebody some time, and possibly some valuable photos. And I feel it is disingenuous of Apple to tout the Time Machine as the mother of all backup solutions with this glaring bug. After all, your photos are among the most precious of your data. If they are not backed up and migrated properly, why bother with Time Machine at all?

To recap:

  1. If you find your photo collection incomplete after migrating to your shiny new iMac (using a Time Machine backup), don’t panic if you still have your old Mac.
  2. Exit gracefully from iPhoto on both the machines.
  3. Copy your old iPhoto Library from the old Mac over to the new one, after properly exiting from iPhoto on both machines.
  4. Restart iPhoto on the new Mac and enjoy.

How to prevent this from happening

Before the final Time Machine backup from your old Mac, ensure that iPhoto is not running. In fact, it may be worth exiting from all applications before taking the final snapshot.

If you want to be doubly sure, consider another automated backup solution just for your iPhoto Library. I use Carbon Copy Cloner.

Photo by Victor Svensson

Slow Time Machine with Time Capsule – SOLVED!

Let me guess — you bought a new Time Capsule, set up your Time Machine to back up half a terabyte of family photos and home videos, and expected it to be “hands-free” from then on? Then you got this progress bar saying that it will take 563 days (or some such rediculous number) to sync?

Your next step was to trawl Google, which would have shown you that you are not alone. You would have tried disk utility to repair your Time Capsule disk, disabled Spotlight indexing, connected your Mac directly to TC etc. Nothing has helped so far? Fear not, here is what you need to do.

First of all, launch your software update pane from your system preferences on your Mac.
Mac Software Update
Ensure that you have this update, which specifically addresses this problem.
Mac Software Update

Here is what Apple says about this update:

About OS X Lion 10.7.5 Supplemental Update
The OS X v10.7.5 Supplemental Update is recommended for all users running OS X Lion v10.7.5 and includes the following fixes:

  • Resolves an issue that may cause Time Machine backups to take a very long time to complete
  • Addresses an issue that prevents certain applications signed with a Developer ID from launching

If it is not installed, click on the “Scheduled Check” tab, and install it. Note that it may be installed as bundled with other updates. So, as long as your Mac is up-to-date, you don’t have to worry too much about missing this particular update.

In all likelihood, this update is all that you will need to fix your slow Time Machine on Time Capsule To verify, restart your machine and launch Time Machine. Give it a few minutes and see if the speed is acceptable (about 10-20 MB a second on your wired Gigabit network).

If it is not, or if you have other reasons for not installing the update, there are a few other these tips you can try.

  • QuickSilver and Dropbox iconsQuit applications that may be indexing the file system. Dropbox, QuickSilver etc.  Find them on your menu bar. Right click on the icons and select Quit.
  • Finder optionEnsure that Finder is not set to show all size. Open a Finder window, hit Cmd-J to bring up these options, and ensure that the Calculate All Sizes is not ticked (despite the fact that it is shown ticked in the screenshot here).

    Note that it is not under the usual Finder preferences, which you would bring up using Cmd-I.

  • Kill FinderThe last thing to try is to kill and relaunch Finder. Click on the Apple logo on any menu bar, select “Force Quit…” to bring up the window show, select Finder and hit the Relaunch button

The last step (of killing and relaunching Finder) has been touted as something that definitely works. So do give it a try if nothing else helps. Another way of killing and relaunching Finder is to issue the command killall Finder from a terminal window.

If these tips didn’t work, you are pretty much out of luck. There are still one more thing you could try, which probably will not work. It certainly didn’t, for me, but gave me a sense that I was “fixing” the problem.

Connect your Time Capsule (TC) directly to your Mac. In order to do this, follow these steps.

  • First, connect your TC to your network, and set it up using the Airport Utility.
  • Disconnect it from your network. (Disconnect the ethernet cable.)
  • Disconnect the ethernet cable from your Mac, and connect TC (one of the three output ports) to your Mac.

19e67a04210afd06a73c452d_640_iphoto-imports

How to Avoid Duplicate Imports in iPhoto

For the budding photographer in you, iPhoto is a godsend. It is the iLife photo organization program that comes pre-installed on your swanky new iMac or Mac Book Air. In fact, I would go as far as to say that iPhoto is one of the main reasons to switch to a Mac. I know, there are alternatives, but for seamless integration and smooth-as-silk workflow, iPhoto reigns supreme.

iPhotoTaggerBut (ah, there is always a “but”), the workflow in iPhoto can create a problem for some. It expects you to shoot pictures, connect your camera to your Mac, move the photos from the camera to the Mac, enhance/edit and share (Facebook, flickr) or print or make photo books. This flow (with some face recognition, red-eye removal, event/album creation etc.) works like a charm — if you are just starting out with your new digital camera. What if you already have 20,000 old photos and scans on your old computer (in “My Pictures”)?

This is the problem I was faced with when I started playing with iPhoto. I pride myself in anticipating such problems. So, I decided to import my old library very carefully. While importing “My Pictures” (which was fairly organized to begin with), I went through it folder by folder, dragging-and-dropping them on iPhoto and, at the same time, labeling them (and the photos therein) with what I thought were appropriate colors. (I used the “Get Info” function in Finder for color labels.) I thought I was being clever, but I ended up with a fine (but colorful) mess, with my folders and photos sporting random colors. It looked impossible to compare and figure out and where my 20,000 photos got imported to in iPhoto; so I decided to write my very first Mac App — iPhotoTagger. It took me about a week to write it, but it sorted out my photo worries. Now I want to sell it and make some money.

Here is what it does. It first goes through your iPhoto library and catalogs what you have there. It then scans the folder you specify and compares the photos in there with those in your library. If a photo is found exactly once, it will get a Green label, so that it stands out when you browse to it in your Finder (which is Mac-talk for Windows Explorer). Similarly, if the photo appears more than once in your iPhoto library, it will be tagged in Yellow. And, going the extra-mile, iPhotoTagger will color your folder Green if all the photos within have been imported into your iPhoto library. Those folders that have been partially imported will be tagged Yellow.

The photo comparison is done using Exif data, and is fairly accurate. Note that iPhotoTagger doesn’t modify anything within your iPhoto library. Doing so would be unwise. It merely reads the library to gather information.

This first version (V1.0) is released to test the waters, as it were, and is priced at $1.99. If there is enough interest, I will work on V2.0 with improved performance (using Perl and SQLite, if you must know). I will price it at $2.99. And, if the interest doesn’t wane, a V3.0 (for $3.99) will appear with a proper help file, performance pane, options to choose your own color scheme, SpotLight comments (and, if you must know, probably rewritten in Objective-C). Before you rush to send me money, please know that iPhotoTagger requires Snow Leopard and Lion (OS-X 10.6 and 10.7). If in doubt, you can download the lite version and play with it. It is fully functional, and will create lists of photos/folders to be tagged in Green and Yellow, but won’t actually tag them.

4687194229_64ac2e49ea_thumbdrive

Your Virtual Thumbdrive

I wrote about DropBox a few weeks ago, ostensibly to introduce it to my readers. My hidden agenda behind that post was to get some of you to sign up using my link so that I get more space. I was certain that all I had to do was to write about it and everyone of you would want to sign up. Imagine my surprise when only two signed up, one of whom turned out to be a friend of mine. So I must have done it wrong. I probably didn’t bring out all the advantages clearly enough. Either that or not many people actually lug their data around in their thumbdrives. So here I go again (with the same, no-so-hidden agenda). Before we go any further, let me tell you clearly that DropBox is a free service. You pay nothing for 2GB of online storage. If you want to go beyond that limit, you do pay some fee.

Most people carry their thumbies around so that they can access their files from any computer they happen to find themselves in front of. If these computers are not your habitual computers (ie, your wife’s notebook, kids’ pc, office computer etc.), the virtual DropBox may not totally obviate the necessity of a real thumbdrive. For random computers, virtual just doesn’t cut it. But if you are a person of habits and shuttle from one regular computer to another, DropBox is actually a lot better than a real USB drive. All you have to do is to install DropBox on all those machines, which don’t even have to be of the same kind — they can be Macs, PCs, Linux boxes etc. (In fact, DropBox can be installed on your mobile devices as well, although how you will use it is far from clear.) Once you install DropBox, you will have a special folder (or directory) where you can save stuff. This special folder/directory is, in reality, nothing but a regular one. Just that there is a background program monitoring it and syncing it magically with a server (which is on a cloud), and with all other computers where you have DropBox installed under your credentials. Better yet, if your computers share a local network, DropBox uses it to sync among them in practically no time.

Here is video I found on YouTube on what DropBox can do for you:

In addition to this file synchronization, DropBox is an offline mirror of your synced files. So if you keep your important files in the DropBox folder, they will survive for ever. This is an advantage that no physical, real thumbdrive can offer you. With real thumbdrives, I personally have lost files (despite the fact that I am fairly religious about regular copies and mirrors) due to USB drives dying on me. With DropBox, it will never happen. You have local copies on all the computers where you have DropBox running and a remote copy on a cloud server.

But you might say, “Ha, that is the problem — how can I put my personal files on some remote location where anybody can look at them?” Well, DropBox says that they use industry-standard encryption that they themselves cannot unlock without your password. I chose to trust them. After all, even if they could decrypt it, how can they troll terabytes of data in random formats in the hope of finding your account number or whatever? Besides, if you are really worried about the security, you can always create a TrueCrypt volume in DropBox.

Another use you can put DropBox to is in keeping your application data synced between computers. This works best with Macs and symbolic links. For instance, if you have a MacBook and an iMac, you can put your address book in your DropBox directory, create a symbolic link from the normal location (in ~/Library/ApplicationData/Mail.app) and expect to see the same address book in both the computers. Similar trick will work with other applications as well. I have tried it with my offline blogging software (ecto) and my development environment (NetBeans).

Want more reasons to sign up? Well, you can also share files with other users. Suppose your spouse has a DropBox of her own, and you want to share some photos with her. This can be easily arranged. And I believe the photos folder in DropBox behaves like a gallery, although I haven’t tested it.

So, if you find these reasons to have a virtual thumbdrive in addition to (or instead of) a real physical one, do sign up for DropBox via any of the million links on this page. Did I tell you that if your friends signed up using your link, you would get 250MB extra for each referral?

Photo by Debs (ò‿ó)♪

41f581561219c35d1f4c85d1_640_internet

Hosting Services

hosting.gifIn today’s world, if you don’t have a website, you don’t exist. Well, that may not be totally accurate — you may do just fine with a facebook page or a blog. But the democratic nature of the Internet inspires a lot of us to become providers of information rather than just consumers. The smarter ones, in fact, strategically position themselves in between the providers and the consumers, and reap handsome rewards. Look at the aforementioned facebook, or Google, or any one of those Internet businesses that made it big. Even the small fries of the Internet, including small-time bloggers such as yours faithfully, find themselves facing web-traffic and stability kind of technical issues. I recently moved from my shared hosting at NamesDirect.com to a virtual private host at Arvixe.com. There, I have done it. I have gone and dropped technical jargon on my readers. But this post is on the technical choices budding webmasters have. (Before we proceed further, let me disclose the fact that the links to Arvixe in this post are all affiliate links.)

When you start off with a small website, you typically go with what they call “shared hosting” — the economy class of web hosting soltuion. You register a domain name (such as thulasidas.com) for $20 or $30 and look around for a place on the web to put your pages. You can find this kind of hosting for under $10 a month. (For instance, Arvixe has a package for as low as $4 a month, with a free domain name registration thrown in.) Most of these providers advertise unlimited bandwidth, unlimited storage, unlimited databases etc. Well, don’t believe everything you see on the Internet; you get what you pay for. If you read the fine print before clicking “here” to accept the 30 page-long terms and conditions, you would see that unlimited really means limited.

For those who have played around with web development at home, shared hosting is like having XAMPP installed on your home computer with multiple users accessing it. Sure, the provider may have a mighty powerful computer, huge storage space and large pipe to the Internet or whatever, but it is still sharing. This means that your own particular needs cannot be easily accommodated, especially if it looks as though you might hog an unfair share of the “unlimited” resources, which is what happened with my provider. I needed a “CREATE TEMPORARY TABLE” privilege for a particular application, and my host said, “No way dude.”

Shared hosting comes in different packages, of course. Business, Pro, Ultimate etc. — they are all merely advertising buzzwords, essentially describing different sizes of the share of the resources you will get. The next upgrade is another buzzword — Cloud Hosting. Here, the resources are still shared. But apparently they reside on geographically dispersed data centers, optimized and scalable through some kind of grid technology. This type of hosting is considered better because, if you run out of resources, the hosting program can allocate more. For instance, if you suddenly have a traffic spike because of your funny post going viral on facebook and digg, the cloud could easily handle it. They will, of course, charge you more, but in the shared hosting scenario, they would probably lock you out temporarily. To me, cloud hosting sounds like shared hosting with some of the resource constraints removed. It is like sharing a pie, but with all the ingredients on hand, so that if you run out, they can quickly bake some more for you.

The “business class” of web hosting is VPS or Virtual Private Server. Here, you have a server (albeit a virtual one) for yourself. Since you “own” this server, you can do whatever you like with it — you have “root” access. And the advertised resources are, more or less, dedicated to you. This is like having a VirtualBox running on your home PC where you have installed XAMPP. The only downside is that you don’t know how many other VirtualBoxes are running on the computer where your VPS is running. So the share of the resources you actually get to enjoy may be different from the the so-called “dedicated” ones. For root access and quasi-dedicated resources, you pay a premium. VPS costs roughly ten times as much as shared hosting. Arvixe, for instance, has a VPS package for $40 a month, which is what I signed up for.

VPS hosting comes with service level agreements that typically state 99.9% uptime or availability. It is important to note that this uptime refers, not to your instance of VPS, but to the server that hosts the virtual servers. Since you are the boss of your VPS, if it crashes, it is largely your problem. Your provider may offer a “fully managed” service (Arvixe does), but that usually means you can ask them to do some admin work and seek advice. In my case, my VPS started hanging (because of some FastCGI issues before I decided to move to DSO for PHP support so that APC worked — I know, lots of techie jargon, but I am laying the groundwork for my next post on server management). When I asked the support to help diagnose the problem, they said, “It is hanging because your server is spawning too many PHP processes. Anything I can help you with?” Accurate statement, I must admit, but not necessarily the kind of help you are looking for. They were saying, ultimately, the VPS server was my baby, and I would have to take care of it.

If you are real high-flying webmaster, the type of hosting you should go for is a fully dedicated one. This is kind of like the first class or private jet kind of situation in my analogy. This hosting option will run you a considerable cost, anywhere from $200 to several thousands per month. For that kind of money, what you will get is a powerful server (well, at least for the costlier ones of these plans) housed in a datacenter with redundant power supplies and so on. Dedicated hosting, in other words, is a real private server, as opposed to a virtual one.

I have no direct experience with a hosted dedicated server, but I do have a couple of servers running at home for development purposes. I run two computers with XAMPP (one real and one on a VirtualBox on my iMac) or and two with MAMP. And I presume the dedicated-server experience is going to be similar — a server at your beck and call with resources earmarked for you, running whatever it is that you would like run.

Somewhat spread out over shared and VPS hosting is what they call a reseller account. This type of hosting essentially sets you up as a small web hosting provider (presumably in a shared hosting mode, as described above) yourself. This can be interesting if you want to make a few bucks on the side. Arvixe, for instance, offers you a reseller package for $20, and promises to look after enduser support themselves. Of course, when you actually resell to your potential customers, you may want to make sure your offering has something better than what they can get directly from the company either in terms of pricing or features. Otherwise, it wouldn’t make much sense for them to come to you, would it?

So there. That is the spectrum of hosting options you have. All you need to do is to figure out where in this spectrum your needs fall, and choose accordingly. If you end up choosing Arvixe (a wise choice), I would be grateful if you do so using one of my affiliate links.