Paradigms All the Way

Paradigms permeate almost all aspects of computing. Some of these paradigms are natural. For instance, it is natural to talk about an image or a song when we actually mean a JPEG or an MP3 file. File is already an abstraction evolved in the file-folder paradigm popularized in Windows systems. The underlying objects or streams are again abstractions for patterns of ones and zeros, which represent voltage levels in transistors, or spin states on a magnetic disk. There is an endless hierarchy of paradigms. Like the proverbial turtles that confounded Bertrand Russell (or was it Samuel Johnson?), it is paradigms all the way down.

Some paradigms have faded into the background although the terminology evolved from them lingers. The original paradigm for computer networks (and of the Internet) was a mesh of interconnections residing in the sky above. This view is more or less replaced by the World Wide Web residing on the ground at our level. But we still use the original paradigm whenever we say “download” or “upload.” The World Wide Web, by the way, is represented by the acronym WWW that figures in the name of all web sites. It is an acronym with the dubious distinction of being about the only one that takes us longer to say than what it stands for. But, getting back to our topic, paradigms are powerful and useful means to guide our interactions with unfamiliar systems and environments, especially in computers, which are strange and complicated beasts to begin with.

A basic computer processor is deceptively simple. It is a string of gates. A gate is a switch (more or less) made up of a small group of transistors. A 32 bit processor has 32 switches in an array. Each switch can be either off representing a zero, or on (one). And a processor can do only one function — add the contents of another array of gates (called a register) to itself. In other words, it can only “accumulate.”

In writing this last sentence, I have already started a process of abstraction. I wrote “contents,” thinking of the register as a container holding numbers. It is the power of multiple levels of abstraction, each of which is simple and obvious, but building on whatever comes before it, that makes a computer enormously powerful.

We can see abstractions, followed by the modularization of the abstracted concept, in every aspect of computing, both hardware and software. Groups of transistors become arrays of gates, and then processors, registers, cache or memory. Accumulations (additions) become all arithmetic operations, string manipulations, user interfaces, image and video editing and so on.

Another feature of computing that aids in the seemingly endless march of the Moore’s Law (which states that computers will double in their power every 18 months) is that each advance seems to fuel further advances, generating an explosive growth. The first compiler, for instance, was written in the primitive assembler level language. The second one was written using the first one and so on. Even in hardware development, one generation of computers become the tools in designing the next generation, stoking a seemingly inexorable cycle of development.

While this positive feedback in hardware and software is a good thing, the explosive nature of growth may take us in wrong directions, much like the strong grown in the credit market led to the banking collapses of 2008. Many computing experts now wonder whether the object oriented technology has been overplayed.

Sections

Comments