Sure, paradigm is one of those "business-speak" words that was popularized in the last decade, like the Macarena and Beanie Babies. It does have entomological roots that go way back, though. The roots of computing go way back, too. If you look up "computer" in dictionaries before the turn of the 20th century, it would be defined as a person that does computation. Whether calculating ballistic trajectories for the military or actuarial charts for insurance companies or tide tables for sailors, computers were men working through a fixed set of formulas to derive their answers. Then, as now, computers were well-suited to perform well-defined, repetitive calculations.
Computers have obviously changed over the last 100+ years. So have the methods used to program them. One of the first electronic computers, ENIAC, used patch cords and wires to direct stored data through the different components, tables, and computing engines. The realization that a computer's program could be treated and stored in the same way as data led to the idea and implementation of the "stored-program" computers. What seems common to us today was a major breakthrough back in 1948.
If programs could be stored in memory, like data, how would you encode the instructions of a program into the machine? At first, you used the machine language of the computer written and entered directly in binary. Then came assembly language, which is just a set of mnemonics used in place of the binary machine instructions. It is easier for a human programmer to understand the operation performed by an 'ADD' instruction than it is to remember that '01101110' will do the same thing. An assembler is the program that performs the well-defined and repetitive process of translating assembly instructions into machine language.
Engineers, physicists, chemists, and other scientists had always realized the advantages that computers brought to their researches and work. However, assembly language must have seemed like the secret language of an underground brotherhood from the lost continent of Atlantis. Scientists work with mathematical formulas and this led to the development of FORTRAN (FORmula TRANslation), the first high-level language. High-level languages require a compiler so that a computer can perform the well-defined and repetitive process of translating the (more) human-readable programs into machine language. Since FORTRAN, we've seen a plethora of high-level programming languages and techniques. Most recently has been object-oriented programming and managed run-time environments.
What does this history lesson have to do with anything? Well, each of these innovations in programming has been a paradigm shift. (There's another one of those '90s buzzwords.) From patch cords to binary machine instructions to assembly language to high-level languages to object-oriented programming. Now, with the advent of multi-core processors, the next paradigm shift in software is concurrent programming through threads.
Our bodies and brains do parallel processing all the time (heart beating, breathing, cogitating, walking, and chewing gum all at the same time). Thinking about things being done in parallel can still be pretty difficult, no matter how much we think we can multi-task. Yet, this is the skill that will be needed to succeed in this new programming paradigm. But, is it really all that new? No. Remember all those human computers that I mentioned? That was an example of parallel processing: each computer was assigned a portion of the whole job, each man worked at the same time as all the others, and the results were compiled together when complete. Sounds simple. Can it really be that easy to thread your applications?
(Parallel computations and high performance computing (HPC) have been used for many years now by engineers and scientists. In support of HPC, computer scientists have been developing their own paradigm shifts with parallel architectures, MPI, plus research into distributed and parallel algorithms. I'm not all that sure how much of this will be of use to programmers using threads, though.)
Multi-core processors are bringing parallel execution to the masses. So, while the paradigm of concurrent programming and parallel processing may not be new, it is going to be much more pervasive from hereon out. Will you jump on this bandwagon and take advantage of dual- and quad-cores? A more relevant question might be, Do you really need to? Will the benefits outweigh the investment of time and effort to thread your codes? Just because everyone else in your office starts doing the Macarena during their lunch hours, that doesn't mean you need to start doing it, too, right?