Pipelining - computer architecture, Computer Engineering

Pipelining - computer architecture:

The Pipeline Defined

According to John Hayes

"A pipeline processor consists of a sequence of processing circuits, called stages or segments, through which a stream of operands may be passed.

"Partial processing of the operands takes place in each segment.

"... a fully processed result is achieved only after an operand set has passed through the whole pipeline."

In daily life, people perform many tasks in stages. For examples, when we perform the laundry, we put a load in the washing machine. When it is completed, it is transferred to the dryer and another load is put in the washing machine. When the initial load is dry, we pull it out for another work like folding or ironing,  by moving the 2nd  load to the dryer and begin a third load in the washing machine. We carry on with folding or ironing of the first load at the same time as the second and third loads are being dried and washed, respectively. We might have never thought of it this way but we do laundry by pipeline processing.

A Pipeline is a series of  various stages, where some task is completed at each stage. The work is not completed until it has passed through the all stages.

Take a review of  Hayes' definition as it pertains to our laundry example. The washing machine is 1 "sequence of processing circuits" or can say a stage. The second sequence is the dryer. And the third is the folding or ironing stage.

 Partial processing is done in each stage. We surly aren't done when the clothes leave the washer. Neither when they leave the dryer, we just getting close. We have to take the third step and fold (if we're lucky) or iron the cloths. The "fully processed result" is gained only after the operand (the load of clothes) has passed through the whole pipeline.

We are frequently taught to take a big task and to divide it into smaller. It may make a unmanageable composite task into a series of more tractable smaller steps. In the  particular case of manageable tasks such as the laundry instance, it permits us to speed up the task by doing it in overlapping steps.

This is the key to pipelining: Division of a big task into smaller overlapping tasks.

"An important aspect of our civilization is the division of labor. Chief engineering achievements are based on subdividing the entire work into individual tasks which can be handled in spite of their inter-dependencies.

"Overlap and pipelining are important operation management techniques based on job sub-divisions under a precedence constraint."

Posted Date: 10/13/2012 3:53:41 AM | Location : United States







Related Discussions:- Pipelining - computer architecture, Assignment Help, Ask Question on Pipelining - computer architecture, Get Answer, Expert's Help, Pipelining - computer architecture Discussions

Write discussion on Pipelining - computer architecture
Your posts are moderated
Related Questions
What are the essential elements of Electronic Data Interchange. The necessary elements of Electronic Data Interchange (EDI) are: The use of an electronic transmission me

The demand placed on a system is explained by a lognormally distributed random variable with mean 50 and standard deviation of 10. The capacity of the system is modeled by a Weibul

In the shared-memory programming model, tasks share a common address space, which they read and write asynchronously. Several mechanisms such as locks / semaphores may be used to c

More complicated logic circuits can be made byconnecting a number of simple logic gates.How do we decide how to connect the gates togive a particular function e.g. output Y?We need



Explain the operation of octal to binary encoder. Ans Octal to binary encoder consists of eight inputs, one for each of eight digits and three outputs which generate the con

Search mechanisms in Prolog: Here we can needs this simple Prolog program to describe how Prolog searches as:president(X) :- first_name(X, georgedubya), second_name(X, bush).

Q. Analysis of Amdahls law? The conclusions of analysis of Amdahl's law are: 1) To optimize performance of parallel computers modified compilers should be developed that sho

Specified the average case complexity of sequential search in an array of unsorted elements of size n if the following conditions hold: a)  Probability of the key to be in the a