Explain the term granularity, Computer Engineering

Assignment Help:

Granularity

Granularity refers to the quantity of computation done in concurrent relative to the size of whole program. In concurrent computing, granularity is a qualitative measure of the ratio of calculation to communication.  According to granularity of a system, parallel processing systems may be splitted in two groups: coarse-grain systems and fine-grain systems. In fine-grained systems, parallel parts are relatively small, that means more frequent communication. They have low calculation to communication ratio and need high communication overhead.  In coarse grained systems concurrent parts are relatively large, that means more calculation and less communication. If granularity is also fine, it is possible that the overhead needed for communications and synchronization between tasks takes longer than computation. On the other side, in coarse-grain parallel systems, relatively greater amount of computational work is done. They have high computation to communication ratio as well as imply more chance for performance increase.

The degree of granularity in a system is determined by algorithm applied in addition the hardware surroundings in which this runs. On an architecturally impartial system, the granularity affects the performance of resultant program. The communication of data needed to begin a large process can take a substantial amount of time. Conversely, a large process will frequently have less communication to do for the duration of processing. A process might need only a small quantity of data to get going, but may require receiving more data to carry on processing, or may require to do lots of communication with other processes so as to perform its processing. In main cases the overhead linked with communications and synchronization is high relative to implementation speed so it's advantageous to have coarse granularity. 

 


Related Discussions:- Explain the term granularity

Fundamental types of flash memory, Q. Fundamental types of flash memory? ...

Q. Fundamental types of flash memory? Code Storage Flash which is made by Intel, AMD, Atmel. It stores programming algorithms and it is largely found in cell phones. Data

What is binary search, Binary Search: Search a sorted array by repeatedly i...

Binary Search: Search a sorted array by repeatedly in-between the search interval in half. Start with an interval covering the entire array. If the value of the search key is less

Prepare a decision tree, A part of an aircraft engine can be given a test b...

A part of an aircraft engine can be given a test before installation.  The test has a 75% chance of revealing a defect if it is present, and the same chance of passing a sound part

Set screen and leave screen, What happens if only one of the commands SET S...

What happens if only one of the commands SET SCREEN and LEAVE SCREEN is used without using the other? If we use SET SCREEN without LEAVE SCREEN, the program ends processing for

Is it possible to pass data and include program explicitly, Is it possible ...

Is it possible to pass data to and from include programs explicitly? No.  If it is needed to pass data to and from modules it is needed to use subroutines or function modules.

What is the kernel, What is the kernel?          A more common definiti...

What is the kernel?          A more common definition is that the OS is the one program running at all times on the computer ,usually  known as the kernel, with all else being

Define deadlock, Define deadlock? Deadlock is a condition, wherein proc...

Define deadlock? Deadlock is a condition, wherein processes never finish executing and system resources are tied up, preventing another job from beginning. A process requests r

Which scheduling in cpu is allocate for least cpu-burst time, The schedulin...

The scheduling in which CPU is allocated to the process with least CPU-burst time is called? Ans. Shortest job first Scheduling wherein CPU is allocated to the process with lea

Define underflow and overflow, Define underflow and overflow. Underflow...

Define underflow and overflow. Underflow: If the result the arithmetic operation including n-bit numbers is too small to show by n-bits, underflow is said to occur. Overflow

Define overflow, Define Overflow.  An overflow is a problem in digital ...

Define Overflow.  An overflow is a problem in digital computer due to the width of registers is finite. A result that contains n+ 1 bit cannot be accommodated in a register wit

Write Your Message!

Captcha
Free Assignment Quote

Assured A++ Grade

Get guaranteed satisfaction & time on delivery in every assignment order you paid with us! We ensure premium quality solution document along with free turntin report!

All rights reserved! Copyrights ©2019-2020 ExpertsMind IT Educational Pvt Ltd