Skip to main content

Home Granularity


(also grain size)

Granularity definition

Granularity is a computer science term that refers to the computation-to-communication ratio (the C/C ratio) — or the breaking down of larger tasks into smaller ones. Another term commonly associated with granularity is parallel computing — a computing architecture where multiple processors simultaneously access the same memory resources.

Finer granularity in computing translates into greater detail and smaller code sizes. Granularity is an essential element of development practices because it helps development teams understand how specific computing tasks perform in the context of the whole project.

How granularity is calculated

  • Tcomp is the computation time (also known as running time). Computation time is the length of time that performing a computational process takes.
  • Tcomm is the communication time. Communication time refers to the communication between several processors or processing elements.
  • G refers to granularity.
  • Granularity can be calculated using the following formula:
    • G=Tcomp/Tcomm

Granularity levels explained

  • Fine-grained: Individual tasks are small in terms of their execution time and code size.
  • Medium-grained: a combination of fine-grained and coarse-grained parallelism.
  • Coarse-grained: Tasks are much larger in size and require more execution time.