Your IP: Unknown · Your Status: ProtectedUnprotectedUnknown

Skip to main content

Grid computing

Grid computing

Grid computing definition

Grid computing is a distributed computing infrastructure that uses a grid of computers to perform large-scale complex tasks.

Grid computing is used in fields that require high processing power (scientific research, financial modeling, media rendering, and bioinformatics). Both grid computing and cloud computing involve distributed resources. However, grid computing uses distributed resources to solve a single task, while cloud computing provides on-demand resources and services to different users for different tasks.

History of grid computing

The term was inspired by the electric power grid that lets users access remote energy resources as if they were a local utility — similar to the goal of grid computing. Ian Foster and Carl Kesselman’s “The Grid: Blueprint for a New Computing Infrastructure” played a significant role in defining the concept.

One of the earliest examples of grid computing was the SETI@home project, started in 1999 by the University of California. This project used internet-connected computers volunteered by the general public to analyze radio signals from space in the search for extraterrestrial intelligence.

Many scientific research projects adopted grid computing in the early 2000s to tackle complex computational problems. One notable example is the Large Hadron Collider at CERN, which uses a grid of tens of thousands of computers to analyze data from its experiments.

Grid computing was a precursor to cloud computing, which has gained popularity for its scalability, flexibility, and ease of access to resources. However, grid computing still plays a critical role, especially in scientific research and areas where large-scale, distributed computing resources are necessary.

Ultimate digital security