Commodity computing definition
Commodity computing refers to the use of standardized, off-the-shelf hardware components for computing, which are widely available and interchangeable among various systems. In other words, commodity computing means using common, affordable computer parts that you can find almost anywhere, instead of special, expensive ones made just for one purpose.
Commodity computing hardware is typically mass-produced, leading to lower costs and making it accessible to a broader range of users and applications. Commodity computing enables organizations to build and scale their systems using these standard, low-cost components. This approach is particularly prevalent in data centers and cloud computing, where high performance and reliability can be achieved by clustering multiple commodity servers together.
See also: network database
Types of commodity computing:
Commodity computing can be segmented based on its application and the scale at which it's implemented:
- Desktop and personal computing. It Includes hardware components such as graphics cards, motherboards, and CPUs and allows to upgrade of each device separately.
- Enterprise servers. These are servers deployed in data centers. Commodity computing lowers the per-unit cost as the infrastructure grows.
- Storage solutions. Used to cheaply store large amounts of data with off-the-shelf components.
- Networking equipment. It is used to set up a wide range of network configurations but may need more expertise than integrated solutions.