Your IP: Unknown · Your Status: ProtectedUnprotectedUnknown

Skip to main content

Memory-driven computing

Memory-driven computing

(also MDC, memory-centric computing)

Memory-driven computing definition

Memory-driven computing is a type of computer architecture that uses large memory pools connected to multiple processing units. Memory-driven computing systems can overcome traditional computing bottlenecks caused by processors being forced to petition each other for additional memory to accomplish complex tasks.

See also: grid computing, cognitive computing, direct memory access, memory allocation, memory cache

How memory-driven computing works

Memory-driven architecture aims to process data directly within the memory. Instead of having discrete processor-memory unit pairs, a memory-driven system uses large pools of memory connected to multiple processors at once.

Such pools offer much more memory capacity to processors than they have available individually in traditional computing setups. The memory pools may also integrate technologies like high-bandwidth memory and persistent memory to further boost performance.

Using centralized memory pools, individual processors can coordinate more efficiently on resource-intensive processes. They no longer need to petition each other for more resources or information on missing steps, as the work done by one processor is available to all others connected to the pool.

Memory-driven computing examples

The most prominent example of memory-driven computing is the Machine, an experimental prototype that Hewlett Packard Enterprise (HPE) began developing in 2014. As of 2018, HPE is looking for commercial applications for the technology.

Ultimate digital security