Your IP: Unknown · Your Status: ProtectedUnprotectedUnknown

Skip to main content

Mutual exclusion

Mutual exclusion

(also mutex)

Mutual exclusion definition

Mutual exclusion is a program object that blocks multiple users from accessing the same shared variable or data at the same time. With a critical section, a region of code in which multiple processes or threads access the same shared resource, this idea is put to use in concurrent programming. Only one thread can use a mutex at a time, thus, upon program startup, a mutex with a specific name is created. To prevent other threads from accessing a shared resource at the same time, a thread currently using that resource must lock the mutex. The mutex is unlocked when the thread releases the resource. Hardware and software levels can enforce mutex. Enforcing kernel mutex and preventing shared data structure corruption is best done by disabling interrupts for the fewest instructions. The resource acquisition flag is set based on availability if many CPUs share memory. Busy-wait enforces software mutex. This includes algorithms like Dekker’s, Peterson’s, Szymanski’s, Lamport’s, and the black-white bakery algorithm.

Types of mutual exclusive devices

  • Locks. A synchronization primitive that limits resource access for multiple threads. Locks impose a mutual exclusion concurrency control policy, and various applications use different ways.
  • Readers-writer locks. A reader-writer problem-solving synchronization primitive. Write operations require exclusive access, although RW locks allow concurrent read-only access. Multiple threads can read data, but writing or altering data requires an exclusive lock.
  • Recursive locks. A mutual exclusion device that can be locked many times using the same process/thread without deadlocking.
  • Semaphores. A variable or abstract data type used to limit access to a shared resource by multiple threads and avoid critical section issues in a concurrent system, like an operating system that can handle multiple tasks at once.
  • Monitors. A synchronization construct in concurrent programming enables threads to mutually exclude one another and to wait (block) for a condition to become false.
  • Message passing. A computer program invocation method. The invoking software sends a message to a process and relies on that process and its supporting infrastructure to choose and run relevant code.
  • Tuple space. For parallel and distributed computing, a tuple space is a way to use the associative memory paradigm. It gives you a place to store tuples that you can access at the same time.

Ultimate digital security