Concurrency control definition
Concurrency control is the practice of managing access to shared resources in concurrent (or parallel) environments. Concurrency control aims to prevent conflicts and maintain the integrity of data when multiple processes seek to alter stored values at the same time.
Concurrency control is particularly important for databases, where multiple transactions may read and write shared data at the same time.
See also: race condition, synchronization
Concurrency control objectives
- Isolation: Isolating each transaction from the others to avoid interference caused by concurrent execution.
- Consistency: Enforcing rules (such as primary key constraints and foreign key constraints) to keep data consistent in concurrent operations.
- Correctness: Ensuring that the final state of the database reflects the intended semantics of the transactions.
Common concurrency control measures
- Lock-based concurrency control involves assigning locks to resources, such as database records or objects. These locks can be either shared (read locks) or exclusive (write locks). Conflicts between transactions are resolved by granting or denying their individual lock requests.
- Multiversion concurrency control maintains multiple versions of data items in the database, which means that multiple transactions can access them at the same time. Each transaction sees a consistent snapshot of the database upon starting
- Timestamp ordering assigns unique timestamps to each transaction based on their start time. These timestamps determine the order in which the transactions can access data.