A Resource Allocation Graph (RAG) is a directed graph used by the OS to model the allocation of resources to processes and to detect or prevent deadlocks. By representing requests and assignments explicitly, the OS can ensure that it never enters a state in which a cycle—and hence a potential deadlock—exists.
1. RAG Components
2. Cycle Detection and Deadlock
3. Preventing Deadlocks via RAG Acyclicity To prevent deadlocks, the OS enforces that the RAG remain acyclic at all times. The high-level algorithm is:
By never allowing the RAG to become cyclic, the OS guarantees that deadlocks cannot occur.
4. Illustrative Example
Before request: P₁ holds R₁ (R₁ → P₁), and P₂ holds R₂ (R₂ → P₂).
P₁ requests R₂: add P₁ → R₂—no cycle, so grant (convert to R₂ → P₁).
P₂ now requests R₁: add P₂ → R₁; this would create a cycle
P₁ → R₂ → P₂ → R₁ → P₁
OS denies P₂’s request, thereby preventing deadlock.