Join Our 4-Week Free Gen AI Course with select Programs.

Request a callback

or Chat with us on

Process Synchronization in OS (Operating System) – A Complete Guide

Basics of SQL
Basics of SQL
icon
12 Hrs. duration
icon
12 Modules
icon
2600+ Learners
logo
Start Learning

Process Synchronization allows execution of multiple processes in a multi­process system to simultaneously access shared resources in a predetermined manner. The goal is to solve problems of race conditions and much more synchronization problems in a concurrent system.

 

Process synchronization is the main objective to make sure that more than one process accesses shared resources without underlying other one and don’t leave the chance to have uncontrollable data or conflicting data due to access concurrently. This is accomplished by using different synchronization techniques like semaphores, monitors, and critical sections.

 

Synchronization is essential, both for achieving data consistency and integrity and to reduce the risk of deadlocks and other synchronization problems in a multi-process system. Synchronization of the process is an important part of modern operating systems and is important for correct and efficient running of multi-process systems.

What is Process Synchronization?

If you have multiple processes running at the same time and if more than one process can access the same data and resources at the same time, the ability to synchronize processes can be very helpful. The multi process system utilizes the process synchronization. Data inconsistency occurs when more than two processes simultaneously have the access to the same data or resource, so to prevent this data inconsistency, all the processes are to be synchronized amongst one another.

 

Process Synchronization

 

Let’s consider in the above picture that a bank account with 500 current balance and two user accounts to that bank account. User 1 and User 2 are both trying to read the balance. If process 1 is for withdrawal and process 2 is for checking balance then if user1 asks for the current balance then user1 might get the wrong current balance. This kind of data inconsistency process synchronization is very helpful, to avoid it, so has been very helpful.

 

Also Read: What is the Process in OS

Importance of Process Synchronization

In operating system design with multi process environments, the concurrent execution is the norm and therefore process synchronization is a necessary concept. Here are the key reasons why process synchronization is important:

Data Consistency

Sharing has a spectrum and, in the case of multi-process systems, multiple processes share resources (like memory, files and devices). Such data inconsistency occurs when concurrent access is invoked without synchronization. This inconsistency can cause:

 

  • Corrupted Data: In an example, suppose that a process has interrupted a shared variable that is being read by another process. That can result in erroneous computations, or system states.
  • Loss of Updates: Two processes update the same data at the same time, and one update may overwrite an adjacent (e.g., previous) one to lose the crucial information.

 

Take an example of two processes trying to maintain the balance of a shared bank account. Suppose we have two processes executing the daily balance computation, and if they happen to simultaneously read the current balance, calculate their new balance and write it back, then one of the updates will be lost and the resulting account balance could be incorrect.

Resource Sharing

Safe sharing of resources among several processes is made possible by process synchronization. Processes typically have to cooperate with one another and often share resources such as printers, files and memory in many applications. Proper synchronization mechanisms allow:

 

  • Controlled Access: In this case, shared resources can be requested by processes without a fear of contention or deadlocks, making sure the resources are actively used, or all at once.
  • Maximized Utilization: Synchronization allows several processes to share resources without conflicts, thus providing higher system utilization. As an example, several processes may be allowed to print jobs to a given printer without interrupting the outputs of the other processes.

Deadlock Avoidance

Two or more processes take locks on each other’s resources and neither can release the locks to proceed. These deadlocks occur. Through careful allocation of resources and proper design and management of resource allocation, synchronization helps avoid deadlocks. It ensures that:

 

  • Orderly Resource Allocation: Requesting resources in order to avoid circular wait conditions is a common cause of deadlock for processes and they are able to request resources in a specific order.
  • Timeouts and Rollbacks: There are some synchronization strategies which in some sense force processes to wait for some fixed amount of time before rollbacking their request, thus reducing the risk of deadlocks.

Starvation Prevention and Fairness

Process synchronization mechanisms make sure that all processes have a fair chance to use the shared resources. This is crucial for:

 

  • Preventing Starvation: Processes destined to starve without resource synchronization will not only suffer, but often become perpetually denied access to resources. Scheduling algorithms like round robin or priority based scheduling can guarantee that each process will eventually get a turn.
  • Ensuring Predictable Behavior: Enforced order of execution can be achieved through synchronization mechanisms helping debug and maintaining complex systems.

IPC (Inter Process Communication)

Processes must communicate and share the data easily in many applications. IPC mechanisms like message passing, shared memory, semaphore and so on need synchronization. It allows:

 

  • Ordered Communication: This allows messages to be sent and received with a controlled process and assume messages are processed in the order they arrive, which is necessary to maintain a state and provide coordination.
  • Data Integrity: It improves data transfer between processes so that data gets transferred and it is not corrupted as processes can access data simultaneously.

System Performance and Efficiency

Efficient synchronization mechanisms may yield improved system performance. Synchronization allows processes to get moving without many unnecessary delays by minimizing the contention for shared resources. This leads to:

 

  • Reduced Overheads: Synchronization well designed can minimize the overhead of context switching and waiting, and increase system overall throughput.
  • Increased Responsiveness: When processes are not bottlenecks, applications can respond to user inputs faster since they are now operating with harmony.

How Process Synchronization Works?

To understand how process synchronization works, let’s consider three processes:

 

  • Process 1 is writing data.
  • Process 2 and Process 3 are both reading that same data.

 

If all three processes run at the same time without synchronization, Processes 2 and 3 might read outdated or incorrect data because they are trying to access the data while it is being written.

Key Sections of a Synchronized Program

A synchronized program consists of several important sections:

 

  • Entry Section: It decides when the process can enter its critical section.
  • Critical Section: It ensures that at any time there is only one process to access and modify the shared data.
  • Exit Section: Before the current process finishes, allow waiting processes to enter the critical section.
  • Remainder Section: It holds the rest of the code, not belonging to the critical or exit sections.

Race Condition

A race condition occurs where a number of different processes try to work on the same shared data at the same time. The order in which the processes run can result in inconsistent or incorrect results because the final data value depends on the result.

Managing Race Conditions

In order to avoid race conditions, the critical section must be set up so that only one process can ever do anything relating to it at a time. This design, sometimes called the atomic section, for the operations in it are indivisible and performed without interruption.

The Critical Section Problem

Critical section problem is about ensuring that the shared resources are modified by one process at a time. Without implementing the critical section, several processes can write to the same shared data at the same time, inventing behaviors that are unpredictable.

Critical Section Solutions Conditions

To solve critical section problems, three essential conditions must be satisfied:

 

  • Mutual Exclusion: The critical section can only be accessed in one process at a time.
  • Progress: If there are no processes in the critical section, processes waiting to enter must be permitted to continue.
  • Bounded Waiting: The critical section must be entered only after a waiting process is allowed to enter.
DevOps & Cloud Engineering
Internship Assurance
DevOps & Cloud Engineering

Solutions to the Critical Section Problem

Peterson’s Solution

Peterson’s solution is a classical way of solving critical section problems in the classical software space. It has a flag array which the process wants to get into the critical section.

do { flag[i] = True; turn = i; while (flag[i] == True && turn == i); flag[i] = False; turn = j; } while (True);

The solution would keep a flag array, a claim flag for each process that indicates whether they are interested in entering the critical section. Consequently, this method ensures that provisions of mutual exclusion are met, and that processes can still execute other code.

Synchronization Hardware

We can see that some operating systems do the synchronization at the hardware level, using features like locking. In this case when a process is in the access and release part of the critical section it will acquire the lock which will prevent any other process from gaining the access to lock until the lock is released again. As a result, additional processes are unable to access a critical section if any process is already using the section. The lock can have either of the two values, 0 or 1.

Mutex Lock

For process synchronization, there is a more basic method, the Mutex Lock. When a process enters the critical section it is set, when a process exits from a critical section it is unset. It makes sure that only one process can change the data shared at the same time.

Semaphores

Semaphores are synchronization variables shared between processes, which use two operations: wait() and signal(). These operations manage access to shared resources allowing processes to wait for access to and notify others when access has been taken.

 

There are two kinds of semaphores:

  • Binary Semaphores

Binary Semaphores can only have one of two values: 0 or 1. They are known as mutex locks because of the fact that they can guarantee mutual exclusion.

  • Counting Semaphores

Semaphores can count any value and are not even constrained to an area. They can also be used to limit access to a resource with concurrent access limits.

 

Also Read: Operating System Interview Questions With Answers

Conclusion

Process synchronization is required to coordinate numerous processes in a multi process system to regulate and forecast resource access in concurrent computing. Race situation and data inconsistency are essential for data integrity that is addressed. Synchronization is done with semaphores and Peterson’s solution. Data consistency needs synchronization that often introduces the complexity and performance overheads of complex and complicated programming logic, and so correct implementation and control over multi-process systems is essential.

FAQs
Process Synchronization is the mechanism by which a multiprocess system interleaves the execution of multiple processes to limit access of these processes to shared resources. It seeks to overcome the problem of race conditions and other synchronization problems in a parallel system.
However, semaphores are two field data types, the first field being a non-negative type of integer S.V and the second field being a set of processes in a queue S.L. This is used to deal with critical sections problems using two atomic operations, wait and signal, used to synchronize processes in it.
In the 19th century semaphores were adopted and widely used (replacing the mechanical arms of shutter semaphores), quite literally replacing hand-held flags in the maritime world. During underway replenishment at sea, it is still used, and is acceptable when daylight or lighted wands are used in the place of flags, at night.
The process is an (already running) program that can base all computation. It is not the same procedure as computer code, although it is pretty much the same. A process is an 'active' entity in contrast to the program which is also called a 'passive' entity.
In OS deadlock means a situation occurs when more than two or one processes or threads are waiting for other processes or threads to release a resource. What this means in general is that when a set of processes get stuck in a way they can't resolve, it's a state called getting 'stuck'.

Deploying Applications Over the Cloud Using Jenkins

Prashant Kumar Dey

Prashant Kumar Dey

Associate Program Director - Hero Vired

Ex BMW | Google

19 October, 12:00 PM (IST)

Limited Seats Left

Book a Free Live Class

left dot patternright dot pattern

Programs tailored for your success

Popular

Management

Data Science

Finance

Technology

Future Tech

Upskill with expert articles

View all
Hero Vired logo
Hero Vired is a leading LearnTech company dedicated to offering cutting-edge programs in collaboration with top-tier global institutions. As part of the esteemed Hero Group, we are committed to revolutionizing the skill development landscape in India. Our programs, delivered by industry experts, are designed to empower professionals and students with the skills they need to thrive in today’s competitive job market.

Data Science

Accelerator Program in Business Analytics & Data Science

Integrated Program in Data Science, AI and ML

Accelerator Program in AI and Machine Learning

Advanced Certification Program in Data Science & Analytics

Technology

Certificate Program in Full Stack Development with Specialization for Web and Mobile

Certificate Program in DevOps and Cloud Engineering

Certificate Program in Application Development

Certificate Program in Cybersecurity Essentials & Risk Assessment

Finance

Integrated Program in Finance and Financial Technologies

Certificate Program in Financial Analysis, Valuation and Risk Management

Management

Certificate Program in Strategic Management and Business Essentials

Executive Program in Product Management

Certificate Program in Product Management

Certificate Program in Technology-enabled Sales

Future Tech

Certificate Program in Gaming & Esports

Certificate Program in Extended Reality (VR+AR)

Professional Diploma in UX Design

Blogs
Reviews
Events
In the News
About Us
Contact us
Learning Hub
18003093939     ·     hello@herovired.com     ·    Whatsapp
Privacy policy and Terms of use

© 2024 Hero Vired. All rights reserved