What is wrong sharing in parallel processing

What is the difference between concurrency, concurrency, and asynchronous methods?

With parallelism, two tasks are performed in parallel on separate threads. However, asynchronous methods are executed in parallel, but on the same 1 thread. How is that achieved? And what about parallelism?

What are the differences between these 3 concepts?


Concurrent and parallel are practically the same principle as you correctly guess. Both refer to tasks running concurrently, although I would say that parallel tasks should really be multitasking, running "concurrently," while concurrent could mean that the tasks share the tasks of the thread of execution while it appears to be running in parallel.

Asynchronous methods are not directly related to the previous two concepts. Asynchronicity is used to give the impression of simultaneous or parallel tasking. Effectively, however, an asynchronous method call is usually used on a process that needs to work outside of the current application, and we don't want to wait and block our application for the response.

For example, getting data from a database might take some time, but we don't want to keep our user interface from waiting for the data. The asynchronous call takes a callback reference and returns execution to your code once the request has been sent to the remote system. Your user interface can still be responsive to the user while the remote system does the necessary processing. Once the data is returned to your callback method, that method can update (or pass that update) the user interface.

From the user's point of view, it seems like multitasking, but it may not be.


It's probably worth noting that in many implementations, an asynchronous method call will cause a thread to come up. However, this is not absolutely necessary. This really depends on the operation being performed and how the response can be reported back to the system.

In summary,

Concurrency means multiple tasks that are started, executed, and executed in no particular order over overlapping periods of time. Concurrency is when multiple tasks OR multiple parts of a unique task are literally performed at the same time, e.g. B. on a multi-core processor.

Remember that parallelism and parallelism are NOT the same thing.

Differences between parallelism and parallelism

Now let's list notable differences between parallelism and parallelism.

Concurrency is when two tasks can start, run, and complete in overlapping periods of time. Concurrency is when tasks are literally running at the same time, e.g. on a multi-core processor.

Concurrency is the composition of independently executed processes, while parallelism is the concurrent execution of (possibly related) calculations.

Concurrency is about getting a lot of things done at the same time. Concurrency is about doing many things at the same time.

An application can run at the same time - but not in parallel. This means that more than one task is being processed at the same time, but no two tasks are running at the same time.

An application can be parallel - but not at the same time. This means that several sub-tasks of a task are processed simultaneously in a multi-core CPU.

An application cannot be parallel or simultaneous, ie it processes all tasks one after the other.

An application can run both in parallel and at the same time. This means that several tasks are processed simultaneously in a multi-core CPU.


Concurrency is essentially applicable when we are talking about at least two tasks or more. When an application can perform two tasks practically at the same time, it is known as a simultaneous application. Although tasks are performed concurrently here, essentially they cannot. They use the operating system's CPU time slicing function, in which each task performs part of its task and then switches to the wait state. If the first task is waiting, the CPU is assigned to the second task to complete its part of the task.

The operating system, based on the priority of the tasks, thus allocates CPU and other computer resources, e.g. B. Storage; Turn to all tasks and give them a chance to complete them. To the end user, all tasks appear to be running in parallel. This is known as parallelism.


No two tasks are required for parallelism. It literally physically performs parts of tasks OR multiple tasks at the same time, using the multi-core infrastructure of the CPU by assigning a core to each task or sub-task.

Essentially, parallelism requires hardware with multiple processing units. In a single-core CPU, parallelism can occur, but NOT parallelism.

Asynchronous methods

This is not related to parallelism and parallelism. Asynchronicity is used to give the impression of simultaneous or parallel tasking. Effectively, however, an asynchronous method call is usually used on a process that needs to work outside of the current application and does not want to wait and block our application for the response.

parallelism is when the execution of multiple tasks is nested instead of each task being executed in turn.

parallelism is when these tasks are actually running in parallel.

Asynchronicity is a separate concept (although it is related in some contexts). It refers to the fact that one event may occur at a different time (out of sync) than another event. The following diagrams illustrate the difference between synchronous and asynchronous execution, where the actors can correspond to different threads, processes, or even servers.

Everyone has problems associating asynchronicity with either parallelism or parallelism, since asynchronicity is not an antonym for parallelism or simultaneity. It's an antonym of synchronous. This only indicates whether something, in this case threads, is synchronizing with something else, in this case another thread.

There are several scenarios in which parallelism can occur:

Asynchronicity - This means that your program is not performing any blocking operations. For example, it can initiate a request for a remote resource over HTTP and then perform some other task while waiting to receive the response. It's a bit like sending an email and then getting on with your life without waiting for a reply.

parallelism - This means that your program uses the hardware of multi-core machines to perform tasks at the same time by dividing the work into tasks, each performed on a separate core. It's a bit like singing in the shower: you are actually doing two things at exactly the same time.

Multithreading - This is a software implementation that allows different threads to run at the same time. A multithreaded program appears to be doing several things at the same time, even when running on a single-core computer. This is a bit like chatting with different people through different IM windows. Although you actually switch back and forth, the net result is that you have multiple conversations at the same time.


Concurrency means that an application is making progress on several tasks at the same time (at the same time). If the computer has only one CPU, the application may not be progressing on more than one task at exactly the same time, but more than one task is being processed in the application at the same time. A task does not complete completely before the next begins.


Parallelism means that an application divides its tasks into smaller sub-tasks that can be processed in parallel, for example on several CPUs at the same time.

Parallelism vs. parallelism in detail

As you can see, parallelism is related to how an application handles multiple tasks that it is working on. An application can process one task at a time (one after the other) or work on several tasks at the same time (at the same time).

Concurrency, on the other hand, depends on how an application handles each individual task. An application can serially process the task from start to finish, or it can split the task into subtasks that can be run in parallel.

As you can see, an application can run concurrently, but not in parallel. This means that multiple tasks are processed at the same time, but the tasks are not divided into sub-tasks.

An application can also be parallel, but not at the same time. This means that the application only works for one task at a time, and that task is divided into sub-tasks that can be processed in parallel.

In addition, an application cannot be concurrent or parallel. This means that only one task is processed at a time and the task is never divided into sub-tasks for parallel execution.

Finally, an application can also be simultaneous and parallel, as it processes several tasks at the same time and divides each task into sub-tasks for parallel execution. However, some of the benefits of parallelism and parallelism can be lost in this scenario because the CPUs in the computer are already sufficiently preoccupied with parallelism or parallelism alone. The combination can only lead to a small gain in performance or even to a loss of performance. Make sure you analyze and measure before blindly adopting a parallel parallel model.

From http://tutorials.jenkov.com/java-concurrency/concurrency-vs-parallelism.html

Parallel:It is a broad term that means that two pieces of code do this "at the same time." It doesn't matter if it's "real" parallelism or if it's a clever design pattern. The point is that you can start the "tasks" at the same time and then control them separately (using Mutex and all the appropriate tricks). Usually, however, you prefer to use the word "parallel" only for "real" parallelism, as in: You make this possible through non-cooperative multitasking (be it through CPU / GPU cores or just at the software level by using the operating system to do it let manage at a very low level). People are reluctant to say "parallel" only for complicated sequential code that simulates parallelism, like you would find in the javascript of a browser window, for example. Hence why people on this thread say "asynchronous has nothing to do with parallelism". Well, just don't confuse them.

Simultaneously: There cannot be parallelism without parallelism (whether simulated or real as I explained above), but this term specifically focuses on the fact that at some point the two systems will try to concurrently same resource to access. It puts emphasis on the fact that you have to deal with it.

Asynchronous : Everyone is right when they say that asynchronous has nothing to do with parallelism, but it paves the way to get there (the burden is on you to do things in parallel or not - read on).

"Asynchronous" refers to a presentation the parallelism, which formalizes the three basic things normally associated with parallelism: 1) define the initialization of the task (say when it starts and what parameters it receives), 2) what needs to be done after the end and 3) What the code should do in between.

But it's still just syntax (it's usually presented as a callback method). Behind the scenes, the underlying system can simply decide that these so-called "tasks" are just snippets of code that pile up until the code it is executing is executed. And then they are unstacked one at a time and run one after the other. Or not. One thread per task can also be created and run in parallel. Who cares? This part is not included in the concept;)

There is a bit of semantics to clarify here:

Parallelism or parallelism is a matter of Resource conflicts , while Asynchronous is about the Control flow goes .

Various Procedures (or their constituent Operations ) are called asynchronous if the order of their processing not implemented deterministically is . In other words, there is a chance that one of them can be processed at some point. T. By definition, multiple processors (e.g. CPUs or people) allow multiple of them to be processed at the same time. Their processing is nested on a single processor (e.g. threads).

Asynchronous procedures or operations are said to be concurrent when they Share resources . Concurrency is the definitive possibility of conflict at any given point in time. T. Parallelism is trivially guaranteed if no resources are shared (e.g. different processor and memory). Otherwise the parallelism check must be handled.

Therefore, an asynchronous procedure or operation can be processed in parallel or concurrently with others.

Concurrency means performing multiple tasks at the same time, but not necessarily at the same time. When you need to do more than one task but only have one resource, we aim for parallelism. In a single-core environment, parallelism is achieved through context switching.

Concurrency is like doing multiple tasks at the same time where you can sing and bathe together. Now you do the job in parallel.

Asynchronous is something related to thread execution in the asynchronous model when a task is executed. You can switch to another task without waiting for the previous task to complete.

Asynchronous programming helps us achieve parallelism. Asynchronous programming in a multithreaded environment is one way of achieving parallelism.

"Sync and Async are programming models. Tasks are carried out simultaneously and in parallel ...". Source: https://medium.com/better-programming/sync-vs-async-vs-concurrent-vs-parallel-5754cdb60f66

In other words, Sync and Async describe how your program will execute when a function call is executed (will it wait or will it keep executing?), While Concurrent and Parallel describe how a function (task) is executed (Concurrent = possibly running at the same time in parallel = effectively running at the same time).

COMPETITION AGAINST PARALLELISM: Parallelism can only perform one task at a time. Example: Parallelism of a single CPU processor at one point allows us to perform multiple tasks. Example: dual-core or multi-core processor

Here I explain with some examples


A GPU uses parallel processing to same block of code (AKA- Kernel ) process on thousands of physical and logical threads. Ideally, the process starts and ends for all threads at the same time. A single CPU core without hyperthreading cannot perform parallel processing.

Note: Ideally, I said that running a kernel that was 7 million calls in size on 6 million thread hardware would have to run the same code twice on all 6 million threads in parallel while consuming all 6 million threads each time.

  • One kernel (a piece of code) runs on multiple processors
  • at the same time
  • With a single execution sequence (a Kernel must do the same thing in all threads so avoiding "branches" or "ifs" as they consume resources drastically by creating lots of NOPs (no-operations) to keep all threads in sync).
  • increases the speed in Essentially drastic
  • restricts drastically a, What you can do
  • depends heavily on the hardware

Note: The Concurrency isn't limited to the GPU.


A Web service receives many small inquiries in real time and must handle each of these inquiries differently at all times and independently of other inquiries or internal jobs. However, you want the web service to be up and running at all times without affecting data status or system health.

Imagine one user updating a record and another user deleting the same record at the same time.

  • Lots of tasks are executed
  • in Real time (or whenever a request comes in)
  • With different execution sequences (unlike the kernel in parallel processing, concurrent tasks can perform different tasks that you will most likely need to queue or prioritize).
  • Improved in Essential the average response time, because task 2 does not have to wait for task 1 to complete
  • Essentially will the computing time sacrificed, because many tasks are carried out at the same time and only limited resources are available
  • got to shared resources properly manage, to avoid deadlocks or data corruption.

Note : These requests usually consume some important resources such as memory, database connection, or bandwidth. However, you want the web service to respond at all times. Asynchronicity is the key to it react , not on parallelism


A difficult process (like an I / O operation) can the GUI light To block, when running on the GUI thread. To the User interface responsiveness To ensure this, a heavy process can run asynchronously. It is better to perform asynchronous operations similarly one after the other . For example, multiple I / O-bound operations can be significantly slower when they are running at the same time. So it is better to start them in the Queue to deliver

  • A task or a stack of tasks is running on a different thread
  • once
  • When there is a task there is no sequence so either wait for it to finish or fire and forget
  • If it's a series of tasks, you can either fire them all at once and forget, wait for them all to complete, or do each task to begin
  • reduced in Essential the performance because of the overhead
  • Provides responsiveness to another thread (effective against blocking the UI thread or other important threads)

Note: An asynchronous operation that runs concurrently (that is, more than once at a time) is a concurrent operation.

Note: Parallelism and asynchronicity are often confused with one another. Concurrency refers to different parts of the system that work together without interfering with one another (these problems are often solved with locks, semaphores, or mutexes). Asynchrony enables you to achieve responsiveness (e.g. threading).

* Note: Asynchrony and multithreading are often confused with one another. Asynchronous code doesn't necessarily involve a new thread. It can be a hardware operation or, as Stephan calls it, a pure operation. Read this

Example: The following WPF + C # code solves an asynchronicity problem while solving a parallelism problem:

I'll make it short and interesting to wrap your head around these concepts.

Concurrent vs. parallel - Ways in which tasks are carried out.

Take a real life example: there is a challenge that involves both eating a whole big cake and singing a whole song. You will win if you are the fastest to sing the whole song and finish the cake. So the rule is that you sing and eat at the same time . How you do that is not the rule. You can eat the whole cake, then sing the whole song, or you can eat half a cake, then sing half a song, then do that all over, and so on.

parallelism is a certain type of parallelism where tasks actually run concurrently. In computer science, parallelism can only be achieved in multicore environments.

Synchronous vs. asynchronous - programming models.

When you sync, you write code as steps that run from top to bottom. In an asynchronous programming model, you write code as tasks that run concurrently. Running at the same time means that all tasks are likely to be running at the same time.

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from.

By continuing, you consent to our use of cookies and other tracking technologies and affirm you're at least 16 years old or have consent from a parent or guardian.

You can read details in our Cookie policy and Privacy policy.