Dark Mode

How to Send Large Data to WebWorkers Efficiently

Send large data to WebWorkers Efficiently

Any web developer finds the idea of offloading heavy computations to WebWorkers to unblock the main thread fascinating. However, sending large data to Web Workers can pose a challenge and slow down the application, causing performance issues. In this blog post, we will explore how to efficiently send large data to Web Workers.

Serialization and Deserialization

When you use worker.postMessage() to send large data to WebWorkers, the structured clone algorithm first converts the data into a string. The structured clone algorithm is a deep copy algorithm that creates a new object with the same properties and values as the original object. When the Web Worker receives the message, it automatically deserializes the data using the same structured clone algorithm.

It is important to note that serializing and deserializing data can be expensive and blocking operations, particularly for large or complex objects. To optimize performance, you should try to minimize the amount of data that the main thread and Web Worker need to send between each other.

Use Patches

If your use case involves sending the same large object multiple times or back and forth with only a few changes, it is better to send the entire object only the first time. After that, you should send only the changes (also known as patches) instead of the entire object.

To generate the patches when changes are made to the object, you can use libraries like immer, as follows:

import { produce } from 'immer';

const largeData = {...};
produce(largeData, draft => {
  // make changes in largeData object
}, patches => {
  // Patches are generated, which can be sent to webworker using `postMessage`
  worker.postMessage(patches);
});

Also Read: Implementing Undo/Redo Functionality in Redux using Immer

Use Transferable Objects

Transferable objects are objects that can be transferred to another context, such as WebWorker, without the need to copy the object. This allows us to send large data between the main thread and WebWorker without the need to serialize it.

But there is a catch – only a few types of objects are transferable and once the object is transferred it can no longer be accessed by the sender.

If you work with binary data, you can and should take advantage of transferable objects, such as ArrayBuffer. You can transfer an ArrayBuffer like this:

// A Large object in the main thread
const data = new Uint8Array(100000000);
const worker = new Worker("worker.js");
worker.postMessage({ data }, [data.buffer]);

// In the Web Worker
onmessage = (event) => {
  const { data } = event.data;
  // use the data here
};

Use SharedArrayBuffer

SharedArrayBuffer is like ArrayBuffer with few key differences:

  • It is not transferable and should be sent using postMessage like any other message that is not transferable. structuredClone algorithm will be used internally to copy the object.
  • A SharedArrayBuffer can be shared by multiple threads after it is sent to WebWorker.

To use SharedArrayBuffers, there are a few security requirements that your app should meet. For top-level documents, two headers need to be set to cross-origin isolate your site:

Cross-Origin-Opener-Policy: same-origin
Cross-Origin-Embedder-Policy: require-corp

To check if cross-origin isolation has been successful, you can check the crossOriginIsolated property available to window and worker contexts:

Conclusion

As you may have noticed, there aren’t many options for non-binary format data at present, but you needn’t worry too much unless you’re sending more than 100KB. If you’re sending JSON data, a simple JSON.stringify on the sender side and JSON.parse on the receiver side should suffice.

On the other hand, if you work with binary data, you can leverage transferable objects to send large data to WebWorkers at an extremely fast speed.

Write the first response