4 min read
|
Saved February 14, 2026
|
Copied!
Do you care about this?
Transformers.js v4 is now available on NPM, making installation easier. The new version features a rewritten WebGPU runtime for improved performance and offline support, along with a modular structure and a separate tokenizers library for better usability.
If you do, here's more
Transformers.js v4 is now available on NPM after almost a year of development. Users can quickly install it with the command `npm i @huggingface/transformers@next`. This version features a new WebGPU Runtime, rewritten in C++, which significantly enhances performance across about 200 model architectures. The new runtime supports various JavaScript environments, allowing users to run models in browsers and server-side environments like Node and Deno. This update also allows for offline support by caching WASM files in the browser.
The codebase has undergone major restructuring. The repository has shifted to a monorepo format using pnpm workspaces to better manage sub-packages for specific use cases. The sprawling models.js file has been broken down into smaller, focused modules, improving maintainability. An examples repository has been created separately, streamlining access to project examples. The build system has migrated from Webpack to esbuild, reducing build times from 2 seconds to 200 milliseconds and decreasing bundle sizes by about 10%.
Transformers.js v4 introduces a range of new models, including GPT-OSS and FalconH1, all compatible with WebGPU for hardware acceleration. The standalone Tokenizers.js library has been launched, providing a lightweight, type-safe tokenization tool. Quality-of-life improvements include enhanced type safety and better logging for model execution. The library can now handle larger models, such as GPT-OSS with 20 billion parameters, achieving a rate of about 60 tokens per second on specific hardware. The development team acknowledges the contributions of the ONNX Runtime team and other collaborators in bringing this release to fruition.
Questions about this article
No questions yet.