Lazy Panda



Best Practice on NuxtJS Bundle Optimization and enhance performance

NuxtJS uses webpack-bundle-analyzer which provides output files with an interactive zoomable treemap to visualize your bundles and optimize them. To enable it and generate our bundle report we just need to add the following script in our package.json:

"analyze": "nuxt-ts build --analyze"

For more information about NuxtJS build configuration.

nuxt vuejs code splitting bundle optimization


Now if we run npm run analyze, we get the warnings messages in the console. 

WARNING in asset size limit: The following assets(s) exceed the recommended size limit.

Vue recommends that bundles do not exceed a size of 244 KiB, and our vendor bundle is too big currently, it is 1.19 MiB

Let’s look into the report generated:

If we take a look at the report we can see that our second heaviest package is lodash.js with 88.47 KB. If we look at the usage of lodash.js in our codebase at the src/services/configuration.service.ts file we can see the following import:

import { get, cloneDeep } from 'lodash'


We can improve the usage of lodash.js by importing just the function that we need like this:

import get from 'lodash/get'

import cloneDeep from 'lodash/cloneDeep'


Now if run npm run analyze with the lodash changes we see the following report:

We can see that our vendor package is 383 KiB instead of 449KiB and our lodash package has only 22.61 KiB now. If there are other libraries that we can reduce the size by importing just part of it instead of the whole library, we should do it to keep our bundle as small as possible. 


Bundle splitting

The idea behind bundle splitting is pretty simple. If you have one giant file, and change one line of code, the user must download that entire file again. But if you’d split it into two files, then the user would only need to download the one that changed, and the browser would serve the other file from the cache.

It’s worth mentioning that since bundle splitting is all about caching, it makes no difference to first-time visitors.

Since our vendor package still exceeds the recommended size, let’s split it.


Splitting out each npm package

We can split our vendor package for each npm package, adding the following code in the nuxt.config.js file.

build: {

    optimization: {

      runtimeChunk: 'single',

      splitChunks: {

        chunks: 'all',

        maxInitialRequests: Infinity,

        minSize: 0,

        cacheGroups: {

          vendor: {

            test: /[\\/]node_modules[\\/]/,

            name(module) {

              // get the name. E.g. node_modules/packageName/not/this/part.js

              // or node_modules/packageName

              const packageName = module.context.match(




              // npm package names are URL-safe, but some servers don't like @ symbols

              return `npm.${packageName.replace('@', '')}`









Set up Explanation

Webpack has some clever defaults that aren’t so clever, like a maximum of 3 files when splitting the output files and a minimum file size of 30 KB (all smaller files would be joined together). So I have overridden these.

cacheGroups is where we define rules for how Webpack should group chunks into output files. I have one here called ‘vendor’ that will be used for any module being loaded from node_modules. Normally, you would just define a name for the output file as a string. But I’m defining the name as a function (which will be called for every parsed file). I’m then returning the name of the package from the path of the module. As a result, we’ll get one file for each package,

e.g. npm.react-dom.899sadfhj4.js.

This whole setup is great because it’s set-and-forget. No maintenance required — I didn’t need to refer to any packages by name.

Question about splitting the bundle


1 - Isn’t it slower to have lots of network requests?

No, This used to be the case back in the days of HTTP/1.1, but it is not the case with HTTP/2.

Although, this post from 2016 and Khan Academy’s post from 2015 both reached the conclusion that even with HTTP/2, downloading too many files was still slower. But in both of these posts, ‘too many’ files meant ‘several hundred’. So just keep in mind that if you’ve got hundreds of files, you might start hitting concurrency limits.


2 - Won’t I lose out on compression by having multiple small files?

Yes, more files, and less compression.

Maybe we will have to quantify how much of our bundle is worth splitting.