SIGKILL or Out of Memory (OOM) errors on Vercel generally point towards memory-related issues, often resulting from the limitations of the Vercel build container. If you’re encountering this error, don’t worry — with a few tweaks and optimizations, you can prevent these errors from surfacing. While this guide primarily provides recommendations for Next.js, strategies and techniques for diagnosing memory issues in other frameworks are also included.
Why builds fail due to memory issues
Each Vercel build container is allocated 8192 MB of memory. When your project overruns this, a SIGKILL or OOM error may be thrown. Memory can be consumed by various project elements, including code, dependencies, assets, or processes running during build time.
Enterprise customers seeking increased memory allocation can purchase [Enhanced Build Machines]
Common causes of memory issues include:
- Large number of dependencies: Large-scale projects, or ones loaded with numerous dependencies, can easily consume memory.
- Large data handling: Massive datasets or high-resolution assets naturally use more memory during processing.
- Inefficient code: Code that inadvertently creates many objects, or that doesn’t free up memory, can rapidly eat up resources.
- External services: Overuse or poor optimization of calls to third-party services during build.
Reducing memory overhead in Next.js
Update your Next.js version
We recommend running Next.js v14.1.3
or higher. Newer Next.js versions come with performance improvements and optimizations to reduce memory overhead.
Run Webpack compilations in separate workers
The Webpack build worker allows you to run Webpack compilations inside a separate Node.js worker which will decrease memory usage of your application during builds.
This option is enabled by default if your application does not have a custom Webpack configuration starting in v14.1.0
.
If you are using an older version of Next.js or you have a custom Webpack configuration, you can enable an experimental flag inside your next.config.js
, which forces the webpack compiler to be run in a separate worker rather than in the main process itself (this feature may not be compatible with all custom Webpack plugins):
{
"experimental": {
"webpackBuildWorker": true
}
}
You can also disable the webpack cache entirely by adding a custom Webpack configuration to your application:
/** @type {import('next').NextConfig} */
const nextConfig = {
webpack: (
config,
{ buildId, dev, isServer, defaultLoaders, nextRuntime, webpack }
) => {
if (cfg.cache && !dev) {
cfg.cache = Object.freeze({
type: 'memory',
})
cfg.cache.maxMemoryGenerations = 0
}
// Important: return the modified config
return config
},
}
export default nextConfig
3. Run next buildwith
–experimental-debug-memory-usage`
From Next.js v14.2.0
onwards, you can run next build --experimental-debug-memory-usage
to run the build in a mode where Next.js will print out information about memory usage continuously throughout the build, such as heap usage and garbage collection statistics. Heap snapshots will also be taken automatically when memory usage gets close to the configured limit.
This feature is not compatible with the Webpack build worker option which is auto-enabled, unless you have custom webpack config.
4. Record a heap profile
To look for memory issues, you can record a heap profile from Node.js and load it in Chrome DevTools to identify potential sources of memory leaks.
In your terminal, pass the --heap-prof
flag to Node.js when starting your Next.js build:
node --heap-prof node_modules/next/dist/bin/next build
At the end of the build, a .heapprofile
file will be created by Node.js.
In Chrome DevTools, you can open the Memory tab and click on the “Load Profile” button to visualize the file.
Analyze a snapshot of the heap
You can use an inspector tool to analyze the memory usage of the application.
When running the next build
or next dev
command, add NODE_OPTIONS=--inspect
to the beginning of the command. This will expose the inspector agent on the default port. If you wish to break before any user code starts, you can pass --inspect-brk
instead. While the process is running, you can use a tool such as Chrome DevTools to connect to the debugging port to record and analyze a snapshot of the heap to see what memory is being retained.
While running in this mode, you can send a SIGUSR2
signal to the process at any point, and the process will take a heap snapshot.
The heap snapshot will be saved to the project root of the Next.js application and can be loaded in any heap analyzer, such as Chrome DevTools, to see what memory is retained. This mode is not yet compatible with Webpack build workers.
See how to record and analyze heap snapshots for more information.
Reducing memory overhead in other frameworks
Reduce number of dependencies
Redundant or heavy dependencies in a project can stealthily introduce significant memory usage, which is especially true for projects that have grown and evolved over time.
For Next.js users, the [Bundle Analyzer] can help you investigate large dependencies in your application that may be able to be removed to improve performance and memory usage.
Other methods to diagnose problematic dependencies include:
- The
node_modules
directory can grow substantially, sometimes including unnecessary or deprecated packages. pnpm list
,npm ls
oryarn list
will view a tree of your installed packages and their dependencies.- Consider using
npm-check
ordepcheck
to identify unused or missing dependencies. - Some libraries are heavy for their functionality. Sites like
Bundlephobia
can show the footprint of npm packages. Look for lighter alternatives when possible. - Ensure you aren’t including multiple versions or duplicate dependencies to your project. Use
pnpm dedupe
,npm dedupe
oryarn dedupe
to help identify instances of this. - Keep your dependencies up-to-date, as newer versions might have optimizations. Use
pnpm outdated
,npm outdated
oryarn outdated
to identify and then update outdated dependencies.
2. Optimize images and assets
Large assets, especially high-resolution images, play a significant role in the overall memory consumption of a project during the build process. When these assets are being processed, converted, or optimized as part of the build pipeline, they demand a significant chunk of the available memory. This is particularly true for web applications that employ real-time image processing or transformations during the build.
To reduce memory overhead caused by images and assets:
- Reduce file sizes using tools like
ImageOptim
to manually compress images without a noticeable quality loss. - Integrate image compression tools into your build process. Libraries like
imagemin
can be used with plugins tailored for specific image types (JPEG, PNG, SVG, etc.). - Consider using modern web formats, such as WebP, for better compression than older formats.
3. Invalidate your build cache
Clearing your Vercel Deployment’s build cache can sometimes alleviate these errors if the build cache has become excessively large or corrupt. There’s a few different ways to clear your project’s build cache:
- Use the Redeploy button for the specific deployment in the Project’s Deployments page. In the popup window that follows, leave the checkbox Use existing Build Cache unchecked.
- Use
vercel --force
with Vercel CLI. - Use the Environment Variable
VERCEL_FORCE_NO_BUILD_CACHE
with a value of1
. - Use the Environment Variable
TURBO_FORCE
with a value oftrue
on your project to skip Turborepo Remote Cache. - Use the
forceNew
optional query parameter with a value of1
when creating a new deployment with the Vercel API.
4. Analyze bundles
Bundles can also inadvertently increase memory overhead. During build time, tasks like transpiling, code splitting, and source map generation can intensify memory demand. You can use tools like webpack-bundle-analyzer to generate visualizations of what’s in your webpack bundle.
When analyzing bundles, consider the following:
- Are any large libraries tree-shakable?
- Are you depending on deprecated libraries?
- Does the report show any large bundles?
- Webpack suggests keeping bundles under 250 KB before minification. If bundles exceed this size, consider code splitting and possibly lazy loading for certain parts.
5. Limit memory allocation for Node
Limiting the allocated memory size will make Node.js more aggressive with its garbage collection process, which can alleviate memory errors for some projects. To limit the memory size for Node.js, add the following before your build command: NODE_OPTIONS="--max-old-space-size=6144"
NODE_OPTIONS="--max-old-space-size=6144" && next build
Conclusion
Memory errors can be frustrating and difficult to pin down, but with the right approach and optimizations in place, you can prevent them from occurring. Regularly reviewing and optimizing your project is key to ensuring smooth and efficient builds on Vercel.
If you’ve tried the above steps and still face issues, don’t hesitate to reach out to our support team for further assistance.