Why is Turbopack so slow?

My macbook M1Pro is burning :fire: We migrated from vite and compared to that turbopack is maybe 30x slower. I would say the way it works is “not as advertised”. I am disapointed

Hi, @andylacko!

Thanks for your feedback, I’ve shared internally. :pray:

Engineer from the Turbopack team reporting in! I’m sorry you’ve had a poor experience with Turbopack.

Would you be willing to try out canary version of Next.js 15 ( npm install next@canary react@canary react-dom@canary), and see if that helps? We’ve landed hundreds of fixes to canary that have not been backported to the Next.js 14.x branch.

I understand that asking you to use a canary version is potentially a huge ask. Turbopack is rapidly developing, and we’re working out internally about how to put our best foot forward. Just this last week we migrated Turbopack’s core into the Next.js repository. One of the goals of that effort was to make backporting changes easier.

On top of trying a canary release, a few more details about what performance problems you’re running into might be helpful:

  • Are these cold starts of the dev server, or warm loads of pages (e.g. HMR)? Cold start times are a known issue because we don’t have persistent (on disk) caching yet, though we’ve found that in many cases a cold start with Turbopack is often still faster than a warm start with Webpack. Persistent caching for faster cold starts is a high priority on our roadmap.

  • What’s the scale of your application? Is this a small application? A large one? 10s of thousands of modules? 100s of thousands?

  • How much memory does your machine have? We’re rapidly narrowing the gap (especially in canary releases), but Turbopack does often use more RAM than Webpack, as it caches more incremental results in memory. This can be an issue for machines with 8GB of RAM.

1 Like

Thank you for quick response. I tried canary maybe month ago and there was no difference

HMR is perfect, the performance of HRM is not even noticable.

The biggest problem is when you navigate or open new page, I don’t know if this is considered cold start (my laptop is definitely not cold) but this is the point where it takes around 10sec to build. If you refresh the same page it is alright.

I wouldn’t say our product is big, around 8k files right now, this is the build log from next

 ✓ Compiled /api/version in 26.5s (8263 modules)
 GET /api/version 200 in 12619ms

I have 16in 16GB M1Pro, the lowest config I think


Looking forward to file caching, because this is really bad DX, we gained little bit of benefit from next but lost the whole DX and I assume this is problematic for many teams using next. Considering nextjs is the most popular JS framework rn, it would be nice to calculate, how much energy you can save by optimizing the Turbopack :muscle:

Thanks for trying out the canary, @andylacko. Yes, when we say “cold start”, we’re referring to loading a route without a cache.

We make the assumption that the next dev process is long-lived. You should be able to start it in the morning and leave it running most or all of your day. We cache things in memory, so the second time you load a route, or if you use HMR, it should be fast. However, every time you ctrl+c the next dev --turbo process, the cache is lost, as it’s not persisted to disk.

Once we have persistence support, we will read in the cache of any previous execution from disk, so Turbopack will only be slow on a fresh clone of your repository.

Because Turbopack uses an uncommon bottom-up caching architecture (we think this will be a key differentiator of Turbopack), persistence is a bit of a challenge, but @sokra previously built some prototypes, and we are confident we can get there.

It’s certainly a priority. We need to support production builds with Turbopack, and we see persistence as a key part to making production builds performant.

16 GB of RAM and 8k modules shouldn’t be a problem :+1:

3 Likes

Hey @andylacko, taking another look at this, 26.5s does seem a bit longer than we’d expect for a cold start, particularly on the canary version. It’s possible that some combination of dependencies or configuration in your project is causing a problem for Turbopack.

For example, we’ve seen postcss+tailwind config files with glob patterns that result in matching the .next directory, leading to broken behavior where postcss/tailwind must re-check it’s own output.

Would you be willing to share a .next/trace.log file with us? There are instructions on how to do this here: Architecture: Turbopack | Next.js

1 Like

Yes, currently we are on 14.2.5, I downgraded the project from canary just after checking the speed because there were some issues. The trace.log is 420MB so I uploaded it to wetransfer Unique Download Link | WeTransfer

Thanks. I’ll get back to you!

1 Like

@andylacko, I’ve looked through the log you sent, and unfortunately I don’t see anything in particular that looks off to me, but I’ve shared it with the rest of the team, and will let you know if anyone else notices something that’s off.

Either way, I expect that once we have a persistent caching solution (which is actively being worked on), that we’ll have a much better story for these initial page loads.

The trace that you sent shows endpoint_write_to_disk (which should be the critical path for an initial page load) taking 10.78s, which is less than the 26.5s you posted earlier. Was this trace with the canary version? Is the canary version faster for you than the stable version?


Sidenote: It’s not very user-friendly at the moment, but if you’re curious about how we analyze these traces, in the Next.js repository (with Rust installed via rustup), we run the trace server with:

cargo run --bin turbo-trace-server --release -- ~/Downloads/trace.log

And then load the viewer in a web browser at https://turbo-trace-viewer.vercel.app/

That gives an interactive chart (similar to Chrome’s JS profiler) that looks something like this:

At some point, we’ll need to turn this into a more accessible self-service tool.

1 Like

thanks @bgw, I’ll try again with canary when I have spare time

1 Like