I truly enjoy the Vercel experience overall, especially the ease of deployments during rapid app iterations. At its best, it’s truly terrific. Full-stack Next.js apps with the magic of server components are amazing for swift development cycles.
However, I find myself increasingly running into AI-related use cases where a Python backend is preferable, if not somewhat necessary—and personally, I find FastAPI slightly more ergonomic for me than Flask. I’m not sure if others feel the same way.
With this rise in pythonic/AI use cases, I am really hitting a wall with Vercel because the serverless architecture often becomes problematic for these projects, sometimes at the last minute. Many AI applications involve file handling and processing that either require a containerized environment or more memory than the 250MB limit of serverless functions allows. This issue is worse than just annoying—it makes me hesitant to start new projects on Vercel. It leaves me contemplating whether to split a project between Vercel and another IaaS provider for the Python environment or to migrate everything elsewhere (but then to where? AWS?).
A good example of one of these serverless “walls” popped up on this forum just this week: Link to forum post. While this doesn’t always happen, encountering such limitations in the middle of a project is a significant productivity problem and makes me nervous about committing the next project to this environment.
I guess I’m curious how others are handling these challenges. Despite loving the rapid deployment experience Vercel offers, the boundaries coming up when they do, and the lack of a smooth off-ramp that I am aware of to server-full partners have me frustrated. But maybe I am missing some good, easy answers on what to do when I realize I actually need my app to store user uploads and apply NumPy
on the files (or something like that?)