Deploying Web Application that is built localy

I am quite new to Docker and containerization in general. But I do have some experience programming, also with Node.js, Next.js, HTML, etc. However, this is all self-taught in my spare time, so please don’t be too harsh. :angel:

My question is the following: I want to build my personal webpage, but never had a server where I could host this stuff. I recently got a Synology NAS and installed the Container Manager (It’s basically a Docker Container Manager).

I started developing my website with Next.js and everything works fine on my PC. I looked up the (very short) documentation for how to containerize the application, and followed the steps here. They use a Dockerfile to set up a Node.js instance, install all the dependencies, build the application and run it afterwords.

My problem is that building the whole application from scratch with Docker Compose takes a long time (>10 mins) on my NAS, but only like 1 min on my PC (understandable, of course).

In my understanding, if I want to update my website, I have to edit the Next.js project and copy the whole project to my NAS so that it can build everything again.

My question is: Can I develop the website on my PC, build it there too and just have Next.js (aka Node.js) run the build output? And do I have to use the “standalone output” described here, or can I just use “next start” (which I would prefer)?

I think I understand the concept of volumes and my best guess is that the solution has to involve volumes, but I’m just not sure how to use them correctly.

TL;DR: Do I have to build a Next.js application on the deployment device, or can I build the application on my PC and just give the container the build files?

Thank you for your patience and understanding. I just couldn’t find information online on this specific use case.

But you do use a git repo to manage your code right? A workflow could look like that your develop on your pc, and push modified sources to git, then pull them on your nas. Though, this will not speed up the build. It will be a more robust and reliable approach then copying the sources.

To speed up the image build, you could potentially build your node application on your pc, copy the current package.json, package-lock.json and content of the dist to the nas. Your Dockerfile would copy the package.json+ package-lock.json then perform npm ci and then copy the dist folder to the container.

Yes, I do use git. But I think I don’t understand something fundamental about Docker Containers: There must be a way to just use the official Node.js Docker Image and just run “next start” with all the files needed in a volume? If I do it like that, I should just be able to replace/update the files in the folder the volume is linked to, and I’m good to go. Because that way I might have to restart the Container to reload the new files, but I don’t have to compile the Dockerfile every time, right? Or do I miss something?

It really depends on what you want.

  • You can bind the sources into the container and use it as runtime environment
  • You can build an image that consists of your application and all its dependencies.

While the first approach is often used during development, the 2nd approach is used to actually created a self-contained image to be used for deployments.

Ok, that actually makes sense. Thank you!

This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.