When it comes to installing a framework (such as next.js) for a web application into a container what I think I’ve seen people doing is create a next project locally (presumably using npx) and then copying it into the container using the COPY command in the dockerfile. Of course there is more to what happens with the Dockerfile (and .dockerignore) than that (such as installing node / npm and ignoring node_modules).
What I’m wondering is…
Why can’t you RUN npx create-next-app@latest thus literally installing next.js framework into the container through or by means of the Dockerfile itself? That would be a different approach to installing it on the local file system then copying it into the container.
Is that not a valid approach? Why / why not? Why don’t I see examples of that approach in tutorials and videos?
Hello? Does anyone understand this better than I do? Cause I’m not finding the internet to be a real robust source of info on docker. Mostly it seems to just a be a few really basic ideas getting repeated over and over.
Regardless of the package manager, I have seen those approaches used:
if dependencies are only fetched from public repos: everything can be done in the Dockerfile
if dependencies need to be fetched from private repos: fetch the dependencies and build outside, and copy the artifacts into the image in the Dockerfile. This prevents accidental leakages of credentials to access the repository.
I am not a node guy, but I have seen plenty of projects that used the approach shown in the link from above.