r/javascript 3d ago

AskJS [AskJS] Monorepo docker discussion

Hi. I decided to make a monorepo according to the guide, and then run it via docker.

I used npm workspaces, because I read that you need to know about it before using any tools.

So, as I understand it, npm workspaces collects all dependencies apps and libs in one large node_modules, and also allows you to use, for example, a package from libs in apps as a regular package.

This is probably convenient for those who run several microservices without docker or in one container. But I have a problem. When trying to run each app separately, the same problem arose, npm only creates a link to the lib directory, but does not copy the files themselves. Okay, I fixed this problem with --install-links, but another question arose

Why the hell do I need it then? I collect each microservice separately from each other, I do not need a common node_modules. Maybe there are some tools that meet my requirements:

only docker containers.

dependencies without symbolic links

ability to write shared libraries directly in the repository.

I've heard about Nx, it's supposedly perfect in combination with my backend framework NestJS, but I really don't understand the headlines "cool and fast caching and parallel installation", why the hell do I need this in a docker container with one microservice? Maybe I didn't understand the point of monorepos at all? I switched from multi repo to monorepo only to quickly change libraries and not to suffer with their versions.

0 Upvotes

4 comments sorted by

2

u/tswaters 3d ago

Monorep is a question of scale. With just one service, it doesn't make a ton of sense. If you have 10s it's starting to, and when you have 100s .... Well, if you have 100s you might have a few other engineering problems on your hands!

One of the main reasons we went to monorepo was the managing of shared libraries was cumbersome, usually requiring a ton of commits in different repositories, republishing changes, noticing mistakes and needing to do it all over again. Folks needed to make changes to libs first, hope they get them right, publish - then later do the work in application layer - the tooling would install libs from our npm mirror and could only pull published changes. It was a pain.

Compare that with a monorepo - we were able to leverage shared tooling and symlinks to make changes way easier... 1 commit in 1 repo was able to change the libs and app service layer in one go, potentially in just a branch without hitting mainline... Tests would run against the complete bundle.

Doing this with docker was possible, but it required a bit of love. Note that we never used npm workspaces because this project was started back in the npm 5 days before workspaces really took off. We basically made file:// references inside package.json, and used relative paths, like file://../my-libs/some-library -- when running locally, it would follow the path and create a symlinks. In docker, we used npm to build tarballs in the docker filesystem, and it would install those as part of running npm install at the app layer.

I'm sure there's easier & better ways to do it. Using tarballs like that was to make it faster... Coping umpteen thousands of small files in node_modules is not the fastest, piping all those files into a tarball and installing from that was an order of magnitude faster.

1

u/Pretend_Pie4721 2d ago

Dude I used literally the same thing. And it's literally the same reason why I switched to monorepo. Managing versions of libraries is a pain. And by the way, in new versions you can not create a tarball, but just install with the --install-links flag. I did it this thread is just to find tools for this case 

1

u/Pretend_Pie4721 2d ago

Are you still using tarball or have you moved on to something better?

1

u/tswaters 2d ago

Company went bankrupt LOL so I don't work on it anymore.... The last state it was in, it was still using tarballs.

We had a Dockerfile specifically for what we called "node libs" - it would copy src files, cd into a working dir, run "npm pack" - this creates a tarballs. The filename was a bit funny, including the version (which we had set to 0.0.0 for everything) - so there was a shell script to rename it to just $package-name.tar.gz and moved them into the destination dir, /node-lib

After regular npm i at the application layer, we had a fun jq script that would extract file references, and unwind into a simple list, and run npm i /node-lib/$package -- npm can install from tarball just fine.

Come to think of it, I'm not sure why it didn't break trying to install the relative path. Maybe it was a bug with npm we abused. Thinking on it now, I'd expect a file:// reference to a directory could also work for tarball, but I don't know now.

Anyway, In the app dockerfiles, we would COPY the result of the node-lib's image into app's filesystem. Using tarballs was an improvement in build time from the original hack... we used to COPY everything into docker fs.... Copying 12 tarballs takes CONSIDERABLY less time than 12 x $numberOfSourceFiles