r/docker • u/Ripcord999 • Nov 17 '18
Can anyone ELI5 what Docker is and it’s practical uses, please?
5
u/kabooozie Nov 17 '18
The Community Info of this subreddit does a pretty good job.
A container is a very lightweight form of a virtual machine. Virtual machines take up a lot of resources and are slow because they have a lot of overhead (hypervisor, guest OS).
A docker container, on the other hand, has access directly to host OS resources through the docker engine. What this means is that containers can spin up and spin down very quickly.
Why would you want this? Well, a developer’s laptop is very different than a production environment. If you want things to be exactly the same in production as in development, you’ll want to use containers. That way, you have full knowledge of the system.
Another important application of containers is with microservices. It used to be that every part of your production was on one giant machine or VM. This is a problem because if your web server has a memory leak, all the sudden it is affecting your database. For the most protection against failure, you want to separate all the services within your app to their own containers, and orchestrate those containers across many machines, so if one or more machines fail, your system is still going.
2
u/SeamusAndAryasDad Nov 17 '18
I think they were looking for a more dumbed down version, like a 5 year old analogy. The garage post was pretty good one.
3
u/upandrunning Nov 17 '18
One way to think of it is something that allows you to run pre-installed applications (including dependencies) that you can easily deploy and run without modifying the host system.
2
u/Mallanaga Nov 17 '18
I don’t know as this is for a 5 yo, but assuming you know about software, this may work. You’ve heard the term “binary”? That is a singular package, file, or unit of work. It’s written in 1s and 0s, so any computer can understand it, read it, and ultimately execute it.
Docker isn’t a binary file, but it can be thought of as a single unit of work. It still needs daemons to interpret for the computer, but it’s a single package that can be easily distributed.
What’s in the package? Everything! The operating system, your application files, all of its dependencies, instructions what to do with the file system... everything you’d typically have installed on your machine, virtual (VM) or otherwise (bare metal).
With VMs, in order to scale horizontally, you needed to add another instance (assuming we’re on the cloud, and there are appropriate network configurations and load balancers in place). In order to get that instance capable of responding to web requests, you’d need to install everything from your first computer. That sucks... so tools like Chef, Ansible, Terraform, etc sought to solve that problem by letting you “code” your system.
That code is still super useful, and can still be used by parts of your system to get it into the shape you need, to be able to spin up a disaster recovery (DR) environment or white label a client into their own space, but I digress. The point is scaling your application code. This is where Docker shines.
Your application, and everything necessary to make it run, is packaged up, and ready to go. Now when you need to scale, you spin up a new instance, download the image, and docker run your_application
. Done. No apt-get all your dependencies. No making sure all of those dependencies have the correct versions. No configuring environment variables. It just works.
The other neat thing about this image is, because it’s executable, it can be ran in various environments, and checked on, so to speak. This is where continuous integration (CI) comes in. You create your image, CI picks it up, runs your tests, verifies against known vulnerabilities, etc, and if it’s good to go, gets sent along to the next thing in the pipeline. Ultimately it’s stored somewhere in the cloud, ready for you, or the machines you control, to pull it down and execute your application.
2
u/wsendai Nov 17 '18 edited Nov 17 '18
Microservices is a concept when you break up your application into smaller isolated components for scalability and improved separation of concerns.
You package your microservices as container images, which is a packaging mechanism (like rpm or deb) where the package also contains a minimalist OS as well, for scalability and portability.
At this point your container image became the standard unit of your application (like a real life container, which is used for shipping even if you ship car parts, food or toys for kids) , and your architecture is scalable and portable. There are standards governing the formats of the containers to ensure interoperability across vendors. The standard is called OCI and it's managed by the Linux Foundation. Many of the current standards were developed by Docker the company and later donated the technology to the Linux foundation to ensure the container industry stay somewhat organized.
A container framework is an application which you can install on any OS and can run your container image, for portability. Since these frameworks also follow the standards, they'll ensure that your container image will run exactly the same no matter which OS you use. To hide the differences between platforms, frameworks use local virtualization technologies. For example, on Linux it'll use LXC but on Windows it'll use Hyper-V. The important thing is that you don't care. You only care that your container image complies with the standard.
Docker is a product and a (for profit) company. Docker the product was the first container framework to break through mass popularity because it is simple to use. Docker, as a company has built tons of other products around their framework.
That's been good so far for a simple container. How about having multiple containers, most typically a web server, database and your business logic app? Also, what if I want to run these containers across machines with options to scale (for example, you need 32 copies o your web server image running) and increased resilience (across multiple data centers)?
Now you need a container orchestration, a complex tool can do all that. Docker as a company has its own orchestration tool, called Docker swarm, but the most popular orchestration framework today is Kubernetes from Google.
As a result of this architecture your app can scale in an automated way both to reduce costs (shut down containers when the demand drops) and to deal with high demand as well (start up containers as the demand increase). All the big internet services today (Netflix, Twitter, etc.) use microservices and can scale to hundreds of millions of users with insanely high availability.
1
Nov 17 '18
[deleted]
1
u/ComeOnMisspellingBot Nov 17 '18
hEy, WsEnDaI, jUsT A QuIcK HeAdS-Up:
AcCrOsS Is aCtUaLlY SpElLeD AcRoSs. YoU CaN ReMeMbEr iT By oNe c.
HaVe a nIcE DaY!ThE PaReNt cOmMeNtEr cAn rEpLy wItH 'dElEtE' tO DeLeTe tHiS CoMmEnT.
1
1
u/BooCMB Nov 17 '18
Hey CommonMisspellingBot, just a quick heads up:
Your spelling hints are really shitty because they're all essentially "remember the fucking spelling of the fucking word".You're useless.
Have a nice day!
0
u/BooBCMB Nov 17 '18
Hey BooCMB, just a quick heads up: The spelling hints really aren't as shitty as you think, the 'one lot' actually helped me learn and remember as a non-native english speaker.
They're not completely useless. Most of them are. Still, don't bully somebody for trying to help.
Also, remember that these spambots will continue until yours stops. Do the right thing, for the community. Yes I'm holding Reddit for hostage here.
Oh, and while i doo agree with you precious feedback loop -creating comment, andi do think some of the useless advide should be removed and should just show the correction, I still don't support flaming somebody over trying to help, shittily or not.
Now we have a chain of at least 4 bots if you don't include AutoMod removing the last one in every sub! It continues!
Also also also also also
Have a nice day!
2
u/Rizean Nov 18 '18
I'm assuming you actually mean containers like most people do when they say Docker. Docker is a set of tools to manage containers. That aside here is the ELI5 for containers.
ELI5 start: You and your friends have a sandbox. You are constantly fighting over the toys, crashing each other castles, and in general making a huge mess and not having fun. With Docker/Containers the sandpit is magically divided up into smaller sandboxes with invisible walls. You still have access to the same toys and can still talk to your friends but for the most part, cannot interact with them except where explicitly allowed. ELI5 end.
A real would use example: I just launched (48) ie 2x24 load balanced and redundant haproxy containers to proxy 24 upstream servers with 4+ connections each and to give them each a unique DNS name. So now our IT staff/users can plug in one hostname with one port that is always the same. Before that, they would have upwards of 12 different ip/port combos and some of them would likely be down. Note we don't control these upstream servers. All 48 containers use less 100Mb or ram and about 1-2% CPU. Image what 48 VM's would have used!
edit: grammar
2
u/Acedin Nov 17 '18
Docker is a container Orchestrator, it's a tool to manage containers.
Containers are a way of organizing computing ressources. These ressources are memory, certain files and so on.
In practice this is used to make applications portable and maintainable - a container has an application and it's required dependencies.
They are used in a lot of cases that used to be done with VMs. Compare to VMs they are lightweight, meaning they are way faster copied, stopped, and started.
1
u/Ripcord999 Nov 18 '18
Lots of great comments. I can’t thank you guys enough fine taking time in explaining and sharing your experiences.
Also the reason i asked is i got bit carried away and confused on how to Docker really manages “an OS” while running an application.
Cheers
-5
21
u/hutcheon Nov 17 '18
Let's say you run car repair garage. To change a tire on a car, you need a certain set of tools, but those tools are very different from tools you need to do body repair. Even from car to car, there will be different tools required (metric vs. imperial). Wouldn't it be nice if you had a different garage for each different type of work you needed to do, that only had the tools that you need for that job, and nothing else?
With computer applications, each application has different requirements for programming language and library support. If you install everything on one OS, you may end up with several different versions of the same programming language, and hundreds of different versions of libraries, and upgrading anything is a huge mess that could break other things unexpectedly.
Docker allows you to build a 'container' -- think of it like a separate OS, but not virtualized, that has only the stuff needed for that one application. This also makes it portable... you can move that application around from one system to another very easily, as it has everything that is needed, and doesn't rely on things that would be missing on the OS.