How to deploy a docker container on a remote ubuntu server?

I have implemented an API inside a docker container and I want to deploy this container to a remote ubuntu server. How can I do this exactly? My API uses a lot of resources and I used MLDB framework to implement it. I want to deploy a container containing an API to this remote ubuntu server. So far I've found tons of tutorials on how to deploy APIs on AWS and DigitalOcean, but since I already have access to a remote ubuntu server, why don't I need it? So how can I deploy my container so someone else can check my API? If there is a better way to deploy my API (hopefully free or cheap) please let me know.

Thanks in advance.

+3


source to share


3 answers


  • Install passwordless SSH on target machine

  • Run the following command to remotely manage Docker on the target VM (also install Docker if needed):

docker-machine create --driver generic --generic-ip-address = 10.123.2.74 --generic-ssh-user = docker --generic-ssh-key ~ / .ssh / id_rsa some_name

More information on the generic driver here .

  1. Set the required environment variables for the newly configured docker machine:

eval $ (docker machine env some_name)



  1. Any Docker command running in this terminal window / cmd will run on the remote machine. To test a run:

docker ps

You can now run docker containers just like you would locally.

PS - If you need to remotely manage a docker instance running on Windows via the Docker Toolbox things get a little more complicated. (you need to allow network access to your ports in the Linux VM docker (ssh, docker engine, container ports) either through VirtualBox bridging network adapter or port forwarding, and also solve windows firewall issues)

+5


source


I would suggest to install docker-machine

in local development environment and use generic driver

to add remote_server

, you can use eval $(docker-machine env remote_server)

to connect to it and deploy your API.

The driver will perform a list of tasks upon creation:

  • If docker is not running on the host, it will be installed automatically.
  • It will update the hosts packages (apt-get update, yum update ...).
  • It will generate certificates to protect the docker daemon.
  • The docker daemon will be restarted, so all running containers will be stopped.
  • The hostname will be changed to match the machine name.

Deploying a local container to remote_server:



Once added remote_server

to docker-machine

using, generic driver

follow these steps to deploy your API.

  • Get envs for server: docker-machine env remote_server

  • Connect the shell to the server: eval $(docker-machine env remote_server)

  • Create an image of the API: docker build -t api_image .

    . (Dockerfile DIR)

  • Start container: docker run -d -p 1111:1111 api_image

  • Use curl

    :curl $(docker-machine ip remote_server):1111

Hope you find this helpful.

+1


source


In fact, it doesn't matter what your container is. It could be anything. You just need docker to deploy it and make it accessible. If you have ssh access to your Ubuntu server or however, install docker there. Ubuntu is simple, you can check this guide:

https://docs.docker.com/engine/installation/linux/ubuntu/

And after that, deploy your docker container to your server, mapping the port / to your API to the host using -p

docker in the run command. Of course, you will need to open this port / s to access the computer from the outside, it depends on your server configuration.

So, you can do it as "normal". Not sure if you're looking for an AWS specific way.

0


source







All Articles