Pipeline to docker exec from command line and from python api

What I am trying to implement is calling mysqldump

in the container and dumping the database into the container's own directory.

I will try below command first:

$ docker exec container-name mysqldump [options] database | xz > database.sql.xz

      

It doesn't work, so I'll try another one:

$ docker exec container-name bash -c 'mysqldump [options] database | xz > database.sql.xz'

      

This time it worked.

But this is really lame.

Then I'll try using docker-py this time cmd

, which worked like this:

cmd=['bash', '-c', 'mysqldump [options]  database | xz > database.sql.xz']

      

logger event as below:

level="info" msg="-job log(exec_start: bash -c mysqldump [options]  database | xz > database.sql.xz, fe58e681fec194cde23b9b31e698446b2f9d946fe0c0f2e39c66d6fe68185442, mysql:latest) = OK (0)"

      

My question is:

is there a more elegant way to archive my target?

+3


source to share


1 answer


You're almost there, you just need to add the -i flag to make the pipe work:

-i, --interactive    Keep STDIN open even if not attached

docker exec -i container-name mysqldump [options] database > database.sql.xz

      

I replaced the pipe with a file reconfiguration, but it will work with Pipe. Just remember to use the -t option, as this will break it.


Additionally:



To revert back sql-dump to mysql:

docker exec -i container-name mysql [options] database < database.sql.xz

      

This little script will detect if I am running mysql on a pipe or not:

#!/bin/bash
if [ -t 0 ]; then
    docker exec -it container-name mysql "$@"
else
    docker exec -i container-name mysql "$@"
fi

      

+11


source







All Articles