Running scrapy in the background (Ubuntu)

I was able to run the cleanup program in Ubuntu terminal. However, I cannot use the Ctrl + Z and bg commands to get it to run in the background. It closes the spider connection every time I press Ctrl + Z.

Is there a workaround or solutions to the problem?

+3


source to share


3 answers


The simplest solution is to use it nohup

along &

with the following syntax:

nohup python parser.py &

      



As long as the suffix &

runs it in the background, closing the session will kill the process anyway. nohup

creates a session independent process suitable for all kinds of environments (like SSH sessions and remote servers for example) and saves all console output to a log file.

+7


source


If you start your spider with scrapy crawl

:



  • If you want to keep the logs: scrapy crawl my_spider > /path/to/logfile.txt 2>&1 &

  • If you want to reject logs: scrapy crawl my_spider > /dev/null 2>&1 &

+1


source


You can use screen

to run one or more tasks in the background

0


source







All Articles