scrapy 一个项目里同时运行多个爬虫

在spiders文件同级建立一个commands文件夹,建立一个py文件,我自定义为crawlall.py。

from scrapy.commands import ScrapyCommand


class Command(ScrapyCommand):
    requires_project = True

    def syntax(self):
        return '[options]'

    def short_desc(self):
        return 'Runs all of the spiders'

    def run(self, args, opts):
        spider_list = self.crawler_process.spiders.list()
        for name in spider_list:
            self.crawler_process.crawl(name, **opts.__dict__)
        self.crawler_process.start()

在settings文件里把刚建立的crawlall文件的路径设置好

COMMANDS_MODULE = "ProxyPool.commands"

最后在cmd下scrapy crawlall运行

猜你喜欢

转载自blog.csdn.net/u014248032/article/details/83351291
今日推荐