scrapy 执行爬虫报错:exceptions.KeyError: 'd',怎么解决?

我在windows环境使用scrapy框架写了个爬虫,一切运行良好,准备部署到服务器,服务器环境为ubuntu
安装了pythonscrapy等之后,执行爬虫,报错。
各种google,还是没有解决,请各位帮忙看下!多谢!

Unhandled Error
Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 150, in _run_command
    cmd.run(args, opts)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/commands/crawl.py", line 57, in run
    self.crawler_process.crawl(spname, **opts.spargs)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 153, in crawl
    d = crawler.crawl(*args, **kwargs)
  File "/usr/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1237, in unwindGenerator
    return _inlineCallbacks(None, gen, Deferred())
--- <exception caught here> ---
  File "/usr/lib/python2.7/dist-packages/twisted/internet/defer.py", line 1099, in _inlineCallbacks
    result = g.send(result)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 71, in crawl
    self.engine = self._create_engine()
  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 83, in _create_engine
    return ExecutionEngine(self, lambda _: self.stop())
  File "/usr/local/lib/python2.7/dist-packages/scrapy/core/engine.py", line 67, in __init__
    self.scraper = Scraper(crawler)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/core/scraper.py", line 70, in __init__
    self.itemproc = itemproc_cls.from_crawler(crawler)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/middleware.py", line 56, in from_crawler
    return cls.from_settings(crawler.settings, crawler)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/middleware.py", line 34, in from_settings
    mw = mwcls.from_crawler(crawler)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/pipelines/media.py", line 33, in from_crawler
    pipe = cls.from_settings(crawler.settings)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/pipelines/images.py", line 57, in from_settings
    return cls(store_uri)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/pipelines/files.py", line 160, in __init__
    self.store = self._get_store(store_uri)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/pipelines/files.py", line 180, in _get_store
    store_cls = self.STORE_SCHEMES[scheme]
exceptions.KeyError: 'd'
阅读 6.6k
1 个回答
撰写回答
你尚未登录,登录后可以
  • 和开发者交流问题的细节
  • 关注并接收问题和回答的更新提醒
  • 参与内容的编辑和改进,让解决方法与时俱进