【亲测有效】[twisted] CRITICAL: Unhandled error in Deferred 解决办法

【亲测有效】[twisted] CRITICAL: Unhandled error in Deferred 解决办法,第1张

[twisted] CRITICAL: Unhandled error in Deferred 解决办法【亲测有效】
SO答案:https://stackoverflow.com/questions/45314514/scrapy-importerror-no-module-named-pipelines

先说原因:
ITEM_PIPELINES = {
‘tutorial.pipelines.jsonpipeline’: 800, // 错误
‘tutorial.pipelines.jsonpipeline.jsonpipeline’: 800, // 正确,原因是scrapy的目录结构有特殊约定
}
这里把pipiline管道放到pipelines包去了,导致scrapy需要重复一次类名。
----------------
默认pipelines.py包含所有的管道,如下:
ITEM_PIPELINES = {
‘tutorial.pipelines.jsonpipeline1’: 800,
‘tutorial.pipelines.jsonpipeline2’: 801,
}

Unhandled error in Deferred:
2022-04-30 23:19:28 [twisted] CRITICAL: Unhandled error in Deferred:

Traceback (most recent call last):
  File "C:\ProgramData\Miniconda3\envs\python383\lib\site-packages\scrapy\crawler.py", line 206, in crawl
    return self._crawl(crawler, *args, **kwargs)
  File "C:\ProgramData\Miniconda3\envs\python383\lib\site-packages\scrapy\crawler.py", line 210, in _crawl
    d = crawler.crawl(*args, **kwargs)
  File "C:\ProgramData\Miniconda3\envs\python383\lib\site-packages\twisted\internet\defer.py", line 1905, in unwindGenerator
    return _cancellableInlineCallbacks(gen)
  File "C:\ProgramData\Miniconda3\envs\python383\lib\site-packages\twisted\internet\defer.py", line 1815, in _cancellableInlineCallbacks
    _inlineCallbacks(None, gen, status)
--- <exception caught here> ---
  File "C:\ProgramData\Miniconda3\envs\python383\lib\site-packages\twisted\internet\defer.py", line 1660, in _inlineCallbacks
    result = current_context.run(gen.send, result)
  File "C:\ProgramData\Miniconda3\envs\python383\lib\site-packages\scrapy\crawler.py", line 102, in crawl
    self.engine = self._create_engine()
  File "C:\ProgramData\Miniconda3\envs\python383\lib\site-packages\scrapy\crawler.py", line 116, in _create_engine
    return ExecutionEngine(self, lambda _: self.stop())
  File "C:\ProgramData\Miniconda3\envs\python383\lib\site-packages\scrapy\core\engine.py", line 84, in __init__
    self.scraper = Scraper(crawler)
  File "C:\ProgramData\Miniconda3\envs\python383\lib\site-packages\scrapy\core\scraper.py", line 75, in __init__
    self.itemproc = itemproc_cls.from_crawler(crawler)
  File "C:\ProgramData\Miniconda3\envs\python383\lib\site-packages\scrapy\middleware.py", line 59, in from_crawler
    return cls.from_settings(crawler.settings, crawler)
  File "C:\ProgramData\Miniconda3\envs\python383\lib\site-packages\scrapy\middleware.py", line 40, in from_settings
    mwcls = load_object(clspath)
  File "C:\ProgramData\Miniconda3\envs\python383\lib\site-packages\scrapy\utils\misc.py", line 66, in load_object
    raise NameError(f"Module '{module}' doesn't define any object named '{name}'")
builtins.NameError: Module 'tutorial.pipelines' doesn't define any object named 'jsonpipeline'

2022-04-30 23:19:28 [twisted] CRITICAL: 
Traceback (most recent call last):
  File "C:\ProgramData\Miniconda3\envs\python383\lib\site-packages\scrapy\utils\misc.py", line 64, in load_object
    obj = getattr(mod, name)
AttributeError: module 'tutorial.pipelines' has no attribute 'jsonpipeline'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\ProgramData\Miniconda3\envs\python383\lib\site-packages\twisted\internet\defer.py", line 1660, in _inlineCallbacks
    result = current_context.run(gen.send, result)
  File "C:\ProgramData\Miniconda3\envs\python383\lib\site-packages\scrapy\crawler.py", line 102, in crawl
    self.engine = self._create_engine()
  File "C:\ProgramData\Miniconda3\envs\python383\lib\site-packages\scrapy\crawler.py", line 116, in _create_engine
    return ExecutionEngine(self, lambda _: self.stop())
  File "C:\ProgramData\Miniconda3\envs\python383\lib\site-packages\scrapy\core\engine.py", line 84, in __init__
    self.scraper = Scraper(crawler)
  File "C:\ProgramData\Miniconda3\envs\python383\lib\site-packages\scrapy\core\scraper.py", line 75, in __init__
    self.itemproc = itemproc_cls.from_crawler(crawler)
  File "C:\ProgramData\Miniconda3\envs\python383\lib\site-packages\scrapy\middleware.py", line 59, in from_crawler
    return cls.from_settings(crawler.settings, crawler)
  File "C:\ProgramData\Miniconda3\envs\python383\lib\site-packages\scrapy\middleware.py", line 40, in from_settings
    mwcls = load_object(clspath)
  File "C:\ProgramData\Miniconda3\envs\python383\lib\site-packages\scrapy\utils\misc.py", line 66, in load_object
    raise NameError(f"Module '{module}' doesn't define any object named '{name}'")
NameError: Module 'tutorial.pipelines' doesn't define any object named 'jsonpipeline'

Process finished with exit code 1

欢迎分享,转载请注明来源:内存溢出

原文地址:https://54852.com/langs/870770.html

(0)
打赏 微信扫一扫微信扫一扫 支付宝扫一扫支付宝扫一扫
上一篇 2022-05-13
下一篇2022-05-13

发表评论

登录后才能评论

评论列表(0条)

    保存