第七色在线视频,2021少妇久久久久久久久久,亚洲欧洲精品成人久久av18,亚洲国产精品特色大片观看完整版,孙宇晨将参加特朗普的晚宴

為了賬號安全,請及時(shí)綁定郵箱和手機(jī)立即綁定

運(yùn)行scrapy crawl douban_spider后報(bào)錯(cuò)

2019-05-31 20:40:43 [scrapy.utils.log] INFO: Scrapy 1.6.0 started (bot: douban)

2019-05-31 20:40:43 [scrapy.utils.log] INFO: Versions: lxml 4.3.2.0, libxml2 2.9.9, cssselect 1.0.3, parsel 1.5.1, w3lib 1.20.0, Twisted 19.2.0, Python 2.7.5 (default, Apr? 9 2019, 14:30:50) - [GCC 4.8.5 20150623 (Red Hat 4.8.5-36)], pyOpenSSL 0.13.1 (OpenSSL 1.0.1e-fips 11 Feb 2013), cryptography 1.7.2, Platform Linux-3.10.0-693.el7.x86_64-x86_64-with-centos-7.4.1708-Core

2019-05-31 20:40:43 [scrapy.crawler] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'douban.spiders', 'SPIDER_MODULES': ['douban.spiders'], 'ROBOTSTXT_OBEY': True, 'BOT_NAME': 'douban'}

Traceback (most recent call last):

? File "/usr/bin/scrapy", line 10, in <module>

? ? sys.exit(execute())

? File "/usr/lib64/python2.7/site-packages/scrapy/cmdline.py", line 150, in execute

? ? _run_print_help(parser, _run_command, cmd, args, opts)

? File "/usr/lib64/python2.7/site-packages/scrapy/cmdline.py", line 90, in _run_print_help

? ? func(*a, **kw)

? File "/usr/lib64/python2.7/site-packages/scrapy/cmdline.py", line 157, in _run_command

? ? cmd.run(args, opts)

? File "/usr/lib64/python2.7/site-packages/scrapy/commands/crawl.py", line 57, in run

? ? self.crawler_process.crawl(spname, **opts.spargs)

? File "/usr/lib64/python2.7/site-packages/scrapy/crawler.py", line 171, in crawl

? ? crawler = self.create_crawler(crawler_or_spidercls)

? File "/usr/lib64/python2.7/site-packages/scrapy/crawler.py", line 200, in create_crawler

? ? return self._create_crawler(crawler_or_spidercls)

? File "/usr/lib64/python2.7/site-packages/scrapy/crawler.py", line 205, in _create_crawler

? ? return Crawler(spidercls, self.settings)

? File "/usr/lib64/python2.7/site-packages/scrapy/crawler.py", line 55, in __init__

? ? self.extensions = ExtensionManager.from_crawler(self)

? File "/usr/lib64/python2.7/site-packages/scrapy/middleware.py", line 53, in from_crawler

? ? return cls.from_settings(crawler.settings, crawler)

? File "/usr/lib64/python2.7/site-packages/scrapy/middleware.py", line 34, in from_settings

? ? mwcls = load_object(clspath)

? File "/usr/lib64/python2.7/site-packages/scrapy/utils/misc.py", line 44, in load_object

? ? mod = import_module(module)

? File "/usr/lib64/python2.7/importlib/__init__.py", line 37, in import_module

? ? __import__(name)

? File "/usr/lib64/python2.7/site-packages/scrapy/extensions/memusage.py", line 16, in <module>

? ? from scrapy.mail import MailSender

? File "/usr/lib64/python2.7/site-packages/scrapy/mail.py", line 25, in <module>

? ? from twisted.internet import defer, reactor, ssl

? File "/usr/lib64/python2.7/site-packages/twisted/internet/ssl.py", line 230, in <module>

? ? from twisted.internet._sslverify import (

? File "/usr/lib64/python2.7/site-packages/twisted/internet/_sslverify.py", line 14, in <module>

? ? from OpenSSL._util import lib as pyOpenSSLlib

ImportError: No module named _util

_util貌似不是一個(gè)模塊?安裝不了 該怎么解決呢

正在回答

1 回答

您使用的是python2 , 嘗試用python3運(yùn)行

0 回復(fù) 有任何疑惑可以回復(fù)我~

舉報(bào)

0/150
提交
取消

運(yùn)行scrapy crawl douban_spider后報(bào)錯(cuò)

我要回答 關(guān)注問題
微信客服

購課補(bǔ)貼
聯(lián)系客服咨詢優(yōu)惠詳情

幫助反饋 APP下載

慕課網(wǎng)APP
您的移動(dòng)學(xué)習(xí)伙伴

公眾號

掃描二維碼
關(guān)注慕課網(wǎng)微信公眾號