织梦CMS - 轻松建站从此开始!

沙龙国际开户_沙龙国际www.salon365.com

当前位置: 主页 > 沙龙国际 >

Scrapy [twisted] CRITICAL: Unhandled error in Deferred:错误

时间:2017-07-17 23:27来源:未知 作者:admin 点击:
今天在centos6.7 64位系统中安装Scrapy,但是报了如题错误: python版本:2.7.11 pip list如下 beautifulsoup4 (4.4.1) cffi (1.4.2) characteristic (14.3.0) cryptography (1.1.2) cssselect (0.9.1) enum34 (1.1.2) idna (2.0) ipad

今天在centos6.7 64位系统中安装Scrapy,但是报了如题错误:
python版本:2.7.11
pip list如下
beautifulsoup4 (4.4.1)
cffi (1.4.2)
characteristic (14.3.0)
cryptography (1.1.2)
cssselect (0.9.1)
enum34 (1.1.2)
idna (2.0)
ipaddress (1.0.16)
lxml (3.5.0)
meld3 (1.0.2)
pip (7.1.2)
pyasn1 (0.1.9)
pyasn1-modules (0.0.8)
pycparser (2.14)
pyOpenSSL (0.15.1)
queuelib (1.4.2)
Scrapy (1.0.4)
service-identity (14.0.0)
setuptools (19.2)
six (1.10.0)
supervisor (3.2.0)
Twisted (15.5.0)
w3lib (1.13.0)
web.py (0.37)
wheel (0.26.0)
zope.interface (4.1.3)
按照官网教程新建项目
1、scrapy startproject tutorial
2、修改tutorial/items.py

import scrapy
class DmozItem(scrapy.Item):

title = scrapy.Field()
link = scrapy.Field()
desc = scrapy.Field()

3、修改tutorial/tutorial/spiders/dmoz_spider.py

import scrapy
class DmozSpider(scrapy.Spider):

name = "dmoz"
allowed_domains = ["dmoz.org"]
start_urls = [
    "http://www.dmoz.org/Computers/Programming/Languages/Python/Books/",
    "http://www.dmoz.org/Computers/Programming/Languages/Python/Resources/"
]
def parse(self, response):
    filename = response.url.split("/")[-2]   '.html'
    with open(filename, 'wb') as f:
        f.write(response.body)

4、根目录下执行 scrapy crawl dmoz,报错了

2016-01-07 16:10:42 [scrapy] INFO: Scrapy 1.0.4 started (bot: tutorial)
2016-01-07 16:10:42 [scrapy] INFO: Optional features available: ssl, http11
2016-01-07 16:10:42 [scrapy] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'tutorial.spiders', 'SPIDER_MODULES': ['tutorial.spiders'], 'BOT_NAME': 'tutorial'}
2016-01-07 16:10:42 [scrapy] INFO: Enabled extensions: CloseSpider, TelnetConsole, LogStats, CoreStats, SpiderState
Unhandled error in Deferred:
2016-01-07 16:10:42 [twisted] CRITICAL: Unhandled error in Deferred:

Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/scrapy/cmdline.py", line 150, in _run_command

cmd.run(args, opts)

File "/usr/local/lib/python2.7/site-packages/scrapy/commands/crawl.py", line 57, in run

self.crawler_process.crawl(spname, **opts.spargs)

File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 153, in crawl

d = crawler.crawl(*args, **kwargs)

File "/usr/local/lib/python2.7/site-packages/twisted/internet/defer.py", line 1274, in unwindGenerator

return _inlineCallbacks(None, gen, Deferred())

--- (责任编辑:admin)

织梦二维码生成器
顶一下
(0)
0%
踩一下
(0)
0%
------分隔线----------------------------
发表评论
请自觉遵守互联网相关的政策法规,严禁发布色情、暴力、反动的言论。
评价:
表情:
用户名: 验证码:点击我更换图片
栏目列表
推荐内容