找不到资源u'tokenizers / punkt / english.pickle'
我的代码: import nltk.data tokenizer = nltk.data.load('nltk:tokenizers/punkt/english.pickle') 错误信息: [ec2-user@ip-172-31-31-31 sentiment]$ python mapper_local_v1.0.py Traceback (most recent call last): File "mapper_local_v1.0.py", line 16, in <module> tokenizer = nltk.data.load('nltk:tokenizers/punkt/english.pickle') File "/usr/lib/python2.6/site-packages/nltk/data.py", line 774, in load opened_resource = _open(resource_url) File "/usr/lib/python2.6/site-packages/nltk/data.py", line 888, in _open return find(path_, path + ['']).open() File "/usr/lib/python2.6/site-packages/nltk/data.py", line 618, in …