There was a problem when creating a project by executing the following command in scrapy.

scrapy startproject myproject
Error message
Traceback (most recent call last):
  File "/ mnt/c/ubuntu_home/scraping/bin/scrapy", line 8, in<module>
    sys.exit (execute ())
  File "/mnt/c/ubuntu_home/scraping/lib/python3.6/site-packages/scrapy/cmdline.py", line 146, in execute
    _run_print_help (parser, _run_command, cmd, args, opts)
  File "/mnt/c/ubuntu_home/scraping/lib/python3.6/site-packages/scrapy/cmdline.py", line 100, in _run_print_help
    func (* a, ** kw)
  File "/mnt/c/ubuntu_home/scraping/lib/python3.6/site-packages/scrapy/cmdline.py", line 154, in _run_command
    cmd.run (args, opts)
  File "/mnt/c/ubuntu_home/scraping/lib/python3.6/site-packages/scrapy/commands/startproject.py", line 102, in run
    self._copytree (self.templates_dir, abspath (project_dir))
  File "/mnt/c/ubuntu_home/scraping/lib/python3.6/site-packages/scrapy/commands/startproject.py", line 78, in _copytree
    self._copytree (srcname, dstname)
  File "/mnt/c/ubuntu_home/scraping/lib/python3.6/site-packages/scrapy/commands/startproject.py", line 78, in _copytree
    self._copytree (srcname, dstname)
  File "/mnt/c/ubuntu_home/scraping/lib/python3.6/site-packages/scrapy/commands/startproject.py", line 78, in _copytree
    self._copytree (srcname, dstname)
  File "/mnt/c/ubuntu_home/scraping/lib/python3.6/site-packages/scrapy/commands/startproject.py", line 81, in _copytree
    copystat (src, dst)
  File "/usr/lib/python3.6/shutil.py", line 229, in copystat
    _copyxattr (src, dst, follow_symlinks = follow)
  File "/usr/lib/python3.6/shutil.py", line 165, in _copyxattr
    os.setxattr (dst, name, value, follow_symlinks = follow_symlinks)
PermissionError: [Errno 13] Permission denied: '/ mnt/c/ubuntu_home/myproject/module/spiders/__ pycache __'

Myproject was created, so when I checked with tree, it came out as follows.

└── module
    ├── items.py.tmpl
    ├── middlewares.py.tmpl
    ├── pipelines.py.tmpl
    ├── settings.py.tmpl
    └── spiders
        ├── __init__.py
        └── __pycache__
3 directories, 5 files

Originally, the tree looks like this:

myproject /
Project── myproject
│ ├── __init__.py
│ ├── items.py
│ ├── middlewares.py
│ ├── pipelines.py
│ ├── settings.py
│ └── spiders
│ └── __init__.py
└── scrapy.cfg
What I did

I looked at various sites to see why this happened, but I couldn't find a solution. I'd be grateful if you could tell me how it would look like a tree that was supposed to be created.

Addition (Environment)

I am using ubuntu18.04 in WSL environment

  • Answer # 1

    sudo pip install scrapyand install with administrator privileges, thensudo scrapy startproject myproject

Related articles