Error run scrapy after install docker

I want to crawl from dynamic content using scrapy. I get from the internet that I must install docker. But after installing it, I always get error when I run:

scrapy runspider example.py

or other scrapy command. Then I uninstall docker. But the error is still showed. This is the error:
enter image description here

  • DNS Server in Docker Container
  • Cassandra in Docker unable to make directory on mounted volume
  • Running php, apache and mysql with Docker & docker-compose
  • How can I detect if there's a new docker image without using docker pull?
  • How to save all docker images and copy to another machine?
  • Docker swarm on 'dind' images and networking problems
  • then I try to install pypiwin32, there is an error too:enter image description here

    How to solve it?

  • Godep not working in docker container
  • Copy war file from local machine to docker
  • Celery and Flask in same docker-compose
  • AngularJS and NodeJS app in Docker
  • Couchbase PHP SDK in Docker Container
  • Mount volume not working in docker
  • One Solution collect form web for “Error run scrapy after install docker”

    Maybe it’s just a permission denied because pip is trying to set files in a directory which need more privileges.

    Try to run your console with administration privileges and executing:

    pip install pypiwin32
    

    Then try again to run your script, without admin privileges.


    Edit:

    If you don’t want to run cmd as admin, you could try something like:

    python -m pip install pypiwin32
    

    But I’m not sure if it will work.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.