Error run scrapy after install docker

I want to crawl from dynamic content using scrapy. I get from the internet that I must install docker. But after installing it, I always get error when I run:

scrapy runspider example.py

or other scrapy command. Then I uninstall docker. But the error is still showed. This is the error:
enter image description here

  • Multi-container docker on AWS - Nginx use host machine /etc/hosts resolver
  • Docker Host And Other Fundamental Questions
  • Cron job doesn't work until I re-save cron file in docker container
  • Not able to execute RUN commands in Dockerfile
  • Creating a host directory as a data volume in Dockerfile
  • Use Docker environment variables for credentials
  • then I try to install pypiwin32, there is an error too:enter image description here

    How to solve it?

  • Permission error when deploying Docker app from CI server to Heroku
  • IBM/secure-gateway-client docker run with the --F (acl file) option
  • In docker cpu usage calculation what are: TotalUsage, SystemUsage, PercpuUsage and what does the calculation means?
  • How to restore postgres within a docker?
  • Bind Docker to a Unix socket under upstart in kubernetes master node
  • How to kill process inside container? Docker top command
  • One Solution collect form web for “Error run scrapy after install docker”

    Maybe it’s just a permission denied because pip is trying to set files in a directory which need more privileges.

    Try to run your console with administration privileges and executing:

    pip install pypiwin32
    

    Then try again to run your script, without admin privileges.


    Edit:

    If you don’t want to run cmd as admin, you could try something like:

    python -m pip install pypiwin32
    

    But I’m not sure if it will work.

    Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.