Scrapy-Splash: Failed to run docker container with scrapinghub/splash:latest as base image

Multi tool use
up vote
0
down vote
favorite
Am building a python Scrapy application which uses some Azure Services and Scrapy-Splash. I tried creating a docker image for my application with scrapinghub/splash:latest
as base image in my local windows machine.
Below is the Dockerfile am using,
FROM scrapinghub/splash:latest
WORKDIR /usr/src/snapshot
ADD requirements.txt ./
RUN pip install -r requirements.txt
ADD . ./
EXPOSE 8888 80
ENTRYPOINT ["/usr/src/snapshot/init_container.sh"]
The init_container.sh file has the statement to run the application, python /usr/src/snapshot/SiteCrawler.py
.
Now when I run the docker image with the command docker run testsnapshot:0.1
, the application starts and stops due to import error, ImportError: No module named azure.servicebus
I tried creating a docker image of the application with python:3.6.6 as base image, it works fine.
The application build the docker image and installs the packages from requirements.txt correctly.
Attaching below my requirements.txt content
asn1crypto==0.24.0
attrs==18.2.0
Automat==0.7.0
azure-common==1.1.16
azure-nspkg==3.0.2
azure-servicebus==0.21.1
azure-storage==0.36.0
certifi==2018.10.15
cffi==1.11.5
chardet==3.0.4
constantly==15.1.0
cryptography==2.3.1
cssselect==1.0.3
hyperlink==18.0.0
idna==2.7
incremental==17.5.0
lxml==4.2.5
parsel==1.5.0
pip==18.0
pyasn1==0.4.4
pyasn1-modules==0.2.2
pycparser==2.19
PyDispatcher==2.0.5
PyHamcrest==1.9.0
pyOpenSSL==18.0.0
python-dateutil==2.7.3
queuelib==1.5.0
requests==2.20.0
Scrapy==1.5.1
scrapy-splash==0.7.2
service-identity==17.0.0
setuptools==39.0.1
six==1.11.0
Twisted==16.1.1
urllib3==1.24
w3lib==1.19.0
zope.interface==4.5.0
python docker scrapy azureservicebus scrapy-splash
add a comment |
up vote
0
down vote
favorite
Am building a python Scrapy application which uses some Azure Services and Scrapy-Splash. I tried creating a docker image for my application with scrapinghub/splash:latest
as base image in my local windows machine.
Below is the Dockerfile am using,
FROM scrapinghub/splash:latest
WORKDIR /usr/src/snapshot
ADD requirements.txt ./
RUN pip install -r requirements.txt
ADD . ./
EXPOSE 8888 80
ENTRYPOINT ["/usr/src/snapshot/init_container.sh"]
The init_container.sh file has the statement to run the application, python /usr/src/snapshot/SiteCrawler.py
.
Now when I run the docker image with the command docker run testsnapshot:0.1
, the application starts and stops due to import error, ImportError: No module named azure.servicebus
I tried creating a docker image of the application with python:3.6.6 as base image, it works fine.
The application build the docker image and installs the packages from requirements.txt correctly.
Attaching below my requirements.txt content
asn1crypto==0.24.0
attrs==18.2.0
Automat==0.7.0
azure-common==1.1.16
azure-nspkg==3.0.2
azure-servicebus==0.21.1
azure-storage==0.36.0
certifi==2018.10.15
cffi==1.11.5
chardet==3.0.4
constantly==15.1.0
cryptography==2.3.1
cssselect==1.0.3
hyperlink==18.0.0
idna==2.7
incremental==17.5.0
lxml==4.2.5
parsel==1.5.0
pip==18.0
pyasn1==0.4.4
pyasn1-modules==0.2.2
pycparser==2.19
PyDispatcher==2.0.5
PyHamcrest==1.9.0
pyOpenSSL==18.0.0
python-dateutil==2.7.3
queuelib==1.5.0
requests==2.20.0
Scrapy==1.5.1
scrapy-splash==0.7.2
service-identity==17.0.0
setuptools==39.0.1
six==1.11.0
Twisted==16.1.1
urllib3==1.24
w3lib==1.19.0
zope.interface==4.5.0
python docker scrapy azureservicebus scrapy-splash
Got it worked. Just addedVOLUME ["/usr/src/snapshot"]
to Docker file. But now I face a different issue where the splash url is not accessible from my scrapy. I tried withhttp://127.0.0.1:8050
andhttp://0.0.0.0:8050
– sadiqmc
2 days ago
add a comment |
up vote
0
down vote
favorite
up vote
0
down vote
favorite
Am building a python Scrapy application which uses some Azure Services and Scrapy-Splash. I tried creating a docker image for my application with scrapinghub/splash:latest
as base image in my local windows machine.
Below is the Dockerfile am using,
FROM scrapinghub/splash:latest
WORKDIR /usr/src/snapshot
ADD requirements.txt ./
RUN pip install -r requirements.txt
ADD . ./
EXPOSE 8888 80
ENTRYPOINT ["/usr/src/snapshot/init_container.sh"]
The init_container.sh file has the statement to run the application, python /usr/src/snapshot/SiteCrawler.py
.
Now when I run the docker image with the command docker run testsnapshot:0.1
, the application starts and stops due to import error, ImportError: No module named azure.servicebus
I tried creating a docker image of the application with python:3.6.6 as base image, it works fine.
The application build the docker image and installs the packages from requirements.txt correctly.
Attaching below my requirements.txt content
asn1crypto==0.24.0
attrs==18.2.0
Automat==0.7.0
azure-common==1.1.16
azure-nspkg==3.0.2
azure-servicebus==0.21.1
azure-storage==0.36.0
certifi==2018.10.15
cffi==1.11.5
chardet==3.0.4
constantly==15.1.0
cryptography==2.3.1
cssselect==1.0.3
hyperlink==18.0.0
idna==2.7
incremental==17.5.0
lxml==4.2.5
parsel==1.5.0
pip==18.0
pyasn1==0.4.4
pyasn1-modules==0.2.2
pycparser==2.19
PyDispatcher==2.0.5
PyHamcrest==1.9.0
pyOpenSSL==18.0.0
python-dateutil==2.7.3
queuelib==1.5.0
requests==2.20.0
Scrapy==1.5.1
scrapy-splash==0.7.2
service-identity==17.0.0
setuptools==39.0.1
six==1.11.0
Twisted==16.1.1
urllib3==1.24
w3lib==1.19.0
zope.interface==4.5.0
python docker scrapy azureservicebus scrapy-splash
Am building a python Scrapy application which uses some Azure Services and Scrapy-Splash. I tried creating a docker image for my application with scrapinghub/splash:latest
as base image in my local windows machine.
Below is the Dockerfile am using,
FROM scrapinghub/splash:latest
WORKDIR /usr/src/snapshot
ADD requirements.txt ./
RUN pip install -r requirements.txt
ADD . ./
EXPOSE 8888 80
ENTRYPOINT ["/usr/src/snapshot/init_container.sh"]
The init_container.sh file has the statement to run the application, python /usr/src/snapshot/SiteCrawler.py
.
Now when I run the docker image with the command docker run testsnapshot:0.1
, the application starts and stops due to import error, ImportError: No module named azure.servicebus
I tried creating a docker image of the application with python:3.6.6 as base image, it works fine.
The application build the docker image and installs the packages from requirements.txt correctly.
Attaching below my requirements.txt content
asn1crypto==0.24.0
attrs==18.2.0
Automat==0.7.0
azure-common==1.1.16
azure-nspkg==3.0.2
azure-servicebus==0.21.1
azure-storage==0.36.0
certifi==2018.10.15
cffi==1.11.5
chardet==3.0.4
constantly==15.1.0
cryptography==2.3.1
cssselect==1.0.3
hyperlink==18.0.0
idna==2.7
incremental==17.5.0
lxml==4.2.5
parsel==1.5.0
pip==18.0
pyasn1==0.4.4
pyasn1-modules==0.2.2
pycparser==2.19
PyDispatcher==2.0.5
PyHamcrest==1.9.0
pyOpenSSL==18.0.0
python-dateutil==2.7.3
queuelib==1.5.0
requests==2.20.0
Scrapy==1.5.1
scrapy-splash==0.7.2
service-identity==17.0.0
setuptools==39.0.1
six==1.11.0
Twisted==16.1.1
urllib3==1.24
w3lib==1.19.0
zope.interface==4.5.0
python docker scrapy azureservicebus scrapy-splash
python docker scrapy azureservicebus scrapy-splash
asked Nov 8 at 8:33
sadiqmc
5611
5611
Got it worked. Just addedVOLUME ["/usr/src/snapshot"]
to Docker file. But now I face a different issue where the splash url is not accessible from my scrapy. I tried withhttp://127.0.0.1:8050
andhttp://0.0.0.0:8050
– sadiqmc
2 days ago
add a comment |
Got it worked. Just addedVOLUME ["/usr/src/snapshot"]
to Docker file. But now I face a different issue where the splash url is not accessible from my scrapy. I tried withhttp://127.0.0.1:8050
andhttp://0.0.0.0:8050
– sadiqmc
2 days ago
Got it worked. Just added
VOLUME ["/usr/src/snapshot"]
to Docker file. But now I face a different issue where the splash url is not accessible from my scrapy. I tried with http://127.0.0.1:8050
and http://0.0.0.0:8050
– sadiqmc
2 days ago
Got it worked. Just added
VOLUME ["/usr/src/snapshot"]
to Docker file. But now I face a different issue where the splash url is not accessible from my scrapy. I tried with http://127.0.0.1:8050
and http://0.0.0.0:8050
– sadiqmc
2 days ago
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53203995%2fscrapy-splash-failed-to-run-docker-container-with-scrapinghub-splashlatest-as%23new-answer', 'question_page');
}
);
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
0OA2T i sGTwBaxllrVIK2 SYayZPoLv6NAQ
Got it worked. Just added
VOLUME ["/usr/src/snapshot"]
to Docker file. But now I face a different issue where the splash url is not accessible from my scrapy. I tried withhttp://127.0.0.1:8050
andhttp://0.0.0.0:8050
– sadiqmc
2 days ago