I have two python files a.py and b.py, the dir structure is like this:
files(dir)
- main(dir)
- a.py
- Dockerfile
- b.py
- main(dir)
In a.py, there is some code like this:
import sys
sys.path.append('..')
import b
I can run it well through command line. But failed to run it with docker. Here is the code for building and running the docker image:
the Dockerfile:
FROM python:3.6
ADD a.py /.
WORKDIR /.
ENV PYTHONPATH /files/
CMD [ "python3", "a.py" ]
Commands for building the image:
# cd /files/main
# docker build -t a:1.0 .
The image was built successfully and the commands for running the image:
# docker run --name a a:1.0
It gives me:
Traceback (most recent call last):
File "a.py", line 3, in <module>
import b
ModuleNotFoundError: No module named 'b'
My question is, given this example, how can I build and run the image in the correct way?
b.pyisn't inside the container as you're only copying the contents of one script and not the whole directory. You need to copy at least all modules you want to use in order to import them