TheCopy and ThePaste is an entry-level space to share ideas, it's intended to place simple
experiments/examples, or maybe just thoughts, related to ML/AI and general information
technologies. The main purpose is to spread some concepts and apply them to the real-world.
I do not own the rights of this image
I try to be updated and to provide modern solutions for the tasks proposed. So, I care about the
depiction of the solutions here will use recent technologies (at least by the time they were
written).
I understand that your time is so important, and you don't want to waste it. Therefore, most of
the entries won't take you away from more than 10 mins. Also, they contain all the code
necessary to reproduce them. Nonetheless, if it's so extensive, I'll address to highlight the
important stuff and refer you to the source code.
So, please, if you're interested in any post, I encourage you to fork/clone the repository from
the GitHub → [thecopy-and-thepaste].
All in all, I appreciate you giving me some time. And thanks for coming here.
Consider using a [venv] to keep your
system unmodified (sometimes you will have to install many
packages with pip or conda). By using a venv, all dependencies will be isolated inside of this
environment.
To create a virtual environment, you can do as follows:
python3 -m venv path/to/virtual/environment
Then, activate it and install the dependencies, with:
source path/to/virtual/environment/bin/activate
pip install -r requirements.txt
[Docker] is a platform that offers
you a set of tools to isolate a software unit. It can be an app developed by yourself or an
environment to test something. It
works based on containers that can package all the resources needed for a specific
task.
In some entries, Docker is used to building the complete environment within dependencies.
You can download/install Docker Engine for your SO in the
[following link].
Besides, I encourage you to download/install [docker-compose], which lets you create
and configure
a complete environment using YAML. It's also useful when multiple containers need to interact
together.
One of the drawbacks of Docker is that you need "to enter" to the container to modify something.
You can achieve this with:
docker exec [container] [entrypoint]
To mitigate it, I recommend sharing the code
directory with the host.
So, a workaround you'll see here is a docker-compose file that already shares it. A common
template is the following:
version: '2'
services:
name_of_service:
container_name: name_of_container
build:
context: image_to_build
ports:
- container_port:host_port
volumes:
- ./code:/lib/code_environment/code
entrypoint: entry_point
tty: true
Note the volume sharing in:
volumes:
- ./code:/lib/code_environment/code
Finally, to show more interactively some steps, and for a better depiction of the results
obtained.
I rely on using [Jupyter notebook/lab], which is a web
application to create documents and add some
markdown/code blocks.
For virtual environment and docker image
To install it, with your virtualenv activated or inside you Docker container, use:
Then, to activate it:
jupyterlab --ip=0.0.0.0 --port 9000
By default, if no port is set, it will start in port 8888.
It will start the jupyter server using as a base path the working directory.
Using docker-compose
For this scenario, you don't have to modify anything. Each docker-compose
file, will
be based on a Jupyter ready image.I'll only add it as an entry point. With:
entrypoint: jupyter notebook --ip=0.0.0.0 --allow-root
It’s possible that some entries need a specific scenario, but don’t worry I care about it won’t
be like dragging a dead deer up a hill.
On the other hand, some entries just won't need any "tailored" dependency. So, I provide the link to the proper [Google colab] notebook.