Hero image credits to Maxim Hopman on Unsplash.
More often than not, I find myself setting up a new python virtual environment with the same packages required for my basic data analysis workflow. It’s always the same steps:
- Create a virtualenv
- Source the new venv
pip installthe same packages
- Create a new notebook for Jupyter
- Import the same few packages
- Finally start the work
Sometimes I do these steps multiple times per day and over time that accumulates to quite a lot time just to do non-productive repetative work.
So I automated the first 5 of the steps listed above.
TL;DR: You can find the code here.
I made a git repo with all the necessary files for this process and wrote a bash-script that does all the installing.
1❯ tree -L 12.3├── LICENSE4├── README.md5├── install.sh6├── requirements.txt7├── starter.ipynb8└── venv
1#! /bin/bash23# First create a python3 virtualenv4python3 -m venv venv56# Then source the newly created env7source venv/bin/activate89# Next, install the requirements10pip install -r requirements.txt1112echo 'Done!'13echo 'Run `source venv/bin/activate`'14echo 'And after that `jupyter lab`'
Now I could just clone the repo, run the script and start the actual work.
However, the process has now only few steps less than the original one had.
To make this better (ie. faster), I wrote a script that I can just curl and run.
1#! /bin/bash23# clone from github to current directory4# TODO make the dir as an argument to be passed in5git clone email@example.com:juhosa/jupyter-lab-template.git .67# run the installer8./install.sh910# remove the .git directory11rm -rf .git/
Now when I run
1/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/juhosa/jupyter-lab-template/main/clone_and_install.sh)"
I’ll get a basic setup in no time compared to the original. Sweet! 🎉
Thinking about this, I’ll propably make an alias for the setup script call. That way I could just call something like
and don’t have to worry about finding the clone-and-install script.