Advanced Linux AI Development & Containerization
Master advanced Linux development tools, Python environments, Docker containerization, and professional AI deployment workflows on Linux systems.
Core Skills
Fundamental abilities you'll develop
- Implement Docker containerization for AI applications
- Build professional AI development workflows
Learning Goals
What you'll understand and learn
- Master Python environment management with conda and pip
Practical Skills
Hands-on techniques and methods
- Configure advanced Linux development environments for AI
- Deploy AI models using Linux server environments
Advanced Content Notice
This lesson covers advanced AI concepts and techniques. Strong foundational knowledge of AI fundamentals and intermediate concepts is recommended.
Advanced Linux AI Development & Containerization
Master advanced Linux development tools, Python environments, Docker containerization, and professional AI deployment workflows on Linux systems.
Tier: Advanced
Difficulty: Advanced
Master advanced Linux development tools, Python environments, Docker containerization, and professional AI deployment workflows on Linux systems.
Tier: Advanced
Difficulty: Advanced
Learning Objectives
- Configure advanced Linux development environments for AI
- Master Python environment management with conda and pip
- Implement Docker containerization for AI applications
- Deploy AI models using Linux server environments
- Build professional AI development workflows
Linux AI Development Tools & Environment
Setting Up Linux AI Development Tools
Package Management Mastery
Advanced APT Usage
# Update package database
sudo apt update
# Show package information
apt show python3-pip
# Search for packages
apt search "machine learning"
# List installed packages
apt list --installed | grep python
# Remove packages completely
sudo apt purge package-name
sudo apt autoremove
Essential AI Development Packages
# System dependencies for AI development
sudo apt install -y \
build-essential \
software-properties-common \
python3-dev \
python3-pip \
python3-venv \
libblas-dev \
liblapack-dev \
libatlas-base-dev \
gfortran \
libhdf5-dev \
libssl-dev \
libffi-dev \
libjpeg-dev \
libpng-dev \
libfreetype6-dev \
pkg-config
# Additional utilities
sudo apt install -y \
htop \
btop \
ncdu \
tree \
jq \
tmux \
screen \
neofetch \
speedtest-cli
Python Environment Management
Install Miniforge (Recommended over Anaconda)
# Download Miniforge installer
wget <https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-Linux-x86_64.sh>
# Make executable and install
chmod +x Miniforge3-Linux-x86\_64.sh
./Miniforge3-Linux-x86\_64.sh
# Follow installation prompts, then restart shell
source ~/.bashrc
Create AI Development Environment
# Create conda environment for AI development
conda create -n ai-dev python=3.11 -y
conda activate ai-dev
# Install essential AI packages
conda install -y \
numpy \
pandas \
matplotlib \
seaborn \
scikit-learn \
jupyter \
jupyterlab \
ipython
# Install PyTorch (CPU version for now)
conda install pytorch torchvision torchaudio cpuonly -c pytorch
# Install additional packages with pip
pip install \
transformers \
openai \
anthropic \
langchain \
streamlit \
fastapi \
uvicorn \
requests \
beautifulsoup4 \
selenium
GPU Support Setup (If Available)
# Check for NVIDIA GPU
nvidia-smi
# Install CUDA toolkit (if GPU available)
wget <https://developer.download.nvidia.com/compute/cuda/repos/wsl-ubuntu/x86_64/cuda-wsl-ubuntu.pin>
sudo mv cuda-wsl-ubuntu.pin /etc/apt/preferences.d/cuda-repository-pin-600
sudo apt-key adv --fetch-keys <https://developer.download.nvidia.com/compute/cuda/repos/wsl-ubuntu/x86_64/3bf863cc.pub>
sudo add-apt-repository "deb <https://developer.download.nvidia.com/compute/cuda/repos/wsl-ubuntu/x86_64/> /"
sudo apt update
sudo apt install -y cuda-toolkit-12-2
# Install PyTorch with CUDA support
conda install pytorch torchvision torchaudio pytorch-cuda=12.1 -c pytorch -c nvidia
Development Tools Configuration
Configure Zsh with Oh My Zsh (Optional but Recommended)
# Install Zsh
sudo apt install -y zsh
# Install Oh My Zsh
sh -c "$(curl -fsSL <https://raw.github.com/ohmyzsh/ohmyzsh/master/tools/install.sh>)"
# Install useful plugins
git clone <https://github.com/zsh-users/zsh-autosuggestions> ~/.oh-my-zsh/custom/plugins/zsh-autosuggestions
git clone <https://github.com/zsh-users/zsh-syntax-highlighting> ~/.oh-my-zsh/custom/plugins/zsh-syntax-highlighting
# Edit ~/.zshrc to enable plugins
plugins=(git zsh-autosuggestions zsh-syntax-highlighting python conda-env)
Configure Git for AI Projects
# Advanced Git configuration for AI development
git config --global core.editor "code --wait"
git config --global merge.tool "code --wait"
git config --global diff.tool "code --wait"
# Git LFS for large model files
sudo apt install -y git-lfs
git lfs install
# Track common AI file types with LFS
git lfs track "*.pkl"
git lfs track "*.h5"
git lfs track "*.pth"
git lfs track "*.bin"
git lfs track "*.safetensors"
git lfs track "*.onnx"
# Add .gitattributes
echo "*.pkl filter=lfs diff=lfs merge=lfs -text" >> .gitattributes
echo "*.h5 filter=lfs diff=lfs merge=lfs -text" >> .gitattributes
echo "\*.pth filter=lfs diff=lfs merge=lfs -text" >> .gitattributes
Jupyter Lab Configuration
# Generate Jupyter config
jupyter lab --generate-config
# Install useful extensions
pip install \
jupyterlab-git \
jupyterlab-lsp \
jupyter-ai \
nbdime \
jupyterlab\_code\_formatter
# Configure Jupyter Lab
cat >> ~/.jupyter/jupyter\_lab\_config.py ~/.tmux.conf ~/monitor\_system.sh << 'EOF'
#!/bin/bash
echo "=== System Performance Monitor ==="
echo
echo "CPU Usage:"
top -bn1 | grep "Cpu(s)" | sed "s/.*, *\([0-9.]*\)%* id.\*/\1/" | awk '{print 100 - $1"%
---
## Docker Containerization for AI Applications
# Docker Containerization for AI ApplicationsMaster containerization to create reproducible, scalable AI applications that run consistently across development, testing, and production environments.
## Why Docker for AI Development?
### Reproducibility Challenges in AI
- Environment Dependencies: Complex Python packages, CUDA versions, system libraries
- Version Conflicts: Different projects requiring different package versions
- Deployment Consistency: "Works on my machine" syndrome
- Collaboration Issues: Team members with different OS and configurations
### Docker Solutions
- Isolated Environments: Each container runs independently
- Consistent Deployment: Same environment from development to production
- Easy Scaling: Horizontal scaling across multiple servers
- Version Control: Infrastructure as code with Dockerfiles
## Docker Installation and Setup
### Install Docker on WSL2
```bash
# Update package database
sudo apt update
# Install prerequisites
sudo apt install -y \
apt-transport-https \
ca-certificates \
curl \
gnupg \
lsb-release
# Add Docker's official GPG key
curl -fsSL <https://download.docker.com/linux/ubuntu/gpg> | sudo gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg
# Add Docker repository
echo "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] <https://download.docker.com/linux/ubuntu> $(lsb\_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
# Install Docker Engine
sudo apt update
sudo apt install -y docker-ce docker-ce-cli containerd.io docker-compose-plugin
# Add user to docker group (avoid sudo)
sudo usermod -aG docker $USER
# Restart shell or run:
newgrp docker
# Test installation
docker --version
docker run hello-world
Configure Docker for AI Development
# Create Docker daemon configuration for better performance
sudo mkdir -p /etc/docker
sudo tee /etc/docker/daemon.json <<EOF
{
"storage-driver": "overlay2
---
Master Advanced AI Concepts
You're working with cutting-edge AI techniques. Continue your advanced training to stay at the forefront of AI technology.