Learn JAX with Real Code Examples

Updated Nov 24, 2025

Explain

JAX provides a NumPy-compatible API with hardware acceleration on CPU, GPU, and TPU.

It offers automatic differentiation for gradients of arbitrary Python functions using `grad`, `vmap`, `jit`, and `pmap`.

JAX enables composable transformations like vectorization, parallelization, and just-in-time compilation, making it ideal for research and large-scale ML experiments.

Core Features

Autograd for forward- and reverse-mode differentiation

JIT compilation for CPU/GPU/TPU performance

Vectorized operations over batches with `vmap`

Parallel computation across devices with `pmap`

Random number generation with functional, reproducible API

Basic Concepts Overview

Array: JAX’s primary data structure, similar to NumPy arrays

grad: computes derivatives of functions automatically

jit: compiles Python functions for optimized execution

vmap: vectorizes functions over batch dimensions

pmap: parallelizes functions across multiple devices

Project Structure

scripts/ - JAX model and function scripts

datasets/ - input data for training or simulations

notebooks/ - exploratory computation and experimentation

models/ - saved parameters or checkpoints

logs/ - performance metrics and experiment tracking

Building Workflow

Define functions using JAX-compatible NumPy API

Apply `grad` for automatic differentiation

Use `jit` to compile functions for hardware acceleration

Vectorize computations over batches with `vmap`

Parallelize computations across devices with `pmap` for scalability

Difficulty Use Cases

Beginner: basic array operations and gradients

Intermediate: JIT compilation and batched operations

Advanced: multi-device parallelization with `pmap`

Expert: custom differentiable functions and research pipelines

Enterprise: scaling scientific computing or ML models on TPU clusters

Comparisons

JAX vs NumPy: JAX adds autograd, JIT, vmap, pmap, GPU/TPU support

JAX vs TensorFlow: JAX is functional, research-focused, flexible; TensorFlow has higher-level APIs and production tools

JAX vs PyTorch: JAX uses functional programming and composable transformations; PyTorch is imperative and popular for production

JAX vs Numpy+Autograd: JAX is faster, supports hardware acceleration and composable transforms

JAX vs MATLAB: JAX is Python-first, differentiable, and GPU/TPU compatible

Versioning Timeline

2018 – Initial release by Google Research

2019 – Added `vmap` for vectorization and `pmap` for multi-device parallelism

2020 – Stable releases with enhanced JIT compilation

2021 – Full TPU support and Flax integration

2025 – Latest release with expanded scientific computing ecosystem and GPU/TPU optimizations

Glossary

JAX: high-performance numerical computing library

grad: automatic differentiation function

jit: just-in-time compilation

vmap: vectorized map for batch processing

pmap: parallel map across multiple devices