Guides
Long-form guides on the math, tools, and ideas we use to build AI systems at Ephizen. Read them in any order — each one is self-contained and shows its prerequisites up front.
Foundations
Mathematics 4
The calculus you need to understand what backprop is actually doing. Derivatives, gradients, and the chain rule — the rest you can Google.
The bits of linear algebra you actually need to read ML papers and debug models. Vectors, matrices, dot products, eigenvalues — tied to embeddings, PCA, and attention.
The probability you actually need to reason about ML models — not the textbook full course, just the parts that show up in loss functions, sampling, and evaluation.
The statistics you need to trust your eval numbers — hypothesis testing, confidence intervals, and the fact that one run is never enough.
Data Structures & Algorithms 2
How to reason about the cost of code without running it. The single most reused idea from DSA in day-to-day ML engineering.
The data structure ML engineers use more than any other — for deduping data, counting features, caching embeddings, and 80% of interview problems.
Python 2
NumPy is the substrate every ML framework is built on. Vectorized operations, broadcasting, axis semantics — the stuff that makes the difference between a fast and a slow model.
The table library every data scientist reaches for. Learn the 20 operations that cover 95% of your data-cleaning life.
Core ML
Classical ML 2
The winner on tabular data for the last decade. How it works, which library to pick, and the three mistakes everyone makes tuning it.
The simplest useful classifier, and a surprisingly strong baseline for most problems. Understand it deeply and you understand half of classical ML.
LLMs & Generative AI 2
When RAG isn't enough and you actually need to teach the model something new. How to decide, how to do it, and what it'll cost you.
Give an LLM fresh, private, or domain-specific knowledge at query time by retrieving relevant chunks and stuffing them into the prompt.