An Obsidian-style knowledge base for the Deep Learning Do It Yourself course, generated from 35 video lecture transcripts.
transcripts/
├── knowledge_base/ # 318 Obsidian-style concept notes
├── knowledge_base_graph.html # interactive graph visualization
└── transcript_to_module.json # transcript filename → module ID mapping
An Obsidian-compatible vault of 318 concept notes covering 20 lecture modules, 9 practicals, and 6 bonus lectures. Each note contains:
- YAML frontmatter — aliases, source modules, tags
- Summary — short explanation with
[[wikilinks]]to related concepts - Professor's quotes — exact words with timestamps, organized by module
- See also — links to related concepts
| Metric | Value |
|---|---|
| Total notes | 318 |
| Multi-module concepts | 46 (appear in 2+ modules) |
| Most connected note | training loop (6 modules, 20 wikilinks) |
| Total links | 1,506 |
# softmax
The softmax function converts a vector of raw scores ([[logits]]) into a
[[probability distribution]]. It is the standard output activation for
multi-class [[classification]].
## What the professor says
### Module 3 — Loss Functions for Classification
> [29:12 → 29:58] "This function is called the softmax function. Why is
> that? It's because if all the theta values here are large, you see that
> this softmax function will concentrate on the maximum. It is a soft
> version of the max function."
## See also
- [[cross-entropy]] — the loss function paired with softmax
- [[sigmoid]] — the binary classification equivalent
- [[logits]] — the raw network outputs before softmax- Obsidian — Open
knowledge_base/as a vault in Obsidian to explore the graph visually - Interactive graph — Open the knowledge base graph in your browser
This knowledge base is designed to be used as a resource by dataflowr-tools. Two MCP tools can leverage it:
search_knowledge_base(query)— fuzzy-match against note names, aliases, and tagsget_knowledge_note(concept)— fetch a specific note for the AI tutor to read
This enables queries like "What did the professor say about softmax?" to return a focused ~1 KB note with exact quotes and timestamps, instead of dumping a 47 KB raw transcript into the LLM context.
See LICENSE.