Jeff Liu Lab
HomeProjectsWorkshopAI WikiAI LabShop
Sign In
All
Computing Science
Artificial Intelligence
Deep Learning
Reinforcement Learning
AI Agents
Embodied Intelligence
Robot Engineering
Human-Like Intelligence
AI Engineering
← Back to Wiki
Deep Learning
Deep Learning Landscape
Introduction to Deep Learning
Feedforward Neural Networks
Probability & Statistics in DL
Loss Functions

Comments (0)

Sign in to comment

Table of Contents
How to Read This ReportTable of Contents1. Cross-Dimensional Alignment of Five Core Concepts1.1 Full Concept Comparison Matrix1.2 Hierarchical Relationships: The Containment Structure of the Five Concepts1.3 Practical Example: All Five Concepts in a Single Project2. Loss Functions from Interdisciplinary Perspectives2.1 The Statistical Perspective: Prior Assumptions About Noise Distributions2.2 The Information-Theoretic Perspective: "Communication Cost" Between Distributions2.3 The Physics Perspective: Energy Equilibria and Potential Energy Surfaces2.4 The Economics and Decision Theory Perspective: Regret and Utility3. Comprehensive Summary of Loss Functions by Task3.1 Regression Tasks: Pursuing Accuracy in Continuous Values3.2 Classification Tasks: Pursuing Overlap of Probability Distributions3.3 Metric Learning and Geometric Spaces: Pursuing Clustering in Embeddings4. Modern Frontier Loss Functions4.1 Preference Alignment for Large Language Models (LLMs)4.2 Self-Supervised Learning (SSL)4.3 Loss Functions for Generative Models5. Engineering Implementation: Framework Differences and Common Pitfalls5.1 Logits vs. Probabilities: The Most Common Fatal Error5.2 Reduction Strategies5.3 Numerical Stability Best Practices6. Selection Guide and Summary6.1 Decision Tree: How to Choose a Loss Function6.2 Core Philosophy Summary6.3 The Essence in One Sentence7. References and Further ReadingFoundational TheoryClassic PapersFrontier Alignment Research

© 2026 Jeff Liu Lab. All rights reserved.

AboutPricingPrivacy & TermsContact