Chen Kai Blog
  • HOME
  • ARCHIVES
  • CATEGORIES
  • TAGS
  • HOME
  • ARCHIVES
  • CATEGORIES
  • TAGS
  • Essence of Linear Algebra (18): Frontiers and Summary

    We've completed a long journey through linear algebra. From vectors and matrices to eigenvalue decomposition, SVD, tensor analysis, and applications in machine learning and deep learning — each chapter has revealed the remarkable universality of this discipline. Now, let's turn our gaze to the frontiers: quantum computing, graph neural networks, large language models, and technologies that are changing the world. These fields may seem mysterious, but their core remains the familiar linear algebra we've studied.

     2019-03-30  
    • Mathematics 
    • > Linear Algebra 
     
    • Linear Algebra 
    • | quantum computing 
    Read more 
  • Essence of Linear Algebra (17): Linear Algebra in Computer Vision

    The core mission of computer vision is to make machines "see" and understand images and videos. Remarkably, this entire field is built almost entirely on linear algebra: images themselves are matrices, geometric transformations are matrix multiplications, camera imaging is projective transformation, and 3D reconstruction is solving systems of linear equations. Master linear algebra, and you master the mathematical core of computer vision.

     2019-03-26  
    • Mathematics 
    • > Linear Algebra 
     
    • Linear Algebra 
    • | 3D reconstruction 
    Read more 
  • Essence of Linear Algebra (16): Linear Algebra in Deep Learning

    At its core, deep learning is just large-scale matrix computation. Whether it's the simplest fully connected network or the complex Transformer architecture, linear algebra is the mathematical foundation that powers everything. Understanding this connection will help you debug models, optimize performance, and design more efficient network architectures.

     2019-03-22  
    • Mathematics 
    • > Linear Algebra 
     
    • Linear Algebra 
    • | neural networks 
    Read more 
  • Essence of Linear Algebra (15): Linear Algebra in Machine Learning

    If you ask a machine learning engineer "what math do you use most often?", the answer is almost certainly linear algebra. From vector representations of data, to forward propagation in neural networks, to matrix factorization in recommender systems — linear algebra is the "native language" of machine learning. This chapter starts from intuition and dives deep into the linear algebra principles behind these core algorithms.

     2019-03-18  
    • Mathematics 
    • > Linear Algebra 
     
    • SVM 
    • | PCA 
    • | Linear Algebra 
    • | machine learning 
    Read more 
  • Essence of Linear Algebra (14): Random Matrix Theory

    When you fill a huge matrix with random numbers and compute its eigenvalues, something magical happens: the distribution of these eigenvalues exhibits stunning regularity. It is like finding order in chaos, hearing music in noise. Random matrix theory tells us that when dimensions are high enough, randomness itself gives rise to profound mathematical structure.

     2019-03-14  
    • Mathematics 
    • > Linear Algebra 
     
    • Linear Algebra 
    • | random matrices 
    • | Wigner semicircle law 
    • | Marchenko-Pastur distribution 
    Read more 
  • Essence of Linear Algebra (13): Tensors and Multilinear Algebra

    If you have worked with deep learning, you have certainly encountered the word "tensor"— PyTorch calls it torch.Tensor, and TensorFlow literally has "Tensor" in its name. But what exactly is a tensor? Why do deep learning frameworks use this physics-sounding term?

    This chapter starts from the familiar concepts of scalars, vectors, and matrices, and guides you to understand the essence of tensors: they are simply arrays generalized to arbitrary dimensions. We will see how tensor operations naturally describe multidimensional data in images, videos, and recommender systems, and how decomposition techniques like CP and Tucker help us compress and understand these high-dimensional structures.

     2019-03-09  
    • Mathematics 
    • > Linear Algebra 
     
    • Linear Algebra 
    • | tensor decomposition 
    • | deep learning 
    • | recommender systems 
    Read more 
  • Essence of Linear Algebra (12): Sparse Matrices and Compressed Sensing

    Sparsity is a ubiquitous feature in nature and data. Compressed sensing exploits this property to achieve the "less is more" miracle.

     2019-03-04  
    • Mathematics 
    • > Linear Algebra 
     
    • Linear Algebra 
    • | sparsity 
    Read more 
  • Essence of Linear Algebra (11): Matrix Calculus and Optimization

    When you adjust the shower water temperature, you're essentially doing the same thing as training a neural network — adjusting "parameters" (knob position) based on current "error" (water too cold or too hot). The only difference is that neural networks have millions of parameters, and the mathematical tool for adjusting them is matrix calculus.

     2019-02-28  
    • Mathematics 
    • > Linear Algebra 
     
    • Linear Algebra 
    • | gradients 
    Read more 
  • Essence of Linear Algebra (10): Matrix Norms and Condition Numbers

    The condition number tells us: how "dangerous" is it to solve this linear system?

    In numerical computing, there's a problem that haunts countless engineers and scientists: the equations are correct, the algorithm is correct, so why are the computed results completely wrong? The answer often lies hidden in a concept called the condition number. The condition number is like a "health check report" for a linear system — it tells you how "sensitive" the system is, and whether tiny input errors will be amplified into catastrophic output errors. To understand condition numbers, we first need to figure out how to measure the "size" of vectors and matrices, which is exactly what norms solve.

     2019-02-23  
    • Mathematics 
    • > Linear Algebra 
     
    • Linear Algebra 
    • | numerical stability 
    Read more 
  • Essence of Linear Algebra (9): Singular Value Decomposition

    SVD (Singular Value Decomposition) is hailed as the "crown jewel" of linear algebra — it can decompose any matrix, not just square or symmetric ones. From image compression to Netflix recommendation algorithms, from face recognition to gene analysis, SVD is everywhere. Understanding SVD means mastering one of the most powerful mathematical tools in data science.

     2019-02-17  
    • Mathematics 
    • > Linear Algebra 
     
    • PCA 
    • | SVD 
    • | Linear Algebra 
    • | singular value decomposition 
    • | dimensionality reduction 
    Read more 
Prev Next
© 2020 - 2026  Chen Kai
Visitor Count