This intermediate-level course introduces the mathematical foundations to derive Principal Component Analysis (PCA), a fundamental dimensionality reduction technique. We'll cover some basic statistics of data sets, such as mean values and variances, we'll compute distances and angles between vectors using inner products and derive orthogonal projections of data onto lower-dimensional subspaces. Using all these tools, we'll then derive PCA as a method that minimizes the average squared reconstruction error between data points and their reconstruction.
- 5 stars51.02%
- 4 stars22.60%
- 3 stars12.83%
- 2 stars6.69%
- 1 star6.83%
來自MATHEMATICS FOR MACHINE LEARNING: PCA的熱門評論
This course was definitely a bit more complex, not so much in assignments but in the core concepts handled, than the others in the specialisation. Overall, it was fun to do this course!
The Programming assignments are quite challenging. The teaching part doesn't equip you with enough resources regarding numpy to get full marks in the Programming Assignments. Good teaching though.
This course is well worth the time. I have a better understanding of one of the most foundational and biologically plausible machine learning algorithms used today! Love it.
Definitely the most challenging of the course making up this specialization. Finishing it with full scores is proportionally far more satisfying!!! Well done Marc!
關於 数学在机器学习领域的应用 專項課程
What level of programming is required to do this course?
How difficult is this course in comparison to the other two of this specialization?