01. What is Deep Learning
02. What is a Neural Network
03. Supervised Learning with Neural Networks
04. Drivers Behind the Rise of Deep Learning
05. Binary Classification in Deep Learning
06. Logistic Regression
07. Logistic Regression Cost Function
08. Gradient Descent
09. Derivatives
10. Derivatives Examples
11. Computation Graph
12. Derivatives with a Computation Graph
13. Logistic Regression Derivatives
14. Gradient Descent on m Training Examples
15. Vectorization
16. More Vectorization Examples
17. Vectorizing Logistic Regression
18. Vectorizing Logistic Regression's Gradient Computation
19. Broadcasting in Python
20. Python-Numpy
21. Jupyter-iPython
22. Logistic Regression Cost Function Explanation
23. Neural Network Overview
24. Neural Network Representation
25. Computing a Neural Network's Output
26. Vectorizing Across Multiple Training Examples
27. Vectorized Implementation Explanation
28. Activation Functions
29. Why Non-Linear Activation Function
30. Derivatives of Activation Functions
。。。。
58. Exponentially Weighted Averages
59. Understanding Exponentially Weighted Averages
60. Bias Correction in Exponentially Weighted Average
61. Gradient Descent with Momentum
62. RMSprop
63. Adam Optimization Algorithm
64. Learning Rate Decay
65. The Problem of Local Optima
66. Tunning Process
67. Right Scale for Hyperparameters
68. Hyperparameters tuning in Practice Panda vs. Caviar
69. Batch Norm
70. Fitting Batch Norm into a Neural Network
71. Why Does Batch Nom Work
72. Batch Norm at Test Time
73. Softmax Regression
74. Training a Softmax Classifier
75. Deep Learning Frameworks
76. TensorFlow
77. Why ML Strategy
78. Orthogonalization
79. Single Number Evaluation Metric
80. Satisfying and Optimizing Metrics
81. train dev test distributions
82. Size of dev and test sets
83. When to change dev test sets and metrics
84. Why human-level performance
85. Avoidable Bias
86. Understanding Human-Level Performance
87. Surpassing Human-Level Performance
88. Improving Your Model Performance
89. Carrying Out Error Analysis
90. Cleaning Up Incorrect Labeled Data
91. Build Your First System Quickly, Then Iterate
92. Training and Testing on Different Distributions
93. Bias and Variance with Mismatched data distributions
94. Addressing Data Mismatch
95. Transfer Learning
96. Multi-Task Learning
97. End-to-End Deep Learning
98. Whether to use End-to-End Learning
作为深度学习领域的通识作品,本书以恢弘的笔触,通过3个部分全景展现了深度学习的发展、演变与应用,首次以亲历者视角回溯了深度学习浪潮在过去60年间的发展脉络与人工...
振华重工创始人管彤贤:郭宇宽对待问题一贯不左不右,不偏不倚。耶鲁大学商学院教授陈志武:在商业模式领域,郭宇宽深度考察过很多的企业,从中能总结出方法,听一听他怎么...
实现从教中心向学中心转变,实现从知识传授为主向素养培养为主转变,实现从单一的“记忆背诵”学习为主向多种学习方式转变,实现从浅层学习向深度学习转变。
这是吴恩达的课程录音 ?