深度学习 deep learning

深度学习 deep learning

春风得意遇知音
5.42万4008

01. What is Deep Learning
02. What is a Neural Network
03. Supervised Learning with Neural Networks
04. Drivers Behind the Rise of Deep Learning
05. Binary Classification in Deep Learning
06. Logistic Regression
07. Logistic Regression Cost Function
08. Gradient Descent
09. Derivatives
10. Derivatives Examples
11. Computation Graph
12. Derivatives with a Computation Graph
13. Logistic Regression Derivatives
14. Gradient Descent on m Training Examples
15. Vectorization
16. More Vectorization Examples
17. Vectorizing Logistic Regression
18.  Vectorizing Logistic Regression's Gradient Computation
19. Broadcasting in Python
20. Python-Numpy
21. Jupyter-iPython
22. Logistic Regression Cost Function Explanation
23. Neural Network Overview
24. Neural Network Representation
25. Computing a Neural Network's Output
26. Vectorizing Across Multiple Training Examples
27. Vectorized Implementation Explanation
28. Activation Functions
29. Why Non-Linear Activation Function
30. Derivatives of Activation Functions

。。。。


58. Exponentially Weighted Averages
59. Understanding Exponentially Weighted Averages
60. Bias Correction in Exponentially Weighted Average
61. Gradient Descent with Momentum
62. RMSprop
63. Adam Optimization Algorithm
64. Learning Rate Decay
65. The Problem of Local Optima
66. Tunning Process
67. Right Scale for Hyperparameters
68. Hyperparameters tuning in Practice  Panda vs. Caviar
69. Batch Norm
70. Fitting Batch Norm into a Neural Network
71. Why Does Batch Nom Work
72. Batch Norm at Test Time
73. Softmax Regression
74. Training a Softmax Classifier
75. Deep Learning Frameworks
76. TensorFlow
77. Why ML Strategy
78. Orthogonalization
79. Single Number Evaluation Metric
80. Satisfying and Optimizing Metrics
81. train dev test distributions
82. Size of dev and test sets
83. When to change dev test sets and metrics
84. Why human-level performance
85. Avoidable Bias
86. Understanding Human-Level Performance
87. Surpassing Human-Level Performance
88. Improving Your Model Performance
89. Carrying Out Error Analysis
90. Cleaning Up Incorrect Labeled Data
91. Build Your First System Quickly, Then Iterate
92. Training and Testing on Different Distributions
93. Bias and Variance with Mismatched data distributions
94. Addressing Data Mismatch
95. Transfer Learning
96. Multi-Task Learning
97. End-to-End Deep Learning
98. Whether to use End-to-End Learning

用户评论
  • 1507636ouvx

    这是吴恩达的课程录音 ?