吳恩達《深度學習》-課後測驗-第一門課 (Neural Networks and Deep Learning)-Week 2 - Neural Network Basics(第二週測驗 - 神經...

Week 2 Quiz - Neural Network Basics(第二週測驗 - 神經網絡基礎)

1. What does a neuron compute?(神經元節點計算什麼?)

【 】 A neuron computes an activation function followed by a linear function (z = Wx + b)(神經 元節點先計算激活函數,再計算線性函數(z = Wx + b))

【 】 A neuron computes a linear function (z = Wx + b) followed by an activation function(神經 元節點先計算線性函數(z = Wx + b),再計算激活。)

【 】 A neuron computes a function g that scales the input x linearly (Wx + b)(神經元節點計算 函數 g,函數 g 計算(Wx + b))

【 】 A neuron computes the mean of all features before applying the output to an activation function(在 將輸出應用於激活函數之前,神經元節點計算所有特徵的平均值)

<details> <summary>答案</summary> <p>【【★】 A neuron computes a linear function (z = Wx + b) followed by an activation function(神經 元節點先計算線性函數(z = Wx + b),再計算激活。) </p> <p>Note: The output of a neuron is a = g(Wx + b) where g is the activation function (sigmoid, tanh, ReLU, …).(注:神經元的輸出是 a = g(Wx + b),其中 g 是激活函數(sigmoid,tanh, ReLU,…))</p> </details>

 

\2. Which of these is the “Logistic Loss”?(下面哪一個是 Logistic 損失?)

【 】損失函數:$L(\hat{y}^{(i)},y^{(i)})=-y^{(i)}log\hat{y}^{(i)}-(1-y^{(i)})log(1-\hat{y}^{(i)})$

Note: We are using a cross-entropy loss function.(注:我們使用交叉熵損失函數。)

 

\3. Suppose img is a (32,32,3) array, representing a 32x32 image with 3 color channels red, green and blue. How do you reshape this into a column vector?(假設 img 是一個(32,32,3) 數組,具有 3 個顏色通道:紅色、綠色和藍色的 32x32 像素的圖像。 如何將其重新轉換爲 列向量?)

<details> <summary>答案</summary> <p> x = img.reshape((32 * 32 * 3, 1)) </p> </details>

 

\4. Consider the two following random arrays “a” and “b”:(看一下下面的這兩個隨機數組“a”和 “b”:)

a = np.random.randn(2, 3) # a.shape = (2, 3)
b = np.random.randn(2, 1) # b.shape = (2, 1)
c = a + b

What will be the shape of “c”?(請問數組 c 的維度是多少?)

<details> <summary>答案</summary> <p> c.shape = (2, 3) </p> <p> b (column vector) is copied 3 times so that it can be summed to each column of a. Therefore, c.shape = (2, 3).( B(列向量)複製 3 次,以便它可以和 A 的每一列相加,所 以:c.shape = (2, 3)) </p> </details>

 

\5. Consider the two following random arrays “a” and “b”:(看一下下面的這兩個隨機數組“a”和 “b”)

a = np.random.randn(4, 3) # a.shape = (4, 3)
b = np.random.randn(3, 2) # b.shape = (3, 2)
c = a * b

What will be the shape of “c”?(請問數組“c”的維度是多少?)

<details> <summary>答案</summary> <p> The computation cannot happen because the sizes don’t match. It’s going to be “error”! Note:“*” operator indicates element-wise multiplication. Element-wise multiplication requires same dimension between two matrices. It’s going to be an error.(注:運算符 “*” 說明了按元素乘法來相 乘,但是元素乘法需要兩個矩陣之間的維數相同,所以這將報錯,無法計算。) </p> </details>

 

\6. Suppose you have 𝒏𝒙 input features per example. Recall that $𝑿 = [𝒙^{(𝟏)} , 𝒙^{(𝟐)} … 𝒙^{(𝒎)} ]$. What is the dimension of X?(假設你的每一個樣本有𝒏𝒙個輸入特徵,想一下在$𝑿 = [𝒙^{(𝟏)} , 𝒙^{(𝟐)} … 𝒙^{(𝒎)} ]$中,X 的維度是多少?)

<details> <summary>答案</summary> <p> (𝑛𝑥, 𝑚) </p> <img src="https://images.cnblogs.com/cnblogs_com/phoenixash/1603337/o_191212093150test2.1.png"/> </details>

 

\7. Recall that np.dot(a,b) performs a matrix multiplication on a and b, whereas a*b performs an element-wise multiplication.(回想一下,np.dot(a,b)在 a 和 b 上執行矩陣乘法,而“a * b”執行元素方式的乘法。)Consider the two following random arrays “a” and “b”:(看一下下 面的這兩個隨機數組“a”和“b”:)

a = np.random.randn(12288, 150) # a.shape = (12288, 150)
b = np.random.randn(150, 45) # b.shape = (150, 45)
c = np.dot(a, b)

What is the shape of c?(請問 c 的維度是多少?)

<details> <summary>答案</summary> <p> c.shape = (12288, 45), this is a simple matrix multiplication example.( c.shape = (12288, 45), 這 是一個簡單的矩陣乘法例子。) </p> </details>

 

\8. Consider the following code snippet:(看一下下面的這個代碼片段:)

# a.shape = (3,4)
# b.shape = (4,1)
for i in range(3):
 for j in range(4):
 c[i][j] = a[i][j] + b[j]

How do you vectorize this?(請問要怎麼把它們向量化?)

<details> <summary>答案</summary> <p> c = a + b.T </p> </details>

 

\9. Consider the following code:(看一下下面的代碼:)

a = np.random.randn(3, 3)
b = np.random.randn(3, 1)
c = a * b

What will be c?(請問 c 的維度會是多少? )

<details> <summary>答案</summary> <p> c.shape = (3, 3) </p> <p> This will invoke broadcasting, so b is copied three times to become (3,3), and * is an elementwise product so c.shape = (3, 3).(這將會使用廣播機制,b 會被複制三次,就會變成 (3,3),再使用元素乘法。所以: c.shape = (3, 3).) </p> </details>

 

\10. Consider the following computation graph,What is the output J.(看一下下面的計算圖,J 輸 出是什麼:)

<details> <summary>答案</summary> <p> J = u + v - w<br> = a * b + a * c - (b + c)<br> = a * (b + c) - (b + c)<br> = (a - 1) * (b + c) </p> </details>

 

 



Week 2 Code Assignments:

✧Course 1 - 神經網絡和深度學習 - 第二週作業 - 具有神經網絡思維的Logistic迴歸

assignment2_1:Python Basics with Numpy (optional assignment)

https://github.com/phoenixash520/CS230-Code-assignments

assignment2_2:Logistic Regression with a Neural Network mindset

https://github.com/phoenixash520/CS230-Code-assignments

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章