Skip to content

Commit 09e31f4

Browse files
Mark LindermanMark Linderman
Mark Linderman
authored and
Mark Linderman
committed
done with ex8
1 parent 91503ff commit 09e31f4

File tree

2 files changed

+4
-4
lines changed

2 files changed

+4
-4
lines changed

machine-learning-ex8/ex8/cofiCostFunc.m

+3-3
Original file line numberDiff line numberDiff line change
@@ -73,7 +73,7 @@
7373
% Y is also num_movies x num_users - so subtracting that is very simple
7474
% likewise, R is also num_movies x num_users so to negate users+movies without ratings, just use dot product
7575
% then square every element and sum in both directions to produce the cost
76-
J = 1/2 * sum(sum(((X * transpose(Theta) - Y) .* R).^2));
76+
J = 1/2 * sum(sum(((X * transpose(Theta) - Y) .* R).^2)) + ((lambda/2) * sum(sum(Theta.^2))) + ((lambda/2) * sum(sum(X.^2)));
7777

7878

7979
% same basic X * transpose(Theta) as above but since these are partial derivatives, no squaring,
@@ -83,8 +83,8 @@
8383
% like usual, focus on aligning the rows columns in the matrix multiplications to what you need
8484
% in calculating these two grads, you want the same dimensions as the starting X and Theta matrices:
8585
% movies x features and users by features, respectively
86-
X_grad = ((X * transpose(Theta) - Y) .* R) * Theta;
87-
Theta_grad = transpose(((X * transpose(Theta) - Y) .* R)) * X;
86+
X_grad = ((X * transpose(Theta) - Y) .* R) * Theta .+ (lambda .* X);
87+
Theta_grad = transpose(((X * transpose(Theta) - Y) .* R)) * X .+ (lambda .* Theta);
8888

8989
% =============================================================
9090

machine-learning-ex8/ex8/token.mat

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Created by Octave 4.4.1, Sat Apr 27 21:19:06 2019 EDT <[email protected]>
1+
# Created by Octave 4.4.1, Sun Apr 28 13:33:20 2019 EDT <[email protected]>
22
# name: email
33
# type: sq_string
44
# elements: 1

0 commit comments

Comments
 (0)