Please answer exactly why these loss functions may or may not be a good
Ask Expert

Be Prepared For The Toughest Questions

Practice Problems

Please answer exactly why these loss functions may or may not be a good

2. Loss Functions for Classification:

Again, assume that the function is f(w, b, x) = w T x + b. In the case of SVM and Perceptrons, we saw the following two loss functions: Li(w, b) = max(0, −yif(w, b, xi)) for Perceptron and Li(w, b) = max(0, 1 − yif(w, b, xi)) for Hinge Loss (SVM). Similar to question 1, let us see if the following loss functions are good choices:

(a) Li(w, b) = max(0, 1 − yif(w, b, xi))2

(b) Li(w, b) = [yi − f(w, b, xi ] 4

(c) Li(w, b) = exp[f(w, b, xi) − yi ]

(d) Li(w, b) = exp[−yif(w, b, xi)]

Part 1: Please answer exactly why these loss functions may or may not be a good choice for classification.

Part 2: Also, compute the gradients of the final loss function in each of the cases above.

Hint
ComputerTo calculate the gradient of a straight line we choose two points on the line itself. The difference in height (y co-ordinates) ÷ The difference in width (x co-ordinates). If the answer is a positive value then the line is uphill in direction. If the answer is a negative value then the line is downhill in direction....

Know the process

Students succeed in their courses by connecting and communicating with
an expert until they receive help on their questions

1
img

Submit Question

Post project within your desired price and deadline.

2
img

Tutor Is Assigned

A quality expert with the ability to solve your project will be assigned.

3
img

Receive Help

Check order history for updates. An email as a notification will be sent.

img
Unable to find what you’re looking for?

Consult our trusted tutors.

Developed by Versioning Solutions.