In: Computer Science
Stochastic Gradient Ascent (SGA) for Logistic
Regression. In the exer- cise, you will implement logistic
regression algorithm using SGA, similar to the logistic regression
algorithm that you have seen in class. You will work with the
datasets attached to the assignment and complete the lo-
gisticRegression.py file to learn the coefficients and predict
binary class labels. The data comes from breast cancer diagnosis
where each sample (30 features) is labeled by a diagnose: either M
(malignant) or B (be- nign) (recorded in the 31-st column in the
datasets). Read the main code, check the configuration parameters,
and make sure the data is loaded and augmented correctly. Do not
use logistic regression packages.
(a) Complete the function predict(x, w), gradient(x,
y, w), and cross entropy(y hat, y) functions according to the
instructions in lo- gisticRegression.py. These functions will be
used in the main SGA algorithm (logisticRegression SGA).
I am using sigmoid function as the activation funcion.
predict(x,w)
#I am writing only the main formulae and I am using numpy#
m
=
x.shape[1]
Y_prediction
= np.zeros((1,m))
s= np.dot(w.T,x)+b
y_hat= 1/(1+np.exp(-s))
for i in range(y_hat.shape[1]):
# Convert probabilities A[0,i] to actual predictions p[0,i]
if y_hat[0,i]>0.5:
Y_prediction[0,i]=1
else:
Y_prediction[0,i]=0
assert(Y_prediction.shape == (1, m))
return Y_prediction
gradient(x, y, w)
m = x.shape[1]
# FORWARD PROPAGATION (FROM X TO COST)
s=(np.dot(w.T,x))+b
y_hat= 1/(1+np.exp(-s)) # compute activation
# BACKWARD PROPAGATION (TO FIND GRAD)
dw= (np.dot(X,(A-Y).T))/m
db= (np.sum(A-Y))/m
cross entropy(y_hat, y)
cost = (-np.sum(y*np.log(y_hat)+(1-y)*np.log(1-y_hat)))/m # compute cost
return cost