1/24/2012

Stochastic Gradient Descent & non ML algo

These weeks I've been exploring Gradient Decent, which is a simple Machine Learning algorithm :D. SGD is an online learning method that keeps updating weights when calculating the loss of every candidates.

It's pretty intuitive how to test whether the model is working:
1. randomly pick up a set of weights ( or pick up whatever you think reasonable);
2. produce scores and generate the candidates using the weights
3. split the candidates into two part, one for training and one for testing
4. if after training the weights predicted are same or close to the real weights, and the precision and recall should be high.

1. For example, weights = {a1, a2, a3}
2. score =weights[1]*x1+weights[2]*x2+weights[3]*x3;
we randomly generate a matrix of n row of candidates with feature x1, x2 and x3
Then we can calculate the score.
3. here I split the 80% data for training, and 20% for testing
4. after training with SGD, there will be a predicted weights{a1', a2', a3'}
Now I can compare the weights and see if the SGD model is doing a great job.

No comments:

Post a Comment