Taking CS445 has been interesting. I’ve noticed quite a few gaps in my understanding of statistics, which makes sense given my focus on more theoretical or structure driven maths. We’re half way through the term and we’ve covered a few types of learning mechanisms as well as contrasted them. I look forward to learning about more of them and eventually ensemble methods.
Keep in mind most of these homework’s were programmed in a day…
Homework 1 : Perceptrons : For a database of grey-scale images of handwritten digits: mnist. Use a perceptron to classify them. The accuracy stumbled around 80% and overall wasn’t impressive, which was exactly what was expected of a perceptron. Source
Homework 2 : Neural Network : Given the same problem we instead applied an NN with a single hidden layer. The accuracy of this one got near perfect on the training set (possibly over-fitting) and around 95% on the test set. The implementation, if you poke around it, isn’t crazy fast but allows for as many layers and nodes as you like. Source
Homework 3 : SVM : For this one we created a spam filter based on SVM_light and spambase data set. We also played around with weighted feature selection vs random feature selection. Overall this one was more wrestling with the oddities of SVM_light and does not reflect understanding of the internals of an SVM. Oh well. (will post source when this has been due for more than a week.)