Another one of these incredibly interesting online classes came to an end. Machine learning was one of the first two classes that started last fall. As I thought to make AI and ML in parallel would be too much, I decided for AI with the intention to make ML later. The second round of ML was announced for January, but actually started in April. Andrew Ng from Stanford, who some people call a rock star in ML, thought the class. The videos were longer than what I was used to, and I downloaded them to my Android to watch them in the train to work. The homework consisted of review questions and work assignments for octave. The last time I did something with MatLab was more than ten years ago, and I remembered nothing of it.
The class started with gradient descent and logistic regression. And almost everything that followed was compared against and related to them.
I had some prior experience in ML, but no formal training. At TCG I learned the basics from a co worker. Then I implemented a document classification engine using an SVM. I read many books on the topic. Later I develpoed a prediction system for good days and locations for paragliding, again using an SVM as well as an evolutionary optimization.
The flightpred system was designed in a modular fasion, so that I could plug other learning algorithms to experiment with. I always wanted to experiment with Neural Networks, and from all my reading I understood how they worked, but didn’t know where to start actually implementing. NN’s were covered in the class, and we implemented it with octave. So, when I find the time, I’ll add it to the flightpred system.
My goal in reading all the ML books was that I would fully understand and be able to implement Support Vector Machines. SVM’s were covered in the class, but also only from the user’s perspective. I gained a better understanding, but still I wouldn’t be able to implement one myself. The math behind it just too involved.
Then we learned about topics that I didn’t even think about in advance, or thought they wouldn’t be as interesting as they were presented in the class. These include : clustering, dimensionality reduction, anomality detection as well as recommender systems. And it’s also worth mentioning that the general advice on how to analyze problems and where to invest further effort is very valuable.
I can highly recommend the coursera machine learning class to anyone interested in ML.
This week I could already make use at work of the newly earned knowledge of octave. I had an intermediary XML file of which I was not sure it the vertex positions it contained were valid. With some grep/sed/octave magic, I had the answer in no time:
find . -name '*.xml' | xargs grep -h '<p>' | sed -e "s/[<]p[>]//g" -e "s/[<].p[>]//g" -e "s/[<]p.name.*[>]//g" > puppe.mat octave --eval 'xyz = load("puppe.mat"); plot3(xyz(:,1),xyz(:,2),xyz (:,3)); print -dpng puppe.png;'
Leave a Reply