This year didn’t start out so great for my online classes. I signed up and started a bunch, but quit all but one so far. Some were not as interesting as I thought, some didn’t contain enough new stuff or the material covered was too different from what I expected. I just couldn’t motivate myself to invest the time and effort to complete them. Maybe it’s not as exciting as it was for the first few classes or maybe these teachers are just trying out a new channel and are not as determined and enthusiastic about this new form of education. For me personally, the first MOOC that I completed, the introduction to AI is still the best.
Finally I found a class that I was keen enough to complete. That was about computational neuroscience. I read some books about neurology before, and was familiar with the basic structure of neurons and synapses as well as with some neuro transmitters such as GABA. But the details about ion channels and their detailed behaviour was new to me. The calculations with the spike voltages and spike triggered averages were very interesting. They highlighted to me just how simplified the common perceptron neural network models are. The second part of the class that was more about the application of the insight from the biological neuroscience into artificial intelligence and machine learning was more familiar and partly repetition.
Another one of these incredibly interesting online classes came to an end. Machine learning was one of the first two classes that started last fall. As I thought to make AI and ML in parallel would be too much, I decided for AI with the intention to make ML later. The second round of ML was announced for January, but actually started in April. Andrew Ng from Stanford, who some people call a rock star in ML, thought the class. The videos were longer than what I was used to, and I downloaded them to my Android to watch them in the train to work. The homework consisted of review questions and work assignments for octave. The last time I did something with MatLab was more than ten years ago, and I remembered nothing of it.
The class started with gradient descent and logistic regression. And almost everything that followed was compared against and related to them.
I had some prior experience in ML, but no formal training. At TCG I learned the basics from a co worker. Then I implemented a document classification engine using an SVM. I read many books on the topic. Later I develpoed a prediction system for good days and locations for paragliding, again using an SVM as well as an evolutionary optimization. Continue reading “machine learning class”
I’ve worked in Baar for two months now and I go to work by train. It takes a while longer than to Schwyz as before, but I don’t have to switch trains or busses. That means it’s good for reading. Currently I read “Artificial Intelligence: A Modern Approach” which is accompanying an online curse (ai-class.com from Stanford) that I currently attend. Today I was reading in the chapter about neuronal networks. There it describes an algorithm called “optimal brain damage“. It tries to find an optimal topology for the NN by randomly cutting connections from an initially fully connected NN.While it describes adequately what the algorithm does, it struck me awkwardly when I first read the name.
What are the best names for algorithms you have come across?
Continue reading “Best names for algorithms”