Thoughts on data science, statistics and machine learning.
To the memory of Kevin Conroy. There was only ever one true Batman. You have been playing for months. Slowly and steadily, you have harvested every collectable - making yourself stronger and stronger until you can kill the toughest enemies. Every enemy defeated, every monster slain. No side quest worth doing remains. Those not worth doing are also done because you are a completionist (which is a dignified way of saying that you have no life).
There’s an unavoidable, inherent difficulty in fine-tuning deep neural networks for specific tasks, which primarily stems from the lack of training data. It would seem ridiculous to a layperson that a pretrained vision model (containing millions of parameters, trained on millions of images) could learn to solve highly specific problems. Therefore, when fine-tuned models do perform well, they seem all the more miraculous. But on the other hand, we also know that it is easier to move from the general to the specific, than the reverse.
I took a course on data structures and algorithms over the last few months. It is being offered as a part of IIT Madras' Online Degree Program in Data Science and Programming, taught by Prof Madhavan Mukund. The program is a MOOC in a true sense, with tens of thousands of students enrolling each year. The DSA course itself is offered every trimester, and sees an average of ~700 enrollments every time.
It was the night of the recent 5-state assembly elections. One of my company’s clients is a major news channel, and I was at the studio late into the night, until the election commission announced that they had cancelled their press conference which was supposed to make an announcement about the final vote counts in Madhya Pradesh. A colleague offered to drop me home, and I got off at the gate of my colony, not wanting to subject my colleague to navigating the labyrinth that is any gated colony in South Delhi.