Nerd Thoughts

This is a non-academic yet academic page. The primary purpose is fun. However, the topics I write about are quite meaningful (to me, and my extremely nerdy mind). I do not claim that anything here is scientifically correct, just some thoughts I wonder about. I also would love it to receive answers, opinions, and/or other ideas to include. Don't hesitate to shoot me an email!


Neural Networks and Time travel

May 18, 2018: OK here is a crazy theory I have: Neural Networks are one way to model the space-time continuum. As seen in fiction movies, when a super hero travels back in time, they have the responsibility of making sure not to make the slightest modification to the space-time continuum otherwise the series of events to follow would result in a lots of changes in the universe. When adversarial examples are computed, this is exactly what happens. The gradient is back-propagated to the input and added as a tiny noise which completely messes up the classification accuracy. To me, it seems pretty clear that back-propagation and time travel are operationally similar. Anyway, neural networks are universal approximators so it sounds quite reasonable that they could emulate the space-time continuum.


How ridiculously expensive Neymar's transfer to PSG was - A graduate student's perspective

August 30, 2017: Neymar's transfer to PSG is reported to be around 250 million dollars. It is hard for me personally to understand how big such a number is, so it was quite fun and nerdy for me to try to equate this voluminous fee to something slightly more meaningful.
On average, I would say that a graduate student's own fortune is about 5000 dollars. Some graduate students cut themselves dry until the next stipend and some save up a little. I think 5000 dollars is a reasonable estimate for the average fortune of a graduate student. So let's say one grad student is worth 5000 dollars. Hence, Neymar's transfer fee is worth 250M/5k = 50k, or 50,000 graduate students. Now take those 50,000 graduate students and make each one write one distinct paper. Also, have each of these paper cite the same person at least once. That person gets 50,000 citations. So I would say Neymar's transfer fee is worth the career of a scholar having around 50,000 citations. A quick check on Google Scholar tells us that at the time of this writing, Neymar's transfer fee is approximately worth the career of Yann LeCun. For those of you who don't know, the latter is the director of Facebook Artificial Intelligence Research lab, a member of the academy of engineers, the inventor of convolutional neural networks, and a main contributor to the advancements of Deep Learning. If machines ever become as smart as humans, we would have this guy (among others) to thank.
So yeah, Neymar's transfer fee is almost equal to Yann LeCun's career. But seriously, all this cutting edge Deep Learning research will never get you to win the Champions League, but Neymar's arrival just might do it for PSG, so I think it was a good deal.


Discriminator vs. generator

May 14, 2017: One of my favorite recent papers is the GAN paper by (Goodfellow et al., NIPS, 2014). Mainly because in their acknowledgments they thank a brewery for stimulating their creativity (which I found purely hilarious and awesome). But also the concept of a discriminator and generator playing a min max game for training is very interesting. Actually, one of my colleagues always critiques my ideas and so I told him "sometimes I feel you and I are involved in a min max game like in GANs. I try to generate ideas and results, but you discriminate them. You are the max to my min". Ever since, I often see min max games in my daily life: my qual exam with the lovely committee I had, the peer review process (specially when there's a rebuttal, we get an extra SGD iteration), preparing a paper draft with my advisor, having arguments with friends in general, etc...


The paper will be accepted with probability p

May 14, 2017: So recently my first ICML paper was accepted. That was amazing. But few days ago, I was discussing with some colleagues about our chances of making it based on how the reviews looked. Someone said "I think there is a 70% chance" to which I replied "Nah, I would say 40%". A friend of mine said "You can't really assign a probability, what does it mean after the decision notification?" That was a legit question to which I replied "It's like quantum mechanics, I cannot say for sure if the paper is accepted or not, but each state has a probability, and yes after measurement the state collapses." That got me thinking, a lot of our life events could be modeled like that, we often say "oh there's a high chance this or that will happen" and eventually it does happen or not. When we talk like that, are we doing quantum mechanics without knowing it?


Training at the gym

May 14, 2017: One day I went to the gym and thought "I'm doing unsupervised training". We were just done with a ML paper submission in our group so the following day I told my advisor about it. He said "well, make sure your model does not overfit". That was a good point, ever since, I have been applying dropout (in the Geoff Hinton sense, if you didn't get it, congrats, you are a normal person, not a nerd). Unfortunately, looks like I'm always stuck in a local minimum (or maximum for that matter).


Dropout was used by hippies before it was cool

May 14, 2017: Some time ago I met a couple of neuroscientists from Berkeley. Naturally, they were telling me about how cooler Berkeley is compared to Stanford. I said I heard it is a very hippy place, and of course, they were proudly telling about the consumption of neuron-weakening substances there (if you didn't get this, congrats, heaven waits for your pure heart). Talking to these neuroscientists about those substances I said: "I just realize something, when someone smokes, it's like they are applying dropout to their brain". To my surprise they loved this joke and one of them started faking the smoking process saying "hold on, I am generalizing..." Actually, that was a lot of fun, but we always hear of artists (painters, poets, musicians, etc...) producing their best works when high. Isn't that exactly dropout??