While doing research, I often find myself wondering: “What’s the state-of-the-art result for XYZ right now?” I just want a tool that returns the summary of the latest SOTA research. My usual go-to place is Google, but I quickly realize that Google often returns:
- papers that have a lot of citations, which means they are old.
- articles that discuss SOTA systems. Again, these articles can be a couple of years old.
- irrelevant results that share the same name as XYZ.
At best, I have to skim those results. At worst, I don’t find what I’m looking for at all (I might if I go to the next page results but who does that?) [See the rest of the post and more information on the script that I wrote on huyenchip.com]
Sophia the robot visited Vietnam today and VnExpress asked me to write an op-ed about what it means to be human. In the article, I mentioned the program that Andrej Karpathy wrote that generated Shakespeare-sounding text.
I didn’t think much of it until my editor texted me: “Is Andrej Karpathy human?”
I was like: “I guess.”
She was disappointed and took that example out of my article.
Today, I was invited to give a guest lecture for the Stanford class CS224N: Natural Language Processing with Deep Learning. I was pretty excited about the opportunity. First, I’d never given a lecture to such a big audience before – there are 400+ students in the class. Second, it’s Richard Socher‘s class. He’s hands down one of the most chill professors I know. For some reason, he always looks like he’s just got out of bed and we occasionally catch him biking down the stairs to the classroom. Third, I’d always heard that speaking at NVIDIA Auditorium is lit and I want to try it out before graduating. Continue reading “[Day 626] I just gave a lecture to 400 students”
This is me attempting to get Google Assistant to say my name right … I was almost convinced that they didn’t hardcode the assistant.
When I’m too busy, I can’t do work because I’m too stressed. But because I haven’t been doing work, I have even more work to do. My life is a catch-22.
Since I’m super stressed with work, I’ve decided to dedicate more of my life into making memes. Continue reading “[Day 283] Procrastination”
I’m half excited, half nervous about the TensorFlow Dev Summit tomorrow. I’m excited because it’s the first official TF event, and I will undoubtedly learn a lot. I’m nervous because I’ll be around strangers! S-T-R-A-N-G-E-R-S!!!!
Seriously, what do people do at an event like this? Do you approach to people? Do you maintain eye contact? What if nobody wants to talk to you? What if you accidentally come across as being creepy af? What if you want to take a nap?
TL;DR: If you’re at the event, please come say hi.
If you don’t already know, style transfer is the cool, hip thing that has been taking the recreational AI community by storm. It’s so cool that even Kristen Stewart co-authored a paper about it. To quote one researcher who has done extensive work in style transfer that I’ve got a chance to talk to, “it is an utterly unremarkable paper that wouldn’t have been published otherwise [if Kristen Stewart’s name is not on it]. That’s a publicity stunt.”
Some background on why I’m doing this: I’m teaching the course CS 20SI: “TensorFlow for Deep Learning Research” and for the assignment about convolution neural networks, I thought it’d be fun for students to do style transfer as their exercise at home. They, after all, showed a lot of enthusiasm when we did Deep Dream in class.
Continue reading “[Day 276] Detailed instruction on how to do Style Transfer”