4 min read
I survived my first day working at a tech giant. Although arguably it wasn't actual work. We have almost a whole week of orientation to introduce us to Google and how it's run before getting into the actual projects.
3 min read
It's been a long journey, but I'm finally here. Well then again, the train ride was only 2.5 hours, but looking back to when the adventure began in October, it's been a heck of a journey. (The 3 prequels can be found on my general blog).
2 min read
So another brutal round of finals have come and gone at UConn. Rather than relax and enjoy my brief week of freedom before packing up and leaving on my next adventure, I tried scrambling to get the first third of my thesis working. Although HORNET hasn't melted down (yet) the grad students and professors who also have access flooded it with jobs the first few days after finals now that the semester's over. Initially I had thought the cluster was down for the summer which resulted in a concerned email from one of my thesis advisors to whoever's in charge of the cluster -- not good.
4 min read
Men and ladies, boys and girls, prepare to be astounded, bedazzled, and otherwise stupefied!
-Three Dog, Fallout 3
Today was the second time that I've attended the annual Invention Convention held at UConn. As last year's CIC, I did my usual drone demo and it was a real hit with the crowd, only this time there was a bit of a twist. Instead of just using eyebrow raises and smirks, I used mental commands, facial expressions, nods and head shakes to get it to move while airborne, which completely bewildered people.
3 min read
So I've been working on my honors thesis/University Scholar project since the start of the semester and results have been, well... inconclusive. I've been working with some machine learning algorithms such as artificial neural nets (ANNs) and support vector machines (SVMs) in order to detect facial expressions from EEG data for part of my thesis. The end goal is to make my own API that detects more events and/or more accurately than the original SDK for my Emotiv neuroheadset and to send this data in a JSON format in a client/server format using a TCP socket. There's a bit more to it than that such as using it to improve my previous Mind-Controlled UAV project with blended commands and path-planning, but that's the gist of it.