Style Transfer on Tweets
This project was the result of my independent study with Prof. Chenhao Tan during Spring 2020. I replicated Style Transfer from Non-Parallel Text by Cross-Alignment paper by Shen et. al., and implemented this technique on non-parallel twitter data. The task here was to see whether we could efficiently translate an input tweet to a better worded tweet with the same content, so that the translated tweet may have a higher chance of getting retweeted. It was a great learning experience for me as I got to know a lot about style transfer in text, which is what I am working on now. Check out the Github repository here.
Clipboard Synchronization Application
I worked on developing an application for synchronizing the clipboard data across multiple devices of a user. The idea here is to provide a simple solution for users to copy a text such as a website URL, between their devices without much hassle. If a user copies a text in one of their devices, it will be available in the clipboard of other devices. We used electron.js and Angular for developing the cross-platform desktop application, Node.js and MongoDB as our backend with Keycloak for Identity and Access Management. We made use of sockets for synchronizing texts between multiple devices. We deployed our applications and services on Google Cloud Platform. Checkout the source code here.
Event Sequence Prediction
Several events happen during our everyday life. In this project, we studied the pattern of such events. With a solid understanding of how events happen we can use it in many different application such as medical treatment, purchase prediction, and behavior study. We implemented an LSTM based sequence model for predicting such events on Twitter (rewteet events) and Hawkes data. We conducted an experiment and compared the Neural Hawkes model with our LSTM to see if the inclusion of the time difference directly influencing the hidden layers have any positive effect to predict future events. You can check out the source code for our LSTM model here.
non-normal Recurrent Neural Networks
Recurrent neural networks (RNNs) have been proven to suffer from Exploding and Vanishing Gradient Problem (EVGP). In the recent years, a class of RNNs known as Orthogonal RNNs have been developed which addresses this issue. The expressivity of orthogonal RNNs is limited because it spans only a subset of all transformations with unit norm eigen spectra. Non-normal Recurrent Neural Networks (nnRNN) solves this problem by relaxing the constraint on the orthogonal eigen basis. This allows to parameterize matrices with unit-norm eigenspectra without orthogonality constraints on eigenbasis, thereby providing better expressivity. In this project, we performed hyperparameter explorations and ablation studies of nnRNNs. We also implemented the Adding task to measure whether nnRNNs perform better against other RNN architectures. Check out the Github repository of our project here.