Thanks to Stephan Gouws for his help on writing and improving this blog post. Transformers have recently become a competitive alternative to RNNs for a range of sequence modeling tasks. They address a significant shortcoming of RNNs, i.e. their inherently sequential computation which prevents parallelization across elements of the input sequence, whilst still addressing the […]
I’ve started an internship at Apple in San Francisco. I am working with Siri Machine Learning team on learning disentangled representations 🙂
I’ve started a five-month internship at Google Brain \o/. I am working with Lukasz Kaiser, Jakob Uszkoreit, and Stephan Gouws from Brain team and Oriol Vinyals from DeepMind to make Transformer model computationally universal 🙂
Our paper “Learning to Attend, Copy, and Generate for Session-Based Query Suggestion”, with Sascha Rothe, Enrique Alfonseca, and Pascal Fleury, has been accepted as a long paper at the international Conference on Information and Knowledge Management (CIKM’17). This paper is on the outcome of my internship at Google Research. \o/ Users interact with search engines […]
I’ve started a four-months internship at Google Research \o/. I am working on Natural Language Generation using Neural Computational Models, with Aliaksei Severy, Enrique Alfonseca and Sascha Rothe.