Skip to main content

Towards Universal Natural Language Processing

Slav Petrov ( Google Research )

In this talk I will first describe our efforts towards a universal representation of syntax that makes it possible to model syntax across languages in a consistent way. I will then highlight several examples of how we have successfully used syntax in different applications and discuss how we have used indirect signals for domain and task adaptation.

Speaker bio

Slav Petrov completed his PhD at UC Berkeley in 2009 under Dan Klein, and is now a researcher at Google in New York, working on problems at the intersection of natural language processing and machine learning. He is particularly interested in multilingual syntactic analysis and its applications to machine translation and information extraction. His work in this area has been recognized with best paper awards at NAACL 2012 and ACL 2011.

 

 

Share this: