Zach Li
Zach Li
Zach Li
Bing
[Bing]

At Bing Autosuggest, I worked on both enterprise-scale engineering challenges and cutting-edge machine learning techniques to improve the autosuggest experience. One of my main contributions was pioneering the next word prediction capability for the autosuggest.

For the first phase of the project, I created the pipelines to regularly build n-gram language models as tries using Bing's latest search data, and developed a performant service to predict the next words of a user's query given their prefixes using the n-gram model. This phase provided a quick and cheap implementation of next-word prediction that vastly increased our autosuggest coverage with very little additional server and latency costs.

For the second phase, I worked closely with a senior data scientist to experiment, develop, and productionalize an LSTM-based language model, which is called for complex queries where better language understanding is necessary for accurate predictions. Both the n-gram model and the LSTM model resulted in large and statistically significant metrics improvements for their specific purposes.

Some of my other contributions to autosuggest include the defective suggestion classifier, the non-prefix-match look-up service for international markets, incorporation of new datasets to the ranking algorithm, and various other experiments + improvements over the 2 years I worked on autosuggest.

[Blog]

A-Deeper-Look-at-Autosuggest

© 2024 Zach (Zuqi) Li