BERT Fine-Tuning Tutorial with PyTorch
By Chris McCormick and Nick Ryan
By Chris McCormick and Nick Ryan
In this post, I take an in-depth look at word embeddings produced by Google’s BERT and show you how to get started with BERT by producing your own word embeddings.
I’m proud to announce that I’ve published my first eBook, The Inner Workings of word2vec. It includes all of the material in the popular word2vec tutorial on my blog, and goes deeper with additional topics like CBOW and Hierarchical Softmax. I’ve also created example code to go along with the book that exposes the algorithm details and let’s you see them in action.
In this article, I wanted to share about a trend that’s occurred over the past few years of using the word2vec model on not just natural language tasks, but on recommender systems as well.
In part 1 of this tutorial, I described the most basic form of a product quantizer. In this post, I’ll be explaining the IndexIVFPQ index from the FAISS library, which uses a product quantizer as well as a couple additional techniques introduced in their 2011 paper.