Sign in to confirm you’re not a bot
This helps protect our community. Learn more
How to get meaning from text with language model BERT | AI Explained
2.5KLikes
60,291Views
2020Sep 1
In this video, we give a step-by-step walkthrough of self-attention, the mechanism powering the deep learning model BERT, and other state-of-the-art transformer models for natural language processing (NLP). More on attention and BERT: https://bit.ly/38vpOyW How to solve a text classification problem with BERT with this tutorial: https://bit.ly/2Ij6tGa 0:00 Introduction of NLP 0:39 Text tokenization 1:07 Text embedding 2:06 Context and attention 2:25 Self-attention mechanism 5:57 Key, Query, and Value projections 7:25 Multi-head attention 8:12 Building a full NLP network 9:00 Example Find Peltarion here: Website: https://bit.ly/3k2MCIC Twitter: https://bit.ly/2RJZpnB Linkedin: https://bit.ly/2FGWkSS #peltarion #textsimilarity #nlp

Follow along using the transcript.

Peltarion

2.18K subscribers