If playback doesn't begin shortly, try restarting your device.
•
You're signed out
Videos you watch may be added to the TV's watch history and influence TV recommendations. To avoid this, cancel and sign in to YouTube on your computer.
CancelConfirm
Share
An error occurred while retrieving sharing information. Please try again later.
In this video, we give a step-by-step walkthrough of self-attention, the mechanism powering the deep learning model BERT, and other state-of-the-art transformer models for natural language processing (NLP). More on attention and BERT: https://bit.ly/38vpOyW
How to solve a text classification problem with BERT with this tutorial: https://bit.ly/2Ij6tGa0:00 Introduction of NLP
0:39 Text tokenization
1:07 Text embedding
2:06 Context and attention
2:25 Self-attention mechanism
5:57 Key, Query, and Value projections
7:25 Multi-head attention
8:12 Building a full NLP network
9:00 Example
Find Peltarion here:
Website: https://bit.ly/3k2MCIC
Twitter: https://bit.ly/2RJZpnB
Linkedin: https://bit.ly/2FGWkSS#peltarion#textsimilarity#nlp…...more
How to get meaning from text with language model BERT | AI Explained
2.5KLikes
60,291Views
2020Sep 1
In this video, we give a step-by-step walkthrough of self-attention, the mechanism powering the deep learning model BERT, and other state-of-the-art transformer models for natural language processing (NLP). More on attention and BERT: https://bit.ly/38vpOyW
How to solve a text classification problem with BERT with this tutorial: https://bit.ly/2Ij6tGa0:00 Introduction of NLP
0:39 Text tokenization
1:07 Text embedding
2:06 Context and attention
2:25 Self-attention mechanism
5:57 Key, Query, and Value projections
7:25 Multi-head attention
8:12 Building a full NLP network
9:00 Example
Find Peltarion here:
Website: https://bit.ly/3k2MCIC
Twitter: https://bit.ly/2RJZpnB
Linkedin: https://bit.ly/2FGWkSS#peltarion#textsimilarity#nlp…...more