Tuesday, February 4, 2020

An insight to BERT Attentions


Attention mechanisms are now commonly used in many NLP tasks. Attention mainly refers to focus on data in machines like humans. The year 2017 has been very pivotal in terms of its contributions to Machine Learning especially Natural Language Processing. The transformer Models changed the way of putting NLP tasks in a different fashion. Previously, For NLP tasks like Summarization and question answering, a common implementation was dot-based attention (flat attentions).

Read the full article here.