📚
Building Transformer Models with Attention Implementing a Neural Machine Translator from Scratch in Keras (Stefania Cristina, Mehreen Saeed) (Z Library)
Stefania Cristina, Mehreen Saeed

Building Transformer Models with Attention Implementing a Neural Machine Translator from Scratch in Keras (Stefania Cristina, Mehreen Saeed) (Z Library)

作者: Stefania Cristina, Mehreen Saeed

技术

If you have been around long enough, you should notice that your search engine can understand human language much better than in previous years. The game changer was the attention mechanism. It is not an easy topic to explain, and it is sad to see someone consider that as secret magic. If we know more about attention and understand the problem it solves, we can decide if it fits into our project and be more comfortable using it. If you are interested in natural language processing and want to tap into the most advanced technique in deep learning for NLP, this new Ebook—in the friendly Machine Learning Mastery style that you’re used to—is all you need. Using clear explanations and step-by-step tutorial lessons, you will learn how attention can get the job done and why we build transformer models to tackle the sequence data. You will also create your own transformer model that translates sentences from one language to another.

📄 文件格式: PDF
💾 文件大小: 7.4 MB
47
浏览次数
16
下载次数
0.00
捐款总额

💝 支持作者

0.00
总金额 (¥)
0
捐款次数

登录后即可支持作者

立即登录
← 返回列表