Should translators understand the power of the technology behind Google Translate and other providers of machine translation? We think so, and that is why we prepared this webinar as the first step in helping translators make informed decisions when working with this technology. The volume and quality of machine translation produced today baffles anyone who has been working or wants to start working with translation. Neural machine translation (NMT) is a set of the most advanced machine learning technologies that has been yielding surprising volumes of high-quality machine translation output. NMT uses neural network-based models to learn probabilistic models of languages, in such a way that they can be used to estimate very likely good translations of new source sentences. One of the key benefits of these approaches is to simplify the process of training MT systems. Unlike previous statistical translation systems, which consisted of many small sub-components that were tuned separately, NMT attempts to build and train a single, large neural network which will then be able to read a source sentence and output a translation for it. In this webinar, we will explain the use of widely available programming libraries to create a customised translation engine or model using NMT. The webinar will consist of a step-by-step demonstration of the different stages of training a NMT model, with simple explanations of the terms that are used by researchers that train these systems. The webinar will include introductory answers to questions like: What is a neural network? What are word embeddings? What does training and learning mean? How many stages are there in an NMT training process? What methods are there to improve the quality of the MT output? Why is the translation stage called decoding? There will be time for the participants to ask questions, but they are not expected to perform any hands-on work. As a follow-up to the webinar, we are preparing a short course with hands-on exercises for participants to train their own initial models. <This talk was presented by Diptesh Kanojia, Leonardo Zilio and Félix do Carmo.>