Meysam Asgari-Chenaghlu is the author of Mastering Transformers, we got the chance to sit down with him and find out more about his experience of writing with Packt.
Q: What are your specialist tech areas?
Meysam: Artificial Intelligence, Natural Language Processing, Deep Learning
Q: How did you become an author for Packt? Tell us about your journey. What was your motivation for writing this book?
Meysam: After the huge impact of Transformers on the NLP, I have always thought of a book to answer most of the problems from scratch. It was important for the community to have access to such a book. Hopefully, it is finished and I hope it helps readers.
Q: What kind of research did you do, and how long did you spend researching before beginning the book?
Meysam: As a professional and academician, I have worked in the field of NLP for more than 7 years. I had many publications in the field before starting the book.
Q: Did you face any challenges during the writing process? How did you overcome them?
Meysam: It was a bit hard to find useful material because it is a new research field and most of the works are towards academic articles rather than tutorials.
Q: What’s your take on the technologies discussed in the book? Where do you see these technologies heading in the future?
Meysam: It has been less than a decade that transformers have dominated the industry. In the near future, with the help of better solutions in terms of performance improvements on speed and latency, I think other fields such as RL and Computer Vision will also be impacted by it.
Q. Why should readers choose this book over others already on the market? How would you differentiate your book from its competition?
Meysam: What makes this book different than others is its novelty in how we approached the problems. I think anyone who wants to learn transformers from scratch would take a glance at this book at least.
Q. What are the key takeaways you want readers to come away with from the book?
Meysam: After finishing the book, the reader will understand how it is possible to train a Transformer model (of any kind, from GPT to BERT) and use it in real-life problems.
Q. What advice would you give to readers learning tech? Do you have any top tips?
Meysam: It is important to note that this book helps readers to grasp the SOTA methods in NLP but they are required to know some basics of NLP before starting.
Q. Do you have a blog that readers can follow? top tips?
Meysam: I mostly use LinkedIn to share my ideas and thoughts. https://www.linkedin.com/in/meysam-ac/
Q. Can you share any blogs, websites, and forums to help readers gain a holistic view of the tech they are learning?
Meysam: HuggingFace has a ton of good material to read about the Transformers and learn: https://huggingface.co/blog. Also, the Keras blog is a wonderful place to learn too: https://blog.keras.io/
Q. How would you describe your author journey with Packt? Would you recommend Packt to aspiring authors?
Meysam: It was really joyful. I really liked how these guys approach and the way they help authors to understand their workflow. I highly recommended it to the authors.
Q. What are your favorite tech journals? How do you keep yourself up to date on tech?
Meysam: I follow academic journals mostly and NIPS, ICML, and AAAI conferences.
Q. How did you organize, plan, and prioritize your work and write the book?
Meysam: It is important to have a plan when you are working with a team (Packt team and co-authors). But while working and writing a book, it is important to cut some of the vacations and have very good timing for work and book.
Q. What is the one writing tip that you found most crucial and would like to share with aspiring authors?
Meysam: Always keep yourself updated about the technology.
You can find Meysam’s book on Amazon by following this link: Please click here