{"id":2404,"date":"2022-11-30T15:47:57","date_gmt":"2022-11-30T15:47:57","guid":{"rendered":"https:\/\/www.codeastar.com\/?p=2404"},"modified":"2022-12-02T18:40:47","modified_gmt":"2022-12-02T18:40:47","slug":"the-lazy-and-easy-pre-trained-translator-of-the-year","status":"publish","type":"post","link":"https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/","title":{"rendered":"The Lazy (and Easy) Pre-Trained Translator of the Year"},"content":{"rendered":"\n

We made our own Neural Machine Translator (NMT)<\/a> in 2019, which helped us to translate Dutch to English. Now it is 2022, and many things have changed in the world of Data Science. The arrival of Bidirectional Encoder Representations from Transformers (BERT)<\/a>, a pre-trained transformer model, in 2019 brought a new page for Natural Language Processing (NLP)<\/a>. And nowadays we have Generative Pre-trained Transformer 3 (GPT-3)<\/a>, an even larger model than BERT with 175 billion(!) parameters. As technologies evolve day after day, we should take the advantage of the evolution. So our project this time is — take a shortcut and use a pre-trained model to build our translator.<\/p>\n\n\n\n

Pick a Pre-Trained Model<\/h3>\n\n\n\n

We mentioned about BERT is the new gold standard of Data Science, then can we use it as our pre-trained translator model? Well, BERT is good, but it is not a good translation model. BERT is a pre-trained model specialized in context word representation. i.e. It can tell the difference between “mouse” as a computer peripheral and a small mammal. But the way it trains the model using masked words may case problems in handling translation. There is a research paper<\/a> on using BERT in neural translation, however, according to its content and setting, it is never the “lazy and easy” thing we wanted. <\/p>\n\n\n\n

\"BERT
BERT on word representation<\/figcaption><\/figure>\n\n\n\n

Then what about the all mighty GPT-3? Yes, GPT-3 can legitimately do everything, including translation. But then the problem is on our side. The place I am currently living in, Hong Kong, is one of the GPT-3 unsupported countries. There is a GPT-3 open source alternative, GPT-J<\/a>, which contains 6 billion parameters as well. This powerful open source pre-trained model does need a little bit of time for setting it up. Thus it just violates our “lazy and easy” principle. <\/p>\n\n\n\n

When one door closes, another opens. On the internet nowadays, there are always more doors than we expect. So we have Argos Translate<\/a>, the OpenNMT<\/a> and the SentencePiece<\/a> powered open source translator. And you may notice, this Argos Translate is using the same tech stack we did last time for our own translator. This time, we can skip the training part and go straight for the translating part, easy peasy!<\/p>\n\n\n\n

Never to the Training, just Straight to Results<\/h3>\n\n\n\n

Although we don’t need a training in using Argos Translate, we do need an installation of the library :]]. Like we what did in past, we use pipenv<\/a> to do the deployment job.<\/p>\n\n\n\n

$pipenv --three\n$pipenv shell\n$pipenv install argostranslate<\/code><\/pre>\n\n\n\n

Okay, we are good to go. We are CodeAStar, let’s do what we do best here — code!<\/p>\n\n\n\n

WAIT!<\/p>\n\n\n\n

In order to become a good developer, please always remember, design first before you code. So we have the following sequence diagram:<\/p>\n\n\n\n

\"Argo<\/figure>\n\n\n\n

The translation flow is straight forward. But there are somethings we can find out from the diagram: <\/p>\n\n\n\n

    \n
  1. we need to provide a way for a user to enter the translation language pair, e.g. from GUI(Graphical User Interface) or command line arguments<\/li>\n\n\n\n
  2. the application should allow a user to submit input through GUI or command line interface<\/li>\n\n\n\n
  3. the internet connection is required, otherwise we have to download language packages to local drive first<\/li>\n\n\n\n
  4. like the input, the application should present the translated outcome in different ways, like GUI, file output or console output.<\/li>\n<\/ol>\n\n\n\n

    When we clean up those design considerations, finally, we are starting to work.<\/p>\n\n\n\n

    EZ coding for the pre-trained translator<\/h3>\n\n\n\n

    When we say “EZ coding”, it always comes easy with an easy interface. So we are building the pre-trained translator with the command line interface. i.e. <\/p>\n\n\n\n

    $python ez_trans.py <FROM LANG> <TO LANG> <INPUT FILE><\/code><\/pre>\n\n\n\n

    And we should add some code to handle the command line arguments and we have argparse<\/a> – the Python bundled arguments handler.<\/p>\n\n\n

    \nimport argparse\nfrom argostranslate import package, translate\n\nparser = argparse.ArgumentParser()\nparser.add_argument("from_lang", help = "From Language, e.g. en")\nparser.add_argument("to_lang", help = "To Language, e.g. es")\nparser.add_argument("input_file", help = "Input Text File, e.g. abc.txt")\nargs = parser.parse_args()\n<\/pre><\/div>\n\n\n

    After getting the user inputs, it is time for us to get the pre-trained language models from Argo Translate.<\/p>\n\n\n

    \ntry:\n    print("Getting the ArgosTranslate package index...")\n    available_packages = package.get_available_packages()\nexcept:\n    package.update_package_index()\n    available_packages = package.get_available_packages()\n\ntry: \n    selected_package = list(\n    filter(\n         lambda x: x.from_code == args.from_lang and x.to_code == args.to_lang, available_packages\n    ))[0]\nexcept: \n    print(f"Error for finding language pair for [{args.from_lang}] to [{args.to_lang}]")\n    exit()\n\nprint(f"Download '{selected_package}' model from the ArgosTrans if no model is found in the current system...")\ndownload_path = selected_package.download()\npackage.install_from_path(download_path)\n\ninstalled_languages = translate.get_installed_languages()\nargo_from_lang = list(filter(lambda x: x.code == args.from_lang,installed_languages))[0]\nargo_to_lang = list(filter(lambda x: x.code == args.to_lang,installed_languages))[0]\ntranslation = argo_from_lang.get_translation(argo_to_lang)\ntranslated_lines = []\n<\/pre><\/div>\n\n\n

    Then we have only few things left: open the input file, translate it line by line with Argo Translate and save to the output file. Since we are working on foreign languages, remember to add “encoding=’utf8′<\/em>” from the “open(….)” <\/em>command.<\/p>\n\n\n

    \nprint(f"Reading '{args.input_file}' and starting to translate...")\n#load text from file\nwith open(args.input_file, encoding='utf8') as f:\n    lines =  f.read().splitlines() \n    for l in lines:\n        translated_l = translation.translate(l)\n        translated_lines.append(translated_l)\ntranslated_output = '\\n'.join(translated_lines)\n\n#save our translated result into a file\nwith open("output_"+args.input_file,'w',encoding='utf8') as o:\n    o.write(translated_output)\nprint(f"Translated output is saved as 'output_{args.input_file}', enjoy!")\n<\/pre><\/div>\n\n\n

    Our EZ pre-trained translator is done! See? The lines of code, including comments, are just under 50!<\/p>\n\n\n\n

    Test Drive on the Pre-Trained Translator<\/h3>\n\n\n\n

    It is a tiny translator, but does size matter? Let’s prove it. Since I am studying Portuguese (or Portugu\u00eas), we will take the Portuguese from my text book and use our EZ translator to translate it back to English. Then we can see if it is well matched to the actual translation. <\/p>\n\n\n\n

    Here is our input:<\/p>\n\n\n\n

    \n

    Afonso: Ol\u00e1! Eu chamo-me Afonso. E voc\u00ea, como \u00e9 que se chama?
    Russell: Ol\u00e1! Eu sou o Russell.
    Afonso: De onde \u00e9?
    Russell: Sou da Austr\u00e1lia, de Darwin. E voc\u00e9?
    Afonso: Eu sou Portugal, de \u00c9vora. Que l\u00ednguas fala?
    Russell: Falo ingl\u00eas, alem\u00e3o e um pouco de portugu\u00eas.
    Afonso: Eu falo portugu\u00eas, canton\u00eas, ingl\u00eas e um pouco de mandarim.<\/p>\n<\/blockquote>\n\n\n\n

    This is a conversation between two men, Afonso and Russell, introducing each other and asking what languages they can speak.<\/p>\n\n\n\n

    What we do next is, save the conversation into a file then run our EZ translator for Portuguese (pt) to English (en) translation.<\/p>\n\n\n\n

    $python ez_trans.py pt en pt_test.txt<\/code><\/pre>\n\n\n\n

    And we get the output file as:<\/p>\n\n\n\n

    \n

    Afonso: Hello! My name is Afonso. What’s your name?
    Russell: Hello! I’m Russell.
    Afonso: Where are you from?
    Russell: I’m from Australia, from Darwin. And you?
    Afonso: I’m Portual from \u00c9vora. What languages do you speak?
    Russell: I speak English, German and a little Portuguese.
    Afonso: I speak Portuguese, Cantonese, English and a little Mandarin.<\/p>\n<\/blockquote>\n\n\n\n

    This is exactly what are they talking about. Therefore our tiny EZ translator does score big in the translation task. Maybe I am taking the elementary course, thus the conversation piece is a bit simple and straight-forward for our EZ translator. Overall, it is a tiny, fast and accurate machine translator.<\/p>\n\n\n\n

    What have we learned in this post?<\/h3>\n\n\n\n
      \n
    1. Life is short, if there is something we can use, just use it, don’t reinvent the wheel<\/li>\n\n\n\n
    2. Always design first code later<\/li>\n\n\n\n
    3. The use of Argo Translate <\/li>\n\n\n\n
    4. Size doesn’t matter in the world of coding<\/span><\/li>\n<\/ol>\n\n\n\n

      (the complete source package can be found at\u00a0GitHub<\/strong>:\u00a0https:\/\/github.com\/codeastar\/lazy_transator<\/a>)<\/p>\n\n\n\n

      <\/p>\n","protected":false},"excerpt":{"rendered":"

      We made our own Neural Machine Translator (NMT) in 2019, which helped us to translate Dutch to English. Now it is 2022, and many things have changed in the world of Data Science. The arrival of Bidirectional Encoder Representations from Transformers (BERT), a pre-trained transformer model, in 2019 brought a new page for Natural Language […]<\/p>\n","protected":false},"author":1,"featured_media":2436,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"om_disable_all_campaigns":false,"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"site-sidebar-layout":"default","site-content-layout":"default","ast-site-content-layout":"","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","theme-transparent-header-meta":"default","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_newsletter_tier_id":0,"jetpack_publicize_message":"","jetpack_is_tweetstorm":false,"jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","enabled":false}}},"categories":[18],"tags":[119,160,140,87,173,174],"jetpack_publicize_connections":[],"yoast_head":"\nPre-Trained Translator ⋆ Code A Star<\/title>\n<meta name=\"description\" content=\"Let's make an easy Pre-Trained Translator using Neural Machine Translation (NMT) with less than 50 lines of code!\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Pre-Trained Translator ⋆ Code A Star\" \/>\n<meta property=\"og:description\" content=\"Let's make an easy Pre-Trained Translator using Neural Machine Translation (NMT) with less than 50 lines of code!\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/\" \/>\n<meta property=\"og:site_name\" content=\"Code A Star\" \/>\n<meta property=\"article:publisher\" content=\"codeastar\" \/>\n<meta property=\"article:author\" content=\"codeastar\" \/>\n<meta property=\"article:published_time\" content=\"2022-11-30T15:47:57+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2022-12-02T18:40:47+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.codeastar.com\/wp-content\/uploads\/2022\/11\/lazy_translator.png\" \/>\n\t<meta property=\"og:image:width\" content=\"2000\" \/>\n\t<meta property=\"og:image:height\" content=\"782\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Raven Hon\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@codeastar\" \/>\n<meta name=\"twitter:site\" content=\"@codeastar\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Raven Hon\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"6 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/\"},\"author\":{\"name\":\"Raven Hon\",\"@id\":\"https:\/\/www.codeastar.com\/#\/schema\/person\/832d202eb92a3d430097e88c6d0550bd\"},\"headline\":\"The Lazy (and Easy) Pre-Trained Translator of the Year\",\"datePublished\":\"2022-11-30T15:47:57+00:00\",\"dateModified\":\"2022-12-02T18:40:47+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/\"},\"wordCount\":1038,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/www.codeastar.com\/#\/schema\/person\/832d202eb92a3d430097e88c6d0550bd\"},\"keywords\":[\"easy\",\"Machine Translation\",\"NLP\",\"Open Source\",\"OpenNMT\",\"Pre-Trained\"],\"articleSection\":[\"Learn Machine Learning\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/\",\"url\":\"https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/\",\"name\":\"Pre-Trained Translator ⋆ Code A Star\",\"isPartOf\":{\"@id\":\"https:\/\/www.codeastar.com\/#website\"},\"datePublished\":\"2022-11-30T15:47:57+00:00\",\"dateModified\":\"2022-12-02T18:40:47+00:00\",\"description\":\"Let's make an easy Pre-Trained Translator using Neural Machine Translation (NMT) with less than 50 lines of code!\",\"breadcrumb\":{\"@id\":\"https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.codeastar.com\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"The Lazy (and Easy) Pre-Trained Translator of the Year\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.codeastar.com\/#website\",\"url\":\"https:\/\/www.codeastar.com\/\",\"name\":\"Code A Star\",\"description\":\"We don't wish upon a star, we code a star\",\"publisher\":{\"@id\":\"https:\/\/www.codeastar.com\/#\/schema\/person\/832d202eb92a3d430097e88c6d0550bd\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.codeastar.com\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-US\"},{\"@type\":[\"Person\",\"Organization\"],\"@id\":\"https:\/\/www.codeastar.com\/#\/schema\/person\/832d202eb92a3d430097e88c6d0550bd\",\"name\":\"Raven Hon\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.codeastar.com\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/i0.wp.com\/www.codeastar.com\/wp-content\/uploads\/2018\/08\/logo70.png?fit=70%2C70&ssl=1\",\"contentUrl\":\"https:\/\/i0.wp.com\/www.codeastar.com\/wp-content\/uploads\/2018\/08\/logo70.png?fit=70%2C70&ssl=1\",\"width\":70,\"height\":70,\"caption\":\"Raven Hon\"},\"logo\":{\"@id\":\"https:\/\/www.codeastar.com\/#\/schema\/person\/image\/\"},\"description\":\"Raven Hon is\u00a0a 20 years+ veteran in information technology industry who has worked on various projects from console, web, game, banking and mobile applications in different sized companies.\",\"sameAs\":[\"https:\/\/www.codeastar.com\",\"codeastar\",\"https:\/\/twitter.com\/codeastar\"]}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Pre-Trained Translator ⋆ Code A Star","description":"Let's make an easy Pre-Trained Translator using Neural Machine Translation (NMT) with less than 50 lines of code!","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/","og_locale":"en_US","og_type":"article","og_title":"Pre-Trained Translator ⋆ Code A Star","og_description":"Let's make an easy Pre-Trained Translator using Neural Machine Translation (NMT) with less than 50 lines of code!","og_url":"https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/","og_site_name":"Code A Star","article_publisher":"codeastar","article_author":"codeastar","article_published_time":"2022-11-30T15:47:57+00:00","article_modified_time":"2022-12-02T18:40:47+00:00","og_image":[{"width":2000,"height":782,"url":"https:\/\/www.codeastar.com\/wp-content\/uploads\/2022\/11\/lazy_translator.png","type":"image\/png"}],"author":"Raven Hon","twitter_card":"summary_large_image","twitter_creator":"@codeastar","twitter_site":"@codeastar","twitter_misc":{"Written by":"Raven Hon","Est. reading time":"6 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/#article","isPartOf":{"@id":"https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/"},"author":{"name":"Raven Hon","@id":"https:\/\/www.codeastar.com\/#\/schema\/person\/832d202eb92a3d430097e88c6d0550bd"},"headline":"The Lazy (and Easy) Pre-Trained Translator of the Year","datePublished":"2022-11-30T15:47:57+00:00","dateModified":"2022-12-02T18:40:47+00:00","mainEntityOfPage":{"@id":"https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/"},"wordCount":1038,"commentCount":0,"publisher":{"@id":"https:\/\/www.codeastar.com\/#\/schema\/person\/832d202eb92a3d430097e88c6d0550bd"},"keywords":["easy","Machine Translation","NLP","Open Source","OpenNMT","Pre-Trained"],"articleSection":["Learn Machine Learning"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/","url":"https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/","name":"Pre-Trained Translator ⋆ Code A Star","isPartOf":{"@id":"https:\/\/www.codeastar.com\/#website"},"datePublished":"2022-11-30T15:47:57+00:00","dateModified":"2022-12-02T18:40:47+00:00","description":"Let's make an easy Pre-Trained Translator using Neural Machine Translation (NMT) with less than 50 lines of code!","breadcrumb":{"@id":"https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/www.codeastar.com\/the-lazy-and-easy-pre-trained-translator-of-the-year\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.codeastar.com\/"},{"@type":"ListItem","position":2,"name":"The Lazy (and Easy) Pre-Trained Translator of the Year"}]},{"@type":"WebSite","@id":"https:\/\/www.codeastar.com\/#website","url":"https:\/\/www.codeastar.com\/","name":"Code A Star","description":"We don't wish upon a star, we code a star","publisher":{"@id":"https:\/\/www.codeastar.com\/#\/schema\/person\/832d202eb92a3d430097e88c6d0550bd"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.codeastar.com\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"en-US"},{"@type":["Person","Organization"],"@id":"https:\/\/www.codeastar.com\/#\/schema\/person\/832d202eb92a3d430097e88c6d0550bd","name":"Raven Hon","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.codeastar.com\/#\/schema\/person\/image\/","url":"https:\/\/i0.wp.com\/www.codeastar.com\/wp-content\/uploads\/2018\/08\/logo70.png?fit=70%2C70&ssl=1","contentUrl":"https:\/\/i0.wp.com\/www.codeastar.com\/wp-content\/uploads\/2018\/08\/logo70.png?fit=70%2C70&ssl=1","width":70,"height":70,"caption":"Raven Hon"},"logo":{"@id":"https:\/\/www.codeastar.com\/#\/schema\/person\/image\/"},"description":"Raven Hon is\u00a0a 20 years+ veteran in information technology industry who has worked on various projects from console, web, game, banking and mobile applications in different sized companies.","sameAs":["https:\/\/www.codeastar.com","codeastar","https:\/\/twitter.com\/codeastar"]}]}},"jetpack_featured_media_url":"https:\/\/i0.wp.com\/www.codeastar.com\/wp-content\/uploads\/2022\/11\/lazy_translator.png?fit=2000%2C782&ssl=1","jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p8PcRO-CM","jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/www.codeastar.com\/wp-json\/wp\/v2\/posts\/2404"}],"collection":[{"href":"https:\/\/www.codeastar.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.codeastar.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.codeastar.com\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.codeastar.com\/wp-json\/wp\/v2\/comments?post=2404"}],"version-history":[{"count":19,"href":"https:\/\/www.codeastar.com\/wp-json\/wp\/v2\/posts\/2404\/revisions"}],"predecessor-version":[{"id":2475,"href":"https:\/\/www.codeastar.com\/wp-json\/wp\/v2\/posts\/2404\/revisions\/2475"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.codeastar.com\/wp-json\/wp\/v2\/media\/2436"}],"wp:attachment":[{"href":"https:\/\/www.codeastar.com\/wp-json\/wp\/v2\/media?parent=2404"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.codeastar.com\/wp-json\/wp\/v2\/categories?post=2404"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.codeastar.com\/wp-json\/wp\/v2\/tags?post=2404"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}