Downloads provided by UsageCountsLoading
The task of translating from one language into another using the computer, called Machine Translation (MT), has been one of the central challenges for the natural language processing community for decades now. Recently, neural models for machine translation (MT) have received much attention. This interest is partially fueled by the successes of neural and other representation learning methods in some domains (e.g., image and speech processing, reinforcement learning) but it is also motivated by recognized limitations of traditional MT systems (e.g., these systems do not directly model paraphrasing or semantic similarity). The aim of this (sub-)project is to exploit fast parallel GPU computation to train large neural networks that learn meaningful representations of input sentences informed by hierarchical structure of a sentences so that better translation quality can be achieved.
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=nwo_________::341527b6210136d172da913b5dd46698&type=result"></script>');
-->
</script>
