tf-seq2seq是Tensorflow的通用编码器 - 解码器框架,可用于机器翻译,文本汇总,会话建模,图像字幕等。

tf-seq2seq是Tensorflow的通用编码器 – 解码器框架,可用于机器翻译,文本汇总,会话建模,图像字幕等。

设计目标

《tf-seq2seq是Tensorflow的通用编码器 - 解码器框架,可用于机器翻译,文本汇总,会话建模,图像字幕等。》 nmt-model-fast.gif

我们建立了以下目标的tf-seq2seq:

通用目的:我们最初构建了机器翻译框架,然后将其用于各种其他任务,包括汇总,会话建模和图像字幕。只要您的问题能够以一种格式编码输入数据并将其解码为另一种格式,您应该能够使用或扩展此框架。
可用性:您可以使用单个命令训练模型。支持多种类型的输入数据,包括标准原始文本。
重复性:使用YAML文件配置培训管道和型号。这允许其他运行您完全相同的模型配置。
可扩展性:代码以模块化的方式构建,易于构建。例如,添加新类型的注意机制或编码器架构只需要最小的代码更改。
文档:所有代码都使用标准的Python文档列表进行记录,并且我们已经编写了指导,以帮助您开始常见的任务。
良好的性能:为了简化代码,我们并没有尝试挤出最后一点的性能,但实施速度足以覆盖几乎所有的生产和研究用例。 tf-seq2seq还支持分布式培训,以减少计算能力和培训时间。

tf-seq2seq is a general-purpose encoder-decoder framework for Tensorflow that can be used for Machine Translation, Text Summarization, Conversational Modeling, Image Captioning, and more.

Design Goals

We built tf-seq2seq with the following goals in mind:

General Purpose: We initially built this framework for Machine Translation, but have since used it for a variety of other tasks, including Summarization, Conversational Modeling, and Image Captioning. As long as your problem can be phrased as encoding input data in one format and decoding it into another format, you should be able to use or extend this framework.
Usability: You can train a model with a single command. Several types of input data are supported, including standard raw text.
Reproducibility: Training pipelines and models are configured using YAML files. This allows other to run your exact same model configurations.
Extensibility: Code is structured in a modular way and that easy to build upon. For example, adding a new type of attention mechanism or encoder architecture requires only minimal code changes.
Documentation: All code is documented using standard Python docstrings, and we have written guides to help you get started with common tasks.
Good Performance: For the sake of code simplicity, we did not try to squeeze out every last bit of performance, but the implementation is fast enough to cover almost all production and research use cases. tf-seq2seq also supports distributed training to trade off computational power and training time.
Tensorflow 教程:
http://www.tensorflownews.com/

    原文作者:不知道自己是谁
    原文地址: https://www.jianshu.com/p/8bc8b46f2b03
    本文转自网络文章,转载此文章仅为分享知识,如有侵权,请联系博主进行删除。
点赞