首页> 外文期刊>Fundamenta Informaticae >Symbolic Tensor Neural Networks for Digital Media - from Tensor Processing via BNF Graph Rules to CREAMS Applications
【24h】

Symbolic Tensor Neural Networks for Digital Media - from Tensor Processing via BNF Graph Rules to CREAMS Applications

机译:用于数字媒体的符号张量神经网络-从通过BNF图规则进行张量处理到CREAMS应用

获取原文
           

摘要

This tutorial material on Convolutional Neural Networks (CNN) and its applications in digital media research is based on the concept of Symbolic Tensor Neural Networks. The set of STNN expressions is specified in Backus-Naur Form (BNF) which is annotated by constraints typical for labeled acyclic directed graphs (DAG). The BNF induction begins from a collection of neural unit symbols with extra (up to five) decoration fields (including tensor depth and sharing fields). The inductive rules provide not only the general graph structure but also the specific shortcuts for residual blocks of units. A syntactic mechanism for network fragments modularization is introduced via user defined units and their instances. Moreover, the dual BNF rules are specified in order to generate the Dual Symbolic Tensor Neural Network (DSTNN). The joined interpretation of STNN and DSTNN provides the correct flow of gradient tensors, back propagated at the training stage. The proposed symbolic representation of CNNs is illustrated for six generic digital media applications (CREAMS): Compression, Recognition, Embedding, Annotation, 3D Modeling for human-computer interfacing, and data Security based on digital media objects. In order to make the CNN description and its gradient flow complete, for all presented applications, the symbolic representations of mathematically defined loss/gain functions and gradient flow equations for all used core units, are given. The tutorial is to convince the reader that STNN is not only a convenient symbolic notation for public presentations of CNN based solutions for CREAMS problems but also that it is a design blueprint with a potential for automatic generation of application source code.
机译:关于卷积神经网络(CNN)及其在数字媒体研究中的应用的本教程材料基于符号张量神经网络的概念。 STNN表达式集以Backus-Naur形式(BNF)指定,并由带标签的无环有向图(DAG)的典型约束进行注释。 BNF归纳从神经单位符号的集合开始,这些神经单位符号具有额外的(最多五个)修饰字段(包括张量深度和共享字段)。归纳规则不仅提供一般的图形结构,还提供剩余单元块的特定快捷方式。通过用户定义的单元及其实例引入了用于网络片段模块化的语法机制。此外,指定了双重BNF规则以生成双重符号张量神经网络(DSTNN)。 STNN和DSTNN的联合解释提供了在训练阶段向后传播的正确的梯度张量流。针对六个通用数字媒体应用程序(CREAMS)说明了CNN的符号表示形式:压缩,识别,嵌入,注释,用于人机接口的3D建模以及基于数字媒体对象的数据安全性。为了使CNN描述及其梯度流完整,对于所有提出的应用,给出了所有使用的核心单元的数学定义的损耗/增益函数和梯度流方程的符号表示。本教程旨在使读者相信,STNN不仅是对基于CNN的CREAMS问题解决方案进行公开演示的便捷符号表示法,而且还是一种可以自动生成应用程序源代码的设计蓝图。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号