首页|Meta-programming for Cross-Domain Tensor Optimizations

Meta-programming for Cross-Domain Tensor Optimizations

扫码查看
Many modern application domains crucially rely on tensor operations. The optimization of programs that operate on tensors poses difficulties that are not adequately addressed by existing languages and tools. Frameworks such as TensorFlow offer good abstractions for tensor operations, but target a specific domain, i.e. machine learning, and their optimization strategies cannot easily be adjusted to other domains. General-purpose optimization tools such as Pluto and existing meta-languages offer more flexibility in applying optimizations but lack abstractions for tensors. This work closes the gap between domain-specific tensor languages and general-purpose optimization tools by proposing the Tensor optimizations Meta-Language (TeML). TeML offers high-level abstractions for both tensor operations and loop transformations, and enables flexible composition of transformations into effective optimization paths. This composi-tionality is built into TeML's design, as our formal language specification will reveal. We also show that TeML can express tensor computations as comfortably as TensorFlow and that it can reproduce Pluto's optimization paths. Thus, optimized programs generated by TeML execute at least as fast as the corresponding Pluto programs. In addition, TeML enables optimization paths that often allow outperforming Pluto.

tensor algebrameta-programmingcode generation and optimizationdenotational semantics

Adilla Susungi、Norman A. Rink、Albert Cohen、Jeronimo Castrillon、Claude Tadonki

展开 >

MINES ParisTech PSL Research University France

Chair for Compiler Construction Technische Universit?t Dresden Germany

INRIA & ENS DI France

2018

ACM SIGPLAN Notices

ACM SIGPLAN Notices

EIISTP
ISSN:0362-1340
年,卷(期):2018.53(9)
  • 7
  • 38