用于免费的数据流分析和使用它们的编译器优化 - 嵌入在学术界

  • Compiler improvement over time: Compilers can be improved but slowly. "Proebsting’s Law" suggests compiler optimization advances double speed every 18 years, which may be optimistic. Slow compiler evolution is a problem in today's environment of rapid innovation in GPUs, TPUs, etc.
  • Research group's goal: Create technologies for self-improving compilers. One technology is superoptimization that uses an expensive search to find missing optimizations. Another is generalization that turns specific optimizations into broadly applicable forms.
  • Self-improvement loop with superoptimization and generalization: Together with a benchmark suite, it results in a fully automated self-improvement loop for the peephole optimizer. Dataflow analyses compute useful facts. There are many dataflow analyses in literature but hard to know which to implement.
  • Automating dataflow work: Create a representation for dataflow facts and formalize their meaning. Use a paper to automatically create dataflow transfer functions. Mitigate loss of precision across instruction boundaries and create efficient product operators. Use a generic dataflow framework to create dataflow analyses.
  • Using dataflow facts for optimizations: Use a superoptimizer like Souper to use LLVM's dataflow results. The generalization engine should support dataflow analyses.
  • Interesting dataflow analysis to implement: Congruences where for a variable v, prove v = ax + b. Current compilers are divorced from compilation foundations and in the future, parts like dataflow analyses and peephole optimizations will be derived from these foundations.
阅读 11
0 条评论