Tensor Programs I

By Greg Yang et al
Published on May 8, 2021
Read the original document by opening this link in a new tab.

Table of Contents

1. Introduction
2. Contributions
3. Gaussian Process with Variable-Dimensional Output
4. GP Behavior of a Multilayer Perceptron (MLP)
5. NETSOR: Language for Expressing Neural Network Computation
6. Computing the GP Kernel from a NETSOR Encoding of a Neural Network

Summary

The document discusses the concept of wide neural networks with random weights and biases being Gaussian processes, extending to various modern neural network architectures. It introduces the tensor programs technique, elaborates on the Gaussian Process results, and provides open-source implementations of GP kernels for different architectures.
×
This is where the content will go.