I LOVE PyTorch for experimenting with <i>dynamic</i> deep neural nets (DNNs) -- that is, DNNs that can have different graphs for different input samples. I find it much, MUCH easier to create and tinker with dynamic DNNs using PyTorch than, say, TensorFlow Fold. PyTorch is great for R&D experimentation.<p>For example, here's how easy it is to construct a fully-connected neural net with a <i>dynamically random</i> number of recurrent hidden layers in PyTorch. Yes, it's a silly example, but it shows how easy it is to construct dynamic DNNs with PyTorch:<p><pre><code> import random
import torch
class MySillyDNN(torch.nn.Module):
def __init__(self, input_dim, hidden_dim, output_dim):
super(MySillyDNN, self).__init__()
self.input_layer = torch.nn.Linear(input_dim, hidden_dim)
self.hidden_layer = torch.nn.Linear(hidden_dim, hidden_dim)
self.output_layer = torch.nn.Linear(hidden_dim, output_dim)
def forward(self, x, max_recurrences=3):
hidden_relu = self.input_layer(x).clamp(min=0)
for r in range(random.randint(0, max_recurrences)):
hidden_relu = self.hidden_layer(hidden_relu).clamp(min=0)
y_pred = self.output_layer(hidden_relu)
return y_pred
</code></pre>
It would be a hassle to do something like this with other frameworks like TensorFlow or Theano, which require you to specify the computational graph (including conditionals, if any) before you can run the graph.<p>PyTorch's define-the-graph-by-running-it approach is sooo nice for quick-n'-dirty experimentation with dynamic graphs.<p>You can even create and tinker with dynamic graphs <i>interactively</i> on a Python REPL :-)