I'm very puzzled about the Swift for Tensorflow investment. I understand that's primarily because of Chris lattner's personal investment...but given Google's massive mindshare in Kotlin/Dart/JavaScript, this is a very puzzling investment.<p>Kotlin is successful because Google was able to drive the direction of the language. For example AndroidX Compose is being built with compiler extensions in Kotlin with Jetbrains.
Dart is obviously driven by Google.<p>Swift is going to be controlled and driven by Apple. From a strategic perspective, I don't understand why Google would make an investment like this when Swift-for-Tensorflow has a fundamental need to drive changes in the language itself.<p>Now, I understand if Google didn't have an alternative. Are you saying Kotlin is soooooo worse than Swift that you can't fundamentally take a similar direction ? Kotlin is currently one of the world's most popular languages, born out of sheer LOVE from the community (without being pushed by the giants).
To those that are upvoting this and previous Swift + TF announcements: What are you excited about, specifically? Why Swift? Why not Julia? Is it the syntax? Types? Compilation? Performance? Community?<p>I like Swift and all but our ML/DL/RL/DS tools and libraries are in Python (and occasionally R). Most are missing for Swift without an awkward Python compatibility layer and I don't see a compelling reason to adopt it.
Is swift a common language for kind of thing tensorflow is used for? Given that swift does not work on windows and Linux support is sub par, is this only because of Chris Lattner?
Glad to see the update. I have experimented with both the macOS and Linux drops before and they were rough. v0.3.1 is focused on providing a productive platform. Anyway I am on travel without a laptop so I need to wait to try it.<p>The idea of LLVM/Swift ‘turtles all the way down’ might really pay off in the long run. In TensorFlow there is an un-understandable barrier (for me) between the Python layer and the lower level C++ code. Good to see this Swift effort along side Julia + Flux, which offers the same implementation language for the whole system advantage.
I took the recent fastai course that had two lessons about S4TF taught by Chris Lattner and walked away excited.<p>It’s not ready for prime time yet but the roadmap looks great. Building auto-differentiation into the language and having an easy way for anyone to experiment with core architecture changes (without killing performance) will be a huge win for everyone.<p>I expect lots of prototyping will be done on S4TF (because it will be easy for developers to keep things on the GPU with XLA and MLIR) and then the things that work will be back-ported to other ecosystems.<p><a href="https://medium.com/tensorflow/mlir-a-new-intermediate-representation-and-compiler-framework-beba999ed18d" rel="nofollow">https://medium.com/tensorflow/mlir-a-new-intermediate-repres...</a>
If you missed it, Chris gave a talk a little while back, and went into detail regarding the motivation behind the project, what Swift provides, etc. It answers a lot of the questions being asked here:<p><a href="https://www.tensorflow.org/swift" rel="nofollow">https://www.tensorflow.org/swift</a>