It's a shame. I had high hopes at the beginning that S4TF - and the investment in Swift from Google - would help Swift break out of the iOS ghetto and cement it as a mainstream language.<p>Swift's a delightful language to use. It has a lot of the nice things about Rust's type system, but is a heck of a lot easier to use at the expense of a bit of performance. For a lot of use cases, I think this is a great value proposition. ML/data science is a great example where practitioners could benefit from a more grown-up type system than what Python has on offer, but would still be able to keep low level details at arms length.<p>I think Serverless would be another ideal use-case for Swift, where the productivity, clarity and correctness tools it offers would be a huge benefit.<p>Some very interesting things came out of the S4TF project - like the work on autodiff, and python interop. It's a shame the project never really seemed to get legs, it seems like it just languished for years with no clear direction.<p>These days I do most of my work in Rust, and I'm happy to do so because the tooling, community, and ecosystem is really amazing. But there are a lot of language concepts and features I miss from swift. I guess it goes to show that the governance and availability of a language have a lot more to do with adoption than the merits of the language itself.
Back when I was doing a lot of Swift programming, this seemed like such a great idea. Since then I moved on to Julia. And when I looked at the ML code for Swift and compared it to Julia, I was thinking "How on earth could anyone think this was ever a good idea?"<p>Doing machine learning stuff in Julia is simply much more user friendly than doing the same in Swift. Swift is nice for iOS development, but I think in data science and Machine Learning Julia will always be a much better choice.<p>It is a pity Google did not go for Julia. Julia has gotten exceptionally far despite limited resources. With Google style resources, Julia could have been massive.<p>Despite modest investment, the Julia JIT beats almost anything out there. It would have been amazing to see what they could have pulled off if they had put the kind of resources that was put into the JavaScript V8 JIT into the Julia JIT.<p>Julia uses a method JIT, which are quick to implement, but which makes latency gets bad. Meaning if you load a whole new package, running the first function can cause some delay. With a JavaScript style tracer JIT added into the mix one could have probably managed to have both high performance and low latency.<p>Alternatively with more investment they could have had better precompile caching, meaning libraries that had been loaded in the past would JIT really fast. Julia does that today, but it could have worked a lot better than it currently does. It isn't that it can't be done, but it just requires resources.
S4TF lost out, not to Python, but to Python's AI/ML ecosystem -- people, projects, libraries, frameworks.<p>Despite its many shortcomings, Python has become the <i>lingua franca</i> of AI and ML, to the point that whenever I come across a newly published AI or ML paper that I find interesting, I <i>expect</i> to be able to find code implementing it in Python.<p>For example, yesterday I saw a post on HN about approximating self-attention matrices in transformers, which have O(n²) computational cost, with a seemingly clever approach that has O(n) computational cost: <a href="https://news.ycombinator.com/item?id=26105455" rel="nofollow">https://news.ycombinator.com/item?id=26105455</a> . "Huh, that looks interesting," I thought, "let me see if I can find some code." Three clicks later, I found myself at <a href="https://github.com/mlpen/Nystromformer" rel="nofollow">https://github.com/mlpen/Nystromformer</a> -- the official implementation... in Python. A few moments later, I was playing with this thingamajiggy. No other language comes close to having this kind of ecosystem in AI and ML.<p>Julia still has a shot at becoming a viable alternative to Python, especially if the Julia developers can shorten the "time to first interactive plot" to make it as fast as Python's, but they face an uphill battle against such an entrenched ecosystem.
Swift for TensorFlow was a nice idea on paper, but you need more than ideas to push a project forward. You need people who actually want to use your product.<p>For all the comments on HN about how great Swift was, there were hundreds if not thousands of ML engineer and research scientist who did not know it existed and frankly did not really care about it either.<p>Swift for Tensorflow was not addressing the issues of ML in productions and felt like "Just another framework" but with the added handicap of needing its users to learn an entirely new language which is not something most scientists want to do. One might argue that engineering might be more interested, but in a lot of cases the science team will push the models and then the engineering team will put them in production, leaving very little room to migrate from TF/Pytorch to Swift for TF.
Swift now being a 100% Apple-sponsored & owned project again makes me a bit nervous.<p>Anyone knows if chris latner is at least using swift in his new company ? I have the feeling swift never really worked in the server side, data science is now officially a failure, and all that is left is now a very niche market of 100% native mobile development.<p>I love this language, but i'm eager to see it handled by a proper foundation with big names behind, as it's a really great language.
Looks like @throw6606 was right: <a href="https://news.ycombinator.com/item?id=24533937" rel="nofollow">https://news.ycombinator.com/item?id=24533937</a>. Does anyone know the status of <a href="https://ai.facebook.com/blog/paving-the-way-for-software-20-" rel="nofollow">https://ai.facebook.com/blog/paving-the-way-for-software-20-</a>...?
Looks like Chris Lattner's "exec commitment and strong roadmaps" comment was not that meaningful:<p><a href="https://twitter.com/clattner_llvm/status/1222032740897284097?s=19" rel="nofollow">https://twitter.com/clattner_llvm/status/1222032740897284097...</a>
No surprises there. There were maybe a total of 5 people excited about adding automatic differentiation to Swift. Too bad they didn't try improving Julia instead.
Every time I see a post, about TensorFlow for Swift.
There seems to be so many misconceptions about why this it was created. The top one being, oh it's Chris L's baby.<p>As a Software Engineer, who's been working on teams for the past 15 years and seeing the craft devolve, and the market saturate with people whom don't understand that fundamentals but memorize the frameworks, you need a language like Swift to bridge that gap.<p>It's hard enough to get people to program to an interface, let alone communicate what they are gonna return from a function or how a function behaves. 70% of people don't understand software engineering is legitimately just plumbing, they think its pretty much an artistic endeavour with no rhyme or reason. This makes it pretty difficult to stay on the same page, when building a product to scale up.<p>Having a strong type system solves that problem. When working with ML people my question 100% of the time is what does that function, return what type?, so I can build off what you are doing, or even debugging requires understanding of the type and the values.<p>It seems like team orientated programming is foreign to most people.<p>Swift has a low cognitive load pushing you to solve problems, with Software Engineering and reducing communication lead time between engineers.<p>All languages eventually converge Swift, I think Chris L has solved the language UI problem.
I love Swift and it's potential for use cases beyond just iOS ecosystem, but I think Apple never fostered or encouraged the open source community to involve in its development.<p>It's another classic example where it's simply not enough to open source something, you have to really allow and encourage the developers to contribute to increase the adoption.
S4TF from my (external) perspective was always more a (very cool!) programming language research project than anything else, at least since Lattner left.<p>I would personally assume the shutdown was due to a combination of reasons:<p>- There simply being no good reason for Python users to ever move to Swift. There is no big painpoint being solved for the broad ML user community<p>- Organisational momentum lost with Lattner leaving<p>- General disarray of TensorFlow as a project, the parallel rise of Jax from/within prominent Google orgs, it being the cool new thing
It's awesome to see this whole comment section be about Julia! As a community, evebtually we'll realize that we dot have the bandwidth to keep inventing the same thing over and over just so a new technique can be used efficiently. Currently, Julia is the only language that makes this kind of deep reuse possible by bein efficient through just-ahead-of-time compilation that specialises well through open multiple-dispatch.
I'm curious which of the many "evolutions" that went into Swift in the last few years were motivated heavily by its dance with TensorFlow. If there were any, can we take them back out? Now that the marriage is over?<p>One of the things that frustrates me with a lot of the languages I'm having to grok these days, is they often lack consistency, because it's one carrot after another to get buy in from different sub communities. And in the end it tastes like vegetable soup--edible, but not tasty (if you love vegetable soup, this reach of an analogy probably won't work for you).
Another sad lesson why no open source developer should trust Google with their time and effort.<p>Swift is a gem of a language with an unfortunately Apple-centric ecosystem, but at least you know Apple won't abandon the effort.
I think people are deceived by Swift's syntax. It looks simple by syntactic sugar, but it's not actually any more approachable than other systems languages like C# or Java. Given that a lot of work in ML is exploratory, it doesn't seem like a good fit compared to a hackable scripting language like Python. I would bet against SwiftUI for similar reasons.
This seems like good news for Julia. Some wondered why Google would develop S4TF with an LLVM backend when Julia already existed and had better support for things like linear algebra than Swift had.
It seems this was written on the wall once Chris Lattner left the project. It's a shame since this could have been a real breakthrough in machine learning combining performance and ease. But the success of JAX means Google probably doesn't feel the investment is worth it.
> Added language-integrated differentiable programming into the Swift language. This work continues in the official Swift compiler.<p>added lots of novel paradigms and rarely trodden code paths to the compiler and then peaced out.<p>I hope the inherent complexity added doesn’t impede future Swift development much, or is expeditiously deleted.
I'm looking through the meeting notes[1] linked from the archive PR[2] and it looks like there hasn't been much activity on the project recently. Can anyone with more insight provide an explanation on what happened?<p>[1] - <a href="https://docs.google.com/document/d/1Fm56p5rV1t2Euh6WLtBFKGqI43ozC3EIjReyLk-LCLU/edit" rel="nofollow">https://docs.google.com/document/d/1Fm56p5rV1t2Euh6WLtBFKGqI...</a><p>[2] - <a href="https://github.com/tensorflow/swift/commit/1b1381ccb89a342bababd1511cc39b8ba7b0708b" rel="nofollow">https://github.com/tensorflow/swift/commit/1b1381ccb89a342ba...</a>
My reading of the discussion at <a href="https://forums.swift.org/t/swift-concurrency-roadmap/41611" rel="nofollow">https://forums.swift.org/t/swift-concurrency-roadmap/41611</a>: the Google Swift team members expressed hope for a bit more community involvement in the Swift language planning, and Apple folks very politely told them to shove it where the sun don't shine.<p>My guess: Google concluded that investing in a language developed behind closed doors of another company is a bad deal.<p>I don't blame Apple, it may be a good business decision for them, and of course it is completely within their rights.
I think it is important to note (for those who didn't read) that much of the work on the Core of the language is being upstreamed.<p><a href="https://forums.swift.org/t/differentiable-programming-for-gradient-based-machine-learning/42147" rel="nofollow">https://forums.swift.org/t/differentiable-programming-for-gr...</a><p><a href="https://github.com/rxwei/swift-evolution/blob/autodiff/proposals/0000-differentiable-programming.md" rel="nofollow">https://github.com/rxwei/swift-evolution/blob/autodiff/propo...</a>
So that's it then - another decade of unchallenged dominance of AI by a Betamax scripting language designed by accretion for early 90s CPU architecture. I just don't get the whole Python for ML craze.
And TensorFlow for C# is alive and kicking: <a href="https://losttech.software/gradient.html" rel="nofollow">https://losttech.software/gradient.html</a><p>I guess this is sort of an ad.<p>Swift was a weird choice for statically typed TensorFlow, being only
popular on the platform, that does not have GPU/TPU support in TensorFlow,
which is, basically, a requirement for any serious work. The fact, that
they had to fork the compiler did not help either.<p>TensorFlow for C# is much like TensorFlow for Swift. There's a statically
typed binding to the core types with the rest of the TensorFlow API available
through an open source Python.NET project [1]. Unlike Swift version though,
that second part is also mostly statically typed, so you can get IDE
autocompletion hints. Also, .NET runtime has native support for dynamic
languages.<p>Like with Swift for TensorFlow, we ported all recent interesting
neural network architectures (and maybe even more): Transformers (GPT-2) [2],
CNNs (YOLOv4) [3], Q-learning (e.g. RL, actor-critic) [4], and even some
cool ones, that have lots of unexplored potential like Siren [5] (this one
uses sin/cos as activation functions).<p>Although we have not worked on automatic differentiation yet, unlike
Swift, it will not need compiler support. .NET (and Java) can inspect and
generate code at runtime, so autodiff can be implemented in a library.<p>We also have integration with Unity ML Agents for training robotic agents [4].<p>[1] <a href="https://github.com/pythonnet/pythonnet/" rel="nofollow">https://github.com/pythonnet/pythonnet/</a><p>[2] <a href="https://github.com/losttech/Gradient-Samples/tree/master/GPT-2" rel="nofollow">https://github.com/losttech/Gradient-Samples/tree/master/GPT...</a><p>[3] <a href="https://github.com/losttech/YOLOv4" rel="nofollow">https://github.com/losttech/YOLOv4</a><p>[4] <a href="https://github.com/losttech/Gradient-Samples/tree/master/RL-MLAgents" rel="nofollow">https://github.com/losttech/Gradient-Samples/tree/master/RL-...</a><p>[5] <a href="https://github.com/losttech/Siren" rel="nofollow">https://github.com/losttech/Siren</a> <a href="https://vsitzmann.github.io/siren/" rel="nofollow">https://vsitzmann.github.io/siren/</a>
This is sad for the project itself, but I predict that the coming programming competition will be Rust vs. Go, not Swift -- as Swift was always a distant third among the "new" languages. And of course the more mature languages will continue to be popular and widely used for decades.<p>In my opinion, Rust will end up winning this contest, but others have perfectly good arguments against that view. Unfortunately for server-side Swift and the many good people who've worked on it, there seems little chance for it, in TensorFlow or other usecases.
I am disappointed, but so it goes. I started writing a Swift AI book a while back and I had planned on abut 20% of the book would be on S4TF. I still have lots of other material, but obviously I am not going to include S4TF material.<p>BTW, the Haskell bindings for TensorFlow have been actively maintained for years. The latest update was 3 days ago.
> Added language-integrated differentiable programming into the Swift language. This work continues in the official Swift compiler.<p>While the readme tells us that the project is in archive mode, it also tells us that the differentiable programming part is still worked on by the compiler team.
Is there further information on that?
Is if safe to assume that this is the main driver for this event? <a href="https://www.businessinsider.com/chris-lattner-former-apple-google-engineer-swift-ai-startup-sifive-2020-2" rel="nofollow">https://www.businessinsider.com/chris-lattner-former-apple-g...</a>
What is really needed in the ML industry is a code independent "neural net interpreter" with an api that binds to any programming language.<p>How hard is it to
have a program that takes in neural network architectures & trained waits so as to do inference?
Not surprised that this project had trouble gaining traction given Swift's obsession with shoe-horning functional concepts into every crevice of the language, e.g., I can't get through a single Swift tutorial without having to get into closures.
Swift for tensorflow never made any sense. I remember only reason it started was one main developer on team liked swift.<p>Tensorflow for nodejs makes much more sense, node community is lit bigger and not sponsored by big corp.