I’m most worried about the rise of Fintech apps enabled by APIs like Plaid. The media seems more worried about 10-year old Facebook likes being sold than a perpetual real time feed of bank transaction data ending up in the wrong hands or in the hands of a nefarious developer.<p>For the record, I’m highly critical of Plaid and hope the tech media catches on soon. They do not require developers to communicate which permissions they are asking for when onboarding new customers (I don’t even think that is an option even if developers wanted to) and there’s no central UI for a end customer to review permissions you’ve granted across developers and revoke them. I don’t think they have any requirements to encrypt this data on the developer side and have no idea how they audit developers to make sure they are using various endpoints without violation of their developer terms.
Shoshana Zuboff is one of those people that make me upset when I discover them. Why didn't I hear about them and their books books much earlier? Is it only because she is not marketing her books well enough?<p>The Age of the Smart Machine (1988) is truly visionary and well written.<p>edit:<p>I'm currently reading The Age of Surveillance Capitalism.<p>The book has well developed concepts like 'behavioral surplus and 'instrumentarianism'. There are also clever terms like 'radical indifference', 'observation without witness', 'equivalence without equality'. They are just plain insightful. I can instantly recognize them as something I could not conceptualize before.
I manage a deep learning team but I have some reservations about the technology. IMO deep learning is best for optimizing back end systems and not good for systems that ‘touch’ people: deciding to loan money, automated sentencing of criminals, targeted marketing from personal information, etc.<p>For me, the problems are lack of explainability and possible bias.<p>There are many great applications for deep learning and AI in general but some guard rails must be in place for public good.
IMHO, “deep” is just the new “smart”. People are just doing the same thing they were doing but bigger and better, but when you are building a new company you need to use the adjective of the times.<p>We have had “smart”-everything, it already sounds tired, hence “deep”—everything, let’s see how long it lasts..
I've recently realized how trivial is to detect suspicious activities on real-time video feeds just by tracking human poses, and how this is an almost completely solved problem now (basically it can count on incremental improvement of accuracy of models that are used inside). I have doubts this would in any way "democratize AI", but instead might end up as a powerful weapon of oppression. No wonder most of the papers on this topic originate from China.
> "That’s why the adjective that so many people are affixing to all of these new capabilities to convey their awesome power is “deep.”"<p>One of the best pieces of academic marketing was calling this set of techniques "deep" learning. The word is so rich with connotations, it immediately brings to mind all the synonyms: profound, complex, arcane, etc. It makes people ascribe <i>far</i> more complexity to the system than it actually has.<p>When in reality, it's just a "massively multi-layered and multi-stage" network. But that doesn't sound nearly as profound, and doesn't allow journalists to spin wild tales.
I wonder if it’s time for software engineers to form our own union or guild to combat misusing our profession in corrupt and immoral ways. We would have immense power as a group, but on our own we’re all beholden to our employers which makes us complicit in doing work without thought to the long term societal damage we do.
I posed the following to @Dang a few days ago with respect to what one would possibly think is, at minimum, as responsibility of YC (and the greater VC/SV population) to acknowledge -- though I don't see this happening any time soon:<p>----<p><i></i><i>[How can we]</i><i></i> Find a way to have a serious objective talk with the greater community on the extraordinarily global reaching issues of the impact of Silicon Valley on society, community, culture as a whole.<p>Look at what we have to just emerge in the last 1.5 decades alone from "unicorns" in silicon valley:<p>* <i></i><i>US policy seemingly being set/disrupted via twitter</i><i></i><p>* <i></i><i>Mental health studies coming out on the negative impact of Facebook</i><i></i><p>* <i></i><i>Election manipulation through ad-powered platforms such as Google and FB</i><i></i><p>* <i></i><i>Massive cultural dialogue and political revolutions being fueled through twitter</i><i></i><p>* <i></i><i>Assassinations being corroborated through Apple an watch</i><i></i><p>* <i></i><i>Global spying and surveillance conducted through all our connected technology</i><i></i><p>Just to name a few of the globally impactful issues of our day which directly stem from the efforts of Silicon Valley in specific and the tech industry in general.<p>As the preeminent VC company in the minds of any young entrepreneur who wants to build the Next Big Thing, I would pose that YC actually has a social responsibility to, at a minimum, foster a conversation on these issues in a meaningful, serious and deep manner.<p>What are the consequences of MASSIVE success of a company?<p>----
The article makes me wonder if the author is aware of the technical meaning of "deep" in the context of the term "deep learning." Not that I disagree with the article, these things tend to take on a life of their own and that's just how it goes with language and culture.. but at least in the case of machine learning "deep" is not just an arbitrary terminology to sound fancy, but refers to a series of breakthroughs allowing incredible training performance on multi-layered neural networks; where "deep" specifically contrasts these results with prior state of the art in 3-layered networks. And presumably this use of the term is at the source of several of these other "hyped" uses of it, perhaps with the exception of "deep state", so it's frustrating to see it thrown into the same basket.
The term 'surveillance capitalism' has become rather misleading, especially since Snowden pretty much showed the whole thing wasn't either all about terrorism or capitalism but control. It is forgetting about the relationship between big tech and the state, which today sometimes mean the same thing.
The Intercept has published an interview with the author, and I found it to be compelling enough to immediately start reading the book.<p>> You’re not technically the product, she explains over the course of several hundred tense pages, because you’re something even more degrading: an input for the real product, predictions about your future sold to the highest bidder so that this future can be altered.<p>> it’s clear that surveillance capitalists have discovered that the most predictive sources of data are when they come in and intervene in our lives, in our real-time actions, to shape our action in a certain direction that aligns with the kind of outcomes they want to guarantee to their customers.<p><a href="https://theintercept.com/2019/02/02/shoshana-zuboff-age-of-surveillance-capitalism/" rel="nofollow">https://theintercept.com/2019/02/02/shoshana-zuboff-age-of-s...</a>
if surveillance capitalism was so successful, you would expect the overall ad spending to have spiked recently , since people claim to have found the holy grail that turns ads directly into profits. But it hasn't.