I mean, maybe? But ultimately, how will an artificial intelligence that does not have a body understand the visceral experience of kissing, or attach the same value to it that a young boy does, or the different values that a young girl does, or that a horny middle-aged woman does?<p>The whole presumption that subjective experience does not matter to 'being human' and can be replaced by gormless algorithmic surfaces seems flawed at its root. These things will never be human, because they ARE NOT human. They do not have human bodies, families, histories, friendships, fears, or hopes. They are not experiencing the passage of time; they are not participating in the dance of culture.<p>Perhaps we can substitute for all of these things algorithmically, but that involves the constructors of these machines to have some insights about what it means to be human, and I just am not seeing that.
What <i>kind</i> of romance novels? Because I've been told the really cheap ones follow such fixed patterns that publishers have checklists and databases to prevent accidentally repeating the same story too blatantly between novels.<p>Which, if true, would actually make them an interesting source for machine learning, now that I think about it.
Another vapid AI story...lovely. The "...more human" in the title is just clickbait that makes it seem like Google and co. are on the cusp of developing a human-like AI which, of course, is not the case at all. And romance novels are the finest example of "more human" literature that they could come up with? Right. Also, the "more" before human implies a human-like AI is desirable...the more "human" the better. Really...why is that? There is a limit to the "really cool shit" tech companies can pile on the market before it becomes repetitive and, well, boring. The internet of things being a case in point. Who wants to buy an internet capable toaster or can opener for its own sake? The whole "tech is a fount of limitless awesomeness that will save the world" thing has been way oversold.
Why would you want to make artificial intelligence more human?<p>If it's just out of curiosity, I get it, but it doesn't seem to make sense beyond that.<p>As far as I can tell, the unspoken truth about the drive for AI is to create a new slave class. AI is being pushed for to create intelligent entities that do the bidding of those that own them. If we succeeded in creating AI capable of emotional responses, this would make AI harder to control. It also opens Pandora's box with regards to the rights of AI. For example, should we start considering the intellectual and emotional fulfillment of AI when designing it?<p>I'm not saying machines shouldn't be given intelligence, but perhaps we owe it to our creations to not burden them with the combination of emotions and restricted freedom to act upon them.