OpenAI seems like a completely disingenuous organization. They have some of the best talent in Machine Learning, but the leadership seems completely clueless.<p>1) (on cluelessness) If Sama/GDB were as smart as they claim to be, would they not have realized it is impossible to run a non profit research lab which is effectively trying "to compete" with DeepMind.<p>2) (on disingenuity) The original openAI charter made OpenAI an organization that was trying to save the world from nefarious actors and uses of AI. Who were such users? To me it seemed like, entities with vastly superior compute resources who were using the latest AI technologies for presumably profit oriented goals. There are few organizations in the world like that, namely FAANG, and their international counterparts. Originally OpenAI sounded incredibly appealing to me, and a lot of us here. But if their leadership had more forethought, they would perhaps not have made this promise. But given the press, and the money they accrued, it has now become impossible to go back on this charter. So the only way to get themselves out of the whole they dug into was by making it into a for profit research lab. And by commercializing perhaps a more superior version of the tools Microsoft, Google and the other large AI organizations are commercializing, is OpenAI any different from them?<p>How do we know OpenAI will not be the bad actor that is going to abuse AI given their self interest?<p>All we have is their charter to go by. But given how they are constantly "re-inventing" their organizational structure, what grounds do we have to trust them?<p>Do we perhaps need a new Open OpenAI? One that we can actually trust? One that is actually transparent with their research process? One that actually releases their code, and papers and has no interest in commercializing that? Oh, that's right, we already have that -- research labs at AI focused schools like MIT, Stanford, BAIR and CMU.<p>I am quite wary of this organization, and I would encourage other HN readers to think more careful about what they are doing here.