I am dealing with data which is stored as JSONB data type in postgres currently. This column needed to be used in the `where` as filter. For example, a json can contain title, body, url, some boolean values etc.<p>While Postgres does allow creating an index on this JSONB column, I am wondering if MongoDB would be better suited for this instead of Postgres?<p>Have you switched any project from postgres or relational data to mongoDB or similar nosql db?<p>When dealing with JSON data, is Postgres better or MongoDB?
Postgres by far. The inbuilt json support is great. In addition, Postgres as a whole has decades of bullet proofing behind it. MongoDB doesn't have many advantages, has many bugs and sharp edges, and doesn't have enough advantages to justify it's use. In [an old analysis by Jepsen](<a href="https://jepsen.io/analyses/mongodb-4.2.6" rel="nofollow noreferrer">https://jepsen.io/analyses/mongodb-4.2.6</a>), MongoDB 4.2.6 exhibited "read skew, cyclic information flow, duplicate writes, and internal consistency violations. Weak defaults meant that transactions could lose writes and allow dirty reads."<p>Will note that json/text columns are better than jsonb. Lower serialization overheads and surprisingly smaller binary size. Or better yet, just normalize to a columnar format.