The idea behind this library is to build an alternative way to query databases using GraphQL instead of SQL. Unlike with direct SQL or an ORM you don't have to write all the steps required to get the data you want in the structure you need, instead just describe it with GraphQL and you'll get exactly the data back.<p>package main<p>import (
"database/sql"
"fmt"
"time"
"github.com/dosco/super-graph/core"
_ "github.com/jackc/pgx/v4/stdlib"
)<p>func main() {
db, err := sql.Open("pgx", "postgres://postgrs:@localhost:5432/example_db")
if err != nil {
log.Fatal(err)
}<p><pre><code> sg, err := core.NewSuperGraph(nil, db)
if err != nil {
log.Fatal(err)
}
// And here's the GraphQL query to fetch posts, comments, author, etc
query := `
query {
posts {
id
title
body
comments(limit: 5) {
id
user {
id
name
}
}
user {
id
name
}
}
}`
res, err := sg.GraphQL(context.Background(), query, nil)
if err != nil {
log.Fatal(err)
}
fmt.Println(string(res.Data))</code></pre>
}<p>https://github.com/dosco/super-graph
Yes, but how do you work with the data once you get it back? I typically decode it into go structs.<p>What happens when the data model changes? How does type checking and validation happen?<p>You example looks cut off on the right, certain lines seem truncated.
I was thinking of the exact same concept the other day; I need something along these lines for one of my own projects. Anyway, I’ll take a closer look at this during the weekend :)