If you regularly do command line JSON requests, I'm a big fan of HTTPie. It's so much easier to use correctly. <a href="https://httpie.io/docs/cli/examples" rel="nofollow">https://httpie.io/docs/cli/examples</a><p>For example, here's a JSONy POST request with cURL:<p>curl -s -H "Content-Type: application/json" -X POST <a href="https://api.ctl.io/v2/authentication/login" rel="nofollow">https://api.ctl.io/v2/authentication/login</a> --data '{"username":"YOUR.USERNAME","password":"YOUR.PASSWORD"}'<p>Here's that same request with HTTPie:<p>http POST <a href="https://api.ctl.io/v2/authentication/login" rel="nofollow">https://api.ctl.io/v2/authentication/login</a> username=YOUR.USERNAME password=YOUR.PASSWORD
The --jp (json part) command line option, described at <a href="https://github.com/curl/curl/wiki/JSON" rel="nofollow">https://github.com/curl/curl/wiki/JSON</a>, has "anti-pattern" written all over it to me. Why introduce some specific, curl-only wonky-ish version of JSON? Is this any easier to remember than normal JSON? I mean, right now, I use cURL all the time with JSON posts, just doing something like<p>-d '{ "foo": "bar", "zed": "yow" }'<p>The proposed --jp flag seems worse to me in every way.<p>(Note I do like the --json as just syntactic sugar for -H "Accept: application/json" -d <jsonBody>)
> --jp a=b --jp c=d --jp e=2 --jp f=false<p>Uh oh, this looks like it would have the problems of yaml. The data type changes based on the provided string.
To everyone saying "just use tool x for this": the advantage of curl is that is so widely available.<p>For your development laptop you can install anything you want but more often than not you need to log into a EC2 instance, a Docker container you name it.<p>Curl is often pre installed or very easy to install. I know it's usually not an up to date version but as time goes by you will be able to rely on this feature on pretty much any machine.
I was wondering, "Why not pipe output to JQ" up until I read this:<p>> A not insignificant amount of people on stackoverflow etc have problems to
send correct JSON with curl and to get the quoting done right, as json uses
double-qoutes by itself and shells don't expand variables within single quotes
etc.<p>It's about sanitized inputs.
Sounds like this idea is limited to the curl tool and wouldn't add anything to libcurl, which is great. I'd prefer libcurl leaving JSON to other libraries.<p>I use bash variables inside JSON with curl all the time, which leads to string escape screw ups. I know there are alternatives that make testing REST + JSON easier, but since our software uses libcurl in production I prefer to test with curl to keep things consistent.
I like using the httpie CLI, in part because it has a nice interface for sending JSON and receiving JSON: <a href="https://httpie.io/docs/cli/json" rel="nofollow">https://httpie.io/docs/cli/json</a>
I can see this being useful, but I'm not looking forward to the list of command line options being even longer. The output of "curl --help" on my system is already 212 lines long.<p>I wish the curl command was split such that different protocols had different commands. I REALLY don't want to see a list of FTP specific command line options whenever I'm just trying to look up a lesser-used HTTP option.<p>That said, this is really a minor gripe compared to just how useful curl has been for me over the years.
In the linked github wiki there's an example of the syntax of the suggested --jp flag used to pass key-value pairs and put them together as a JSON object:[1]<p>--jp a=b --jp c=d --jp e=2 --jp f=false<p>Gives:<p>{
"a": "b",
"c": "d",
"e": 2,
"f": false
}<p>--jp map=europe --jp prime[]=13 --jp prime[]=17 --jp target[x]=-10 --jp target[y]=32<p>Gives:<p>{
"map": "europe",
"prime": [
13,
17
],
"target": {
"x": -10,
"y": 32
}
}<p>While this is neat, I suppose, it seems like such a waste that the first one isn't given as:<p>--jp a=b,c=d,e=2,f=false<p>And the second as:<p>--jp map=europe --jp prime[]=13,17 --jp target[]=x:-10,y:32<p>...or similar. The repetition kind of bothers me.<p>[1]: <a href="https://github.com/curl/curl/wiki/JSON" rel="nofollow">https://github.com/curl/curl/wiki/JSON</a>
I feel like if you only want to make a single JSON request, a simple curl invocation with the JSON data in single quotes or in a file should be enough. And if you make many different JSON requests, you're probably much better off with one of the alternative tools.<p>Related to the second point, I really wish more people put more time into creating tools for their testers. Shell/Ruby/Python/Perl scripts that are custom-made for the specific service they're testing and provides better UI. So that instead of a sequence of curl invocations, logins, and error-prone copy-pasting, people could just:<p><pre><code> test-my-service --user j.doe:hunter2 --api comments/create --param body="hello world"</code></pre>
Some of the replies say this is a layer violation: HTTP doesn't care about JSON so curl shouldn't either. But you have to add Content-type and Accept headers when working in JSON, which I personally often forget, so I think this does make sense.
I'm indifferent if they do this or not, can always use pipes and jq, but if they do, I hope the json-part option uses some syntax that's a subset of jsonpath and/or jq, so I don't have to understand a third syntax when people start using this.
I think this idea violates the Unix Philosophy. What should happen is that a separate utility could be used to pipe in the request body to cURL similar to <a href="https://stackoverflow.com/questions/12583930/use-pipe-for-curl-data" rel="nofollow">https://stackoverflow.com/questions/12583930/use-pipe-for-cu...</a>
Check out curlie[0] which is really great and already does this. It's essentially a wrapper for curl with JSON support.<p>[ 0 ] <a href="https://github.com/rs/curlie" rel="nofollow">https://github.com/rs/curlie</a>
It seems the goal is to make it easier to craft JSON by having curl perform escaping, while the proposal would seem to require some sort of in-memory tree representation of the data.<p>One alternative would be to provide escaping more directly like this:<p><pre><code> curl --json '{
"map": %s,
"prime": [
%i,
%i
],
"target": {
"x": %i,
"y": %i
}
}' "$continent" "$p1" "$p2" "$x" "$y" https://example.com
</code></pre>
And then curl would do the substitution with the appropriate type-specific escaping for each variable. This has a few nice properties:<p>1. What's on the command line resembles what's actually going to be sent.<p>2. Curl doesn't actually need to parse (nor validate) the JSON, or to create a tree representation of the data within itself. %s is invalid JSON anyway, so you can do a string substitution - all you need to keep track of are matching quotes (including escape sequences).<p>I've used a printf style format string here, which could be expanded for extra convenience. For example the Python-style `%(env_var)s` sequences could be used which could expand environment variables directly. Or something could be added for convenient handling of bash arrays.
JSON is underspecified, leading to various incompatibilities between implementations.<p>Because cURL is so ubiquitous, whatever Daniel implements may become the de facto standard.
This <i>would</i> provide some additional utility, but honestly I don't see the point.
Anyone sending JSON via curl CLI a lot is probably having to manipulate JSON via CLI for purposes <i>other</i> than sending requests with curl as well. It makes more sense for most people to just learn one json manipulation tool and pipe input in and out of things that need it.
What happened to “Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new "features".”?<p>If it’s too tough to integrate with other tools like jq, maybe that could provide for a better outcome.
I use cURL a lot. I can see how this would maybe somewhat useful for working very quickly, but the wiki-given use cases of k-v pairs and lists are simple enough in raw JSON.<p>Something that would be helpful is for cURL, HTTPie, Postman, Fiddler, etc to standardize on a request/response pair format such as Chrome's HAR. There are some tools in NPM and the below HAR to cURL too, so I think native HAR support would be more helpful than a JSON builder.<p><a href="https://mattcg.github.io/har-to-curl/" rel="nofollow">https://mattcg.github.io/har-to-curl/</a>
I would rather write a new tool - say jcurl - which uses curl under the hood.<p>As a user I would not expect curl to have json functionality.<p>And as a developer I would prefer to have one codebase deal with http and another one with json.
I've included --json in a custom redefinition for years, glad to see something like that coming to the official binary!<p><pre><code> curl() {
args=()
for arg in "$@"; do
case $arg in
--json) args+=("-H" "Content-Type: application/json") ;;
*) args+=("$arg") ;;
esac
done
command curl "${args[@]}"
}</code></pre>
If this means I can just use libcurl to GET a web endpoint and parse the JSON in a C program rather than have to manage multiple dependencies, I'm all for it!
This is great. When a new user uses Darklang, we want them to be able to make JSON API requests quickly and easily, and there aren't great client-side tools for that that you can expect users to have installed. giving them a big long curl command is no fun, but `curl --json 'the-body' would be amazing`
Doesn't really look like it's adding anything, and the `jp` part looks like the people referenced on stackoverflow will just be more confused.<p>Often times the JSON being sent down is complex, I can't imagine anyone wanting to basically rewrite it into something else for anything other than 2 field JSON objects
I know I've done the quoting dance before, while exploring an API in one project I resorted to using zsh heredocs to build the payload argument to avoid all quoting issues. I'm sure there is a better way already but it sounds nice to have this built into curl as its so common.
I would prefer to use the --json flag to provide syntactic sugar for setting the content type and accepts headers and leave the marshaling of data to a separate tool. Or if it has to be baked in, refactor `jo` into `libjo` and a CLI wrapper so that the two tools behave the same way.
I use cURL for local development with JSON cookies and I think it's perfectly adequate for that purpose.<p>curl --insecure --cookie test_cookie='{"test":"bob"}' <a href="https://localhost:8081/" rel="nofollow">https://localhost:8081/</a>
Shouldn't this be a GitHub issue or GitHub discussion:<p><a href="https://github.com/curl/curl/wiki/JSON" rel="nofollow">https://github.com/curl/curl/wiki/JSON</a><p>Wiki is a weird format to use for a proposal.
So great! This has been one of the most requested curl features for years. Without this feature, to send JSON, you had to craft a valid JSON string yourself or shell out to another utility that creates a valid JSON string.
I just want to pass it a filename that contains the JSON. Never been a fan of heaving around post bodies that dangle from a curl command...and I hate postman.
dupe of <a href="https://news.ycombinator.com/item?id=30011382" rel="nofollow">https://news.ycombinator.com/item?id=30011382</a>
On one hand, this is awesome.<p>But aren't there also several command line utilities which already support JSON.<p>Why cram new stuff into such an industry standard tool?