Congrats on launching, but this feels like an uphill climb to get paying customers. You need to find the intersection of potential customers that know SQL but don't want to use one of the open source options. (perhaps data analysts working in restricted environments where the only option is a web browser)
Have you seen duckdb? <a href="https://duckdb.org/" rel="nofollow">https://duckdb.org/</a><p>It's basically what you're building, but more low-level. Really cool, to be honest -- serves the same market too. Do you have any significant differentiator, other than charts?
You might have a wider audience if you put in on the app store. I only install very well-known software outside of the app store. For anything more niche, I need it to be on the app store to offer some assurance that it is not malicious and that sandboxing is enforced.
Congratulations. I do see value in quickly seeing, querying files in a nice desktop interface. I am curious why there is no parquet support though. If duckdb is running in the background it is probably easy to support it?
This looks awesome. I'm the target audience. I do quite a bit of development around SQL Server and there's an endless stream of CSV and XLSX files coming and going that need spot checks and quick looks. I use ModernCSV quite a bit and would have purchased that if it built these SQL features in. I've used DuckDB directly a few times to join and query CSV and XLSX files, I'll pay my own $$ for something that quickly streamlines this.<p>I can import into SQL Server but there's too much ceremony needed (column types, etc) for quick looks at data I'm going to answer a question about and then discard. After a quick look at TextQuery I'm running into the same issues (although TextQuery is just a couple of clicks instead of 5+). I was also seeing an error yesterday from associating XLSX files with TextQuery but that seems to have gone away today.
Coongrats on the release.<p>It reminds me of Log Parser Studio [1] on Windows. Using SQL to query text and log files is a great idea.<p>[1] <a href="https://web.archive.org/web/20170710212920/http://gallery.technet.microsoft.com/office/Log-Parser-Studio-cd458765" rel="nofollow">https://web.archive.org/web/20170710212920/http://gallery.te...</a>
I think I'm your target user, but I currently use DuckDB for this type of work, so unlikely to buy your product. That said, lots of devs pay for Rider and/or DataGrip - sometimes with their own money - so maybe there is a market here?
A few hours ago this would have been useful, I will probably give it a try in few days.
On another note, I recommend clarifying in the heroes page that it's about a one-time purchase, because that's a really big plus.
Another tool in the same vein is 'q': <a href="https://harelba.github.io/q/" rel="nofollow">https://harelba.github.io/q/</a>
Quick question - is it possible to import multiple files at once? I frequently get ZIP files full of csv/xlsx files that I need to search through. I didn't see a way to import more than 1 file at a time. Thanks!
pretty cool seeing someone care about a good ui for this kind of tool - always annoys me when the workflow is clunky, you think people actually care more about small features or is it just all about price
Is it just me, or the images on the website aren't loading? Using firefox, from what I searched it seems like it could be just a firefox issue. The cdn links are https bt when navigating there manually it says the connection is not private.
I could use this.<p>Questions:
What file sizes have you tested?<p>What about a directory with similar CSV files - I have a use case where similar structure CSV, 2 TB data broken into 700 files. Instead of 1 large file. Would that work?
For CSV files you can also import them directly into a SQLite file using <a href="https://sqlitebrowser.org/" rel="nofollow">https://sqlitebrowser.org/</a><p>XLSX would be the same workflow with "save as" CSV and then push it into SQLite.
Try <a href="https://sql-workbench.com" rel="nofollow">https://sql-workbench.com</a> if you‘d like to do this directly in the browser, for free. Including Parquet and Arrow support as well.
Readers may also enjoy Steampipe [1], an open source tool to live query 140+ services with SQL (e.g. AWS, GitHub, CSV, Kubernetes, etc). It uses Postgres Foreign Data Wrappers under the hood and supports joins etc with other tables. (Disclaimer - I'm a lead on the project.)<p>1 - <a href="https://github.com/turbot/steampipe">https://github.com/turbot/steampipe</a>
Uh huh. The No SQL zombie yet shuffles on.<p>Anyone who knows SQL sees dozens of problems immediately. What enforces data integrity? How do we know the records are NF1? How do we perform a join, or test existential quantification, without table names? How do we know all supposed "dates" are valid dates, and not my uncles ex-wife’s maiden name? How does one reference XML attributes from SQL?<p>The answer produced by SQL are only as good as the data they’re drawn from. The quality and internal consistency of those data are enforced by the DBMS. No amount of pretty graphs and syntax highlighting changes that. The effective of SQL depends on the knowledge of the practitioner. No tool changes that, either.