The counter-intuitive part is that a 100MB file is considered large on a machine with 8-16GB RAM.<p>It's definitely not the problem of the resources, but only of the architecture of applications.<p>(Indeed, the right application is a DBMS, not a spreadsheet.)
You could get interested in:<p><a href="https://github.com/BurntSushi/xsv" rel="nofollow">https://github.com/BurntSushi/xsv</a>
No mention of vi or Sqlite? While I'm no vi expert it's a great tool for working with big files when you want to browse around without grep. And Sqlite is similarly ubiquitous and capable of crunching large files.
Two of my favorite tools for this kind of thing:<p><a href="http://visidata.org/" rel="nofollow">http://visidata.org/</a><p>And<p><a href="http://recs.pl/" rel="nofollow">http://recs.pl/</a>
> Excel for Mac performed well but it is a paid solution so I did not consider it viable for many developers<p>Because developers handling gigabyte size data, and wanting to reliably manipulate it in a GUI, cannot possibly be expected to pay/afford the $7/month to Microsoft.<p>That said, the recommended solution is probably the best option for developers, not bedside because it's free, but for the ability to run complex SQL statements, and visualize the results.<p>If I were to edit this article, that'd be my takeaway: use tool X for snappy vitalization of SQL queries, even on multi gigabyte sized CSVs
I wonder how well would Table Tool [1] perform with your large dataset?
This is an open source CSV editor for Mac from the developer of Postico, my favorite PostgreSQL client for Mac [2]<p>[1] <a href="https://github.com/jakob/TableTool" rel="nofollow">https://github.com/jakob/TableTool</a><p>[2] <a href="https://eggerapps.at/postico/" rel="nofollow">https://eggerapps.at/postico/</a>
I use XSV: A fast CSV command line toolkit written in Rust.<p><a href="https://formulae.brew.sh/formula/xsv" rel="nofollow">https://formulae.brew.sh/formula/xsv</a><p><a href="https://github.com/BurntSushi/xsv" rel="nofollow">https://github.com/BurntSushi/xsv</a>
I am not clear what "manipulate" means here -- what is the author trying to do with the comma separated values? FWIW, I can accomplish csv manipulation using a handful of Unix utilities: sed, awk, cut, and call it a day.
<a href="https://kothar.net/csview" rel="nofollow">https://kothar.net/csview</a><p>A fast viewer a friend of mine created to view large CSVs in a GUI - might be useful to someone.
There seem to be literally dozens of solutions to do read-only operations but very few to enable comfortable editing of the files in-place in a Unix / command line environment.<p>Seems like a real gap in the software ecosystem atm:<p><pre><code> - fast
- no limit on file size
- spreadsheet style layout
- command line
- easily edit and update individual cell => save
</code></pre>
I've tried VIM's CSV plugins many times and have never been satisfied.
@alecdibble Could you try LibreOffice's Calc [1]? It's my daily driver in Linux, it mostly works well as an excel replacement but I'm interested to see how well it does on a Mac.<p>[1] <a href="https://www.libreoffice.org/download/download/" rel="nofollow">https://www.libreoffice.org/download/download/</a>
tad is absolutely great for this, i tested it for the exact use case mentioned. I had to file a ticket and wait for him to add an export to csv function. tad is built over sqlite and can filter/sort/pivot/aggregate and export the result, which is all my business team partners ever need. can handle GB files, i didnt test with TB size.<p><a href="https://www.tadviewer.com/" rel="nofollow">https://www.tadviewer.com/</a>, its a desktop app<p>edit:
one note after seeing other comments based on cli (xsv, sed, awk, etc) — the OP’s use case is something that marketing/pm/business stakeholders can use. my favorite tools are cli-based, however this does not fly with business teams so forget that option.
I'm always astounded that there doesn't seem to be a decent general purpose CSV editor/viewer application. Excel is atrocious - it's always dog slow, and it mangles any CSV I've ever opened by trying to interpret the data to format it "smartly".<p>Having to build a table in a database and import the CSV into that feels a bit like hitting a house fly with a sledgehammer, but it's the most effective way I've seen.