I built sentences from /usr/share/dict with a small one line script I made.<p>I have as `final_data` as a source file with a bunch of strange sentences in it.<p>$cat final_data | wc -l
1501920<p>So, a little more than 1.5v million "records" in the file.<p>And let's time the operation:
$time cat final_data | wc -l
1501920<p>real 0m0.063s
user 0m0.052s
sys 0m0.043s<p>Thats some speedy stuff. Keep in mind, I am running this on a Mac OS X Yosemite MacBookPro (15-inch, Mid <i>2010</i>), 8 GB (Self upgraded from 4GB) 1067 MHz DDR3, 2.53 GHz Intel Core i5, so by no means at all a speed demon. There probably literally are now faster phones on the market.<p>I imagine, were I to drop two SSD's in here the second in the CD tray slot and worked the source and destination data from each drive so they are never combating for read and write access, this would be sped up a ton. Or I had a SSD array.<p>Next up, lets drop off the cat and work right away with the `wc` application. if sure get blazingly faster:<p>$time wc -l final_data
1501920 final_data<p>real 0m0.059s
user 0m0.044s
sys 0m0.015s<p>faster in every regard. Just for fun, lets pump a grep call in there and see how much it speeds up, slows down, who knows:<p>* Remember to clear your terminal scroll back when looking at large data like this, or set the threshold to a lower value. Until I did, Mac IS X does not like multi-megabyte text files.<p>Wow, that was significantly faster than I thought:<p>$time grep '\.' final_data | wc -l
1501920<p>real 0m0.705s
user 0m0.712s
sys 0m0.035s<p>And, lets do it how some things sometimes happen, but they are just one off's:<p>$time cat final_data | grep '\.' final_data | wc -l
1501920<p>real 0m0.766s
user 0m0.771s
sys 0m0.041s