43 Folders

Back to Work

Merlin’s weekly podcast with Dan Benjamin. We talk about creativity, independence, and making things you love.

Join us via RSS, iTunes, or at 5by5.tv.

”What’s 43 Folders?”
43Folders.com is Merlin Mann’s website about finding the time and attention to do your best creative work.

Text file question

I've recently learned basic (read: remedial) command line and VIM controls and have really gotten into text files. Following up on year-old post on 43 folders (http://www.43folders.com/2005/08/17/life-inside-one-big-text-file), I'm intrigued by the idea of doing everything in big text files--dumping information into the file line by line, tagging it, and then relying on grep, sort, tr, sed and so on to manipulate and make sense of it. I can honestly say that I have never found a more powerful or flexible tool. (And just to think that it's been there all along, even when I shelled out big bucks for more complicated solutions. And don't even get me started on TeX. Yippee!)

But my main question is: how big can a plain text file become before it grows too slow or unwieldy? Is there a practical limit to the size of text files? Obviously, this depends on the power of the system and the software used. I'm currently using an iBook G4 and VIM 7.0 from the command line. I'd like not to split up the file, if possible, but I suppose I may have to archive older lines at some point if it gets too slow.

Any advice would be greatly appreciated!




An Oblique Strategy:
Honor thy error as a hidden intention


Subscribe with Google Reader

Subscribe on Netvibes

Add to Technorati Favorites

Subscribe on Pageflakes

Add RSS feed

The Podcast Feed


Merlin used to crank. He’s not cranking any more.

This is an essay about family, priorities, and Shakey’s Pizza, and it’s probably the best thing he’s written. »

Scared Shitless

Merlin’s scared. You’re scared. Everybody is scared.

This is the video of Merlin’s keynote at Webstock 2011. The one where he cried. You should watch it. »