<p>Sqlite or lucene would be my.recommendation.</p>
<div class="gmail_quote">On Sep 2, 2011 2:30 PM, "Michael Holland" <<a href="mailto:michael.holland@gmail.com">michael.holland@gmail.com</a>> wrote:<br type="attribution">> Suppose you had a large database - 1.7GB, with about 250k records in a CSV file.<br>
> Each record has 8 fields - 7 headers plus a body.<br>> You might use a PERL script to split to files, sort into folders by<br>> embassy name, convert the ALLCAPS to more legible case, and remove the<br>> quote escaping from the body.<br>
> Maybe add links to a glossary for the more obscure military/diplomatic<br>> terms and acronyms.<br>> But greping all this data is still slow. What is a good way to store<br>> it in Linux, with a full text index?<br>
> _______________________________________________<br>> PLUG discussion list: <a href="mailto:plug@plug.org.au">plug@plug.org.au</a><br>> <a href="http://lists.plug.org.au/mailman/listinfo/plug">http://lists.plug.org.au/mailman/listinfo/plug</a><br>
> Committee e-mail: <a href="mailto:committee@plug.org.au">committee@plug.org.au</a><br>> PLUG Membership: <a href="http://www.plug.org.au/membership">http://www.plug.org.au/membership</a><br></div>