[plug] importing a large text database for fast search
Lyndon Maydwell
maydwell at gmail.com
Fri Sep 2 14:59:58 WST 2011
Sqlite or lucene would be my.recommendation.
On Sep 2, 2011 2:30 PM, "Michael Holland" <michael.holland at gmail.com> wrote:
> Suppose you had a large database - 1.7GB, with about 250k records in a CSV
file.
> Each record has 8 fields - 7 headers plus a body.
> You might use a PERL script to split to files, sort into folders by
> embassy name, convert the ALLCAPS to more legible case, and remove the
> quote escaping from the body.
> Maybe add links to a glossary for the more obscure military/diplomatic
> terms and acronyms.
> But greping all this data is still slow. What is a good way to store
> it in Linux, with a full text index?
> _______________________________________________
> PLUG discussion list: plug at plug.org.au
> http://lists.plug.org.au/mailman/listinfo/plug
> Committee e-mail: committee at plug.org.au
> PLUG Membership: http://www.plug.org.au/membership
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.plug.org.au/pipermail/plug/attachments/20110902/5beb894e/attachment.html>
More information about the plug
mailing list