[plug] importing a large text database for fast search
maydwell at gmail.com
Fri Sep 2 14:59:58 WST 2011
Sqlite or lucene would be my.recommendation.
On Sep 2, 2011 2:30 PM, "Michael Holland" <michael.holland at gmail.com> wrote:
> Suppose you had a large database - 1.7GB, with about 250k records in a CSV
> Each record has 8 fields - 7 headers plus a body.
> You might use a PERL script to split to files, sort into folders by
> embassy name, convert the ALLCAPS to more legible case, and remove the
> quote escaping from the body.
> Maybe add links to a glossary for the more obscure military/diplomatic
> terms and acronyms.
> But greping all this data is still slow. What is a good way to store
> it in Linux, with a full text index?
> PLUG discussion list: plug at plug.org.au
> Committee e-mail: committee at plug.org.au
> PLUG Membership: http://www.plug.org.au/membership
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the plug