Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reduce on-disk model NLP size by compressing it #1165

Open
LiamKarlMitchell opened this issue Jul 30, 2022 · 1 comment
Open

Reduce on-disk model NLP size by compressing it #1165

LiamKarlMitchell opened this issue Jul 30, 2022 · 1 comment

Comments

@LiamKarlMitchell
Copy link

Is your feature request related to a problem? Please describe.
My model nlp file is growing quite large over time.

Describe the solution you'd like
Would be great to compress it.

Describe alternatives you've considered
Yeah I can and probably will do this manually with the import and export features.

But could be a good feature to have?
Just thought I would open this to ask if anyone has considered a way to do this in the NLPManager?
Or perhaps there is a way to do this already built-in that I am not aware of, did look and couldn't seem to find one in docs.

Perhaps an option of compressed defaulting to false.

https://github.com/axa-group/nlp.js/blob/master/docs/v3/nlp-manager.md#importexport-using-json

@Apollon77
Copy link
Contributor

I would propose to simply detect it by the provided filename. When ends with ".gz" then store it using gz compression. else not. Could be implememented rather easy

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants