mirror of
https://gitlab.ub.uni-bielefeld.de/sfb1288inf/nlp.git
synced 2024-12-26 20:34:18 +00:00
Update README and pipeline help
This commit is contained in:
parent
aa1bfa259d
commit
dc62755d12
46
README.md
46
README.md
@ -5,18 +5,13 @@ This software implements a heavily parallelized pipeline for Natural Language Pr
|
|||||||
## Software used in this pipeline implementation
|
## Software used in this pipeline implementation
|
||||||
- Official Debian Docker image (buster-slim) and programs from its free repositories: https://hub.docker.com/_/debian
|
- Official Debian Docker image (buster-slim) and programs from its free repositories: https://hub.docker.com/_/debian
|
||||||
- pyFlow (1.1.20): https://github.com/Illumina/pyflow/releases/tag/v1.1.20
|
- pyFlow (1.1.20): https://github.com/Illumina/pyflow/releases/tag/v1.1.20
|
||||||
- spaCy (3.0.3): https://github.com/tesseract-ocr/tesseract/releases/tag/4.1.1
|
- spaCy (3.0.5): https://github.com/tesseract-ocr/tesseract/releases/tag/4.1.1
|
||||||
- spaCy medium sized models (3.0.0):
|
- spaCy medium sized models (3.0.0):
|
||||||
- https://github.com/explosion/spacy-models/releases/tag/da_core_news_md-3.0.0
|
|
||||||
- https://github.com/explosion/spacy-models/releases/tag/de_core_news_md-3.0.0
|
- https://github.com/explosion/spacy-models/releases/tag/de_core_news_md-3.0.0
|
||||||
- https://github.com/explosion/spacy-models/releases/tag/el_core_news_md-3.0.0
|
|
||||||
- https://github.com/explosion/spacy-models/releases/tag/en_core_web_md-3.0.0
|
- https://github.com/explosion/spacy-models/releases/tag/en_core_web_md-3.0.0
|
||||||
- https://github.com/explosion/spacy-models/releases/tag/es_core_news_md-3.0.0
|
|
||||||
- https://github.com/explosion/spacy-models/releases/tag/fr_core_news_md-3.0.0
|
|
||||||
- https://github.com/explosion/spacy-models/releases/tag/it_core_news_md-3.0.0
|
- https://github.com/explosion/spacy-models/releases/tag/it_core_news_md-3.0.0
|
||||||
- https://github.com/explosion/spacy-models/releases/tag/nl_core_news_md-3.0.0
|
- https://github.com/explosion/spacy-models/releases/tag/nl_core_news_md-3.0.0
|
||||||
- https://github.com/explosion/spacy-models/releases/tag/pt_core_news_md-3.0.0
|
- https://github.com/explosion/spacy-models/releases/tag/pl_core_news_md-3.0.0
|
||||||
- https://github.com/explosion/spacy-models/releases/tag/ru_core_news_md-3.0.0
|
|
||||||
- https://github.com/explosion/spacy-models/releases/tag/zh_core_web_md-3.0.0
|
- https://github.com/explosion/spacy-models/releases/tag/zh_core_web_md-3.0.0
|
||||||
|
|
||||||
|
|
||||||
@ -29,7 +24,7 @@ mkdir -p /<my_data_location>/input /<my_data_location>/output
|
|||||||
|
|
||||||
2. Place your text files inside `/<my_data_location>/input`. Files should all contain text of the same language.
|
2. Place your text files inside `/<my_data_location>/input`. Files should all contain text of the same language.
|
||||||
|
|
||||||
3. Start the pipeline process. Check the [Pipeline arguments](#pipeline-arguments) section for more details.
|
3. Start the pipeline process. Check the pipeline help (`nlp --help`) for more details.
|
||||||
```
|
```
|
||||||
# Option one: Use the wrapper script
|
# Option one: Use the wrapper script
|
||||||
## Install the wrapper script (only on first run). Get it from https://gitlab.ub.uni-bielefeld.de/sfb1288inf/nlp/-/raw/1.0.0/wrapper/nlp, make it executeable and add it to your ${PATH}
|
## Install the wrapper script (only on first run). Get it from https://gitlab.ub.uni-bielefeld.de/sfb1288inf/nlp/-/raw/1.0.0/wrapper/nlp, make it executeable and add it to your ${PATH}
|
||||||
@ -51,38 +46,3 @@ docker run \
|
|||||||
```
|
```
|
||||||
|
|
||||||
4. Check your results in the `/<my_data_location>/output` directory.
|
4. Check your results in the `/<my_data_location>/output` directory.
|
||||||
```
|
|
||||||
|
|
||||||
### Pipeline arguments
|
|
||||||
|
|
||||||
`--check-encoding`
|
|
||||||
* If set, the pipeline tries to automatically determine the right encoding for
|
|
||||||
your texts. Only use it if you are not sure that your input is provided in UTF-8.
|
|
||||||
* default = False
|
|
||||||
* required = False
|
|
||||||
|
|
||||||
`-l languagecode`
|
|
||||||
* Tells spaCy which language will be used.
|
|
||||||
* options = da (Danish), de (German), el (Greek), en (English), es (Spanish), fr (French), it (Italian), nl (Dutch), pt (Portuguese), ru (Russian), zh (Chinese)
|
|
||||||
* required = True
|
|
||||||
|
|
||||||
`--nCores corenumber`
|
|
||||||
* Sets the number of CPU cores being used during the NLP process.
|
|
||||||
* default = min(4, multiprocessing.cpu_count())
|
|
||||||
* required = False
|
|
||||||
|
|
||||||
``` bash
|
|
||||||
# Example with all arguments used
|
|
||||||
docker run \
|
|
||||||
--rm \
|
|
||||||
-it \
|
|
||||||
-u $(id -u $USER):$(id -g $USER) \
|
|
||||||
-v "$HOME"/ocr/input:/input \
|
|
||||||
-v "$HOME"/ocr/output:/output \
|
|
||||||
gitlab.ub.uni-bielefeld.de:4567/sfb1288inf/nlp:1.0.0 \
|
|
||||||
-i /input \
|
|
||||||
-l en \
|
|
||||||
-o /output \
|
|
||||||
--check-encoding \
|
|
||||||
--nCores 8 \
|
|
||||||
```
|
|
||||||
|
2
nlp
2
nlp
@ -156,7 +156,7 @@ def parse_args():
|
|||||||
type=int)
|
type=int)
|
||||||
parser.add_argument('--n-cores',
|
parser.add_argument('--n-cores',
|
||||||
default=min(4, multiprocessing.cpu_count()),
|
default=min(4, multiprocessing.cpu_count()),
|
||||||
help='Number of CPU threads to be used',
|
help='Number of CPU threads to be used (Default: min(4, number of CPUs))',
|
||||||
type=int)
|
type=int)
|
||||||
parser.add_argument('--zip',
|
parser.add_argument('--zip',
|
||||||
help='Create one zip file per filetype')
|
help='Create one zip file per filetype')
|
||||||
|
Loading…
Reference in New Issue
Block a user