Skip to content

flairNLP

Description

flairnlp website

A very simple framework for state-of-the-art Natural Language Processing (NLP)

Environment Modules

Run module spider flairnlp to find out what environment modules are available for this application.

Environment Variables

  • HPC_FLAIRNLP_DIR - installation directory

Additional Usage Information

Flair is:

  • A powerful NLP library. Flair allows you to apply our state-of-the-art natural language processing (NLP) models to your text, such as named entity recognition (NER), part-of-speech tagging (PoS), special support for biomedical data, sense disambiguation and classification, with support for a rapidly growing number of languages.

  • A text embedding library. Flair has simple interfaces that allow you to use and combine different word and document embeddings, including the proposed Flair embeddings, BERT embeddings and ELMo embeddings.

  • A PyTorch NLP framework. The framework builds directly on PyTorch, making it easy to train your own models and experiment with new approaches using Flair embeddings and classes.

Citation

If you publish research that uses flairNLP you have to cite it as follows:

If you are using Flair embeddings: - Akbik, A., Blythe, D., & Vollgraf, R. (2018). Contextual string embeddings for sequence labeling. In Proceedings of the 27th International Conference on Computational Linguistics (COLING 2018) (pp. 1638–1649). https://aclanthology.org/C18-1139/

If you use the Flair framework for your experiments: - Akbik, A., Bergmann, T., Blythe, D., Rasul, K., Schweter, S., & Vollgraf, R. (2019). FLAIR: An easy-to-use framework for state-of-the-art NLP. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics (Demonstrations) (pp. 54–59). https://aclanthology.org/N19-4010/

If you use our new "FLERT" models or approach: - Schweter, S., & Akbik, A. (2020). FLERT: Document-level features for named entity recognition. arXiv preprint. arXiv:2011.06993. https://arxiv.org/abs/2011.06993

If you use our TARS approach for few-shot and zero-shot learning: - Halder, K., Akbik, A., Krapac, J., & Vollgraf, R. (2020). Task-aware representation of sentences for few-shot and zero-shot learning. In Proceedings of the 28th International Conference on Computational Linguistics (COLING 2020) (pp. 3035–3049). https://kishaloyhalder.github.io/pdfs/tars_coling2020.pdf

Categories

visualization, language