llama¶
Description¶
Llama 2 pretrained models are trained on 2 trillion tokens, and have double the context length than Llama 1. Its fine-tuned models have been trained on over 1 million human annotations.
Environment Modules¶
Run module spider llama
to find out what environment modules are available for this application.
Environment Variables¶
Categories¶
library, math