mirror of
https://github.com/ggerganov/llama.cpp.git
synced 2025-04-17 03:56:07 +00:00

* common: custom hf endpoint support Add support for custom huggingface endpoints via HF_ENDPOINT environment variable You can now specify a custom huggingface endpoint using the HF_ENDPOINT environment variable when using the --hf-repo flag, which works similarly to huggingface-cli's endpoint configuration. Example usage: HF_ENDPOINT=https://hf-mirror.com/ ./bin/llama-cli --hf-repo Qwen/Qwen1.5-0.5B-Chat-GGUF --hf-file qwen1_5-0_5b-chat-q2_k.gguf -p "The meaning to life and the universe is" The trailing slash in the URL is optional: HF_ENDPOINT=https://hf-mirror.com ./bin/llama-cli --hf-repo Qwen/Qwen1.5-0.5B-Chat-GGUF --hf-file qwen1_5-0_5b-chat-q2_k.gguf -p "The meaning to life and the universe is" * Update common/arg.cpp readability Improvement Co-authored-by: Xuan-Son Nguyen <thichthat@gmail.com> * Apply suggestions from code review --------- Co-authored-by: ベアトリーチェ <148695646+MakiSonomura@users.noreply.github.com> Co-authored-by: Xuan-Son Nguyen <thichthat@gmail.com>