How to change the location where Langchain’s C Transformer downloads the hugging face model on AWS sagemaker studio labs

I used the command :

llm = CTransformers(model="TheBloke/Llama-2-7B-Chat-GGML", model_file="llama-2-7b-chat.ggmlv3.q2_K.bin", callbacks=[StreamingStdOutCallbackHandler()])

to download a LLM model , but I cant find where the model file has been saved. I’m trying to save the model in the directory of my script.

Leave a Comment