cguna commited on
Commit
11ca87c
·
verified ·
1 Parent(s): fb3dbb5

Update context_relevancy_lora/README.md

Browse files
Files changed (1) hide show
  1. context_relevancy_lora/README.md +3 -2
context_relevancy_lora/README.md CHANGED
@@ -60,7 +60,8 @@ To prompt the LoRA adapter to determine context relevancy, a special context rel
60
 
61
  ## Quickstart Example
62
 
63
- Use the code below to get started with the model.
 
64
  ```python
65
  import torch
66
  from transformers import AutoTokenizer, AutoModelForCausalLM
@@ -71,7 +72,7 @@ device=torch.device('cuda' if torch.cuda.is_available() else 'cpu')
71
 
72
  CONTEXT_RELEVANCY_PROMPT = "<|start_of_role|>context_relevance<|end_of_role|>"
73
  BASE_NAME = "ibm-granite/granite-3.3-8b-instruct"
74
- LORA_NAME = "ibm-granite/granite-3.3-8b-lora-rag-context-relevance-prediction"
75
 
76
  tokenizer = AutoTokenizer.from_pretrained(BASE_NAME, padding_side='left',trust_remote_code=True)
77
  model_base = AutoModelForCausalLM.from_pretrained(BASE_NAME,device_map="auto")
 
60
 
61
  ## Quickstart Example
62
 
63
+ Use the code below to get started with the model. Before running the script, set the `LORA_NAME` parameter to the path of the directory that you downloaded the LoRA adapter. The download process is explained [here](https://huggingface.co/ibm-granite/granite-3.3-8b-rag-agent-lib#quickstart-example).
64
+
65
  ```python
66
  import torch
67
  from transformers import AutoTokenizer, AutoModelForCausalLM
 
72
 
73
  CONTEXT_RELEVANCY_PROMPT = "<|start_of_role|>context_relevance<|end_of_role|>"
74
  BASE_NAME = "ibm-granite/granite-3.3-8b-instruct"
75
+ LORA_NAME = "PATH_TO_DOWNLOADED_DIRECTORY"
76
 
77
  tokenizer = AutoTokenizer.from_pretrained(BASE_NAME, padding_side='left',trust_remote_code=True)
78
  model_base = AutoModelForCausalLM.from_pretrained(BASE_NAME,device_map="auto")