Abdulvajid commited on
Commit
627f2d6
·
verified ·
1 Parent(s): a87c761

Upload tokenizer

Browse files
Files changed (2) hide show
  1. chat_template.jinja +15 -17
  2. special_tokens_map.json +14 -2
chat_template.jinja CHANGED
@@ -1,25 +1,23 @@
1
- {{ bos_token}}{% if messages[0]['role']==system%}
2
- {{ raise_exception('System message is not supported in gemma, it would be good to merget the system prompt with first user message')}}
3
  {% endif %}
4
- {% if not tools%}
5
- {{ raise_exception('You should provide tools, this model only trained for tool calling with Reasoning')}}
6
  {% endif %}
7
- {% set first_message=True %}
8
  {% for message in messages %}
9
- {% if first_message and tools%}
10
- {{"<start_of_turn>human
11
- You are a function calling AI model. You are provided with function signatures within <tools></tools> XML tags.You may call one or more functions to assist with the user query. Don't make assumptions about what values to plug into functions.Here are the available tools:<tools> " ~ (tools | tojson) ~ " </tools>Use the following pydantic model json schema for each tool call you will make: {'title': 'FunctionCall', 'type': 'object', 'properties': {'arguments': {'title': 'Arguments', 'type': 'object'}, 'name': {'title': 'Name', 'type': 'string'}}, 'required': ['arguments', 'name']}For each function call return a json object with function name and arguments within <tool_call></tool_call> XML tags as follows:
 
12
  <tool_call>
13
  {tool_call}
14
  </tool_call>Also, before making a call to a function take the time to plan the function to take. Make that thinking process between <think>{your thought}</think>
15
 
16
- " + message['content'] }}
17
- {% set first_message = False %}
18
- {% else %}
19
- {{'<start_of_turn>' + message['role'] + '
20
- ' + message['content'] | trim + '<eos_turn><eos>
21
- '}}
22
- {% endif %}
23
  {% endfor %}
24
- {% if add_generation_prompt %}{{'<start_of_turn>model
25
- '}}{% endif %}"
 
1
+ {{ bos_token }}{% if messages[0].role == 'system' %}
2
+ {{ raise_exception('System message is not supported in gemma; merge the system prompt with the first user message') }}
3
  {% endif %}
4
+ {% if not tools %}
5
+ {{ raise_exception('You should provide tools; this model was only trained for tool calling with reasoning') }}
6
  {% endif %}
7
+ {% set first_message = True %}
8
  {% for message in messages %}
9
+ {% if first_message and tools %}
10
+ {% raw -%}
11
+ <start_of_turn>human
12
+ You are a function calling AI model. You are provided with function signatures within <tools></tools> XML tags.You may call one or more functions to assist with the user query. Don't make assumptions about what values to plug into functions.Here are the available tools:<tools> {% endraw %}{{ tools }}{% raw -%} </tools>Use the following pydantic model json schema for each tool call you will make: {'title': 'FunctionCall', 'type': 'object', 'properties': {'arguments': {'title': 'Arguments', 'type': 'object'}, 'name': {'title': 'Name', 'type': 'string'}}, 'required': ['arguments', 'name']}For each function call return a json object with function name and arguments within <tool_call></tool_call> XML tags as follows:
13
  <tool_call>
14
  {tool_call}
15
  </tool_call>Also, before making a call to a function take the time to plan the function to take. Make that thinking process between <think>{your thought}</think>
16
 
17
+ {% endraw %}{{ message.content }}{% raw -%}
18
+ <eos_turn><eos>
19
+ <start_of_turn>model
20
+ {% endraw -%}
21
+ {% set first_message = False %}
22
+ {% endif %}
 
23
  {% endfor %}
 
 
special_tokens_map.json CHANGED
@@ -16,6 +16,18 @@
16
  "rstrip": false,
17
  "single_word": false
18
  },
19
- "eos_token": "<eos>",
20
- "pad_token": "<pad>"
 
 
 
 
 
 
 
 
 
 
 
 
21
  }
 
16
  "rstrip": false,
17
  "single_word": false
18
  },
19
+ "eos_token": {
20
+ "content": "<eos>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false
25
+ },
26
+ "pad_token": {
27
+ "content": "<pad>",
28
+ "lstrip": false,
29
+ "normalized": false,
30
+ "rstrip": false,
31
+ "single_word": false
32
+ }
33
  }