You're looking at a specific version of this model. Jump to the model overview.
lucataco /tinyllama-1.1b-chat-v1.0:c35e854d
Input schema
The fields you can use to run this model with an API. If you don’t give a value for a field its default value will be used.
Field | Type | Default value | Description |
---|---|---|---|
prompt |
string
|
How many helicopters can a human eat in one sitting?
|
Instruction for model
|
system_prompt |
string
|
You are a friendly chatbot who always responds in the style of a pirate
|
System prompt for the model, helps guides model behaviour.
|
prompt_template |
string
|
<|system|>
{system_prompt}</s>
<|user|>
{prompt}</s>
<|assistant|>
|
Template to pass to model. Override if you are providing multi-turn instructions.
|
max_new_tokens |
integer
|
256
|
The maximum number of tokens the model should generate as output
|
top_p |
number
|
0.95
|
This parameter controls how many of the highest-probability words are selected to be included in the generated text
|
top_k |
integer
|
50
|
The number of highest probability tokens to consider for generating the output. If > 0, only keep the top k tokens with highest probability (top-k filtering)
|
temperature |
number
|
0.7
|
The value used to modulate the next token probabilities.
|
Output schema
The shape of the response you’ll get when you run this model with an API.
Schema
{'items': {'type': 'string'},
'title': 'Output',
'type': 'array',
'x-cog-array-display': 'concatenate',
'x-cog-array-type': 'iterator'}