You're looking at a specific version of this model. Jump to the model overview.

lucataco /qwen1.5-72b:f919d3c4

Input schema

The fields you can use to run this model with an API. If you don’t give a value for a field its default value will be used.

Field Type Default value Description
prompt
string
Give me a short introduction to large language model.
Input prompt
system_prompt
string
You are a helpful assistant.
System prompt
max_new_tokens
integer
512

Min: 1

Max: 32768

The maximum number of tokens to generate
temperature
number
1

Min: 0.1

Max: 5

Adjusts randomness of outputs, greater than 1 is random and 0 is deterministic, 0.75 is a good starting value.
top_p
number
1

Min: 0.01

Max: 1

When decoding text, samples from the top p percentage of most likely tokens; lower to ignore less likely tokens.
top_k
integer
1
When decoding text, samples from the top k most likely tokens; lower to ignore less likely tokens.
repetition_penalty
number
1

Min: 0.01

Max: 10

Penalty for repeated words in generated text; 1 is no penalty, values greater than 1 discourage repetition, less than 1 encourage it.
seed
integer
The seed for the random number generator

Output schema

The shape of the response you’ll get when you run this model with an API.

Schema
{'items': {'type': 'string'},
 'title': 'Output',
 'type': 'array',
 'x-cog-array-display': 'concatenate',
 'x-cog-array-type': 'iterator'}