Engines
Engines describe and provide access to the various models available in the API. You can refer to the Models documentation to understand what engines are available and the differences between them.
Get engines GET /v1/engines/{engine_id}
Gets information about an individual engine available, such as the owner, name, availability, and description.
GET https://api.goose.ai/v1/engines/{engine_id}
Curl Example
curl https://api.goose.ai/v1/engines/gpt-j-6b \
-H 'Authorization: Bearer YOUR_API_KEY'
{
"description": "6B parameter EleutherAI model trained on the Pile, using the Mesh Transformer JAX framework.",
"id": "gpt-j-6b",
"name": "GPT-J 6B",
"object": "engine",
"owner": "gooseai",
"ready": true,
"tokenizer": "gpt2",
"type": "text"
}
List engines GET /v1/engines
Lists the currently available engines, and provides basic information about each one such as the owner and availability.
GET https://api.goose.ai/v1/engines
Curl Example
curl https://api.goose.ai/v1/engines \
-H 'Authorization: Bearer YOUR_API_KEY'
{
"data": [
{
"description": "20B parameter EleutherAI model trained on the Pile, using the NeoX framework.",
"id": "gpt-neo-20b",
"name": "GPT-NeoX 20B",
"object": "engine",
"owner": "gooseai",
"ready": true,
"tokenizer": "pile",
"type": "text"
},
{
"description": "6B parameter EleutherAI model trained on the Pile, using the Mesh Transformer JAX framework.",
"id": "gpt-j-6b",
"name": "GPT-J 6B",
"object": "engine",
"owner": "gooseai",
"ready": true,
"tokenizer": "gpt2",
"type": "text"
},
{
"description": "2.7B parameter EleutherAI model trained on the Pile, using the Neo framework.",
"id": "gpt-neo-2-7b",
"name": "GPT-Neo 2.7B",
"object": "engine",
"owner": "gooseai",
"ready": true,
"tokenizer": "gpt2",
"type": "text"
},
{
"description": "1.3B parameter EleutherAI model trained on the Pile, using the Neo framework.",
"id": "gpt-neo-1-3b",
"name": "GPT-Neo 1.3B",
"object": "engine",
"owner": "gooseai",
"ready": true,
"tokenizer": "gpt2",
"type": "text"
},
{
"description": "125M parameter EleutherAI model trained on the Pile, using the Neo framework.",
"id": "gpt-neo-125m",
"name": "GPT-Neo 125M",
"object": "engine",
"owner": "gooseai",
"ready": true,
"tokenizer": "gpt2",
"type": "text"
},
{
"description": "13B parameter Facebook Mixture of Experts model trained on RoBERTa and CC100 subset data.",
"id": "fairseq-13b",
"name": "Fairseq 13B",
"object": "engine",
"owner": "gooseai",
"ready": true,
"tokenizer": "gpt2",
"type": "text"
},
{
"description": "6.7B parameter Facebook Mixture of Experts model trained on RoBERTa and CC100 subset data.",
"id": "fairseq-6-7b",
"name": "Fairseq 6.7B",
"object": "engine",
"owner": "gooseai",
"ready": true,
"tokenizer": "gpt2",
"type": "text"
},
{
"description": "2.7B parameter Facebook Mixture of Experts model trained on RoBERTa and CC100 subset data.",
"id": "fairseq-2-7b",
"name": "Fairseq 2.7B",
"object": "engine",
"owner": "gooseai",
"ready": true,
"tokenizer": "gpt2",
"type": "text"
},
{
"description": "1.3B parameter Facebook Mixture of Experts model trained on RoBERTa and CC100 subset data.",
"id": "fairseq-1-3b",
"name": "Fairseq 1.3B",
"object": "engine",
"owner": "gooseai",
"ready": true,
"tokenizer": "gpt2",
"type": "text"
},
{
"description": "125M parameter Facebook Mixture of Experts model trained on RoBERTa and CC100 subset data.",
"id": "fairseq-125m",
"name": "Fairseq 125M",
"object": "engine",
"owner": "gooseai",
"ready": true,
"tokenizer": "gpt2",
"type": "text"
}
],
"object": "list"
}