List of LLMs in DKubeX LLM Catalog¶
The following list contains the base LLMs which are currently registered with the DKubeX LLM Catalog on the current version of DKubeX (DKubeX v0.8.1).
LLM Name |
Required GPU Type |
GPU Memory Utilization |
Max number of batched tokens |
Max number of sequences |
Trust remote code |
Max number of total tokens |
Number of GPUs needed |
---|---|---|---|---|---|---|---|
|
a10 |
0.85 |
4096 |
64 |
true |
2048 |
1 |
|
a10 |
0.8 |
4096 |
64 |
true |
4096 |
4 |
|
a10 |
0.85 |
4096 |
64 |
true |
2048 |
1 |
|
a10 |
0.7 |
4096 |
64 |
true |
4096 |
1 |
|
a10 |
0.8 |
4096 |
64 |
true |
4096 |
4 |
|
a10 |
0.85 |
4096 |
64 |
true |
4096 |
1 |
|
a10 |
0.85 |
4096 |
64 |
true |
2048 |
1 |
|
a10 |
0.7 |
4096 |
64 |
true |
4096 |
1 |
|
a10 |
0.7 |
4096 |
64 |
true |
4096 |
1 |
|
a10 |
0.8 |
4096 |
64 |
true |
4096 |
4 |
These LLMs can be deployed on the DKubeX platform without using any config files provided the resource and permission requirements are met. You can deploy these LLMs on DKubeX using local resources or using cloud resources using SkyPilot.
Note
For more information regarding deploying LLMs on DKubeX, refer to Deploying LLMs in DKubeX.
To deploy an LLM using local resources, you can use the command given below. Replace the following fields in the command with the appropriate values:
<name of the deployment>
: Unique name for the LLM deployment<LLM Name>
: Name of the LLM provided in the list above<GPU Type>
: GPU type required for the LLM<access token>
: Access token required for the LLM (if any)
Note
In case you are using a EKS setup, please change the value of the flag
--type
froma10
tog5.4xlarge
in the following command.Use the
--publish
flag if you want to make the deployment available for any user on the same setup to access and use.
d3x llms deploy --name <name of the deployment> --model <LLM Name> --type <GPU Type> --token <access token> --publish
d3x llms deploy --name llama27b --model meta-llama/Llama-2-7b-chat-hf --type a10 --token hf_Ahq**********************WmO --publish
To deploy an LLM using cloud resources using SkyPilot, you can use the command given below. Replace the following fields in the command with the appropriate values:
<name of the deployment>
: Unique name for the LLM deployment<LLM Name>
: Name of the LLM provided in the list above<access token>
: Access token required for the LLM (if any)
Note
Use the --publish
flag if you want to make the deployment available for any user on the same setup to access and use.
d3x llms deploy --name <name of the deployment> --model <LLM Name> -sky --token <access token> --publish
d3x llms deploy --name llama27b --model meta-llama/Llama-2-7b-chat-hf -sky --token hf_Ahq**********************WmO --publish