Creates an embedding vector that can be easily consumed by machine learning models and algorithms.
POST
/v1/embeddings
Request Body
Les paramètres suivants peuvent être inclus dans le corps de la requête :
Paramètres
model
string
Required
Default Value:
alphaedge-large-3-2512
ID of the model to use.
input
string
array<string>
Required
Input text to embed, encoded as a string or array of strings.
Successful Response
Les champs suivants sont retournés dans une réponse réussie :
Champs de réponse
object
string
Required
The object type, which is always "list".
data
The list of embedding objects.
model
string
Required
The model used for the embedding.
usage
UsageInfo
Required
Usage statistics for the embedding request.
Examples
Exemples de code pour utiliser cet endpoint :
typescript
import { AlphaEdge } from '@alphaedge/alphaedge';
const alphaedge = new AlphaEdge({
apiKey: process.env.ALPHAEDGE_API_KEY,
});
const embedding = await alphaedge.embeddings.create({
model: 'alphaedge-large-3-2512',
input: 'The food was delicious and the waiter...'
});
python
from alphaedge import AlphaEdge
alphaedge = AlphaEdge(api_key="your-api-key")
embedding = alphaedge.embeddings.create(
model="alphaedge-large-3-2512",
input="The food was delicious and the waiter..."
)
curl
curl https://api.alphaedge-ai.com/v1/embeddings \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $ALPHAEDGE_API_KEY" \
-d '{
"model": "alphaedge-large-3-2512",
"input": "The food was delicious and the waiter..."
}'
Response
Exemple de réponse de l'API :
json
{
"object": "list",
"data": [
{
"object": "embedding",
"embedding": [
0.0023064255,
-0.009327292,
0.0023054255
],
"index": 0
}
],
"model": "alphaedge-large-3-2512",
"usage": {
"prompt_tokens": 8,
"total_tokens": 8
}
}