Embeddings are numerical representations of text that capture semantic meaning. Similar texts have similar embeddings, enabling powerful search and comparison features.
from mythicdot import MythicDot
import numpy as np
client = MythicDot()
# Generate embeddings for texts
texts = [
"How do I reset my password?",
"I forgot my login credentials",
"What are your business hours?"
]
response = client.embeddings.create(
model="mythic-embed-3",
input=texts
)
# Extract embeddings
embeddings = [item.embedding for item in response.data]
# Calculate similarity (cosine)
def cosine_similarity(a, b):
return np.dot(a, b) / (np.linalg.norm(a) * np.linalg.norm(b))
# Compare first two (similar meaning)
sim_1_2 = cosine_similarity(embeddings[0], embeddings[1])
print(f"Similarity 1-2: {sim_1_2:.3f}") # ~0.85 (similar)
# Compare first and third (different topics)
sim_1_3 = cosine_similarity(embeddings[0], embeddings[2])
print(f"Similarity 1-3: {sim_1_3:.3f}") # ~0.32 (different)
| Model | Dimensions | Price per 1M Tokens |
|---|---|---|
| mythic-embed-3 | 3072 | $0.13 |
| mythic-embed-3-mini | 1536 | $0.02 |
Build powerful search and similarity features with our embeddings.