Agentic Service Discovery

Resources

G

Groq

by Groq, Inc.

Run extremely fast chat completions and audio transcriptions on open-source models (Llama 3, Mixtral, Gemma) using Groq's LPU inference hardware.

AI-tjenesterGratis / BetaltAPI-nøglellmfast-inferencellamamixtralopen-sourceai

Sådan bruger agenter Groq

  • Agent uses Groq for low-latency real-time inference where response speed is critical
  • Agent transcribes audio recordings or voice memos using Whisper on Groq hardware
  • Agent uses Llama 3 70B for complex reasoning tasks at lower cost than frontier models
  • Agent runs structured JSON extraction with response_format: json_object for parsing unstructured data
  • Agent uses Groq as a fast fallback when primary LLM providers are throttled or unavailable

Agenthandlinger

Chat CompletionGenerate a chat response using a Groq-hosted model with ultra-low latency.

Input: messages, model, systemPrompt, temperature, maxTokens, responseFormat

Returnerer: content, finishReason, promptTokens, completionTokens, model, tokensPerSecond

Transcribe AudioTranscribe audio to text using Whisper on Groq hardware.

Input: audioUrl, language, prompt

Returnerer: text, language, duration

List ModelsRetrieve all available models on Groq.

Returnerer: models, modelCount

Eksempel på workflows

1

Real-time chat agent

An agent that needs sub-second response times uses Groq with Llama 3 for low-latency interactive processing.

Om Groq

Leverandør
Groq, Inc.
Pris Tjek altid detaljer med udbyderen
Gratis / BetaltGenerous free tier available. Paid usage billed per million tokens, typically cheaper than OpenAI equivalents.
Godkendelse
API-nøgle
Hastighedsgrænse Tjek altid detaljer med udbyderen
30 anmodninger / minut
Kompatible noder
AgentResource

Byg et AI-workflow med Groq

Brug Agentic Planner til at designe, visualisere og forbinde Groq med dine andre værktøjer.

Åbn Agentic Planner

Relaterede AI-tjenester-værktøjer

Groq AI Agent Integration | Fugentic