Skip to content

GoChat API Reference

This document is generated according to godoc conventions, covering usage instructions for core externally exposed packages, interfaces, functions, and structs in the pkg directory.

Package core (Core Types)

Contains foundational interface constraints, data carriers, and domain models for the entire project.

Interface Client

The fundamental interaction interface for large language models.

type Client interface {
    // Chat executes a blocking chat request, returning all information at once.
    Chat(ctx context.Context, messages []Message, opts ...Option) (*Response, error)

    // ChatStream executes a streaming chat request, returning an iterable Stream object.
    ChatStream(ctx context.Context, messages []Message, opts ...Option) (*Stream, error)
}

Struct Message

Represents the atomic-level structure of a conversation context.

type Message struct {
    Role       string         `json:"role"`           // Role: system, user, assistant, tool
    Content    []ContentBlock `json:"content"`        // Multimodal content block slice
    ToolCalls  []ToolCall     `json:"tool_calls"`     // Function call requests initiated by assistant
    ToolCallID string         `json:"tool_call_id"`   // Callback ID for tool response when role is tool
}

// Quick creation of single text block messages
func NewUserMessage(text string) Message
func NewSystemMessage(text string) Message
// Get all concatenated text content internally
func (m Message) TextContent() string

Struct ContentBlock

type ContentBlock struct {
    Type      ContentType // "text", "image", "file", "thinking"
    Text      string
    MediaType string      // MIME type, e.g., "image/png"
    Data      string      // Base64-encoded data of the resource
}

Struct Response & Stream

type Response struct {
    ID               string
    Model            string
    Content          string      // Merged final text
    ReasoningContent string      // Model's thinking (CoT) process
    Message          Message     // Returned structured message
    FinishReason     string      // Stop reason, e.g., "stop"
    Usage            *Usage      // Token consumption
    ToolCalls        []ToolCall  // Tools expected to be called
}

type Stream struct { ... }
// Next drives iteration forward, returning false indicates iteration end or error.
func (s *Stream) Next() bool
// Event gets the stream event object pointed to by the current cursor.
func (s *Stream) Event() StreamEvent
// Close closes the underlying response stream to avoid connection leaks.
func (s *Stream) Close() error
// Usage gets total token cost if stream has ended.
func (s *Stream) Usage() *Usage

Function Modifiers Option

func WithModel(model string) Option
func WithTemperature(t float64) Option
func WithMaxTokens(n int) Option
func WithTools(tools ...Tool) Option
func WithSystemPrompt(prompt string) Option
func WithThinking(budget int) Option

Package client (LLM Clients)

Subpackages under this directory provide core.Client implementations for each specific vendor.

client/openai

func New(config Config) (*Client, error)

Supports all model endpoints compatible with the standard OpenAI protocol (/v1/chat/completions).

client/anthropic

func New(config Config) (*Client, error)

Highly integrated with Anthropic's unique API mechanisms and version request headers, natively supporting its unique reasoning model protocol and image transmission standards.

client/ollama

func DefaultOllamaClient() (*Client, error)
func New(config Config) (*Client, error)

Connects to locally deployed Ollama service, internally adapting to its unique design where non-streaming mode still returns streaming format (NDJSON).

(Other packages such as azureopenai, deepseek, qwen also provide the same abstract construction entry points.)


Package embedding (Vectorization Models)

Provides text embedding interface, local ONNX loading engine, tokenizer, and dynamic concurrent scheduling processor.

Interface Provider & MultimodalProvider

type Provider interface {
    Embed(ctx context.Context, texts []string) ([][]float32, error)
    Dimension() int // Get the returned feature dimension
}

type MultimodalProvider interface {
    Provider
    EmbedImages(ctx context.Context, images [][]byte) ([][]float32, error)
}

Factory Construction Methods

// Create model provider via model file path, automatically identifying BERT, BGE, Sentence-BERT types
func NewProvider(modelPath string) (Provider, error)

// Convenient factory methods, automatically stream download model from HF mirror source and initialize if not exists locally
func WithBEG(modelName, modelPath string) (Provider, error)
func WithBERT(modelName, modelPath string) (Provider, error)
func WithCLIP(modelName, modelPath string) (MultimodalProvider, error)

BatchProcessor

Achieves high-throughput Embedding computation for massive texts through in-memory text hash caching and concurrent pipeline distribution strategy.

func NewBatchProcessor(provider Provider, options BatchOptions) *BatchProcessor
// Execute batch operation with progress bar callback
func (bp *BatchProcessor) ProcessWithProgress(ctx context.Context, texts []string, callback ProgressCallback) ([][]float32, error)

Package pipeline (Workflow Engine)

Provides high-level process scheduling architecture for LLM applications, with full strong-type safety generic support.

Pipeline[T]

The core pipeline driver.

type Pipeline[T any] struct { ... }

// Initialize type-safe Pipeline
func New[T any]() *Pipeline[T]

// Assemble processing steps
func (p *Pipeline[T]) AddStep(step Step[T]) *Pipeline[T]

// Start execution, blocks and returns quickly on any failure
func (p *Pipeline[T]) Execute(ctx context.Context, state T) error

Interface Step[T] & Hook[T]

// All processing components must implement this interface
type Step[T any] interface {
    Name() string
    Execute(ctx context.Context, state T) error
}

// Intercept lifecycle for AOP monitoring
type Hook[T any] interface {
    OnStepStart(ctx context.Context, step Step[T], state T)
    OnStepError(ctx context.Context, step Step[T], state T, err error)
    OnStepComplete(ctx context.Context, step Step[T], state T)
}

Control Flow

Built-in auxiliary Steps supporting flow control branching and early exit.

func NewIf[T any](name string, cond func(context.Context, T) bool, thenStep, elseStep Step[T]) *IfStep[T]
func NewLoop[T any](name string, cond func(context.Context, T) bool, body Step[T], maxLoops int) *LoopStep[T]