ai-mocks

Project Url: mokksy/ai-mocks
Introduction: AI-Mocks is a Kotlin-based mock server toolkit that brings service virtualization to both HTTP/SSE and LLM APIs — think WireMock meets local OpenAI/Anthropic/Gemini/A2A testing, but with real streaming and Server-Sent Events support.
More: Author   ReportBugs   OfficialWebsite   
Tags:

Maven Central Kotlin CI GitHub branch status Codacy Badge Codacy Coverage codecov CodeRabbit Pull Request Reviews

Documentation API Reference Ask DeepWiki GitHub License Kotlin API Java

AI-Mocks are mock LLM (Large Language Model) servers built on Mokksy, a mock server inspired by WireMock, with support for response streaming and Server-Side Events (SSE). They are designed to build, test, and mock LLM responses for development purposes.

Buy me a Coffee

Mokksy

Mokksy is a mock HTTP server built with Kotlin and Ktor. It addresses the limitations of WireMock by supporting true SSE and streaming responses, making it extreamly useful for integration testing LLM clients.

AI-Mocks

AI-Mocks is a set of specialized mock server implementations (e.g., mocking OpenAI API) built using Mokksy.

It supports mocking following AI services:

  1. OpenAI - ai-mocks-openai
  2. Anthropic - ai-mocks-anthropic
  3. Google VertexAI Gemini - ai-mocks-gemini
  4. Ollama - ai-mocks-ollama
  5. Agent-to-Agent (A2A) Protocol - ai-mocks-a2a

Feature Support Matrix

Feature OpenAI Anthropic Gemini Ollama A2A
Chat Completions
Streaming
Embeddings
Moderation
Additional APIs Responses - - Generate Full A2A Protocol
(11 endpoints)

How to build

Building project locally:

./gradlew build

or using Make:

make

Contributing

I do welcome contributions! Please see the Contributing Guidelines for details.

Apps
About Me
GitHub: Trinea
Facebook: Dev Tools