Generating Your Application Layer from Domain Models

7 min read
ddd codegen architecture kotlin cqrs

If your REST endpoint and your MCP tool behave differently for the same operation, you have a bug. Not a feature gap, not a documentation issue — a bug. Two humans wrote two translations of the same domain operation and got two different answers. This happens constantly in hand-written application layers, and it is entirely preventable.

I spent too long maintaining application code that should never have existed as hand-written code in the first place. This post explains why I now generate the entire application layer from a single domain model, and how the codegen system works.

The problem with hand-written application layers

A typical DDD application has a clean domain layer — aggregates, commands, events, all carefully modeled. Then someone needs to expose it over HTTP. They write a controller. They write DTOs. They write mappers. They make judgment calls: should this be a PATCH or a PUT? What is the URL structure? How do I represent this nested sealed class in JSON?

Now someone needs an MCP integration so AI agents can interact with the system. They write tool definitions. They write more DTOs. They write more mappers. They make the same judgment calls again, slightly differently, because they are a different person or the same person on a different day.

Six months later, the REST API accepts null for a field that the MCP tool rejects. A GraphQL mutation returns a different error shape than the REST endpoint. A TypeScript client has a stale enum value. Nobody notices until a user files a bug report.

The application layer is not where your business logic lives. It is a translation layer — technology-specific plumbing that maps protocol concepts (HTTP methods, JSON schemas, tool definitions) onto domain concepts (commands, queries, views). Translation layers should not contain decisions. They should contain conventions applied mechanically. That is what machines are for.

The principle

The application layer is a technology-specific lens into the domain. It has no logic. No special cases. No opinions that differ from the domain model. Every protocol — REST, MCP, GraphQL, Protobuf — should be a deterministic function of the domain model.

If you accept this principle, the architecture follows naturally: parse the domain into an intermediate representation, then run generators that produce protocol-specific code from that IR.

The domain intermediate representation

The codegen system starts with DomainModelParser, which introspects Kotlin sealed classes at compile time and produces a DomainModel — the IR that every generator consumes.

data class DomainModel(
    val name: String,
    val packageName: String,
    val aggregates: Map<String, AggregateMetadata>,
    val queries: List<QueryMetadata>,
    val views: List<ViewMetadata>,
    val sagaStates: List<SagaStateMetadata>,
)

Each aggregate carries its commands and events, with semantic analysis already applied:

data class CommandMetadata(
    val name: String,
    val kClass: KClass<*>,
    val aggregateName: String,
    val verb: String,              // "Create", "Rename", "Complete"
    val targetProperty: String?,   // "DueDate" for UpdateDueDate, null for CompleteTask
    val hasSideEffects: Boolean,
    val parameters: List<ParameterMetadata>,
    val isCreate: Boolean,
    val isDelete: Boolean,
    val isModification: Boolean,
    val isAction: Boolean,
    val isFullReplacement: Boolean,
    val isPartialModification: Boolean,
)

The parser does real semantic analysis. It extracts the verb from a command name, determines what property it targets, and classifies the command into a category that drives protocol mapping. CreateTask is identified as a creation command. ReplaceTaskName is identified as a partial modification targeting the name property. ClarifyTask is identified as a domain action. This classification is derived from naming conventions, not configuration files.

The parser is invoked with the sealed class roots of your domain:

val domainModel = DomainModelParser().parseDomain(
    domainName = "GTD",
    packageName = "com.example.gtd",
    aggregateInterface = GtdAggregate::class,
    commandInterface = GtdCommand::class,
    eventInterface = GtdEvent::class,
    readerInterface = GtdReader::class,
    viewInterface = GtdView::class,
)

From those five entry points, the parser walks the sealed class hierarchies, extracts every command, event, query, and view, and produces the complete IR. The domain author never writes a mapping file or an annotation (beyond the standard @CreationCommand for non-obvious creation commands). The naming conventions carry all the information the generators need.

The generator interface

Every generator implements a single interface:

interface Generator {
    fun generate(domainModel: DomainModel, config: GeneratorConfig)
}

data class GeneratorConfig(
    val kotlinSourceOutputDir: File,
    val resourceOutputDir: File,
    val packageName: String,
    val managerClassName: String,
    val generateTests: Boolean = false,
    val additionalProperties: Map<String, Any> = emptyMap(),
)

The system currently has generators for:

  • RestApiGenerator — OpenAPI 3.1.0 spec, Ktor route handlers, request/response DTOs, DTO mappers
  • MCPGenerator — MCP tool definitions, tool handlers, MCP-optimized DTOs, DTO mappers
  • GraphQLGenerator — GraphQL schema from domain IR
  • ProtobufGenerator — Protocol Buffer definitions from domain IR
  • TypeScriptApiGenerator — TypeScript API client, service classes, command types, enum types, view types, reader types

Each generator reads the same DomainModel. Each produces a complete, working integration for its protocol. None of them contain business logic.

How REST generation works

The RestApiGenerator orchestrates two sub-generators:

class RestApiGenerator : Generator {
    private val openApiGenerator = RestOpenApiGenerator()
    private val controllerGenerator = RestControllerGenerator()

    override fun generate(domainModel: DomainModel, config: GeneratorConfig) {
        // Generate OpenAPI specification (for docs and client generation)
        val openApiSpec = openApiGenerator.generateOpenApiSpec(domainModel)

        // Generate Ktor routing directly from domain IR
        controllerGenerator.generateControllers(domainModel, config)
    }
}

From a single DomainModel, this produces:

  1. An OpenAPI 3.1.0 spec (api.yaml) for documentation and client generation
  2. Ktor route handlers with complete implementations
  3. Request/response DTOs
  4. DTO mappers between domain types and transport types

The OpenAPI spec and the Ktor routes are generated from the same IR in the same build step. They cannot diverge.

Convention over configuration

The most important design decision in the codegen system is how command names map to HTTP semantics. The ConventionValidator encodes these rules:

Create       -> POST   /{resource}
Delete       -> DELETE /{resource}/{id}
Replace      -> PUT    /{resource}/{id}        (full replacement)
Replace{Prop}-> PATCH  /{resource}/{id}        (RFC 6902: replace /prop)
Update{Agg}  -> PATCH  /{resource}/{id}        (RFC 6902: replace /)
Add{X}To{Y}  -> PATCH  /{resource}/{id}        (RFC 6902: add /x)
Remove{X}    -> PATCH  /{resource}/{id}        (RFC 6902: remove /x)
{DomainVerb} -> POST   /{resource}/{id}/{verb} (domain actions)

Partial modification commands are consolidated into a single PATCH endpoint per aggregate, using RFC 6902 JSON Patch format. The path and operation are derived mechanically from the command name:

val CommandMetadata.rfc6902Op: String
    get() = when (verb) {
        "Replace", "Update" -> "replace"
        "Add" -> "add"
        "Remove" -> "remove"
        else -> throw IllegalStateException("Not a PATCH command: $name")
    }

This means if you define a command called AddContextToTask, the codegen system knows:

  • This is a partial modification (PATCH)
  • The RFC 6902 operation is add
  • The RFC 6902 path is /context
  • It belongs on the consolidated PATCH endpoint for the Task aggregate

You did not configure any of that. You named your command according to the ubiquitous language of your domain, and the conventions extracted the protocol semantics.

Domain-specific verbs — Clarify, Complete, Trash, Delegate — are not special-cased. They all become POST /{resource}/{id}/{action}. The domain gets to define its own vocabulary without fighting the framework.

The MCP side of the same coin

The MCPGenerator consumes the identical DomainModel and produces MCP tool definitions. It generates two tools per domain — a read tool exposing all queries and a write tool exposing all commands — each with proper JSON Schema oneOf discriminators so LLMs can understand the available operations.

The generated MCP handlers call the same domain service that the REST handlers call. The DTOs are different (MCP-optimized for LLM consumption, with an Mcp prefix), but they map from the same domain views. The behavior is identical because the source of truth is the same IR.

The payoff

When I add a new command to the domain — say, ArchiveTask — I add the sealed subclass to TaskCommand, implement the business logic in the domain service, and rebuild. The codegen system:

  1. Detects Archive as a domain action verb
  2. Generates POST /tasks/{id}/archive in the OpenAPI spec
  3. Generates the corresponding Ktor route handler
  4. Generates an MCP tool variant for the write tool
  5. Generates the TypeScript client method
  6. Generates the GraphQL mutation
  7. Generates the Protobuf service definition

Every protocol gains the capability simultaneously. There is zero drift between them because there is zero hand-written translation code. The application layer is a derived artifact, not an authored one.

This is what I mean when I say the application layer should have no logic. If your REST controller is making decisions that your MCP handler does not make, one of them is wrong. If they are both generated from the same model, they are both right by construction.

The domain is where the thinking happens. The application layer is where the conventions happen. Keep them separate, and let the machine handle the conventions.