Skip to content

Templates Overview

Templates are the source code blueprints that blissful-infra start copies and customises to create your project. Each template is a complete, production-ready starting point for a specific technology stack.

Templates live in packages/cli/templates/ in the blissful-infra repository:

packages/cli/templates/
├── spring-boot/ # Kotlin + Spring Boot backend
├── react-vite/ # React + Vite frontend
├── fastapi/ # Python + FastAPI backend
├── express/ # Node.js + Express backend
├── go-chi/ # Go + Chi backend
├── nextjs/ # Next.js frontend
├── loki/ # Log aggregation config
├── prometheus/ # Metrics scrape config
├── grafana/ # Pre-provisioned dashboards
├── jenkins/ # Jenkins server configuration
└── plugins/
└── ai-pipeline/ # AI/ML data platform plugin

Template files use {{PROJECT_NAME}} as a placeholder. When you run blissful-infra start my-app, every occurrence of {{PROJECT_NAME}} in every file is replaced with my-app. This affects:

  • Spring Boot application properties (spring.application.name)
  • Gradle project name and Docker image tags
  • Docker Compose container names and volume names
  • Database name and credentials
  • Kafka consumer group IDs
  • Nginx configuration

Templates also support conditional blocks based on the --database flag:

{{#IF_POSTGRES}}
// This code is included when database is 'postgres' or 'postgres-redis'
spring.datasource.url=jdbc:postgresql://postgres:5432/{{PROJECT_NAME}}
{{/IF_POSTGRES}}
{{#IF_REDIS}}
// Included when database is 'redis' or 'postgres-redis'
spring.data.redis.url=${REDIS_URL:redis://localhost:6379}
{{/IF_REDIS}}

Binary files (images, compiled assets, JARs) are copied as-is without substitution.

TemplateLanguageFrameworkFeatures
spring-bootKotlinSpring Boot 3Kafka producer/consumer, WebSockets, JPA, Flyway, Actuator, OpenTelemetry
fastapiPythonFastAPIKafka consumer, WebSockets, async handlers, Pydantic models
expressTypeScriptExpressKafka producer/consumer, WebSockets, TypeScript strict mode
go-chiGoChiKafka consumer, WebSockets, structured logging
TemplateLanguageFrameworkFeatures
react-viteTypeScriptReact + ViteTailwindCSS, WebSocket client, chat UI, hot reload
nextjsTypeScriptNext.jsApp Router, TailwindCSS, WebSocket client

These are always included and are not selectable:

TemplatePurpose
lokiLoki log aggregation config + Promtail Docker socket scraper
prometheusPrometheus config scraping backend:8080/actuator/prometheus
grafanaDatasource provisioning + 3 pre-built dashboards

All backend templates generate a working chat application to demonstrate the stack. When you first open your app, you can send messages through the React frontend and see them:

  1. Sent from the frontend to the backend via WebSocket
  2. Published to a Kafka topic by the backend
  3. Consumed by a Kafka listener in the backend
  4. Broadcast back to all connected WebSocket clients
  5. Persisted to Postgres (with postgres or postgres-redis)
  6. Served from the Redis cache on subsequent page loads (with postgres-redis)

This means you can observe Kafka message flow, cache hit/miss patterns in Grafana, and distributed traces in Jaeger — all from a working app — before writing any code.

You can modify the template files directly if you are working on blissful-infra itself (see blissful-infra dev —templates).

For project-specific customisation, edit the generated files in your project directory. They are real files you own — blissful-infra does not re-generate or overwrite them after start.

Plugins extend the generated project with additional services. Unlike templates, plugins are additive — they add new containers to docker-compose.yaml and new directories to your project.

PluginWhat it adds
ai-pipelineFastAPI + scikit-learn classifier consuming Kafka events. Co-deploys ClickHouse (columnar store), MLflow (experiment tracking), and Mage (visual pipeline orchestrator).
agent-serviceClaude-powered agent service with workspace access. Reads logs, runs tools, and responds to structured task requests via HTTP API.
scraperScrapy-based web scraper that publishes articles to a Kafka topic for downstream processing.

Enable plugins at creation time:

Terminal window
blissful-infra start my-app --plugins ai-pipeline
blissful-infra start my-app --plugins ai-pipeline,agent-service