Conversation
Convert tarot-service from blocking JPA to reactive R2DBC while maintaining existing WebFlux controllers and API contracts. Changes: - Replace spring-boot-starter-data-jpa with spring-boot-starter-data-r2dbc - Remove kotlin-jpa plugin (R2DBC uses data classes) - Convert entities to data classes with @Table/@Id/@column annotations - Replace JpaRepository with R2dbcRepository (native reactive queries) - Add ArcanaTypeRepository for FK lookups (Card -> ArcanaType) - Update TarotService to use fully reactive repository methods - Update CardMapper to accept explicit ArcanaType parameter - Configure R2DBC + Flyway (JDBC for migrations) in tests - All 12 tests pass Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Replace spring-boot-starter-web with spring-boot-starter-webflux - Replace spring-boot-starter-data-jpa with spring-boot-starter-data-r2dbc - Convert JPA entities (Role, User) to R2DBC data classes with @table - Convert JpaRepository to R2dbcRepository with reactive return types - Update services to return Mono/Flux and wrap Feign calls with Mono.fromCallable - Convert controllers to return Mono<ResponseEntity<T>> - Replace GatewayAuthenticationFilter (OncePerRequestFilter) with GatewayAuthenticationWebFilter (WebFilter) - Convert SecurityConfig to use @EnableWebFluxSecurity and @EnableReactiveMethodSecurity - Update GlobalExceptionHandler for WebFlux (WebExchangeBindException, ServerWebExchange) - Convert tests to use WebTestClient and StepVerifier Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Update ORM description: all services now use Spring Data R2DBC - Update architecture table: user-service and tarot-service use WebFlux + R2DBC - Remove JPA lazy loading behavior example (no longer applicable) - Expand reactive programming section to cover all services - Add reactive security notes for user-service - Add testing notes for WebTestClient and StepVerifier Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
After the reactive migration, two issues caused e2e test failures:
1. user-service: Missing HttpMessageConverters bean
- Feign's SpringDecoder requires this bean which Spring MVC provides
automatically but WebFlux doesn't
- Added FeignConfiguration.kt to provide it manually
2. divination-service: Blocking Feign calls on reactor threads
- Feign's BlockingLoadBalancerClient calls .block() when resolving
services via Eureka, which fails on reactor threads
- Refactored SpreadMapper and InterpretationMapper to return Mono<T>
and wrap Feign calls with subscribeOn(Schedulers.boundedElastic())
- Updated DivinationService to use flatMap for async mapper calls
Also added error logging to GlobalExceptionHandler in both services.
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Reorganize tarot-service following Clean Architecture principles: Domain layer: - domain/model/Card, ArcanaType, LayoutType - pure domain models Application layer: - application/interfaces/repository/CardRepository, LayoutTypeRepository - application/service/TarotService - returns domain models Infrastructure layer: - persistence/entity/*Entity - R2DBC @table classes - persistence/repository/SpringData*Repository - Spring Data interfaces - persistence/R2dbc*Repository - implements application interfaces - persistence/mapper/*EntityMapper - entity ↔ domain mapping API layer: - api/controller/*Controller - REST endpoints - api/mapper/*DtoMapper - domain ↔ DTO mapping All 12 tests pass. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Create domain layer: User, Role, RoleType models (pure Kotlin data classes)
- Create application layer:
- Interfaces: UserRepository, RoleRepository, DivinationServiceProvider,
PasswordEncoder, TokenProvider
- Services: UserService with authentication and user management
- Create infrastructure layer:
- Persistence: R2DBC entities, Spring Data repositories, entity mappers,
R2dbcUserRepository, R2dbcRoleRepository implementations
- External: FeignDivinationServiceProvider
- Security: JwtTokenProvider, SpringPasswordEncoder, GatewayAuthenticationWebFilter
- Create API layer: Controllers and UserDtoMapper
- Update all tests for new package structure
- All 48 tests pass
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Apply Clean Architecture pattern with: - Domain layer: Pure business models (Spread, SpreadCard, Interpretation) - Application layer: Repository interfaces, provider interfaces, DivinationService - Infrastructure layer: R2DBC repositories, Feign providers, security - API layer: Controllers moved to api/controller package Key changes: - Extract Feign calls from mappers to provider interfaces - Pure domain models with no framework annotations - Repository interfaces in application layer, implementations in infrastructure - CurrentUserProvider abstracts security context access - All 41 tests pass Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add Clean Architecture section documenting: - Four-layer structure (domain, application, infrastructure, api) - Three model types (domain, entity, DTO) - Naming conventions (technology prefix, no Adapter suffix) - Data flow patterns - Key design decisions Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Fix Swagger UI location (now centralized at gateway)
- Add missing API endpoints (auth/login, cards/random, layout-types/{id})
- Fix JPA → R2DBC in configuration description
- Update E2E test count (31 → 63)
- Remove verbose sections (coverage paths, pre-commit setup, detailed health checks)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add 3-broker Kafka cluster with KRaft mode (no Zookeeper): - kafka-1, kafka-2, kafka-3 using confluentinc/cp-kafka:7.5.0 - Ports: 9092, 9093, 9094 (external access) - Replication: factor=3, min.insync.replicas=2 - kafka-ui for monitoring (port 8090) - Persistent volumes for each broker Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Update highload-config submodule with Kafka producer configuration: - Shared producer settings (serializers, acks, idempotence) - Topic names for user-service, divination-service Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- EventType enum (CREATED, UPDATED, DELETED) for Kafka message headers - UserEventData for users-events topic - SpreadEventData for spreads-events topic - InterpretationEventData for interpretations-events topic Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add spring-kafka dependency to build.gradle.kts - Create UserEventPublisher interface in application layer - Create UserEventDataMapper for domain-to-DTO mapping - Create KafkaUserEventPublisher implementation with bounded elastic scheduler - Configure KafkaTemplate with Spring's ObjectMapper for proper JSON serialization Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Inject UserEventPublisher into UserService - Publish CREATED event after user creation - Publish UPDATED event after user update - Publish DELETED event after user deletion - Change deleteUser from existsById to findById (needed for event data) - Add UserEventPublisher mock to unit and integration tests Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add spring-kafka dependency and create event publishing infrastructure: - SpreadEventPublisher/InterpretationEventPublisher interfaces (application layer) - SpreadEventDataMapper/InterpretationEventDataMapper (infrastructure/messaging/mapper) - KafkaSpreadEventPublisher/KafkaInterpretationEventPublisher implementations - KafkaConfig with separate KafkaTemplate beans for spread and interpretation events Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Injected SpreadEventPublisher and InterpretationEventPublisher into DivinationService. Added event publishing: - createSpread: publishCreated after spread and spread cards saved - deleteSpread: publishDeleted after spread deleted - addInterpretation: publishCreated after interpretation saved - updateInterpretation: publishUpdated after interpretation saved - deleteInterpretation: publishDeleted after interpretation deleted Updated unit tests (DivinationServiceTest) with publisher mocks and verification. Updated integration test base classes (BaseControllerIntegrationTest, BaseIntegrationTest) to mock publishers. All 41 tests pass. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add KAFKA_BOOTSTRAP_SERVERS environment variable to user-service and divination-service. Add depends_on kafka-1 (healthy) to both services to ensure Kafka is ready before services start. Corrected plan typo: all brokers use port 29092 for internal PLAINTEXT listener (not 29093/29094 as originally stated in plan). Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
R2DBC save() only returns generated @id, not database defaults like created_at. Set createdAt = Instant.now() explicitly before save to ensure value is available for Kafka event publishing. Also fixes duplicate spring: key in application.yml (highload-config). Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add kafka-1/2/3 and kafka-ui to architecture table - Add Event-Driven Architecture section with topics, events, and message format - Update Quick Start with Kafka UI link - Update Running Locally to include Kafka services - Add Event Streaming to Technology Stack Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add notification-service module to settings.gradle.kts - Create build.gradle.kts with WebFlux, R2DBC, Kafka consumer, Feign dependencies - Create Dockerfile following existing service pattern - Create minimal application.yml pointing to config-server - Create NotificationServiceApplication with Eureka, Feign, R2DBC, OpenAPI annotations
Add MinIO object storage infrastructure for file attachments: - MinIO service with S3-compatible API on port 9000 - MinIO Console on port 9001 - Health check with curl to live endpoint - minio_data volume for persistent storage Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add domain model (FileUpload, FileUploadStatus) and persistence layer (FileUploadEntity, R2dbcFileUploadRepository) following Clean Architecture patterns from notification-service. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add application-test.yml with disabled Eureka/config server - Add FileUploadServiceTest with 14 unit tests covering: - requestUpload (content type validation, PENDING record creation) - verifyAndCompleteUpload (ownership, status, expiry, file existence, size) - getUploadMetadata, getDownloadUrl, deleteUpload - Add FileUploadControllerIntegrationTest with 10 integration tests using: - TestContainers (PostgreSQL + MinIO) - WebTestClient for public and internal endpoints - Fix build.gradle.kts MinIO TestContainers dependency name Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add FileAttachmentE2ETest with 9 tests for file upload API:
- Presigned URL generation
- Content type validation (400 for invalid types)
- Interpretation without attachment
- Non-existent uploadId rejection
- Delete pending upload
- MinIO upload tests (disabled by default)
- Interpretation with attachment (disabled)
- Cross-user upload rejection (disabled)
- Delete completed upload (disabled)
- Add FilesServiceClient (public Feign client for files-service)
- Add FileUploadDtos (PresignedUploadRequest, PresignedUploadResponse, etc.)
- Add GlobalExceptionHandler to files-service for proper HTTP status codes
- Update BaseE2ETest with FilesServiceClient and vararg assertThrowsWithStatus
- Update application.yml with files-service.url
- Update files-service tests to expect 400 instead of 500 for errors
Note: Tests requiring direct MinIO access are disabled by default because
presigned URLs contain internal Docker hostname ('minio'). To enable:
add '127.0.0.1 minio' to /etc/hosts
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add files-service and file attachment feature documentation to CLAUDE.md: - MinIO to Technology Stack - files-service (8085) and minio (9000/9001) to Microservices table - file_upload and interpretation_attachment tables to Database Schema - files-service endpoints with upload flow documentation - files-events topic and FileEventData to Event-Driven Architecture - Environment variables for MinIO configuration Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Presigned URLs were using internal Docker hostname (minio:9000) which clients outside Docker cannot resolve. Added separate MinioClient for presigned URL generation configured with external endpoint (localhost:9000). Changes: - Add externalEndpoint property to MinioProperties - Create presignedMinioClient bean with external endpoint and explicit region - Update MinioFileStorage to use presignedMinioClient for URL generation - Update integration tests to configure external-endpoint Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add userConsumerFactory and userKafkaListenerContainerFactory beans to enable consuming UserEventData messages from users-events topic. Config submodule updated with users-events topic definition. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Adds Kafka consumer that listens to users-events topic for DELETED events. When a user is deleted, asynchronously cleans up all their spreads and interpretations via DivinationService.deleteUserData(). Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Remove DivinationServiceProvider references from test files: - UserServiceTest: remove mock, constructor param, verify calls, delete ServiceUnavailableException test - BaseIntegrationTest: remove @MockBean and doNothing setup - BaseControllerIntegrationTest: remove @MockBean and doNothing setup Tests now verify simplified delete flow (find→delete→publish). Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Remove deleteUserData method from InternalController (getSpreadOwner preserved) - Delete InternalControllerIntegrationTest (all tests were for deleteUserData) - Add Kafka consumer config to application-test.yml (group-id, bootstrap-servers) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Remove the unused deleteUserData method from DivinationServiceInternalClient. User data cleanup now happens via Kafka event consumer in divination-service. getSpreadOwner method preserved for notification-service usage. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add awaitCondition and awaitStatus utilities to BaseE2ETest for polling - Update UserCascadeDeleteE2ETest to use awaitStatus(404) instead of assertThrowsWithStatus(404) for spread deletion checks - Update test KDoc to reflect Kafka-based async cleanup - Fix missing spring.kafka.consumer.group-id in divination-service.yml Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
No description provided.