diff --git a/README.md b/README.md
new file mode 100644
index 0000000..42e4262
--- /dev/null
+++ b/README.md
@@ -0,0 +1,159 @@
+# Explore-with-me
+
+## Introduction
+
+Explore-with-me - это платформа, позволяющая пользователям создавать мероприятия, управлять ими и участвовать в них. Приложение перешло на **микросервисную архитектуру** для улучшения масштабируемости, отказоустойчивости и гибкости разработки.
+
+Ключевые компоненты архитектуры:
+* **Config Server**: Централизованное управление конфигурациями для всех микросервисов.
+* **Discovery Server (Eureka)**: Обеспечивает регистрацию и обнаружение сервисов в сети.
+* **Gateway Server**: Единая точка входа для всех клиентских запросов, выполняет маршрутизацию к соответствующим микросервисам.
+* **Event Service**: Отвечает за логику, связанную с мероприятиями (создание, поиск, управление).
+* **User Service**: Управляет данными пользователей и их аутентификацией/авторизацией.
+* **Request Service**: Обрабатывает запросы на участие в мероприятиях.
+* **Stats Service**: Система сбора и анализа статистики, разделенная на:
+ * **Collector**: Принимает и сохраняет информацию о просмотрах и обращениях к эндпоинтам.
+ * **Aggregator**: Агрегирует собранные данные для последующего анализа.
+ * **Analyzer**: Анализирует агрегированные данные для формирования рекомендаций пользователям.
+
+Взаимодействие между сервисами осуществляется с помощью **Feign клиентов**.
+
+## Technologies Used
+
+- **Spring Boot**: Основа для создания микросервисов.
+- **Spring Cloud**:
+ - **Spring Cloud Config**: Для `config-server`.
+ - **Spring Cloud Netflix Eureka** (или **Consul**): Для `discovery-server`.
+ - **Spring Cloud Gateway**: Для `gateway-server`.
+ - **OpenFeign**: Для декларативного REST-взаимодействия между сервисами.
+- **REST**: Для API коммуникации.
+- **Docker & Docker Compose**: Для контейнеризации и оркестрации окружения.
+- **PostgreSQL**: В качестве основной базы данных для сервисов (где это необходимо).
+- **Lombok**: Для уменьшения шаблонного кода.
+- **SLF4J**: Для логирования.
+
+## Service Overview
+
+Проект состоит из следующих основных микросервисов:
+
+- **config-server**: Сервер конфигураций.
+- **discovery-server**: Сервер регистрации и обнаружения сервисов.
+- **gateway-server**: API шлюз, маршрутизирующий запросы.
+- **event-service**: Сервис управления событиями.
+- **user-service**: Сервис управления пользователями.
+- **request-service**: Сервис управления запросами на участие.
+- **collector**: Сервис сбора статистики.
+- **aggregator**: Сервис агрегации статистики.
+- **analyzer**: Сервис анализа статистики и формирования рекомендаций.
+
+## Setup and Installation
+
+### Prerequisites
+
+- Java 21+
+- Maven
+- Docker
+- Docker Compose
+
+### Installation Steps
+
+1. **Clone the Repository**
+
+ ```bash
+ git clone https://github.com/yiqes/java-plus-graduation.git
+ cd java-plus-graduation
+ ```
+
+2. **Build the Project Modules**
+
+ ```bash
+ mvn clean install
+ ```
+
+3. **Running with Docker Compose (Recommended)**
+
+ - Убедитесь, что Docker и Docker Compose установлены.
+ - Перейдите в корневую директорию проекта, где находится `docker-compose.yml`.
+ - Запустите:
+
+ ```bash
+ docker-compose up --build
+ ```
+ Это поднимет все сконфигурированные сервисы, включая `config-server`, `discovery-server` и базы данных.
+
+4. **Running Without Docker (More Complex)**
+
+ - Запустите `config-server`.
+ - Запустите `discovery-server`. Убедитесь, что он регистрируется в `config-server` для получения своей конфигурации.
+ - Запустите остальные микросервисы (`event-service`, `user-service`, `request-service`, `stats-collector`, `stats-aggregator`, `stats-analyzer`). Они должны регистрироваться в `discovery-server` и получать конфигурацию из `config-server`.
+ - Запустите `gateway-server`. Он будет использовать `discovery-server` для маршрутизации запросов.
+ - Для каждого сервиса необходимо настроить подключение к базе данных (если используется) в его `bootstrap.properties` (для подключения к config-server) и `application.properties` (или получать из config-server).
+ - Запуск каждого сервиса:
+ ```bash
+ # Пример для event-service
+ # java -jar event-service/target/event-service.jar
+ ```
+
+## API Documentation
+
+Все API запросы должны направляться через **Gateway Server**, который обычно доступен по адресу `http://localhost:8080` (или порт, указанный в вашей конфигурации `gateway-server`). Gateway автоматически маршрутизирует запросы к соответствующим микросервисам.
+
+**Эндпоинты остались теми же, что и в монолитной версии.**
+
+### Admin API Endpoints (через Gateway)
+
+- **POST /admin/categories**
+ - Создает новую категорию.
+ - Request Body: `NewCategoryDto`
+ - *Маршрутизируется на `event-service`*
+
+- **GET /admin/compilations**
+ - Получает подборки событий.
+ - Query Params: `pinned`, `from`, `size`
+ - *Маршрутизируется на `event-service`*
+
+- **POST /admin/events** (и другие PATCH, GET эндпоинты для событий)
+ - Создает/обновляет/получает события (администратор).
+ - Request Body: `NewEventDto` / `UpdateEventAdminRequest`
+ - *Маршрутизируется на `event-service`*
+
+- **GET /admin/users** (и другие эндпоинты для управления пользователями)
+ - Получает пользователей.
+ - Query Params: `ids`, `from`, `size`
+ - *Маршрутизируется на `user-service`*
+
+### Public API Endpoints (через Gateway)
+
+- **GET /categories**
+ - Получает категории.
+ - Query Params: `from`, `size`
+ - *Маршрутизируется на `event-service`*
+
+- **GET /compilations**
+ - Получает подборки событий.
+ - Query Params: `pinned`, `from`, `size`
+ - *Маршрутизируется на `event-service`*
+
+- **GET /events**
+ - Получает события (публичный доступ).
+ - Query Params: `text`, `categories`, `paid`, `rangeStart`, `rangeEnd`, `onlyAvailable`, `sort`, `from`, `size`
+ - *Маршрутизируется на `event-service`*
+
+### Private API Endpoints (через Gateway)
+
+- **POST /users/{userId}/events** (и другие эндпоинты для событий пользователя)
+ - Создает событие для пользователя.
+ - Request Body: `NewEventDto`
+ - *Маршрутизируется на `event-service` (с проверкой пользователя через `user-service` или передачей `userId`)*
+
+- **GET /users/{userId}/requests** (и другие эндпоинты для запросов пользователя)
+ - Получает запросы на участие для событий пользователя.
+ - *Маршрутизируется на `request-service` (с проверкой пользователя)*
+
+## Usage Examples
+
+### Getting an event with provided id & user-id from header (через Gateway)
+
+```bash
+curl -X GET "http://localhost:8080/123" \
+ -H "X-Ewm-User-Id: 456"
\ No newline at end of file
diff --git a/core/event-service/pom.xml b/core/event-service/pom.xml
index e38dce8..dc34144 100644
--- a/core/event-service/pom.xml
+++ b/core/event-service/pom.xml
@@ -1,132 +1,133 @@
-
4.0.0
+
ru.practicum
core
0.0.1-SNAPSHOT
+
event-service
- 22
- 22
+ 21
+ 21
UTF-8
+ 5.0.0
-
- org.springframework.data
- spring-data-jpa
-
-
- org.projectlombok
- lombok
- provided
-
-
- org.apache.tomcat.embed
- tomcat-embed-core
-
-
- com.querydsl
- querydsl-jpa
- jakarta
- 5.1.0
-
+
ru.practicum
interaction-api
0.0.1-SNAPSHOT
+
- org.mapstruct
- mapstruct
- 1.6.2
- provided
-
-
- org.mapstruct
- mapstruct-processor
- 1.6.2
- provided
+ org.springframework.boot
+ spring-boot-starter-web
+
- com.netflix.spectator
- spectator-api
- 1.7.3
- compile
+ org.springframework.boot
+ spring-boot-configuration-processor
+ true
+
- jakarta.persistence
- jakarta.persistence-api
+ org.springframework.boot
+ spring-boot-starter-actuator
+
- org.hibernate.orm
- hibernate-core
+ org.springframework.boot
+ spring-boot-starter-validation
+
- org.springframework.retry
- spring-retry
+ org.springframework.boot
+ spring-boot-starter-data-jpa
+
- org.springframework.cloud
- spring-cloud-starter-netflix-eureka-client
- 3.1.8
+ org.postgresql
+ postgresql
+ runtime
+
- org.springframework.cloud
- spring-cloud-config-client
- 4.2.0
+ com.h2database
+ h2
+ runtime
+
- org.springframework.boot
- spring-boot-starter-web
+ org.mapstruct
+ mapstruct
+ 1.5.5.Final
+
- org.springframework.boot
- spring-boot-starter-data-jpa
+ org.projectlombok
+ lombok-mapstruct-binding
+ 0.2.0
+
- org.postgresql
- postgresql
+ com.querydsl
+ querydsl-apt
+ ${querydsl.version}
+ jakarta
+ provided
+
- org.springframework.cloud
- spring-cloud-starter-openfeign
+ com.querydsl
+ querydsl-jpa
+ jakarta
+ ${querydsl.version}
- ru.practicum
- stats-client
- 0.0.1-SNAPSHOT
- compile
+ org.projectlombok
+ lombok
+ provided
-
-
-
-
- org.springframework.cloud
- spring-cloud-dependencies
- 2024.0.0
- pom
- import
-
-
-
+
org.springframework.boot
spring-boot-maven-plugin
+
+
+ org.apache.maven.plugins
+ maven-compiler-plugin
+ 3.11.0
-
- paketobuildpacks/builder-jammy-base:latest
-
+
+
+ org.mapstruct
+ mapstruct-processor
+ 1.5.5.Final
+
+
+ org.projectlombok
+ lombok
+ 1.18.30
+
+
+ org.projectlombok
+ lombok-mapstruct-binding
+ 0.2.0
+
+
@@ -144,36 +145,8 @@
-
-
- com.querydsl
- querydsl-apt
- jakarta
- 5.1.0
-
-
-
-
- org.codehaus.mojo
- build-helper-maven-plugin
- 3.3.0
-
-
- add-source
- generate-sources
-
- add-source
-
-
-
- ${project.build.directory}/generated-sources/java/
-
-
-
-
-
\ No newline at end of file
diff --git a/core/event-service/src/main/java/ru/practicum/controller/PublicEventController.java b/core/event-service/src/main/java/ru/practicum/controller/PublicEventController.java
index 6e23883..1f3ae0d 100644
--- a/core/event-service/src/main/java/ru/practicum/controller/PublicEventController.java
+++ b/core/event-service/src/main/java/ru/practicum/controller/PublicEventController.java
@@ -7,13 +7,16 @@
import org.springframework.web.bind.annotation.*;
import ru.practicum.dto.event.EventFullDto;
import ru.practicum.dto.event.EventShortDto;
+import ru.practicum.dto.event.RecommendedEventDto;
import ru.practicum.exception.IncorrectValueException;
import ru.practicum.service.event.EventSearchParams;
import ru.practicum.service.event.EventService;
import ru.practicum.service.event.PublicSearchParams;
+import ru.practicum.stats.client.StatClient;
import java.time.LocalDateTime;
import java.util.List;
+import java.util.stream.Stream;
import static ru.practicum.constant.Constant.PATTERN_DATE;
@@ -25,7 +28,10 @@
@RequiredArgsConstructor
@Slf4j
public class PublicEventController {
+
+ private final StatClient statClient;
private final EventService eventService;
+ private static final String X_EWM_USER_ID_HEADER = "X-EWM-USER-ID";
/**
* Gets events.
@@ -83,21 +89,23 @@ public List getAll(
return eventShortDtoList;
}
- /**
- * Gets event by id.
- *
- * @param eventId the event id
- * @param request the request
- * @return the event by id
- */
@GetMapping("/{event-id}")
- public EventFullDto getEventById(@PathVariable("event-id") Long eventId, HttpServletRequest request) {
+ public EventFullDto getEventById(@PathVariable("event-id") Long eventId, @RequestHeader(X_EWM_USER_ID_HEADER) long userId) {
log.info("Получение информации о событии с id={}", eventId);
- // Получение IP клиента
- String clientIp = request.getRemoteAddr();
-
// Получение события через сервис
- return eventService.getEventById(eventId, clientIp);
+ return eventService.getEventById(eventId, userId);
+ }
+
+ @GetMapping("/recommendations")
+ public Stream getRecommendations(@RequestHeader(X_EWM_USER_ID_HEADER) long userId,
+ @RequestParam(defaultValue = "10") int maxResults) {
+ return eventService.getRecommendations(userId, maxResults);
+ }
+
+ @PutMapping("/{event-id}/like")
+ public void likeEvent(@PathVariable("event-id") Long eventId,
+ @RequestHeader(X_EWM_USER_ID_HEADER) long userId) {
+ eventService.addLike(userId, eventId);
}
}
diff --git a/core/event-service/src/main/java/ru/practicum/mapper/event/EventMapper.java b/core/event-service/src/main/java/ru/practicum/mapper/event/EventMapper.java
index a8f096e..f59ed4a 100644
--- a/core/event-service/src/main/java/ru/practicum/mapper/event/EventMapper.java
+++ b/core/event-service/src/main/java/ru/practicum/mapper/event/EventMapper.java
@@ -3,6 +3,8 @@
import org.mapstruct.Mapper;
import ru.practicum.dto.event.EventFullDto;
import ru.practicum.dto.event.EventShortDto;
+import ru.practicum.dto.event.RecommendedEventDto;
+import ru.practicum.grpc.stat.request.RecommendedEventProto;
import ru.practicum.mapper.location.LocationMapper;
import ru.practicum.model.Event;
@@ -45,4 +47,6 @@ default String formatDateTime(LocalDateTime dateTime) {
DateTimeFormatter formatter = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss");
return dateTime.format(formatter);
}
+
+ RecommendedEventDto map(RecommendedEventProto proto);
}
diff --git a/core/event-service/src/main/java/ru/practicum/mapper/event/UtilEventClass.java b/core/event-service/src/main/java/ru/practicum/mapper/event/UtilEventClass.java
index a70527f..baadf02 100644
--- a/core/event-service/src/main/java/ru/practicum/mapper/event/UtilEventClass.java
+++ b/core/event-service/src/main/java/ru/practicum/mapper/event/UtilEventClass.java
@@ -128,7 +128,7 @@ public Event updateEvent(Event updatedEvent, UpdateEventAdminRequest request, Ca
request.getRequestModeration() : updatedEvent.getRequestModeration())
.state(updatedEvent.getState())
.title(request.getTitle() != null ? request.getTitle() : updatedEvent.getTitle())
- .views(updatedEvent.getViews())
+ .rating(updatedEvent.getRating())
.build();
}
@@ -160,7 +160,7 @@ public EventFullDto toEventFullDto(Event event) {
eventFullDto.setRequestModeration(event.getRequestModeration());
eventFullDto.setState(event.getState());
eventFullDto.setTitle(event.getTitle());
- eventFullDto.setViews(event.getViews());
+ eventFullDto.setRating(event.getRating());
eventFullDto.setEventDate(event.getEventDate().format(formatter));
return eventFullDto;
diff --git a/core/event-service/src/main/java/ru/practicum/model/Event.java b/core/event-service/src/main/java/ru/practicum/model/Event.java
index 93c5b4e..2e7cd3b 100644
--- a/core/event-service/src/main/java/ru/practicum/model/Event.java
+++ b/core/event-service/src/main/java/ru/practicum/model/Event.java
@@ -55,7 +55,7 @@ public class Event {
EventState state;
String title;
@Transient
- Long views;
+ double rating;
@Transient
Long likes;
}
diff --git a/core/event-service/src/main/java/ru/practicum/repository/EventRepository.java b/core/event-service/src/main/java/ru/practicum/repository/EventRepository.java
index 723a1c0..d0832d9 100644
--- a/core/event-service/src/main/java/ru/practicum/repository/EventRepository.java
+++ b/core/event-service/src/main/java/ru/practicum/repository/EventRepository.java
@@ -1,7 +1,9 @@
package ru.practicum.repository;
+import jakarta.transaction.Transactional;
import org.springframework.data.domain.Pageable;
import org.springframework.data.jpa.repository.JpaRepository;
+import org.springframework.data.jpa.repository.Modifying;
import org.springframework.data.jpa.repository.Query;
import org.springframework.data.querydsl.QuerydslPredicateExecutor;
import org.springframework.data.repository.query.Param;
@@ -77,4 +79,18 @@ List findAllEvents(@Param("text") String text,
@Query(value = "SELECT COUNT(*) FROM LIKES_EVENTS WHERE EVENT_ID = :eventId", nativeQuery = true)
long countLikesByEventId(Long eventId);
+
+ @Query(value = "SELECT EXISTS (" +
+ "SELECT * FROM LIKES_EVENTS WHERE USER_ID = :userId AND EVENT_ID = :eventId)", nativeQuery = true)
+ boolean checkLikeExistence(long userId, long eventId);
+
+ @Modifying
+ @Transactional
+ @Query(value = "INSERT INTO LIKES_EVENTS (USER_ID, EVENT_ID) values (:userId, :eventId)", nativeQuery = true)
+ void addLike(Long userId, Long eventId);
+
+ @Modifying
+ @Transactional
+ @Query(value = "DELETE FROM LIKES_EVENTS WHERE USER_ID = :userId AND EVENT_ID = :eventId", nativeQuery = true)
+ void deleteLike(Long userId, Long eventId);
}
diff --git a/core/event-service/src/main/java/ru/practicum/service/event/EventService.java b/core/event-service/src/main/java/ru/practicum/service/event/EventService.java
index ac758c6..80cd92c 100644
--- a/core/event-service/src/main/java/ru/practicum/service/event/EventService.java
+++ b/core/event-service/src/main/java/ru/practicum/service/event/EventService.java
@@ -1,14 +1,12 @@
package ru.practicum.service.event;
import org.springframework.transaction.annotation.Transactional;
-import ru.practicum.dto.event.EventFullDto;
-import ru.practicum.dto.event.EventShortDto;
-import ru.practicum.dto.event.NewEventDto;
-import ru.practicum.dto.event.UpdateEventAdminRequest;
+import ru.practicum.dto.event.*;
import ru.practicum.enums.EventState;
import java.time.LocalDateTime;
import java.util.List;
+import java.util.stream.Stream;
/**
* The interface Event service.
@@ -89,19 +87,18 @@ List getEvents(String text, List categories, Boolean paid,
LocalDateTime rangeStart, LocalDateTime rangeEnd,
Boolean onlyAvailable, String sort, int from, int size, String clientIp);
- /**
- * Gets event by id.
- *
- * @param id the id
- * @param clientIp the client ip
- * @return the event by id
- */
- EventFullDto getEventById(Long id, String clientIp);
+ EventFullDto getEventById(Long id, long userId);
EventFullDto getByIdInternal(long eventId);
@Transactional(readOnly = true)
List getAllByPublic(EventSearchParams searchParams, Boolean onlyAvailable, String sort, String clientIp);
+
+ void addLike(long userId, long eventId);
+
+ void deleteLike(long userId, long eventId);
+
+ Stream getRecommendations(Long userId, int limit);
}
diff --git a/core/event-service/src/main/java/ru/practicum/service/event/EventServiceImpl.java b/core/event-service/src/main/java/ru/practicum/service/event/EventServiceImpl.java
index 4fafb81..5f25457 100644
--- a/core/event-service/src/main/java/ru/practicum/service/event/EventServiceImpl.java
+++ b/core/event-service/src/main/java/ru/practicum/service/event/EventServiceImpl.java
@@ -9,11 +9,8 @@
import org.springframework.data.domain.Pageable;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
-import ru.practicum.StatClient;
import ru.practicum.client.RequestServiceClient;
import ru.practicum.client.UserServiceClient;
-import ru.practicum.dto.EndpointHitDto;
-import ru.practicum.dto.ViewStatsDto;
import ru.practicum.dto.category.CategoryDto;
import ru.practicum.dto.event.*;
import ru.practicum.enums.AdminStateAction;
@@ -22,6 +19,7 @@
import ru.practicum.exception.ConflictException;
import ru.practicum.exception.NotFoundException;
import ru.practicum.exception.ValidationException;
+import ru.practicum.grpc.stat.action.ActionTypeProto;
import ru.practicum.mapper.event.EventMapper;
import ru.practicum.mapper.event.UtilEventClass;
import ru.practicum.mapper.location.LocationMapper;
@@ -29,11 +27,14 @@
import ru.practicum.model.Location;
import ru.practicum.repository.*;
import ru.practicum.service.category.CategoryService;
+import ru.practicum.stats.client.StatClient;
+import java.time.Instant;
import java.time.LocalDateTime;
import java.time.format.DateTimeFormatter;
import java.util.*;
import java.util.stream.Collectors;
+import java.util.stream.Stream;
import static ru.practicum.constant.Constant.PATTERN_DATE;
import static ru.practicum.model.QEvent.event;
@@ -66,9 +67,9 @@ public EventServiceImpl(EventRepository eventRepository,
RequestServiceClient requestServiceClient,
EventMapper eventMapper, CategoryService categoryService, UtilEventClass utilEventClass,
LocationRepository locationRepository, SearchEventRepository searchEventRepository,
- CategoryRepository categoryRepository, StatClient statClient,
+ CategoryRepository categoryRepository,
LocationMapper locationMapper,
- UserServiceClient userServiceClient) {
+ UserServiceClient userServiceClient, StatClient statClient) {
this.eventRepository = eventRepository;
this.requestServiceClient = requestServiceClient;
this.eventMapper = eventMapper;
@@ -86,7 +87,6 @@ public EventServiceImpl(EventRepository eventRepository,
@Transactional(readOnly = true)
public List getEventsForUser(Long userId, Integer from, Integer size) {
List events = eventRepository.findByInitiatorId(userId, PageRequest.of(from, size));
-
return events.stream()
.map(eventMapper::toEventShortDto)
.collect(Collectors.toList());
@@ -213,12 +213,6 @@ public List getEvents(String text, List categories, Boolean
if (Boolean.TRUE.equals(text == null && categories == null && paid == null && rangeStart == null && rangeEnd == null
&& !onlyAvailable && sort == null && from == 0) && size == 10) {
- log.info("==> Статистика: вызов метода getEvents с пустыми параметрами от клиента {}", clientIp);
-
- // Записываем статистику
- saveEventsRequestToStats(clientIp);
-
- // Возвращаем пустой список
return Collections.emptyList();
}
@@ -256,21 +250,6 @@ public List getEvents(String text, List categories, Boolean
.map(event -> "/events/" + event.getId()) // Получаем URI для каждого мероприятия
.toList();
- // Запрашиваем статистику просмотров с использованием StatClient
- List viewStats = statClient.getStats(rangeStart.toString(), rangeEnd.toString(), uris, true);
-
- // Обработка случая, если статистика отсутствует
- if (viewStats == null || viewStats.isEmpty()) {
- log.warn("Сервис статистики вернул пустой результат или null");
- viewStats = Collections.emptyList();
- }
-
- // Заполняем Map с количеством просмотров
- for (ViewStatsDto stat : viewStats) {
- Long eventId = Long.valueOf(stat.getUri().substring(stat.getUri().lastIndexOf("/") + 1));
- eventViews.put(eventId, Math.toIntExact(stat.getHits()));
- }
-
// Сортировка
if ("VIEWS".equalsIgnoreCase(sort)) {
// Сортировка по количеству просмотров
@@ -283,10 +262,7 @@ public List getEvents(String text, List categories, Boolean
// Сортировка по дате события
filteredEvents.sort(Comparator.comparing(Event::getEventDate));
}
- log.info("Передаем запрос в статистику");
- // Логируем запрос в статистику
- saveEventsRequestToStats(clientIp);
// Применяем пагинацию
int start = Math.min(from, filteredEvents.size());
@@ -299,7 +275,7 @@ public List getEvents(String text, List categories, Boolean
}
@Override
- public EventFullDto getEventById(Long eventId, String clientIp) {
+ public EventFullDto getEventById(Long eventId, long userId) {
// Проверка существования события
Event event = eventRepository.findById(eventId).orElseThrow(
() -> new NotFoundException("Event with id=" + eventId + " not found!", "")
@@ -310,13 +286,6 @@ public EventFullDto getEventById(Long eventId, String clientIp) {
throw new NotFoundException("Event with id=" + eventId + " is not published yet!", "");
}
- // Увеличение количества просмотров
- saveEventRequestToStats(event, clientIp);
-
- // Получение количества просмотров из статистики
- long views = getViewsFromStats(event);
-
- event.setViews(views);
eventRepository.save(event);
// Подсчет подтвержденных запросов
@@ -324,72 +293,11 @@ public EventFullDto getEventById(Long eventId, String clientIp) {
// Создание DTO
EventFullDto eventFullDto = utilEventClass.toEventFullDto(event);
- eventFullDto.setViews(views);
eventFullDto.setConfirmedRequests(confirmedRequests);
return eventFullDto;
}
- private void saveEventsRequestToStats(String clientIp) {
- try {
- // Создание объекта для статистики
- log.info("Создание объекта для статистики");
- EndpointHitDto hitDto = new EndpointHitDto();
- hitDto.setApp("ewm-main-service");
- hitDto.setUri("/events");
- hitDto.setIp(clientIp);
- hitDto.setTimestamp(LocalDateTime.now().format(dateTimeFormatter));
-
- // Логируем успешный запрос
- log.info("Логируем запрос в статистику: URI={}, IP={}", hitDto.getUri(), hitDto.getIp());
-
- // Отправка статистики
- statClient.saveHit(hitDto);
- } catch (Exception e) {
- log.error("Ошибка при сохранении статистики для URI=/events, IP=" + clientIp, e);
- }
- }
-
- private void saveEventRequestToStats(Event event, String clientIp) {
- try {
- EndpointHitDto hitDto = new EndpointHitDto();
- hitDto.setApp("ewm-main-service");
- hitDto.setUri("/events/" + event.getId());
- hitDto.setIp(clientIp);
- hitDto.setTimestamp(LocalDateTime.now().format(dateTimeFormatter));
-
- statClient.saveHit(hitDto);
- } catch (Exception e) {
- log.error("Ошибка при сохранении статистики для события id=" + event.getId(), e);
- }
- }
-
- private long getViewsFromStats(Event event) {
- DateTimeFormatter formatter = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss");
- try {
- String uri = "/events/" + event.getId();
- // Добавляем одну секунду к началу и завершению диапазона
- String start = event.getCreatedOn().minusSeconds(1).format(formatter);
- String end = LocalDateTime.now().plusSeconds(1).format(formatter);
-
- List stats = statClient.getStats(
- start,
- end,
- List.of(uri),
- true
- );
-
- return stats.stream()
- .filter(stat -> stat.getUri().equals(uri))
- .mapToLong(ViewStatsDto::getHits)
- .sum();
- } catch (Exception e) {
- log.error("Ошибка при получении статистики просмотров для события id=" + event.getId(), e);
- return 0;
- }
- }
-
-
private void checkDateTime(LocalDateTime rangeStart, LocalDateTime rangeEnd) {
if (rangeStart != null && rangeEnd != null && rangeStart.isAfter(rangeEnd)) {
throw new ValidationException("start time can't be after end time", "time range is incorrect");
@@ -484,7 +392,7 @@ public List getAllByPublic(EventSearchParams searchParams, Boolea
Long view = 0L;
- event.setViews(view);
+ event.setRating(view);
event.setConfirmedRequests(
requestServiceClient.countByStatusAndEventId(RequestStatus.CONFIRMED, event.getId()));
event.setLikes(eventRepository.countLikesByEventId(event.getId()));
@@ -509,18 +417,6 @@ public List getAllByPublic(EventSearchParams searchParams, Boolea
.toList();
-
- List viewStats = statClient.getStats(rangeStart.toString(), rangeEnd.toString(), uris, true);
-
- if (viewStats == null || viewStats.isEmpty()) {
- log.warn("Сервис статистики вернул пустой результат или null");
- viewStats = Collections.emptyList();
- }
- for (ViewStatsDto stat : viewStats) {
- Long eventId = Long.valueOf(stat.getUri().substring(stat.getUri().lastIndexOf("/") + 1));
- eventViews.put(eventId, stat.getHits());
- }
-
if ("VIEWS".equalsIgnoreCase(sort)) {
// Сортировка по количеству просмотров
filteredEvents.sort((e1, e2) -> {
@@ -534,11 +430,41 @@ public List getAllByPublic(EventSearchParams searchParams, Boolea
}
log.info("Передаем запрос в статистику");
- // Логируем запрос в статистику
- saveEventsRequestToStats(clientIp);
-
return eventListBySearch.stream()
.map(eventMapper::toEventShortDto)
.toList();
}
+
+ @Override
+ public void addLike(long userId, long eventId) {
+ Event event = eventRepository.findById(eventId).orElseThrow(
+ () -> new NotFoundException("Event with id = " + eventId, " not found")
+ );
+ if (event.getState() != EventState.PUBLISHED) {
+ throw new ConflictException("Event with id = ", eventId + " is not published");
+ }
+ eventRepository.addLike(userId, eventId);
+ event.setLikes(eventRepository.countLikesByEventId(eventId));
+ statClient.registerUserAction(eventId, userId, ActionTypeProto.ACTION_LIKE, Instant.now());
+ }
+
+ @Override
+ public void deleteLike(long userId, long eventId) {
+ Event event = eventRepository.findById(eventId).orElseThrow(
+ () -> new NotFoundException("Event with id = ", eventId + " not found")
+ );
+ boolean isLikeExists = eventRepository.checkLikeExistence(userId, eventId);
+ if (isLikeExists) {
+ eventRepository.deleteLike(userId, eventId);
+ } else {
+ throw new NotFoundException("Like for event: ", eventId
+ + " by user: " + userId + " not exists");
+ }
+ }
+
+ @Override
+ public Stream getRecommendations(Long userId, int limit) {
+ return statClient.getRecommendationsFor(userId, limit)
+ .map(eventMapper::map);
+ }
}
diff --git a/core/interaction-api/src/main/java/ru/practicum/dto/event/EventFullDto.java b/core/interaction-api/src/main/java/ru/practicum/dto/event/EventFullDto.java
index d53ede0..7b22ecf 100644
--- a/core/interaction-api/src/main/java/ru/practicum/dto/event/EventFullDto.java
+++ b/core/interaction-api/src/main/java/ru/practicum/dto/event/EventFullDto.java
@@ -36,6 +36,6 @@ public class EventFullDto {
boolean requestModeration;
EventState state;
String title;
- Long views;
+ double rating;
Long likes;
}
diff --git a/stats/stats-dto/src/main/java/ru/practicum/dto/ViewStatsDto.java b/core/interaction-api/src/main/java/ru/practicum/dto/event/EventRecommendationDto.java
similarity index 63%
rename from stats/stats-dto/src/main/java/ru/practicum/dto/ViewStatsDto.java
rename to core/interaction-api/src/main/java/ru/practicum/dto/event/EventRecommendationDto.java
index 99f252e..b88ffe3 100644
--- a/stats/stats-dto/src/main/java/ru/practicum/dto/ViewStatsDto.java
+++ b/core/interaction-api/src/main/java/ru/practicum/dto/event/EventRecommendationDto.java
@@ -1,4 +1,5 @@
-package ru.practicum.dto;
+package ru.practicum.dto.event;
+
import lombok.AccessLevel;
import lombok.AllArgsConstructor;
@@ -6,15 +7,11 @@
import lombok.NoArgsConstructor;
import lombok.experimental.FieldDefaults;
-/**
- * The type View stats dto.
- */
@Data
-@NoArgsConstructor
@AllArgsConstructor
+@NoArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE)
-public class ViewStatsDto {
- String app;
- String uri;
- Long hits;
-}
+public class EventRecommendationDto {
+ long eventId;
+ double score;
+}
\ No newline at end of file
diff --git a/core/interaction-api/src/main/java/ru/practicum/dto/event/EventShortDto.java b/core/interaction-api/src/main/java/ru/practicum/dto/event/EventShortDto.java
index a4366a2..2ef8628 100644
--- a/core/interaction-api/src/main/java/ru/practicum/dto/event/EventShortDto.java
+++ b/core/interaction-api/src/main/java/ru/practicum/dto/event/EventShortDto.java
@@ -24,7 +24,7 @@ public class EventShortDto {
UserShortDto initiator;
Boolean paid;
String title;
- Long views;
+ double rating;
diff --git a/stats/stats-dto/src/main/java/ru/practicum/dto/EndpointHitResponseDto.java b/core/interaction-api/src/main/java/ru/practicum/dto/event/RecommendedEventDto.java
similarity index 58%
rename from stats/stats-dto/src/main/java/ru/practicum/dto/EndpointHitResponseDto.java
rename to core/interaction-api/src/main/java/ru/practicum/dto/event/RecommendedEventDto.java
index cd1f8ab..5bbd5bd 100644
--- a/stats/stats-dto/src/main/java/ru/practicum/dto/EndpointHitResponseDto.java
+++ b/core/interaction-api/src/main/java/ru/practicum/dto/event/RecommendedEventDto.java
@@ -1,21 +1,18 @@
-package ru.practicum.dto;
+package ru.practicum.dto.event;
import lombok.AccessLevel;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import lombok.experimental.FieldDefaults;
+import lombok.experimental.SuperBuilder;
-/**
- * The type Endpoint hit response dto.
- */
@Data
+@SuperBuilder(toBuilder = true)
@AllArgsConstructor
@NoArgsConstructor
@FieldDefaults(level = AccessLevel.PRIVATE)
-public class EndpointHitResponseDto {
-
- String app;
- String uri;
- String ip;
-}
+public class RecommendedEventDto {
+ Long eventId;
+ double score;
+}
\ No newline at end of file
diff --git a/core/pom.xml b/core/pom.xml
index eb6fb25..a0a3336 100644
--- a/core/pom.xml
+++ b/core/pom.xml
@@ -17,4 +17,69 @@
interaction-api
request-service
+
+
+ 21
+ 21
+ UTF-8
+
+
+
+ org.springframework.cloud
+ spring-cloud-starter-config
+
+
+
+ org.springframework.retry
+ spring-retry
+
+
+
+ org.springframework.cloud
+ spring-cloud-starter-netflix-eureka-client
+
+
+
+ org.springframework.cloud
+ spring-cloud-starter-openfeign
+
+
+
+ ru.practicum
+ aggregator
+ 0.0.1-SNAPSHOT
+ compile
+
+
+ ru.practicum
+ stats-client
+ 0.0.1-SNAPSHOT
+ compile
+
+
+ org.springframework
+ spring-tx
+
+
+ com.querydsl
+ querydsl-core
+ 5.0.0
+ compile
+
+
+ org.springframework.data
+ spring-data-commons
+
+
+ org.projectlombok
+ lombok
+
+
+
+ org.springframework.boot
+ spring-boot-starter-actuator
+
+
+
+
\ No newline at end of file
diff --git a/core/request-service/pom.xml b/core/request-service/pom.xml
index 620d13e..9728d2f 100644
--- a/core/request-service/pom.xml
+++ b/core/request-service/pom.xml
@@ -1,8 +1,9 @@
-
4.0.0
+
ru.practicum
core
@@ -12,97 +13,147 @@
request-service
- 22
- 22
+ 21
+ 21
UTF-8
+ 5.0.0
+
+
org.springframework.boot
- spring-boot-starter
+ spring-boot-starter-web
+
- org.springframework.cloud
- spring-cloud-starter-netflix-eureka-client
- 3.1.8
+ org.springframework.boot
+ spring-boot-starter-actuator
+
org.springframework.boot
- spring-boot-starter-web
+ spring-boot-starter-data-jpa
+
- org.projectlombok
- lombok
- provided
+ org.postgresql
+ postgresql
+ runtime
+
+
+ com.h2database
+ h2
+ runtime
+
+
org.mapstruct
mapstruct
- 1.6.2
- compile
+ 1.5.5.Final
+
+
+ org.projectlombok
+ lombok-mapstruct-binding
+ 0.2.0
+
+
+
+ jakarta.validation
+ jakarta.validation-api
+
+
+
+ org.springframework.boot
+ spring-boot-starter-validation
+
+
org.mapstruct
mapstruct-processor
- 1.6.2
+ 1.5.5.Final
provided
+
- org.springframework.boot
- spring-boot-starter-data-jpa
+ org.springframework.cloud
+ spring-cloud-starter-netflix-eureka-client
+
- com.querydsl
- querydsl-jpa
- jakarta
- 5.1.0
+ org.springframework.cloud
+ spring-cloud-starter-config
+
- ru.practicum
- interaction-api
- 0.0.1-SNAPSHOT
- compile
+ org.springframework.cloud
+ spring-cloud-starter-openfeign
+
- org.postgresql
- postgresql
+ org.springframework.retry
+ spring-retry
+
- org.springframework.boot
- spring-boot-starter-actuator
+ org.projectlombok
+ lombok
+ true
+
- org.springframework.cloud
- spring-cloud-starter-openfeign
+ com.querydsl
+ querydsl-apt
+ ${querydsl.version}
+ jakarta
+ provided
+
- org.springframework.cloud
- spring-cloud-config-client
- 4.2.0
+ com.querydsl
+ querydsl-jpa
+ jakarta
+ ${querydsl.version}
+
+
+ ru.practicum
+ interaction-api
+ 0.0.1-SNAPSHOT
+ compile
-
-
-
-
- org.springframework.cloud
- spring-cloud-dependencies
- 2024.0.0
- pom
- import
-
-
-
+
org.springframework.boot
spring-boot-maven-plugin
+
+
+ org.apache.maven.plugins
+ maven-compiler-plugin
+ 3.11.0
-
- paketobuildpacks/builder-jammy-base:latest
-
+
+
+ org.mapstruct
+ mapstruct-processor
+ 1.5.5.Final
+
+
+ org.projectlombok
+ lombok
+ 1.18.30
+
+
+ org.projectlombok
+ lombok-mapstruct-binding
+ 0.2.0
+
+
@@ -120,33 +171,6 @@
-
-
- com.querydsl
- querydsl-apt
- jakarta
- 5.1.0
-
-
-
-
- org.codehaus.mojo
- build-helper-maven-plugin
- 3.3.0
-
-
- add-source
- generate-sources
-
- add-source
-
-
-
- ${project.build.directory}/generated-sources/java/
-
-
-
-
diff --git a/core/request-service/src/main/java/ru/practicum/controller/PrivateRequestController.java b/core/request-service/src/main/java/ru/practicum/controller/PrivateRequestController.java
index bd7ae7f..c1ffc4c 100644
--- a/core/request-service/src/main/java/ru/practicum/controller/PrivateRequestController.java
+++ b/core/request-service/src/main/java/ru/practicum/controller/PrivateRequestController.java
@@ -6,7 +6,6 @@
import org.springframework.http.HttpStatus;
import org.springframework.validation.annotation.Validated;
import org.springframework.web.bind.annotation.*;
-import ru.practicum.client.EventServiceClient;
import ru.practicum.dto.request.EventRequestStatusUpdateRequest;
import ru.practicum.dto.request.EventRequestStatusUpdateResult;
import ru.practicum.dto.request.ParticipationRequestDto;
@@ -27,7 +26,6 @@ public class PrivateRequestController {
private static final String USERID = "user-id";
private static final String EVENTID = "event-id";
- private final EventServiceClient eventServiceClient;
private final RequestService requestService;
/**
diff --git a/core/request-service/src/main/java/ru/practicum/service/RequestServiceImpl.java b/core/request-service/src/main/java/ru/practicum/service/RequestServiceImpl.java
index 95a059f..5412e3a 100644
--- a/core/request-service/src/main/java/ru/practicum/service/RequestServiceImpl.java
+++ b/core/request-service/src/main/java/ru/practicum/service/RequestServiceImpl.java
@@ -12,10 +12,13 @@
import ru.practicum.enums.RequestStatus;
import ru.practicum.exception.ConflictException;
import ru.practicum.exception.NotFoundException;
+import ru.practicum.grpc.stat.action.ActionTypeProto;
import ru.practicum.mapper.RequestMapper;
import ru.practicum.model.Request;
import ru.practicum.repository.RequestRepository;
+import ru.practicum.stats.client.StatClient;
+import java.time.Instant;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
diff --git a/core/user-service/pom.xml b/core/user-service/pom.xml
index 47b9445..e610891 100644
--- a/core/user-service/pom.xml
+++ b/core/user-service/pom.xml
@@ -1,142 +1,179 @@
-
4.0.0
+
ru.practicum
core
0.0.1-SNAPSHOT
+
user-service
- 22
- 22
+ 21
+ 21
UTF-8
+ 5.0.0
+
+
+
+ ru.practicum
+ stats-client
+ 0.0.1-SNAPSHOT
+
+
+
+ ru.practicum
+ aggregator
+ 0.0.1-SNAPSHOT
+
+
org.springframework.boot
spring-boot-starter-web
+
- org.zalando
- logbook-spring-boot-starter
- 3.9.0
+ org.springframework.boot
+ spring-boot-starter-actuator
+
org.springframework.boot
- spring-boot-starter-test
+ spring-boot-starter-validation
+
- org.springframework
- spring-web
+ org.springframework.boot
+ spring-boot-starter-data-jpa
+
- org.projectlombok
- lombok
- provided
+ org.postgresql
+ postgresql
+ runtime
+
- org.springframework.data
- spring-data-commons
+ com.h2database
+ h2
+ runtime
+
org.mapstruct
mapstruct
- 1.6.2
+ 1.5.5.Final
+
- org.mapstruct
- mapstruct-processor
- 1.6.2
- provided
+ org.projectlombok
+ lombok-mapstruct-binding
+ 0.2.0
+
jakarta.validation
jakarta.validation-api
+
- jakarta.transaction
- jakarta.transaction-api
+ org.springframework.boot
+ spring-boot-starter-validation
+
- org.springframework.data
- spring-data-jpa
+ org.mapstruct
+ mapstruct-processor
+ 1.5.5.Final
+ provided
+
- com.querydsl
- querydsl-core
- 5.1.0
+ org.springframework.cloud
+ spring-cloud-starter-netflix-eureka-client
+
- javax.persistence
- javax.persistence-api
- 2.2
+ org.springframework.cloud
+ spring-cloud-starter-config
+
- org.springframework.boot
- spring-boot-starter-data-jpa
+ org.springframework.cloud
+ spring-cloud-starter-openfeign
+
- ru.practicum
- interaction-api
- 0.0.1-SNAPSHOT
- compile
+ org.springframework.retry
+ spring-retry
+
- org.springframework.boot
- spring-boot-starter-actuator
+ org.projectlombok
+ lombok
+ true
+
+
+
com.querydsl
- querydsl-jpa
+ querydsl-apt
+ ${querydsl.version}
jakarta
- 5.1.0
-
-
- org.springframework.cloud
- spring-cloud-starter-netflix-eureka-client
- 3.1.8
-
-
- org.springframework.cloud
- spring-cloud-config-client
- 4.2.0
+ provided
+
- org.postgresql
- postgresql
+ com.querydsl
+ querydsl-jpa
+ jakarta
+ ${querydsl.version}
- org.springframework.cloud
- spring-cloud-starter-config
+ ru.practicum
+ interaction-api
+ 0.0.1-SNAPSHOT
+ compile
-
-
-
-
- org.springframework.cloud
- spring-cloud-dependencies
- 2024.0.0
- pom
- import
-
-
-
+
org.springframework.boot
spring-boot-maven-plugin
+
+
+ org.apache.maven.plugins
+ maven-compiler-plugin
+ 3.11.0
-
- paketobuildpacks/builder-jammy-base:latest
-
+
+
+ org.mapstruct
+ mapstruct-processor
+ 1.5.5.Final
+
+
+ org.projectlombok
+ lombok
+ 1.18.30
+
+
+ org.projectlombok
+ lombok-mapstruct-binding
+ 0.2.0
+
+
@@ -154,33 +191,6 @@
-
-
- com.querydsl
- querydsl-apt
- jakarta
- 5.1.0
-
-
-
-
- org.codehaus.mojo
- build-helper-maven-plugin
- 3.3.0
-
-
- add-source
- generate-sources
-
- add-source
-
-
-
- ${project.build.directory}/generated-sources/java/
-
-
-
-
diff --git a/docker-compose.yml b/docker-compose.yml
index b7f2a57..d5973aa 100644
--- a/docker-compose.yml
+++ b/docker-compose.yml
@@ -37,108 +37,104 @@ services:
depends_on:
config-server:
condition: service_healthy
+ event-service:
+ condition: service_healthy
user-service:
condition: service_healthy
request-service:
condition: service_healthy
- event-service:
- condition: service_healthy
- stats-server:
- condition: service_healthy
+
networks:
- ewm-net
environment:
- EUREKA_CLIENT_SERVICEURL_DEFAULTZONE=http://discovery-server:8761/eureka/
- stats-server:
- build: stats/stats-server
- container_name: ewm-stats-server
- ports:
- - "9090:9090"
+ event-service:
+ build: core/event-service
+ container_name: event-service
depends_on:
- stats-db:
+ event-db:
condition: service_healthy
config-server:
condition: service_healthy
+
networks:
- ewm-net
environment:
- - SPRING_DATASOURCE_URL=jdbc:postgresql://stats-db:5432/ewm-stats
+ - SPRING_DATASOURCE_URL=jdbc:postgresql://event-db:5432/ewm-event
- SPRING_DATASOURCE_USERNAME=root
- SPRING_DATASOURCE_PASSWORD=root
- EUREKA_CLIENT_SERVICEURL_DEFAULTZONE=http://discovery-server:8761/eureka/
- - SERVER_PORT=9090
+ - SERVER_PORT=8081
healthcheck:
- test: "curl --fail --silent localhost:9090/actuator/health | grep UP || exit 1"
+ test: "curl --fail --silent localhost:8081/actuator/health | grep UP || exit 1"
timeout: 5s
- interval: 15s
+ interval: 25s
retries: 10
- stats-db:
+ event-db:
image: postgres:16.1
- container_name: postgres-ewm-stats-db
+ container_name: postgres-ewm-event-db
+ networks:
+ - ewm-net
environment:
- POSTGRES_PASSWORD=root
- POSTGRES_USER=root
- - POSTGRES_DB=ewm-stats
- networks:
- - ewm-net
+ - POSTGRES_DB=ewm-event
+ ports:
+ - 5434:5433
healthcheck:
test: pg_isready -q -d $$POSTGRES_DB -U $$POSTGRES_USER
timeout: 5s
interval: 10s
retries: 15
- # Event-service
- event-service:
- build: core/event-service
- container_name: event-service
+
+ request-service:
+ build: core/request-service
+ container_name: ewm-request-service
depends_on:
- event-db:
+ request-db:
condition: service_healthy
config-server:
condition: service_healthy
- stats-server:
- condition: service_healthy
networks:
- ewm-net
environment:
- - SPRING_DATASOURCE_URL=jdbc:postgresql://event-db:5432/ewm-event
+ - SPRING_DATASOURCE_URL=jdbc:postgresql://request-db:5432/ewm-request
- SPRING_DATASOURCE_USERNAME=root
- SPRING_DATASOURCE_PASSWORD=root
- EUREKA_CLIENT_SERVICEURL_DEFAULTZONE=http://discovery-server:8761/eureka/
- - SERVER_PORT=8081
+ - SERVER_PORT=8083
healthcheck:
- test: "curl --fail --silent localhost:8081/actuator/health | grep UP || exit 1"
+ test: "curl --fail --silent localhost:8083/actuator/health | grep UP || exit 1"
timeout: 5s
interval: 25s
retries: 10
- event-db:
+
+ request-db:
image: postgres:16.1
- container_name: postgres-ewm-event-db
+ container_name: postgres-ewm-request-db
networks:
- ewm-net
environment:
- POSTGRES_PASSWORD=root
- POSTGRES_USER=root
- - POSTGRES_DB=ewm-event
+ - POSTGRES_DB=ewm-request
healthcheck:
test: pg_isready -q -d $$POSTGRES_DB -U $$POSTGRES_USER
timeout: 5s
interval: 10s
retries: 15
- # USER-SERVICE
user-service:
build: core/user-service
- container_name: user-service
+ container_name: ewm-user-service
depends_on:
- event-db:
+ user-db:
condition: service_healthy
config-server:
condition: service_healthy
- stats-server:
- condition: service_healthy
networks:
- ewm-net
environment:
@@ -146,9 +142,9 @@ services:
- SPRING_DATASOURCE_USERNAME=root
- SPRING_DATASOURCE_PASSWORD=root
- EUREKA_CLIENT_SERVICEURL_DEFAULTZONE=http://discovery-server:8761/eureka/
- - SERVER_PORT=8083
+ - SERVER_PORT=8084
healthcheck:
- test: "curl --fail --silent localhost:8083/actuator/health | grep UP || exit 1"
+ test: "curl --fail --silent localhost:8084/actuator/health | grep UP || exit 1"
timeout: 5s
interval: 25s
retries: 10
@@ -168,51 +164,131 @@ services:
interval: 10s
retries: 15
- #REQUEST-SERVICE
- request-db:
- image: postgres:16.1
- container_name: postgres-ewm-request-db
+ kafka:
+ image: confluentinc/confluent-local:7.4.3
+ hostname: kafka
+ container_name: kafka
+ ports:
+ - "9092:9092" # for client connections
+ - "29092:29092"
+ restart: unless-stopped
networks:
- ewm-net
environment:
- - POSTGRES_PASSWORD=root
- - POSTGRES_USER=root
- - POSTGRES_DB=ewm-request
- volumes:
- - ${PWD}/core/request-service/src/main/resources/schema.sql:/docker-entrypoint-initdb.d/schema.sql
+ KAFKA_NODE_ID: 1
+ KAFKA_ADVERTISED_LISTENERS: 'PLAINTEXT://kafka:29092,PLAINTEXT_HOST://localhost:9092'
+ KAFKA_JMX_PORT: 9101
+ KAFKA_JMX_HOSTNAME: localhost
+ KAFKA_PROCESS_ROLES: 'broker,controller'
+ KAFKA_CONTROLLER_QUORUM_VOTERS: '1@kafka:29093'
+ KAFKA_LISTENERS: 'PLAINTEXT://kafka:29092,CONTROLLER://kafka:29093,PLAINTEXT_HOST://0.0.0.0:9092'
+ CLUSTER_ID: 'K0EA9p0yEe6MkAAAAkKsEg'
healthcheck:
- test: pg_isready -q -d $$POSTGRES_DB -U $$POSTGRES_USER
- timeout: 5s
+ test: [ "CMD", "kafka-topics", "--bootstrap-server", "localhost:9092", "--list" ]
interval: 10s
- retries: 15
+ timeout: 5s
+ retries: 10
- request-service:
- build: core/request-service
- container_name: request-service
+ kafka-init-topics:
+ image: confluentinc/confluent-local:7.4.3
+ container_name: kafka-init-topics
depends_on:
- user-db:
+ kafka:
condition: service_healthy
+ networks:
+ - ewm-net
+ command: "bash -c \
+ 'kafka-topics --create --topic stats.user-actions.v1 \
+ --partitions 1 --replication-factor 1 --if-not-exists \
+ --bootstrap-server kafka:29092 && \
+ kafka-topics --create --topic stats.events-similarity.v1 \
+ --partitions 1 --replication-factor 1 --if-not-exists \
+ --bootstrap-server kafka:29092'"
+ init: true
+
+ collector:
+ build: stats/collector
+ container_name: collector
+ restart: on-failure
+ depends_on:
config-server:
condition: service_healthy
- stats-server:
+ kafka-init-topics:
+ condition: service_completed_successfully
+ aggregator:
condition: service_healthy
- user-service:
+ networks:
+ - ewm-net
+ environment:
+ - EUREKA_CLIENT_SERVICEURL_DEFAULTZONE=http://discovery-server:8761/eureka/
+ - SERVER_PORT=8080
+ healthcheck:
+ test: "curl --fail --silent localhost:8080/actuator/health | grep UP || exit 1"
+ timeout: 5s
+ interval: 25s
+ retries: 10
+
+ aggregator:
+ build: stats/aggregator
+ container_name: aggregator
+ restart: on-failure
+ depends_on:
+ config-server:
condition: service_healthy
- request-db:
+ kafka-init-topics:
+ condition: service_completed_successfully
+ networks:
+ - ewm-net
+ environment:
+ - EUREKA_CLIENT_SERVICEURL_DEFAULTZONE=http://discovery-server:8761/eureka/
+ - SERVER_PORT=8080
+ healthcheck:
+ test: "curl --fail --silent localhost:8080/actuator/health | grep UP || exit 1"
+ timeout: 5s
+ interval: 25s
+ retries: 10
+
+ analyzer:
+ build: stats/analyzer
+ container_name: analyzer
+ restart: on-failure
+ depends_on:
+ analyzer-db:
condition: service_healthy
+ config-server:
+ condition: service_healthy
+ kafka-init-topics:
+ condition: service_completed_successfully
networks:
- ewm-net
environment:
- - SPRING_DATASOURCE_URL=jdbc:postgresql://request-db:5432/ewm-request
+ - SPRING_DATASOURCE_URL=jdbc:postgresql://analyzer-db:5432/analyzer
- SPRING_DATASOURCE_USERNAME=root
- SPRING_DATASOURCE_PASSWORD=root
- EUREKA_CLIENT_SERVICEURL_DEFAULTZONE=http://discovery-server:8761/eureka/
- - SERVER_PORT=8085
+ - SERVER_PORT=8080
healthcheck:
- test: "curl --fail --silent localhost:8085/actuator/health | grep UP || exit 1"
+ test: "curl --fail --silent localhost:8080/actuator/health | grep UP || exit 1"
timeout: 5s
interval: 25s
retries: 10
+ analyzer-db:
+ image: postgres:16.1
+ container_name: analyzer-db
+ restart: on-failure
+ environment:
+ - POSTGRES_DB=analyzer
+ - POSTGRES_USER=root
+ - POSTGRES_PASSWORD=root
+ networks:
+ - ewm-net
+ healthcheck:
+ test: pg_isready -q -d $$POSTGRES_DB -U $$POSTGRES_USER
+ timeout: 5s
+ interval: 10s
+ retries: 10
+
+
networks:
ewm-net:
\ No newline at end of file
diff --git a/infra/config-server/src/main/resources/application.yml b/infra/config-server/src/main/resources/application.yml
index bf89bc7..117066e 100644
--- a/infra/config-server/src/main/resources/application.yml
+++ b/infra/config-server/src/main/resources/application.yml
@@ -11,16 +11,17 @@ spring:
- classpath:config/core/{application}
- classpath:config/infra/{application}
- classpath:config/stats/{application}
+ discovery:
+ enabled: true
+
eureka:
client:
- register-with-eureka: true
- fetch-registry: true
serviceUrl:
defaultZone: http://localhost:8761/eureka/
instance:
- prefer-ip-address: true
+ preferIpAddress: true
hostname: localhost
instance-id: "${spring.application.name}:${random.value}"
- lease-renewal-interval-in-seconds: 10
+ leaseRenewalIntervalInSeconds: 10
server:
port: 0
\ No newline at end of file
diff --git a/infra/config-server/src/main/resources/config/stats.stats-server/application.yml b/infra/config-server/src/main/resources/config/stats.stats-server/application.yml
deleted file mode 100644
index 6886a06..0000000
--- a/infra/config-server/src/main/resources/config/stats.stats-server/application.yml
+++ /dev/null
@@ -1,20 +0,0 @@
-spring:
- datasource:
- driverClassName: org.postgresql.Driver
- url: jdbc:postgresql://stats-db:5432/ewm-stats
- username: root
- password: root
-
- jpa:
- hibernate:
- ddl-auto: none
- database-platform: org.hibernate.dialect.PostgreSQLDialect
- generate-ddl: false
- properties:
- hibernate:
- format_sql: true
- show-sql: false
-
- sql:
- init:
- mode: always
\ No newline at end of file
diff --git a/infra/config-server/src/main/resources/config/stats/aggregator/application.yml b/infra/config-server/src/main/resources/config/stats/aggregator/application.yml
new file mode 100644
index 0000000..cd8fbb0
--- /dev/null
+++ b/infra/config-server/src/main/resources/config/stats/aggregator/application.yml
@@ -0,0 +1,20 @@
+server:
+ port: 8889
+
+kafka:
+ bootstrapServers: localhost:9092
+ producerClientIdConfig: aggregator-producer
+ producerKeySerializer: org.apache.kafka.common.serialization.LongSerializer
+ producerValueSerializer: ru.practicum.serializer.AvroSerializer
+ consumerGroupId: aggregator-group
+ consumerClientIdConfig: aggregator-consumer
+ consumerKeyDeserializer: org.apache.kafka.common.serialization.LongDeserializer
+ consumerValueDeserializer: ru.practicum.deserializer.UserActionDeserializer
+ consumerEnableAutoCommit: "false"
+ userActionTopic: stats.user-actions.v1
+ eventsSimilarityTopic: stats.events-similarity.v1
+
+logging:
+ level:
+ ru.practicum: debug
+ root: info
\ No newline at end of file
diff --git a/infra/config-server/src/main/resources/config/stats/analyzer/application.yml b/infra/config-server/src/main/resources/config/stats/analyzer/application.yml
new file mode 100644
index 0000000..cd9c0c9
--- /dev/null
+++ b/infra/config-server/src/main/resources/config/stats/analyzer/application.yml
@@ -0,0 +1,63 @@
+server:
+ port: 9090
+
+kafka:
+ bootstrapServers: localhost:9092
+ userActionTopic: stats.user-actions.v1
+ eventsSimilarityTopic: stats.events-similarity.v1
+
+ userActionConsumer:
+ groupId: analyzer-group
+ clientId: analyzer-consumer
+ keyDeserializer: org.apache.kafka.common.serialization.LongDeserializer
+ valueDeserializer: ru.practicum.deserializer.UserActionDeserializer
+ enableAutoCommit: "false"
+ maxPollRecords: 500
+ maxPollIntervalMs: 300000
+ sessionTimeoutMs: 10000
+
+ eventSimilarityConsumer:
+ groupId: event-similarity
+ clientId: event-similarity-client
+ keyDeserializer: org.apache.kafka.common.serialization.LongDeserializer
+ valueDeserializer: ru.practicum.deserializer.EventSimilarityDeserializer
+ enableAutoCommit: "false"
+ maxPollRecords: 500
+ maxPollIntervalMs: 300000
+ sessionTimeoutMs: 10000
+
+spring:
+ jpa:
+ hibernate.ddl-auto: none
+ show-sql: true
+ properties:
+ hibernate:
+ format_sql: true
+ sql.init.mode: always
+
+logging:
+ file:
+ name: .from_the_beginning/analyzer_report.txt
+ max-size: 10MB
+ max-history: 1
+ level:
+ ru.practicum.repository: TRACE
+ ru.practicum.service: DEBUG
+ org.hibernate.SQL: DEBUG
+ org.hibernate.type.descriptor.sql.BasicBinder: TRACE
+
+logging.level:
+ org.springframework.orm.jpa: INFO
+ org.springframework.transaction: INFO
+ ru.practicum: DEBUG
+---
+
+spring:
+ config:
+ activate:
+ on-profile: dev
+ datasource:
+ driver-class-name: org.h2.Driver
+ url: jdbc:h2:mem:analyzer
+ username: stats
+ password: stats
\ No newline at end of file
diff --git a/infra/config-server/src/main/resources/config/stats/collector/application.yml b/infra/config-server/src/main/resources/config/stats/collector/application.yml
new file mode 100644
index 0000000..1445b4e
--- /dev/null
+++ b/infra/config-server/src/main/resources/config/stats/collector/application.yml
@@ -0,0 +1,18 @@
+logging:
+ level:
+ ru.practicum: debug
+ root: info
+
+grpc:
+ server:
+ port: 0
+
+server:
+ port: 8888
+
+kafka:
+ user-action-topic: stats.user-actions.v1
+ bootstrap-servers: localhost:9092
+ client-id-config: collector-client
+ producer-key-serializer: org.apache.kafka.common.serialization.LongSerializer
+ producer-value-serializer: ru.practicum.serializer.UserActionsAvroSerializer
\ No newline at end of file
diff --git a/pom.xml b/pom.xml
index 599095a..58eecfe 100644
--- a/pom.xml
+++ b/pom.xml
@@ -6,7 +6,7 @@
org.springframework.boot
spring-boot-starter-parent
- 3.4.0
+ 3.3.0
@@ -15,7 +15,7 @@
core
infra
stats
-
+
ru.practicum
explore-with-me
@@ -25,7 +25,14 @@
21
UTF-8
- 2024.0.0
+ 2023.0.3
+ 1.12.0
+ 3.25.1
+ 1.63.0
+ ${avro.version}
+ 2.4.0
+ 3.11.0
+ 3.1.0.RELEASE
@@ -37,6 +44,29 @@
pom
import
+
+ net.devh
+ grpc-spring-boot-starter
+ ${grpc-spring-boot-starter.version}
+
+
+
+ net.devh
+ grpc-server-spring-boot-starter
+ ${grpc-spring-boot-starter.version}
+
+
+
+ io.grpc
+ grpc-stub
+ ${grpc.version}
+
+
+
+ io.grpc
+ grpc-protobuf
+ ${grpc.version}
+
diff --git a/stats/stats-server/Dockerfile b/stats/aggregator/Dockerfile
similarity index 100%
rename from stats/stats-server/Dockerfile
rename to stats/aggregator/Dockerfile
diff --git a/stats/aggregator/pom.xml b/stats/aggregator/pom.xml
new file mode 100644
index 0000000..aec55c0
--- /dev/null
+++ b/stats/aggregator/pom.xml
@@ -0,0 +1,69 @@
+
+
+ 4.0.0
+
+ ru.practicum
+ stats
+ 0.0.1-SNAPSHOT
+
+
+ aggregator
+
+
+ 21
+ 21
+ UTF-8
+
+
+
+
+ ru.practicum
+ avro-schemas
+ 0.0.1-SNAPSHOT
+
+
+ org.springframework.boot
+ spring-boot-starter
+
+
+ org.projectlombok
+ lombok
+ true
+
+
+ org.springframework.cloud
+ spring-cloud-starter-config
+
+
+ org.springframework.retry
+ spring-retry
+
+
+ org.springframework.cloud
+ spring-cloud-starter-netflix-eureka-client
+
+
+ org.springframework.boot
+ spring-boot-starter-actuator
+
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+
+
+
+ org.projectlombok
+ lombok
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/stats/aggregator/src/main/java/ru/practicum/AggregatorApplication.java b/stats/aggregator/src/main/java/ru/practicum/AggregatorApplication.java
new file mode 100644
index 0000000..87a9504
--- /dev/null
+++ b/stats/aggregator/src/main/java/ru/practicum/AggregatorApplication.java
@@ -0,0 +1,16 @@
+package ru.practicum;
+
+import org.springframework.boot.SpringApplication;
+import org.springframework.boot.autoconfigure.SpringBootApplication;
+import org.springframework.context.ConfigurableApplicationContext;
+import ru.practicum.service.AggregationStarter;
+
+@SpringBootApplication
+public class AggregatorApplication {
+ public static void main(String[] args) {
+ ConfigurableApplicationContext context = SpringApplication.run(AggregatorApplication.class, args);
+
+ AggregationStarter aggregator = context.getBean(AggregationStarter.class);
+ aggregator.start();
+ }
+}
diff --git a/stats/aggregator/src/main/java/ru/practicum/config/KafkaConfig.java b/stats/aggregator/src/main/java/ru/practicum/config/KafkaConfig.java
new file mode 100644
index 0000000..7ad4423
--- /dev/null
+++ b/stats/aggregator/src/main/java/ru/practicum/config/KafkaConfig.java
@@ -0,0 +1,51 @@
+package ru.practicum.config;
+
+import lombok.Getter;
+import lombok.extern.slf4j.Slf4j;
+import org.apache.avro.specific.SpecificRecordBase;
+import org.apache.kafka.clients.consumer.ConsumerConfig;
+import org.apache.kafka.clients.consumer.KafkaConsumer;
+import org.apache.kafka.clients.producer.KafkaProducer;
+import org.apache.kafka.clients.producer.Producer;
+import org.apache.kafka.clients.producer.ProducerConfig;
+import org.springframework.boot.context.properties.EnableConfigurationProperties;
+import org.springframework.context.annotation.Bean;
+import org.springframework.context.annotation.Configuration;
+import ru.practicum.ewm.stats.avro.UserActionAvro;
+
+import java.util.Properties;
+
+@Slf4j
+@Getter
+@Configuration
+@EnableConfigurationProperties({KafkaConfigProperties.class})
+public class KafkaConfig {
+ private final KafkaConfigProperties kafkaProperties;
+
+ public KafkaConfig(KafkaConfigProperties properties) {
+ this.kafkaProperties = properties;
+ }
+
+ @Bean
+ public Producer producer() {
+ Properties properties = new Properties();
+ properties.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaProperties.getBootstrapServers());
+ properties.put(ProducerConfig.CLIENT_ID_CONFIG, kafkaProperties.getProducerClientIdConfig());
+ properties.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, kafkaProperties.getProducerKeySerializer());
+ properties.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, kafkaProperties.getProducerValueSerializer());
+ log.info("properties for producer are: {}", properties);
+ return new KafkaProducer<>(properties);
+ }
+
+ @Bean
+ public KafkaConsumer consumer() {
+ Properties props = new Properties();
+ props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaProperties.getBootstrapServers());
+ props.put(ConsumerConfig.GROUP_ID_CONFIG, kafkaProperties.getConsumerGroupId());
+ props.put(ConsumerConfig.CLIENT_ID_CONFIG, kafkaProperties.getConsumerClientIdConfig());
+ props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, kafkaProperties.getConsumerKeyDeserializer());
+ props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, kafkaProperties.getConsumerValueDeserializer());
+ props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, kafkaProperties.getConsumerEnableAutoCommit());
+ return new KafkaConsumer<>(props);
+ }
+}
\ No newline at end of file
diff --git a/stats/aggregator/src/main/java/ru/practicum/config/KafkaConfigProperties.java b/stats/aggregator/src/main/java/ru/practicum/config/KafkaConfigProperties.java
new file mode 100644
index 0000000..f436052
--- /dev/null
+++ b/stats/aggregator/src/main/java/ru/practicum/config/KafkaConfigProperties.java
@@ -0,0 +1,26 @@
+package ru.practicum.config;
+
+import lombok.AccessLevel;
+import lombok.Getter;
+import lombok.Setter;
+import lombok.experimental.FieldDefaults;
+import org.springframework.boot.context.properties.ConfigurationProperties;
+
+@Getter
+@Setter
+@ConfigurationProperties(prefix = "kafka")
+@FieldDefaults(level = AccessLevel.PRIVATE)
+public class KafkaConfigProperties {
+ String bootstrapServers;
+ String producerClientIdConfig;
+ String producerKeySerializer;
+ String producerValueSerializer;
+ String consumerGroupId;
+ String consumerClientIdConfig;
+ String consumerKeyDeserializer;
+ String consumerValueDeserializer;
+ long consumerAttemptTimeout;
+ String consumerEnableAutoCommit;
+ String userActionTopic;
+ String eventsSimilarityTopic;
+}
\ No newline at end of file
diff --git a/stats/aggregator/src/main/java/ru/practicum/deserializer/BaseAvroDeserializer.java b/stats/aggregator/src/main/java/ru/practicum/deserializer/BaseAvroDeserializer.java
new file mode 100644
index 0000000..f4a6ca7
--- /dev/null
+++ b/stats/aggregator/src/main/java/ru/practicum/deserializer/BaseAvroDeserializer.java
@@ -0,0 +1,42 @@
+package ru.practicum.deserializer;
+
+import org.apache.avro.Schema;
+import org.apache.avro.io.BinaryDecoder;
+import org.apache.avro.io.DatumReader;
+import org.apache.avro.io.DecoderFactory;
+import org.apache.avro.specific.SpecificDatumReader;
+import org.apache.avro.specific.SpecificRecordBase;
+import org.apache.kafka.common.serialization.Deserializer;
+
+
+public class BaseAvroDeserializer implements Deserializer {
+
+ private final DecoderFactory decoderFactory;
+ private final DatumReader reader;
+
+ public BaseAvroDeserializer(Schema schema) {
+ this(DecoderFactory.get(), schema);
+ }
+
+ public BaseAvroDeserializer(DecoderFactory decoderFactory, Schema schema) {
+ this.decoderFactory = decoderFactory;
+ this.reader = new SpecificDatumReader<>(schema);
+
+ }
+
+
+ @Override
+ public T deserialize(String topic, byte[] data) {
+
+ try {
+ if (data != null) {
+
+ BinaryDecoder decoder = decoderFactory.binaryDecoder(data, null);
+ return reader.read(null, decoder);
+ }
+ return null;
+ } catch (Exception e) {
+ throw new RuntimeException("Data serialization from topic error [" + topic + "]", e);
+ }
+ }
+}
\ No newline at end of file
diff --git a/stats/aggregator/src/main/java/ru/practicum/deserializer/UserActionDeserializer.java b/stats/aggregator/src/main/java/ru/practicum/deserializer/UserActionDeserializer.java
new file mode 100644
index 0000000..957ca3d
--- /dev/null
+++ b/stats/aggregator/src/main/java/ru/practicum/deserializer/UserActionDeserializer.java
@@ -0,0 +1,11 @@
+package ru.practicum.deserializer;
+
+import ru.practicum.ewm.stats.avro.UserActionAvro;
+
+
+public class UserActionDeserializer extends BaseAvroDeserializer {
+ public UserActionDeserializer() {
+ super(UserActionAvro.getClassSchema());
+ }
+
+}
\ No newline at end of file
diff --git a/stats/aggregator/src/main/java/ru/practicum/serializer/AvroSerializer.java b/stats/aggregator/src/main/java/ru/practicum/serializer/AvroSerializer.java
new file mode 100644
index 0000000..118cc0d
--- /dev/null
+++ b/stats/aggregator/src/main/java/ru/practicum/serializer/AvroSerializer.java
@@ -0,0 +1,36 @@
+package ru.practicum.serializer;
+
+import lombok.extern.slf4j.Slf4j;
+import org.apache.avro.io.BinaryEncoder;
+import org.apache.avro.io.DatumWriter;
+import org.apache.avro.io.EncoderFactory;
+import org.apache.avro.specific.SpecificDatumWriter;
+import org.apache.avro.specific.SpecificRecordBase;
+import org.apache.kafka.common.errors.SerializationException;
+import org.apache.kafka.common.serialization.Serializer;
+
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+
+@Slf4j
+public class AvroSerializer implements Serializer {
+ private final EncoderFactory encoderFactory = EncoderFactory.get();
+ private BinaryEncoder encoder;
+
+ public byte[] serialize(String topic, SpecificRecordBase data) {
+
+ try (ByteArrayOutputStream out = new ByteArrayOutputStream()) {
+ byte[] result = null;
+ encoder = encoderFactory.binaryEncoder(out, encoder);
+ if (data != null) {
+ DatumWriter writer = new SpecificDatumWriter<>(data.getSchema());
+ writer.write(data, encoder);
+ encoder.flush();
+ result = out.toByteArray();
+ }
+ return result;
+ } catch (IOException ex) {
+ throw new SerializationException("Data serialization from topic error [" + topic + "]", ex);
+ }
+ }
+}
\ No newline at end of file
diff --git a/stats/aggregator/src/main/java/ru/practicum/service/AggregationStarter.java b/stats/aggregator/src/main/java/ru/practicum/service/AggregationStarter.java
new file mode 100644
index 0000000..132f44c
--- /dev/null
+++ b/stats/aggregator/src/main/java/ru/practicum/service/AggregationStarter.java
@@ -0,0 +1,87 @@
+package ru.practicum.service;
+
+import lombok.RequiredArgsConstructor;
+import lombok.extern.slf4j.Slf4j;
+import org.apache.kafka.clients.consumer.*;
+import org.apache.kafka.common.TopicPartition;
+import org.apache.kafka.common.errors.WakeupException;
+import org.springframework.stereotype.Component;
+import ru.practicum.config.KafkaConfig;
+import ru.practicum.ewm.stats.avro.EventSimilarityAvro;
+import ru.practicum.ewm.stats.avro.UserActionAvro;
+
+import java.time.Duration;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+@Slf4j
+@Component
+@RequiredArgsConstructor
+public class AggregationStarter {
+ private final AggregatorService aggregatorService;
+ private final Consumer consumer;
+ private final KafkaConfig kafkaConfig;
+ private final Map currentOffsets = new HashMap<>();
+
+
+
+ private void manageOffsets(ConsumerRecord consumerRecord, int count, Consumer consumer) {
+ currentOffsets.put(
+ new TopicPartition(consumerRecord.topic(), consumerRecord.partition()),
+ new OffsetAndMetadata(consumerRecord.offset() + 1)
+ );
+
+ if (count % 10 == 0) {
+ consumer.commitAsync(currentOffsets, (offsets, exception) -> {
+ if (exception != null) {
+ log.warn("Ошибка во время фиксации оффсетов: {}", offsets, exception);
+ }
+ });
+ }
+ }
+
+ public void start() {
+ Runtime.getRuntime().addShutdownHook(new Thread(consumer::wakeup));
+
+ try {
+ consumer.subscribe(List.of(kafkaConfig.getKafkaProperties().getUserActionTopic()));
+ while (true) {
+ ConsumerRecords records = consumer
+ .poll(Duration.ofMillis(kafkaConfig.getKafkaProperties().getConsumerAttemptTimeout()));
+ int count = 0;
+ for (ConsumerRecord record : records) {
+ log.info("UserActionAvro got from consumer: {}", record);
+ handleRecord(record);
+ manageOffsets(record, count, consumer);
+ count++;
+ }
+ }
+
+ } catch (WakeupException ignores) {
+
+ } catch (Exception e) {
+ log.error("Ошибка во время обработки событий от датчиков", e);
+ } finally {
+
+ try {
+ consumer.commitSync(currentOffsets);
+
+ } finally {
+ log.info("Закрываем консьюмер");
+ consumer.close();
+ log.info("Отправляем все сообщения из буфера продюсера");
+ aggregatorService.flush();
+ log.info("Закрываем продюсер");
+ aggregatorService.close();
+ }
+ }
+ }
+
+ private void handleRecord(ConsumerRecord consumerRecord) throws InterruptedException {
+ List eventSimilarityList = aggregatorService.updateSimilarity(consumerRecord.value());
+ for (EventSimilarityAvro eventSimilarity : eventSimilarityList) {
+ aggregatorService.collectEventSimilarity(eventSimilarity);
+ }
+ }
+}
\ No newline at end of file
diff --git a/stats/aggregator/src/main/java/ru/practicum/service/AggregatorService.java b/stats/aggregator/src/main/java/ru/practicum/service/AggregatorService.java
new file mode 100644
index 0000000..85e8f07
--- /dev/null
+++ b/stats/aggregator/src/main/java/ru/practicum/service/AggregatorService.java
@@ -0,0 +1,16 @@
+package ru.practicum.service;
+
+import ru.practicum.ewm.stats.avro.EventSimilarityAvro;
+import ru.practicum.ewm.stats.avro.UserActionAvro;
+
+import java.util.List;
+
+public interface AggregatorService {
+ List updateSimilarity(UserActionAvro userAction);
+
+ void collectEventSimilarity(EventSimilarityAvro eventSimilarityAvro);
+
+ default void close() {}
+
+ void flush();;
+}
\ No newline at end of file
diff --git a/stats/aggregator/src/main/java/ru/practicum/service/AggregatorServiceImpl.java b/stats/aggregator/src/main/java/ru/practicum/service/AggregatorServiceImpl.java
new file mode 100644
index 0000000..1672f0a
--- /dev/null
+++ b/stats/aggregator/src/main/java/ru/practicum/service/AggregatorServiceImpl.java
@@ -0,0 +1,184 @@
+package ru.practicum.service;
+
+import lombok.AllArgsConstructor;
+import lombok.extern.slf4j.Slf4j;
+import org.apache.avro.specific.SpecificRecordBase;
+import org.apache.kafka.clients.producer.Producer;
+import org.apache.kafka.clients.producer.ProducerRecord;
+import org.springframework.stereotype.Service;
+import ru.practicum.config.KafkaConfig;
+import ru.practicum.ewm.stats.avro.EventSimilarityAvro;
+import ru.practicum.ewm.stats.avro.UserActionAvro;
+import ru.practicum.ewm.stats.avro.ActionTypeAvro;
+
+import java.time.Instant;
+import java.util.ArrayList;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+@Slf4j
+@Service
+@AllArgsConstructor
+public class AggregatorServiceImpl implements AggregatorService {
+
+ private final Producer producer;
+ private final KafkaConfig kafkaConfig;
+
+ private final Map> eventUserWeights = new HashMap<>();
+ private final Map eventTotalWeights = new HashMap<>();
+ private final Map> pairMinWeights = new HashMap<>();
+
+ @Override
+ public List updateSimilarity(UserActionAvro userAction) {
+ log.info("Processing action for user {} and event {}",
+ userAction.getUserId(), userAction.getEventId());
+
+ List results = new ArrayList<>();
+ Long eventId = userAction.getEventId();
+ Long userId = userAction.getUserId();
+ double newWeight = getWeightByActionType(userAction.getActionType());
+
+ log.debug("Received weight: {} for event: {}, user: {}", newWeight, eventId, userId);
+
+ eventUserWeights.putIfAbsent(eventId, new HashMap<>());
+ double currentWeight = eventUserWeights.get(eventId).getOrDefault(userId, 0.0);
+ log.debug("Current weight: {} for event: {}, user: {}", currentWeight, eventId, userId);
+
+ if (newWeight <= currentWeight) {
+ log.debug("Weight not increased, skipping processing");
+ return results;
+ }
+
+ eventUserWeights.get(eventId).put(userId, newWeight);
+ log.debug("Updated user weight to: {} for event: {}, user: {}", newWeight, eventId, userId);
+
+ double deltaWeight = newWeight - currentWeight;
+ double newTotalWeight = eventTotalWeights.merge(eventId, deltaWeight, Double::sum);
+ log.debug("Updated total weight for event {}: {}", eventId, newTotalWeight);
+
+ for (Map.Entry> entry : eventUserWeights.entrySet()) {
+ Long otherEventId = entry.getKey();
+
+ if (otherEventId.equals(eventId)) {
+ log.debug("Skipping same event: {}", eventId);
+ continue;
+ }
+
+ if (entry.getValue().containsKey(userId)) {
+ double otherWeight = entry.getValue().get(userId);
+ log.debug("Found interaction with event: {}, weight: {}", otherEventId, otherWeight);
+
+ long firstEvent = Math.min(eventId, otherEventId);
+ long secondEvent = Math.max(eventId, otherEventId);
+ log.debug("Processing pair: {} and {}", firstEvent, secondEvent);
+
+ double oldMin = Math.min(currentWeight, otherWeight);
+ double newMin = Math.min(newWeight, otherWeight);
+ double deltaMin = newMin - oldMin;
+ log.debug("Min weights - old: {}, new: {}, delta: {}", oldMin, newMin, deltaMin);
+
+ Map secondLevelMap = pairMinWeights.computeIfAbsent(firstEvent, k -> new HashMap<>());
+ double currentSum = secondLevelMap.getOrDefault(secondEvent, 0.0);
+ double updatedSum = currentSum + deltaMin;
+ secondLevelMap.put(secondEvent, updatedSum);
+
+ log.debug("Updated min weights sum for pair ({}, {}): was {}, now {}",
+ firstEvent, secondEvent, currentSum, updatedSum);
+
+ double sumA = eventTotalWeights.get(firstEvent);
+ double sumB = eventTotalWeights.get(secondEvent);
+ log.debug("Total weights - sumA: {}, sumB: {}", sumA, sumB);
+
+ double score = calculateCosineSimilarity(sumA, sumB, updatedSum);
+ log.info("Calculated similarity score for events {} and {}: {}",
+ firstEvent, secondEvent, score);
+
+ if (score > 0) {
+ EventSimilarityAvro similarity = createSimilarityAvro(firstEvent, secondEvent, score);
+ results.add(similarity);
+ log.debug("Created similarity record: {}", similarity);
+ }
+ }
+ }
+
+ return results;
+ }
+
+ private double calculateCosineSimilarity(double sumA, double sumB, double sumMin) {
+ if (sumA <= 0 || sumB <= 0 || sumMin <= 0) {
+ log.debug("Invalid input for similarity calculation - sumA: {}, sumB: {}, sumMin: {}",
+ sumA, sumB, sumMin);
+ return 0;
+ }
+
+ double sqrtA = Math.sqrt(sumA);
+ double sqrtB = Math.sqrt(sumB);
+ double denominator = sqrtA * sqrtB;
+
+ if (denominator == 0) {
+ log.debug("Denominator is zero - sumA: {}, sumB: {}", sumA, sumB);
+ return 0;
+ }
+
+ double score = sumMin / denominator;
+ double roundedScore = Math.round(score * 100000.0) / 100000.0;
+ return roundedScore;
+ }
+
+ private EventSimilarityAvro createSimilarityAvro(long eventA, long eventB, double score) {
+ return EventSimilarityAvro.newBuilder()
+ .setEventA(eventA)
+ .setEventB(eventB)
+ .setScore((float) score)
+ .setTimestamp(Instant.now())
+ .build();
+ }
+
+ private double getWeightByActionType(ActionTypeAvro actionType) {
+ return switch (actionType) {
+ case VIEW -> 0.4;
+ case REGISTER -> 0.8;
+ case LIKE -> 1.0;
+ };
+ }
+
+ @Override
+ public void collectEventSimilarity(EventSimilarityAvro eventSimilarityAvro) {
+ try {
+ ProducerRecord record = new ProducerRecord<>(
+ kafkaConfig.getKafkaProperties().getEventsSimilarityTopic(),
+ eventSimilarityAvro.getEventA(),
+ eventSimilarityAvro);
+ producer.send(record);
+ } catch (Exception e) {
+ log.error("Error sending to Kafka: {}", e.getMessage());
+ }
+ }
+
+ @Override
+ public void flush() {
+ if (producer != null) {
+ producer.flush();
+ }
+ }
+
+ @Override
+ public void close() {
+ try {
+ if (producer != null) {
+ producer.flush();
+ }
+ } finally {
+ if (producer != null) {
+ producer.close();
+ }
+ }
+ }
+
+ public void resetState() {
+ eventUserWeights.clear();
+ eventTotalWeights.clear();
+ pairMinWeights.clear();
+ }
+}
\ No newline at end of file
diff --git a/stats/aggregator/src/main/resources/application.yml b/stats/aggregator/src/main/resources/application.yml
new file mode 100644
index 0000000..f0fb364
--- /dev/null
+++ b/stats/aggregator/src/main/resources/application.yml
@@ -0,0 +1,21 @@
+spring:
+ application:
+ name: aggregator
+ config:
+ import: 'configserver:'
+ cloud:
+ config:
+ discovery:
+ service-id: config-server
+ enabled: true
+ fail-fast: true
+ retry:
+ use-random-policy: true
+ max-interval: 10000
+eureka:
+ instance:
+ prefer-ip-address: true
+ client:
+ service-url:
+ defaultZone: http://localhost:8761/eureka
+ register-with-eureka: true
\ No newline at end of file
diff --git a/stats/analyzer/Dockerfile b/stats/analyzer/Dockerfile
new file mode 100644
index 0000000..0ff1817
--- /dev/null
+++ b/stats/analyzer/Dockerfile
@@ -0,0 +1,5 @@
+FROM eclipse-temurin:21-jre-jammy
+VOLUME /tmp
+ARG JAR_FILE=target/*.jar
+COPY ${JAR_FILE} app.jar
+ENTRYPOINT ["sh", "-c", "java ${JAVA_OPTS} -jar /app.jar"]
\ No newline at end of file
diff --git a/stats/stats-server/pom.xml b/stats/analyzer/pom.xml
similarity index 72%
rename from stats/stats-server/pom.xml
rename to stats/analyzer/pom.xml
index d126e6a..10747be 100644
--- a/stats/stats-server/pom.xml
+++ b/stats/analyzer/pom.xml
@@ -3,96 +3,79 @@
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
4.0.0
-
ru.practicum
stats
0.0.1-SNAPSHOT
- stats-server
+ analyzer
- 17
- 17
+ 21
+ 21
UTF-8
+
+ ru.practicum
+ avro-schemas
+ 0.0.1-SNAPSHOT
+
+
+ ru.practicum
+ proto-schemas
+ 0.0.1-SNAPSHOT
+
org.springframework.boot
- spring-boot-starter-web
+ spring-boot-starter
-
org.springframework.boot
spring-boot-starter-data-jpa
-
org.springframework.boot
- spring-boot-starter-actuator
+ spring-boot-starter-validation
-
org.postgresql
postgresql
runtime
-
com.h2database
h2
runtime
-
-
- org.springframework.boot
- spring-boot-configuration-processor
- true
-
-
-
- org.springframework.boot
- spring-boot-starter-test
- test
-
-
- ru.practicum
- stats-dto
- 0.0.1-SNAPSHOT
- compile
-
-
org.projectlombok
lombok
true
- org.springframework.cloud
- spring-cloud-commons
+ net.devh
+ grpc-server-spring-boot-starter
+ ${grpc-spring-boot-starter.version}
-
-
- org.springframework.cloud
- spring-cloud-starter-netflix-eureka-client
-
-
org.springframework.cloud
spring-cloud-starter-config
-
org.springframework.retry
spring-retry
org.springframework.cloud
- spring-cloud-openfeign-core
+ spring-cloud-starter-netflix-eureka-client
+
+
+ org.springframework.boot
+ spring-boot-starter-actuator
-
@@ -100,8 +83,15 @@
org.springframework.boot
spring-boot-maven-plugin
+
+
+
+ org.projectlombok
+ lombok
+
+
+
-
\ No newline at end of file
diff --git a/stats/analyzer/src/main/java/ru/practicum/AnalyzerApplication.java b/stats/analyzer/src/main/java/ru/practicum/AnalyzerApplication.java
new file mode 100644
index 0000000..71939da
--- /dev/null
+++ b/stats/analyzer/src/main/java/ru/practicum/AnalyzerApplication.java
@@ -0,0 +1,27 @@
+package ru.practicum;
+
+import org.springframework.boot.SpringApplication;
+import org.springframework.boot.autoconfigure.SpringBootApplication;
+import org.springframework.cloud.client.discovery.EnableDiscoveryClient;
+import org.springframework.context.ConfigurableApplicationContext;
+import ru.practicum.processor.EventSimilarityProcessor;
+import ru.practicum.processor.UserActionEventProcessor;
+
+@EnableDiscoveryClient
+@SpringBootApplication
+public class AnalyzerApplication {
+ public static void main(String[] args) {
+ ConfigurableApplicationContext context = SpringApplication.run(AnalyzerApplication.class, args);
+
+ final UserActionEventProcessor userActionProcessor =
+ context.getBean(UserActionEventProcessor.class);
+ final EventSimilarityProcessor eventSimilarityProcessor =
+ context.getBean(EventSimilarityProcessor.class);
+
+ Thread hubEventsThread = new Thread(userActionProcessor);
+ hubEventsThread.setName("UserActionHandlerThread");
+ hubEventsThread.start();
+
+ eventSimilarityProcessor.run();
+ }
+}
\ No newline at end of file
diff --git a/stats/analyzer/src/main/java/ru/practicum/config/ConsumerProperties.java b/stats/analyzer/src/main/java/ru/practicum/config/ConsumerProperties.java
new file mode 100644
index 0000000..e4cace7
--- /dev/null
+++ b/stats/analyzer/src/main/java/ru/practicum/config/ConsumerProperties.java
@@ -0,0 +1,20 @@
+package ru.practicum.config;
+
+import lombok.AccessLevel;
+import lombok.Getter;
+import lombok.NoArgsConstructor;
+import lombok.Setter;
+import lombok.experimental.FieldDefaults;
+
+@Getter
+@Setter
+@NoArgsConstructor
+@FieldDefaults(level = AccessLevel.PRIVATE)
+public class ConsumerProperties {
+ String groupId;
+ String clientId;
+ String keyDeserializer;
+ String valueDeserializer;
+ long attemptTimeout;
+ String enableAutoCommit;
+}
\ No newline at end of file
diff --git a/stats/analyzer/src/main/java/ru/practicum/config/KafkaConfig.java b/stats/analyzer/src/main/java/ru/practicum/config/KafkaConfig.java
new file mode 100644
index 0000000..61f9085
--- /dev/null
+++ b/stats/analyzer/src/main/java/ru/practicum/config/KafkaConfig.java
@@ -0,0 +1,54 @@
+package ru.practicum.config;
+
+import lombok.Getter;
+import org.apache.kafka.clients.consumer.ConsumerConfig;
+import org.apache.kafka.clients.consumer.KafkaConsumer;
+import org.springframework.boot.context.properties.EnableConfigurationProperties;
+import org.springframework.context.annotation.Bean;
+import org.springframework.context.annotation.Configuration;
+import ru.practicum.ewm.stats.avro.EventSimilarityAvro;
+import ru.practicum.ewm.stats.avro.UserActionAvro;
+
+import java.util.Properties;
+
+@Getter
+@Configuration
+@EnableConfigurationProperties({KafkaConfigProperties.class})
+public class KafkaConfig {
+ private final KafkaConfigProperties kafkaProperties;
+
+ public KafkaConfig(KafkaConfigProperties properties) {
+ this.kafkaProperties = properties;
+ }
+
+ @Bean
+ public KafkaConsumer getEventSimilarityConsumer() {
+ Properties props = new Properties();
+ props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaProperties.getBootstrapServers());
+ props.put(ConsumerConfig.GROUP_ID_CONFIG, kafkaProperties.getEventSimilarityConsumer().getGroupId());
+ props.put(ConsumerConfig.CLIENT_ID_CONFIG, kafkaProperties.getEventSimilarityConsumer().getClientId());
+ props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG,
+ kafkaProperties.getEventSimilarityConsumer().getKeyDeserializer());
+ props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,
+ kafkaProperties.getEventSimilarityConsumer().getValueDeserializer());
+ props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG,
+ kafkaProperties.getEventSimilarityConsumer().getEnableAutoCommit());
+
+ return new KafkaConsumer<>(props);
+ }
+
+ @Bean
+ public KafkaConsumer getUserActionConsumer() {
+ Properties props = new Properties();
+ props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaProperties.getBootstrapServers());
+ props.put(ConsumerConfig.GROUP_ID_CONFIG, kafkaProperties.getUserActionConsumer().getGroupId());
+ props.put(ConsumerConfig.CLIENT_ID_CONFIG, kafkaProperties.getUserActionConsumer().getClientId());
+ props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG,
+ kafkaProperties.getUserActionConsumer().getKeyDeserializer());
+ props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,
+ kafkaProperties.getUserActionConsumer().getValueDeserializer());
+ props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG,
+ kafkaProperties.getUserActionConsumer().getEnableAutoCommit());
+ return new KafkaConsumer<>(props);
+ }
+}
\ No newline at end of file
diff --git a/stats/analyzer/src/main/java/ru/practicum/config/KafkaConfigProperties.java b/stats/analyzer/src/main/java/ru/practicum/config/KafkaConfigProperties.java
new file mode 100644
index 0000000..f04b42f
--- /dev/null
+++ b/stats/analyzer/src/main/java/ru/practicum/config/KafkaConfigProperties.java
@@ -0,0 +1,19 @@
+package ru.practicum.config;
+
+import lombok.AccessLevel;
+import lombok.Getter;
+import lombok.Setter;
+import lombok.experimental.FieldDefaults;
+import org.springframework.boot.context.properties.ConfigurationProperties;
+
+@Getter
+@Setter
+@ConfigurationProperties(prefix = "kafka")
+@FieldDefaults(level = AccessLevel.PRIVATE)
+public class KafkaConfigProperties {
+ String bootstrapServers;
+ ConsumerProperties userActionConsumer;
+ ConsumerProperties eventSimilarityConsumer;
+ String userActionTopic;
+ String eventsSimilarityTopic;
+}
\ No newline at end of file
diff --git a/stats/analyzer/src/main/java/ru/practicum/controller/RecommendationController.java b/stats/analyzer/src/main/java/ru/practicum/controller/RecommendationController.java
new file mode 100644
index 0000000..1867f7a
--- /dev/null
+++ b/stats/analyzer/src/main/java/ru/practicum/controller/RecommendationController.java
@@ -0,0 +1,75 @@
+package ru.practicum.controller;
+
+import io.grpc.stub.StreamObserver;
+import lombok.RequiredArgsConstructor;
+import lombok.extern.slf4j.Slf4j;
+import net.devh.boot.grpc.server.service.GrpcService;
+import ru.practicum.grpc.stat.dashboard.RecommendationsControllerGrpc;
+import ru.practicum.grpc.stat.request.InteractionsCountRequestProto;
+import ru.practicum.grpc.stat.request.RecommendedEventProto;
+import ru.practicum.grpc.stat.request.SimilarEventsRequestProto;
+import ru.practicum.grpc.stat.request.UserPredictionsRequestProto;
+import ru.practicum.service.RecommendationService;
+
+import java.util.List;
+
+@GrpcService
+@Slf4j
+@RequiredArgsConstructor
+public class RecommendationController extends RecommendationsControllerGrpc.RecommendationsControllerImplBase {
+ private final RecommendationService recommendationService;
+
+ @Override
+ public void getRecommendationsForUser(UserPredictionsRequestProto request,
+ StreamObserver responseObserver) {
+ log.info("Получен запрос на рекомендации для пользователя: {}", request);
+ try {
+ List recommendedEvents = recommendationService.generateRecommendationsForUser(request);
+ recommendedEvents.forEach(responseObserver::onNext);
+ responseObserver.onCompleted();
+ log.info("Успешно сформированы рекомендации для пользователя");
+ } catch (Exception e) {
+ log.error("Ошибка при формировании рекомендаций для пользователя: {}", request, e);
+ responseObserver.onError(io.grpc.Status.INTERNAL
+ .withDescription("Ошибка сервера при получении рекомендаций: " + e.getMessage())
+ .withCause(e)
+ .asRuntimeException());
+ }
+ }
+
+ @Override
+ public void getSimilarEvents(SimilarEventsRequestProto request,
+ StreamObserver responseObserver) {
+ log.info("Получен запрос на поиск похожих событий: {}", request);
+ try {
+ List similarEvents = recommendationService.getSimilarEvents(request);
+ similarEvents.forEach(responseObserver::onNext);
+ responseObserver.onCompleted();
+ log.info("Успешно найдены похожие события");
+ } catch (Exception e) {
+ log.error("Ошибка при поиске похожих событий: {}", request, e);
+ responseObserver.onError(io.grpc.Status.INTERNAL
+ .withDescription("Ошибка сервера при поиске похожих событий: " + e.getMessage())
+ .withCause(e)
+ .asRuntimeException());
+ }
+ }
+
+ @Override
+ public void getInteractionsCount(InteractionsCountRequestProto request,
+ StreamObserver responseObserver) {
+ log.info("Получен запрос на получение количества взаимодействий: {}", request);
+ try {
+ List interactions = recommendationService.getInteractionsCount(request);
+ interactions.forEach(responseObserver::onNext);
+ responseObserver.onCompleted();
+ log.info("Успешно получено количество взаимодействий");
+ } catch (Exception e) {
+ log.error("Ошибка при получении количества взаимодействий: {}", request, e);
+ responseObserver.onError(io.grpc.Status.INTERNAL
+ .withDescription("Ошибка сервера при получении количества взаимодействий: " + e.getMessage())
+ .withCause(e)
+ .asRuntimeException());
+ }
+ }
+}
\ No newline at end of file
diff --git a/stats/analyzer/src/main/java/ru/practicum/deserializer/BaseAvroDeserializer.java b/stats/analyzer/src/main/java/ru/practicum/deserializer/BaseAvroDeserializer.java
new file mode 100644
index 0000000..52d1c4d
--- /dev/null
+++ b/stats/analyzer/src/main/java/ru/practicum/deserializer/BaseAvroDeserializer.java
@@ -0,0 +1,41 @@
+package ru.practicum.deserializer;
+
+import org.apache.avro.Schema;
+import org.apache.avro.io.BinaryDecoder;
+import org.apache.avro.io.DatumReader;
+import org.apache.avro.io.DecoderFactory;
+import org.apache.avro.specific.SpecificDatumReader;
+import org.apache.avro.specific.SpecificRecordBase;
+import org.apache.kafka.common.serialization.Deserializer;
+
+public class BaseAvroDeserializer implements Deserializer {
+
+ private final DecoderFactory decoderFactory;
+ private final DatumReader reader;
+
+ public BaseAvroDeserializer(Schema schema) {
+ this(DecoderFactory.get(), schema);
+ }
+
+ public BaseAvroDeserializer(DecoderFactory decoderFactory, Schema schema) {
+ this.decoderFactory = decoderFactory;
+ this.reader = new SpecificDatumReader<>(schema);
+
+ }
+
+
+ @Override
+ public T deserialize(String topic, byte[] data) {
+
+ try {
+ if (data != null) {
+
+ BinaryDecoder decoder = decoderFactory.binaryDecoder(data, null);
+ return reader.read(null, decoder);
+ }
+ return null;
+ } catch (Exception e) {
+ throw new RuntimeException("Data serialization from topic error [" + topic + "]", e);
+ }
+ }
+}
\ No newline at end of file
diff --git a/stats/analyzer/src/main/java/ru/practicum/deserializer/EventSimilarityDeserializer.java b/stats/analyzer/src/main/java/ru/practicum/deserializer/EventSimilarityDeserializer.java
new file mode 100644
index 0000000..68326e6
--- /dev/null
+++ b/stats/analyzer/src/main/java/ru/practicum/deserializer/EventSimilarityDeserializer.java
@@ -0,0 +1,11 @@
+package ru.practicum.deserializer;
+
+import ru.practicum.ewm.stats.avro.EventSimilarityAvro;
+
+
+public class EventSimilarityDeserializer extends BaseAvroDeserializer {
+
+ public EventSimilarityDeserializer() {
+ super(EventSimilarityAvro.getClassSchema());
+ }
+}
\ No newline at end of file
diff --git a/stats/analyzer/src/main/java/ru/practicum/deserializer/UserActionDeserializer.java b/stats/analyzer/src/main/java/ru/practicum/deserializer/UserActionDeserializer.java
new file mode 100644
index 0000000..6be984d
--- /dev/null
+++ b/stats/analyzer/src/main/java/ru/practicum/deserializer/UserActionDeserializer.java
@@ -0,0 +1,10 @@
+package ru.practicum.deserializer;
+
+import ru.practicum.ewm.stats.avro.UserActionAvro;
+
+public class UserActionDeserializer extends BaseAvroDeserializer {
+ public UserActionDeserializer() {
+ super(UserActionAvro.getClassSchema());
+ }
+
+}
\ No newline at end of file
diff --git a/stats/analyzer/src/main/java/ru/practicum/mapper/Mapper.java b/stats/analyzer/src/main/java/ru/practicum/mapper/Mapper.java
new file mode 100644
index 0000000..d5827b4
--- /dev/null
+++ b/stats/analyzer/src/main/java/ru/practicum/mapper/Mapper.java
@@ -0,0 +1,43 @@
+package ru.practicum.mapper;
+
+import ru.practicum.ewm.stats.avro.ActionTypeAvro;
+import ru.practicum.ewm.stats.avro.EventSimilarityAvro;
+import ru.practicum.ewm.stats.avro.UserActionAvro;
+import ru.practicum.grpc.stat.request.RecommendedEventProto;
+import ru.practicum.model.ActionType;
+import ru.practicum.model.EventSimilarity;
+import ru.practicum.model.RecommendedEvent;
+import ru.practicum.model.UserAction;
+
+public class Mapper {
+
+ public static UserAction mapToUserAction(UserActionAvro userActionAvro) {
+ return UserAction.builder()
+ .userId(userActionAvro.getUserId())
+ .eventId(userActionAvro.getEventId())
+ .actionType(toActionType(userActionAvro.getActionType()))
+ .created(userActionAvro.getTimestamp())
+ .weight(toActionType(userActionAvro.getActionType()).getWeight())
+ .build();
+ }
+
+ public static ActionType toActionType(ActionTypeAvro actionTypeAvro) {
+ return ActionType.valueOf(actionTypeAvro.name());
+ }
+
+ public static EventSimilarity mapToEventSimilarity(EventSimilarityAvro eventSimilarityAvro) {
+ return EventSimilarity.builder()
+ .aeventId(eventSimilarityAvro.getEventA())
+ .beventId(eventSimilarityAvro.getEventB())
+ .score(eventSimilarityAvro.getScore())
+ .build();
+
+ }
+
+ public static RecommendedEventProto mapToRecommendedEventProto(RecommendedEvent recommendedEvent) {
+ return RecommendedEventProto.newBuilder()
+ .setEventId(recommendedEvent.getEventId())
+ .setScore(recommendedEvent.getScore())
+ .build();
+ }
+}
\ No newline at end of file
diff --git a/stats/analyzer/src/main/java/ru/practicum/model/ActionType.java b/stats/analyzer/src/main/java/ru/practicum/model/ActionType.java
new file mode 100644
index 0000000..83a1a2b
--- /dev/null
+++ b/stats/analyzer/src/main/java/ru/practicum/model/ActionType.java
@@ -0,0 +1,16 @@
+package ru.practicum.model;
+
+import lombok.Getter;
+
+@Getter
+public enum ActionType {
+ VIEW(0.4),
+ REGISTER(0.8),
+ LIKE(1.0);
+
+ final double weight;
+
+ ActionType(double weight) {
+ this.weight = weight;
+ }
+}
\ No newline at end of file
diff --git a/stats/analyzer/src/main/java/ru/practicum/model/EventSimilarity.java b/stats/analyzer/src/main/java/ru/practicum/model/EventSimilarity.java
new file mode 100644
index 0000000..1d6025e
--- /dev/null
+++ b/stats/analyzer/src/main/java/ru/practicum/model/EventSimilarity.java
@@ -0,0 +1,48 @@
+package ru.practicum.model;
+
+import jakarta.persistence.*;
+import lombok.*;
+import lombok.experimental.FieldDefaults;
+import org.hibernate.proxy.HibernateProxy;
+
+import java.util.Objects;
+
+@Entity
+@Table(name = "event_similarity")
+@Getter
+@Setter
+@Builder(toBuilder = true)
+@FieldDefaults(level = AccessLevel.PRIVATE)
+@AllArgsConstructor
+@NoArgsConstructor
+public class EventSimilarity {
+ @Id
+ @GeneratedValue(strategy = GenerationType.IDENTITY)
+ Long id;
+
+ Long aeventId;
+
+ Long beventId;
+
+ double score;
+
+ @Override
+ public final boolean equals(Object o) {
+ if (this == o) return true;
+ if (o == null) return false;
+ Class> oEffectiveClass = o instanceof HibernateProxy ?
+ ((HibernateProxy) o).getHibernateLazyInitializer().getPersistentClass() : o.getClass();
+ Class> thisEffectiveClass = this instanceof HibernateProxy ?
+ ((HibernateProxy) this).getHibernateLazyInitializer().getPersistentClass() : this.getClass();
+ if (thisEffectiveClass != oEffectiveClass) return false;
+ EventSimilarity eventSimilarity = (EventSimilarity) o;
+ return getId() != null && Objects.equals(getId(), eventSimilarity.getId());
+ }
+
+ @Override
+ public final int hashCode() {
+ return this instanceof HibernateProxy
+ ? ((HibernateProxy) this).getHibernateLazyInitializer().getPersistentClass().hashCode()
+ : getClass().hashCode();
+ }
+}
\ No newline at end of file
diff --git a/stats/analyzer/src/main/java/ru/practicum/model/RecommendedEvent.java b/stats/analyzer/src/main/java/ru/practicum/model/RecommendedEvent.java
new file mode 100644
index 0000000..1b01c73
--- /dev/null
+++ b/stats/analyzer/src/main/java/ru/practicum/model/RecommendedEvent.java
@@ -0,0 +1,14 @@
+package ru.practicum.model;
+
+import lombok.*;
+import lombok.experimental.FieldDefaults;
+
+@Builder
+@Getter
+@Setter
+@ToString
+@FieldDefaults(level = AccessLevel.PRIVATE)
+public class RecommendedEvent {
+ Long eventId;
+ double score;
+}
\ No newline at end of file
diff --git a/stats/analyzer/src/main/java/ru/practicum/model/UserAction.java b/stats/analyzer/src/main/java/ru/practicum/model/UserAction.java
new file mode 100644
index 0000000..073a6e3
--- /dev/null
+++ b/stats/analyzer/src/main/java/ru/practicum/model/UserAction.java
@@ -0,0 +1,54 @@
+package ru.practicum.model;
+
+import jakarta.persistence.*;
+import lombok.*;
+import lombok.experimental.FieldDefaults;
+import org.hibernate.proxy.HibernateProxy;
+
+import java.time.Instant;
+import java.util.Objects;
+
+@Entity
+@Table(name = "user_action")
+@Getter
+@Setter
+@Builder(toBuilder = true)
+@FieldDefaults(level = AccessLevel.PRIVATE)
+@AllArgsConstructor
+@NoArgsConstructor
+public class UserAction {
+ @Id
+ @GeneratedValue(strategy = GenerationType.IDENTITY)
+ Long id;
+
+ Long userId;
+
+ Long eventId;
+
+ @Enumerated(EnumType.STRING)
+ ActionType actionType;
+
+ Instant created;
+
+ double weight;
+
+ @Override
+ public final boolean equals(Object o) {
+ if (this == o) return true;
+ if (o == null) return false;
+ Class> oEffectiveClass = o instanceof HibernateProxy ?
+ ((HibernateProxy) o).getHibernateLazyInitializer().getPersistentClass() : o.getClass();
+ Class> thisEffectiveClass = this instanceof HibernateProxy ?
+ ((HibernateProxy) this).getHibernateLazyInitializer().getPersistentClass() : this.getClass();
+ if (thisEffectiveClass != oEffectiveClass) return false;
+ UserAction userAction = (UserAction) o;
+ return getId() != null && Objects.equals(getId(), userAction.getId());
+ }
+
+ @Override
+ public final int hashCode() {
+ return this instanceof HibernateProxy
+ ? ((HibernateProxy) this).getHibernateLazyInitializer().getPersistentClass().hashCode()
+ : getClass().hashCode();
+ }
+}
\ No newline at end of file
diff --git a/stats/analyzer/src/main/java/ru/practicum/processor/EventSimilarityProcessor.java b/stats/analyzer/src/main/java/ru/practicum/processor/EventSimilarityProcessor.java
new file mode 100644
index 0000000..27cbe95
--- /dev/null
+++ b/stats/analyzer/src/main/java/ru/practicum/processor/EventSimilarityProcessor.java
@@ -0,0 +1,92 @@
+package ru.practicum.processor;
+
+import lombok.RequiredArgsConstructor;
+import lombok.extern.slf4j.Slf4j;
+import org.apache.kafka.clients.consumer.Consumer;
+import org.apache.kafka.clients.consumer.ConsumerRecord;
+import org.apache.kafka.clients.consumer.ConsumerRecords;
+import org.apache.kafka.clients.consumer.OffsetAndMetadata;
+import org.apache.kafka.common.TopicPartition;
+import org.apache.kafka.common.errors.WakeupException;
+import org.springframework.stereotype.Component;
+import ru.practicum.config.KafkaConfig;
+import ru.practicum.ewm.stats.avro.EventSimilarityAvro;
+import ru.practicum.mapper.Mapper;
+import ru.practicum.model.EventSimilarity;
+import ru.practicum.repository.EventSimilarityRepository;
+
+import java.time.Duration;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+@Slf4j
+@Component
+@RequiredArgsConstructor
+public class EventSimilarityProcessor implements Runnable {
+
+ private final Consumer consumer;
+ private final KafkaConfig kafkaConfig;
+ private final Map currentOffsets = new HashMap<>();
+ private final EventSimilarityRepository eventSimilarityRepository;
+
+ @Override
+ public void run() {
+ Runtime.getRuntime().addShutdownHook(new Thread(consumer::wakeup));
+ try {
+ consumer.subscribe(List.of(kafkaConfig.getKafkaProperties().getEventsSimilarityTopic()));
+ while (true) {
+ ConsumerRecords records = consumer
+ .poll(Duration.ofMillis(kafkaConfig.getKafkaProperties()
+ .getEventSimilarityConsumer().getAttemptTimeout()));
+ int count = 0;
+ for (ConsumerRecord record : records) {
+ handleRecord(record);
+ manageOffsets(record, count, consumer);
+ count++;
+ }
+ }
+
+ } catch (WakeupException ignores) {
+ } catch (Exception e) {
+ log.error("Ошибка во время обработки события похожести ", e);
+ } finally {
+
+ try {
+ consumer.commitSync(currentOffsets);
+
+ } finally {
+ log.info("Закрываем консьюмер");
+ consumer.close();
+ }
+ }
+ }
+
+ private void handleRecord(ConsumerRecord consumerRecord) throws InterruptedException {
+ log.info("handleRecord {}", consumerRecord);
+ EventSimilarity eventSimilarity = Mapper.mapToEventSimilarity(consumerRecord.value());
+
+ eventSimilarityRepository.findByAeventIdAndBeventId(
+ eventSimilarity.getAeventId(),
+ eventSimilarity.getBeventId()).ifPresent(oldEventSimilarity ->
+ eventSimilarity.setId(oldEventSimilarity.getId()));
+ eventSimilarityRepository.save(eventSimilarity);
+ }
+
+ private void manageOffsets(ConsumerRecord consumerRecord,
+ int count,
+ Consumer consumer) {
+ currentOffsets.put(
+ new TopicPartition(consumerRecord.topic(), consumerRecord.partition()),
+ new OffsetAndMetadata(consumerRecord.offset() + 1)
+ );
+
+ if (count % 10 == 0) {
+ consumer.commitAsync(currentOffsets, (offsets, exception) -> {
+ if (exception != null) {
+ log.warn("Ошибка во время фиксации оффсетов: {}", offsets, exception);
+ }
+ });
+ }
+ }
+}
\ No newline at end of file
diff --git a/stats/analyzer/src/main/java/ru/practicum/processor/UserActionEventProcessor.java b/stats/analyzer/src/main/java/ru/practicum/processor/UserActionEventProcessor.java
new file mode 100644
index 0000000..45e4bf8
--- /dev/null
+++ b/stats/analyzer/src/main/java/ru/practicum/processor/UserActionEventProcessor.java
@@ -0,0 +1,85 @@
+package ru.practicum.processor;
+
+import lombok.RequiredArgsConstructor;
+import lombok.extern.slf4j.Slf4j;
+import org.apache.kafka.clients.consumer.Consumer;
+import org.apache.kafka.clients.consumer.ConsumerRecord;
+import org.apache.kafka.clients.consumer.ConsumerRecords;
+import org.apache.kafka.clients.consumer.OffsetAndMetadata;
+import org.apache.kafka.common.TopicPartition;
+import org.apache.kafka.common.errors.WakeupException;
+import org.springframework.stereotype.Component;
+import ru.practicum.config.KafkaConfig;
+import ru.practicum.ewm.stats.avro.UserActionAvro;
+import ru.practicum.service.RecommendationService;
+
+import java.time.Duration;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+
+@Slf4j
+@Component
+@RequiredArgsConstructor
+public class UserActionEventProcessor implements Runnable {
+
+ private final Consumer consumer;
+ private final KafkaConfig kafkaConfig;
+ private final Map currentOffsets = new HashMap<>();
+ private final RecommendationService recommendationService;
+
+ @Override
+ public void run() {
+ Runtime.getRuntime().addShutdownHook(new Thread(consumer::wakeup));
+ try {
+ consumer.subscribe(List.of(kafkaConfig.getKafkaProperties().getUserActionTopic()));
+ while (true) {
+ ConsumerRecords records = consumer
+ .poll(Duration.ofMillis(kafkaConfig.getKafkaProperties()
+ .getUserActionConsumer().getAttemptTimeout()));
+ int count = 0;
+ for (ConsumerRecord record : records) {
+ handleRecord(record);
+ manageOffsets(record, count, consumer);
+ count++;
+ }
+ }
+
+ } catch (WakeupException ignores) {
+
+ } catch (Exception e) {
+ log.error("Ошибка во время обработки события хаба ", e);
+ } finally {
+
+ try {
+ consumer.commitSync(currentOffsets);
+
+ } finally {
+ log.info("Закрываем консьюмер");
+ consumer.close();
+ }
+ }
+ }
+
+ private void handleRecord(ConsumerRecord consumerRecord) throws InterruptedException {
+ log.info("handleRecord {}", consumerRecord);
+ recommendationService.saveUserAction(consumerRecord.value());
+ }
+
+ private void manageOffsets(ConsumerRecord consumerRecord,
+ int count,
+ Consumer consumer) {
+ currentOffsets.put(
+ new TopicPartition(consumerRecord.topic(), consumerRecord.partition()),
+ new OffsetAndMetadata(consumerRecord.offset() + 1)
+ );
+
+ if (count % 10 == 0) {
+ consumer.commitAsync(currentOffsets, (offsets, exception) -> {
+ if (exception != null) {
+ log.warn("Ошибка во время фиксации оффсетов: {}", offsets, exception);
+ }
+ });
+ }
+ }
+}
\ No newline at end of file
diff --git a/stats/analyzer/src/main/java/ru/practicum/repository/EventSimilarityRepository.java b/stats/analyzer/src/main/java/ru/practicum/repository/EventSimilarityRepository.java
new file mode 100644
index 0000000..f9e1b8c
--- /dev/null
+++ b/stats/analyzer/src/main/java/ru/practicum/repository/EventSimilarityRepository.java
@@ -0,0 +1,29 @@
+package ru.practicum.repository;
+
+import org.springframework.data.jpa.repository.JpaRepository;
+import org.springframework.data.jpa.repository.Query;
+import org.springframework.data.repository.query.Param;
+import org.springframework.stereotype.Repository;
+import ru.practicum.model.EventSimilarity;
+
+import java.util.List;
+import java.util.Optional;
+
+@Repository
+public interface EventSimilarityRepository extends JpaRepository {
+
+ Optional findByAeventIdAndBeventId(Long aEventId, Long bEventId);
+
+ @Query("select es from EventSimilarity es where es.aeventId = :id or es.beventId = :id")
+ List findAllByEvent(@Param("id") Long eventId);
+
+ @Query("select es from EventSimilarity es " +
+ " where (es.aeventId = :id and es.beventId in :ids) or " +
+ " (es.beventId = :id and es.aeventId in :ids) " +
+ " order by es.score desc" +
+ " limit :limit")
+ List findAllByEventAndEventIdInLimitedTo(
+ @Param("id") Long eventId,
+ @Param("ids") List eventIds,
+ @Param("limit") Long limit);
+}
\ No newline at end of file
diff --git a/stats/analyzer/src/main/java/ru/practicum/repository/UserActionRepository.java b/stats/analyzer/src/main/java/ru/practicum/repository/UserActionRepository.java
new file mode 100644
index 0000000..c6aa963
--- /dev/null
+++ b/stats/analyzer/src/main/java/ru/practicum/repository/UserActionRepository.java
@@ -0,0 +1,26 @@
+package ru.practicum.repository;
+
+import org.springframework.data.jpa.repository.JpaRepository;
+import org.springframework.data.jpa.repository.Query;
+import org.springframework.data.repository.query.Param;
+import org.springframework.stereotype.Repository;
+import ru.practicum.model.RecommendedEvent;
+import ru.practicum.model.UserAction;
+
+import java.util.List;
+import java.util.Optional;
+
+@Repository
+public interface UserActionRepository extends JpaRepository {
+
+ Optional findByUserIdAndEventId(Long userId, Long eventId);
+
+ List findAllByUserId(Long userId);
+
+ @Query("SELECT new ru.practicum.model.RecommendedEvent(ua.eventId, sum(ua.weight)) " +
+ "FROM UserAction ua WHERE ua.eventId in :ids GROUP BY ua.eventId")
+ List getSumWeightForEvents(@Param("ids") List ids);
+
+ @Query("SELECT ua FROM UserAction ua WHERE ua.userId = :id ORDER BY ua.created DESC LIMIT :limit")
+ List findByUserIdOrderByCreatedDescLimitedTo(@Param("id") Long userId, @Param("limit") long limit);
+}
\ No newline at end of file
diff --git a/stats/analyzer/src/main/java/ru/practicum/service/RecommendationService.java b/stats/analyzer/src/main/java/ru/practicum/service/RecommendationService.java
new file mode 100644
index 0000000..eaf4300
--- /dev/null
+++ b/stats/analyzer/src/main/java/ru/practicum/service/RecommendationService.java
@@ -0,0 +1,20 @@
+package ru.practicum.service;
+
+import ru.practicum.ewm.stats.avro.UserActionAvro;
+import ru.practicum.grpc.stat.request.InteractionsCountRequestProto;
+import ru.practicum.grpc.stat.request.RecommendedEventProto;
+import ru.practicum.grpc.stat.request.SimilarEventsRequestProto;
+import ru.practicum.grpc.stat.request.UserPredictionsRequestProto;
+
+import java.util.List;
+
+public interface RecommendationService {
+
+ List generateRecommendationsForUser(UserPredictionsRequestProto request);
+
+ List getSimilarEvents(SimilarEventsRequestProto request);
+
+ List getInteractionsCount(InteractionsCountRequestProto request);
+
+ void saveUserAction(UserActionAvro userActionAvro);
+}
\ No newline at end of file
diff --git a/stats/analyzer/src/main/java/ru/practicum/service/RecommendationServiceImpl.java b/stats/analyzer/src/main/java/ru/practicum/service/RecommendationServiceImpl.java
new file mode 100644
index 0000000..cf1e02b
--- /dev/null
+++ b/stats/analyzer/src/main/java/ru/practicum/service/RecommendationServiceImpl.java
@@ -0,0 +1,163 @@
+package ru.practicum.service;
+
+import lombok.RequiredArgsConstructor;
+import lombok.extern.slf4j.Slf4j;
+import org.springframework.stereotype.Service;
+import org.springframework.transaction.annotation.Transactional;
+import ru.practicum.ewm.stats.avro.UserActionAvro;
+import ru.practicum.grpc.stat.request.InteractionsCountRequestProto;
+import ru.practicum.grpc.stat.request.RecommendedEventProto;
+import ru.practicum.grpc.stat.request.SimilarEventsRequestProto;
+import ru.practicum.grpc.stat.request.UserPredictionsRequestProto;
+import ru.practicum.model.EventSimilarity;
+import ru.practicum.model.RecommendedEvent;
+import ru.practicum.model.UserAction;
+import ru.practicum.repository.EventSimilarityRepository;
+import ru.practicum.repository.UserActionRepository;
+import ru.practicum.mapper.Mapper;
+
+import java.util.*;
+import java.util.stream.Collectors;
+
+import static java.util.Collections.emptyList;
+
+@Service
+@Slf4j
+@RequiredArgsConstructor
+@Transactional(readOnly = true)
+public class RecommendationServiceImpl implements RecommendationService {
+ private static final long EVENT_COUNT_PREDICTION = 5;
+ private final EventSimilarityRepository eventSimilarityRepository;
+ private final UserActionRepository userActionRepository;
+
+ @Override
+ public List generateRecommendationsForUser(UserPredictionsRequestProto request) {
+ List lastUserEvents = userActionRepository.findByUserIdOrderByCreatedDescLimitedTo(
+ request.getUserId(), request.getMaxResults()
+ );
+
+ if (lastUserEvents.isEmpty()) {
+ return emptyList();
+ }
+
+ List recommendedEvents = new ArrayList<>();
+ lastUserEvents.forEach(event -> recommendedEvents.addAll(
+ getSimilarEvents(request.getUserId(), event.getEventId(), request.getMaxResults())
+ .stream()
+ .sorted(Comparator.comparingDouble(EventSimilarity::getScore).reversed())
+ .limit(request.getMaxResults())
+ .map(similarEvent -> genRecommendedEventFrom(similarEvent, event.getEventId()))
+ .toList()));
+
+ List limitRecommendedEvents = recommendedEvents.stream()
+ .sorted(Comparator.comparingDouble(RecommendedEvent::getScore).reversed())
+ .limit(request.getMaxResults())
+ .toList();
+ log.info("RecommendedEvents: {}", recommendedEvents);
+ limitRecommendedEvents.forEach(
+ event -> event.setScore(getPrediction(event.getEventId(), request.getUserId()))
+ );
+ return limitRecommendedEvents.stream()
+ .map(Mapper::mapToRecommendedEventProto)
+ .toList();
+ }
+
+ @Override
+ public List getSimilarEvents(SimilarEventsRequestProto request) {
+ return getSimilarEvents(request.getUserId(), request.getEventId(), request.getMaxResults()).stream()
+ .map(event -> genRecommendedEventProtoFrom(event, request.getEventId()))
+ .toList();
+ }
+
+ @Override
+ public List getInteractionsCount(InteractionsCountRequestProto request) {
+ return userActionRepository.getSumWeightForEvents(request.getEventIdList())
+ .stream()
+ .map(Mapper::mapToRecommendedEventProto)
+ .toList();
+ }
+
+ @Override
+ @Transactional
+ public void saveUserAction(UserActionAvro userActionAvro) {
+ UserAction userAction = Mapper.mapToUserAction(userActionAvro);
+ log.info("Saving UserAction: userId={}, eventId={}, type={}, weight={}",
+ userAction.getUserId(), userAction.getEventId(), userAction.getActionType(), userAction.getWeight());
+
+ Optional oldUserAction = userActionRepository.findByUserIdAndEventId(
+ userAction.getUserId(), userAction.getEventId()
+ );
+
+ if (oldUserAction.isPresent()) {
+ log.info("Updating existing UserAction: oldWeight={}", oldUserAction.get().getWeight());
+ userAction.setId(oldUserAction.get().getId());
+ if (userAction.getWeight() < oldUserAction.get().getWeight()) {
+ userAction.setWeight(oldUserAction.get().getWeight());
+ }
+ }
+
+ UserAction savedAction = userActionRepository.save(userAction);
+ log.info("Saved UserAction: id={}, weight={}", savedAction.getId(), savedAction.getWeight());
+ }
+
+ private RecommendedEvent genRecommendedEventFrom(EventSimilarity eventSimilarity, Long eventId) {
+ Long recommendedEventId = Objects.equals(eventSimilarity.getAeventId(), eventId) ?
+ eventSimilarity.getBeventId() : eventSimilarity.getAeventId();
+
+ return RecommendedEvent.builder()
+ .eventId(recommendedEventId)
+ .score(eventSimilarity.getScore())
+ .build();
+ }
+
+ private RecommendedEventProto genRecommendedEventProtoFrom(EventSimilarity eventSimilarity, Long eventId) {
+ Long recommendedEventId = Objects.equals(eventSimilarity.getAeventId(), eventId) ?
+ eventSimilarity.getBeventId() : eventSimilarity.getAeventId();
+
+ return RecommendedEventProto.newBuilder()
+ .setEventId(recommendedEventId)
+ .setScore(eventSimilarity.getScore())
+ .build();
+ }
+
+ private List getSimilarEvents(Long userId, Long eventId, Long limit) {
+ List events = eventSimilarityRepository.findAllByEvent(eventId);
+ List actions = userActionRepository.findAllByUserId(userId).stream()
+ .map(UserAction::getEventId).toList();
+
+ List result = events.stream()
+ .filter(event -> !(actions.contains(event.getAeventId()) && actions.contains(event.getBeventId())))
+ .sorted(Comparator.comparingDouble(EventSimilarity::getScore).reversed())
+ .limit(limit)
+ .toList();
+
+ log.info("similar events are {}", result);
+ return result;
+ }
+
+ private double getPrediction(Long eventId, Long userId) {
+ double prediction = 0.0;
+
+ Map ratedEvents = userActionRepository.findAllByUserId(userId).stream()
+ .collect(Collectors.toMap(UserAction::getEventId, UserAction::getWeight));
+ List similarEvents = eventSimilarityRepository.findAllByEventAndEventIdInLimitedTo(
+ eventId, ratedEvents.keySet().stream().toList(), EVENT_COUNT_PREDICTION)
+ .stream()
+ .map(eventSimilarity -> genRecommendedEventFrom(eventSimilarity, eventId))
+ .toList();
+
+ double weightedSum = 0.0;
+ double similaritySum = 0.0;
+
+ for (RecommendedEvent event : similarEvents) {
+ weightedSum += event.getScore() * ratedEvents.get(event.getEventId());
+ similaritySum += event.getScore();
+ }
+
+ if (similaritySum != 0) {
+ prediction = weightedSum / similaritySum;
+ }
+
+ return prediction;
+ }
+}
diff --git a/stats/analyzer/src/main/resources/application.yml b/stats/analyzer/src/main/resources/application.yml
new file mode 100644
index 0000000..a419635
--- /dev/null
+++ b/stats/analyzer/src/main/resources/application.yml
@@ -0,0 +1,21 @@
+spring:
+ application:
+ name: analyzer
+ config:
+ import: 'configserver:'
+ cloud:
+ config:
+ discovery:
+ service-id: config-server
+ enabled: true
+ fail-fast: true
+ retry:
+ use-random-policy: true
+ max-interval: 10000
+eureka:
+ instance:
+ prefer-ip-address: true
+ client:
+ service-url:
+ defaultZone: http://localhost:8761/eureka
+ register-with-eureka: true
\ No newline at end of file
diff --git a/stats/analyzer/src/main/resources/schema.sql b/stats/analyzer/src/main/resources/schema.sql
new file mode 100644
index 0000000..2abff13
--- /dev/null
+++ b/stats/analyzer/src/main/resources/schema.sql
@@ -0,0 +1,19 @@
+create TABLE IF NOT EXISTS user_action (
+ id BIGINT GENERATED BY DEFAULT AS IDENTITY,
+ user_id BIGINT NOT NULL,
+ event_id BIGINT NOT NULL,
+ action_type VARCHAR(20) NOT NULL,
+ created TIMESTAMP NOT NULL,
+ weight DOUBLE PRECISION,
+ CONSTRAINT pk_user_action PRIMARY KEY (id),
+ CONSTRAINT unique_user_action_user_id_event_id UNIQUE (user_id, event_id)
+);
+
+create TABLE IF NOT EXISTS event_similarity (
+ id BIGINT GENERATED BY DEFAULT AS IDENTITY,
+ aevent_id BIGINT NOT NULL,
+ bevent_id BIGINT NOT NULL,
+ score DOUBLE PRECISION,
+ CONSTRAINT pk_event_similarity PRIMARY KEY (id),
+ CONSTRAINT unique_event_similarity_aevent_id_bevent_id UNIQUE (aevent_id, bevent_id)
+);
\ No newline at end of file
diff --git a/stats/collector/Dockerfile b/stats/collector/Dockerfile
new file mode 100644
index 0000000..0ff1817
--- /dev/null
+++ b/stats/collector/Dockerfile
@@ -0,0 +1,5 @@
+FROM eclipse-temurin:21-jre-jammy
+VOLUME /tmp
+ARG JAR_FILE=target/*.jar
+COPY ${JAR_FILE} app.jar
+ENTRYPOINT ["sh", "-c", "java ${JAVA_OPTS} -jar /app.jar"]
\ No newline at end of file
diff --git a/stats/collector/pom.xml b/stats/collector/pom.xml
new file mode 100644
index 0000000..3a9779d
--- /dev/null
+++ b/stats/collector/pom.xml
@@ -0,0 +1,86 @@
+
+
+ 4.0.0
+
+ ru.practicum
+ stats
+ 0.0.1-SNAPSHOT
+
+
+ collector
+
+
+ 21
+ 21
+ UTF-8
+ 0.0.1-SNAPSHOT
+ 0.0.1-SNAPSHOT
+
+
+
+
+ ru.practicum
+ avro-schemas
+ 0.0.1-SNAPSHOT
+
+
+ ru.practicum
+ proto-schemas
+ 0.0.1-SNAPSHOT
+
+
+ org.springframework.boot
+ spring-boot-starter
+
+
+ net.devh
+ grpc-server-spring-boot-starter
+ ${grpc-spring-boot-starter.version}
+
+
+ org.projectlombok
+ lombok
+ true
+
+
+ org.springframework.boot
+ spring-boot-starter-validation
+
+
+ org.springframework.cloud
+ spring-cloud-starter-config
+
+
+ org.springframework.retry
+ spring-retry
+
+
+ org.springframework.cloud
+ spring-cloud-starter-netflix-eureka-client
+
+
+ org.springframework.boot
+ spring-boot-starter-actuator
+
+
+
+
+
+
+ org.springframework.boot
+ spring-boot-maven-plugin
+
+
+
+ org.projectlombok
+ lombok
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/stats/stats-server/src/main/java/ru/practicum/StatServer.java b/stats/collector/src/main/java/ru/practicum/CollectorApplication.java
similarity index 65%
rename from stats/stats-server/src/main/java/ru/practicum/StatServer.java
rename to stats/collector/src/main/java/ru/practicum/CollectorApplication.java
index 1066613..e5177f4 100644
--- a/stats/stats-server/src/main/java/ru/practicum/StatServer.java
+++ b/stats/collector/src/main/java/ru/practicum/CollectorApplication.java
@@ -3,14 +3,11 @@
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.cloud.client.discovery.EnableDiscoveryClient;
-import org.springframework.cloud.openfeign.EnableFeignClients;
@SpringBootApplication
@EnableDiscoveryClient
-@EnableFeignClients
-public class StatServer {
+public class CollectorApplication {
public static void main(String[] args) {
- SpringApplication.run(StatServer.class, args);
+ SpringApplication.run(CollectorApplication.class, args);
}
-
-}
\ No newline at end of file
+}
diff --git a/stats/collector/src/main/java/ru/practicum/config/KafkaConfig.java b/stats/collector/src/main/java/ru/practicum/config/KafkaConfig.java
new file mode 100644
index 0000000..bcf4c4a
--- /dev/null
+++ b/stats/collector/src/main/java/ru/practicum/config/KafkaConfig.java
@@ -0,0 +1,45 @@
+package ru.practicum.config;
+
+import lombok.Getter;
+import org.apache.avro.specific.SpecificRecordBase;
+import org.apache.kafka.clients.producer.KafkaProducer;
+import org.apache.kafka.clients.producer.Producer;
+import org.apache.kafka.clients.producer.ProducerConfig;
+import org.springframework.boot.context.properties.EnableConfigurationProperties;
+import org.springframework.context.annotation.Bean;
+import org.springframework.context.annotation.Configuration;
+
+import java.util.Objects;
+import java.util.Properties;
+
+@Getter
+@Configuration
+@EnableConfigurationProperties({KafkaConfigProperties.class})
+public class KafkaConfig {
+ private final KafkaConfigProperties kafkaProperties;
+
+ public KafkaConfig(KafkaConfigProperties properties) {
+ this.kafkaProperties = properties;
+ }
+
+ @Bean
+ public Producer producer() {
+ // Проверка, что свойства не null
+ Objects.requireNonNull(kafkaProperties.getBootstrapServers(),
+ "kafka.bootstrap-servers не задано в конфигурации");
+ Objects.requireNonNull(kafkaProperties.getClientIdConfig(),
+ "kafka.client-id-config не задано в конфигурации");
+ Objects.requireNonNull(kafkaProperties.getProducerKeySerializer(),
+ "kafka.producer-key-serializer не задано в конфигурации");
+ Objects.requireNonNull(kafkaProperties.getProducerValueSerializer(),
+ "kafka.producer-value-serializer не задано в конфигурации");
+ Objects.requireNonNull(kafkaProperties.getUserActionTopic(),
+ "kafka.user-action-topic не задано в конфигурации");
+ Properties properties = new Properties();
+ properties.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaProperties.getBootstrapServers());
+ properties.put(ProducerConfig.CLIENT_ID_CONFIG, kafkaProperties.getClientIdConfig());
+ properties.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, kafkaProperties.getProducerKeySerializer());
+ properties.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, kafkaProperties.getProducerValueSerializer());
+ return new KafkaProducer<>(properties);
+ }
+}
\ No newline at end of file
diff --git a/stats/collector/src/main/java/ru/practicum/config/KafkaConfigProperties.java b/stats/collector/src/main/java/ru/practicum/config/KafkaConfigProperties.java
new file mode 100644
index 0000000..b8611f3
--- /dev/null
+++ b/stats/collector/src/main/java/ru/practicum/config/KafkaConfigProperties.java
@@ -0,0 +1,19 @@
+package ru.practicum.config;
+
+import lombok.AccessLevel;
+import lombok.Getter;
+import lombok.Setter;
+import lombok.experimental.FieldDefaults;
+import org.springframework.boot.context.properties.ConfigurationProperties;
+
+@Getter
+@Setter
+@ConfigurationProperties(prefix = "kafka")
+@FieldDefaults(level = AccessLevel.PRIVATE)
+public class KafkaConfigProperties {
+ String bootstrapServers;
+ String clientIdConfig;
+ String producerKeySerializer;
+ String producerValueSerializer;
+ String userActionTopic;
+}
\ No newline at end of file
diff --git a/stats/collector/src/main/java/ru/practicum/mapper/UserActionMapper.java b/stats/collector/src/main/java/ru/practicum/mapper/UserActionMapper.java
new file mode 100644
index 0000000..6557a69
--- /dev/null
+++ b/stats/collector/src/main/java/ru/practicum/mapper/UserActionMapper.java
@@ -0,0 +1,47 @@
+package ru.practicum.mapper;
+
+import ru.practicum.ewm.stats.avro.ActionTypeAvro;
+import ru.practicum.ewm.stats.avro.UserActionAvro;
+import ru.practicum.grpc.stat.action.ActionTypeProto;
+import ru.practicum.grpc.stat.action.UserActionProto;
+import ru.practicum.model.ActionType;
+import ru.practicum.model.UserAction;
+
+import java.time.Instant;
+
+public class UserActionMapper {
+ public static UserActionAvro toUserActionAvro(UserAction userAction) {
+ return UserActionAvro.newBuilder()
+ .setUserId(userAction.getUserId())
+ .setEventId(userAction.getEventId())
+ .setTimestamp(userAction.getTimestamp())
+ .setActionType(toActionTypeAvro(userAction.getActionType()))
+ .build();
+
+ }
+
+ public static ActionTypeAvro toActionTypeAvro(ActionType actionType) {
+ return ActionTypeAvro.valueOf(actionType.name());
+ }
+
+
+ public static UserAction map(UserActionProto userActionProto) {
+ return UserAction.builder()
+ .userId(userActionProto.getUserId())
+ .eventId(userActionProto.getEventId())
+ .actionType(toActionType(userActionProto.getActionType()))
+ .timestamp(Instant.ofEpochSecond(userActionProto.getTimestamp().getSeconds(),
+ userActionProto.getTimestamp().getNanos()))
+ .build();
+ }
+
+ public static ActionType toActionType(ActionTypeProto actionTypeProto) {
+ return switch (actionTypeProto) {
+ case ACTION_VIEW -> ActionType.VIEW;
+ case ACTION_REGISTER -> ActionType.REGISTER;
+ case ACTION_LIKE -> ActionType.LIKE;
+ default -> null;
+ };
+ }
+
+}
\ No newline at end of file
diff --git a/stats/collector/src/main/java/ru/practicum/model/ActionType.java b/stats/collector/src/main/java/ru/practicum/model/ActionType.java
new file mode 100644
index 0000000..a2d84fa
--- /dev/null
+++ b/stats/collector/src/main/java/ru/practicum/model/ActionType.java
@@ -0,0 +1,7 @@
+package ru.practicum.model;
+
+public enum ActionType {
+ VIEW,
+ REGISTER,
+ LIKE
+}
\ No newline at end of file
diff --git a/stats/collector/src/main/java/ru/practicum/model/UserAction.java b/stats/collector/src/main/java/ru/practicum/model/UserAction.java
new file mode 100644
index 0000000..993069b
--- /dev/null
+++ b/stats/collector/src/main/java/ru/practicum/model/UserAction.java
@@ -0,0 +1,22 @@
+package ru.practicum.model;
+
+import jakarta.validation.constraints.NotNull;
+import lombok.*;
+import lombok.experimental.FieldDefaults;
+
+import java.time.Instant;
+
+@Builder
+@Getter
+@Setter
+@ToString
+@FieldDefaults(level = AccessLevel.PRIVATE)
+public class UserAction {
+ @NotNull
+ Long userId;
+ @NotNull
+ Long eventId;
+ @NotNull
+ ActionType actionType;
+ Instant timestamp = Instant.now();
+}
\ No newline at end of file
diff --git a/stats/collector/src/main/java/ru/practicum/serializer/UserActionsAvroSerializer.java b/stats/collector/src/main/java/ru/practicum/serializer/UserActionsAvroSerializer.java
new file mode 100644
index 0000000..f6d1e49
--- /dev/null
+++ b/stats/collector/src/main/java/ru/practicum/serializer/UserActionsAvroSerializer.java
@@ -0,0 +1,36 @@
+package ru.practicum.serializer;
+
+import lombok.extern.slf4j.Slf4j;
+import org.apache.avro.io.BinaryEncoder;
+import org.apache.avro.io.DatumWriter;
+import org.apache.avro.io.EncoderFactory;
+import org.apache.avro.specific.SpecificDatumWriter;
+import org.apache.avro.specific.SpecificRecordBase;
+import org.apache.kafka.common.errors.SerializationException;
+import org.apache.kafka.common.serialization.Serializer;
+
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+
+@Slf4j
+public class UserActionsAvroSerializer implements Serializer {
+ private final EncoderFactory encoderFactory = EncoderFactory.get();
+ private BinaryEncoder encoder;
+
+ public byte[] serialize(String topic, SpecificRecordBase data) {
+
+ try (ByteArrayOutputStream out = new ByteArrayOutputStream()) {
+ byte[] result = null;
+ encoder = encoderFactory.binaryEncoder(out, encoder);
+ if (data != null) {
+ DatumWriter writer = new SpecificDatumWriter<>(data.getSchema());
+ writer.write(data, encoder);
+ encoder.flush();
+ result = out.toByteArray();
+ }
+ return result;
+ } catch (IOException ex) {
+ throw new SerializationException("Error with serialization data for topic [" + topic + "]", ex);
+ }
+ }
+}
\ No newline at end of file
diff --git a/stats/collector/src/main/java/ru/practicum/service/ActionService.java b/stats/collector/src/main/java/ru/practicum/service/ActionService.java
new file mode 100644
index 0000000..ab5b8d6
--- /dev/null
+++ b/stats/collector/src/main/java/ru/practicum/service/ActionService.java
@@ -0,0 +1,9 @@
+package ru.practicum.service;
+
+
+import ru.practicum.model.UserAction;
+
+public interface ActionService {
+
+ void collectUserAction(UserAction userAction);
+}
\ No newline at end of file
diff --git a/stats/collector/src/main/java/ru/practicum/service/ActionServiceImpl.java b/stats/collector/src/main/java/ru/practicum/service/ActionServiceImpl.java
new file mode 100644
index 0000000..efc082d
--- /dev/null
+++ b/stats/collector/src/main/java/ru/practicum/service/ActionServiceImpl.java
@@ -0,0 +1,62 @@
+package ru.practicum.service;
+
+import jakarta.annotation.PreDestroy;
+import lombok.RequiredArgsConstructor;
+import lombok.extern.slf4j.Slf4j;
+import org.apache.avro.specific.SpecificRecordBase;
+import org.apache.kafka.clients.producer.Producer;
+import org.apache.kafka.clients.producer.ProducerRecord;
+import org.springframework.stereotype.Service;
+import ru.practicum.mapper.UserActionMapper;
+import ru.practicum.model.UserAction;
+import ru.practicum.config.KafkaConfig;
+import ru.practicum.ewm.stats.avro.UserActionAvro;
+
+import java.util.Objects;
+
+@Service
+@RequiredArgsConstructor
+@Slf4j
+public class ActionServiceImpl implements ActionService {
+ private final Producer producer;
+ private final KafkaConfig kafkaConfig;
+
+ @Override
+ public void collectUserAction(UserAction userAction) {
+ Objects.requireNonNull(userAction, "UserAction cannot be null");
+
+ String topic = kafkaConfig.getKafkaProperties().getUserActionTopic();
+ Objects.requireNonNull(topic, "Kafka topic is not configured!");
+
+ log.info("Sending UserAction to Kafka. Topic: {}, UserID: {}, EventID: {}",
+ topic, userAction.getUserId(), userAction.getEventId());
+
+ UserActionAvro avroRecord = UserActionMapper.toUserActionAvro(userAction);
+ send(topic, userAction.getUserId(), userAction.getTimestamp().toEpochMilli(), avroRecord);
+ }
+
+ private void send(String topic, Long key, Long timestamp, SpecificRecordBase specificRecordBase) {
+ ProducerRecord rec = new ProducerRecord<>(
+ topic,
+ null,
+ timestamp,
+ key,
+ specificRecordBase);
+ producer.send(rec, (metadata, exception) -> {
+ if (exception != null) {
+ log.error("Kafka: сообщение НЕ ОТПРАВЛЕНО, topic: {}", topic, exception);
+ } else {
+ log.info("Kafka: сообщение УСПЕШНО отправлено, topic: {}, partition: {}, offset: {}",
+ metadata.topic(), metadata.partition(), metadata.offset());
+ }
+ });
+ }
+
+ @PreDestroy
+ private void close() {
+ if (producer != null) {
+ producer.flush();
+ producer.close();
+ }
+ }
+}
\ No newline at end of file
diff --git a/stats/collector/src/main/java/ru/practicum/service/CollectorController.java b/stats/collector/src/main/java/ru/practicum/service/CollectorController.java
new file mode 100644
index 0000000..f1a8bda
--- /dev/null
+++ b/stats/collector/src/main/java/ru/practicum/service/CollectorController.java
@@ -0,0 +1,26 @@
+package ru.practicum.service;
+
+import com.google.protobuf.Empty;
+import io.grpc.stub.StreamObserver;
+import lombok.RequiredArgsConstructor;
+import lombok.extern.slf4j.Slf4j;
+import net.devh.boot.grpc.server.service.GrpcService;
+import ru.practicum.mapper.UserActionMapper;
+import ru.practicum.grpc.stat.action.UserActionProto;
+import ru.practicum.grpc.stat.collector.UserActionControllerGrpc;
+
+@GrpcService
+@Slf4j
+@RequiredArgsConstructor
+public class CollectorController extends UserActionControllerGrpc.UserActionControllerImplBase {
+ private final ActionService actionService;
+
+ @Override
+ public void collectUserAction(UserActionProto request, StreamObserver responseObserver) {
+ log.info("ActionController call collectUserAction for request = {}", request);
+ actionService.collectUserAction(UserActionMapper.map(request));
+
+ responseObserver.onNext(Empty.getDefaultInstance());
+ responseObserver.onCompleted();
+ }
+}
\ No newline at end of file
diff --git a/stats/collector/src/main/resources/application.yml b/stats/collector/src/main/resources/application.yml
new file mode 100644
index 0000000..8d4e45f
--- /dev/null
+++ b/stats/collector/src/main/resources/application.yml
@@ -0,0 +1,23 @@
+spring:
+ application:
+ name: collector
+ config:
+ import: 'configserver:'
+ cloud:
+ config:
+ discovery:
+ service-id: config-server
+ enabled: true
+ fail-fast: true
+ retry:
+ use-random-policy: true
+ max-interval: 10000
+eureka:
+ instance:
+ hostname: localhost
+ prefer-ip-address: true
+ ip-address: 127.0.0.1
+ client:
+ service-url:
+ defaultZone: http://localhost:8761/eureka
+ register-with-eureka: true
\ No newline at end of file
diff --git a/stats/pom.xml b/stats/pom.xml
index 321657b..7c3a106 100644
--- a/stats/pom.xml
+++ b/stats/pom.xml
@@ -17,8 +17,10 @@
stats-client
- stats-dto
- stats-server
+ collector
+ aggregator
+ serialization
+ analyzer
diff --git a/stats/serialization/avro-schemas/pom.xml b/stats/serialization/avro-schemas/pom.xml
new file mode 100644
index 0000000..27077af
--- /dev/null
+++ b/stats/serialization/avro-schemas/pom.xml
@@ -0,0 +1,81 @@
+
+
+ 4.0.0
+
+ ru.practicum
+ serialization
+ 0.0.1-SNAPSHOT
+
+
+ avro-schemas
+
+
+ 21
+ 21
+ UTF-8
+
+
+
+
+ org.apache.avro
+ avro
+ ${avro.version}
+
+
+
+ org.apache.kafka
+ kafka-clients
+
+
+
+
+
+
+ org.apache.maven.plugins
+ maven-compiler-plugin
+
+
+
+ org.apache.avro
+ avro-maven-plugin
+
+
+ schemas
+ generate-sources
+
+ idl-protocol
+
+
+ ${project.basedir}/src/main/avro
+ ${project.build.directory}/generated-sources/avro
+ String
+
+
+
+
+
+
+ org.codehaus.mojo
+ build-helper-maven-plugin
+ 3.5.0
+
+
+ add-source
+ generate-sources
+
+ add-source
+
+
+
+ ${project.build.directory}/generated-sources
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/stats/serialization/avro-schemas/src/main/avro/ru/practicum/ewm/stats/avro/EventSimilarityProtocol.avdl b/stats/serialization/avro-schemas/src/main/avro/ru/practicum/ewm/stats/avro/EventSimilarityProtocol.avdl
new file mode 100644
index 0000000..117946e
--- /dev/null
+++ b/stats/serialization/avro-schemas/src/main/avro/ru/practicum/ewm/stats/avro/EventSimilarityProtocol.avdl
@@ -0,0 +1,10 @@
+@namespace("ru.practicum.ewm.stats.avro")
+protocol EventSimilarityProtocol {
+
+ record EventSimilarityAvro {
+ long eventA;
+ long eventB;
+ double score;
+ timestamp_ms timestamp;
+ }
+}
\ No newline at end of file
diff --git a/stats/serialization/avro-schemas/src/main/avro/ru/practicum/ewm/stats/avro/UserActionProtocol.avdl b/stats/serialization/avro-schemas/src/main/avro/ru/practicum/ewm/stats/avro/UserActionProtocol.avdl
new file mode 100644
index 0000000..b8ea932
--- /dev/null
+++ b/stats/serialization/avro-schemas/src/main/avro/ru/practicum/ewm/stats/avro/UserActionProtocol.avdl
@@ -0,0 +1,16 @@
+@namespace("ru.practicum.ewm.stats.avro")
+protocol UserActionProtocol {
+
+ enum ActionTypeAvro {
+ VIEW,
+ REGISTER,
+ LIKE
+ }
+
+ record UserActionAvro {
+ long userId;
+ long eventId;
+ ActionTypeAvro actionType;
+ timestamp_ms timestamp;
+ }
+}
\ No newline at end of file
diff --git a/stats/serialization/avro-schemas/src/main/java/ru/practicum/AvroDeserializer.java b/stats/serialization/avro-schemas/src/main/java/ru/practicum/AvroDeserializer.java
new file mode 100644
index 0000000..2431732
--- /dev/null
+++ b/stats/serialization/avro-schemas/src/main/java/ru/practicum/AvroDeserializer.java
@@ -0,0 +1,51 @@
+package ru.practicum;
+
+import org.apache.avro.io.BinaryDecoder;
+import org.apache.avro.io.DecoderFactory;
+import org.apache.avro.specific.SpecificDatumReader;
+import org.apache.avro.specific.SpecificRecordBase;
+import org.apache.kafka.common.serialization.Deserializer;
+
+import java.io.ByteArrayInputStream;
+import java.util.Map;
+
+public class AvroDeserializer implements Deserializer {
+
+ private final Class targetType;
+
+ public AvroDeserializer(Class targetType) {
+ this.targetType = targetType;
+ }
+
+ public AvroDeserializer() {
+ this.targetType = null;
+ }
+
+ @Override
+ public void configure(Map configs, boolean isKey) {
+
+ }
+
+ @Override
+ public T deserialize(String topic, byte[] data) {
+ if (data == null) {
+ return null;
+ }
+
+ if (targetType == null) {
+ throw new IllegalStateException("targetType is undefined in AvroDeserializer");
+ }
+
+ try {
+ SpecificDatumReader datumReader = new SpecificDatumReader<>(targetType);
+ ByteArrayInputStream in = new ByteArrayInputStream(data);
+ BinaryDecoder decoder = DecoderFactory.get().binaryDecoder(in, null);
+ return datumReader.read(null, decoder);
+ } catch (Exception e) {
+ throw new RuntimeException("Couldn't deserialize avro message for topic " + topic, e);
+ }
+ }
+
+ @Override
+ public void close(){}
+}
diff --git a/stats/serialization/avro-schemas/src/main/java/ru/practicum/AvroSerializer.java b/stats/serialization/avro-schemas/src/main/java/ru/practicum/AvroSerializer.java
new file mode 100644
index 0000000..e958572
--- /dev/null
+++ b/stats/serialization/avro-schemas/src/main/java/ru/practicum/AvroSerializer.java
@@ -0,0 +1,39 @@
+package ru.practicum;
+
+import org.apache.avro.io.BinaryEncoder;
+import org.apache.avro.io.DatumWriter;
+import org.apache.avro.io.EncoderFactory;
+import org.apache.avro.specific.SpecificDatumWriter;
+import org.apache.avro.specific.SpecificRecordBase;
+import org.apache.kafka.common.errors.SerializationException;
+import org.apache.kafka.common.serialization.Serializer;
+
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.util.Map;
+
+public class AvroSerializer implements Serializer {
+
+ @Override
+ public void configure(Map configs, boolean isKey) {}
+
+ @Override
+ public byte[] serialize(String topic, T data) {
+ if (data == null) {
+ return null;
+ }
+
+ try (ByteArrayOutputStream out = new ByteArrayOutputStream()) {
+ BinaryEncoder encoder = EncoderFactory.get().binaryEncoder(out, null);
+ DatumWriter writer = new SpecificDatumWriter<>(data.getSchema());
+ writer.write(data, encoder);
+ encoder.flush();
+ return out.toByteArray();
+ } catch (IOException e) {
+ throw new SerializationException("Error has occurred during serialization avro-message", e);
+ }
+ }
+
+ @Override
+ public void close(){}
+}
diff --git a/stats/serialization/pom.xml b/stats/serialization/pom.xml
new file mode 100644
index 0000000..3f32dfb
--- /dev/null
+++ b/stats/serialization/pom.xml
@@ -0,0 +1,27 @@
+
+
+ 4.0.0
+
+ ru.practicum
+ stats
+ 0.0.1-SNAPSHOT
+
+
+ serialization
+ pom
+
+ avro-schemas
+ proto-schemas
+
+
+
+ 21
+ 21
+ UTF-8
+
+
+
+
+
\ No newline at end of file
diff --git a/stats/serialization/proto-schemas/pom.xml b/stats/serialization/proto-schemas/pom.xml
new file mode 100644
index 0000000..e1dd02e
--- /dev/null
+++ b/stats/serialization/proto-schemas/pom.xml
@@ -0,0 +1,93 @@
+
+
+ 4.0.0
+
+ ru.practicum
+ serialization
+ 0.0.1-SNAPSHOT
+
+
+ proto-schemas
+
+
+ 21
+ 21
+ UTF-8
+
+
+
+
+ io.grpc
+ grpc-stub
+
+
+
+ io.grpc
+ grpc-protobuf
+
+
+
+ jakarta.annotation
+ jakarta.annotation-api
+ 1.3.5
+ true
+
+
+
+
+
+
+ org.apache.maven.plugins
+ maven-compiler-plugin
+
+
+
+ io.github.ascopes
+ protobuf-maven-plugin
+
+
+ ${protobuf.version}
+
+
+
+ io.grpc
+ protoc-gen-grpc-java
+ ${grpc.version}
+
+
+
+
+
+
+
+ generate
+
+
+
+
+
+
+ org.codehaus.mojo
+ build-helper-maven-plugin
+ 3.5.0
+
+
+ add-source
+ generate-sources
+
+ add-source
+
+
+
+ ${project.build.directory}/generated-sources/protobuf
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/stats/serialization/proto-schemas/src/main/protobuf/stats/messages/recommendation_request.proto b/stats/serialization/proto-schemas/src/main/protobuf/stats/messages/recommendation_request.proto
new file mode 100644
index 0000000..f86bb52
--- /dev/null
+++ b/stats/serialization/proto-schemas/src/main/protobuf/stats/messages/recommendation_request.proto
@@ -0,0 +1,26 @@
+syntax = "proto3";
+
+package stats.messages.request;
+
+option java_multiple_files = true;
+option java_package = "ru.practicum.grpc.stat.request";
+
+message UserPredictionsRequestProto{
+ int64 user_id = 1;
+ int64 max_results = 2;
+}
+
+message SimilarEventsRequestProto{
+ int64 event_id = 1;
+ int64 user_id = 2;
+ int64 max_results = 3;
+}
+
+message InteractionsCountRequestProto{
+ repeated int64 event_id = 1;
+}
+
+message RecommendedEventProto{
+ int64 event_id = 1;
+ double score = 2;
+}
\ No newline at end of file
diff --git a/stats/serialization/proto-schemas/src/main/protobuf/stats/messages/user_action.proto b/stats/serialization/proto-schemas/src/main/protobuf/stats/messages/user_action.proto
new file mode 100644
index 0000000..db7cdaf
--- /dev/null
+++ b/stats/serialization/proto-schemas/src/main/protobuf/stats/messages/user_action.proto
@@ -0,0 +1,20 @@
+syntax = "proto3";
+
+package stats.messages.action;
+
+option java_multiple_files = true;
+option java_package = "ru.practicum.grpc.stat.action";
+import "google/protobuf/timestamp.proto";
+
+message UserActionProto{
+ int64 user_id = 1;
+ int64 event_id = 2;
+ ActionTypeProto action_type = 3;
+ google.protobuf.Timestamp timestamp = 4;
+}
+
+enum ActionTypeProto{
+ ACTION_VIEW = 0;
+ ACTION_REGISTER = 1;
+ ACTION_LIKE = 2;
+}
\ No newline at end of file
diff --git a/stats/serialization/proto-schemas/src/main/protobuf/stats/service/recommendations_controller.proto b/stats/serialization/proto-schemas/src/main/protobuf/stats/service/recommendations_controller.proto
new file mode 100644
index 0000000..04abaf6
--- /dev/null
+++ b/stats/serialization/proto-schemas/src/main/protobuf/stats/service/recommendations_controller.proto
@@ -0,0 +1,18 @@
+syntax = "proto3";
+
+package stats.service.dashboard;
+
+option java_multiple_files = true;
+option java_package = "ru.practicum.grpc.stat.dashboard";
+import "stats/messages/recommendation_request.proto";
+
+service RecommendationsController{
+ rpc GetRecommendationsForUser(stats.messages.request.UserPredictionsRequestProto)
+ returns (stream stats.messages.request.RecommendedEventProto);
+
+ rpc GetSimilarEvents(stats.messages.request.SimilarEventsRequestProto)
+ returns (stream stats.messages.request.RecommendedEventProto);
+
+ rpc GetInteractionsCount(stats.messages.request.InteractionsCountRequestProto)
+ returns (stream stats.messages.request.RecommendedEventProto);
+}
\ No newline at end of file
diff --git a/stats/serialization/proto-schemas/src/main/protobuf/stats/service/user_action_controller.proto b/stats/serialization/proto-schemas/src/main/protobuf/stats/service/user_action_controller.proto
new file mode 100644
index 0000000..8c81016
--- /dev/null
+++ b/stats/serialization/proto-schemas/src/main/protobuf/stats/service/user_action_controller.proto
@@ -0,0 +1,12 @@
+syntax = "proto3";
+
+package stats.service.collector;
+
+option java_package = "ru.practicum.grpc.stat.collector";
+
+import "google/protobuf/empty.proto";
+import "stats/messages/user_action.proto";
+
+service UserActionController{
+ rpc CollectUserAction(stats.messages.action.UserActionProto) returns (google.protobuf.Empty);
+}
\ No newline at end of file
diff --git a/stats/stats-client/pom.xml b/stats/stats-client/pom.xml
index b110bde..1d28091 100644
--- a/stats/stats-client/pom.xml
+++ b/stats/stats-client/pom.xml
@@ -16,35 +16,50 @@
21
21
UTF-8
+ 2.13.1.RELEASE
ru.practicum
- stats-dto
+ proto-schemas
0.0.1-SNAPSHOT
+ compile
+
+
+ io.grpc
+ grpc-stub
+
+
+ net.devh
+ grpc-client-spring-boot-starter
+ ${grpc-spring-boot-starter.version}
-
org.springframework.boot
- spring-boot-starter-web
+ spring-boot-starter-actuator
-
- org.springframework
- spring-web
+ org.springframework.boot
+ spring-boot-starter-web
-
org.projectlombok
lombok
- true
+ provided
org.springframework.cloud
- spring-cloud-openfeign-core
+ spring-cloud-starter-netflix-eureka-client
+
+
+ org.springframework.cloud
+ spring-cloud-commons
+
+
+ org.springframework.retry
+ spring-retry
-
\ No newline at end of file
diff --git a/stats/stats-client/src/main/java/ru/practicum/StatClient.java b/stats/stats-client/src/main/java/ru/practicum/StatClient.java
deleted file mode 100644
index b371b8d..0000000
--- a/stats/stats-client/src/main/java/ru/practicum/StatClient.java
+++ /dev/null
@@ -1,24 +0,0 @@
-package ru.practicum;
-
-import org.springframework.cloud.openfeign.FeignClient;
-import org.springframework.web.bind.annotation.GetMapping;
-import org.springframework.web.bind.annotation.PostMapping;
-import org.springframework.web.bind.annotation.RequestBody;
-import org.springframework.web.bind.annotation.RequestParam;
-import ru.practicum.dto.EndpointHitDto;
-import ru.practicum.dto.ViewStatsDto;
-
-import java.util.List;
-
-@FeignClient(name = "stats-server")
-public interface StatClient {
-
- @PostMapping("/hit")
- void saveHit(@RequestBody EndpointHitDto hitDto);
-
- @GetMapping("stats")
- List getStats(@RequestParam(defaultValue = "") String start,
- @RequestParam(defaultValue = "") String end,
- @RequestParam(defaultValue = "") List uris,
- @RequestParam(defaultValue = "false") Boolean unique);
-}
\ No newline at end of file
diff --git a/stats/stats-client/src/main/java/ru/practicum/StatServiceClient.java b/stats/stats-client/src/main/java/ru/practicum/StatServiceClient.java
deleted file mode 100644
index 40afb07..0000000
--- a/stats/stats-client/src/main/java/ru/practicum/StatServiceClient.java
+++ /dev/null
@@ -1,34 +0,0 @@
-package ru.practicum;
-
-import lombok.RequiredArgsConstructor;
-import lombok.extern.slf4j.Slf4j;
-import org.springframework.stereotype.Component;
-import ru.practicum.dto.EndpointHitDto;
-import ru.practicum.dto.ViewStatsDto;
-
-import java.util.Collections;
-import java.util.List;
-
-@Slf4j
-@Component
-@RequiredArgsConstructor
-public class StatServiceClient {
-
- private final StatClient statClient;
-
- public void saveHit(EndpointHitDto dto) {
- statClient.saveHit(dto);
- }
-
- public List getStats(String start,
- String end,
- List uris,
- Boolean unique) {
- try {
- return statClient.getStats(start, end, uris, unique);
- } catch (Exception e) {
- log.warn("Failed to get stats: {}", e.getMessage());
- }
- return Collections.emptyList();
- }
-}
\ No newline at end of file
diff --git a/stats/stats-client/src/main/java/ru/practicum/stats/client/StatClient.java b/stats/stats-client/src/main/java/ru/practicum/stats/client/StatClient.java
new file mode 100644
index 0000000..a6e6c35
--- /dev/null
+++ b/stats/stats-client/src/main/java/ru/practicum/stats/client/StatClient.java
@@ -0,0 +1,19 @@
+package ru.practicum.stats.client;
+
+import ru.practicum.grpc.stat.action.ActionTypeProto;
+import ru.practicum.grpc.stat.request.RecommendedEventProto;
+
+import java.time.Instant;
+import java.util.List;
+import java.util.stream.Stream;
+
+public interface StatClient {
+
+ void registerUserAction(long eventId, long userId, ActionTypeProto actionTypeProto, Instant instant);
+
+ Stream getSimilarEvents(long eventId, long userId, int maxResults);
+
+ Stream getRecommendationsFor(long userId, int maxResults);
+
+ Stream getInteractionsCount(List eventIds);
+}
\ No newline at end of file
diff --git a/stats/stats-client/src/main/java/ru/practicum/stats/client/StatClientImpl.java b/stats/stats-client/src/main/java/ru/practicum/stats/client/StatClientImpl.java
new file mode 100644
index 0000000..a7bf2c0
--- /dev/null
+++ b/stats/stats-client/src/main/java/ru/practicum/stats/client/StatClientImpl.java
@@ -0,0 +1,102 @@
+package ru.practicum.stats.client;
+
+import com.google.protobuf.Timestamp;
+import lombok.RequiredArgsConstructor;
+import lombok.extern.slf4j.Slf4j;
+import net.devh.boot.grpc.client.inject.GrpcClient;
+import org.springframework.stereotype.Component;
+import ru.practicum.grpc.stat.action.ActionTypeProto;
+import ru.practicum.grpc.stat.action.UserActionProto;
+import ru.practicum.grpc.stat.collector.UserActionControllerGrpc;
+import ru.practicum.grpc.stat.dashboard.RecommendationsControllerGrpc;
+import ru.practicum.grpc.stat.request.InteractionsCountRequestProto;
+import ru.practicum.grpc.stat.request.RecommendedEventProto;
+import ru.practicum.grpc.stat.request.SimilarEventsRequestProto;
+import ru.practicum.grpc.stat.request.UserPredictionsRequestProto;
+
+import java.time.Instant;
+import java.util.Iterator;
+import java.util.List;
+import java.util.Spliterator;
+import java.util.Spliterators;
+import java.util.stream.Stream;
+import java.util.stream.StreamSupport;
+
+@Slf4j
+@Component
+@RequiredArgsConstructor
+public class StatClientImpl implements StatClient {
+
+ @GrpcClient("collector")
+ private UserActionControllerGrpc.UserActionControllerBlockingStub userClient;
+
+ @GrpcClient("analyzer")
+ private RecommendationsControllerGrpc.RecommendationsControllerBlockingStub analyzerClient;
+
+ @Override
+ public void registerUserAction(long eventId, long userId, ActionTypeProto actionTypeProto, Instant instant) {
+ log.info("statClientImpl registerUserAction for eventId = {}, userId = {}, actionType = {}, time = {}",
+ eventId, userId, actionTypeProto, instant);
+ Timestamp timestamp = Timestamp.newBuilder()
+ .setNanos(instant.getNano())
+ .setSeconds(instant.getEpochSecond())
+ .build();
+
+ UserActionProto request = UserActionProto.newBuilder()
+ .setUserId(userId)
+ .setEventId(eventId)
+ .setActionType(actionTypeProto)
+ .setTimestamp(timestamp)
+ .build();
+
+ log.info("statClientImpl registerUserAction request = {}", request);
+ userClient.collectUserAction(request);
+ }
+
+ @Override
+ public Stream getSimilarEvents(long eventId, long userId, int maxResults) {
+ log.info("statsClientImpl getSimilarEvents for eventId = {}, userId = {}, maxResults = {}",
+ eventId, userId, maxResults);
+ SimilarEventsRequestProto request = SimilarEventsRequestProto.newBuilder()
+ .setEventId(eventId)
+ .setUserId(userId)
+ .setMaxResults(maxResults)
+ .build();
+
+ Iterator iterator = analyzerClient.getSimilarEvents(request);
+
+ return toStream(iterator);
+ }
+
+ @Override
+ public Stream getRecommendationsFor(long userId, int maxResults) {
+ log.info("statsClientImpl getRecommendationsForUser for userId = {}, maxResults = {}", userId, maxResults);
+ UserPredictionsRequestProto request = UserPredictionsRequestProto.newBuilder()
+ .setUserId(userId)
+ .setMaxResults(maxResults)
+ .build();
+
+ Iterator iterator = analyzerClient.getRecommendationsForUser(request);
+ return toStream(iterator);
+ }
+
+ @Override
+ public Stream getInteractionsCount(List eventIds) {
+ log.info("statsClientImpl getInteractionsCount for event list = {}", eventIds);
+
+ InteractionsCountRequestProto request = InteractionsCountRequestProto.newBuilder()
+ .addAllEventId(eventIds)
+ .build();
+
+ Iterator iterator = analyzerClient.getInteractionsCount(request);
+ return toStream(iterator);
+ }
+
+ private Stream toStream(Iterator iterator) {
+ return StreamSupport.stream(
+ Spliterators.spliteratorUnknownSize(iterator, Spliterator.ORDERED),
+ false
+ );
+ }
+
+}
\ No newline at end of file
diff --git a/stats/stats-client/src/main/resources/application.properties b/stats/stats-client/src/main/resources/application.properties
deleted file mode 100644
index f181fe5..0000000
--- a/stats/stats-client/src/main/resources/application.properties
+++ /dev/null
@@ -1 +0,0 @@
-client.url=http://stats-server:9090
\ No newline at end of file
diff --git a/stats/stats-dto/pom.xml b/stats/stats-dto/pom.xml
deleted file mode 100644
index e810cb1..0000000
--- a/stats/stats-dto/pom.xml
+++ /dev/null
@@ -1,39 +0,0 @@
-
-
- 4.0.0
-
- ru.practicum
- stats
- 0.0.1-SNAPSHOT
-
-
- stats-dto
-
-
- 21
- 21
- UTF-8
-
-
-
-
- org.springframework.boot
- spring-boot-starter-validation
-
-
- org.projectlombok
- lombok
- provided
-
-
- org.springframework.boot
- spring-boot-starter-actuator
-
-
- com.fasterxml.jackson.datatype
- jackson-datatype-jsr310
-
-
-
\ No newline at end of file
diff --git a/stats/stats-dto/src/main/java/ru/practicum/dto/EndpointHitDto.java b/stats/stats-dto/src/main/java/ru/practicum/dto/EndpointHitDto.java
deleted file mode 100644
index 79babb3..0000000
--- a/stats/stats-dto/src/main/java/ru/practicum/dto/EndpointHitDto.java
+++ /dev/null
@@ -1,48 +0,0 @@
-package ru.practicum.dto;
-
-import com.fasterxml.jackson.annotation.JsonProperty;
-import jakarta.validation.constraints.Pattern;
-import jakarta.validation.constraints.Size;
-import lombok.AccessLevel;
-import lombok.AllArgsConstructor;
-import lombok.Data;
-import lombok.NoArgsConstructor;
-import lombok.experimental.FieldDefaults;
-
-/**
- * The type Endpoint hit dto.
- */
-@Data
-@NoArgsConstructor
-@AllArgsConstructor
-@FieldDefaults(level = AccessLevel.PRIVATE)
-public class EndpointHitDto {
- @JsonProperty(access = JsonProperty.Access.READ_ONLY)
- Integer id;
- @Size(max = 255)
- String app;
- @Size(max = 2048)
- String uri;
- @Pattern(
- regexp = "^((25[0-5]|2[0-4]\\d|1?\\d\\d?)\\.){3}(25[0-5]|2[0-4]\\d|1?\\d\\d?)$",
- message = "Неверный формат IP-адреса"
- )
- String ip;
- String timestamp;
-
- /**
- * Instantiates a new Endpoint hit dto.
- *
- * @param app the app
- * @param uri the uri
- * @param ip the ip
- * @param timestamp the timestamp
- */
-// Дополнительный конструктор без id
- public EndpointHitDto(String app, String uri, String ip, String timestamp) {
- this.app = app;
- this.uri = uri;
- this.ip = ip;
- this.timestamp = timestamp;
- }
-}
diff --git a/stats/stats-dto/src/main/java/ru/practicum/dto/EndpointHitSaveRequestDto.java b/stats/stats-dto/src/main/java/ru/practicum/dto/EndpointHitSaveRequestDto.java
deleted file mode 100644
index 81441b9..0000000
--- a/stats/stats-dto/src/main/java/ru/practicum/dto/EndpointHitSaveRequestDto.java
+++ /dev/null
@@ -1,27 +0,0 @@
-package ru.practicum.dto;
-
-import com.fasterxml.jackson.annotation.JsonFormat;
-import lombok.AccessLevel;
-import lombok.AllArgsConstructor;
-import lombok.Data;
-import lombok.NoArgsConstructor;
-import lombok.experimental.FieldDefaults;
-
-import java.time.LocalDateTime;
-
-/**
- * The type Endpoint hit save request dto.
- */
-@Data
-@AllArgsConstructor
-@NoArgsConstructor
-@FieldDefaults(level = AccessLevel.PRIVATE)
-public class EndpointHitSaveRequestDto {
-
- String app;
- String uri;
- String ip;
- @JsonFormat(shape = JsonFormat.Shape.STRING, pattern = "yyyy-MM-dd HH:mm:ss")
- LocalDateTime timestamp;
-
-}
diff --git a/stats/stats-dto/src/main/java/ru/practicum/dto/SecondaryViewStatsDto.java b/stats/stats-dto/src/main/java/ru/practicum/dto/SecondaryViewStatsDto.java
deleted file mode 100644
index 48603cf..0000000
--- a/stats/stats-dto/src/main/java/ru/practicum/dto/SecondaryViewStatsDto.java
+++ /dev/null
@@ -1,26 +0,0 @@
-package ru.practicum.dto;
-
-import lombok.AllArgsConstructor;
-import lombok.Data;
-import lombok.NoArgsConstructor;
-
-/**
- * The type Secondary view stats dto.
- */
-@Data
-@AllArgsConstructor
-@NoArgsConstructor
-public class SecondaryViewStatsDto {
- /**
- * The App.
- */
- String app;
- /**
- * The Uri.
- */
- String uri;
- /**
- * The Hits.
- */
- Long hits;
-}
diff --git a/stats/stats-server/src/main/java/ru/practicum/ErrorResponse.java b/stats/stats-server/src/main/java/ru/practicum/ErrorResponse.java
deleted file mode 100644
index 6965644..0000000
--- a/stats/stats-server/src/main/java/ru/practicum/ErrorResponse.java
+++ /dev/null
@@ -1,32 +0,0 @@
-package ru.practicum;
-
-import com.fasterxml.jackson.annotation.JsonIgnore;
-import com.fasterxml.jackson.annotation.JsonProperty;
-import lombok.Data;
-import org.slf4j.Logger;
-
-import java.io.PrintWriter;
-import java.io.StringWriter;
-
-@Data
-public class ErrorResponse {
-
- @JsonProperty("error")
- private String message;
- @JsonIgnore
- private String stacktrace;
-
- public ErrorResponse(String message) {
- this.message = message;
- }
-
- public static ErrorResponse getErrorResponse(Exception e, Logger log) {
- log.info("Error", e);
- ErrorResponse errorResponse = new ErrorResponse(e.getMessage());
- StringWriter stringWriter = new StringWriter();
- PrintWriter pw = new PrintWriter(stringWriter);
- e.printStackTrace(pw);
- errorResponse.setStacktrace(pw.toString());
- return errorResponse;
- }
-}
\ No newline at end of file
diff --git a/stats/stats-server/src/main/java/ru/practicum/controller/StatController.java b/stats/stats-server/src/main/java/ru/practicum/controller/StatController.java
deleted file mode 100644
index 5bfbc21..0000000
--- a/stats/stats-server/src/main/java/ru/practicum/controller/StatController.java
+++ /dev/null
@@ -1,67 +0,0 @@
-package ru.practicum.controller;
-
-import lombok.extern.slf4j.Slf4j;
-import org.springframework.beans.factory.annotation.Autowired;
-import org.springframework.format.annotation.DateTimeFormat;
-import org.springframework.http.HttpStatus;
-import org.springframework.web.bind.annotation.*;
-import ru.practicum.ErrorResponse;
-import ru.practicum.dto.EndpointHitDto;
-import ru.practicum.dto.ViewStatsDto;
-import ru.practicum.service.StatService;
-
-import java.io.PrintWriter;
-import java.io.StringWriter;
-import java.time.LocalDateTime;
-import java.util.ArrayList;
-import java.util.List;
-
-@RestController
-@Slf4j
-public class StatController {
-
- private final StatService statsService;
-
- @Autowired
- public StatController(StatService service) {
- this.statsService = service;
- }
-
-
- @PostMapping("/hit")
- @ResponseStatus(HttpStatus.CREATED)
- public EndpointHitDto saveHit(@RequestBody EndpointHitDto hitDto) {
-
- return statsService.saveHit(hitDto);
- }
-
- @GetMapping("/stats")
- public List getHits(@RequestParam(value = "start", required = false) @DateTimeFormat(pattern = "yyyy-MM-dd HH:mm:ss") LocalDateTime start,
- @RequestParam(value = "end", required = false) @DateTimeFormat(pattern = "yyyy-MM-dd HH:mm:ss") LocalDateTime end,
- @RequestParam(value = "uris", required = false) List uris,
- @RequestParam(value = "unique", defaultValue = "false") boolean unique
- ) {
- if (start == null || end == null) {
- throw new IllegalArgumentException("Время не может быть Null");
- }
- if (start.isAfter(end)) {
- throw new IllegalArgumentException("Дата начала не может быть позже даты конца");
- }
- if (uris == null) {
- uris = new ArrayList<>();
- }
- log.info("/GET запрос на получение статистики");
- return statsService.getStats(start, end, uris, unique);
- }
-
- @ExceptionHandler
- @ResponseStatus(HttpStatus.BAD_REQUEST)
- public ErrorResponse handleException(final IllegalArgumentException e) {
- ErrorResponse errorResponse = new ErrorResponse(e.getMessage());
- StringWriter sw = new StringWriter();
- PrintWriter pw = new PrintWriter(sw);
- e.printStackTrace(pw);
- errorResponse.setStacktrace(pw.toString());
- return errorResponse;
- }
-}
\ No newline at end of file
diff --git a/stats/stats-server/src/main/java/ru/practicum/mapper/EndpointHitMapper.java b/stats/stats-server/src/main/java/ru/practicum/mapper/EndpointHitMapper.java
deleted file mode 100644
index 7252c50..0000000
--- a/stats/stats-server/src/main/java/ru/practicum/mapper/EndpointHitMapper.java
+++ /dev/null
@@ -1,37 +0,0 @@
-package ru.practicum.mapper;
-
-import lombok.experimental.UtilityClass;
-import ru.practicum.dto.EndpointHitDto;
-import ru.practicum.model.EndpointHit;
-
-import java.time.LocalDateTime;
-
-import static ru.practicum.util.Constants.FORMATTER;
-
-@UtilityClass
-public class EndpointHitMapper {
-
- public static EndpointHitDto toHitDto(EndpointHit hit) {
- String dateTime = hit.getTimestamp().format(FORMATTER);
-
- return new EndpointHitDto(
- hit.getId(),
- hit.getApp(),
- hit.getUri(),
- hit.getIp(),
- dateTime
- );
- }
-
- public static EndpointHit dtoToHit(EndpointHitDto hitDto) {
-
- LocalDateTime localDateTime = LocalDateTime.parse(hitDto.getTimestamp(), FORMATTER);
- EndpointHit hit = new EndpointHit();
- hit.setId(hitDto.getId());
- hit.setApp(hitDto.getApp());
- hit.setUri(hitDto.getUri());
- hit.setIp(hitDto.getIp());
- hit.setTimestamp(localDateTime);
- return hit;
- }
-}
\ No newline at end of file
diff --git a/stats/stats-server/src/main/java/ru/practicum/model/EndpointHit.java b/stats/stats-server/src/main/java/ru/practicum/model/EndpointHit.java
deleted file mode 100644
index 6b78cc2..0000000
--- a/stats/stats-server/src/main/java/ru/practicum/model/EndpointHit.java
+++ /dev/null
@@ -1,34 +0,0 @@
-package ru.practicum.model;
-
-import jakarta.persistence.*;
-import lombok.*;
-import lombok.experimental.FieldDefaults;
-
-import java.time.LocalDateTime;
-
-/**
- * The type Endpoint hit.
- */
-@Setter
-@Getter
-@Entity
-@Table(name = "endpoint_hit")
-@AllArgsConstructor
-@FieldDefaults(level = AccessLevel.PRIVATE)
-@NoArgsConstructor
-public class EndpointHit {
-
- @Id
- @GeneratedValue(strategy = GenerationType.IDENTITY)
- Integer id;
-
- @Column(name = "app")
- String app;
- @Column(name = "uri")
- String uri;
- @Column(name = "ip")
- String ip;
- @Column(name = "timestamp")
- LocalDateTime timestamp;
-
-}
\ No newline at end of file
diff --git a/stats/stats-server/src/main/java/ru/practicum/repository/EndpointHitRepository.java b/stats/stats-server/src/main/java/ru/practicum/repository/EndpointHitRepository.java
deleted file mode 100644
index f786e29..0000000
--- a/stats/stats-server/src/main/java/ru/practicum/repository/EndpointHitRepository.java
+++ /dev/null
@@ -1,89 +0,0 @@
-package ru.practicum.repository;
-
-import org.springframework.data.jpa.repository.JpaRepository;
-import org.springframework.data.jpa.repository.Query;
-
-import ru.practicum.dto.ViewStatsDto;
-import ru.practicum.model.EndpointHit;
-
-import java.time.LocalDateTime;
-import java.util.List;
-
-/**
- * The interface Endpoint hit repository.
- */
-public interface EndpointHitRepository extends JpaRepository {
-
- String SELECT_STAT_WITHOUT_UNIQUE_IP_SQL = "SELECT " +
- "new ru.practicum.dto.ViewStatsDto(e.app, e.uri, " +
- "(SELECT count(ep.ip) FROM EndpointHit AS ep WHERE ep.uri = e.uri) AS hits) " +
- "FROM EndpointHit AS e WHERE e.uri IN ( ?3 ) AND e.timestamp BETWEEN ?1 AND ?2 " +
- "GROUP BY e.uri, e.app ORDER BY hits DESC ";
-
- /**
- * The constant SELECT_STAT_WITH_UNIQUE_IP_SQL.
- */
- String SELECT_STAT_WITH_UNIQUE_IP_SQL = "SELECT " +
- "new ru.practicum.dto.ViewStatsDto(e.app, e.uri, " +
- "(SELECT count(DISTINCT ep.ip) FROM EndpointHit AS ep WHERE ep.uri = e.uri) AS hits) " +
- "FROM EndpointHit AS e WHERE e.uri IN ( ?3 ) AND e.timestamp BETWEEN ?1 AND ?2 " +
- "GROUP BY e.uri, e.app ORDER BY hits DESC ";
-
- /**
- * The constant SELECT_STAT_ALL_WITHOUT_UNIQUE_IP_SQL.
- */
- String SELECT_STAT_ALL_WITHOUT_UNIQUE_IP_SQL = "SELECT " +
- "new ru.practicum.dto.ViewStatsDto(e.app, e.uri, " +
- "(SELECT count(ep.ip) FROM EndpointHit AS ep WHERE ep.uri = e.uri) AS hits) " +
- "FROM EndpointHit AS e WHERE e.timestamp BETWEEN ?1 AND ?2 GROUP BY e.uri, e.app ORDER BY hits DESC ";
-
- /**
- * The constant SELECT_STAT_ALL_WITH_UNIQUE_IP_SQL.
- */
- String SELECT_STAT_ALL_WITH_UNIQUE_IP_SQL = "SELECT " +
- "new ru.practicum.dto.ViewStatsDto(e.app, e.uri, " +
- "(SELECT count(DISTINCT ep.ip) FROM EndpointHit AS ep WHERE ep.uri = e.uri) AS hits) " +
- "FROM EndpointHit AS e WHERE e.timestamp BETWEEN ?1 AND ?2 GROUP BY e.uri, e.app ORDER BY hits DESC ";
-
- /**
- * Find stat without unique ip list.
- *
- * @param start the start
- * @param end the end
- * @param uris the uris
- * @return the list
- */
- @Query(SELECT_STAT_WITHOUT_UNIQUE_IP_SQL)
- List findStatWithoutUniqueIp(LocalDateTime start, LocalDateTime end, List uris);
-
- /**
- * Find stat with unique ip list.
- *
- * @param start the start
- * @param end the end
- * @param uris the uris
- * @return the list
- */
- @Query(SELECT_STAT_WITH_UNIQUE_IP_SQL)
- List findStatWithUniqueIp(LocalDateTime start, LocalDateTime end, List uris);
-
- /**
- * Find stat all without unique ip list.
- *
- * @param start the start
- * @param end the end
- * @return the list
- */
- @Query(SELECT_STAT_ALL_WITHOUT_UNIQUE_IP_SQL)
- List findStatAllWithoutUniqueIp(LocalDateTime start, LocalDateTime end);
-
- /**
- * Find stat all with unique ip list.
- *
- * @param start the start
- * @param end the end
- * @return the list
- */
- @Query(SELECT_STAT_ALL_WITH_UNIQUE_IP_SQL)
- List findStatAllWithUniqueIp(LocalDateTime start, LocalDateTime end);
-}
\ No newline at end of file
diff --git a/stats/stats-server/src/main/java/ru/practicum/service/StatService.java b/stats/stats-server/src/main/java/ru/practicum/service/StatService.java
deleted file mode 100644
index a7d24b9..0000000
--- a/stats/stats-server/src/main/java/ru/practicum/service/StatService.java
+++ /dev/null
@@ -1,14 +0,0 @@
-package ru.practicum.service;
-
-import ru.practicum.dto.EndpointHitDto;
-import ru.practicum.dto.ViewStatsDto;
-
-import java.time.LocalDateTime;
-import java.util.List;
-
-public interface StatService {
-
- EndpointHitDto saveHit(EndpointHitDto hitDto);
-
- List getStats(LocalDateTime start, LocalDateTime end, List uris, Boolean unique);
-}
diff --git a/stats/stats-server/src/main/java/ru/practicum/service/StatServiceImpl.java b/stats/stats-server/src/main/java/ru/practicum/service/StatServiceImpl.java
deleted file mode 100644
index 25bfb72..0000000
--- a/stats/stats-server/src/main/java/ru/practicum/service/StatServiceImpl.java
+++ /dev/null
@@ -1,51 +0,0 @@
-package ru.practicum.service;
-
-import lombok.RequiredArgsConstructor;
-import lombok.extern.slf4j.Slf4j;
-import org.springframework.stereotype.Service;
-import org.springframework.transaction.annotation.Transactional;
-import ru.practicum.dto.*;
-import ru.practicum.repository.EndpointHitRepository;
-
-import java.time.LocalDateTime;
-import java.util.ArrayList;
-import java.util.List;
-
-import static ru.practicum.mapper.EndpointHitMapper.dtoToHit;
-import static ru.practicum.mapper.EndpointHitMapper.toHitDto;
-
-/**
- * The type Stat service.
- */
-@Service
-@RequiredArgsConstructor
-@Slf4j
-public class StatServiceImpl implements StatService {
-
- private final EndpointHitRepository endpointHitRepository;
-
- @Override
- @Transactional
- public EndpointHitDto saveHit(EndpointHitDto hitDto) {
- return toHitDto(endpointHitRepository.save(dtoToHit(hitDto)));
- }
-
- @Transactional(readOnly = true)
- @Override
- public List getStats(LocalDateTime start, LocalDateTime end, List uris, Boolean unique) {
- log.info("Получение статистики с параметрами: start={}, end={}, uris={}, unique={}", start, end, uris, unique);
- List stats;
-
- if (unique) {
- stats = uris.isEmpty() ?
- endpointHitRepository.findStatAllWithUniqueIp(start, end) :
- endpointHitRepository.findStatWithUniqueIp(start, end, uris);
- } else {
- stats = uris.isEmpty() ?
- endpointHitRepository.findStatAllWithoutUniqueIp(start, end) :
- endpointHitRepository.findStatWithoutUniqueIp(start, end, uris);
- }
- log.info("Полученные статистические данные: {}", stats);
- return new ArrayList<>(stats);
- }
-}
diff --git a/stats/stats-server/src/main/java/ru/practicum/util/Constants.java b/stats/stats-server/src/main/java/ru/practicum/util/Constants.java
deleted file mode 100644
index 8766eb6..0000000
--- a/stats/stats-server/src/main/java/ru/practicum/util/Constants.java
+++ /dev/null
@@ -1,10 +0,0 @@
-package ru.practicum.util;
-
-import java.time.format.DateTimeFormatter;
-
-public class Constants {
-
- public static final String TIMESTAMP_PATTERN = "yyyy-MM-dd HH:mm:ss";
-
- public static final DateTimeFormatter FORMATTER = DateTimeFormatter.ofPattern(TIMESTAMP_PATTERN);
-}
\ No newline at end of file
diff --git a/stats/stats-server/src/main/resources/application.yml b/stats/stats-server/src/main/resources/application.yml
deleted file mode 100644
index e5a978d..0000000
--- a/stats/stats-server/src/main/resources/application.yml
+++ /dev/null
@@ -1,39 +0,0 @@
-spring:
- application:
- name: stats-server
- config:
- import: optional:configserver:http://config-server:9091
- cloud:
- config:
- enabled: false
- datasource:
- driverClassName: org.postgresql.Driver
- url: jdbc:postgresql://ewm-db:5432/ewm-stats
- username: root
- password: root
- jpa:
- hibernate:
- ddl-auto: none
- database-platform: org.hibernate.dialect.PostgreSQLDialect
- generate-ddl: false
- properties:
- hibernate:
- format_sql: true
- show-sql: false
- sql:
- init:
- mode: always
-
-eureka:
- client:
- register-with-eureka: true
- fetch-registry: true
- serviceUrl:
- defaultZone: http://${eureka.instance.hostname:localhost}:${eureka.instance.port:8761}/eureka/
- instance:
- prefer-ip-address: true
- hostname: localhost
- instance-id: "${spring.application.name}:${random.value}"
- lease-renewal-interval-in-seconds: 10
-server:
- port: 9090
\ No newline at end of file
diff --git a/stats/stats-server/src/main/resources/schema.sql b/stats/stats-server/src/main/resources/schema.sql
deleted file mode 100644
index 49082a7..0000000
--- a/stats/stats-server/src/main/resources/schema.sql
+++ /dev/null
@@ -1,9 +0,0 @@
-DROP TABLE IF EXISTS endpoint_hit;
-
-CREATE TABLE IF NOT EXISTS endpoint_hit (
- id SERIAL PRIMARY KEY,
- app VARCHAR(255),
- uri VARCHAR(255) NOT NULL,
- ip VARCHAR(255) NOT NULL,
- timestamp TIMESTAMP WITHOUT TIME ZONE NOT NULL
-);
\ No newline at end of file