The Sorting Hat's backend is responsible for collecting service-based system data and extracting CharM metrics.
Step 1) Create a .env file based on the .env.sample in this repository.
Make sure you set the DB_HOST value correctly
(localhost or db, depending on how you will run the application -- see step 3).
Step 2) Run the following command to start the database:
docker-compose up dbStep 3) You can run the application on your local machine (if you have JDK installed) or on docker.
On a local machine:
./gradlew clean build
./gradlew :server:shadowJar
java -jar server/build/libs/server-0.0.1-SNAPSHOT-all.jarOn docker:
docker-compose build # to build the app image
docker-compose up app # to run db and app containers- Kotlin language
- Spring Boot Framework
- MongoDB
- JUnit5 and Mockito for tests
- Gradle
Every time the backend is started, it fetches systems' data manually collected from a spreadsheet and saves those that have not yet been saved in the database.
The endpoints are as follows:
GET /systems: to get all systems registered in the tool.GET /systems/{name}: to get more detailed information of a specific system with namename.- Example:
GET /systems/InterSCity.
- Example:
GET /systems/{name}/metrics: to extract and get the CharM metrics of a specific system with namename.- Example:
GET /systems/TrainTicket/metrics.
- Example:
POST /systems: to collect system data from a remote repository.PUT /systems/{name}/endpoints: to register services endpoints.
Example of a request body for POST /systems:
{
"repoUrl": "https://github.com/codelab-alexia/buscar-hackathon",
"filename": "docker-compose.yaml"
}Example of a request body for PUT /systems/fictional-system/endpoints:
{
"repoUrl": "https://github.com/myorg/fictional-system",
"servicesAndOpenApiFilenames": [
{
"serviceName": "user-service",
"openApiFilename": "user-service-openapi.yaml"
},
{
"serviceName": "order-service",
"openApiFilename": "order-service-openapi.yaml"
}
]
}The backend is composed by the modules as follows:
domain: contains the core entities used by other modules.data_collector: responsible for collecting service-based systems' data. Currently, it collects data from docker-compose files.metrics_extractor: responsible for extracting CharM metrics from systems collected bydata_collector.persistence: handle saving systems' data in a MongoDB and accessing them. It is shared bydata_collectorandmetrics_extractor.server: starts an HTTP Web Server and connects the other modules. It is the entry point of the application.
All images follow the color key below:







