Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
@@ -1,2 +1,4 @@
.DS_Store
.env
.env
.env.json
.aws-sam
10 changes: 5 additions & 5 deletions .gitmodules
Original file line number Diff line number Diff line change
@@ -1,16 +1,16 @@
[submodule "api"]
path = api
url = git@github.com:mirrulations/API.git
url = https://github.com/mirrulations/API.git
branch = main
[submodule "transformation_trigger"]
path = transformation_trigger
url = git@github.com:mirrulations/transformation_trigger.git
url = https://github.com/mirrulations/transformation_trigger.git
branch = main
[submodule "data-product-kit"]
path = data-product-kit
url = git@github.com:mirrulations/Data-Product-Kit.git
path = data_product_kit
url = https://github.com/mirrulations/Data-Product-Kit.git
branch = main
[submodule "website"]
path = website
url = git@github.com:mirrulations/mirrulations-website.git
url = https://github.com/mirrulations/mirrulations-website.git
branch = main
99 changes: 93 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,19 +1,106 @@
# dev


## Setup

Requirements

- Docker
- Docker Compose
- Docker & Docker Compose
- AWS SAM
- python3

To get started clone this repository and run the following commands:
To get started clone this repository and run the following command:

```bash
chmod u+x Submodules.sh
./Submodules.sh
git submodule update --init --recursive
```
this will initalize all the submodules in this repo.

### Environment variables:

Create a .env file with the following values, values with <> surrounding them will require you to fill them in.
```
VITE_GATEWAY_API_URL=http://localhost:3000/dummy
VITE_COGNITO_USER_POOL_ENDPOINT=http://localhost:9229
VITE_COGNITO_USER_POOL_ID=local_2EfVJC8K
VITE_COGNITO_CLIENT_ID=1r4k4b23bva9jj3kxgd28zcc3
VITE_LOCAL=true

ENVIRONMENT=local
POSTGRES_DB=postgres
POSTGRES_USER=postgres
POSTGRES_PASSWORD=<enter a secure password>
POSTGRES_HOST=postgres
POSTGRES_PORT=5432
OPENSEARCH_INITIAL_ADMIN_PASSWORD=<enter a secure password (requires numbers, special characters and capitals)>
OPENSEARCH_HOST=opensearch-node1
OPENSEARCH_PORT=9200

AWS_DEFAULT_REGION=us-east-1
```

This will clone the submodules
then run the `createEnvJson.py` file to create a .env.json file for the lambdas to use.


### **EACH OF THE FOLLOWING STEPS ASSUMES THAT YOU ARE AT THE ROOT/TOP LEVEL OF THE PROJECT**

### Initializing the Databases
**THIS ONLY NEEDS TO BE DONE ON THE INITIAL SETUP**

- start the databases with `docker compose up -d --build postgres opensearch-node1 opensearch-node2`

#### Postgres
1. cd into `data_product_kit`
2. setup a python virtual environment
1. create a virtual environment with `python3 -m venv .venv`
2. activate the virtual environment with `source .venv/bin/activate`
3. install the requirements file with `pip install -r requirements.txt`
3. cd into `sql`
4. run `POSTGRES_HOST=localhost python3 ResetDatabase.py`
- note: the `POSTGRES_HOST=localhost` is required

#### Opensearch
- There is no initalization step because indexes are created when first ingesting data.


### Starting the frontend

- **NOTE**: Make sure you are not in any subfolders of the project for running this.
- if you run into an issue where docker cannot find a .env, you are not in the right spot.
- start the frontend with `docker compose up -d --build website cognito`

- open your browser of choice and navigate to `localhost:5500`

- login with username `test` and password `test`


### Starting the API Gateway
- NOTE: This will require another terminal, as it does not run detached

1. cd into `api`
2. run `sam build`
3. run `sam local start-api --docker-network dev_network --env-vars ../.env.json`
- this starts the api in the docker network that the DBs are in and with the environment variables in .env.json


### Starting the Orchestrator Lambda
- NOTE: This will rquire another terminal, as it does not run detached

1. cd into `transformation_trigger/dev-env`
2. run `sam build`
3. run `sam local start-lambda --docker-network dev_network --container-env-vars ../../.env.json --env-vars ../../.env.json`



### Ingesting Data
- The `ingest.py` file will invoke the orchestrator lambda with a given file.
- this requires the databases and the orchestrator lambda to be running

- this requires the `boto3` library
1. create a virtual environment with `python3 -m venv .venv`
2. activate the virtual environment with `source .venv/bin/activate`
3. install the requirements file with `pip install -r requirements.txt`

- to ingest run `python3 ingest.py <path-to-file>`

- **NOTE**: When ingesting dockets, documents and comments, dockets need to be ingested before documents and documents need to be ingested before comments, due to the relational database structure.
4 changes: 0 additions & 4 deletions Submodules.sh

This file was deleted.

2 changes: 1 addition & 1 deletion api
1 change: 1 addition & 0 deletions cognito/config.json
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
{}
30 changes: 30 additions & 0 deletions cognito/db/clients.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
{
"Clients": {
"1r4k4b23bva9jj3kxgd28zcc3": {
"CallbackURLs": [
"http://localhost:3000/callback"
],
"ClientId": "1r4k4b23bva9jj3kxgd28zcc3",
"ClientName": "MyAppClient",
"ClientSecret": "5oolz6muthfeg2309apcivzel",
"CreationDate": "2025-04-15T18:12:43.442Z",
"ExplicitAuthFlows": [
"ALLOW_USER_PASSWORD_AUTH",
"ALLOW_REFRESH_TOKEN_AUTH"
],
"LastModifiedDate": "2025-04-15T18:12:43.442Z",
"LogoutURLs": [
"http://localhost:3000/logout"
],
"SupportedIdentityProviders": [
"COGNITO"
],
"TokenValidityUnits": {
"AccessToken": "hours",
"IdToken": "minutes",
"RefreshToken": "days"
},
"UserPoolId": "local_2EfVJC8K"
}
}
}
Loading