Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .dockerignore
Original file line number Diff line number Diff line change
Expand Up @@ -41,4 +41,4 @@ coverage/

# Docker
.dockerignore
Dockerfile
Dockerfile
2 changes: 1 addition & 1 deletion .github/workflows/codeql.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -32,4 +32,4 @@ jobs:
uses: github/codeql-action/autobuild@v3

- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v3
uses: github/codeql-action/analyze@v3
2 changes: 1 addition & 1 deletion .github/workflows/documentation.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,4 +26,4 @@ jobs:
# Run the mkdocs command using Poetry's environment
- name: Deploy documentation
run: |
poetry run mkdocs gh-deploy --force
poetry run mkdocs gh-deploy --force
51 changes: 51 additions & 0 deletions .github/workflows/lint.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
name: Lint

permissions:
contents: read

on:
push:
branches: [main]
paths:
- '**.py'
- 'pyproject.toml'
- '.github/workflows/lint.yml'
pull_request:
branches: [main]
paths:
- '**.py'
- 'pyproject.toml'
- '.github/workflows/lint.yml'
workflow_dispatch:

jobs:
ruff:
name: Ruff Linting
runs-on: ubuntu-latest

steps:
- name: Checkout repository
uses: actions/checkout@v4

- name: Set up Python 3.11
uses: actions/setup-python@v5
with:
python-version: '3.11'
cache: 'pip'

- name: Install Poetry
run: |
curl -sSL https://install.python-poetry.org | python -
echo "$HOME/.local/bin" >> $GITHUB_PATH

- name: Install dependencies
run: |
poetry install --no-interaction --no-ansi

- name: Run ruff check
run: |
poetry run ruff check --output-format=github .

- name: Run ruff format check
run: |
poetry run ruff format --check .
6 changes: 3 additions & 3 deletions .github/workflows/pypi-release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -33,10 +33,10 @@ jobs:
cd flowfile_frontend
npm install
npm run build:web

# Create the static directory if it doesn't exist
mkdir -p ../flowfile/flowfile/web/static

# Copy the built files to the Python package
cp -r build/renderer/* ../flowfile/flowfile/web/static/
echo "Contents of web/static directory:"
Expand Down Expand Up @@ -81,4 +81,4 @@ jobs:
with:
skip-existing: true
packages-dir: dist/
verbose: true
verbose: true
2 changes: 1 addition & 1 deletion .github/workflows/release.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -112,4 +112,4 @@ jobs:
echo "GitHub Release created."
echo "Tag: ${{ github.ref_name }}"
echo "Release ID: ${{ steps.create_release.outputs.id }}"
echo "Assets have been uploaded to the GitHub Release."
echo "Assets have been uploaded to the GitHub Release."
6 changes: 3 additions & 3 deletions .github/workflows/test.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ jobs:
cd flowfile_frontend
npm install
npm run build:web

# Create the static directory if it doesn't exist
mkdir -p ../flowfile/flowfile/web/static

Expand Down Expand Up @@ -112,7 +112,7 @@ jobs:
cd flowfile_frontend
npm install
npm run build:web

# Create the static directory if it doesn't exist
New-Item -ItemType Directory -Force -Path ../flowfile/flowfile/web/static | Out-Null

Expand Down Expand Up @@ -283,4 +283,4 @@ jobs:
shell: pwsh
working-directory: flowfile_frontend
run: |
npm run test
npm run test
39 changes: 39 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
# Pre-commit hooks configuration
# See https://pre-commit.com for more information

repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version should match the version in pyproject.toml
rev: v0.8.6
hooks:
# Run the linter
- id: ruff
args: [--fix]
types_or: [python, pyi]
# Run the formatter
- id: ruff-format
types_or: [python, pyi]

- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v5.0.0
hooks:
# Identify invalid files
- id: check-ast
- id: check-yaml
- id: check-json
exclude: 'tsconfig.*\.json$' # Exclude TypeScript config files (they use JSONC with comments)
- id: check-toml
# Check for files that would conflict in case-insensitive filesystems
- id: check-case-conflict
# Check for merge conflicts
- id: check-merge-conflict
# Check for debugger imports
- id: debug-statements
# Make sure files end with newline
- id: end-of-file-fixer
# Trim trailing whitespace
- id: trailing-whitespace
args: [--markdown-linebreak-ext=md]
# Check for large files
- id: check-added-large-files
args: ['--maxkb=1000']
142 changes: 142 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,142 @@
# Contributing to Flowfile

Thank you for your interest in contributing to Flowfile! This guide will help you set up your development environment and understand our code quality standards.

## Development Setup

### Prerequisites

- Python 3.10 or higher (but less than 3.14)
- [Poetry](https://python-poetry.org/docs/#installation) for dependency management
- Git

### Initial Setup

1. **Clone the repository**
```bash
git clone https://github.com/Edwardvaneechoud/Flowfile.git
cd Flowfile
```

2. **Install dependencies**
```bash
poetry install
```

3. **Install pre-commit hooks** (recommended)
```bash
poetry run pre-commit install
```

This will automatically run linting and formatting checks before each commit.

## Code Quality

### Linting with Ruff

We use [Ruff](https://docs.astral.sh/ruff/) for linting and code formatting. Ruff is configured in `pyproject.toml`.

**Run linting manually:**
```bash
# Check for linting issues
poetry run ruff check .

# Auto-fix linting issues
poetry run ruff check --fix .

# Check code formatting
poetry run ruff format --check .

# Format code
poetry run ruff format .
```

**Configuration:**
- Target: Python 3.10+
- Line length: 120 characters
- Rules: F (Pyflakes), E/W (pycodestyle), I (isort), UP (pyupgrade), B (flake8-bugbear)

### Pre-commit Hooks

Pre-commit hooks automatically run before each commit to ensure code quality. They will:

1. **Ruff linting** - Check and auto-fix Python code issues
2. **Ruff formatting** - Format Python code consistently
3. **File checks** - Validate YAML, JSON, TOML, and Python syntax
4. **Trailing whitespace** - Remove unnecessary whitespace
5. **End of file** - Ensure files end with a newline
6. **Merge conflicts** - Detect merge conflict markers
7. **Large files** - Prevent committing large files (>1MB)

**Skip pre-commit hooks** (not recommended):
```bash
git commit --no-verify -m "Your commit message"
```

**Run pre-commit manually on all files:**
```bash
poetry run pre-commit run --all-files
```

### Continuous Integration

Our GitHub Actions workflows automatically run:

- **Linting** (`lint.yml`) - Runs ruff check and format validation on all PRs
- **Tests** (`test-docker-auth.yml`, `e2e-tests.yml`) - Runs test suites
- **Documentation** (`documentation.yml`) - Builds and deploys docs

All checks must pass before a PR can be merged.

## Running Tests

```bash
# Run all tests
poetry run pytest

# Run tests for a specific module
poetry run pytest flowfile_core/tests/
poetry run pytest flowfile_worker/tests/

# Run tests with coverage
poetry run pytest --cov=flowfile_core --cov=flowfile_worker
```

## Code Style Guidelines

- Follow [PEP 8](https://pep8.org/) style guidelines (enforced by Ruff)
- Use type hints where appropriate
- Write descriptive variable and function names
- Keep functions focused and modular
- Add docstrings for public functions and classes
- Keep line length under 120 characters

## Submitting Changes

1. **Create a new branch** for your feature or fix:
```bash
git checkout -b feature/your-feature-name
```

2. **Make your changes** and ensure all tests pass

3. **Commit your changes** (pre-commit hooks will run automatically):
```bash
git add .
git commit -m "Add your descriptive commit message"
```

4. **Push to your fork**:
```bash
git push origin feature/your-feature-name
```

5. **Create a Pull Request** on GitHub

## Getting Help

- Check the [documentation](https://edwardvaneechoud.github.io/Flowfile/)
- Open an issue on GitHub
- Read the [architecture documentation](docs/for-developers/architecture.md)

Thank you for contributing to Flowfile! 🚀
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,7 @@ For a deeper dive into the technical architecture, check out [this article](http
#### 1. Desktop Application
The desktop version offers the best experience with a native interface and integrated services. You can either:

**Option A: Download Pre-built Application**
**Option A: Download Pre-built Application**
- Download the latest release from [GitHub Releases](https://github.com/Edwardvaneechoud/Flowfile/releases)
- Run the installer for your platform (Windows, macOS, or Linux)
> **Note:** You may see security warnings since the app isn't signed with a developer certificate yet.
Expand Down
4 changes: 2 additions & 2 deletions build_backends/build_backends/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ def get_connectorx_metadata():
for dist_info in glob.glob(dist_info_pattern):
metadata_locations.append(dist_info)

# Look for egg-info directories
# Look for egg-info directories
egg_info_pattern = os.path.join(site_packages, 'connectorx*.egg-info')
for egg_info in glob.glob(egg_info_pattern):
metadata_locations.append(egg_info)
Expand Down Expand Up @@ -126,7 +126,7 @@ def patched_version(distribution_name):
"""

# Collect minimal snowflake dependencies
snowflake_imports = collect_submodules('snowflake.connector',
snowflake_imports = collect_submodules('snowflake.connector',
filter=lambda name: any(x in name for x in [
'connection',
'errors',
Expand Down
2 changes: 1 addition & 1 deletion build_backends/build_backends/main_prd.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ def wait_for_endpoint(url, timeout=60):
def shutdown_service():
"""Shutdown the service gracefully using the shutdown endpoint."""
try:
response = requests.post("http://0.0.0.0:63578/shutdown", headers={"accept": "application/json"}, data="")
requests.post("http://0.0.0.0:63578/shutdown", headers={"accept": "application/json"}, data="")
print("Shutdown request sent, waiting for service to stop...")
time.sleep(1) # Wait 10 seconds to ensure the service is fully stopped
return True
Expand Down
2 changes: 1 addition & 1 deletion docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -88,4 +88,4 @@ volumes:

secrets:
flowfile_master_key:
file: ./master_key.txt
file: ./master_key.txt
2 changes: 1 addition & 1 deletion docs/MakeFile
Original file line number Diff line number Diff line change
Expand Up @@ -20,4 +20,4 @@ clean:
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
2 changes: 1 addition & 1 deletion docs/for-developers/architecture.md
Original file line number Diff line number Diff line change
Expand Up @@ -239,4 +239,4 @@ This design enables Flowfile to:

---

*For a deep dive into the implementation details, see the [full technical article](https://dev.to/edwardvaneechoud/building-flowfile-architecting-a-visual-etl-tool-with-polars-576c).*
*For a deep dive into the implementation details, see the [full technical article](https://dev.to/edwardvaneechoud/building-flowfile-architecting-a-visual-etl-tool-with-polars-576c).*
Loading
Loading