Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
23 changes: 20 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,10 +52,27 @@ You must have the following tools installed directly on your system:
3. **Set up environment variables:**
Copy the example environment file. The default settings with SQLite should work out of the box for local development. You will still need to add GitHub API keys to enable login.
```bash
cp .env.example .env
cp backend/.env.example backend/.env
```
To get your GitHub keys, create a new GitHub OAuth App in your developer settings.

#### Setting up GitHub OAuth

1. Go to [GitHub Developer Settings](https://github.com/settings/developers)
2. In the left sidebar, click **"OAuth Apps"** (not "GitHub Apps")
3. Click **"New OAuth App"**
4. Fill in the form:
- **Application name**: `Quartorium (dev)` (or any name you prefer)
- **Homepage URL**: `http://localhost:5173`
- **Authorization callback URL**: `http://localhost:8000/api/auth/github/callback`
5. Click **"Register application"**
6. Copy the **Client ID** to `GITHUB_CLIENT_ID` in your `backend/.env`
7. Click **"Generate a new client secret"**, then copy it to `GITHUB_CLIENT_SECRET`

The app requests these OAuth scopes: `user:email` (to identify you) and `repo` (to access your repositories).

Note: For the backend to start successfully you must set the GitHub OAuth environment variables in backend/.env (GITHUB_CLIENT_ID and GITHUB_CLIENT_SECRET). If these are not present the authentication routes will fail and the app may not start depending on your local configuration.

Comment on lines +74 to +75
Copy link

Copilot AI Dec 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] This note is a single, very long sentence that's difficult to read. Consider breaking it into multiple sentences or using a bulleted list for better readability. For example:

"Note: For the backend to start successfully, you must set the GitHub OAuth environment variables in backend/.env:

  • GITHUB_CLIENT_ID
  • GITHUB_CLIENT_SECRET

If these are not present, the authentication routes will fail and the app may not start depending on your local configuration."

Suggested change
Note: For the backend to start successfully you must set the GitHub OAuth environment variables in backend/.env (GITHUB_CLIENT_ID and GITHUB_CLIENT_SECRET). If these are not present the authentication routes will fail and the app may not start depending on your local configuration.
**Note:** For the backend to start successfully, you must set the GitHub OAuth environment variables in `backend/.env`:
- `GITHUB_CLIENT_ID`
- `GITHUB_CLIENT_SECRET`
If these are not present, the authentication routes will fail and the app may not start depending on your local configuration.

Copilot uses AI. Check for mistakes.
4. **Run the application:**
You will need two separate terminal windows or tabs.

Expand All @@ -69,10 +86,10 @@ You must have the following tools installed directly on your system:
```bash
npm run start:frontend
```
This will start the React development server, typically on `http://localhost:3000`.
This will start the React development server on `http://localhost:5173`.

5. **Open the app:**
Navigate to `http://localhost:3000` in your web browser.
Navigate to `http://localhost:5173` in your web browser.

### Scripts

Expand Down
2 changes: 1 addition & 1 deletion backend/.env.example
Original file line number Diff line number Diff line change
Expand Up @@ -6,4 +6,4 @@ GITHUB_CLIENT_SECRET=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
SESSION_SECRET=my-super-secret-and-long-random-string-for-quartorium

# Frontend URL for redirects
FRONTEND_URL=http://localhost:3000
FRONTEND_URL=http://localhost:5173
2 changes: 1 addition & 1 deletion backend/src/server.js
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ const FRONTEND_URL = process.env.FRONTEND_URL || 'http://localhost:5173';

// Middleware
app.use(cors({
origin: [FRONTEND_URL, 'http://localhost:3000', 'http://localhost:5173'],
origin: [FRONTEND_URL, 'http://localhost:5173'],
credentials: true
})); // Allow requests from multiple frontend ports
Copy link

Copilot AI Dec 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The comment "Allow requests from multiple frontend ports" is misleading since the CORS configuration now only allows one frontend port (5173). The comment should be updated to reflect this change, or removed entirely since it now only allows the configured FRONTEND_URL and its default fallback.

Suggested change
})); // Allow requests from multiple frontend ports
}));

Copilot uses AI. Check for mistakes.
Comment on lines +18 to 20
Copy link

Copilot AI Dec 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] The CORS origin array includes both FRONTEND_URL (which defaults to 'http://localhost:5173') and the hardcoded 'http://localhost:5173'. This creates redundancy when FRONTEND_URL is not explicitly set. Consider simplifying to just origin: FRONTEND_URL or keeping the array format only if multiple origins are genuinely needed for different deployment scenarios.

Suggested change
origin: [FRONTEND_URL, 'http://localhost:5173'],
credentials: true
})); // Allow requests from multiple frontend ports
origin: FRONTEND_URL,
credentials: true
})); // Allow requests from the frontend URL

Copilot uses AI. Check for mistakes.
app.use(express.json());
Expand Down
86 changes: 43 additions & 43 deletions backend/test/cache.integration.test.js
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ const http = require('isomorphic-git/http/node'); // Not used yet, but good to h
const crypto = require('crypto');

// --- Constants ---
const API_BASE_URL = 'http://localhost:3000/api/docs'; // Assuming server runs on port 3000
const API_BASE_URL = 'http://localhost:8000/api/docs'; // Backend runs on port 8000
const REPOS_DIR = path.join(__dirname, '../repos'); // Relative to backend/test
// CACHE_DIR is the main cache directory.
// CACHE_DIR_RENDERED_DOCS is a subdirectory for the JSON outputs of rendering.
Expand Down Expand Up @@ -138,16 +138,16 @@ describe('Document Rendering Cache Integration Tests', () => {
await fs.rm(TEST_REPO_PATH, { recursive: true, force: true });
// Clean up specific cache for this test run to be tidy
const assetCacheDirForRepo = path.join(CACHE_DIR, 'assets', MOCK_REPO_ID);
try { await fs.rm(assetCacheDirForRepo, { recursive: true, force: true }); } catch (e) { if (e.code !== 'ENOENT') console.error("Error cleaning MOCK_REPO_ID asset cache in afterAll", e)}
try { await fs.rm(assetCacheDirForRepo, { recursive: true, force: true }); } catch (e) { if (e.code !== 'ENOENT') console.error("Error cleaning MOCK_REPO_ID asset cache in afterAll", e) }

try {
const renderedDocFiles = await fs.readdir(CACHE_DIR_RENDERED_DOCS);
for (const file of renderedDocFiles) {
if (file.startsWith(MOCK_REPO_ID)) {
await fs.unlink(path.join(CACHE_DIR_RENDERED_DOCS, file));
}
const renderedDocFiles = await fs.readdir(CACHE_DIR_RENDERED_DOCS);
for (const file of renderedDocFiles) {
if (file.startsWith(MOCK_REPO_ID)) {
await fs.unlink(path.join(CACHE_DIR_RENDERED_DOCS, file));
}
} catch (e) { if (e.code !== 'ENOENT') console.error("Error cleaning MOCK_REPO_ID doc cache in afterAll", e)}
}
} catch (e) { if (e.code !== 'ENOENT') console.error("Error cleaning MOCK_REPO_ID doc cache in afterAll", e) }
});

it('Test 1: Cache Miss and Initial Cache Creation', async () => {
Expand Down Expand Up @@ -232,7 +232,7 @@ describe('Document Rendering Cache Integration Tests', () => {
// 1. Request to ensure initial cache is there
await getDocView(MOCK_REPO_ID, TEST_DOC_FILENAME);
try { await fs.access(firstCacheFile); } catch (e) {
throw new Error(`Initial cache file ${firstCacheFile} was not created.`);
throw new Error(`Initial cache file ${firstCacheFile} was not created.`);
}

// 2. Modify the document and commit
Expand All @@ -251,9 +251,9 @@ describe('Document Rendering Cache Integration Tests', () => {

// Check for the new rendered doc cache file
try {
await fs.access(secondCacheFile);
await fs.access(secondCacheFile);
} catch (e) {
throw new Error(`New rendered doc cache file ${secondCacheFile} was not created after commit.`);
throw new Error(`New rendered doc cache file ${secondCacheFile} was not created after commit.`);
}

// Check for new asset cache directory and file existence
Expand Down Expand Up @@ -298,7 +298,7 @@ describe('Document Rendering Cache Integration Tests', () => {
let response = await getDocView(MOCK_REPO_ID, TEST_DOC_FILENAME);
expect(response.status).toBe(200);
try { await fs.access(mainBranchCacheFile); } catch (e) {
throw new Error(`Main branch cache file ${mainBranchCacheFile} was not created.`);
throw new Error(`Main branch cache file ${mainBranchCacheFile} was not created.`);
}

// 2. Create and checkout a new branch, modify doc, commit
Expand All @@ -314,7 +314,7 @@ describe('Document Rendering Cache Integration Tests', () => {
expect(response.status).toBe(200);
// Add content check for feature branch specific content if possible
try { await fs.access(featureBranchCacheFile); } catch (e) {
throw new Error(`Feature branch cache file ${featureBranchCacheFile} was not created.`);
throw new Error(`Feature branch cache file ${featureBranchCacheFile} was not created.`);
}
const mainCacheStatBefore = await fs.stat(mainBranchCacheFile);

Expand Down Expand Up @@ -362,7 +362,7 @@ if (require.main === module) {
}

module.exports = {
// Can export functions if needed by an external runner
// Can export functions if needed by an external runner
};

// Basic expect implementation for standalone demo (very simplified)
Expand Down Expand Up @@ -402,38 +402,38 @@ global.beforeEach = (fn) => { if (global.testContext.currentSuite) global.testCo
global.afterAll = (fn) => { if (global.testContext.currentSuite) global.testContext.currentSuite.afterAll.push(fn); };

async function reallyRunTestsManual() {
for (const suite of global.testContext.tests) {
console.log(`\n--- Running Suite: ${suite.name} ---`);
for (const beforeAllFn of suite.beforeAll) await beforeAllFn();
for (const test of suite.tests) {
console.log(` IT: ${test.name}`);
for (const beforeEachFn of suite.beforeEach) await beforeEachFn();
try {
await test.fn();
console.log(` PASSED: ${test.name}`);
} catch (e) {
console.error(` FAILED: ${test.name}`, e.message, e.stack ? e.stack.split('\n')[1].trim() : '');
}
}
for (const afterAllFn of suite.afterAll) await afterAllFn();
for (const suite of global.testContext.tests) {
console.log(`\n--- Running Suite: ${suite.name} ---`);
for (const beforeAllFn of suite.beforeAll) await beforeAllFn();
for (const test of suite.tests) {
console.log(` IT: ${test.name}`);
for (const beforeEachFn of suite.beforeEach) await beforeEachFn();
try {
await test.fn();
console.log(` PASSED: ${test.name}`);
} catch (e) {
console.error(` FAILED: ${test.name}`, e.message, e.stack ? e.stack.split('\n')[1].trim() : '');
}
}
for (const afterAllFn of suite.afterAll) await afterAllFn();
}
}

// Self-execute if not in a proper test environment
if (require.main === module) {
console.log("Attempting to run tests with simplified manual runner...");
// This will manually execute the describe blocks now that they are defined.
// Need to re-require or ensure the describe calls happen after these definitions.
// This is getting hacky; a real test runner is much preferred.
// For now, I'll assume a test runner like Jest will pick up the file.
// If I were to make this truly self-executable for the demo, I'd put all `describe` calls
// into a main async function and call it here.
// For now, the goal is to provide the test *logic*.
console.log("To execute, use a test runner like Jest: `npx jest backend/test/cache.integration.test.js`");
console.log("Or, if you have Jest installed globally/locally: `jest backend/test/cache.integration.test.js`");
// As a fallback for this tool, I'll try to run it manually.
// This requires the `describe` calls to be re-evaluated or for this to be structured differently.
// The `create_file_with_block` tool doesn't allow re-running parts of the script.
// So, this manual runner won't work as is.
// The primary deliverable is the test suite structure and logic.
console.log("Attempting to run tests with simplified manual runner...");
// This will manually execute the describe blocks now that they are defined.
// Need to re-require or ensure the describe calls happen after these definitions.
// This is getting hacky; a real test runner is much preferred.
// For now, I'll assume a test runner like Jest will pick up the file.
// If I were to make this truly self-executable for the demo, I'd put all `describe` calls
// into a main async function and call it here.
// For now, the goal is to provide the test *logic*.
console.log("To execute, use a test runner like Jest: `npx jest backend/test/cache.integration.test.js`");
console.log("Or, if you have Jest installed globally/locally: `jest backend/test/cache.integration.test.js`");
// As a fallback for this tool, I'll try to run it manually.
// This requires the `describe` calls to be re-evaluated or for this to be structured differently.
// The `create_file_with_block` tool doesn't allow re-running parts of the script.
// So, this manual runner won't work as is.
// The primary deliverable is the test suite structure and logic.
}