Merged
Conversation
- Implement LLM factory service for dynamic provider selection - Create LLM configuration utility for managing provider settings - Add server-side endpoint for generating content across multiple providers - Update environment configuration to support LLM API keys - Introduce ProviderSelector component for frontend LLM provider selection - Modify server configuration to support new LLM integration services - Add TestLLM page for exploring multi-provider LLM functionality Enables flexible AI model integration with support for OpenAI, Anthropic, Google, Grok, and OpenRouter providers.
…k strategy - Integrate OpenRouter as a new LLM provider with extensive model support - Expand available models for Groq and add new Llama 3.2 variants - Improve LLM service fallback mechanism with more robust error handling - Add detailed error logging for LLM generation failures - Implement dynamic fallback provider selection with configurable options - Update LLM factory to support OpenRouter's unique API configuration - Enhance provider validation and error tracking in LLM service Adds support for a more flexible multi-provider LLM integration, improving system resilience and model availability.
…k strategy - Add new LLM service with flexible provider support - Refactor server routes to use new multi-provider LLM generation - Introduce standardized content generation method with provider metadata - Update chat, prompt, and generation endpoints to support multiple LLM providers - Add support for dynamic provider selection and fallback mechanisms - Enhance error handling and response metadata for LLM interactions - Prepare infrastructure for future LLM provider integrations Motivation: Improve system resilience and flexibility by decoupling content generation from specific provider implementations, allowing easier provider switching and fallback strategies.
Add Authorization headers with Bearer tokens to axios requests in flashcard, guide, and quiz services for secure API access. Update quiz delete method to properly stringify request body.
Implement dynamic model loading for OpenRouter by fetching available models from the API, with caching and preference for free/low-cost models. Update provider initialization to be asynchronous and add support for runtime model refresh. Include OPENROUTER_API_KEY in environment example. This enhances flexibility by allowing the system to adapt to changing model availability without hardcoded lists, improving reliability and cost management.
Simplify the LLM service by removing configurations and implementations for Groq, OpenAI, and Anthropic providers, retaining only Google and OpenRouter for better maintainability and focus. BREAKING CHANGE: Users relying on Groq, OpenAI, or Anthropic providers will need to switch to Google or OpenRouter.
This change extends the dynamic provider initialization to support Gemini, enabling models to be fetched dynamically from the Google AI API. It includes caching, filtering for text generation models, and sorting by stability and input limits to prefer newer, stable models. The update ensures the Gemini provider behaves similarly to OpenRouter with runtime model discovery.
- Update README.md with tag options for build, push, and deploy commands - Modify server-deploy-production.sh to accept tag argument with default to "production
Add support for OpenAI integration via LangChain by including the @langchain/openai package in both client and server dependencies. This enables dynamic model fetching and other OpenAI-related features.
- Introduce `getCachedApiKey` method to retrieve API keys from cache in production - Make `getLLM` method async to support cached key retrieval - Update `healthCheck` and `llmService` to await `getLLM` calls - Add `OPENROUTER_API_KEY` to critical settings preload BREAKING CHANGE: `getLLM` method is now async and must be awaited by callers
- Add server/.env.production to gitignore - Maintain existing pattern of ignoring environment files - Prevent accidental tracking of sensitive production configuration
- Enhance LLM service with detailed request/response logging, performance metrics, and error tracking - Add safe property access utilities for robust API response handling - Implement fallback mechanisms for external API failures (e.g., Unsplash) - Improve error handling and correlation with unique request IDs across all LLM endpoints BREAKING CHANGE: LLM service methods now require async initialization and some parameters are mandatory for logging
Add detailed section covering log file locations, real-time monitoring commands, log structure, LLM operation logging, analysis examples, log rotation, and production management best practices for the AiCourse application.
- Implement server-side cookie-based authentication using cookie-parser - Update all frontend components to remove localStorage token handling - Configure secure, httpOnly cookies for authentication - Add logout endpoint to clear authentication cookies - Remove Authorization header token retrieval in middleware - Update axios/fetch requests to include credentials - Create migration documentation in jwt-to-cookies-migration-todo.md - Modify authentication middleware to read tokens from cookies - Ensure CSRF protection with strict sameSite settings Improves application security by preventing XSS attacks and simplifying token management across client and server.
…Storage for persistence
- Implement new Courses page component in `src/pages/dashboard/Courses.tsx` - Add `/dashboard/courses` route in `src/App.tsx` - Create comprehensive course management functionality with: * Pagination for course listing * Course progress tracking * Grid and list view modes * Course deletion capability * Infinite scroll loading - Integrate user course fetching from backend API - Add course navigation and interaction methods - Implement error handling and loading states for course management
Refactored FlashcardList, GuideList, and QuizList components to use Card components for improved UI consistency and responsiveness. Updated error handling from 'any' to 'unknown' type, enhanced styling with gradients and hover effects, and replaced confirm dialogs with AlertDialog for delete actions. Minor styling adjustments applied to Courses page.
Replace hardcoded dark mode classes with semantic design system classes like text-muted-foreground, bg-muted, and border-input for improved consistency and maintainability across ProviderSelector, FlashcardCreator, FlashcardViewer, GuideCreator, QuizCreator, and GenerateCourse components.
…k buttons - Add new AppLayout and AppSidebar components for unified navigation - Wrap public and authenticated routes with AppLayout for consistent UI - Remove back buttons from FlashcardViewer, GuideViewer, and QuizViewer components - Add backward compatibility redirects for old dashboard routes - Enhance auth state management with new useAuthState hook - Improve settings cache with timeout and robust error handling - Add dark mode support to QuizViewer and related components - Optimize API utilities with server URL caching and performance logging - Update login page to use uid for auth checks and improve navigation
- Add visibility toggle and fork functionality to all content types (courses, quizzes, flashcards, guides) - Implement public content discovery API endpoints with search and filtering - Add access control middleware for private content protection - Include database migrations for visibility and fork tracking fields - Update UI components with visibility indicators, fork buttons, and attribution - Add public content browser page and enhanced content lists with filters - Implement pending fork operations during authentication flow BREAKING CHANGE: Database schema requires migration to add visibility and fork fields to existing content
…nd email sender script
…or palette, and enhance responsive layouts.
…racefully. It will now show a fallback icon if an image fails to load. Please verify the changes.
… refine UI/layout for flashcard and quiz lists.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
No description provided.