This project sets up a real-time chat service that leverages the LLM API to provide seamless, natural conversations.
-
Clone the repository and install the dependencies
git clone https://github.com/GWjun/global-chat.git cd global-chat yarn install -
Create a .env file
cp .env.example .env
-
Start Database
yarn docker:start npx prisma db push
-
Start Application
-
Development:
yarn dev
-
Start Production Server:
yarn start
Builds the application and starts the server in production mode.
├── dist/ # Production build output
├── src/ # React application source files
│ ├── main.tsx # Client-side entry point
│ └── routes/ # Route objects
└── server/ # Fastify server code
├── entry.tsx # Server-side entry point for SSR render
└── main.ts # Main server file