Purpose:
When making client applications that utilize the OpenAI API, you may want to encapsulate requests via a Micro-service layer (see their documentation). This repo creates a Micro-service for calling the OpenAI API using the following:
- Amazon Web Services
- Amazon API Gateway
- AWS Lambda
- serverless framework for Infrastructure-as-code
- .env to hold your OpenAI API key
Deploying:
Pre-requisites:
- You have the serverless framework installed and the AWS CLI configured on your workstation. This is the infrasructure-as-code tool that will deploy this Micro-service. If not, you can refer to this tutorial.
- You have an OpenAI API key. This can be created here: https://beta.openai.com/account/api-keys
Steps:
- Clone this repo.
- Install project dependencies with:
npm install
- Update .env file with your own OpenAI API key.
- Deploy this micro-service to your own AWS Account with:
sls deploy
- Use the deployed Micro-service to call the OpenAI API!
- To clean up, you can tear down micro-service from your own AWS Account with:
sls remove
Testing:
- This repo uses Jest for Unit Tests and supertest for Integration Tests.
- Before running integration tests, update .env.test with your own API Gateway API_BASEURL.
- Both unit and integration tests can be run via:
npm test
