Skip to content

ccabanero/sls-openai-api

Repository files navigation

sls-openai-api

Purpose:

When making client applications that utilize the OpenAI API, you may want to encapsulate requests via a Micro-service layer (see their documentation). This repo creates a Micro-service for calling the OpenAI API using the following:

  • Amazon Web Services
  • Amazon API Gateway
  • AWS Lambda
  • serverless framework for Infrastructure-as-code
  • .env to hold your OpenAI API key

techstack

Deploying:

Pre-requisites:

Steps:

  1. Clone this repo.
  2. Install project dependencies with:
npm install
  1. Update .env file with your own OpenAI API key.
  2. Deploy this micro-service to your own AWS Account with:
sls deploy
  1. Use the deployed Micro-service to call the OpenAI API!
  2. To clean up, you can tear down micro-service from your own AWS Account with:
sls remove

Testing:

  • This repo uses Jest for Unit Tests and supertest for Integration Tests.
  • Before running integration tests, update .env.test with your own API Gateway API_BASEURL.
  • Both unit and integration tests can be run via:
npm test

About

Micro-service for calling the OpenAI API using AWS and serverless framework

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published