Skip to content

Next js migration#22

Draft
realisticattorney wants to merge 2 commits intoeffectiveaccelerationism:mainfrom
realisticattorney:next-js-migration
Draft

Next js migration#22
realisticattorney wants to merge 2 commits intoeffectiveaccelerationism:mainfrom
realisticattorney:next-js-migration

Conversation

@realisticattorney
Copy link

Hey there,

I’ve been working on a few things to make our lives easier:

-TypeScript and Tailwind (to ship fast)
-OpenAI API server is now an endpoint, hosted out of the box by Vercel (app/api/banger). No more juggling three local environments or installing dependencies three times over. and it's using next.js edge runtime (it's fast)
-Threw in a spinner and added history to keep track of tweet-banger pairs.

check it out:
https://text-to-banger-next-js.vercel.app/

Haven't migrated yet:
-the functions from the model server (@martinshkreli can I get the twitter API keys?).
-that light/dark mode switch (it's broken on main so I've commented it out for now).
-the tweet banger button/link

other to-dos:
-CI/CD so PRs automatically show a deployment for each commit pushed, making reviewers job way easier
-a function that triggers some build to automate fine-tuning
-the fine tuning (will need twitter API keys for this as well).
-many changes to UI and handling of bangers

TL;DR: Made things faster to develop, iterate and ship faster, added a tweet-banger history and a loading spinner. more to come later!

That’s about it. LMK what you think!

@realisticattorney realisticattorney marked this pull request as draft August 11, 2023 20:18
@codethazine
Copy link
Collaborator

Hey @realisticattorney thanks for the contrib! Will wait for the whole migration but lgtm

A couple of notes on my side:

  1. The model functions you're referring to in model/data-scripts are meant to be run one-shot for retrieving and filtering the data to fine-tune the model, so no need to productionalize them yet
  2. Light/dark mode switch worked on my local, what error were you getting?
  3. Data has been downloaded and filtered already, so no need for the Twitter API Key anymore

As a side note, the next step is fine-tuning, so it might be useful to think of how we want to integrate it with the current OAI server setup. I'm thinking of fine-tuning this WE directly through OAI and see how that performs, if it's good enough, I'd say we can keep the current setup, otherwise, we'd need to think of how to deploy the fine-tuned model (llama2 prolly) in a scalable way and integrate it

@realisticattorney
Copy link
Author

@codethazine thanks for the quick response!

1 & 3-ohh ok now I see, gotcha. yea I'm looking at the data you pushed an hour ago, awesome!! Gonna play around with it
2-The dark mode doesn't fully revert back:

Screenshot 2023-08-11 at 19 03 02 Screenshot 2023-08-11 at 19 03 07

I agree with your points. Was thinking of giving davinci-003 a shot at fine-tuning too but maybe I'll wait for you to do it? Also, I personally want to fine-tune llama-2 for something anyways, so I'll try that on AWS, or might try both!

@realisticattorney realisticattorney marked this pull request as ready for review August 16, 2023 19:02
@realisticattorney
Copy link
Author

realisticattorney commented Aug 16, 2023

@codethazine hey quick update:

-Added dark mode
-Match code w new @finnbags design
-Added history of prompt-banger pairs

check it out: https://text-to-banger-next-js.vercel.app/

Lmk what you think!

@codethazine
Copy link
Collaborator

Hey @realisticattorney, please fix the pull to maintain the mono repo structure and remove the node_modules

@codethazine codethazine marked this pull request as draft August 19, 2023 22:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants