Skip to content

Conversation

@stdthoth
Copy link
Contributor

@stdthoth stdthoth commented Jan 6, 2025

/claim #8

Writer's Checklist

Writing Structure

  • Use short sentences and paragraphs, and include bucket brigades.
  • Include more than two descriptive H2 headings to organize content.
  • Capitalize headings according to the AP Stylebook
    (use this tool)
  • Include an introduction with at least two paragraphs before the first H2
    section.
  • Use appropriate formatting (bold, italic, underline), notes, quotes,
    TLDRs, and key points.
  • Incorporate code elements and Markdown format where appropriate.
  • Ensure at least one visual element per “screen” of the article
    (screenshots, diagrams, tables, graphs, lists, sidenotes, blockquotes).

Fact-Checking

  • Verify all facts and data points included in the article.

Asset Management

  • Save images in the /assets folder.
  • Follow naming conventions:
    YYYYMMDD_title_of_the_article_IMG_NAME_NO.png.
  • (Optional) Create a GitHub repo for the code referenced in the article and
    share it.
  • (Optional) Include a link to this Loom video in the PR comments.

Interlinking

Glossary/Definitions

  • Create new definition in /defitnitions folder.

Review and Edit

  • Ensure articles address the needs of the target audience and their search
    intent.
  • Read the article out loud to catch any awkward phrasing.
  • Run the draft through Grammarly or a similar
    grammar tool.
  • Double-check adherence to the style guide and repository guidelines.
  • Use the name of the article for the title of the PR.

@stdthoth stdthoth force-pushed the daytona-gpu-utilization branch 2 times, most recently from d3f651b to 3c81cf6 Compare January 6, 2025 22:20
@stdthoth
Copy link
Contributor Author

stdthoth commented Jan 7, 2025

Happy new year @mojafa please this PR is ready for review

@stdthoth
Copy link
Contributor Author

stdthoth commented Jan 9, 2025

@mojafa could you please review this PR as I want to open for another issue.

@mojafa
Copy link
Contributor

mojafa commented Jan 10, 2025

@stdthoth apologies, just to let you know there's another PR that also tackles the same issue. I'm reviewing all of them and will assign the one with the best quality.

@stdthoth
Copy link
Contributor Author

@mojafa no problem.. take your time and I am available for any clarifications

@stdthoth
Copy link
Contributor Author

Any updates concerning the review @mojafa

@mojafa
Copy link
Contributor

mojafa commented Jan 16, 2025

@stdthoth Just to let you know, someone else was assigned this task but had to close their PR as he had attempted more than one issue. I'll also check for code similarity.

I'm reviewing your PR now, and if it's sufficient, I'll assign this to you.

@mojafa
Copy link
Contributor

mojafa commented Jan 16, 2025

@stdthoth please link keywords in most of your paragraphs, from intro all the way down.. it's very plain and some terms like GPU should be linked as a keyword.

Screenshot 2025-01-16 at 14 06 05

@mojafa
Copy link
Contributor

mojafa commented Jan 16, 2025

Also, your TL/DR section is not supposed to be there.
The intro section (anything before TL/DR) is too long. Follow the template.

I'll test the code, but please ensure that your article follows the template.

Equally, this is an article, not a guide, according to the issue: #8

Signed-off-by: stdthoth <ichthoth@gmail.com>
…in GPU utilization guide

Signed-off-by: stdthoth <ichthoth@gmail.com>
Signed-off-by: stdthoth <ichthoth@gmail.com>
Signed-off-by: stdthoth <ichthoth@gmail.com>
Signed-off-by: stdthoth <ichthoth@gmail.com>
Signed-off-by: stdthoth <ichthoth@gmail.com>
Signed-off-by: stdthoth <ichthoth@gmail.com>
…ver, and LLM inference and added as keyword in article

Signed-off-by: stdthoth <ichthoth@gmail.com>
@stdthoth stdthoth force-pushed the daytona-gpu-utilization branch from 4ae7386 to 51932d1 Compare January 16, 2025 20:02
Signed-off-by: stdthoth <ichthoth@gmail.com>
@stdthoth
Copy link
Contributor Author

Also, your TL/DR section is not supposed to be there. The intro section (anything before TL/DR) is too long. Follow the template.

I'll test the code, but please ensure that your article follows the template.

Equally, this is an article, not a guide, according to the issue: #8

  • Reduced the intro section and made it more relevant to the subject
  • Added new definitions and linked to the respective keyword
  • moved article to the article folder
  • added references

@stdthoth
Copy link
Contributor Author

@mojafa any updates on this ??

@mojafa
Copy link
Contributor

mojafa commented Jan 24, 2025

@stdthoth allow me to re-review this over the next 2 days

@stdthoth
Copy link
Contributor Author

@mojafa okay thanks 👍🏽

@stdthoth
Copy link
Contributor Author

@mojafa any updates ??

@mojafa
Copy link
Contributor

mojafa commented Jan 29, 2025

@stdthoth reviewing now

@mojafa
Copy link
Contributor

mojafa commented Jan 29, 2025

@stdthoth I've assigned this issue to you.

However, I'll need more time as i need to provision a GPU based machine to run nvidia.

In the meantime, please share your screen recording of your solution.

The content of the article lokos good.

@mojafa
Copy link
Contributor

mojafa commented Feb 1, 2025

@stdthoth pleaae message me on slack @jafa

need your time on provisioning some VM

@stdthoth
Copy link
Contributor Author

stdthoth commented Feb 1, 2025

@mojafa sent

@mojafa
Copy link
Contributor

mojafa commented Feb 2, 2025

@stdthoth I managed to spin up a GPU enabled VM but I'm facing some issues.

I feel like your article jumps directly into the technical steps without providing some essential context upfront, such as system architecture requirements and prerequisites.

My suggestions:

  1. Add more context in the introductory section to introduce the reader to the necessary system specifications and prerequisites in a clearer way. This could include a brief overview of the hardware, software, and environment compatibility.
  2. Talk about the System Architecture: Before diving into the installation steps, briefly mention the supported architectures (e.g., x86_64 vs ARM) and CUDA compatibility, as this can impact the ability to run GPU-based tasks.
  3. Clarify Prerequisites: In the "Prerequisites" section, list the minimum requirements such as:
    • System Architecture: AMD64 (x86_64) or ARM (e.g., NVIDIA Jetson devices).
    • Operating System: Ubuntu or other Linux distributions (with specific versions).
    • Hardware Requirements: NVIDIA GPU with CUDA support (e.g., RTX series or higher).
    • Software Requirements: Docker, WSL2 for Windows users, IDEs (VS Code/JetBrains), CUDA drivers.
  4. Verify GPU Compatibility: Make sure to highlight the importance of checking the GPU's architecture and whether it's supported by CUDA, as this directly affects the ability to run GPU-enabled containers.
  5. Diagram or Visualization: Consider including a visual representation of how each piece of the environment fits together, such as:
    • High-level architecture of GPU utilization.
    • System requirements (OS version, GPU, etc.).
  6. Define Key Terms Early: Some terms like "CUDA-enabled GPU," "Daytona," "LLM," and "Fine-Tuning" are defined later, but it might help to give brief explanations earlier in the article. A quick glossary at the start could help the reader grasp the terminology before diving into technical details.
  7. Guide for Different OS Users: Since you're dealing with different environments (Linux and Windows), you could add a section like "Special Considerations for Linux vs. Windows" early on to guide users based on their system.
  8. Pre-installation Verification: You can have a section to verify that the prerequisites are met before proceeding, such as checking if Docker, WSL2, NVIDIA drivers, and CUDA are installed and functioning.
  9. Final Walkthrough/Conclusion: Wrap up with a brief summary or checklist that recaps the setup process and highlights the steps that users should verify to ensure everything is set up correctly.

Revised Example Intro:

Prerequisites and System Architecture

Before starting the setup process for GPU-based LLM fine-tuning and inference with Daytona, ensure your system meets the following requirements:

Hardware:

  • CPU: x86_64 (AMD64) or ARM architecture (e.g., NVIDIA Jetson devices).
  • GPU: CUDA-compatible NVIDIA GPU (e.g., RTX 20xx, 30xx, or A100).
  • Memory: Minimum of 8GB RAM recommended, 16GB or more for better performance.

Software:

  • Operating System: Ubuntu 18.04+ or other Linux distributions (Windows users will use WSL2).
  • Docker: Must be installed and configured for running containers.
  • NVIDIA Drivers: Ensure the correct GPU drivers for your hardware are installed. You can verify this by running nvidia-smi.

Environment:

  • CUDA: Install the correct version of the CUDA toolkit matching your GPU drivers. Ensure compatibility between Docker images and the driver version.
  • WSL2: For Windows users, ensure that WSL2 is properly configured to allow GPU access within Linux environments.

somthing like:
This guide will walk you through setting up a Daytona-powered environment for LLM fine-tuning and inference using your GPU. It assumes you have a CUDA-compatible GPU and a Linux-based system (Windows users need to enable WSL2). You’ll use Docker containers, Daytona, and VS Code to streamline the process and leverage your GPU for faster model training and inference.


This way, the reader gets a full picture right from the start!

@mojafa
Copy link
Contributor

mojafa commented Feb 3, 2025

@stdthoth running into this error. Looks like the image is outdated. I'll find a workaroudn or fix as well.
Screenshot 2025-02-03 at 11 18 33

@mojafa
Copy link
Contributor

mojafa commented Feb 3, 2025

had to use: docker run --gpus all nvidia/cuda:12.0.1-base-ubuntu22.04 nvidia-smi

Screenshot 2025-02-03 at 11 21 26

@mojafa
Copy link
Contributor

mojafa commented Feb 3, 2025

@stdthoth please hvae a look at your dev configs, running daytona create fails to create the workspace.

Please test lo

Screenshot 2025-02-03 at 11 28 56

@mojafa
Copy link
Contributor

mojafa commented Feb 3, 2025

There's a step you've missed here, after pushing the code to github, please specify that readers who are running vm's need to ssh into the server and do a daytona create etc.

Also at the top when doing a setup, specify that yours is using a hardware setup, and that users can set up gpu enabled linux VM from cloud providers etc, they can verfiy that git, daytona, docker, nvidia-smi, and all the other commands are installed and running. From there, they can do daytona git-provider add, then daytona create repo-url. Soemthing like that. you get the gist.

NOTE: I've had to request access to GPUs from AWS. the request took a few days. So might also include that. there's no fastest way to run GPU-enabled device, well apart from buying the hardware.

Screenshot 2025-02-03 at 11 30 26

@stdthoth
Copy link
Contributor Author

stdthoth commented Feb 3, 2025

@mojafa already making edits

Signed-off-by: stdthoth <ichthoth@gmail.com>
@stdthoth
Copy link
Contributor Author

stdthoth commented Feb 3, 2025

@mojafa I have made all the necessary changes you requested 👌🏽

@stdthoth
Copy link
Contributor Author

stdthoth commented Feb 3, 2025

@mojafa concerning the workspace failing to start. I made a mistake with the name of the .txt file with which the llm uses for fine-tuning.. please change train.txt to data.txt in the Dockerfile and also in the finetune_workflow.py. Apologies for the error I have corrected it in the article.

Signed-off-by: stdthoth <ichthoth@gmail.com>
@nibzard
Copy link
Contributor

nibzard commented Feb 6, 2025

I have tried reading the article, but it is not the expected quality. It is hard to read, and the introduction misses the point. Please read the connected issue and think about how to rewrite the article more fluently.

utilization
/ˌjuːtɪlʌɪˈzeɪʃn/
noun
the action of making practical and effective use of something.

...and not "a percentage of a [GPU's]" - this is nonsense....

I appreciate the hard work, but this needs to be properly written, have a narrative and a story arc. And in the end make sense. @stdthoth

@nibzard
Copy link
Contributor

nibzard commented Feb 10, 2025

due to the low quality of writing and no response I have to close this, thanks

@nibzard nibzard closed this Feb 10, 2025
@stdthoth
Copy link
Contributor Author

@nkko I am currently working on this, can you please reopen.. I'll have something by tomorrow

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants