Resource-efficient LLM distillation: Improving sustainability and reducing computational costs of Large Language Models in financial analytics through knowledge distillation.
-
Updated
May 15, 2025 - Jupyter Notebook
Resource-efficient LLM distillation: Improving sustainability and reducing computational costs of Large Language Models in financial analytics through knowledge distillation.
SymRAG adaptively routes queries through neuro-symbolic, neural, or hybrid paths based on complexity and system load, ensuring efficient and accurate RAG for diverse QA tasks.
Claude Code plugin that prevents unnecessary file creation through smart policies and modular SKILLs. Chat-first approach with contextual analysis for cleaner codebases.
Ein Machine-Learning-Projekt zur automatischen Unterscheidung von Web- und Strickstoffen für Textilrecycling. Enthält eigenen Demo-Datensatz, Annotation-Guide, Modell-Training und Auswertung.
The source codes for bachelor's thesis
Add a description, image, and links to the resource-efficiency topic page so that developers can more easily learn about it.
To associate your repository with the resource-efficiency topic, visit your repo's landing page and select "manage topics."