Skip to content
View dalgarak's full-sized avatar
😁
😁
  • Electronics and Telecommunications Research Institute @etri
  • Korea, Daejeon

Block or report dalgarak

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Maximum 250 characters. Please don't include any personal information such as legal names or email addresses. Markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse

Pinned Loading

  1. lua-sentencepiece lua-sentencepiece Public

    A simple lua bindings for sentencepiece(https://github.com/google/sentencepiece) implementation

    C++

  2. conv2kr conv2kr Public

    Forked from hongcho/conv2kr

    Automatically exported from code.google.com/p/conv2kr

    Perl

  3. Megatron-LM Megatron-LM Public

    Forked from NVIDIA/Megatron-LM

    Ongoing research training transformer models at scale

    Python 1 5

  4. ms-swift ms-swift Public

    Forked from modelscope/ms-swift

    Use PEFT or Full-parameter to CPT/SFT/DPO/GRPO 500+ LLMs (Qwen3, Qwen3-MoE, Llama4, InternLM3, DeepSeek-R1, ...) and 200+ MLLMs (Qwen2.5-VL, Qwen2.5-Omni, Qwen2-Audio, Ovis2, InternVL3, Llava, GLM4…

    Python

  5. etri-crossmodal/gbswt5 etri-crossmodal/gbswt5 Public

    CharFormer(Tay et al., 2022; Gradient-based Subword Tokenizer + T5) model implementation for Huggingface Transformers

    Python 19 12

  6. etri-crossmodal/llm-downstream-s2s etri-crossmodal/llm-downstream-s2s Public

    Pytorch-lightning based Seq-to-Seq Pretrained LM(BART, T5) fine-tuning softwares

    Python 12 1