This repository provides daily, TL;DR summaries of recent research papers published on arXiv which are easy to read. By default, it focuses on the following specific arXiv categories:
gr-qcfor General Relativity and Quantum Cosmologyhep-thfor High-Energy Physics (Theory)math-phfor Mathematical Physics
Each day, it automatically:
- Fetches the most recent papers from the previous day in the above arXiv categories
- Uses a transformer-based summarization model to produce readable, condensed summaries
- Commits and pushes a new Markdown file with the latest summaries to this repository at 08:00 CET (07:00 UTC)
-
Data Source:
Papers are fetched using the arXiv API based on specified categories -
Summarization Model:
Summaries are generated using Hugging Face’stransformerslibrary and the facebook/bart-large-cnn model developed by Meta AI -
Automation:
A workflow runs every day at a fixed time and pushes the resulting.mdfile to this repository
-
Latest Summary:
The most recent summary files will be placed in the root directory of this repository and are named something likearXivBytes_{category}_recent.md, where{category}corresponds to the arXiv category. -
Previous Summaries:
All previously generated.mdfiles remain in the repository’s commit history and are also available in each arXiv category folder. You can view older summaries by browsing past commits or searching through the desired arXiv category folder in the repository. The summaries are saved asarXivBytes_{category}_summaries_{date}.mdwhere{date}corresponds to the date it was generated.
-
arXiv:
The research papers come from arXiv, a free distribution service and an open-access archive for scholarly articles. -
Summarization Model:
The summaries are generated using facebook/bart-large-cnn, a model provided by Facebook (Meta) and hosted on Hugging Face. Credit goes to the developers and maintainers of the model. -
Tooling:
- Hugging Face Transformers for model inference and pipelines.
- Feedparser and Requests for fetching and parsing arXiv data.
The arXiv paper abstracts are subject to arXiv’s terms of use and the original authors’ copyright.
The facebook/bart-large-cnn model is provided under its respective license on Hugging Face.
This project is strictly non-commercial.