Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .copier-answers.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Do NOT update manually; changes here will be overwritten by Copier
_commit: 2f2f7c4
_commit: a740779
_src_path: https://github.com/ingadhoc/addons-repo-template.git
description: ''
is_private: false
Expand Down
25 changes: 6 additions & 19 deletions .github/copilot-instructions.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,8 +38,6 @@

* Confirmar que todos los archivos usados (vistas, seguridad, datos, reportes, wizards) estén referenciados en el manifest.
* Verificar dependencias declaradas: que no falten módulos requeridos ni se declaren innecesarios.
* **Regla de versión (obligatoria):**
Solo sugerir bump de versión si el `__manifest__.py` no incrementa `version` y se modificó la estructura de un modelo, una vista, o algún record .xml (ej. cambios en definición de campos, vistas XML, datos XML, seguridad).
* Solo hacerlo una vez por revisión, aunque haya múltiples archivos afectados.

---
Expand All @@ -61,7 +59,6 @@

* Verificar los archivos `ir.model.access.csv` para nuevos modelos: deben tener permisos mínimos necesarios.
* No proponer abrir acceso global sin justificación.
* Si se agregan nuevos modelos o campos de control de acceso, **recordar el bump de versión** (ver sección de manifest).
* Si se cambian `record rules`, revisar especialmente combinaciones multi-compañía y multi-website.

### Seguridad y rendimiento del ORM
Expand All @@ -86,7 +83,7 @@

## Cambios estructurales y scripts de migración – **cuestiones generales**

Cuando el diff sugiera **cambios de estructura de datos**, **siempre evaluar** si corresponde proponer un **script de migración** en `migrations/` (pre/post/end) **y recordar el bump de versión**.
Cuando el diff sugiera **cambios de estructura de datos**, **siempre evaluar** si corresponde proponer un **script de migración** en `migrations/` (pre/post/end).

### Reglas generales de estructura de `migrations/`

Expand Down Expand Up @@ -283,31 +280,21 @@ def migrate(cr, registry):
| ------------------ | -------------------------------------------------------------------------------------------------------- |
| Modelos | Relaciones válidas; constraints; uso adecuado de `@api.depends`; `super()` correcto |
| Vistas XML | Herencias correctas; campos válidos; adaptación a cambios de versión (p.ej. `<list>` vs `<tree>`) |
| Manifest | **Bump de versión obligatorio** si hay cambios estructurales en modelos/vistas/records .xml; archivos referenciados |
| Seguridad | Accesos mínimos necesarios; reglas revisadas |
| Migraciones | **Si hay cambios estructurales, sugerir script en `migrations/` (pre/post/end)** y describir qué hace |
| Rendimiento / ORM | Evitar loops costosos; no SQL innecesario; aprovechar las optimizaciones del ORM de la versión |
| Ortografía & typos | Errores evidentes corregibles sin modificar idioma ni estilo |

---

## Heurística práctica para el bump de versión (general)

* **SI** el diff modifica la estructura de un modelo, una vista, o algún record .xml (ej. cambios en definición de campos, vistas XML, datos XML, seguridad)
**Y** `__manifest__.py` no cambia `version`**Sugerir bump**.
* **SI** hay scripts `migrations/pre_*.py` o `migrations/post_*.py` nuevos → **Sugerir al menos minor bump**.
* **SI** hay cambios que rompen compatibilidad (renombres, cambios de tipo con impacto, limpieza masiva de datos) → **Sugerir minor/major** según impacto.

---

## Estilo del feedback (general)

* Ser breve, claro y útil. Ejemplos:

* “El campo `partner_id` no se encuentra referenciado en la vista.”
* “Este método redefine `write()` sin usar `super()`.”
* “Tip: hay un error ortográfico en el nombre del parámetro.”
* **Bump + migración:** “Se renombra `old_ref``new_ref`: falta **bump de versión** y **pre-script** en `migrations/` para copiar valores antes del upgrade; añadir **post-script** para recompute del stored.”
* **Migración:** “Se renombra `old_ref``new_ref`: falta **pre-script** en `migrations/` para copiar valores antes del upgrade; añadir **post-script** para recompute del stored.”

* Evitar explicaciones largas o reescrituras completas salvo que el cambio sea claro y necesario.
* Priorizar comentarios en forma de **lista corta de puntos** (3–7 ítems) y frases breves en lugar de bloques de texto extensos.
Expand All @@ -316,10 +303,10 @@ def migrate(cr, registry):

## Resumen operativo para Copilot

1. **Detecta cambios estructurales en modelos, vistas o records .xml → exige bump de `version` en `__manifest__.py` si no está incrementada.**
2. **Si hay cambio estructural (según la lista actualizada) → propone y describe script(s) de migración en `migrations/` (pre/post/end)**, con enfoque idempotente y en lotes.
3. Distingue entre:
1. **Si hay cambio estructural (según la lista actualizada) → propone y describe script(s) de migración en `migrations/` (pre/post/end)**, con enfoque idempotente y en lotes.
2. Distingue entre:

* **cuestiones generales** (válidas para cualquier versión),
* y **matices específicos de Odoo 18** (por ejemplo, uso de `<list>`, passkeys, tours y comportamiento del framework).
4. Mantén el feedback **concreto, breve y accionable**.

3. Mantén el feedback **concreto, breve y accionable**.
7 changes: 6 additions & 1 deletion .github/workflows/pre-commit.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,13 @@ name: pre-commit

on:
push:
branches: "[0-9][0-9].0"
branches:
- "1[8-9].0"
- "[2-9][0-9].0"
pull_request_target:
branches:
- "1[8-9].0*"
- "[2-9][0-9].0*"

jobs:
pre-commit:
Expand Down
2 changes: 2 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,8 @@ repos:
- id: check-docstring-first
- id: check-executables-have-shebangs
- id: check-merge-conflict
args: ['--assume-in-merge']
exclude: '\.rst$'
- id: check-symlinks
- id: check-xml
- id: check-yaml
Expand Down
19 changes: 16 additions & 3 deletions export_bg/models/export_bg_mixin.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,8 @@ class DateTimeEncoder(json.JSONEncoder):
def default(self, obj):
if isinstance(obj, (datetime, date, time)):
return obj.isoformat()
if isinstance(obj, (bytes, bytearray, memoryview)):
return base64.b64encode(bytes(obj)).decode()
return super().default(obj)


Expand All @@ -39,8 +41,18 @@ def _export_chunk_bg(self, data, export_id, export_format):
]
)

field_names = [f.get("name") or f.get("value") or f.get("id") for f in params["fields"]]
field_labels = [f.get("label") or f.get("string") for f in params["fields"]]
# Extract field names considering import_compat mode
import_compat = params.get("import_compat", True)

# For field_names (data extraction), always use the technical field name
# Only use 'value' as fallback when import_compat=True (for import compatibility)
if import_compat:
field_names = [f.get("name") or f.get("value") or f.get("id") for f in params["fields"]]
field_labels = field_names # Use field names as headers for import compatibility
else:
# When not import_compat, use only 'name' or 'id' for field_names, not 'value'
field_names = [f.get("name") or f.get("id") for f in params["fields"]]
field_labels = [f.get("label") or f.get("string") for f in params["fields"]]

export_data = self.export_data(field_names).get("datas", [])

Expand Down Expand Up @@ -129,7 +141,8 @@ def _combine_chunks(self, export_id, export_format):
ws.write_row(0, 0, chunk_data["headers"])
row_num = 1
for row in chunk_data["rows"]:
ws.write_row(row_num, 0, row)
cleaned_row = [str(cell) if isinstance(cell, (dict, list)) else cell for cell in row]
ws.write_row(row_num, 0, cleaned_row)
row_num += 1
wb.close()
chunks.unlink()
Expand Down
64 changes: 64 additions & 0 deletions pot_github_push/README.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
.. |company| replace:: ADHOC SA

.. |company_logo| image:: https://raw.githubusercontent.com/ingadhoc/maintainer-tools/master/resources/adhoc-logo.png
:alt: ADHOC SA
:target: https://www.adhoc.com.ar

.. |icon| image:: https://raw.githubusercontent.com/ingadhoc/maintainer-tools/master/resources/adhoc-icon.png

.. image:: https://img.shields.io/badge/license-AGPL--3-blue.png
:target: https://www.gnu.org/licenses/agpl
:alt: License: AGPL-3

=============
POT Generator
=============

Automatic POT (Portable Object Template) file generator for Odoo modules with GitHub API integration.

Features
========

**POT Generation**
- Generate .pot files using Odoo's native ``trans_export``
- Direct GitHub API push (no local Git required)
- Smart content comparison (ignores timestamp changes)

**Integration**
- Runbot compatible execution
- Auto-execution on module installation
- Environment variable configuration

Configuration
=============

Set environment variables for GitHub integration::

export GITHUB_TOKEN="your_github_token"
export GITHUB_REPO_OWNER="your_organization"
export GITHUB_REPO_NAME="your_repository"
export GITHUB_BRANCH="your_branch"

.. image:: https://odoo-community.org/website/image/ir.attachment/5784_f2813bd/datas
:alt: Try me on Runbot
:target: http://runbot.adhoc.com.ar/

Credits
=======

Images
------

* |company| |icon|

Contributors
------------

Maintainer
----------

|company_logo|

This module is maintained by the |company|.

To contribute to this module, please visit https://www.adhoc.com.ar.
33 changes: 33 additions & 0 deletions pot_github_push/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
from . import wizard

import logging
import ast
import os

_logger = logging.getLogger(__name__)


def post_init_hook(env):
"""Auto-generate POT files on installation

Environment variables:
- MODULE_INFO: Dict with tuple key (repo_owner, repo_name) and modules list as value
{("owner", "repo"): ["module1", "module2"], ...}
- GITHUB_TOKEN: GitHub token (required)
- GITHUB_BRANCH: Target branch (required)
"""
module_info = os.getenv("MODULE_INFO", "{}")
github_token = os.getenv("GITHUB_TOKEN")
github_branch = os.getenv("GITHUB_BRANCH")

if not module_info or module_info == "{}":
_logger.info("No modules specified for POT generation (MODULE_INFO)")
return False

try:
module_info = ast.literal_eval(module_info)
except Exception as e:
_logger.error("Error parsing MODULE_INFO: %s", str(e))
return False

env["pot.generator"]._generate_pots(module_info, github_token, github_branch)
12 changes: 12 additions & 0 deletions pot_github_push/__manifest__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
{
"name": "POT Generator",
"version": "18.0.1.0.0",
"category": "Tools",
"summary": "Helper module to generate POT files",
"author": "ADHOC SA",
"license": "AGPL-3",
"depends": ["base"],
"data": [],
"installable": True,
"post_init_hook": "post_init_hook",
}
1 change: 1 addition & 0 deletions pot_github_push/wizard/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
from . import pot_generator_wizard
138 changes: 138 additions & 0 deletions pot_github_push/wizard/pot_generator_wizard.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,138 @@
import base64
import contextlib
import io
import logging

import requests
from odoo import api, models
from odoo.tools.translate import trans_export

_logger = logging.getLogger(__name__)


class PotGenerator(models.AbstractModel):
_name = "pot.generator"
_description = "Simple POT Generator"

@api.model
def _generate_pots(self, module_info, github_token, github_branch):
"""Generate POT files for specified modules and push to GitHub

:param module_info: Dict with tuple key (owner, repo) and modules list {("owner", "repo"): ["mod1"]}
:param github_token: GitHub API token
:param github_branch: Target branch name
"""
try:
for repo_key, module_names in module_info.items():
# repo_key should be tuple (owner, repo)
if isinstance(repo_key, tuple):
repo_owner, repo_name = repo_key
else:
_logger.error("Invalid repo key type: %s", type(repo_key))
continue

for module_name in module_names:
content = self._generate_pot(module_name)
if content:
self._github_push(module_name, content, repo_owner, repo_name, github_token, github_branch)
return True

except Exception as e:
_logger.exception("POT generation failed: %s", str(e))
return False

def _generate_pot(self, module_name):
"""Generate single POT file"""
try:
# Get content using Odoo's trans_export
with contextlib.closing(io.BytesIO()) as buf:
trans_export(False, [module_name], buf, "po", self._cr)
return buf.getvalue().decode("utf-8")
except Exception as e:
_logger.exception("Failed POT generation for %s: %s", module_name, str(e))
return False

def _github_push(self, module_name, content, repo_owner, repo_name, github_token, branch):
"""Push POT file to GitHub using API

:param module_name: Name of the module
:param content: POT file content
:param repo_owner: GitHub repository owner
:param repo_name: GitHub repository name
:param github_token: GitHub API token
:param branch: Target branch name
"""
headers = {}
try:
# File path in repository
file_path = f"{module_name}/i18n/{module_name}.pot"

# GitHub API headers
headers = {"Authorization": f"Bearer {github_token}", "Accept": "application/vnd.github.v3+json"}

# Get current file SHA (if exists)
url = f"https://api.github.com/repos/{repo_owner}/{repo_name}/contents/{file_path}"
params = {"ref": branch}
response = requests.get(url, headers=headers, params=params, timeout=30)

sha = None
if response.status_code == 200:
file_info = response.json()
sha = file_info["sha"]

# Compare content to avoid unnecessary pushes
existing_content = base64.b64decode(file_info["content"]).decode("utf-8")
if self._pot_content_equal(existing_content, content):
_logger.info("File %s content unchanged (ignoring timestamps), skipping push", file_path)
return True

elif response.status_code == 404:
_logger.info("File %s does not exist, will create new", file_path)
else:
_logger.error("Error getting file info: %s", response.text)
return False

content_encoded = base64.b64encode(content.encode("utf-8")).decode("utf-8")

# Prepare commit data
commit_data = {
"message": f"[I18N] {module_name}: export source terms",
"content": content_encoded,
"branch": branch,
}
if sha:
commit_data["sha"] = sha

# Push to GitHub
response = requests.put(url, json=commit_data, headers=headers, timeout=30)
if response.status_code in [200, 201]:
_logger.info("GitHub push completed for %s", module_name)
return True
else:
_logger.error("GitHub push failed for %s: %s", module_name, response.text)
return False

except Exception as e:
_logger.error("GitHub push failed for %s: %s", module_name, str(e))
return False
finally:
# Clear headers to avoid keeping sensitive token data in memory
headers.clear()

def _pot_content_equal(self, content1, content2):
"""Compare POT files ignoring timestamp changes"""

def normalize_pot_content(content):
"""Remove timestamp lines and normalize content for comparison"""
lines = content.strip().split("\n")
normalized_lines = []
for line in lines:
# Skip POT-Creation-Date and PO-Revision-Date lines
if line.startswith('"POT-Creation-Date:') or line.startswith('"PO-Revision-Date:'):
continue
normalized_lines.append(line)
return "\n".join(normalized_lines)

normalized1 = normalize_pot_content(content1)
normalized2 = normalize_pot_content(content2)
return normalized1 == normalized2