diff --git a/.github/workflows/test.yml b/.github/workflows/test.yml
index 6f4b2fd52..92bc5aec1 100644
--- a/.github/workflows/test.yml
+++ b/.github/workflows/test.yml
@@ -42,7 +42,7 @@ jobs:
makepot: "true"
services:
postgres:
- image: postgres:12.0
+ image: postgis/postgis:13-3.4
env:
POSTGRES_USER: odoo
POSTGRES_PASSWORD: odoo
diff --git a/base_geoengine/README.rst b/base_geoengine/README.rst
index a019e3442..170657c6c 100644
--- a/base_geoengine/README.rst
+++ b/base_geoengine/README.rst
@@ -17,13 +17,13 @@ Geospatial support for Odoo
:target: http://www.gnu.org/licenses/agpl-3.0-standalone.html
:alt: License: AGPL-3
.. |badge3| image:: https://img.shields.io/badge/github-OCA%2Fgeospatial-lightgray.png?logo=github
- :target: https://github.com/OCA/geospatial/tree/17.0/base_geoengine
+ :target: https://github.com/OCA/geospatial/tree/18.0/base_geoengine
:alt: OCA/geospatial
.. |badge4| image:: https://img.shields.io/badge/weblate-Translate%20me-F47D42.png
- :target: https://translation.odoo-community.org/projects/geospatial-17-0/geospatial-17-0-base_geoengine
+ :target: https://translation.odoo-community.org/projects/geospatial-18-0/geospatial-18-0-base_geoengine
:alt: Translate me on Weblate
.. |badge5| image:: https://img.shields.io/badge/runboat-Try%20me-875A7B.png
- :target: https://runboat.odoo-community.org/builds?repo=OCA/geospatial&target_branch=17.0
+ :target: https://runboat.odoo-community.org/builds?repo=OCA/geospatial&target_branch=18.0
:alt: Try me on Runboat
|badge1| |badge2| |badge3| |badge4| |badge5|
@@ -31,10 +31,10 @@ Geospatial support for Odoo
GeoEngine is an Odoo module that adds spatial/GIS capabilites to Odoo.
It will allow you to :
-- Visualize and query your business information on map
-- Perform GeoBI and spatial query
-- Configure your spatial layers and spatial datasources
-- Extend Odoo models with spatial columns
+- Visualize and query your business information on map
+- Perform GeoBI and spatial query
+- Configure your spatial layers and spatial datasources
+- Extend Odoo models with spatial columns
GeoEngine relies on `OpenLayers `__ and
`PostgGIS `__ technologies.
@@ -68,8 +68,8 @@ On Ubuntu:
The module also requires two additional python libs:
-- `Shapely `__
-- `geojson `__
+- `Shapely `__
+- `geojson `__
When you will install the module this two additional libs will be
installed.
@@ -174,14 +174,14 @@ Changelog
16.0.1.0.0 (2023-03-20)
-----------------------
-- LayerSwitcher has been removed as it was not really practical. A
- LayerPanel is now active.
-- The geo_search method is now deprecated and replaced by the standard
- odoo search method.
-- The widget "geo_edit_map" attribute is no longer necessary as the
- field is automatically detected by his type. We can also provide an
- option attribute that allows us to pass an opacity and a color as
- parameters.
+- LayerSwitcher has been removed as it was not really practical. A
+ LayerPanel is now active.
+- The geo_search method is now deprecated and replaced by the standard
+ odoo search method.
+- The widget "geo_edit_map" attribute is no longer necessary as the
+ field is automatically detected by his type. We can also provide an
+ option attribute that allows us to pass an opacity and a color as
+ parameters.
.. code:: xml
@@ -193,15 +193,15 @@ Changelog
-- The method geo_search is now deprecated. We now need to use the
- standard odoo search method.
+- The method geo_search is now deprecated. We now need to use the
+ standard odoo search method.
.. code:: python
obj.search([("the_point","geo_intersect",{"dummy.zip.the_geom": [("id", "=", rec.id)]})])
-- We can now pass to the geoengine view a template to display the
- information we want to see when clicking on a feature.
+- We can now pass to the geoengine view a template to display the
+ information we want to see when clicking on a feature.
.. code:: xml
@@ -223,8 +223,8 @@ Changelog
-- We can now pass a model to use to a layer to display other
- information on the map.
+- We can now pass a model to use to a layer to display other information
+ on the map.
.. code:: xml
@@ -241,7 +241,7 @@ Changelog
0.8
-- There is some new features in the LayerPanel.
+- There is some new features in the LayerPanel.
1. If you are logged in as an admin, you have the possibility to edit
the layer by clicking on the edit button. This will open a dialog
@@ -254,25 +254,24 @@ Changelog
the layers by sliding them over each other. If you are logged in as a
user, changes will not be persisted in the database.
-- Widget domain is now implemented for geo field This means that the
- geo-operators are also implemented and that there is the possibility
- to add a sub-domain. If we want to add a domain that includes all the
- records that are displayed in the geoengine view (active_ids). We can
- use the two new operators : "in active_ids" and "not in active_ids".
- These will automatically replace the marker with ids. Note that the
- widget will indicate that the domain is invalid because of the
- marker.
-- Creation of the RecordsPanel. This panel allows you to retrieve all
- active records. You can click on record to get the movement to the
- selected record. Two magnifying glass are also available. You can
- click on the left one to zoom on the record. You can click on the
- right one to get the original zoom.
-- A search bar is also available. It allows you to perform a search
- into the RecordsPanel.
-- A button to open/close the panels is also available.
-- The module has been translated in French.
-- Now you can now make the geoengine view editable. Simply add editable
- attribute in the geoengine view.
+- Widget domain is now implemented for geo field This means that the
+ geo-operators are also implemented and that there is the possibility
+ to add a sub-domain. If we want to add a domain that includes all the
+ records that are displayed in the geoengine view (active_ids). We can
+ use the two new operators : "in active_ids" and "not in active_ids".
+ These will automatically replace the marker with ids. Note that the
+ widget will indicate that the domain is invalid because of the marker.
+- Creation of the RecordsPanel. This panel allows you to retrieve all
+ active records. You can click on record to get the movement to the
+ selected record. Two magnifying glass are also available. You can
+ click on the left one to zoom on the record. You can click on the
+ right one to get the original zoom.
+- A search bar is also available. It allows you to perform a search into
+ the RecordsPanel.
+- A button to open/close the panels is also available.
+- The module has been translated in French.
+- Now you can now make the geoengine view editable. Simply add editable
+ attribute in the geoengine view.
.. code:: xml
@@ -303,7 +302,7 @@ Bug Tracker
Bugs are tracked on `GitHub Issues `_.
In case of trouble, please check there if your issue has already been reported.
If you spotted it first, help us to smash it by providing a detailed and welcomed
-`feedback `_.
+`feedback `_.
Do not contact contributors directly about support or help with technical issues.
@@ -319,33 +318,33 @@ Authors
Contributors
------------
-- Nicolas Bessi
-- Frederic Junod
-- Yannick Payot
-- Sandy Carter
-- Laurent Mignon
-- Jonathan Nemry
-- David Lasley
-- Daniel Reis
-- Matthieu Dietrich
-- Alan Ramos
-- Damien Crier
-- Cyril Gaudin
-- Pierre Verkest
-- Benjamin Willig
-- Devendra Kavthekar
-- Emanuel Cino
-- Thomas Nowicki
-- Alexandre Saunier
-- Sandip Mangukiya
-- Samuel Kouff
-- `APSL-Nagarro `__:
-
- - Antoni Marroig
- - Miquel Alzanillas
-
-- Red Butay <>
-- Sergio Sancho
+- Nicolas Bessi
+- Frederic Junod
+- Yannick Payot
+- Sandy Carter
+- Laurent Mignon
+- Jonathan Nemry
+- David Lasley
+- Daniel Reis
+- Matthieu Dietrich
+- Alan Ramos
+- Damien Crier
+- Cyril Gaudin
+- Pierre Verkest
+- Benjamin Willig
+- Devendra Kavthekar
+- Emanuel Cino
+- Thomas Nowicki
+- Alexandre Saunier
+- Sandip Mangukiya
+- Samuel Kouff
+- `APSL-Nagarro `__:
+
+ - Antoni Marroig
+ - Miquel Alzanillas
+
+- Red Butay <>
+- Sergio Sancho
Maintainers
-----------
@@ -360,6 +359,6 @@ OCA, or the Odoo Community Association, is a nonprofit organization whose
mission is to support the collaborative development of Odoo features and
promote its widespread use.
-This module is part of the `OCA/geospatial `_ project on GitHub.
+This module is part of the `OCA/geospatial `_ project on GitHub.
You are welcome to contribute. To learn how please visit https://odoo-community.org/page/Contribute.
diff --git a/base_geoengine/__manifest__.py b/base_geoengine/__manifest__.py
index 1679b8d32..1e773fd08 100644
--- a/base_geoengine/__manifest__.py
+++ b/base_geoengine/__manifest__.py
@@ -4,7 +4,7 @@
# License AGPL-3.0 or later (http://www.gnu.org/licenses/agpl).
{
"name": "Geospatial support for Odoo",
- "version": "17.0.1.0.0",
+ "version": "18.0.1.0.0",
"category": "GeoBI",
"author": "Camptocamp,ACSONE SA/NV,Odoo Community Association (OCA)",
"license": "AGPL-3",
@@ -28,7 +28,13 @@
"web/static/src/scss/pre_variables.scss",
"web/static/lib/bootstrap/scss/_variables.scss",
("include", "web._assets_bootstrap"),
- ]
+ ],
+ "base_geoengine.assets_jsLibs_geoengine": [
+ "/base_geoengine/static/lib/ol-7.2.2/ol.js",
+ "/base_geoengine/static/lib/chromajs-2.4.2/chroma.js",
+ "/base_geoengine/static/lib/geostats-2.0.0/geostats.js",
+ "/base_geoengine/static/lib/geostats-2.0.0/geostats.css",
+ ],
},
"external_dependencies": {"python": ["shapely", "geojson"]},
"installable": True,
diff --git a/base_geoengine/expressions.py b/base_geoengine/expressions.py
index 1e6113474..85b6c46b0 100644
--- a/base_geoengine/expressions.py
+++ b/base_geoengine/expressions.py
@@ -1,17 +1,48 @@
# Copyright 2023 ACSONE SA/NV
# License AGPL-3.0 or later (http://www.gnu.org/licenses/agpl).
+import collections
+import collections.abc
+import json
+import logging
import random
+import reprlib
import string
+import traceback
+from datetime import date, datetime, time
+import pytz
+
+from odoo.models import READ_GROUP_NUMBER_GRANULARITY, check_property_field_value_name
from odoo.osv import expression
-from odoo.osv.expression import TERM_OPERATORS
-from odoo.tools import SQL, Query
+from odoo.osv.expression import (
+ AND,
+ AND_OPERATOR,
+ ANY_IN,
+ FALSE_LEAF,
+ NEGATIVE_TERM_OPERATORS,
+ NOT_OPERATOR,
+ OR,
+ SQL_OPERATORS,
+ TERM_OPERATORS,
+ TERM_OPERATORS_NEGATION,
+ TRUE_LEAF,
+ WILDCARD_OPERATORS,
+ check_leaf,
+ domain_combine_anies,
+ is_operator,
+ normalize_leaf,
+)
+from odoo.tools import SQL, Query, get_lang
+from odoo.tools.sql import (
+ pattern_to_translated_trigram_pattern,
+ value_to_translated_trigram_pattern,
+)
from .fields import GeoField
from .geo_operators import GeoOperator
-original__leaf_to_sql = expression.expression._expression__leaf_to_sql
+_logger = logging.getLogger(__name__)
GEO_OPERATORS = {
"geo_greater": ">",
@@ -27,13 +58,24 @@
term_operators_list.append(op)
expression.TERM_OPERATORS = tuple(term_operators_list)
+TERM_OPERATORS = expression.TERM_OPERATORS
+GEO_SQL_OPERATORS = {
+ "geo_greater": SQL(">"),
+ "geo_lesser": SQL("<"),
+ "geo_equal": SQL("="),
+ "geo_touch": SQL("ST_Touches"),
+ "geo_within": SQL("ST_Within"),
+ "geo_contains": SQL("ST_Contains"),
+ "geo_intersect": SQL("ST_Intersects"),
+}
-def __leaf_to_sql(self, leaf, model, alias):
- """
- This method has been monkey patched in order to be able to include
- geo_operators into the Odoo search method.
- """
+expression.SQL_OPERATORS.update(GEO_SQL_OPERATORS)
+
+
+def __leaf_to_sql(leaf, model, alias): # noqa: C901
+ # This method has been monkey patched in order to be able to include
+ # geo_operators into the Odoo search method.
left, operator, right = leaf
if isinstance(leaf, list | tuple):
current_field = model._fields.get(left)
@@ -57,6 +99,7 @@ def __leaf_to_sql(self, leaf, model, alias):
+ "_"
+ "".join(random.choices(string.ascii_lowercase, k=5))
)
+
rel_query = where_calc(
rel_model,
ref_search[key],
@@ -88,8 +131,35 @@ def __leaf_to_sql(self, leaf, model, alias):
query = get_geo_func(
current_operator, operator, left, right, params, model._table
)
- return SQL(query, *params)
- return original__leaf_to_sql(self, leaf=leaf, model=model, alias=alias)
+
+ for idx, param in enumerate(params):
+ if isinstance(param, str):
+ if "%" in param: # or "POINT" in param:
+ param = param.replace("%", "%%")
+ params[idx] = f"'{param}'"
+ continue
+ try:
+ param = int(param)
+ continue
+ except ValueError as e:
+ _logger.info(e)
+
+ if isinstance(param, tuple):
+ entries = []
+ is_number = False
+ for entry in param:
+ try:
+ int(entry)
+ is_number = True
+ except ValueError as e:
+ _logger.info(e)
+ entries.append(entry)
+ if is_number:
+ params[idx] = f'({",".join(entries)})'
+ else:
+ entries_escaped = '","'.join(entries)
+ params[idx] = f'("{entries_escaped}")'
+ return SQL(query % tuple(params))
def get_geo_func(current_operator, operator, left, right, params, table):
@@ -134,4 +204,875 @@ def where_calc(model, domain, active_test=True, alias=None):
return query
-expression.expression._expression__leaf_to_sql = __leaf_to_sql
+def parse(self): # noqa: C901
+ """Transform the leaves of the expression
+
+ The principle is to pop elements from a leaf stack one at a time.
+ Each leaf is processed. The processing is a if/elif list of various
+ cases that appear in the leafs (many2one, function fields, ...).
+
+ Three things can happen as a processing result:
+
+ - the leaf is a logic operator, and updates the result stack
+ accordingly;
+ - the leaf has been modified and/or new leafs have to be introduced
+ in the expression; they are pushed into the leaf stack, to be
+ processed right after;
+ - the leaf is converted to SQL and added to the result stack
+
+ Example:
+
+ =================== =================== =====================
+ step stack result_stack
+ =================== =================== =====================
+ ['&', A, B] []
+ substitute B ['&', A, B1] []
+ convert B1 in SQL ['&', A] ["B1"]
+ substitute A ['&', '|', A1, A2] ["B1"]
+ convert A2 in SQL ['&', '|', A1] ["B1", "A2"]
+ convert A1 in SQL ['&', '|'] ["B1", "A2", "A1"]
+ apply operator OR ['&'] ["B1", "A1 or A2"]
+ apply operator AND [] ["(A1 or A2) and B1"]
+ =================== =================== =====================
+
+ Some internal var explanation:
+
+ :var list path: left operand seen as a sequence of field names
+ ("foo.bar" -> ["foo", "bar"])
+ :var obj model: model object, model containing the field
+ (the name provided in the left operand)
+ :var obj field: the field corresponding to `path[0]`
+ :var obj column: the column corresponding to `path[0]`
+ :var obj comodel: relational model of field (field.comodel)
+ (res_partner.bank_ids -> res.partner.bank)
+ """
+
+ def to_ids(value, comodel, leaf):
+ """Normalize a single id or name, or a list of those, into a list of ids
+
+ :param comodel:
+ :param leaf:
+ :param int|str|list|tuple value:
+
+ - if int, long -> return [value]
+ - if basestring, convert it into a list of basestrings, then
+ - if list of basestring ->
+
+ - perform a name_search on comodel for each name
+ - return the list of related ids
+ """
+ names = []
+ if isinstance(value, str):
+ names = [value]
+ elif (
+ value
+ and isinstance(value, tuple | list)
+ and all(isinstance(item, str) for item in value)
+ ):
+ names = value
+ elif isinstance(value, int):
+ if not value:
+ # given this nonsensical domain, it is generally cheaper to
+ # interpret False as [], so that "X child_of False" will
+ # match nothing
+ _logger.warning("Unexpected domain [%s], interpreted as False", leaf)
+ return []
+ return [value]
+ if names:
+ return list(
+ {
+ rid
+ for name in names
+ for rid in comodel._search([("display_name", "ilike", name)])
+ }
+ )
+ return list(value)
+
+ def child_of_domain(left, ids, left_model, parent=None, prefix=""):
+ """Return a domain implementing the child_of operator for [(left,child_of,ids)],
+ either as a range using the parent_path tree lookup field
+ (when available), or as an expanded [(left,in,child_ids)]"""
+ if not ids:
+ return [FALSE_LEAF]
+ left_model_sudo = left_model.sudo().with_context(active_test=False)
+ if left_model._parent_store:
+ domain = OR(
+ [
+ [("parent_path", "=like", rec.parent_path + "%")]
+ for rec in left_model_sudo.browse(ids)
+ ]
+ )
+ else:
+ # recursively retrieve all children nodes with sudo(); the
+ # filtering of forbidden records is done by the rest of the
+ # domain
+ parent_name = parent or left_model._parent_name
+ if left_model._name != left_model._fields[parent_name].comodel_name:
+ raise ValueError(
+ f"Invalid parent field: {left_model._fields[parent_name]}"
+ )
+ child_ids = set()
+ records = left_model_sudo.browse(ids)
+ while records:
+ child_ids.update(records._ids)
+ records = records.search(
+ [(parent_name, "in", records.ids)], order="id"
+ ) - records.browse(child_ids)
+ domain = [("id", "in", list(child_ids))]
+ if prefix:
+ return [(left, "in", left_model_sudo._search(domain))]
+ return domain
+
+ def parent_of_domain(left, ids, left_model, parent=None, prefix=""):
+ """Return a domain implementing the parent_of operator
+ for [(left,parent_of,ids)],
+ either as a range using the parent_path tree lookup field
+ (when available), or as an expanded [(left,in,parent_ids)]"""
+ ids = [id for id in ids if id] # ignore (left, 'parent_of', [False])
+ if not ids:
+ return [FALSE_LEAF]
+ left_model_sudo = left_model.sudo().with_context(active_test=False)
+ if left_model._parent_store:
+ parent_ids = [
+ int(label)
+ for rec in left_model_sudo.browse(ids)
+ for label in rec.parent_path.split("/")[:-1]
+ ]
+ domain = [("id", "in", parent_ids)]
+ else:
+ # recursively retrieve all parent nodes with sudo() to avoid
+ # access rights errors; the filtering of forbidden records is
+ # done by the rest of the domain
+ parent_name = parent or left_model._parent_name
+ parent_ids = set()
+ records = left_model_sudo.browse(ids)
+ while records:
+ parent_ids.update(records._ids)
+ records = records[parent_name] - records.browse(parent_ids)
+ domain = [("id", "in", list(parent_ids))]
+ if prefix:
+ return [(left, "in", left_model_sudo._search(domain))]
+ return domain
+
+ HIERARCHY_FUNCS = {"child_of": child_of_domain, "parent_of": parent_of_domain}
+
+ def pop():
+ """Pop a leaf to process."""
+ return stack.pop()
+
+ def push(leaf, model, alias):
+ """Push a leaf to be processed right after."""
+ leaf = normalize_leaf(leaf)
+ check_leaf(leaf)
+ stack.append((leaf, model, alias))
+
+ def pop_result():
+ return result_stack.pop()
+
+ def push_result(sql):
+ result_stack.append(sql)
+
+ # process domain from right to left; stack contains domain leaves, in
+ # the form: (leaf, corresponding model, corresponding table alias)
+ stack = []
+ for leaf in self.expression:
+ push(leaf, self.root_model, self.root_alias)
+
+ # stack of SQL expressions
+ result_stack = []
+
+ while stack:
+ # Get the next leaf to process
+ leaf, model, alias = pop()
+
+ # ----------------------------------------
+ # SIMPLE CASE
+ # 1. leaf is an operator
+ # 2. leaf is a true/false leaf
+ # -> convert and add directly to result
+ # ----------------------------------------
+
+ if is_operator(leaf):
+ if leaf == NOT_OPERATOR:
+ push_result(SQL("(NOT (%s))", pop_result()))
+ elif leaf == AND_OPERATOR:
+ push_result(SQL("(%s AND %s)", pop_result(), pop_result()))
+ else:
+ push_result(SQL("(%s OR %s)", pop_result(), pop_result()))
+ continue
+
+ if leaf == TRUE_LEAF:
+ push_result(SQL("TRUE"))
+ continue
+ if leaf == FALSE_LEAF:
+ push_result(SQL("FALSE"))
+ continue
+
+ # Get working variables
+
+ left, operator, right = leaf
+
+ path = left.split(".", 1)
+
+ field = model._fields[path[0]]
+ if field.type == "many2one":
+ comodel = model.env[field.comodel_name].with_context(active_test=False)
+ elif field.type in ("one2many", "many2many"):
+ comodel = model.env[field.comodel_name].with_context(**field.context)
+
+ if (
+ field.company_dependent
+ and field.index == "btree_not_null"
+ and not isinstance(right, SQL | Query)
+ and not (
+ field.type in ("datetime", "date") and len(path) > 1
+ ) # READ_GROUP_NUMBER_GRANULARITY is not supported
+ and model.env["ir.default"]._evaluate_condition_with_fallback(
+ model._name, leaf
+ )
+ is False
+ ):
+ push("&", model, alias)
+ sql_col_is_not_null = SQL(
+ "%s.%s IS NOT NULL", SQL.identifier(alias), SQL.identifier(field.name)
+ )
+ push_result(sql_col_is_not_null)
+
+ if field.inherited:
+ parent_model = model.env[field.related_field.model_name]
+ parent_fname = model._inherits[parent_model._name]
+ # LEFT JOIN parent_model._table AS parent_alias
+ # ON alias.parent_fname = parent_alias.id
+ parent_alias = self.query.make_alias(alias, parent_fname)
+ self.query.add_join(
+ "LEFT JOIN",
+ parent_alias,
+ parent_model._table,
+ SQL(
+ "%s = %s",
+ model._field_to_sql(alias, parent_fname, self.query),
+ SQL.identifier(parent_alias, "id"),
+ ),
+ )
+ push(leaf, parent_model, parent_alias)
+
+ elif left == "id" and operator in HIERARCHY_FUNCS:
+ ids2 = to_ids(right, model, leaf)
+ dom = HIERARCHY_FUNCS[operator](left, ids2, model)
+ for dom_leaf in dom:
+ push(dom_leaf, model, alias)
+
+ elif field.type == "properties":
+ if len(path) != 2 or "." in path[1]:
+ raise ValueError(f"Wrong path {path}")
+ elif operator not in (
+ "=",
+ "!=",
+ ">",
+ ">=",
+ "<",
+ "<=",
+ "in",
+ "not in",
+ "like",
+ "ilike",
+ "not like",
+ "not ilike",
+ ):
+ raise ValueError(f"Wrong search operator {operator!r}")
+ property_name = path[1]
+ check_property_field_value_name(property_name)
+
+ if (isinstance(right, bool) or right is None) and operator in ("=", "!="):
+ # check for boolean value but also for key existence
+ if right:
+ # inverse the condition
+ right = False
+ operator = "!=" if operator == "=" else "="
+
+ sql_field = model._field_to_sql(alias, field.name, self.query)
+ sql_operator = SQL_OPERATORS[operator]
+ sql_extra = SQL()
+ if operator == "=": # property == False
+ sql_extra = SQL(
+ "OR (%s IS NULL) OR NOT (%s ? %s)",
+ sql_field,
+ sql_field,
+ property_name,
+ )
+
+ push_result(
+ SQL(
+ "((%s -> %s) %s '%s' %s)",
+ sql_field,
+ property_name,
+ sql_operator,
+ right,
+ sql_extra,
+ )
+ )
+
+ else:
+ sql_field = model._field_to_sql(alias, field.name, self.query)
+
+ if operator in ("in", "not in"):
+ sql_not = SQL("NOT") if operator == "not in" else SQL()
+ sql_left = SQL("%s -> %s", sql_field, property_name) # raw value
+ sql_operator = (
+ SQL("<@") if isinstance(right, list | tuple) else SQL("@>")
+ )
+ sql_right = SQL("%s", json.dumps(right))
+ push_result(
+ SQL(
+ "(%s (%s) %s (%s))",
+ sql_not,
+ sql_left,
+ sql_operator,
+ sql_right,
+ )
+ )
+
+ elif isinstance(right, str):
+ if operator in ("ilike", "not ilike"):
+ right = f"%{right}%"
+ unaccent = self._unaccent
+ else:
+ unaccent = lambda x: x # noqa: E731
+ sql_left = SQL(
+ "%s ->> %s", sql_field, property_name
+ ) # JSONified value
+ sql_operator = SQL_OPERATORS[operator]
+ sql_right = SQL("%s", right)
+ push_result(
+ SQL(
+ "((%s) %s (%s))",
+ unaccent(sql_left),
+ sql_operator,
+ unaccent(sql_right),
+ )
+ )
+
+ else:
+ sql_left = SQL("%s -> %s", sql_field, property_name) # raw value
+ sql_operator = SQL_OPERATORS[operator]
+ sql_right = SQL("%s", json.dumps(right))
+ push_result(
+ SQL(
+ "((%s) %s (%s))",
+ sql_left,
+ sql_operator,
+ sql_right,
+ )
+ )
+ elif field.type in ("datetime", "date") and len(path) == 2:
+ if path[1] not in READ_GROUP_NUMBER_GRANULARITY:
+ raise ValueError(
+ f"Error when processing the field {field!r}, "
+ f"the granularity {path[1]} is not supported. "
+ "Only {', '.join(READ_GROUP_NUMBER_GRANULARITY.keys())}"
+ " are supported"
+ )
+ sql_field = model._field_to_sql(alias, field.name, self.query)
+ if (
+ model._context.get("tz") in pytz.all_timezones_set
+ and field.type == "datetime"
+ ):
+ sql_field = SQL(
+ "timezone(%s, timezone('UTC', %s))", model._context["tz"], sql_field
+ )
+ if path[1] == "day_of_week":
+ first_week_day = int(
+ get_lang(model.env, model._context.get("tz")).week_start
+ )
+ sql = SQL(
+ "mod(7 - %s + date_part(%s, %s)::int, 7) %s %s",
+ first_week_day,
+ READ_GROUP_NUMBER_GRANULARITY[path[1]],
+ sql_field,
+ SQL_OPERATORS[operator],
+ right,
+ )
+ else:
+ sql = SQL(
+ "date_part(%s, %s) %s %s",
+ READ_GROUP_NUMBER_GRANULARITY[path[1]],
+ sql_field,
+ SQL_OPERATORS[operator],
+ right,
+ )
+ push_result(sql)
+
+ # ----------------------------------------
+ # PATH SPOTTED
+ # -> many2one or one2many with _auto_join:
+ # - add a join, then jump into linked column: column.remaining on
+ # src_table is replaced by remaining on dst_table,
+ # and set for re-evaluation
+ # - if a domain is defined on the column, add it into evaluation
+ # on the relational table
+ # -> many2one, many2many, one2many: replace by an equivalent computed
+ # domain, given by recursively searching on the remaining of the path
+ # -> note: hack about columns.property should not be necessary anymore
+ # as after transforming the column, it will go through this loop once again
+ # ----------------------------------------
+
+ elif (
+ operator in ("any", "not any")
+ and field.store
+ and field.type == "many2one"
+ and field.auto_join
+ ):
+ # res_partner.state_id = res_partner__state_id.id
+ coalias = self.query.make_alias(alias, field.name)
+ self.query.add_join(
+ "LEFT JOIN",
+ coalias,
+ comodel._table,
+ SQL(
+ "%s = %s",
+ model._field_to_sql(alias, field.name, self.query),
+ SQL.identifier(coalias, "id"),
+ ),
+ )
+
+ if operator == "not any":
+ right = ["|", ("id", "=", False), "!", *right]
+
+ for leaf in right:
+ push(leaf, comodel, coalias)
+
+ elif (
+ operator in ("any", "not any")
+ and field.store
+ and field.type == "one2many"
+ and field.auto_join
+ ):
+ # use a subquery bypassing access rules and business logic
+ domain = right + field.get_domain_list(model)
+ query = comodel._where_calc(domain)
+ sql = query.subselect(
+ comodel._field_to_sql(comodel._table, field.inverse_name, query),
+ )
+ push(("id", ANY_IN[operator], sql), model, alias)
+
+ elif operator in ("any", "not any") and field.store and field.auto_join:
+ raise NotImplementedError(
+ f"auto_join attribute not supported on field {field}"
+ )
+
+ elif operator in ("any", "not any") and field.type == "many2one":
+ right_ids = comodel._search(right)
+ if operator == "any":
+ push((left, "in", right_ids), model, alias)
+ else:
+ for dom_leaf in ("|", (left, "not in", right_ids), (left, "=", False)):
+ push(dom_leaf, model, alias)
+
+ # Making search easier when there is a left operand as one2many or many2many
+ elif operator in ("any", "not any") and field.type in ("many2many", "one2many"):
+ domain = field.get_domain_list(model)
+ domain = AND([domain, right])
+ right_ids = comodel._search(domain)
+ push((left, ANY_IN[operator], right_ids), model, alias)
+
+ elif not field.store:
+ # Non-stored field should provide an implementation of search.
+ if not field.search:
+ # field does not support search!
+ _logger.error(
+ "Non-stored field %s cannot be searched.", field, exc_info=True
+ )
+ if _logger.isEnabledFor(logging.DEBUG):
+ _logger.debug("".join(traceback.format_stack()))
+ # Ignore it: generate a dummy leaf.
+ domain = []
+ else:
+ # Let the field generate a domain.
+ if len(path) > 1:
+ right = comodel._search([(path[1], operator, right)])
+ operator = "in"
+ domain = field.determine_domain(model, operator, right)
+
+ for elem in domain_combine_anies(domain, model):
+ push(elem, model, alias)
+
+ # -------------------------------------------------
+ # RELATIONAL FIELDS
+ # -------------------------------------------------
+
+ # Applying recursivity on field(one2many)
+ elif field.type == "one2many" and operator in HIERARCHY_FUNCS:
+ ids2 = to_ids(right, comodel, leaf)
+ if field.comodel_name != model._name:
+ dom = HIERARCHY_FUNCS[operator](
+ left, ids2, comodel, prefix=field.comodel_name
+ )
+ else:
+ dom = HIERARCHY_FUNCS[operator]("id", ids2, comodel, parent=left)
+ for dom_leaf in dom:
+ push(dom_leaf, model, alias)
+
+ elif field.type == "one2many":
+ domain = field.get_domain_list(model)
+ inverse_field = comodel._fields[field.inverse_name]
+ inverse_is_int = inverse_field.type in ("integer", "many2one_reference")
+ unwrap_inverse = (
+ (lambda ids: ids) if inverse_is_int else (lambda recs: recs.ids)
+ )
+
+ if right is not False:
+ # determine ids2 in comodel
+ if isinstance(right, str):
+ op2 = (
+ TERM_OPERATORS_NEGATION[operator]
+ if operator in NEGATIVE_TERM_OPERATORS
+ else operator
+ )
+ ids2 = comodel._search(
+ AND([domain or [], [("display_name", op2, right)]])
+ )
+ elif isinstance(right, collections.abc.Iterable):
+ ids2 = right
+ else:
+ ids2 = [right]
+ if inverse_is_int and domain:
+ ids2 = comodel._search([("id", "in", ids2)] + domain)
+
+ if inverse_field.store:
+ # In the condition, one must avoid subqueries to return
+ # NULL values, since it makes the IN test NULL instead
+ # of FALSE. This may discard expected results, as for
+ # instance "id NOT IN (42, NULL)" is never TRUE.
+ sql_in = (
+ SQL("NOT IN")
+ if operator in NEGATIVE_TERM_OPERATORS
+ else SQL("IN")
+ )
+ if not isinstance(ids2, Query):
+ ids2 = comodel.browse(ids2)._as_query(ordered=False)
+ sql_inverse = comodel._field_to_sql(
+ ids2.table, inverse_field.name, ids2
+ )
+ if not inverse_field.required:
+ ids2.add_where(SQL("%s IS NOT NULL", sql_inverse))
+ if (
+ inverse_field.company_dependent
+ and inverse_field.index == "btree_not_null"
+ and not inverse_field.get_company_dependent_fallback(comodel)
+ ):
+ ids2.add_where(
+ SQL(
+ "%s IS NOT NULL",
+ SQL.identifier(ids2.table, inverse_field.name),
+ )
+ )
+ push_result(
+ SQL(
+ "(%s %s %s)",
+ SQL.identifier(alias, "id"),
+ sql_in,
+ ids2.subselect(sql_inverse),
+ )
+ )
+ else:
+ # determine ids1 in model related to ids2
+ recs = (
+ comodel.browse(ids2).sudo().with_context(prefetch_fields=False)
+ )
+ ids1 = unwrap_inverse(recs.mapped(inverse_field.name))
+ # rewrite condition in terms of ids1
+ op1 = "not in" if operator in NEGATIVE_TERM_OPERATORS else "in"
+ push(("id", op1, ids1), model, alias)
+
+ else:
+ if inverse_field.store and not (inverse_is_int and domain):
+ # rewrite condition to match records with/without lines
+ sub_op = "in" if operator in NEGATIVE_TERM_OPERATORS else "not in"
+ comodel_domain = [(inverse_field.name, "!=", False)]
+ query = comodel._where_calc(comodel_domain)
+ sql_inverse = comodel._field_to_sql(
+ query.table, inverse_field.name, query
+ )
+ sql = query.subselect(sql_inverse)
+ push(("id", sub_op, sql), model, alias)
+ else:
+ comodel_domain = [(inverse_field.name, "!=", False)]
+ if inverse_is_int and domain:
+ comodel_domain += domain
+ recs = (
+ comodel.search(comodel_domain, order="id")
+ .sudo()
+ .with_context(prefetch_fields=False)
+ )
+ # determine ids1 = records with lines
+ ids1 = unwrap_inverse(recs.mapped(inverse_field.name))
+ # rewrite condition to match records with/without lines
+ op1 = "in" if operator in NEGATIVE_TERM_OPERATORS else "not in"
+ push(("id", op1, ids1), model, alias)
+
+ elif field.type == "many2many":
+ rel_table, rel_id1, rel_id2 = field.relation, field.column1, field.column2
+
+ if operator in HIERARCHY_FUNCS:
+ # determine ids2 in comodel
+ ids2 = to_ids(right, comodel, leaf)
+ domain = HIERARCHY_FUNCS[operator]("id", ids2, comodel)
+ ids2 = comodel._search(domain)
+ rel_alias = self.query.make_alias(alias, field.name)
+ push_result(
+ SQL(
+ "EXISTS (SELECT 1 FROM %s AS %s WHERE %s = %s AND %s IN %s)",
+ SQL.identifier(rel_table),
+ SQL.identifier(rel_alias),
+ SQL.identifier(rel_alias, rel_id1),
+ SQL.identifier(alias, "id"),
+ SQL.identifier(rel_alias, rel_id2),
+ tuple(ids2) or (None,),
+ )
+ )
+
+ elif right is not False:
+ # determine ids2 in comodel
+ if isinstance(right, str):
+ domain = field.get_domain_list(model)
+ op2 = (
+ TERM_OPERATORS_NEGATION[operator]
+ if operator in NEGATIVE_TERM_OPERATORS
+ else operator
+ )
+ ids2 = comodel._search(
+ AND([domain or [], [("display_name", op2, right)]])
+ )
+ elif isinstance(right, collections.abc.Iterable):
+ ids2 = right
+ else:
+ ids2 = [right]
+
+ if isinstance(ids2, Query):
+ # rewrite condition in terms of ids2
+ sql_ids2 = ids2.subselect()
+ else:
+ # rewrite condition in terms of ids2
+ sql_ids2 = SQL("%s", tuple(it for it in ids2 if it) or (None,))
+
+ if operator in NEGATIVE_TERM_OPERATORS:
+ sql_exists = SQL("NOT EXISTS")
+ else:
+ sql_exists = SQL("EXISTS")
+
+ rel_alias = self.query.make_alias(alias, field.name)
+ push_result(
+ SQL(
+ "%s (SELECT 1 FROM %s AS %s WHERE %s = %s AND %s IN %s)",
+ sql_exists,
+ SQL.identifier(rel_table),
+ SQL.identifier(rel_alias),
+ SQL.identifier(rel_alias, rel_id1),
+ SQL.identifier(alias, "id"),
+ SQL.identifier(rel_alias, rel_id2),
+ sql_ids2,
+ )
+ )
+
+ else:
+ # rewrite condition to match records with/without relations
+ if operator in NEGATIVE_TERM_OPERATORS:
+ sql_exists = SQL("EXISTS")
+ else:
+ sql_exists = SQL("NOT EXISTS")
+ rel_alias = self.query.make_alias(alias, field.name)
+ push_result(
+ SQL(
+ "%s (SELECT 1 FROM %s AS %s WHERE %s = %s)",
+ sql_exists,
+ SQL.identifier(rel_table),
+ SQL.identifier(rel_alias),
+ SQL.identifier(rel_alias, rel_id1),
+ SQL.identifier(alias, "id"),
+ )
+ )
+
+ elif field.type == "many2one":
+ if operator in HIERARCHY_FUNCS:
+ ids2 = to_ids(right, comodel, leaf)
+ if field.comodel_name != model._name:
+ dom = HIERARCHY_FUNCS[operator](
+ left, ids2, comodel, prefix=field.comodel_name
+ )
+ else:
+ dom = HIERARCHY_FUNCS[operator]("id", ids2, comodel, parent=left)
+ for dom_leaf in dom:
+ push(dom_leaf, model, alias)
+
+ elif (
+ isinstance(right, str)
+ or isinstance(right, tuple | list)
+ and right
+ and all(isinstance(item, str) for item in right)
+ ):
+ # resolve string-based m2o criterion into IDs subqueries
+
+ # Special treatment to ill-formed domains
+ operator = "in" if operator in ("<", ">", "<=", ">=") else operator
+ dict_op = {"not in": "!=", "in": "=", "=": "in", "!=": "not in"}
+ if isinstance(right, tuple):
+ right = list(right)
+ if not isinstance(right, list) and operator in ("not in", "in"):
+ operator = dict_op[operator]
+ elif isinstance(right, list) and operator in (
+ "!=",
+ "=",
+ ): # for domain (FIELD,'=',['value1','value2'])
+ operator = dict_op[operator]
+ if operator in NEGATIVE_TERM_OPERATORS:
+ res_ids = comodel._search(
+ [("display_name", TERM_OPERATORS_NEGATION[operator], right)]
+ )
+ for dom_leaf in (
+ "|",
+ (left, "not in", res_ids),
+ (left, "=", False),
+ ):
+ push(dom_leaf, model, alias)
+ else:
+ res_ids = comodel._search([("display_name", operator, right)])
+ push((left, "in", res_ids), model, alias)
+
+ else:
+ # right == [] or right == False
+ # and all other cases are handled by _condition_to_sql()
+ push_result(
+ model._condition_to_sql(alias, left, operator, right, self.query)
+ )
+
+ # -------------------------------------------------
+ # BINARY FIELDS STORED IN ATTACHMENT
+ # -> check for null only
+ # -------------------------------------------------
+
+ elif field.type == "binary" and field.attachment:
+ if operator in ("=", "!=") and not right:
+ sub_op = "in" if operator in NEGATIVE_TERM_OPERATORS else "not in"
+ sql = SQL(
+ (
+ "(SELECT res_id FROM ir_attachment "
+ "WHERE res_model = %s AND res_field = %s)"
+ ),
+ model._name,
+ left,
+ )
+ push(("id", sub_op, sql), model, alias)
+ else:
+ _logger.error(
+ "Binary field '%s' stored in attachment: ignore %s %s %s",
+ field.string,
+ left,
+ operator,
+ reprlib.repr(right),
+ )
+ push(TRUE_LEAF, model, alias)
+
+ # -------------------------------------------------
+ # OTHER FIELDS
+ # -> datetime fields: manage time part of the datetime
+ # column when it is not there
+ # -> manage translatable fields
+ # -------------------------------------------------
+
+ elif field.type in [
+ "geo_polygon",
+ "geo_multi_polygon",
+ "geo_point",
+ "geo_multi_point",
+ "geo_line",
+ "geo_multi_line",
+ ]:
+ push_result(__leaf_to_sql(leaf, model, alias))
+ else:
+ if field.type == "datetime" and right:
+ if isinstance(right, str) and len(right) == 10:
+ if operator in (">", "<="):
+ right += " 23:59:59"
+ else:
+ right += " 00:00:00"
+ push((left, operator, right), model, alias)
+ elif isinstance(right, date) and not isinstance(right, datetime):
+ if operator in (">", "<="):
+ right = datetime.combine(right, time.max)
+ else:
+ right = datetime.combine(right, time.min)
+ push((left, operator, right), model, alias)
+ else:
+ push_result(
+ model._condition_to_sql(
+ alias, left, operator, right, self.query
+ )
+ )
+
+ elif (
+ field.translate
+ and (isinstance(right, str) or right is False)
+ and left == field.name
+ and self._has_trigram
+ and field.index == "trigram"
+ and operator in ("=", "like", "ilike", "=like", "=ilike")
+ ):
+ right = right or ""
+ sql_operator = SQL_OPERATORS[operator]
+ need_wildcard = operator in WILDCARD_OPERATORS
+
+ if need_wildcard and not right:
+ push_result(
+ SQL("FALSE")
+ if operator in NEGATIVE_TERM_OPERATORS
+ else SQL("TRUE")
+ )
+ continue
+ push_result(
+ model._condition_to_sql(alias, left, operator, right, self.query)
+ )
+
+ if not need_wildcard:
+ right = field.convert_to_column(right, model, validate=False)
+
+ # a prefilter using trigram index to speed up '=', 'like', 'ilike'
+ # '!=', '<=', '<', '>', '>=', 'in', 'not in',
+ # 'not like', 'not ilike' cannot use this trick
+ if operator == "=":
+ _right = value_to_translated_trigram_pattern(right)
+ else:
+ _right = pattern_to_translated_trigram_pattern(right)
+
+ if _right != "%":
+ # combine both generated SQL expressions
+ # (above and below) with an AND
+ push("&", model, alias)
+ sql_column = SQL(
+ "%s.%s", SQL.identifier(alias), SQL.identifier(field.name)
+ )
+ indexed_value = self._unaccent(
+ SQL("jsonb_path_query_array(%s, '$.*')::text", sql_column)
+ )
+ _sql_operator = SQL("LIKE") if operator == "=" else sql_operator
+ push_result(
+ SQL(
+ "%s %s %s",
+ indexed_value,
+ _sql_operator,
+ self._unaccent(SQL("%s", _right)),
+ )
+ )
+ else:
+ push_result(
+ model._condition_to_sql(alias, left, operator, right, self.query)
+ )
+
+ # ----------------------------------------
+ # END OF PARSING FULL DOMAIN
+ # -> put result in self.result and self.query
+ # ----------------------------------------
+ [self.result] = result_stack
+ self.query.add_where(self.result)
+
+
+expression.expression.parse = parse
diff --git a/base_geoengine/fields.py b/base_geoengine/fields.py
index f0d8beedb..4943543b7 100644
--- a/base_geoengine/fields.py
+++ b/base_geoengine/fields.py
@@ -43,7 +43,7 @@ def column_type(self):
postgis_geom_type += "ZM"
return ("geometry", f"geometry({postgis_geom_type}, {self.srid})")
- def convert_to_column(self, value, record, values=None):
+ def convert_to_column(self, value, record, values=None, validate=True):
"""Convert value to database format
value can be geojson, wkt, shapely geometry object.
diff --git a/base_geoengine/geo_operators.py b/base_geoengine/geo_operators.py
index 8be01f44e..321cad6cb 100644
--- a/base_geoengine/geo_operators.py
+++ b/base_geoengine/geo_operators.py
@@ -14,7 +14,7 @@ def _get_direct_como_op_sql(self, table, col, value, params, op=""):
else:
base = self.geo_field.entry_to_shape(value, same_type=False)
params.append(base.wkt)
- return f" ST_Area({table}.{col}) {op} ST_Area(ST_GeomFromText(%s))"
+ return f" ST_Area({table}.{col}) {op} ST_Area(ST_GeomFromText('%s'))"
def _get_postgis_comp_sql(self, table, col, value, params, op=""):
"""return raw sql for all search based on St_**(a, b) posgis operator"""
@@ -22,7 +22,7 @@ def _get_postgis_comp_sql(self, table, col, value, params, op=""):
srid = self.geo_field.srid
params.append(base.wkt)
params.append(srid)
- return f"{op}({table}.{col}, ST_GeomFromText(%s, %s))"
+ return f"{op}({table}.{col}, ST_GeomFromText('%s', %s))"
def get_geo_greater_sql(self, table, col, value, params):
"""Returns raw sql for geo_greater operator
@@ -46,7 +46,7 @@ def get_geo_equal_sql(
(used for equality comparison)
"""
base = self.geo_field.entry_to_shape(value, same_type=False)
- compare_to = "ST_GeomFromText(%s)"
+ compare_to = "ST_GeomFromText('%s')"
params.append(base.wkt)
return f" {table}.{col} = {compare_to}"
diff --git a/base_geoengine/models/ir_view.py b/base_geoengine/models/ir_view.py
index 21c469533..c6eb3948c 100644
--- a/base_geoengine/models/ir_view.py
+++ b/base_geoengine/models/ir_view.py
@@ -33,3 +33,10 @@ def _is_qweb_based_view(self, view_type):
if view_type == "geoengine":
return True
return super()._is_qweb_based_view(view_type)
+
+ def _get_view_info(self):
+ view_info = super()._get_view_info()
+ view_info["geoengine"] = {
+ "icon": "fa fa-globe",
+ }
+ return view_info
diff --git a/base_geoengine/static/description/index.html b/base_geoengine/static/description/index.html
index 54de0b054..60ba95514 100644
--- a/base_geoengine/static/description/index.html
+++ b/base_geoengine/static/description/index.html
@@ -369,7 +369,7 @@
records that are displayed in the geoengine view (active_ids). We can
use the two new operators : “in active_ids” and “not in active_ids”.
These will automatically replace the marker with ids. Note that the
-widget will indicate that the domain is invalid because of the
-marker.
+widget will indicate that the domain is invalid because of the marker.
Creation of the RecordsPanel. This panel allows you to retrieve all
active records. You can click on record to get the movement to the
selected record. Two magnifying glass are also available. You can
click on the left one to zoom on the record. You can click on the
right one to get the original zoom.
-
A search bar is also available. It allows you to perform a search
-into the RecordsPanel.
+
A search bar is also available. It allows you to perform a search into
+the RecordsPanel.
A button to open/close the panels is also available.
The module has been translated in French.
Now you can now make the geoengine view editable. Simply add editable
@@ -656,7 +655,7 @@
Bugs are tracked on GitHub Issues.
In case of trouble, please check there if your issue has already been reported.
If you spotted it first, help us to smash it by providing a detailed and welcomed
-feedback.
OCA, or the Odoo Community Association, is a nonprofit organization whose
mission is to support the collaborative development of Odoo features and
promote its widespread use.
-
This module is part of the OCA/geospatial project on GitHub.
+
This module is part of the OCA/geospatial project on GitHub.