diff --git a/doc/release-notes/11085-suppress-breadcrumbs b/doc/release-notes/11085-suppress-breadcrumbs deleted file mode 100644 index fe229d4e338..00000000000 --- a/doc/release-notes/11085-suppress-breadcrumbs +++ /dev/null @@ -1 +0,0 @@ -This release fixes a bug which was allowing the viewing of host collections' names when using anonymized preview URLS. diff --git a/doc/release-notes/11206-refresh-landing-page-after-delete.md b/doc/release-notes/11206-refresh-landing-page-after-delete.md deleted file mode 100644 index 9b5b2812b65..00000000000 --- a/doc/release-notes/11206-refresh-landing-page-after-delete.md +++ /dev/null @@ -1,6 +0,0 @@ -## Bug Fix ## - -JSF UI will no longer display the deleted Dataset/Dataverse. A 1 second delay in the UI page redirect gives Solr time to re-index and remove the deleted object. - -See: -- [#11206](https://github.com/IQSS/dataverse/issues/11206) diff --git a/doc/release-notes/11254-croissant-builtin.md b/doc/release-notes/11254-croissant-builtin.md deleted file mode 100644 index 6ee67059650..00000000000 --- a/doc/release-notes/11254-croissant-builtin.md +++ /dev/null @@ -1,13 +0,0 @@ -## Croissant Support Is Now Built In, Slim Version Added - -Croissant is a metadata export format for machine learning datasets that (until this release) was optional and implemented as external exporter. The code has been merged into the main Dataverse code base which means the Croissant format is automatically available in your installation of Dataverse, alongside older formats like Dublin Core and DDI. If you were using the external Croissant exporter, the merged code is equivalent to version 0.1.6. Croissant bugs and feature requests should now be filed against the main Dataverse repo (https://github.com/IQSS/dataverse) and the old repo (https://github.com/gdcc/exporter-croissant) should be considered retired. - -As described in the [Discoverability](https://dataverse-guide--12130.org.readthedocs.build/en/12130/admin/discoverability.html#id6) section of the Admin Guide, Croissant is inserted into the "head" of the HTML of dataset landing pages, as requested by the [Google Dataset Search](https://datasetsearch.research.google.com) team so that their tool can filter by datasets that support Croissant. In previous versions of Dataverse, when Croissant was optional and hadn't been enabled, we used the older "Schema.org JSON-LD" format in the "head". If you'd like to keep this behavior, you can use the feature flag [dataverse.legacy.schemaorg-in-html-head](https://dataverse-guide--12130.org.readthedocs.build/en/12130/installation/config.html#dataverse.legacy.schemaorg-in-html-head). - -Both Croissant and Schema.org JSON-LD formats can become quite large when the dataset has many files or (for Croissant) when the files have many variables. As of this release, the "head" of the HTML contains a "slim" version of Croissant that doesn't contain information about files or variables. The original, full version of Croissant is still available via the "Export Metadata" dropdown. Both "croissant" and "croissantSlim" formats are available via API. - -See also #11254, #12123, #12130, and #12191. - -## New Settings - -- dataverse.legacy.schemaorg-in-html-head diff --git a/doc/release-notes/11473-harvesting-client-improvements.md b/doc/release-notes/11473-harvesting-client-improvements.md deleted file mode 100644 index 8f14c111860..00000000000 --- a/doc/release-notes/11473-harvesting-client-improvements.md +++ /dev/null @@ -1,5 +0,0 @@ -A setting has been added for configuring sleep intervals between OAI calls for specific harvesting clients. Making it possible to harvest uninterrupted from servers enforcing rate limit policies. See the configuration guide for details. Additionally, this release fixes a problem with harvesting from DataCite OAI-PMH where initial, long-running harvests were failing on sets with large numbers of records. - -## New Database Settings - -- :HarvestingClientCallRateLimit diff --git a/doc/release-notes/11606-hide-spa-oidc-providers-from-jsf-login-screen.md b/doc/release-notes/11606-hide-spa-oidc-providers-from-jsf-login-screen.md deleted file mode 100644 index 89151f5c942..00000000000 --- a/doc/release-notes/11606-hide-spa-oidc-providers-from-jsf-login-screen.md +++ /dev/null @@ -1,7 +0,0 @@ -This release fixes a bug where the value of the dataverse.auth.oidc.enabled setting, available when Provisioning an authentication provider via JVM options (see ref: https://guides.dataverse.org/en/latest/installation/oidc.html#provision-via-jvm-options) was not being not being propagated to the current Dataverse user interface (where enabled=false providers are not displayed for login/registration) or represented in the GET api/admin/authenticationProviders API call. - -A new JVM setting ('dataverse.auth.oidc.hidden-jsf') was added to hide an enabled OIDC Provider from the JSF UI. - -For Dataverse instances deploying both the current JSF UI and the new SPA UI, this fix allows the OIDC Keycloak provider configured for the SPA to be hidden in the JSF UI (useful in cases where it would duplicate other configured providers). - -Note: The API to create a new Auth Provider can only be used to create a provider for both JSF and SPA. Use JVM / MicroProfile config setting to create SPA only providers. diff --git a/doc/release-notes/11670-notification-of-moved-datasets.md b/doc/release-notes/11670-notification-of-moved-datasets.md deleted file mode 100644 index bbec87a79fd..00000000000 --- a/doc/release-notes/11670-notification-of-moved-datasets.md +++ /dev/null @@ -1,6 +0,0 @@ -## Notifications - -New notification was added for Datasets moving between Dataverses. -Requires SettingsServiceBean.Key.SendNotificationOnDatasetMove setting to be enabled. - -See #11670 diff --git a/doc/release-notes/11714-prevent-exclude-email-from-export-for-permitted-user-get-dataset-version.md b/doc/release-notes/11714-prevent-exclude-email-from-export-for-permitted-user-get-dataset-version.md deleted file mode 100644 index 6a87c065573..00000000000 --- a/doc/release-notes/11714-prevent-exclude-email-from-export-for-permitted-user-get-dataset-version.md +++ /dev/null @@ -1,7 +0,0 @@ -New query parameter (ignoreSettingExcludeEmailFromExport) for API /api/datasets/:persistentId/versions/{versionId} - -SPA requires the ability to have the contact emails included in the response for this API call -This query parameter prevents the contact email from being excluded when the setting (ExcludeEmailFromExport) is set to true and the user has EditDataset permissions. - -See: -- [#11714](https://github.com/IQSS/dataverse/issues/11714) diff --git a/doc/release-notes/11740-api-file-download-with-bearer-token.md b/doc/release-notes/11740-api-file-download-with-bearer-token.md deleted file mode 100644 index 651ae4040d6..00000000000 --- a/doc/release-notes/11740-api-file-download-with-bearer-token.md +++ /dev/null @@ -1,7 +0,0 @@ -## Bug / Not Bug in Dataverse. Bug is in SPA Frontend - -Cleaned up Access APIs to localize getting user from session for JSF backward compatibility - -This bug requires a front end fix to send the Bearer Token in the API call. - -See: #11740 diff --git a/doc/release-notes/11747-review-dataset-type.md b/doc/release-notes/11747-review-dataset-type.md deleted file mode 100644 index 7a67b84f5d0..00000000000 --- a/doc/release-notes/11747-review-dataset-type.md +++ /dev/null @@ -1,24 +0,0 @@ -## Highlights - -### Review Datasets - -Dataverse now supports review datasets, a type of dataset that can be used to review resources such as other datasets in the Dataverse installation itself or various resources in external data repositories. APIs and a new "review" metadata block (with an "Item Reviewed" field) are in place but the UI for this feature will only available in a future version of the new React-based [Dataverse Frontend](https://github.com/IQSS/dataverse-frontend). See also the [guides](https://dataverse-guide--11753.org.readthedocs.build/en/11753/api/native-api.html#add-dataset-type), #11747, #12015, #11887, #12115, and #11753. - -## Other Features Added - -- Citation Style Language (CSL) output now includes "type:software" or "type:review" when those dataset types are used. See the [guides](https://dataverse-guide--11753.org.readthedocs.build/en/11753/api/native-api.html#get-citation-in-other-formats) and #11753. - -## Updated APIs - -- The Change Collection Attributes API now supports `allowedDatasetTypes`. See the [guides](https://dataverse-guide--11753.org.readthedocs.build/en/11753/api/native-api.html#change-collection-attributes), #12115, and #11753. - -## Bugs Fixed - -- 500 error when deleting dataset type by name. See #11833 and #11753. -- Dataset Type facet works in JSF but not the SPA. See #11758 and #11753. - -## Backward Incompatible Changes - -### Dataset Types Must Be Allowed, Per-Collection, Before Use - -In previous releases of Dataverse, as soon as additional dataset types were added (such as "software", "workflow", etc.), they could be used by all users when creating datasets (via API only). As of this release, on a per-collection basis, superusers must allow these dataset types to be used. See #12115 and #11753. diff --git a/doc/release-notes/11773-incorrect-roles-listed-in-assignrole-notifications.md b/doc/release-notes/11773-incorrect-roles-listed-in-assignrole-notifications.md deleted file mode 100644 index d425c8551a5..00000000000 --- a/doc/release-notes/11773-incorrect-roles-listed-in-assignrole-notifications.md +++ /dev/null @@ -1,4 +0,0 @@ -This release changes the text in assign role notifications to list only the role being assigned that generated the specific notification. -The previous implementation listed all the roles associated with the dataset in each notification. - -See also [the guides](https://dataverse-guide--11664.org.readthedocs.build/en/11664/user/account.html#notifications) and #11773. diff --git a/doc/release-notes/11787-signed-url-improvement.md b/doc/release-notes/11787-signed-url-improvement.md deleted file mode 100644 index 6caadc8523a..00000000000 --- a/doc/release-notes/11787-signed-url-improvement.md +++ /dev/null @@ -1 +0,0 @@ -In prior versions of Dataverse, configuring a proxy to forward to Dataverse over an http connection could result in failure of signed Urls (e.g. for external tools). This version of Dataverse supports having a proxy send an X-Forwarded-Proto header set to https to avoid this issue. \ No newline at end of file diff --git a/doc/release-notes/11914-set-template-default-api.md b/doc/release-notes/11914-set-template-default-api.md deleted file mode 100644 index d9cad2edb71..00000000000 --- a/doc/release-notes/11914-set-template-default-api.md +++ /dev/null @@ -1,15 +0,0 @@ -## New Endpoint: POST `/dataverses/{id}/template/default/{templateId}` - -A new endpoint has been implemented to set the default template to a given dataverse collection. - -### Functionality -- Sets the default template of the given dataverse collection. -- You must have edit dataverse permission in the collection in order to use this endpoint. - -## New Endpoint: DELETE `/dataverses/{id}/template/default` - -A new endpoint has been implemented to remove the default template to a given dataverse collection. - -### Functionality -- Removes the default template of the given dataverse collection. -- You must have edit dataverse permission in the collection in order to use this endpoint. diff --git a/doc/release-notes/11918-template-apis.md b/doc/release-notes/11918-template-apis.md deleted file mode 100644 index b970fd02c70..00000000000 --- a/doc/release-notes/11918-template-apis.md +++ /dev/null @@ -1,17 +0,0 @@ -## New Endpoint: GET `/dataverses/{id}/template` - -A new endpoint has been implemented to manage templates belonging to a given dataverse collection. - -### Functionality -- Returns the template of the given {id} in json format. -- You must have add dataset permission in the collection in order to use this endpoint. - -## New Endpoint: DELETE `/dataverses/{id}/template` - -A new endpoint has been implemented to manage templates belonging to a given dataverse collection. - -### Functionality -- Deletes the template of the given {id}. -- You must have Edit Dataverse permission in order to use this endpoint. - - diff --git a/doc/release-notes/11976-fix-file-replace.md b/doc/release-notes/11976-fix-file-replace.md deleted file mode 100644 index ba9d55138a6..00000000000 --- a/doc/release-notes/11976-fix-file-replace.md +++ /dev/null @@ -1,3 +0,0 @@ -A bug introduced in Dataverse 6.8 that makes attempts to replace non-tabular files via the current Dataverse UI fail has been fixed. (The bug would also cause the replace API to fail if an empty dataFileTags array is sent.) - -See #11976 diff --git a/doc/release-notes/11983-COAR-Notify2.md b/doc/release-notes/11983-COAR-Notify2.md deleted file mode 100644 index 1c3451a81aa..00000000000 --- a/doc/release-notes/11983-COAR-Notify2.md +++ /dev/null @@ -1,3 +0,0 @@ -### Improved COAR Notify Relationship Announcement Support - -Dataverse no longer sends duplicate [COAR Notify Relationship Announcement Workflow](https://coar-notify.net/catalogue/workflows/repository-relationship-repository/) messages when new dataset versions are published (and the relationship metadata has not been changed). diff --git a/doc/release-notes/12008-dataset-api-locks.md b/doc/release-notes/12008-dataset-api-locks.md deleted file mode 100644 index 4eeba82b7af..00000000000 --- a/doc/release-notes/12008-dataset-api-locks.md +++ /dev/null @@ -1 +0,0 @@ -The API returning information about datasets (`/api/datasets/{id}`) now includes a `locks` field containing a list of the types of all existing locks, e.g. `"locks": ["InReview"]`. diff --git a/doc/release-notes/12009-my-data-api-params.md b/doc/release-notes/12009-my-data-api-params.md deleted file mode 100644 index abff4d5033a..00000000000 --- a/doc/release-notes/12009-my-data-api-params.md +++ /dev/null @@ -1 +0,0 @@ -The My Data API now supports the `metadata_fields`, `sort` and `order`, `show_collections` and `fq` parameters, which enhances its functionality and brings it in line with the search API. \ No newline at end of file diff --git a/doc/release-notes/12065-allow-crud-on-archival-status-for-deaccessioned.md b/doc/release-notes/12065-allow-crud-on-archival-status-for-deaccessioned.md deleted file mode 100644 index fb0ebdf6a1e..00000000000 --- a/doc/release-notes/12065-allow-crud-on-archival-status-for-deaccessioned.md +++ /dev/null @@ -1 +0,0 @@ -This release removes an undocumented restriction on the API calls to get, set, and delete archival status. They did not work on deaccessioned dataset versions and now do. (See https://guides.dataverse.org/en/latest/api/native-api.html#get-the-archival-status-of-a-dataset-by-version ) \ No newline at end of file diff --git a/doc/release-notes/12067-require-embargo-reason.md b/doc/release-notes/12067-require-embargo-reason.md deleted file mode 100644 index 094d2270703..00000000000 --- a/doc/release-notes/12067-require-embargo-reason.md +++ /dev/null @@ -1,8 +0,0 @@ -It is now possible to configure Dataverse to require an embargo reason when a user creates an embargo on one or more files. -By default the embargo reason is optional. - -In addition, with this release, if an embargo reason is supplied, it must not be blank. - -New Feature Flag: - -dataverse.feature.require-embargo-reason - default false diff --git a/doc/release-notes/12082-permission-indexing-improvements.md b/doc/release-notes/12082-permission-indexing-improvements.md deleted file mode 100644 index 38d71d3f28f..00000000000 --- a/doc/release-notes/12082-permission-indexing-improvements.md +++ /dev/null @@ -1,6 +0,0 @@ -Changes in v6.9 that significantly improved re-indexing performance and lowered memory use in situations -such as when a user's role on the root collection were changed, also slowed reindexing of individual -datasets ater editing and publication. - -This release restores/improves the individual dataset reindexing performance while retaining the -benefits of the earlier update. diff --git a/doc/release-notes/12094permission-indexing-improvements3.md b/doc/release-notes/12094permission-indexing-improvements3.md deleted file mode 100644 index 3d431fb567d..00000000000 --- a/doc/release-notes/12094permission-indexing-improvements3.md +++ /dev/null @@ -1,2 +0,0 @@ -(assuming the earlier PRs have been merged, tehre will be a section on indexing improvements already) -This release also avoids creating unused Solr entries for files in drafts of new versions of published datasets (decreasing the Solr db size and thereby improving performance). diff --git a/doc/release-notes/12127-fix-stored-proc-generated-pids.md b/doc/release-notes/12127-fix-stored-proc-generated-pids.md deleted file mode 100644 index 968bd6007b9..00000000000 --- a/doc/release-notes/12127-fix-stored-proc-generated-pids.md +++ /dev/null @@ -1,3 +0,0 @@ -This release fixes a bug which prevents PIDs from being generated when the `identifier-generation-style` is set to `storedProcGenerated`. - -Previously, this caused a database error ("ERROR: procedure generateidentifierfromstoredprocedure(unknown) does not exist"). \ No newline at end of file diff --git a/doc/release-notes/12158-rdm-integration-new-globus-features.md b/doc/release-notes/12158-rdm-integration-new-globus-features.md deleted file mode 100644 index 55fbed09272..00000000000 --- a/doc/release-notes/12158-rdm-integration-new-globus-features.md +++ /dev/null @@ -1,15 +0,0 @@ -## New Globus Features in rdm-integration 2.0.1 - -[rdm-integration](https://github.com/libis/rdm-integration) is a Dataverse external tool for synchronizing files from various source repositories into Dataverse, with support for background processing, DDI-CDI metadata generation, and high-performance Globus transfers. - -Release 2.0.1 brings several new Globus capabilities: - -- **Guest downloads** — public datasets can be downloaded via Globus without a Dataverse account -- **Preview URL support** — reviewers can download draft dataset files via Globus using general preview URLs -- **Scoped institutional login** — `session_required_single_domain` support enables access to institutional Globus endpoints (e.g., HPC clusters); scopes are automatically removed for guest and preview access -- **Real-time transfer progress** — polling-based progress monitoring with percentage display and status updates (ACTIVE/SUCCEEDED/FAILED) -- **Download filtering** — only datasets where the user can download all files are shown, avoiding failed transfers for restricted or embargoed content -- **Hierarchical file tree** — recursive folder selection and color-coded file status - -For full details, see the [README](https://github.com/libis/rdm-integration#readme) and [GLOBUS_INTEGRATION.md](https://github.com/libis/rdm-integration/blob/main/GLOBUS_INTEGRATION.md). - diff --git a/doc/release-notes/12163-REFI-QDA-types b/doc/release-notes/12163-REFI-QDA-types deleted file mode 100644 index 124aea11e32..00000000000 --- a/doc/release-notes/12163-REFI-QDA-types +++ /dev/null @@ -1,5 +0,0 @@ -This release adds support to map *.qdc and *.qdpx files as [REFI-QDA standard](https://www.qdasoftware.org/) Codebook and Project files for qualitative data analysis, which allows them to be used with the -new REFI QDA Previewers at https://github.com/gdcc/dataverse-previewers. - -Note that, to enable existing qdc and qdpx files to be used with the previewers, their Mimetypes will ned to be reassigned (by api (https://guides.dataverse.org/en/latest/api/native-api.html#redetect-file-type) or in the database) or their existing types used when configuring the previewers. - diff --git a/doc/release-notes/12167-ore-bag-archiving-changes.md b/doc/release-notes/12167-ore-bag-archiving-changes.md deleted file mode 100644 index a10dbdce1df..00000000000 --- a/doc/release-notes/12167-ore-bag-archiving-changes.md +++ /dev/null @@ -1,54 +0,0 @@ -## Archiving, OAI-ORE, and BagIt Export - -This release includes multiple updates to the OAI-ORE metadata export and the process of creating archival bags, improving performance, fixing bugs, and adding significant new functionality. - -### General Archiving Improvements -- Multiple performance and scaling improvements have been made for creating archival bags for large datasets, including: - - The duration of archiving tasks triggered from the version table or API are no longer limited by the transaction time limit. - - Temporary storage space requirements have increased by `1/:BagGeneratorThreads` of the zipped bag size. (This is a consequence of changes to avoid timeout errors on larger files/datasets.) - - The size of individual data files and the total dataset size that will be included in an archival bag can now be limited. Admins can choose whether files above these limits are transferred along with, but outside, the zipped bag (creating a complete archival copy) or are just referenced (using the concept of a "holey" bag and just listing the oversized files and the Dataverse URLs from which they can be retrieved in a `fetch.txt` file). In the holey bag case, an active service on the archiving platform must retrieve the oversized files (using appropriate credentials as needed) to make a complete copy. - - Superusers can now see a pending status in the dataset version table while archiving is active. - - Workflows are now triggered outside the transactions related to publication, assuring that workflow locks and status updates are always recorded. - - Potential conflicts between archiving/workflows, indexing, and metadata exports after publication have been resolved, avoiding cases where the status/last update times for these actions were not recorded. -- A bug has been fixed where superusers would incorrectly see the "Submit" button to launch archiving from the dataset page version table. -- The local, S3, and Google archivers have been updated to support deleting existing archival files for a version to allow re-creating the bag for a given version. -- For archivers that support file deletion, it is now possible to recreate an archival bag after "Update Current Version" has been used (replacing the original bag). By default, Dataverse will mark the current version's archive as out-of-date, but will not automatically re-archive it. - - A new 'obsolete' status has been added to indicate when an archival bag exists for a version but it was created prior to an "Update Current Version" change. -- Improvements have been made to file retrieval for bagging, including retries on errors and when download requests are being throttled. - - A bug causing `:BagGeneratorThreads` to be ignored has been fixed, and the default has been reduced to 2. -- Retrieval of files for inclusion in an archival bag is no longer counted as a download. -- It is now possible to require that all previous versions have been successfully archived before archiving of a newly published version can succeed. (This is intended to support use cases where deduplication of files between dataset versions will be done and is a step towards supporting the Oxford Common File Layout (OCFL).) -- The pending status has changed to use the same JSON format as other statuses - -### OAI-ORE Export Updates -- The export now uses URIs for checksum algorithms, conforming with JSON-LD requirements. -- A bug causing failures with deaccessioned versions has been fixed. This occurred when the deaccession note ("Deaccession Reason" in the UI) was null, which is permissible via the API. -- The `https://schema.org/additionalType` has been updated to "Dataverse OREMap Format v1.0.2" to reflect format changes. - -### Archival Bag (BagIt) Updates -- The `bag-info.txt` file now correctly includes information for dataset contacts, fixing a bug where nothing was included when multiple contacts were defined. (Multiple contacts were always included in the OAI-ORE file in the bag; only the baginfo file was affected). -- Values used in the `bag-info.txt` file that may be multi-line (i.e. with embedded CR or LF characters) are now properly indented and wrapped per the BagIt specification (`Internal-Sender-Identifier`, `External-Description`, `Source-Organization`, `Organization-Address`). -- The dataset name is no longer used as a subdirectory within the `data/` directory to reduce issues with unzipping long paths on some filesystems. -- For dataset versions with no files, the empty `manifest-.txt` file will now use the algorithm from the `:FileFixityChecksumAlgorithm` setting instead of defaulting to MD5. -- A new key, `Dataverse-Bag-Version`, has been added to `bag-info.txt` with the value "1.0" to allow for tracking changes to Dataverse's archival bag generation over time. -- When using the `holey` bag option discussed above, the required `fetch.txt` file will be included. - - -### New Configuration Settings - -This release introduces several new settings to control archival and bagging behavior. - -- `:ArchiveOnlyIfEarlierVersionsAreArchived` (Default: `false`) - When set to `true`, dataset versions must be archived in order. That is, all prior versions of a dataset must be archived before the latest version can be archived. - -The following JVM options (MicroProfile Config Settings) control bag size and holey bag support: -- `dataverse.bagit.zip.holey` -- `dataverse.bagit.zip.max-data-size` -- `dataverse.bagit.zip.max-file-size` - -- `dataverse.bagit.archive-on-version-update` (Default: `false`) - Indicates whether archival bag creation should be triggered (if configured) when a version is updated and was already successfully archived, i.e., via the Update-Current-Version publication option. Setting the flag to `true` only works if the archiver being used supports deleting existing archival bags. - - ###Backward Incompatibility - - The name of archival zipped bag produced by the LocalSubmitToArchiveCommand archiver now has a '.' character before the version number mirror the name used by other archivers, e.g. the name will be like doi-10-5072-fk2-fosg5q.v1.0.zip rather than doi-10-5072-fk2-fosg5qv1.0.zip \ No newline at end of file diff --git a/doc/release-notes/315-fix-handle-to-citation-processing.md b/doc/release-notes/315-fix-handle-to-citation-processing.md deleted file mode 100644 index cca80d80e84..00000000000 --- a/doc/release-notes/315-fix-handle-to-citation-processing.md +++ /dev/null @@ -1,5 +0,0 @@ -### Bug Fix - -Handles from hdl.handle.net with urls of `/citation` instead of `/dataset.xhtml` were not properly redirecting. This fix adds a lookup for alternate PID so `/citation` endpoint will redirect to `/dataset.xhtml` - - diff --git a/doc/release-notes/359-enhance-publishing-message-acknowledgement.md b/doc/release-notes/359-enhance-publishing-message-acknowledgement.md deleted file mode 100644 index 69194182cb0..00000000000 --- a/doc/release-notes/359-enhance-publishing-message-acknowledgement.md +++ /dev/null @@ -1,21 +0,0 @@ -## Publishing Enhancement ## - -Before a Dataset can be published the user must acknowledge acceptance of the disclaimer if it is required. - -The setting "PublishDatasetDisclaimerText", when set, will prevent a draft dataset from being published without the user acknowledging the disclaimer. -The approved disclaimer text is `"By publishing this dataset, I fully accept all legal responsibility for ensuring that the deposited content is: anonymized, free of copyright violations, and contains data that is computationally reusable. I understand and agree that any violation of these conditions may result in the immediate removal of the dataset by the repository without prior notice."` - -To enable/disable the acknowledgement requirement an Admin can set/delete the setting using the following APIs: - -`curl -X PUT -d "By publishing this dataset, I fully accept all legal responsibility for ensuring that the deposited content is: anonymized, free of copyright violations, and contains data that is computationally reusable. I understand and agree that any violation of these conditions may result in the immediate removal of the dataset by the repository without prior notice." http://localhost:8080/api/admin/settings/:PublishDatasetDisclaimerText` - -`curl -X DELETE http://localhost:8080/api/admin/settings/:PublishDatasetDisclaimerText` - -The UI will prevent the user from publishing a Dataset unless the disclaimer is acknowledged. - -The APIs will continue to publish without the acknowledgement for now. An Info API getter was added for non-superusers to get the disclaimer text. - -`curl -X GET http://localhost:8080/api/info/settings/:PublishDatasetDisclaimerText` - -See: -- [#359](https://github.com/IQSS/dataverse.harvard.edu/issues/359) diff --git a/doc/release-notes/6.10-release-notes.md b/doc/release-notes/6.10-release-notes.md new file mode 100644 index 00000000000..594a035cb58 --- /dev/null +++ b/doc/release-notes/6.10-release-notes.md @@ -0,0 +1,256 @@ +# Dataverse 6.10 + +Please note: To read these instructions in full, please go to https://github.com/IQSS/dataverse/releases/tag/v6.10 rather than the [list of releases](https://github.com/IQSS/dataverse/releases), which will cut them off. + +This release brings new features, enhancements, and bug fixes to Dataverse. Thank you to all of the community members who contributed code, suggestions, bug reports, and other assistance across the project! + +## Release Highlights + +Highlights for Dataverse 6.10 include: + +- Optionally require acknowledgment of a disclaimer when publishing +- Optionally require embargo reason +- Harvesting improvements +- Croissant support now built in +- Archiving, OAI-ORE, and BagIt export improvements +- Support for REFI-QDA Codebook and Project files +- Review datasets +- New and improved APIs +- Bug fixes + +### Optionally Require Acknowledgment of a Disclaimer When Publishing + +When users click "Publish" on a dataset they have always seen a popup displaying various information to read before clicking "Continue" to proceed with publication. + +Now you can optionally require users to check a box in this popup to acknowledge a disclaimer that you specify through a new setting called `:PublishDatasetDisclaimerText`. + +For backward compatibility, APIs will continue to publish without the acknowledgement for now. An [API endpoint](https://guides.dataverse.org/en/6.10/api/native-api.html#show-disclaimer-for-publishing-datasets) was added for anyone to retrieve the disclaimer text anonymously. + +See [the guides](https://guides.dataverse.org/en/6.10/installation/config.html#publishdatasetdisclaimertext) and #12051. + +### Optionally Require Embargo Reason + +It is now possible to configure Dataverse to require an embargo reason when a user creates an embargo on one or more files. By default, the embargo reason is optional. `dataverse.feature.require-embargo-reason` can be set to true to enable this feature. + +In addition, with this release, if an embargo reason is supplied, it must not be blank. + +See [the guides](https://guides.dataverse.org/en/6.10/installation/config.html#dataverse-feature-require-embargo-reason), #8692, #11956, #12067. + +### Harvesting Improvements + +A setting has been added for configuring sleep intervals in between OAI-PMH calls on a per-server basis. This can help when some of the servers you want to harvest from have rate limiting policies. You can set a default sleep time and custom sleep times for servers that need more. + +Additionally, this release fixes a problem with harvesting from DataCite OAI-PMH where initial, long-running harvests were failing on sets with large numbers of records. + +See [:HarvestingClientCallRateLimit](https://guides.dataverse.org/en/6.10/installation/config.html#harvestingclientcallratelimit) in the guides, #11473, and #11486. + +### Croissant Support Is Now Built In, Slim Version Added + +Croissant is a metadata export format for machine learning datasets that (until this release) was optional and implemented as external exporter. The code has been merged into the main Dataverse code base which means the Croissant format is automatically available in your installation of Dataverse, alongside older formats like Dublin Core and DDI. If you were using the external Croissant exporter, the merged code is equivalent to version 0.1.6. Croissant bugs and feature requests should now be filed against the main Dataverse repo (https://github.com/IQSS/dataverse) and the old repo (https://github.com/gdcc/exporter-croissant) should be considered retired. + +As described in the [Discoverability](https://guides.dataverse.org/en/6.10/admin/discoverability.html#id6) section of the Admin Guide, Croissant is inserted into the "head" of the HTML of dataset landing pages, as requested by the [Google Dataset Search](https://datasetsearch.research.google.com) team so that their tool can filter by datasets that support Croissant. In previous versions of Dataverse, when Croissant was optional and hadn't been enabled, we used the older "Schema.org JSON-LD" format in the "head". If you'd like to keep this behavior, you can use the feature flag [dataverse.legacy.schemaorg-in-html-head](https://guides.dataverse.org/en/6.10/installation/config.html#dataverse.legacy.schemaorg-in-html-head). + +Both Croissant and Schema.org JSON-LD formats can become quite large when the dataset has many files or (for Croissant) when the files have many variables. As of this release, the "head" of the HTML contains a "slim" version of Croissant that doesn't contain information about files or variables. The original, full version of Croissant is still available via the "Export Metadata" dropdown. Both "croissant" and "croissantSlim" formats are available via API. + +See also #11254, #12123, #12130, and #12191. + +### Archiving, OAI-ORE, and BagIt Export Improvements + +This release includes multiple updates to the OAI-ORE metadata export and the process of creating archival bags, improving performance, fixing bugs, and adding significant new functionality. See #12144, #12129, #12122, #12104, #12103, #12101, and #12213. + +#### General Archiving Improvements + +- Multiple performance and scaling improvements have been made for creating archival bags for large datasets, including: + - The duration of archiving tasks triggered from the version table or API are no longer limited by the transaction time limit. + - Temporary storage space requirements have increased by `1/:BagGeneratorThreads` of the zipped bag size. (Often this is by half because the default value for `:BagGeneratorThreads` is 2.) This is a consequence of changes to avoid timeout errors on larger files/datasets. + - The size of individual data files and the total dataset size that will be included in an archival bag can now be limited. Admins can choose whether files above these limits are transferred along with, but outside, the zipped bag (creating a complete archival copy) or are just referenced (using the concept of a "holey" bag and just listing the oversized files and the Dataverse URLs from which they can be retrieved in a `fetch.txt` file). In the holey bag case, an active service on the archiving platform must retrieve the oversized files (using appropriate credentials as needed) to make a complete copy. + - Superusers can now see a pending status in the dataset version table while archiving is active. + - Workflows are now triggered outside the transactions related to publication, assuring that workflow locks and status updates are always recorded. + - Potential conflicts between archiving/workflows, indexing, and metadata exports after publication have been resolved, avoiding cases where the status/last update times for these actions were not recorded. +- A bug has been fixed where superusers would incorrectly see the "Submit" button to launch archiving from the dataset page version table. +- The local, S3, and Google archivers have been updated to support deleting existing archival files for a version to allow re-creating the bag for a given version. +- For archivers that support file deletion, it is now possible to recreate an archival bag after "Update Current Version" has been used (replacing the original bag). By default, Dataverse will mark the current version's archive as out-of-date, but will not automatically re-archive it. + - A new "obsolete" status has been added to indicate when an archival bag exists for a version but it was created prior to an "Update Current Version" change. +- Improvements have been made to file retrieval for bagging, including retries on errors and when download requests are being throttled. + - A bug causing `:BagGeneratorThreads` to be ignored has been fixed, and the default has been reduced to 2. +- Retrieval of files for inclusion in an archival bag is no longer counted as a download. +- It is now possible to require that all previous versions have been successfully archived before archiving of a newly published version can succeed. This is intended to support use cases where de-duplication of files between dataset versions will be done and is a step towards supporting the Oxford Common File Layout (OCFL). +- The pending status has changed to use the same JSON format as other statuses. + +#### OAI-ORE Export Updates + +- The export now uses URIs for checksum algorithms, conforming with JSON-LD requirements. +- A bug causing failures with deaccessioned versions has been fixed. This occurred when the deaccession note ("Deaccession Reason" in the UI) was null, which is permissible via the API. +- The `https://schema.org/additionalType` has been updated to "Dataverse OREMap Format v1.0.2" to reflect format changes. + +#### Archival Bag (BagIt) Updates + +- The `bag-info.txt` file now correctly includes information for dataset contacts, fixing a bug where nothing was included when multiple contacts were defined. (Multiple contacts were always included in the OAI-ORE file in the bag; only the baginfo file was affected). +- Values used in the `bag-info.txt` file that may be multi-line (i.e. with embedded CR or LF characters) are now properly indented and wrapped per the BagIt specification (`Internal-Sender-Identifier`, `External-Description`, `Source-Organization`, `Organization-Address`). +- The dataset name is no longer used as a subdirectory within the `data/` directory to reduce issues with unzipping long paths on some filesystems. +- For dataset versions with no files, the empty `manifest-.txt` file will now use the algorithm from the `:FileFixityChecksumAlgorithm` setting instead of defaulting to MD5. +- A new key, `Dataverse-Bag-Version`, has been added to `bag-info.txt` with the value "1.0" to allow for tracking changes to Dataverse's archival bag generation over time. +- When using the `holey` bag option discussed above, the required `fetch.txt` file will be included. + +### Support for REFI-QDA Codebook and Project Files + +.qdc and .qdpx files are now detected as [REFI-QDA standard](https://www.qdasoftware.org) Codebook and Project files, respectively, for qualitative data analysis, which allows them to be used with the new REFI QDA Previewers. See https://github.com/gdcc/dataverse-previewers/pull/137 for screenshots. + +To enable existing .qdc and .qdpx files to be used with the previewers, their content type (MIME type) will need to be [redetected](https://guides.dataverse.org/en/6.10/api/native-api.html#redetect-file-type). See #12163. + +### Review Datasets + +Dataverse now supports review datasets, a type of dataset that can be used to review resources such as other datasets in the Dataverse installation itself or various resources in external data repositories. APIs and a new "review" metadata block (with an "Item Reviewed" field) are in place but the UI for this feature will only available in a future version of the new React-based [Dataverse Frontend](https://github.com/IQSS/dataverse-frontend) (see [#876](https://github.com/IQSS/dataverse-frontend/pull/876)). See the [guides](https://guides.dataverse.org/en/6.10/user/dataset-management.html#review-datasets), #11747, #12015, #11887, #12115, and #11753. This feature is experimental. + +## Features Added + +These are features that weren't already mentioned under "highlights" above. + +- A new "DATASETMOVED" notification type was added for when datasets are moved from one collection (dataverse) to another. This requires the :SendNotificationOnDatasetMove setting to be enabled. See #11670 and #11805. +- Performance has been improved for the Solr search index. Changes in v6.9 that significantly improved re-indexing performance and lowered memory use (in situations such as when a user's role on the root collection were changed) also slowed reindexing of individual datasets after editing and publication. This release restores/improves the individual dataset reindexing performance while retaining the benefits of the earlier update. This release also avoids creating unused Solr entries for files in drafts of new versions of published datasets (decreasing the Solr database size and thereby improving performance). See #12082, #12093, and #12094. +- In prior versions of Dataverse, configuring a proxy to forward to Dataverse over an HTTP connection could result in failure of signed URLs (e.g. for external tools). This version of Dataverse supports having a proxy send an `X-Forwarded-Proto` header set to HTTPS to avoid this issue. See [the guides](https://guides.dataverse.org/en/6.10/installation/config.html#using-x-forwarded-proto-for-signed-urls) and #11787. +- Citation Style Language (CSL) output now includes "type:software" or "type:review" when those dataset types are used. See the [guides](https://guides.dataverse.org/en/6.10/api/native-api.html#get-citation-in-other-formats) and #11753. + +## External Tool Updates + +### New Globus Features in rdm-integration 2.0.1 + +[rdm-integration](https://github.com/libis/rdm-integration) is a Dataverse external tool for synchronizing files from various source repositories into Dataverse, with support for background processing, DDI-CDI metadata generation, and high-performance Globus transfers. You can find it on the [Integrations](https://guides.dataverse.org/en/6.10/admin/integrations.html#integrations-dashboard) section of the Dataverse Admin Guide. + +Release 2.0.1 brings several new Globus capabilities: + +- **Guest downloads** — public datasets can be downloaded via Globus without a Dataverse account +- **Preview URL support** — reviewers can download draft dataset files via Globus using general preview URLs +- **Scoped institutional login** — `session_required_single_domain` support enables access to institutional Globus endpoints (e.g., HPC clusters); scopes are automatically removed for guest and preview access +- **Real-time transfer progress** — polling-based progress monitoring with percentage display and status updates (ACTIVE/SUCCEEDED/FAILED) +- **Download filtering** — only datasets where the user can download all files are shown, avoiding failed transfers for restricted or embargoed content +- **Hierarchical file tree** — recursive folder selection and color-coded file status + +For full details, see the [README](https://github.com/libis/rdm-integration#readme) and [GLOBUS_INTEGRATION.md](https://github.com/libis/rdm-integration/blob/main/GLOBUS_INTEGRATION.md). + +### New Previewer for REFI-QDA Codebook and Project Files + +See the note above about support for REFI-QDA files. Screenshots of the previewer can be found at https://github.com/gdcc/dataverse-previewers/pull/137 + +## Bugs Fixed + +- Hidden fields on a dataset creation form remained visible and setting a field to "hidden" was not working. See #11992 and #12017. +- The names of host collections were visible when using anonymized preview URLs. See #11085 and #12111. +- As of Dataverse 6.8, the "replace file" feature was not working. See #11976, #12107, and #12157. +- A dataset or collection (dataverse) was still visible in browse/search results immediately after deleting it if you didn't refresh the page. See #11206 and #12072. +- The text in "assign role" notifications now only shows the role that was just assigned. Previously, the notification showed all the roles associated with the dataset. See #11773 and #11915. +- Handles from hdl.handle.net with urls of `/citation` instead of `/dataset.xhtml` were not properly redirecting. This fix adds a lookup for alternate PID so `/citation` endpoint will redirect to `/dataset.xhtml`. See #11943. +- Dataverse no longer sends duplicate [COAR Notify Relationship Announcement Workflow](https://coar-notify.net/catalogue/workflows/repository-relationship-repository/) messages when new dataset versions are published (and the relationship metadata has not been changed). See #11983. +- 500 error when deleting dataset type by name. See #11833 and #11753. +- Dataset Type facet works in JSF but not the SPA. See #11758 and #11753. +- PIDs could not be generated when the `identifier-generation-style` was set to `storedProcGenerated`. See #12126 and #12127. +- It came to our attention that the [Dataverse Uploader GitHub Action](https://guides.dataverse.org/en/6.10/admin/integrations.html#github) was [failing](https://github.com/IQSS/dataverse-uploader/issues/28) with an "unhashable type" error. This has been fixed in a new release, [v1.7](https://github.com/IQSS/dataverse-uploader/releases/tag/v1.7). + +## API Updates + +- The MyData API now supports the `metadata_fields`, `sort`, `order`, `show_collections` and `fq` parameters, which enhances its functionality and brings it in line with the Search API. See [the guides](https://guides.dataverse.org/en/6.10/api/native-api.html#mydata) and #12009. +- This release removes an undocumented restriction on the API calls to get, set, and delete [archival status](https://guides.dataverse.org/en/6.10/api/native-api.html#get-the-archival-status-of-a-dataset-by-version). They did not work on deaccessioned dataset versions and now do. See #12065. +- Dataset templates can be listed and deleted for a given collection (dataverse). See [the guides](https://guides.dataverse.org/en/6.10/api/native-api.html#list-single-template-by-its-identifier), #11918, and #11969. The default template can also be set. See [the guides](https://guides.dataverse.org/en/6.10/api/native-api.html#set-a-default-template-for-a-collection), #11914 and #11989. +- Because some clients (such as the [new frontend](https://github.com/IQSS/dataverse-frontend)) need to retrieved contact email addresses along with the rest of the dataset metadata, a new query parameter called `ignoreSettingExcludeEmailFromExport` has been introduced. It requires "EditDataset" permission. See [the guides](https://guides.dataverse.org/en/6.10/api/native-api.html#get-json-representation-of-a-dataset), #11714, and #11819. +- The Change Collection Attributes API now supports `allowedDatasetTypes`. See the [guides](https://guides.dataverse.org/en/6.10/api/native-api.html#change-collection-attributes), #12115, and #11753. +- The API returning information about datasets (`/api/datasets/{id}`) now includes a `locks` field containing a list of the types of all existing locks, e.g. `"locks": ["InReview"]`. See #12008. +- Cleaned up Access APIs to localize getting user from session for JSF backward compatibility. This bug requires a frontend fix to send the Bearer Token in the API call. See #11740 and #11844. + +## Security Updates + +This release contains important security updates. If you are not receiving security notices, please sign up by following [the steps](https://guides.dataverse.org/en/latest/installation/config.html#ongoing-security-of-your-installation) in the guides. + +## Backward Incompatible Changes + +Generally speaking, see the [API Changelog](https://guides.dataverse.org/en/latest/api/changelog.html) for a list of backward-incompatible API changes. + +### Archival Zip Filename Change + +The filename of the archival zipped bag produced by the `LocalSubmitToArchiveCommand` archiver now has a "." character before the "v" (for version number) to mirror the filename used by other archivers. For example, the filename will look like + +`doi-10-5072-fk2-fosg5q.v1.0.zip` + +rather than + +`doi-10-5072-fk2-fosg5qv1.0.zip`. + +### Dataset Types Must Be Allowed, Per-Collection, Before Use + +In previous releases of Dataverse, as soon as additional dataset types were added (such as "software", "workflow", etc.), they could be used by all users when creating datasets (via API only). As of this release, on a per-collection basis, superusers must allow these dataset types to be used. See #12115 and #11753. + +## End-Of-Life (EOL) Announcements + +### PostgreSQL 13 Reached EOL on 13 November 2025 + +We mentioned this in the Dataverse [6.6](https://github.com/IQSS/dataverse/releases/tag/v6.6), [6.8](https://github.com/IQSS/dataverse/releases/tag/v6.8), [6.9](https://github.com/IQSS/dataverse/releases/tag/v6.9) release notes, but as a reminder, according to https://www.postgresql.org/support/versioning/ PostgreSQL 13 reached EOL on 13 November 2025. As stated in the [Installation Guide](https://guides.dataverse.org/en/6.10/installation/prerequisites.html#postgresql), we recommend running PostgreSQL 16 since it is the version we test with in our continuous integration ([since](https://github.com/gdcc/dataverse-ansible/commit/8ebbd84ad2cf3903b8f995f0d34578250f4223ff) February 2025). The [Dataverse 5.4 release notes](https://github.com/IQSS/dataverse/releases/tag/v5.4) explained the upgrade process from 9 to 13 (e.g. pg_dumpall, etc.) and the steps will be similar. If you have any problems, please feel free to reach out (see "getting help" in these release notes). + +## Notes for Dataverse Installation Administrators + +### Hiding OIDC Provider from JSF UI + +This release fixes a bug where the value of the [dataverse.auth.oidc.enabled](https://guides.dataverse.org/en/6.10/installation/oidc.html#provision-via-jvm-options) setting (available when provisioning an authentication provider via JVM options) was not being not being propagated to the current Dataverse user interface (JSF, where `enabled=false` providers are not displayed for login/registration) or represented in the GET /api/admin/authenticationProviders API call. + +A new JVM setting (`dataverse.auth.oidc.hidden-jsf`) was added to hide an enabled OIDC Provider from the JSF UI. + +For Dataverse instances deploying both the current JSF UI and the [new SPA UI](https://github.com/IQSS/dataverse-frontend), this fix allows the OIDC Keycloak provider configured for the SPA to be hidden in the JSF UI. This is useful in cases where it would duplicate other configured providers. + +Note: The API to create a new Auth Provider can only be used to create a provider for both JSF and SPA. Use JVM / MicroProfile config setting to create SPA-only providers. + +See [dataverse.auth.oidc.hidden-jsf](https://guides.dataverse.org/en/6.10/installation/oidc.html#provision-via-jvm-options) in the guides, #11606, and #11922. + +## New Settings + +### New JVM Options (MicroProfile Config Settings) + +- dataverse.auth.oidc.hidden-jsf +- dataverse.bagit.archive-on-version-update +- dataverse.bagit.zip.holey +- dataverse.bagit.zip.max-data-size +- dataverse.bagit.zip.max-file-size +- dataverse.feature.require-embargo-reason +- dataverse.legacy.schemaorg-in-html-head + +### New Database Settings + +- :ArchiveOnlyIfEarlierVersionsAreArchived +- :HarvestingClientCallRateLimit +- :PublishDatasetDisclaimerText +- :SendNotificationOnDatasetMove + +## Complete List of Changes + +For the complete list of code changes in this release, see the [6.10 milestone](https://github.com/IQSS/dataverse/issues?q=milestone%3A6.10+is%3Aclosed) in GitHub. + +## Getting Help + +For help with upgrading, installing, or general questions please see [getting help](https://guides.dataverse.org/en/latest/installation/intro.html#getting-help) in the Installation Guide. + +## Installation + +If this is a new installation, please follow our [Installation Guide](https://guides.dataverse.org/en/latest/installation/). Please don't be shy about [asking for help](https://guides.dataverse.org/en/latest/installation/intro.html#getting-help) if you need it! + +Once you are in production, we would be delighted to update our [map of Dataverse installations around the world](https://dataverse.org/installations) to include yours! Please [create an issue](https://github.com/IQSS/dataverse-installations/issues) or email us at support@dataverse.org to join the club! + +You are also very welcome to join the [Global Dataverse Community Consortium](https://www.gdcc.io/) (GDCC). + +## Upgrade Instructions + +Upgrading requires a maintenance window and downtime. Please plan accordingly, create backups of your database, etc. + +Note: These instructions assume that you are upgrading from the immediate previous version. That is to say, you've already upgraded through all the 6.x releases and are now running Dataverse 6.9. See [tags on GitHub](https://github.com/IQSS/dataverse/tags) for a list of versions. If you are running an earlier version, the only supported way to upgrade is to progress through the upgrades to all the releases in between before attempting the upgrade to this version. + +If you are running Payara as a non-root user (and you should be!), **remember not to execute the commands below as root**. By default, Payara runs as the `dataverse` user. In the commands below, we use sudo to run the commands as a non-root user. + +Also, we assume that Payara 6 is installed in `/usr/local/payara6`. If not, adjust as needed. + +1. Undeploy Dataverse, using the unprivileged service account ("dataverse", by default). + + `sudo -u dataverse /usr/local/payara6/bin/asadmin list-applications` + + `sudo -u dataverse /usr/local/payara6/bin/asadmin undeploy dataverse-6.9` + +1. Deploy the Dataverse 6.10 war file. + + `wget https://github.com/IQSS/dataverse/releases/download/v6.10/dataverse-6.10.war` + + `sudo -u dataverse /usr/local/payara7/bin/asadmin deploy dataverse-6.10.war` \ No newline at end of file diff --git a/doc/release-notes/dataverse-uploader-github-action-fixed.md b/doc/release-notes/dataverse-uploader-github-action-fixed.md deleted file mode 100644 index 424b7187f42..00000000000 --- a/doc/release-notes/dataverse-uploader-github-action-fixed.md +++ /dev/null @@ -1 +0,0 @@ -It came to our attention that the [Dataverse Uploader GitHub Action](https://guides.dataverse.org/en/6.10/admin/integrations.html#github) was [failing](https://github.com/IQSS/dataverse-uploader/issues/28) with an "unhashable type" error. This has been fixed in a new release, [v1.7](https://github.com/IQSS/dataverse-uploader/releases/tag/v1.7).