You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The current re-review batch workflow is static, with each entity generally assigned to a single batch (e.g., 0, 1, 2...). We want to extend this so an entity can be re-assigned to multiple re-review batches over time, while still maintaining a stable system that avoids conflicts or data duplication.
We already have two tables—re_review_entity_connect and re_review_assignment—that define entity-to-batch relationships and user assignments. The goal is to enhance these without breaking older references, so that previously completed batches remain valid, and new re-reviews can happen on the same entity in separate, future batches.
Goals
Incremental & Multi-Batch Support
Allow the same entity to appear in multiple re-review batches, each linked to a different review cycle, date range, or criteria.
Make sure each re-review batch is stable (immutable as a snapshot) once completed, but still let new batches be created for the same entity in subsequent cycles.
Selective Updates
Provide partial re-batching if an entity was not fully approved in a prior batch or if additional reviews are needed later.
Offer a way to skip or filter out entities that are fully approved, unless a new re-review is explicitly triggered.
Extended API
New endpoints (e.g., POST /api/re_review/batch) to create “fresh” batches on-demand from filters or explicit entity lists, including entities that may have been in previous batches.
Possibly enhance or add endpoints to split or clone an existing batch if only some entities need further review.
Backward Compatibility
Keep using existing columns (re_review_batch, _saved, _submitted, _approved) to avoid breaking references.
Add minimal schema changes if needed for clarity (e.g., a batch_type or batch_creation_date), but do not remove or rename existing columns.
Managing Multiple Batches Over Time
Each new re-review cycle can create new rows in re_review_entity_connect for the same entity_id but a new re_review_batch number.
Ensure that the UI and DB queries can distinguish old vs. new assignments so that older batches remain archived and accessible.
Proposed Approach
Allow Multiple Records per Entity
re_review_entity_connect does not enforce uniqueness on (entity_id). An entity can appear multiple times, each row corresponding to a different batch.
This means if entity_id = 100 was in batch 2 last year, and now needs re-review again, we create a new row with entity_id = 100 and a newre_review_batch (e.g., 7).
Snapshot & Versioning
Once a batch is submitted and approved, it can be considered closed or archived. If a new review is needed later, a new batch is created (with its own row(s)) rather than updating the old one.
This approach preserves historical context (old data remains intact in the older batch) while enabling fresh reviews.
Extended Endpoints
POST /api/re_review/batch: Create a brand-new batch.
If certain entities are already fully approved, skip them unless the request explicitly allows re-reviews.
Otherwise, new rows in re_review_entity_connect are added for each entity needing re-review.
PUT /api/re_review/batch/:batchId: Update or finalize an existing batch.
Entities that remain unreviewed can stay in the batch.
Entities that have reached final approval can be considered complete.
Avoiding Collisions
If an entity is currently in an open batch (not yet finalized), we can decide whether to allow it to also appear in a new batch. In many workflows, you may want to finish or close an existing batch before creating a new one for the same entity.
Alternatively, allow advanced use-cases where an entity could appear in multiple concurrent batches (e.g., two different teams reviewing different criteria), as long as it’s clear in the UI.
UI Adjustments
Show each batch as a distinct “review cycle” in the UI.
Indicate whether a batch is closed or open so users know if they should create another one.
Stabilizing Factors for Multiple Batches
Row-Level Separation: Each (entity_id, re_review_batch) row is unique, so the same entity in two different batches has two different rows.
Completion Flags: re_review_submitted and re_review_approved flags apply per row, so one batch’s approval does not automatically affect another batch’s row.
Historical Traceability: Because old rows remain in the DB, you can always see an entity’s past re-review batch(es).
Business Rules: The system or user roles can decide whether to prevent or allow overlapping re-reviews for the same entity. This prevents confusion and ensures consistent data.
Acceptance Criteria
An entity can appear in multiple re-review batches over time without overwriting old data.
re_review_entity_connect records remain stable snapshots of the entity’s state for each batch.
Old batches can be archived or left unmodified once completed, while new batches can be spun up for the same entity.
The UI clearly distinguishes old batches from newer ones.
Backward compatibility: older batch references (like “batch 0” or “batch 1”) are still valid.
Additional Context
This approach leverages the existing structure, letting (re_review_entity_id, entity_id, re_review_batch) act as a unique set of references.
No forced uniqueness on entity_id alone means we can have any number of batches for the same entity.
Administrators or curators remain responsible for ensuring that the same entity isn’t re-reviewed unnecessarily unless truly needed.
The text was updated successfully, but these errors were encountered:
Description
The current re-review batch workflow is static, with each entity generally assigned to a single batch (e.g., 0, 1, 2...). We want to extend this so an entity can be re-assigned to multiple re-review batches over time, while still maintaining a stable system that avoids conflicts or data duplication.
We already have two tables—
re_review_entity_connect
andre_review_assignment
—that define entity-to-batch relationships and user assignments. The goal is to enhance these without breaking older references, so that previously completed batches remain valid, and new re-reviews can happen on the same entity in separate, future batches.Goals
Incremental & Multi-Batch Support
Selective Updates
Extended API
POST /api/re_review/batch
) to create “fresh” batches on-demand from filters or explicit entity lists, including entities that may have been in previous batches.Backward Compatibility
re_review_batch
,_saved
,_submitted
,_approved
) to avoid breaking references.batch_type
orbatch_creation_date
), but do not remove or rename existing columns.Managing Multiple Batches Over Time
re_review_entity_connect
for the sameentity_id
but a newre_review_batch
number.Proposed Approach
Allow Multiple Records per Entity
re_review_entity_connect
does not enforce uniqueness on(entity_id)
. An entity can appear multiple times, each row corresponding to a different batch.entity_id = 100
was in batch2
last year, and now needs re-review again, we create a new row withentity_id = 100
and a newre_review_batch
(e.g.,7
).Snapshot & Versioning
Extended Endpoints
POST /api/re_review/batch
: Create a brand-new batch.re_review_entity_connect
are added for each entity needing re-review.PUT /api/re_review/batch/:batchId
: Update or finalize an existing batch.Avoiding Collisions
UI Adjustments
Stabilizing Factors for Multiple Batches
(entity_id, re_review_batch)
row is unique, so the same entity in two different batches has two different rows.re_review_submitted
andre_review_approved
flags apply per row, so one batch’s approval does not automatically affect another batch’s row.Acceptance Criteria
re_review_entity_connect
records remain stable snapshots of the entity’s state for each batch.Additional Context
(re_review_entity_id, entity_id, re_review_batch)
act as a unique set of references.entity_id
alone means we can have any number of batches for the same entity.The text was updated successfully, but these errors were encountered: