@@ -308,7 +362,7 @@ Configure what is required for a task to be considered "reviewed."
|
-In a task where multiple annotators submitted labels, the reviewer only needs to accept one to consider the task reviewed.
+For eligible tasks (tasks that enter the **Needs Review** state) in which multiple annotators submitted annotations, the reviewer only needs to accept one annotation to consider the task done.
|
@@ -319,7 +373,7 @@ In a task where multiple annotators submitted labels, the reviewer only needs to
-In a task where multiple annotators submitted labels, the reviewer needs to accept or reject annotations submitted by all annotators.
+For eligible tasks (tasks that enter the **Needs Review** state) in which multiple annotators submitted annotations, the reviewer needs to either accept or reject **all** annotations to consider the task done.
|
@@ -330,7 +384,9 @@ In a task where multiple annotators submitted labels, the reviewer needs to acce
-If enabled, a reviewer can only see tasks to which they've been assigned. Otherwise, they can view all tasks that are ready for review.
+If enabled, a reviewer can only see tasks to which they've been assigned.
+
+This also means that a task can progress from the **Annotating** state directly to the **Done** state if this is enabled and no reviewers are assigned.
|
@@ -341,11 +397,11 @@ If enabled, a reviewer can only see tasks to which they've been assigned. Otherw
-When enabled, a reviewer only sees tasks that have been completed by all required annotators.
+When enabled, a task is considered ready for review only when it has achieved its annotation requirement.
-If your project is using auto distribution, then this means a reviewer only sees tasks that have met the **Annotations per task** threshold.
+If your project is using [auto distribution](#distribute-tasks), then this means the task has met the [**Annotations per task** threshold](#overlap).
-If your project is using manual distribution, then this means a reviewer only sees tasks in which all assigned annotators have submitted an annotation.
+If your project is using [manual distribution](#distribute-tasks), then this means all assigned annotators have submitted an annotation.
Note that in most cases, skipped tasks do not contribute towards meeting the minimum.
@@ -355,11 +411,25 @@ Note that in most cases, skipped tasks do not contribute towards meeting the min
-Task Ordering
+Review Sampling
-Choose the order in which reviewers see tasks in the review stream.
+Determine how many eligible tasks need to be reviewed.
+
+"Eligible" tasks are first defined by whether you have enabled **Show only finished tasks in the review stream**:
+
+* If enabled, then eligible tasks only include those tasks that have been completed by all required annotators.
+* If not enabled, as soon as a task has at least one submitted annotation, it is eligible for review.
+
+Eligible tasks enter the **Needs Review** state, which means they are included in the review stream.
+
+You can use the review sampling settings to configure whether tasks can skip the **Needs Review** state and go straight to **Done**.
+
+!!! note
+ The percentages set under **Review Sampling** are applied as probabilistic sampling rates, not exact quotas.
+
+ For example, setting **Basic Sampling** to 50% does not guarantee that exactly half of eligible tasks will be reviewed — the actual share will vary around 50%, with less variance the more tasks your project has.
@@ -371,26 +441,26 @@ Choose the order in which reviewers see tasks in the review stream.
|
-**By Task ID**
+**Basic Sampling**
|
-Tasks are ordered by their numeric ID (ascending). Annotation order within a task remains stable.
+Configure the percentage of eligible tasks that enter the **Needs Review** state.
For example, if you set this to 80%, then reviewers will see 80% of eligible tasks in the review stream. The remaining 20% of tasks will go straight to the **Done** state and skip the review stream.
|
|
-**Random**
-
+**Agreement-based Sampling**
|
-Tasks are shown in randomized task order while preserving the stable order of annotations within each task. This mode enables **Task limit (%)** (see below).
-
-!!! note
- If any tasks are selected in the Data Manager or reviewers use Quickview, this limit will not be applied. You can disable the Data Manager for reviewers in the project settings to avoid these situations.
+Use [agreement scores](stats) to determine which and how many tasks enter the **Needs Review** state vs. moving straight to the **Done** state.
+- Agreement threshold: The agreement score threshold you will use to partition the the High agreement/Low agreement sampling rates.
+- Agreement source: The source of the agreement score used for the agreement threshold. You can select the overall task agreement or the control tag-level agreement score. For more information, see Overall vs. per-control-tag agreement.
+- Low agreement: The percentage of tasks with agreement scores that fall below the Agreement threshold that you want to enter the Needs Review state.
+- High agreement: The percentage of tasks with agreement scores that fall above the Agreement threshold that you want to enter the Needs Review state.
|
@@ -398,23 +468,43 @@ Tasks are shown in randomized task order while preserving the stable order of an
-Task Limit (%)
+Task Ordering
-Limit the portion of project tasks that are available to reviewers when **Task Ordering** is set to **Random**.
+Select the order in which reviewers see tasks in the review stream.
-Set this to a percentage from `0` to `100`.
+
+
+
+ | Field |
+ Description |
+
+
+
+|
-!!! note
- Note the following:
+**By Task ID**
+ |
+
+
+Tasks are ordered by their numeric ID (ascending). Annotation order within a task remains stable.
+
+ |
+
+
+
+
+**Random**
+
+ |
+
- * This only applies only when sampling is **Random**.
- * If you enter a percentage of `≤0` or `≥100`, you will effectively disable limiting.
- * This limit is applied over the eligible task set after filters (for example, **Show only finished tasks**) are applied.
- * If reviewers open the review stream by selecting tasks and then clicking **Label *n* Tasks** from the Data Manager, they will bypass the limit.
+Tasks are shown in randomized task order while preserving the stable order of annotations within each task.
- For example, if a project has 1,000 tasks and the limit is set to 60%, at most ~600 tasks will be served for review under Random sampling. When the limit is reached, the API returns “no more annotations to review,” and the UI displays **Review finished**.
+ |
+
+
@@ -934,33 +1024,6 @@ Annotators are assigned one at a time until the agreement threshold is achieved.
-## Members
-
-Use this page to control which users are project members.
-
-Project members have access to published projects, depending on the permissions associated with their role. For more information, see [User roles and permissions](admin_roles).
-
-Some users cannot be added or removed from the Members page at the project level. These users include administrators, who already have access to every project (outside of the Sandbox). This also includes users who have been added as members to the Workspace. Workspace membership is inherited by the projects within the workspace.
-
-* If you have [Automatic distribution](#distribute-tasks) enabled, users with the Annotator role are automatically assigned tasks when they are added as members. Similarly, by default, project members with the Reviewer role are able to begin reviewing annotations once the tasks are labeled.
-
-* If you have [Manual distribution](#distribute-tasks) enabled, you need to add users with the Annotator role as project members before you can assign them to tasks. And if you have [**Review only manually assigned tasks**](#reviewing-options) enabled, the users with the Reviewer role must also be project members before they can be assigned to tasks.
-
-#### Project-level roles
-
-Project-level roles are Annotator and Reviewer.
-
-Users with these roles have their access constrained to the project level (meaning they cannot view organization-wide information and can only view project data when added to a project and assigned tasks). For more information, see [User roles and permissions](admin_roles).
-
-For Annotators and Reviewers, you can change their default role on a per-project basis to suit your needs. For example, a user can be assigned as an Annotator to "Project 1" and as a Reviewer to "Project 2."
-
-To assign a project-level role, first add the person to your project. Once added, you can use the drop-down menu to change their role:
-
-
-
-!!! note
- This is only available for users who have the Annotator or Reviewer role applied at the organization level. Users with Manager, Administrator, and Owner role cannot have their permissions downgraded to Annotator or Reviewer on a per-project basis.
-
## Model
Click **Connect Model** to connect a machine learning (ML) backend to your project. For more information on connecting a model, see [Machine learning integration](ml).
diff --git a/docs/source/guide/project_states.md b/docs/source/guide/project_states.md
index a95c5f0efc46..1424c87a983a 100644
--- a/docs/source/guide/project_states.md
+++ b/docs/source/guide/project_states.md
@@ -119,10 +119,11 @@ If your overlap is greater than `1`, the task will sit in the **Annotating** sta
##### Needs Review state
-There are two settings that can cause tasks (and by extension, the project) to skip the **Needs Review** state:
+There are several settings that can cause tasks (and by extension, the project) to skip the **Needs Review** state:
* If you have [**Review > Reviewing Options > Review only manually assigned tasks**](project_settings_lse#reviewing-options) enabled, but have not assigned any reviewers.
-* If you have [**Review > Task Ordering > Random**](project_settings_lse#task-ordering) selected and the **Task limit (%)** set to `0`.
+* If you have your [**Review Sampling**](project_settings_lse#review-sampling) settings configured in a way that allows some or all tasks to bypass the **Needs Review** state.
+
##### In Review state
@@ -160,8 +161,6 @@ For example, if you have 10 tasks:
Your project will be in the **Annotating** state, even though you do not have any tasks in the **Annotating** state.
-
-
### API values
If you are using the [SDK](https://api.labelstud.io/api-reference/introduction/getting-started), the API values for each state are as follows.
diff --git a/docs/themes/v2/source/images/project/member_roles.png b/docs/themes/v2/source/images/project/member_roles.png
index 848a0f6cac20..5d519608ea96 100644
Binary files a/docs/themes/v2/source/images/project/member_roles.png and b/docs/themes/v2/source/images/project/member_roles.png differ
|