Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
177 changes: 120 additions & 57 deletions docs/source/guide/project_settings_lse.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,52 @@ The labeling interface is the central configuration point for projects. This det

For information on setting up the labeling interface, see [Labeling configuration](setup).

## Members

Use this section to add and remove project members. You can only configure project members for public projects (meaning the project is not in your Personal Sandbox workspace).

##### Add members

Click **Add Members** to open a window where you can search for and add members to the project. Depending on your organization's permission settings, you may also be able to invite new users to an organization and automatically add them to the project.

!!! info Tip
If you are in the Admin or Owner role, you can bulk assign users to workspaces and projects from the **Organization > Members** page.

##### User roles and membership

| Role | Membership |
| ------------- | ------------ |
| **Admins and Owners** | Cannot be added or removed from a public project. These users have access to all projects, regardless of state, unless the project is in a Personal Sandbox workspace. |
| **Managers** | Must be added to a project or parent workspace to have access, but once added will have access to the project regardless of state. |
| **Annotators and Reviewers** | Must be added to a project or parent workspace to have access, but cannot access projects until they are published. |

##### Inherited members

Managers, Reviewers, and Annotators who are added as members at the [workspace level](workspaces#Add-or-remove-workspace-members) are automatically granted membership to any projects within the workspace.

Admins and Owners have inherited membership because they have access to all public projects within an organizations.

##### Annotator roles

The annotator role is the most constrained role in Label Studio, and by default annotators can only access the labeling stream where they see tasks that are ready for labeling.

* If you have [Automatic distribution](#distribute-tasks) enabled, users with the Annotator role are automatically assigned tasks when they are added as members. Similarly, by default, project members with the Reviewer role are able to begin reviewing annotations once the tasks are labeled.
* If you have [Manual distribution](#distribute-tasks) enabled, you need to add users with the Annotator role as project members before you can assign them to tasks. And if you have [**Review only manually assigned tasks**](#reviewing-options) enabled, the users with the Reviewer role must also be project members before they can be assigned to tasks.

##### Project-level roles

For users in the Annotator or Reviewer role (set at the organization level), you can change their role from project to project as needed.

For example, if Heidi Opossum is granted the Annotator role at the organization level, but you would like her to work as a Reviewer in "Project 1", you can set her project-level role to Reviewer.

To assign a project-level role, first add the person to your project. Once added, you can use the drop-down menu to change their role:

![Screenshot](/images/project/member_roles.png)


!!! note
This is only available for users who have the Annotator or Reviewer role applied at the organization level. Users with Manager, Administrator, and Owner role cannot have their permissions downgraded to Annotator or Reviewer on a per-project basis.

## Annotation

Use these settings to configure what options annotators will see and how their labeling tasks are assigned.
Expand Down Expand Up @@ -284,15 +330,23 @@ Enable **Show before reviewing** to display a pop-up message to reviewers when t

<dd>

Configure what is required for a task to be considered "reviewed."
Use this section to configure:

* How many annotations need to be accepted/rejected for the task to be considered reviewed (move from the **Needs Review** state to the **Done** state).

* How many annotations need to be submitted for a task to enter the **Needs Review** state.

* Whether a reviewer can only see tasks to which they've been assigned.

For more information about states, see [Project states](project_states).

!!! note
This metric determines:
Moving from **Needs Review** to **Done** influences the following:

* **Review stream**: When a task is removed from the review queue.
* **Review stream**: Tasks that are **Done** are not shown in the review stream (what reviewers see when they click **Review All Tasks**).
* **Data Manager**: The value shown in the **Reviewed** column.
* **Export**: Which tasks are included when you want to only include reviewed tasks in your export snapshot.
* **Dashboards**: Reviewed counts and related metrics.
* **Dashboards**: Various counts and metric related to reviews.

<table>
<thead>
Expand All @@ -308,7 +362,7 @@ Configure what is required for a task to be considered "reviewed."
</td>
<td>

In a task where multiple annotators submitted labels, the reviewer only needs to accept one to consider the task reviewed.
For eligible tasks (tasks that enter the **Needs Review** state) in which multiple annotators submitted annotations, the reviewer only needs to accept one annotation to consider the task done.

</td>
</tr>
Expand All @@ -319,7 +373,7 @@ In a task where multiple annotators submitted labels, the reviewer only needs to
</td>
<td>

In a task where multiple annotators submitted labels, the reviewer needs to accept or reject annotations submitted by all annotators.
For eligible tasks (tasks that enter the **Needs Review** state) in which multiple annotators submitted annotations, the reviewer needs to either accept or reject **all** annotations to consider the task done.

</td>
</tr>
Expand All @@ -330,7 +384,9 @@ In a task where multiple annotators submitted labels, the reviewer needs to acce
</td>
<td>

If enabled, a reviewer can only see tasks to which they've been assigned. Otherwise, they can view all tasks that are ready for review.
If enabled, a reviewer can only see tasks to which they've been assigned.
Comment thread
caitlinwheeless marked this conversation as resolved.

This also means that a task can progress from the **Annotating** state directly to the **Done** state if this is enabled and no reviewers are assigned.

</td>
</tr>
Expand All @@ -341,11 +397,11 @@ If enabled, a reviewer can only see tasks to which they've been assigned. Otherw
</td>
<td>

When enabled, a reviewer only sees tasks that have been completed by all required annotators.
When enabled, a task is considered ready for review only when it has achieved its annotation requirement.

If your project is using auto distribution, then this means a reviewer only sees tasks that have met the **Annotations per task** threshold.
If your project is using [auto distribution](#distribute-tasks), then this means the task has met the [**Annotations per task** threshold](#overlap).

If your project is using manual distribution, then this means a reviewer only sees tasks in which all assigned annotators have submitted an annotation.
If your project is using [manual distribution](#distribute-tasks), then this means all assigned annotators have submitted an annotation.

Note that in most cases, skipped tasks do not contribute towards meeting the minimum.

Expand All @@ -355,11 +411,25 @@ Note that in most cases, skipped tasks do not contribute towards meeting the min

</dd>

<dt id="task-ordering">Task Ordering</dt>
<dt id="review-sampling">Review Sampling <span class="badge"></span></dt>

<dd>

Choose the order in which reviewers see tasks in the review stream.
Determine how many eligible tasks need to be reviewed.

"Eligible" tasks are first defined by whether you have enabled **Show only finished tasks in the review stream**:

* If enabled, then eligible tasks only include those tasks that have been completed by all required annotators.
* If not enabled, as soon as a task has at least one submitted annotation, it is eligible for review.

Eligible tasks enter the **Needs Review** state, which means they are included in the review stream.

You can use the review sampling settings to configure whether tasks can skip the **Needs Review** state and go straight to **Done**.

!!! note
The percentages set under **Review Sampling** are applied as probabilistic sampling rates, not exact quotas.

For example, setting **Basic Sampling** to 50% does not guarantee that exactly half of eligible tasks will be reviewed — the actual share will vary around 50%, with less variance the more tasks your project has.

<table>
<thead>
Expand All @@ -371,50 +441,70 @@ Choose the order in which reviewers see tasks in the review stream.
<tr>
<td>

**By Task ID**
**Basic Sampling**
Comment thread
caitlinwheeless marked this conversation as resolved.
</td>
<td>

Tasks are ordered by their numeric ID (ascending). Annotation order within a task remains stable.
Configure the percentage of eligible tasks that enter the **Needs Review** state.<br/><br/>For example, if you set this to 80%, then reviewers will see 80% of eligible tasks in the review stream. The remaining 20% of tasks will go straight to the **Done** state and skip the review stream.

</td>
</tr>
<tr>
<td>

**Random**
<span class="badge"></span>
**Agreement-based Sampling**
</td>
<td>

Tasks are shown in randomized task order while preserving the stable order of annotations within each task. This mode enables **Task limit (%)** (see below).

!!! note
If any tasks are selected in the Data Manager or reviewers use Quickview, this limit will not be applied. You can disable the Data Manager for reviewers in the project settings to avoid these situations.
Use [agreement scores](stats) to determine which and how many tasks enter the **Needs Review** state vs. moving straight to the **Done** state.
<ul><li><b>Agreement threshold</b>: The agreement score threshold you will use to partition the the <b>High agreement</b>/<b>Low agreement</b> sampling rates.</li>
<li><b>Agreement source</b>: The source of the agreement score used for the agreement threshold. You can select the overall task agreement or the control tag-level agreement score. For more information, see <a href="stats#Overall-vs-per-control-tag-agreement">Overall vs. per-control-tag agreement</a>.</li>
<li><b>Low agreement</b>: The percentage of tasks with agreement scores that fall below the <b>Agreement threshold</b> that you want to enter the <b>Needs Review</b> state.</li>
<li><b>High agreement</b>: The percentage of tasks with agreement scores that fall above the <b>Agreement threshold</b> that you want to enter the <b>Needs Review</b> state.</li></ul>

</td>
</tr>
</table>

</dd>

<dt id="task-limit">Task Limit (%) <span class="badge"></span></dt>
<dt id="task-ordering">Task Ordering</dt>

<dd>

Limit the portion of project tasks that are available to reviewers when **Task Ordering** is set to **Random**.
Select the order in which reviewers see tasks in the review stream.

Set this to a percentage from `0` to `100`.
<table>
<thead>
<tr>
<th style="width: 20%;">Field</th>
<th>Description</th>
</tr>
</thead>
<tr>
<td>

!!! note
Note the following:
**By Task ID**
</td>
<td>

Tasks are ordered by their numeric ID (ascending). Annotation order within a task remains stable.

</td>
</tr>
<tr>
<td>

**Random** <br/>
<span class="badge"></span>
</td>
<td>

* This only applies only when sampling is **Random**.
* If you enter a percentage of `≤0` or `≥100`, you will effectively disable limiting.
* This limit is applied over the eligible task set after filters (for example, **Show only finished tasks**) are applied.
* If reviewers open the review stream by selecting tasks and then clicking **Label *n* Tasks** from the Data Manager, they will bypass the limit.
Tasks are shown in randomized task order while preserving the stable order of annotations within each task.

For example, if a project has 1,000 tasks and the limit is set to 60%, at most ~600 tasks will be served for review under Random sampling. When the limit is reached, the API returns “no more annotations to review,” and the UI displays **Review finished**.
</td>
</tr>
</table>

</dd>

Expand Down Expand Up @@ -934,33 +1024,6 @@ Annotators are assigned one at a time until the agreement threshold is achieved.
</dd>
</dl>

## Members

Use this page to control which users are project members.

Project members have access to published projects, depending on the permissions associated with their role. For more information, see [User roles and permissions](admin_roles).

Some users cannot be added or removed from the Members page at the project level. These users include administrators, who already have access to every project (outside of the Sandbox). This also includes users who have been added as members to the Workspace. Workspace membership is inherited by the projects within the workspace.

* If you have [Automatic distribution](#distribute-tasks) enabled, users with the Annotator role are automatically assigned tasks when they are added as members. Similarly, by default, project members with the Reviewer role are able to begin reviewing annotations once the tasks are labeled.

* If you have [Manual distribution](#distribute-tasks) enabled, you need to add users with the Annotator role as project members before you can assign them to tasks. And if you have [**Review only manually assigned tasks**](#reviewing-options) enabled, the users with the Reviewer role must also be project members before they can be assigned to tasks.

#### Project-level roles

Project-level roles are Annotator and Reviewer.

Users with these roles have their access constrained to the project level (meaning they cannot view organization-wide information and can only view project data when added to a project and assigned tasks). For more information, see [User roles and permissions](admin_roles).

For Annotators and Reviewers, you can change their default role on a per-project basis to suit your needs. For example, a user can be assigned as an Annotator to "Project 1" and as a Reviewer to "Project 2."

To assign a project-level role, first add the person to your project. Once added, you can use the drop-down menu to change their role:

![Screenshot of project-level role action](/images/project/member_roles.png)

!!! note
This is only available for users who have the Annotator or Reviewer role applied at the organization level. Users with Manager, Administrator, and Owner role cannot have their permissions downgraded to Annotator or Reviewer on a per-project basis.

## Model

Click **Connect Model** to connect a machine learning (ML) backend to your project. For more information on connecting a model, see [Machine learning integration](ml).
Expand Down
7 changes: 3 additions & 4 deletions docs/source/guide/project_states.md
Original file line number Diff line number Diff line change
Expand Up @@ -119,10 +119,11 @@ If your overlap is greater than `1`, the task will sit in the **Annotating** sta

##### Needs Review state

There are two settings that can cause tasks (and by extension, the project) to skip the **Needs Review** state:
There are several settings that can cause tasks (and by extension, the project) to skip the **Needs Review** state:

* If you have [**Review > Reviewing Options > Review only manually assigned tasks**](project_settings_lse#reviewing-options) enabled, but have not assigned any reviewers.
* If you have [**Review > Task Ordering > Random**](project_settings_lse#task-ordering) selected and the **Task limit (%)** set to `0`.
* If you have your [**Review Sampling**](project_settings_lse#review-sampling) settings configured in a way that allows some or all tasks to bypass the **Needs Review** state.



##### In Review state
Expand Down Expand Up @@ -160,8 +161,6 @@ For example, if you have 10 tasks:

Your project will be in the **Annotating** state, even though you do not have any tasks in the **Annotating** state.



### API values

If you are using the [SDK](https://api.labelstud.io/api-reference/introduction/getting-started), the API values for each state are as follows.
Expand Down
Binary file modified docs/themes/v2/source/images/project/member_roles.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading