- 21 Apr 2025
- Print
- DarkLight
- PDF
Create Consensus Tasks
- Updated On 21 Apr 2025
- Print
- DarkLight
- PDF
The Consensus is an important quality control feature, allowing you to compare annotations by different users on a specific item and generating majority-vote based, high-quality data.
- Purpose: To ensure quality control by comparing annotations from multiple contributors.
- Functionality:
- Facilitates the comparison of annotations made by different contributors on the same item.
- Ensures consistency and reliability by calculating consensus metrics.
- Allows for more items to be annotated within a single task.
- Use Case: Ideal for tasks requiring alignment among multiple annotators working on the same data. The task calculator helps estimate the scope and scale of the task, providing clarity on workload distribution.

How consensus works
When consensus is enabled for a labeling task, it is configured with the percentage of items to cover and the number of assignees. Dataloop automatically creates copies of the items and assigns them randomly to contributors.
Key Points:
- Task Browsing: Only the original items are visible when browsing the task.
- Assignment Browsing: Users can view the specific copies assigned to them within their assignments.
- Annotation Merging: Once all copies of an item are assigned a status (e.g., "Completed"), the system merges the annotations back onto the original item. Until then, the original item will not display the annotations from its copies.
- Data Download: When downloading consensus data, the JSON file includes all annotations made by different users, along with their usernames. This allows users to calculate their own scores and determine which annotations are of the highest quality.
- Syncing Datasets with Consensus Task: Cloned external datasets (AWS, GCP, Azure) cannot be synced with a consensus task.
Consensus in the QA workflow
Consensus is integrated into the QA workflow to enhance annotation quality. Reviewers can evaluate items containing annotations from multiple contributors, flag issues, and create note-annotations to trigger corrections.
Workflow Steps:
- Flagging Issues: Reviewers identify errors in annotations and provide feedback through notes or flags.
- Annotator Corrections:
- Annotators correct their work and set the item's status to Complete.
- The original annotations are removed from the master item, and the corrected annotations are copied over to ensure higher-quality work.
- Final Review: Reviewers re-evaluate the corrected annotations and set the item's status to Approve once it meets quality standards.
Consensus score
Quality task scoring is now available for items, annotations, and users. When creating a task with Consensus enabled, a scoring function is activated to evaluate all users, items, and annotations. Detailed scoring insights can be accessed through the new Scores tab on the Analytics page.
Benefits of Consensus Scoring with Application Integration:
- Customizable Scoring: Calculate consensus scores using your preferred methods and thresholds (e.g., IOU).
- Dataset Management: Move or clone items and annotations to a majority-vote dataset for use in training or testing.
- Pipeline Integration: Trigger further processing for annotations with high scores in the pipeline workflow.
To learn more about the Dataloop Scoring Function, see the Scoring and metrics app {target="_blank"} README file.
Redistribute or reassign consensus tasks
Dataloop consensus tasks currently have the following limitations:
- Redistribution Restriction: To ensure consistency in quality evaluation, consensus tasks cannot be redistributed once created.
- Reassignment Rules: Consensus assignments can only be reassigned to users who have never had an assignment in the task. This prevents scenarios where a single annotator works on multiple consensus copies of the same item.
To reassign:
- Open the Labeling > Tasks page.
- Double-click on the consensus task. The Assignments tab is displayed.
- Select the assignment to be reassigned.
- Click on the Assignment Actions and select the Reassign from the list. The Reassign Assignment window of the Edit Task will be displayed.
- Select a new user from the list and click Apply Reassignment.
Create a consensus task
A labeling task assigned to multiple annotators for the same items, used to compare their results and measure agreement to ensure label quality and reduce subjectivity.
- Open the Labeling from the left-side menu.
- Click Create Task. The task type section popup is displayed.
- Select the Consensus from the popup.

- Click Continue.
1. General
Enter or select the required details in the General section:

- Task Name: Enter a name for the new task. By default, the project name + (total number of tasks + 1) is displayed.
For example, if the project name is abc and the total number of tasks you have already is 5, then the new task name is abc-6. - Owner: By default, the current user's email ID is displayed. Click on it to select a different owner from the list.
- Priority: Select a priority from the list. By default, Medium is selected.
- Completion Due Date (Optional): Select a task's due date from the calendar.
- Click Next: Data Source.
2. Data source
- Enter or select a Dataset from the list.
- Click Next: Instructions.

3. Instructions
- Enter or select the required details in the Instructions section. The number of Labels and Attributes is displayed on the top-right side of the page.
- Recipe: By default, the default recipe is displayed. Select a recipe from the list, if needed.
- Labeling Instructions (.pdf): The labeling instruction document is displayed, if available. Go to the Recipe section to upload a PDF instruction. You can select the page range accordingly.
- Click Next: Statuses. The Statuses section is displayed.

4. Statuses
- By default, the Completed status is selected. Click Add New Status to add a new status.
- Click Next: Assignments.

5. Assignments
When switching the allocation method from Distribution to Pulling or changing the task type from Labeling to Review (QA), Quality tasks (e.g., consensus, honey pot, qualification) will no longer be available. Additionally, any task assignees will be reset. Confirmation dialogs will guide you through these changes.

- Enter or select the required details in the Assignments section.
- Allocation Method: Select one of the following allocation methods: Distribution is selected by default.
- Pulling: The pulling distribution method means that annotators only pull a batch of items at a time and the maximum number of items in an assignment. You can make changes in the following fields if required: Pulling batch size (items) and Max items in an assignment.
- Distribution: The distribution allocation method means that the items will be distributed in advance among users, equally or based on a custom percentage. The Auto Distribution option distributes the task equally among the users. By default, it is checked.
- Available Users: Search for or select users from the list, and click the Forward arrow icon to add them to the Assigned users list.
- Assigned Users:
- Search for or view the assigned users from the list. The allocation percentage is equally distributed if you select Auto Distribution.
- Select and click the Backward arrow icon to remove them from the Assigned Distribution list.
- Allocation Method: Select one of the following allocation methods: Distribution is selected by default.
Inactive users are grayed out and disabled for redistribution, and available for reassignment.
- Click Next: Consensus.
6. Consensus
Enable advanced quality monitoring options to ensure data quality and review performance.

- Modify the Consensus Data & Assignees values as required.
- Consensus items: Define the Consensus items for consensus, up to 100% of the items in the task. As you change the percentage of items, the exact number of consensus items is calculated and shown.
- Assignees: Define the Users for consensus as the number of copies per item (minimum 2).
- Original Items: This is the number of items you selected as Data for this task.
- Consensus Items: The number of additional copies included in the task. Since one copy is already considered part of the original data, this shows the additional payload per task.
- Total Items: It allows you to estimate the schedule and cost.
- Score Function (Optional): Select a score function to calculate quality score. Clicking on the Apply Score Function, a pipeline will be created facilitating a score function running on items completed in the quality task.
- Click Create Task. The consensus task will be created and is displayed in the tasks list.
Create consensus tasks in a pipeline
A method of automating consensus tasks within a pipeline, enabling data to flow through multiple annotation stages with built-in agreement validation between annotators. To create a consensus task, you can use one of the following options to create an annotation task.
- Open the Pipelines from the left side panel.
- Select and drag the Consensus task node from the node library to the pipeline canvas.
- Select the Consensus task node and click Create Consensus task from the right-side panel.

- The Create New Task page is displayed and perform the following steps:

1. General
- Enter or select the required details in the General section:
- Task Name: Enter a name for the new task. By default, the project name + (total number of tasks + 1) is displayed.
For example, if the project name is abc and the total number of tasks you have already is 5, then the new task name is abc-6. - Owner: By default, the current user's email ID is displayed. Click on it to select a different owner from the list.
- Pipeline Name: The current pipeline name is displayed.
- Priority: Select a priority from the list. By default, Medium is selected.
- Task Name: Enter a name for the new task. By default, the project name + (total number of tasks + 1) is displayed.
- Click Next: Data Source.
2. Data source
- Enter or select a Dataset from the list.
- Click Next: Instructions.
3. Instructions
- Enter or select the required details in the Instructions section. The number of Labels and Attributes is displayed on the top-right side of the page.
- Recipe: By default, the default recipe is displayed. Select a recipe from the list, if needed.
- Labeling Instructions (.pdf): The labeling instruction document is displayed, if available. Go to the Recipe section to upload a PDF instruction. You can select the page range accordingly.
- Click Next: Statuses. The Statuses section is displayed.
4. Statuses
- By default, the Completed status is selected. Click Add New Status to add a new status.
- Click Next: Assignments.
5. Assignments
- Enter or select the required details in the Assignments section.
- Allocation Method: Select one of the following allocation methods: Distribution is selected by default.
- Pulling: The pulling distribution method means that annotators only pull a batch of items at a time and the maximum number of items in an assignment. You can make changes in the following fields if required: Pulling batch size (items) and Max items in an assignment.
- Distribution: The distribution allocation method means that the items will be distributed in advance among users, equally or based on a custom percentage. The Auto Distribution option distributes the task equally among the users. By default, it is checked.
- Available Users: Search for or select users from the list, and click the Forward arrow icon to add them to the Assigned users list.
- Assigned Users:
- Search for or view the assigned users from the list. The allocation percentage is equally distributed if you select Auto Distribution.
- Select and click the Backward arrow icon to remove them from the Assigned Distribution list.
- Allocation Method: Select one of the following allocation methods: Distribution is selected by default.
Inactive users are grayed out and disabled for redistribution, and available for reassignment.
- Click Next: Consensus.
6. Consensus
Enable advanced quality monitoring options to ensure data quality and review performance.
- Modify the Consensus Data & Assignees values as required.
- Click Create Task. The consensus task will be created and is displayed on the pipeline node.
To edit a qualification task, see the Edit a task section.