Encord Python SDK API Reference#
User Client#
- class encord.user_client.EncordUserClient(user_config, querier)[source]#
Expand for references to
encord.user_client.EncordUserClient
- get_dataset(dataset_hash, dataset_access_settings=DatasetAccessSettings(fetch_client_metadata=False))[source]#
Get the Dataset class to access dataset fields and manipulate a dataset.
You only have access to this project if you are one of the following
Dataset admin
Organisation admin of the project
- Parameters
dataset_hash (
str
) – The Dataset IDdataset_access_settings (
DatasetAccessSettings
) – Set the dataset_access_settings if you would like to change the defaults.
- Return type
- get_project(project_hash)[source]#
Get the Project class to access project fields and manipulate a project.
You will only have access to this project if you are one of the following
Project admin
Project team manager
Organisation admin of the project
- Parameters
project_hash (
str
) – The Project ID- Return type
Expand for references to
encord.user_client.EncordUserClient.get_project
- get_ontology(ontology_hash)[source]#
- Return type
Expand for references to
encord.user_client.EncordUserClient.get_ontology
- create_private_dataset(dataset_title, dataset_type, dataset_description=None)[source]#
DEPRECATED - please use create_dataset instead.
- Return type
- create_dataset(dataset_title, dataset_type, dataset_description=None, create_backing_folder=True)[source]#
- Parameters
dataset_title (
str
) – Title of dataset.dataset_type (
StorageLocation
) – StorageLocation type where data will be stored.dataset_description (
Optional
[str
]) – Optional description of the dataset.
- Return type
- Returns
CreateDatasetResponse
Expand for references to
encord.user_client.EncordUserClient.create_dataset
- get_dataset_api_keys(dataset_hash)[source]#
- Return type
List
[DatasetAPIKey
]
Expand for references to
encord.user_client.EncordUserClient.get_dataset_api_keys
- get_datasets(title_eq=None, title_like=None, desc_eq=None, desc_like=None, created_before=None, created_after=None, edited_before=None, edited_after=None)[source]#
List either all (if called with no arguments) or matching datasets the user has access to.
- Parameters
title_eq (
Optional
[str
]) – optional exact title filtertitle_like (
Optional
[str
]) – optional fuzzy title filter; SQL syntaxdesc_eq (
Optional
[str
]) – optional exact description filterdesc_like (
Optional
[str
]) – optional fuzzy description filter; SQL syntaxcreated_before (
Union
[str
,datetime
,None
]) – optional creation date filter, ‘less’created_after (
Union
[str
,datetime
,None
]) – optional creation date filter, ‘greater’edited_before (
Union
[str
,datetime
,None
]) – optional last modification date filter, ‘less’edited_after (
Union
[str
,datetime
,None
]) – optional last modification date filter, ‘greater’
- Return type
List
[Dict
[str
,Any
]]- Returns
list of (role, dataset) pairs for datasets matching filter conditions.
Expand for references to
encord.user_client.EncordUserClient.get_datasets
- static create_with_ssh_private_key(ssh_private_key=None, password=None, requests_settings=RequestsSettings(max_retries=3, backoff_factor=1.5, connection_retries=3, connect_timeout=180, read_timeout=180, write_timeout=180), ssh_private_key_path=None, **kwargs)[source]#
Creates an instance of EncordUserClient authenticated with private SSH key. Accepts the private key content, path to key file, that can be provided as method parameters or as following environment variables:
ENCORD_SSH_KEY: environment variable with the private key content
ENCORD_SSH_KEY_FILE: environment variable with the path to the key file
- Parameters
ssh_private_key (Optional[str]) – the private key content
ssh_private_key_path (Optional[str | Path]) – the pah to the private key file
password (Optional[str]) – private key password
- Return type
Expand for references to
encord.user_client.EncordUserClient.create_with_ssh_private_key
- get_projects(title_eq=None, title_like=None, desc_eq=None, desc_like=None, created_before=None, created_after=None, edited_before=None, edited_after=None)[source]#
List either all (if called with no arguments) or matching projects the user has access to.
- Parameters
title_eq (
Optional
[str
]) – optional exact title filtertitle_like (
Optional
[str
]) – optional fuzzy title filter; SQL syntaxdesc_eq (
Optional
[str
]) – optional exact description filterdesc_like (
Optional
[str
]) – optional fuzzy description filter; SQL syntaxcreated_before (
Union
[str
,datetime
,None
]) – optional creation date filter, ‘less’created_after (
Union
[str
,datetime
,None
]) – optional creation date filter, ‘greater’edited_before (
Union
[str
,datetime
,None
]) – optional last modification date filter, ‘less’edited_after (
Union
[str
,datetime
,None
]) – optional last modification date filter, ‘greater’
- Return type
List
[Dict
]- Returns
list of (role, projects) pairs for project matching filter conditions.
Expand for references to
encord.user_client.EncordUserClient.get_projects
- create_project(project_title, dataset_hashes, project_description='', ontology_hash='', workflow_settings=<encord.orm.project.ManualReviewWorkflowSettings object>, workflow_template_hash=None)[source]#
Creates a new project and returns its uid (‘project_hash’)
- Parameters
project_title (
str
) – the title of the projectdataset_hashes (
List
[str
]) – a list of the dataset uids that the project will useproject_description (
str
) – the optional description of the projectontology_hash (
str
) – the uid of an ontology to be used. If omitted, a new empty ontology will be createdworkflow_settings (
Union
[ManualReviewWorkflowSettings
,BenchmarkQaWorkflowSettings
]) – selects and configures the type of the quality control workflow to use, Seeencord.orm.project.ProjectWorkflowSettings
for details. If omitted,ManualReviewWorkflowSettings
is used.workflow_template_hash (
Optional
[str
]) – project will be created using a workflow based on the template provided.
- Return type
str
- Returns
the uid of the project.
Expand for references to
encord.user_client.EncordUserClient.create_project
- create_project_api_key(project_hash, api_key_title, scopes)[source]#
- Return type
str
- Returns
The created project API key.
- get_project_api_keys(project_hash)[source]#
- Return type
List
[ProjectAPIKey
]
Expand for references to
encord.user_client.EncordUserClient.get_project_api_keys
- get_dataset_client(dataset_hash, dataset_access_settings=DatasetAccessSettings(fetch_client_metadata=False), **kwargs)[source]#
DEPRECATED - prefer using
get_dataset()
instead.- Return type
- get_project_client(project_hash, **kwargs)[source]#
DEPRECATED - prefer using
get_project()
instead.- Return type
- create_project_from_cvat(import_method, dataset_name, review_mode=ReviewMode.LABELLED, max_workers=None, *, transform_bounding_boxes_to_polygons=False)[source]#
- Export your CVAT project with the “CVAT for images 1.1” option and use this function to import
your images and annotations into encord. Ensure that during you have the “Save images” checkbox enabled when exporting from CVAT.
- Parameters
import_method (
LocalImport
) – The chosen import method. See the ImportMethod class for details.dataset_name (
str
) – The name of the dataset that will be created.review_mode (
ReviewMode
) –- Set how much interaction is needed from the labeler and from the reviewer for the CVAT labels.
See the ReviewMode documentation for more details.
max_workers (
Optional
[int
]) – DEPRECATED: This argument will be ignoredtransform_bounding_boxes_to_polygons – All instances of CVAT bounding boxes will be converted to polygons in the final Encord project.
- Returns
If the project was successfully imported. CvatImporterError: If the project could not be imported.
- Return type
- Raises
ValueError – If the CVAT directory has an invalid format.
- get_cloud_integrations()[source]#
- Return type
List
[CloudIntegration
]
- get_ontologies(title_eq=None, title_like=None, desc_eq=None, desc_like=None, created_before=None, created_after=None, edited_before=None, edited_after=None)[source]#
List either all (if called with no arguments) or matching ontologies the user has access to.
- Parameters
title_eq (
Optional
[str
]) – optional exact title filtertitle_like (
Optional
[str
]) – optional fuzzy title filter; SQL syntaxdesc_eq (
Optional
[str
]) – optional exact description filterdesc_like (
Optional
[str
]) – optional fuzzy description filter; SQL syntaxcreated_before (
Union
[str
,datetime
,None
]) – optional creation date filter, ‘less’created_after (
Union
[str
,datetime
,None
]) – optional creation date filter, ‘greater’edited_before (
Union
[str
,datetime
,None
]) – optional last modification date filter, ‘less’edited_after (
Union
[str
,datetime
,None
]) – optional last modification date filter, ‘greater’
- Return type
List
[Dict
]- Returns
list of (role, projects) pairs for ontologies matching filter conditions.
- create_ontology(title, description='', structure=None)[source]#
- Return type
Expand for references to
encord.user_client.EncordUserClient.create_ontology
- deidentify_dicom_files(dicom_urls, integration_hash, redact_dicom_tags=True, redact_pixels_mode=DeidentifyRedactTextMode.REDACT_NO_TEXT, save_conditions=None, upload_dir=None)[source]#
Deidentify DICOM files in external storage. Given links to DICOM files pointing to AWS, GCP, AZURE or OTC, for example: [ “https://s3.region-code.amazonaws.com/bucket-name/dicom-file-input.dcm” ] Function executes deidentification on those files, it removes all DICOM tags (https://dicom.nema.org/medical/Dicom/2017e/output/chtml/part06/chapter_6.html) from metadata except for:
x00080018 SOPInstanceUID
x00100010 PatientName
x00180050 SliceThickness
x00180088 SpacingBetweenSlices
x0020000d StudyInstanceUID
x0020000e SeriesInstanceUID
x00200032 ImagePositionPatient
x00200037 ImageOrientationPatient
x00280008 NumberOfFrames
x00281050 WindowCenter
x00281051 WindowWidth
x00520014 ALinePixelSpacing
- Parameters
self – Encord client object.
dicom_urls (
List
[str
]) – a list of urls to DICOM files, e.g. [ “https://s3.region-code.amazonaws.com/bucket-name/dicom-file-input.dcm” ]integration_hash (
str
) – integration_hash parameter of Encord platform external storage integrationredact_dicom_tags (
bool
) – Specifies if DICOM tags redaction should be enabled.redact_pixels_mode (
DeidentifyRedactTextMode
) – Specifies which text redaction policy should be applied to pixel data.save_conditions (
Optional
[List
[Union
[SaveDeidentifiedDicomConditionNotSubstr
,SaveDeidentifiedDicomConditionIn
]]]) – Specifies a list of conditions which all have to be true for DICOM deidentified file to be saved.upload_dir (
Optional
[str
]) – Specifies a directory that files will be uploaded to. By default, set to None, deidentified files will be uploaded to the same directory as source files.
- Return type
List
[str
]- Returns
Function returns list of links pointing to deidentified DICOM files, those will be saved to the same bucket and the same directory as original files with prefix ( deid_{timestamp}_ ). Example output: [ “https://s3.region-code.amazonaws.com/bucket-name/deid_167294769118005312_dicom-file-input.dcm” ]
- class encord.user_client.ListingFilter(value)[source]#
Available properties_filter keys for get_projects() and get_datasets().
The values for _before and _after should be datetime objects.
- TITLE_EQ = 'title_eq'#
- TITLE_LIKE = 'title_like'#
- DESC_EQ = 'desc_eq'#
- DESC_LIKE = 'desc_like'#
- CREATED_BEFORE = 'created_before'#
- CREATED_AFTER = 'created_after'#
- EDITED_BEFORE = 'edited_before'#
- EDITED_AFTER = 'edited_after'#
Project#
- class encord.project.Project(client, project_instance, ontology)[source]#
Access project related data and manipulate the project.
- property project_hash: str#
Get the project hash (i.e. the Project ID).
- Return type
str
- property title: str#
Get the title of the project.
- Return type
str
- property description: str#
Get the description of the project.
- Return type
str
- property created_at: datetime.datetime#
Get the time the project was created at.
- Return type
datetime
- property last_edited_at: datetime.datetime#
Get the time the project was last edited at.
- Return type
datetime
- property ontology: dict#
Get the ontology of the project.
DEPRECATED: Prefer using the
encord.Project.ontology_structure()
method.- Return type
dict
- property ontology_hash: str#
Get the ontology hash of the project’s ontology.
- Return type
str
- property ontology_structure: encord.objects.ontology_structure.OntologyStructure#
Get the ontology structure of the project’s ontology.
- Return type
- property datasets: list#
Get the associated datasets.
Prefer using the
encord.objects.project.ProjectDataset()
class to work with the data.from encord.objects.project import ProjectDataset project = user_client.get_project("<project_hash>") project_datasets = ProjectDataset.from_list(project.datasets)
- Return type
list
- property label_rows: dict#
Get the label rows. DEPRECATED: Prefer using
list_label_rows_v2()
method andLabelRowV2()
class to work with the data.from encord.orm.label_row import LabelRowMetadata project = user_client.get_project("<project_hash>") label_rows = LabelRowMetadata.from_list(project.label_rows)
- Return type
dict
Expand for references to
encord.project.Project.label_rows
- refetch_data()[source]#
The Project class will only fetch its properties once. Use this function if you suspect the state of those properties to be dirty.
- Return type
None
- refetch_ontology()[source]#
Update the ontology for the project to reflect changes on the backend
- Return type
None
- get_project()[source]#
This function is exposed for convenience. You are encouraged to use the property accessors instead.
- Return type
- list_label_rows_v2(data_hashes=None, label_hashes=None, edited_before=None, edited_after=None, label_statuses=None, shadow_data_state=None, data_title_eq=None, data_title_like=None, workflow_graph_node_title_eq=None, workflow_graph_node_title_like=None)[source]#
- Parameters
data_hashes (
Optional
[List
[str
]]) – List of data hashes to filter by.label_hashes (
Optional
[List
[str
]]) – List of label hashes to filter by.edited_before (
Union
[str
,datetime
,None
]) – Optionally filter to only rows last edited before the specified timeedited_after (
Union
[str
,datetime
,None
]) – Optionally filter to only rows last edited after the specified timelabel_statuses (
Optional
[List
[AnnotationTaskStatus
]]) – Optionally filter to only those label rows that have one of the specified :class:`~encord.orm.label_row.AnnotationTaskStatus`esshadow_data_state (
Optional
[ShadowDataState
]) – On Optionally filter by data type in Benchmark QA projects. SeeShadowDataState
data_title_eq (
Optional
[str
]) – Optionally filter by exact title matchdata_title_like (
Optional
[str
]) – Optionally filter by fuzzy title match; SQL syntaxworkflow_graph_node_title_eq (
Optional
[str
]) – Optionally filter by exact match with workflow node titleworkflow_graph_node_title_like (
Optional
[str
]) – Optionally filter by fuzzy match with workflow node title; SQL syntax
- Return type
List
[LabelRowV2
]- Returns
A list of
LabelRowV2
instances for all the matching label rows
- add_users(user_emails, user_role)[source]#
Add users to project
- Parameters
user_emails (
List
[str
]) – list of user emails to be addeduser_role (
ProjectUserRole
) – the user role to assign to all users
- Return type
List
[ProjectUser
]- Returns
ProjectUser
- Raises
AuthorisationError – If the project API key is invalid.
ResourceNotFoundError – If no project exists by the specified project EntityId.
UnknownError – If an error occurs while adding the users to the project
Expand for references to
encord.project.Project.add_users
- copy_project(copy_datasets=False, copy_collaborators=False, copy_models=False, *, copy_labels=None, new_title=None, new_description=None)[source]#
Copy the current project into a new one with copied contents including settings, datasets and users. Labels and models are optional.
- Parameters
copy_datasets (
Union
[bool
,CopyDatasetOptions
]) – if True, the datasets of the existing project are copied over, and new tasks are created from those datasetscopy_collaborators – if True, all users of the existing project are copied over with their current roles. If label and/or annotator reviewer mapping is set, this will also be copied over
copy_models – currently if True, all models with their training information will be copied into the new project
copy_labels (
Optional
[CopyLabelsOptions
]) – options for copying labels, defined in CopyLabelsOptionsnew_title (
Optional
[str
]) – when provided, will be used as the title for the new project.new_description (
Optional
[str
]) – when provided, will be used as the title for the new project.
- Return type
str
- Returns
the EntityId of the newly created project
- Raises
AuthorisationError – If the project API key is invalid.
ResourceNotFoundError – If no project exists by the specified project EntityId.
UnknownError – If an error occurs while copying the project.
Expand for references to
encord.project.Project.copy_project
- submit_label_row_for_review(uid)[source]#
Submit a label row for review.
Note: this method is not supported for the workflow-based projects. See the documentation about the workflows
- Parameters
uid (
str
) – A label_hash (uid) string.- Returns
Bool.
- Raises
AuthenticationError – If the project API key is invalid.
AuthorisationError – If access to the specified resource is restricted.
UnknownError – If an error occurs while submitting for review.
OperationNotAllowed – If the write operation is not allowed by the API key.
- add_datasets(dataset_hashes)[source]#
Add a dataset to a project
- Parameters
dataset_hashes (
List
[str
]) – List of dataset hashes of the datasets to be added- Return type
bool
- Returns
Bool.
- Raises
AuthenticationError – If the project API key is invalid.
AuthorisationError – If access to the specified resource is restricted.
ResourceNotFoundError – If one or more datasets don’t exist by the specified dataset_hashes.
UnknownError – If an error occurs while adding the datasets to the project.
OperationNotAllowed – If the write operation is not allowed by the API key.
Expand for references to
encord.project.Project.add_datasets
- remove_datasets(dataset_hashes)[source]#
Remove datasets from project
- Parameters
dataset_hashes (
List
[str
]) – List of dataset hashes of the datasets to be removed- Return type
bool
- Returns
Bool.
- Raises
AuthenticationError – If the project API key is invalid.
AuthorisationError – If access to the specified resource is restricted.
ResourceNotFoundError – If no dataset exists by the specified dataset_hash (uid).
UnknownError – If an error occurs while removing the datasets from the project.
OperationNotAllowed – If the operation is not allowed by the API key.
Expand for references to
encord.project.Project.remove_datasets
- get_project_ontology()[source]#
DEPRECATED - prefer using the ontology_structure property accessor instead.
- Return type
- add_object(name, shape)[source]#
Add object to an ontology.
ATTENTION: this legacy method will affect all the projects sharing the same ontology
- Parameters
name (
str
) – the name of the objectshape (
ObjectShape
) – the shape of the object. (BOUNDING_BOX, POLYGON, POLYLINE or KEY_POINT)
- Return type
bool
- Returns
True if the object was added successfully and False otherwise.
- Raises
AuthenticationError – If the project API key is invalid.
AuthorisationError – If access to the specified resource is restricted.
UnknownError – If an error occurs while add te object to the project ontology
OperationNotAllowed – If the operation is not allowed by the API key.
ValueError – If invalid arguments are supplied in the function call
- add_classification(name, classification_type, required, options=None)[source]#
Add classification to an ontology.
ATTENTION: this legacy method will affect all the projects sharing the same ontology
- Parameters
name (
str
) – the name of the classificationclassification_type (
ClassificationType
) – the classification type (RADIO, TEXT or CHECKLIST)required (whether this classification is required by the annotator) –
options (
Optional
[Iterable
[str
]]) – the list of options for the classification (to be set to None for texts)
- Raises
AuthenticationError – If the project API key is invalid.
AuthorisationError – If access to the specified resource is restricted.
UnknownError – If an error occurs while add te classification to the project ontology
OperationNotAllowed – If the operation is not allowed by the API key.
ValueError – If invalid arguments are supplied in the function call
- list_models()[source]#
List all models that are associated with the project. Use the
encord.project.Project.get_training_metadata()
to get more metadata about each training instance.from encord.utilities.project_utilities import get_all_model_iteration_uids project = client_instance.get_project(<project_hash>) model_configurations = project.list_models() all_model_iteration_uids = get_all_model_iteration_uids(model_configurations) training_metadata = project.get_training_metadata( all_model_iteration_uids, get_model_training_labels=True, )
- Return type
List
[ModelConfiguration
]
- get_training_metadata(model_iteration_uids, get_created_at=False, get_training_final_loss=False, get_model_training_labels=False)[source]#
Given a list of model_iteration_uids, get some metadata around each model_iteration.
- Parameters
model_iteration_uids (
Iterable
[str
]) – The model iteration uidsget_created_at (
bool
) – Whether the created_at field should be retrieved.get_training_final_loss (
bool
) – Whether the training_final_loss field should be retrieved.get_model_training_labels (
bool
) – Whether the model_training_labels field should be retrieved.
- Return type
List
[TrainingMetadata
]
- create_model_row(title, description, features, model)[source]#
Create a model row.
- Parameters
title (
str
) – Model title.description (
str
) – Model description.features (
List
[str
]) – List of <feature_node_hashes> which is id’s of ontology objects or classifications to be included in the model.model (
Union
[AutomationModels
,str
]) – the model type to be used. For backwards compatibility purposes, we continuously allow strings corresponding to the values of theAutomationModels
Enum.
- Return type
str
- Returns
The uid of the added model row.
- Raises
AuthenticationError – If the project API key is invalid.
AuthorisationError – If access to the specified resource is restricted.
ModelFeaturesInconsistentError – If a feature type is different than what is supported by the model (e.g. if creating a classification model using a bounding box).
- model_delete(uid)[source]#
Delete a model created on the platform.
- Parameters
uid (
str
) – A model_hash (uid) string.- Return type
bool
- Returns
bool
- Raises
AuthenticationError – If the project API key is invalid.
AuthorisationError – If access to the specified resource is restricted.
ResourceNotFoundError – If no model exists by the specified model_hash (uid).
UnknownError – If an error occurs during training.
- model_inference(uid, file_paths=None, base64_strings=None, conf_thresh=0.6, iou_thresh=0.3, device=Device.CUDA, detection_frame_range=None, allocation_enabled=False, data_hashes=None, rdp_thresh=0.005)[source]#
Run inference with model trained on the platform.
The image(s)/video(s) can be provided either as local file paths, base64 strings, or as data hashes if the data is already uploaded on the Encord platform.
- Parameters
uid (
str
) – A model_iteration_hash (uid) string.file_paths (
Optional
[List
[str
]]) – List of local file paths to image(s) or video(s) - if running inference on files.base64_strings (
Optional
[List
[bytes
]]) – List of base 64 strings of image(s) or video(s) - if running inference on base64 strings.conf_thresh (
float
) – Confidence threshold (default 0.6).iou_thresh (
float
) – Intersection over union threshold (default 0.3).device (
Device
) – Device (CPU or CUDA, default is CUDA).detection_frame_range (
Optional
[List
[int
]]) – Detection frame range (for videos).allocation_enabled (
bool
) – Object UID allocation (tracking) enabled (disabled by default).data_hashes (
Optional
[List
[str
]]) – list of hash of the videos/image_groups you’d like to run inference on.rdp_thresh (
float
) – parameter specifying the polygon coarseness to be used while running inference. The higher the value, the less points in the segmented image
- Returns
A dict of inference results.
- Return type
Inference results
- Raises
AuthenticationError – If the project API key is invalid.
AuthorisationError – If access to the specified resource is restricted.
ResourceNotFoundError – If no model exists by the specified model_iteration_hash (uid).
UnknownError – If an error occurs while running inference.
FileTypeNotSupportedError – If the file type is not supported for inference (has to be an image or video)
FileSizeNotSupportedError – If the file size is too big to be supported.
DetectionRangeInvalidError – If a detection range is invalid for video inference
- model_train(uid, label_rows=None, epochs=None, batch_size=24, weights=None, device=Device.CUDA)[source]#
Train a model created on the platform.
- Parameters
uid (
str
) – A model_hash (uid) string.label_rows (
Optional
[List
[str
]]) – List of label row uid’s (hashes) for training.epochs (
Optional
[int
]) – Number of passes through training dataset.batch_size (
int
) – Number of training examples utilized in one iteration.weights (
Optional
[ModelTrainingWeights
]) – Model weights.device (
Device
) – Device (CPU or CUDA, default is CUDA).
- Returns
A model iteration object.
- Raises
AuthenticationError – If the project API key is invalid.
AuthorisationError – If access to the specified resource is restricted.
ModelWeightsInconsistentError – If the passed model weights are incompatible with the selected model.
ResourceNotFoundError – If no model exists by the specified model_hash (uid).
UnknownError – If an error occurs during training.
- object_interpolation(key_frames, objects_to_interpolate)[source]#
Run object interpolation algorithm on project labels (requires an editor ontology and feature uid’s).
Interpolation is supported for bounding box, polygon, and keypoint.
- Parameters
key_frames –
Labels for frames to be interpolated. Key frames are consumed in the form:
{ "<frame_number>": { "objects": [ { "objectHash": "<object_hash>", "featureHash": "<feature_hash>", "polygon": { "0": { "x": x1, "y": y1, }, "1": { "x": x2, "y": y2, }, # ..., } }, # ... ] }, # ..., }
objects_to_interpolate – List of object uid’s (hashes) of objects to interpolate.
- Returns
Full set of filled frames including interpolated objects.
- Return type
Interpolation results
- Raises
AuthenticationError – If the project API key is invalid.
AuthorisationError – If access to the specified resource is restricted.
UnknownError – If an error occurs while running interpolation.
- fitted_bounding_boxes(frames, video)[source]#
- Parameters
frames (
dict
) –Labels for frames to be fitted. Frames are consumed in the form:
{ "<frame_number>": { "objects": [ { "objectHash": "<object_hash>", "featureHash": "<feature_hash>", "polygon": { "0": { "x": x1, "y": y1, }, "1": { "x": x2, "y": y2, }, # ..., } }, # ... ] }, # ..., }
video (
dict
) –Metadata of the video for which bounding box fitting needs to be run:
{ "width" : w, "height" : h, }
- Returns
Full set of filled frames including fitted objects.
- Return type
Fitting results
- Raises
AuthenticationError – If the project API key is invalid.
AuthorisationError – If access to the specified resource is restricted.
UnknownError – If an error occurs while running interpolation.
- get_data(data_hash, get_signed_url=False)[source]#
Retrieve information about a video or image group.
- Parameters
data_hash (
str
) – The uid of the data objectget_signed_url (
bool
) – Optionally return signed URLs for timed public access to that resource (default False)
- Return type
- Returns
A consisting of the video (if it exists) and a list of individual images (if they exist)
- Raises
AuthenticationError – If the project API key is invalid.
AuthorisationError – If access to the specified resource is restricted.
UnknownError – If an error occurs while retrieving the object.
- get_label_logs(user_hash=None, data_hash=None, from_unix_seconds=None, to_unix_seconds=None, after=None, before=None, user_email=None)[source]#
Get label logs, which represent the actions taken in the UI to create labels.
All arguments can be left as None if no filtering should be applied.
- Parameters
user_hash (
Optional
[str
]) – Filter the label logs by the user.data_hash (
Optional
[str
]) – Filter the label logs by the data_hash.from_unix_seconds (
Optional
[int
]) – Filter the label logs to only include labels after this timestamp. Deprecated: use parameter after insteadto_unix_seconds (
Optional
[int
]) – Filter the label logs to only include labels before this timestamp. Deprecated: use parameter before insteadafter (
Optional
[datetime
]) – Filter the label logs to only include labels after the specified time.before (
Optional
[datetime
]) – Filter the label logs to only include labels before the specified time.user_email (
Optional
[str
]) – Filter by the annotator email.
- Return type
List
[LabelLog
]
- get_cloud_integrations()[source]#
- Return type
List
[CloudIntegration
]
- list_label_rows(edited_before=None, edited_after=None, label_statuses=None, shadow_data_state=None, *, include_uninitialised_labels=False, label_hashes=None, data_hashes=None)[source]#
DEPRECATED - use list_label_rows_v2 to manage label rows instead.
- Parameters
edited_before (
Union
[str
,datetime
,None
]) – Optionally filter to only rows last edited before the specified timeedited_after (
Union
[str
,datetime
,None
]) – Optionally filter to only rows last edited after the specified timelabel_statuses (
Optional
[List
[AnnotationTaskStatus
]]) – Optionally filter to only those label rows that have one of the specifiedAnnotationTaskStatus
shadow_data_state (
Optional
[ShadowDataState
]) – On Optionally filter by data type in Benchmark QA projects. SeeShadowDataState
include_uninitialised_labels – Whether to return only label rows that are “created” and have a label_hash (default). If set to True, this will return all label rows, including those that do not have a label_hash.
data_hashes (
Optional
[List
[str
]]) – List of data hashes to filter by.label_hashes (
Optional
[List
[str
]]) – List of label hashes to filter by.
- Return type
List
[LabelRowMetadata
]- Returns
A list of
LabelRowMetadata
instances for all the matching label rows- Raises
UnknownError – If an error occurs while retrieving the data.
- set_label_status(label_hash, label_status)[source]#
DEPRECATED - this function is currently not maintained.
Set the label status for a label row to a desired value.
- Parameters
self – Encord client object.
label_hash (
str
) – unique identifier of the label row whose status is to be updated.label_status (
LabelStatus
) – the new status that needs to be set.
- Return type
bool
- Returns
Bool.
- Raises
AuthorisationError – If the label_hash provided is invalid or not a member of the project.
UnknownError – If an error occurs while updating the status.
- get_label_row(uid, get_signed_url=True, *, include_object_feature_hashes=None, include_classification_feature_hashes=None, include_reviews=False)[source]#
DEPRECATED: Prefer using the list_label_rows_v2 function to interact with label rows.
Retrieve label row. If you need to retrieve multiple label rows, prefer using
encord.project.Project.get_label_rows()
instead.A code example using the include_object_feature_hashes and include_classification_feature_hashes filters can be found in
encord.project.Project.get_label_row()
.- Parameters
uid (
str
) – A label_hash (uid) string.get_signed_url (
bool
) – Whether to generate signed urls to the data asset. Generating these should be disabled if the signed urls are not used to speed up the request.include_object_feature_hashes (
Optional
[Set
[str
]]) – If None all the objects will be included. Otherwise, only objects labels will be included of which the feature_hash has been added.include_classification_feature_hashes (
Optional
[Set
[str
]]) – If None all the classifications will be included. Otherwise, only classification labels will be included of which the feature_hash has been added.include_reviews (
bool
) – Whether to request read only information about the reviews of the label row.
- Returns
A label row instance.
- Return type
- Raises
AuthenticationError – If the project API key is invalid.
AuthorisationError – If access to the specified resource is restricted.
ResourceNotFoundError – If no label exists by the specified label_hash (uid).
UnknownError – If an error occurs while retrieving the label.
OperationNotAllowed – If the read operation is not allowed by the API key.
Expand for references to
encord.project.Project.get_label_row
- get_label_rows(uids, get_signed_url=True, *, include_object_feature_hashes=None, include_classification_feature_hashes=None, include_reviews=False)[source]#
DEPRECATED: Prefer using the list_label_rows_v2 function to interact with label rows.
Retrieve a list of label rows. Duplicates will be dropped. The result will come back in a random order.
This return is undefined behaviour if any of the uids are invalid (i.e. it may randomly fail or randomly succeed and should not be relied upon).
# Code example of using the object filters. from encord.objects.common import Shape from encord.objects.ontology_structure import OntologyStructure project = ... # assuming you already have instantiated this Project object # Get all feature hashes of the objects which are of type `Shape.BOUNDING_BOX` ontology = OntologyStructure.from_dict(project.ontology) only_bounding_box_feature_hashes = set() for object_ in ontology.objects: if object_.shape == Shape.BOUNDING_BOX: only_bounding_box_feature_hashes.add(object_.feature_node_hash) no_classification_feature_hashes = set() # deliberately left empty # Get all labels of tasks that have already been initiated. # Include only labels of bounding boxes and exclude all # classifications label_hashes = [] for label_row in project.label_rows: # Trying to run `get_label_row` on a label_row without a `label_hash` would fail. if label_row["label_hash"] is not None: label_hashes.append(label_row["label_hash"]) all_labels = project.get_label_rows( label_hashes, include_object_feature_hashes=only_bounding_box_feature_hashes, include_classification_feature_hashes=no_classification_feature_hashes, )
- Parameters
uids (
List
[str
]) – A list of label_hash (uid).get_signed_url (
bool
) –- Whether to generate signed urls to the data asset. Generating these should be disabled
if the signed urls are not used to speed up the request.
- include_object_feature_hashes:
If None all the objects will be included. Otherwise, only objects labels will be included of which the feature_hash has been added.
- include_classification_feature_hashes:
If None all the classifications will be included. Otherwise, only classification labels will be included of which the feature_hash has been added.
include_reviews: Whether to request read only information about the reviews of the label row.
- Raises
MultiLabelLimitError – If too many labels were requested. Check the error’s maximum_labels_allowed field to read the most up to date error limit.
AuthenticationError – If the project API key is invalid.
AuthorisationError – If access to the specified resource is restricted.
ResourceNotFoundError – If no label exists by the specified label_hash (uid).
UnknownError – If an error occurs while retrieving the label.
OperationNotAllowed – If the read operation is not allowed by the API key.
- Return type
List
[LabelRow
]
- save_label_row(uid, label)[source]#
DEPRECATED: Prefer using the list_label_rows_v2 function to interact with label rows.
Save existing label row.
If you have a series of frame labels and have not updated answer dictionaries, call the construct_answer_dictionaries utilities function to do so prior to saving labels.
- Parameters
uid – A label_hash (uid) string.
label – A label row instance.
- Returns
Bool.
- Raises
AuthenticationError – If the project API key is invalid.
AuthorisationError – If access to the specified resource is restricted.
ResourceNotFoundError – If no label exists by the specified label_hash (uid).
UnknownError – If an error occurs while saving the label.
OperationNotAllowed – If the write operation is not allowed by the API key.
AnswerDictionaryError – If an object or classification instance is missing in answer dictionaries.
CorruptedLabelError – If a blurb is corrupted (e.g. if the frame labels have more frames than the video).
- create_label_row(uid)[source]#
DEPRECATED: Prefer using the list_label_rows_v2 function to interact with label rows.
Create a label row (for data in a project not previously been labeled).
- Parameters
uid (
str
) – the data_hash (uid) of the data unit being labeled. Available in client.get_project().get(‘label_rows’) where label_status is NOT_LABELLED.- Returns
A label row instance.
- Return type
- Raises
AuthenticationError – If the project API key is invalid.
AuthorisationError – If access to the specified resource is restricted.
UnknownError – If an error occurs while saving the label.
OperationNotAllowed – If the write operation is not allowed by the API key.
AnswerDictionaryError – If an object or classification instance is missing in answer dictionaries.
CorruptedLabelError – If a blurb is corrupted (e.g. if the frame labels have more frames than the video).
ResourceExistsError – If a label row already exists for this project data. Avoids overriding existing work.
- create_bundle()[source]#
Initialises a bundle to reduce amount of network calls performed by the Encord SDK
See the
encord.http.bundle.Bundle
documentation for more details- Return type
- list_collaborator_timers(after, before=None, group_by_data_unit=True)[source]#
Provides information about time spent for each collaborator that has worked on the project within a specified range of dates.
- Parameters
after (
datetime
) – the beginning of the period of interest.before (
Optional
[datetime
]) – the end of period of interest.group_by_data_unit (
bool
) – if True, time spent by a collaborator for each data unit is provided separately, and if False, all time spent in the scope of the project is aggregated together.
- Return type
Iterable
[CollaboratorTimer
]- Returns
Iterable[CollaboratorTimer]
Dataset#
- class encord.dataset.Dataset(client, orm_dataset)[source]#
Access dataset related data and manipulate the dataset.
- property dataset_hash: str#
Get the dataset hash (i.e. the Dataset ID).
- Return type
str
- property title: str#
- Return type
str
- property description: str#
- Return type
str
- property storage_location: encord.orm.dataset.StorageLocation#
- Return type
- property backing_folder_uuid: Optional[uuid.UUID]#
- Return type
Optional
[UUID
]
- property data_rows: List[encord.orm.dataset.DataRow]#
Part of the response of this function can be configured by the
encord.dataset.Dataset.set_access_settings()
method.dataset.set_access_settings(DatasetAccessSettings(fetch_client_metadata=True)) print(dataset.data_rows)
- Return type
List
[DataRow
]
- list_data_rows(title_eq=None, title_like=None, created_before=None, created_after=None, data_types=None, data_hashes=None)[source]#
Retrieve dataset rows (pointers to data, labels).
- Parameters
title_eq (
Optional
[str
]) – optional exact title row filtertitle_like (
Optional
[str
]) – optional fuzzy title row filter; SQL syntaxcreated_before (
Union
[str
,datetime
,None
]) – optional datetime row filtercreated_after (
Union
[str
,datetime
,None
]) – optional datetime row filterdata_types (
Optional
[List
[DataType
]]) – optional data types row filterdata_hashes (
Optional
[List
[str
]]) – optional list of individual data unit hashes to include
- Returns
A list of DataRows object that match the filter
- Return type
List[DataRow]
- Raises
AuthorisationError – If the dataset API key is invalid.
ResourceNotFoundError – If no dataset exists by the specified dataset EntityId.
UnknownError – If an error occurs while retrieving the dataset.
- refetch_data()[source]#
The Dataset class will only fetch its properties once. Use this function if you suspect the state of those properties to be dirty.
- Return type
None
- get_dataset()[source]#
This function is exposed for convenience. You are encouraged to use the property accessors instead.
- Return type
- set_access_settings(dataset_access_settings, *, refetch_data=True)[source]#
- Parameters
dataset_access_settings (
DatasetAccessSettings
) – The access settings to use going forwardrefetch_data (
bool
) – Whether a refetch_data() call should follow the update of the dataset access settings.
- Return type
None
- add_users(user_emails, user_role)[source]#
Add users to dataset. If the user was already added, this operation will succeed but the user_role will be unchanged. The existing user_role will be reflected in the DatasetUser instance.
- Parameters
user_emails (
List
[str
]) – list of user emails to be addeduser_role (
DatasetUserRole
) – the user role to assign to all users
- Return type
List
[DatasetUser
]
- upload_video(file_path, cloud_upload_settings=CloudUploadSettings(max_retries=None, backoff_factor=None, allow_failures=False), title=None)[source]#
Upload video to Encord storage.
- Parameters
file_path (
str
) – path to video e.g. ‘/home/user/data/video.mp4’cloud_upload_settings (
CloudUploadSettings
) – Settings for uploading data into the cloud. Change this object to overwrite the default values.title (
Optional
[str
]) – The video title. If unspecified, this will be the file name. This title should include an extension. For example “encord_video.mp4”.
- Returns
Bool.
- Raises
UploadOperationNotSupportedError – If trying to upload to external datasets (e.g. S3/GPC/Azure)
- create_image_group(file_paths, max_workers=None, cloud_upload_settings=CloudUploadSettings(max_retries=None, backoff_factor=None, allow_failures=False), title=None, *, create_video=True)[source]#
Create an image group in Encord storage. Choose this type of image upload for sequential images. Else, you can choose the
Dataset.upload_image()
function.- Parameters
file_paths (
Iterable
[str
]) – a list of paths to images, e.g. [‘/home/user/data/img1.png’, ‘/home/user/data/img2.png’]max_workers (
Optional
[int
]) – DEPRECATED: This argument will be ignoredcloud_upload_settings (
CloudUploadSettings
) – Settings for uploading data into the cloud. Change this object to overwrite the default values.title (
Optional
[str
]) – The title of the image group. If unspecified this will be randomly generated for you. This title should NOT include an extension. For example “encord_image_group”.create_video (
bool
) – A flag specifying how image groups are stored. If True, a compressed video will be created from the image groups. True was the previous default support. If False, the images are saved as a sequence of images.
- Returns
Bool.
- Raises
UploadOperationNotSupportedError – If trying to upload to external datasets (e.g. S3/GPC/Azure)
- create_dicom_series(file_paths, cloud_upload_settings=CloudUploadSettings(max_retries=None, backoff_factor=None, allow_failures=False), title=None)[source]#
Upload a DICOM series to Encord storage
- Parameters
file_paths (
List
[str
]) – a list of paths to DICOM files, e.g. [‘/home/user/data/DICOM_1.dcm’, ‘/home/user/data/DICOM_2.dcm’]cloud_upload_settings (
CloudUploadSettings
) – Settings for uploading data into the cloud. Change this object to overwrite the default values.title (
Optional
[str
]) – The title of the DICOM series. If unspecified this will be randomly generated for you. This title should NOT include an extension. For example “encord_image_group”.
- Returns
Bool.
- Raises
UploadOperationNotSupportedError – If trying to upload to external datasets (e.g. S3/GPC/Azure)
- upload_image(file_path, title=None, cloud_upload_settings=CloudUploadSettings(max_retries=None, backoff_factor=None, allow_failures=False))[source]#
Upload a single image to Encord storage. If your images are sequential we recommend creating an image group via the
Dataset.create_image_group()
function. For more information please compare https://docs.encord.com/docs/annotate-images and https://docs.encord.com/docs/annotate-videos- Parameters
file_path (
Union
[Path
,str
]) – The file path to the imagetitle (
Optional
[str
]) – The image title. If unspecified, this will be the file name. This title should include an extension. For example “encord_image.png”.cloud_upload_settings (
CloudUploadSettings
) – Settings for uploading data into the cloud. Change this object to overwrite the default values.
- Return type
- delete_image_group(data_hash)[source]#
Delete an image group in Encord storage.
- Parameters
data_hash (
str
) – the hash of the image group to delete
- delete_data(data_hashes)[source]#
Delete a video/image group from a dataset.
- Parameters
data_hashes (
List
[str
]) – list of hash of the videos/image_groups you’d like to delete, all should belong to the same dataset
- add_private_data_to_dataset(integration_id, private_files, ignore_errors=False)[source]#
Append data hosted on a private cloud to an existing dataset.
For a more complete example of safe uploads, please follow the guide found in our docs under https://python.docs.encord.com/tutorials/datasets.html#adding-data-from-a-private-cloud
- Parameters
integration_id (
str
) – The EntityId of the cloud integration you wish to use.private_files (
Union
[str
,Dict
,Path
,TextIO
]) – A str path or Path object to a json file, json str or python dictionary of the files you wish to addignore_errors (
bool
) – When set to True, this will prevent individual errors from stopping the upload process.
- Return type
- Returns
add_private_data_response List of DatasetDataInfo objects containing data_hash and title
- add_private_data_to_dataset_start(integration_id, private_files, ignore_errors=False)[source]#
Append data hosted on a private cloud to an existing dataset.
This method inititalizes the upload in Encord’s backend. Once the upload id has been returned, you can exit the terminal while the job continues uninterrupted.
You can check upload job status at any point using the
add_private_data_to_dataset_get_result()
method. This can be done in a separate python session to the one where the upload was initialized.- Parameters
integration_id (
str
) – The EntityId of the cloud integration you wish to use.private_files (
Union
[str
,Dict
,Path
,TextIO
]) – A str path or Path object to a json file, json str or python dictionary of the files you wish to addignore_errors (
bool
) – When set to True, this will prevent individual errors from stopping the upload process.
- Return type
str
- Returns
- str
upload_job_id - UUID Identifier of upload job. This id enables the user to track the job progress via SDK, or web app.
- add_private_data_to_dataset_get_result(upload_job_id, timeout_seconds=604800)[source]#
Fetch data upload status, perform long polling process for timeout_seconds.
- Parameters
upload_job_id (
str
) – UUID Identifier of upload job. This id enables the user to track the job progress via SDK, or web app.timeout_seconds (
int
) – Number of seconds the method will wait while waiting for a response. If timeout_seconds == 0, only a single checking request is performed. Response will be immediately returned.
- Return type
- Returns
- DatasetDataLongPolling
Response containing details about job status, errors and progress.
- update_data_item(data_hash, new_title)[source]#
DEPRECATED: Use the individual setter properties of the respective
encord.orm.dataset.DataRow
instance instead. These can be retrieved via theDataset.data_rows()
function.Update a data item
- Parameters
data_hash (
str
) – str Data hash of the item being updatednew_title (
str
) – String containing the new title of the data item being updated
- Return type
bool
- Returns
Returns a boolean for whether the update was successful
- re_encode_data(data_hashes)[source]#
Launches an async task that can re-encode a list of videos.
- Parameters
data_hashes (
List
[str
]) – list of hash of the videos you’d like to re_encode, all should belong to the same dataset- Returns
EntityId(integer) of the async task launched.
- re_encode_data_status(job_id)[source]#
Returns the status of an existing async task which is aimed at re-encoding videos.
- Parameters
job_id (
int
) – id of the async task that was launched to re-encode the videos- Returns
- Object containing the status of the task, along with info about the new encoded videos
in case the task has been completed
- Return type
- run_ocr(image_group_id)[source]#
Returns an optical character recognition result for a given image group :type image_group_id:
str
:param image_group_id: the id of the image group in this dataset to run OCR on- Return type
List
[ImageGroupOCR
]- Returns
Returns a list of ImageGroupOCR objects representing the text and corresponding coordinates found in each frame of the image group
- get_cloud_integrations()[source]#
- Return type
List
[CloudIntegration
]
Ontology#
- class encord.ontology.Ontology(querier, config, instance)[source]#
Access ontology related data and manipulate the ontology. Instantiate this class via
encord.user_client.EncordUserClient.get_ontology()
- property ontology_hash: str#
Get the ontology hash (i.e. the Ontology ID).
- Return type
str
- property title: str#
Get the title of the ontology.
- Return type
str
- property description: str#
Get the description of the ontology.
- Return type
str
- property created_at: datetime.datetime#
Get the time the ontology was created at.
- Return type
datetime
- property last_edited_at: datetime.datetime#
Get the time the ontology was last edited at.
- Return type
datetime
- property structure: encord.objects.ontology_structure.OntologyStructure#
Get the structure of the ontology.
- Return type
Expand for references to
encord.ontology.Ontology.structure
Ontology Structure#
- class encord.objects.ontology_labels_impl.OntologyStructure(objects=<factory>, classifications=<factory>)[source]#
- objects: List[encord.objects.ontology_object.Object]#
- classifications: List[encord.objects.classification.Classification]#
- get_child_by_hash(feature_node_hash, type_=None)[source]#
Returns the first child node of this ontology tree node with the matching feature node hash. If there is more than one child with the same feature node hash in the ontology tree node, then the ontology would be in an invalid state. Throws if nothing is found or if the type is not matched.
- Parameters
feature_node_hash (
str
) – the feature_node_hash of the child node to search for in the ontology.type – The expected type of the item. If the found child does not match the type, an error will be thrown.
- Return type
~OntologyElementT
- get_child_by_title(title, type_=None)[source]#
Returns a child node of this ontology tree node with the matching title and matching type if specified. If more than one child in this Object have the same title, then an error will be thrown. If no item is found, an error will be thrown as well.
- Parameters
title (
str
) – The exact title of the child node to search for in the ontology.type – The expected type of the child node. Only a node that matches this type will be returned.
- Return type
~OntologyElementT
- get_children_by_title(title, type_=None)[source]#
Returns all the child nodes of this ontology tree node with the matching title and matching type if specified. Title in ontologies do not need to be unique, however, we recommend unique titles when creating ontologies.
- Parameters
title (
str
) – The exact title of the child node to search for in the ontology.type – The expected type of the item. Only nodes that match this type will be returned.
- Return type
List
[~OntologyElementT]
- classmethod from_dict(d)[source]#
- Parameters
d (
Dict
[str
,Any
]) – a JSON blob of an “ontology structure” (e.g. from Encord web app)- Raises
KeyError – If the dict is missing a required field.
- Return type
- to_dict()[source]#
- Return type
Dict
[str
,List
[Dict
[str
,Any
]]]- Returns
The dict equivalent to the ontology.
- Raises
KeyError – If the dict is missing a required field.
- add_object(name, shape, uid=None, color=None, feature_node_hash=None)[source]#
Adds an object class definition to the structure.
structure = ontology_structure.OntologyStructure() eye = structure.add_object( name="Eye", ) nose = structure.add_object( name="Nose", ) nose_detail = nose.add_attribute( encord.objects.common.ChecklistAttribute, ) nose_detail.add_option(feature_node_hash="2bc17c88", label="Is it a cute nose?") nose_detail.add_option(feature_node_hash="86eaa4f2", label="Is it a wet nose? ")
- Parameters
name (
str
) – the user-visible name of the objectshape (
Shape
) – the kind of object (bounding box, polygon, etc). Seeencord.objects.common.Shape
enum for possible valuesuid (
Optional
[int
]) – integer identifier of the object. Normally auto-generated; omit this unless the aim is to create an exact clone of existing structurecolor (
Optional
[str
]) – the color of the object in the label editor. Normally auto-assigned, should be in ‘#1A2B3F’ syntax.feature_node_hash (
Optional
[str
]) – global identifier of the object. Normally auto-generated; omit this unless the aim is to create an exact clone of existing structure
- Return type
- Returns
the created object class that can be further customised with attributes.
- add_classification(uid=None, feature_node_hash=None)[source]#
Adds an classification definition to the ontology.
structure = ontology_structure.OntologyStructure() cls = structure.add_classification(feature_node_hash="a39d81c0") cat_standing = cls.add_attribute( encord.objects.common.RadioAttribute, feature_node_hash="a6136d14", name="Is the cat standing?", required=True, ) cat_standing.add_option(feature_node_hash="a3aeb48d", label="Yes") cat_standing.add_option(feature_node_hash="d0a4b373", label="No")
- Parameters
uid (
Optional
[int
]) – integer identifier of the object. Normally auto-generated; omit this unless the aim is to create an exact clone of existing structurefeature_node_hash (
Optional
[str
]) – global identifier of the object. Normally auto-generated; omit this unless the aim is to create an exact clone of existing structure
- Return type
- Returns
the created classification node. Note that classification attribute should be further specified by calling its add_attribute() method.
- class encord.objects.ontology_labels_impl.Object(feature_node_hash, uid, name, color, shape, attributes=<factory>)[source]#
- uid: int#
- name: str#
- color: str#
- shape: encord.objects.common.Shape#
- feature_node_hash: str#
- attributes: List[encord.objects.attributes.Attribute]#
- property title: str#
- Return type
str
- property children: Sequence[encord.objects.ontology_element.OntologyElement]#
- Return type
Sequence
[OntologyElement
]
- create_instance()[source]#
Create a
encord.objects.ObjectInstance
to be used with a label row.- Return type
- T#
alias of TypeVar(‘T’, bound=
encord.objects.attributes.Attribute
)
- add_attribute(cls, name, local_uid=None, feature_node_hash=None, required=False, dynamic=False)[source]#
Adds an attribute to the object.
- Parameters
cls (Type[T]) – attribute type, one of RadioAttribute, ChecklistAttribute, TextAttribute
name (str) – the user-visible name of the attribute
local_uid (Optional[int]) – integer identifier of the attribute. Normally auto-generated; omit this unless the aim is to create an exact clone of existing ontology
feature_node_hash (Optional[str]) – global identifier of the attribute. Normally auto-generated; omit this unless the aim is to create an exact clone of existing ontology
required (bool) – whether the label editor would mark this attribute as ‘required’
dynamic (bool) – whether the attribute can have a different answer for the same object across different frames.
- Return type
T
- Returns
the created attribute that can be further specified with Options, where appropriate
- Raises
ValueError – if specified local_uid or feature_node_hash violate uniqueness constraints
- class encord.objects.ontology_labels_impl.Classification(feature_node_hash, uid, attributes)[source]#
Represents a whole-image classification as part of Ontology structure. Wraps a single Attribute that describes the image in general rather than an individual object.
- uid: int#
- feature_node_hash: str#
- attributes: List[encord.objects.attributes.Attribute]#
- property title: str#
- Return type
str
- property children: Sequence[encord.objects.ontology_element.OntologyElement]#
- Return type
Sequence
[OntologyElement
]
- create_instance()[source]#
Create a
encord.objects.ClassificationInstance
to be used with a label row.- Return type
- T#
alias of TypeVar(‘T’, bound=
encord.objects.attributes.Attribute
)
- add_attribute(cls, name, local_uid=None, feature_node_hash=None, required=False)[source]#
Adds an attribute to the classification.
- Parameters
cls (Type[T]) – attribute type, one of RadioAttribute, ChecklistAttribute, TextAttribute
name (str) – the user-visible name of the attribute
local_uid (Optional[int]) – integer identifier of the attribute. Normally auto-generated; omit this unless the aim is to create an exact clone of existing ontology
feature_node_hash (Optional[str]) – global identifier of the attribute. Normally auto-generated; omit this unless the aim is to create an exact clone of existing ontology
required (bool) – whether the label editor would mark this attribute as ‘required’
- Return type
T
- Returns
the created attribute that can be further specified with Options, where appropriate
- Raises
ValueError – if the classification already has an attribute assigned
Encord Objects#
- class encord.objects.common.PropertyType(value)[source]#
An enumeration.
- RADIO = 'radio'#
- TEXT = 'text'#
- CHECKLIST = 'checklist'#
- class encord.objects.common.Shape(value)[source]#
An enumeration.
- BOUNDING_BOX = 'bounding_box'#
- POLYGON = 'polygon'#
- POINT = 'point'#
- SKELETON = 'skeleton'#
- POLYLINE = 'polyline'#
- ROTATABLE_BOUNDING_BOX = 'rotatable_bounding_box'#
- BITMASK = 'bitmask'#
- class encord.objects.common.DeidentifyRedactTextMode(value)[source]#
An enumeration.
- REDACT_ALL_TEXT = 'REDACT_ALL_TEXT'#
- REDACT_NO_TEXT = 'REDACT_NO_TEXT'#
- REDACT_SENSITIVE_TEXT = 'REDACT_SENSITIVE_TEXT'#
- class encord.objects.common.SaveDeidentifiedDicomConditionType(value)[source]#
An enumeration.
- NOT_SUBSTR = 'NOT_SUBSTR'#
- IN = 'IN'#
- class encord.objects.common.SaveDeidentifiedDicomConditionIn(value, dicom_tag, condition_type=SaveDeidentifiedDicomConditionType.IN)[source]#
- value: List[str]#
- dicom_tag: str#
- condition_type: encord.objects.common.SaveDeidentifiedDicomConditionType = 'IN'#
- class encord.objects.common.SaveDeidentifiedDicomConditionNotSubstr(value, dicom_tag, condition_type=SaveDeidentifiedDicomConditionType.NOT_SUBSTR)[source]#
- value: str#
- dicom_tag: str#
- condition_type: encord.objects.common.SaveDeidentifiedDicomConditionType = 'NOT_SUBSTR'#
- encord.objects.utils.short_uuid_str()[source]#
This is being used as a condensed uuid.
- Return type
str
Label Row V2#
- class encord.objects.LabelRowV2(label_row_metadata, project_client, ontology)[source]#
This class represents a single label row. It is corresponding to exactly one data row within a project. It holds all the labels for that data row.
You can access a many metadata fields with this class directly. If you want to read or write labels you will need to call
initialise_labels()
first. To upload your added labels callsave()
.- property label_hash: Optional[str]#
- Return type
Optional
[str
]
- property data_hash: str#
- Return type
str
- property dataset_hash: str#
- Return type
str
- property dataset_title: str#
- Return type
str
- property data_title: str#
- Return type
str
- property data_type: encord.constants.enums.DataType#
- Return type
- property label_status: encord.orm.label_row.LabelStatus#
Returns the current labeling status for the label row.
Note: This method is not supported for workflow-based projects. Please see our workflow documentation for more details.
- Return type
- property annotation_task_status: encord.orm.label_row.AnnotationTaskStatus#
Returns the current annotation task status for the label row.
Note: This method is not supported for workflow-based projects. Please see our workflow documentation for more details.
- Return type
- property workflow_graph_node: Optional[encord.orm.label_row.WorkflowGraphNode]#
- Return type
Optional
[WorkflowGraphNode
]
- property is_shadow_data: bool#
- Return type
bool
- property created_at: Optional[datetime.datetime]#
The creation date of the label row. None if the label row was not yet created.
- Return type
Optional
[datetime
]
- property last_edited_at: Optional[datetime.datetime]#
The time the label row was updated last as a whole. None if the label row was not yet created.
- Return type
Optional
[datetime
]
- property number_of_frames: int#
- Return type
int
- property duration: Optional[float]#
Only a value for Video data types.
- Return type
Optional
[float
]
- property fps: Optional[float]#
Only a value for Video data types.
- Return type
Optional
[float
]
- property data_link: Optional[str]#
The data link in either your cloud storage or the encord storage to the underlying object. This will be None for DICOM series or image groups that have been created without performance optimisations, as there is no single underlying file for these data types.
- Return type
Optional
[str
]
- property width: Optional[int]#
This is None for image groups without performance optimisation, as there is no single underlying width for this data type.
- Return type
Optional
[int
]
- property height: Optional[int]#
This is None for image groups without performance optimisation, as there is no single underlying width for this data type.
- Return type
Optional
[int
]
- property priority: Optional[float]#
Get workflow priority for the task associated with the data unit.
This property only works for workflow-based project.
It is None for label rows in “complete” state.
- Return type
Optional
[float
]
- property ontology_structure: encord.objects.ontology_structure.OntologyStructure#
Get the corresponding ontology structure
- Return type
- property is_labelling_initialised: bool#
Whether you can start labelling or not. If this is False, call the member
initialise_labels()
to read or write specific ObjectInstances or ClassificationInstances.- Return type
bool
- initialise_labels(include_object_feature_hashes=None, include_classification_feature_hashes=None, include_reviews=False, overwrite=False, bundle=None)[source]#
Call this function to download or export labels stored on the Encord server, as well as to perform any other reading or writing operations. If you only want to inspect a subset of labels, you can filter them. Please note that if you filter the labels, and upload them later, you will effectively delete all the labels that were previously filtered.
If the label was not yet in progress, this will set the label status to LabelStatus.LABEL_IN_PROGRESS.
You can call this function at any point to overwrite the current labels stored in this class with the most up to date labels stored in the Encord servers. This would only matter if you manipulate the labels while someone else is working on the labels as well. You would need to supply the overwrite parameter to True
- Parameters
include_object_feature_hashes (
Optional
[Set
[str
]]) – If None all the objects will be included. Otherwise, only objects labels will be included of which the feature_hash has been added. WARNING: it is only recommended to use this filter if you are reading (not writing) the labels. If you are requesting a subset of objects and later, save the label, you will effectively delete all the object instances that are stored in the Encord platform, which were not included in this filtered subset.include_classification_feature_hashes (
Optional
[Set
[str
]]) – If None all the classifications will be included. Otherwise, only classification labels will be included of which the feature_hash has been added. WARNING: it is only recommended to use this filter if you are reading (not writing) the labels. If you are requesting a subset of classifications and later, save the label, you will effectively delete all the classification instances that are stored in the Encord platform, which were not included in this filtered subset.include_reviews (
bool
) – Whether to request read only information about the reviews of the label row.overwrite (
bool
) – If the label row was already initialised, you need to set this flag to True to overwrite the current labels with the labels stored in the Encord server. If this is False and the label row was already initialised, this function will throw an error.bundle (
Optional
[Bundle
]) – If not passed, initialisation is performed independently. If passed, it will be delayed and initialised along with other objects in the same bundle.
- Return type
None
- from_labels_dict(label_row_dict)[source]#
If you have a label row dictionary in the same format that the Encord servers produce, you can initialise the LabelRow from that directly. In most cases you should prefer using the initialise_labels method.
This function also initialises the label row.
Calling this function will reset all the labels that are currently stored within this class.
- Parameters
label_row_dict (
dict
) – The dictionary of all labels as expected by the Encord format.- Return type
None
- get_image_hash(frame_number)[source]#
Get the corresponding image hash of the frame number. Return None if the frame number is out of bounds. Raise an error if this function is used for non-image data types.
- Return type
Optional
[str
]
- get_frame_number(image_hash)[source]#
Get the corresponding image hash of the frame number. Return None if the image hash was not found with an associated frame number. Raise an error if this function is used for non-image data types.
- Return type
Optional
[int
]
- save(bundle=None)[source]#
Upload the created labels with the Encord server. This will overwrite any labels that someone has created in the platform in the meantime.
- Parameters
bundle (
Optional
[Bundle
]) – if not passed, save is executed immediately. If passed, it is executed as a part of the bundle- Return type
None
- property metadata: Optional[encord.objects.metadata.DICOMSeriesMetadata]#
Metadata for the given data type. Currently only supported for DICOM, and will return None for other formats.
Label row needs to be initialised before using this property
- Return type
Optional
[DICOMSeriesMetadata
]
- get_frame_view(frame=0)[source]#
- Parameters
frame (Union[int, str]) – Either the frame number or the image hash if the data type is an image or image group. Defaults to the first frame.
- Return type
- get_frame_views()[source]#
- Return type
List[FrameView]
- Returns
A list of frame views in order of available frames.
- get_object_instances(filter_ontology_object=None, filter_frames=None)[source]#
- Parameters
- Return type
List
[ObjectInstance
]- Returns
All the `ObjectInstance`s that match the filter.
- add_object_instance(object_instance, force=True)[source]#
Add an object instance to the label row. If the object instance already exists, it
- Parameters
object_instance (
ObjectInstance
) – The object instance to add.force (
bool
) – overwrites current objects, otherwise this will replace the current object.
- Return type
None
- add_classification_instance(classification_instance, force=False)[source]#
Add a classification instance to the label row.
- Parameters
classification_instance (
ClassificationInstance
) – The object instance to add.force (
bool
) – overwrites current objects, otherwise this will replace the current object.
- Return type
None
- remove_classification(classification_instance)[source]#
Remove a classification instance from a label row.
- add_to_single_frame_to_hashes_map(label_item, frame)[source]#
This is an internal function, it is not meant to be called by the SDK user.
- Return type
None
- get_classification_instances(filter_ontology_classification=None, filter_frames=None)[source]#
- Parameters
filter_ontology_classification (
Optional
[Classification
]) – Optionally filter by a specific ontology classification.filter_frames (
Union
[int
,List
[int
],Range
,List
[Range
],None
]) – Optionally filter by specific frames.
- Return type
List
[ClassificationInstance
]- Returns
All the `ObjectInstance`s that match the filter.
- to_encord_dict()[source]#
This is an internal helper function. Likely this should not be used by a user. To upload labels use the
save()
function.- Return type
Dict
[str
,Any
]
- workflow_reopen()[source]#
A label row is returned to the first annotation stage for re-labeling. No data will be lost during this call.
This method is only relevant for the projects that use the Workflow feature, and will raise an error for pre-workflow projects.
- Return type
None
- workflow_complete()[source]#
A label row is moved to the final workflow node, marking it as ‘Complete’.
This method can be called only for labels for which
initialise_labels()
was called at least ance, and consequentially the “label_hash” field is not None.Please note that labels need not be initialized every time the workflow_complete() method is called.
This method is only relevant for the projects that use the Workflow feature, and will raise an error for projects that don’t use Workflows.
- Return type
None
- set_priority(priority)[source]#
Set priority for task in workflow project.
- Parameters
priority (
float
) – float value from 0.0 to 1.0, where 1.0 is the highest priority- Return type
None
- class FrameView(label_row, label_row_read_only_data, frame)[source]#
This class can be used to inspect what object/classification instances are on a given frame or what metadata, such as a image file size, is on a given frame.
- property image_hash: str#
- Return type
str
- property image_title: str#
- Return type
str
- property file_type: str#
- Return type
str
- property frame: int#
- Return type
int
- property width: int#
- Return type
int
- property height: int#
- Return type
int
- property data_link: Optional[str]#
- Return type
Optional
[str
]
- property metadata: Optional[encord.objects.metadata.DICOMSliceMetadata]#
Annotation metadata. Particular format depends on the data type. Currently only supported for DICOM, and will return None for other formats.
- Return type
Optional
[DICOMSliceMetadata
]
- add_object_instance(object_instance, coordinates, *, overwrite=False, created_at=None, created_by=None, last_edited_at=None, last_edited_by=None, confidence=None, manual_annotation=None)[source]#
- Return type
None
- add_classification_instance(classification_instance, *, overwrite=False, created_at=None, created_by=None, confidence=1.0, manual_annotation=True, last_edited_at=None, last_edited_by=None)[source]#
- Return type
None
- get_object_instances(filter_ontology_object=None)[source]#
- Parameters
filter_ontology_object (
Optional
[Object
]) – Optionally filter by a specific ontology object.- Return type
List
[ObjectInstance
]- Returns
All the `ObjectInstance`s that match the filter.
- get_classification_instances(filter_ontology_classification=None)[source]#
- Parameters
filter_ontology_classification (
Optional
[Classification
]) – Optionally filter by a specific ontology object.- Return type
List
[ClassificationInstance
]- Returns
All the `ObjectInstance`s that match the filter.
- class FrameLevelImageGroupData(image_hash, image_title, file_type, frame_number, width, height, data_link=None)[source]#
This is an internal helper class. A user should not directly interact with it.
- image_hash: str#
- image_title: str#
- file_type: str#
- frame_number: int#
- width: int#
- height: int#
- data_link: Optional[str] = None#
- class LabelRowReadOnlyData(label_hash, created_at, last_edited_at, data_hash, data_type, label_status, annotation_task_status, workflow_graph_node, is_shadow_data, number_of_frames, duration, fps, dataset_hash, dataset_title, data_title, width, height, data_link, priority, frame_level_data=<factory>, image_hash_to_frame=<factory>, frame_to_image_hash=<factory>)[source]#
This is an internal helper class. A user should not directly interact with it.
- label_hash: Optional[str]#
This is None if the label row does not have any labels and was not initialised for labelling.
- created_at: Optional[datetime.datetime]#
This is None if the label row does not have any labels and was not initialised for labelling.
- last_edited_at: Optional[datetime.datetime]#
This is None if the label row does not have any labels and was not initialised for labelling.
- data_hash: str#
- data_type: encord.constants.enums.DataType#
- label_status: encord.orm.label_row.LabelStatus#
- annotation_task_status: Optional[encord.orm.label_row.AnnotationTaskStatus]#
- workflow_graph_node: Optional[encord.orm.label_row.WorkflowGraphNode]#
- is_shadow_data: bool#
- number_of_frames: int#
- duration: Optional[float]#
- fps: Optional[float]#
- dataset_hash: str#
- dataset_title: str#
- data_title: str#
- width: Optional[int]#
- height: Optional[int]#
- data_link: Optional[str]#
- priority: Optional[float]#
- frame_level_data: Dict[int, encord.objects.ontology_labels_impl.LabelRowV2.FrameLevelImageGroupData]#
- image_hash_to_frame: Dict[str, int]#
- frame_to_image_hash: Dict[int, str]#
Label ObjectInstance#
- class encord.objects.ObjectInstance(ontology_object, *, object_hash=None)[source]#
An object instance is an object that has coordinates and can be places on one or multiple frames in a label row.
- is_assigned_to_label_row()[source]#
- Return type
Optional
[LabelRowV2
]
- property object_hash: str#
A unique identifier for the object instance.
- Return type
str
- property ontology_item: encord.objects.ontology_object.Object#
- Return type
- property feature_hash: str#
Feature node hash from the project ontology
- Return type
str
- property object_name: str#
Object name from the project ontology
- Return type
str
- get_answer(attribute, filter_answer=None, filter_frame=None, is_dynamic=None)[source]#
Get the answer set for a given ontology Attribute. Returns None if the attribute is not yet answered.
For the ChecklistAttribute, it returns None if and only if the attribute is nested and the parent is unselected. Otherwise, if not yet answered it will return an empty list.
- Parameters
attribute (
Attribute
) – The ontology attribute to get the answer for.filter_answer (
Union
[str
,Option
,Iterable
[Option
],None
]) – A filter for a specific answer value. Only applies to dynamic attributes.filter_frame (
Optional
[int
]) – A filter for a specific frame. Only applies to dynamic attributes.is_dynamic (
Optional
[bool
]) – Optionally specify whether a dynamic answer is expected or not. This will throw if it is set incorrectly according to the attribute. Set this to narrow down the return type.
- Return type
Union
[str
,Option
,Iterable
[Option
],List
[AnswerForFrames
],None
]- Returns
If the attribute is static, then the answer value is returned, assuming an answer value has already been set. If the attribute is dynamic, the AnswersForFrames object is returned.
- set_answer(answer, attribute=None, frames=None, overwrite=False)[source]#
Set the answer for a given ontology Attribute. This is the equivalent of e.g. selecting a checkbox in the UI after drawing the ObjectInstance. There is only one answer per ObjectInstance per Attribute, unless the attribute is dynamic (check the args list for more instructions on how to set dynamic answers).
- Parameters
answer (
Union
[str
,Option
,Sequence
[Option
]]) – The answer to set.attribute (
Optional
[Attribute
]) – The ontology attribute to set the answer for. If not set, this will be attempted to be inferred. For answers toencord.objects.common.RadioAttribute
orencord.objects.common.ChecklistAttribute
, this can be inferred automatically. Forencord.objects.common.TextAttribute
, this will only be inferred there is only one possible TextAttribute to set for the entire object instance. Otherwise, aencord.exceptionsLabelRowError
will be thrown.frames (
Union
[int
,List
[int
],Range
,List
[Range
],None
]) – Only relevant for dynamic attributes. The frames to set the answer for. If None, the answer is set for all frames that this object currently has set coordinates for (also overwriting current answers). This will not automatically propagate the answer to new frames that are added in the future. If this is anything but None for non-dynamic attributes, this will throw a ValueError.overwrite (
bool
) – If True, the answer will be overwritten if it already exists. If False, this will throw a LabelRowError if the answer already exists. This argument is ignored for dynamic attributes.
- Return type
None
- set_answer_from_list(answers_list)[source]#
This is a low level helper function and should usually not be used directly.
Sets the answer for the classification from a dictionary.
- Parameters
answers_list (
List
[Dict
[str
,Any
]]) – The list of dictionaries to set the answer from.- Return type
None
- delete_answer(attribute, filter_answer=None, filter_frame=None)[source]#
This resets the answer of an attribute as if it was never set.
- Parameters
attribute (
Attribute
) – The attribute to delete the answer for.filter_answer (
Union
[str
,Option
,Iterable
[Option
],None
]) – A filter for a specific answer value. Delete only answers with the provided value. Only applies to dynamic attributes.filter_frame (
Optional
[int
]) – A filter for a specific frame. Only applies to dynamic attributes.
- Return type
None
- set_for_frames(coordinates, frames=0, *, overwrite=False, created_at=None, created_by=None, last_edited_at=None, last_edited_by=None, confidence=None, manual_annotation=None, reviews=None, is_deleted=None)[source]#
Places the object onto the specified frame. If the object already exists on the frame and overwrite is set to True, the currently specified values will be overwritten.
- Parameters
coordinates (
Union
[BoundingBoxCoordinates
,RotatableBoundingBoxCoordinates
,PointCoordinate
,PolygonCoordinates
,PolylineCoordinates
,SkeletonCoordinates
,BitmaskCoordinates
]) – The coordinates of the object in the frame. This will throw an error if the type of the coordinates does not match the type of the attribute in the object instance.frames (
Union
[int
,List
[int
],Range
,List
[Range
]]) – The frames to add the object instance to. Defaulting to the first frame for convenience.overwrite (
bool
) – If True, overwrite existing data for the given frames. This will not reset all the non-specified values. If False and data already exists for the given frames, raises an error.created_at (
Optional
[datetime
]) – Optionally specify the creation time of the object instance on this frame. Defaults to datetime.now().created_by (
Optional
[str
]) – Optionally specify the creator of the object instance on this frame. Defaults to the current SDK user.last_edited_at (
Optional
[datetime
]) – Optionally specify the last edit time of the object instance on this frame. Defaults to datetime.now().last_edited_by (
Optional
[str
]) – Optionally specify the last editor of the object instance on this frame. Defaults to the current SDK user.confidence (
Optional
[float
]) – Optionally specify the confidence of the object instance on this frame. Defaults to 1.0.manual_annotation (
Optional
[bool
]) – Optionally specify whether the object instance on this frame was manually annotated. Defaults to True.reviews (
Optional
[List
[dict
]]) – Should only be set by internal functions.is_deleted (
Optional
[bool
]) – Should only be set by internal functions.
- Return type
None
- get_annotation(frame=0)[source]#
Get the annotation for the object instance on the specified frame.
- Parameters
frame (Union[int, str]) – Either the frame number or the image hash if the data type is an image or image group. Defaults to the first frame.
- Return type
- copy()[source]#
Creates an exact copy of this ObjectInstance but with a new object hash and without being associated to any LabelRowV2. This is useful if you want to add the semantically same ObjectInstance to multiple `LabelRowV2`s.
- Return type
- get_annotations()[source]#
Get all annotations for the object instance on all frames it has been placed to.
- Return type
List[Annotation]
- Returns
A list of ObjectInstance.Annotation in order of available frames.
- is_valid()[source]#
Check if is valid, could also return some human/computer messages.
- Return type
None
- are_dynamic_answers_valid()[source]#
Whether there are any dynamic answers on frames that have no coordinates.
- Return type
None
- class Annotation(object_instance, frame)[source]#
This class can be used to set or get data for a specific annotation (i.e. the ObjectInstance for a given frame number).
- property frame: int#
- Return type
int
- property coordinates: Union[encord.objects.coordinates.BoundingBoxCoordinates, encord.objects.coordinates.RotatableBoundingBoxCoordinates, encord.objects.coordinates.PointCoordinate, encord.objects.coordinates.PolygonCoordinates, encord.objects.coordinates.PolylineCoordinates, encord.objects.coordinates.SkeletonCoordinates, encord.objects.bitmask.BitmaskCoordinates]#
- Return type
Union
[BoundingBoxCoordinates
,RotatableBoundingBoxCoordinates
,PointCoordinate
,PolygonCoordinates
,PolylineCoordinates
,SkeletonCoordinates
,BitmaskCoordinates
]
- property created_at: datetime.datetime#
- Return type
datetime
- property created_by: Optional[str]#
- Return type
Optional
[str
]
- property last_edited_at: datetime.datetime#
- Return type
datetime
- property last_edited_by: Optional[str]#
- Return type
Optional
[str
]
- property confidence: float#
- Return type
float
- property manual_annotation: bool#
- Return type
bool
- property reviews: Optional[List[Dict[str, Any]]]#
A read only property about the reviews that happened for this object on this frame.
- Return type
Optional
[List
[Dict
[str
,Any
]]]
- property is_deleted: Optional[bool]#
This property is only relevant for internal use.
- Return type
Optional
[bool
]
- class FrameInfo(created_at=<factory>, created_by=None, last_edited_at=<factory>, last_edited_by=None, confidence=1.0, manual_annotation=True, reviews=None, is_deleted=None)[source]#
- created_at: datetime.datetime#
- created_by: Optional[str] = None#
None defaults to the user of the SDK once uploaded to the server.
- last_edited_at: datetime.datetime#
- last_edited_by: Optional[str] = None#
None defaults to the user of the SDK once uploaded to the server.
- confidence: float = 1.0#
- manual_annotation: bool = True#
- reviews: Optional[List[dict]] = None#
- is_deleted: Optional[bool] = None#
- class FrameData(coordinates, object_frame_instance_info)[source]#
- coordinates: Union[encord.objects.coordinates.BoundingBoxCoordinates, encord.objects.coordinates.RotatableBoundingBoxCoordinates, encord.objects.coordinates.PointCoordinate, encord.objects.coordinates.PolygonCoordinates, encord.objects.coordinates.PolylineCoordinates, encord.objects.coordinates.SkeletonCoordinates, encord.objects.bitmask.BitmaskCoordinates]#
- object_frame_instance_info: encord.objects.ontology_object_instance.ObjectInstance.FrameInfo#
Label ClassificationInstance#
- class encord.objects.ClassificationInstance(ontology_classification, *, classification_hash=None)[source]#
- property classification_hash: str#
A unique identifier for the classification instance.
- Return type
str
- property ontology_item: encord.objects.classification.Classification#
- Return type
- property classification_name: str#
Classification name from the project ontology
- Return type
str
- property feature_hash: str#
Feature node hash from the project ontology
- Return type
str
- set_for_frames(frames=0, *, overwrite=False, created_at=None, created_by=None, confidence=1.0, manual_annotation=True, last_edited_at=None, last_edited_by=None, reviews=None)[source]#
Places the classification onto the specified frame. If the classification already exists on the frame and overwrite is set to True, the currently specified values will be overwritten.
- Parameters
frames (
Union
[int
,List
[int
],Range
,List
[Range
]]) – The frame to add the classification instance to. Defaulting to the first frame for convenience.overwrite (
bool
) – If True, overwrite existing data for the given frames. This will not reset all the non-specified values. If False and data already exists for the given frames, raises an error.created_at (
Optional
[datetime
]) – Optionally specify the creation time of the classification instance on this frame. Defaults to datetime.now().created_by (
Optional
[str
]) – Optionally specify the creator of the classification instance on this frame. Defaults to the current SDK user.last_edited_at (
Optional
[datetime
]) – Optionally specify the last edit time of the classification instance on this frame. Defaults to datetime.now().last_edited_by (
Optional
[str
]) – Optionally specify the last editor of the classification instance on this frame. Defaults to the current SDK user.confidence (
float
) – Optionally specify the confidence of the classification instance on this frame. Defaults to 1.0.manual_annotation (
bool
) – Optionally specify whether the classification instance on this frame was manually annotated. Defaults to True.reviews (
Optional
[List
[dict
]]) – Should only be set by internal functions.
- Return type
None
- get_annotation(frame=0)[source]#
- Parameters
frame (Union[int, str]) – Either the frame number or the image hash if the data type is an image or image group. Defaults to the first frame.
- Return type
- get_annotations()[source]#
- Return type
List[Annotation]
- Returns
A list of ClassificationInstance.Annotation in order of available frames.
- set_answer(answer, attribute=None, overwrite=False)[source]#
Set the answer for a given ontology Attribute. This is the equivalent of e.g. selecting a checkbox in the UI after adding a ClassificationInstance. There is only one answer per ClassificationInstance per Attribute.
- Parameters
answer (
Union
[str
,Option
,Sequence
[Option
]]) – The answer to set.attribute (
Optional
[Attribute
]) – The ontology attribute to set the answer for. If not set, this will be attempted to be inferred. For answers toencord.objects.common.RadioAttribute
orencord.objects.common.ChecklistAttribute
, this can be inferred automatically. Forencord.objects.common.TextAttribute
, this will only be inferred there is only one possible TextAttribute to set for the entire object instance. Otherwise, aencord.exceptionsLabelRowError
will be thrown.overwrite (
bool
) – If True, the answer will be overwritten if it already exists. If False, this will throw a LabelRowError if the answer already exists.
- Return type
None
- set_answer_from_list(answers_list)[source]#
This is a low level helper function and should not be used directly.
Sets the answer for the classification from a dictionary.
- Parameters
answers_list (
List
[Dict
[str
,Any
]]) – The list to set the answer from.- Return type
None
- get_answer(attribute=None)[source]#
Get the answer set for a given ontology Attribute. Returns None if the attribute is not yet answered.
For the ChecklistAttribute, it returns None if and only if the attribute is nested and the parent is unselected. Otherwise, if not yet answered it will return an empty list.
- Parameters
attribute (
Optional
[Attribute
]) – The ontology attribute to get the answer for.- Return type
Union
[str
,Option
,Iterable
[Option
],None
]
- delete_answer(attribute=None)[source]#
This resets the answer of an attribute as if it was never set.
- Parameters
attribute (
Optional
[Attribute
]) – The ontology attribute to delete the answer for. If not provided, the first level attribute is used.- Return type
None
- copy()[source]#
Creates an exact copy of this ClassificationInstance but with a new classification hash and without being associated to any LabelRowV2. This is useful if you want to add the semantically same ClassificationInstance to multiple `LabelRowV2`s.
- Return type
- class Annotation(classification_instance, frame)[source]#
This class can be used to set or get data for a specific annotation (i.e. the ClassificationInstance for a given frame number).
- property frame: int#
- Return type
int
- property created_at: datetime.datetime#
- Return type
datetime
- property created_by: Optional[str]#
- Return type
Optional
[str
]
- property last_edited_at: datetime.datetime#
- Return type
datetime
- property last_edited_by: Optional[str]#
- Return type
Optional
[str
]
- property confidence: float#
- Return type
float
- property manual_annotation: bool#
- Return type
bool
- property reviews: Optional[List[dict]]#
A read only property about the reviews that happened for this object on this frame.
- Return type
Optional
[List
[dict
]]
- class FrameData(created_at=<factory>, created_by=None, confidence=1.0, manual_annotation=True, last_edited_at=<factory>, last_edited_by=None, reviews=None)[source]#
- created_at: datetime.datetime#
- created_by: Optional[str] = None#
- confidence: float = 1.0#
- manual_annotation: bool = True#
- last_edited_at: datetime.datetime#
- last_edited_by: Optional[str] = None#
- reviews: Optional[List[dict]] = None#
- class encord.objects.AnswerForFrames(answer, ranges)[source]#
- answer: Union[str, encord.objects.options.Option, Iterable[encord.objects.options.Option]]#
- ranges: List[encord.objects.frames.Range]#
The ranges are essentially a run length encoding of the frames where the unique answer is set. They are sorted in ascending order.
Label Frame Utilities#
- encord.objects.frames.Frames#
alias of
Union
[int
,List
[int
],encord.objects.frames.Range
,List
[encord.objects.frames.Range
]]
- encord.objects.frames.Ranges#
alias of
List
[encord.objects.frames.Range
]
- encord.objects.frames.frame_to_range(frame)[source]#
Convert a single frame to a Range.
- Parameters
frame (
int
) – The single frame- Return type
- encord.objects.frames.frames_to_ranges(frames)[source]#
Create a sorted list (in ascending order) of run length encoded ranges of the frames. The Ranges will not be overlapping.
- Parameters
frames (
Collection
[int
]) – A collection of integers representing frames- Return type
List
[Range
]
- encord.objects.frames.ranges_to_list(ranges)[source]#
Convert a list of Range to a list of lists (run length encoded) of integers.
- Parameters
ranges (
List
[Range
]) – A list of Range objects- Return type
List
[List
[int
]]
- encord.objects.frames.range_to_ranges(range_)[source]#
Convert a single Range to a list of Ranges.
- Parameters
range – The single Range
- Return type
List
[Range
]
- encord.objects.frames.range_to_frames(range_)[source]#
Convert a single Range (run length encoded) to a list of integers. Useful to flatten out run length encoded values.
- Parameters
range – The single Range
- Return type
List
[int
]
- encord.objects.frames.ranges_to_frames(range_list)[source]#
Convert a list of Ranges (run length encoded) to a list of integers. Useful to flatten out run length encoded values.
- Parameters
range_list (
List
[Range
]) – A list of Ranges- Return type
List
[int
]
Utilities - Client#
- class encord.utilities.client_utilities.APIKeyScopes(value)[source]#
The APIKeyScope is used to provide specific access rights to a project through
EncordUserClient.create_project_api_key()
. The options are a follows:LABEL_READ
: access to tutorials/projects:Getting label rowsLABEL_WRITE
: access to tutorials/projects:Saving label rowsMODEL_INFERENCE
: access to InferenceMODEL_TRAIN
: access to Creating a model row and TrainingLABEL_LOGS_READ
: access to Reviewing label logsALGO_LIBRARY
: access to algorithms like Object interpolation
- LABEL_READ = 'label.read'#
- LABEL_WRITE = 'label.write'#
- MODEL_INFERENCE = 'model.inference'#
- MODEL_TRAIN = 'model.train'#
- LABEL_LOGS_READ = 'label_logs.read'#
- ALGO_LIBRARY = 'algo.library'#
- class encord.utilities.client_utilities.LocalImport(file_path)[source]#
file_path: Supply the path of the exported folder which contains the images and annotations.xml file. Make sure to select “Save images” when exporting your CVAT Task or Project.
- file_path: str#
- encord.utilities.client_utilities.ImportMethod#
Using images/videos in cloud storage as an alternative import method will be supported in the future.
- class encord.utilities.client_utilities.Issue(issue_type, instances)[source]#
For each issue_type there may be multiple occurrences which are documented in the instances. The instances list can provide additional information on how the issue was encountered. If there is no additional information available, the instances list will be empty.
- issue_type: str#
- instances: List[str]#
- class encord.utilities.client_utilities.Issues(errors, warnings, infos)[source]#
Any issues that came up during importing a project. These usually come from incompatibilities between data saved on different platforms.
- errors: List[encord.utilities.client_utilities.Issue]#
- warnings: List[encord.utilities.client_utilities.Issue]#
- infos: List[encord.utilities.client_utilities.Issue]#
- class encord.utilities.client_utilities.CvatImporterSuccess(project_hash, dataset_hash, issues)[source]#
- project_hash: str#
- dataset_hash: str#
- class encord.utilities.client_utilities.CvatImporterError(dataset_hash, issues)[source]#
- dataset_hash: str#
Utilities - Project User#
Expand for references to
encord.utilities.project_user
Utilities - Label#
- encord.utilities.label_utilities.construct_answer_dictionaries(label_row)[source]#
Adds answer object and classification answer dictionaries from a label row if they do not exist. Integrity checks are conducted upon saving of labels.
- Parameters
label_row – A label row.
- Returns
A label row instance with updated answer dictionaries
- Return type
Utilities - ORM classes#
- class encord.orm.api_key.ApiKeyMeta(dic)[source]#
ApiKeyMeta contains key information.
ORM:
title, resource_type
- DB_FIELDS: collections.OrderedDict = {'resource_type': <class 'str'>, 'title': <class 'str'>}#
- NON_UPDATABLE_FIELDS: set = {'resource_type'}#
- class encord.orm.base_orm.BaseORM(dic)[source]#
Base ORM for all database objects.
- DB_FIELDS: collections.OrderedDict = {}#
- NON_UPDATABLE_FIELDS: set = {}#
- static from_db_row(row, db_field)[source]#
Static method for conveniently converting db row to client object. :param row: :param db_field: :return:
- to_dic(time_str=True)[source]#
Conveniently set client object as dict. Only considers the dict items, no other object attr will be counted
- Parameters
time_str (
bool
) – if set to True, will convert datetime field to str with format %Y-%m-%d %H:%M:%S. If False, will keep the original datetime type. Default will be True.
- class encord.orm.base_orm.BaseListORM(iter_)[source]#
A wrapper for a list of objects of a specific ORM.
- BASE_ORM_TYPE#
alias of
encord.orm.base_orm.BaseORM
Datasets / Data / Adding data / Adding data from a private cloudExpand for references to
encord.orm.cloud_integration
Expand for references to
encord.orm.dataset
- class encord.orm.dataset.DatasetUser(user_email, user_role, dataset_hash)[source]#
- user_email: str#
- user_role: encord.orm.dataset.DatasetUserRole#
- dataset_hash: str#
- class encord.orm.dataset.ImageData(image_hash, title, file_link, file_type, file_size, storage_location, created_at, last_edited_at, width, signed_url, height)[source]#
Information about individual images within a single
DataRow
of typeDataType.IMG_GROUP
. Get this information via theDataRow.images
property.- property image_hash: str#
- Return type
str
- property title: str#
- Return type
str
- property file_link: str#
- Return type
str
- property file_type: str#
The MIME type of the file.
- Return type
str
- property file_size: int#
The size of the file in bytes.
- Return type
int
- property storage_location: encord.orm.dataset.StorageLocation#
- Return type
- property created_at: datetime.datetime#
- Return type
datetime
- property last_edited_at: datetime.datetime#
- Return type
datetime
- property height: int#
- Return type
int
- property width: int#
- Return type
int
- property signed_url: Optional[str]#
The signed URL if one was generated when this class was created.
- Return type
Optional
[str
]
- class encord.orm.dataset.DataRow(uid, title, data_type, created_at, last_edited_at, width, height, file_link, file_size, file_type, storage_location, client_metadata, frames_per_second, duration, images_data, signed_url, is_optimised_image_group, backing_item_uuid)[source]#
Each individual DataRow is one upload of a video, image group, single image, or DICOM series.
This class has dict-style accessors for backwards compatibility. Clients who are using this class for the first time are encouraged to use the property accessors and setters instead of the underlying dictionary. The mixed use of the dict style member functions and the property accessors and setters is discouraged.
WARNING: Do NOT use the .data member of this class. Its usage could corrupt the correctness of the datastructure.
- property uid: str#
The unique identifier for this data row. Note that the setter does not update the data on the server.
- Return type
str
- property title: str#
The data title.
The setter updates the custom client metadata. This queues a request for the backend which will be executed on a call of
DataRow.upload()
.- Return type
str
- property data_type: encord.constants.enums.DataType#
- Return type
- property created_at: datetime.datetime#
- Return type
datetime
- property frames_per_second: Optional[int]#
If the data type is
DataType.VIDEO
this returns the actual number of frames per second for the video. Otherwise, it returns None as a frames_per_second field is not applicable.- Return type
Optional
[int
]
- property duration: Optional[int]#
If the data type is
DataType.VIDEO
this returns the actual duration for the video. Otherwise, it returns None as a duration field is not applicable.- Return type
Optional
[int
]
- property client_metadata: Optional[dict]#
The currently cached client metadata. To cache the client metadata, use the
refetch_data()
function.The setter updates the custom client metadata. This queues a request for the backend which will be executed on a call of
DataRow.upload()
.- Return type
Optional
[dict
]
- property width: Optional[int]#
An actual width of the data asset. This is None for data types of
DataType.IMG_GROUP
whereis_image_sequence
is False, because each image in this group can have a different dimension. Inspect theimages
to get the height of individual images.- Return type
Optional
[int
]
- property height: Optional[int]#
An actual height of the data asset. This is None for data types of
DataType.IMG_GROUP
whereis_image_sequence
is False, because each image in this group can have a different dimension. Inspect theimages
to get the height of individual images.- Return type
Optional
[int
]
- property last_edited_at: datetime.datetime#
- Return type
datetime
- property file_link: Optional[str]#
A permanent file link of the given data asset. When stored in
StorageLocation.CORD_STORAGE
this will be the internal file path. In private bucket storage location this will be the full path to the file. If the data type is DataType.DICOM then this returns None as no single file is associated with the series.- Return type
Optional
[str
]
- property signed_url: Optional[str]#
The cached signed url of the given data asset. To cache the signed url, use the
refetch_data()
function.- Return type
Optional
[str
]
- property file_size: int#
The file size of the given data asset in bytes.
- Return type
int
- property file_type: str#
A MIME file type of the given data asset as a string
- Return type
str
- property storage_location: encord.orm.dataset.StorageLocation#
- Return type
- property images_data: Optional[List[encord.orm.dataset.ImageData]]#
A list of the cached
ImageData
objects for the given data asset. Fetch the images with appropriate settings in therefetch_data()
function. If the data type is notDataType.IMG_GROUP
then this returns None.- Return type
Optional
[List
[ImageData
]]
- property is_optimised_image_group: Optional[bool]#
If the data type is an
DataType.IMG_GROUP
, returns whether this is a performance optimised image group. Returns None for other data types.DEPRECATED: This method is deprecated and will be removed in the upcoming library version. Please use
is_image_sequence()
instead- Return type
Optional
[bool
]
- property is_image_sequence: Optional[bool]#
If the data type is an
DataType.IMG_GROUP
, returns whether this is an image sequence. Returns None for other data types.For more details refer to the documentation on image sequences
- Return type
Optional
[bool
]
- property backing_item_uuid: uuid.UUID#
- Return type
UUID
- refetch_data(*, signed_url=False, images_data_fetch_options=None, client_metadata=False)[source]#
Fetches all the most up-to-date data. If any of the parameters are falsy, the current values will not be updated.
- Parameters
signed_url (
bool
) – If True, this will fetch a generated signed url of the data asset.images_data_fetch_options (
Optional
[ImagesDataFetchOptions
]) – If not None, this will fetch the image data of the data asset. You can additionally specify what to fetch with theImagesDataFetchOptions
class.client_metadata (
bool
) – If True, this will fetch the client metadata of the data asset.
- class encord.orm.dataset.DataRows(data_rows)[source]#
This is a helper class that forms request for filtered dataset rows Not intended to be used directly
- class encord.orm.dataset.DatasetInfo(dataset_hash, user_hash, title, description, type, created_at, last_edited_at, backing_folder_uuid=None)[source]#
This class represents a dataset in the context of listing
Expand for references to
encord.orm.dataset.DatasetInfo
- dataset_hash: str#
- user_hash: str#
- title: str#
- description: str#
- type: int#
- created_at: datetime.datetime#
- last_edited_at: datetime.datetime#
- backing_folder_uuid: Optional[uuid.UUID] = None#
- class encord.orm.dataset.Dataset(title, storage_location, data_rows, dataset_hash, description=None, backing_folder_uuid=None)[source]#
- property dataset_hash: str#
- Return type
str
- property title: str#
- Return type
str
- property description: str#
- Return type
str
- property storage_location: encord.orm.dataset.StorageLocation#
- Return type
- property data_rows: List[encord.orm.dataset.DataRow]#
- Return type
List
[DataRow
]
- property backing_folder_uuid: Optional[uuid.UUID]#
- Return type
Optional
[UUID
]
- class encord.orm.dataset.DatasetDataInfo(data_hash, title, backing_item_uuid)[source]#
- data_hash: str#
- title: str#
- backing_item_uuid: Optional[uuid.UUID]#
- class encord.orm.dataset.AddPrivateDataResponse(dataset_data_list)[source]#
Response of add_private_data_to_dataset
- dataset_data_list: List[encord.orm.dataset.DatasetDataInfo]#
- class encord.orm.dataset.DatasetAPIKey(dataset_hash, api_key, title, key_hash, scopes)[source]#
- dataset_hash: str#
- api_key: str#
- title: str#
- key_hash: str#
- scopes: List[encord.orm.dataset.DatasetScope]#
- class encord.orm.dataset.CreateDatasetResponse(title, storage_location, dataset_hash, user_hash, backing_folder_uuid)[source]#
Expand for references to
encord.orm.dataset.CreateDatasetResponse
- property title: str#
- Return type
str
- property storage_location: encord.orm.dataset.StorageLocation#
- Return type
- property dataset_hash: str#
- Return type
str
- property user_hash: str#
- Return type
str
- property backing_folder_uuid: Optional[uuid.UUID]#
- Return type
Optional
[UUID
]
- class encord.orm.dataset.StorageLocation(value)[source]#
An enumeration.
Expand for references to
encord.orm.dataset.StorageLocation
- CORD_STORAGE = 0#
- AWS = 1#
- GCP = 2#
- AZURE = 3#
- OTC = 4#
- NEW_STORAGE = -99#
This is a placeholder for a new storage location that is not yet supported by your SDK version. Please update your SDK to the latest version.
- encord.orm.dataset.DatasetType#
For backwards compatibility
- class encord.orm.dataset.DatasetScope(value)[source]#
An enumeration.
Expand for references to
encord.orm.dataset.DatasetScope
Datasets / API keys / Creating a dataset API key with specific rights
- READ = 'dataset.read'#
- WRITE = 'dataset.write'#
- class encord.orm.dataset.DatasetData(dic)[source]#
Video base ORM.
- DB_FIELDS: collections.OrderedDict = {'data_hash': <class 'str'>, 'images': <class 'list'>, 'video': <class 'dict'>}#
- class encord.orm.dataset.SignedVideoURL(dic)[source]#
A signed URL object with supporting information.
- DB_FIELDS: collections.OrderedDict = {'data_hash': <class 'str'>, 'file_link': <class 'str'>, 'signed_url': <class 'str'>, 'title': <class 'str'>}#
- class encord.orm.dataset.SignedImageURL(dic)[source]#
A signed URL object with supporting information.
- DB_FIELDS: collections.OrderedDict = {'data_hash': <class 'str'>, 'file_link': <class 'str'>, 'signed_url': <class 'str'>, 'title': <class 'str'>}#
- class encord.orm.dataset.SignedImagesURL(iter_)[source]#
A signed URL object with supporting information.
- BASE_ORM_TYPE#
alias of
encord.orm.dataset.SignedImageURL
- class encord.orm.dataset.SignedDicomURL(dic)[source]#
A signed URL object with supporting information.
- DB_FIELDS: collections.OrderedDict = {'data_hash': <class 'str'>, 'file_link': <class 'str'>, 'signed_url': <class 'str'>, 'title': <class 'str'>}#
- class encord.orm.dataset.SignedDicomsURL(iter_)[source]#
A signed URL object with supporting information.
- BASE_ORM_TYPE#
alias of
encord.orm.dataset.SignedDicomURL
- class encord.orm.dataset.Video(dic)[source]#
A video object with supporting information.
- DB_FIELDS: collections.OrderedDict = {'backing_item_uuid': <class 'uuid.UUID'>, 'data_hash': <class 'str'>, 'file_link': <class 'str'>, 'title': <class 'str'>}#
- NON_UPDATABLE_FIELDS: set = {'backing_item_uuid', 'data_hash'}#
- class encord.orm.dataset.ImageGroup(dic)[source]#
An image group object with supporting information.
- DB_FIELDS: collections.OrderedDict = {'data_hash': <class 'str'>, 'file_link': <class 'str'>, 'title': <class 'str'>}#
- NON_UPDATABLE_FIELDS: set = {'data_hash'}#
- class encord.orm.dataset.Image(dic)[source]#
An image object with supporting information.
- DB_FIELDS: collections.OrderedDict = {'data_hash': <class 'str'>, 'file_link': <class 'str'>, 'image_hash': <class 'str'>, 'title': <class 'str'>}#
- NON_UPDATABLE_FIELDS: set = {'data_hash'}#
- class encord.orm.dataset.Images(success)[source]#
Uploading multiple images in a batch mode.
- success: bool#
- class encord.orm.dataset.DicomDeidentifyTask(dicom_urls, integration_hash)[source]#
- dicom_urls: List[str]#
- integration_hash: str#
- class encord.orm.dataset.ReEncodeVideoTaskResult(data_hash, signed_url, bucket_path)[source]#
- data_hash: str#
- signed_url: Optional[str]#
- bucket_path: str#
- class encord.orm.dataset.ReEncodeVideoTask(status, result=None)[source]#
A re encode video object with supporting information.
- status: str#
- result: Optional[List[encord.orm.dataset.ReEncodeVideoTaskResult]] = None#
- class encord.orm.dataset.DatasetAccessSettings(fetch_client_metadata)[source]#
Settings for using the dataset object.
- fetch_client_metadata: bool#
Whether client metadata should be retrieved for each data_row.
- class encord.orm.dataset.ImagesDataFetchOptions(fetch_signed_urls=False)[source]#
- fetch_signed_urls: bool = False#
Whether to fetch signed urls for each individual image. Only set this to true if you need to download the images.
- class encord.orm.dataset.LongPollingStatus(value)[source]#
An enumeration.
Expand for references to
encord.orm.dataset.LongPollingStatus
Datasets / Data / Adding data / Adding data from a private cloud
- PENDING = 'PENDING'#
Job will automatically start soon (waiting in queue) or already started processing.
- DONE = 'DONE'#
Job has finished successfully (possibly with errors if ignore_errors=True) If ignore_errors=False was specified in
encord.dataset.Dataset.add_private_data_to_dataset_start()
, job will only have the status DONE if there were no errors. If ignore_errors=True was specified inencord.dataset.Dataset.add_private_data_to_dataset_start()
, job will always show the status DONE once complete and will never show ERROR status if this flag was set to True. There could be errors that were ignored. Information about number of errors and stringified exceptions is available in the units_error_count: int and errors: List[str] attributes.
- ERROR = 'ERROR'#
Job has completed with errors. This can only happen if ignore_errors was set to False. Information about errors is available in the units_error_count: int and errors: List[str] attributes.
- class encord.orm.dataset.DatasetDataLongPolling(status, data_hashes_with_titles, errors, units_pending_count, units_done_count, units_error_count)[source]#
Response of the upload job’s long polling request.
Note: An upload job consists of job units, where job unit could be either a video, image group, dicom series, or a single image.
- status: encord.orm.dataset.LongPollingStatus#
Status of the upload job. Documented in detail in
LongPollingStatus()
- data_hashes_with_titles: List[encord.orm.dataset.DatasetDataInfo]#
Information about data which was added to the dataset.
- errors: List[str]#
Stringified list of exceptions.
- units_pending_count: int#
Number of upload job units that have pending status.
- units_done_count: int#
Number of upload job units that have done status.
- units_error_count: int#
Number of upload job units that have error status.
- class encord.orm.dataset_with_user_role.DatasetWithUserRole(user_role, dataset)[source]#
This is a helper class denoting the relationship between the current user and a project
- user_role: int#
- dataset: dict#
- class encord.orm.label_log.LabelLog(log_hash, user_hash, user_email, annotation_hash, identifier, data_hash, feature_hash, action, label_name, time_taken, created_at, frame)[source]#
- log_hash: str#
- user_hash: str#
- user_email: str#
- annotation_hash: str#
- identifier: str#
- data_hash: str#
- feature_hash: str#
- action: encord.orm.label_log.Action#
- label_name: str#
- time_taken: int#
- created_at: datetime.datetime#
- frame: int#
- class encord.orm.label_log.Action(value)[source]#
An enumeration.
- ADD = 0#
- EDIT = 1#
- DELETE = 2#
- START = 3#
- END = 4#
- MARK_AS_NOT_LABELLED = 5#
- MARK_AS_IN_PROGRESS = 6#
- MARK_AS_LABELLED = 7#
- MARK_AS_REVIEW_REQUIRED = 8#
- MARK_AS_REVIEWED = 9#
- MARK_AS_REVIEWED_TWICE = 10#
- SUBMIT_TASK = 11#
- APPROVE_LABEL = 12#
- REJECT_LABEL = 13#
- CLICK_SAVE = 14#
- CLICK_UNDO = 15#
- CLICK_REDO = 16#
- CLICK_BULK = 17#
- CLICK_ZOOM = 19#
- CLICK_BRIGHTNESS = 20#
- CLICK_HOTKEYS = 21#
- CLICK_SETTINGS = 22#
- ADD_ATTRIBUTE = 23#
- EDIT_ATTRIBUTE = 24#
- DELETE_ATTRIBUTE = 25#
- APPROVE_NESTED_ATTRIBUTE = 26#
- REJECT_NESTED_ATTRIBUTE = 27#
- SUBMIT_LABEL = 28#
- SUBMIT_NESTED_ATTRIBUTE = 29#
- BUFFERING_OVERLAY_SHOWN = 30#
- BITRATE_WARNING_SHOWN = 31#
- SEEKING_OVERLAY_SHOWN = 32#
- class encord.orm.label_log.LabelLogParams(**data)[source]#
- user_hash: Optional[str]#
- data_hash: Optional[str]#
- start_timestamp: Optional[int]#
- end_timestamp: Optional[int]#
- user_email: Optional[str]#
- include_user_email_and_interface_key: bool#
- class encord.orm.label_row.LabelRow(dic)[source]#
A label row contains a data unit or a collection of data units and associated labels, and is specific to a data asset with type
video
orimg_group
:A label row with a data asset of type video contains a single data unit.
A label row with a data asset of type img_group contains any number of data units.
The label row ORM is as follows:
label_hash
(uid) is the unique identifier of the label rowdataset_hash
(uid) is the unique identifier of the dataset which contains the particular video or image groupdataset_title
is the title of the dataset which contains the particular video or image groupdata_title
is the title of the video or image groupdata_type
eithervideo
orimg_group
depending on data typedata_units
a dictionary with (key: data hash, value: data unit) pairs.object_answers
is a dictionary with (key: object hash, value: object answer) pairs.classification_answers
is a dictionary with (key: classification hash, value: classification answer) pairs.object_actions
is a dictionary with (key:<object_hash>
, value: object action) pairs.label_status
is a string indicating label status. It can take the values enumerated inencord.orm.label_row.LabelStatus
. Note that this does not reflect thes status shown in the Projects->Labels section on the web-app.
A data unit, mentioned for the dictionary entry
data_units
above, has in the form:label_row = { # The label row # ... "data_units": { "<data_hash>": { "data_hash": "<data_hash>", # A data_hash (uid) string "data_title": "A data title", "data_link": "<data_link>", # Signed URL that expiring after 7 days "data_type": "<data_type>", # (video/mp4, image/jpeg, etc.) "data_fps": 24.95, # For video, the frame rate "data_sequence": "0", # Defines order of data units "width": 640, # The width of the content "height": 610, # The height of the content "labels": { # ... } }, # ..., } }
A data unit can have any number of vector labels (e.g. bounding box, polygon, keypoint) and classifications.
Objects and classifications
A data unit can have any number of vector labels (e.g., bounding boxes, polygons, polylines, keypoints) and classifications. Each frame-level object and classification has unique identifiers ‘objectHash’ and ‘classificationHash’. Each frame-level entity has a unique feature identifier ‘featureHash’, defined in the editor ontology.
The object and classification answers are contained separately from the individual data units to preserve space for video, sequential images, DICOM, etc.
The objects and classifications answer dictionaries contain classification ‘answers’ (i.e. attributes that describe the object or classification). This is to avoid storing the information at every frame in the blurb, of particular importance for videos.
A labels dictionary for video is in the form:
label_row["data_units"]["<data_hash>"]["labels"] = { "<frame_number>": { "objects": [ # { object 1 }, # { object 2 }, # ... ], "classifications": [ # { classification 1 }, # { classification 2 }, # ... ], } }
A labels dictionary for an img_group data unit is in the form:
label_row["data_units"]["<data_hash>"]["labels"] = { "objects": [ # { object 1 }, # { object 2 }, # ... ], "classifications": [ # { classification 1 }, # { classification 2 }, # ... ], }
The object answers dictionary is in the form:
label_row["object_answers"] = { "<object_hash>": { "objectHash": "<object_hash>", "classifications": [ # {answer 1}, # {answer 2}, # ... ] }, # ... }
The classification answers dictionary is in the form:
label_row["classification_answers"] = { "<classification_hash>": { "classificationHash": "<classification_hash>", "classifications": [ # {answer 1}, # {answer 2}, # ... ], }, # ... }
The object actions dictionary is in the form:
label_row["object_actions"] = { "<object_hash>": { "objectHash": "<object_hash>", "actions": [ # {answer 1}, # {answer 2}, # ... ], }, # ... }
- DB_FIELDS: collections.OrderedDict = {'annotation_task_status': <class 'str'>, 'classification_answers': <class 'dict'>, 'created_at': <class 'str'>, 'data_hash': <class 'str'>, 'data_title': <class 'str'>, 'data_type': <class 'str'>, 'data_units': <class 'dict'>, 'dataset_hash': <class 'str'>, 'dataset_title': <class 'str'>, 'label_hash': <class 'str'>, 'label_status': <class 'str'>, 'last_edited_at': <class 'str'>, 'object_actions': <class 'dict'>, 'object_answers': <class 'dict'>}#
- NON_UPDATABLE_FIELDS: set = {'label_hash'}#
- class encord.orm.label_row.AnnotationTaskStatus(value)[source]#
An enumeration.
- QUEUED = 'QUEUED'#
- ASSIGNED = 'ASSIGNED'#
- IN_REVIEW = 'IN_REVIEW'#
- RETURNED = 'RETURNED'#
- COMPLETED = 'COMPLETED'#
- class encord.orm.label_row.ShadowDataState(value)[source]#
Specifies the kind of data to fetch when working with a BenchmarkQa project
- ALL_DATA = 'ALL_DATA'#
Fetch all the label rows
- SHADOW_DATA = 'SHADOW_DATA'#
the annotator’s view of the benchmark
- Type
Only fetch the label rows that were submitted against “shadow data”
- NOT_SHADOW_DATA = 'NOT_SHADOW_DATA'#
Only fetch the label rows for “production” data
- class encord.orm.label_row.LabelStatus(value)[source]#
An enumeration.
- NOT_LABELLED = 'NOT_LABELLED'#
- LABEL_IN_PROGRESS = 'LABEL_IN_PROGRESS'#
- LABELLED = 'LABELLED'#
- REVIEW_IN_PROGRESS = 'REVIEW_IN_PROGRESS'#
- REVIEWED = 'REVIEWED'#
- REVIEWED_TWICE = 'REVIEWED_TWICE'#
- MISSING_LABEL_STATUS = '_MISSING_LABEL_STATUS_'#
This value will be displayed if the Encord platform has a new label status and your SDK version does not understand it yet. Please update your SDK to the latest version.
- class encord.orm.label_row.WorkflowGraphNode(uuid, title)[source]#
- uuid: str#
- title: str#
- classmethod from_optional_dict(json_dict)[source]#
- Return type
Optional
[WorkflowGraphNode
]
- class encord.orm.label_row.LabelRowMetadata(label_hash, created_at, last_edited_at, data_hash, dataset_hash, dataset_title, data_title, data_type, data_link, label_status, annotation_task_status, workflow_graph_node, is_shadow_data, number_of_frames, duration, frames_per_second, height, width, priority=None)[source]#
Contains helpful information about a label row.
- label_hash: Optional[str]#
Only present if the label row is initiated
- created_at: Optional[datetime.datetime]#
Only present if the label row is initiated
- last_edited_at: Optional[datetime.datetime]#
Only present if the label row is initiated
- data_hash: str#
- dataset_hash: str#
- dataset_title: str#
- data_title: str#
- data_type: str#
- data_link: Optional[str]#
Can be None for label rows of image groups or DICOM series.
- label_status: encord.orm.label_row.LabelStatus#
Can be None for TMS2 projects
- annotation_task_status: Optional[encord.orm.label_row.AnnotationTaskStatus]#
Only available for TMS2 project
- workflow_graph_node: Optional[encord.orm.label_row.WorkflowGraphNode]#
- is_shadow_data: bool#
- number_of_frames: int#
- duration: Optional[float]#
Only available for the VIDEO data_type
- frames_per_second: Optional[int]#
Only available for the VIDEO data_type
- height: Optional[int]#
- width: Optional[int]#
- priority: Optional[float] = None#
Only available for not complete tasks
- classmethod from_list(json_list)[source]#
- Return type
List
[LabelRowMetadata
]
- class encord.orm.labeling_algorithm.LabelingAlgorithm(dic)[source]#
Labeling algorithm base ORM.
ORM:
algorithm_name, algorithm_params
- DB_FIELDS: collections.OrderedDict = {'algorithm_name': <class 'str'>, 'algorithm_parameters': <class 'dict'>}#
- class encord.orm.labeling_algorithm.ObjectInterpolationParams(dic)[source]#
Labeling algorithm parameters for interpolation algorithm
ORM:
key_frames, objects_to_interpolate
- DB_FIELDS: collections.OrderedDict = {'key_frames': <class 'dict'>, 'objects_to_interpolate': <class 'list'>}#
- class encord.orm.labeling_algorithm.BoundingBoxFittingParams(dic)[source]#
Labeling algorithm parameters for bounding box fitting algorithm
ORM:
labels, video
- DB_FIELDS: collections.OrderedDict = {'labels': <class 'dict'>, 'video': <class 'dict'>}#
- class encord.orm.model.ModelOperations(value)[source]#
An enumeration.
- INFERENCE = 0#
- TRAIN = 1#
- CREATE = 2#
- class encord.orm.model.Model(dic)[source]#
Model base ORM.
ORM:
model_operation, model_parameters,
- DB_FIELDS: collections.OrderedDict = {'model_operation': <class 'int'>, 'model_parameters': <class 'dict'>}#
- class encord.orm.model.ModelConfiguration(model_uid, title, description, feature_node_hashes, model, model_iteration_uids)[source]#
- model_uid: str#
- title: str#
- description: str#
- feature_node_hashes: List[str]#
The corresponding feature node hashes of the ontology object
- model: encord.constants.model.AutomationModels#
- model_iteration_uids: List[str]#
All the UIDs of individual model training instances
- class encord.orm.model.ModelTrainingLabelMetadata(label_uid, data_uid, data_link)[source]#
- label_uid: str#
- data_uid: str#
- data_link: str#
- class encord.orm.model.ModelTrainingLabel(label_metadata_list, feature_uids)[source]#
- label_metadata_list: List[encord.orm.model.ModelTrainingLabelMetadata]#
- feature_uids: List[str]#
- class encord.orm.model.TrainingMetadata(model_iteration_uid, created_at=None, training_final_loss=None, model_training_labels=None)[source]#
- model_iteration_uid: str#
- created_at: Optional[datetime.datetime] = None#
- training_final_loss: Optional[float] = None#
- model_training_labels: Optional[encord.orm.model.ModelTrainingLabel] = None#
- static get_model_training_labels(json_dict)[source]#
- Return type
Optional
[ModelTrainingLabel
]
- class encord.orm.model.ModelRow(dic)[source]#
A model row contains a set of features and a model (resnet18, resnet34, resnet50, resnet101, resnet152, vgg16, vgg19, faster_rcnn, mask_rcnn).
ORM:
model_hash (uid), title, description, features, model,
- DB_FIELDS: collections.OrderedDict = {'description': <class 'str'>, 'features': <class 'list'>, 'model': <class 'str'>, 'model_hash': <class 'str'>, 'title': <class 'str'>}#
- class encord.orm.model.ModelInferenceParams(dic)[source]#
Model inference parameters for running models trained via the platform.
ORM:
local_file_path, conf_thresh, iou_thresh, device detection_frame_range (optional)
- DB_FIELDS: collections.OrderedDict = {'allocation_enabled': <class 'bool'>, 'conf_thresh': <class 'float'>, 'data_hashes': <class 'list'>, 'detection_frame_range': <class 'list'>, 'device': <class 'str'>, 'files': <class 'list'>, 'iou_thresh': <class 'float'>, 'rdp_thresh': <class 'float'>}#
- class encord.orm.model.ModelTrainingWeights(dic)[source]#
Model training weights.
ORM:
training_config_link, training_weights_link,
- DB_FIELDS: collections.OrderedDict = {'model': <class 'str'>, 'training_config_link': <class 'str'>, 'training_weights_link': <class 'str'>}#
- class encord.orm.model.ModelTrainingParams(dic)[source]#
Model training parameters.
ORM:
model_hash, label_rows, epochs, batch_size, weights, device
- DB_FIELDS: collections.OrderedDict = {'batch_size': <class 'int'>, 'device': <class 'str'>, 'epochs': <class 'int'>, 'label_rows': <class 'list'>, 'model_hash': <class 'str'>, 'weights': <class 'encord.orm.model.ModelTrainingWeights'>}#
Expand for references to
encord.orm.project
- class encord.orm.project.Project(dic)[source]#
DEPRECATED - prefer using the encord.project.Project class instead.
A project defines a label ontology and is a collection of datasets and label rows.
ORM:
title,
description,
editor_ontology,
ontology_hash,
datasets:
[ { dataset_hash (uid), title, description, dataset_type (internal vs. AWS/GCP/Azure), }, ... ],
label_rows:
[ { label_hash (uid), data_hash (uid), dataset_hash (uid), dataset_title, data_title, data_type, label_status }, ... ]
annotation_task_status
- DB_FIELDS: collections.OrderedDict = {'created_at': <class 'datetime.datetime'>, 'datasets': (<class 'list'>, <class 'str'>), 'description': <class 'str'>, 'editor_ontology': (<class 'dict'>, <class 'str'>), 'label_rows': (<class 'list'>, <class 'str'>), 'last_edited_at': <class 'datetime.datetime'>, 'ontology_hash': <class 'str'>, 'project_hash': <class 'str'>, 'source_projects': <class 'list'>, 'title': <class 'str'>, 'workflow_manager_uuid': <class 'str'>}#
- NON_UPDATABLE_FIELDS: set = {'datasets', 'editor_ontology', 'label_rows'}#
- get_labels_list()[source]#
Returns a list of all optional label row IDs (label_hash uid) in a project. If no label_hash is found, a None value is appended. This can be useful for working with fetching additional label row data via
encord.project.Project.get_label_rows()
for example.project = client_instance.get_project(<project_hash>) project_orm = project.get_project() labels_list = project_orm.get_labels_list() created_labels_list = [] for label in labels_list: if label is not None: # None values will fail the operation created_labels_list.append(label) label_rows = project.get_label_rows(created_labels_list, get_signed_url=False)
- Return type
List
[Optional
[str
]]
- property project_hash: str#
- Return type
str
- property title: str#
- Return type
str
- property description: str#
- Return type
str
- property editor_ontology#
- property datasets#
- property label_rows#
- property ontology_hash: str#
- Return type
str
- property source_projects#
- property created_at: datetime.datetime#
- Return type
datetime
- property last_edited_at: datetime.datetime#
- Return type
datetime
- property workflow_manager_uuid: uuid.UUID#
- Return type
UUID
- class encord.orm.project.ProjectCopyOptions(value)[source]#
An enumeration.
- COLLABORATORS = 'collaborators'#
- DATASETS = 'datasets'#
- MODELS = 'models'#
- LABELS = 'labels'#
- class encord.orm.project.ReviewApprovalState(value)[source]#
An enumeration.
Expand for references to
encord.orm.project.ReviewApprovalState
- APPROVED = 'APPROVED'#
- PENDING = 'PENDING'#
- REJECTED = 'REJECTED'#
- DELETED = 'DELETED'#
- NOT_SELECTED_FOR_REVIEW = 'NOT_SELECTED_FOR_REVIEW'#
- class encord.orm.project.CopyDatasetAction(value)[source]#
An enumeration.
Expand for references to
encord.orm.project.CopyDatasetAction
- ATTACH = 'ATTACH'#
Attach the datasets associated with the original project to the copy project.
- CLONE = 'CLONE'#
Clone the data units from the associated datasets into a new dataset an attach it to the copy project.
Expand for references to
encord.orm.project.CopyDatasetAction.CLONE
- class encord.orm.project.CopyDatasetOptions(action=CopyDatasetAction.ATTACH, dataset_title=None, dataset_description=None, datasets_to_data_hashes_map=<factory>)[source]#
Options for copying the datasets associated with a project.
Expand for references to
encord.orm.project.CopyDatasetOptions
- action: encord.orm.project.CopyDatasetAction = 'ATTACH'#
One of CopyDatasetAction.ATTACH or CopyDatasetAction.CLONE. (defaults to ATTACH)
- dataset_title: Optional[str] = None#
- dataset_description: Optional[str] = None#
- class encord.orm.project.CopyLabelsOptions(accepted_label_hashes=None, accepted_label_statuses=None)[source]#
Options for copying the labels associated with a project.
Expand for references to
encord.orm.project.CopyLabelsOptions
- accepted_label_hashes: Optional[List[str]] = None#
A list of label hashes that will be copied to the new project
- accepted_label_statuses: Optional[List[encord.orm.project.ReviewApprovalState]] = None#
A list of label statuses to filter the labels copied to the new project, defined in ReviewApprovalState
- class encord.orm.project.CopyProjectPayload(copy_project_options=<factory>, copy_labels_options=None, project_copy_metadata=None)[source]#
WARN: do not use, this is only for internal purpose.
- copy_project_options: List[ProjectCopyOptions]#
- copy_labels_options: Optional[_CopyLabelsOptions] = None#
- project_copy_metadata: Optional[_ProjectCopyMetadata] = None#
- class encord.orm.project.ProjectWorkflowType(value)[source]#
An enumeration.
- MANUAL_QA = 'manual'#
- BENCHMARK_QA = 'benchmark'#
- class encord.orm.project.ManualReviewWorkflowSettings[source]#
Sets the project QA workflow to “manual reviews”. There are currently no available settings for this workflow
- class encord.orm.project.BenchmarkQaWorkflowSettings(source_projects=<factory>)[source]#
Sets the project QA workflow to “Automatic”, with benchmark data being presented to all the annotators
- source_projects: List[str]#
For Benchmark QA projects, a list of project ids (project_hash-es) that contain the benchmark source data
- encord.orm.project.ProjectWorkflowSettings#
A variant type that allows you to select the type of quality assurance workflow for the project, and configure it.
- Currently one of:
ManualReviewWorkflowSettings
: a workflow with optional manual reviewsBenchmarkQaWorkflowSettings
: annotators are presented with “benchmark” or “honeypot” data
alias of
Union
[encord.orm.project.ManualReviewWorkflowSettings
,encord.orm.project.BenchmarkQaWorkflowSettings
]
- class encord.orm.project.ReviewMode(value)[source]#
- UNLABELLED:
- The labels are added to the images. However, the one person must still go over
all of the labels before submitting them for review.
- LABELLED:
- The labels are added to the images and are marked as labelled. A reviewer will
still need to approve those.
- REVIEWED:
- The labels are added to the images and considered reviewed. No more action is
required from the labeler or reviewer.
- UNLABELLED = 'unlabelled'#
- LABELLED = 'labelled'#
- REVIEWED = 'reviewed'#
- class encord.orm.project.ProjectImporter(dic)[source]#
- DB_FIELDS: collections.OrderedDict = {'errors': <class 'list'>, 'project_hash': typing.Union[str, NoneType]}#
- class encord.orm.project.CvatExportType(value)[source]#
An enumeration.
- PROJECT = 'project'#
- TASK = 'task'#
- class encord.orm.project.ProjectImporterCvatInfo(dic)[source]#
- DB_FIELDS: collections.OrderedDict = {'export_type': <enum 'CvatExportType'>}#
Expand for references to
encord.orm.project_api_key
- class encord.orm.project_api_key.ProjectAPIKey(api_key, title, scopes)[source]#
Expand for references to
encord.orm.project_api_key.ProjectAPIKey
- api_key: str#
- title: str#
- scopes: List[encord.utilities.client_utilities.APIKeyScopes]#
Utilities - Other#
- class encord.http.utils.CloudUploadSettings(max_retries=None, backoff_factor=None, allow_failures=False)[source]#
The settings for uploading data into the GCP cloud storage. These apply for each individual upload. These settings will overwrite the
encord.http.constants.RequestsSettings()
which is set duringencord.EncordUserClient
creation.- max_retries: Optional[int] = None#
Number of allowed retries when uploading
- backoff_factor: Optional[float] = None#
With each retry, there will be a sleep of backoff_factor * (2 ** (retry_number - 1) )
- allow_failures: bool = False#
If failures are allowed, the upload will continue even if some items were not successfully uploaded even after retries. For example, upon creation of a large image group, you might want to create the image group even if a few images were not successfully uploaded. The unsuccessfully uploaded images will then be logged.
Expand for references to
encord.http.constants
- class encord.http.constants.RequestsSettings(max_retries=3, backoff_factor=1.5, connection_retries=3, connect_timeout=180, read_timeout=180, write_timeout=180)[source]#
The settings for all outgoing network requests. These apply for each individual request.
Expand for references to
encord.http.constants.RequestsSettings
- max_retries: int = 3#
Number of allowed retries when a request is sent. It only affects idempotent retryable requests.
- backoff_factor: float = 1.5#
With each retry, there will be a sleep of backoff_factor * (2 ** (retry_number - 1) )
- connection_retries: int = 3#
Number of allowed retries to establish TCP connection when a request is sent.
- connect_timeout: int = 180#
Maximum number of seconds from connection establishment to the first byte of response received
- read_timeout: int = 180#
Maximum number of seconds to obtain full response
- write_timeout: int = 180#
Maximum number of seconds to send request payload
- encord.utilities.project_utilities.get_all_model_iteration_uids(model_configurations)[source]#
A convenience function that works well with
encord.project.Project.list_models()
- Return type
Set
[str
]
Configs#
- class encord.configs.BaseConfig(endpoint, requests_settings=RequestsSettings(max_retries=3, backoff_factor=1.5, connection_retries=3, connect_timeout=180, read_timeout=180, write_timeout=180))[source]#
- class encord.configs.UserConfig(private_key, domain='https://api.encord.com', requests_settings=RequestsSettings(max_retries=3, backoff_factor=1.5, connection_retries=3, connect_timeout=180, read_timeout=180, write_timeout=180))[source]#
-
- static from_ssh_private_key(ssh_private_key, password='', requests_settings=RequestsSettings(max_retries=3, backoff_factor=1.5, connection_retries=3, connect_timeout=180, read_timeout=180, write_timeout=180), **kwargs)[source]#
Instantiate a UserConfig object by the content of a private ssh key.
- Parameters
ssh_private_key (
str
) – The content of a private key file.password (
Optional
[str
]) – The password for the private key file.requests_settings (
RequestsSettings
) – The requests settings for all outgoing network requests.
- Returns
UserConfig.
- Raises
ValueError – If the provided key content is not of the correct format.
- class encord.configs.Config(resource_id=None, web_file_path='/public', domain=None, websocket_endpoint='wss://ws.encord.com/websocket', requests_settings=RequestsSettings(max_retries=3, backoff_factor=1.5, connection_retries=3, connect_timeout=180, read_timeout=180, write_timeout=180))[source]#
Config defining endpoint, project id, API key, and timeouts.
- encord.configs.get_env_ssh_key()[source]#
Returns the raw ssh key by looking up the ENCORD_SSH_KEY_FILE and ENCORD_SSH_KEY environment variables in the mentioned order and returns the first successfully identified key.
- Return type
str
- class encord.configs.ApiKeyConfig(resource_id=None, api_key=None, domain=None, requests_settings=RequestsSettings(max_retries=3, backoff_factor=1.5, connection_retries=3, connect_timeout=180, read_timeout=180, write_timeout=180))[source]#
- encord.configs.EncordConfig#
alias of
encord.configs.ApiKeyConfig
- encord.configs.CordConfig#
alias of
encord.configs.ApiKeyConfig
- encord.configs.ENCORD_DOMAIN = 'https://api.encord.com'
str(object=’’) -> str str(bytes_or_buffer[, encoding[, errors]]) -> str
Create a new string object from the given object. If encoding or errors is specified, then the object must expose a data buffer that will be decoded using the given encoding and error handler. Otherwise, returns the result of object.__str__() (if defined) or repr(object). encoding defaults to sys.getdefaultencoding(). errors defaults to ‘strict’.
Exceptions#
- exception encord.exceptions.EncordException(message, context=None)[source]#
Base class for all exceptions.
- encord.exceptions.CordException#
alias of
encord.exceptions.EncordException
- exception encord.exceptions.InitialisationError(message, context=None)[source]#
Exception thrown when API key fails to initialise.
- exception encord.exceptions.AuthenticationError(message, context=None)[source]#
Exception thrown when API key fails authentication.
- exception encord.exceptions.AuthorisationError(message, context=None)[source]#
Exception thrown when access is unauthorised. (E.g. access to a data asset or method).
- exception encord.exceptions.ResourceNotFoundError(message, context=None)[source]#
Exception thrown when a requested resource is not found. (E.g. label, data asset).
- exception encord.exceptions.TimeOutError(message, context=None)[source]#
Exception thrown when a request times out.
- exception encord.exceptions.RequestException(message, context=None)[source]#
Ambiguous exception while handling request.
- exception encord.exceptions.InvalidDateFormatError(message, context=None)[source]#
Invalid date format error
- exception encord.exceptions.MethodNotAllowedError(message, context=None)[source]#
Exception thrown when HTTP method is not allowed.
- exception encord.exceptions.OperationNotAllowed(message, context=None)[source]#
Exception thrown when a read/write operation is not allowed. The API key blocks the operation.
- exception encord.exceptions.AnswerDictionaryError(message, context=None)[source]#
Exception thrown when answer dictionaries are incomplete. Occurs when an object or classification is missing.
- exception encord.exceptions.CorruptedLabelError(message, context=None)[source]#
Exception thrown when a label is corrupted. (E.g. the frame labels have more frames than the video).
- exception encord.exceptions.FileTypeNotSupportedError(message, context=None)[source]#
Exception thrown when a file type is not supported. Supported file types are: image/jpeg, image/png, video/webm, video/mp4.
- exception encord.exceptions.FileSizeNotSupportedError(message, context=None)[source]#
Exception thrown when the combined size of the input files is larger than the supported limit.
- exception encord.exceptions.FeatureDoesNotExistError(message, context=None)[source]#
If a feature uid does not exist in a given project ontology.
- exception encord.exceptions.ModelWeightsInconsistentError(message, context=None)[source]#
Exception thrown when an attempted model training iteration has a different type of weights than what is recorded (i.e. if type of model_hash (uid) is faster_rcnn, but is attempted trained with different model weights).
- exception encord.exceptions.ModelFeaturesInconsistentError(message, context=None)[source]#
If a feature type is different than what is supported by the model (e.g. if creating a classification model using a bounding box).
- exception encord.exceptions.UploadOperationNotSupportedError(message, context=None)[source]#
Exception thrown when trying to upload a video/image group to non-Encord storage dataset
- exception encord.exceptions.DetectionRangeInvalidError(message, context=None)[source]#
Exception thrown when a detection range is invalid. (E.g. negative or higher than num frames in video).
- exception encord.exceptions.InvalidAlgorithmError(message, context=None)[source]#
Exception thrown when invalid labeling algorithm name is sent.
- exception encord.exceptions.ResourceExistsError(message, context=None)[source]#
Exception thrown when trying to re-create a resource. Avoids overriding existing work.
- exception encord.exceptions.DuplicateSshKeyError(message, context=None)[source]#
Exception thrown when using an SSH key that was added twice to the platform.
- exception encord.exceptions.SshKeyNotFound(message, context=None)[source]#
Exception thrown when using an SSH key that was not added to the platform.
- exception encord.exceptions.InvalidArgumentsError(message, context=None)[source]#
Exception thrown when the arguments are invalid.
- exception encord.exceptions.GenericServerError(message, context=None)[source]#
The server has reported an error which is not recognised by this SDK version. Try upgrading the SDK version to see the precise error that is reported.
- exception encord.exceptions.CloudUploadError(message, context=None)[source]#
The upload to the cloud was not successful
- exception encord.exceptions.MultiLabelLimitError(message, maximum_labels_allowed, context=None)[source]#
Too many labels were requested
- exception encord.exceptions.LabelRowError(message, context=None)[source]#
An error thrown when the construction of a LabelRow class is invalid.
Helpers#
- class encord.http.bundle.Bundle[source]#
This class allows to perform operations in bundles to improve performance by reducing number of network calls.
It is not supposed to be instantiated directly by a user. Use
encord.project.Project.create_bundle()
method to initiate bundled operations.To execute batch you can either call
execute()
directly, or use a Context Manager.# Code example of performing batched label initialisation project = ... # assuming you already have instantiated this Project object label_rows = project.list_label_rows_v2() bundle = project.create_bundle() for label_row in label_rows: label_row.initialise_labels(bundle=bundle) # no real network operations happened at this point # now, trigger the actual network interaction bundle.execute() # all labels are initialised at this point
And this is the same flow with the Context Manager approach:
# Code example of performing batched label initialisation project = ... # assuming you already have instantiated this Project object label_rows = project.list_label_rows_v2() with project.create_bundle() as bundle: for label_row in label_rows: label_row.initialise_labels(bundle=bundle) # no real network operations happened at this point # At this point all labels will be initialised
DEPRECATED Client#
Using this client is deprecated. You are encouraged to use the Project
and Dataset
instead.
encord.client
provides a simple Python client that allows you
to query project resources through the Encord API.
Here is a simple example for instantiating the client for a project and obtaining project info:
- class encord.client.EncordClient(querier, config, api_client=None)[source]#
Encord client. Allows you to query db items associated with a project (e.g. label rows, datasets).
- static initialise(resource_id=None, api_key=None, domain='https://api.encord.com', requests_settings=RequestsSettings(max_retries=3, backoff_factor=1.5, connection_retries=3, connect_timeout=180, read_timeout=180, write_timeout=180))[source]#
Create and initialize a Encord client from a resource EntityId and API key.
- Parameters
resource_id (
Optional
[str
]) –either of the following
A <project_hash>. If
None
, uses theENCORD_PROJECT_ID
environment variable. TheCORD_PROJECT_ID
environment variable is supported for backwards compatibility.A <dataset_hash>. If
None
, uses theENCORD_DATASET_ID
environment variable. TheCORD_DATASET_ID
environment variable is supported for backwards compatibility.
api_key (
Optional
[str
]) – An API key. If None, uses theENCORD_API_KEY
environment variable. TheCORD_API_KEY
environment variable is supported for backwards compatibility.domain (
str
) – The encord api-server domain. If None, theencord.configs.ENCORD_DOMAIN
is usedrequests_settings (
RequestsSettings
) – The RequestsSettings from this config
- Returns
A Encord client instance.
- Return type
- static initialise_with_config(config)[source]#
Create and initialize a Encord client from a Encord config instance.
- Parameters
config (
ApiKeyConfig
) – A Encord config instance.- Returns
A Encord client instance.
- Return type
- get_cloud_integrations()[source]#
- Return type
List
[CloudIntegration
]
- class encord.client.EncordClientDataset(querier, config,