Task Creation

To create a task, the toolkit requires to upload the infomation into a server that Scale has access to. (Usually this implies a valid S3 credentials, an S3 bucket). The uploaded files are a list of json files that holds data such as device position, images, camera positions, points, and any other data that’s relevant.

And a valid Scale API key to create the task.

S3

The lidar toolkit uses boto3, which will use the aws credentials stored in ~/.aws/credentials.json (this is similar aws-cli in terminal).

The method scene.s3_upload(bucket, path) uses boto3 to upload all the attachments and images of the scene into the specified bucket and path.

scene.s3_upload('scaleapi-sales-test', path='test/lidar-toolkit')

Scale File Upload

As an alternative you can use the new feature Scale File upload, more info here. In order to use Scale File Upload replace the following line:

scene.s3_upload(bucket, path)

with

attachments = scene.scale_file_upload('project_name')

Replace project_name with which project these files should be associated with.

The method scale_file_upload returns a list of URL of the scene’s frame. This is what should be included in your lidar task’s attachments field (as part of the payload).

with open('template.json') as json_file:
    TEMPLATE = json.load(json_file)

if __name__ == '__main__':
    scene = create_scene(
            'data/',
            frames=range(1,6)
            )

    # Debugging methods
    #  scene.get_frame(index=1).add_debug_lines()
    #  scene.preview()

    # Upload data to S3 bucket
    attachments = scene.scale_file_upload('test_project')
    TEMPLATE['attachments'] = attachments
    # Create tasks/ Scale API request
    scene.create_task(TEMPLATE).publish()

See also

You can hide the progress bar calling the method with verbose False scene.scale_file_upload('test_project', verbose=False)

Scale Api Key

Before you can create a task on Scale’s platform, you need to define an environment variable called SCALE_API_KEY . You can do this by entering export SCALE_API_KEY=live_xxxx in your terminal.

The following method is used to to create a task:

scene.create_task().publish()

With everything finally in place, you should be able to debug your data, upload it to S3 or your cloud provider of choice, and create tasks in Scale’s platform!

See also

You can create different task types with the same method by changing the task_type parameter:scene.create_task().publish(task_type='[TASK TYPE]'). The available task types are: lidarannotation (default), lidartopdown, and lidarsegmentation

Template

Scene.create_task() supports a json template, which is used to generate the payload containing the task parameters from Scale’s documentation (labels, project name, instructions, etc).

# Code example of how to load a template.json file and use it
with open('template.json') as json_file:
    TEMPLATE = json.load(json_file)
scene.create_task(template).publish()

Example of template.json:

{
  "attachment_  type": "json",
  "callback_ur  l": "http://example.com/callback",
  "project": " default_project",
  "instruction": "Please label the objects on the scene",
  "labels": [
    "car",
    "truck",
    "bus",
    "bicycle",
    "motorbike",
    "towed_object",
    "person"
  ],
  "max_distance_meters": 20,
  "meters_per_unit": 1
}

Create the Task

Comment out the lidar debug lines, uncomment the s3 and task creation lines.

Add your task template and run the script to create your task!

with open('template.json') as json_file:
    TEMPLATE = json.load(json_file)

if __name__ == '__main__':
    scene = create_scene(
            'data/',
            frames=range(1,6)
            )

    # Debugging methods
    #  scene.get_frame(index=1).add_debug_lines()
    #  scene.preview()

    # Upload data to S3 bucket
    scene.s3_upload(S3_BUCKET, path='test-scale')
    # Create tasks/ Scale API request
    scene.create_task(TEMPLATE).publish()