Skip to main content
This page describes how to configure the ClickUp connector component as part of a data pipeline within . The ClickUp component uses the Connect and Configure parameters to create a table of ClickUp data, which is then stored in your preferred storage location (Snowflake, Databricks, Amazon Redshift, or cloud storage). You do not need to use the Create Table component when using this connector, as the ClickUp component will create a new table or replace an existing table for you using the Destination parameters you define. The ClickUp connector is a Flex connector. In , Flex connectors let you connect to a curated set of endpoints to load data. You can use the ClickUp connector in its preconfigured state, or you can edit the connector by adding or amending available ClickUp endpoints as per your use case. You can edit Flex connectors in the Custom Connector user interface. For detailed information about authentication, specific endpoint parameters, pagination, and more aspects of the ClickUp API, read the ClickUp API documentation. There are two versions of the ClickUp API, both of which you can access through the link provided.

Properties

Reference material is provided below for the Connect, Configure, Destination, and Advanced Settings properties.
Name
string
required
A human-readable name for the component.

Connect

Data Source
drop-down
required
The data source to load data from in this pipeline. The drop-down menu lists the ClickUp API endpoints available in the connector. For detailed information about specific endpoints, read the ClickUp API documentation.
EndpointMethodReference
Get Authorized UserGETView the details of the authenticated user’s ClickUp account
Get Authorized TeamsGETView the Workspaces available to the authenticated user
Get Task CommentsGETView task comments
Get Chat View CommentsGETView comments from a Chat view
Get List CommentsGETView the comments added to a List
Get Threaded CommentsGETView threaded comments
Get Custom Task TypesGETView the custom task types available in a Workspace
Get Accessible Custom FieldsGETView the custom fields available on tasks in a specific list
Search DocsGETView the docs in your Workspace
Get Doc PageListingGETView the PageListing for a doc
Get Doc PagesGETView pages belonging to a doc
Get FoldersGETView the folders in a space
Get GoalsGETView the goals available in a workspace
Get ListsGETView the lists within a folder
Get Folderless ListsGETView the lists in a space that aren’t located in a folder
Get Task MembersGETView the people who have access to a task
Get List MembersGETView the people who have access to a list
Get Custom RolesGETView the custom roles available in a workspace
Shared HierarchyGETView the tasks, lists, and folders that have been shared with the authenticated user
Get SpacesGETView the spaces available in a workspace
Get Space TagsGETView the task tags available in a space
Get TasksGETView the tasks in a list
Get Filtered Team TasksGETView the tasks that meet specific criteria from a workspace
Get Task TemplatesGETView the task templates available in a workspace
Get Workspace SeatsGETView the used, total, and available member and guest seats for a workspace
Get Time EntriesGETView time entries filtered by start and end date
Get Time Entry HistoryGETView a list of changes made to a time entry
Get Running Time EntryGETView a time entry that’s currently tracking time for the authenticated user
Get All Tags From Time EntriesGETView all the labels that have been applied to time entries in a Workspace
Get UserGETView information about a user in a Workspace
Get Workspace ViewsGETView the task and page views available at the Everything Level of a workspace
Get Space ViewsGETView the task and page views available for a space
Get Folder ViewsGETView the task and page views available for a folder
Get List ViewsGETView the task and page views available for a List
Get View TasksGETSee all visible tasks in a view in ClickUp
Get WebhooksGETView the webhooks created via the API for a Workspace
Authentication Type
drop-down
required
The authentication method to authorize access to your ClickUp data. Currently supports API Key.
Key
string
required
The key of a working API key:value pair.Enter Authorization in the Key field.
Value
drop-down
required
Use the drop-down menu to select the corresponding secret definition that denotes the value of a working API key:value pair.Read Secrets and secret definitions to learn how to create a new secret definition.Read the ClickUp API documentation to learn how to acquire an API key.

Configure

URI Parameters
column editor
required
  • Parameter Name: The name of a URI parameter.
  • Parameter Value: The value of the corresponding parameter.
Required parameterEndpointsDescription
api_versionGet Authorized User, Get Authorized Teams, Get Task Comments, Get Chat View Comments, Get List Comments, Get Custom Task Types, Get Accessible Custom Fields, Get Folders, Get Goals, Get Lists, Get Folderless Lists, Get Task Members, Get List Members, Get Custom Roles, Shared Hierarchy, Get Spaces, Get Time Entry History, Get Running Time Entry, Get All Tags From Time Entries, Get User, Get Workspace Views, Get Space Views, Get Folder Views, Get List Views, Get View Tasks, Get Webhooksv2
api_versionSearch Docs, Get Doc PageListing, Get Doc Pagesv3
task_idGet Task Comments, Get Task MembersThe task ID.
view_idGet Chat View Comments, Get View TasksThe view ID.
list_idGet List Comments, Get Accessible Custom Fields, Get List Members, Get List ViewsThe list ID. Read the list_id parameter description in Get Tasks for instructions on how to obtain the list_id.
comment_idGet Threaded CommentsYour comment ID.
team_idGet Custom Task Types, Shared Hierarchy, Get Goals, Get Custom Roles, Get Spaces, Get Time Entry History, Get Running Time Entry, Get All Tags From Time Entries, Get User, Get Workspace Views, Get WebhooksThe Team ID of the workspace.
workspaceIdSearch Docs, Get Doc PageListing, Get Doc PagesThe ID of the workspace.
docIdGet Doc PageListing, Get Doc PagesThe ID of the doc.
space_idGet Folders, Get Folderless Lists, Get Space ViewsThe ID of a space.
folder_idGet Lists, Get Folder ViewsThe folder ID.
timer_idGet Time Entry HistoryThe ID of the timer.
user_idGet UserThe user ID in a workspace.
Query Parameters
column editor
required
  • Parameter Name: The name of a query parameter.
  • Parameter Value: The value of the corresponding parameter.
Required parameterEndpointsDescription
pageGet View Tasks0
Header Parameters
column editor
required
  • Parameter Name: The name of a header parameter.
  • Parameter Value: The value of the corresponding parameter.
Required parameterEndpointsDescription
Content-TypeAll endpointsapplication/json
Post Body
JSON
A JSON body to include as part of a POST request. Use Custom Connector to test your endpoints work as expected before moving to Designer pipelines.You should also consult the developer documentation for the API you’re connecting to—as the developer portal may provide additional information about endpoints and requests.
Page Limit
integer
A numeric value to limit the maximum number of records per page.

Destination

Select your cloud data warehouse.
Destination
drop-down
required
  • Snowflake: Load your data into Snowflake. You’ll need to set a cloud storage location for temporary staging of the data.
  • Cloud Storage: Load your data directly into your preferred cloud storage location.
Click either the Snowflake or Cloud Storage tab on this page for documentation applicable to that destination type.
Warehouse
drop-down
required
The Snowflake warehouse used to run the queries. The special value [Environment Default] uses the warehouse defined in the environment. Read Overview of Warehouses to learn more.
Database
drop-down
required
The Snowflake database. The special value [Environment Default] uses the database defined in the environment. Read Databases, Tables and Views - Overview to learn more.
Schema
drop-down
required
The Snowflake schema. The special value [Environment Default] uses the schema defined in the environment. Read Database, Schema, and Share DDL to learn more.
Table Name
string
required
The name of the table to be created.
Load Strategy
drop-down
required
  • Replace: If the specified table name already exists, that table will be destroyed and replaced by the table created during this pipeline run.
  • Truncate and Insert: If the specified table name already exists, all rows within the table will be removed and new rows will be inserted per the next run of this pipeline.
  • Fail if Exists: If the specified table name already exists, this pipeline will fail to run.
  • Append: If the specified table name already exists, then the data is inserted without altering or deleting the existing data in the table. It’s appended onto the end of the existing data in the table. If the specified table name doesn’t exist, then the table will be created, and your data will be inserted into the table.
Clean Staged files
boolean
required
  • Yes: Staged files will be destroyed after data is loaded. This is the default setting.
  • No: Staged files are retained in the staging area after data is loaded.
Stage Access Strategy
drop-down
Select the stage access strategy. The strategies available depend on the cloud platform you select in Stage Platform.
  • Credentials: Connects to the external stage (AWS, Azure) using your configured cloud provider credentials. Not available for Google Cloud Storage.
  • Storage Integration: Use a Snowflake storage integration to grant access to Snowflake to read data from and write to a cloud storage location. This will reveal the Storage Integration property, through which you can select any of your existing Snowflake storage integrations.
Stage Platform
drop-down
required
Choose a data staging platform using the drop-down menu.
  • Amazon S3: Stage your data on an AWS S3 bucket.
  • Snowflake: Stage your data on a Snowflake internal stage.
  • Azure Storage: Stage your data in an Azure Blob Storage container.
  • Google Cloud Storage: Stage your data in a Google Cloud Storage bucket.
Click one of the tabs below for documentation applicable to that staging platform.
Storage Integration
drop-down
required
Select the storage integration. Storage integrations are required to permit Snowflake to read data from and write to a cloud storage location. Integrations must be set up in advance of selecting them. Storage integrations can be configured to support Amazon S3, Google Cloud Storage, or Microsoft Azure Blob Storage, regardless of the cloud provider that hosts your Snowflake account.
Amazon S3 Bucket
drop-down
required
An AWS S3 bucket to stage data into. The drop-down menu will include buckets tied to the cloud provider credentials that you have associated with your environment.

Advanced Settings

Log Level
drop-down
Set the severity level of logging. Choose from Error, Warn, Info, Debug, or Trace. Logs can be found in the Message field of the task details after the pipeline has been run.
Load Selected Data
boolean
Choose whether to return the entire payload or only selected data objects. Read Structure to learn how to select which data objects to include in your API response.
  • No: Will return the entire payload. This is the default setting.
  • Yes: Will return only the objects in Custom Connector that are marked as Selected Data in the Structure setting.

Deactivate soft delete for Azure blobs (Databricks)

If you intend to set your destination as Databricks and your stage platform as Azure Storage, you must turn off the “Enable soft delete for blobs” setting in your Azure account for your pipeline to run successfully. To do this:
  1. In the Azure portal, navigate to your storage account.
  2. In the menu, under Data management, click Data protection.
  3. Clear the Enable soft delete for blobs checkbox. For more information, read Soft delete for blobs.