You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/en/sql-reference/10-sql-commands/00-ddl/04-task/01-ddl-create_task.md
+18-16Lines changed: 18 additions & 16 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -113,7 +113,7 @@ CREATE TASK my_daily_task
113
113
WAREHOUSE ='compute_wh'
114
114
SCHEDULE = USING CRON '0 0 9 * * *''America/Los_Angeles'
115
115
COMMENT ='Daily summary task'
116
-
AS
116
+
AS
117
117
INSERT INTO summary_table SELECT*FROM source_table;
118
118
```
119
119
@@ -127,7 +127,7 @@ CREATE TASK IF NOT EXISTS mytask
127
127
SCHEDULE =2 MINUTE
128
128
SUSPEND_TASK_AFTER_NUM_FAILURES =3
129
129
AS
130
-
INSERT INTOcompaction_test.testVALUES((1));
130
+
INSERT INTOcompaction_test.testVALUES((1));
131
131
```
132
132
133
133
This example creates a task named `mytask`, if it doesn't already exist. The task is assigned to the **system** warehouse and is scheduled to run **every 2 minutes**. It will be **automatically suspended** if it **fails three times consecutively**. The task performs an INSERT operation into the compaction_test.test table.
@@ -139,9 +139,9 @@ CREATE TASK IF NOT EXISTS daily_sales_summary
139
139
WAREHOUSE ='analytics'
140
140
SCHEDULE =30 SECOND
141
141
AS
142
-
SELECT sales_date, SUM(amount) AS daily_total
143
-
FROM sales_data
144
-
GROUP BY sales_date;
142
+
SELECT sales_date, SUM(amount) AS daily_total
143
+
FROM sales_data
144
+
GROUP BY sales_date;
145
145
```
146
146
147
147
In this example, a task named `daily_sales_summary` is created with **second-level scheduling**. It is scheduled to run **every 30 SECOND**. The task uses the **analytics** warehouse and calculates the daily sales summary by aggregating data from the sales_data table.
@@ -152,22 +152,24 @@ In this example, a task named `daily_sales_summary` is created with **second-lev
In this example, a task named `process_orders` is created, and it is defined to run **after the successful completion** of **task1** and **task2**. This is useful for creating **dependencies** in a **Directed Acyclic Graph (DAG)** of tasks. The task uses the **etl** warehouse and transfers data from the staging area to the data warehouse.
160
160
161
+
> Tip: Using the AFLTER parameter does not require setting the SCHEDULE parameter.
WHERE archived_date < DATEADD(HOUR, -24, CURRENT_TIMESTAMP());
171
+
DELETEFROM archived_data
172
+
WHERE archived_date < DATEADD(HOUR, -24, CURRENT_TIMESTAMP());
171
173
172
174
```
173
175
@@ -181,12 +183,12 @@ CREATE TASK IF NOT EXISTS mytask
181
183
SCHEDULE =30 SECOND
182
184
ERROR_INTEGRATION ='myerror'
183
185
AS
184
-
BEGIN
186
+
BEGIN
185
187
BEGIN;
186
188
INSERT INTO mytable(ts) VALUES(CURRENT_TIMESTAMP);
187
189
DELETEFROM mytable WHERE ts < DATEADD(MINUTE, -5, CURRENT_TIMESTAMP());
188
190
COMMIT;
189
-
END;
191
+
END;
190
192
```
191
193
192
194
In this example, a task named `mytask` is created. It uses the **mywh** warehouse and is scheduled to run **every 30 seconds**. The task executes a **BEGIN block** that contains an INSERT statement and a DELETE statement. The task commits the transaction after both statements are executed. When the task fails, it will trigger the **error integration** named **myerror**.
@@ -201,10 +203,10 @@ CREATE TASK IF NOT EXISTS cache_enabled_task
201
203
enable_query_result_cache =1,
202
204
query_result_cache_min_execute_secs =5
203
205
AS
204
-
SELECTSUM(amount) AS total_sales
205
-
FROM sales_data
206
-
WHERE transaction_date >= DATEADD(DAY, -7, CURRENT_DATE())
207
-
GROUP BY product_category;
206
+
SELECTSUM(amount) AS total_sales
207
+
FROM sales_data
208
+
WHERE transaction_date >= DATEADD(DAY, -7, CURRENT_DATE())
209
+
GROUP BY product_category;
208
210
```
209
211
210
212
In this example, a task named `cache_enabled_task` is created with **session parameters** that enable query result caching. The task is scheduled to run **every 5 minutes** and uses the **analytics** warehouse. The session parameters **`enable_query_result_cache = 1`** and **`query_result_cache_min_execute_secs = 5`** are specified **after all other task parameters**, enabling the query result cache for queries that take at least 5 seconds to execute. This can **improve performance** for subsequent executions of the same task if the underlying data hasn't changed.
0 commit comments