Deadline User Manual

Deadline User Manual
Release 7.1.0.35
Thinkbox Software
May 04, 2015
CONTENTS
1
2
3
4
5
Introduction
1.1 Overview . . . . . . . . . .
1.2 Feature Set . . . . . . . . .
1.3 Supported Software . . . .
1.4 Render Farm Considerations
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
3
3
7
10
30
Installation
2.1 System Requirements . . . . . . . . .
2.2 Licensing . . . . . . . . . . . . . . . .
2.3 Database and Repository Installation .
2.4 Client Installation . . . . . . . . . . .
2.5 Submitter Installation . . . . . . . . .
2.6 Upgrading or Downgrading Deadline .
2.7 Relocating the Database or Repository
2.8 Importing Repository Settings . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
37
37
40
41
67
83
87
89
90
Getting Started
3.1 Application Configuration
3.2 Submitting Jobs . . . . .
3.3 Monitoring Jobs . . . . .
3.4 Controlling Jobs . . . . .
3.5 Archiving Jobs . . . . . .
3.6 Monitor and User Settings
3.7 Local Slave Controls . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
93
93
97
104
112
144
148
156
Client Applications
4.1 Launcher . . .
4.2 Monitor . . . .
4.3 Slave . . . . .
4.4 Pulse . . . . .
4.5 Balancer . . .
4.6 Command . .
4.7 Web Service .
4.8 Mobile . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
163
163
168
192
199
204
209
212
217
Administrative Features
5.1 Repository Configuration
5.2 User Management . . . .
5.3 Slave Configuration . . .
5.4 Pulse Configuration . . .
5.5 Balancer Configuration . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
223
223
260
266
274
278
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
i
5.6
5.7
5.8
5.9
5.10
5.11
5.12
5.13
6
7
8
9
ii
Job Scheduling . . . . . . .
Pools and Groups . . . . . .
Limits and Machine Limits
Job Failure Detection . . . .
Notifications . . . . . . . .
Remote Control . . . . . .
Network Performance . . .
Cross Platform Rendering .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
284
287
292
296
299
301
310
315
Advanced Features
6.1 Manual Job Submission . . . . .
6.2 Power Management . . . . . . .
6.3 Slave Scheduling . . . . . . . . .
6.4 Farm Statistics . . . . . . . . . .
6.5 Client Configuration . . . . . . .
6.6 Auto Configuration . . . . . . . .
6.7 Render Environment . . . . . . .
6.8 Multiple Slaves On One Machine
6.9 Cloud Controls . . . . . . . . . .
6.10 Job Transferring . . . . . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
319
319
328
338
341
354
359
362
365
368
371
Scripting
7.1 Scripting Overview . . .
7.2 Application Plugins . .
7.3 Event Plugins . . . . . .
7.4 Cloud Plugins . . . . .
7.5 Balancer Plugins . . . .
7.6 Monitor Scripts . . . . .
7.7 Job Scripts . . . . . . .
7.8 Web Service Scripts . .
7.9 Standalone Python API .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
375
375
381
398
411
417
420
428
430
433
REST API
8.1 REST Overview
8.2 Jobs . . . . . . .
8.3 Job Reports . . .
8.4 Tasks . . . . . .
8.5 Task Reports . .
8.6 Slaves . . . . . .
8.7 Pulse . . . . . .
8.8 Balancer . . . .
8.9 Limits . . . . . .
8.10 Users . . . . . .
8.11 Repository . . .
8.12 Pools . . . . . .
8.13 Groups . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
437
437
439
451
453
458
459
464
466
468
471
475
481
484
Application Plugins
9.1 3ds Command . .
9.2 3ds Max . . . . .
9.3 After Effects . . .
9.4 Anime Studio . . .
9.5 Arion Standalone .
9.6 Arnold Standalone
9.7 AutoCAD . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
489
489
498
552
563
566
569
573
9.8
9.9
9.10
9.11
9.12
9.13
9.14
9.15
9.16
9.17
9.18
9.19
9.20
9.21
9.22
9.23
9.24
9.25
9.26
9.27
9.28
9.29
9.30
9.31
9.32
9.33
9.34
9.35
9.36
9.37
9.38
9.39
9.40
9.41
9.42
9.43
9.44
9.45
9.46
9.47
9.48
9.49
9.50
9.51
9.52
9.53
9.54
9.55
9.56
9.57
9.58
9.59
9.60
9.61
Blender . . . . . . . . . . . . .
Cinema 4D . . . . . . . . . . .
Cinema 4D Team Render . . .
Clarisse iFX . . . . . . . . . .
Combustion . . . . . . . . . . .
Command Line . . . . . . . . .
Command Script . . . . . . . .
Composite . . . . . . . . . . .
Corona Standalone . . . . . . .
Corona Distributed Rendering .
CSiBridge . . . . . . . . . . .
CSiETABS . . . . . . . . . . .
CSiSAFE . . . . . . . . . . . .
CSiSAP2000 . . . . . . . . . .
DJV . . . . . . . . . . . . . . .
Draft . . . . . . . . . . . . . .
Draft Tile Assembler . . . . . .
EnergyPlus . . . . . . . . . . .
FFmpeg . . . . . . . . . . . . .
Fusion . . . . . . . . . . . . .
Fusion Quicktime . . . . . . .
Generation . . . . . . . . . . .
Hiero . . . . . . . . . . . . . .
Houdini . . . . . . . . . . . . .
Lightwave . . . . . . . . . . .
LuxRender . . . . . . . . . . .
LuxSlave . . . . . . . . . . . .
Mantra Standalone . . . . . . .
Maxwell . . . . . . . . . . . .
Maya . . . . . . . . . . . . . .
Media Encoder . . . . . . . . .
Mental Ray Standalone . . . . .
Messiah . . . . . . . . . . . . .
MetaFuze . . . . . . . . . . . .
MetaRender . . . . . . . . . .
MicroStation . . . . . . . . . .
modo . . . . . . . . . . . . . .
Naiad . . . . . . . . . . . . . .
Natron . . . . . . . . . . . . .
Nuke . . . . . . . . . . . . . .
Nuke Frame Server . . . . . . .
Octane Standalone . . . . . . .
PRMan (Renderman Pro Server)
Puppet . . . . . . . . . . . . .
Python . . . . . . . . . . . . .
Quicktime Generation . . . . .
Realflow . . . . . . . . . . . .
REDLine . . . . . . . . . . . .
Renderman (RIB) . . . . . . .
Rendition . . . . . . . . . . . .
Rhino . . . . . . . . . . . . . .
RVIO . . . . . . . . . . . . . .
Salt . . . . . . . . . . . . . . .
Shake . . . . . . . . . . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
577
582
586
591
596
600
602
605
610
612
619
621
624
627
629
632
635
638
641
644
650
654
657
661
666
672
674
677
681
684
706
709
713
717
720
722
728
736
739
742
748
754
757
760
762
764
767
774
777
780
783
792
794
796
iii
9.62
9.63
9.64
9.65
9.66
9.67
9.68
9.69
9.70
9.71
9.72
9.73
SketchUp . . . . . . . . . . .
Softimage . . . . . . . . . . .
Terragen . . . . . . . . . . .
Tile Assembler . . . . . . . .
V-Ray Distributed Rendering
VRay Standalone . . . . . . .
VRay Ply2Vrmesh . . . . . .
VRay Vrimg2Exr . . . . . . .
VRED . . . . . . . . . . . .
VRED Cluster . . . . . . . .
Vue . . . . . . . . . . . . . .
xNormal . . . . . . . . . . .
10 Event Plugins
10.1 Draft . .
10.2 FontSync
10.3 ftrack . .
10.4 Puppet .
10.5 Salt . . .
10.6 Shotgun .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
799
802
813
816
818
828
831
834
837
840
843
847
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
851
851
853
854
869
870
871
11 Cloud Plugins
11.1 Amazon EC2 . .
11.2 Google Cloud . .
11.3 Microsoft Azure
11.4 OpenStack . . .
11.5 vCenter . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
893
893
896
903
905
907
12 Release Notes
12.1 Deadline 7.0.0.54 Release Notes
12.2 Deadline 7.0.1.3 Release Notes
12.3 Deadline 7.0.2.3 Release Notes
12.4 Deadline 7.0.3.0 Release Notes
12.5 Deadline 7.1.0.35 Release Notes
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
911
911
939
940
942
942
iv
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
Deadline User Manual, Release 7.1.0.35
• search
CONTENTS
1
Deadline User Manual, Release 7.1.0.35
2
CONTENTS
CHAPTER
ONE
INTRODUCTION
1.1 Overview
Deadline is a hassle-free administration and rendering toolkit for Windows, Linux, and Mac OSX based render farms.
It offers a world of flexibility and a wide-range of management options for render farms of all sizes, and supports over
60 different rendering packages out of the box.
Deadline 7 is the latest version of Thinkbox Software’s scalable high-volume compute management solution. It features built-in VMX (Virtual Machine Extension) capabilities, which allow artists, architects and engineers to harness
resources in both public and private clouds.
In addition to enhanced cloud support, Deadline 7 expands support for the Jigsaw multi-region rendering feature,
which can now be accessed in 3ds Max, Maya, modo, and Rhino. Deadline 7 also includes an updated version of
Draft, Thinkbox’s lightweight compositing and video processing plug-in designed to automate typical post-render
tasks such as image format conversion as well as the creation of animated videos and QuickTimes, contact sheets, and
watermark elements on exported images. Finally, Deadline 7 introduces a wealth of new features, enhancements, and
bug fixes.
Deadline 7.1 adds many new features to Deadline 7.0, including new slave metrics, better font synchronization, and
new application support. It also fixes some bugs that were discovered after Deadline 7.0 was released.
Note that a new 7.1 license is required to run this version. If you have a license for Deadline 7.0 or earlier, you will
need an updated license. In addition, the version of Draft that ships with Deadline 7.1 needs a new 1.3 license. If you
have a license for Draft 1.2 or earlier, you will need an updated license.
1.1.1 Components
The Deadline Render Farm Management System is built up of 3 components:
• A single Deadline Database
• A single Deadline Repository
• One or more Deadline Clients
3
Deadline User Manual, Release 7.1.0.35
The Database and Repository together act as a global system where all of Deadline’s data is stored. The Clients
(workstations and render nodes) then connect to this system to submit, render, and monitor jobs. It is important to
note that while the Database and Repository work together, they are still separate components, and therefore can be
installed on separate machines if desired.
1.1.2 Database
The Database is the global database component of the Deadline Render Farm Management System. It stores the jobs,
settings, and slave configurations. The Clients access the Database via a direct socket connection over the network. It
only needs to be installed on one machine (preferably a server), and does not require a license.
1.1.3 Repository
The Repository is the global file system component of the Deadline Render Farm Management System. It stores the
plugins, scripts, logs, and any auxiliary files (like scene files) that are submitted with the jobs. The Clients access the
Repository via a shared network path. It only needs to be installed on one machine (preferably a server), and does not
require a license.
1.1.4 Client
The Client should be installed on your render nodes, workstations, and any other machines you wish to participate in
submitting, rendering, or monitoring jobs. The Client consists of the following applications:
4
Chapter 1. Introduction
Deadline User Manual, Release 7.1.0.35
• Launcher: Acts as a launch point for the Deadline applications on workstations, and facilitates remote communication on render nodes.
• Monitor: An all-in-one application that artists can use to monitor their jobs and administrators can use to monitor
the farm.
• Slave: Controls the rendering applications on the render nodes.
• Command: A command line tool that can submit jobs to the farm and query for information about the farm.
• Pulse: An optional mini server application that performs maintenance operations on the farm, and manages
more advanced features like Auto Configuration, Power Management, Slave Throttling, Statistics Gathering,
and the Web Service. If you choose to run Pulse, it only needs to be running on one machine.
• Balancer: An optional Cloud-controller application that can create and terminate Cloud instances based on
things like available jobs and budget settings.
Note that the Slaves and the Balancer applications are the only Client applications that require a license.
1.1.5 Jobs
A Deadline job typically represents one of the following:
• The rendering of an animation sequence from a 3D scene.
• The rendering of a frame sequence from a composition. It could represent a single write node, or multiple write
nodes with the same frame range.
• The generation of a Quicktime movie from an existing image sequence.
• A simulation.
These are just some common cases. Since a job simply represents some form of processing, a plug-in can be created
for Deadline to do almost anything you can think of.
Job Breakdown
A job can be broken down into one or more tasks, where each task is an individual unit that can be rendered by the
Slave application. Each task can then consist of a single frame or a sequence of frames. Here are some examples:
• When rendering an animation with 3ds Max where each frame can take hours to render, each frame can be
rendered as a separate task.
• When rendering a compositing job with After Effects where each frame can take seconds to render, each task
could consist of 20 frames.
• When rendering a Quicktime job to create a movie from an existing sequence of images, the job would consist
of a single task, and that task would consist of the entire image sequence.
1.1. Overview
5
Deadline User Manual, Release 7.1.0.35
Job Scheduling
Use numeric job priorities, machine groups and pools, and job-specific machine lists to explicitly control distribution
of rendering resources among multiple departments. Limits allow you to handle both limited license plug-ins and
render packages, while job dependencies and scheduling allow you to control when your jobs will begin rendering.
6
Chapter 1. Introduction
Deadline User Manual, Release 7.1.0.35
The Slave applications are fully responsible for figuring out which job they should render next, and they do this by
connecting directly to the Database. In other words, there is no central server application that controls which jobs the
Slaves are working on. The benefit to this is that as long as your Database and Repository are online, Deadline will be
fully operational.
1.2 Feature Set
1.2.1 Rock-steady Operation
Deadline’s unique architecture removes the need for a centralized manager application by using a highly-scalable
database and basic file sharing to manage the farm. As long as your Database and File Server are running, Deadline is
running.
1.2.2 Intuitive User Interface
Built with your creativity in mind, Deadline’s User Interface has evolved in response to extensive feedback from artists.
The flexible and intuitive interface provides a unified experience to artists and administrators across all platforms.
For job submission, Deadline offers integrated submission scripts for 3ds Max, After Effects, Blender, Cinema 4D,
Clarisse iFX, Composite, Fusion, Generation, Hiero, Houdini, Lightwave, Maya, Messiah, modo, Nuke, RealFlow,
Rhino, SketchUp 3D, Softimage, and Vue, providing a comfortable native environment for cross-application tasks.
1.2. Feature Set
7
Deadline User Manual, Release 7.1.0.35
1.2.3 Supported Software
Deadline supports over 60 different rendering packages out of the box. See the Supported Software page in the
Deadline documentation for more information.
1.2.4 Customizable and Scriptable
With its Python based plug-in API, studios can customize the out of the box plug-ins and scripts to suit their individual
pipelines, or create custom plug-ins to support in-house applications. Event plug-ins can be created to trigger events
like updating pipeline tools when jobs are submitted or finish rendering, and Cloud plug-ins can be created to control
VMs in public and private Cloud providers. Finally, job scripts can be created to setup custom dependencies, as well
as perform operations when a job starts, when a job finishes, and before and after each task is rendered.
1.2.5 Flexible Job Scheduling
Use numeric job priorities, machine groups and pools, and job-specific machine lists to explicitly control distribution
of rendering resources among multiple departments. Limits allow you to handle both limited license plug-ins and
render packages, while job, asset, and script based dependencies allow you to control when your jobs will begin
rendering. Stick with the default First-in, First-out scheduling logic, or switch to a Balanced or Weighted system.
Launch and configure an arbitrary number of Slaves on a single machine. Each Slave instance can be given a unique
name, and can be assigned its own list of pools and groups, which allows Slaves to work on separate jobs. A single
high performance machine can process multiple 3D, compositing, and simulation jobs simultaneously. Slave instances
running on the same machine will share a single Deadline license.
1.2.6 Notifications
Deadline can be configured to notify users of job completion or failure through an automatic e-mail notification or a
popup message on the users’ machine.
Administrators can also configure Deadline to notify them with information about Power Management, stalled Slaves,
licensing issues, and other issues that may arise on the farm.
1.2.7 Statistics Gathering
Deadline automatically stores job and render farm statistics in the Database. Statistics can be viewed from the Monitor,
or retrieved from the Database by custom pipeline tools.
1.2.8 Shotgun and ftrack Integration
Deadline integrates with Shotgun to enable a seamless render and review data flow. When a render job is submitted,
a version is automatically created in Shotgun with key metadata. When the render is complete, Shotgun is updated
with a thumbnail image, paths to frames, render stats, and playback links. Deadline can also automatically upload a
movie and/or a filmstrip when the render is complete. Shotgun then dispatches targeted notifications with links back
to the work. Studios can view versions in various contexts, create reports, and organize work into playlists for review
sessions where they can quickly take notes with the Shotgun Note App.
The Deadline/FTrack integration enables a seamless render and review data flow. When Deadline starts a render, an
Asset Version is automatically created within FTrack using key metadata. When the render is complete, Deadline
automatically updates the created Version appropriately – a thumbnail image is uploaded, components are created
from the Job’s output paths (taking advantage of FTrack’s location plugins), and the Version is flagged for Review. In
8
Chapter 1. Introduction
Deadline User Manual, Release 7.1.0.35
doing so, Deadline provides a seamless transition from Job Submission to Review process, without artists needing to
monitor their renders.
1.2.9 Draft
Draft is a tool that provides simple compositing functionality. It is implemented as a Python library, which exposes
functionality for use in python scripts. Draft is designed to be tightly integrated with Deadline, but it can also be used
as a standalone tool.
Using the Draft plugin for Deadline, artists can automatically perform simple compositing operations on rendered
frames after a render job finishes. They can also convert them to a different image format, or generate Quicktimes for
dailies.
Active Deadline subscribers are entitled to Draft licenses at no additional cost. Active Deadline subscribers can request
a Draft license by emailing sales@thinkboxsoftware.com.
1.2.10 QuickTime Support
Install QuickTime on your slaves to create QuickTime movies from your own rendered frames.
1.2.11 Jigsaw and Tile Rendering
Jigsaw is available for 3ds Max, Maya, modo, and Rhino, and can be used to split up large frames into arbitrary sized
tiles and distribute them over your render farm. When the tiles are finished rendering, they are automatically assemged
into the final image using Draft. Specific tiles can be re-rendered and automatically composited on top of the original
image.
Regular tile rendering, which supports fixed tile sizes only, is still supported as well, and is available for 3ds Max,
Maya, modo, Rendition, Rhino, and Softimage.
1.2.12 Easy Installation and Upgrade Deployment
Deadline has gone through rigorous analysis to make the installation and configuration process smooth and efficient. A
detailed document provides easy, step-by-step instructions explaining the various components that will be installed. In
addition, Deadline has the ability to auto-upgrade the whole render farm from a centralized deployment - an incredible
time-saver for large render farms.
Auto Configuration allows studios to efficiently increase the size of their farm by removing the need to configure each
new Slave individually. The Repository Path, License Server, and additional settings can be configured in a single
location, and broadcast to the slaves when they start up.
1.2.13 Slave Scheduling and Idle Detection
Start and stop the slave based on the time of day to allow workstations to join the render farm overnight. Alternatively,
start the slave if the machine has been idle for a certain amount of time, and stop it when the machine is in use again.
Other criteria like CPU usage, memory usage, and running processes can also be checked before starting the slave.
Displays a warning message before starting the slave, allowing an artist to choose to delay when the slave starts if they
are still using the machine.
1.2. Feature Set
9
Deadline User Manual, Release 7.1.0.35
1.2.14 Local Slave Controls
Artists can monitor and control the slave application running on their workstation, which is useful if the slave is
running as a service. Override the Idle Detection settings for your slave, or change the slave’s Job Dequeuing Mode
to control if the slave should render all jobs, jobs submitted from the artist’s machine, or jobs submitted by specific
users.
1.2.15 Remote Control and Farm Administration
Stream the log from a Slave in real time, or start, stop, and restart Slave instances (as well as the remote machine on
which it is running) remotely from within the Monitor. In addition, execute arbitrary command lines (applications,
command line operations or batch files) on a single or group of remote machines to rollout software or install updates.
In addition, Deadline integrates seamlessly with VNC, Remote Desktop Connection, Apple Remote Desktop, and
Radmin using custom scripts. These scripts can be modified or new scripts can be created to support other remote
access software.
1.2.16 Access Control and Auditing
While full access is granted for all users to modify their own jobs, the User Group Management System prevents users
from inadvertently disrupting other jobs, and allows Administrators to configure the types of actions available to each
user group. An optional password protected Super User mode allows for global network administration.
Any command that affects a job or Slave is logged along with the originating user name and machine. This allows
everyone, including project managers and supervisors, to track changes and troubleshoot issues with confidence. It
also encourages responsibility and cooperation on the part of all users.
1.2.17 Reduced Energy Footprint
Save on energy consumption, power and cooling costs with Power Management, a feature that shuts down idle machines and starts them back up when needed. This feature is available for render farms with machines that support
WakeOnLan.
1.3 Supported Software
Deadline offers extensive out of the box support for third party applications, as well as an Application Plugin API and
Event Plugin API for custom plugin development. The following applications (and associated renderers) are supported
out of the box.
10
Chapter 1. Introduction
Deadline User Manual, Release 7.1.0.35
1.3.1 3ds Max
Highlighted Features
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Supports Versions 2010 to 2016
3ds Max and 3ds Max Design
Integrated Submission
RPManager Submission
Keeps Scene In Memory
Tile Rendering
Jigsaw Support
Interactive VRay Distributed Rendering
Interactive Corona Distributed Rendering
Offload VRay Distributed Rendering
Offload Mental Ray Distributed Rendering
Render To Texture Support
Maxscript Jobs
Scene States/Sub-States
Custom Sanity Check
Local Rendering
Sticky/Default Settings Configuration
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Output File Path
Path Mapping Of Pre/Post Script Paths
Path Mapping Of Path Config File Path
Supported Renderers
•
•
•
•
•
•
•
•
•
•
•
•
Brazil r/s
Corona
finalRender
finalToon
Krakatoa
Maxwell
NVIDIA iray
NVIDIA Mental Ray
Quicksilver
RenderPipe
Scanline
VRay
Documentation: 3ds Max Documentation, 3ds Command Documentation
1.3.2 After Effects
Highlighted Features
•
•
•
•
•
•
•
•
•
•
•
•
Supports Versions CS3 to CS6 and CC to CC2014
Integrated Submission
Local Rendering
Multi-Machine Rendering
Submit Layers As Separate Jobs
Custom Sanity Check
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Output Path
Path Mapping Of Scene File Contents (.aepx format only)
Documentation: After Effects Documentation
1.3. Supported Software
11
Deadline User Manual, Release 7.1.0.35
1.3.3 Anime Studio
Highlighted Features
•
•
•
•
•
•
Supports Version 8 to 11
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Output Path
Documentation: Anime Studio Documentation
1.3.4 Arion Standalone
Highlighted Features
•
•
•
•
•
•
Supports Version 2 and Later
Shotgun Support
ftrack Support
Path Mapping Of Scene File Path
Path Mapping Of Output Path
Path Mapping Of Scene File Contents
Documentation: Arion Standalone Documentation
1.3.5 Arnold Standalone
Highlighted Features
•
•
•
•
•
•
•
Supports the Pre-Release Beta and Version 1
Local Rendering
Shotgun Support
ftrack Support
Path Mapping Of Input File Paths
Path Mapping Of Output Path
Path Mapping Of Plugin Folder Paths
Documentation: Arnold Standalone Documentation
12
Chapter 1. Introduction
Deadline User Manual, Release 7.1.0.35
1.3.6 AutoCAD
Highlighted Features
•
•
•
•
•
•
•
Supports AutoCAD 2015
Local Rendering
Plotting
Exporting
Shotgun Support
ftrack Support
Draft Support
Documentation: AutoCAD Documentation
1.3.7 Blender
Highlighted Features
•
•
•
•
•
•
Supports Version 2.5 and Later
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Output Path
Supported Renderers
• All
Documentation: Blender Documentation
1.3.8 Cinema 4D
Highlighted Features
•
•
•
•
•
•
•
•
•
•
•
Supports Versions 12 to 16
Integrated Submission
Local Rendering
Automatic Scene Exporting
Team Render Support
Custom Sanity Check
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Output Path
Supported Renderers
• All
Documentation: Cinema 4D Documentation, Cinema 4D Team Render Documentation
1.3. Supported Software
13
Deadline User Manual, Release 7.1.0.35
1.3.9 Clarisse iFX
Highlighted Features
•
•
•
•
•
•
Integrated Submission
Automatic Render Archiving
Path Mapping Of Scene File Path
Path Mapping Of Config File Path
Path Mapping Of Module Paths
Path Mapping Of Search Paths
Documentation: Clarisse iFX Documentation
1.3.10 Combustion
Highlighted Features
•
•
•
•
•
•
Supports Versions 4 and 2008
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Scene File Contents
Documentation: Combustion Documentation
1.3.11 Command Line
Highlighted Features
•
•
•
•
•
Run Arbitrary Command Line Jobs
Run The Same Command For Different Frames
Run Different Commands For Different Tasks
Path Mapping Of Executable File Path
Path Mapping Of Arguments
Documentation: Command Line Documentation, Command Script Documentation
1.3.12 Composite
Highlighted Features
•
•
•
•
Supports Versions 2010 to 2016
Integrated Submission
Shotgun Support
ftrack Support
Documentation: Composite Documentation
14
Chapter 1. Introduction
Deadline User Manual, Release 7.1.0.35
1.3.13 Corona Standalone
Highlighted Features
•
•
•
•
•
Override number of passes and render time during submission
Specify multiple configuration files to use when rendering
Path Mapping Of Scene File Path
Path Mapping Of Output File Path
Path Mapping Of Config File Paths
Documentation: Corona Standalone Documentation
1.3.14 Corona Distributed Rendering
Highlighted Features
•
•
•
•
Supported Applications
Submit DR Jobs to Reserve Machines
Interactive Distributed Rendering
Existing Server Process Handling
Slave Auto Session Timeout Controls
• 3ds Max (fully integrated)
• DR Server (server launching only)
Documentation: Corona Distributed Rendering Documentation
1.3.15 CSiBridge
Highlighted Features
•
•
•
•
Submit Solver, Analysis and Reporting jobs
Cleanup Options to Optimize Data Size
Optionally Perform Design after Analysis
Optional Automatic Compression of Output
Documentation: CSiBridge Documentation
1.3.16 CSiETABS
Highlighted Features
• Submit Solver, Analysis and Reporting jobs
• Cleanup Options to Optimize Data Size
• Optional Automatic Compression of Output
Documentation: CSiETABS Documentation
1.3. Supported Software
15
Deadline User Manual, Release 7.1.0.35
1.3.17 CSiSAFE
Highlighted Features
•
•
•
•
Submit Solver, Analysis and Reporting jobs
Cleanup Options to Optimize Data Size
Optionally Export to external Database
Optional Automatic Compression of Output
Documentation: CSiSAFE Documentation
1.3.18 CSiSAP2000
Highlighted Features
•
•
•
•
Submit Solver, Analysis and Reporting jobs
Cleanup Options to Optimize Data Size
Optionally Perform Design after Analysis
Optional Automatic Compression of Output
Documentation: CSiSAP2000 Documentation
1.3.19 DJV
Highlighted Features
•
•
•
•
•
•
•
Image/Movie Type Conversion
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Input File Path
Path Mapping Of Output File Path
Path Mapping Of Slate Input Path
Documentation: DJV Documentation
1.3.20 Draft
Highlighted Features
•
•
•
•
•
•
•
Deep Integration With Deadline
Create Movies From Rendered Images
Perform Other Image Processing
Shotgun Support
ftrack Support
Path Mapping Of Template File Path
Path Mapping Of Template Arguments
Documentation: Draft Documentation, Draft Event Documentation
16
Chapter 1. Introduction
Deadline User Manual, Release 7.1.0.35
1.3.21 EnergyPlus
Highlighted Features
•
•
•
•
•
Off-load US Gov. Energy Analysis Jobs
Optional Weather EPW Files
Multithreading/DEBUG Options
Post-Processing Options
Optional Automatic Compression of Output
Documentation: EnergyPlus Documentation
1.3.22 FFmpeg
Highlighted Features
•
•
•
•
•
•
Up To 10 Input Files or Sequences
Path Mapping Of Input File Paths
Path Mapping Of Output File Path
Path Mapping Of Video Preset File Path
Path Mapping Of Audio Preset File Path
Path Mapping Of Subtitle Preset File Path
Documentation: FFmpeg Documentation
1.3.23 ftrack
Highlighted Features
•
•
•
•
Create new Asset Versions on job submission
Update Version status on job completion
Automatic thumbnail generation and upload
Automatic component upload
Documentation: ftrack Event Documentation
1.3.24 Fusion
Highlighted Features
•
•
•
•
•
•
•
•
Supports Versions 5 to 7
Integrated Submission
Keeps Scene In Memory
Custom Sanity Check
Quicktime Generation
Shotgun Support
ftrack Support
Draft Support
Documentation: Fusion Documentation, Fusion Quicktime Documentation
1.3. Supported Software
17
Deadline User Manual, Release 7.1.0.35
1.3.25 Generation
Highlighted Features
• Integrated Submission
• Submit Comp Jobs To Fusion
Documentation: Generation Documentation
1.3.26 Hiero
Highlighted Features
• Integrated Submission
• Submit Transcoding Jobs To Nuke
Documentation: Hiero Documentation
1.3.27 Houdini
Highlighted Features
•
•
•
•
•
•
•
•
•
•
•
•
•
Supports Versions 9 to 14
Integrated Submission
Submit ROPs as Separate Jobs
Submit Wedge ROPs as Separate Jobs
IFD Export Jobs
Custom Sanity Check
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Output File Path
Path Mapping Of Scene File Contents
Path Mapping Of IFD File Path
Supported Renderers
• All
Documentation: Houdini Documentation
18
Chapter 1. Introduction
Deadline User Manual, Release 7.1.0.35
1.3.28 Lightwave
Highlighted Features
•
•
•
•
•
•
•
•
•
•
•
•
Supports Versions 8 to 11 and 2015
FPrime Rendering
Integrated Submission
Keeps Scene In Memory
Custom Sanity Check
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Config Folder Path
Path Mapping Of Content Folder Path
Path Mapping Of Content File Contents
Supported Renderers
• All
Documentation: Lightwave Documentation
1.3.29 LuxRender
Highlighted Features
• Path Mapping Of Scene File Path
Documentation: LuxRender Documentation
1.3.30 LuxSlave
Highlighted Features
•
•
•
•
Submit Luxconsole Jobs to Reserve Machines
Interactive Distributed Rendering
Existing Slave Process Handling
Slave Auto Session Timeout Controls
Documentation: LuxSlave Documentation
1.3.31 Mantra Standalone
Highlighted Features
•
•
•
•
•
•
Supports Versions 7 to 13
Shotgun Support
ftrack Support
Path Mapping Of IFD File Path
Path Mapping Of Output File Path
Path Mapping Of IFD File Contents
Documentation: Mantra Standalone Documentation
1.3. Supported Software
19
Deadline User Manual, Release 7.1.0.35
1.3.32 Maxwell
Highlighted Features
•
•
•
•
•
•
•
•
•
•
•
•
Supports Versions 2 and 3
Cooperative Rendering
Automatic MXI Merging
Local Rendering
Resume Rendering from MXI Files
Override Time and Sampling Level Values
Override Extra Sampling Values
Shotgun Support
ftrack Support
Path Mapping Of Scene File Path
Path Mapping Of MXI File Path
Path Mapping Of Output File Path
Documentation: Maxwell Documentation
1.3.33 Maya
Highlighted Features
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Supports Versions 2010 to 2016
Integrated Submission
Keeps Scene In Memory
Tile Rendering
Jigsaw Support
VRay Distributed Rendering
Local Rendering
Submit Layers As Separate Jobs
Submit Cameras As Separate Jobs
Mental Ray Export Jobs
VRay Export Jobs
Renderman Export Jobs
Arnold Export Jobs
Melscript/Python Script Jobs
Custom Sanity Check
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Output Folder Path
Path Mapping Of Project Folder Path
Path Mapping Of Scene File Contents (.ma format
only)
Supported Renderers
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
3Delight
Arnold
Caustic Visualizer
Final Render
Gelato
Krakatoa
Maxwell
MayaSoftware
MayaHardware
MayaVector
Mental Ray
Octane
Redshift
Renderman
Renderman RIS
Turtle
VRay
Documentation: Maya Documentation
20
Chapter 1. Introduction
Deadline User Manual, Release 7.1.0.35
1.3.34 Media Encoder
Highlighted Features
•
•
•
•
•
Local Rendering
Shotgun Support
ftrack Support
Path Mapping Of Input File Path
Path Mapping Of Output File Path
Documentation: Media Encoder Documentation
1.3.35 Mental Ray Standalone
Highlighted Features
•
•
•
•
•
Local Rendering
Shotgun Support
ftrack Support
Path Mapping Of Input File Path
Path Mapping Of Output File Path
Documentation: Mental Ray Standalone Documentation
1.3.36 Messiah
Highlighted Features
•
•
•
•
•
•
Integrated Submission
Shotgun Support
ftrack Support
Path Mapping Of Scene File Path
Path Mapping Of Output Folder Path
Path Mapping Of Content Folder Path
Documentation: Messiah Documentation
1.3.37 MetaFuze
Highlighted Features
• Batch Folder Submission
• Path Mapping Of Scene File Path
Documentation: MetaFuze Documentation
1.3. Supported Software
21
Deadline User Manual, Release 7.1.0.35
1.3.38 MetaRender
Highlighted Features
• Path Mapping Of Input File Path
• Path Mapping Of Output File Path
Documentation: MetaRender Documentation
1.3.39 MicroStation
Highlighted Features
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Supports MicroStation v8i SS3
Integrated Submission
Keeps Design File in Memory
Animation Renders
Single View Renders
Batched View Renders
Export to modo scene file
Export to DWG / DXF
Export to ACIS SAT
Export to flat DGN
Export visible edges
Print jobs
Path Mapping Of Scene File Path
Path Mapping Of Output File Path
Supported Renderers
• Luxology (modo)
• All built-in renderers
Documentation: MicroStation Documentation
1.3.40 modo
Highlighted Features
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Supports Versions 3xx to 8xx
Integrated Submission
Keeps Scene In Memory
Modo Distributed Rendering
Tile Rendering
Jigsaw Support
Pass Groups Support
Submit Pass Group As Separate Jobs
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Output File Path
Path Mapping Of Scene File Contents
Supported Renderers
• modo’s default renderer
• VRay
Documentation: modo Documentation
22
Chapter 1. Introduction
Deadline User Manual, Release 7.1.0.35
1.3.41 Naiad
Highlighted Features
•
•
•
•
•
•
Simulation Jobs
EMP to PRT Conversion Jobs
Shotgun Support
ftrack Support
Path Mapping Of Scene File Path
Path Mapping Of EMP File Path
Documentation: Naiad Documentation
1.3.42 Natron
Highlighted Features
•
•
•
•
•
•
Supports Versions 0.9 to 1.0
Specify Writer Node to Render
Shotgun Support
ftrack Support
Path Mapping Of Project File Path
Path Mapping Of Project File Contents
Documentation: Natron Documentation
1.3.43 Nuke
Highlighted Features
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Supports Versions 6 to 9
Integrated Submission
Keeps Scene In Memory
Submit Write Nodes As Separate Jobs
Submit Write Nodes in Precomp Nodes
Specify Views to Render
Render Using Proxy Mode
Nuke Studio Support
Studio Frame Server distributed rendering
Studio Sequence Submission
Custom Sanity Check
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Scene File Contents
Documentation: Nuke Documentation
1.3. Supported Software
23
Deadline User Manual, Release 7.1.0.35
1.3.44 Octane Standalone
Highlighted Features
•
•
•
•
Shotgun Support
ftrack Support
Path Mapping Of Scene File Path
Path Mapping Of Output File Path
Documentation: Octane Standalone Documentation
1.3.45 PRMan (Renderman Pro Server)
Highlighted Features
•
•
•
•
Shotgun Support
ftrack Support
Path Mapping Of Input File Path
Path Mapping Of Working Directory Path
Documentation: PRMan Documentation
1.3.46 Puppet
Highlighted Features
• Sync applications and plugins across render nodes
• Automatically sync when render nodes are idle
Documentation: Puppet Event Documentation
1.3.47 Python
Highlighted Features
•
•
•
•
Supports Versions 2.3 to 2.7 and 3.0 to 3.2
Submit Python Scripts as Jobs
Path Mapping Of Script File Path
Path Mapping Of Script Arguments
Documentation: Python Documentation
24
Chapter 1. Introduction
Deadline User Manual, Release 7.1.0.35
1.3.48 Quicktime
Highlighted Features
•
•
•
•
•
•
Generate Quicktime Movies from Images
Shotgun Support
ftrack Support
Path Mapping Of Input File Path
Path Mapping Of Output File Path
Path Mapping Of Audio File Path
Documentation: Quicktime Documentation
1.3.49 RealFlow
Highlighted Features
•
•
•
•
•
•
Supports Versions 4 to 5, and 2012 to 2014
Integrated Submission
Submit IDOCs as Separate Jobs
Shotgun Support
ftrack Support
Path Mapping Of Scene File Path
Documentation: RealFlow Documentation
1.3.50 REDLine
Highlighted Features
• Path Mapping Of Scene File Path
• Path Mapping Of Output Folder Path
• Path Mapping Of RSX File Path
Documentation: REDLine Documentation
1.3.51 Renderman (RIB)
Note that while this plugin supports PRMan, it is recommended that you use PRMan’s dedicated plugin instead if you
are using that renderer.
1.3. Supported Software
25
Deadline User Manual, Release 7.1.0.35
Highlighted Features
•
•
•
•
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Input File Path
Supported Renderers
•
•
•
•
•
•
•
•
•
3Delight
AIR
Aqsis
BMRT
Entropy
PRMan
Pixie
RenderDotC
RenderPipe
Documentation: Renderman Documentation
1.3.52 Rendition
Highlighted Features
•
•
•
•
•
Tile Rendering
Shotgun Support
ftrack Support
Path Mapping Of Scene File Path
Path Mapping Of Output File Path
Documentation: Rendition Documentation
1.3.53 Rhino
Highlighted Features
•
•
•
•
•
•
•
•
•
•
Supports Versions 4 and 5
Integrated Submission
Render Bongo Animations
Shotgun Support
ftrack Support
Draft Support
Tile Rendering
Jigsaw Support
Path Mapping Of Scene File Path
Path Mapping Of Output File Path
Supported Renderers
•
•
•
•
•
•
•
•
Brazil r/s
Flamingo Raytrace
Flamingo Photometric
Maxwell
Penguin
Rhino
TreeFrog
VRay
Documentation: Rhino Documentation
26
Chapter 1. Introduction
Deadline User Manual, Release 7.1.0.35
1.3.54 RVIO
Highlighted Features
•
•
•
•
•
Shotgun Support
ftrack Support
Path Mapping Of Input File Paths
Path Mapping Of Audio File Paths
Path Mapping Of Output File Path
Documentation: RVIO Documentation
1.3.55 Salt
Highlighted Features
• Sync applications and plugins across render nodes
• Automatically sync when render nodes are idle
Documentation: Salt Event Documentation
1.3.56 Shake
Highlighted Features
• Shotgun Support
• ftrack Support
• Path Mapping Of Scene File Path
Documentation: Shake Documentation
1.3.57 Shotgun
Highlighted Features
•
•
•
•
•
Create new Versions on job submission
Update Version status on job completion
Automatic thumbnail generation and upload
Automatic movie generation and upload
Automatic film strip generation and upload
Documentation: Shotgun Event Documentation
1.3. Supported Software
27
Deadline User Manual, Release 7.1.0.35
1.3.58 SketchUp
Highlighted Features
•
•
•
•
•
•
•
Supports Versions 7 to 8 and 2013 to 2015
Integrated Submission
Export 3D Models
Export 2D Images
Export 2D Image Sequences
Path Mapping Of Scene File Path
Path Mapping Of Export Directory Path
Supported Renderers
• All
Documentation: SketchUp Documentation
1.3.59 Softimage
Highlighted Features
•
•
•
•
•
•
•
•
•
•
•
•
•
Supports Versions 2010 to 2015
Integrated Submission
Keeps Scene In Memory
Tile Rendering
Local Rendering
Submit Passes As Separate Jobs
Fx Render Tree Jobs
Shotgun Support
ftrack Support
Draft Support
Path Mapping Of Scene File Path
Path Mapping Of Output File Path
Path Mapping Of Workgroup Folder Path
Supported Renderers
• All
Documentation: Softimage Documentation
1.3.60 Terragen
Highlighted Features
•
•
•
•
Supports Versions 2 to 3
Local Rendering
Path Mapping Of Scene File Path
Path Mapping Of Output File Path
Documentation: Terragen Documentation
28
Chapter 1. Introduction
Deadline User Manual, Release 7.1.0.35
1.3.61 VRay Distributed Rendering
Highlighted Features
Supported Applications
• Submit Spawner Jobs to Reserve Machines
• Interactive Distributed Rendering
•
•
•
•
•
•
3ds Max (fully integrated)
Maya (fully integrated)
Rhino (spawner launching only)
SketchUp (spawner launching only)
Softimage (fully integrated)
VRay Standalone (spawner launching only)
Documentation: VRay Distributed Rendering Documentation
1.3.62 VRay Standalone
Highlighted Features
•
•
•
•
•
•
•
VRIMG to EXR Conversion
PLY to VRMESH Conversion
Shotgun Support
ftrack Support
Path Mapping Of Scene File Path
Path Mapping Of Output File Path
Path Mapping Of Scene File Contents
Documentation: VRay Standalone Documentation, Ply2Vrmesh Documentation, Vrimg2Exr Documentation
1.3.63 VRED
Highlighted Features
•
•
•
•
•
•
•
Supports VRED 2015
Local Rendering
Single Frame and Animation Rendering
Sequencer and RenderQueue Supported
Shotgun Support
ftrack Support
Draft Support
Documentation: VRED Documentation
1.3.64 VRED Cluster
Highlighted Features
• Supports VRED 2015
• Submit Cluster Jobs to Reserve Machines
Documentation: VRED Cluster Documentation
1.3. Supported Software
29
Deadline User Manual, Release 7.1.0.35
1.3.65 Vue
Highlighted Features
•
•
•
•
•
Supports Versions 7 to 11 and 2014 to 2015
Integrated Submission
Shotgun Support
ftrack Support
Path Mapping Of Scene File Path
Documentation: Vue Documentation
1.3.66 xNormal
Highlighted Features
• Path Mapping Of Scene File Path
Documentation: xNormal Documentation
1.4 Render Farm Considerations
This is a list of things that should be taken into consideration before installing Deadline.
1.4.1 Rendering Software and Licensing
It is recommended that the rendering applications you plan to use for rendering (ie: 3ds Max, Maya, etc) be installed
on all of your render nodes. It is preferable that you install an application to the same location on each machine,
because this makes configuring the Deadline plugins easier. Note that some applications support being installed and
run from a network location, which can make setup and configuration easier. Refer to your rendering application’s
documentation to see if this is supported.
In addition, it is recommended that all licensing that your rendering applications require be setup before attempting to
render on your network. Deadline doesn’t handle the licensing of 3rd party rendering applications, so you should refer
to your application’s documentation or contact its support team if you run into issues with licensing.
1.4.2 Store Assets On The Network
It is recommended that all assets (ie: scenes, footage, textures, etc) used by your render jobs be placed on a network
share (preferably a server), which can be accessed via a shared path or a mapped network drive. This is important for
two reasons:
• It ensures that all the slaves in your render farm have access to your asset files.
• It ensures that the slaves use the same version of the asset files that are used by your job.
Note that you can optionally submit the scene file with the job. This results in the scene file being sent to the Repository
or an alternate location, and then copied locally to the Slave that renders it. If the scene file contains relative asset
paths, it is recommended to not submit the scene file with the job, as these relative paths will likely be broken when
the Slave renders the scene from its local location.
30
Chapter 1. Introduction
Deadline User Manual, Release 7.1.0.35
When rendering in a mixed OS environment, you can configure Deadline to swap paths based on the operating system
it is running on. The way this works is often specific to the rendering application that you are using, so please refer
to Cross-Platform Rendering Considerations section for the plug-in that you are using for more information. You can
access plug-in specific documentation in the Plug-ins documentation.
1.4.3 Save Output Files To The Network
All output should be saved to a network share as well (preferably a server). This is important because it ensures that
all the slaves in your render farm have access to the output path.
When rendering in a mixed OS environment, you can configure Deadline to swap output paths based on the operating
system it is running on. The way this works is often specific to the rendering application that you are using, so please
refer to Cross-Platform Rendering Considerations section for the plug-in that you are using for more information. You
can access plug-in specific documentation in the Plug-ins documentation.
1.4.4 Remote Administration
Deadline has a Remote Administration feature that can be enabled in the Client Setup section of the Repository
Options, which can be accessed from the Monitor by selecting Tools -> Configure Repository Options while in Super
User Mode. This feature allows you to control all the render nodes remotely from a single machine, including starting
and stopping the Slave application, and running arbitrary command line applications on each machine. However, this
feature can be a potential security risk if you are not behind a firewall. If this is the case, we recommend that you keep
this feature disabled.
1.4.5 Automatic Updates
Deadline has an Automatic Updates feature that can be enabled in the Client Setup section of the Repository Options,
which can be accessed from the Monitor by selecting Tools -> Configure Repository Options while in Super User
Mode. Enabling this feature makes minor Deadline upgrades easy, with little to no downtime. Refer to the Upgrading
Documentation for more information.
1.4.6 Setup An SMTP Server for Emails
Deadline can use email to notify users when their jobs have succeeded or failed. Email can also be used to notify
system administrators of all sorts of events, like when slaves stall or when jobs fail. It is recommended that an SMTP
server be setup so that you can make use of these features.
You can configure the email notification settings in the Repository Options, which can be accessed from the Monitor
by selecting Tools -> Configure Repository Options while in Super User Mode.
1.4.7 Auto Login on Windows Render Nodes
If you’re not running the Slave as a service, it can be set to start automatically when the render mode it is on starts up,
but this requires that the render node login automatically. On Windows, this can be done by modifying the registry on
each render node.
These are the steps to setup your render node registry to login:
1. Download the Registry Entry File For Auto Login from the Miscellaneous Deadline Downloads Page.
2. Edit the file to use the username and password you wish to.
3. Login to the render node as the specified user, then double-click on this file to run.
1.4. Render Farm Considerations
31
Deadline User Manual, Release 7.1.0.35
4. The next time you restart the machine, it should login automatically as the specified user.
By default, the Slaves are set to start automatically when the machine logs in. This setting, as well as others, can be
modified from the Launcher on each machine.
1.4.8 App Nap on Mac OS X Render Nodes and Workstations
App Nap is a collection of new features in OS X Mavericks (10.9+) that helps conserve CPU energy use by “slowing
down” or stopping applications that cannot be seen, for example if they are behind another window or the screen has
been put to sleep. However, this can have an adverse affect on Deadline and/or the applications it is rendering with.
Because of this, we recommend disabling App Nap and screen power saving modes (if applicable) on render nodes
across the entire operating system by enabling the “Prevent App Nap” checkbox via right-click “Get Info” for each
application on each machine or by following these steps in terminal:
1. Open a terminal (the Terminal can be found in /Applications/Utilities).
2. Run the following command (sudo rights required) and you must restart the machine
defaults write NSGlobalDomain NSAppSleepDisabled -bool YES
If you wish to re-enable App Nap, follow the steps above, but run the following command for (2) instead:
defaults delete NSGlobalDomain NSAppSleepDisabled
You can check the status of the setting (if it already exists on a machine) by the following command, where “1” means
App Nap is disabled and “0” means it is enabled:
defaults read NSGlobalDomain NSAppSleepDisabled
If workstations are being used as render nodes, it is recommended to disable App Nap on them as well. However, if
workstations are simply being used to submit and monitor render jobs, then this shouldn’t be necessary.
On Macs which have built-in or connected external displays, once a screen saver has begun or the display has been put
to sleep by power management, Deadline as well as other rendering applications will be throttled down to conserve
energy, regardless of the per-app App Nap setting.
Finally, the machine that is running Pulse/Balancer should also have App Nap disabled, or at the very least, disabled for
the Pulse/Balancer applications. To disable App Nap for the Pulse/Balancer application only, right-click (or Commandclick) on the DeadlinePulse/DeadlineBalancer application in Finder, and select Get Info. Then in the General section,
check the “Prevent App Nap” box. If Pulse/Balancer is currently running, you will have to restart it for the changes to
take effect.
32
Chapter 1. Introduction
Deadline User Manual, Release 7.1.0.35
1.4.9 Disable WER on Windows
When applications crash on Windows, the system holds the application open in memory and displays a series of helpful
boxes asking if you want to submit the error report to Microsoft. While that’s super handy for all sorts of reasons, if
there’s no one there to click the dialog (headless render node), Deadline will assume the application is still running
and wait indefinitely by default.
The registry fix below will stop that from popping up on render nodes that don’t have baby sitters. Meaning when the
application crashes, it actually exits like we know it should. This change is system-wide, but can be configured peruser if you like by changing the registry hive used (HKEY_CURRENT_USER versus HKEY_LOCAL_MACHINE).
Ensure you restart the machine after changing the registry setting and it is always recommended to take a backup before
editing a machine’s registry. Copy the code below into a file: “DisableCrashReporting.reg” and double-click this file
as a user with administrator privileges. Alternatively, you can manually add/edit the registry entry via “regedit.exe” or
inject the registry silently via the command-line “regedit.exe /s DisableCrashReporting.reg”.
Windows Registry Editor Version 5.00
[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\Windows Error Reporting]
"Disabled"=dword:00000001
For more information about the possible settings, see here: MSDN article WER Settings.
It’s also possible to just default to sending them if you like, or to store the crash dumps in a safe place if you’re a
developer.
1.4.10 Firewall, Anti-Virus & Security Considerations
Here is a checklist of items which should be considered by those responsible for deploying Deadline repository and
client software.
Ensure you consider additional configuration requirements for any software/hardware firewall clients, network
switches, anti-virus software clients and Operating System specific security controls such as Windows UAC or
SELinux (Security-Enhanced), which may attempt to block Deadline communication.
It is recommended during initial setup & configuration to disable all firewalls, anti-virus software, etc and test the basic
operation and functionality of Deadline. Once this has been verified as correct, then slowly re-enable all necessary
other software, re-testing and confirming that Deadline execution is still correct.
Windows UAC
Ensure Windows UAC is correctly configured to allow Deadline communication and the correct execution of the
Deadline applications.
Anti-Virus Software
Ensure Anti-Virus software does NOT block Deadline and allows Deadline executables to run normally on ALL
machines.
Deadline Executables
Allow Deadline executables to pass-through any applicable Client Firewall. Ensure you consider all applicable policy
scopes (Windows - domain, private, public) and both inbound & outbound rules:
• [INSTALL_PATH] Windows executable / Mac OSX executable / Linux executable
• [INSTALL_PATH] deadlinecommand.exe / DeadlineCommand.app / deadlinecommand
• [INSTALL_PATH] deadlinecommandbg.exe / DeadlineCommandBG.app / deadlinecommandbg
• [INSTALL_PATH] deadlinelauncher.exe / DeadlineLauncher.app / deadlinelauncher
1.4. Render Farm Considerations
33
Deadline User Manual, Release 7.1.0.35
• [INSTALL_PATH] deadlinelauncherservice.exe (Windows Only)
• [INSTALL_PATH] deadlinemonitor.exe / DeadlineMonitor.app / deadlinemonitor
• [INSTALL_PATH] deadlineslave.exe / DeadlineSlave.app / deadlineslave
• [INSTALL_PATH] deadlinepulse.exe / DeadlinePulse.app / deadlinepulse
• [INSTALL_PATH] deadlinebalancer.exe / DeadlineBalancer.app / deadlinebalancer
Deadline’s default local client software [INSTALL_PATH] for each OS are as follows (where # is the Deadline version):
• Windows: “C:\Program Files\Thinkbox\Deadline#\bin”
• Mac OSX: “/Applications/Thinkbox/Deadline#/bin”
• Linux: “/opt/Thinkbox/Deadline#”
Application Executables
Make sure you allow your application executables to pass-through any applicable Client Firewall. Ensure you consider
all applicable policy scopes (Windows - domain, private, public) and both inbound & outbound rules. See here for
specific 3dsMax Firewall Exceptions documentation.
MongoDB Server & Deadline clients
Ensure you allow MongoDB service daemon to pass through any firewall and network switch. Ensure you consider
all applicable policy scopes (Windows - domain, private, public) and both inbound & outbound rules:
• [INSTALL_PATH] Windows executable / Mac OSX executable / Linux executable
• [INSTALL_PATH] mongod.exe / mongod / mongod
Deadline’s default local database software [INSTALL_PATH] for each OS are as follows (where # is the Deadline
version):
• Windows: “c:\DeadlineDatabase#\mongo\application\bin”
• Mac OSX: “/Applications/Thinkbox/DeadlineDatabase#/mongo/application/bin”
• Linux: “/opt/Thinkbox/DeadlineDatabase#/mongo/application/bin”
Mono (Mac OSX / Linux Only)
Ensure Mono executable is allowed to pass-through any firewall / anti-virus software.
Port Configuration.
Ensure the machine(s) running the MongoDB, Deadline repository, Deadline Pulse/Balancer/Monitor/Slave ALL have
the ability to communicate with each other on your local and/or extended network with the following (default) TCP or
UDP ports.
34
Chapter 1. Introduction
Deadline User Manual, Release 7.1.0.35
Protocol
UDP
Port
Number
17061
TCP
17061
TCP
17062
TCP
TCP
27017
28017
TCP
8080
UDP
7
UDP
TCP
123
25
TCP
587
TCP
465
Service
Comment
Pulse
auto-configuration
Pulse
auto-configuration
Pulse
Default UDP port - Pulse listens for broadcasts on the UDP port
MongoDB
MongoDB Web
API
Pulse WebService
WoL
(Wake-On-Lan)
NTP
SMTP
SMTP
(submission)
SMTP SSL
Default TCP port - Pulse sends auto-config data over TCP
Default TCP port - “Configure Repository Options” - “Pulse
Settings” - “General”
Access the http web site (optional) for database information
Default TCP port - “Configure Repository Options” - “Pulse
Settings” - “WebService”
Default UDP port - “Configure Repository Options” - “Wake On
Lan Settings”
For mail server to receive e-mail notifications from Slaves and
Pulse
For sending notifications using SSL
License Server
If necessary, ensure that the Thinkbox Flexlm license file has been configured to run over an exact TCP port and
this port has also been allowed access through any required firewall or network switch. Please refer to the FLEXnet
Licensing Documentation.
External Web Service Access & Deadline Mobile
If external network access is required, please see the Web Service and Deadline Mobile documentation.
1.4. Render Farm Considerations
35
Deadline User Manual, Release 7.1.0.35
36
Chapter 1. Introduction
CHAPTER
TWO
INSTALLATION
2.1 System Requirements
This section covers the system requirements for all the Deadline components. It is also recommended to read through
the Render Farm Considerations documentation before proceeding with the installation.
For a more complete description of the Deadline components listed below, see the Deadline Overview documentation.
2.1.1 Database
Deadline uses MongoDB for the Database, and requires MongoDB 2.6.1 or later. The Repository installer can install the MongoDB database for you, or you can use an existing MongoDB installation providing that it is running
MongoDB 2.6.1 or later.
The following operating systems are supported for the Database:
• Windows Server 2003 and later (64-bit)
• Linux (64-bit)
• Mac OS X 10.7 and later (64-bit)
These are the minimum recommended hardware requirements for a production Database:
• 64-bit Architecture
• 8 GB RAM
• 4 Cores
• RAID or SSD disks
• 20 GB disk space
Note that MongoDB performs best if all the data fits into RAM, and it has fast disk write speeds. In addition, larger
farms may have to scale up on RAM and Cores as necessary, or even look at Sharding their database. Finally, while
you can install MongoDB to a 32-bit system for testing, it has limitations and is not recommended for production. For
example, the database size will be limited to 2 gigabytes, and Journaling will be disabled. Without Journaling, it will
not be possible to repair the database if a crash corrupts the data. See the MongoDB FAQ for more information.
Windows
If you choose a non-Server Windows Operating System (Vista, 7, or 8) to host the database, you should be aware
that these operating systems have a TCP/IP connection limitation of 10 new connections per second. If your render
farm consists of more than 10 machines, it is very likely that you’ll hit this limitation every now and then (and the
37
Deadline User Manual, Release 7.1.0.35
odds continue to increase as the number of machines increase). This is a limitation of the operating systems, and isn’t
something that we can workaround, so we recommend using a Server edition of Windows, or a different operating
system like Linux.
Linux
If you choose a Linux system to host the database, you will need to make sure the system resource limits are configured
properly to avoid connection issues. More details can be found in the Database and Repository Installation Guide.
Other Linux recommendations include:
• Do not run MongoDB on systems with Non-Uniform Access Memory (NUMA). It can cause a number of
operational problems, including slow performance or high system process usage.
• Install on a system with a minimum Linux kernel version of 2.6.36.
• Install on a system with Ext4 or XFS file systems.
• Turn off atime or relatime for the storage volume containing the database files, as it can impact performance.
• Do not use hugepages virtual memory pages as MongoDB performs better with normal virtual memory pages.
Mac OS X
If you choose a Mac OS X system to host the database, you will need to make sure the system resource limits are
configured properly to avoid connection issues. More details can be found in the Database and Repository Installation
Guide.
2.1.2 Repository
The Repository is just a collection of files and folders, so it can be installed to any type of share on any type of
operating system. Common Repository choices include:
• Windows Server
• Linux
• FreeBSD
While the Repository can be installed on any operating system, the Repository installer is only supported on the
following operating systems. To install on a different operating system, first create the network share on that
system, and then run the Repository installer on one of the systems below and choose the network share as the
installation location.
• Windows (32 and 64-bit)
– Windows XP and later (32 and 64-bit)
– Windows Server 2003 and later (32 and 64-bit)
• Linux (64-bit only)
– Ubuntu 12.04 and later
– Debian 7 and later
– Fedora 16 and later
– CentOS 6 and later
– RHEL 6 and later
38
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
• Mac OS X (64-bit only)
– 10.7 (OS X Lion) and later
If you choose a non-Server Windows Operating System (XP, Vista, 7, or 8), these operating systems usually will not
allow more than 10 incoming connections without purchasing additional user access licenses from Microsoft. This
means that if more than 10 machines (render nodes or workstations) connect to the Repository, connections will be
dropped, which could result unexpected behavior. This is a limitation of the operating systems, and isn’t something
that we can workaround, so we recommend using a Server edition of Windows, or a different operating system like
Linux or FreeBSD.
For hardware requirements, it mainly depends on if you are planning to submit scene files and other auxiliary files
with your jobs. If you are, keep in mind that the Repository machine will need to serve out these files to the Client
machines, so you will want to treat it like another asset server when it comes to picking hardware. That being said, if
you already have an asset server, you could probably just install the Repository on it. If you are not submitting your
scene files with your jobs (because they are already stored in a network location), then you should be fine with a less
powerful machine.
2.1.3 Client
The Client can be installed on Windows, Linux, or Mac OS X. The requirements for today’s rendering applications go
far beyond the requirements of the Client, so if a machine is powerful enough to be used for rendering, it is more than
capable of running the Client applications.
If you choose to run Pulse or Balancer, and you wish to run it on the same machine as the Database and/or Repository,
you will have to install the Client on that machine as well.
The following operating systems are supported for the Client:
• Windows (32 and 64-bit)
– Windows XP and later (32 and 64-bit)
– Windows Server 2003 and later (32 and 64-bit)
• Linux (64-bit only)
– Ubuntu 12.04 and later
– Debian 7 and later
– Fedora 16 and later
– CentOS 6 and later
– RHEL 6 and later
• Mac OS X (64-bit only)
– 10.7 (OS X Lion) and later
Note that on Linux, the Deadline applications have dependencies on some libraries that are installed with the lsb
(Linux Standard Base) package. To ensure you have all the dependencies you need, we recommend installing the full
lsb package. In addition, the libX11 and libXext must be installed on Linux for the Deadline applications to run, even
if running them with the -nogui flag. They’re required for the Idle Detection feature, among other things. To check if
libX11 and libXext are installed, open a Terminal and run the following commands. If they are installed, then the path
to the libraries will be printed out by these commands.
ldconfig -p | grep libX11
ldconfig -p | grep libXext
2.1. System Requirements
39
Deadline User Manual, Release 7.1.0.35
If any of these libraries are missing, then please contact your local system administrator to resolve this issue. Here is
an example assuming you have root access, using YUM to install them on your system:
sudo -s
yum install redhat-lsb
yum install libX11
yum install libXext
Note that if you are choosing a machine to run Pulse, you should be aware that non-Server editions of Windows
have a TCP/IP connection limitation of 10 new connections per second. If your render farm consists of more than
10 render nodes, it is very likely that you’ll hit this limitation every now and then (and the odds continue to increase
as the number of machines increase). This is a limitation of the operating systems, and isn’t something that we can
workaround, so we recommend using a Server edition of Windows, or a different operating system like Linux.
2.1.4 License Server
Deadline requires Flexnet License Server version 11.12 or later, and the license server can be run on the following
operating systems:
• Windows (32 and 64-bit)
– Windows XP and later (32 and 64-bit)
– Windows Server 2003 and later (32 and 64-bit)
• Linux (64-bit only)
– Ubuntu 12.04 and later
– Debian 7 and later
– Fedora 16 and later
– CentOS 6 and later
– RHEL 6 and later
• Mac OS X (64-bit only)
– 10.7 (OS X Lion) and later
See the License Server Documentation for more information on the License Server requirements.
Note that if you choose a non-Server Windows Operating System (XP, Vista, 7, or 8), you should be aware that these
operating systems have a TCP/IP connection limitation of 10 new connections per second. If your render farm consists
of more than 10 machines, it is very likely that you’ll hit this limitation every now and then (and the odds continue to
increase as the number of machines increase). This is a limitation of the operating systems, and isn’t something that
we can workaround, so we recommend using a Server edition of Windows, or a different operating system like Linux.
2.2 Licensing
See the License Server Documentation for more information on installing and configuring the License Server.
40
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
2.3 Database and Repository Installation
2.3.1 Overview
Before proceeding with this installation, it is highly recommended to read through the Render Farm Considerations
documentation.
The Database is the global database component of the Deadline Render Farm Management System. It stores the jobs,
settings, and slave configurations. The Clients access the Database via a direct socket connection over the network.
It only needs to be installed on one machine (preferably a server), and does not require a license. Deadline uses
MongoDB for the Database.
The Repository is the global file system component of the Deadline Render Farm Management System. It stores the
plugins, scripts, logs, and any auxiliary files (like scene files) that are submitted with the jobs. The Clients access the
Repository via a shared network path. It only needs to be installed on one machine (preferably a server), and does not
require a license.
The Database and Repository together act as a global system where all of Deadline’s data is stored. The Clients
then connect to this system to submit, render, and monitor jobs. It is important to note that while the Database and
Repository work together, they are still separate components, and therefore can be installed on separate machines if
desired.
The Repository installer can install the MongoDB database for you, but you can also choose to connect to an existing
MongoDB installation.
2.3.2 Installation
While the Repository can be installed on any operating system, the Repository installer is only available for Windows,
Linux, and Mac OS X. However, the machine that you run the Repository installer on doesn’t have to be the same
machine you’re installing the Repository to. For example, if you have an existing share on a FreeBSD server or a NAS
system, you can run the Repository installer on Windows, Linux, or Mac OS X and choose that share as the install
location.
To install the Repository, simply run the appropriate installer for your operating system and follow the steps. This
procedure is identical for all operating systems. The Repository installer also supports silent installations.
2.3. Database and Repository Installation
41
Deadline User Manual, Release 7.1.0.35
When choosing the Installation Directory, you can choose either a local path on the current machine, or the path to an
existing network share. Note that if you choose a local path, you must ensure that path is shared on the network so that
the Clients can access it. Do not install over an existing installation unless it’s the same major version, or there
could be unexpected results.
42
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
If you’re installing over an existing Repository installation, all previous binaries, plug-ins, and scripts will be backed
up prior to being overwritten. After the installation is complete, you can find these backed up files in the Backup folder
in the Repository installation root. Note that installing over an existing repository is only supported for repairing a
damaged repository, or for performing a minor upgrade. Major upgrades require a fresh repository installation. See
the Upgrading or Downgrading Deadline Documentation for more information.
2.3. Database and Repository Installation
43
Deadline User Manual, Release 7.1.0.35
After choosing the installation directory, you will be asked to install the MongoDB Database, or connect to an existing
one. If you choose to install the MongoDB Database, you will be asked to choose an installation location and a port
number. It is highly recommended that you choose a local directory to install the Database.
Note that Deadline 7 requires a newer version of the MongoDB database application than the one shipped with
Deadline 6. However, this newer version is backward compatible with Deadline 6. So if you are installing the
MongoDB database application to a machine that already has a Deadline 6 database installed, you can just
install it over top of the existing Deadline 6 database installation.
44
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
Next, you need to specify the Database Settings so that the installer can set up the Database. These settings will also
be used by the Clients to connect to the database. The following are required:
• Database Server: The host name or the IP address of the machine that the MongoDB database is running on.
If desired, you can specify multiple entries and separate them with semicolons. There are a couple reasons to
specify multiple entries:
– You have machines on different subnets that need to access the database differently (ie: machines in the
cloud might use a different host name than machines on the local network).
– Some machines need to resolve the database machine by its host name, and others need to use its IP
address.
Note that if there are IP addresses listed that cannot be resolved, the Deadline Command application
can run slower on Linux and OSX Clients because it won’t exit until the connection attempt for those IP
addresses time out.
• Database Port: The port that the MongoDB database is listening on.
• Database Name: The name of the Database. If you are setting up a new Database, you can leave this as the
default. If you are connecting to an existing Database, make sure to enter the same name you used when you
initially set up the Database.
• Replica Set: If you set up your MongoDB database manually and it is part of a Replica Set, specify the Replica
Set Name here. If you don’t have Replica Set, just leave this blank.
When you press Next, the installer will try to connect to the database using these settings to configure it. This can take
a minute or two. If an error occurs, you will be prompted with the error message. If the setup succeeds, you can then
proceed with the installation of the Repository.
2.3. Database and Repository Installation
45
Deadline User Manual, Release 7.1.0.35
Command Line or Silent Installation
The Repository installer can be run in command line mode or unattended mode on each operating system. Note though
that on Mac OS X, you must run the installbuilder.sh script that can be found in the Contents/MacOS folder, which is
inside the Mac Repository Installer package.
To run in command line mode, pass the “–mode text” command line option to the installer. For example, on Linux:
./DeadlineRepository-X.X.X.X-linux-x64-installer.run --mode text
To run in silent mode, pass the “–mode unattended” command line option to the installer. For example, on Windows:
DeadlineRepository-X.X.X.X-windows-installer.exe --mode unattended
To get a list of all available command line options, pass the “–help” command line option to the installer. For example,
on Mac OS X:
/DeadlineRepository-X.X.X.X-osx-installer.app/Contents/MacOS/installbuilder.sh --help
Note that there are a few Repository installer options that are only available from the command line, which you can
view when running the “–help” command. These options include:
• –backuprepo: If enabled, many folders in the Repository will be backed up before overwriting them (this is
enabled by default).
• –dbauth: If enabled, Deadline will use the given user and password to connect to MongoDB (if authentication
is enabled on your database).
• –dbuser: The user name to connect to MongoDB if authentication is enabled.
• –dbpassword: The password to connect to MongoDB if authentication is enabled.
• –dbsplit: If enabled, the database collections will be split into separate databases to improve performance (this
is enabled by default).
Database Config File
A file called config.conf is installed to the data directory in the database installation folder. This file is used to configure
the MongoDB database, and can be modified to add or change functionality. This is what you will typically see by
default:
#MongoDB config file
#where to log
systemLog:
destination: file
path: C:/DeadlineDatabase7/data/logs/log.txt
quiet: true
#verbosity: <integer>
#port for mongoDB to listen on
#uncomment below ipv6 and REST option to enable them.
net:
port: 27070
#ipv6: true
#http:
46
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
#RESTInterfaceEnabled: true
#where to store the data
storage:
dbPath: C:/DeadlineDatabase7/data
#enable sharding
#sharding:
#clusterRole
#configDB
#setup replica set with give replica set name
#replication:
#replSetName
#enable authentication
#security:
#authorization: enabled
After making changes to this file, simply restart the mongod process for the changes to take effect. See the MongoDB
Configuration File Options for more information on the available options.
Manual Database Installation
The Repository installer installs MongoDB with the bare minimum settings required for Deadline to operate. Manually
installing the Database might be preferable for some because it gives you greater control over things like authentication,
and allows you to create sharded clusters or replica sets for backup.
If you wish to install MongoDB manually, you can download MongoDB from the MongoDB Downloads Page. Once
MongoDB is running, you can then run the Repository installer, and choose to connect to an existing MongoDB
Database. Here are some helpful links for manually installing the MongoDB database:
• Installing MongoDB
• Enabling Authentication
• Replication
• Sharding
MongoDB also has a management system called MMS. It’s a cloud service that makes it easy to provision, monitor,
backup, and scale your MongoDB databse. Here are some helpful links for setting up and using MMS:
• Getting Started
• Add MongoDB Servers to MMS
• Install the Automation Agent
The Automation Agent mentioned above makes it possible to setup your MongoDB database from a web interface, and
easily configure which MongoDB servers are replica sets or shards. It also allows you to easily upgrade the version of
your MongoDB database. Here are some additional links for how you can use the Automation Agent:
• Deploy a Replica Set
• Deploy a Sharded Cluster
• Deploy a Standalone MongoDB Instance
• Change the MongoDB Version
2.3. Database and Repository Installation
47
Deadline User Manual, Release 7.1.0.35
Note though that as of this writing, the Automation Agent is only available for Linux and Mac OS X.
Database Resource Limits
Linux and Mac OS X systems impose a limit on the number of resources a process can use, and these limits can
affect the number of open connections to the database. It is important to be aware of these limits, and make sure they
are set appropriately to avoid unexpected behaviour. Note that MongoDB will allocate 80% of the system limit for
connections, so if the system limit is 1024, the maximum number of connections will be 819.
If you choose a Linux system to host the database, make sure the system limits are configured properly to avoid connection issues. See MongoDB’s Linux ulimit Settings documentation for more information, as well as the recommended
system limits to use.
You can check your current Linux/OSX ulimit settings in a terminal shell:
#overall ulimit settings on the machine
ulimit -a
#number of open files allowed
ulimit -n
MongoDB provides these Recommended ulimit Settings for optimal performance of your database. Note, you must
restart the Deadline Database daemon after changing these ulimit settings.
If you choose a Mac OS X system to host the database, and you use the Repository installer to install the database,
the resource limits will be set to 1024. These limits can be adjusted later by manually editing the HardResourceLimits
and SoftResourceLimits values in /Library/LaunchDaemons/org.mongodb.mongod.plist after the Repository installer
has finished.
2.3.3 Open Firewall Ports
To ensure that the Deadline applications can communicate with MongoDB, you will need to update the firewall on
the machine that MongoDB is running on. You can either disable the firewall completely (assuming it operates in an
internal network), or you can open the port that you chose for the database to use during install. More information on
opening ports can be found below.
Windows
Open Windows Firewall with Advanced Security. Click on Inbound Rules in the left panel to view all inbout rules,
and then right-click on Inbound Rules and select New Rule to start the Inbound Rule Wizard. Select Port for the Rule
Type, and then click Next.
48
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
2.3. Database and Repository Installation
49
Deadline User Manual, Release 7.1.0.35
On the Protocol and Ports page, choose TCP, and then specify the port that you chose for the database during the
install, and then press next. Then on the Action page, choose Allow The Connection and press Next.
50
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
2.3. Database and Repository Installation
51
Deadline User Manual, Release 7.1.0.35
On the Profile page, choose the networks that this rule applies to, and then press next. Then on the Name page, specify
a name for the rule (for example, MongoDB Connection), and then press Finish.
52
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
2.3. Database and Repository Installation
53
Deadline User Manual, Release 7.1.0.35
Linux
On RedHat and CentOS, the following commands should allow incoming connections to the Mongo database if iptables are being used. Just make sure to specify the port that you chose for the database during the install.
sudo iptables -I INPUT 1 -p tcp --dport 27070 -j ACCEPT
sudo ip6tables -I INPUT 1 -p tcp --dport 27070 -j ACCEPT
Ubuntu has no firewall installed by default, and we have not yet tested Fedora Core’s FirewallD.
Mac OS X
Mac OS X has its firewall disabled by default, but if enabled, it is possible to open ports for specific applications. Open
up System Preferences„ choose the Security & Privacy option, and click on the Firewall tab.
54
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
Press the Firewall Options button to open the firewall options. Press the [+] button and choose the path to the mongod
application, which can be found in the database installation folder in mongo/application/bin (for example, /Applications/Thinkbox/DeadlineDatabase7/mongo/application/bin/mongod). Then click OK to save your settings.
2.3. Database and Repository Installation
55
Deadline User Manual, Release 7.1.0.35
2.3.4 Sharing The Repository Folder
In general, the Repository must have open read and write permissions for Deadline to operate properly. This section
explains how to share your Repository folder and configure its permissions to ensure the Clients have full access.
Without full read/write access, the Client applications will not be able to function properly.
Note that this guide is for applying full read/write permissions to the entire Repository folder structure. For the more
advanced user, it is possible to enforce tighter restrictions on the Repository folders. Just make sure the Clients have
full read/write access to the following folders in the Repository. The rest must have at least read access.
• jobs: This is where job auxiliary files are copied to during submission.
• jobsArchived: This is where archived jobs are exported to.
• reports: This is where the physical log files for job and slave reports are saved to.
Windows
First, you need to configure the Repository folder permissions. Note that the images shown here are from Windows
XP, but the procedure is basically the same for any version of Windows.
• On the machine where the Repository is installed, navigate to the folder where it is installed using
Windows Explorer.
56
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
• Right-click on the Repository folder and select Properties from the menu.
• Select the Security tab.
• If there is already an Everyone item under Group or user names, you can skip the next two steps.
• Click on the Add button.
• In the resulting dialog, type Everyone and click OK.
2.3. Database and Repository Installation
57
Deadline User Manual, Release 7.1.0.35
• Select Everyone under Group or user names.
• Ensure that Modify, Read & Execute, List Folder Contents, Read, and Write are all checked under
the Allow column.
• Click on the OK button to save the settings.
58
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
Second, you need to share the Repository folder. Note that the images shown here are from Windows XP, but the
procedure is basically the same for any version of Windows.
• On the machine where the Repository is installed, navigate to the folder where it is installed using
Windows Explorer.
• Right-click on the Repository folder and select Properties from the menu. If you’re unable to see the
Sharing tab, you may need to disable Simple File Sharing in the Explorer Folder Options.
2.3. Database and Repository Installation
59
Deadline User Manual, Release 7.1.0.35
• Select the Sharing tab.
60
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
• Select the option to Share This Folder, then specify the share name.
• Click the Permissions button.
• Give Full Control to the Everyone user.
• Press OK on the Permissions dialog and then the Properties dialog.
2.3. Database and Repository Installation
61
Deadline User Manual, Release 7.1.0.35
Linux
Since the Clients expects full read and write access to the repository, it’s recommended to use a single user account
to mount shares across all machines. It is possible to add particular users to a ‘deadline’ group, but you will need to
experiment with that on your own.
So for both of the sharing mechanisms we explain below, you’ll need to create a user and a group named ‘deadline’.
They don’t need a login or credentials, we just need to be able to set files to be owned by them and for their account to
show up in /etc/passwd. So, to do this use the useradd command.
sudo useradd -d /dev/null -c "Deadline Repositry User" -M deadline
This should create a user named “deadline” with no home folder, and a fancy comment. The account login should also
be disabled, meaning your standard users can’t ssh or ftp into your file server using this account. Set a password using
sudo passwd deadline if you need your users to login as deadline using ftp or ssh.
62
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
Now add a group using
sudo groupadd deadline
And finally, have the Repository owned by this new user and group
sudo chown -R deadline:deadline /path/to/repository
sudo chmod -R 777 /path/to/repository
Now you’re ready to set up your network sharing protocol. There are a many ways this can be done, and this just
covers a few of them.
Samba Share
This is an example entry in the /etc/samba/smb.conf file:
[DeadlineRepository]
path = /path/to/repository
writeable = Yes
guest ok = Yes
create mask = 0777
force create mode = 0777
force directory mode = 0777
unix extensions = No
NFS Share
The simplest thing that could possibly work. Note that this is not the most secure thing that could possibly work:
For Linux and BSD, open up /etc/exports as an administrator, and make one new export:
/path/to/repository
192.168.2.0/24(rw,all_squash,insecure)
Breakdown of this command is as follows:
• /path/to/repository: The Repository folder to share. Change the path as necessary.
• 192.168.2.0/24: The IP range to allow. The zero is important for these ranges. You can also go by hostname if
you have reverse DNS, or * to allow from anyone’s computer.
• rw: Allow read/write for the repository, which is required for the Clients to operate properly.
• all_squash: Make every single person who connects to the Repository share map to the nobody:nogroup user
and group. This relieves a lot of permissions pain for new users at the cost of zero security. Files and folders
within your repository will be fully readable and writeable by whomever is able to connect to your NFS server.
The Clients require this, but it can also be achieved by creating a group and adding individual users into that
group. Many studios will only need all_squash as Deadline will keep track of who submits what jobs.
• insecure: Required for Mac OS X to mount nfs shares. It simply means that NFS doesn’t need to receive
requests on a port in the secure port range (a port number less than 1024).
Once that’s done, you may need to install an NFS server. To do so, open a terminal or your favourite package manager
to install one. For Ubuntu Server, type the following:
sudo apt-get install nfs-kernel-server
Then start up the server (for those living in an init.d world):
2.3. Database and Repository Installation
63
Deadline User Manual, Release 7.1.0.35
sudo /etc/init.d/nfs-kernel-server start
Any time you change the exports file, you’ll need to issue the same command, but replace ‘start’ with ‘reload’.
There is an excellent tutorial here as well: https://help.ubuntu.com/community/SettingUpNFSHowTo
Mac OS X
First, you need to configure the Repository folder permissions. Note that the images shown here are from Leopard
(10.5), but the procedure is basically the same for any version of Mac OS X.
• On the machine where the Repository is installed, navigate to the folder where it is installed using
Finder.
• Right-click on the Repository folder and select Get Info from the menu.
• Expand the Sharing & Permissions section, and unlock the settings if necessary.
• Give everyone Read & Write privileges.
• While probably not necessary, also give admin Read & Write privileges.
64
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
If you prefer to set the permissions from the Terminal, run the following commands:
$ chown -R nobody:nogroup /path/to/repository
$ chmod -R 777 /path/to/repository
Now you can share the folder. There are a many ways this can be done, and this just covers a few of them.
Using System Preferences
Note that the images shown here are from Leopard (10.5), but the procedure is basically the same for any version of
Mac OS X.
• Open System Preferences, and select the Sharing option.
2.3. Database and Repository Installation
65
Deadline User Manual, Release 7.1.0.35
• Make sure File Sharing is enabled, and then add the Repository folder to the list of shared folders.
• Under Users, give everyone Read & Write privileges.
• If sharing with Windows machines, press the Options button and make sure the “Share files and
folders using SMB (Windows)” is enabled.
Samba Share
Interestingly, Mac OS X uses samba as well. Apple just does a good job of hiding it. To create a samba share in Mac
OS X, past this at the bottom of /etc/smb.conf:
[DeadlineRepository]
path = /path/to/repository
writeable = Yes
guest ok = Yes
create mask = 0777
force create mode = 0777
force directory mode = 0777
unix extensions = No
2.3.5 Uninstallation
The Repository installer creates an uninstaller in the folder that you installed the Repository to. To uninstall the
Repository, simply run the uninstaller and confirm that you want to proceed with the uninstallation.
66
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
Note that if you installed the Database with the Repository installer, it will be uninstalled as well. If you chose to
connect to a Database that you manually installed, the Database will be unaffected.
Command Line or Silent Uninstallation
The Repository uninstaller can be run in command line mode or unattended mode on each operating system.
To run in command line mode, pass the “–mode text” command line option to the installer. For example, on Linux:
./uninstall --mode text
To run in silent mode, pass the “–mode unattended” command line option to the installer. For example, on Windows:
uninstall.exe --mode unattended
To get a list of all available command line options, pass the “–help” command line option to the installer. For example,
on Mac OS X:
./uninstall --help
2.4 Client Installation
2.4.1 Overview
Before proceeding with this installation, it is highly recommended to read through the Render Farm Considerations
documentation.
This guide will walk you through the installation of the Client. At this point, you should already have the Database and
Repository installed. If you do not, please see the Database and Repository Installation documentation for installation
instructions.
The Client consists of the following applications:
• Launcher: Acts as a launch point for the Deadline applications on workstations, and facilitates remote communication on render nodes.
• Monitor: An all-in-one application that artists can use to monitor their jobs and administrators can use to monitor
the farm.
• Slave: Controls the rendering applications on the render nodes.
• Command: A command line tool that can submit jobs to the farm and query for information about the farm.
2.4. Client Installation
67
Deadline User Manual, Release 7.1.0.35
• Pulse: An optional mini server application that performs maintenance operations on the farm, and manages
more advanced features like Auto Configuration, Power Management, Slave Throttling, Statistics Gathering,
and the Web Service. If you choose to run Pulse, it only needs to be running on one machine.
• Balancer: An optional Cloud-controller application that can create and terminate Cloud instances based on
things like available jobs and budget settings. If you choose to run Balancer, it only needs to be running on one
machine.
Note that the Slaves and the Balancer applications are the only Client applications that require a license.
2.4.2 Installing The Clients
The Client should be installed on your render nodes, workstations, and any other machines you wish to participate in
submitting, rendering, or monitoring jobs. The Slaves and the Balancer applications are the only Client applications
that require a license. Before you can configure the license for the Client, the license server must be running. See the
Licensing documentation for more information.
If you choose to run Pulse, you need to install the Client on the chosen machine. Note that if you wish to run it on the
same machine as the Database and/or Repository, you still have to install the Client on that machine.
There are Client installers for Windows, Linux, and Mac OS X. To install the Client, simply run the appropriate
installer for your operating system and follow the steps.
Windows
Start the installation process by double-clicking on the Windows Client Installer. The Windows Client installer also
supports silent installations with additional options.
68
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
Choose an installation location and press Next to continue.
2.4. Client Installation
69
Deadline User Manual, Release 7.1.0.35
Configure the necessary Client Setup and Launcher Setup settings. The following Client settings are available:
• Repository Directory: This is the shared path to the Repository. Note, if you are unable to browse to your
Repository shared path via your drive mapping in the install wizard, then this is more than likely due to a
problem with Windows UAC elevation. Essentially, even if the currently logged in user has the network drive
70
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
configured; that configuration is not available in the elevated scope, as you are technically another user here. This
is something which is handled by the OS so we cannot do anything on our side. However, possible workarounds
are to simply select the UNC path instead that the drive is mapped to OR logon to the system as the user account
with elevated permissions (local administrator for example) and then run the client install wizard.
• License Server: The license server entry should be in the format @SERVER, where SERVER is the host name
or IP address of the machine that the license server is running on. If you configured your license server to use a
specific port, you can use the format PORT@SERVER. For example, @lic-server or 27000@lic-server. If you
are running Deadline in LICENSE-FREE MODE, or you have not set up your license server yet, you can leave
this blank for now.
The following Launcher settings are available:
• Launch Slave When Launcher Starts: If enabled, the Slave will launch whenever the Launcher starts.
• Install Launcher As A Service: Enable this if you which to install the Launcher as a service. The service
must run under an account that has network access. See the Windows Service documentation below for more
information.
After configuring the Client and Launcher settings, press Next to continue with the installation.
Linux
Note that on Linux, the Deadline applications have dependencies on some libraries that are installed with the lsb
(Linux Standard Base) package. To ensure you have all the dependencies you need, we recommend installing the full
lsb package. In addition, the libX11 and libXext must be installed on Linux for the Deadline applications to run, even
if running them with the -nogui flag. They’re required for the Idle Detection feature, among other things. To check if
libX11 and libXext are installed, open a Terminal and run the following commands. If they are installed, then the path
to the libraries will be printed out by these commands.
ldconfig -p | grep libX11
ldconfig -p | grep libXext
If any of these libraries are missing, then please contact your local system administrator to resolve this issue. Here is
an example assuming you have root access, using YUM to install them on your system:
sudo -s
yum install redhat-lsb
yum install libX11
yum install libXext
Start the installation process by double-clicking on the Linux Client Installer. The Linux Client installer also supports
silent installations with additional options.
2.4. Client Installation
71
Deadline User Manual, Release 7.1.0.35
72
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
Choose an installation location and press Next to continue.
2.4. Client Installation
73
Deadline User Manual, Release 7.1.0.35
74
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
Configure the necessary Client Setup and Launcher Setup settings. The following Client settings are available:
• Repository Directory: This is the shared path to the Repository.
• License Server: The license server entry should be in the format @SERVER, where SERVER is the host name
or IP address of the machine that the license server is running on. If you configured your license server to use a
specific port, you can use the format PORT@SERVER. For example, @lic-server or 27000@lic-server. If you
are running Deadline in LICENSE-FREE MODE, or you have not set up your license server yet, you can leave
this blank for now.
The following Launcher settings are available:
• Launch Slave When Launcher Starts: If enabled, the Slave will launch whenever the Launcher launches.
• Install Launcher As A Daemon: Enable this if you which to install the Launcher as a daemon. You can also
choose to run the daemon as a specific user. If you leave the user blank, it will run as root instead. See the Linux
Daemon documentation below for more information.
After configuring the Client and Launcher settings, press Next to continue with the installation.
Mac OSX
Start the installation process by double-clicking on the Mac Client Installer. The Mac Client installer also supports
silent installations with additional options.
2.4. Client Installation
75
Deadline User Manual, Release 7.1.0.35
76
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
Choose an installation location and press Next to continue.
2.4. Client Installation
77
Deadline User Manual, Release 7.1.0.35
78
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
Configure the necessary Client Setup and Launcher Setup settings. The following Client settings are available:
• Repository Directory: This is the shared path to the Repository. Deadline isn’t able to understand paths starting
with “afp://” or “smb://”, so point the installer to the Repository path mounted under “/Volumes”.
• License Server: The license server entry should be in the format @SERVER, where SERVER is the host name
or IP address of the machine that the license server is running on. If you configured your license server to use a
specific port, you can use the format PORT@SERVER. For example, @lic-server or 27000@lic-server. If you
are running Deadline in LICENSE-FREE MODE, or you have not set up your license server yet, you can leave
this blank for now.
The following Launcher settings are available:
• Launch Slave When Launcher Starts: If enabled, the Slave will launch whenever the Launcher launches.
• Install Launcher As A Daemon: Enable this if you which to install the Launcher as a daemon. You can also
choose to run the daemon as a specific user. If you leave the user blank, it will run as root instead. See the Mac
OSX Daemon documentation below for more information.
After configuring the Client and Launcher settings, press Next to continue with the installation.
2.4.3 Command Line or Silent Installation
The Client installer can be run in command line mode or unattended mode on each operating system. Note though that
on OSX, you must run the installbuilder.sh script that can be found in the Contents/MacOS folder, which is inside the
Mac Client Installer package.
2.4. Client Installation
79
Deadline User Manual, Release 7.1.0.35
To run in command line mode, pass the “–mode text” command line option to the installer. For example, on Linux:
./DeadlineClient-X.X.X.X-linux-x64-installer.run --mode text
To run in silent mode, pass the “–mode unattended” command line option to the installer. For example, on Windows:
DeadlineClient-X.X.X.X-windows-installer.exe --mode unattended
To get a list of all available command line options, pass the “–help” command line option to the installer. For example,
on OSX:
/DeadlineClient-X.X.X.X-osx-installer.app/Contents/MacOS/installbuilder.sh --help
Note that there are quite a few Client installer options that are only available from the command line, which you can
view when running the “–help” command. These options include:
• –configport: The port that the Client uses for Auto Configuration.
• –slavestartupport: The port that the Slaves use to ensure that only one slave is initializing at a time.
• –slavedatadir: The local path where the Slave temporarily stores plugin and job data from the Repository during
rendering (if not specified, the default location is used).
• –noguimode: If enabled, the Launcher, Slave, and Pulse will run without a user interface on this machine.
• –killprocesses: If enabled, the installer will kill any running Deadline processes before proceeding with the
installation (Windows only).
• –launcherport: The Launcher uses this port for Remote Administration, and it should be the same on all Client
machines.
• –launcherstartup: If enabled, the Launcher will automatically launch when the system logs in (non-service
mode on Windows only).
• –restartstalled: If enabled, the Launcher will try to restart the Slave application on this machine if it stalls.
• –autoupdateoverride: Overrides the Auto Update setting for this client installation (leave blank to use the value
specified in the Repository Options)
• –launcherservicedelay: If the Launcher is running as a service or daemon, this is the amount of seconds it waits
after starting up before launching other Deadline applications.
2.4.4 Installing as a Service or Daemon
On Windows and Linux, you can choose to install the Launcher as a service or daemon during installation. There are
a few things to keep in mind when running Deadline in this mode.
Windows Service
When running as a service on Windows, the Launcher will run without displaying its system tray icon. If the Slave or
Pulse application is started through the Launcher while it is in this mode, they will also run without a user interface.
Finally, the Launcher can still perform an auto-upgrade, but only when launching the Slave and Pulse applications
(launching the Monitor, for example, will not invoke an upgrade).
Note that when running the Launcher as a service, the Slave or Pulse application will also run in a service context.
Since services run in a different environment, and potentially under a different user profile than the one currently
logged in, certain considerations need to be made.
80
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
First, the default user for a service has no access to network resources, so while Launcher service will run without any
issues, neither the Slave nor Pulse applications will be able to access the Repository. To avoid network access issues,
you must configure the service to run as a user with network privileges. Typical desktop users have this permission,
but check with your system administrator to find which account is best for this application.
Another issue presented by the service context is that there is no access to the default set of mapped drives. Applications will either need to map drives for themselves, or make use of UNC paths. While Deadline supports Automatic
Drive Mapping, the SMB protocol does not allow sharing a resource between two users on the same machine. This
means that mapping of drives or accessing a resource with different credentials may fail when running as a service on
a machine which already requires access to the Repository.
There is also an issue with hardware-based renderers. Starting with Windows Vista, services now run in a virtualized
environment which prevents them from accessing hardware resources. Because the renderer will run in the context of
a service, hardware-based renderers will typically fail to work.
Linux Daemon
When installing the daemon, the Client installer creates the appropriate deadlinelauncherservice script in /etc/init.d.
When running as a daemon on Linux, the Launcher will run without displaying its system tray icon. If the Slave or
Pulse application is started through the Launcher while it is in this mode, they will also run without a user interface.
This is useful when running Deadline on a Linux machine that doesn’t have a Desktop environment.
Mac OSX Daemon
When installing the daemon, the Client installer creates the appropriate com.thinkboxsoftware.deadlinelauncher.plist
file in /Library/LaunchDaemons.
When running as a daemon on Mac OSX, the Launcher will run without displaying its system tray icon. If the Slave
or Pulse application is started through the Launcher while it is in this mode, they will also run without a user interface.
2.4.5 Client License Configuration
Before you can configure the license for the Client, the license server must be running. See the Licensing documentation for more information.
If you didn’t configure the license for the Client during installation (see above), there are a couple of ways to set the
license for the Client. The quickest way is to use the right-click menu in the Launcher or the File menu in the Slave
application to change the license server.
2.4. Client Installation
81
Deadline User Manual, Release 7.1.0.35
The other option is to set up Auto Configuration so that the Client automatically pulls the license server information.
2.4.6 Uninstallation
The Client installer creates an uninstaller in the folder that you installed the Client to. To uninstall the Client, simply
run the uninstaller and confirm that you want to proceed with the uninstallation.
Command Line or Silent Uninstallation
The Client uninstaller can be run in command line mode or unattended mode on each operating system.
To run in command line mode, pass the “–mode text” command line option to the installer. For example, on Linux:
./uninstall --mode text
To run in silent mode, pass the “–mode unattended” command line option to the installer. For example, on Windows:
uninstall.exe --mode unattended
To get a list of all available command line options, pass the “–help” command line option to the installer. For example,
on Mac OS X:
./uninstall --help
82
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
2.5 Submitter Installation
2.5.1 Overview
This guide will walk you through the installation of the integrated submitters, which can be used to submit jobs from
within your application (3ds Max, Maya, Nuke, etc). These should be installed on any machines you wish to submit
jobs from. Note that jobs can also be submitted from the Submit menu in the Monitor. See the Submitting Jobs
documentation for more information.
At this point, you should already have the Database and Repository installed, and the Client software installed. If
you do not, please see the Database and Repository Installation and Client Installation documentation for installation
instructions. You also need to have the software that you will be submitting from installed as well (3ds Max, Maya,
Nuke, etc).
2.5.2 Installing The Submitters
The submitter installers can be found in the submission folder in the Deadline Repository. Open the folder for the
application you want to install the submitter for (3dsmax, Maya, Nuke, etc), and then open the Installers folder. There
will be an installer for each operating system that the current application runs on.
Simply run the appropriate installer then follow the step as follows. Note that these steps are similar for each application and each operating system.
2.5. Submitter Installation
83
Deadline User Manual, Release 7.1.0.35
The Deadline Client Bin Directory page shows what DEADLINE_PATH is currently set to. This value is originally
set by the Client installer, and is used by the submission scripts to find the Client’s bin directory so that it can find the
Repository and submit jobs. You can change the DEADLINE_PATH value here if it’s incorrect or if it doesn’t exist,
and the submitter installer will give you the option to make the change permanent.
The next page will show the Repository directory that the Client is currently connected to, which is where the submission scripts are installed from. If this path is incorrect, you can change it here.
84
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
Select the components you wish to install (the installer will try to auto select the versions it detects), and then verify
the install location for each one.
After configuring these, press Next to continue with the installation.
2.5. Submitter Installation
85
Deadline User Manual, Release 7.1.0.35
2.5.3 Silent Installation
The Submitter installers can be run in command line mode or unattended mode on each operating system. Note though
that on OSX, you must run the installbuilder.sh script that can be found in the Contents/MacOS folder, which is inside
the Mac Submitter Installer package.
To run in command line mode, pass the “–mode text” command line option to the installer. For example, on Linux:
./Nuke-submitter-linux-installer.run --mode text
To run in silent mode, pass the “–mode unattended” command line option to the installer. For example, on Windows:
Maya-submitter-windows-installer.exe --mode unattended
To get a list of all available command line options, pass the “–help” command line option to the installer. For example,
on OSX:
/Maya-submitter-osx-installer.app/Contents/MacOS/installbuilder.sh --help
Note that there are quite a few Submitter installer options that are only available from the command line, which you
can view when running the –help command. These options include:
• –enable-components: Select the components which you would like to enable (programs installed in default
locations will be auto selected)
• –disable-components: Select the components which you would like to disable (programs installed in default
locations will be auto selected)
• –destDir###: The destination directories for the components (will be defaulted to if installed in default locations)
An example batch script that puts these all together:
@echo off
.\Maya-submitter-windows-installer.exe --mode unattended --disable-components Maya2014
.\3dsMax-submitter-windows-installer.exe --mode unattended
--enable-components 3dsMax2011,3dsMax2015
--disable-components 3dsMax2012,3dsMax2013,3dsMax2014
--destDir2011 "C:\3dsMax2011_64"
.\Nuke-submitter-windows-installer.exe --mode unattended
This script installs the submitters for Maya (ignoring Maya 2014), 3ds Max(2011 and 2015 only, with 2011 in an
unusual directory) and Nuke (default settings)
2.5.4 Change the DEADLINE_PATH Value
The DEADLINE_PATH value is a system setting that the Integrated Submission scripts use to determine where the
Deadline Client is installed to, and what the Repository path is. This value is set by the Client installer, and if you’ve
installed more than one version of Deadline on your machine, it’s possible that this value could be incorrect.
You can use a Submitter installer to change the DEADLINE_PATH value without installing anything by following
these steps:
• Run any submitter installer and set the DEADLINE_PATH value on the Deadline Client Bin Directory page.
• Skip past the Repository Directory page.
86
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
• Uncheck all options on the Components page.
• Click Next on the Ready to Install page.
The installer will then update the DEADLINE_PATH variable without actually installing anything.
2.6 Upgrading or Downgrading Deadline
2.6.1 Overview
This will guide you through the process of upgrading or downgrading an existing Deadline installation.
2.6.2 Major Upgrades or Downgrades
If upgrading to a new major version (for example, Deadline 6 to 7), or downgrading from a new major version (for
example, Deadline 7 to 6), you will need to install a new Repository and Database, and you will need to reinstall the
Client software. This is necessary because there are often breaking changes between major releases. Do not install
over an existing installation unless it’s the same major version, or there could be unexpected results.
Note that Deadline 7 requires a newer version of the MongoDB database application. However, this newer
version is backward compatible with Deadline 6. So if you are installing the MongoDB database application
to a machine that already has a Deadline 6 database installed, you can just install it over top of the existing
Deadline 6 database installation.
You should also reinstall your integrated submission scripts on your workstations, since it’s possible these were
changed between major releases. See the Application Plug-ins documentation for more information on how to set
up the integrated submission scripts (where applicable).
The license server should also be upgraded to ensure it will work with newer releases in case there are incompatibilities
with the previous version of the license server.
Please refer to the following documentation for more information:
• Database and Repository Installation Guide
• Client Installation Guide
• Licensing Guide
2.6.3 Minor Upgrades or Downgrades
If upgrading or downgrading to a minor version that is part of the same major release cycle (for example, Deadline 7.0
to 7.0.1, or Deadline 7.1 to 7.1), you can simply install over the existing installation. If you have Automatic Upgrades
/ Downgrades enabled, you can have the Clients automatically upgrade or downgrade themselves after upgrading or
downgrading the Database and Repository. Automatic Upgrades / Downgrades can be enabled in the Client Setup
section of the Repository Configuration.
You can also enable Remote Administration in the Client Setup section of the Repository Configuration. This will
make it easier to upgrade or downgrade your render nodes remotely.
Note that this upgrade/downgrade method is only supported when upgrading or downgrading an existing Repository
installation. For example, it is NOT recommended to install the Deadline 7.1 Repository to a new location and then
have your 7.0 Clients upgrade by pointing them to the new Repository path. Instead, you should first move your
Repository installation and then do the upgrade once your 7.0 Clients are connected to the new Repository.
2.6. Upgrading or Downgrading Deadline
87
Deadline User Manual, Release 7.1.0.35
Important Notice When Upgrading From 7.0 to 7.1: Due to a change in the Slave Scheduling settings in the
database, you should avoid editing the Slave Scheduling settings from a machine running version 7.1 until all machines
have upgraded to 7.1. Otherwise, you will get the following error when the Launcher tries to auto-upgrade. If you
get this error when the Launcher tries to auto-upgrade, the workaround is to delete all Slave Scheduling groups in the
Slave Scheduling settings, and then recreate them once all machines have upgraded to 7.1.
An error occurred while deserializing the SlaveSchedulingGroups property of class
Deadline.Configuration.DeadlineNetworkSettings: Element 'AllSlaves' does not match
any field or property of class Deadline.Slaves.SlaveSchedulingGroup.
(System.IO.FileFormatException)
Upgrading or Downgrading the Database and Repository
Launch the new Repository installer, and choose the existing Repository folder for the Installation Directory. Then
choose the option to connect to an existing MongoDB database, and use the same Database Settings you used when
installing the previous version (they should be pre-populated for you).
During the installation, all binaries, plug-ins, and scripts from the previous version will be backed up. You can find
them in the backup folder in the Repository after the installation is complete. Note that any scripts or plugins in the
‘custom’ folder will not be affected when upgrading the Repository.
After upgrading or downgrading the Database and Repository, you can then upgrade or downgrade the Clients.
Upgrading or Downgrading Pulse and Balancer
Before upgrading or downgrading all of your client machines, you should first upgrade or downgrade Pulse and the
Balancer (if you’re running either of them). If you don’t have Automatic Upgrades / Downgrades enabled, you will
have to upgrade or downgrade Pulse and the Balancer manually, which involves running the Client Installer on the
Pulse machine. See the Client Installation Guide for more information.
If you have Automatic Upgrades / Downgrades enabled, all you have to do is restart the Pulse or Balancer application
from the Monitor, providing that Remote Administration is enabled. The Client will notice that the Repository has
been upgraded or downgraded, and will automatically upgrade or downgrade itself.
• To restart Pulse remotely, select Pulse in the Pulse List in the Monitor while in Super User mode, then right
click and select Remote Control -> Restart Pulse.
• To restart the Balancer remotely, select the Balancer in the Balancer List in the Monitor while in Super User
mode, then right click and select Remote Control -> Restart Balancer.
See the Remote Control documentation for more information about the remote commands that are available.
Upgrading or Downgrading the Clients
If you don’t have Automatic Upgrades / Downgrades enabled, you will have to upgrade or downgrade the Clients
manually, which involves running the Client Installers on the machines. See the Client Installation Guide for more
information.
If you have Automatic Upgrades / Downgrades enabled, all you have to do is restart the Slave application on each
render node through the Launcher. The Client will notice that the Repository has been upgraded or downgraded,
and will automatically upgrade or downgrade itself. In addition, the next time artists launch the Monitor on their
workstations through the Launcher, their installation will also be upgraded or downgraded.
To restart the Slaves remotely, Remote Administration must be enabled. Select the Slaves you want to upgrade or
downgrade in the Monitor while in Super User mode, then right click and select Remote Control -> Restart Slaves.
88
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
If the slaves are currently rendering and you don’t want to disrupt them, you can choose the option to Restart Slaves
After Current Task instead. This option will allow the Slaves to upgrade or downgrade after they finishe rendering
their current task to prevent the loss of any render time. See the Remote Control documentation for more information.
After restarting the Slaves, several Slaves may appear offline or a message may pop up saying the certain Slaves did
not respond. This may occur because all the Slaves are trying to upgrade or downgrade at once. Wait a little bit and
eventually all the Slaves should come back online.
2.7 Relocating the Database or Repository
2.7.1 Overview
There may come a time where you have to move the Database or Repository (or both) to another location or another
machine. This guide will walk you through the steps required.
2.7.2 Migrating the Database
These are the steps to move your Database to a new location:
1. Shut down all the Slave applications running on your render nodes. You don’t want them making changes during
the move.
2. Stop the mongod process on the Database machine.
3. Copy the Database folder from the original location to the new one.
4. Update the config.conf file in the data folder to point to the new system log folder and storage folder locations.
5. Start the mongod process on the Database machine.
6. Modify the dbConnect.XML file in the settings folder in the Repository to set the new database host name or IP
address (if you moved it to another machine).
7. Start up the Slaves and ensure that they can connect to the new Database.
Here is an example of how you would update the config.conf file if you copied the new database location was
C:\NEW_DATABASE_FOLDER:
systemLog:
destination: file
path: C:/NEW_DATABASE_FOLDER/data/logs/log.txt
quiet: true
storage:
dbPath: C:/NEW_DATABASE_FOLDER/data
Because the Clients use the dbConnect.xml file in the Repository to determine the database connection settings, you
don’t have to reconfigure the Clients to find the new database.
2.7.3 Migrating the Repository
These are the steps to move your Repository to a new location:
1. Ensure that the share for the new location already exists. Also ensure that the proper permissions have been set.
2.7. Relocating the Database or Repository
89
Deadline User Manual, Release 7.1.0.35
2. Shut down all the Slave applications running on your render nodes. You don’t want them making changes during
the move.
3. Copy the Repository folder from the original location to the new location.
4. Redirect all your Client machines to point to the new Repository location.
5. Start up the Slaves and ensure that they can connect to the new Repository location.
6. Delete the original Repository (optional).
As an alternative to step (4), you can configure your share name (if the new Repository is on the same machine) or
your DNS settings (if the new Repository is on a different machine) so that the new Repository location has the same
path as the original. This saves you the hassle of having to reconfigure all of your Client machines.
2.8 Importing Repository Settings
After installing a new Repository, you can import settings from a previous Repository into the new one. To do this,
open the Monitor and ensure that you’re connected to the new Repository (the title bar for the Monitor window will
show the Repository that you’re connected to). The enter Super User Mode from the Tools menu, and select Tools ->
Import Settings to bring up the Import Repositroy Settings window.
Specify the path to the old Repository that you want to import the settings from, and then choose which settings you
want to import and press the Import Settings button. Note that all passwords in Repository Options (Super User, SMTP,
90
Chapter 2. Installation
Deadline User Manual, Release 7.1.0.35
Mapped Drives) and Users (Web Service, Windows Login) will not be transferred, so these must be set manually after
the transfer is complete.
Also note that this feature only allows you to import settings from Deadline 6 or later. An un-supported Python
script DeadlineV5Migration.py attempts to migrate Deadline v5.x customers over to Deadline v6.x. It can be found
together with other useful example scripts on our Github site. Please note the disclaimer before executing this script
in your Deadline queue.
2.8. Importing Repository Settings
91
Deadline User Manual, Release 7.1.0.35
92
Chapter 2. Installation
CHAPTER
THREE
GETTING STARTED
3.1 Application Configuration
3.1.1 Overview
Deadline needs to know the executable file path to your installed application before being able to process jobs across
your network. For many applications (which ship themselves with a default install path), the binary executable file
and it’s path for each operating system and version is already included in the “Configure Plugins...” dialog for each
application, which can be accessed via Deadline Monitor –> Tools –> Super User Mode –> “Configure Plugins...”.
Below are example default application paths for the MayaBatch and Nuke plugins.
93
Deadline User Manual, Release 7.1.0.35
3.1.2 Multiple Application Paths
Looking at the MayaBatch plugin configuration section as an example. There are multiple render executable paths
defined. When a Deadline Slave dequeues (starts) a MayaBatch job, amongst many other functions in the ../<DeadlineRepository>/plugins/MayaBatch/MayaBatch.py plugin file, the slave attempts to retrieve the first Application
Path that exists on the machine which has been configured for the exact version of MayaBatch to be used, bitness
build (None, 32bit, 64bit) and also for all operating systems. In the example of MayaBatch below we also have
separate versions for Maya (2016) and Maya’s Extension (2016_5) which are defined in the ../<DeadlineRepository>/plugins/MayaBatch/MayaBatch.dlinit configuration file as a semicolon separated list:
RenderExecutable2016_0=C:\\Program Files\\Autodesk\\Maya2016\\bin\\MayaBatch.exe;/usr/autodesk/maya2
RenderExecutable2016_5=C:\\Program Files\\Autodesk\\Maya2016.5\\bin\\MayaBatch.exe;/usr/autodesk/may
The MayaBatch.dlinit file is automatically written to as you commit UI changes in the Plugin Configuration dialog
in Monitor. There is no need to manually edit these text files although this is possible. The ../<DeadlineRepository>/plugins/MayaBatch/MayaBatch.param file is an optional file that is used by the Plugin Configuration dialog in
the Monitor. It declares properties that the Monitor uses to generate a user interface for modifying custom settings in
the MayaBatch.dlinit file.
94
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
Typically, there are 3 functions in our scripting API which help us identify the correct application executable to
return as the “Render Executable” to be used, depending on which Build option is selected in your in-app or monitor
submission UI (see above for an example) to be used - None (default), 32bit or 64bit. These functions check the actual
bitness of the application binary executable to ensure we use a 32bit or 64bit application if applicable:
• FileUtils. SearchFileList ( string fileList ) Searches a semicolon separated list of files (fileList) for the first
one that exists. For relative file paths in the list, the current directory and the PATH environment variable
will be searched. Returns the first file that exists, or “” if no file is found.
• FileUtils. SearchFileListFor32Bit ( string fileList ) Searches a semicolon separated list of files (fileList) for
the first 32bit file that exists. For relative file paths in the list, the current directory and the PATH environment variable will be searched. Returns the first file that exists, or “” if no file is found.
• FileUtils. SearchFileListFor64Bit ( string fileList ) Searches a semicolon separated list of files (fileList) for
the first 64bit file that exists. For relative file paths in the list, the current directory and the PATH environment variable will be searched. Returns the first file that exists, or “” if no file is found.
3.1.3 Network Installed Applications
If the application in question supports running across a network, then you can add network application install paths
to the Plugin Configuration dialog in Monitor as well. Access permissions should be correctly configured for the user
account(s) that Deadline Slave runs under to have the correct access. Alternatively, you may wish to create desktop
shortcuts/symlinks instead and configure these paths in the Plugin Configuration dialog. Beyond the scope of this
documentation, please note that although many Windows based applications can be installed to a network location,
they still require the presence of many c++/c#/.NET re-distribution packages to be installed.
Application Wrapper Scripts
Typically, Linux based VFX studios use a bash/python ‘wrapper’ script which is called to startup an application. This
allows the studio to execute other commands, configure environment variables accordingly before the actual launching
of an application such as Maya. As the bash/python script file is not a binary executable directly, our 2 x functions
which check the actual bitness of your script file will cause a failure, which can be skipped by simply ensuring None
as the build option is used. This can be better explained by showing a working example if we inspect the actual Python
code in the MayaBatch plugin:
3.1. Application Configuration
95
Deadline User Manual, Release 7.1.0.35
## Called by Deadline to get the render executable.
def RenderExecutable( self ):
versionString = str( self.Version ).replace( ".", "_" )
mayaExecutable = ""
mayaExeList = self.deadlinePlugin.GetConfigEntry( "RenderExecutable" + versionString )
if( self.Build == "32bit" ):
self.deadlinePlugin.LogInfo( "Enforcing 32 bit build of Maya" )
if( SystemUtils.IsRunningOnWindows() ):
mayaExecutable = FileUtils.SearchFileListFor32Bit( mayaExeList )
if( mayaExecutable == "" ):
self.deadlinePlugin.FailRender( "32 bit Maya " + versionString + " render executable
else:
# Need to check bitness of Render because maya is just a shell script.
mayaExeList = mayaExeList.replace( "\\", "/" )
for executable in mayaExeList.split( ";" ):
tempExecutable = PathUtils.ChangeFilename( executable, "Render" )
tempExecutable = FileUtils.SearchFileListFor32Bit( tempExecutable )
if tempExecutable != "":
mayaExecutable = executable
break
if( mayaExecutable == "" ):
self.deadlinePlugin.FailRender( "32 bit Maya " + versionString + " render executable
elif( self.Build == "64bit" ):
self.deadlinePlugin.LogInfo( "Enforcing 64 bit build of Maya" )
if( SystemUtils.IsRunningOnWindows() ):
mayaExecutable = FileUtils.SearchFileListFor64Bit( mayaExeList )
if( mayaExecutable == "" ):
self.deadlinePlugin.FailRender( "64 bit Maya " + versionString + " render executable
else:
# Need to check bitness of Render because maya is just a shell script.
mayaExeList = mayaExeList.replace( "\\", "/" )
for executable in mayaExeList.split( ";" ):
tempExecutable = PathUtils.ChangeFilename( executable, "Render" )
tempExecutable = FileUtils.SearchFileListFor64Bit( tempExecutable )
if tempExecutable != "":
mayaExecutable = executable
break
if( mayaExecutable == "" ):
self.deadlinePlugin.FailRender( "64 bit Maya " + versionString + " render executable
else:
self.deadlinePlugin.LogInfo( "Not enforcing a build of Maya" )
mayaExecutable = FileUtils.SearchFileList( mayaExeList )
if( mayaExecutable == "" ):
self.deadlinePlugin.FailRender( "Maya " + versionString + " render executable was not fo
return mayaExecutable
96
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
3.2 Submitting Jobs
3.2.1 Overview
The easiest and most common way to submit render jobs to Deadline is via our many submission scripts, which are
written for each rendering application it supports. After you have submitted your job, you can monitor its progress
using the Monitor. See the Monitoring Jobs documentation for more information.
If you would like more control over the submission process, or would like to submit arbitrary command line jobs to
Deadline, see the Manual Job Submission documentation for more information.
3.2.2 Integrated Submission Scripts
Where possible, we have created integrated submission scripts that allow you to submit jobs directly from the application you’re working with. These scripts are convenient because you don’t have to launch a separate application to
submit the job. In addition, these scripts often provide more submission options because they have direct access to the
scene or project file you are submitting.
See the Plug-ins documentation for more information on how to set up the integrated submission scripts (where applicable) and submit jobs for specific applications.
3.2. Submitting Jobs
97
Deadline User Manual, Release 7.1.0.35
3.2.3 Monitor Submission Scripts
In cases where an application doesn’t have an integrated submission script, you can submit the jobs from the Submit
menu in the Monitor. Note that applications that have integrated submission scripts also have Monitor scripts here,
but in most cases there are less options to choose from. This is because the integrated submission scripts use the
application’s native scripting language to pull additional information from the file being submitted. See the Plug-ins
documentation for more information on how submit jobs for specific applications.
You can also create your own submission scripts for the Monitor. Check out the Monitor Scripting documentation for
more details.
3.2.4 Common Job Submission Options
There are many job options that can be specified on submission. A lot of these options are general job properties
that aren’t specific to the application you’re rendering with. Some of these options are described below. There are
also many other options that are specific to the application that you’re rendering with. These are covered in each
application’s plug-in guide, which can be found in the Plug-ins documentation.
Job Name
The name of your job. This is optional, and if left blank, it will default to “Untitled”.
Comment
A simple description of your job. This is optional and can be left blank.
Department
The department you belong to. This is optional and can be left blank.
Pool and Group
98
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
The pool and group that the job belongs to. See the Job Scheduling documentation for more information
on how these options affect job scheduling.
Priority
A job can have a numeric priority ranging from 0 to 100, where 0 is the lowest priority and 100 is the
highest priority. See the Job Scheduling documentation for more information on how this option affects
job scheduling.
Task Timeout and Auto Task Timeout
The number of minutes a slave has to render a task for this job before an error is reported and the task is
requeued. Specify 0 for no limit. If the Auto Task Timeout is properly configured in the Repository Options, then enabling the Auto Task Timeout option will allow a task timeout to be automatically calculated
based on the render times of previous frames for the job.
Concurrent Tasks and Limiting Tasks To A Slave’s Task Limit
The number of tasks that can render concurrently on a single slave. This is useful if the rendering application only uses one thread to render and your slaves have multiple CPUs. Caution should be used when
using this feature though if your renders require a large amount of RAM.
If you limit the tasks to a slave’s task limit, then by default, the slave won’t dequeue more tasks then it has
CPUs. This task limit can be overridden for individual slaves by an administrator. See the Slave Settings
documentation for more information.
Machine Limit and Machine Whitelists/Blacklists
Use the Machine Limit to specify the maximum number of slaves that can render your job at one time.
Specify 0 for no limit. You can also force the job to render on specific slaves by using a whitelist, or you
can avoid specific slaves by using a blacklist. See the Limit Documentation for more information.
Limits
The limits that your job must adhere to. See the Limit Documentation for more information.
Dependencies
Specify existing jobs that this job will be dependent on. This job will not start until the specified dependencies finish rendering.
On Job Complete
If desired, you can automatically archive or delete the job when it completes.
Submit Job As Suspended
If enabled, the job will submit in the suspended state. This is useful if you don’t want the job to start
rendering right away. Just resume it from the Monitor when you want it to render.
Scene/Project/Data File (if applicable)
The file path to the Scene/Project/Data File to be processed/rendered as the job. The file needs to be in
a shared location so that the slave machines can find it when they go to render it directly. See Submit
Scene/Project File With Job below for a further option. Note, all external asset/file paths referenced by
the Scene/Project/Data File should be resolvable by your slave machines on your network.
Frame List
The list of frames to render. See the Frame List Formatting Options below for valid frame lists.
Frames Per Task
Also known as Chunk Size. This is the number of frames that will be rendered at a time for each job task.
Increasing the Frames Per Task can help alleviate some of the inherited overhead that comes with network
3.2. Submitting Jobs
99
Deadline User Manual, Release 7.1.0.35
rendering, but if your frames take longer than a couple of minutes to render, it is recommended that you
leave the Frames Per Task at 1.
Submit Scene/Project File With Job
If this option is enabled, the scene or project file you want to render will be submitted with the job, and
then copied locally to the slave machine during rendering. The benefit to this is that you have a copy of
the file in the state that it was in when it was submitted. However, if your scene or project file uses relative
asset paths, enabling this option can cause the render to fail when the asset paths can’t be resolved.
Note, only the Scene/Project File is submitted with the job and ALL external/asset files referenced by the
Scene/Project File are still required by the slave machines.
If this option is disabled, the file needs to be in a shared location so that the slave machines can find
it when they go to render it directly. Leaving this option disabled is required if the file has references
(footage, textures, caches, etc) that exist in a relative location. Note though that if you modify the original
file, it will affect the render job.
3.2.5 Draft and Integration Submission Options
The majority of the submission scripts that ship with Deadline have Integration options to connect to Shotgun and
ftrack, and/or use Draft to perform post-rendering compositing operations. The Integration and Draft job options are
essentially the same in every submission script, and more information can be found in their respective documentation:
• Draft Documentation
• Shotgun Documentation
• ftrack Documentation
3.2.6 Jigsaw
Jigsaw is a flexible multi-region rendering system for Deadline, and is available for 3ds Max, Maya, modo, and Rhino.
It can be used to render regions of various sizes for a single frame, and in 3ds Max and Maya, it can be used to track
and render specific objects over an animation.
Draft can then be used to automatically assemble the regions into the final frame or frames. It can also be used to
automatically composite re-rendered regions onto the original frame.
Jigsaw is built into the 3ds Max, Maya, modo, and Rhino submitters, and with the exception of 3ds Max, Jigsaw
viewport will be displayed in a separate window.
100
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
The viewport can be used to create and manipulate regions, which will then be submitted to Deadline to render. The
available options are listed below.
General Options
These options are always available:
• Add Region: Adds a new region.
• Delete All: Deletes all the current regions.
• Create From Grid: Creates a grid of regions to cover the full viewport. The X value controls the number of
columns and the Y value controls the number of rows.
• Fill Regions: Automatically creates new regions to fill the parts of the viewport that are not currently covered
by a region.
• Clean Regions: Deletes any regions that are fully contained within another region.
• Undo: Undo the last change made to the regions.
• Redo: Redo the last change that was previously undone.
Selected Regions Options
These options are only available when one or more regions are selected.
• Delete: Deletes the selected regions.
• Split: Splits the selected regions into sub-regions based on the Tiles In X and Tyles In Y settings.
These options are only available when a single region is selected:
• Clone: Creates a duplicate region parallel to the selected region in the specified direction.
• Lock Postion: If enabled, the region will be locked to its current position.
• Enable Region: If disabled, the region will be ignored when submitting the job.
• X Position: The horizontal position of the selected region, taken from the left.
3.2. Submitting Jobs
101
Deadline User Manual, Release 7.1.0.35
• Y Position: The vertical position of the selected region, taken from the top.
• Width: The width of the selected region.
• Height: The height of the selected region.
These options are only available when multiple regions are selected.
• Merge: Combines the selected regions into a single region that covers the full area of the selected regions.
Zoom Options
These zoom options are always available:
• Zoom Slider: Use the slider to zoom the viewport in and out. You can also use the mouse wheel to zoom in and
out, and you can click the mouse wheel down to pan the image if it doesn’t fit in the viewport.
• Reset Zoom: Resets the zoom within the viewport.
• Fit Viewport: Zoom to see everything in the viewport.
• Keep Fit: Zoom to see everything in the viewport, and force the viewport to not change. This allows the
viewport to scale when resizing the Jigsaw window.
Maya Options
These options are currently only available for Maya:
• Reset Background: Gets the current viewport image from Maya.
• Fit Selection: Create regions surrounding the selected items in the Maya scene.
• Mode: The type of regions to be used when fitting the selected items. The options are Tight (fitting the minimum
2D bounding box of the points) and Loose (fitting the minimum 2D bounding box of the bounding box of the
object).
• Padding: The amount of padding to add when fitting the selection (this is a percentage value that is added in
each direction).
• Save Regions: Saves the informations in the regions directly into the Maya scene.
• Load Regions: Loads the saved regions information from the Maya scene.
3.2.7 Frame List Formatting Options
During job submission, you usually have the option to specify the frame list you want to render, which often involves
manually typing the frame list into a text box. In this case, you can make use of the following frame list formatting
options.
Specifying Individual Frames or a Sequence
You can specify a single frame just by typing in the frame number:
5
You can specify individual frames by separating each frame with a comma or a space:
5,10,15,20
5 10 15 20
You can specify a frame range by separating the start and end frame with a dash:
102
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
1-100
Specifying a Sequence with a Step Frame
You can specify a step frame for a sequence using x, step, by, or every:
1-100x5
1-100step5
1-100by5
1-100every5
Each of these examples will render every 5th frame between 1 and 100 (1, 6, 11, 16, etc).
Specifying a Reverse Frame Sequence
You can specify a reverse frame range by separating the end frame and start frame with a dash:
100-1
Using a step frame also works for reverse frame sequences:
100-1x5
Advanced Frame Lists
Individual frames for the same job are never repeated when creating tasks for a job, which allows you to get creative
with your frame lists without worrying about rendering the same frame more than once.
To render frames 5, 18, and then from 28 to 100, you can specify one of the following:
5,18,28-100
5 18 28-100
To render every 5th frame between 1 to 100, then fill in the rest, you can specify one of the following:
1-100x5,1-100
1-100x5 1-100
To render every 10th frame between 1 to 100, then every 5th frame, then every 2nd frame, then fill in the rest, you can
specify one of the following:
1-100x10,1-100x5,1-100x2,1-100
1-100x10 1-100x5 1-100x2 1-100
To render in a mix of forward and reverse by different Nth frames, then fill in the rest in reverse, you can specify one
of the following:
100-1x10,0-100x5,100-1
100-1x10 0-100x5 100-1
NOTE, a job’s frame range can be modified after a job has been submitted to Deadline by right-clicking on a job and
selecting “Modify Frame Range...”.
3.2. Submitting Jobs
103
Deadline User Manual, Release 7.1.0.35
3.3 Monitoring Jobs
3.3.1 Overview
The Monitor application lets you monitor and control your jobs after they have been submitted to the farm. This
documentation only covers some of the basics regarding the Monitor application. For more in-depth information, see
the Monitor documentation.
If you’re launching the Monitor for the first time on your machine, you will be prompted with a Login dialog. Simply
choose your user name or create a new one before continuing. Once the Monitor is running, you’ll see your user name
in the bottom right corner. If this is the wrong user, you can log in as another user by selecting File -> Change User.
Note that if your administrator set up Deadline to lock the user to the system’s login account, you will have to log off
of your system and log back in as the correct user.
3.3.2 Finding Your Jobs
Information in the Monitor is broken up into different panels. When monitoring your jobs, you typically want to use
the following panels:
104
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
• Job Panel: This panel shows all the jobs in the farm.
• Task Panel: When a job is selected, this will show all the tasks for the job.
• Job Reports Panel: When a job is selected, this will show all reports (logs and errors) for the job.
These panels, and others, can be created from the View menu, or from the main toolbar. They can be re-sized, docked,
or floated as desired. This allows for a highly customized viewing experience which is adaptable to the needs of
different users. See the Panel Features documentation for instructions on how to create new panels in the Monitor.
The easiest way to find your jobs is to enable Ego-Centric Sorting in the job panel’s drop down menu, which can be
found in the upper-right corner of the panel. This keeps all of your jobs at the top of the job list, regardless of which
column the job list is sorted on. Then sort on the Submit Date/Time column to show your jobs in the order they were
submitted.
3.3. Monitoring Jobs
105
Deadline User Manual, Release 7.1.0.35
3.3.3 Filtering the Job List
Another way to find the jobs you are interested in is to use the filtering options in the job panel. The Quick Filter
option in the job panel’s drop down menu will open a side panel that allows you to filter out jobs based on status, user,
pool, group, and plugin.
For more advanced filtering, use the Edit Filter option in the drop down menu to filter on any column in the job list. If
you would like to save a filter for later use, use the Pinned Filters option in the drop down menu to pin your filter. You
106
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
will then be able to select it later from the Pinned Filters sub menu.
3.3. Monitoring Jobs
107
Deadline User Manual, Release 7.1.0.35
Finally, you can use the search box above the job list to filter your results even further.
3.3.4 Job Batches
Jobs that share the same Batch Name property will be grouped together in the job list. All of the job submitters that
are included with Deadline will automatically set the Batch Name if they are submitting multiple jobs that are related
to each other. The Batch Name for a job can be modified in the Job Properties in the Monitor.
108
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
If you prefer to not have the jobs grouped together in the job list, you can disable the Group Jobs By Batch Name
option in the Monitor and User Settings.
3.3.5 Controlling Your Jobs
If you need to pause your job, you can right-click on the job in the job list and select Suspend job. When you are ready
to let the job continue, simply right-click on the job again and select Resume Job. See the Job States documentation
for more information.
To modify the properties of your job, you can double-click on the job, or right-click on it and select Modify Properties.
Here you can change scheduling options such as priority and pool, as well as other general properties like the job
name. If you wish to limit which render nodes your job runs on, as well as the number of nodes that can render it
concurrently, you can do so on the Machine Limit page. Depending on the application you’re rendering with, you
3.3. Monitoring Jobs
109
Deadline User Manual, Release 7.1.0.35
may see an extra page at the bottom of the properties list (with the name of the plug-in) that allows you to modify
properties which are specific to that application. More information on job properties can be found in the Job Properties
documentation.
3.3.6 Why Is My Job Not Rendering?
If a slave isn’t rendering a job that you think it should be, you can use the Job Candidate Filter option in the panel’s drop
down menu to try and figure out why. See the Job Candidate Filter section in the Slave Configuration documentation
for more information.
The job could also be producing errors when rendering. See the following section below about handling job errors.
3.3.7 Handling Job Errors
If your job starts producing errors, you’ll notice that your job will change from green to brown, then eventually to
red (depending on the number of errors). These error reports can be viewed in the Job Reports panel, which can be
opened from the View menu, or from the job’s right-click menu. Here you will find all the reports generated for a job
including the error reports which will be red. You can filter and sort the reports to help find what you are looking for.
Often, the error reports will clearly show what the cause of the error is, allowing you can take the appropriate steps
to resolve the problem. If you’re ever unsure of what an error means, feel free to email the error report to Deadline
Support and we’ll try to help. See the Job Reports and History documentation for more information.
110
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
3.3.8 Completed Jobs
When your job is complete, you can view the output images by right-clicking on the individual tasks in the task list
and selecting the output filename. This will open the image in the application that is set to open that type of file by
default. Note that this option isn’t always available for some applications. In most cases though, you can view the
output image folder by right-clicking on the job and selecting Explore Output. See the Job Output documentation for
more information.
You can also view the logs for the job in the Job Reports panel, which can be opened from the View menu, or from the
job’s right-click menu. Finally, once you are happy with the results and no longer need the job, you can delete it by
right-clicking on the job and selecting Delete Job.
3.3. Monitoring Jobs
111
Deadline User Manual, Release 7.1.0.35
3.3.9 Re-rendering Jobs
If you have a completed job that you need to re-render, you can do so by right-clicking on the job and selecting
Requeue Job. If you only need to re-render a few bad frames, you can just requeue their corresponding tasks by
right-clicking on one or more tasks in the task list selecting Requeue Tasks.
In some cases, the Monitor can try to detect bad frames for you. You can use this feature by right-clicking on the
job and selecting Scan For Missing Output. The scan will check for missing frames or frames that don’t meet a size
threshold. You will then have the option to requeue all the corresponding tasks automatically. Note that the Scan For
Missing Output option isn’t available for all jobs. See the Job Output documentation for more information.
3.4 Controlling Jobs
3.4.1 Overview
The Jobs panel allows jobs to be controlled and modified using the right-click menu. In addition, the Task panel allows
specific tasks to be controlled using the right-click menu. Note that the availability of these options can vary depending
on the context in which they are used, as well as the User Group Permissions that are defined for the current user.
112
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
If the Job or Task panels are not visible, see the Panel Features documentation for instructions on how to create new
panels in the Monitor.
3.4.2 Job States
The state of jobs can be changed using the Job panel’s right-click menu. In addition, the states of specific tasks can be
changed using the Task panel’s right-click menu. Note that it is possible to modify the states of multiple jobs or tasks
at the same time, providing the selected jobs or tasks are all in the same state.
3.4. Controlling Jobs
113
Deadline User Manual, Release 7.1.0.35
When suspending a job, a confirmation message will appear that gives you the option to suspend the tasks for the
job that are currently rendering. If you disable this option, any tasks that are currently rendering will be allowed to
complete.
These are the states that a job can be in. They are color coded to make it clear which state the job is in.
• Queued (white): No tasks for the job are currently being rendered.
• Rendering (green): At least one task for the job is being rendered.
• Completed (blue): All tasks for the job have finished rendering.
• Suspended (gray): The job will not be rendered until it is resumed.
• Pending (orange): The job is waiting on dependencies to finish, or is scheduled to start at a later time.
• Failed (red): The job has failed due to errors. It must be resumed before it can be rendered again.
You may notice Queued or Rendering jobs turn slightly red or brown as they sit in the farm. This is an indication that
the job is reporting errors. See the Job Reports section further down for more information.
The Job panel’s right-click menu also gives the option to delete or archive jobs. Both options will remove the jobs
from the farm, but archived jobs can be imported again for later use. You can import archived jobs from the File menu
in the Monitor. See the Job Archiving documentation for more information.
3.4.3 Resubmitting Jobs
If you want to render a specific job again, but you don’t want to lose the statistics for original job, you can resubmit it
from the Job panel’s right-click menu. This will bring up a window allowing you to adjust the frame list and frames
per task if you want to. All other job properties will remain identical.
114
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
Note that you can resubmit it as a normal job or a maintenance job. Maintenance jobs are special jobs where each task
for the job will render the same frame(s) on a different machine in your farm. This is useful for performing benchmark
tests on your machines. When a maintenance job is submitted, a task will automatically be created for each slave, and
once a slave has finished a task, it will no longer pick up the job.
It’s even possible to resubmit specific tasks as a new job, which can be done from the Task panel’s right-click menu.
Note though that a Maintenance job can only be resubmitted from the Job panel.
Note that Tile jobs will have their own resubmission dialog, and only the Tile frame can be changed.
3.4.4 Job Properties
To modify job properties, select the Modify Job Properties option from the Job panel’s right-click menu. Doubleclicking on a job will also bring up the Job Properties window. There are many pages of properties you can modify,
which are covered below. Note that it is possible to modify the properties of multiple jobs at the same time.
General
These are the most common job properties, and most of these were specified when the job was originally submitted.
3.4. Controlling Jobs
115
Deadline User Manual, Release 7.1.0.35
The properties are as follows:
• Job ID: The internal ID of the job.
• Job Name: The name of the job.
• Comment: The comment for the job.
• Department: The department the job was submitted from.
• Batch Name: The batch the job belongs to. Jobs with the same Batch Name are grouped together in the Monitor.
• User: The user who submitted the job.
• Pool: The pool that the job belongs to.
• Secondary Pool: If enabled, the job can fall back to the secondary pool if there are machines available in that
pool.
• Group: The group that the job belongs to.
• Priority: The priority of the job (0 = lowest, 100 = highest).
• Concurrent Tasks: The number of tasks a slave can dequeue at a time (1-16). Note that not all plug-ins support
this feature, such as Digital Fusion.
• Limit Tasks To Slave’s Task Limit: If checked, a slave will not dequeue more tasks than it is allowed to based
on its settings.
116
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
• On Job Complete: When a job completes, you can auto-archive or auto-delete it. You can also choose to do
nothing when the job completes.
• Job Is Protected: If enabled, the job can only be deleted by the job’s user, a super user, or a user that belongs
to a user group that has permissions to handle protected jobs. Other users will not be able to delete the job, and
the job will also not be cleaned up by Deadline’s automatic house cleaning.
• Re-synchronize Auxiliary Files Between Tasks: If checked, all job files will be synchronized by the Slave
between tasks for this job. This can add significant network overhead, and should only be used if you are
manually editing any of the files that were submitted with the job.
• Reload Plugin Between Tasks: If checked, the slave reloads all the plug-in files between tasks for the same
job.
• Enforce Sequential Rendering: Sequential rendering forces a slave to render the tasks of a job in order. If
an earlier task is ever requeued, the slave won’t go back to that task until it has finished the remaining tasks in
order.
• Suppress Event Plugins: If enabled, this job will not trigger any event plugins while in the queue.
• Job Is Interruptible: If enabled, tasks for this job can be interrupted during rendering by a job with a higher
priority.
• Interruptible %: A task for this job will only be interrupted if the task progress is less than or equal to this
value.
Timeouts
These properties effect how a job will timeout. It is important to note that the Auto Task Timeout feature is based on
the Auto Job Timeout Settings in the Repository Options. The timeout is based on the render times of the tasks that
have already finished for this job, so this option should only be used if the frames for the job have consistent render
times.
3.4. Controlling Jobs
117
Deadline User Manual, Release 7.1.0.35
The properties are as follows:
• Minimum Task Render Time: The minimum amount of time a Slave has to render a task. If a task finishes
faster, an error will be reported.
• Maximum Task Render Time: The maximum amount of time a Slave has to render a task. If a Maximum Start
Job Time is set, the Maximum Task Render Time will not be applied to the Starting phase of a job.
• Maximum Start Job Time: The maximum amount of time a Slave has to start a job.
• On Task Timeout: You have the option to have the job report an error or notify you when a timeout is reached.
• Enable Timeouts For Pre/Post Job Scripts: If checked, then the timeouts for this job will also affect its
pre/post job scripts, if any are defined.
• Enable Auto Task Timeout: If the job should automatically timeout based on parameters specified in the
Repository Options.
• Use Frame Timeouts: If enabled, timeouts will be calculated based on frames instead of by tasks. The timeouts
entered for tasks will be used for each frame in that task.
Notifications
These properties allow you to notify user(s) are jobs complete. There are two list controls beside each other on this
panel. The left list contains all the current users on your farm. The right list contains the names of the users of whom
will receive notifications. You can move users from one list to another using the arrow controls between the two lists.
118
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
The properties are as follows:
• Notification Email Addresses: A comma delimited list of the Notification Users email addresses.
• Job Completion Notes: Notes to attach in the email sent when the job has completed.
• Override Notification Method: If checked, you can select whether to send an email or to not send an email.
Machine Limit
A Machine Limit can be used to limit the number of slaves that can render one particular job. This is useful if you
want to render a bunch of jobs simultaneously. The list you create can be a whitelist or a blacklist. A whitelist is the
list of slaves that are approved to render this job (only these approved machines will render this job) while a black list
contains slaves which are will not render this job. To move a machine from one list to another you can use the arrow
buttons between the two lists, drag and drop the machine names you want, or simply double click the machine name.
You are also able to load and save your machine list from a file so you can use the same list across multiple jobs. The
file used will save each machine name to a single line.
3.4. Controlling Jobs
119
Deadline User Manual, Release 7.1.0.35
You can modify the following options for the machine limit:
• Slaves that can render this job simultaneously: The number of slaves that can render this job at the same
time.
• Return Limit Stub When Task Progress % Reaches: If enabled, you can have a slave release its limit stub
when the current task it is rendering reaches the specified progress. Note that not all plug-ins report task progress,
in which case the machine limit stub will not be released until the task finishes rendering.
• Whitelisted/Blacklisted Slaves: If slaves are on a blacklist, they will never try to render this job. If slaves are
on a whitelist, only those slaves will try to render this job. Note that an empty blacklist and an empty whitelist
are functionally equivalent, and have no impact on which machines the job renders on.
• Load Machine List: Open a file dialog to load a list of slaves to be used in the white/blacklist. One machine
name per line in the file (.txt).
• Save Machine List: Open a file dialog to save the current white/black list. Each machine name will be written
to a single line.
Limits
Here you can add or remove the limits that will effect your job. Limits are used to ensure floating licences are used
correctly on your farm. To add a limit to your job, you can select the limit(s) you require from from the limit list and
press the right arrow between the Limit List and the Required Limits. You are also able to drag and drop your selected
limits into or from the required limits or just double click a limit to move it from one list to another.
120
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
Dependencies
Dependencies can be used to control when a job should start rendering. See the Job Dependency Options below for
more information.
3.4. Controlling Jobs
121
Deadline User Manual, Release 7.1.0.35
Failure Detection
Here you can set how your job handles errors and determine when to fail a job.
122
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
The properties are a follows:
• Override Job Error Limit: Once checked, the job override limit will be set to the user specified value.
• Override Task Error Limit: Once checked, the task error limit will be changed to the user specified value.
• Send Warning Notification For Job Errors: Whether or not to send a notification to the users specified in the
Notification Panel when a job error occurs.
• Ignore Bad Slave Error Limit: If checked, A bad slave error will not count towards job errors.
• Clear Bad Slave List: Determines whether or not the bad slave list should currently be cleared.
Cleanup
Here you can override if and how your job is automatically cleaned up when it completes.
3.4. Controlling Jobs
123
Deadline User Manual, Release 7.1.0.35
The properties are a follows:
• Override Automatic Job Cleanup: If enabled, these cleanup settings will be used instead of the ones in the
Repository Options.
• Cleanup Job After This Many Days: If enabled, this is the number of days to wait after this job has completed
before cleaning it up.
• Cleanup Mode: Whether the cleanup should archive the job or delete it.
Scheduling
You can schedule the job to start and/or stop at a specific date and time, and even repeat on regular intervals. This
can be useful for maintenance jobs that need to run every few days or weeks. In addition, you can define a custom
schedule so that the jobs can start and/or stop at different times on different days of the week.
124
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
Scheduling properties are as follows:
• Scheduling Mode: Determines how the job will be scheduled. Possible values are Disabled, One Time, Repeating, or Custom.
• Once or Repeating Scheduling Settings:
– Start Date and Time: The date and time this job should start.
– Stop Date and Time: If enabled, the date and time this job should be marked as complete if it is still
active.
– Day Interval: The number of days to wait before repeating this job if the Scheduling Mode is set to
Repeating.
• Custom Scheduling Settings: Configure the days and times that the job should start and/or stop.
It should be noted that if the job is not put into the Pending state, the job will not wait for the scheduled time to begin
rendering. When the scheduling settings change, you will be prompted to put the job in the pending state. This can
also be done by right clicking the job and choosing ‘Mark as Pending’.
Scripts
You can attach custom Python scripts to your job which can be run before and after your job has rendered. You may
also attach scripts to your job’s tasks which can be run before and after your job’s tasks render. For more information
on creating custom job scripts, see the Job Scripting section of the documentation.
3.4. Controlling Jobs
125
Deadline User Manual, Release 7.1.0.35
You may attach the following scripts which will be executed at different times:
• Pre Job Script: Executed before a job is run.
• Post Job Script: Executed after a job has completed.
• Pre Task Script: Executed before a task is completed.
• Post Task Script: Executed after a task has completed.
For more details on these script properties, see the Job Scripting section of the documentation.
Environment
When running a job, you are able to attach environment variables through the Environment tab. The environment variables are specified as key-value pairs and are set on the slave machine running the job. You are able to specify whether
your job specific environment variables will only be set while your job is rendering. All job specific environment
variables will be removed when the job has finished running.
You are also able to set a custom plugin directory on this panel. This acts as an alternative directory to load your jobs
plugin from. It is useful while creating and testing custom job plugins or when you need 1 or more jobs to specifically
use a custom job plugin which is not stored in the Deadline Repository.
126
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
The Environment properties are as follows:
• Custom Plugin Directory: An alternative directory to load your jobs plugin from.
• Environment Variables: A list of environment variables to set while running a job. Stored as a list of key value
pairs.
• Only Use Job Environment Variables When Rendering: Environment variables for your job will only be set
when the job is in the rendering state. Will be removed when the job is finished rendering.
Extra Info
When a job is submitted, it can have extra information embedded in it. For example, if a studio has an in-house
pipeline tool, they may want to embed information in the job that will be used to update the pipeline tool when the job
finishes rendering.
3.4. Controlling Jobs
127
Deadline User Manual, Release 7.1.0.35
The Extra Info 0-9 properties can be renamed from the Jobs section of the Repository Options, and have corresponding
columns in the Job list that can be sorted on. The additional key/value pairs in the list at the bottom do not have
corresponding columns, and can be used to contain internal data that doesn’t need to be displayed in the job list.
Submission Params
Here you can view and export the job info and plugin info parameters that were specified when the job was submitted.
The exported files can be passed to the the Command application to manually re-submit the job. See the Manual Job
Submission documentation for more information.
128
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
Plugin Specific Properties
The Plug-in specific properties vary between the different plug-ins, and some plug-ins may not have a Plug-in specific
properties tab at all. Note that when modifying properties for multiple jobs at the same time, the Plug-in specific tab
will only be available if all selected jobs use the same plug-in.
3.4. Controlling Jobs
129
Deadline User Manual, Release 7.1.0.35
To get a description of specific plug-in properties, just hover your mouse cursor over them in the properties dialog and
a tooltip will pop up with a description.
3.4.5 Job Dependency Options
Dependencies can be used to control when a job should start rendering. There are three types of dependencies available,
and one or more can be specified for a job:
• Jobs: Job dependencies can be used to start a job when other jobs that it depends on are finished.
• Assets: Asset dependencies can be used to start a job when specific files exist on disk.
• Scripts: Script dependencies can be used to start a job based on if a Python script returns True or False.
There are a few ways to set up dependencies in the Monitor, which are described below.
Job Properties
In the Job tab on the Dependencies page, you have the ability to set which jobs your job is dependent on. By default,
the job will only resume when each of its dependencies have completed, but you can also have your job resume when
the dependencies have failed, or have been deleted from the queue. Note that you can only set which jobs this job is
dependent on, not which jobs are dependent on this job.
You can also make the job frame dependent, which means that a frame from the job won’t begin rendering until the
same frame from the other job(s) is complete. This is useful if you have a job that is dependent on the frames of
another job, and you want the two jobs to render concurrently.
130
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
In the Asset tab, you can make this job dependent on asset files (textures, particle caches, etc). This job won’t be able
to render on a slave unless it can access all the files listed here.
3.4. Controlling Jobs
131
Deadline User Manual, Release 7.1.0.35
Iin the Script tab, you can make this job dependent on the results of the specified scripts.
132
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
The following properties apply to all dependency types:
• Resume On Completed Dependencies: This job will resume when its dependencies complete.
• Resume On Failed Dependencies: This job will resume when its dependencies fail.
• Resume On Deleted dependencies: This job will resume when its dependencies are deleted from the queue.
• Resume When Each Dependency is % Complete: This job will resume when each of the jobs this job is
dependent on reaches a certain percentage of completion.
• Use Frame Dependencies: Specifies that this job is dependent on specific frames from its dependencies, and
will release tasks for this job as appropriate.
• Frame Offset Start/End: Use these to offset the frames that this job is dependent on. It can also be used to
make frames for this job dependent on multiple frames from other jobs.
You can also specify notes and set overrides for individual dependencies by clicking on them in the dependency list.
Click the Overrides button to view the overrides panel.
3.4. Controlling Jobs
133
Deadline User Manual, Release 7.1.0.35
Drag and Drop
In the Jobs panel, you can drag one or more jobs and drop them on another job. You will then be presented with some
choices on how to set the dependencies.
134
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
Note that drag & drop dependencies will not work if you are holding down a modifier key (SHIFT, CTRL, etc). This
is to help avoid accidental drag & drops when selecting multiple jobs in the list.
If you would like to disable drag & drop dependencies, you can do so from the Monitor Options, which can be accessed
from the main toolbar. Note that if you change this setting, you will have to restart the Monitor for the changes to take
effect.
Dependency View
The Job Dependency View is used to be able to visualize and modify your jobs and their dependencies. You can open
the Job Dependency View panel from the View menu in the Monitor.
3.4. Controlling Jobs
135
Deadline User Manual, Release 7.1.0.35
The view will show your currently selected job and all nodes that are linked to it by dependencies. The job node colors
indicate the state of the job, while the asset nodes are yellow and the script nodes are purple.
Jobs are dependent on everything that has a connection to the Square Socket on their left side. Connections can be
made by dragging from the sockets on the nodes (square/circle) to the socket/main body of the other node. Connections
can be broken by either dragging the connection off of the node or by selecting the connection and pressing the delete
key. Note that changes made in the dependency view do not take effect until saved. If you have made changes and go
to close the dependency view, you will be notified that you have unsaved changes.
Additional job nodes can be added to the view by dragging them in from the job list (after locking the dependency
first), or through the right click menu. Asset and script nodes can also be added by dragging the file in from your
explorer/finder window, or through the right click menu as well.
Dependencies can be tested by pressing the Test Dependency button in the toolbar. The results are represented by the
following colors:
• Green: The dependency test has passed.
• Red: The dependency test has failed.
• Yellow: The job is frame dependent, and the dependency test for some of the frames has passed.
136
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
All the available dependency view options can be found across the toolbar at the top of the view, and/or from the
view’s right click menu.
Options in toolbar and right click menu:
• Lock View: When enabled, the view will no longer show the currently selected job and will display the last
job selected before locking. This is necessary before additional job can be dragged from the job list into the
dependency view.
• Reload View: This redraws the dependency view for the selected job. If changes have been made, you will be
prompted to save your changes.
3.4. Controlling Jobs
137
Deadline User Manual, Release 7.1.0.35
• Save View: Saves the changes made to the dependency view for the selected job.
• Selection Style: If off, all nodes and connections touched by the selection area will be selected. If on, only
nodes and connections that are fully contained by the selection are will be selected.
• Minimap: Controls if the minimap is visible and if so, in which corner.
• Elide Titles: Control whether or not the titles of nodes should be elided and if so, in which direction.
• Zoom All: Zoom the view to the point where the entire view (area that has been used) is visible.
• Zoom Extents: Zoom the view to the point where all nodes currently in the view are visible.
Options in toolbar only:
• Modify Job Details: This allows you to set which properties are visible in the nodes.
• Test Dependencies: This allows you to test your dependencies.
• Zoom Level: The current zoom level.
Options in right-click menu only:
• Job Menu: If one or more jobs are selected, you can use the same job menu that is available in the job list.
• Add Job: Choose a job to add to the dependency view.
• Add Asset: Choose an asset file to add to the dependency view.
• Add Script: Choose a script file to add to the dependency view.
• Expand/Collapse: Expand or collapse the details in all nodes.
3.4.6 Job Frame Range
To modify the frame range, select the Modify Frame Range option from the Job panel’s right-click menu. Note that
modifying these settings will stop and requeue all tasks that are currently rendering.
See the Frame List Formatting Options documentation for more information on options for formatting frame lists.
3.4.7 Job Reports and History
All reports for a job can be viewed in the Job Reports panel. This panel can be opened from the View menu or from
the main toolbar in the Monitor. It can also be opened from the Job and Task panel’s right-click menu.
138
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
The following reports can be viewed from the Job Report panel:
• Render Logs: These are the reports from tasks that rendered successfully.
• Render Errors: This are the reports from tasks that failed to render.
• Event Logs: These are the reports from Events that were handled successfully.
• Event Errors: These are the reports from Events that raised errors.
• Requeues: These are reports explaining why tasks were requeued.
You can use the Job Report panel’s right-click menu to save reports as files to send to Deadline Support. You can also
delete reports from this menu as well. Finally, if a particular Slave is reporting lots of errors, you can blacklist it from
this menu (or remove it from the job’s whitelist).
In addition to viewing job reports, you can also view the job’s history. The History window can be brought up from
the Job panel’s right-click menu by selecting the Job History option.
3.4. Controlling Jobs
139
Deadline User Manual, Release 7.1.0.35
3.4.8 Job Output
Many jobs have the options to explore and view the job’s output directly from the Job or Task panel’s right-click menu.
If the options to explore and view the output are available for the job, there will also be the option to copy the output
path to the clipboard. This is helpful if you need to paste the path into another application.
Note that the availability of these options is based on how much information about the job’s output could be determined
at the time the job was submitted. In some cases, the submitter can’t determine where all or some of the job’s output
will be saved to, so these options won’t be available.
140
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
When viewing the output for a job, the Monitor will typically open the image file in the default application on the
machine. You can configure the Monitor to use specific image viewer applications in the Monitor Options, which can
be accessed from the main toolbar.
3.4. Controlling Jobs
141
Deadline User Manual, Release 7.1.0.35
Finally, some jobs will support the ability to scan completed tasks for a job to see if any output is missing or below an
expected file size. The Scan For Missing Output window can be opened by right-clicking on a job and selecting Job
Output -> Scan For Missing Output. If any missing output is detected, or the output file is smaller than the Minimum
File Size specified, you are given the option to requeue those tasks (simply place a check mark beside the tasks to
requeue).
142
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
3.4.9 Job Auxiliary Files
Many jobs have additional files submitted with them, such as the scene file being rendered. These files are copied to
the server and are then copied to the Slaves when they render the jobs. If a job has auxiliary files submitted with it,
you can explore these files from the Job panel’s right-click menu. There will also be the option to copy the auxiliary
path to the clipboard, which is helpful if you need to paste the path into another application.
3.4. Controlling Jobs
143
Deadline User Manual, Release 7.1.0.35
3.5 Archiving Jobs
3.5.1 Overview
Deadline allows you to archive jobs, which is useful if you want to keep a backup of every job you’ve rendered, or
if you want to remove a job from one farm and place it in another. It can also be used to give a problematic job to
Deadline Support for testing purposes.
Jobs can be archived automatically or manually. When a job is archived, its job and task information are exported as
JSON to separate text files. These files are placed in a zip file with any auxiliary files that were submitted with the job,
and any reports the job currently has. The name of the zip file will contain the job’s user, plugin, name, and ID (to
guarantee uniqueness). It will have the following format:
USER__PLUGIN__JOBNAME__JOBID.zip
Typically, this zip file is placed in the jobsArchived folder in the Repository. However, when manually archiving a
job, you have the option to choose an alternative archive location.
3.5.2 Manual Job Archiving
Users can manually archive a job by right-clicking on it in the job list in the Monitor and selecting Archive Job. This
will bring up the following window:
144
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
By default, it will save the archive to the jobsArchived folder in the Repository. However, you can choose a different
folder to archive the job. You can also choose whether or not to delete the job from the database after archiving it.
Once case where you might not want to delete it is if you are archiving a job to send to Deadline Support for testing
purposes.
If the Job panel is not visible, see the Panel Features documentation for instructions on how to create new panels in
the Monitor.
3.5.3 Automatic Job Archiving
When submitting a job, users can set the On Job Complete setting to Archive. When the job finishes, it will automatically be archived to the jobsArchived folder in the Repository.
3.5. Archiving Jobs
145
Deadline User Manual, Release 7.1.0.35
Administrators can also configure Deadline to automatically archive all jobs after they have finished rendering and
place them in the jobsArchived folder in the Repository. This can be done in the Job Settings section of the Repository
Options.
146
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
3.5.4 Importing Archived Jobs
To import an archived job, simply select File -> Import Archived Jobs in the Monitor and choose one or more zip files
containing archived jobs.
3.5. Archiving Jobs
147
Deadline User Manual, Release 7.1.0.35
3.6 Monitor and User Settings
3.6.1 Overview
You can customize your Monitor options, User settings, and Styles in the Monitor Options. On Windows and Linux,
select Tools -> Options, and on Mac OS X, select DeadlineMonitor -> Preferences. You can also open these settings
from the main toolbar in the Monitor.
3.6.2 Monitor Options
The Monitor options allow you to customize a few aspects of the Monitor.
148
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
Job List
• Enable Drag & Drop Dependencies: If enabled, you can drag jobs and drop them on other jobs to set dependencies. Note that you must restart the Monitor for this setting to take effect. See the Controlling Jobs
documentation for more information on setting dependencies this way.
• Show Task States In Job Progress Bar: If enabled, the job progress bars will show the states of all the tasks
for the job.
• Group Jobs By Batch Name: If enabled, jobs that have the same Batch Name will be grouped together in the
job list. Note that you must restart the Monitor for this setting to take effect.
• Change Color Of Jobs That Accumulate Errors: If enabled, jobs will change color from the Rendering color
to the Failed color as they accumulate errors. See the Styles section further down for more on the colors.
Task List
• Task Double-click Behavior: Customize the double-click behavior of rendering, completed, and failed tasks in
the task list. Double-clicking on tasks in other states will bring up the task reports panel. These are the available
options:
3.6. Monitor and User Settings
149
Deadline User Manual, Release 7.1.0.35
– View Reports: This will bring up the task reports panel for the selected task.
– Connect To Slave Log: This will connect to the Slave that is rendering or has rendered the selected task.
– View Image: This will open the output image for the selected task in the default viewer.
• Change Color Of Tasks That Accumulate Errors: If enabled, tasks will change color from the Rendering color
to the Failed color as they accumulate errors. See the Styles section further down for more on the colors.
Miscellaneous
• Start In Super User Mode: If enabled, the Monitor will start with Super User mode enabled. If Super User
mode is password protected, you will be prompted for the password when you start the Monitor.
• Stream Job Logs from Pulse: If enabled, the Monitor will stream the job logs from Pulse instead of reading
them directly from the Repository. While streaming the logs this way is typically slower, it can be useful if the
connection to the Repository server is slow.
• Show House Cleaning Updates In Status Bar: If enabled, the Monitor status bar will show when the last
House Cleaning was performed.
• Show Repository Repair Updates In Status Bar: If enabled, the Monitor status bar will show when the last
Repository Repair was performed.
• Show Pending Job Scan Updates In Status Bar: If enabled, the Monitor status bar will show when the last
Pending Job Scan was performed.
• Enable Slave Pinging: If enabled, the Slave List will show if slave machines can be pinged or not.
3.6.3 Image Viewers
Configure the image viewer applications that the Job and Task panels uses to view output images. See the Controlling
Jobs documentation for more information on viewing job output.
150
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
You can specify up to three image view applications with the following options:
• Executable: The path to the image viewer executable you want to use.
• Arguments: The arguments to pass to the image viewer executable. The default is “{FRAME}”, which represents a path to a single image file for a task. More information about the support argument tags can be found
below.
• Name: The viewer name, which is used in the menu item created for this image viewer (defaults to the executable
name if left blank).
• Viewer Supports Chunked Tasks: If enabled, the tasks image viewer dialog will not be shown when viewing
the output for jobs with Frames Per Task greater than 1.
The following tags are supported in the custom viewer arguments, and can be combined with other arguments that the
image viewer accepts:
3.6. Monitor and User Settings
151
Deadline User Manual, Release 7.1.0.35
• {FRAME}: This represents the task’s frame file. For example: /path/to/image0002.png
• {SEQ#}: This represents the task’s frame sequence files, using ‘#’ as the padding.
/path/to/image####.png
For example:
• {SEQ?}: This represents the task’s frame sequence files, using ‘?’
/path/to/image????.png
as the padding.
For example:
• {SEQ@}: This represents the task’s frame sequence files, using ‘@’ as the padding.
/path/to/image@@@@.png
For example:
• {SEQ%}: This represents the task’s frame sequence files, using ‘%d’ as the padding.
/path/to/image%04d.png
For example:
You can also specify the Preferred Image Viewer, which is the default image viewer to use when viewing output files.
If set to DefaultViewer, the system’s default application for the output file type will be used.
3.6.4 User Settings
You can configure you user settings here.
152
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
Notification Settings
If you would like to receive email notifications for your job, you can specify your email address in the Notification
Settings and enable the option to receive them. Note that this requires your administrator to configure the email settings
in the Repository Options.
If you would like to receive popup message notifications for your job, you can specify your machine name in the
Notification Settings and enable the option to receive them. Note that this requires the Launcher to be running on the
machine that you specify here.
Render Job As User Settings
If the Render Job As User option is enabled in the job settings in the Repository Options, these options will be used to
launch the rendering process as the specified user. For Linux and OSX, only the User Name is required. For Windows,
the Domain and Password must be provided for authentication. See the Render Jobs As Job’s User documentatison
for more information.
Web Service Authentication Settings
You can also specify a Web Service password, which is typically used for the Mobile application. A password is
required to authenticate with the Deadline web service if authentication has been enabled and empty passwords are
not allowed.
Region
A user’s region is used for cross platform rendering. All the paths a user sees in the Monitor will be replaced based on
the path mappings for their region. Example: Viewing the output of a completed job. See Region Settings and Regions
for more information.
3.6.5 Styles
The Styles panel can be used to customize the color palette and the fonts that the Deadline Applications use. Custom
styles can be saved and imported as well.
3.6. Monitor and User Settings
153
Deadline User Manual, Release 7.1.0.35
By default, the current style will be Default Style, which is the style shipped with Deadline and cannot be modified in
any way. Previously saved styles will be available in the Saved Styles list. Custom styles can be created and deleted
by clicking the Create New Style and Delete Style buttons, respectively.
Once a custom style has been selected, the style’s color palette can be modified:
• The General Palette color is used to generate the colors for the various controls and text in the Deadline applications. Note that dark palettes will result in light text, and light palettes will result in dark text.
• The Selection color is used to highlight selected items or text.
• The remaining colors are used to color the text for jobs, tasks, slaves, etc, based on their current state. It is
recommened to choose colors that contrast well with the General Palette and Selection colors to ensure the text
is readable.
The style’s font can be modified as well:
• Primary Font: This is the font used for almost all the text in the Deadline applications.
• Console Font: This is the font used in console and log windows. By default, a monospace font is used for these
windows.
154
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
Any style changes made are not saved until the Monitor Options dialog is accepted by clicking OK. Once the dialog
has been accepted, the Monitor must be restarted in order to apply the style changes. In order to facilitate testing out
new styles, there is a Preview Style button which opens a dialog that displays an approximation of the current style
settings.
Note that the Deadline applications will always load with the style that was last selected in the Styles panel in the
Monitor Options.
3.6. Monitor and User Settings
155
Deadline User Manual, Release 7.1.0.35
Styles may also be saved and loaded using the View menu in the Monitor. Note that when saving styles, all of the
custom styles are saved, and when loading saved styles from disk the loaded styles will be appended to the list of styles
currently present, overwriting any styles with a shared name.
3.7 Local Slave Controls
3.7.1 Overview
The Local Slave Controls allow you to control the slave on your machine, as well as configure Idle Detection and the
Job Dequeing Mode. You can access the Local Slave Controls from the Launcher’s menu, or from the Tools menu in
the Monitor.
156
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
Note that it is possible for Administrators to disable the Local Slave Controls. If that’s the case, you will see this
message when trying to open them.
3.7.2 Slave Controls
This section allows you to view the state of the slave running on your machine. Also, if the slave is rendering, you can
see which job it is currently rendering in the list. Finally, you can control the slave on you machine by right-clicking
on them in the list.
3.7. Local Slave Controls
157
Deadline User Manual, Release 7.1.0.35
More information about the avaiable controls can be found in the Remote Control documentation.
3.7.3 Override Idle Detection
This section overrides the global Slave Scheduling settings for your machine (if there are any). It can be used to
start the slave when your machine becomes idle (based on keyboard and mouse activity), and stop the slave when the
machine is in use again. Note that Idle Detection is managed by the Launcher, so it must be running for this feature to
work.
158
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
The available Idle Detection settings are as follows:
• Start Slave When Machine Is Idle For: This option enables Idle Detection, and you can specify the number
of minutes without keyboard, mouse or tablet activity before the slave should start.
• Only Start Slave If CPU Usage Less Than: If enabled, the slave will only start if the machine’s CPU usage is
less then the given value.
• Only Start Slave If Free Memory More Than: If enabled, the slave will only start if the machine has this
much free memory available.
• Only Start Slave If These Processes Are Not Running: If enabled, the slave will not start if any of the listed
processes are running.
• Only Start Slave If Launcher Is Not Running As These Users: If eanbled, the slave will not start if the
Launcher process is running as any of the listed users.
• Stop Slave When Machine Is No Longer Idle: If enabled, the slave will automatically stop when there is
keyboard, mouse or tablet activity again.
• Only Stop Slave If Started By Idle Detection: If enabled, the Slave will only be stopped when the machine is
no longer idle if that Slave was originally started by Idle Detection. If the Slave was originally started manually,
it will not be stopped.
• Allow Slave To Finish Its Current Task When Stopping: If enabled, the slave will finish its current task
3.7. Local Slave Controls
159
Deadline User Manual, Release 7.1.0.35
before stopping when the machine is no longer idle. If disabled, the slave wil requeue its current task before
stopping so that another slave can render it.
There are some limitations with Idle Detection depending on the operating system:
• On Windows, Idle Detection will not work if the Launcher is running as a service. This is because the service
runs in an environment that is separate from the Desktop, and has no knowledge of any mouse or keyboard
activity.
• On Linux, the Launcher uses X11 to determine if there has been any mouse or keyboard activity. If X11 is not
available, Idle Detection will not work.
3.7.4 Job Dequeuing Mode
This section can be used to control how your slave dequeues jobs.
The available dequeing modes are:
• All Jobs: This is the default behavior. The slave will dequeue any job that it can work on.
• Only Jobs Submitted From This Slave’s Machine: This option will only allow the slave to dequeue jobs
submitted from the same machine. This is a useful way of ensuring that your slave will only render your jobs.
• Only Jobs Submitted From These Users: This option will only allow the slave to dequeue jobs submitted by
160
Chapter 3. Getting Started
Deadline User Manual, Release 7.1.0.35
the specified users. This is another way of ensuring that your slave will only render your jobs. However, it can
also be used to make your slave render jobs from other specific users, which is useful if you’re waiting on the
results of those jobs.
3.7. Local Slave Controls
161
Deadline User Manual, Release 7.1.0.35
162
Chapter 3. Getting Started
CHAPTER
FOUR
CLIENT APPLICATIONS
4.1 Launcher
4.1.1 Overview
The Launcher’s main use is to provide a means of remote communication between the Monitor and the Slave or Pulse
applications, and therefore should always be left running on your render nodes and workstations. It can also detect if
the Slave running on the machine has stalled, and restart it if it does.
Unless the Launcher is running as a service or daemon, you should see the
icon in your system tray or notification
area. You can right-click on the icon to access the Launcher menu, or double-click it to launch the Monitor.
4.1.2 Running The Launcher
To start the Launcher:
• On Windows, you can start the Launcher from the Start Menu under Thinkbox\Deadline.
• On Linux, you can start the Launcher from a terminal window by running the deadlinelauncher script in the bin
folder.
• On Mac OS X, you can start the Launcher from Finder by running the DeadlineLauncher application in Applications/Thinkbox/Deadline.
The Launcher can also be started from a command prompt or terminal window. For more information, see the Launcher
Command Line documentation.
4.1.3 Administration Features
Running the Launcher can help make some administrative tasks easier, which is why it’s recommended to keep it
running at all times on your render nodes and workstations.
Automatic Updates
If you have enabled Automatic Upgrades under the Client Setup section of the Repository Options, whenever you
launch the Monitor, Slave, or Pulse using the Launcher, it will check the Repository for updates and upgrade itself
automatically if necessary before starting the selected application.
Note that the upgrade will only trigger when launching applications through the Launcher. Also, if the Launcher is
running as a service on Windows, launching the Monitor will not trigger an update.
163
Deadline User Manual, Release 7.1.0.35
Remote Administration
If you have enabled Remote Administration under the Client Setup section of the Repository Options, you will be able
to control the Slave or Pulse applications remotely, and remotely execute arbitrary commands. Note that it may be a
potential security risk to leave it running if you are connected to the internet and are not behind a firewall. In this case,
you should leave Remote Administration disabled.
4.1.4 Launcher Menu Options
Right-click on the Launcher system tray icon to bring up the Launcher menu. The available options are listed below.
Note that if the Launcher is running as a service or daemon, this menu is unavailable because the system tray icon will
be hidden.
Launch Monitor
Launches the Monitor application. If the Repository has been upgraded recently, and Automatic Updates
is enabled, this will automatically upgrade the client machine.
Launch Slave(s)
Launches the Slave application. If this machine has been configured to run more than one Slave instance,
this will launch all of them. If the Repository has been upgraded recently, and Automatic Updates is
enabled, this will automatically upgrade the client machine.
Launch Slave By Name
Launch a specific Slave instance, or add/remove Slave instances from this machine (if enabled for the
current user). Note that new Slave instances must have names that only contain alphanumeric characters,
underscores, or hyphens. See the documentation on running Multiple Slaves On One Machine for more
information.
Local Slave Controls
Opens the Local Slave Controls window, which allows you to control and configure the Slave that runs on
your machine.
Launch Slave at Startup
164
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
If enabled, the Slave will launch when the Launcher starts up.
Restart Slave If It Stalls
If enabled, the Launcher will try to restart the Slave on the machine if it stalls.
Scripts
Allows you to run general scripts that you can create. Note that these are the same scripts that you can
access from the Scripts menu in the Monitor. Check out the Monitor Scripts documentation for more
information.
Submit
Allows you to submit jobs for different rendering plug-ins. Note that these are the same submission scripts
that you can access from the Submit menu in the Monitor. More information regarding the Monitor submission scripts for each plug-in can be found in the Plug-Ins section of the documentation. You can also
add your own submission scripts to the submission menu. Check out the Monitor Scripts documentation
for more information.
Change Repository
Change the Repository that the client connects to.
Change User
Change the current user on the client.
Change License Server
Change the license server that the Slave connects to.
Explore Log Folder
Opens the Deadline log folder on the machine.
4.1.5 Command Line Options
To run the Launcher from a command prompt or terminal window, navigate to the Deadline bin folder (Windows or
Linux) or the Resources folder (Mac OS X) and run the ‘deadlinelauncher’ application. To view all available command
line arguments, you can run the following:
deadlinelauncher -help
Available Options
To start the Monitor with the Launcher, use the -monitor option. If another Launcher is already running, this will tell
the existing Launcher to start the Monitor. If an upgrade is available, this will trigger an automatic upgrade:
deadlinelauncher -monitor
To start the Slave with the Launcher, use the -slave option. If another Launcher is already running, this will tell the
existing Launcher to start the Slave. If an upgrade is available, this will trigger an automatic upgrade:
deadlinelauncher -slave
To start Pulse with the Launcher, use the -pulse option. If another Launcher is already running, this will tell the existing
Launcher to start Pulse. If an upgrade is available, this will trigger an automatic upgrade:
4.1. Launcher
165
Deadline User Manual, Release 7.1.0.35
deadlinelauncher -pulse
To start the Balancer with the Launcher, use the -balancer option. If another Launcher is already running, this will tell
the existing Launcher to start the Balancer. If an upgrade is available, this will trigger an automatic upgrade:
deadlinelauncher -balancer
To trigger an automatic upgrade if one is available, use the -upgrade flag:
deadlinelauncher -upgrade
To run the Launcher without a user interface, use the -nogui option. Note that if the Launcher is running in this mode,
if you launch the Slave or Pulse through the Launcher, they will also run without a user interface:
deadlinelauncher -nogui
deadlinelauncher -nogui -slave
To shutdown the Launcher if it’s already running, use the -shutdown option:
deadlinelauncher -shutdown
To shutdown the Slaves, Pulse, and Balancer on the machine before shutting down the Launcher, use the -shutdownall
option:
deadlinelauncher -shutdownall
4.1.6 Launcher As A Service
When installing the Deadline Client on Windows, you can choose to install the Launcher as a service. If you want
to configure the Launcher to run as a service after the Client has been installed, it is possible to set up the service
manually, which is explained below. However, it’s probably easier to simply run the Client installer again and enable
the service option during installation.
There are also some considerations that need to be made when installing the Launcher as a service. See the Windows
Service documentation for more information.
Manually Installing the Launcher Service
You can use Deadline Command along with the following commands to install or uninstall the Launcher service:
InstallLauncherService
[true/false]
InstallLauncherServiceLogOn
[User Name]
[Password]
[true/false]
166
Installs the Deadline Launcher Service, and
optionally starts it.
Whether or not to start the Launcher Service
after it has been installed (optional)
Installs the Deadline Launcher Service with the
given account, and optionally starts it.
The account user name
The account password
Whether or not to start the Launcher Service
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
after it has been installed (optional)
UninstallLauncherService
Stops and uninstalls the Deadline Launcher
Service.
StartLauncherService
Starts the Deadline Launcher Service if it is
running.
StopLauncherService
Stops the Deadline Launcher Service if it is
running.
Here is an example command line to install the service:
deadlinecommand.exe -InstallLauncherServiceLogOn "USER" "PASSWORD"
Here is an example command line to uninstall the service:
deadlinecommand.exe -UninstallLauncherService
4.1.7 FAQ
Why should the Launcher application be left running on the client machines?
Its main purpose is to provide a means of remote communication between the Monitor and the Slave
applications. If it’s not running, the Slave will have to be stopped and started manually.
In addition, whenever you launch the Monitor or Slave using the Launcher, it will check the Repository
for updates and upgrade itself automatically if necessary before starting the selected application. If the
Launcher is not running, updates will not be detected.
Finally, the Launcher can detect if the Slave running on the machine has stalled, and restart it.
Can I run the Launcher without a user interface?
Yes, you can do this by passing the -nogui command line argument to the Launcher application:
deadlinelauncher -nogui
I have Idle Detection enabled, but the Launcher doesn’t start the Slave on Linux when it’s been idle long enough.
The libX11 and libXext libraries must be installed on Linux for Idle Detection to work. To check if libX11
and libXext are installed, open a Terminal and run the following commands. If they are installed, then the
path to the libraries will be printed out by these commands.
ldconfig -p | grep libX11
ldconfig -p | grep libXext
If any of these libraries are missing, then please contact your local system administrator to resolve this
issue. Here is an example assuming you have root access, using YUM to install them on your system:
4.1. Launcher
167
Deadline User Manual, Release 7.1.0.35
sudo -s
yum install redhat-lsb
yum install libX11
yum install libXext
4.2 Monitor
4.2.1 Overview
The Monitor application offers detailed information and control options for each job and Slave in your farm. It provides
normal users a means of monitoring and controlling their jobs, and it gives administrators options for configuring and
controlling the entire render farm.
If you’re launching the Monitor for the first time on your machine, you will be prompted with a Login dialog. Simply
choose your user name or create a new one before continuing. Once the Monitor is running, you’ll see your user name
in the bottom right corner. If this is the wrong user, you can log in as another user by selecting File -> Change User.
Note that if your administrator set up Deadline to lock the user to the system’s login account, you will have to log off
of your system and log back in as the correct user.
168
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
4.2.2 Running the Monitor
To start the Monitor:
• On Windows, you can start the Monitor from the Start Menu under Thinkbox\Deadline, or from the Launcher’s
right-click menu.
• On Linux, you can start the Monitor from a terminal window by running the deadlinemonitor script in the bin
folder, or from the Launcher’s right-click menu.
• On Mac OS X, you can start the Monitor from Finder by running the DeadlineMonitor application in Applications/Thinkbox/Deadline, or from the Launcher’s right-click menu.
The Monitor can also be started from a command prompt or terminal window. For more information, see the Monitor
Command Line documentation.
4.2.3 Panel Features
Information in the Monitor is broken up into different panels, which are described further down. These panels have
many features in common, which are explained here.
Customization
Monitor panels can be created from the View menu, or from the main toolbar. They can be re-sized, docked, or floated
as desired. This allows for a highly customized viewing experience which is adaptable to the needs of different users.
4.2. Monitor
169
Deadline User Manual, Release 7.1.0.35
The current layout can be pinned to the Pinned Layouts menu so that it can be restored at a later time. This can be
done from the View menu, or from the main toolbar. The current layout can also be saved to a file from the View
menu, and then loaded from that file later.
When you pin a layout you can chose to save the location and size of the monitor by checking the “Save Location and
Size” box when pinning the layout.
170
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
To prevent accidental modifications to the current layout, you can lock the layout from the View menu, by pressing
“Alt-‘”, or from the main toolbar. When locked, panels cannot be moved, but they can still be docked and undocked.
To dock a floating panel while the layout is locked, simply double-click on the panels title. It will be docked to the
same location it was originally undocked from.
The columns in monitor panels are customizable. The columns can be resized by simply clicking on the separator
column line and moving it and can be reordered by clicking on a column and moving it. Right clicking on the column
headers in a panel allows you to toggle the visibility of each column.
4.2. Monitor
171
Deadline User Manual, Release 7.1.0.35
In this menu you can modify the visibility and ordering of the columns by clicking the “Customize..” menu item.
Moving columns to the left side list hides them, and the order that columns are listed in the right list corresponds to
the order they will appear in the panel (top->bottom corresponds to left->right). You move the columns around by
clicking the arrow buttons.
172
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
Once you have configured your column layout you can pin it.
4.2. Monitor
173
Deadline User Manual, Release 7.1.0.35
You can also set the current list layout as the list layout to load by default, when opening new panels of the same type,
by clicking “Save Current List Layout As Default”. If you want to restore the original list layout default click the
“Reset Default List Layout”.
Data Filtering
Almost every panel has a search box that you can use to filter the information you’re interested in. You can simply
type in the word(s) you are looking for, or use regular expressions for more advanced searching.
In addition, every panel that has a search box also supports a more advanced filtering system. To add a filter to a
panel, select the Edit Filter option in the panel’s drop down menu, which can be found in the upper-right corner of the
panel. A window will appear allowing you to specify the name the filter being created. You can select to match all of
the filters added or any of the filters added. If all must match, only records where all data matches each filter will be
shown, while if any can match, if a record contains one or more matches it will be shown.
174
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
4.2. Monitor
175
Deadline User Manual, Release 7.1.0.35
Clicking the add filter button generates a new filter. The filter requires a column to be selected, an operation to perform,
and a value to use in the operation. Filters can also be removed by clicking the minus button to the right of each filter.
After all filters are are entered, press OK to apply the filter to the current panel.
A filter can be cloned and opened in a new tab within the panel through the Clone Filter option in the panel drop down
menu. The Clear Filter option can be used to clear all filters from the current panel.
Finally, you can pin the current filters so that they can be restored at a later time using the Pinned Filters sub menu in
the panel drop down menu. Note that the Pin Current Filter option is only available if a filter is currently being applied.
If there are no filters, the Pin Current Filter option will be hidden.
Automatic Sorting and Filtering
Almost every panel has an option to do automatic sorting and filtering when data changes in the panel. When this
option is disabled, sorting and filters must manually be re-applied to ensure that the data is sorted and filtered properly.
Note that automatic sorting and filtering can affect the Monitor’s performance if there are lots of jobs (10,000+) or lots
of slaves (1000+) in the farm. To improve Monitor performance in this case, it is recommended to disable automatic
sorting and filtering. There is an option in the Monitor Settings in the Repository Configuration to automatically
disable it by default.
Saving and Loading Panel Layouts
Every list-based panel (Jobs, Slaves, Tasks, etc) has an option to save and load the list layout, which you can find in
the panel’s drop down menu. This allows you to save out a list’s filters, column order and visibility, etc, and load them
again later or share them with another user.
176
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
Note that when loading a list layout, you must choose a layout that was saved from the same type of list. For example,
you cannot save a layout from the Job list and then load it into the Slave list.
Graph Views
Almost every panel supports showing a graphical representation of the data. The graph can be shown by selecting
the Graph View option in the panel’s drop down menu, which can be found in the upper-left corner of the panel. The
graph view can be saved as an image file by right-clicking anywhere in view and selecting Save Graph As Image.
4.2. Monitor
177
Deadline User Manual, Release 7.1.0.35
If the graph is a line graph, the following operations are available:
• Zoom In: Use the mouse wheel or the UP arrow key to zoom in. You can also click and hold the left mouse
button and drag to select a sub-area of the graph to zoom in.
• Zoom Out: Use the mouse wheel or the DOWN arrow key to zoom out.
• Reset Zoom: Use the right-click menu to reset the zoom level.
• Pan: Use the middle mouse button or the LEFT and RIGHT arrow keys to pan the graph.
• Show/Hide Series: If the line graph has a legend, you can use the right-click menu to customize which series
are shown or hidden.
If the graph is a pie chart, you can filter the data from the graph view by holding down the SHIFT key and clicking
on one of the pie slices. The data will be filtered to only show records that are represented by the pie slice that was
clicked on.
178
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
Scripts
Almost every panel has the option to run custom scripts from the panel’s right-click menu. Many scripts are already
shipped with Deadline, and additional custom scripts can be written. See the Monitor Scripts documentation for more
information.
These script menus can also be customized from the Repository Options.
4.2.4 Information Panels
As mentioned earlier, information in the Monitor is broken up into different panels. These panels can be created from
the View menu, or from the main toolbar. They can be re-sized, docked, or floated as desired. This allows for a highly
customized viewing experience which is adaptable to the needs of different users.
Jobs
The Jobs panel contains a list that shows all jobs in the farm. It also displays useful information about each job such
as it’s name, user, status, error count, plugin, etc. As jobs change states, their colors will change. Active jobs will
appear as green, and will remain green as they continue to render without errors. But if it starts to accumulate errors, it
will turn brown and then eventually red. This allows you to see at a glance which jobs are having problems. For more
information on job monitoring, see the Monitoring Jobs documentation.
4.2. Monitor
179
Deadline User Manual, Release 7.1.0.35
The Jobs panel supports standard filtering, but it also has a Quick Filter option in the panel’s drop down menu to make
it easier to filter out unwanted jobs. By toggling the options within the Status, User, Pool, Group, and Plugin sections,
you can quickly drill down to the jobs you are interested in. There is also an Ego-Centric Sorting optino in the panel’s
drop down menu which can be used to keep all of your jobs at the top of the job list.
The Jobs panel also supports the ability to group jobs together based on their Batch Name property. All of the job
submitters that are included with Deadline will automatically set the Batch Name if they are submitting multiple jobs
that are related to each other. The Batch Name for a job can be modified in the Job Properties. If you prefer to not
180
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
have the jobs grouped together in the job list, you can disable the Group Jobs By Batch Name option in the Monitor
and User Settings.
Finally, the Jobs panel allows jobs to be controlled and modified using the right-click menu. You can also bring up the
Job Properties window by double clicking on a job. See the Controlling Jobs documentation for more information.
Tasks
The Task panel shows all the tasks for the job that is currently selected. It displays useful information about each task
such as its frame list, status, and if applicable, the Slave that is rendering it.
The Task panel also allows you to control tasks from the right-click menu. See the Controlling Jobs documentation
4.2. Monitor
181
Deadline User Manual, Release 7.1.0.35
for more information. In addition, the double-click behavior in the Task panel can be set in the Monitor and User
Settings, which can be accessed from the main toolbar.
Job Details
The Job Details panel shows all available information about the job that is currently selected. The information is split
up into different sections that can be expanded or collapsed as desired.
Job Dependency View
This panel allows you to view and modify a job’s dependency tree in a node-based view. You can lock the view to
the currently selected job, which allows you to drag & drop other jobs into the view to hook up new dependencies. In
addition, you can drag & drop Python scripts or asset files directly into the view and hook them up as dependencies.
See the Controlling Jobs documentation for more information.
182
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
Job Report
All reports for a job can be viewed in the Job Reports panel. This includes error reports, logs, and task requeue
reports. This panel can also be opened by right-clicking on a job in the Job List and selecting View Job Reports. More
information can be found in the Controlling Jobs documentation.
4.2. Monitor
183
Deadline User Manual, Release 7.1.0.35
Slaves
The Slave panel shows all the Slaves that are in your farm. It shows system information about each Slave, as well as
information about the job the slave is currently rendering.
If you see a slave that is colored orange in the list, this means that the slave is unable to get a license or that the license
is about to expire. When the slave cannot get a license, it could be because there is a network issue, the license has
expired, or the license limit has been reached.
If a slave isn’t rendering a job that you think it should be, you can use the Job Candidate Filter option in the panel’s drop
down menu to try and figure out why. See the Job Candidate Filter section in the Slave Configuration documentation
184
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
for more information.
The Slave panel’s right-click menu allows you to modify Slave settings and control the Slaves remotely. See the Slave
Configuration documentation for more information.
Slave Reports
All log and error reports for a Slave can be viewed in the Slave Reports panel. This panel can also be opened by
right-clicking on a slave in the Slave List and selecting View Slave Reports.
Pulses
The Pulse panel shows which machine Pulse is running on, as well as previous machines that Pulse has run on. It also
shows system information about each machine.
4.2. Monitor
185
Deadline User Manual, Release 7.1.0.35
Balancers
The Balancer panel shows which machines the Balancer is running on. It also shows system information about each
machine.
The Balancer panel’s right-click menu allows you to modify Balancer settings and control the Balancer remotely. See
the Balancer Configuration documentation for more information.
Limits
The Limit panel shows all the Limits that are in your farm. You can access many options for the Limits by rightclicking on them. See the Limits and Machine Limits documentation for more information.
Console
The Console panel shows all lines of text that is written to the Monitor’s log.
186
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
Remote Commands
The Remote Command panel shows all pending and completed remote commands that were sent from the Monitor.
When sending a remote command, if this panel is not already displayed, it will be displayed automatically (assuming
you have permissions to see the Remote Command panel). See the Remote Control documentation for more information.
Cloud
The Cloud panel shows all the instances from the cloud providers that the Monitor is connected to. This panel allows
you to control and close your existing instances. See the Cloud Controls documentation for more information.
4.2. Monitor
187
Deadline User Manual, Release 7.1.0.35
4.2.5 Monitor Menu Options
The available options are listed below. They are available in the Monitor’s main menu, and some are also available in
the main toolbar. Note that the availability of these options can vary depending on the context in which they are used,
as well as the User Group Permissions that are defined for the current user.
File Menu
Change Repository
Connect to a different repository, or reconnect to the current repository if the Monitor becomes disconnected. There is also a toolbar button for this option.
Change User
Change the current user. You have the choice to select a different user or create a new one. There is also
a toolbar button for this option.
Import Archived Jobs
Opens a file dialog which allows you to select a zip file containing an archived job which you would like
to add back to the monitor. See the Archiving Jobs documentation for more information.
View Menu
Manual Refresh
Forces an immediate refresh of all the data in the Monitor. Manual refreshing is disabled by default, and
can only be enabled in the Monitor Settings in the Repository Configuration.
New Panel
Spawn a new information panel. See the Information Panels section above for more information. There
is also a toolbar button for this option.
Lock Panels
Prevents the panels from being moved. Panels can still be floated, docked, and closed. To dock a floating
panel, double-click on the panel’s title. There is also a toolbar button for this option.
Pinned Layouts
You are able to save different Monitor layout for quick use. By selecting Pin Current Layout, your current
layout will be added to your pinned layouts. Selecting a pinned layout will restore the monitors panels to
the pinned layouts state. There is also a toolbar button for this option.
Open Layout
188
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
Load a previously saved layout from file.
Save Layout
Saves the current layout to file.
Save All Pined Layouts
Save all the pinned Monitor layouts to a zip file.
Reset Layout
Reset the current layout to the Monitor’s default layout.
Submit and Script Menus
Submission scripts can be found under the Submit menu, and general scripts can be found under the Scripts menu.
Many scripts are already shipped with Deadline, and additional custom scripts can be written. Check out the Monitor
Scripts documentation for more information.
Tools Menu
Super User
Enter Super User Mode, which allows you to access the administrative Monitor options. Super User
mode can be password protected simply by specifying a password in the Access Control section of the
Repository Configuration.
View Repository History
View all repository history entries generated on the farm.
View Power Management History
View all power management history entries on the farm. See the Power Management documentation for
more information.
View Farm Reports
View various repository statistical information. See the Farm Statistics documentation for more information.
Manage Pools
Add or remove Pools, and configure which Pools are assigned to the Slaves. See the Pools and Groups
documentation for more information.
Manage Groups
Add or remove Groups, and configure which Groups are assigned to the Slaves. See the Pools and Groups
documentation for more information.
Manage Users
Add or remove users, and set user information. See the User Management documentation for more
information.
Manage User Groups
Add or remove a user group, and set user group permissions to control which features are accessible. See
the User Management documentation for more information
Configure Repository Settings
4.2. Monitor
189
Deadline User Manual, Release 7.1.0.35
Configure a wide range of global settings. See the Repository Configuration documentation for more
information.
Configure Slave Scheduling
Configure the slave scheduling options. See the Slave Scheduling documentation for more information.
Configure Power Management Options
Configure the Power Management settings. See the Power Management documentation for more information.
Configure Cloud Providers
Set up and enable cloud service providers. See the Cloud Controls documentation for more information.
Configure Plugins
Configure the available render plugins, such as 3ds Max, After Effects, Maya, and Nuke. See the plugin
documentation for more information on the configurable settings for each plugin.
Configure Event Plugins
Configure the available event plugins such as Draft and Shotgun. See the event plugin documentation for
more information on the configurable settings for each plugin.
Connect to Pulse Log
Use this to remotely connect to the Pulse log. See the Remote Control documentation for more information.
Perform Pending Jobs Scan
Performs a scan of pending jobs and determines if any should be released. This operation is normally
performed automatically, but you can force an immediate clean up with this option if desired.
Perform House Cleaning
Clean up files for deleted jobs, check for stalled slaves, etc. This operation is normally performed automatically, but you can force an immediate clean up with this option if desired.
Undelete Jobs
Use this to recover any deleted jobs that haven’t been purged from the database yet.
Explore Repository Root
View the root directory of the current Repository.
Import Settings
Import settings from another Repository. See the Importing Repository Settings documentation for more
information.
Synchronize Scripts and Plugin Icons
Rebuilds the script-specific menus, and updates your local plugin icon cache with the icons that are currently in the Repository. Note that if any new icons are copied over, you will have to restart the Monitor
before the jobs in list show the new icons.
Local Slave Controls
Opens the Local Slave Controls window, which allows you to control and configure the Slave that runs on
your machine.
Options
Modify the Monitor and User Settings. There is also a toolbar button for this option.
190
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
4.2.6 Command Line Options
To run the Monitor from a command prompt or terminal window, navigate to the Deadline bin folder (Windows or
Linux) or the Resources folder (Mac OS X) and run the ‘deadlinemonitor’ application. To view all available command
line arguments, you can run the following:
deadlinemonitor -help
Available Options
To start a new Monitor if there already another Monitor running, use the -new option:
deadlinemonitor -new
To start the Monitor connected to a different repository, use the -repository option. You can combine this with the
-new option to have different Monitors connected to different repositories:
deadlinemonitor -repository "\\repository\path"
deadlinemonitor -new -repository "\\repository\path"
To start the Monitor without the splash screen, use the -nosplash option:
deadlinemonitor -nosplash
To shutdown the Monitor if it’s already running, use the -shutdown option:
deadlinemonitor -shutdown
You can also set all of the Monitor Options using command line options. For example:
deadlinemonitor -draganddropdep True -groupjobbatches False
4.2.7 FAQ
I’m unable to move panels in the Monitor, or dock floating panels.
You need to unlock the Monitory layout. This can be done from the View menu or from the toolbar.
Can I dock a floating panel when the Monitor layout is locked?
Yes, you can dock the floating panel by double-clicking on its title bar. It will be docked to its previous
location, or to the bottom of the Monitor if it wasn’t docked previously.
What does it mean when a Slave is orange in the Slave list?
This means that the Slave is currently unable to get a license.
4.2. Monitor
191
Deadline User Manual, Release 7.1.0.35
4.3 Slave
4.3.1 Overview
The Slave is the application that controls the rendering applications and should be running on any machine you want
to to include in the rendering process.
192
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
4.3.2 Running the Slave
To start the Slave:
• On Windows, you can start the Slave from the Start Menu under Thinkbox\Deadline, or from the Launcher’s
4.3. Slave
193
Deadline User Manual, Release 7.1.0.35
right-click menu.
• On Linux, you can start the Slave from a terminal window by running the deadlineslave script in the bin folder,
or from the Launcher’s right-click menu.
• On Mac OS X, you can start the Slave from Finder by running the DeadlineSlave application in Applications/Thinkbox/Deadline, or from the Launcher’s right-click menu.
You can also configure the Slave to launch automatically when the Launcher starts up. To enable this, just enable the
Launch Slave At Startup option in the Launcher menu.
The Slave can also be started from a command prompt or terminal window. For more information, see the Slave
Command Line documentation.
4.3.3 Licensing
The Slave requires a license to run, and more information on setting up licensing can be found in the Licensing Guide.
The Slave only requires a license while rendering. If a Slave cannot get a license, it will continue to run, but it won’t
be able to pick up jobs for rendering. In addition when a slave becomes idle it will return it’s license. The Slave’s
licensing information can be found under the Slave Information tab (see next section).
If you have more then one slave running on a machine they will all share the same licence.
4.3.4 Job and Slave Information Tabs
The Job Information tab shows information about the job currently being rendered. By default, the tab will show
information about all render threads combined, but the drop down control gives the option to show information about
a specific render thread. The Slave Information tab shows information about the Slave and the machine that it’s running
on, including license information and resource usage (CPU and memory).
194
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
4.3. Slave
195
Deadline User Manual, Release 7.1.0.35
4.3.5 Viewing the Slave Log
To view the Slave’s current log, simply press the Open Slave Log button at the bottom of the Slave window. This will
open the Slave’s log in a new window to avoid impacting the performance of the main Slave application.
If the Slave is running in the background or without an interface, you can connect to the Slave’s log from the command
line. In a command prompt or terminal window, navigate to the Deadline bin folder (Windows or Linux) or the
Resources folder (Mac OS X) and run the following, where “SLAVENAME” is the name of the Slave you want to
connect to:
deadlinecommand -ConnectToSlaveLog "SLAVENAME"
4.3.6 Slave Menu Options
The available options are listed below. They are available in the Slave’s window, or from the Slave system tray icon’s
right-click menu. Note that if the Slave is running in the background or without an interface, these options will be
unavailable.
196
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
File Menu
Change License Server
Change the license server that the Slave connects to.
Options Menu
Hide When Minimized
The Slave is hidden when minimized, but can be restored using the Slave icon in the system tray.
Minimize On Startup
Starts the Slave in the minimized state.
Control Menu
Search For Jobs
If the Slave is sitting idle, this option can be used to force the slave to search for a job immediately.
Cancel Current Task
If the Slave is currently rendering a task, this forces the slave to cancel it.
Continue Running After Current Task Completion
Check to keep the Slave application running after it finishes its current task completion.
Stop/Restart Slave After Current Task Completion
Check to stop or restart the Slave application after it finishes its current task.
Shutdown/Restart Machine After Current Task Completion
Check to shutdown or restart the machine after the Dealine Slave finishes its current task.
4.3.7 Command Line Options
To run the Slave from a command prompt or terminal window, navigate to the Deadline bin folder (Windows or
Linux) or the Resources folder (Mac OS X) and run the ‘deadlineslave’ application. To view all available command
line arguments, you can run the following:
deadlineslave -help
Available Options
To start a new instance of the Slave, use the -name option. If you already have multiple instances of the Slave
configured, use the -name option to start a specific instance:
deadlineslave -name "second-slave"
To start the Slave without a user interface, use the -nogui option:
4.3. Slave
197
Deadline User Manual, Release 7.1.0.35
deadlineslave -nogui
To start the Slave without the splash screen, use the -nosplash option:
deadlineslave -nosplash
To shut down the Slave if it’s already running, use the -shutdown option. This can be combined with the -name option
if you have more than one Slave instance running and you want to shut down a specific instance:
deadlineslave -shutdown
deadlineslave -shutdown -name "second-slave"
To control what a running Slave should do after it finishes rendering its current task, use the -aftertask option. The
available options are Continue, StopSlave, RestartSlave, ShutdownMachine, or RestartMachine. This can be combined
with the -name option if you have more than one Slave instance running and you want to control a specific instance:
deadlineslave -aftertask RestartSlave
deadlineslave -aftertask RestartMachine -name "second-slave"
4.3.8 FAQ
Can I run the Slave on an artist’s workstation?
Yes. On Windows and Linux, you can set the Affinity in the Slave Settings to help reduce the impact that
the renders have on the artist’s workstation.
Can I run the Slave as a service or daemon?
Yes. If you’re running the Launcher as a service or daemon, then it will run the Slave in the background
as well. See the Client Installation documentation for more information.
The Slave keeps reporting errors for the same job instead of moving on to a different job. What can I do?
You can enable Bad Slave Detection in the Repository Configuration to have a slave mark itself as bad for
a job when it reports consecutive errors on it.
What does it mean when a Slave is stalled, and is this a bad thing?
Slaves become stalled when they don’t update their status for a long period of time, and is often an
indication that the slave has crashed. A stalled slave isn’t necessarily a bad thing, because it’s possible the
slave just wasn’t shutdown properly (it was killed from the Task Manager, for example). In either case,
it’s a good idea to check the slave machine and restart the slave application if necessary.
On Linux, the Slave is reporting that the operating system is simply ‘Linux’, instead of showing the actual
Linux distribution.
In order for the Slave to report the Linux distribution properly, you need to have lsb installed, and
lsb_release needs to be in the path. You can use any package management application to install lsb.
On Linux, the Slave crashes shortly after starting up.
The libX11 and libXext libraries must be installed on Linux for the Slave to run, even if running it with
the -nogui flag. To check if libX11 and libXext are installed, open a Terminal and run the following
commands. If they are installed, then the path to the libraries will be printed out by these commands.
198
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
ldconfig -p | grep libX11
ldconfig -p | grep libXext
If any of these libraries are missing, then please contact your local system administrator to resolve this
issue. Here is an example assuming you have root access, using YUM to install them on your system:
sudo -s
yum install redhat-lsb
yum install libX11
yum install libXext
4.4 Pulse
4.4.1 Overview
Pulse is an optional mini server application that performs maintenance operations on the farm, and manages more
advanced features like Auto Configuration, Power Management, Slave Throttling, Statistics Gathering, and the Web
Service. If you choose to run Pulse, it only needs to be running on one machine. Note that Pulse does not play a role
in job scheduling, so if you are running Pulse and it goes down, Deadline will still be fully operational (minus the
advanced features). Note to build redundancy if Primary Pulse fails in your environment, consider protecting yourself
by configuring Pulse Redundancy.
4.4. Pulse
199
Deadline User Manual, Release 7.1.0.35
If you are choosing a machine to run Pulse, you should be aware that non-Server editions of Windows have a TCP/IP
connection limitation of 10 new connections per second. If your render farm consists of more than 10 render nodes,
it is very likely that you’ll hit this limitation every now and then (and the odds continue to increase as the number of
machines increase). This is a limitation of the operating systems, and isn’t something that we can workaround, so we
recommend using a Server edition of Windows, or a different operating system like Linux.
200
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
4.4.2 Running Pulse
To start Pulse:
• On Windows, you can start Pulse from the Start Menu under Thinkbox\Deadline.
• On Linux, you can start Pulse from a terminal window by running the deadlinepulse script in the bin folder.
• On Mac OS X, you can start Pulse from Finder by running the DeadlinePulse application in Applications/Thinkbox/Deadline.
You can configure Pulse to launch automatically when the Launcher starts up (similar to how the Slave does this). This
can be done by adding the LaunchPulseAtStartup=True to the system’s deadline.ini file. See the Client Configuration
documentation for more information.
Pulse can also be started from a command prompt or terminal window. For more information, see the Pulse Command
Line documentation.
4.4.3 Viewing the Pulse Log
To view Pulse’s current log, simply press the Open Pulse Log button at the bottom of the Pulse window. This will
open the Pulse log in a new window to avoid impacting the performance of the main Pulse application.
If Pulse is running in the background or without an interface, you can connect to the Pulse log from the command line.
In a command prompt or terminal window, navigate to the Deadline bin folder (Windows or Linux) or the Resources
4.4. Pulse
201
Deadline User Manual, Release 7.1.0.35
folder (Mac OS X) and run the following, where “PULSENAME” is the name of the Pulse you want to connect to:
deadlinecommand -ConnectToPulseLog "PULSENAME"
4.4.4 Configuring Pulse
Pulse needs to be configured so that the Slave applications know how to connect to Pulse. This is necessary for the
Slave Throttling feature to function properly. There are a couple different ways to configure Pulse, which are described
below.
Auto Configuration
If you launch Pulse, and a Primary Pulse hasn’t been set yet, it will automatically configure itself to be the Primary,
and configure itself to be connected to by its host name. These settings can be changed from the Pulse Panel in the
Monitor at any time. See the Pulse Configuration documentation for more information.
If Pulse has already been configured, but you want to quickly switch to another machine to run Pulse on, simply launch
Pulse on the desired machine. Then when it appears in the Pulse list in the Monitor, right-click on it and select Auto
Configure Pulse. Generally, this feature is only available in Super User mode.
Manual Configuration
The connection settings, as well as additional settings, can be configured for Pulse from the Monitor. Advanced
features like Auto Configuration, Power Management, Slave Throttling, Statistics Gathering, and the Web Service can
also be configured in the Monitor. See the Pulse Configuration documentation for more information.
4.4.5 Pulse Menu Options
The available options are listed below. They are available in Pulse’s window, or from the Pulse system tray icon’s rightclick menu. Note that if Pulse is running in the background or without an interface, these options will be unavailable.
202
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
Options Menu
Hide When Minimized
Pulse is hidden when minimized, but can be restored using the Pulse icon in the system tray.
Minimize On Startup
Starts Pulse in the minimized state.
Control Menu
Perform Pending Job Scan
If Pulse is between repository pending job scans, this option can be used to force Pulse to perform a
pending job scan immediately. A pending job scan releases pending jobs by checking their dependencies
or scheduling options.
Perform Repository Clean-up
If Pulse is between repository clean-ups, this option can be used to force Pulse to perform a repository
clean-up immediately. A repository clean-up includes deleting jobs that are marked for automatic deletion.
Perform Repository Repair
If Pulse is between repository repairs, this option can be used to force Pulse to perform a repository repair
immediately. A repository repair includes checking for stalled slaves and orphaned limit stubs.
Perform Power Management Check
If Pulse is between power management checks, this option can be used to force Pulse to perform a power
management check immediately.
4.4.6 Command Line Options
To run Pulse from a command prompt or terminal window, navigate to the Deadline bin folder (Windows or Linux)
or the Resources folder (Mac OS X) and run the ‘deadlinepulse’ application. To view all available command line
arguments, you can run the following:
deadlinepulse -help
Available Options
To start Pulse without a user interface, use the -nogui option:
deadlinepulse -nogui
To start Pulse without the splash screen, use the -nosplash option:
deadlinepulse -nosplash
To shut down Pulse if it’s already running, use the -shutdown option:
deadlinepulse -shutdown
4.4. Pulse
203
Deadline User Manual, Release 7.1.0.35
4.4.7 FAQ
Does Pulse use any license?
No. It is an unlicensed product and included in the Deadline Client software installer.
Can I run Pulse on any machine in my farm?
You can run Pulse on any machine in your farm, including the Repository or Database machine. However,
for larger farms, we recommend running Pulse on a dedicated machine.
When choosing a machine to run Pulse on, you should be aware that non-Server editions of Windows
have a TCP/IP connection limitation of 10 new connections per second. If your render farm consists of
more than 100 machines, it is very likely that you’ll hit this limitation every now and then (and the odds
continue to increase as the number of machines increase). Therefore, if you are running Pulse on a farm
with 100 machines or more, we recommend using a Server edition of Windows, or a different operating
system like Linux.
Can I run Pulse as a service or daemon?
Yes. If you’re running the Launcher as a service or daemon, then it will run Pulse in the background as
well. See the Client Installation documentation for more information.
If Pulse is shutdown or terminated, is the Power Management feature still functional?
In this case, the only aspect of Power Management that is still functional is the Temperature Checking.
Redundancy for Temperature checking has been built into the Slave application, so if Pulse isn’t running,
you’re still protected if the temperature in your farm room begins to rise.
Which temperature sensors work with Power Management?
We have tested with many different temperature sensors. Basically, as long as the temperature sensors use
SNMP, and you know its OID (which is configurable in the Power Management settings), it should work.
Can I run multiple Pulse’s on separate machines?
Yes and like typical IT best practices, this will provide Pulse Redundancy. Note, only one Pulse can be
Primary at any given time.
4.5 Balancer
4.5.1 Overview
The Balancer is a cloud controller application capable of virtual/physical, private/public, remote/local simultaneous
machine orchestration. It can create, start, stop and terminate cloud instances based on the current queue load taking
into account jobs and tasks. Further customization to take into account other job/task factors can be achieved by
utilizing the Deadline plugin API to create a custom Balancer algorithm. Note to build redundancy if Primary Balancer
fails in your environment, consider protecting yourself by configuring Balancer Redundancy.
204
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
The Balancer works in cycles, and each cycle consists for a number of stages.
• First, the Balancer will do a House Keeping step in which it will clean up any disks or instances that haven’t
been terminated like they were supposed to.
• Second, the Balancer will execute the Balancer Algorithm. These are the steps of the default algorithm (note
that these steps can be customized with your own Balancer Algorithm plugin):
– Create State Structure: This sets up the data structures used in the rest of the algorithm.
– Compute Demand: Examines the groups for jobs that are queued and assigns a weighting to the group
based on the amount of tasks that need to be done and the group priority.
– Determine Resources: Here we determine how much space we have available with our provider and how
many limits we have.
– Compute Targets: Based on the Demand and the available Resources we set a target number of instances
for each group.
– Populate Targets: This sets up a full target data structure for use in Deadline.
4.5. Balancer
205
Deadline User Manual, Release 7.1.0.35
• Third, the Balancer will equalize the targets by starting or terminating instances.
4.5.2 Running the Balancer
To start the Balancer:
• On Windows, you can start the Balancer from the Start Menu under Thinkbox\Deadline.
• On Linux, you can start the Balancer from a terminal window by running the deadlinebalancer script in the bin
folder.
• On Mac OS X, you can start the Balancer from Finder by running the DeadlineBalancer application in Applications/Thinkbox/Deadline.
You can configure the Balancer to launch automatically when the Launcher starts up (similar to how the Slave does
this). This can be done by adding the LaunchBalancerAtStartup=True to the system’s deadline.ini file. See the Client
Configuration documentation for more information.
The Balancer can also be started from a command prompt or terminal window. For more information, see the Balancer
Command Line documentation.
4.5.3 Viewing the Balancer Log
To view the Balancer’s current log, simply press the Open Balancer Log button at the bottom of the Balancer window.
This will open the Balancer log in a new window to avoid impacting the performance of the main Balancer application.
206
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
If the Balancer is running in the background or without an interface, you can connect to the Balancer log from the
command line. In a command prompt or terminal window, navigate to the Deadline bin folder (Windows or Linux)
or the Resources folder (Mac OS X) and run the following, where “BALANCERNAME” is the name of the Balancer
you want to connect to:
deadlinecommand -ConnectToBalancerLog "BALANCERNAME"
4.5.4 Configuring the Balancer
The Balancer needs to be configured before it can do anything. See the Balancer Configuration documentation for
more information.
4.5. Balancer
207
Deadline User Manual, Release 7.1.0.35
4.5.5 Balancer Menu Options
The available options are listed below. They are available in the Balancer’s window, or from the Balancer system tray
icon’s right-click menu. Note that if the Balancer is running in the background or without an interface, these options
will be unavailable.
Options Menu
Hide When Minimized
The Balancer is hidden when minimized, but can be restored using the the Balancer icon in the system
tray.
Minimize On Startup
Starts the Balancer in the minimized state.
Control Menu
Perform Balancing
If the Balancer is between Balancing cyles, this option forces the Balancer to perform a balancing cycle
immediately. A balancing cycle looks at tasks, groups, limits and cloud regions to determine if it should
create or terminate cloud instances.
4.5.6 Command Line Options
To run the Balancer from a command prompt or terminal window, navigate to the Deadline bin folder (Windows or
Linux) or the Resources folder (Mac OS X) and run the ‘deadlinebalancer’ application. To view all available command
line arguments, you can run the following:
deadlinebalancer -help
Available Options
To start the Balancer without a user interface, use the -nogui option:
deadlinebalancer -nogui
To start the Balancer without the splash screen, use the -nosplash option:
deadlinebalancer -nosplash
To shut down the Balancer if it’s already running, use the -shutdown option:
deadlinebalancer -shutdown
208
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
4.5.7 FAQ
Can I run Balancer on any machine in my farm?
You can run Balancer on any machine in your farm, including the Repository or Database machine.
However, for larger farms, we recommend running Balancer on a dedicated machine.
When choosing a machine to run Balancer on, you should choose a machine which has the correct network
routable access to your local renderfarm as well as external access to any public/private connections via
technologies such as VPN.
Can I run Balancer as a service or daemon?
Yes. If you’re running the Launcher as a service or daemon, then it will run Balancer in the background
as well. See the Client Installation documentation for more information.
Can I run multiple Balancer’s on separate machines?
Yes and like typical IT best practices, this will provide Balancer Redundancy. Note, only one Balancer
can be Primary at any given time and this is the machine which will checkout a Flexlm based Balancer
license.
Does Balancer use a Deadline Slave license?
No. Primary Balancer will checkout a Balancer specific license which is included to all customers who are
currently on Thinkbox annual support for Deadline. The Draft and Balancer licenses will be renewed for
another 12 months as you renew your annual Thinkbox Deadline support contract. Please email Deadline
Sales for further details.
4.6 Command
4.6.1 Overview
The deadlinecommand application is a command line tool for the Deadline render farm management system. It can be
used to control, query, and submit jobs to the farm.
There is also a deadlinecommandbg application which is identical to deadlinecommand, except that it is executed in
the background. When using deadlinecommandbg, the output and exit code are written to the Deadline temp folder
as dsubmitoutput.txt and dsubmitexitcode.txt respectively. If you want to control where these files get written to, you
can use the ‘-outputFiles’ option, followed by the paths to the output and exit code file names. For example:
deadlinecommandbg -outputFiles c:\output.txt c:\exitcode.txt -pools
You can find the deadlinecommand and deadlinecommandbg applications in the Deadline bin folder (Windows or
Linux) or the Resources folder (Mac OS X).
4.6.2 Command Line Options
The supported command line options and their usage instructions can be printed out by running ‘deadlinecommand’
from a command prompt or terminal with the ‘-help’ argument.
deadlinecommand -help
To get usage information for a specific command, specify the command name after the -help argument:
4.6. Command
209
Deadline User Manual, Release 7.1.0.35
deadlinecommand -help SubmitCommandLineJob
4.6.3 Long Command Lines
Some operating systems have a limit on the number of characters that an individual command line can consist of, which
can cause problems if you are using deadlinecommand with a large number of command line options. To workaround
this issue, you can create a text file with one command line option per line, and pass that file as the only argument to
deadlinecommand or deadlinecommandbg. For example, you can create a file called args.txt that looks like this:
-SubmitMultipleJobs
-dependent
-job
\\path\to\job_1_info_file.txt
\\path\to\job_1_plugin_file.txt
-job
\\path\to\job_2_info_file.txt
\\path\to\job_2_plugin_file.txt
-job
\\path\to\job_3_info_file.txt
\\path\to\job_3_plugin_file.txt
You would then pass it to deadlinecommand like this:
deadlinecommand args.txt
4.6.4 Usage Examples
Submitting a Job
To submit a 3dsmax scene (ie. C:\MyScene.max), you must first create a job submission info file (ie. C:\job_info.job)
and a 3dsmax plugin info file (ie. C:\max_info.job). See the Manual Job Submission documentation for more information.
Once the files are created, you can submit the job using this command:
deadlinecommand "C:\job_info.job" "C:\max_info.job" "C:\MyScene.max"
Querying For Jobs Using Filters
To query for all jobs that belong to jsmith or cdavis:
deadlinecommand -getjobsfilter username=jsmith username=cdavis
To query for all of jsmith’s jobs with completed status:
deadlinecommand -getjobsfilterand username=jsmith status=completed
210
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
Checking Which Slaves Assigned To A Specific Pool
To check which slaves are assigned to the 3dsmax pool:
deadlinecommand -getslavenamesinpool 3dsmax Assigned
To Check which slaves are excluded from the xsi pool:
deadlinecommand -getslavenamesinpool Xsi Excluded
Querying For Task Information
To query for task information for the job with the ID of “546cc87357dbb04344a5c6b5”:
deadlinecommand -getjobtasks 546cc87357dbb04344a5c6b5
Retrieving and Changing Job Status
To retrieve the status of the job with the ID of “546cc87357dbb04344a5c6b5”:
deadlinecommand -getjob 546cc87357dbb04344a5c6b5
To retrieve all of the jobs details:
deadlinecommand -getjobdetails
546cc87357dbb04344a5c6b5
To suspend the job with the ID of “546cc87357dbb04344a5c6b5”:
deadlinecommand -suspendjob 546cc87357dbb04344a5c6b5
deadlinecommand -suspendjobnonrenderingtasks 546cc87357dbb04344a5c6b5
To resume the job:
deadlinecommand -resumejob 546cc87357dbb04344a5c6b5
To requeue the job:
deadlinecommand -requeuejob 546cc87357dbb04344a5c6b5
To delete the job:
deadlinecommand -deletejob 546cc87357dbb04344a5c6b5
To archive the job:
deadlinecommand -archivejob 546cc87357dbb04344a5c6b5
4.6. Command
211
Deadline User Manual, Release 7.1.0.35
Sending An Email
To send the message to jsmith@mycompany.com(cc cjones@mycompany.com):
deadlinecommand -sendemail -to jsmith@mycompany.com -cc cjones@mycompany.com
-subject "the subject" -message "C:\MyMessage.html"
To send the same message with the attachment “C:\MyAttachment.txt”:
deadlinecommand -sendemail -to jsmith@mycompany.com -cc cjones@mycompany.com
-subject "the subject" -message "C:\MyMessage.html" -attach "C:\MyAttachment.txt"
Note that the -to, -subject, and -message options are required. The other two options are optional.
4.6.5 FAQ
What’s the difference between the deadlinecommand and deadlinecommandbg applications?
The deadlinecommandbg application is identical to deadlinecommand, except that it is executed in the
background. When using deadlinecommandbg, the exit code and output are written to the Deadline temp
directory as dsubmitexitcode.txt dsubmitoutput.txt respectively.
4.7 Web Service
4.7.1 Overview
The deadlinewebservice application is a command line application for the Deadline render farm management system.
It allows you to get query information from Deadline over an Internet connection, which you can view with the Mobile
application, or you can write custom Web Service Scripts to display this information in a manner of your choice, such
as a web page.
You can find the deadlinewebservice application in the Deadline bin folder (Windows or Linux) or the Resources
folder (Mac OS X).
The Pulse application also has the web service built into it, so if you are already running Pulse, you can just connect to
it directly instead of running the standalone deadlinewebservice application. That being said, there are a few benefits
to running the standalone deadlinewebservice application if you are already running Pulse:
• If you make heavy use of the web service, it won’t impact Pulse’s performance.
• You can run multiple instances of the standalone deadlinewebservice application on different machines.
• Migrating the web service to another machine doesn’t require you to migrate Pulse as well.
If you would like to use Pulse’s web service feature, you must enable it in Pulse, which can be done from the Web
Service tab in the Pulse Settings in the Repository Configuration. Note that if you enable or disable the web service
feature while Pulse is running, you must restart Pulse for the changes to take effect.
212
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
4.7.2 Setup
Before you can use the web service, you need to configure the general Web Service settings in the Repository Configuration. These settings apply to both the standalone deadlinewebservice application, and Pulse’s web service feature.
4.7. Web Service
213
Deadline User Manual, Release 7.1.0.35
4.7.3 RESTful HTTP API
The RESTful API in the web service can be used to request information from the database, store new data, alter
existing data or remove entries from the database.
See the REST Overview documentation for more information.
4.7.4 Additional Web Service Functionality
This additional web service functionality is still supported, but is now deprecated in favor of the new RESTful HTTP
API.
Connecting to the Web Service
You can connect to the web service using a URL containing the host name or IP address of the machine that is hosting
the web service application, as well as the port, which we will assume to be 8080 for now (this can be configured in
214
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
the Web Service Settings). Note that if port 8080 is being blocked by a firewall, the web service will not be able to
accept web requests. An example URL will look like the following:
http://[myhost]:8080/[command][arguments]
Where:
• myhost is your web service server’s IP address or host name.
• command is the command you want to execute. The web service can support two different types of commands,
which are explained below.
• arguments represents the arguments being passed to the command. This can be optional, and depends on the
command.
To confirm that you can at least connect to the web service, try the following URL.
http://[myhost]:8080/
You should see the following if you connect to the web service successfully:
This is the Deadline web service!
Windows Namespace Reservation
If the web service is running on Windows, you may also need to add a namespace reservation for the current user
that the web service is running under, so that it can reserve namespaces for the URL connection. See the Configuring
Namespace Reservations section in this MSDN Article for more information.
Note that by default, the web service listens on http://*:8080/, so make sure you set the port number correctly in the
URL you use when reserving the namespace. For example:
netsh http add urlacl url=http://*:8080/ user=USERNAME
Ensure you have correctly elevated permissions when executing the above in a command prompt and replace USERNAME with the appropriate %USERNAME% that the web service is running under. Depending on your local security
policy, the user account may need to have local administrator rights temporarily for you to initially reserve the namespace. The namespace reservation will also need updating if you ever modify the port number or user account used.
Use the following command in a command prompt to help list what namespace reservations are currently present on
your machine:
netsh http show urlacl
Running Commands
The first set of commands are the same commands that you can use with the Command application. However, these
commands are disabled by default. To enable them, you need to enable the Allow Non-Script Commands setting in the
Web Service settings. If left disabled, you will see the following results when trying to call one of these commands:
Error - Non-Script commands are disabled.
Here is an example of how you would use the web service to call the -GetSlaveNames command:
4.7. Web Service
215
Deadline User Manual, Release 7.1.0.35
http://[myhost]:8080/GetSlaveNames
Here is an example of the results that would be displayed:
Jupiter
Rnd-vista
Slave-29
Monkeypantswin7
Electron.franticfilms.com
Test3
Monkeypants
Slave-27d
Proton.franticfilms.com
Atom.franticfilms.com
Rnd-suse
Opensuse-64
Pathos
Neutron.franticfilms.com
Some commands can take arguments. To include arguments, you need to place a ‘?’ between the command name and
the first argument, and then a ‘&’ between additional arguments. Here is an example of how you would use the web
service to call the -GetSlaveNamesInPool command, and pass it two pools as arguments:
http://[myhost]:8080/GetSlaveNamesInPool?show_a&show_b
Here is an example of the results that would be displayed:
Monkeypants
Pathos
Calling Python Scripts
The second set of commands are actually Python scripts that you can create in the Repository. These scripts use Pulse’s
Python API to get data, and then return the data in a readable manner. So basically, you can create scripts to access
any type of data and display it in any way you want. See the Web Service Scripts documentation for more information
on how to create these scripts.
Once a script has been created, you can call it by using the name of the script, without the .py extension. For example,
if you have a web service script called GetFarmStatistics.py, you would call it using:
http://[myhost]:8080/GetFarmStatistics
Some scripts can take arguments. To include arguments, you need to place a ‘?’ between the command name and the
first argument, and then a ‘&’ between additional arguments. Here is an example of how you would pass arg1, arg2,
and arg3 as separate arguments to the GetFarmStatistics.py script:
http://[myhost]:8080/GetFarmStatistics?arg1&arg2&arg3
The way the results are displayed depends on the format in which they are returned. Again, see the Web Service
Scripting documentation for more information.
216
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
4.8 Mobile
4.8.1 Overview
The Mobile application allows you to monitor your jobs from anywhere. The application connects to the Deadline
web service to download information about the state of your jobs, so the web service must be running before you can
use the Mobile application. See the Web Service documentation for more information.
The minimum requirements for the Mobile application are as follows.
Mobile Device
Android
iPhone or iPad
Windows Phone
Minimum Requirements
Deadline 5.0 and Android 2.1
Deadline 4.1 and iPhone OS 3.0 - 7.10
Deadline 5.0 and Windows Phone 7.0
4.8.2 Mobile Setup
When you launch the Mobile application for the first time, you will need to configure it so that it can connect to your
Deadline web service. Just press the Settings button in the top left corner. The important settings are the Deadline
User settings and the Pulse Server settings. For Mobile to connect to the web service, you must provide the following
information:
• Deadline User Settings -> User Name: This is the Deadline User is the user that you normally submit render
jobs from.
• Deadline User Settings -> Password: If the web service has been configured to require authentication, and
empty passwords are not allowed, you must enter your user password here. This is the password that you
specify in your User Settings in the Monitor. See the User Settings documentation for more information.
• Pulse Server Settings -> Server Name: This is the host name or IP address of the server machine that is running
the web service.
• Pulse Server Settings -> Server Port: The default is 8080, and should only be changed if the web service has
been configured to listen on a different port.
Note that the Pulse Server Settings can be used to connect to a Pulse instance if the web service feature is enabled,
or it can be used to connect to the standalone web service application. See the Web Service documentation for more
information.
After you have configured your Server and User settings, press the Job List button to return and press the Refresh
button to connect to the web service and load the job list. If you get an error when Mobile attempts to contact the web
service, see the Troubleshooting section for known errors and solutions.
4.8.3 Job List
The job list is the main screen, and by default it shows all the jobs in the repository. See the Settings section below for
information on how to sort and filter this list. You can also use the search field to search for specific jobs.
4.8. Mobile
217
Deadline User Manual, Release 7.1.0.35
To refresh the job list, just press the Refresh button. If you want to see more information about a specific job, press
the button to the right of the job name to bring up the job details panel.
4.8.4 Job Details
The job details panel shows additional information for a specific job. In this view, you can see most of the information
you could normally see in the Monitor.
To refresh the job details, just press the Refresh Job button. To return to the job list, press the Job List button in the
upper left corner.
218
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
4.8.5 Settings
The settings panel can be accessed from the job list by pressing the Settings button. You can access the online help by
pressing the Help button in the top right corner (Android) or by scrolling down to find the Online Help link (iPhone).
To return to the job list, press the Job List button in the upper left corner.
Auto Refresh Settings
• Job List: If enabled, the job list will automatically refresh itself at the increment defined in Job List Interval.
• Job Details: If enabled, the job’s details will automatically refresh itself at the increment defined in Job Details
Interval.
Job List Filter Settings
• Configure filters to only show the jobs that you’re interested in.
Job List Sort Settings
• Ego-centric Sort: If enabled, all of your jobs will appear at the top of the job list, followed by the remaining
jobs.
• Primary Sort: Set the primary sort field and order for the job list.
• Secondary Sort: Set the secondary sort field and order for the job list.
Deadline User Settings
• User Name: Your Deadline user name. This is the user that you normally submit render jobs under.
• Password: If the web service has been configured to require authentication, and empty passwords are not
allowed, you must enter your user password here. This is the password that you specify in your User Settings in
the Monitor. See the User Settings documentation for more information.
Pulse Server Settings
• Server Name: This is the host name or IP address of the server machine that is running the web service.
• Server Port: The default is 8080, and should only be changed if the web service has been configured to listen
on a different port.
4.8. Mobile
219
Deadline User Manual, Release 7.1.0.35
Note that the Pulse Server Settings can be used to connect to a Pulse instance if the web service feature is enabled,
or it can be used to connect to the standalone web service application. See the Web Service documentation for more
information.
Proxy Server Settings
• Server URL: If you are using a proxy web server, you may need to set a more specific URL to connect to the
web service.
• Http Authorization: If your proxy web server requires HTTP authorization, you should enable this option and
specify the user name and password.
• SSL: If you are using a proxy web server that requires SSL, you should enable this option. Note that this will
change the server port in the Pulse Server Settings to 443 by default.
Download Information
• This a running tally of the data that you’ve downloaded from the web service.
4.8.6 Proxy Server
Depending on the security restrictions of your studio, you may wish to to setup a proxy server that acts as a middleman
between Mobile and the web service. You can run the proxy server on a different machine, and configure it to require
authentication, use SSL, etc.
We have example scripts that you can start with by downloading the Pulse Proxy Script For Deadline Mobile file from
the Miscellaneous Deadline Downloads Page.
• Place these scripts into a cgi script executable folder. For apache, the default is the cgi-bin directory, but different
folders can be configured as script folders.
• Once the scripts are in the folder, running them should yield a 403: Not authorized error until the script has been
configured.
The proxy scripts have been written to assume that the root web directory will be where the scripts will be run. Because
of this, if they are placed into the cgi-bin folder you must prepend ‘\cgi-bin\’ to the URI regular expression test in the
scripts. Note that all slashes and regular expression special characters must be escaped (hence the double slash).
Common pitfalls with this are forgetting to mark the scripts as executable on unix based systems (use “chmod og+x
Mobile_GetJob*” to mark them executable), and forgetting to set the owner and group to be the same as the webserver
runs as (use “chown www:www Mobile_GetJob*” on most systems).
Note that we provide these scripts as is, and we don’t officially support them. However, if you are having difficulties,
contact Deadline Support and we’ll do what we can to help.
4.8.7 Troubleshooting
These are some known Mobile errors and solutions.
You must provide a password for authentication
This error occurs when a password has not been set for the current user while authentication is enabled
and empty passwords are not accepted. To resolve this issue, you must fill in the Web Service Password
field for the user in the User Settings in the Monitor. Before you can connect, you may need to wait for
the web service to update its network settings or manually restart the web service.
The provided user name and password are invalid
220
Chapter 4. Client Applications
Deadline User Manual, Release 7.1.0.35
This error occurs when the password provided is incorrect for the given user. If you believe the password
is correct, you may need to wait for the web service to update its network settings or manually restart the
web service.
The provided user name is invalid
This error occurs when the provided user is not in the web service’s cached list. If the user name is
valid, you may need to wait for the web service to update its network settings or manually restart the web
service.
There was an error connecting to Pulse
This error occurs when there are two errors connecting to the web service in a row. The likely cause of
this error is that the web service is not running on the specified server. Verify that the web service is
running on the specified server and that you have entered the server’s name or IP address correctly. If you
have a name specified for the server and are not on the local area network of that machine, you may need
to enter the server’s IP address instead of its name.
Network Error
The connection with the server failed. Please check your server settings in the Settings Section
Double check your settings in Mobile to make sure they match the required information. If all the Mobile settings
are entered correctly and you still cannot connect, look in your general mobile device settings and make sure you are
connected to the right network. Depending on how things are set up, your device will try to connect to the strongest
network in the area. If the network it switches to doesn’t have the correct settings to connect to your server then the
connection will fail.
If you are still unable to connect try rebooting the device (fully power off your device and power it back on). This
error also occurs when the server you are trying to connect to has lost access to the internet. Double check that the
server is connected to the internet.
4.8.8 FAQ
How do I get the Mobile application?
The Mobile application can be downloaded from the Android Market and the iPhone App Store.
How much does Mobile cost?
Nothing, it’s free!
4.8. Mobile
221
Deadline User Manual, Release 7.1.0.35
222
Chapter 4. Client Applications
CHAPTER
FIVE
ADMINISTRATIVE FEATURES
5.1 Repository Configuration
5.1.1 Overview
There are a wide variety of Repository options that can be configured. These options can be modified at any time from
the Deadline Monitor while in Super User Mode by selecting Tools -> Configure Repository Options. If you want to
restore all the Repository Options to their defaults, simply click the Reset Settings button.
223
Deadline User Manual, Release 7.1.0.35
Note that long-running applications like the Launcher, Slave, and Pulse only update these settings every 10 minutes,
so after making changes, it can take up to 10 minutes for all machines to recognize them. You can restart these
applications to have them recognize the changes immediately.
5.1.2 Client Setup
These settings affect the Deadline Client installed on each machine.
• Remote Administration: Enabling Remote Administration allows the Deadline Clients to be controlled remotely from the Monitor running on another machine. Note that this can be a security risk if you are not behind
a firewall.
• Automatic Upgrades: Enabling Automatic Upgrades allows the Deadline Clients to detect if the Repository
has been upgraded, and upgrade themselves if necessary. Note that the upgrade check is only performed when
launching applications via the Launcher.
224
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
5.1.3 Monitor Settings
These settings affect the Deadline Monitor application on each machine.
Monitor Layouts
Existing Monitor layouts can be added here. These layouts can be assigned to User Groups as a user’s default layout.
If the Pinned option is enabled, they can also be chosen from the Pinned Layouts menu in the Monitor. The order of
the layouts here will be the same in the Pinned Layouts menu.
5.1. Repository Configuration
225
Deadline User Manual, Release 7.1.0.35
To add a new layout, simply press the Add button, and then choose an existing Monitor layout file, or use the current
Monitor’s layout. Note that Monitor layout files can be saved from the Monitor by selecting View -> Save Layout.
Update Settings
Enable Manual Refreshing
If your Auto Refreshing Intervals are set to longer intervals, manual refreshing in the Monitor can be enabled to allow
226
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
users to get the most up to date data immediately. To prevent users from abusing manual refreshing, a minimum
interval between manual refreshes can be configured.
Sorting and Filtering
For farms that have a large number of jobs (10,000+) or slaves (1000+), disabling Automatic Sorting and Filtering in
the lists in the Monitor can improve the Monitor’s overall performance. This option in the Repository Options can be
used to disable Automatic Sorting and Filtering by default, and users can enable it later in their Monitors if desired.
5.1.4 Slave Settings
These settings affect the Deadline Slave application on each machine.
Slave Settings
General
• Limit the number of characters per line for standard output handling: Lines of standard output that are
longer than the specified limit will be ignored by the Slave’s stdout handling.
5.1. Repository Configuration
227
Deadline User Manual, Release 7.1.0.35
• Delete Offline/Stalled Slaves from the Repository after this many days: Slaves that are Offline or Stalled
will be removed from the Repository after this many days.
• Gather System Resources (CPU and RAM) When Rendering Tasks On Linux/Mac: If enabled, the Slave
will collect CPU and RAM usage for a task while it is rendering. We have seen cases where this can cause the
Slave to crash on Linux or Mac, so you should only disable this feature if you run into this problem.
• Use fully qualified domain name (FQDN) for Machine Name instead of host name: If enabled, the Slave
will try to use the machine’s fully qualified domain name (FQDN) when setting its Machine Name instead of
using the machine’s host name. The FQDN will then be used for Remote Control, which can be useful if the
remote machine name isn’t recognized in the local network. If the Slave can’t resolve the FQDN, it will just use
the host name instead.
• Use Slave’s IP Address for Remote Control: If enabled, the Slave’s IP address will be used for remote control
instead of trying to resolve the Slave’s host name.
Wait Times
• Number of Minutes Before An Unresponsive Slave is Marked as Stalled: If a slave has not provided a status
update in this amount of time, it will be marked as stalled.
• Number of Seconds To Wait Fora Response When Connecting to Pulse: The number of seconds a salve that
is connected to plus will wait for pulse to respond when querying for a job.
• Number of Seconds Between Thermal Shutdown Checks if Pulse is Offline: The number of seconds between
thermal shutdown checks. The Slave only does this check if Pulse is not running.
228
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
Extra Properties
Extra arbitrary properties can be set for slaves, and these properties can be given user friendly names so that they can
easily be identified and used to filter and sort slaves in the Monitor.
5.1. Repository Configuration
229
Deadline User Manual, Release 7.1.0.35
5.1.5 Performance Settings
These settings are used to influence the performance of Deadline by modifying update intervals.
Auto Adjust
The auto adjust option will try to choose the best interval settings based on the number of slaves in your farm. These
should act as a good base that you can modify later as necessary. Press the Auto Adjust button to bring up the interval
settings. Note that this will show you what your current settings are, and what they’ll be changed to based on the
number of slaves you entered.
230
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
Monitor Referesh Intervals
• Number of Seconds Between Job Updates: This controls how often the Monitor reads in new job updates.
• Number of Seconds Between Slave Updates: This controls how often the Monitor reads in new slave updates.
• Number of Seconds Between Pulse Updates: This controls how often the Monitor reads in new pulse updates.
• Number of Seconds Between Limit Updates: This controls how often the Monitor reads in new limit updates.
• Number of Seconds Between Settings Updates: This controls how often the Settings such as groups, pools
and users are updated.
• Number of Seconds Between Cloud Updates: This controls how often the Monitor updates the Cloud Panel.
• Number of Seconds Between Balancer Updates: This controls how often the Monitor reads in new Balancer
updates.
Slave Intervals
• Number of Seconds Between Slave Information Updates: This controls how often the Slave updates the
information that’s shown in the Slave list in the Monitor.
• Number of Seconds Between Queries For New Tasks While the Slave is Rendering: The number of seconds
a Slave will wait after it finishes a task before moving on to another. This delay is not applied when the Slave is
idle.
• Multiplier to determine seconds between queries while the Slave is Idle: The multiplier to be applied to the
number of slaves that will determine how long a slave will wait between polls to the Repository for tasks when
5.1. Repository Configuration
231
Deadline User Manual, Release 7.1.0.35
it is idle.
• Maximum number of seconds between Job queries while the Slave is Idle: The maximum number of seconds
a slave will wait between polls to the Repository for tasks when it is idle.
• Minimum number of seconds between Job queries when the Slave is Idle: The minimum number of seconds
a slave will wait between polls to the Repository for tasks when it is idle.
5.1.6 Pulse Settings
These settings control how the Slaves connect to Pulse for Throttling, and are also used by the Slave to determine if
Pulse is running.
General
• Maximum Incoming Connections: The maximum number of Slaves that can connect to Pulse at any given
time.
• Connection Timeout (in milliseconds): The number of milliseconds messages to and from Pulse have to
complete before they timeout.
232
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
• Maximum Connection Attempts: The maximum number of times a Slave will attempt to connect to Pulse
before giving up.
• Stalled Pulse Threshold (in minutes): Deadline determines if a Pulse has stalled by checking the last time that
the Pulse has provided a status update. If a Pulse has not updated its state in the specified amount of time, it will
be marked as Stalled.
• Use Pulse’s IP Address When Slaves Connect To Pulse and For Remote Control: If enabled, the Pulse’s IP
address will be used when the slaves connect to pulse, and for remote control, instead of trying to resolve the
Pulse’s host name.
Power Management
• Power Management Check Interval: How often Pulse performs Power Management operations.
5.1. Repository Configuration
233
Deadline User Manual, Release 7.1.0.35
Throttling
Throttling can be used to limit the number of slave applications that are copying over the job files at the same time.
This can help network performance if large scene files are being submitted with the jobs. Note that a Slave only copies
over the job files when it starts up a new job. When it goes to render subsequent tasks for the same job, it will not be
affected by the throttling feature.
• Enable Throttling: Allow throttling to occur.
• Maximum Number of Slaves That Can Copy Job Files at The Same Time: The maximum number of Slaves
that can copy a scene file at the same time.
• The Interval a Slave Waits Between Updates To See If It Can Start Copying Job Files: The amount of
time(in seconds) a Salve will wait to send throttle checks and updates to Pulse.
• Throttle Update Timeout Multiplier (based on the Slave Interval): The interval a slave waits between updates is multiplied by this value to determine the timeout value.
234
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
Web Service
Enable the Web Service
The Web Service allows you to execute commands and scripts from a browser, and must be enabled to use the Mobile
applications and the Pulse RESTful API (see REST Overview). While there is a standalone web service application, it
can also be enabled in Pulse if you are running it. All other Web Service settings can be set in the Web Service page,
which is covered further down this page.
• Enable the Web Service: Makes the Pulse Web Service Available. Note that if you enable or disable the Web
Service feature while Pulse is running, it must be restarted for the changes to take effect.
5.1. Repository Configuration
235
Deadline User Manual, Release 7.1.0.35
5.1.7 Balancer Settings
These settings control general settings for the Balancer.
• Balancer Update Interval: How often the Balancer performs a balancing cycle.
• Current Algorithm Logic: The Balancer Plugin to use for determining balancing targets.
• Use Balancer’s IP Address for Remote Control: If enabled, the Balancer’s IP address will be used for remote
control instead of trying to resolve the Balancer’s host name.
• Stalled Balancer Threshold (in minutes): Deadline determines if a Balancer has stalled by checking the last
time that the Balancer has provided a status update. If a Balancer has not updated its state in the specified
amount of time, it will be marked as Stalled.
• Error Tolerance: How many times we try to connect to the primary Balancer before it fails and we make
another Balancer the new primary.
• Enable Group Switching: If there are group mappings that have the same image and hardware types instances
will move between groups as needed. If it’s not enabled instances will shutdown and startup like normal.
The settings for the currently selected Algorithm Logic will be shown here as well (if there are any settings).
236
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
5.1.8 Region Settings
This is where you can set up Regions in Deadline. Regions are logical groupings for slaves and users. Cross Platform
Rendering and Balancer Settings can be unique to each region. For example a slave that’s in the ‘thinkbox_west’
Region will use the path mapping settings for that Region. The list on the right shows the Cloud Regions and the
list on the left shows the general Regions. Regions must have a unique name.’all’ and ‘none’ are reserved names that
cannot be used. See Regions for more information.
5.1. Repository Configuration
237
Deadline User Manual, Release 7.1.0.35
5.1.9 Email Notification
This section handles all email related settings within the repository.
Primary and Secondary Server
Set up a primary SMTP server to send email notifications. You can set up an optional secondary SMTP server for
Deadline to use if the primary server is unavailable.
• SMTP Server: The SMTP server used by Deadline to send emails.
• Sender Account: The email account that Deadline will use to send emails from.
• Port: The SMTP port to use.
• Use SSL: The email account from which the notifications will be sent.
• SMTP Server Requires Authentication: Enable if the SMTP server requires a user name and password to
authenticate.
238
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
• Testing: Send a test email to the specified email address.
• Automatically Generate Email Addresses for New Users: Generates new email address for new users in the
form username@postfix, where ‘postfix’ is the value entered in the Email Address Postfix field.
Note that if you have SSL enabled, you may need to configure your Linux and OSX machines for SSL to work. The
process for doing this is explained in Mono’s Security Documentation.
If you using Google Mail to send emails (smtp.gmail.com), you will typically use port 25 if SSL is disabled, and port
465 if SSL is enabled. See Google’s documentation on Sending Emails for more information.
Notifications
• Job Completed: When a job completes, an email will be sent to these email addresses.
• Job Timed Out: When a job times out, an email will be sent to these email addresses.
• Job Error Warning: When a job accumulates a certain number of errors, a warning email will be sent to these
email addresses. You can configure the warning limit in the Failure Detection settings.
• Job Failed: “When a job fails, an email will be sent to these email addresses.
• Job Corrupted: When a corrupted job is detected, an email will be sent to these email addresses.
5.1. Repository Configuration
239
Deadline User Manual, Release 7.1.0.35
• Slave License Errors: “When a slave is unable to get a license, an email will be sent to these email addresses.
• Slave Status Errors: When a slave is unable to update its state in the Repository, an email will be sent to these
email addresses.
• Slave Error Warning: When a slave accumulates a certain number of errors in one session, a warning email
will be sent to these email addresses. You can configure the warning limit in the Failure Detection settings.
• Stalled Slave: When a stalled slave detected, an email will be sent to these email addresses.
• System Administrator: When users use the option in the Error Report Viewer to report error messages to their
system administrator, those emails will be sent to these email addresses.
• Low Database Connections: Low Database connection notification emails will be sent to these email addresses.
• Database Connection Thresholds: When the number of available database connections is below the set threshold a warning email will be sent.
Power Management Notifications
• Idle Shutdown: Notifications for Idle Shutdown operations will be sent to these email addresses.
• Machine Startup: Notifications for Machine Startup operations will be sent to these email addresses.
240
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
• Thermal Shutdown: Notifications for Thermal Shutdown operations will be sent to these email addresses.
• Machine Restart: Notifications for Machine Restart operations will be sent to these email addresses.
5.1.10 House Cleaning
Pending Job Scan
• Pending Job Scan Interval: The maximum amount of time between Pending Job Scans in seconds.
• Allow Slaves to Perform the Pending Job Scan If Pulse is not Running: If enabled, the Slaves will perform
the pending job scan if Pulse is not running. If disabled, only Pulse can perform the pending job scan.
• Run Pending Job Scan in a Separate Process: If enabled, the pending job scan will be run in a separate
process. This can be useful when using dependency scripts to ensure that a crash caused by the script doesn’t
cause the main application (Pulse, Slave, or Monitor) to crash.
– Write Pending Job Scan Output to Seperate Log File: If enabled, all output from the pending job scan
will be placed into a seperate log file.
5.1. Repository Configuration
241
Deadline User Manual, Release 7.1.0.35
– Pending Job Scan Process Timeout: If running the pending job scan in a separate process, this is the
maximum amount of time the process can take before it is aborted.
• Asynchronous Job Events: If enabled, many job events will be processed asynchronously by the Pending Job
Scan operation, which can help improve improve the performance of the Monitor when performing operations
on batches of jobs. If this is enabled, the OnJobSubmitted event will still be processed synchronously to ensure
that any updates to the job are committed before the job can be picked up by Slaves.
– Maximum Job Events Per Session: The maximum number of pending job events that can be processed
per scan.
House Cleaning
• House Cleaning Interval: The maximum amount of time between House Cleaning operations in seconds.
• Allow Slaves to Perform House Cleaning If Pulse is not Running: If enabled, the Slaves will perform house
cleaning if Pulse is not running. If disabled, only Pulse can perform house cleaning.
• Run House Cleaning in a Separate Process: If enabled, the house cleaning operation will be run in a separate
process.
242
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
– Write House Cleaning Output to Seperate Log File: If enabled, all output from the house cleaning will
be placed into a seperate log file.
– House Cleaning Process Timeout: If running the house cleaning in a separate process, this is the maximum amount of time the process can take before it is aborted.
• House Cleaning Maximum Per Session
– Maximum Deleted Jobs: The maximum number of deleted jobs that can be purged per session.
– Maximum Archived Jobs: The maximum number of jobs that can be archived per session.
– Maximum Auxiliary Folders: The maximum number of job auxiliary folders that can be deleted per
session.
– Maximum Job Reports: The maximum number of jobs report files that can be deleted per session.
Repository Repair
• Repository Repair Interval: The maximum amount of time between Repository Repair operations in seconds.
• Allow Slaves to Perform the Repository Repair If Pulse is not Running: If enabled, the Slaves will perform
the repository repair if Pulse is not running. If disabled, only Pulse can perform the repository repair.
5.1. Repository Configuration
243
Deadline User Manual, Release 7.1.0.35
• Run Repository Repair in a Separate Process: If enabled, the repository repair operation will be run in a
separate process.
– Write Repository Repair Output to Seperate Log File: If enabled, all output from the repository repair
will be placed into a seperate log file.
– Repository Repair Process Timeout: If running the repository repair in a separate process, this is the
maximum amount of time the process can take before it is aborted.
• Automatic Primary Election: If enabled, the Repository Repair operation will elect another running
Pulse/Balancer instance as the Primary if the current Primary instance is no longer running.
5.1.11 Auto Configuration
This allows you to configure your Slaves from a single location. When a Slave starts up, it will automatically pull this
configuration from Pulse and apply it before fully initializing. See the Auto Configuration documentation for more
information.
244
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
5.1.12 User Security
• Super User Password: The password needed to access Super User Mode in the Monitor. Leave blank for no
password.
• Enhanced User Security: When using the System User for the Deadline User, the only way to switch Deadline
users is to log off the system and log back in as someone else. This helps improve Deadlines’s user security, as
it prevents users from impersonating others to modify their jobs.
– Use The System User For The Deadline User: Enable to use enhanced user security, which prevents
users from impersonating others.
• Rendering Jobs As User: By default, the rendering process will run under the same user account that the
Slave is running as. If Render Jobs As User is enabled, the rendering process will run under the user account
associated with the user that submitted the job. Each Deadline user must have their Render Jobs As User settings
configured properly for this to work. On Windows, the user’s Run As Name, Domain, and Password settings
will be used to start the rendering process as that user. On Linux and Mac OS X, only the user’s Run As Name
setting will be used with ‘su’ or ‘sudo’ to start the rendering process as that user. Note that on Linux and Mac
OS X, the Slave must be running as root for this to work properly.
5.1. Repository Configuration
245
Deadline User Manual, Release 7.1.0.35
– Render Jobs As User: Enable to have jobs render as the user that submitted them.
– Use ‘su’ Instead Of ‘sudo’ On Linux and Mac OS X: If enabld, ‘su’ will be used to run the process as
another user instead of ‘sudo’. This setting is ignored on Windows.
– Preserve Environment On Linux and Mac OS X: If enabled, the user environment will be preserved
when running the process as another user using ‘su’ or ‘sudo’. This setting is ignored on Windows, and is
ignored on Mac OS X when using ‘su’ instead of ‘sudo’.
5.1.13 Job Settings
Job Scheduling
Scheduling Order
• Job Scheduling Order: The order of priority that Deadline uses to schedule jobs. See the Job Scheduling
documentation for more details.
• Priority Weight: Weight given to job priority when using a Weighted scheduling order.
• Submission Time Weight: Weight given to job submission time when using a Weighted scheduling order.
246
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
• Error Weight: Weight given to the number of errors a job has when using a Weighted scheduling order.
• Rendering Task Weight: Weight given to the number of rendering tasks a job has when using a Weighted
scheduling order.
• Rendering Task Buffer: A buffer that is used by slaves to give their job extra priority on the farm.
• Enhanced Balancing Logic: If enabled, a more enhanced method of balancing slave between jobs is used,
which should prevent slaves from jumping between jobs as much. This feature is still considered experimental.
Submission Limitations
• Task Limit For Jobs: The maximum number of tasks a job can have. Note that this does not impose a frame
limit so you you can always increase the number of frames per task to stay below this limit.
• Maximum Job Priority: The maximum priority value a job can have.
Automatic Job Timeout
Configure Deadline to automatically determine a timeout for a job based on the render times of tasks that have already
completed. If a task goes longer than that timeout, a timeout error will occur and the task will be requeued.
• Minimum number of completed tasks required before calculating a timeout: The minimum number of tasks
that must be completed before Auto Job Timeout Checking occurs.
• Minimum percent of completed tasks required before calculating a timeout: The minimum percent of tasks
that must be completed before Auto Job Timeout Checking occurs.
• Enforce an automatic job timeout for all jobs: If enabled, the Auto Job Timeout will be enabled for all jobs
overriding the per job specification of the value.
• Timeout Multiplier: To calculate the Auto Job Timeout, the longest render time of the completed tasks is
multiplied by this value to determine the timeout time.
5.1. Repository Configuration
247
Deadline User Manual, Release 7.1.0.35
Failure Detection
Job Failure Detection
Sends warnings and fail jobs or tasks if they generate too many errors.
• Send a warning to the job’s user after it has generated this many errors: A warning will be sent to the job’s
notification list once its error count has reach. By default, the submitting user is automatically added to this list.
• Mark a job as failed after it has generated this many errors: The number of errors a job must throw before
it is marked as failed.
• Mark a task as failed after it has generated this many errors: The number of errors a task must throw before
it is marked as failed.
• Automatically delete corrupted jobs from the Repository: If enabled, if a job is found to be corrupted it will
it will be automatically removed from the the Repository.
• Maximum Number of Job Error Reports Allowed: This is the maximum number of error reports each job
can generate. Once a job generate this many errors it will fail and can not be resumed until some of it’s error
reports are deleted or this value is increased.
248
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
Slave Failure Detection
Sends warnings and prevent Slaves from reattempting jobs that keep generating errors.
• Send a warning after a Slave has generated this many errors for a job in a row: The maximum number of
errors that can occur before email warnings are sent to the users specified in the Email Notification section.
• Mark a Slave as bad after it has generated this many errors for a job in a row: If a Slave hits this many
errors, it will be marked as bad for its current job.
• Frequency at which a slave will attempt a job that it has been marked bad for: The percentage of time a
Slave will attempt a task it has been marked bad for if no good jobs are available.
Cleanup
Automatic Job Cleanup
• Cleanup Jobs After This Many Days: If enabled, this is the number of days to wait before cleaning up unarchived jobs.
• Cleanup Mode: Whether the cleanup should archive the jobs found or delete them.
• You can also set the number of hours since the job was last modified before cleaning it up.
5.1. Repository Configuration
249
Deadline User Manual, Release 7.1.0.35
Deleted Job Purging
• Set the number of hours after a job has been deleted before it is purged from the database.
Auxiliary Files
Many jobs have an option to submit the scene file and other auxiliary files with the job. This can be useful because it
stores a copy of the scene file with the job that can be referred to later. However, if the size of these files are large and
the Repository server isn’t designed to handle this load, it can seriously impact the Repository machine’s performance.
This problem can be avoided by storing these files in a location on a different server that is designed to handle the
load.
• Store job auxiliary files in a different location: If enabled, job auxiliary files submitted to Deadline will be
stored at a location specified and not the Repository.
250
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
Extra Properties
Extra arbitrary properties can be submitted with a job, and these properties can be given user friendly names so that
they can easily be identified and used to filter and sort jobs in the Monitor.
5.1. Repository Configuration
251
Deadline User Manual, Release 7.1.0.35
5.1.14 Application Logging
Application Log Cleanup
• Delete Monitor logs after this many days: The number of days before a Monitor log will be deleted.
• Delete Slave logs after this many days: The number of days before a Slave log will be deleted.
• Delete Pulse logs after this many days: The number of days before a Pulse log will be deleted.
• Delete Balancer logs after this many days: The number of days before a Balancer log will be deleted.
• Delete Launcher logs after this many days: The number of days before a Launcher log will be deleted.
History Entries
• Maximum Number of Repository History Entries: The maximum number of repository history entries that
are stored before old entries are overwritten.
• Maximum Number of Job History Entries: The maximum number of job history entries that are stored before
old entries are overwritten.
252
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
• Maximum Number of Slave History Entries: The maximum number of slave history entries that are stored
before old entries are overwritten.
• Maximum Number of Pulse History Entries: The maximum number of pulse history entries that are stored
before old entries are overwritten.
• Maximum Number of Balancer History Entries: The maximum number of balancer history entries that are
stored before old entries are overwritten.
Logging Verbosity
• Slave Verbose Logging: If enabled, more information will be written to the Slave log while it is running.
• Pulse Verbose Logging: If enabled, more information will be written to the Pulse log while it is running.
• Balancer Verbose Logging: If enabled, more information will be written to the Balancer log while it is running.
5.1.15 Statistics Gathering
Configure Deadline to keep track of job and farm statistics. Note that Pulse must be running to gather Slave and
Repository statistics. Job statistics will be gathered regardless if Pulse is running or not.
5.1. Repository Configuration
253
Deadline User Manual, Release 7.1.0.35
• Enable Statistics Gathering: If enabled, Deadline will gather statistical information.
• Slave Statistics Gathering Interval(in minutes): The amount of time between polling Slaves for statistical
information.
• Repository Statistics Gathering Interval(in minutes): The amount of time between polling the Repository
for statistical information.
• Delete Job Statistics After This Many Days: The number of days from generation that job statistics will be
kept before they are deleted.
• Delete Slave Statistics After This Many Days: The number of days from generation that Slave statistics will
be kept before they are deleted.
• Delete Repository Statistics After This Many Days: The number of days from generation that Repository
statistics will be kept before they are deleted.
5.1.16 Mapped Paths
Paths to be mapped before rendering(based on Operating System). You may add, remove, or edit paths as well as
modify the order in which they will be mapped. See the Cross Platform Rendering section for more details.
254
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
5.1.17 Mapped Drives
Drives to be mapped before rendering(Windows Only).
• Drive: The drive to be mapped.
• Remote Path: The remote path for the drive.
• Only Map If Unmapped: Enable to only map the drive if it is unmapped. Disabled by default.
• Requires Authentication: (Optional) Enable if the drive requires authentication. If unchecked, the existing
logged in user account credentials will be used.
• Username: Username. Must not be blank.
• Password: Password. Must not be blank.
Note, drives can be mapped when running as a service. Beware that if a user is logged in and has mapped drives set
up for them, the Deadline Slave service won’t see them because they run in a different environment. However, if the
drives are mapped in the service’s environment (which is what the slave is doing), then they will work fine. Using the
following setting can help remove this potential situation.
5.1. Repository Configuration
255
Deadline User Manual, Release 7.1.0.35
• Only map drives when the Slave is running as a service: If checked, the slave will only map the drives if it’s
running as a service. If unchecked, it will also do it when the slave is running as a normal application.
5.1.18 Script Menus
There are many scripts that ship with Deadline, and it’s more than likely that you don’t need to use them all, especially
the submission scripts. Here, you can configure the contents of the individual script menus to only display what you
use. You can also set icons and keyboard shortcuts for your script menu items. If a script menu item has the same
shortcut as an existing menu item, the script menu item’s shortcut will take precedence.
Note though that these settings will affect all Monitors that connect to this Repository.
256
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
5.1.19 Python Settings
A list of additional paths to be added to the Python search paths. Specify one path per line, and use the Add Path
button to browse for paths.
5.1. Repository Configuration
257
Deadline User Manual, Release 7.1.0.35
5.1.20 Wake On Lan Settings
Deadline’s Power Management uses Wake On Lan to wake up machines, and you can configure which port(s) the
WOL packet is sent over. If no ports are listed here, Deadline will use port 9 by default.
258
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
5.1.21 Web Service
The Web Service allows you to execute commands and scripts from a browser, and must be enabled to use Deadline’s
Mobile applications. The Web Service can be run as a console application or as part of Pulse. Note that only one
instance of the Web Service can run on a machine at a time. Also note that all changes to the Web Service settings
require the Web Service to be restarted before they will be implemented.
• Listening Port: The port on which the Web Service will listen.
• Connection Limit: The maximum number of concurrent connections allowed for the Pulse Web Service.
• Connection Timeout(in seconds): The amount of time in between sending and receiving messages to and from
the Web Service before a timeout occurs.
If the Web Service requires authentication, users would use their Deadline user name along with the password stored
in their User Settings. If empty passwords are allowed, they can leave their password setting blank.
• Require Authentication: If enabled, the Pulse Web Service will require a username and password. These are
stored in the user settings.
• Allow Empty Passwords: If enabled, the Web Service will accept empty passwords.
5.1. Repository Configuration
259
Deadline User Manual, Release 7.1.0.35
• Allow Execution of Non-Script Commands: If enabled, users are allowed access to Deadline Command
commands.
5.2 User Management
5.2.1 Overview
Deadline has its own user system, which is primarily used to tie users to Jobs. By default, users cannot control or
modify the settings of another User’s Jobs.
Each user can configure their own user settings from the Monitor by selecting Tools -> Options. See the Monitor and
User Settings documentation for more information on the available user settings.
5.2.2 Managing Users
Administrators can manage the all users from the Monitor. This is done by selecting Tools -> Manage Users in Super
User mode, or as a user with appropriate User Group privileges. From here, you can add or remove individual users,
260
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
and edit their user settings. See the Monitor and User Settings documentation for more information on the available
user settings.
5.2.3 User Security
User Security settings can be configured in the Repository Configuration.
5.2. User Management
261
Deadline User Manual, Release 7.1.0.35
By default, Deadline does not enforce Enhanced User Security. This means that a user can switch to a different User
and edit someone else’s Jobs. For some pipelines, this “honor system” will work fine, but for those looking for tighter
security, you should enable Enhanced User Security, so that it uses the system user as the Deadline User. When this
option is enabled, users will not be able to switch to a another Deadline User unless they log off their system and log
back in as someone else.
It is also recommended that you add a Super User password if you are looking for enhanced security, as a Super
User without a password would allow Users to circumvent User Job-editing restriction, as well as circumventing any
restrictions imposed on them by their User Groups (see below).
5.2.4 User Group
User Groups allow Administrators to restrict what functionality is available to certain users, as well as make certain
features accessible to others without requiring the use of the Super User mode.
Deadline automatically creates an ‘Everyone’ User Group, which always contains all Users, and cannot be removed
or disabled. This User Group is also populated with the default Permission Settings recommended for normal users.
262
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
Managing User Groups
The User Group Management section can be accessed as a Super User through the Tools -> Manage User Groups
menu in the Monitor.
The left side of this dialog contains the list of User Groups that have already been created in the Repository. There are
also controls allowing you to manipulate this list in many ways:
• Add: Will create a new User Group using the default options and feature access levels (equivalent to the default
‘Everyone’ group before modification).
• Remove: Will delete the selected User Group from the Repository. Note that the ‘Everyone’ group can never
be Removed in order to guarantee that all Users will at least be part of this group.
• Clone: Will create a new User Group using the Options and Feature Access Levels of the currently selected
group as defaults.
This list is visible regardless of which tab is selected, allowing you to quickly change which Group you’re modifying,
and ensuring you’re always aware of which one is currently selected.
5.2. User Management
263
Deadline User Manual, Release 7.1.0.35
General Options
This tab contains basic higher-level settings for User Groups. Note that most of the features on this tab, described
below, will be disabled when modifying the ‘Everyone’ group, since it is a special Group that must always be active
and enabled for all Users.
• Group Options
– Group Enabled: This indicates whether or not this User Group is currently active or not. Disabling
a User Group instead of Removing it altogether can be useful if you just want to temporarily disable
access for a group of users without having to re-create it later. This is always true for the ‘Everyone’
Group.
– Group Expires: This setting will cause a Group to only be valid up to the specified Date and Time.
This can be useful if you are hiring temporary staff and know in advance that you will need to revoke
their access on a certain Date. This cannot be set for the ‘Everyone’ Group.
• Job Access Level
– Can View Other Users’ Jobs: This setting determines whether or not Users belonging to the Group
can see other users’ jobs.
– Can Modify Other Users’ Jobs: This setting indicates whether or not Users in this Group should be
allowed to modify other users’ jobs (change properties, job state, etc).
– Can Handle Protected Jobs: This setting determines whether or not Users belonging to the Group
can archive or delete protected jobs that don’t belong to them.
– Can Submit Jobs: This setting determines whether or not Users belonging to the Group can submit
jobs.
• Default Monitor Layout: Here you can select a Monitor layout that was added to the Repository Configuration.
This layout will act as the default for users belonging to this user group. The Priority setting is used as a tie
breaker if a user is part of more than one group with a default layout. When a user selects View -> Reset Layout,
it will reset to their user group’s default layout instead of the normal default. Finally, if the Reset Layout On
Startup setting is enabled, the Monitor will always start up with that layout when it is launched.
• Time-Restricted Access: This section allows you to set windows of time during which this Group is considered
Active. This is useful if you want to set up permissions to change based on the time of day, or if you just want
to lock out certain Users after hours. This cannot be enabled for the ‘Everyone’ Group.
• Group Members: This is where you control which Users are considered members of the currently selected
Group. Users can be part of multiple Groups. All Users are always part of the ‘Everyone’ Group, and this
cannot be changed.
Controlling Feature Access
The other tabs in the Group Management dialog are dedicated to enabling or restricting access to certain Features on
a per-group basis.
Each tab groups displays a different type of Feature, that represent different aspects of the end-user experience:
• Menu Items: This tab contains all the Menu Item features, including the main menu bar, right-click menus, and
toolbar items.
• Job Properties: This tab contains all of a Job’s modifiable properties, and determines which ones a User will
be allowed to change. Note that this is only for Jobs a User is allowed to modify in the first place, if he is not
allowed to modify other Users’ Jobs (see section above).
264
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
• Scripts: This contains all the different type of Scripts a User could run from the Monitor. This section is a
little different than the others, because the actual Features are dynamically generated based on which Scripts are
currently in the Repository. Note that all scripts will default to a value of ‘Inherited’, so make sure to revisit this
screen when adding new Scripts to your Repository.
• UI Features: This tab contains all the different types of Panels that a User can spawn in the Monitor, and
controls whether or not a particular User Group is allowed to spawn them.
These Features are also grouped further within each tab into logical categories, to try and make maintenance easier.
There are three possible Access Levels that you can specify for each Feature:
• Enabled: The members of this Group will have access to this particular Feature.
• Disabled: This Group is not granted access to this Feature. Note, however, that Users in this Group might be
granted access to this Feature by a different Group.
• Inherited: Whether or not this Feature is ‘Enabled’ or ‘Disabled’ is deferred to the Feature’s Parent Category.
Its current inherited value is reflected in the coloured square next to the dropdown; Red indicates it is currently Disabled, while Green indicates it is currently Enabled. Top-level Parents in a category cannot be set to
‘Inherited’.
If Users are part of multiple Groups, they will always use the least-restrictive Group for a particular Feature. In other
words, a given User will have access to a Feature as long as he is part of at least one currently active Group that has
access to that Feature, regardless of whether or not his other Groups typically allow it.
5.2. User Management
265
Deadline User Manual, Release 7.1.0.35
5.3 Slave Configuration
5.3.1 Overview
The Slaves panel allows Slaves to be controlled and modified using the right-click menu. Note that the availability of
these options can vary depending on the context in which they are used, as well as the User Group Permissions that
are defined for the current user.
If the Slaves panel is not visible, see the Panel Features documentation for instructions on how to create new panels
in the Monitor.
5.3.2 Slave States
These are the states that a Slave can be in. They are color coded to make it clear which state the Slave is in.
• Offline (gray): The Slave application is closed.
• Idle (white): The Slave application is running, but it is not currently rendering.
• Rendering (green): The Slave application is running, and is rendering a job.
• Stalled (red): A Slave becomes stalled if it hasn’t updated its state for a certain amount of time. This could be
because the machine crashed, or the Slave simply didn’t shutdown cleanly.
• Disabled (yellow): The Slave has been disabled by an administrator. This prevents the Slave application from
launching on the machine.* License Warning - Slave received a license error when last attempting to render.
View Job Reports to find the exact error message.
• License Problems (orange): The Slave cannot acquire a license, or its temporary license is about to expire.
If you see an orange Slave in the Slave list, it means that the Slave is having licensing problems, or that the license it
is using will expire in less than 10 days. You can check the License column in the Slave list to see what the problem
is.
If you see a red Slave, it means the Slave has been marked as stalled. This happens if the Slave hasn’t updated its state
for a certain amount of time. You can configure this amount of time in the Wait Times section of the Slave Settings in
the Repository Configuration. When a Slave is marked as stalled, it usually means that the machine crashed, or that the
Slave simply didn’t shutdown cleanly. In the latter case, you can simply mark the Slave as offline from the right-click
menu.
The Slave panel’s right-click menu also gives the option to delete or disable Slaves. When disabled, the Slave application will not be allowed to launch on the machine. This is useful if you are doing maintenance on a machine and you
don’t want the Slave accidentally starting up on it.
5.3.3 Job Candidate Filter
If a slave isn’t rendering a job that you think it should be, you can use the Job Candidate Filter option in the Slave
Panel’s drop down menu to try and figure out why. When the option is enabled, simply click on a job in the Job Panel
and the Slave Panel will be filtered to only show the slaves that can render the selected job based on the job’s settings.
The filtering takes the following into account:
• The job’s pool and group (see the Pools and Groups documentation for more information).
• The job’s whitelist/blacklist, and the whitelist/blacklist in the job’s assigned limits (see the Limits and Machine
Limits documentation for more information).
• If the slave has been marked bad for the job (see the Job Failure Detection documentation for more information).
266
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
5.3.4 Slave Settings
Most of the Slave settings be configured from the Monitor while in Super User Mode (or with the proper user privileges) by right-clicking on one or more of them and selecting ‘Modify Slave Properties’. To configure Pools and
Groups, you can use the Tools menu, or you can use the Slave panel’s right-click menu. See the Pools and Groups
documentation for more information.
Note that the only settings here that have an actual impact on rendering are the Concurrent Tasks and CPU Affinity
settings. Furthermore, the CPU Affinity feature is only supported on Windows and Linux operating systems, since
OSX does not support process affinity.
General
These are some general Slave settings:
• Slave Description: A description of the selected Slave. This can be used to provide some pertinent information
about the slave, such as certain system information.
• Slave Comment: A short comment regarding the Slave. This can be used to inform other users why certain
changes were made to that Slave’s settings, or of any known potential issues with that particular Slave.
• Normalized Render Time Multiplier: This value is used to calculate the normalized render time of Tasks. For
example, a Slave that normally takes twice as long to render a Task should be assigned a multiplier of 2.
5.3. Slave Configuration
267
Deadline User Manual, Release 7.1.0.35
• Normalized Task Timeout Multiplier: This value is used to calculate the normalized render time of Task
Timeouts. Typically, this should be the same value as above.
• Concurrent Task Limit Override: The concurrent Task Limit for the Slave. If 0, the Slave’s CPU count is
used as the limit.
• Host Name/IP Address Override: Overrides the Host name/IP address for remote commands.
• MAC Address Override: This is used to override the MAC Address associated with this Slave. This is useful
in the event that the slave defaults to a different MAC Address than the one needed for Wake On Lan.
• Region: The Slave’s region. Used for cross platform rendering. Default is ‘None’. See Regions for more
information.
• Exclude Jobs in the ‘none’ Pool: Enable this option to prevent the Slave from picking up Jobs that are assigned
to the ‘none’ Pool.
• Exclude Jobs in the ‘none’ Group: Enable this option to prevent the Slave from picking up Jobs that are
assigned to the ‘none’ Group.
Idle Detection
These settings can be used to override the global Slave Scheduling settings for the slave (if there are any). It can be
used to start the slave when its machine becomes idle (based on keyboard and mouse activity), and stop the slave when
its machine is in use again. Note that Idle Detection is managed by the Launcher, so it must be running for this feature
to work.
• Start Slave When Machine Idle For: If enabled, the Slave will be started on the machine if it is idle. A
machine is considered idle if there hasn’t been any keyboard, mouse or tablet activity for the specified amount
of time.
268
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
• Only Start Slave If CPU Usage Less Than: If enabled, the slave will only be launched if the machine’s CPU
usage is less than the specified value.
• Only Start Slave If Free Memory More Than: If enabled, the slave will only be launched if the machine has
more free memory than the specified value (in Megabytes).
• Only Start Slave If These Processes Are Not Running: If enabled, the slave will only be launched if the
specified processes are not running on the machine.
• Only Start If Launcher Is Not Running As These Users: If enabled, the slave will only be launched if the
launcher is not running as one of the specified users.
• Stop Slave When Machine Is No Longer Idle: If enabled, the Slave will be stopped when the machine is no
longer idle. A machine is considered idle if there hasn’t been any keyboard, mouse or tablet activity for the
specified amount of time.
• Only Stop Slave If Started By Idle Detection: If enabled, the Slave will only be stopped when the machine is
no longer idle if that Slave was originally started by Idle Detection. If the Slave was originally started manually,
it will not be stopped.
• Allow Slave To Finish Its Current Task When Stopping: If enabled, the Slave application will not be closed
until it finishes its current Task.
There are some limitations with Idle Detection depending on the operating system:
• On Windows, Idle Detection will not work if the Launcher is running as a service. This is because the service
runs in an environment that is separate from the Desktop, and has no knowledge of any mouse or keyboard
activity.
• On Linux, the Launcher uses X11 to determine if there has been any mouse or keyboard activity. If X11 is not
available, Idle Detection will not work.
5.3. Slave Configuration
269
Deadline User Manual, Release 7.1.0.35
Job Dequeuing
These setting are used to determine when a Slave can dequeue Jobs.
• All Jobs: In this mode, the Slave will dequeue any job.
• Only Jobs Submitted From This Slave’s Machine: In this mode, the Slave will only dequeue job submitted
from the machine it’s running on.
• Only Jobs Submitted From These Users: In this mode, the Slave will only dequeue job submitted from the
specified users.
CPU Affinity
These settings affect the number of CPUs the Slave renders with (Windows and Linux only):
• Override CPU Affinity: Enable this option to override which CPUs the Slave and its child processes are limited
to.
• Specify Number of CPUs to use: Choose this option if you just want to limit the number of CPUs used, and
you aren’t concerned with which specific CPUs are used.
• Select Individual CPUs: Choose this option if you want to explicitly pick which CPUs are used. This is useful
if you are running multiple Slaves on the same machine and you want to give each of them their own set of
CPUs.
270
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
Extra Info
Like jobs, extra arbitrary properties can also be set for slaves.
5.3. Slave Configuration
271
Deadline User Manual, Release 7.1.0.35
The Extra Info 0-9 properties can be renamed from the Slaves section of the Repository Configuration, and have
corresponding columns in the Slave list that can be sorted on.
5.3.5 Slave Reports and History
All error reports for a Slave can be viewed in the Slave Reports panel. This panel can be opened from the View menu
or from the main toolbar in the Monitor. It can also be opened from the Slave panel’s right-click menu.
272
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
You can use the Slave Report panel’s right-click menu to save reports as files to send to Deadline Support. You can
also delete reports from this menu as well.
In addition to viewing Slave reports, you can also view the Slave’s history. The History window can be brought up
from the Slave panel’s right-click menu by selecting the View Slave History option.
5.3. Slave Configuration
273
Deadline User Manual, Release 7.1.0.35
5.3.6 Remote Control
You can view the live log for Slaves or control them remotely from the right-click menu. See the Remote Control
documentation for more information.
5.4 Pulse Configuration
5.4.1 Overview
Pulse has two sets of options that can be configured. There are the global Pulse settings in the Repository Options,
which are applied to every running instance of Pulse, and there are the per-Pulse settings that can be configured from
the right-click menu in the Pulse panel. Note that the availability of these options can vary depending on the context
in which they are used, as well as the User Group Permissions that are defined for the current user.
274
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
If the Pulse panel is not visible, see the Panel Features documentation for instructions on how to create new panels in
the Monitor.
5.4.2 Pulse States
These are the states that a Pulse can be in. They are color coded to make it clear which state the Pulse is in.
• Offline (gray): The Pulse application is closed.
• Running (white): The Pulse application is running.
• Stalled (red): Pulse becomes stalled if it hasn’t updated its state for a certain amount of time. This could be
because the machine crashed, or that Pulse simply didn’t shutdown cleanly.
If you see a red Pulse, it means the Pulse has been marked as stalled. This happens if the Pulse hasn’t updated its
state for a certain amount of time. You can configure the Stalled Pulse Threshold in the General Pulse settings in the
Repository Options. When a Pulse is marked as stalled, it usually means that the machine crashed, or that Pulse simply
didn’t shutdown cleanly. In the latter case, you can simply mark Pulse as offline from the right-click menu.
The Pulse panel’s right-click menu also gives the option to delete Pulses.
5.4.3 Pulse Settings
As mentioned above, there are the global Pulse settings in the Repository Options, which are applied to every running
instance of Pulse. However, there are also settings that can be specified for individual Pulse instances, which can be
modified by right-clicking on a Pulse in the Pulse panel and selecting ‘Modify Pulse Properties’.
You can also auto-configure a Pulse instance by right-clicking on it in the Monitor and selecting ‘Auto Configure
Pulse’. This will automatically make this Pulse the Primary Pulse, and set its connection settings.
General
These are some general Pulse settings:
• This Pulse Is The Primary: If enabled, this is the Primary Pulse that the Slaves will connect to. If there is no
Primary, the Slaves will not be able to connect to Pulse.
5.4. Pulse Configuration
275
Deadline User Manual, Release 7.1.0.35
• Override Port: If enabled, this port will be used by Pulse instead of a random port.
• Host Name/IP Address Override: Overrides the Host name/IP address used by the Slaves to connect to Pulse,
and for remote commands.
• MAC Address Override: This is used to override the MAC Address associated with this Pulse. This is useful
in the event that the pulse defaults to a different MAC Address than the one needed for Wake On Lan.
• Region: The region for Pulse. Used for path mapping when executing commands with the Web Service.
When the Slaves connect to Pulse, they will use Pulse’s host name, unless the option to use Pulse’s IP address is
enabled in the Pulse Settings in the Repository Options. Use the Host Name/IP Address Override setting above to
override what the Slaves use to connect to Pulse.
5.4.4 Pulse History
You can view a Pulse’s history by right-clicking on it in the Pulse panel and selecting the View Pulse History option.
276
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
5.4.5 Remote Control
You can view the live log for Pulse or control it remotely from the right-click menu. See the Remote Control documentation for more information.
5.4.6 Pulse Redundancy
You can run multiple instances of Pulse on separate machines as backups in case your Primary Pulse instance goes
down. If the Primary Pulse goes offline or becomes stalled, Deadline’s Repository Repair operation can elect another
running instance of Pulse as the Primary, and the Slaves will automatically connect to the new Primary instance.
To enable Pulse Redundancy, you must enable the Automatic Primary Pulse Election optin in the Repository Repair
settings in the Repository Options.
Note that when multiple Pulse instances are running, only the Primary Pulse is used by the Slaves for Throttling.
In addition, only Primary Pulse is used to perform Housecleaning, Power Management, and Statistics Gathering.
However, you can connect to any Pulse instance to use the Web Service.
5.4.7 Advanced Features
Many advanced features are built into Pulse. These features are described below.
Auto Configuration
5.4. Pulse Configuration
277
Deadline User Manual, Release 7.1.0.35
This allows you to set the repository path in a single location. When a Slave starts up, it will automatically pull the
repository path from Pulse and from that apply some settings before fully initializing. See the Auto Configuration
documentation for more information.
Slave Throttling
Pulse supports a throttling feature, which is helpful if you’re submitting large files with your jobs. This is used to limit
the number of Slaves that copy over the job and plugin files at the same time. See the Network Performance Guide
documentation for more information.
Power Management
Power management is a system for controlling how machines startup and shutdown automatically based on sets of
conditions on the render farm, including job load and temperature. Power management is built into Pulse, so Pulse must
be running to use this feature. The only exception to this rule is Temperature checking. See the Power Management
documentation for more information.
Statistics Gathering
While Pulse isn’t required to gather job statistics, it is required to gather the Slave and Repository statistics. See the
Farm Statistics documentation for more information.
Web Service
While Deadline has a standalone Web Service application, Pulse also has a web service feature built in. The web
service can be used to get information over an Internet connection. It is used by the Mobile application, and can also
be used to display information in a web page. See the Web Service documentation for more information.
5.5 Balancer Configuration
5.5.1 Overview
Balancer has three sets of options that can be configured:
• Global Balancer settings in the Repository Options.
• Cloud Provider Balancer settings in the Cloud Provider Configuration dialog.
• Per-Balancer settings that can be configured from the right-click menu in the Balancer panel.
Note that the availability of these options can vary depending on the context in which they are used, as well as the
User Group Permissions that are defined for the current user.
If the Balancer panel is not visible, see the Panel Features documentation for instructions on how to create new panels
in the Monitor.
5.5.2 Global Balancer Settings
As mentioned above, there are the global Balancer settings in the Repository Options, which are applied to every
running instance of Balancer.
5.5.3 Cloud Provider Configuration
Before the Balancer can do anything, you’ll need to setup a Cloud Region. Balancer settings for each Cloud Provider
can be configured in the Cloud Provider Configuration dialog. Deadline supports a number of cloud providers by
default. Custom cloud plugins can be written to support different providers. Here’s a list of all the supported Cloud
Plugins.
278
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
When adding a new Cloud Region you’ll have to enter all of your credentials and settings for that particular provider.
You can look at the documentation for each plugin for further details about all the settings and credentials. Enabling
the region will show instances in the Cloud Panel. Your credentials need to be verified before you’re able to enable
the region to work with the Balancer.
Basic Configuration
The basic configuration options are:
• Enabled: Enabling the region makes it usable by the Balancer.
• Region Preference: Weighting towards the region.
• Region Budget: Total Budget for a region. Governs how many instances will be started for this region.
Asset Checking
Asset Checking can be used to sync assets between the repository and the slaves. The Asset Checking options are:
• Enable Asset Checking: Enables asset crawler for jobs with assets.
• Asset Crawler Hostname: Hostname for the Asset Crawler.
• Asset Crawler Port: Port number for the Asset Crawler.
• Asset Crawler OS: Operating system of the Asset Crawler.
The asset script itself can be found in the vmx folder in the Repository, and is called AssetCrawler_Server.py.
5.5. Balancer Configuration
279
Deadline User Manual, Release 7.1.0.35
Balancer Plugins
The Balancer uses an algorithm that’s defined in a Balancer Plugin. That can be set in the Balancer Settings section in
Repository Configuration. We’ve included a default algorithm that should be fine for most use cases but you can write
your own for your specific needs.
Group Mappings
Group Mappings are the heart of the Balancer. They tell the Balancer what kinds of instances to start for each group.
A Group Mapping is mainly comprised of a group, an image, a hardware type and a budget. The image and hardware
type are from the provider. The Cost is how much of the region’s budget will be consumed by each instance.
280
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
You can also add Pools to a mapping so that instances will be started in those pools.
5.5.4 Balancer States
These are the states that a Balancer can be in. They are color coded to make it clear which state the Balancer is in.
• Offline (gray): The Balancer application is closed.
• Running (white): The Balancer application is running.
• Stalled (red): Balancer becomes stalled if it hasn’t updated its state for a certain amount of time. This could be
because the machine crashed, or that Balancer simply didn’t shutdown cleanly.
If you see a red Balancer, it means the Balancer has been marked as stalled. This happens if the Balancer hasn’t
updated its state for a certain amount of time. You can configure the Stalled Balancer Threshold in the General
Balancer settings in the Repository Options. When a Balancer is marked as stalled, it usually means that the machine
crashed, or that Balancer simply didn’t shutdown cleanly. In the latter case, you can simply mark Balancer as offline
from the right-click menu.
The Balancer panel’s right-click menu also gives the option to delete Balancers.
5.5.5 Balancer Settings
There are settings that can be specified for individual Balancer instances, which can be modified by right-clicking on
a Balancer in the Balancer panel and selecting ‘Modify Balancer Properties’.
5.5. Balancer Configuration
281
Deadline User Manual, Release 7.1.0.35
You can also auto-configure a Balancer instance by right-clicking on it in the Monitor and selecting ‘Auto Configure
Balancer’. This will automatically make this Balancer the Primary Balancer.
General
These are some general Balancer settings:
• This Balancer Is The Primary: If enabled, this is the Primary Balancer.
• Host Name/IP Address Override: Overrides the Host name/IP address for remote commands.
• MAC Address Override: This is used to override the MAC Address associated with this Balancer. This is
useful in the event that the balancer defaults to a different MAC Address than the one needed for Wake On Lan.
• Region: The region for Balancer.
5.5.6 Balancer History
You can view a Balancer’s history by right-clicking on it in the Balancer panel and selecting the View Balancer History
option.
282
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
5.5.7 Remote Control
You can view the live log for Balancer or control it remotely from the right-click menu. See the Remote Control
documentation for more information.
5.5.8 Balancer Redundancy
You can run multiple instances of Balancer on separate machines as backups in case your Primary Balancer instance
goes down. If the Primary Balancer goes offline or becomes stalled, Deadline’s Repository Repair operation can elect
another running instance of Balancer as the Primary.
To enable Balancer Redundancy, you must enable the Automatic Primary Balancer Election optin in the Repository
Repair settings in the Repository Options.
5.5. Balancer Configuration
283
Deadline User Manual, Release 7.1.0.35
Note that when multiple Balancer instances are running, only the Primary Balancer is starting and stopping virtual
instances.
5.6 Job Scheduling
5.6.1 How a Job is Selected by a Slave
By default, a job is selected by a Slave based on the following properties, in this order:
1. The Pools and Groups that the Job has been submitted to.
• A Slave will only select a Job if it has been assigned to the Pool and Group to which the Job belongs.
• Pools are priority-based, so a Slave will favour Jobs in Pools that are higher on its priority list. This
ordering can be configured on a per-Slave basis through the Manage Pools utility.
• Groups are not priority-based, and are typically just used to ensure that Jobs render on machines with
appropriate hardware and software configurations.
2. The Job’s Priority:
• By default, a Job has a numeric Priority ranging from 0 to 100, where 0 is the lowest priority and 100 is
the highest priority. You can adjust the maximum Job priority in the Job Settings section of the Repository
Configuration.
• Everything else being equal, the highest Priority Job will always be chosen first when a Slave is selecting
its next Job.
3. The Date and Time at which the Job was submitted:
• This is set automatically and is the timestamp of when the Job was submitted to Deadline.
• Everything else being equal, an older Job will take priority over a newer Job when a Slave is looking for a
new one.
4. The Job’s Limits and Machine Limits
• With Limits, if a Job has the highest priority, but requires a Limit that is maxed out, a Slave will try to
select a different Job.
• A Machine Limit is a special type of Limit that restricts the number of machines that can render that
particular Job at the same time.
5.6.2 Changing the Scheduling Order
It is possible to change the order in which Jobs are scheduled in the Job Settings section of the Repository Configuration.
284
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
The following options are available:
• First-in First-Out: Job order will be based solely on submission date, and will be rendered in the order they
are submitted.
• Pool, First-in First-Out: Job order will be based on the job’s pool first, with submission date being the tiebreaker.
• Pool, Priority, First-in First-Out: This is the default scheduling order that is used. Job order will be based on
the job’s pool, then priority, with submission date being the tie-breaker.
• Priority, First-in First-Out: Job order will be based on the job’s priority first, with submission date being the
tie-breaker.
• Priority, Pool, First-in First-Out: Job order will be based on the job’s priority, then pool, with submission date
being the tie-breaker.
• Balanced: Job order will be balanced so that each job has the same number of slaves rendering them at a time.
• Pool, Balanced: Job order will be based on the job’s pool first, with a balance being applied to jobs that are in
the same pool.
• Pool, Priority, Balanced: Job order will be based on the job’s pool, then priority, with a balance being applied
5.6. Job Scheduling
285
Deadline User Manual, Release 7.1.0.35
to jobs that have the same pool and priority.
• Priority, Balanced: Job order will be based on the job’s priority first, with a balance being applied to jobs that
have the same priority.
• Priority, Pool, Balanced: Job order will be based on the job’s priority, then pool, with a balance being applied
to jobs that have the same pool and priority.
• Weighted, First-in First-out: A weighted system that takes priority, submission time, number of rendering
tasks, and number of job errors into account, but does not take pools into account. If two or more jobs have the
same calculated weight, the submission date will act as the tie-breaker.
• Pool, Weighted, First-in First-out: A weighted system that still respects pool priority. If two or more jobs have
the same calculated weight, the submission date will act as the tie-breaker.
• Weighted, Balanced: A weighted system that takes priority, submission time, number of rendering tasks, and
number of job errors into account, but does not take pools into account. A balance will be applied to jobs that
have the same calculated weight.
• Pool, Weighted, Balanced: A weighted system that still respects pool priority. A balance will be applied to
jobs that have the same calculated weight.
Note that the Secondary Pool feature was designed for job scheduling orders that have Pool listed first, and might not
work as expected otherwise. For example, if Priority is listed first, a job with lower priority that’s found during the
initial Primary Pool scan will be preferred over a job with higher priority that’s found during the Secondary Pool scan.
This is because the Secondary Pool scan is only performed if no jobs are found during the initial Primary Pool scan.
See the Pools and Groups documentation for more information.
Balanced Scheduling
For the balanced options, you can can have slaves give the job they are currently working on more priority using the
Rendering Task Buffer. This can help prevent slaves from jumping between jobs. For example, if this is set to 3, a
slave will only drop its current job for another one if the other job has more than 3 less rendering tasks than the current
job.
There is also an experimental option to enhance the balancing logic. When this option is enabled, the slaves will use
the database to get a more accurate snapshot of all the rendering jobs in the farm, and use this information to make
better decisions about which job they should be rendering. Testing has shown that when this option is enabled, a
proper distribution of Slaves among jobs is much more consistent, and Slaves no longer jump between jobs of the
same priority. The result is more predictable behavior, and less wasted time due to the overhead of switching between
jobs that are expensive to start up.
Weighted Scheduling
For the weighted options, you can control how much weight is applied to the job priority, submission time, number of
rendering tasks, and number of errors. You can also give weight to the job that the slave is currently working on using
the Rendering Task Buffer. The buffer is subtracted from the rendering task count for the current job, which pushes it
higher in the queue.
Deadline then sorts by this weight so that jobs with the largest weight value have the higher priority. Note that the
weight values can be negative. For example, if you set a negative weight value to the number of job errors, a job with
more errors will end up having a lower overall weight so that precedence is given to other jobs in the queue.
Here is how the weight is calculated:
weight = (job.Priority * PW) +
(job.Errors * EW) +
286
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
((NOW - job.SubmissionTimeSeconds) * SW) +
((job.RenderingTasks - RB) * RW)
Where:
• PW = priority weight
• EW = error weight
• SW = submission time weight
• RW = rendering task weight
• RB = rendering task buffer
• NOW = the current repository time
Note that because the job submission time is measured in seconds, it will have the greatest impact on the overall
weight. Reducing the SW value can help reduce the submission time’s impact on the weight value.
There is also an experimental option to enhance the balancing logic. When this option is enabled, the slaves will use
the database to get a more accurate snapshot of all the rendering jobs in the farm, and use this information to make
better decisions about which job they should be rendering. Testing has shown that when this option is enabled, a
proper distribution of Slaves among jobs is much more consistent, and Slaves no longer jump between jobs of the
same priority. The result is more predictable behavior, and less wasted time due to the overhead of switching between
jobs that are expensive to start up.
5.7 Pools and Groups
5.7.1 What are Pools and Groups?
Groups can be used to organize your farm based on machine configurations (e.g., specs, installed software, etc). For
example, if you have several 64-bit machines with 3ds Max installed, you could assign them to groups like 3dsmax, or
3dsmax_64bit, or simply 3D. Groups have no impact on the order in which Jobs are rendered, they just help to ensure
that Jobs render on machines with proper an appropriate hardware/software setup. If you don’t care about grouping
your machines, you can simply use the default ‘none’ Group.
Pools are similar to Groups, except that they do affect the order in which Jobs are rendered. Because of this, it is
encouraged to use Pools for prioritizing different shows, shots, types of Jobs, etc. If you don’t want to set up Pools on
your farm, you can simply use the default ‘none’ Pool. Note that the ‘none’ Pool always has the lowest priority of all
the Pools.
Jobs can be added to an optional Secondary Pool. When searching for a Job, a Slave does a first pass using the Primary
Pool of the available Jobs. If the Slave doesn’t find any Jobs using the Primary Pool, it then makes a second pass using
the Secondary Pool. This system can allow a Job to spread to a Secondary Pool as necessary, and it can also ease the
configuration of Pools in the farm if there are lots of Pools and Slaves. An example of this is shown below.
Note that the Secondary Pool feature was designed for Job Scheduling Orders that have Pool listed first, and might not
work as expected otherwise. For example, if Priority is listed first, a job with lower priority that’s found during the
initial Primary Pool scan will be preferred over a job with higher priority that’s found during the Secondary Pool scan.
This is because the Secondary Pool scan is only performed if no jobs are found during the initial Primary Pool scan.
5.7.2 Managing Pools and Groups
Pools and Groups can be managed from the Monitor while in Super User mode (or as a User with the proper User
Group privileges). Just select ‘Manage Pools’ (or ‘Manage Groups’) from the ‘Tools’ menu, or from the Slave panel’s
5.7. Pools and Groups
287
Deadline User Manual, Release 7.1.0.35
right-click menu.
The dialogs are very similar to each other, but the nuances between the two are described below in detail. Note that if
you used the Slave panel’s right-click menu to open these dialogs, they will be pre-filtered to just show the slaves that
you right-clicked on. They will also show the same columns that are currently being shown in the slave list.
Group Management Dialog
From here, you can manage individual Groups, and assign them to various Slaves. It is a bit simpler than the Pool
Management Dialog, which will be covered below in more detail, since it does not have to worry about the order of
Groups for a given Slave.
The functions you can perform here are as follows:
• Groups: This section displays existing Groups and allows you to manipulate them, or create new ones. Your
selection here will determine which Groups will be affected by the Group Operations.
– New: This will create a new Group in the Repository, and allow you to assign the Group to different Slaves.
You will be prompted for a name for the new Group. Group names cannot be changed once the Group has
been created. Adding a Group with the name of previously Deleted Group will effectively ‘re-instate’ that
Group if it hasn’t been Purged yet (see below).
– Delete: This will Delete all of the selected Groups from the Repository, and enable the option to Purge
them (described below).
– Purge Obsolete Groups on Close: This will purge any obsolete (deleted) Groups from existing Jobs and
remove from any Slaves that are currently assigned to it. They will be replaced with the Group selected in
the drop down. Note that if you choose not to Purge the obsolete Groups right now, you can always return
to this dialog and do it later.
• Slaves: This section displays a list of all known Slaves that have connected to your Repository. Your selection
here will determine which Slaves will be affected by the Groups Operations.
288
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
– Only Show Slaves Assigned to a Selected Group: This option will filter the displayed Slaves to only
include the ones that are currently assigned to at least one of the selected Groups.
• Group Operations: These operations are used to manipulate which Groups are assigned to which Slaves. They
typically require a selection of one or more Groups and one or more Slaves to be active.
– Add: This will add all of the selected Groups to all of the selected Slaves, if it wasn’t already there.
– Remove: This will remove all of the selected Groups from all of the selected Slaves, if applicable.
– Copy: This will copy the groups from the selected slave to the clipboard.
– Paste: This will paste the groups that were copied using the Copy button to the selected slaves.
– Clear: This will clear all the groups from all of the selected Slaves. This option does not require a Group
to be selected.
Pool Management Dialog
The Pool Management dialog functions similarly to the Group Management dialog described above, but with a few
added options to deal with managing Pool Ordering for individual Slaves.
The functions you can perform here are as follows. Note that a lot of these overlap with the described Group Management functionality described in the previous section.
• Pools: This section displays existing Pools and allows you to manipulate them, or create new ones. Your
selection here will determine which Pools will be affected by the Pool Operations described below.
– New: This will create a new Pool in the Repository, and allow you to assign it to Slaves. You will be
prompted for a name for the new Pool; not that Pool names cannot be changed once the Pool has been
created. Adding a Pool with the name of previously Deleted Pool will effectively ‘re-instate’ that Pool if it
hasn’t been Purged yet (see below).
5.7. Pools and Groups
289
Deadline User Manual, Release 7.1.0.35
– Delete: This will Delete all of the selected Pools from the Repository, and enable the option to Purge them
(described below).
– Purge Obsolete Pools on Close: This will purge any obsolete (deleted) Pools from existing Jobs and
remove them from any Slaves that may have them in their list. They will be replaced with the Pool
selected in the drop down. Note that if you choose not to Purge the obsolete Pools right now, you can
always return to this dialog and do it later.
– Priority Distribution: This graph visualizes how many Slaves have one of the selected Pools as #1 priority, #2 priority, etc. It also displays how many Slaves are not currently assigned to the selected Pools.
• Slaves: This section displays a list of all known Slaves that have connected to your Repository. Your selection
here will determine which Slaves will be affected by the Pool Operations described below.
– Only Show Slaves Assigned to a Selected Pool: This option will filter the displayed Slaves to only include
the ones that are currently assigned to at least one of the selected Pools.
• Pool Operations: These operations are used to manipulate which Pools are assigned to which Slaves. They
typically require a selection of one or more Pools and one or more Slaves to be active.
– Add: This will add all of the selected Pools to all of the selected Slaves, if it wasn’t already there.
– Remove: This will remove all of the selected Pools from all of the selected Slaves, if applicable.
– Promote: This will bump up the selected Pools by one position in all of the selected Slaves’ Pool lists.
Any selected Slaves that are not assigned to the selected Pool(s) are unaffected.
– Demote: This will bump down the selected Pools by one position in all of the selected Slaves’ Pool lists.
Any selected Slaves that are not assigned to the selected Pool(s) are unaffected. Note that a Pool cannot
be demoted to be lower than the ‘none’ pool – the ‘none’ Pool is always assigned the lowest priority by
Slaves.
– Copy: This will copy the pools from the selected slave to the clipboard.
– Paste: This will paste the pools that were copied using the Copy button to the selected slaves.
– Clear: This will clear all the Pools from all of the selected Slaves. This option does not require a Pool to
be selected.
Preventing Slaves from Rendering Jobs in the ‘none’ Pool or Group
In some cases, it may be useful to prevent on or more Slaves from rendering Jobs that are assigned to the ‘none’ Pool
or Group. For example, you may have a single machine that you want to only render Quicktime Jobs. Normally, you
could add this machine to a ‘quicktime’ Group, but if there are noe Quicktime Jobs, the Slave could move on to Jobs
that are in the ‘none’ Group. If you want this machine to only be available for Quicktime Jobs, you can configure it to
exclue Jobs in the ‘none’ Group.
The option to exclude Jobs in the ‘none’ Pool or Group can be found in the Slave Settings in the Monitor.
5.7.3 Pools and Job Scheduling
How pools affect the Job selection process is best explained through an example. Note that this example relies on a
Scheduling Order where Pools are the primary determining factor of scheduling (such as the default Pool -> Priority
-> Submit Date scheme).
Say we need to render Jobs for two different shows, and we’ve already created corresponding pools for each show in
Deadline:
• show_a
290
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
• show_b
Now say we have 10 machine in our render farm, and we want to give each show top priority on half of it. To do this,
we’d just assign the pools to our Slaves like this:
• Slaves 1-5:
1. show_a
• Slaves 6-10:
1. show_b
With this setup, if Jobs from both shows are in the queue, then Slaves 1-5 will pick up the Jobs from show_a, while
Slaves 6-10 will work on Jobs from show_b. This effectively splits our farm in half, like we desired, but with this
configuration Slaves 1-5 would sit idle once show_a finishes production, even if there are plenty of show_b Jobs in the
queue. The reverse would also be true if show_b production slows down while show_a is still ramping up.
To accomplish this second goal of maximizing our resources, we’ll assign the Pools to our Slaves as follows:
• Slaves 1-5:
1. show_a
2. show_b
• Slaves 6-10:
1. show_b
2. show_a
Now, Slaves 1-5 will still give top priority to show_a Jobs, and Slaves 6-10 will similarly give top priority to show_b
Jobs. However, if there are no show_a Jobs currently in the queue, Slaves 1-5 will start working on show_b Jobs
until another show_a Job comes along. Similarly, Slaves 6-10 would start working on show_a if no show_b Jobs were
available.
This concept is also extensible to any number of shows/pools, you just have to make sure to have an even Priority
Distribution across your farm (the Priority Distribution graph should help with that). Here’s an example of what the
Priority Distribution for a 3-show farm with 15 Slaves could look like:
• Slaves 1-5:
1. show_a
2. show_b
3. show_c
• Slaves 6-10:
1. show_b
2. show_c
3. show_a
• Slaves 11-15:
1. show_c
2. show_a
3. show_b
5.7. Pools and Groups
291
Deadline User Manual, Release 7.1.0.35
5.7.4 Secondary Pools and Job Scheduling
How secondary pools affect the Job selection process is best explained through an example. Note that this example
relies on a Scheduling Order where Pools are the primary determining factor of scheduling (such as the default Pool
-> Priority -> First-in First-out option). The Secondary Pool feature was designed for job scheduling orders that have
Pool listed first, and might not work as expected otherwise.
Let’s say you have 5 pools and 10 slaves. You want each pool to have top priority on 2 machines, but then be able to
spread to the rest of them if they are idle. Without using the secondary pool system, you might have something like
this:
• Slaves 0-1: pool_1, pool_2, pool_3, pool_4, pool_5
• Slaves 2-3: pool_2, pool_3, pool_4, pool_5, pool_1
• Slaves 4-5: pool_3, pool_4, pool_5, pool_1, pool_2
• Slaves 6-7: pool_4, pool_5, pool_1, pool_2, pool_3
• Slaves 8-9: pool_5, pool_1, pool_2, pool_3, pool_4
This can be tricky to maintain if you have to reorganize pools or new slaves are added to the farm. The new secondary
pool system can make this easier:
• Slaves 0-1: pool_1, pool_all
• Slaves 2-3: pool_2, pool_all
• Slaves 4-5: pool_3, pool_all
• Slaves 6-7: pool_4, pool_all
• Slaves 8-9: pool_5, pool_all
In this case, all jobs could have pool_all as their secondary pool, and will spread to the rest of the farm if machines
become available.
5.8 Limits and Machine Limits
5.8.1 Overview
In order to support rendering applications that use floating licensing to limit the number of clients rendering at any one
time, Deadline supports the ability to create Limits to manage this restriction. When creating a Limit, be sure to set
the limit to the number of network licenses you have for the product.
For example, if you have 20 nodes in your render farm and only 10 licenses of Nuke, you can create a Nuke Limit
with a limit of 10. During rendering Deadline will ensure that no more than 10 machines are rendering Jobs associated
with this Nuke Limit at any given time. Because of this, you never have to worry about licensing issues.
Machine Limits function similarly, but are on a per-Job basis. Instead of limiting how many Slaves can render a group
of Jobs, they limit the number of Slaves that can render one particular Job. This is useful if you want to prevent a job
from potentially taking up the entire farm.
5.8.2 Job Machine Limits
Machine Limits are a per-Job option, and can be managed through the Job’s Properties window, which you can get to
by right-clicking on the Job and selecting ‘Modify Job Properties’. More information on the available Machine Limit
settings can be found in the Controlling Jobs documentation.
292
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
5.8.3 Limits
Limits can be managed from the Limit list in the Monitor while in Super User mode (or as a user with appropriate
User Group privileges). This list shows all the Limits that are in your Repository. It also displays useful information
about each Limit, such as its name, its limit, and the number of Limit stubs that are currently in use. You can access
many options for the Limits (listed below) by right-clicking on them, and you can create a new Limit by clicking on
the [+] button in the Limit list’s toolbar.
If the Limits panel is not visible, see the Panel Features documentation for instructions on how to create new panels
in the Monitor.
5.8. Limits and Machine Limits
293
Deadline User Manual, Release 7.1.0.35
New Limit
Use this option to add a new Limit to your Repository.
You can modify the following settings for the new Limit:
Name
The name of the new Limit. Note that this setting cannot be changed once the Limit has been created.
Usage Level
The level at which a Limit Stub will be checked out. ‘Slave’ is the default, and will require each Slave
to acquire a Stub; if ‘Machine’ is selected, only a single Stub will be required for all Slaves on the same
machine. Conversely, if ‘Task’ is selected, Slaves will try to acquire one Stub per concurrent Render
Thread. Note that this setting cannot be changed after Limit creation.
Limit
The maximum number of simultaneous uses that this Limit can support at any given time. What counts
294
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
as a ‘use’ is based on the usage Level (will be either on a Machine, Slave, or Task level).
Release at Task Progress
If enabled, Slaves will release their Limit stub when the current Task reaches the specified percentage.
Note that not all Plugins report Task progress.
Whitelisted/Blacklisted Slaves
If Slaves (or Machines, depending on Level selected above) are on a Blacklist, they will never try to render
Jobs associated with this Limit. If Slaves/Machines are on a Whitelist, then they are the only ones that
will try to render Jobs associated with this Limit. Note that an empty blacklist and an empty whitelist are
functionally equivalent, and have no impact on which machines the job renders on.
Slaves Excluded From Limit
These Slaves (or Machines, depending on Level selected above) will ignore this Limit and won’t contribute to the Limit’s stub count. This is useful if you are juggling a mix of floating and node-locked
licenses, in which case your machines with node-locked licenses should be placed on this list.
Clone Limit
This option allows you to create a new Limit while using an existing Limit as a template. It will bring up a dialog
very similar to the one pictured in Create Limit, with all the same options. This option is handy if you want to create
a Limit that is very similar to an existing one, but with a small variation.
Modify Limit Properties
This option allows you to edit the settings for an existing Limit. All of the settings described in the New Limit section
above can be changed except for the Limit’s Name and Usage Level, which cannot be changed once the Limit has
been created.
Reset Limit Usage Count
Sometimes a Limit stub will get orphaned, meaning that it is counting against the Limit’s usage count, but not machines
are actually using it. After a while, Deadline will eventually clean up these orphaned Limit stubs. This option provides
the means to delete all existing stubs immediately (whether they are orphaned or not), which will require Slaves to
acquire them again.
Delete Limit
Removes an existing Limit from your Repository. Any Jobs associated with deleted Limits will still be able to render,
but they will print out Warnings indicating that the Limit no longer exists.
5.8.4 Limits and Job Scheduling
Although Limits and Job Machine Limits aren’t priority-based like Pools, they do have an impact on job scheduling.
Here are some examples.
Limits
• If a job is assigned to a Limit, and that Limit is currently maxed out, the job will not be picked up by any
additional slaves.
5.8. Limits and Machine Limits
295
Deadline User Manual, Release 7.1.0.35
• If a job is assigned to a Limit, and that Limit has a whitelist, the job will only render on the slaves in that
whitelist.
• If a job is assigned to two Limits, and one of those Limits is currently maxed out, the job will not be picked up
by any additional slaves. This is because a slave must be able to acquire all Limits that the job requires.
• If a job is assigned to two Limits, and one of those Limits has slave_1 on its blacklist, slave_1 will never pick
up the job. This is because a slave must be able to acquire all Limits that the job requires.
Job Machine Limits
• If a job has a Machine Limit greater than 0, and that Limit is currently maxed out, the job will not be picked up
by any additional slaves.
• If a job has a whitelist, the job will only render on the slaves in that whitelist.
5.9 Job Failure Detection
5.9.1 Overview
Job Failure Detection can be used to prevent problematic Jobs from wasting previous render time on the farm. There are
two types of Failure Detection, which are both explained below. By default, Jobs will fail after they have accumulated
100 errors, but this can be changed in the Job Settings section of the Repository Configuration.
296
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
5.9.2 Job Failure Detection
A Job will enter the Failed state when it has accumulated the maximum permitted number of errors. Once in the Failed
state, the Job will no longer be picked up by Slaves for rendering without manual intervention. Because of this, Job
Failure Detection can help ensure that problematic Jobs are flagged appropriately and won’t waste precious rendering
time. In the Repository Options, can setup failure thresholds for Jobs and for individual Tasks.
5.9. Job Failure Detection
297
Deadline User Manual, Release 7.1.0.35
If you’ve resolved the problems that were preventing the Job from rendering properly, you can right-click on it in the
Monitor and select ‘Resume Failed Job’. You will then be prompted with the option to ignore or override Failure
Detection for this Job going forward. Note that an Error Limit of 0 indicates that there is no limit, and the Job will
never be marked as Failed by Failure Detection.
If you choose not to ignore Failure Detection, make sure to clear the Job’s errors, or a single new error will result in
the Job failing again, because its error limit is still over the maximum. To clear a Job’s errors, simply delete all of the
Job’s Error Reports using the Job Reports Panel.
5.9.3 Slave Failure Detection
Slave Failure Detection works a little differently than Job Failure Detection. Basically, if a particular Slave reports
consecutive errors for a given Job, it will add itself to the Job’s list of Bad Slaves. When a Slave has been marked as
bad for a particular Job, it will not try to render that Job again until it has no other Jobs available. This helps ensure
that Slaves aren’t wasting render time on Jobs that they likely aren’t able to render.
If the issue preventing a Slave from rendering a particular Job properly has been resolved, you can remove it from a
298
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
Job’s ‘bad’ list by navigating to the ‘Failure Detection’ section of a Job’s Properties dialog. There is also an option in
this section to have your Job completely ignore Slave Failure Detection, if you wish.
5.10 Notifications
5.10.1 Overview
Deadline can be configured to notify Users when their Jobs finish, or if they have failed. In addition, Deadline can
be configured to send notifications to administrators when certain events occur on the farm (e.g., when a Slave has
stalled, or if a Slave is being shutdown by Power Management).
5.10.2 Email Notifications
Before Deadline can send email notifications, you need to configure the Email Notification settings in the Repository
Configuration.
5.10. Notifications
299
Deadline User Manual, Release 7.1.0.35
5.10.3 Popup Message Notifications
The popup message notification system can be used to send job notifications to users by popping up a message window
on their workstations.
In order to receive popup message notifications, the user needs to have the Launcher running on their workstation, and
have their workstation machine name specified in their User Settings (see below).
5.10.4 Job Notifications
Users can edit their User Settings to control whether or not they receive notifications for their own Jobs.
300
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
In order to receive email notifications, the user needs to set their Email Address setting and enable the Email Notification option. Note that email notifications will only be sent if the SMTP settings in the Repository Options are set
properly, as mentioned in the previous section.
In order to receive popup message notifications, the user needs to have the Launcher running on their workstation, and
have their workstation machine name specified in their User Settings.
5.11 Remote Control
5.11.1 Overview
Remote control features are built into the Monitor to make farm administration easier. These features allow you to
connect to and control the Slave application on your render nodes, and also run remote commands on them. They also
allow you to control Pulse and the Balancer as well (if you’re running them on your farm).
If the Slaves, Pulse, or Balancer panel are not visible, see the Panel Features documentation for instructions on how to
create new panels in the Monitor.
5.11. Remote Control
301
Deadline User Manual, Release 7.1.0.35
5.11.2 Connecting to the Application Logs
You can remotely connect to the Slave, Pulse, or Balancer log from the Monitor.
Connecting to the Slave Log
You can remotely connect to a Slave by double-clicking on it in the Slave panel, or by right-clicking on it and selecting
Connect To Slave Log. This will bring up the Slave Log window, allowing you to see what the Slave is currently
doing.
There are a few places in the Monitor you can find the option to connect to the Slave log:
• The Slave panel right-click menu.
• The Task panel right-click menu. Note that it will only appear for rendering or completed tasks.
• The Job Report panel right-click menu.
• The Slave Report panel right-click menu.
302
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
Connecting to the Pulse Log
You can remotely connect to a Pulse by double-clicking on it in the Pulse panel, or by right-clicking on it and selecting
Connect To Pulse Log. This will bring up the Pulse Log window, allowing you to see what the Pulse is currently doing.
Connecting to the Balancer Log
You can remotely connect to a Balancer by double-clicking on it in the Balancer panel, or by right-clicking on it
and selecting Connect To Balancer Log. This will bring up the Balancer Log window, allowing you to see what the
Balancer is currently doing.
5.11. Remote Control
303
Deadline User Manual, Release 7.1.0.35
5.11.3 Remote Controlling Slaves, Pulses, and Balancers
The Remote Control menu can be found in the Slave, Pulse, and Balancer panel’s right-click menu. Note that the
availability of these options can vary depending on the context in which they are used, as well as the User Group
Permissions that are defined for the current user. Remote Administration must also be enabled on the farm, and can be
enabled in the Client Setup.
These are the options that are available in the Slave, Pulse, and Balancer Remote Control menus:
• Start Machine: Starts the machine using Wake On Lan.
• Shutdown Machine: Turns off the machine.
• Restart Machine: Restarts the machine.
• Suspend Machine: Sets the machines state as suspended (Windows Only).
• Execute Command: Executes an arbitrary command on the machine.
304
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
When executing an arbitrary command, if you want to execute a DOS command on a Windows machine, the command
must be preceded with “cmd /C”. This opens the DOS prompt, executes the command, and closes the prompt. For
example:
cmd /C echo "foo" > c:\test.txt
These remote commands do not allow for user input for any command requiring a prompt. An example where this
might cause confusion is with Microsoft’s xcopy command. Here, if the destination is uncertain to be a file or folder,
xcopy will immediately exit as though successful instead of asking what should be done.
If a command returns a non-zero exit code, the command will be interpreted as having failed.
Slave Remote Control Options
These options are only available in the Slave Remote Control menu:
• Search For Jobs: Forces the Slave to search the Repository for a job to do.
• Cancel Current Tasks: Forces the Slave to cancel its current tasks.
• Start Slave: Starts the Slave instance.
• Stop Slave: Stops the Slave instance.
• Restart Slave: Restarts the Slave instance.
• Continue Running After Current Task Completion: The Slave will continue to run after it finishes its current
task.
• Stop Slave After Current Task Completion: The Slave will stop after the current task is completed.
• Restart Slave After Current Task Completion: The Slave will restart after the current task is completed.
• Shutdown Machine After Current Task Completion: The Machine running the Slave will stop after the
current task is completed.
• Restart Machine After Current Task Completion: The machine running the Slave will restart after the current
task is completed.
• Start All Slave Instances: Starts all the slave instances on the selected machines.
• Start New Slave Instance: Starts a new slave instance with the specified name on the selected machine.
5.11. Remote Control
305
Deadline User Manual, Release 7.1.0.35
Pulse Remote Control Options
These options are only available in the Pulse Remote Control menu:
• Perform Pending Job Scan: Forces Pulse to perform the Pending Job Scan operation.
• Perform House Cleaning: Forces Pulse to perform the House Cleaning operation.
• Perform Repository Repair: Forces Pulse to perform the Repository Repair operation.
• Perform Power Management Check: Forces Pulse to perform the Power Management check.
• Start Pulse: Starts the Pulse instance.
• Stop Pulse: Stops the Pulse instance.
• Restart Pulse: Restarts the Pulse instance.
306
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
Balancer Remote Control Options
These options are only available in the Pulse Remote Control menu:
• Perform Balancing: Forces the Balancer to perform the Balancing operation.
• Start Balancer: Starts the Balancer instance.
• Stop Balancer: Stops the Balancer instance.
• Restart Balancer: Restarts the Balancer instance.
5.11.4 Remote Commands Panel
The Remote Command panel shows all pending and completed remote commands that were sent from the Monitor.
When sending a remote command, if this panel is not already displayed, it will be displayed automatically (assuming
you have permissions to see the Remote Command panel).
You can view the results of a remote command by clicking on the command in the list. The full results will be shown
in the log window below. All successful commands will start with “Connection Accepted.”
5.11.5 Remote Desktop Software
There are many applications which allows you to remotely control another computer. The following applications are
supported by Deadline out of the box via Monitor scripts. The scripts can be run from the Scripts menu in the Monitor,
or by right-clicking on a Slave, Pulse, or Balancer in their respective panels. Right-click scripts can also be found in
the Task and Report panels.
5.11. Remote Control
307
Deadline User Manual, Release 7.1.0.35
Apple Remote Desktop (ARD)
With Apple Remote Desktop (ARD), you can observe and obtain access to the computers on your network. Note that
in order to connect to a machine from the Monitor, that machine must already be in the ARD list of computers because
Deadline can’t create new computer entries and add them to the list. An error message is displayed if the machine
can’t be found in the ARD list.
The following options are available in the ARD window in the Monitor:
• Machine IP Address(s): Specify which machines to connect to. Use a comma to separate multiple machine
names.
• Hide this window if running from a right-click Scripts menu: If enabled, this window will be hidden if run
from a right-click menu in the Monitor. You can always run it from the main Scripts menu to see this window.
Radmin
Radmin is fast, secure and affordable remote-control software that enables you to work on a remote computer in real
time as if you were sitting in front of it.
308
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
The following options are available in the Radmin window in the Monitor:
• Machine Name(s): Specify which machines to connect to. Use a comma to separate multiple machine names.
• Radmin Viewer: The Radmin viewer executable to use.
• Radmin Port: The Radmin port.
• Hide this window if running from a right-click Scripts menu: If enabled, this window will be hidden if run
from a right-click menu in the Monitor. You can always run it from the main Scripts menu to see this window.
Remote Desktop Connection (RDC)
With Remote Desktop Connection (RDC), you can easily connect to a terminal server or to another computer running
Windows. All you need is network access and permissions to connect to the other computer.
The following options are available in the RDC window in the Monitor:
• Machine Name(s): Specify which machines to connect to. Use a comma to separate multiple machine names.
• Settings:
– No Settings: When this option is chosen, no existing RDP settings are used to connect.
– Settings File: When this option is chosen, the specified RDP config file is used to connect.
– Settings Folder: When this option is enabled, existing RDP config files in this folder are used to
connect. If the machine does not have an RDP config file, you’ll have the option to save one before
connecting.
• Hide this window if running from a right-click Scripts menu: If enabled, this window will be hidden if run
from a right-click menu in the Monitor. You can always run it from the main Scripts menu to see this window.
5.11. Remote Control
309
Deadline User Manual, Release 7.1.0.35
VNC
Virtual Network Computing (VNC) is a desktop protocol to remotely control another computer. It transmits the keyboard presses and mouse clicks from one computer to another relaying the screen updates back in the other direction,
over a network. There are many options available for VNC software. TightVNC, RealVNC, UltraVNC, and Chicken
have all been used successfully with Deadline.
The following options are available in the VNC window in the Monitor:
• Machine Name(s): Specify which machines to connect to. Use a comma to separate multiple machine names.
• VNC Viewer: The VNC viewer executable to use.
• Password: The VNC password.
• VNC Port: The VNC port.
• Remember Password: Enable to remember your password between sessions.
• Hide this window if running from a right-click Scripts menu: If enabled, this window will be hidden if run
from a right-click menu in the Monitor. You can always run it from the main Scripts menu to see this window.
5.12 Network Performance
5.12.1 Overview
This guide is intended to help you find and fix potential bottlenecks in your Deadline render farm. If you are noticing
sluggish performance when you are using Deadline, there are a few things you can do to try and improve it.
5.12.2 Adjust Monitor and Slave Settings
There are a few Monitor and Slave settings in the Repository Options that you can tweak to help improve performance,
and reduce load on both the network and the database. You can also use the Auto Adjust option to figure out the best
default values to use based on the number of Slaves in your farm. See the Repository Options documentation for more
information.
310
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
5.12.3 Enable Throttling
Pulse supports a Throttling feature, which is helpful if you’re submitting large files with your jobs. This can be used
to limit the number of Slaves that are copying over the Job files at the same time. The Throttling settings can be found
in the Pulse Settings section of the Repository Options.
5.12. Network Performance
311
Deadline User Manual, Release 7.1.0.35
For example, if you have 100 Slaves, and you’re submitting 500MB scene files with your jobs, you may notice a
performance hit if all 100 Slaves try to copy over the Job and Plugin files at the same time. You could set the Slave
Throttle Limit to 10, so that only 10 of those Slaves will ever be copying those files at the same time. When it goes
to render subsequent tasks for the same Job, it will not be affected by the throttling feature, since it already has the
required files. Note that for this feature to work, you must be running Pulse.
5.12.4 Utilize Limits / Machine Limits
Irrespective of Pulse Throttling, if your scene files (Maya, 3dsMax, modo, etc) are referencing a large amount of
external asset files (textures, geo caches), then at initial startup of this job on multiple machines, your network file
storage solution may struggle with this ‘fire storm’ of i/o demand. To lower this demand on your file server, you can
use Machine Limits or Limits. One of the aspects of the limits feature is the ability to tell the slave to not return the
stub until the particular task being rendered by the slave in question has reached a certain percentage, which would
presume it has downloaded all the assets it needs. Note that not all Plugins report task progress, which is a requirement
of this feature to operate correctly.
312
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
5.12. Network Performance
313
Deadline User Manual, Release 7.1.0.35
5.12.5 Manage Job Auxiliary Files
If you are submitting your scene files with your Jobs, this can affect overall performance if the scene files are quite
large. This is because whenever a Slave starts a new Job, it copies those Job files locally before rendering, including
the Scene file if submitted with the Job. As mentioned in the previous section, if you have hundreds of Slaves starting
a Job with a large scene file, and your Repository hardware isn’t built to handle a large load, your performance will
suffer.
If enabling Throttling isn’t helping, another option (which can also be used in conjunction if you wish) is to configure
Deadline to store these scene files in an alternate location (like a separate, dedicated file server). This can be done by
configuring the Job Auxiliary Files settings in the Repository Options.
314
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
From here, you can choose a server that’s better equipped to handle the load, which will help improve the performance
and stability of your Repository machine, especially if it is also hosting your Database backend. In a mixed farm
environment, you need to ensure that the paths for each operating system resolve to the same location. Otherwise, a
scene file submitted with the Job on one operating system will not be visible to a Slave running on another.
5.13 Cross Platform Rendering
5.13.1 Overview
Many of the applications that Deadline supports are available for multiple operating systems, and if you have a mixed
farm, you will probably run into one or more of these scenarios:
• You want to submit Jobs from one operating system and render on a different one.
• You want one or more Jobs to render on machines with different operating systems concurrently.
Both of these can be achieved, thanks to Deadline’s Path Mapping feature. While there may be other considerations to
take into account, depending on the application you’re rendering with, the Path Mapping feature will do most of the
work for you.
5.13. Cross Platform Rendering
315
Deadline User Manual, Release 7.1.0.35
5.13.2 Mapped Path Setup
When using a mixed render farm, it is all but guaranteed that asset paths will be different on each operating system.
In many cases, Deadline is aware of the paths being passed to the rendering application, so you can configure Path
Mappings to swap out these paths when appropriate based on the operating system.
To add a new Path Mapping, just click the ‘Add’ button. Then, you specify the path that needs to be swapped out,
along with the paths that will be swapped in based on the operating system. You can also specify a region so you can
have different mappings for the same path across different regions. For best results, make sure that all paths end with
their appropriate path separator (‘/’ or ‘\’). This helps avoid mangled paths that are a result of one path with a trailing
separator, and one without.
316
Chapter 5. Administrative Features
Deadline User Manual, Release 7.1.0.35
Note that these swaps only work one-way, so if you are swapping from PC to Linux and vice-versa, you will need
two separate entries. For example, let’s say the PC machines use the path ‘\\server\share\’ for assets, while the Linux
machines use the path ‘/mnt/share/’. Here are what your two entries should look like:
• Entry 1 (replaces the Linux path with the PC path on PCs):
Replace Path: /mnt/share/
Windows Path: \\server\share\
Linux Path:
Mac Path:
• Entry 2 (replaces the PC path with the Linux path on Linux):
Replace Path: \\server\share\
Windows Path:
Linux Path: /mnt/share/
Mac Path:
If you have Mac machines as well, you will need three entries. For example, if the Macs use ‘/Volumes/share/’ to
access the assets from the previous example, here are what your three entries should look like:
• Entry 1 (replaces the Linux path with the PC path on PCs and the Mac path on Macs):
Replace Path: /mnt/share/
Windows Path: \\server\share\
Linux Path:
Mac Path: /Volumes/share/
• Entry 2 (replaces the PC path with the Linux path on Linux and the Mac path on Macs):
Replace Path: \\server\share\
Windows Path:
Linux Path: /mnt/share/
Mac Path: /Volumes/share/
• Entry 3 (replaces the Mac path with the PC path on PCs and the Linux path on Linux):
Replace Path: /Volumes/share/
Windows Path: \\server\share\
5.13. Cross Platform Rendering
317
Deadline User Manual, Release 7.1.0.35
Linux Path: /mnt/share/
Mac Path:
By default, Deadline just uses regular string replacement to swap out the paths. In this case, Deadline takes care of
the path separators (‘/’ and ‘\’) automatically. If you want more flexibility, you can configure your path mappings to
use regular expressions, but not that you will need to handle the path separators manually using ‘[/\\]’ in your regular
expressions.
5.13.3 Application-Specific Considerations
For some applications, like Maya and Nuke, configuring Path Mappings is enough to allow for cross-platform rendering. For other applications, like After Effects and Cinema 4D, more setup is required. More information on how to
render with these applications on mixed farms can be found in their Cross-Platform Rendering Considerations sections
in the Plugins documentation.
5.13.4 Regions
Regions can be used to setup different mappings for the same path across your farm. For example, let’s say we have a
local farm and a remote farm, and we want to map the path ‘/mnt/share/’ in our remote farm but not in our local farm.
All we have to do is set the region of our mapping to the same region our remote slaves are in. Slaves in the region
will replace ‘/mnt/share/’ but all the other slaves will use ‘/mnt/share/’ normally. We could also setup an alternate path
for the slaves in our local farm.
A mapping in the ‘All’ region will apply to every region. It should be noted that a region’s mapping is applied before
the ‘All’ region.
318
Chapter 5. Administrative Features
CHAPTER
SIX
ADVANCED FEATURES
6.1 Manual Job Submission
6.1.1 Overview
Manual job submission is useful if you want more control over the submission process. For example, if you’re writing
a custom submission script, or you are integrating the submission process into an internal pipeline tool, you will
probably want full control over which job settings are being set.
If you are just looking to submit jobs from one of the many scripts that are shipped with Deadline, you should refer to
the Submitting Jobs documentation.
6.1.2 Arbitrary Command Line Jobs
To manually submit arbitrary command line jobs, you can use the -SubmitCommandLineJob option with the Command
application. The key parameters that you need to specify are:
• -executable: The executable we wish to use.
• -arguments: The arguments we wish to pass to the executable. In the arguments string, there are a few key
words that will be replaced with the appropriate text when rendering the job:
– <STARTFRAME> will be replaced with the current start frame for each task.
– <ENDFRAME> will be replaced by the current end frame for each task.
– <STARTFRAME%#> will be replaced with the current start frame for each task, and will be padded with
0’s to match the length specified for #. For example, <STARTFRAME%4> will ensure a start frame
padding of 4.
– <ENDFRAME%#> will be replaced by the current end frame for each task, and will be padded with 0’s to
match the length specified for #. For example, <ENDFRAME%6> will ensure an end frame padding of 6.
– <QUOTE> will be replaced with an actual quote character (”).
• -frames: The frames we wish to render.
The following parameters can also be included, but are optional:
• -startupdirectory: The directory that the command line will start up in.
• -chunksize: The number of frames per task (defaults to 1).
• -pool: The pool we wish to submit to (defaults to none).
• -group: The group we wish to submit to (defaults to none).
319
Deadline User Manual, Release 7.1.0.35
• -priority: The job’s priority (defaults to 50).
• -name: The job’s name (defaults to “Untitled”).
• -department: The job’s department (defaults to “”).
• -initialstatus: Specify “Active” or “Suspended” (defaults to “Active”).
• -prop: Specify additional job properties in the form KEY=VALUE, where KEY is any of the property names
that can be specified in the Job Info File.
For example, say we want to submit a job that uses 3dsmaxcmd.exe to render frames in the scene file
“\\shared\path\scene.max”. We want to render frames 1 to 10, and we want an image resolution of 480x320. The
command line to do this from a command prompt would look like:
3dsmaxcmd.exe
-start:1
-end:10
-width:480
-height:320
"\\shared\path\scene.max"
For the job, we want a task chunk size of 2, we want to submit to the 3dsmax group, we want a priority of 50, and we
want a machine limit of 5. Finally, we want to call the job “3dsmax command line job”. The command line to submit
this job would look like this:
deadlinecommand.exe
-SubmitCommandLineJob
-executable "c:\Program Files\Autodesk\3dsmax8\3dsmaxcmd.exe"
-arguments "-start:<STARTFRAME> -end:<ENDFRAME>
-width:480 -height:320 <QUOTE>\\shared\path\scene.max<QUOTE>"
-frames 1-10
-chunksize 2
-group 3dsmax
-priority 50
-name "3dsmax command line job"
-prop MachineLimit=5
6.1.3 Maintenance Jobs
Maintenance jobs are special jobs where each task for the job will render on a different machine in your farm. This
is useful for performing benchmark tests, installing new software, synchronizing files to each machine, etc. When a
maintenance job is submitted, a task will automatically be created for each slave, and once a slave has finished a task,
it will no longer pick up the job.
One way to submit a Maintenance job is to manually submit a job to Deadline by creating the the necessary job
submission files as documented below. In the job info file, you must set MaintenanceJob to True:
MaintenanceJob=True
By default, a Maintenance job will render frame 0 on every machine. To render a different frame, or a sequence of
frames, you can specify the MaintenanceJobStartFrame and MaintenanceJobEndFrame options in the job info file:
MaintenanceJob=True
MaintenanceJobStartFrame=1
MaintenanceJobEndFrame=5
320
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
Note that if you specify a whitelist or blacklist in the job info file, the number of tasks that are created for the Maintenance job will equal the number of valid slaves that the job could render on.
Another way to submit a Maintenance job is to right-click on an existing job in the Monitor and choose the Resubmit
Job option. See the Resubmitting Jobs section of the Controlling Jobs documentation for more information.
6.1.4 Creating Job Submission Files
This is the method that our submission scripts use to submit jobs. This method is far more flexible, but requires more
work to setup the job. It also uses the Command application to submit the job.
Before the job can be submitted though, a Job Info File and a Plug-in Info File must be created. These are the first two
files that should always be submitted with the job.
You can also submit additional auxiliary files with the job, such as the scene file you want to render, or any other files
the job will need. Any number of auxiliary files can be specified after the job info and plugin info file. These auxiliary
file are copied to the Repository, and are then copied locally to each machine during rendering. Because these files
will be copied to the same folder, it is necessary that every file name be unique.
Once these files are ready to go, you can submit the job using the command line:
deadlinecommand.exe
[Job Info File] [Plug-in Info File] [Auxiliary File 1] [Auxiliary File 2]
Job Info File
The Job Info File is a plain text file that uses Key/Value pairs (key=value) to define all the generic job options used to
render the job. A couple options are required, but most are optional. All jobs can use these options, regardless of the
plug-in that they use. Some examples have been provided further down. Required Options
These options must be specified in the job info file, or the job will fail to submit. The rest of the options are optional.
• Plugin=<plugin name> : Specifies the plugin to use. Must match an existing plugin in the repository.
General Options
• Frames=<1,2,3-10,20> : Specifies the frame range of the render job. See the Frame List Formatting Options in
the Job Submission documentation for more information (default = 0).
• Name=<job name> : Specifies the name of the job (default = Untitled).
• UserName=<username> : Specifies the job’s user (default = current user).
• MachineName=<machineName> : Specifies the machine the job was submitted from (default = current machine).
• Department=<department name> : Specifies the department that the job belongs to. This is simply a way to
group jobs together, and does not affect rendering in any way (default = blank).
• Comment=<comment> : Specifies a comment for the job (default = blank).
• Group=<groupName> : Specifies the group that the job is being submitted to (default = none).
• Pool=<poolName> : Specifies the pool that the job is being submitted to (default = none).
• SecondaryPool=<poolName> : Specifies the secondary pool that the job can spread to if machines are available. If not specified, the job will not use a secondary pool.
• Priority=<0 or greater> : Specifies the priority of a job with 0 being the lowest (default = 50). The maximum
priority can be configured in the Job Settings of the Repository Options, and defaults to 100.
6.1. Manual Job Submission
321
Deadline User Manual, Release 7.1.0.35
• ChunkSize=<1 or greater> : Specifies how many frames to render per task (default = 1).
• ForceReloadPlugin=<true/false> : Specifies whether or not to reload the plugin between subsequent frames of
a job (default = false). This deals with memory leaks or applications that do not unload all job aspects properly.
• SynchronizeAllAuxiliaryFiles=<true/false> : If enabled, all job files (as opposed to just the job info and plugin
info files) will be synchronized by the Slave between tasks for this job (default = false). Note that this can add
significant network overhead, and should only be used if you plan on manually editing any of the files that are
being submitted with the job.
• InitialStatus=<Active/Suspended> : Specifies what status the job should be in immediately after submission
(default = Active).
• LimitGroups=<limitGroup,limitGroup,limitGroup> : Specifies the limit groups that this job is a member of
(default = blank).
• MachineLimit=<0 or greater> : Specifies the maximum number of machines this job can be rendered on at
the same time (default = 0, which means unlimited).
• MachineLimitProgress=<0.1 or greater> : If set, the slave rendering the job will give up its current machine
limit lock when the current task reaches the specified progress. If negative, this feature is disabled (default =
-1.0). The usefulness of this feature is directly related to the progress reporting capabilities of the individual
plugins.
• Whitelist=<slaveName,slaveName,slaveName> : Specifies which slaves are on the job’s whitelist (default =
blank). If both a whitelist and a blacklist are specified, only the whitelist is used.
• Blacklist=<slaveName,slaveName,slaveName> : Specifies which slaves are on the job’s blacklist (default =
blank). If both a whitelist and a blacklist are specified, only the whitelist is used.
• ConcurrentTasks=<1-16> : Specifies the maximum number of tasks that a slave can render at a time (default
= 1). This is useful for script plugins that support multithreading.
• LimitTasksToNumberOfCpus=<true/false> : If ConcurrentTasks is greater than 1, setting this to true will
ensure that a slave will not dequeue more tasks than it has processors (default = true).
• Sequential=<true/false> : Sequential rendering forces a slave to render the tasks of a job in order. If an earlier
task is ever requeued, the slave won’t go back to that task until it has finished the remaining tasks in order
(default = false).
• Interruptible=<true/false> : Specifies if tasks for a job can be interrupted by a higher priority job during
rendering (default = false).
• SuppressEvents=<true/false> : If true, the job will not trigger any event plugins while in the queue (default =
false).
• NetworkRoot=<repositoryUNCPath> : Specifies the repository that the job will be submitted to. This is
required if you are using more than one repository (default = current default repository for the machine from
which submission is occurring).
Cleanup Options
• Protected=<true/false> : If enabled, the job can only be deleted by the job’s user, a super user, or a user that
belongs to a user group that has permissions to handle protected jobs. Other users will not be able to delete the
job, and the job will also not be cleaned up by Deadline’s automatic house cleaning.
• OnJobComplete=<Nothing/Delete/Archive> : Specifies what should happen to a job after it completes (default = Nothing).
• DeleteOnComplete=<true/false> : Specifies whether or not the job should be automatically deleted after it
completes (default = false).
322
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
• ArchiveOnComplete=<true/false> : Specifies whether or not the job should be automatically archived after it
completes (default = false).
• OverrideAutoJobCleanup=<true/false> : If true, the job will ignore the global Job Cleanup settings and
instead use its own (default = false).
• OverrideJobCleanup=<true/false> : If OverrideAutoJobCleanup is true, this will determine if the job should
be automatically cleaned up or not.
• JobCleanupDays=<true/false> : If OverrideAutoJobCleanup and OverrideJobCleanup are both true, this is the
number of days to keep the job before cleaning it up.
• OverrideJobCleanupType=<ArchiveJobs/DeleteJobs> :
Cleanup are both true, this is the job cleanup mode.
If OverrideAutoJobCleanup and OverrideJob-
Environment Options
• EnvironmentKeyValue#=<key=value> : Specifies environment variables to set when the job renders. This
option is numbered, starting with 0 (EnvironmentKeyValue0), to handle multiple environment variables. For
each additional variable, just increase the number (EnvironmentKeyValue1, EnvironmentKeyValue2, etc). Note
that these variables are only applied to the rendering process, so they do not persist between jobs.
• IncludeEnvironment=<true/false> : If true, the submission process will automatically grab all the environment
variables from the submitter’s current environment and set them in the job’s environment variables (default =
false). Note that these variables are only applied to the rendering process, so they do not persist between jobs.
• UseJobEnvironmentOnly=<true/false> : If true, only the job’s environment variables will be used at render
time (default = false). If False, the job’s environment variables will be merged with the slave’s current environment, with the job’s variables overwriting any existing ones with the same name.
• CustomPluginDirectory=<directoryName> : If specified, the job will look for for the plugin it needs to render
in this location. If it does not exist in this location, it will fall back on the Repository plugin directory. For example, if you are rendering with a plugin called MyPlugin, and it exists in \\server\development\plugins\MyPlugin,
you would set CustomPluginDirectory=\\server\development\plugins.
Failure Detection Options
• OverrideJobFailureDetection=<true/false> : If true, the job will ignore the global Job Failure Detection
settings and instead use its own (default = false).
• FailureDetectionJobErrors=<0 or greater> : If OverrideJobFailureDetection is true, this sets the number of
errors before the job fails. If set to 0, job failure detection will be disabled.
• OverrideTaskFailureDetection=<true/false> : If true, the job will ignore the global Task Failure Detection
settings and instead use its own (default = false).
• FailureDetectionTaskErrors=<0 or greater> : If OverrideTaskFailureDetection is true, this sets the number
of errors before a task for the job fails. If set to 0, task failure detection will be disabled.
• IgnoreBadJobDetection=<true/false> : If true, slaves will never mark the job as bad for themselves. This
means that they will continue to make attempts at jobs that often report errors until the job is complete, or until
it fails (default = false).
• SendJobErrorWarning=<true/false> : If the job should send warning notifications when it reaches a certain
number of errors (default = false).
Timeout Options
• MinRenderTimeSeconds=<0 or greater> : Specifies the minimum time, in seconds, a slave should render a
task for, otherwise an error will be reported (default = 0, which means no minimum). Note that if MinRenderTimeSeconds and MinRenderTimeMinutes are both specified, MinRenderTimeSeconds will be ignored.
6.1. Manual Job Submission
323
Deadline User Manual, Release 7.1.0.35
• MinRenderTimeMinutes=<0 or greater> : Specifies the minimum time, in minutes, a slave should render a
task for, otherwise an error will be reported (default = 0, which means no minimum). Note that if MinRenderTimeSeconds and MinRenderTimeMinutes are both specified, MinRenderTimeSeconds will be ignored.
• TaskTimeoutSeconds=<0 or greater> : Specifies the time, in seconds, a slave has to render a task before it
times out (default = 0, which means unlimited). Note that if TaskTimeoutSeconds and TaskTimeoutMinutes are
both specified, TaskTimeoutSeconds will be ignored.
• TaskTimeoutMinutes=<0 or greater> : Specifies the time, in minutes, a slave has to render a task before it
times out (default = 0, which means unlimited). Note that if TaskTimeoutSeconds and TaskTimeoutMinutes are
both specified, TaskTimeoutSeconds will be ignored.
• StartJobTimeoutSeconds=<0 or greater> : Specifies the time, in seconds, a slave has to start a render job
before it times out (default = 0, which means unlimited). Note that if StartJobTimeoutSeconds and StartJobTimeoutMinutes are both specified, StartJobTimeoutSeconds will be ignored.
• StartJobTimeoutMinutes=<0 or greater> : Specifies the time, in minutes, a slave has to start a render job
before it times out (default = 0, which means unlimited). Note that if StartJobTimeoutSeconds and StartJobTimeoutMinutes are both specified, StartJobTimeoutSeconds will be ignored.
• OnTaskTimeout=<Error/Notify/ErrorAndNotify/Complete> : Specifies what should occur if a task times
out (default = Error).
• EnableAutoTimeout=<true/false> : If true, a slave will automatically figure out if it has been rendering too
long based on some Repository Configuration settings and the render times of previously completed tasks (default = false).
• EnableTimeoutsForScriptTasks=<true/false> : If true, then the timeouts for this job will also affect its
pre/post job scripts, if any are defined (default = false).
Dependency Options
• JobDependencies=<jobID,jobID,jobID> : Specifies what jobs must finish before this job will resume (default
= blank). These dependency jobs must be identified using their unique job ID, which is outputted after the job
is submitted, and can be found in the Monitor in the “Job ID” column.
• JobDependencyPercentage=<-1, or 0 to 100> : If between 0 and 100, this job will resume when all of its
job dependencies have completed the specified percentage number of tasks. If -1, this feature will be disabled
(default = -1).
• IsFrameDependent=<true/false> : Specifies whether or not the job is frame dependent (default = false).
• FrameDependencyOffsetStart=<-100000 to 100000> : If the job is frame dependent, this is the start frame
offset (default = 0).
• FrameDependencyOffsetEnd=<-100000 to 100000> : If the job is frame dependent, this is the end frame
offset (default = 0).
• ResumeOnCompleteDependencies=<true/false> : Specifies whether or not the dependent job should resume
when its dependencies are complete (default = true).
• ResumeOnDeletedDependencies=<true/false> : Specifies whether or not the dependent job should resume
when its dependencies have been deleted (default = false).
• ResumeOnFailedDependencies=<true/false> : Specifies whether or not the dependent job should resume
when its dependencies have failed (default = false).
• RequiredAssets=<assetPath,assetPath,assetPath> : Specifies what asset files must exist before this job will
resume (default = blank). These asset paths must be identified using full paths, and multiple paths can be
separated with commas. If using frame dependencies, you can replace padding in a sequence with the ‘#’
characters, and a task for the job will only be resumed when the required assets for the task’s frame) exist.
324
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
• ScriptDependencies=<scriptPath,scriptPath,scriptPath> : Specifies what Python script files will be executed
to determine if a job can resume (default = blank). These script paths must be identified using full paths,
and multiple paths can be separated with commas. See the Scripting section of the documentation for more
information on script dependencies.
Scheduling Options
• ScheduledType=<None/Once/Daily> : Specifies whether or not you want to schedule the job (default = None).
• ScheduledStartDateTime=<dd/MM/yyyy HH:mm> [The date/time at which the job will run. The start
date/time must match the specified format. Here’s an explanation:]
– dd: The day of the month. Single-digit days must have a leading zero.
– MM: The numeric month. Single-digit months must have a leading zero.
– yyyy: The year in four digits, including the century.
– HH: The hour in a 24-hour clock. Single-digit hours must have a leading zero.
– mm: The minute. Single-digit minutes must have a leading zero.
• ScheduledDays=<day interval> : If scheduling a Daily job, this is the day interval for when the job runs
(default = 1).
• JobDelay=<dd:hh:mm:ss> : A start time delay. If there is no ScheduledStartDateTime this delay will be
applied to the submission date. The delay value is represented by the number of days, hours, minutes, and
seconds, all separated by colons.
Output Options
• OutputFilename#=<fileName> : Specifies the output image filenames for each frame (default = blank). This
allows the Monitor to display the “View Output Image” context menu option in the task list. There is no
minimum or maximum limit to padding length supported. A padding of 4 x #### is very common in many
applications. If the filename is a full path, then the OutputDirectory# option is not needed. This option is numbered, starting with 0 (OutputFilename0), to handle multiple output file names per frame. For each additional
file name, just increase the number (OutputFilename1, OutputFilename2, etc).
• OutputFilename#Tile?=<fileName> : Specifies the output image filenames for each task of a Tile job (default
= blank). This allows the Monitor to display the “View Output Image” context menu option in the task list for
Tile jobs. The ‘#’ is used to support multiple outputs per frame (see OutputFilename# above), and the ‘?’ is
used to specify the output for each task in the Tile job. For example, a Tile job with 2 outputs and 2 tiles would
specify OutputFilename0Tile0, OutputFilename0Tile1, OutputFilename1Tile0, and OutputFilename1Tile1.
• OutputDirectory#=<directoryName> : Specifies the output image directory for the job (default = blank). This
allows the Monitor to display the “Explore Output” context menu option in the job list. If the filename is a full
path, then the OutputDirectory# option is not needed. This option is numbered, starting with 0 (OutputDirectory0), to handle multiple output directories per frame. For each additional directory, just increase the number
(OutputDirectory1, OutputDirectory2, etc).
OutputDirectory0=\\fileserver\Project\Renders\OutputFolder\
OutputFilename0=o_HDP_010_BG_v01.####.exr
OutputDirectory1=\\fileserver\Project\Renders\OutputFolder\
OutputFilename1=o_HDP_010_SPEC_v01####.dpx
OutputDirectory2=\\fileserver\Project\Renders\OutputFolder\
OutputFilename2=o_HDP_010_RAW_v01_####.png
Notification Options
• NotificationTargets=<username,username,username> : A list of users, separated by commas, who should be
notified when the job is completed (default = blank).
6.1. Manual Job Submission
325
Deadline User Manual, Release 7.1.0.35
• ClearNotificationTargets=<true/false> : If enabled, all of the job’s notification targets will be removed (default
= false).
• NotificationEmails=<email,email,email> : A list of additional email addresses, separated by commas, to send
job notifications to (default = blank).
• OverrideNotificationMethod=<true/false> : If the job user’s notification method should be ignored (default =
false).
• EmailNotification=<true/false> : If overriding the job user’s notification method, whether to use email notification (default = false).
• PopupNotification=<true/false> : If overriding the job user’s notification method, whether to use popup notification (default = false).
• NotificationNote=<note> : A note to append to the notification email sent out when the job is complete (default
= blank). Separate multiple lines with [EOL], for example:
This is a line[EOL]This is another line[EOL]This is the last line
Script Options
• PreJobScript=<path to script> : Specifies a full path to a python script to execute when the job initially starts
rendering (default = blank).
• PostJobScript=<path to script> : Specifies a full path to a python script to execute when the job completes
(default = blank).
• PreTaskScript=<path to script> : Specifies a full path to a python script to execute before each task starts
rendering (default = blank).
• PostTaskScript=<path to script> : Specifies a full path to a python script to execute after each task completes
(default = blank).
Tile Job Options
• TileJob=<true/false> : If this job is a tile job (default = false).
• TileJobFrame=<frameNumber> : The frame that the tile job is rendering (default = 0).
• TileJobTilesInX=<xCount> : The number of tiles in X for a tile job (default = 0). This should be specified
with the TileJobTilesInY option below.
• TileJobTilesInY=<yCount> : The number of tiles in Y for a tile job (default = 0). This should be specified
with the TileJobTilesInX option above.
• TileJobTileCount=<count> : The number of tiles for a tile job (default = 0). This is an alternative to specifying
the TileJobTilesInX and TileJobTilesInY options above.
Maintenance Job Options
• MaintenanceJob=<true/false> : If this job is a maintenance job (default = false).
• MaintenanceJobStartFrame=<frameNumber> : The first frame for the maintenance job (default = 0).
• MaintenanceJobEndFrame=<frameNumber> : The last frame for the maintenance job (default = 0).
Extra Info Options
These are extra arbitrary properties that have corresponding columns in the Monitor that can be sorted on. There are a
total of 10 Extra Info properties that can be specified.
• ExtraInfo0=<arbitrary value>
• ExtraInfo1=<arbitrary value>
326
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
• ExtraInfo2=<arbitrary value>
• ExtraInfo3=<arbitrary value>
• ExtraInfo4=<arbitrary value>
• ExtraInfo5=<arbitrary value>
• ExtraInfo6=<arbitrary value>
• ExtraInfo7=<arbitrary value>
• ExtraInfo8=<arbitrary value>
• ExtraInfo9=<arbitrary value>
These are additional arbitrary properties. There is no limit on the number that are specified, but they do not have
corresponding columns in the Monitor.
• ExtraInfoKeyValue0=<key=value>
• ExtraInfoKeyValue1=<key=value>
• ExtraInfoKeyValue2=<key=value>
• ExtraInfoKeyValue3=<key=value>
• ...
Job Info File Examples
3ds Max Job Info File:
Plugin=3dsmax
ForceReloadPlugin=false
Frames=0-400
Priority=50
Pool=3dsmax
Name=IslandWaveScene_lighted01
Comment=Testing
OutputDirectory0=\\fileserver\Renders\OutputFolder\
OutputFilename0=islandWaveBreak_Std####.png
Lightwave Job Info File:
Plugin=Lightwave
Frames=1-10,21-30
ChunkSize=10
Priority=99
Pool=LightwavePool
Group=NiceShot
Name=Lightwave Test
OutputFilename0=\\fileserver\Renders\OutputFolder\test####.tif
DeleteOnComplete=true
MachineLimit=5
SlaveTimeout=3600
Fusion Job Info File:
Plugin=Fusion
Frames=1-100
Priority=50
6.1. Manual Job Submission
327
Deadline User Manual, Release 7.1.0.35
Group=Fusion
Name=Fusion Dependency Test
OutputFilename0=\\fileserver\Renders\OutputFolder\dfusion_test####.tif
JobDependencies=546cc87357dbb04344a5c6b5,53d27c9257dbb027b8a4ccd2
InitialStatus=Suspended
LimitGroups=DFRNode
ExtraInfo0=Regression Testing
ExtraInfoKeyValue0=TestID=344
ExtraInfoKeyValue1=DeveloperID=12
Plug-in Info File
The Plug-in Info File is a plain text file that uses Key/Value pairs (key=value) to define the plug-in specific options
that are used by the individual plug-ins to render the job. Often, these options are used to build up the command line
arguments that are to be passed on to the rendering application.
The plug-ins read in the settings from the Plug-in Info File using the script functions GetPluginInfoEntry(...) and
GetPluginInfoEntryWithDefault(...), which are discussed in more detail in the Plug-in Scripting documentation (Application Plugins).
6.2 Power Management
6.2.1 Overview
Power Management is a system that automatically controls when machines in the farm start up or shut down, based on
the current conditions of the farm. It can start machines if they are required to render jobs in the farm, and it can shut
down machines that are no longer needed for rendering. It can also poll an external temperature sensor using SNMP
and shut down machines if the server room is too hot. Finally, it can reboot problematic machines on a regular basis.
6.2.2 Running Pulse
Power Management is built into Pulse, so Pulse must be running for Power Management to work. The only exception
for this is the Thermal Shutdown feature. Redundancy for this feature has been built into the Slave applications, so if
Pulse isn’t running, you’re still protected if the temperature of your server room gets too hot.
See the Pulse documentation for more information about running and configuring Pulse.
328
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
6.2.3 Configuration
Power Management can be configured from the Monitor by selecting ‘Tools’ -> ‘Configure Power Management’. You
will need to be in Super User mode for this, if you are not part of a User Group that has access to this feature.
6.2. Power Management
329
Deadline User Manual, Release 7.1.0.35
Machine Groups are used by Power Management to organize Slave machines on the farm, and each group has four
sections of settings that can be configured independently of each other. To add a new Machine Group, simply click the
Add button in the Machine Group section.
330
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
Power Management Group Settings:
• Group Name: The name with which the Power Management Group will be identified.
• Group Mode: Whether this particular Group is enabled or disabled.
• Include All Slaves in this Group: If enabled, all slaves will be included in this group. Note that you cannot
override the slave order for the Power Management features if this is enabled.
• Slaves Not In Group: The Slaves that will not be part of this Group.
• Slaves In Group: Slaves that will be part of this Group.
To edit the Power Management settings within a group, simply click on the group in the Machine Groups list.
Idle Shutdown
Idle Shutdown is a system that forces Slaves to shutdown after they have been idle for a certain period of time. This
can be used to save on energy costs when the render farm is idle, without having to shutdown machines manually.
Combining this feature with Wake-On-Lan will ensure that machines in the render farm are only running when they
are needed.
You can split the idle time period between a Daytime period and an Evening period. This is useful because in most
cases, you want most of your machines to stay on during the workday, and then shutdown during the evening when
there are no renders left. In addition, you can also specify exceptions to these two periods, which means (for example)
you could have different idle periods for the weekend.
6.2. Power Management
331
Deadline User Manual, Release 7.1.0.35
Idle Shutdown Settings:
• Idle Shutdown Mode: Select Disabled, Enabled, or Debug mode. In Debug mode, all the checks are performed
as normal, but no action is actually taken.
• Number of Minutes Before Idles Slaves Will be Shutdown: Self explanatory.
• Number of Slaves to Leave Running: The minimum number of Slaves to keep running at all times.
• Slave Shutdown Type: The method that will be used to shutdown Idle Slaves:
• Shutdown: Power off the machine using the normal shutdown method.
• Suspend: Suspends the machines instead of shutting them down. Only works for Windows Slaves.
• Run Command: Use this method to have the Slave run a command when attempting to shutdown a Slave.
• Important Processes: If the Slave has any of these processes running, it will not shutdown.
332
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
• Overrides: Define overrides for different days and times. Simply specify the day(s) of the week, the time
period, the minimum number of Slaves, and the idle shutdown time for each override required. For example, if
more machines are required to be running continuously for Friday evening and Saturday afternoon, this can be
accomplished with an override.
• Override Shutdown Order: Whether or not to define the order in which Slaves are shutdown. If disabled,
Slaves will be shut down in alphabetical order. If enabled, use the Set Shutdown Order dialog to define the order
in which you would like the Slaves to shut down. Note that this feature is not available if the Power Management
Group is configured to include all slaves.
Machine Startup
This is a system that allows powered-down machines to be started automatically when new Jobs are submitted to the
render farm. Combining this feature with Idle Shutdown will ensure that machines in the render farm are only running
when they are needed.
If Slave machines support it, Wake On Lan (WOL) or IPMI commands can be used to start them up after they shutdown. By default, the WOL packet is sent over port 9, but you can change this in the Wake On Lan settings in the
Repository Configuration. Make sure there isn’t a firewall or other security software blocking communication over the
selected port(s).
WOL Packets are sent to the MAC addresses that Deadline has on file for each of the Slaves. If your Slaves have multiple Ethernet ports, the Slave may have registered the wrong MAC address, which may prevent WOL from working
properly. If this is the case, you will have to manually set MAC Address overrides for the Slaves that are having this
problem, which can be done through the Slave Settings dialog.
Note that if machines in the group begin to be shutdown due to temperature, this feature may be automatically disabled
for the group to prevent machines from starting up and raising the temperature again.
6.2. Power Management
333
Deadline User Manual, Release 7.1.0.35
Machine Startup Settings:
• Machine Startup Mode: Select Disabled, Enabled, or Debug mode. In Debug mode, all the checks are performed, but no action is actually taken.
• Number of Slaves to Wake Up per Interval: The maximum number of machines that will be started in a given
Power Management check interval. The interval itself can be configured in the Pulse section of the Repository
Options.
• Wake Up Mode: This determines how the machines will be woken up. See the available Wake Up Modes below
for more information.
• Override Startup Order: Whether or not to define the order in which Slaves are started up. If disabled, Slaves
will be started in alphabetical order. If enabled, use the ‘Set Startup Order’ dialog to define the order. Note that
this feature is not available if the Power Management Group is configured to include all slaves.
Wake Up Modes:
• Use Wake On Lan: Wake On Lan packets will be sent to machines to wake them up.
334
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
• Run Command: This is primarily for IPMI support. If enabled, Pulse will run a given command to start Slave
machines. This command will be run once for each Slave that is being woken up. A few tags can be used within
the command:
– {SLAVE_NAME} is replaced with the current Slave’s hostname.
– {SLAVE_MAC} is replaced with the current Slave’s MAC address.
– {SLAVE_IP} is replaced with the current Slave’s IP address.
Thermal Shutdown
The Thermal Shutdown system polls temperature sensors and responds by shutting down machines if the temperature
gets too high. The sensors we have used for testing are NetTherms, and APC Sensors are also known to be compatible.
Note that the temperature sensor uses port 161, and should be automatically unblocked.
Thermal Shutdown settings:
6.2. Power Management
335
Deadline User Manual, Release 7.1.0.35
• Thermal Shutdown Mode: Select Disabled, Enabled, or Debug mode. In Debug mode, all the checks will be
performed, but no action is actually taken.
• Temperature Units: The units used to display and configure the temperatures. Note that this is separate from
the units that the actual sensors use.
• Thermal Sensors: The host and OID (Object Identifier) of the sensor(s) in the zone. To add a new sensor,
simply click the ‘Add’ button.
• Temperature Threshold: Thresholds can be added for any temperature. When a sensor reports a temperature
higher than a particular threshold, the Slaves in the zone will respond accordingly. Note that higher temperature
thresholds take precedence over lower ones.
• Shut down Slaves if sensors are offline for this many minutes: If enabled, Slaves will shut down after a period
of time in which the temperature sensor could not be reached for temperature information.
• Disable Machine Startup if thermal threshold is reached: If enabled, Machine Startup for the current group
will be disabled if a thermal threshold is reached.
• Re-enable Machine Startup when temperature returns to: If enabled, this will re-enable Machine Startup
when the temperature returns to the specified temperature.
• Override Shutdown Order: Whether or not to define a custom order in which Slaves will be shutdown. If
disabled, Slaves will be shut down in alphabetical order. If enabled, use the ‘Set Shutdown Order’ dialog to
define the order. Note that this feature is not available if the Power Management Group is configured to include
all slaves.
Sensor Settings:
• Sensor Hostname or IP Address: The host of the temperature sensor.
• Sensor OID: The OID (Object Identifier) of the temperature sensor. The default OID is for the particular type
of sensor we use.
• Sensor SNMP Community: If testing the sensor fails with private is selected, try using public.
336
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
• Sensor Reports Temperature As: Select the units that your temperature sensor uses to report the temperature.
• Sensor Timeout in Milliseconds: The timeout value for contacting the sensor.
• Sensor Testing Temperature: If enabled, the corresponding temperature will always be returned by this sensor.
This is useful for testing purposes.
• Test Sensor: Queries the sensor for the current temperature, and displays it. If the temperature displayed seems
incorrect, make sure you have selected the correct unit of temperature above.
If you simply want to test the Thermal Shutdown feature, but you don’t have any thermal sensors to test with, you
can enable the Sensor Testing Temperature in the Sensor settings above. When enabled, you don’t need to provide a
Sensor Hostname or Sensor OID, and the test sensor will alway return the specified temperature.
Machine Restart
If you have problematic machines that you need to reboot periodically, you can configure the Machine Restart feature
of Power Management to restart your Slaves for you . Note that if the Slave on the machine is in the middle of
rendering a Task, it will finish its current Task before the machine is restarted.
6.2. Power Management
337
Deadline User Manual, Release 7.1.0.35
Machine Restart settings:
• Machine Restart Mode: Select Disabled, Enabled, or Debug mode. In Debug mode, all the checks are performed as normal, but no action is actually taken.
• Restart machines after Slave has been running for: The interval, in minutes, at which this group of Slaves
will be restarted.
6.3 Slave Scheduling
6.3.1 Overview
You can use the Slave Scheduling feature to configure when Slaves applications should be launched and shut down.
Slave Scheduling is controlled by the Launcher, so the Launcher must be running on the machines for Slave Scheduling
to work.
338
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
If a slave is scheduled to start on a machine, a notification message will pop up for 30 seconds indicating that the slave
is scheduled to start. If someone is still using the machine, they can choose to delay the start of the slave for a certain
amount of time.
6.3.2 Configuration
Slave Scheduling can be configured from the Monitor by selecting ‘Tools’ -> ‘Configure Slave Scheduling’. You will
need to be in Super User mode for this, if you are not part of a User Group that has access to this feature.
Machine Groups are used by Slave Scheduling to organize Slave machines on the farm, and each group can have
different scheduling settings. To add a new Machine Group, simply click the Add button in the Machine Group
section.
6.3. Slave Scheduling
339
Deadline User Manual, Release 7.1.0.35
Slave Scheduling Group Settings:
• Group Name: The name with which the Slave Scheduling Group will be identified.
• Group Mode: Whether this particular Group is enabled or disabled.
• Include All Slaves in this Group: If enabled, all slaves will be included in this group.
• Unassigned Slaves: The Slaves that will not be part of this Group.
• Slaves In Group: Slaves that will be part of this Group.
To edit the scheduling settings within a group, simply click on the group in the Machine Groups list.
Slave Scheduling
These settings are used to define the schedule for when slaves should start and stop.
• Ensure Slave Is Running During Scheduled Hours: If enabled, slaves will be restarted if they are shut down
during the scheduled hours.
• Day of the Week: Configure which days of the week you want to set a schedule for.
• Start Time: The time on the selected day that the Slave application should be launched if it is not already
running.
• Stop Time: The time on the selected day that the Slave application should be closed if it is running.
340
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
Idle Detection
These settings are used to launch the slave if the machine has been idle for a certain amount of time (“idle” means no
keyboard or mouse input). There is also additional criteria that can be checked before launching the slave, including
the machine’s current memory and CPU usage, the current logged in user, and the processes currently running on the
machine. Finally, this system can stop the slave automatically when the machine is no longer idle.
• Start Slave When Machine Is Idle For ___ Minutes: If enabled, the Slave will be started on the machine if
it is idle. A machine is considered idle if there hasn’t been any keyboard or mouse activity for the specified
amount of time.
• Stop Slave When Machine Is No Longer Idle: If enabled, the Slave will be stopped when the machine is no
longer idle. A machine is considered idle if there hasn’t been any keyboard or mouse activity for the specified
amount of time.
• Only Stop Slave If Started By Idle Detection: If enabled, the Slave will only be stopped when the machine is
no longer idle if that Slave was originally started by Idle Detection. If the Slave was originally started manually,
it will not be stopped.
There are some limitations with Idle Detection depending on the operating system:
• On Windows, Idle Detection will not work if the Launcher is running as a service. This is because the service
runs in an environment that is separate from the Desktop, and has no knowledge of any mouse or keyboard
activity.
• On Linux, the Launcher uses X11 to determine if there has been any mouse or keyboard activity. If X11 is not
available, Idle Detection will not work.
Note that Idle Detection can be overridden in the Local Slave Controls so that users can configure if their local slave
should launch when the machine becomes idle.
Miscellaneous Options
These settings are applied to both Slave Scheduling and Idle Detection.
• Only Start Slave If CPU Usage Less Than ___%: If enabled, the slave will only be launched if the machine’s
CPU usage is less than the specified value.
• Only Start Slave If Free Memory More Than ___ MB: If enabled, the slave will only be launched if the
machine has more free memory than the specified value (in Megabytes).
• Only Start Slave If These Processes Are Not Running: If enabled, the slave will only be launched if the
specified processes are not running on the machine.
• Only Start If Launcher Is Not Running As These Users: If enabled, the slave will only be launched if the
launcher is not running as one of the specified users.
• Allow Slaves to Finish Their Current Task When Stopping: If enabled, the Slave application will not be
closed until it finishes its current Task.
6.4 Farm Statistics
6.4.1 Overview
Deadline can keep track of some basic statistics. It can keep track of all of your completed Jobs so that you refer to
them later. It stores the User that submitted the Job, when the Job was submitted, the error count, as well as some
6.4. Farm Statistics
341
Deadline User Manual, Release 7.1.0.35
useful rendering metrics like render time, CPU usage, and memory usage. You can use all of this information to figure
out if there are any Slaves that aren’t being utilized to their full potential.
Statistical information is also gathered for individual slaves, including the slave’s running time, rendering time, and
idle time. It also includes information about the number of tasks the slave has completed, the number of errors it has
reported, and its average Memory and CPU usage.
Note that some statistics can only be gathered if Pulse is running.
6.4.2 Enabling Statistics Gathering
You must first make sure Statistics Gathering has been enabled before Deadline will start logging information, which
can be done in the Statistics Gathering section of the Repository Options.
Note that if Pulse is not running, only statistics for completed Jobs, User usage and Slave Statistics will be recorded.
You must run Pulse to keep track of Slave Resource Usage and overall Repository statistics. When running, Pulse will
periodically gather information about Slaves Resource Usage and the general state of the repository, and record them
in the Database.
342
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
6.4.3 Viewing Farm Reports
To view Statistics, open the Monitor and select ‘Tools’ -> ‘View Farm Reports’. This must be done in Super User
mode, unless you have the proper User Privileges to do so.
From this window, you can specify which type of report(s) to generate, and a date range to filter the statistics. You can
also specify a region to filter the statistics, but only the Active Slave Stats and Slaves Overview reports will use it.
There are five default Reports that will always be available, but custom reports can also be created and saved for later
use (see the ‘Custom Reports’ section below for more info).
Active Slave Stats
The Active Slave Stats report displays Slave usage statistics for the farm, which are logged by Slaves as they are
running. The statistics displayed by this report are generated by each individual slave at regular intervals and do not
require Pulse to be running.
6.4. Farm Statistics
343
Deadline User Manual, Release 7.1.0.35
Completed Job Stats
The Completed Job Stats report consists of a list of completed Jobs with detailed statistics. Pulse does not need to be
running to gather these statistics.
344
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
Farm Overview
The Farm Overview report displays statistics about the Farm using graphs. The statistics displayed by this report are
assembled by Pulse, and will therefore only be gethered if Pulse is running.
The State Counts section displays the statistics in terms of counts.
6.4. Farm Statistics
345
Deadline User Manual, Release 7.1.0.35
The State Totals gives a visual representation of the statistics in terms of percentages.
346
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
Slaves Overview
The Slaves Overview report displays the statistics for each Slave on the farm with graphs to help display the statistics.
The statistics displayed by this report are assembled by Pulse, and will therefore only be gathered if Pulse is running.
The Slaves Overview chart shows now many slaves were in each state (starting job, rendering, idle, offline, stalled,
and disabled).
The Available/Active Slaves charts show the number of slaves that are available, and the number of available slaves
that are active.
6.4. Farm Statistics
347
Deadline User Manual, Release 7.1.0.35
The Individual Slaves list and charts show the average CPU and Memory usage for individual slaves, as well as average
time each slave spends in each state.
348
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
User Farm Time Report
The User Farm Time Report displays the farm usage statistics for each User. Pulse does not need to be running to
gather these statistics.
6.4. Farm Statistics
349
Deadline User Manual, Release 7.1.0.35
Custom Reports
Users can create their own custom Reports to control how the gathered statistics are aggregated and presented. By
doing this, users can create their own arsenal of specialized reports that help to drill down and expose potential
problems with the farm.
In order to create or edit Custom Reports you first need to be in Super User mode, or have the appropriate User Group
Permissions to do so. If that is the case, there should be a new set of buttons below the list of Reports, providing
control over Custom Reports.
By clicking the ‘New’ button, you will be prompted to specify a name for your new report and select the type of
statistics which this report will display.
Once you’ve done that, you’ll be brought to the Edit view for your new Report. You’ll note that this is very similar to generating a report under normal circumstances, but with the addition of several buttons that allow further
350
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
customization of your Report.
Chief among these new buttons is the ‘Edit Data Columns’ button, which will allow you to select which columns are
displayed. You can also specify if you want to aggregate row information by selecting a Group By column, and a
Group Op for each other column.
6.4. Farm Statistics
351
Deadline User Manual, Release 7.1.0.35
The way the aggregation works is similar to a SQL query with a “group by” statement. Data rows will be combined
based on identical values of the Group By column, while the values of other columns will be determined by performing
the Group Ops on the combined rows.
As a simple example to demonstrate how this works in practice, let us consider a case where you might want to view
the error information on a per-plugin basis. We don’t have a built-in report to do this, but all this information is
contained in Completed Job Stats. With that in mind, you can create a Custom Report based on Completed Job Stats
to group by Plugin, and aggregate Error Counts and Wasted Error Time, as illustrated below.
352
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
Once you’ve specified which columns are displayed, and whether/how rows are aggregated, you can also add simple
Graphs to your report. Simply click the ‘Add Graph’ button, and specify the type of graph you want along with the
columns on which the graph should be based. Graphs are always based on all of the data presented the list view, and
currently cannot be based on selection or a different data model.
Once you’re done customizing your new report, simply click the ‘OK’ button on the Farm Status Reports window, and
your changes will be committed to the Database. Now, every time anyone brings up this dialog, they should be able to
6.4. Farm Statistics
353
Deadline User Manual, Release 7.1.0.35
generate the report you’ve just created!
6.4.4 Custom Statistics
If you need to keep track of more information, we suggest writing your own tool that uses Deadline Command.
Deadline Command can be used to query the repository for all sorts of information, like the current state of all the
Jobs and all the Slaves. You can have it print these out in an ini file format and use any ini file parser to extract the
information (Python has a module for this). This is also handy if you want to post stats to a web page, or insert entries
into a separate database.
6.5 Client Configuration
6.5.1 Overview
Clients are configured using the deadline.ini file. Some settings are stored in a system deadline.ini file, and some
are stored in a per-user deadline.ini file. Most of these settings are set during the Client Installation, but they can be
changed afterwards by editing the deadline.ini file directly. Some of these settings can also be updated using Auto
Configuration.
This guide will cover the various settings, and how they can be configured.
6.5.2 DEADLINE_PATH Environment Variable
The DEADLINE_PATH Environment variable is an environment variable on Windows and Linux which contains the
path to Deadline’s bin directory. On OSX, it is instead a file located at /Users/Shared/Thinkbox which contains the
path to Deadline’s resources directory.
DEADLINE_PATH is used by the integrated submisison scripts that are shipped with Deadline to determine where the
Deadline Client is installed to, and what the Repository path is. While it is possible to modify this value on the system
manually, you can instead use one of the Submitter installers to Change the DEADLINE_PATH Value.
6.5.3 Local Slave Instance Files
Deadline supports the ability to run Multiple Slaves On One Machine. The local slave instances are represented by .ini
files which are stored in the “slaves” folder in the following locations. Note that the # in the path will change based on
the Deadline version number.
• Windows: %PROGRAMDATA%\Thinkbox\Deadline#\slaves\
• Linux: /var/lib/Thinkbox/Deadline#/slaves/
• OSX: /Users/Shared/Thinkbox/Deadline#/slaves/
To remove local slave instances, simply delete their corresponding .ini file. Note that this does not remove the slave
entries from the repository that the slaves connected to.
6.5.4 Configuration File Format
The deadline.ini file has an ini file format, so there will be a [Deadline] section followed by a number of key=value
pairs that represent each setting. For example:
354
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
[Deadline]
LicenseServer=@my-server
NetworkRoot=\\\\repository\\path
LauncherListeningPort=17060
AutoConfigurationPort=17061
6.5.5 System Configuration File
The system deadline.ini file can be found in the following locations. Note that the # in the path will change based on
the Deadline version number.
• Windows: %PROGRAMDATA%\Thinkbox\Deadline#\deadline.ini
• Linux: /var/lib/Thinkbox/Deadline#/deadline.ini
• OSX: /Users/Shared/Thinkbox/Deadline#/deadline.ini
The following settings can be configured in the system deadline.ini file. Note that other settings can show up in this
file, but they are used internally by Deadline and are not documented here.
NetworkRoot
The NetworkRoot setting tells the Client which Repository to connect to.
NetworkRoot=\\\\repository\\path
There can also be additional NetworkRoot# settings that store previous Repository paths. These paths will be prepopulated in the drop down list when changing Repositories.
NetworkRoot0=\\\\repository\\path
NetworkRoot1=\\\\another\\repository
NetworkRoot2=\\\\test\\repository
This setting can be changed using the Change Repository option in the Launcher or the Monitor, and it can also be
configured using Auto Configuration.
LicenseServer
The LicenseServer setting tells the Client where it can get a license from.
LicenseServer=@my-server
This setting can be changed using the Change License Server option in the Launcher or the Slave, and it can also be
configured using Auto Configuration.
LauncherListeningPort
The LauncherListeningPort setting is the port that the Launcher listens on for Remote Control. It must be the same on
all Clients.
6.5. Client Configuration
355
Deadline User Manual, Release 7.1.0.35
LauncherListeningPort=17060
This setting can only be changed manually.
LauncherServiceStartupDelay
The LauncherServiceStartupDelay setting is the number of seconds that the Launcher waits during startup when running as a service or daemon. This delay helps ensure that the machine has set its host name before the Launcher starts
up any other Deadline applications.
LauncherServiceStartupDelay=60
This setting can only be changed manually.
SlaveStartupPort
The SlaveStartupPort setting is the port that the Slaves on this machine use when starting up to ensure that only one
Slave starts up at a time.
LauncherListeningPort=17063
This setting can only be changed manually.
AutoConfigurationPort
The AutoConfigurationPort setting is the port that the Clients use when Auto Configuring themselves. It must be the
same on all Clients.
AutoConfigurationPort=17061
This setting can only be changed manually
SlaveDataRoot
The SlaveDataRoot setting tells the Slave where to copy its job files temporarily during rendering. The default location
is the “slave” folder in the same folder as the per-user deadline.ini file. If left blank, the default location will be used
as well.
SlaveDataRoot=C:\\LocalSlaveData
This setting can be configured using Auto Configuration.
MultipleSlavesEnabled
The MultipleSlavesEnabled setting indicates if multiple slaves are allowed to run on this machine or not. The default
is True.
MultipleSlavesEnabled=True
This setting can only be changed manually.
356
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
RestartStalledSlave
The RestartStalledSlave setting indicates if the Launcher should try to restart the Slave on the machine if it becomes
stalled. The default is True.
RestartStalledSlave=True
This setting can be changed from the Launcher menu, and it can also be configured using Auto Configuration.
LaunchPulseAtStartup
The LaunchPulseAtStartup setting controls if the Launcher should automatically launch Pulse after the launcher starts
up. The default is False.
LaunchPulseAtStartup=True
This setting can only be changed manually.
KeepPulseRunning
The KeepPulseRunning setting controls if the Launcher should automatically relaunch Pulse if it is shut down or
crashes. The default is False.
KeepPulseRunning=True
This setting can only be changed manually.
LaunchBalancerAtStartup
The LaunchBalancerAtStartup setting controls if the Launcher should automatically launch the Balancer after the
launcher starts up. The default is False.
LaunchBalancerAtStartup=True
This setting can only be changed manually.
KeepBalancerRunning
The KeepBalancerRunning setting controls if the Launcher should automatically relaunch Balancer if it is shut down
or crashes. The default is False.
KeepBalancerRunning=True
This setting can only be changed manually.
6.5. Client Configuration
357
Deadline User Manual, Release 7.1.0.35
LaunchWebServiceAtStartup
The LaunchWebServiceAtStartup setting controls if the Launcher should automatically launch the Web Service after
the launcher starts up. The default is False.
LaunchWebServiceAtStartup=True
This setting can only be changed manually.
KeepWebServiceRunning
The KeepWebServiceRunning setting controls if the Launcher should automatically relaunch Web Service if it is shut
down or crashes. The default is False.
KeepWebServiceRunning=True
This setting can only be changed manually.
AutoUpdateOverride
The AutoUpdateOverride setting can be used to override the Automatic Upgrades setting in the Repository Configuration. If left blank, then it will not override the Repository Options, which is also the default behavior if this setting
isn’t specified.
AutoUpdateOverride=False
This setting can be configured using Auto Configuration.
6.5.6 Per-User Configuration File
The per-user deadline.ini file can be found in the following locations. Note that the # in the path will change based on
the Deadline version number.
• Windows: %LOCALAPPDATA%\Thinkbox\Deadline#\deadline.ini
• Linux: ~/Thinkbox/Deadline#/deadline.ini
• OSX: ~/Library/Application Support/Thinkbox/Deadline#/deadline.ini
The following settings can be configured in the per-user deadline.ini file.
User
The User setting is used by the Client to know which user you are when launching the Monitor or when submitting
jobs.
User=Ryan
This setting can be changed using the Change User option in the Launcher or the Monitor. To prevent users from
changing who they are, see the User Management documentation.
358
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
LaunchSlaveAtStartup
The LaunchSlaveAtStartup setting controls if the Launcher should automatically launch the Slave after the launcher
starts up. The default is True.
LaunchSlaveAtStartup=False
This setting can be changed from the Launcher menu, and it can also be configured using Auto Configuration.
6.6 Auto Configuration
6.6.1 Overview
Auto Configuration allows you to configure many Client settings from a single location. When the Deadline applications start up, they will automatically pull these settings, save them locally, and apply them before fully initializing.
Note that Pulse must be running for the Deadline applications to pull the Repository Path setting. All the other settings
are pulled directly from the Database once the applications are able to connect to it. Note that if Pulse isn’t running, the
other settings will still be pulled directly from the Database. To configure and run Pulse, see the Pulse documentation.
6.6. Auto Configuration
359
Deadline User Manual, Release 7.1.0.35
6.6.2 Rulesets
You can set up Client Configuration Rulesets from the Auto Configuration section of the Repository Configuration. If
you want to configure groups of Clients differently from others, you can add multiple Rulesets. This is useful if you
have more than one Repository on your network, or if you want to configure your render nodes differently than your
workstations.
New Rulesets can be added by pressing the Add button. You can give the Ruleset a name, and then choose a Client
Filter method to control which Clients will use this Ruleset. There are currently three types of Slave Filters:
• Hostname Regex: You can use regular expressions to match a Client’s host name. If your Slaves are using IPv6,
this is probably the preferred method to use. Note that this is case-sensitive. For example:
– .*host.* will match hostnames containing the word ‘host’ in lower case.
– host.* will match hostnames starting with ‘host’.
– .*[Hh]ost will match ending with ‘Host’ or ‘host’.
– .* will match everything.
• IP Regex: You can use regular expressions to match a Client’s IP address. This works with both IPv4 and IPv6
addresses. For example:
– 192.168..* will match IPv4 addresses not transported inside IPv6 starting with “192.168”.
– [:fF]*192.168.
should match IPv4 address even if they are carried over IPv6 addresses (ex
”::ffff:192.168.2.128”).
– .* will match everything.
• IPv4 Match: You can specify specific IP addresses, or a range of IP addresses (by using wildcards or ranges).
Note that this only works with IPv4. Do not use this for IPv6 addresses. For example:
– 192.168.0.1-150
– 192.168.0.151-255
– 192.168.*.*
– *.*.*.*
Configurations are generated starting from the top rule working down one by one. When there is a match for the
requesting Client, any properties in the rule which are not marked as ‘(Inherited)’ will override a previous setting. By
default, Slaves will use their local configuration for any property which is not set by a rule. Based on the example
here, all clients starting with the name ‘Render-‘ and ending with a whole number will use the same Repository Path
and launch the Client at startup, while the ‘Default’ rule above it matches all Clients and sets their license server.
360
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
The available options are:
• License Server: The license server setting. Use the format ‘@SERVER’, or if you have configured your license
file to use a specific port, use ‘PORT@SERVER’.
• Launch Slave At Startup: Whether or not the Slave should automatically launch when the Launcher starts up.
• Auto Update Override: Whether or not launching the Client should trigger an automatic upgrade if it is available.
• Restart Slave If It Stalls: If enabled, the Launcher will try to restart the Slave on the machine if it stalls.
• Repository Path: This is the path to the Repository that the Slave will connect to. You can specify a different
path for each operating system.
• Local Data Path: The local path where the Client temporarily stores plugin and job data from the Repository
during rendering. Note that this should be a local path to avoid conflicts. You can specify a different path for
each operating system.
6.6. Auto Configuration
361
Deadline User Manual, Release 7.1.0.35
6.7 Render Environment
6.7.1 Job Environment Variables
Environment variables can be set for a job, and these variables will be applied to the rendering process’ environment.
These variables can be set in the Job Properties in the Monitor, and they can be set during Manual Job Submission.
Manual Job Submission
For manual job submission, these variables can be specified in the job info file like this:
EnvironmentKeyValue0=mykey=myvalue
EnvironmentKeyValue1=anotherkey=anothervalue
EnvironmentKeyValue2=athirdkey=athirdvalue
...
There is also an IncludeEnvironment option that takes either True or False (False is the default). When IncludeEnvironment is set to True, Deadline will automatically grab all the environment variables from the submitter’s environment
and set them as the job’s environment variables.
IncludeEnvironment=True
This can be used in conjunction with the EnvironmentKeyValue# options above, but note that the EnvironmentKeyValue# options will take precedence over any current environment variables with the same name.
Finally, there is a UseJobEnvironmentOnly option that takes either True or False (False is the default):
UseJobEnvironmentOnly=True
The UseJobEnvironmentOnly setting controls how the job’s environment variables are applied to the rendering environment. If True, ONLY the job’s environment variables will be used. If False, the job’s environment variables will
be merged with the Slave’s current environment, with the job’s variables overwriting any existing ones with the same
name.
Job Rendering
At render time, the job’s environment variables are applied to the rendering process. As explained above, the job’s
environment can either be merged with the Slave’s current environment, or the job’s environment can be used exclusively.
Note though that if the job’s plugin defines any environment variables, those will take precedence over any job environment variables with the same name. In a job’s plugin, there are two functions that are available for the DeadlinePlugin
object that can be used to set environment variables:
• SetProcessEnvironmentVariable( key, value ):
– This should be used in Advanced plugins only.
– Any variables set by this function are applied to all process launched through Deadline’s plugin API.
– Note that calling SetProcessEnvironmentVariable in Simple plugins or within ManagedProcess callbacks
will not affect the current process’ environment.
– When using SetProcessEnvironmentVariable in an Advanced plugin, make sure to call it outside of the
ManagedProcess callbacks.
362
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
• SetEnvironmentVariable( key, value ):
– This is typically used in Simple plugins, or within ManagedProcess callbacks in Advanced plugins.
– Any variables set by this function are only applied to the process they are starting up, and they take
precedence over any variables set by SetProcessEnvironmentVariable.
See the Application Plugins documentation for more information.
6.7.2 Render Jobs As Job’s User
Deadline has some features that allow jobs to be rendered with the the job’s user account, rather than the user account
that the Slave is running as.
• On Windows, this is done by using the job’s user account credentials to start the rendering process using that
account.
• On Linux and Mac OS X, the Slave must be running as root. It will then use “sudo” to start the rendering process
using the job’s user account.
Enabling Render Jobs As User
To render jobs as the job’s user, you must enable Render Jobs As User in the User Security section of the Repository
Options. Note that this setting affects all jobs, and requires users to ensure that their User Account Settings are
configured properly (see below).
6.7. Render Environment
363
Deadline User Manual, Release 7.1.0.35
User Account Settings
The user account settings used to start the rendering process are stored in the User Settings for each user. For Linux
and OSX, only the User Name is required. For Windows, the Domain and Password must also be provided for
authentication.
364
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
6.8 Multiple Slaves On One Machine
6.8.1 Overview
Deadline has the ability to launch and configure an arbitrary number of Slave instances on a single machine. Each
Slave instance can be given a unique name, and can be assigned its own list of Pools and Groups, which allows Slaves
to work independently on separate Jobs. A single high-performance machine could potentially process multiple 3D,
compositing, and simulation Jobs simultaneously.
Note that the configurations for these slave instances are stored locally on the slave machine. This means that these
slave instances exist independently from the repository that the slaves connect to. So if you delete a slave from the
repository, the local configuration for that slave instance still exists. Conversely, if you delete a local slave instance,
the slave will still have an entry in the repository. It is possible to remove both the slave from the repository and the
local slave instance from the slave machine, which is covered below.
6.8. Multiple Slaves On One Machine
365
Deadline User Manual, Release 7.1.0.35
6.8.2 Licensing
In Deadline 7, all Slave instances running on a single machine will use the same license. For example, if you had 3
slave instances running on one machine, they would only use 1 license.
6.8.3 Adding and Running Slaves
There are three ways to launch new slave instances:
• From the Launcher menu by selecting Launch Slave By Name -> New Slave Instance. This is disabled by
default, but can be enabled in the User Group Management settings.
• From the right-click menu in the Slave list in the Monitor by selecting Remote Control -> Slave Commands ->
Start New Slave Instance. By default, this is only available when in Super User Mode.
• From the command line using the -name option.
deadlineslave -name "instance-01"
Additonally for a headless/no GUI machine, you would add a -nogui flag.
deadlineslave -name "instance-01" -nogui
Note that the name you enter is the postfix that is appended to the slave’s base name. For example, if the slave’s base
name is “Render-02”, and you start a new instance on it called “instance-01”, the full name for that slave instance will
be “Render-02-instance-01”. This is done so that if the slave’s machine name is changed, the full slave name will be
updated accordingly. Using the same example, if the machine was renamed to “Node-05”, the slave instance will now
be called “Node-05-instance-01”.
Once the new Slave shows up in the Slave List in the Monitor, you can configure it like any other Slave. You might
want to use Slave Settings (see Slave Configuration) to assign the different Slaves to run on separate CPUs. It might
also be a good idea to assign them to different Pools and Groups, so that they can work on different types of Jobs to
avoid competing for the same resource (e.g., you could have one Slave assigned to CPU intensive Jobs, while the other
works on RAM intensive ones).
Once the Slave has been created, you can also launch it remotely like you would any other Slave. See the Remote
Control documentation for more information.
366
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
6.8.4 Removing Slaves
There are three ways to remove existing slave instances:
• From the Launcher menu by selecting Launch Slave By Name -> Remove Slave Instances. This is disabled by
default, but can be enabled in the User Group Management settings.
• From the right-click menu in the Slave list in the Monitor by selecting Remote Control -> Slave Commands
-> Remove Slave Instance. This method gives the additional option to automatically remove the slave instance
from the repository as well. By default, this is only available when in Super User Mode.
• Manually delete the .ini files that define the local slaves instances on the machine that the slave runs on. See the
Client Configuration documentation for more information.
6.8.5 Limiting and Disabling Multiple Slaves
By default, users do not have the ability to launch additional Slaves on their own machines (see User Group Management). However, there are some cases where you might want to completely disable the ability to run multiple slaves
on the same machine.
The only known situation where this might be necessary is if your render nodes all net-boot off the same installation
(meaning they share the same file system). In this case, if multiple Slaves are enabled, each render node will end up
trying to run a Slave instance for every other render node net-booting off the same installation.
6.8. Multiple Slaves On One Machine
367
Deadline User Manual, Release 7.1.0.35
In this scenario, you can disable the multi-slave feature by opening the system’s deadline.ini file and adding this line:
MultipleSlavesEnabled=False
The system deadline.ini file can be found in the following locations. Note that the # in the path will change based on
the Deadline version number.
• Windows: %PROGRAMDATA%\Thinkbox\Deadline#\deadline.ini
• Linux: /var/lib/Thinkbox/Deadline#/deadline.ini
• OSX: /Users/Shared/Thinkbox/Deadline#/deadline.ini
6.9 Cloud Controls
6.9.1 Overview
Deadline has some built in cloud features that allows it to connect to different cloud providers and control your
instances. Currently, Amazon EC2, Microsoft Azure, Google Cloud, OpenStack, and vCenter are supported, but more
providers may be added in the future.
Note that Deadline only allows you to control existing instances. It does not create instances for you, except in the
case where you clone an existing instance. In order to use instances for rendering, you will need to set them up first,
which includes installing the Deadline Client, installing your rendering software, and setting up any licensing that is
required.
Permission for the Cloud Panel can be editted in the User Group Permissions form. See Controlling Feature Access.
6.9.2 Cloud Providers
Cloud providers can be configured from the Monitor by selecting Tools -> Configure Cloud Providers. By default, this
option is hidden for normal users, so you may need to enter Super User Mode. This will bring up the Cloud Options
window.
368
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
Adding Providers
To add a provider, click the Add button under the Cloud Region list. Choose the Cloud plugin you wish to use, and
give it a region name. This is useful for providers like Amazon EC2 that have more than one region. Then click OK.
The new Cloud region will now show up in the Cloud Region list.
Configuring Providers
To configure an existing provider, select it in the Cloud Region box, which will bring up its configuration settings.
This are the settings that the Monitor will use to connect to your cloud provider(s).
6.9. Cloud Controls
369
Deadline User Manual, Release 7.1.0.35
Every provider has an option to enable or disable it, but the other options can vary between providers. To get more
information about a particular setting, just hover your mouse over the setting text, or refer to the Cloud Plugins section
of the documentation.
6.9.3 Cloud Panel
The Cloud panel in the Monitor shows all the instances from the cloud providers that the Monitor is connected to. By
default, this panel is hidden for normal users, so you may need to enter Super User Mode before you can open it.
If the Cloud panel is not visible, see the Panel Features documentation for instructions on how to create new panels in
the Monitor.
Controlling Instances
The Cloud panel allows you to create new instances and control your existing instances using the right-click context
menu. The following options are available when you right-click on an instance:
• Create New Instance: Creates a new instance.
• Start Instance: Starts an instance that is currently stopped.
• Stop Instance: Stops an instance that is currently running.
370
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
• Destroy Instance: Destroys an existing instance. Once an instance is destroyed, it can not be recovered.
• Clone Instance: Clones an existing instance. This allows you to quickly launch multiple copies of the selected
instance.
• Reboot Instance: Reboots an instance that is currently running.
It should be noted that some cloud providers don’t provide the ability to Start/Stop instances.
6.9.4 Cloud Plug-ins
Cloud providers are supported via the Cloud Plug-in system. This means that the existing ones can be customized,
or you can write your own. See the Cloud Plugins documentation for more information on creating cloud plug-ins.
Plugin data is only loaded and updated when the Cloud Panel is being displayed.
6.10 Job Transferring
6.10.1 Overview
If you have multiple office locations that each have their own Deadline Repository, it is possible to transfer Jobs
between them. This can be handy if one office’s farm is sitting idle while the other is completely swamped.
Note though that Deadline will only transfer over the files that are submitted with the Job, which in most cases is just
the scene file. You must ensure that all assets the scene requires and all output paths that it writes to exist in the remote
location before transferring the Job.
6.10.2 Setting Up a Transfer
Before you can transfer a Job, it must be in the Suspend, Completed, or Failed state. Just right-click on the Job, and
select ‘Scripts’ -> ‘TransferSubmission’. A Transfer Job window will be displayed.
6.10. Job Transferring
371
Deadline User Manual, Release 7.1.0.35
You’ll notice that you’re actually submitting another Job that will transfer the original Job. The general Deadline
options are explained in the Job Submission documentation. The Job Transfer specific options are:
• Frame List and Frames Per Task: This is the frame list for the original Job that will be transferred. It will
default to the values for the original Job, but you can change them if you only want to transfer a subset of frames.
• New Repository: This is the path to the remote Repository that the original Job will be transferred to. Note that
the Slaves that the transfer Job will be running on must be able to see this path in order to transfer the original
Job to the new repository.
• Compress Files During Transfer: If enabled, the original Job’s files will be compressed during the transfer.
• Suspend Remote Job After Transfer: If enabled, the original Job will be submitted in the Suspended state to
the new Repository.
• Email Results After Transfer: If enabled, you will be emailed when the original Job has been successfully
372
Chapter 6. Advanced Features
Deadline User Manual, Release 7.1.0.35
transferred. Note that this requires you to have your email notification options set up properly.
• Remove Local Job After Transfer: If enabled, the original Job in the local Repository will be deleted after the
Job has been successfully transferred to the remote Repository.
Once you have your options set, click the Submit button to submit the transfer Job.
6.10.3 Global Transfer Options
Job Transfers are handled by a JobTransfer plugin, which has a few options that can be configured which will affect
all transfers. To change the JobTransfer plugin options, open the Monitor and select ‘Tools’ -> ‘Configure Plugins’ as
a Super User, and then select the JobTransfer plugin from the list on the left.
The following options are available:
• Notification Email(s): The email(s) where successful Job Transfer reports will be sent, so that sys admins can
keep track of all successfully transferred Jobs. Leave blank to disable this feature. Use commas to specify more
than one email address.
6.10. Job Transferring
373
Deadline User Manual, Release 7.1.0.35
374
Chapter 6. Advanced Features
CHAPTER
SEVEN
SCRIPTING
7.1 Scripting Overview
7.1.1 Overview
Scripts can be used to customize various aspects of Deadline, including creating custom plug-ins, submitting jobs to
the farm, or automating specific tasks after a job completes. The scripting language that Deadline uses is Python 2.7,
which is supported using Python for .NET. In addition to supporting native cPython modules, Python for .NET allows
your scripts to make use of the .NET Libraries. This fantastic combination of cPython & .NET allows for the best
of both worlds, suiting both seasoned cPython scripters and .NET technology based developers. Studios are free to
choose to use either or both technologies to their advantage in further customizing the Deadline compute management
framework.
7.1.2 Custom Repository Folder
If desired, custom scripts and plugins can be placed in the ‘custom’ folder in the Repository. This folder contains
subfolders for different plugins and scripts, allowing you to customize the following areas of Deadline:
• Application Plugins ../<DeadlineRepository>/custom/plugins/
• Event Plugins ../<DeadlineRepository>/custom/events/
• Cloud Plugins ../<DeadlineRepository>/custom/cloud/
• Balancer Plugins ../<DeadlineRepository>/custom/balancer/
• Monitor Scripts
• Submission Scripts ../<DeadlineRepository>/custom/scripts/Submission/
• General Scripts ../<DeadlineRepository>/custom/scripts/General/
• Job Scripts ../<DeadlineRepository>/custom/scripts/Jobs/
• Task Scripts ../<DeadlineRepository>/custom/scripts/Tasks/
• Slave Scripts ../<DeadlineRepository>/custom/scripts/Slaves/
• Pulse Scripts ../<DeadlineRepository>/custom/scripts/Pulse/
• Balancer Scripts ../<DeadlineRepository>/custom/scripts/Balancer/
• Limit Scripts ../<DeadlineRepository>/custom/scripts/Limits/
• Job Report Scripts ../<DeadlineRepository>/custom/scripts/JobReports/
• Slave Report Scripts ../<DeadlineRepository>/custom/scripts/SlaveReports/
375
Deadline User Manual, Release 7.1.0.35
• Web Service Scripts ../<DeadlineRepository>/custom/scripts/WebService/
Note that any scripts or plugins in the ‘custom’ folder will not be affected when upgrading or downgrading the Repository. The Repository installer also creates a backup of the ‘custom’ directory together with the other Deadline directories during the install process to ‘../backup/[timeStamp] and/or [mostRecent]/custom’ directory. In addition, any
scripts or plugins in the ‘custom’ folder will override any scripts or plugins that are shipped with Deadline if they share
the same name. If you want to check out the scripts and plugins that are shipped with Deadline, you can find then in
the ‘events’, ‘plugins’, and ‘scripts’ folders in the Repository.
There is also an option for a job to load its Application Plug-in from another location, which can be set in the Job
Properties. This can be useful when testing plugins before updating them directly in the Repository.
Note, the in-app submitters stored under ../<DeadlineRepository>/submission/ are not included in the “Custom Repository Folder” system, due to the complexity and limitation of some of the application scripting languages. To customize
any of the code under the “submission” directory, it is recommended to take a copy/backup for later reference. Note,
any customization you make, will still get backed up when the repository installer is run during an upgrade. However,
the contents of the “submission” directory will be overwritten during an upgrade.
7.1.3 Scripting Reference
The full Deadline Scripting Reference can be found on the Thinkbox Software Documentation Website. Offline PDF
and HTML versions can be downloaded from here as well. Ensure you select the correct drop-down version of
Deadline to view the matching API to your current Deadline version.
There are also many scripts and plug-ins that are shipped with Deadline, which you can use as a reference or starting
point for your own customization. These scripts can be found in the following folders in the Repository:
• ../<DeadlineRepository>/cloud Cloud Plugins
• ../<DeadlineRepository>/events Event Plugins
• ../<DeadlineRepository>/plugins Application Plugins
• ../<DeadlineRepository>/scripts Monitor Scripts
7.1.4 Application Submission Scripting Reference
Located under the ../<DeadlineRepository>/submission directory in Deadline’s repository are the application specific
script files for all the deeply integrated application submitters. Each application directory where applicable has 3 x
sub-directories:
• Client: The local proxy Client script is stored here, which typically is manually copied over to the local submitting client machine, thereby allowing users to open up the submission UI. These scripts tend not to be modified
very often and purely serve as a proxy script, which references/pulls the Main submission script from the Deadline repository, where the actual submission code resides.
• Main: The Main script(s) files here are referenced or loaded into application memory, typically by the local
proxy Client script. It is in these script file(s) that the deep, submission integration code resides for each
application in question. All this code is unprotected and studios are invited to customize if they so choose.
Note, the in-app submitters stored under ../<DeadlineRepository>/submission are not included in the “Custom
Repository Folder” system, due to the complexity and limitation of some of the application scripting languages.
To customize any of the code under the “submission” directory, it is recommended to take a copy/backup for
later reference. Note, any customization you make, will still get backed up when the repository installer is run
during an upgrade. However, the contents of the “submission” directory will be overwritten during an upgrade.
• Installers: Each of our applications that have an in-app submitter, on their respective documentation page, there
will be instructions on how to manually install the local proxy Client script into the correct directory and any
further configuration that may be required to get up and running. As an alternative, we provide Installer(s) which
376
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
can be run with the correct access permissions, to install the local proxy Client script(s) for you and also carry
out any further configuration that may be required. Where applicable, Installers are provided for the different
operating systems.
The following in-application deeply integrated submitters are available for reference or as a starting point for your
own custom submitter:
• 3ds Command ../<DeadlineRepository>/submission/3dsCmd/
• 3ds Max ../<DeadlineRepository>/submission/3dsmax/
• Corona Distributed Rendering ../<DeadlineRepository>/submission/3dsmaxCoronaDR/
• RPManager Script Setup ../<DeadlineRepository>/submission/3dsmaxRPM/
• 3ds Max ../<DeadlineRepository>/submission/3dsmaxVRayDBR/
• After Effects ../<DeadlineRepository>/submission/AfterEffects/
• AutoCAD ../<DeadlineRepository>/submission/AutoCAD/
• Blender ../<DeadlineRepository>/submission/Blender/
• Cinema 4D ../<DeadlineRepository>/submission/Cinema4D/
• Cinema 4D Team Render ../<DeadlineRepository>/submission/Cinema4DTeamRender/
• Clarisse iFX ../<DeadlineRepository>/submission/Clarisse/
• Composite ../<DeadlineRepository>/submission/Composite/
• Draft ../<DeadlineRepository>/submission/Draft/
• ftrack ../<DeadlineRepository>/submission/FTrack/
• Fusion ../<DeadlineRepository>/submission/Fusion/
• Generation ../<DeadlineRepository>/submission/Generation/
• Hiero ../<DeadlineRepository>/submission/Hiero/
• Houdini ../<DeadlineRepository>/submission/Houdini/
• Jigsaw ../<DeadlineRepository>/submission/Jigsaw/
• Lightwave ../<DeadlineRepository>/submission/Lightwave/
• Maya ../<DeadlineRepository>/submission/Maya/
• Maya ../<DeadlineRepository>/submission/MayaVRayDBR/
• Messiah ../<DeadlineRepository>/submission/Messiah/
• MicroStation ../<DeadlineRepository>/submission/MicroStation/
• modo ../<DeadlineRepository>/submission/Modo/
• Interactive Distributed Rendering ../<DeadlineRepository>/submission/ModoDBR/
• Nuke ../<DeadlineRepository>/submission/Nuke/
• Realflow ../<DeadlineRepository>/submission/RealFlow/
• Rhino ../<DeadlineRepository>/submission/Rhino/
• SketchUp ../<DeadlineRepository>/submission/SketchUp/
• Softimage ../<DeadlineRepository>/submission/Softimage/
• Softimage ../<DeadlineRepository>/submission/SoftimageVRayDBR/
7.1. Scripting Overview
377
Deadline User Manual, Release 7.1.0.35
7.1.5 Running Scripts from the Command Line
To run scripts from the command line, the only requirement is that you define a __main__ function. This is the function
called by the Command application when it executes the script.
def __main__( *args ):
# Replace "pass" with code
pass
If you save this script to a file called myscript.py, you can execute it using this command:
deadlinecommand -ExecuteScript "myscript.py"
If you are running the script in a headless environment where there is no display, you should use this command again:
deadlinecommand -ExecuteScriptNoGui "myscript.py"
The only difference between these commands is that ExecuteScriptNoGui doesn’t pre-import any of the user interface
modules so that it can run in a headless environment. If your script doesn’t use any user interface modules, then you
can use ExecuteScriptNoGui regardless of whether or not you’re in a headless environment.
7.1.6 Migrating Scripts From Deadline 5
Some changes were made to the Scripting API in Deadline 6, which means that Deadline 6 and later are NOT backward
compatible with scripts written for Deadline 5.
One change that affects all Deadline scripts is that the globally defined Deadline functions are no longer available.
However, many have functional replacements, which are mentioned below.
For migration tips for specific scripts, see the appropriate documentation:
• Application Plug-ins
• Event Plug-ins
• Monitor Scripts
• Job Scripts
• Web Service Scripts
378
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
Deadline Repository Path Functions
Original
Global
Function
GetJobsDirectory()
GetJobDropDirectory()
GetLimitGroupsDirectory()
GetPluginsDirectory()
GetPulseDirectory()
GetRootDirectory()
GetScriptsDirectory()
GetSettingsDirectory()
GetSlavesDirectory()
GetSubmissionDirectory()
GetTempDirectory()
GetTrashDirectory()
GetUsersDirectory()
Replacement Function
There is no replacement for this function because most job information is now stored in the
Database. If you want to get the auxiliary folder for a job, use
RepositoryUtils.GetJobAuxiliaryPath(job), which takes an instance of a job as a parameter.
There is no replacement for this function because drop jobs have been removed.
There is no replacement for this function because Limit information is now stored in the
Database.
RepositoryUtils.GetPluginsDirectory()
There is no replacement for this function because Pulse information is now stored in the
Database.
RepositoryUtils.GetRootDirectory()
RepositoryUtils.GetScriptsDirectory()
RepositoryUtils.GetSettingsDirectory()
There is no replacement for this function because Slave information is now stored in the
Database.
There is no replacement for this function.
There is no replacement for this function because there is no longer a temp folder in the
Repository.
There is no replacement for this function because there is no longer a trash folder in the
Repository.
There is no replacement for this function because User information is now stored in the
Database.
Deadline Client Path Functions
Original Global Function
GetDeadlineBinPath()
GetDeadlineHomeCurrentUserPath()
GetDeadlineHomePath()
GetDeadlineSettingsPath()
GetDeadlineTempPath()
GetLocalApplicationDataPath()
GetSystemTempPath()
7.1. Scripting Overview
Replacement Function
ClientUtils.GetBinDirectory()
ClientUtils.GetCurrentUserHomeDirectory()
ClientUtils.GetUsersHomeDirectory()
ClientUtils.GetUsersSettingsDirectory()
ClientUtils.GetDeadlineTempPath()
PathUtils.GetLocalApplicationDataPath()
PathUtils.GetSystemTempPath()
379
Deadline User Manual, Release 7.1.0.35
General Process Functions
Original Global Function
IsProcessRunning(processName)
KillAllProcesses(processName)
KillParentAndChildProcesses(processName)
WaitForProcessToStart(processName,
timeoutSeconds)
Replacement Function
ProcessUtils.IsProcessRunning(name)
ProcessUtils.KillProcesses(name)
ProcessUtils.KillParentAndChildProcesses(name)
ProcessUtils.WaitForProcessToStart(name,
timeoutMilliseconds)
File/Path/Directory Functions
Original Global Function
AddToPath(semicolonSeparatedList)
ChangeFilename(path, filename)
FileExists(filename)
GetExecutableVersion(filename)
GetFileSize(filename)
GetIniFileKeys(iniFilename, section)
GetIniFileSections(iniFilename)
GetIniFileSetting(iniFilename, section, key,
default)
Is64BitDllOrExe(filename)
SearchDirectoryList(semicolonSeparatedList)
SearchFileList(semicolonSeparatedList)
SearchFileListFor32Bit(semicolonSeparatedList)
SearchFileListFor64Bit(semicolonSeparatedList)
SearchPath(filename)
SetIniFileSetting(iniFilename, section, key,
value)
SynchronizeDirectories(srcPath, destPath,
deepCopy)
ToShortPathName(filename)
Replacement Function
DirectoryUtils.AddToPath(directory)
PathUtils.ChangeFilename(path, filename)
FileUtils.FileExists(filename)
FileUtils.GetExecutableVersion(filename)
FileUtils.GetFileSize(filename)
FileUtils.GetIniFileKeys(fileName, section)
FileUtils.GetIniFileSections(fileName)
FileUtils.GetIniFileSetting(fileName, section, key, defaultValue)
FileUtils.Is64BitDllOrExe(filename)
DirectoryUtils.SearchDirectoryList(directoryList)
FileUtils.SearchFileList(fileList)
FileUtils.SearchFileListFor32Bit(fileList)
FileUtils.SearchFileListFor64Bit(fileList)
DirectoryUtils.SearchPath(filename)
FileUtils.SetIniFileSetting(filename , section, key, value)
DirectoryUtils.SynchronizeDirectories(sourceDirectory,
destDirectory,deepCopy)
PathUtils.ToShortPathName(path)
Miscellaneous Functions
Original Global Function
BlankIfEitherIsBlank(str1, str2)
ExecuteScript(scriptFilename, arguments)
Sleep(milliseconds)
380
Replacement Function
StringUtils.BlankIfEitherIsBlank(str1, str2)
ClientUtils.ExecuteScript(scriptFilename, arguments)
SystemUtils.Sleep(milliseconds)
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
OS Functions
Original Global Function
GetAvailableRam()
GetApplicationPath(filename)
GetCpuCount()
GetRegistryKeyValue(keyName, valueName,
defaultValue)
GetTotalRam()
GetUsedRam()
Is64Bit()
IsRunningOnLinux()
IsRunningOnMac()
IsRunningOnWindows()
Replacement Function
SystemUtils.GetAvailableRam()
PathUtils.GetApplicationPath(applicationName)
SystemUtils.GetCpuCount()
SystemUtils.GetRegistryKeyValue(keyName, valueName,
defaultValue)
SystemUtils.GetTotalRam()
SystemUtils.GetUsedRam()
SystemUtils.Is64Bit()
SystemUtils.IsRunningOnLinux()
SystemUtils.IsRunningOnMac()
SystemUtils.IsRunningOnWindows()
7.2 Application Plugins
7.2.1 Overview
All of Deadline’s plug-ins are written in Python, which means that it’s easy to create your own plug-ins or customize
the existing ones. See the Scripting Overview documentation for more information, and links to the Deadline Scripting
reference.
Note that because the Python scripts for application plug-ins will be executed in a non-interactive way, it is important
that your scripts do not contain any blocking operations like infinite loops, or interfaces that require user input.
When a plugin is loaded the log will show where the plugin is being loaded from.
7.2.2 General Plug-in Information
There are two types of plug-ins that can be created:
• Simple
• Advanced
Simple plug-ins provide the basics to wrap a command line application, and is typically used to build up command
line arguments to pass to the application. Advanced plug-ins provide more control, and are typically used when
running a simple command line application isn’t enough. Other than the plug-in Python script itself though, Simple
and Advanced plug-ins are very similar.
7.2.3 Creating a New Plug-in
This section covers the the areas that Simple and Advanced plug-ins have in common. Specifics for Simple and
Advanced plug-ins are covered later on.
To create a new plug-in, start by creating a folder in the Repository’s custom\plugins folder and give it the name of
your plug-in. See the Scripting Overview documentation for more information on the ‘custom’ folder in the Repository
and how it’s used.
For the sake of this document, we will call our new plug-in MyPlugin. All relative script and configuration files for
this plug-in are to be placed in this plug-in’s folder (some are required and some are optional).
7.2. Application Plugins
381
Deadline User Manual, Release 7.1.0.35
The dlinit File - Required
The first required file is MyPlugin.dlinit, which is the main configuration file for this plug-in. It is a plain text file that
defines a few general key=value plug-in properties, which include:
Key
Name
About
ConcurrentTasks
DebugLogging
DeprecatedMode
Description
A short description of the plug-in.
Set to True or False (default is False). If tasks for this plug-in can render concurrently without
interfering with each other, this can be set to True.
Set to True or False (default is False). If set to True, then debug plug-in logging will be printed out
during rendering.
Set to True or False (default is False). Only set to True if you want a custom Python.NET plug-in
from Deadline 5.1 or 5.2 to work with Deadline 6 or later. More information on DeprecatedMode
can be found later on.
It can also define key=value custom settings to be used by the plug-in. A common custom setting is the executable to
use to render the job. For this example, our MyPlugin.dlinit file might look like this:
About=My Example Plugin for Deadline
# This is a comment
ConcurrentTasks=True
MyPluginRenderExecutable=c:\path\to\my\executable.exe
The py File - Required
The other required file is MyPlugin.py, which is the main plug-in script file. It defines the main DeadlinePlugin class
that contains the necessary code that Deadline uses to render a job. This is where Simple and Advanced plug-ins will
differ, and the specifics for each can be found later on, but the template for this script file might look like this:
from Deadline.Plugins import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main DeadlinePlugin class.
######################################################################
def GetDeadlinePlugin():
return MyPlugin()
######################################################################
## This is the function that Deadline calls when the plugin is no
## longer in use so that it can get cleaned up.
######################################################################
def CleanupDeadlinePlugin( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
## This is the main DeadlinePlugin class for MyPlugin.
######################################################################
class MyPlugin (DeadlinePlugin):
# TODO: Place code here instead of "pass"
pass
382
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
The first thing to note is that we’re importing the Deadline.Plugins namespace so that we can access the DeadlinePlugin
class.
The GetDeadlinePlugin() function is important, as it allows the Slave to get an instance of our MyPlugin class (which is
extending the abstract DeadlinePlugin class). In Deadline 6.2 and later, the GetDeadlinePluginWithJob( job ) function
can be defined as an alternative. It works the same as GetDeadlinePlugin(), except that it accepts an instance of the
Job object that the plug-in is being loaded for. If either of these functions are not defined, the Slave will report an error
when it tries to render the job.
The MyPlugin class will need to implement certain callbacks based on the type of plug-in it is, and these callbacks must
be hooked up in the MyPlugin constructor. One callback that all plug-ins should implement is the InitializeProcess
function. There are many other callbacks that can be implemented, which are covered in the Events section for the
DeadlinePlugin class in the Deadline Scripting reference.
The CleanupDeadlinePlugin() function is also important, as it is necessary to clean up the plug-in when it is no longer
in use. Typically, this is used to clean up any callbacks that were created when the plug-in was initialized.
To start off, the InitializeProcess callback is typically used to set some general plug-in settings:
from Deadline.Plugins import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main DeadlinePlugin class.
######################################################################
def GetDeadlinePlugin():
return MyPlugin()
######################################################################
## This is the function that Deadline calls when the plugin is no
## longer in use so that it can get cleaned up.
######################################################################
def CleanupDeadlinePlugin( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
## This is the main DeadlinePlugin class for MyPlugin.
######################################################################
class MyPlugin (DeadlinePlugin):
## Hook up the callbacks in the constructor.
def __init__( self ):
self.InitializeProcessCallback += self.InitializeProcess
## Clean up the plugin.
def Cleanup():
del self.InitializeProcessCallback
## Called by Deadline to initialize the plugin.
def InitializeProcess( self ):
# Set the plugin specific settings.
self.SingleFramesOnly = False
self.PluginType = PluginType.Simple
These are the common plug-in properties that can be set in InitializeProcess callback. See the DeadlinePlugin class in
the Deadline Scripting reference for additional properties.
7.2. Application Plugins
383
Deadline User Manual, Release 7.1.0.35
Property
PluginType
SingleFramesOnly
Description
The type of plug-in this is (PluginType.Simple/PluginType.Advanced).
Set to True or False. Set to True if your plug-in can only work on one frame at a time, rather
than a frame sequence.
The param File - Optional
The MyPlugin.param file is an optional file that is used by the Plugin Configuration dialog in the Monitor. It declares
properties that the Monitor uses to generate a user interface for modifying custom settings in the MyPlugin.dlinit file.
After you’ve created this file, open the Monitor and enter Super User mode. Then select Tools -> Configure Plugins
and look for your plug-in in the list on the left.
The file might look something like:
[MyPluginRenderExecutable]
Type=filename
Label=My Plugin Render Executable
Default=c:\path\to\my\executable.exe
Description=The path to the executable file used for rendering.
Comment lines are supported in the param file, and must start with either ‘;’ or ‘#’. For example:
# This is the file name picker control to set the executable for this plugin.
[MyPluginRenderExecutable]
Type=filename
Label=My Plugin Render Executable
Default=c:\path\to\my\executable.exe
Description=The path to the executable file used for rendering.
384
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
You’ll notice that the property name between the square brackets matches the MyPluginRenderExecutable custom
setting we defined in our MyPlugin.dlinit file. This means that this control will change the MyPluginRenderExecutable
setting. The available key=value pairs for the properties defined here are:
Key
Name
Category
CategoryIndex
CategoryOrder
Default
DefaultValue
Description
DisableIfBlank
IgnoreIfBlank
Index
Label
Required
Type
Description
The category the control should go under.
This determines the control’s order under its category. This does the same thing as Index.
This determines the category’s order among other categories. If more than one CategoryOrder is
defined for the same category, the lowest value is used.
The default value to be used if this property is not defined in the dlinit file. This does the same thing
as DefaultValue.
The default value to be used if this property is not defined in the dlinit file. This does the same thing
as Default.
A short description of the property the control is for (displayed as a tooltip in the UI).
If True, a control will not be shown if this property is not defined in the dinit file (True/False). This
does the same thing as IgnoreIfBlank.
If True, a control will not be shown if this property is not defined in the dinit file (True/False). This
does the same thing as DisableIfBlank.
This determines the control’s order under its category. This does the same thing as CategoryIndex.
The control label.
If True, a control will be shown for this property even if it’s not defined in the dlinit file (True/False).
The type of control (see table below).
These are the available controls.
Control Type
Boolean
Color
Enum
Enumeration
Filename
FilenameSave
Float
Folder
Integer
Label
MultiFilename
MultiLineMultiFilename
MultiLineMultiFolder
MultiLineString
Password
SlaveList
String
Description
A drop-down control that allows the selection of True or False.
Allows the selection of a color.
A drop-down control that allows the selection of an item from a list.
Same as Enum above.
Allows the selection of an existing file.
Allows the selection of a new or existing file.
An floating point spinner control.
Allows the selection of an existing folder.
An integer spinner control.
A read-only text field.
Allows the selection of multiple existing files, which are then separated by semicolons in
the text field.
Allows the selection of multiple existing files, which are then placed on multiple lines in
the text field.
Allows the selection of multiple existing folders, which are then placed on multiple lines
in the text field.
A text field with multiple lines.
A text field that masks the text.
Allows the selection of existing Slaves, which are then separated by commas in the text
field.
A text field.
There are also key/value pairs for specific controls:
7.2. Application Plugins
385
Deadline User Manual, Release 7.1.0.35
Key Name
DecimalPlaces
Filter
Increment
Items
Maximum
Minimum
Validator
Values
Description
The number of decimal places for the Float controls.
The filter string for the Filename, FilenameSave, or MultiFilename controls.
The value to increment the Integer or Float controls by.
The semicolon separated list of items for the Enum control. This does the same thing as Values.
The maximum value for the Integer or Float controls.
The minimum value for the Integer or Float controls.
A regular expression for the String control that is used to ensure the value is valid.
The semicolon separated list of items for the Enum control. This does the same thing as Items.
The options File - Optional
The MyPlugin.options file is an optional file that is used by the Job Properties dialog in the Monitor. It declares
properties that the Monitor uses to generate a user interface for modifying plug-in specific options as they appear in
the plug-in info file that was submitted with the job. After you’ve created this file, you can right-click on a job in the
Monitor that uses this plug-in and select Modify Properties. You should then see a MyPlugin page at the bottom of the
list on the left which you can select to view these properties.
Often, these plug-in specific options are used to build up the arguments to be passed to the rendering application. Let’s
assume that our render executable takes a “-verbose” argument that accepts a boolean parameter, and that the plug-in
info file submitted with the job contains the following:
Verbose=True
386
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
Now we would like to be able to change this value from the Job Properties dialog in the Monitor, so our MyPlugin.options file might look like this:
[Verbose]
Type=boolean
Label=Verbose Logging
Description=If verbose logging is enabled.
Required=true
DisableIfBlank=false
DefaultValue=True
You’ll notice that the property name between the square brackets matches the Verbose setting in our plug-in info file.
This means that this control will change the Verbose setting. The available key=value pairs for the properties defined
here are the same as those defined for the param file above. Comment lines are also supported in the options file in the
same way they are supported in the param file.
The ico File - Optional
The MyPlugin.icon file is an optional 16x16 icon file that can be used to easily identify jobs that use this plug-in in
the Monitor. Typically, it is the plug-in application’s logo, or something else that represents the plug-in. If a plug-in
does not have an icon file, a generic icon will be shown in the jobs list in the Monitor
The JobPreLoad.py File - Optional
The JobPreLoad.py file is an optional script that will be executed by the Slave prior to loading a job that uses this
plug-in. Note that in this case, the file does not share its name with the plug-in folder. This script can be used to do
things like synchronize plug-ins or scripts prior to starting the render job.
The only requirement for the PreJobLoad.py script is that you define a __main__ function, which is called by the Slave
when it executes the script. It must accept a single parameter, which is the current instance of the DeadlinePlugin class.
Here is an example script that copies a couple files from a server to the local machine, and sets some environment
variables:
from System import *
from System.IO import *
def __main__( deadlinePlugin ):
deadlinePlugin.LogInfo( "Copying some files" )
File.Copy( r"\\server\files\file1.ext", r"C:\local\files\file1.ext", True )
File.Copy( r"\\server\files\file2.ext", r"C:\local\files\file2.ext", True )
deadlinePlugin.LogInfo( "Setting EnvVar1 to True" )
deadlinePlugin.SetProcessEnvironmentVariable( "EnvVar1", "True" )
deadlinePlugin.LogInfo( "Setting EnvVar2 to False" )
deadlinePlugin.SetProcessEnvironmentVariable( "EnvVar2", "False" )
The PluginPreLoad.py File - Optional
The PluginPreLoad.py file is an optional script that will be executed by the Slave prior to executing any python script
for the plug-in (MyPlugin.py or JobPreLoad.py), and any pre or post job or task script for the current job. Note that
in this case, the file does not share its name with the plug-in folder. This script can be used to set up the Python
7.2. Application Plugins
387
Deadline User Manual, Release 7.1.0.35
environment prior to running any other python script, including setting sys.path to control where additional modules
will be loaded from.
The only requirement for the PluginPreLoad.py script is that you define a __main__ function, which is called by the
Slave when it executes the script. It does not accept any parameters. Here is an example script that updates sys.path
with custom paths:
import sys
def __main__():
path = r"\\server\python"
if path not in sys.path:
sys.path.append( path )
7.2.4 Simple Plug-ins
A render job goes through three stages:
• StartJob: A job enters this stage when it is first picked up by a Slave.
• RenderTasks: A job can enter this stage many times (once for each task a Slave dequeues while it has the
current job loaded).
• EndJob: A job enters this stage when a Slave is unloading the job.
Simple plug-ins only covers the RenderTasks stage, and are pretty straight forward. They are commonly used to render
with applications that support simple command line rendering (running a command line executable and waiting for it
to complete). For example, After Effects has a command line renderer called aerender.exe, which can be executed by
the Slave to render specific frames of an After Effects project file.
Initialization
By default, a plug-in is considered to be a Simple plug-in, but you can explicitly set this in the InitializeProcess()
callback (as explained above). You can also define settings specific to the simple plug-in, as well as any popup or
stdout handlers that you need. These additional settings are covered in the ManagedProcess class in the Deadline
Scripting reference (note that the DeadlinePlugin class inherits from the ManagedProcess class). For example:
from Deadline.Plugins import *
from System.Diagnostics import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main DeadlinePlugin class.
######################################################################
def GetDeadlinePlugin():
return MyPlugin()
######################################################################
## This is the function that Deadline calls when the plugin is no
## longer in use so that it can get cleaned up.
######################################################################
def CleanupDeadlinePlugin( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
388
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
## This is the main DeadlinePlugin class for MyPlugin.
######################################################################
class MyPlugin (DeadlinePlugin):
## Hook up the callbacks in the constructor.
def __init__( self ):
self.InitializeProcessCallback += self.InitializeProcess
## Clean up the plugin.
def Cleanup():
# Clean up stdout handler callbacks.
for stdoutHandler in self.StdoutHandlers:
del stdoutHandler.HandleCallback
del self.InitializeProcessCallback
## Called by Deadline to initialize the process.
def InitializeProcess( self ):
# Set the plugin specific settings.
self.SingleFramesOnly = False
self.PluginType = PluginType.Simple
# Set the ManagedProcess specific settings.
self.ProcessPriority = ProcessPriorityClass.BelowNormal
self.UseProcessTree = True
#StdoutHandling should be enabled if required in your plugin
self.StdoutHandling = True
#PopupHandling should be enabled if required in your plugin
self.PopupHandling = True
# Set the stdout handlers.
self.AddStdoutHandlerCallback(
"WARNING:.*" ).HandleCallback += self.HandleStdoutWarning
self.AddStdoutHandlerCallback(
"ERROR:(.*)" ).HandleCallback += self.HandleStdoutError
# Set the popup ignorers.
self.AddPopupIgnorer( "Popup 1" )
self.AddPopupIgnorer( "Popup 2" )
# Set the popup handlers.
self.AddPopupHandler( "Popup 3", "OK" )
self.AddPopupHandler( "Popup 4", "Do not ask me this again;Continue" )
## Callback for when a line of stdout contains a WARNING message.
def HandleStdoutWarning( self ):
self.LogWarning( self.GetRegexMatch(0) )
## Callback for when a line of stdout contains an ERROR message.
def HandleStdoutError( self ):
self.FailRender( "Detected an error: " + self.GetRegexMatch(1) )
7.2. Application Plugins
389
Deadline User Manual, Release 7.1.0.35
Stdout Handlers
The AddStdoutHandlerCallback() function accepts a string parameter, which is a POSIX compliant regular expression
used to match against lines of stdout from the command line process. This function also returns a RegexHandlerCallback instance, which you can hook up a callback to that is called when a line of stdout is matched. This can all be
done on one line, which is shown in the example above.
Examples of handler callback functions are also shown in the example above. Within these handler functions, the
GetRegexMatch() function can be used to get a specific match from the regular expression. The parameter passed to
GetRegexMatch() is the index for the matches that were found. 0 returns the entire matched string, and 1, 2, etc returns
the matched substrings (matches that are surrounded by round brackets). If there isn’t a corresponding substring, you’ll
get an error (note that 0 is always a valid index).
In HandleStdoutWarning(), 0 is the only valid index because there is no substring in round brackets in the regular
expression. In HandleStdoutError(), 0 and 1 are valid. 0 will return the entire matched string, whereas 1 will return
the substring in the round brackets.
For further examples, please open up any of our application plugin Python script files and inspect them. An example
of comprehensive Stdout handlers can be found in the MayaBatch plugin.
• ../plugins/MayaBatch/MayaBatch.py
Note, that Deadline’s default shipping StdoutHandlers require the Slave’s Operating System to be using ENGLISH as
it’s language.
Popup Ignorers and Handlers
The AddPopupIgnorer() function accepts a string parameter, which is a POSIX compliant regular expression. If a
popup is displayed with a title that matches the given regular expression, the popup is simply ignored. Popup ignorers
should only be used if the popup doesn’t halt the rendering because it is waiting for a button to be pressed. In the
case where a button needs to be pressed to continue, popup handlers should be used instead. The AddPopupHandler()
function takes two parameters: a regular expression string, and the button(s) to press (multiple buttons can be separated
with semicolons).
Note, that Deadline’s default shipping PopupIgnorers and PopupHandlers require the Slave’s Operating System to be
using ENGLISH as it’s language.
Here is an example using ”.*” at the beginning and end of the title search string which acts as a wildcard. The dialog
also has a “Adopt the File’s Unit Scale” checkbox that needs to be checked ON and then the “OK” button should be
pressed in that order.
self.PopupHandling = True
self.AddPopupHandler( ".*File Load: Units Mismatch.*", "Adopt the File's Unit Scale?;OK" )
In this example, the Optical Flares license popup uses a “wxWindowClassNR” control for its “OK” button, so we need
to add this special class type to our built-in list of possible button classes, just for the After Effects plugin. Once this
class is added, we can search for it and react by pressing the “OK” button in it’s dialog. Although, in this case, visually
the button displays the word “OK”, but actually the name of the button is “panel”.
self.PopupHandling = True
self.PopupButtonClasses = ( "Button", "wxWindowClassNR" )
# Handle Optical Flares License popup (the "OK" button is actually called "panel")
self.AddPopupHandler( ".*Optical Flares License.*", "panel" )
For users without access to a recent (2012+) version of Visual Studio which includes the Spy++ utility, then the free
application WinSpy++ is very useful to help identify the correct syntax for a dialog’s title or button.
390
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
In this example, we force all Qt based widgets to be native instead of alien based widgets, set our HandleQtPopups
variable to True and then we are able to handle V-Ray Qt based alien widget based dialogs whilst rendering in Rhino
by pressing the [X] symbol in the top right corner of the Rhino Qt dialog:
self.PopupHandling = True
self.HandleQtPopups = True
self.SetEnvironmentVariable( "QT_USE_NATIVE_WINDOWS","1" )
self.AddPopupHandler( r"Rhino", "[X]" )
In this final example, we need to handle Windows 8 Mobile / Windows 10 based popup dialogs and ensure we react
correctly depending on the name of the dialog title. A sometimes tricky task if you have multiple, very similar named
popup title dialogs in the application. We use the ”.*” characters as a wildcard, the “^” character to ensure the text
appears at the start of the string and the “$” character to ensure the text appears at the end of the string we are searching
for.
self.PopupHandling = True
self.HandleWindows10Popups = True
self.AddPopupIgnorer( "SAFE 12.*" )
self.AddPopupIgnorer( "^SAFE$" )
self.AddPopupHandler( "^$", "[X]" )
self.AddPopupHandler( "Tip of the Day", "[X]" )
For further examples, please open up any of our application plugin Python script files and inspect them. Good examples
are to be found in:
• ../plugins/3dsmax/3dsmax.py
• ../plugins/AfterEffects/AfterEffects.py
• ../plugins/CSiSAFE/CSiSAFE.py
• ../plugins/Rhino/Rhino.py
Further information on Regular Expressions can be found on Wikipedia and many online POSIX compliant RegEx
testers are available to help you develop and test your RegEx before testing your code in Deadline:
• regex101
• regexr
• regexpal
• regextester
Finally, the Deadline “FranticX.Processes.ManagedProcess” class has a number of functions to further assist with
Popup Handling and it is recommended to review our Scripting API docs for further information on these functions:
• PopupButtonClasses
• PopupMaxChildWindows
• PopupTextClasses
• PressEnterDuringRender
Render Executable and Arguments
The RenderExecutable() callback is used to get the path to the executable that will be used for rendering. This callback
must be implemented in a Simple plug-in, or an error will occur. Continuing our example from above, we’ll use the
path specified in the MyPlugin.dlinit file, and we can access it using the global GetConfigEntry() function.
7.2. Application Plugins
391
Deadline User Manual, Release 7.1.0.35
Another important (but optional) callback is the RenderArgument() callback. This callback should return the arguments you want to pass to the render executable. Typically, these arguments are built from values that are pulled from
the DeadlinePlugin class (like the scene file name, or the start and end frame for the task), or from the plug-in info file
that was submitted with the job using the GetPluginInfoEntry() function. If this callback is not implemented, then no
arguments will be passed to the executable.
After adding these callbacks, our example plug-in script now looks like this:
from Deadline.Plugins import *
from System.Diagnostics import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main DeadlinePlugin class.
######################################################################
def GetDeadlinePlugin():
return MyPlugin()
######################################################################
## This is the function that Deadline calls when the plugin is no
## longer in use so that it can get cleaned up.
######################################################################
def CleanupDeadlinePlugin( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
## This is the main DeadlinePlugin class for MyPlugin.
######################################################################
class MyPlugin (DeadlinePlugin):
## Hook up the callbacks in the constructor.
def __init__( self ):
self.InitializeProcessCallback += self.InitializeProcess
self.RenderExecutableCallback += self.RenderExecutable
self.RenderArgumentCallback += self.RenderArgument
## Clean up the plugin.
def Cleanup():
# Clean up stdout handler callbacks.
for stdoutHandler in self.StdoutHandlers:
del stdoutHandler.HandleCallback
del self.InitializeProcessCallback
del self.RenderExecutableCallback
del self.RenderArgumentCallback
## Called by Deadline to initialize the process.
def InitializeProcess( self ):
# Set the plugin specific settings.
self.SingleFramesOnly = False
self.PluginType = PluginType.Simple
# Set the ManagedProcess specific settings.
self.ProcessPriority = ProcessPriorityClass.BelowNormal
self.UseProcessTree = True
self.StdoutHandling = True
self.PopupHandling = True
392
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
# Set the stdout handlers.
self.AddStdoutHandlerCallback(
"WARNING:.*" ).HandleCallback += self.HandleStdoutWarning
self.AddStdoutHandlerCallback(
"ERROR:(.*)" ).HandleCallback += self.HandleStdoutError
# Set the popup ignorers.
self.AddPopupIgnorer( "Popup 1" )
self.AddPopupIgnorer( "Popup 2" )
# Set the popup handlers.
self.AddPopupHandler( "Popup 3", "OK" )
self.AddPopupHandler( "Popup 4", "Do not ask me this again;Continue" )
## Callback for when a line of stdout contains a WARNING message.
def HandleStdoutWarning( self ):
self.LogWarning( self.GetRegexMatch(0) )
## Callback for when a line of stdout contains an ERROR message.
def HandleStdoutError( self ):
self.FailRender( "Detected an error: " + self.GetRegexMatch(1) )
## Callback to get the executable used for rendering.
def RenderExecutable( self ):
return self.GetConfigEntry( "MyPluginRenderExecutable" )
## Callback to get the arguments that will be passed to the executable.
def RenderArgument( self ):
arguments = " -continueOnError"
arguments += " -verbose " + self.GetPluginInfoEntry( "Verbose" )
arguments += " -start " + str(self.GetStartFrame())
arguments += " -end " + str(self.GetEndFrame())
arguments += " -scene \"" + self.GetDataFilename() + "\""
return arguments
There are many other callbacks that can be implemented for Simple plug-ins, which are covered in the Events section
for the ManagedProcess class in the Deadline Scripting reference. The best place to find examples of Simple plug-ins
is to look at some of the plug-ins that are shipped with Deadline. These range from the very basic (Blender), to the
more complex (MayaCmd).
7.2.5 Advanced Plug-ins
To reiterate, a render job goes through three stages:
• StartJob: A job enters this stage when it is first picked up by a Slave.
• RenderTasks: A job can enter this stage many times (once for each task a Slave dequeues while it has the
current job loaded).
• EndJob: A job enters this stage when a Slave is unloading the job.
Advanced plug-ins are more complex, as they control all three of these job stages. They are commonly used to render
with applications that support some sort of slave/server mode that Deadline can interact with. Usually, this requires the
application to be started during the StartJob phase, fed commands during the RenderTasks stage(s), and finally shut
down during the EndJob stage. For example, the 3ds Max plug-in starts up 3dsmax in slave mode and forces it to load
7.2. Application Plugins
393
Deadline User Manual, Release 7.1.0.35
our Lightning plug-in. The Lightning plug-in listens for commands from Deadline and executes them as necessary.
After rendering is complete, 3ds Max is shut down.
Initialization
To indicate that your plug-in is an Advanced plug-in, you need to set the PluginType property in the InitializeProcess()
callback.
from Deadline.Plugins import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main DeadlinePlugin class.
######################################################################
def GetDeadlinePlugin():
return MyPlugin()
######################################################################
## This is the function that Deadline calls when the plugin is no
## longer in use so that it can get cleaned up.
######################################################################
def CleanupDeadlinePlugin( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
## This is the main DeadlinePlugin class for MyPlugin.
######################################################################
class MyPlugin (DeadlinePlugin):
## Hook up the callbacks in the constructor.
def __init__( self ):
self.InitializeProcessCallback += self.InitializeProcess
## Clean up the plugin.
def Cleanup():
del self.InitializeProcessCallback
## Called by Deadline to initialize the process.
def InitializeProcess( self ):
# Set the plugin specific settings.
self.SingleFramesOnly = False
self.PluginType = PluginType.Advanced
Render Tasks
The RenderTasks() callback is the only required callback for Advanced plug-ins. If it is not implemented, an error will
occur. It contains the code to be executed for each task that a Slave renders. This could involve launching applications,
communicating with already running applications, or simply running a script to automate a particular task (like backing
up a group of files).
Other common callbacks for Advanced plug-ins are the StartJob() and EndJob() callbacks. The StartJob() callback can
be used to start up an application, or to set some local variables that will be used in other callbacks. If the StartJob()
callback is not implemented, then nothing is done during the StartJob phase. The EndJob() callback can be used to
shut down a running application, or to clean up temporary files. If the EndJob() callback is not implemented, then
nothing is done during the EndJob phase.
394
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
In the example below, we will be launching our application during the StartJob phase. The benefit to this is that
the application can be left running during the duration of the job, which eliminates the overhead of having to launch
the application for each task. To launch and monitor the application, we will be implementing a ManagedProcess
class, and calling it MyPluginProcess .This ManagedProcess class will define the render executable and command line
arguments for launching the process we will be monitoring. Note that we aren’t passing it any frame information, as
this needs to be handled in the RenderTasks() callback when it interacts with the process.
After adding these three callbacks, and the MyPluginProcess class, our example code looks like this. Note that the
RenderTasks() callback still needs code to allow it to interact with the running process launched in the StartJob()
callback.
from Deadline.Plugins import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main DeadlinePlugin class.
######################################################################
def GetDeadlinePlugin():
return MyPlugin()
######################################################################
## This is the function that Deadline calls when the plugin is no
## longer in use so that it can get cleaned up.
######################################################################
def CleanupDeadlinePlugin( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
## This is the main DeadlinePlugin class for MyPlugin.
######################################################################
class MyPlugin (DeadlinePlugin):
## Variable to hold the Managed Process object.
Process = None
## Hook up the callbacks in the constructor.
def __init__( self ):
self.InitializeProcessCallback += self.InitializeProcess
self.StartJobCallback += self.StartJob
self.RenderTasksCallback += self.RenderTasks
self.EndJobCallback += self.EndJob
## Clean up the plugin.
def Cleanup():
del self.InitializeProcessCallback
del self.StartJobCallback
del self.RenderTasksCallback
del self.EndJobCallback
# Clean up the managed process object.
if self.Process:
self.Process.Cleanup()
del self.Process
## Called by Deadline to initialize the process.
def InitializeProcess( self ):
# Set the plugin specific settings.
self.SingleFramesOnly = False
7.2. Application Plugins
395
Deadline User Manual, Release 7.1.0.35
self.PluginType = PluginType.Advanced
## Called by Deadline when the job starts.
def StartJob( self ):
myProcess = MyPluginProcess()
StartMonitoredManagedProcess( "My Process", myProcess )
## Called by Deadline for each task the Slave renders.
def RenderTasks( self ):
# Do something to interact with the running process.
pass
## Called by Deadline when the job ends.
def EndJob( self ):
ShutdownMonitoredManagedProcess( "My Process" )
######################################################################
## This is the ManagedProcess class that is launched above.
######################################################################
class MyPluginProcess (ManagedProcess):
deadlinePlugin = None
## Hook up the callbacks in the constructor.
def __init__( self, deadlinePlugin ):
self.InitializeProcessCallback += self.InitializeProcess
self.RenderExecutableCallback += self.RenderExecutable
self.RenderArgumentCallback += self.RenderArgument
## Clean up the managed process.
def Cleanup():
# Clean up stdout handler callbacks.
for stdoutHandler in self.StdoutHandlers:
del stdoutHandler.HandleCallback
del self.InitializeProcessCallback
del self.RenderExecutableCallback
del self.RenderArgumentCallback
## Called by Deadline to initialize the process.
def InitializeProcess( self ):
# Set the ManagedProcess specific settings.
self.ProcessPriority = ProcessPriorityClass.BelowNormal
self.UseProcessTree = True
self.StdoutHandling = True
self.PopupHandling = True
# Set the stdout handlers.
self.AddStdoutHandlerCallback(
"WARNING:.*" ).HandleCallback += self.HandleStdoutWarning
self.AddStdoutHandlerCallback(
"ERROR:(.*)" ).HandleCallback += self.HandleStdoutError
# Set the popup ignorers.
self.AddPopupIgnorer( "Popup 1" )
self.AddPopupIgnorer( "Popup 2" )
# Set the popup handlers.
self.AddPopupHandler( "Popup 3", "OK" )
396
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
self.AddPopupHandler( "Popup 4", "Do not ask me this again;Continue" )
## Callback for when a line of stdout contains a WARNING message.
def HandleStdoutWarning( self ):
self.deadlinePlugin.LogWarning( self.GetRegexMatch(0) )
## Callback for when a line of stdout contains an ERROR message.
def HandleStdoutError( self ):
self.deadlinePlugin.FailRender( "Detected an error: " + self.GetRegexMatch(1) )
## Callback to get the executable used for rendering.
def RenderExecutable( self ):
return self.deadlinePlugin.GetConfigEntry( "MyPluginRenderExecutable" )
## Callback to get the arguments that will be passed to the executable.
def RenderArgument( self ):
arguments = " -verbose " + self.deadlinePlugin.GetPluginInfoEntry( "Verbose" )
arguments += " -scene \"" + self.deadlinePlugin.GetDataFilename() + "\""
return arguments
Because the Advanced plug-ins are much more complex than the Simple plug-ins, we recommend taking a look at the
following plug-ins that are shipped with Deadline for examples:
• 3dsmax
• Fusion
• Lightwave
• MayaBatch
• Modo
• Nuke
• SoftimageBatch
7.2.6 Migrating Plug-ins from Deadline 5
Some changes were made to the Scripting API in Deadline 6, which means that Deadline 6 and later are NOT backward
compatible with plugin scripts written for Deadline 5. However, migrating your scripts over is relatively straightforward, and this guide will walk you through the API changes so that you can update your scripts as necessary.
Global Functions
In Deadline 6, all global API functions were removed, and replaced with DeadlinePlugin member functions, or with
static utility functions. See the Migrating Scripts From Deadline 5 section in the Scripting Overview documentation
for more information, including replacement functions.
Almost all plugin-specific global functions are now DeadlinePlugin member functions. For example, the global
‘LogInfo( message )’ function has been replaced with a member function for the DeadlinePlugin class, which you
created in your event python file. So instead of:
LogInfo( "this is a test message" )
You would use this code:
7.2. Application Plugins
397
Deadline User Manual, Release 7.1.0.35
self.LogInfo( "this is a test message" )
The only functions that aren’t DeadlinePlugin member functions are listed below, along with their replacement utility
functions.
Original Global Function
CheckPathMapping( path )
CheckPathMappingInFile( inFileName,
outFileName )
CheckPathMappingInFileAndReplaceSeparator(
inFileName, outFileName , separatorToReplace,
newSeparator )
PathMappingRequired( path )
Replacement Function
RepositoryUtils.CheckPathMapping( path )
RepositoryUtils.CheckPathMappingInFile( inFileName,
outFileName )
RepositoryUtils.CheckPathMappingInFileAndReplaceSeparator(
inFileName, outFileName, separatorToReplace,
newSeparator )
RepositoryUtils.PathMappingRequired( path )
Callbacks
You need to set up callbacks in the constructor of your DeadlinePlugin class that you created in your plugin python
file. Examples are shown in the documentation above, and you can look at the plug-ins that ship with Deadline for
references as well. For example:
def __init__( self ):
self.InitializeProcessCallback += self.InitializeProcess
self.RenderExecutableCallback += self.RenderExecutable
self.RenderArgumentCallback += self.RenderArgument
self.PreRenderTasksCallback += self.PreRenderTasks
self.PostRenderTasksCallback += self.PostRenderTasks
Note that these callbacks need to be manually cleaned up when the plug-in is no longer in use. See the documentation
regarding the CleanupDeadlinePlugin function above for more information.
Deprecated Mode
As mentioned above, you can set the DeprecatedMode property in your dlinit file to True. This mode allows
Python.NET plug-ins written for Deadline 5.1 or 5.2 to work with Deadline 6 and later, which can make the transition to Deadline 6 easier if you have custom plug-ins.
Note that when DeprecatedMode is enabled, all global functions will still be available, so if you have custom
Python.NET plug-ins, you just need to drop them in the ‘custom/plugins’ folder in the Repository, and add “DeprecatedMode=True” to your dlinit file.
If you have custom IronPython plug-ins from Deadline 5.2 or earlier, they will not work with Deadline 6 and later.
7.3 Event Plugins
7.3.1 Overview
Event plug-ins can be created to execute specific tasks in response to specific events in Deadline (like when a job is
submitted or when it finishes). For example, event plug-ins can be used to communicate with in-house pipeline tools
to update the state of shots or tasks, or they can be used to submit a post-processing job when another job finishes. All
of Deadline’s event plug-ins are written in Python, which means that it’s easy to create your own plug-ins or customize
398
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
the existing ones. See the Scripting Overview documentation for more information, and links to the Deadline Scripting
reference.
Note that because the Python scripts for event plug-ins will be executed in a non-interactive way, it is important that
your scripts do not contain any blocking operations like infinite loops, or interfaces that require user input.
When an event is executed the log will show where the script is being loaded from.
7.3.2 Triggering Events
An event plug-in can respond to one or more of the following DeadlineEventListener events:
• When a job is submitted OnJobSubmittedCallback
• When a job starts rendering OnJobStartedCallback
• When a job finishes rendering OnJobFinishedCallback
• When a job is requeued OnJobRequeuedCallback
• When a job fails OnJobFailedCallback
• When a job is suspended OnJobSuspendedCallback
• When a suspended or failed job is resumed OnJobResumedCallback
• When a job is placed in the pending state OnJobPendedCallback
• When a job is released from a pending state OnJobReleasedCallback
• When a job is deleted OnJobDeletedCallback
• When a job error occurs during rendering OnJobErrorCallback
• When a job is about to be purged from the database OnJobPurgedCallback
• When a house cleaning operation finishes OnHouseCleaningCallback
• When a repository repair operation finishes OnRepositoryRepairCallback
• When a slave starts OnSlaveStartedCallback
• When a slave stops OnSlaveStoppedCallback
• When a slave becomes idle OnSlaveIdleCallback
• When a slave starts rendering OnSlaveRenderingCallback
• When a slave starts a job OnSlaveStartingJobCallback
• When a slave is marked as stalled OnSlaveStalledCallback
• When power management’s Idle Shutdown feature shuts down slaves OnIdleShutdownCallback
• When power management’s Machine Startup feature starts up slaves OnMachineStartupCallback
• When power management’s Thermal Shutdown feature shuts down slaves OnThermalShutdownCallback
• When power management’s Machine Restart feature restarts slaves OnMachineRestartCallback
The corresponding Event Callbacks for these events can be found in the ‘Deadline.Events.DeadlineEventListener Class
Reference’ section of the Deadline Scripting Reference documentation. The full Deadline Scripting Reference can be
found on the Thinkbox Software Documentation Website. Offline PDF and HTML versions can be downloaded from
here as well.
By default, all jobs will trigger event plug-ins when they are submitted or change state. However, there is a job property
that can be enabled to suppress events. In the Monitor, you can set the Suppress Events property under the Advanced
7.3. Event Plugins
399
Deadline User Manual, Release 7.1.0.35
tab in the Job Properties dialog. If you have a custom submission tool or script, you can specify the following in the
job info file:
SuppressEvents=True
Note that events will be executed by different Deadline applications, depending on the context of the event. For
example, the job submission event is processed by the Command application after the job has been submitted, while
the job finished event is normally processed by the Slave that finishes the last task for the job. However, the job finished
event could also be processed by the Monitor if manually marking a job as complete.
7.3.3 Creating an Event Plug-in
To create a custom event plug-in, you start by creating a folder in the Repository’s custom\events folder and give it the
name of your event plug-in. See the Scripting Overview documentation for more information on the ‘custom’ folder
in the Repository and how it’s used.
For the sake of this document, we will call our new event plug-in MyEvent. All relative script and configuration files
for this event plug-in are to be placed in this folder (some are required and some are optional).
The dlinit File - Required
The first required file is MyEvent.dlinit, which is the main configuration file for this event plug-in. It is a plain text file
that defines a few general key=value event plug-in properties, which include:
Key
Name
Enabled
DeprecatedMode
Description
Set to True or False (default is False). Only enabled event plug-ins will respond to events.
Set to True or False (default is False). Only set to True if you want a custom Python.NET event
plug-in from Deadline 5.1 or 5.2 to work with Deadline 6 or later. More information on
DeprecatedMode can be found later on.
It can also define key=value custom settings to be used by the event plug-in. For example, if you are connecting to an
in-house pipeline tool, you may want the URL and credentials to be configurable, in which case our MyEvent.dlinit
file might look like this:
Enabled=True
PipelineURL=http://[myserver]/pipeline
PipelineUserName=myuser
PipelinePassword=mypassword
The py File - Required
The other required file is MyEvent.py, which is the main event plug-in script file. It defines the main DeadlineEventListener class that contains the necessary callbacks that will respond to specific events. The template for this script file
might look like this:
from Deadline.Events import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main DeadlineEventListener class.
######################################################################
def GetDeadlineEventListener():
400
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
return MyEvent()
######################################################################
## This is the function that Deadline calls when the event plugin is
## no longer in use so that it can get cleaned up.
######################################################################
def CleanupDeadlineEventListener( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
## This is the main DeadlineEventListener class for MyEvent.
######################################################################
class MyEvent (DeadlineEventListener):
# TODO: Place code here to replace "pass"
pass
The first thing to note is that we’re importing the Deadline.Events namespace so that we can access the DeadlineEventListener class.
The GetDeadlineEventListener() function is important, as it allows Deadline to get an instance of our MyEvent class
(which is extending the abstract DeadlineEventListener class). In Deadline 6.2 and later, the GetDeadlineEventListenerWithJobs( jobs ) function can be defined as an alternative. It works the same as GetDeadlineEventListener(), except
that it accepts a list of the Job objects that the event plug-in is being loaded for. If either of these functions are not
defined, Deadline will report an error when it tries to load the event plug-in.
The MyEvent class will need to implement certain callbacks based on the events you want to respond to, and these
callbacks must be hooked up in the MyEvent constructor. All callbacks are optional, but make sure to include at
least one so that your event plug-in actually does something. For a list of all available callbacks, refer to the DeadlineEventListener class in the Deadline Scripting reference.
The CleanupDeadlineEventListener() function is also important, as it is necessary to clean up the event plug-in when
it is no longer in use. Typically, this is used to clean up any callbacks that were created when the event plug-in was
initialized.
After implementing a few functions, your MyEvent.py script file might look something like this:
from Deadline.Events import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main DeadlineEventListener class.
######################################################################
def GetDeadlineEventListener():
return MyEvent()
######################################################################
## This is the function that Deadline calls when the event plugin is
## no longer in use so that it can get cleaned up.
######################################################################
def CleanupDeadlineEventListener( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
## This is the main DeadlineEventListener class for MyEvent.
######################################################################
class MyEvent (DeadlineEventListener):
7.3. Event Plugins
401
Deadline User Manual, Release 7.1.0.35
def __init__( self ):
# Set up the event callbacks here
self.OnJobSubmittedCallback += self.OnJobSubmitted
self.OnJobFinishedCallback += self.OnJobFinished
def Cleanup( self ):
del self.OnJobSubmittedCallback
del self.OnJobFinishedCallback
def OnJobSubmitted( self, job ):
# TODO: Connect to pipeline site to notify it that a job has been submitted
# for a particular shot or task.
pass
def OnJobFinished( self, job ):
# TODO: Connect to pipeline site to notify it that the job for a particular
# shot or task is complete.
pass
The param File - Optional
The MyEvent.param file is an optional file that is used by the Event Configuration dialog in the Monitor. It declares
properties that the Monitor uses to generate a user interface for modifying custom settings in the MyEvent.dlinit file.
After you’ve created this file, open the Monitor and enter Super User mode. Then select Tools -> Configure Events
and look for your event plug-in in the list on the left.
The file might look something like:
402
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
[Enabled]
Type=boolean
Label=Enabled
Default=True
Description=If this event plug-in should respond to events.
[PipelineURL]
Type=string
Label=Pipeline URL
Default=http://[myserver]/pipeline
Description=The URL for our pipeline website.
[PipelineUserName]
Type=string
Label=Pipeline User Name
Default=
Description=The user name for our pipeline website.
[PipelinePassword]
Type=string
Label=Pipeline Password
Default=
Description=The password for our pipeline website.
Comment lines are supported in the param file, and must start with either ‘;’ or ‘#’. For example:
# This is a comment about this PipelineURL property.
[PipelineURL]
Type=string
Label=Pipeline URL
Default=http://[myserver]/pipeline
Description=The URL for our pipeline website.
You’ll notice that the property names between the square brackets matches the custom keys we defined in our
MyEvent.dlinit file. This means that these control will change the corresponding settings. The available key=value
pairs for the properties defined here are:
7.3. Event Plugins
403
Deadline User Manual, Release 7.1.0.35
Key
Name
Category
CategoryIndex
CategoryOrder
Default
DefaultValue
Description
DisableIfBlank
IgnoreIfBlank
Index
Label
Required
Type
Description
The category the control should go under.
This determines the control’s order under its category. This does the same thing as Index.
This determines the category’s order among other categories. If more than one CategoryOrder is
defined for the same category, the lowest value is used.
The default value to be used if this property is not defined in the dlinit file. This does the same thing
as DefaultValue.
The default value to be used if this property is not defined in the dlinit file. This does the same thing
as Default.
A short description of the property the control is for (displayed as a tooltip in the UI).
If True, a control will not be shown if this property is not defined in the dinit file (True/False). This
does the same thing as IgnoreIfBlank.
If True, a control will not be shown if this property is not defined in the dinit file (True/False). This
does the same thing as DisableIfBlank.
This determines the control’s order under its category. This does the same thing as CategoryIndex.
The control label.
If True, a control will be shown for this property even if it’s not defined in the dlinit file (True/False).
The type of control (see table below).
These are the available controls.
Control Type
Boolean
Color
Enum
Enumeration
Filename
FilenameSave
Float
Folder
Integer
Label
MultiFilename
MultiLineMultiFilename
MultiLineMultiFolder
MultiLineString
Password
SlaveList
String
Description
A drop-down control that allows the selection of True or False.
Allows the selection of a color.
A drop-down control that allows the selection of an item from a list.
Same as Enum above.
Allows the selection of an existing file.
Allows the selection of a new or existing file.
An floating point spinner control.
Allows the selection of an existing folder.
An integer spinner control.
A read-only text field.
Allows the selection of multiple existing files, which are then separated by semicolons in
the text field.
Allows the selection of multiple existing files, which are then placed on multiple lines in
the text field.
Allows the selection of multiple existing folders, which are then placed on multiple lines
in the text field.
A text field with multiple lines.
A text field that masks the text.
Allows the selection of existing Slaves, when are then separated by commas in the text
field.
A text field.
There are also key/value pairs for specific controls:
404
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
Key Name
DecimalPlaces
Filter
Increment
Items
Maximum
Minimum
Validator
Values
Description
The number of decimal places for the Float controls.
The filter string for the Filename, FilenameSave, or MultiFilename controls.
The value to increment the Integer or Float controls by.
The semicolon separated list of items for the Enum control. This does the same thing as Values.
The maximum value for the Integer or Float controls.
The minimum value for the Integer or Float controls.
A regular expression for the String control that is used to ensure the value is valid.
The semicolon separated list of items for the Enum control. This does the same thing as Items.
7.3.4 Event Plug-in and Error Reports
Logs and reports can be stored with the job or the slave, depending on the event type.
Job Event Reports
Event types that start with “OnJob...” will save reports with the corresponding job.
When an event plug-in that uses the LogInfo or LogWarning functions finishes executing, its log will be stored with
the job’s other render logs, which you can view in the Monitor by right-clicking on the job and selecting View Job
Reports.
When an error occurs in an event-plugin, an error report will also be stored with the job’s other render errors, which
you can view in the Monitor by right-clicking on the job and selecting View Job Reports.
Slave Event Reports
Event types that start with “OnSlave...” will save reports with the corresponding slave.
When an event plug-in that uses the LogInfo or LogWarning functions finishes executing, its log will be stored with
the slave’s other render logs, which you can view in the Monitor by right-clicking on the slave and selecting View
Slave Reports.
When an error occurs in an event-plugin, an error report will also be stored with the slave’s other render errors, which
you can view in the Monitor by right-clicking on the slave and selecting View Slave Reports.
7.3.5 Quicktime Generation Example
An event plug-in can be used to automatically submit a Quicktime job to create a movie from the rendered images of
a job that just finished. An example of an event plug-in like this can be downloaded from the Miscellaneous Deadline
Downloads Page. To install the event plugin, just unzip the downloaded file to your Repository’s custom/events folder.
Configuration Files
The QuicktimeGen.dlinit and QuicktimeGen.param files define a couple of settings that can be configured from the
Monitor. Here you can specify a path to the Quicktime settings XML file you want to use. This settings file can be
generated from the Submit Quicktime Job To Deadline submitter in the Monitor.
The QuicktimeGen.dlinit file:
Enabled=True
QTSettings=\\ws-wpg-026\share\quicktime_export_settings.xml
The QuicktimeGen.param file:
7.3. Event Plugins
405
Deadline User Manual, Release 7.1.0.35
[Enabled]
Type=boolean
Label=Enabled
Default=True
Description=If this event plug-in should respond to events.
[QTSettings]
Type=filename
Label=QT Settings XML File
Default=
Description=The QT settings xml file.
7.3.6 Cron / Scheduled Event
406
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
A ‘regular time interval’ based event plugin can be called via listening for the House Cleaning event in Deadline to
be completed. This is ideal for the execution of a Deadline event plugin, at a regular time interval when the Deadline
database is as up to date as possible. The time interval of the House Cleaning operation is controlled in the repository
options.
Deadline provides the possibility of integration with IT monitoring systems such as Zabbix, Zenoss, Nagios, Opennms,
SolarWinds or indeed any other monitoring software via the house cleaning event callback. As an example, this event
could be used to regularly inject Deadline data based on it’s job, slave, pulse, balancer statistics or info/settings into
another database thereby providing integration and consistency between separate information systems in different
departments in a company.
Building your own scheduled event script file might look something like this:
from Deadline.Events import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main DeadlineEventListener class.
######################################################################
def GetDeadlineEventListener():
return ScheduledEvent()
######################################################################
## This is the function that Deadline calls when the event plugin is
## no longer in use so that it can get cleaned up.
######################################################################
def CleanupDeadlineEventListener( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
## This is the main DeadlineEventListener class for ScheduledEvent.
######################################################################
class ScheduledEvent (DeadlineEventListener):
def __init__( self ):
# Set up the event callbacks here
self.OnHouseCleaningCallback += self.OnHouseCleaning
def Cleanup( self ):
del self.OnHouseCleaningCallback
def OnHouseCleaning( self ):
# TODO: Execute generic pipeline duties here such as
# reporting to an external studio database or injecting
# Deadline Farm Stats into Zabbix, Zenoss, Nagios for IT
7.3. Event Plugins
407
Deadline User Manual, Release 7.1.0.35
pass
7.3.7 Software Configuration Management Integration
408
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
Deadline provides the possibility of integration with Software Configuration Management (SCM) systems such as
CFEngine, Puppet, Saltstack, Chef, SCCM or indeed any SCM software via the slave event callbacks. Deadline ships
with Puppet and Salt Maintenance Jobs which can be submitted to Deadline via their monitor submission scripts and
also via Puppet and Salt slave centric event plugins.
Building your own SCM event plugin might look something like this:
from Deadline.Events import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main DeadlineEventListener class.
######################################################################
def GetDeadlineEventListener():
return SoftwareEvent()
######################################################################
## This is the function that Deadline calls when the event plugin is
## no longer in use so that it can get cleaned up.
######################################################################
def CleanupDeadlineEventListener( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
## This is the main DeadlineEventListener class for SoftwareEvent.
######################################################################
class SoftwareEvent (DeadlineEventListener):
def __init__( self ):
# Set up the event callbacks here
self.OnSlaveIdleCallback += self.OnSlaveIdle
self.OnSlaveStartedCallback += self.OnSlaveStarted
self.OnSlaveStartingJobCallback += self.OnSlaveStartingJob
def Cleanup( self ):
del self.OnSlaveIdleCallback
del self.OnSlaveStartedCallback
del self.OnSlaveStartingJob
# This is called when a slave becomes idle.
def OnSlaveIdle(self, string):
# If a slave is IDLE, then it is not processing,
7.3. Event Plugins
409
Deadline User Manual, Release 7.1.0.35
# which might be an optimal time to check for
# system updates.
self.SoftwareUpdate()
# This is called when a slave is started.
def OnSlaveStarted(self, string):
# If a slave has just started on a rendernode,
# this can typically be a reliable and safe time
# to carry out config/software deployment.
self.SoftwareUpdate()
# This is called when a slave starts a job.
def OnSlaveStartingJob(self, string, job):
# You could query the returned job object when a
# slave first starts a job. Correct version of
# renderer installed?
self.SoftwareUpdate()
def SoftwareUpdate(self):
ClientUtils.LogText("Preparing for Software Update")
# TODO: Execute command here to query your in-house
# software deployment tool (SCM) to see if any new
# software/sys env variables are required to be updated.
pass
7.3.8 Migrating Event Plug-ins from Deadline 5
Some changes were made to the Scripting API in Deadline 6, which means that Deadline 6 and later are NOT backward compatible with event plugin scripts written for Deadline 5. However, migrating your scripts over is relatively
straightforward, and this guide will walk you through the API changes so that you can update your scripts as necessary.
Global Functions
In Deadline 6, all global API functions were removed, and replaced with DeadlineEventListener member functions, or
with static utility functions. See the Migrating Scripts From Deadline 5 section in the Scripting Overview documentation for more information, including replacement functions.
Almost all event plugin-specific global functions are now DeadlineEventListener member functions. For example, the
global ‘LogInfo( message )’ function has been replaced with a member function for the DeadlineEventListener class,
which you created in your event python file. So instead of:
LogInfo( "this is a test message" )
You would use this code:
self.LogInfo( "this is a test message" )
The only functions that aren’t DeadlineEventListener member functions are listed below, along with their replacement
utility functions.
410
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
Original Global Function
CheckPathMapping( path )
CheckPathMappingInFile( inFileName,
outFileName )
CheckPathMappingInFileAndReplaceSeparator(
inFileName, outFileName , separatorToReplace,
newSeparator )
PathMappingRequired( path )
Replacement Function
RepositoryUtils.CheckPathMapping( path )
RepositoryUtils.CheckPathMappingInFile( inFileName,
outFileName )
RepositoryUtils.CheckPathMappingInFileAndReplaceSeparator(
inFileName, outFileName, separatorToReplace,
newSeparator )
RepositoryUtils.PathMappingRequired( path )
Callbacks
You need to set up callbacks in the constructor of your DeadlineEventListener class that you created in your event
python file. Examples are shown in the documentation above, and you can look at the event plug-ins that ship with
Deadline for references as well. For example:
def __init__( self ):
self.OnJobSubmittedCallback += self.OnJobSubmitted
self.OnJobStartedCallback += self.OnJobStarted
self.OnJobFinishedCallback += self.OnJobFinished
self.OnJobRequeuedCallback += self.OnJobRequeued
self.OnJobFailedCallback += self.OnJobFailed
Note that these callbacks need to be manually cleaned up when the event plug-in is no longer in use. See the documentation regarding the CleanupDeadlineEventListener function above for more information.
Deprecated Mode
As mentioned above, you can set the DeprecatedMode property in your dlinit file to True. This mode allows
Python.NET event plug-ins written for Deadline 5.1 or 5.2 to work with Deadline 6 and later, which can make the
transition to Deadline 6 easier if you have custom event plug-ins.
Note that when DeprecatedMode is enabled, all global functions will still be available, so if you have custom
Python.NET event plug-ins, you just need to drop them in the ‘custom/events’ folder in the Repository, and add
“DeprecatedMode=True” to your dlinit file.
If you have custom IronPython event plug-ins from Deadline 5.2 or earlier, they will not work with Deadline 6 and
later.
7.4 Cloud Plugins
7.4.1 Overview
Cloud plug-ins can be created to allow Deadline to communicate with different cloud providers. All of Deadline’s
cloud plug-ins are written in Python, which means that it’s easy to create your own plug-ins or customize the existing
ones. You can also refer to them in the Repository’s cloud folder for examples of how they work. See the Scripting
Overview documentation for more information, and links to the Deadline Scripting reference.
Note that because the Python scripts for cloud plug-ins will be executed in a non-interactive way, it is important that
your scripts do not contain any blocking operations like infinite loops, or interfaces that require user input.
When a cloud script is executed the log will show where the script is being loaded from.
7.4. Cloud Plugins
411
Deadline User Manual, Release 7.1.0.35
7.4.2 Creating a Cloud Plug-in
To create a custom cloud plug-in, you start by creating a folder in the Repository’s custom\cloud folder and give it the
name of your cloud plug-in. See the Scripting Overview documentation for more information on the ‘custom’ folder
in the Repository and how it’s used.
For the sake of this document, we will call our new cloud plug-in MyCloud. All relative script and configuration files
for this cloud plug-in are to be placed in this folder.
The py File
The first required file is MyCloud.py, which is the main cloud plug-in script file. It defines the main CloudPluginWrapper class that contains the necessary callbacks that will respond to specific commands. The template for this
script file might look like this:
from Deadline.Cloud import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main CloudPluginWrapper class.
######################################################################
def GetCloudPluginWrapper():
return MyCloudPlugin()
######################################################################
## This is the function that Deadline calls when the cloud plugin is
## no longer in use so that it can get cleaned up.
######################################################################
def CleanupCloudPlugin( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
## This is the main DeadlineCloudListener class for MyCloud.
######################################################################
class MyCloud (CloudPluginWrapper):
# TODO: Place code here instead of "pass"
pass
The GetCloudPluginWrapper() function is important, as it allows Deadline to get an instance of our MyCloud class
(which is extending the abstract CloudPluginWrapper class). If this function isn’t defined, Deadline will report an
error when it tries to load the cloud plug-in. Notice that we’re importing the Deadline.Cloud namespace so that we
can access the CloudPluginWrapper class.
The MyCloud class will need to implement certain callbacks so that Deadline can get information from the cloud
provider, and these callbacks must be hooked up in the MyCloud constructor. For a list of all available callbacks, refer
to the CloudPluginWrapper class in the Deadline Scripting reference.
After implementing a few functions, your MyCloud.py script file might look something like this:
from Deadline.Cloud import *
######################################################################
## This is the function that Deadline calls to get an instance of the
## main CloudPluginWrapper class.
######################################################################
def GetCloudPluginWrapper():
412
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
return MyCloudPlugin()
######################################################################
## This is the function that Deadline calls when the cloud plugin is
## no longer in use so that it can get cleaned up.
######################################################################
def CleanupCloudPlugin( deadlinePlugin ):
deadlinePlugin.Cleanup()
######################################################################
## This is the main DeadlineCloudListener class for MyCloud.
######################################################################
class MyCloud (CloudPluginWrapper):
def __init__( self ):
#Set up our callbacks for cloud control
self.VerifyAccessCallback += self.VerifyAccess
self.AvailableHardwareTypesCallback += self.GetAvailableHardwareTypes
self.AvailableOSImagesCallback += self.GetAvailableOSImages
self.CreateInstancesCallback += self.CreateInstances
self.TerminateInstancesCallback += self.TerminateInstances
self.CloneInstanceCallback += self.CloneInstance
self.GetActiveInstancesCallback += self.GetActiveInstances
self.StopInstancesCallback += self.StopInstances
self.StartInstancesCallback += self.StartInstances
self.RebootInstancesCallback += self.RebootInstances
def Cleanup( self ):
#Clean up our callbacks for cloud control
del self.VerifyAccessCallback
del self.AvailableHardwareTypesCallback
del self.AvailableOSImagesCallback
del self.CreateInstancesCallback
del self.TerminateInstancesCallback
del self.CloneInstanceCallback
del self.GetActiveInstancesCallback
del self.StopInstancesCallback
del self.StartInstancesCallback
del self.RebootInstancesCallback
def VerifyAccess( self ):
#TODO: Return True if connection to cloud provider can be verified.
pass
def GetAvailableHardwareTypes( self ):
#TODO: Return list of HardwareType objects representing the hardware
#types supported by this provider.
#Must be implemented for the Balancer to work.
pass
def GetAvailableOSImages( self ):
#TODO: Return list of OSImage objects representing the OS images
#supported by this provider.
#Must be implemented for the Balancer to work.
pass
def GetActiveInstances( self ):
#TODO: Return list of CloudInstance objects that are currently active.
7.4. Cloud Plugins
413
Deadline User Manual, Release 7.1.0.35
pass
def CreateInstances( self, hardwareID, imageID, count ):
#TODO: Start instances and return list of CloudInstance objects that
#have been started.
#Must be implemented for the Balancer to work.
pass
def TerminateInstances( self, instanceIDs ):
#TODO: Return list of boolean values indicating which instances
#terminated successfully.
#Must be implemented for the Balancer to work.
pass
def StopInstances( self, instanceIDs ):
#TODO: Return list of boolean values indicating which instances
#stopped successfully.
pass
def StartInstances( self, instanceIDs ):
#TODO: Return list of boolean values indicating which instances
#started successfully.
pass
def RebootInstances( self, instanceIDs ):
#TODO: Return list of boolean values indicating which instances
#rebooted successfully.
pass
The param File
The MyCloud.param file is an optional file that is used by the Cloud Provider Configuration dialog in the Monitor. It
declares properties that the Monitor uses to generate a user interface for modifying settings for this provider, which are
then stored in the database. After you’ve created this file, open the Monitor and enter Super User mode. Then select
Tools -> Configure Cloud Providers and click the Add button under the Cloud Region box to see your cloud plugin.
414
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
The file might look something like:
[Enabled]
Type=boolean
Label=Enabled
Default=True
Description=If this cloud plug-in should be enabled.
[AccessID]
Type=string
Category=Options
CategoryOrder=0
Index=1
Label=Access ID
Default=
Description=Your Cloud Provider Access ID.
[SecretKey]
Type=password
Category=Options
CategoryOrder=0
Index=2
Label=Secret Key
Default=
Description=Your Cloud Provider Secret Key.
Comment lines are supported in the param file, and must start with either ‘;’ or ‘#’. For example:
# This is a comment about this Enabled property.
[Enabled]
Type=boolean
Label=Enabled
Default=True
Description=If this cloud plug-in should be enabled.
The available key=value pairs for the properties defined here are:
7.4. Cloud Plugins
415
Deadline User Manual, Release 7.1.0.35
Key
Name
Category
CategoryIndex
CategoryOrder
Default
DefaultValue
Description
DisableIfBlank
IgnoreIfBlank
Index
Label
Required
Type
Description
The category the control should go under.
This determines the control’s order under its category. This does the same thing as Index.
This determines the category’s order among other categories. If more than one CategoryOrder is
defined for the same category, the lowest value is used.
The default value to be used if this property is not defined in the dlinit file. This does the same thing
as DefaultValue.
The default value to be used if this property is not defined in the dlinit file. This does the same thing
as Default.
A short description of the property the control is for (displayed as a tooltip in the UI).
If True, a control will not be shown if this property is not defined in the dinit file (True/False). This
does the same thing as IgnoreIfBlank.
If True, a control will not be shown if this property is not defined in the dinit file (True/False). This
does the same thing as DisableIfBlank.
This determines the control’s order under its category. This does the same thing as CategoryIndex.
The control label.
If True, a control will be shown for this property even if it’s not defined in the dlinit file (True/False).
The type of control (see table below).
These are the available controls.
Control Type
Boolean
Color
Enum
Enumeration
Filename
FilenameSave
Float
Folder
Integer
Label
MultiFilename
MultiLineMultiFilename
MultiLineMultiFolder
MultiLineString
Password
SlaveList
String
Description
A drop-down control that allows the selection of True or False.
Allows the selection of a color.
A drop-down control that allows the selection of an item from a list.
Same as Enum above.
Allows the selection of an existing file.
Allows the selection of a new or existing file.
An floating point spinner control.
Allows the selection of an existing folder.
An integer spinner control.
A read-only text field.
Allows the selection of multiple existing files, which are then separated by semicolons in
the text field.
Allows the selection of multiple existing files, which are then placed on multiple lines in
the text field.
Allows the selection of multiple existing folders, which are then placed on multiple lines
in the text field.
A text field with multiple lines.
A text field that masks the text.
Allows the selection of existing Slaves, when are then separated by commas in the text
field.
A text field.
There are also key/value pairs for specific controls:
416
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
Key Name
DecimalPlaces
Filter
Increment
Items
Maximum
Minimum
Validator
Values
Description
The number of decimal places for the Float controls.
The filter string for the Filename, FilenameSave, or MultiFilename controls.
The value to increment the Integer or Float controls by.
The semicolon separated list of items for the Enum control. This does the same thing as Values.
The maximum value for the Integer or Float controls.
The minimum value for the Integer or Float controls.
A regular expression for the String control that is used to ensure the value is valid.
The semicolon separated list of items for the Enum control. This does the same thing as Items.
7.5 Balancer Plugins
7.5.1 Overview
Balancer plugins can be created to customize the algorithm logic for the Balancer application. Balancer plugins are
written in Python, which means that they can easily be created and customized. You can also refer to the default plugin
in the Repoistory’s balancer folder for a full example of how it works. See the Scripting Overview documentation for
more information, and links to the Deadline Scripting reference.
7.5.2 Creating a Balancer Plug-in
To create a custom balancer plug-in, you start by creating a folder in the Repository’s custom\balancer folder and give
it the name of your balancer plug-in. See the Scripting Overview documentation for more information on the ‘custom’
folder in the Repository and how it’s used.
For the sake of this document, we will call our new balancer plug-in MyBalancerAlgorithm. All relative script and
configuration files for this balancer plug-in are to be placed in this folder.
The py File
The first required file is MyBalancerAlgorithm.py, which is the main balancer plugin script. It defines the BalancerPluginWrapper class that contains all the necessary callbacks that will be used during a balancer cycle. The template
for this script file might look like this:
from Deadline.Balancer import *
###########################################################################
## This is the function that Deadline calls to get an instance of the
## main BalancerPluginWrapper class.
###########################################################################
def GetBalancerPluginWrapper():
return MyBalancerPlugin()
###########################################################################
## This is the main DeadlineBalancerListener class for MyBalancerAlgorithm.
###########################################################################
class MyBalancerAlgorithm (BalancerPluginWrapper):
# TODO: Place code here instead of "pass"
pass
7.5. Balancer Plugins
417
Deadline User Manual, Release 7.1.0.35
The GetBalancerPluginWrapper() function is important, as it allows Deadline to get an instance of our MyBalancerAlgorithm class (which is extending the abstract BalancerPluginWrapper class). If this function isn’t defined, Deadline
will report an error when it tries to load the balancer plug-in. Notice that we’re importing the Deadline.Balancer
namespace so that we can access the BalancerPluginWrapper class.
The MyBalancerAlgorithm class will need to implement the BalancerAlgorithm callback so that Deadline can know
how to balance your farm, and these callbacks must be hooked up in the MyBalancerAlgorithm constructor.
After implementing a few functions, your MyBalancerAlgorithm.py script file might look something like this:
from Deadline.Balancer import *
###########################################################################
## This is the function that Deadline calls to get an instance of the
## main BalancerPluginWrapper class.
###########################################################################
def GetBalancerPluginWrapper():
return MyBalancerPlugin()
###########################################################################
## This is the main DeadlineBalancerListener class for MyBalancerAlgorithm.
###########################################################################
class MyBalancerAlgorithm (BalancerPluginWrapper):
def __init__( self ):
self.BalancerAlgorithmCallback += self.BalancerAlgorithm
def BalancerAlgorithm(self, stateStruct):
#TODO: Return a target struct to the Balancer.
pass
Here’s what a BalancerTargetStruct looks like:
/// <summary>
/// The BalancerTargetStruct indicates the ideal number of VM instances that should
/// be running in each enabled Group of each CloudRegion. The BalancerTargetStruct
/// is populated by a Balancer Logic Plug-in.
/// </summary>
public class BalancerTargetStruct
{
public BalancerTargetStruct() { }
// Logic plug-in can set this to true to indicate that an error occurred.
public bool ErrorEncountered;
// Logic plugin can convey an error message here
// (ErrorEncountered should be set to true).
public string ErrorMessage;
// Logic plugin can convey a non-error message here.
public string Message;
// An array of cloud region targets.
public CloudRegionTargetStruct[] CloudRegionTargets;
// The time the structure was filled.
public DateTime Time;
}
418
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
public class CloudRegionTargetStruct
{
public CloudRegionTargetStruct() { }
// The unique ID of the region.
public string RegionID;
// An array of Group targets
public GroupTargetStruct[] GroupTargets;
}
public class GroupTargetStruct
{
public GroupTargetStruct() { }
public GroupTargetStruct(string Name, int Count)
{
this.Name = Name;
this.Count = Count;
}
// The name of the group.
public string Name;
// The target number of VM instances for the group.
public int Count;
}
The param File
The MyBalancerAlgorithm.param file is an optional file that is used in the Balancer Settings panel of the Repository
Options dialog in the Monitor. It declares properties that the Monitor uses to generate a user interface for modifying
settings for this algorithm, which are then stored in the database. After you’ve created this file, open the Monitor
and enter Super User mode. Then select Tools -> Repository Options -> Balancer Settings and click the dropdown to
MyBalancerAlgorithm to see your settings. Comment lines are supported in the param file, and must start with either
‘;’ or ‘#’.
The dlinit File
The last required file is MyBalancerAlgorithm.dlinit, which is the main configuration file for this plugin. It is a plain
text file that defines a few general key=value plug-in properties, which include:
Key
Name
About
ConcurrentTasks
DebugLogging
DeprecatedMode
Description
A short description of the plug-in.
Set to True or False (default is False). If tasks for this plug-in can render concurrently without
interfering with each other, this can be set to True.
Set to True or False (default is False). If set to True, then debug plug-in logging will be printed out
during rendering.
Set to True or False (default is False). Only set to True if you want a custom Python.NET plug-in
from Deadline 5.1 or 5.2 to work with Deadline 6 or later. More information on DeprecatedMode
can be found later on.
7.5. Balancer Plugins
419
Deadline User Manual, Release 7.1.0.35
It can also define key=value custom settings to be used by the plug-in. For this example, our MyBalancerAlgorithm.dlinit file might look like this:
About=My Example Plugin for Deadline
SomeSortOfScript=c:\path\to\my\script.py
Comment lines are supported in the dlinit file, and must start with either ‘;’ or ‘#’.
7.6 Monitor Scripts
7.6.1 Overview
There are several different types of Monitor scripts available. While the large majority of the ones shipping with
Deadline are Submission Scripts used to submit new Jobs to the farm, the Monitor has the capability of running utility
scripts in the context of specific Jobs, Tasks, Slaves, Limits, or even Reports.
Below, we go into more detail for each of the different types of Scripts, and how to create your own.
7.6.2 Scripting Reference
As with all other Deadline scripts, Monitor scripts use Python 2.7, which is supported using Python for .NET. This
means that in addition to typical cPython modules, Python for .NET allows your scripts to make use of .NET Libraries,
and Deadline’s own internal functions.
The full Deadline Scripting Reference can be downloaded in CHM or PDF format from the Deadline Downloads page.
Particular functions of note relevant to Monitor Scripting can be found in the aforementioned Scripting Reference,
under the following sections:
• Deadline.Scripting.MonitorUtils
• Deadline.Scripting.JobUtils
• Deadline.Scripting.SlaveUtils
It can also be very helpful when developing your own Monitor Script to take a look at how our built-in Monitor Scripts
of that type are structured.
7.6.3 General Script Template
We follow a fairly specific template when making any new built-in Monitor scripts. The template is loosely as follows:
• Define your __main__ function: This is the function that Deadline will call when invoking your script. This is
mandatory, and your script will generate an error if it isn’t done.
def __main__( *args ):
#Replace "pass"
pass
• Build the submission UI: Typically done in the __main__ function by creating a ScriptDialog object, and
adding controls to it. Each control’s name must be unique, so that each control can be identified properly. You
can also set the dialog’s size (if not using a grid layout), the row and column (if using a grid layout), title, and a
few other settings. For more details, see the ScriptDialog and ScriptControl sections of the Reference Manual.
For an example on how to use the grid layout see the Grid Layout Example Script documentation.
420
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
– Define and Load Sticky Settings: Sticky settings are settings that persist after the dialog has been closed.
They are defined by creating a string array that contains the names of the controls for which you want the
settings to persist. After defining them, you can load them by calling the ‘LoadSettings’ function of your
ScriptDialog.
– Show the Dialog: The last thing you should do in your __main__ function is to show your ScriptDialog,
by using its ‘ShowDialog’ function.
• Define Your Functions: Specify any functions that may be used by your script. These could just be helper
functions, or event handlers that do stuff when UI values are modified.
Note that you don’t necessarily need to follow this template, but the closer you stick to it, the more examples you’ll
have to draw on.
7.6.4 Monitor Scripts
There are many different types of scripts you can write for the Monitor, which are listed below. It is recommended that
these scripts be created in the ‘custom’ folder in the Repository to avoid issues when upgrading your Repository in the
future. See the Scripting Overview documentation for more information on the ‘custom’ folder in the Repository and
how it’s used.
When a monitor script is executed the log will show where the script is being loaded from.
Submission Scripts
Submission Scripts are used to create custom Submission dialogs, and ultimately submit new Jobs to Deadline. They
are located in the ‘Submit’ menu of the Monitor’s main menu bar, as well as the ‘Submit’ menu in the the Launcher.
Creating your own custom Submission dialog is quite simple, and the process is described below.
To create new submission scripts, simply navigate to the ‘custom\scripts\Submission’ folder in your Repository. Then,
create a new Python file named ‘MySubmissionScript.py’, where ‘MySubmissionScript’ is the name of your new
script.
Once created, you can follow the template outlined above in the General Script Template section to build up your
script.
General Scripts
General scripts are used to perform any sort of custom action by selecting them from the Monitor’s (or Launcher’s)
‘Scripts’ menu. Under the hood, there technically isn’t anything different between General and Submission scripts.
The only real difference is that they show up under different menus, which is just to help keep scripts semantically
separated.
To create new General scripts, simply navigate to the ‘custom\scripts\General’ folder in your Repository. Then, create
a new Python file named ‘MyGeneralScript.py’, where ‘MyGeneralScript’ is the name of your new script.
Once created, you can follow the template outlined above in the General Script Template section to build up your
script.
Job Scripts
Job Scripts are typically used to modify or to perform actions on a selected Job in the Monitor. They can be accessed
by right-clicking an existing Job in the Job Panel, under the ‘Scripts’ sub-menu.
To create new Job scripts, simply navigate to the ‘custom\scripts\Jobs’ folder in your Repository. Then, create a new
Python file named ‘MyJobScript.py’, where ‘MyJobScript’ is the name of your new script.
7.6. Monitor Scripts
421
Deadline User Manual, Release 7.1.0.35
Once created, you can follow the template outlined above in the General Script Template section to build up your
script.
Task Scripts
Task Scripts are typically used to modify or to perform actions on a selected Task in the Monitor. They can be accessed
by right-clicking an existing Task in the Task Panel, under the ‘Scripts’ sub-menu.
To create new Task scripts, simply navigate to the ‘custom\scripts\Tasks’ folder in your Repository. Then, create a
new Python file named ‘MyTaskScript.py’, where ‘MyTaskScript’ is the name of your new script.
Once created, you can follow the template outlined above in the General Script Template section to build up your
script.
Slave Scripts
Slave Scripts are typically used to modify or to perform actions on a selected Slave in the Monitor. They can be
accessed by right-clicking an existing Slave in the Slave Panel, under the ‘Scripts’ sub-menu.
To create new Slave scripts, simply navigate to the ‘custom\scripts\Slaves’ folder in your Repository. Then, create a
new Python file named ‘MySlaveScript.py’, where ‘MySlaveScript’ is the name of your new script.
Once created, you can follow the template outlined above in the General Script Template section to build up your
script.
Pulse Scripts
Pulse Scripts are typically used to modify or to perform actions on a selected Pulse in the Monitor. They can be
accessed by right-clicking an existing Pulse in the Pulse Panel, under the ‘Scripts’ sub-menu.
To create new Pulse scripts, simply navigate to the ‘custom\scripts\Pulse’ folder in your Repository. Then, create a
new Python file named ‘MyPulseScript.py’, where ‘MyPulseScript’ is the name of your new script.
Once created, you can follow the template outlined above in the General Script Template section to build up your
script.
Balancer Scripts
Balancer Scripts are typically used to modify or to perform actions on a selected Balancer in the Monitor. They can
be accessed by right-clicking an existing Balancer in the Balancer Panel, under the ‘Scripts’ sub-menu.
To create new Balancer scripts, simply navigate to the ‘custom\scripts\Balancer’ folder in your Repository. Then,
create a new Python file named ‘MyBalancerScript.py’, where ‘MyBalancerScript’ is the name of your new script.
Once created, you can follow the template outlined above in the General Script Template section to build up your
script.
Limit Scripts
Limit Scripts are typically used to modify or to perform actions on selected Limits in the Monitor. They can be
accessed by right-clicking an existing Limit in the Pulse Panel, under the ‘Scripts’ sub-menu.
To create new Limit scripts, simply navigate to the ‘custom\scripts\Limits’ folder in your Repository. Then, create a
new Python file named ‘MyLimitScript.py’, where ‘MyLimitScript’ is the name of your new script.
422
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
Once created, you can follow the template outlined above in the General Script Template section to build up your
script.
Job Report Scripts
Job Report Scripts are typically used to modify or to perform actions on selected Job Reports in the Monitor. They
can be accessed by right-clicking an existing Job Report in the Job Report Panel, under the ‘Scripts’ sub-menu.
To create new Job Report scripts, simply navigate to the ‘custom\scripts\JobReports’ folder in your Repository. Then,
create a new Python file named ‘MyJobReportScript.py’, where ‘MyJobReportScript’ is the name of your new script.
Once created, you can follow the template outlined above in the General Script Template section to build up your
script.
Slave Report Scripts
Slave Report Scripts are typically used to modify or to perform actions on selected Slave Reports in the Monitor. They
can be accessed by right-clicking an existing Slave Report in the Slave Report Panel, under the ‘Scripts’ sub-menu.
To create new Slave Report scripts, simply navigate to the ‘custom\scripts\SlaveReports’ folder in your Repository.
Then, create a new Python file named ‘MySlaveReportScript.py’, where ‘MySlaveReportScript’ is the name of your
new script.
Once created, you can follow the template outlined above in the General Script Template section to build up your
script.
7.6.5 Customizing Script Display
As with any built-in script, once you’ve created your new Monitor Script you can change its Display Name, Keyboard
Shortcut, Icon, and its position within the menu in the Repository Configuration.
You can also control who can see (and use) your Submission Script through by tweaking its access level in User
Management. It is probably a good idea to disable access to it for most users until you have your new script in
working order.
7.6.6 Grid Layout Example Script
Grid layouts allow your script dialog to dynamically resize its contents to fit the the size of the dialog. Below are some
examples of how to use the new grid layout to build a script dialog.
First you must create a ScriptDialog object and start a grid. Once all controls have been added you must end the grid
dg = DeadlineScriptDialog()
dg.AddGrid()
#...
#Added controls go here
#...
dg.EndGrid()
Once you start a grid you can add controls to it by row and column. There is no need to specify how many rows or
columns you want the grid to have, just specify the row and column where you want the control to be and the grid will
grow to accommodate. Here is an example of adding a label and a text field to the dialog in the same row.
7.6. Monitor Scripts
423
Deadline User Manual, Release 7.1.0.35
dg.AddGrid()
dg.AddControlToGrid("Label1", "LabelControl", "I'm a label.", 0,0, "A tooltip", False)
dg.AddControlToGrid( "TextBox1", "TextControl", "", 0, 1 )
dg.EndGrid()
Here is an example of what this dialog would look like:
It is not possible to specify the size of the controls you want to add to the grid, however it is also not necessary to do
so. The contents of the grid(s) will automatically adjust themselves to share the size of the dialog. If you want certain
elements to not grow within a row you can set the “expand” property to be disabled. If you want a control to take more
space you can set the control span multiple rows or columns using “rowSpan” and “colSpan”, respectively. By default
controls have “expand” set and have their “colSpan” and “rowSpan” properties set to 1.
This is an example of a dialog with two rows and four columns. The first row contains a label in the first column and
is set to not grow any bigger than it needs to and a text control that spans the next 3 columns and is allowed to grow.
The second row contains three labels that are not allowed to grow in the first three columns and a text control in the
fourth column that can grow as needed.
dg.AddGrid()
dg.AddControlToGrid(
"L1", "LabelControl", "I'm a label.", 0,0, "A tooltip", expand=False)
dg.AddControlToGrid(
"TextBox1", "TextControl", "", 0, 1, colSpan=3)
dg.AddControlToGrid(
"L2", "LabelControl", "I'm
dg.AddControlToGrid(
"L3", "LabelControl", "I'm
dg.AddControlToGrid(
"L4", "LabelControl", "I'm
dg.AddControlToGrid(
"TextBox2", "TextControl",
another label.", 1,0, "A tooltip", expand=False)
another label.", 1,1, "A tooltip", expand=False)
another label.", 1,2, "A tooltip", expand=False)
"", 1, 3)
dg.EndGrid()
Here is an example of what this dialog would look like:
When you expand the dialog horizontally, only the text controls will grow in the above example. Nothing will grow,
other than the dialog itself, when expanding vertically. Note that if you set all controls in a row to not expand that this
will cause the cells in the grid that the controls are in to expand without allowing any of the controls to expand with it.
This will result in the dialog losing its layout when it is expanded.
424
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
Here is an example of what this dialog would look like expanded horizontally:
Here is an example of what this dialog would look like expanded vertically:
Here is an example of what the dialog would look like expanded horizontally if all controls had “expand=False” set.
If you want to space controls out in the grid you can use labels filled with white space, or you can use horizontal
spacers. Here is an example of adding two buttons to a dialog and keeping them to the far right of the dialog.
dg.AddGrid()
dg.AddHorizontalSpacerToGrid( "DummyLabel", 0, 0 )
ok = dg.AddControlToGrid(
"Ok", "ButtonControl", "OK", 0, 1, expand=False )
ok.ValueModified.connect(OkButtonPressed)
cancel = dg.AddControlToGrid(
"Cancel", "ButtonControl", "Cancel", 0, 2, expand=False )
cancel.ValueModified.connect(CancelButtonPressed)
dg.EndGrid()
Here is an example of what this dialog will look like when expanded horizontally:
7.6. Monitor Scripts
425
Deadline User Manual, Release 7.1.0.35
All together, here is an example of a basic script dialog using grid layouts.
from DeadlineUI.Controls.Scripting.DeadlineScriptDialog import DeadlineScriptDialog
########################################################################
## Globals
########################################################################
dg = None
########################################################################
## Main Function Called By Deadline
########################################################################
def __main__( *args ):
global dg
dg = DeadlineScriptDialog()
dg.SetTitle( "Example Deadline Script" )
dg.AddGrid()
dg.AddControlToGrid(
"L1", "LabelControl", "I'm a label.", 0,0, "A tooltip", expand=False)
dg.AddControlToGrid(
"TextBox1", "TextControl", "", 0, 1, colSpan=3)
dg.AddControlToGrid(
"L2", "LabelControl", "I'm
dg.AddControlToGrid(
"L3", "LabelControl", "I'm
dg.AddControlToGrid(
"L4", "LabelControl", "I'm
dg.AddControlToGrid(
"TextBox2", "TextControl",
another label.", 1,0, "A tooltip", expand=False)
another label.", 1,1, "A tooltip", expand=False)
another label.", 1,2, "A tooltip", expand=False)
"", 1, 3)
dg.EndGrid()
#Adds an OK and Cancel button to the dialog
dg.AddGrid()
dg.AddHorizontalSpacerToGrid( "DummyLabel", 0, 0 )
ok = dg.AddControlToGrid(
"Ok", "ButtonControl", "OK", 0, 1, expand=False )
ok.ValueModified.connect(OkButtonPressed)
cancel = dg.AddControlToGrid(
"Cancel", "ButtonControl", "Cancel", 0, 2, expand=False )
cancel.ValueModified.connect(CancelButtonPressed)
dg.EndGrid()
dg.ShowDialog( True )
def CloseDialog():
426
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
global dg
dg.CloseDialog()
def CancelButtonPressed():
CloseDialog()
def OkButtonPressed( *args ):
global dg
dg.ShowMessageBox( "You pressed the OK button.", "Button Pressed" )
Here is what this dialog looks like:
7.6.7 Migrating Scripts from Deadline 5
Some changes were made to the Scripting API in Deadline 6, which means that Deadline 6 and later are NOT backward
compatible with scripts written for Deadline 5. However, migrating your scripts over is relatively straightforward, and
this guide will walk you through the API changes so that you can update your scripts as necessary.
Global Functions
The globally defined functions are no longer available. See the Migrating Scripts From Deadline 5 section in the
Scripting Overview documentation for more information, including replacement functions.
User Interface
If you are creating a user interface using the ScriptDialog object, you can no longer get an instance of it from DeadlineScriptEngine using the following:
scriptDialog = DeadlineScriptEngine.GetScriptDialog()
Instead, you need to import the DeadlineScriptDialog class, and use its constructor to create an instance:
from DeadlineUI.Controls.Scripting.DeadlineScriptDialog import DeadlineScriptDialog
...
scriptDialog = DeadlineScriptDialog()
Another change is how the ValueModifed event handlers are hooked up for the ScriptDialog controls. For example,
this is how the event was hooked up in Deadline 5:
7.6. Monitor Scripts
427
Deadline User Manual, Release 7.1.0.35
compBox = scriptDialog.AddControl(
"CompBox", "TextControl", "", dialogWidth-labelWidth-24,-1)
compBox.ValueModified += CompChanged
Now, because the ScriptDialog object is a Qt object, you need to use the connect function to hook up events:
compBox = scriptDialog.AddControl(
"CompBox", "TesxtControl", "", dialogWidth-labelWidth-24,-1)
compBox.ValueModified.connect( CompChanged )
The File Browser based controls have also changed their file filter syntax. In Deadline 5, the file filter syntax looked
like this:
scriptDialog.AddRow()
scriptDialog.AddControl(
"FileLabel", "LabelControl", "Select File", labelWidth, -1)
scriptDialog.AddSelectionControl("FileBox", "FileBrowserControl", "",
"All Files (*.*)|*.*|CAD Files: JT (*.jt)|*.jt", dialogWidth-labelWidth-24, -1)
scriptDialog.EndRow()
Now, because the ScriptDialog object is a Qt object, you need to use the following syntax to filter files in any of the
browser controls. Note the replacement of the “|” character for ”;;” and there is no longer the requirement to provide
a file extension filter per file format entry as the filter is taken from the text label (*.txt) or (*.*) as per the example
below:
scriptDialog.AddRow()
scriptDialog.AddControl(
"FileLabel", "LabelControl", "Select File", labelWidth, -1)
scriptDialog.AddSelectionControl( "FileBox", "FileBrowserControl", "",
"Text Files (*.txt);;All Files (*.*)", dialogWidth-labelWidth-24, -1)
scriptDialog.EndRow()
7.7 Job Scripts
7.7.1 Overview
Job scripts and Dependency scripts can use Python to implement additional automation. Job scripts can be used to
perform additional tasks during rendering, and Dependency scripts can control when jobs start rendering.
Note that because the Python scripts will be executed in a non-interactive way, it is important that your scripts do not
contain any blocking operations like infinite loops, or interfaces that require user input. See the Scripting Overview
documentation for more information, and links to the Deadline Scripting reference.
7.7.2 Job Scripts
Job scripts can be assigned to Jobs in order to automate certain tasks before a Job starts rendering (Pre-Job Script),
after a Job finished rendering (Post-Job Script), or before and after each individual Job Task has been completed (Pre
and Post-Task Scripts).
After you create your scripts, you can assign them to a Job by right-clicking on the desired Job in the Monitor, and
selecting ‘Modify Job Properties’. The script options can be found under the ‘Scripts’ section of the Job Properties
428
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
window. In addition to this, Job scripts can be specified by custom submitters by including them in the Job Info File
on submission. Note that a full path to the script is required, so it is recommended that the script file be stored in a
location that is accessible to all Slaves.
Creating Job Scripts
The only requirement for a Job script is that you define a __main__ function. This is the function that will be called
by Deadline when it comes time to execute the script, and an instance of the DeadlinePlugin object will be passed as
a parameter.
def __main__( *args ):
#Replace "pass"
pass
A common use for Post-Task scripts is to do some processing with the output image files. Here is a sample script that
demonstrates how to get the output file names for the current task, and print them out to the render log:
import re
from System.IO import *
from Deadline.Scripting import *
def __main__( *args ):
deadlinePlugin = args[0]
job = deadlinePlugin.GetJob()
outputDirectories = job.OutputDirectories
outputFilenames = job.OutputFileNames
paddingRegex = re.compile("[^\\?#]*([\\?#]+).*")
s
for i in range( 0, len(outputDirectories) ):
outputDirectory = outputDirectories[i]
outputFilename = outputFilenames[i]
startFrame = deadlinePlugin.GetStartFrame()
endFrame = deadlinePlugin.GetEndFrame()
for frameNum in range(startFrame, endFrame+1):
outputPath = Path.Combine(outputDirectory,outputFilename)
outputPath = outputPath.replace("//","/")
m = re.match(paddingRegex,outputPath)
if( m != None):
padding = m.group(1)
frame = StringUtils.ToZeroPaddedString(frameNum,len(padding),False)
outputPath = outputPath.replace( padding, frame )
deadlinePlugin.LogInfo( "Output file: " + outputPath )
7.7.3 Dependency Scripts
Dependency scripts can be used to control when a job starts rendering. For example, the script could connect to an
internal pipeline database to see if the job has been approved to start rendering.
After you create your dependency scripts, you can assign them to a Job by right-clicking on the desired Job in the
Monitor, and selecting ‘Modify Job Properties’. The Script Dependencies options can be found under the ‘Scripts’
section of the Job Properties window. In addition to this, Job scripts can be specified by custom submitters by including
7.7. Job Scripts
429
Deadline User Manual, Release 7.1.0.35
them in the Job Info File on submission. Note that a full path to the script is required, so it is recommended that the
script file be stored in a location that is accessible to all Slaves.
Creating Dependency Scripts
The only requirement for a Job script is that you define a __main__ function. This is the function that will be called
by Deadline when it comes time to execute the script to determine if a job should be released or not.
For jobs without Frame Dependencies enabled, only the job ID will be passed as a parameter. The __main__ function
should then return True if the job should be released or False if it shouldn’t be.
For jobs with Frame Dependencies enabled, the job ID will be passed as the first parameter, and a list of pending task
IDs will be passed as the second parameter. The __main__ function should then return the list of task IDs that should
be released, or an empty list of none should be released.
Here is a very simple example that will work regardless of whether Frame Dependencies are enabled or not:
def __main__( jobId, taskIds=None ):
if not taskIds:
# Frame Dependencies are disabled
releaseJob = False
#figure out if job should be released
return releaseJob
else:
# Frame Dependencies are enabled
tasksToRelease = []
#figure out which tasks should be released, and append their IDs to the array
return tasksToRelease
By giving the taskIds parameter a default of None, it allows the script to function regardless of whether Frame Dependencies are enabled or not. You can check if “taskIds” is None, and if it is, you know that Frame Dependencies are
disabled.
7.7.4 Migrating Scripts from Deadline 5
Some changes were made to the Scripting API in Deadline 6, which means that Deadline 6 and later are NOT backward
compatible with scripts written for Deadline 5. However, migrating your scripts over is relatively straightforward, and
this guide will walk you through the API changes so that you can update your scripts as necessary.
The only significant change is that the globally defined functions are no longer available. See the Migrating Scripts
From Deadline 5 section in the Scripting Overview documentation for more information, including replacement functions.
7.8 Web Service Scripts
7.8.1 Overview
Web service scripts allow you to retrieve data from Deadline and display it in any way you see fit. See the Web Service
for more information on Deadline’s web service feature and how you can use it to call scripts and commands.
430
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
7.8.2 Creating Web Service Scripts
Custom web service scripts can be created in the ‘custom\scripts\WebService’ folder in your repository. See the
Scripting Overview documentation for more information on the ‘custom’ folder in the Repository and how it’s used.
Just place any new scripts directly into this folder, and they will be available to the Web Service. Script files names
should not contain any spaces, and should end in a ‘.py’ extension (ie, they must be Python scripts).
The __main__ Function
All web service scripts must define a __main__ function that accepts *args (a tuple containing 2 items). This is the
function that will be called when the web service executes the script. Note that if you decide not to accept args, and an
argument string is passed to your script in the URL, it will result in an exception being thrown. The function should
also return a string value, which is used to display the results. The string can be HTML, XML, plain text, etc.
def __main__( *args ):
results = ""
#...
#append data to results
#...
return results
It is also possible for the web service script to set the HTTP status code. This can be done by including the status code
after the results in the return statement. For example:
def __main__( *args ):
results = ""
statusCode = "200"
#...
#append data to results, and set statusCode as necessary
#...
return results, statusCode
Finally, it is possible for the web service script to set additional headers to be included in the HTTP response. This
can be done by including an arbitrary number of “key=value” strings after the status code in the return statement. For
example:
def __main__( *args ):
results = ""
statusCode = "200"
#...
#append data to results, and set statusCode as necessary
#...
return results, statusCode, "header1=value1", "header2=value2"
7.8. Web Service Scripts
431
Deadline User Manual, Release 7.1.0.35
Supporting Arguments
Arguments can be passed to web service scripts as a tuple with 2 items, and can be accepted in two different ways.
The first way is to simply accept args, which will be an array of length 2. The other way is to accept the tuple as two
separate variables, for instance (dlArgs, qsArgs) for Deadline arguments and query string arguments. In the first case,
args[0] is equivalent to dlArgs (Deadline arguments), and args[1] is equivalent to qsArgs (Query String Arguments).
Deadline Arguments
The web service will automatically pass your script a dictionary as the first item in the args tuple. The Dictionary
will contain at least one key (“Authenticated”), but may contain more if the user authenticated with the web service.
Currently, if the user has not authenticated, the Dictionary will only contain the “Authenticated” key, with a value
of ‘False’. However, if the user has authenticated, it will also contain the “UserName” key, with a value of the user
executing the script.
Query String Arguments
Arguments are passed to your script by a query string defined in the URL, and can be in one of the following forms:
Key/Value Pairs: This is the preferred method of passing arguments. Arguments in this form will look something like
this at the end of the URL:
?key0=value0&key1=value1
List of Values: Arguments in this form will instead look something like this:
?value0&value1
The query string will be passed to the Python script as a NameValueCollection and it will be the second item of the
tuple passed to your script’s __main__ function.
Relevant API Functions
For functions that will be relevant to most Web Service scripts, see the ‘Deadline.PulseUtils’ section of the Deadline
Scripting Reference documentation. The full Deadline Scripting Reference can be found on the Thinkbox Software
Documentation Website. Offline PDF and HTML versions can be downloaded from here as well.
7.8.3 Calling Web Service Scripts
Once the script has been created, you can call it using the web service. See the Web Service Documentation for more
information on how to set this up. For example, if you have a Web Service script called ‘GetFarmStatistics.py’, you
would call it using the following URL (where [myhost] is the hostname pointing to your web service machine):
http://[myhost]:8080/GetFarmStatistics
Some scripts can take arguments, as detailed in the previous section. To include arguments, you need to place a ‘?’
between the base URL and the first argument, with ‘&’ separating addition arguments. Here is an example of how you
would pass ‘arg1’, ‘arg2’, and ‘arg3’ as a list of arguments to the GetFarmStatistics.py script:
http://[myhost]:8080/GetFarmStatistics?arg1&arg2&arg3
Here is an example of how you would pass values for arguments named ‘arg1’, ‘arg2’, and ‘arg3’ in the form of
key-value pairs:
432
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
http://[myhost]:8080/GetFarmStatistics?arg1=value1&arg2=value2&arg3=value3
The way the results of the script will be displayed is entirely dependent on the format in which the Script returns them.
7.8.4 Migrating Scripts from Deadline 5
Some changes were made to the Scripting API in Deadline 6, which means that Deadline 6 and later are NOT backward
compatible with scripts written for Deadline 5. However, migrating your scripts over is relatively straightforward, and
this guide will walk you through the API changes so that you can update your scripts as necessary.
The only significant change is that the globally defined functions are no longer available. See the Migrating Scripts
From Deadline 5 section in the Scripting Overview documentation for more information, including replacement functions.
7.9 Standalone Python API
7.9.1 Overview
The Standalone Python API can be used in Python for communicating with the HTTP API (documented in REST
Overview). In order to use the HTTP API you must have the Web Service running on a machine whose address and
port number you know. For a list of the API’s functions and how they are used go to the Deadline Downloads page and
download the documentation. Essentially, our Standalone Python API is a Python wrapper API around our RESTful
HTTP API.
Note, as all communication to Deadline travels through the machine running the Web Service and not the local host,
there are consequences that should be considered carefully. Any file paths provided need to be valid on the Web Service
machine, including any differences between operating systems if for example, your local host is running Windows but
the Web Service machine is Linux. In the case of submitting a job, the job’s username will be the user account currently
running the Web Service, NOT the submitting local user, unless a UserName is provided in the job info.
7.9.2 Set-up
In order to use the Standalone Python API you must have Python 2.7 or later installed. Copy the “Deadline”
Folder containing the Standalone Python API from \\your\repository\api\python to the “site-packages” folder of
your Python installation and the API is ready to use.
7.9.3 Using the API
A DeadlineCon object must be created which is used to communicate with the web service to send and receive requests.
First enter “import Deadline.DeadlineConnect as Connect”, then create your connection object “connectionObject =
Deadline.DeadlineConnect.DeadlineCon(‘PulseName’, PulsePortNumber)”, where ‘PulseName’ is the DNS name or
IP address of the machine currently running the web service and ‘PulsePortNumber’ is the web service port number
as configured in the Web Service settings in the Repository Options. By default it is: 8080. The “connectionObject”
variable can now be used to communicate requests to the web service.
Example: Getting group names and suspending a job
>>> from Deadline.DeadlineConnect import DeadlineCon as Connect
>>> con = Connect('PulseName', 8080)
>>> con.Groups.GetGroupNames()
7.9. Standalone Python API
433
Deadline User Manual, Release 7.1.0.35
[u'none', u'group1', u'group2', u'group3']
>>> jobId = validjobID
>>> con.Jobs.SuspendJob(jobId)
'Success'
Documentation for all the possible API functions can be found on at the Deadline Downloads page.
7.9.4 Authenticating
If your Web Service has authentication enabled then you must set up authentication for the Python API. This can be
achieved through the “EnableAuthentication” and “SetAuthenticationCredentials” functions. Setting your authentication credentials allows the Python API to use them for as long as that instance of python is running.
>>> from Deadline.DeadlineConnect import DeadlineCon as Connect
>>> con = Connect('PulseName', 8080)
>>> con.Groups.GetGroupNames()
"Error: HTTP Status Code 401. Authentication with the Web Service failed.
Please ensure that the authentication credentials are set, are correct, and
that authentication mode is enabled."
>>> con.AuthenticationModeEnabled()
False
>>> con.EnabledAuthentication(True)
>>> con.AuthenticationModeEnabled()
True
>>> con.SetAuthenticationCredentials("username", "password")
>>> con.Groups.GetGroupNames()
[u'none', u'group1', u'group2', u'group3']
By default “SetAuthenticationCredentials” also enables authentication, so it is not actually necessary to explicitly call
“EnableAuthentication” as well. If you want to store your credentials without enabling authentication you may do so
as well using the optional third parameter.
>>> con.SetAuthenticationCredentials("username", "password", False)
7.9.5 API Functions
All of the Standalone Python API functions return a Python dictionary, a Python list, or a Python string. Lists often
contain dictionaries.
Examples: Getting a list, a list containing dictionaries, a dictionary, and a string back.
>>> groupNames = con.Groups.GetGroupNames()
>>> groupNames[0]
group1
>>> jobs = con.Jobs.GetJobs()
>>> jobs[0]['FailedChunks']
12
>>> task = con.Tasks.GetJobTask(jobId, 0)
>>> task["Errs"]
8
>>> root = con.Repository.GetRootDirectory()
>>> root
'C:/DeadlineRepository'
434
Chapter 7. Scripting
Deadline User Manual, Release 7.1.0.35
Example: Getting a job, changing the pool and priority then saving it.
>>> job = con.Jobs.GetJob(jobId)
>>> str(job['Props']['Pool'])
none
>>> job['Props']['Pool'] = unicode('jobPool')
>>> str(job['Props']['Pool'])
jobPool
>>> print str(job['Props']['Pri'])
50
>>> job['Props']['Pri'] = 75
>>> str(job['Props']['Pri'])
75
>>> con.Jobs.SaveJob(job)
'Success'
>>> job = con.Jobs.GetJob(jobId)
>>> str(job['Props']['Pool']) + ' ' +str(job['Props']['Pri'])
jobPool 75
Example: Submitting a ‘reserve’ VraySpawner job using Python dictionaries.
import Deadline.DeadlineConnect as Connect
if __name__ == '__main__':
Deadline = Connect.DeadlineCon('PulseName', 8080)
JobInfo = {
"Name": "Submitted via Python",
"UserName": "UserName",
"Frames": "0-1",
"Plugin": "VraySpawner"
}
PluginInfo = {
"Version": "Max2014"
}
try:
newJob = Deadline.Jobs.SubmitJob(JobInfo, PluginInfo)
print newJob
except:
print "Sorry, Web Service is currently down!"
Note, when submitting a job, the JobInfo and PluginInfo dictionaries should contain ALL the minimum necessary
KEY=VALUE pairs to successfully run this plugin job type in Deadline. As the KEY=VALUE pairs are internal and
change depending on the application plugin, it is recommended you submit a job normally to Deadline and then inspect
the job’s Submission Params to see what KEY=VALUE pairs should be submitted for this job type. You can also use
the “Export” button to take a copy of the JobInfo and PluginInfo files to submit the job using these files instead of via
Python dictionaries.
7.9. Standalone Python API
435
Deadline User Manual, Release 7.1.0.35
436
Chapter 7. Scripting
CHAPTER
EIGHT
REST API
8.1 REST Overview
8.1.1 Overview
The RESTful HTTP API can be used to interact with an instance of the web service. HTTP requests can be made
to request information from the database, store new data, alter existing data or remove entries from the database.
Requests to the API can be categorized by the type of data you are attempting to access and by the type of HTTP
request you are using to access said data. In order to use the HTTP API you must have the Web Service running on a
machine whose address and port number you know.
Note, as all communication to Deadline travels through the machine running the Web Service and not the local host,
there are consequences that should be considered carefully. Any file paths provided need to be valid on the Web Service
machine, including any differences between operating systems if for example, your local host is running Windows but
the Web Service machine is Linux. In the case of submitting a job, the job’s username will be the user account currently
running the Web Service, NOT the submitting local user, unless a UserName is provided in the job info.
Requests that alter data are primarily POST or PUT messages, and they typically return text stating whether they
succeeded or if there was an error. Requests made to retrieve data are done using GET messages and return JavaScript
Object Notation (JSON) formatted objects if successful, and text explaining the error if not. Some POST or PUT
messages will return JSON objects as well, but usually only if there is information about the action that the user may
need (an example of this would be a request to create a new object, the object’s primary key may be returned on
creation). Requests made to remove data are typically done using DELETE messages and return text stating whether
they succeeded or if there was an error, just like POST and PUT messages. In the event of an error message being
returned the HTTP Status Code will also be set to describe the error.
8.1.2 Request Types
• Jobs
• Job Reports
• Groups
• Pools
• Limits
• Repository
• Pulse
• Slaves
• Tasks
437
Deadline User Manual, Release 7.1.0.35
• Task Reports
• Users
• Balancer
8.1.3 Request Formats and Responses
• GET
Request for some data. These messages are constructed entirely within the URL. Successful requests
will usually return a JSON object and failed requests will return a brief error message along with the
HTTP Status Code. There are some GET requests that will return plain text for a successful request.
• PUT
Typically a request to modify some data. These messages use the URL to specify what type of
data that you wish to alter, and use the message body for storing the message to the database. The
message body must be a JSON object, although how this object must be built depends on the data
being modified. PUT messages for data that does not exist will often fail, but in some cases will act
as a POST. Successful requests will usually return text stating success. Failed requests will return
a brief error message along with the HTTP Status Code. There are some PUT messages that return
JSON objects, and this usually occurs when data has been created instead of altered.
• POST
Request to create some data. These messages use the URL to specify what type of data that you wish
to create, and use the message body for storing the message to the database. The message body must
be a JSON object, although how this object must be built depends on the data being modified. POST
messages for data that already exists will fail. Successful requests will usually return text stating
success. Failed requests will return a brief error message along with the HTTP Status Code. There
are some POST messages that return JSON objects.
• DELETE
Request to delete some data. These messages are constructed entirely within the URL. Successful
requests will usually return text stating success. Failed requests will return a brief error message
along with the HTTP Status Code.
8.1.4 HTTP Status Codes
The following are the HTTP Status Code that can be returned, and what they signify in Deadline.
• 200 - OK
Request completed without error. Note that this does not always mean the request modified everything as intended. Example: trying to send a “complete” message to a completed job will do nothing
and return this status code. Another example: trying to release a job from pending when the job is
not pending will return this status code and do nothing.
• 400 - Bad Request
Request could not be completed due to incorrect request message structure in either the URL or the
body of the request message.
• 404 - Not Found
Requested data could not be found, or requested command could not be found.
• 405 - Method Not Allowed
438
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
Requested operation could not be completed using the request format given.
• 500 - Internal Server Error
Request message could not be interpreted properly, or the action being attempted causing an exception in Deadline.
• 501 - Not Implemented
Request type is not supported. For example, a JobReport PUT request would return this because only
GET is supported.
8.1.5 Additional Information
If a request is made for a JSON object, and an empty JSON object is returned, then the information provided for the
request did not match any entry in the repository.
Adding additional key-value pairs to a JSON object for a request that does not specify their use can have surprising
consequences. Keys that are not used by other commands will be ignored, but be sure to read the documentation
for each possible query for each request type before building a JSON object for your query, as some commands are
identical other than the presence of a single key and have vastly different effects.
If a documented query requires a JSON object that you do not know how to properly construct, it is often possible to
do a GET query for the same object type and receive the JSON format that the query expects.
A query that returns “Success” does not imply that the actions your query requested occurred. Some actions are
impossible, but do not warrant an error message. (Example, sending a Suspend message to a Suspended job, or
Deleting a Slave that does not exist or was already Deleted.)
8.2 Jobs
8.2.1 Overview
Job requests can be used to set and retrieve information for one or many jobs. Job requests support GET, PUT, POST
and DELETE request types. For more about these request types and their uses see the Request Formats and Responses
documentation.
8.2.2 Requests and Responses
List of possible requests for Jobs. All PUT and POST requests may also return a 400 Bad Request error if there was
no message body in the request. All PUT requests may also return a 400 Bad Request error message if the command
key is not present in the message body’s JSON object. All PUT requests may also return a 500 Internal Server Error
error message if the command key in the message body contained an invalid command.
Get All The Jobs
URL: http://hostname:portnumber/api/jobs
Request Type: GET
Message Body: N/A
Response: JSON object containing all the job information for every job in the repository.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
8.2. Jobs
439
Deadline User Manual, Release 7.1.0.35
Get Jobs In Specified State Gets jobs in the specified state(s). Valid states are Active, Suspended, Completed, Failed,
and Pending. Note that Active covers both Queued and Rendering jobs. Specify more than one state by separating them with commas (ie: Active,Completed,Suspended).
URL: http://hostname:portnumber/api/jobs?States=states
Request Type: GET
Message Body: N/A
Response: JSON object containing all the jobs in the specified state(s).
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get All The Job IDs
URL: http://hostname:portnumber/api/jobs?IdOnly=true
Request Type: GET
Message Body: N/A
Response: JSON object containing all the job IDs in the repository.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Job Gets job info for the given job ID.
URL: http://hostname:portnumber/api/jobs?JobID=validjobidhere
Request Type: GET
Message Body: N/A
Response: JSON object containing all the job information for the job ID provided.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Save Job Saves the job info provided. Job info must be in JSON format.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = save
• Job = JSON object containing the job info
Response: “Success”
Possible Errors:
• 400 Bad Request: There was no Job entry in the JSON object in the message body.
• 500 Internal Server Error: An exception occurred within the Deadline code.
Suspend Job Puts the job with the matching ID into the Suspended state.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = suspend
440
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
• JobID = the ID of the Job to be suspended
Response: “Success”
Possible Errors:
• 400 Bad Request: There was no JobID entry in the JSON object in the message body.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Job ID provided does not correspond to a Job in the repository.
Suspend Job: Non-rendering tasks Puts the job with the matching ID into the Suspended state.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = suspendnonrendering
• JobID = the ID of the Job to be suspended
Response: “Success”
Possible Errors:
• 400 Bad Request: There was no JobID entry in the JSON object in the message body.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Job ID provided does not correspond to a Job in the repository.
Resume Job Resumes the job with the ID that matches the provided ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = resume
• JobID = the ID of the Job to be resumed
Response: “Success”
Possible Errors:
• 400 Bad Request: There was no JobID entry in the JSON object in the message body.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Job ID provided does not correspond to a Job in the repository.
Resume Failed Job Resumes the failed job with the ID that matches the provided ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
8.2. Jobs
441
Deadline User Manual, Release 7.1.0.35
Message Body:
JSON object where the following keys are mandatory:
• Command = resumefailed
• JobID = the ID of the failed Job to be resumed
Response: “Success”
Possible Errors:
• 400 Bad Request: There was no JobID entry in the JSON object in the message body.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Job ID provided does not correspond to a Job in the repository.
Requeue Job Requeues the job with the ID that matches the provided ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = requeue
• JobID = the ID of the Job to be requeued
Response: “Success”
Possible Errors:
• 400 Bad Request: There was no JobID entry in the JSON object in the message body.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Job ID provided does not correspond to a Job in the repository.
Archive Job Archives the job with the ID that matches the provided ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = archive
• JobID = the ID of the Job to be archived
Response: “Success”
Possible Errors:
• 400 Bad Request: There was no JobID entry in the JSON object in the message body.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Job ID provided does not correspond to a Job in the repository.
442
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
Import Job Imports the job path provided.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = import
• File = the file location of the archived job/s (May be an array)
The following keys are optional:
• DeleteFile = true (deletes the archive file/s after importing)
Response: The job ids of the imported jobs and of the jobs that were not imported.
Possible Errors:
• 400 Bad Request: There was no File path provided.
• 500 Internal Server Error: An exception occurred within the Deadline code.
Pend Job Puts the job with the ID that matches the provided ID in the pending state.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = pend
• JobID = the ID of the Job to be put in the pending state
Response: “Success”
Possible Errors:
• 400 Bad Request: There was no JobID entry in the JSON object in the message body.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Job ID provided does not correspond to a Job in the repository.
Release Pending Job Releases the job with the ID that matches the provided ID from the pending state.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = releasepending
• JobID = the ID of the Job to be release from the pending state
Response: “Success”
Possible Errors:
• 400 Bad Request: There was no JobID entry in the JSON object in the message body.
8.2. Jobs
443
Deadline User Manual, Release 7.1.0.35
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Job ID provided does not correspond to a Job in the repository.
Complete Job Marks the job with the ID that matches the provided ID as complete.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = complete
• JobID = the ID of the Job to be marked as complete
Response: “Success”
Possible Errors:
• 400 Bad Request: There was no JobID entry in the JSON object in the message body.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Job ID provided does not correspond to a Job in the repository.
Fail Job Marks the job with the ID that matches the provided ID as failed.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = fail
• JobID = the ID of the Job to be marked as failed
Response: “Success”
Possible Errors:
• 400 Bad Request: There was no JobID entry in the JSON object in the message body.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Job ID provided does not correspond to a Job in the repository.
Update Job Submission Date Updates the Submission Date for the job with the ID that matches the provided ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = updatesubmissiondate
• JobID = the ID of the Job to have the submission date updated for
444
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
Response: “Success”
Possible Errors:
• 400 Bad Request: There was no JobID entry in the JSON object in the message body.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Job ID provided does not correspond to a Job in the repository.
Set Job Machine Limit Sets the Job Machine Limit for the job with the ID that matches the provided ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = setjobmachinelimit
• JobID = the ID of the Job
The following keys are optional:
• Limit = the new job machine limit, must be an integer
• SlaveList = the slave/s to be set as the slave list (May be an array)
• WhiteListFlag = boolean : sets the whitelistflag to true or false
• Progress = Floating point number for the release percentage
Response: “Success”
Possible Errors:
• 400 Bad Request: There was no JobID entry in the JSON object in the message body.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Job ID provided does not correspond to a Job in the repository.
Add Slaves To Job Machine Limit List Adds the provided Slaves to the job with the ID that matches the provided
ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = addslavestojobmachinelimitlist
• JobID = the ID of the Job
• SlaveList = the slave/s to be added to the slave list (May be an array)
Response: “Success”
Possible Errors:
• 400 Bad Request:
8.2. Jobs
445
Deadline User Manual, Release 7.1.0.35
– There was no JobID entry in the JSON object in the message body, or
– There needs to be at least one Slave passed.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Job ID provided does not correspond to a Job in the repository.
Remove Slaves From Job Machine Limit List Removes the provided Slaves from the Job Machine Limit List for
the job with the ID that matches the provided ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = removeslavesfromjobmachinelimitlist
• JobID = the ID of the Job
• SlaveList = the slave/s to be removed from the slave list (May be an array)
Response: “Success”
Possible Errors:
• 400 Bad Request:
– There was no JobID entry in the JSON object in the message body, or
– There needs to be at least one Slave passed.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Job ID provided does not correspond to a Job in the repository.
Set Job Machine Limit Listed Slaves Sets provided Slaves as Job Machine Limit Listed Slaves for the Job whose
ID matches the provided ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = setjobmachinelimitlistedslaves
• JobID = the ID of the Job
• SlaveList = the slave/s to be set as the slave list (May be an array)
Response: “Success”
Possible Errors:
• 400 Bad Request:
– There was no JobID entry in the JSON object in the message body, or
– There needs to be at least one Slave passed.
• 500 Internal Server Error:
446
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
– An exception occurred within the Deadline code, or
– Job ID provided does not correspond to a Job in the repository.
Set Job Machine Limit White List Flag Sets Job Machine Limit White List Flag for the job with the ID that matches
the provided ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = setjobmachinelimitwhitelistflag
• JobID = the ID of the Job
• WhiteListFlag = boolean : sets the whitelistflag to true or false
Response: “Success”
Possible Errors:
• 400 Bad Request:
– There was no JobID entry in the JSON object in the message body, or
– Must pass a boolean WhiteListFlag.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Job ID provided does not correspond to a Job in the repository.
Set Job Machine Limit Sets Job Machine Limit Maximum for the job with the ID that matches the provided ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = setjobmachinelimitmaximum
• JobID = the ID of the Job
• Limit = the new job machine limit, must be an integer
Response: “Success”
Possible Errors:
• 400 Bad Request:
– There was no JobID entry in the JSON object in the message body, or
– Must pass an integer Limit
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Job ID provided does not correspond to a Job in the repository.
8.2. Jobs
447
Deadline User Manual, Release 7.1.0.35
Set Job Frame Range Sets the frame range for the job with the ID that matches the provided ID.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = setjobframerange
• JobID = the ID of the Job
• FrameList = the new frame list
• ChunkSize = the new chunk size
Response: “Success”
Possible Errors:
• 400 Bad Request:
– There was no JobID entry in the JSON object in the message body, or
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Job ID provided does not correspond to a Job in the repository.
Append Job Frame Range Appends frames to the job with the ID that matches the provided ID. This adds new tasks
without affecting the job’s existing tasks.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = appendjobframerange
• JobID = the ID of the Job
• FrameList = the frame list to append to the job’s existing frames
Response: “Success”
Possible Errors:
• 400 Bad Request:
– There was no JobID entry in the JSON object in the message body, or
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Job ID provided does not correspond to a Job in the repository.
Submit Job Submits a job using the job info provided.
URL: http://hostname:portnumber/api/jobs
Request Type: POST
Message Body:
448
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
JSON object where the following keys are mandatory:
• JobInfo = JSON object containing the Job Info
• PluginInfo = JSON object containing the Plugin Info
• AuxFiles = Array of Auxiliary File paths (May be empty, but must be provided)
• IdOnly = Set to “true” to only return the job ID (defaults to “false”)
Response: JSON object containing the new Job that was submitted or the Job ID
Possible Errors:
• 400 Bad Request: Missing one or more of the mandatory keys listed above.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Could not access the file path specified in NetworkRoot.
Delete Jobs Deletes the job corresponding to the job ID provided.
URL: http://hostname:portnumber/api/jobs?JobID=listOfJobIdsToDelete
Request Type: DELETE
Message Body: N/A
Response: “Success”
Possible Errors:
• 400 Bad Request: Need to provide at least one job ID to delete.
• 500 Internal Server Error: An exception occurred within the Deadline code.
Get Job Details Gets the Job Details, similar to the Job Details panel, for the Jobs corresponding to the provided Job
IDs.
URL: http://hostname:portnumber/api/jobs?JobID=listOfJobIds&Details=true
Request Type: GET
Message Body: N/A
Response: A JSON object containing the Job Details.
Possible Errors:
• 400 Bad Request: Need to provide at least one job ID to get details for.
• 500 Internal Server Error: An exception occurred within the Deadline code.
Get Deleted Jobs Gets the Deleted Jobs that correspond to the provided Job IDs.
URL: http://hostname:portnumber/api/jobs?JobID=listOfJobIds&Deleted=true
Request Type: GET
Message Body: N/A
Response: A JSON object containing the deleted Jobs.
Possible Errors:
• 400 Bad Request: Need to provide at least one deleted job ID.
• 500 Internal Server Error: An exception occurred within the Deadline code.
8.2. Jobs
449
Deadline User Manual, Release 7.1.0.35
Get All Deleted Jobs Gets all the Deleted Jobs.
URL: http://hostname:portnumber/api/jobs?Deleted=true
Request Type: GET
Message Body: N/A
Response: A JSON object containing the deleted Jobs.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Purge Deleted Jobs Purges the Deleted Jobs that correspond to the provided Job IDs.
URL: http://hostname:portnumber/api/jobs?JobID=listOfJobIdsToDelete&Purge=true
Request Type: DELETE
Message Body: N/A
Response: “Success”
Possible Errors:
• 400 Bad Request: Need to provide at least one job ID to delete.
• 500 Internal Server Error: An exception occurred within the Deadline code.
Undelete Jobs Undeletes the Deleted Jobs that correspond to the provided Job IDs.
URL: http://hostname:portnumber/api/jobs
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = undelete
• JobID/s = job ID/list of job IDs to undelete
Response: “Success”
Possible Errors:
• 400 Bad Request: Need to provide at least one job ID to delete.
• 500 Internal Server Error: An exception occurred within the Deadline code.
8.2.3 Job Property Values
Values for some Job properties are represented by numbers. Those properties and their possible values are listed below.
Stat (Status)
• 0 = Unknown
• 1 = Active
• 2 = Suspended
• 3 = Completed
• 4 = Failed
• 6 = Pending
450
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
Note that an active job can either be idle or rendering. Use the RenderingChunks property to determine if anything is
rendering.
Timeout (OnTaskTimeout)
• 0 = Both
• 1 = Error
• 2 = Notify
OnComp (OnJobComplete)
• 0 = Archive
• 1 = Delete
• 2 = Nothing
Schd (ScheduledType)
• 0 = None
• 1 = Once
• 2 = Daily
8.3 Job Reports
8.3.1 Overview
Job Report requests can be used to retrieve Job Reports for a Job using the GET request type. PUT, POST and
DELETE are not supported and sending a message of any of these types will result in a 501 Not Implemented error
message. For more about these request types and their uses see the Request Formats and Responses documentation.
8.3.2 Requests and Responses
List of possible requests for Job Reports. It is possible to get a 400 Bad Request error message for any of the requests
if the value for Data is incorrect.
Get All Job Reports Gets all the Job Reports for the Job that corresponds to the provided Job ID.
URL: http://hostname:portnumber/api/jobreports?Data=all&JobID=validJobID
http://hostname:portnumber/api/jobreports?JobID=validJobID
Request Type: GET
Message Body: N/A
Response: JSON object containing all the job reports for the requested job, or a message stating that there are
no reports for the job.
Possible Errors:
• 400 Bad Request: No Job ID was provided.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– The Job ID provided does not correspond to any Job in the repository.
8.3. Job Reports
451
Deadline User Manual, Release 7.1.0.35
Get Job Error Reports Gets all the Job Error Reports for the Job that corresponds to the provided Job ID.
URL: http://hostname:portnumber/api/jobreports?Data=error&JobID=validJobID
Request Type: GET
Message Body: N/A
Response: JSON object containing all the job error reports for the requested job, or a message stating that there
are no error reports for the job.
Possible Errors:
• 400 Bad Request: No Job ID was provided.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– The Job ID provided does not correspond to any Job in the repository.
Get Job Log Reports Gets all the Job Reports for the Job that corresponds to the provided Job ID.
URL: http://hostname:portnumber/api/jobreports?Data=log&JobID=validJobID
Request Type: GET
Message Body: N/A
Response: JSON object containing all the job log reports for the requested job, or a message stating that there
are no log reports for the job.
Possible Errors:
• 400 Bad Request: No Job ID was provided.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– The Job ID provided does not correspond to any Job in the repository.
Get Job Requeue Reports Gets all the Job Requeue Reports for the Job that corresponds to the provided Job ID.
URL: http://hostname:portnumber/api/jobreports?Data=requeue&JobID=validJobID
Request Type: GET
Message Body: N/A
Response: JSON object containing all the job requeue reports for the requested job, or a message stating that
there are no requeue reports for the job.
Possible Errors:
• 400 Bad Request: No Job ID was provided.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– The Job ID provided does not correspond to any Job in the repository.
Get Job History Entries Gets all the Job History Entries for the Job that corresponds to the provided Job ID.
URL: http://hostname:portnumber/api/jobreports?Data=history&JobID=validJobID
Request Type: GET
Message Body: N/A
452
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
Response: JSON object containing all the job history entries for the requested job, or a message stating that
there are no history entries for the job.
Possible Errors:
• 400 Bad Request: No Job ID was provided.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– The Job ID provided does not correspond to any Job in the repository.
8.3.3 Job Report Property Values
Values for some Job Report properties are represented by numbers. Those properties and their possible values are
listed below.
Type (ReportType)
• 0 = LogReport
• 1 = ErrorReport
• 2 = RequeueReport
8.4 Tasks
8.4.1 Overview
Task requests can be used to set and retrieve Task information using GET and PUT request types. POST and DELETE
are not supported and sending a message of either of these types will result in a 501 Not Implemented error message.
For more about these request types and their uses see the Request Formats and Responses documentation.
8.4.2 Requests and Responses
List of possible requests for Tasks. For all PUT requests it is possible to return a 400 Bad Request error message if the
message body is empty or if no command key is provided. All requests may return a 400 Bad request error message
if no Job ID is provided or a 500 Internal Server Error if the Job ID provided does not correspond to any Job in the
repository.
Get Task IDs
Gets all the Task IDs for the Job that corresponds to the Job ID provided.
URL: http://hostname:portnumber/api/tasks?IdOnly=true&JobID=aValidJobID
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Task IDs for the Job.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Task
8.4. Tasks
453
Deadline User Manual, Release 7.1.0.35
Gets the Task that correspond to the Task ID provided for the Job that corresponds to the Job ID provided.
URL: http://hostname:portnumber/api/tasks?TaskID=oneValidTaskID&JobID=aValidJobID
Request Type: GET
Message Body: N/A
Response: JSON object containing the Task information for the requested Task.
Possible Errors:
• 400 Bad Request:
– No Task ID provided, or
– Task ID must be an integer value.
• 500 Internal Server Error: An exception occurred within the Deadline code.
Get All Tasks
Gets the Tasks for the Job that corresponds to the Job ID provided.
URL: http://hostname:portnumber/api/tasks?JobID=aValidJobID
Request Type: GET
Message Body: N/A
Response: JSON object containing the Task information for all the Job Tasks.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Requeue Tasks
Requeues the Tasks that correspond to the Task IDs provided for the Job that corresponds to the Job ID
provided. If no Task IDs are provided, all Job tasks will be requeued.
URL: http://hostname:portnumber/api/tasks
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = requeue
• JobID = the id of the Job
The following keys are optional:
• TaskList = integer Task ID/s (May be an Array)
Response: “Success”
Possible Errors:
• 400 Bad Request: TaskList contains entries, but none of them are valid integers.
• 404 Not Found: Requested Task ID does not correspond to a Task for the Job.
• 500 Internal Server Error: An exception occurred within the Deadline code.
Complete Tasks
Completes the Tasks that correspond to the Task IDs provided for the Job that corresponds to the Job ID
provided. If no Task IDs are provided, all Job tasks will be completed.
URL: http://hostname:portnumber/api/tasks
454
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = complete
• JobID = the id of the Job
The following keys are optional:
• TaskList = integer Task ID/s (May be an Array)
Response: “Success”
Possible Errors:
• 400 Bad Request: TaskList contains entries, but none of them are valid integers.
• 404 Not Found: Requested Task ID does not correspond to a Task for the Job.
• 500 Internal Server Error: An exception occurred within the Deadline code.
Suspend Tasks
Suspend the Tasks that correspond to the Task IDs provided for the Job that corresponds to the Job ID
provided. If no Task IDs are provided, all Job tasks will be suspended.
URL: http://hostname:portnumber/api/tasks
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = suspend
• JobID = the id of the Job
The following keys are optional:
• TaskList = integer Task ID/s (May be an Array)
Response: “Success”
Possible Errors:
• 400 Bad Request: TaskList contains entries, but none of them are valid integers.
• 404 Not Found: Requested Task ID does not correspond to a Task for the Job.
• 500 Internal Server Error: An exception occurred within the Deadline code.
Fail Tasks
Fails the Tasks that correspond to the Task IDs provided for the Job that corresponds to the Job ID provided. If no Task IDs are provided, all Job tasks will be failed.
URL: http://hostname:portnumber/api/tasks
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = fail
• JobID = the id of the Job
8.4. Tasks
455
Deadline User Manual, Release 7.1.0.35
The following keys are optional:
• TaskList = integer Task ID/s (May be an Array)
Response: “Success”
Possible Errors:
• 400 Bad Request: TaskList contains entries, but none of them are valid integers.
• 404 Not Found: Requested Task ID does not correspond to a Task for the Job.
• 500 Internal Server Error: An exception occurred within the Deadline code.
Resume Failed Tasks
Resumes the Failed Tasks that correspond to the Task IDs provided for the Job that corresponds to the Job
ID provided. If no Task IDs are provided, all Job failed tasks will be resumed.
URL: http://hostname:portnumber/api/tasks
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = resumefailed
• JobID = the id of the Job
The following keys are optional:
• TaskList = integer Task ID/s (May be an Array)
Response: “Success”
Possible Errors:
• 400 Bad Request: TaskList contains entries, but none of them are valid integers.
• 404 Not Found: Requested Task ID does not correspond to a Task for the Job.
• 500 Internal Server Error: An exception occurred within the Deadline code.
Pend Tasks
Pends the Tasks that correspond to the Task IDs provided for the Job that corresponds to the Job ID
provided. If no Task IDs are provided, all Job tasks will be pended.
URL: http://hostname:portnumber/api/tasks
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = pend
• JobID = the id of the Job
The following keys are optional:
• TaskList = integer Task ID/s (May be an Array)
Response: “Success”
Possible Errors:
• 400 Bad Request: TaskList contains entries, but none of them are valid integers.
456
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
• 404 Not Found: Requested Task ID does not correspond to a Task for the Job.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Trying to pend a task for a Suspended Job.
Release Pending Tasks
Releases the pending Tasks that correspond to the Task IDs provided for the Job that corresponds to the
Job ID provided. If no Task IDs are provided, all Job pending tasks will be released.
URL: http://hostname:portnumber/api/tasks
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = releasepending
• JobID = the id of the Job
The following keys are optional:
• TaskList = integer Task ID/s (May be an Array)
Response: “Success”
Possible Errors:
• 400 Bad Request: TaskList contains entries, but none of them are valid integers.
• 404 Not Found: Requested Task ID does not correspond to a Task for the Job.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Trying to release a task from pending for a Suspended Job.
8.4.3 Task Property Values
Values for some Task properties are represented by numbers. Those properties and their possible values are listed
below.
Stat (Status)
• 1 = Unknown
• 2 = Queued
• 3 = Suspended
• 4 = Rendering
• 5 = Completed
• 6 = Failed
• 8 = Pending
8.4. Tasks
457
Deadline User Manual, Release 7.1.0.35
8.5 Task Reports
8.5.1 Overview
Task Report requests can be used to retrieve Task Reports for a Job Task using the GET request type. PUT, POST and
DELETE are not supported and sending a message of any of these types will result in a 501 Not Implemented error
message. For more about these request types and their uses see the Request Formats and Responses documentation.
8.5.2 Requests and Responses
List of possible requests for Task Reports. It is possible to get a 400 Bad Request error message for any of the requests
if the value for Data is incorrect. All requests may return a 400 Bad request error message if no Job ID is provided or
a 500 Internal Server Error if the Job ID provided does not correspond to any Job in the repository. All requests may
also return a 400 Bad Request error message if the Task ID was not provided, or was not valid, or was not an integer.
Get All Task Reports
Gets all the Task Reports for the Job Task that corresponds to the provided Job ID and provided Task ID.
URL: http://hostname:portnumber/api/taskreports?Data=all&JobID=validJobID&TaskID=validTaskID
http://hostname:portnumber/api/taskreports?JobID=validJobID&TaskID=validTaskID
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Task reports for the requested Job Task, or a message stating
that there are no reports for the Job Task.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Task Error Reports
Gets all the Task Error Reports for the Job Task that corresponds to the provided Job ID and provided
Task ID.
URL: http://hostname:portnumber/api/taskreports?Data=error&JobID=validJobID&TaskID=validTaskID
Request Type: GET
Message Body: N/A
Response: JSON object containing the Task error reports for the requested Job Task, or a message stating
that there are no error reports for the Job Task.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Task Log Reports
Gets all the Task Log Reports for the Job Task that corresponds to the provided Job ID and provided Task
ID.
URL: http://hostname:portnumber/api/taskreports?Data=log&JobID=validJobID&TaskID=validTaskID
Request Type: GET
Message Body: N/A
Response: JSON object containing the Task log reports for the requested Job Task, or a message stating
that there are no log reports for the Job Task.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
458
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
Get Task Requeue Reports
Gets all the Task Requeue Reports for the Job Task that corresponds to the provided Job ID and provided
Task ID.
URL: http://hostname:portnumber/api/taskreports?Data=requeue&JobID=validJobID&TaskID=validTaskID
Request Type: GET
Message Body: N/A
Response: JSON object containing the Task requeue reports for the requested Job Task, or a message
stating that there are no requeue reports for the Job Task.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
8.5.3 Task Report Property Values
Values for some Task Report properties are represented by numbers. Those properties and their possible values are
listed below.
Type (ReportType)
• 0 = LogReport
• 1 = ErrorReport
• 2 = RequeueReport
8.6 Slaves
8.6.1 Overview
Slave requests can be used to set or retrieve Slave information. Slave requests support GET, PUT and DELETE request
types. POST is not supported and sending such a message will result in a 501 Not Implemented error message. For
more about these request types and their uses see the Request Formats and Responses documentation.
8.6.2 Requests and Responses
List of possible requests for Slaves. For all PUT requests it is possible to return a 400 Bad Request error message if
there is no message body or if the command key is not set. PUT requests may also return a 500 Internal Server Error
message if the command key is set to an invalid command.
Get Slave Names
Gets all the Slave names.
URL: http://hostname:portnumber/api/slaves?NamesOnly=true
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Slave names.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Slaves’ InfoSettings
8.6. Slaves
459
Deadline User Manual, Release 7.1.0.35
Gets the InfoSettings for every Slave name provided.
URL: http://hostname:portnumber/api/slaves?Name=oneOrMoreSlaveNames&Data=infosettings
Request Type: GET
Message Body: N/A
Response: JSON object containing the Slave InfoSettings for all the Slave names provided.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get All Slaves’ InfoSettings
Gets the InfoSettings for every Slave.
URL: http://hostname:portnumber/api/slaves?Data=infosettings
Request Type: GET
Message Body: N/A
Response: JSON object containing the Slave InfoSettings for all the Slaves.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Slaves’ Information
Gets the Slave Information for every Slave name provided.
URL: http://hostname:portnumber/api/slaves?Name=oneOrMoreSlaveNames&Data=info
Request Type: GET
Message Body: N/A
Response: JSON object containing the Slave Information for all the Slave names provided.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get All Slaves’ Information
Gets the Slave Information for every Slave.
URL: http://hostname:portnumber/api/slaves?Data=info
Request Type: GET
Message Body: N/A
Response: JSON object containing the Slave Information for all the Slaves.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Slaves’ Settings
Gets the Slave Settings for every Slave name provided.
URL: http://hostname:portnumber/api/slaves?Name=oneOrMoreSlaveNames&Data=settings
Request Type: GET
Message Body: N/A
Response: JSON object containing the Slave Settings for all the Slave names provided.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get All Slaves’ Settings
460
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
Gets the Slave Settings for every Slave.
URL: http://hostname:portnumber/api/slaves?Data=settings
Request Type: GET
Message Body: N/A
Response: JSON object containing the Slave Settings for all the Slaves.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Save Slave Information
Saves the Slave Information provided.
URL: http://hostname:portnumber/api/slaves
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = saveinfo
• SlaveInfo = JSON object containing the Slave information to save.
Response: “Success”
Possible Errors:
• 400 Bad Request: JSON object containing Slave Information was not provided.
• 500 Internal Server Error: An exception occurred within the Deadline code.
Save Slave Settings
Saves the Slave Settings provided.
URL: http://hostname:portnumber/api/slaves
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = savesettings
• SlaveInfo = JSON object containing the Slave Settings to save.
Response: “Success”
Possible Errors:
• 400 Bad Request: JSON object containing Slave Settings was not provided.
• 500 Internal Server Error: An exception occurred within the Deadline code.
Delete Slaves
Deletes every Slave that corresponds to a Slave name provided.
URL: http://hostname:portnumber/api/slaves?Name=oneOrMoreSlaveNames
Request Type: DELETE
Message Body: N/A
Response: “Success”
8.6. Slaves
461
Deadline User Manual, Release 7.1.0.35
Possible Errors:
• 400 Bad Request: Need to provide at least one Slave name to delete.
• 500 Internal Server Error: An exception occurred within the Deadline code.
Get Slaves’ Reports
Gets all Slave Reports for all Slave names provided.
URL: http://hostname:portnumber/api/slaves?Name=oneOrMoreSlaveNames&Data=reports
Request Type: GET
Message Body: N/A
Response: JSON object containing all Slave Reports for all Slave names provided.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Slave Reports For All Slaves
Gets all Slave Reports for all Slaves.
URL: http://hostname:portnumber/api/slaves?Data=reports
Request Type: GET
Message Body: N/A
Response: JSON object containing all Slave Reports for all Slave names provided.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Slaves’ History
Gets all Slave History Entries for all Slave names provided.
URL: http://hostname:portnumber/api/slaves?Name=oneOrMoreSlaveNames&Data=history
Request Type: GET
Message Body: N/A
Response: JSON object containing all Slave History Entries for all Slave names provided.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Slave History For All Slaves
Gets all Slave History Entries for all Slaves.
URL: http://hostname:portnumber/api/slaves?Data=history
Request Type: GET
Message Body: N/A
Response: JSON object containing all Slave History for all Slaves.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Slave Names Rendering Job
Gets all Slave names rendering Job that corresponds to Job ID provided.
URL: http://hostname:portnumber/api/slavesrenderingjob?JobID=validJobID
Request Type: GET
Message Body: N/A
462
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
Response: JSON object all the Slave names rendering the Job.
Possible Errors:
• 400 Bad Request: No Job ID was provided.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Job ID provided does not correspond to a Job in the repository.
Get Host Names of Machines Rendering Job
Gets all machine host names for slaves rendering Job that corresponds to Job ID provided.
URL: http://hostname:portnumber/api/machinessrenderingjob?JobID=validJobID
Request Type: GET
Message Body: N/A
Response: JSON object containing all the host names.
Possible Errors:
• 400 Bad Request: No Job ID was provided.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Job ID provided does not correspond to a Job in the repository.
Get IP Address of Machines Rendering Job
Gets all machine IP addresses for slaves rendering Job that corresponds to Job ID provided.
URL: http://hostname:portnumber/api/machinessrenderingjob?JobID=validJobID&GetIpAddress=true
Request Type: GET
Message Body: N/A
Response: JSON object containing all the IP addresses.
Possible Errors:
• 400 Bad Request: No Job ID was provided.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Job ID provided does not correspond to a Job in the repository.
8.6.3 Slave Property Values
Values for some Slave Info, Settings, and Report properties are represented by numbers. Those properties and their
possible values are listed below.
Stat (SlaveStatus)
• 0 = Unknown
• 1 = Rendering
• 2 = Idle
8.6. Slaves
463
Deadline User Manual, Release 7.1.0.35
• 3 = Offline
• 4 = Stalled
• 8 = StartingJob
Type (ReportType)
• 0 = LogReport
• 1 = ErrorReport
• 2 = RequeueReport
8.7 Pulse
8.7.1 Overview
Pulse requests can be used to set and retrieve Pulse information using GET and PUT. POST and DELETE are not
supported and sending a message of either of these types will result in a 501 Not Implemented error message. For
more about these request types and their uses see the Request Formats and Responses documentation.
8.7.2 Requests and Responses
List of possible requests for Pulse.
Get Pulse Names
Gets all the Pulse names.
URL: http://hostname:portnumber/api/pulse?NamesOnly=true
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Pulse names.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Pulse Information
Gets the Pulse information for the Pulse names provided.
URL:
http://hostname:portnumber/api/pulse?Info=true&Names=oneOrMorePulseNamesOR
http://hostname:portnumber/api/pulse?Info=true&Name=onePulseName
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Pulse information for the requested Pulse names.
Possible Errors:
• 404 Not Found: Pulse name provided does not exist (can only occur if you use Name= )
• 500 Internal Server Error: An exception occurred within the Deadline code.
Save Pulse Information
464
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
Saves the Pulse information provided.
URL: http://hostname:portnumber/api/pulse
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = saveinfo
• PulseInfo = JSON object containing all the Pulse information.
Response: “Success”
Possible Errors:
• 400 Bad Request: Did not provide a Pulse Information JSON object
• 500 Internal Server Error: An exception occurred within the Deadline code.
Get Pulse Settings
Gets the Pulse settings for the Pulse names provided.
URL:
http://hostname:portnumber/api/pulse?Settings=true&Names=oneOrMorePulseNamesOR
http://hostname:portnumber/api/pulse?Settings=true&Name=onePulseName
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Pulse settings for the requested Pulse names.
Possible Errors:
• 404 Not Found: Pulse name provided does not exist (can only occur if you use Name= )
• 500 Internal Server Error: An exception occurred within the Deadline code.
Save Pulse Settings
Saves the Pulse settings provided.
URL: http://hostname:portnumber/api/pulse
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = savesettings
• PulseSettings = JSON object containing all the Pulse information.
Response: “Success”
Possible Errors:
• 400 Bad Request: Did not provide a Pulse Information JSON object
• 500 Internal Server Error: An exception occurred within the Deadline code.
Get Pulse InfoSettings
Gets the Pulse information and settings for the Pulse names provided.
URL:
http://hostname:portnumber/api/pulse?Names=oneOrMorePulseNamesOR
http://hostname:portnumber/api/pulse?Name=onePulseName
8.7. Pulse
465
Deadline User Manual, Release 7.1.0.35
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Pulse information and settings for the requested Pulse names.
Possible Errors:
• 404 Not Found: Pulse name provided does not exist (can only occur if you use Name= )
• 500 Internal Server Error: An exception occurred within the Deadline code.
8.7.3 Pulse Property Values
Values for some Pulse properties are represented by numbers. Those properties and their possible values are listed
below.
Stat (PulseStatus)
• 0 = Unknown
• 1 = Running
• 2 = Offline
• 4 = Stalled
8.8 Balancer
8.8.1 Overview
Balancer requests can be used to set and retrieve Balancer information using GET and PUT. POST and DELETE are
not supported and sending a message of either of these types will result in a 501 Not Implemented error message. For
more about these request types and their uses see the Request Formats and Responses documentation.
8.8.2 Requests and Responses
List of possible requests for Balancer.
Get Balancer Names
Gets all the Balancer names.
URL: http://hostname:portnumber/api/balancer?NamesOnly=true
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Balancer names.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Balancer Information
Gets the Balancer information for the Balancer names provided.
URL:
http://hostname:portnumber/api/balancer?Info=true&Names=oneOrMoreBalancerNamesOR
http://hostname:portnumber/api/balancer?Info=true&Name=oneBalancerName
Request Type: GET
466
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
Message Body: N/A
Response: JSON object containing all the Balancer information for the requested Balancer names.
Possible Errors:
• 404 Not Found: Balancer name provided does not exist (can only occur if you use Name= )
• 500 Internal Server Error: An exception occurred within the Deadline code.
Save Balancer Information
Saves the Balancer information provided.
URL: http://hostname:portnumber/api/balancer
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = saveinfo
• BalancerInfo = JSON object containing all the Balancer information.
Response: “Success”
Possible Errors:
• 400 Bad Request: Did not provide a Balancer Information JSON object
• 500 Internal Server Error: An exception occurred within the Deadline code.
Get Balancer Settings
Gets the Balancer settings for the Balancer names provided.
URL: http://hostname:portnumber/api/balancer?Settings=true&Names=oneOrMoreBalancerNamesOR
http://hostname:portnumber/api/balancer?Settings=true&Name=oneBalancerName
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Balancer settings for the requested Balancer names.
Possible Errors:
• 404 Not Found: Balancer name provided does not exist (can only occur if you use Name= )
• 500 Internal Server Error: An exception occurred within the Deadline code.
Save Balancer Settings
Saves the Balancer settings provided.
URL: http://hostname:portnumber/api/balancer
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = savesettings
• BalancerSettings = JSON object containing all the Balancer information.
Response: “Success”
8.8. Balancer
467
Deadline User Manual, Release 7.1.0.35
Possible Errors:
• 400 Bad Request: Did not provide a Balancer Information JSON object
• 500 Internal Server Error: An exception occurred within the Deadline code.
Get Balancer InfoSettings
Gets the Balancer information and settings for the Balancer names provided.
URL:
http://hostname:portnumber/api/balancer?Names=oneOrMoreBalancerNamesOR
http://hostname:portnumber/api/balancer?Name=oneBalancerName
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Balancer information and settings for the requested Balancer
names.
Possible Errors:
• 404 Not Found: Balancer name provided does not exist (can only occur if you use Name= )
• 500 Internal Server Error: An exception occurred within the Deadline code.
8.8.3 Balancer Property Values
Values for some Balancer properties are represented by numbers. Those properties and their possible values are listed
below.
Stat (BalancerStatus)
• 0 = Unknown
• 1 = Running
• 2 = Offline
• 4 = Stalled
8.9 Limits
8.9.1 Overview
Limit Group requests can be used to set and retrieve information about one or many Limit Groups. Limit Group
requests support GET, PUT, POST and DELETE request types. For more about these request types and their uses see
the Request Formats and Responses documentation.
8.9.2 Requests and Responses
List of possible requests for Limit Groups. All PUT and POST requests can return a 400 Bad Request error message
if no message body is passed, or if no command key is present in the message body. All PUT and POST requests may
also return a 500 Internal Server Error error message if the command key in the message body contained an invalid
command.
468
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
Get Limit Group Names Gets the names of all Limit Groups in the repository.
URL: http://hostname:portnumber/api/limitgroups?NamesOnly=true
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Limit Group names.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Limit Groups Gets the Limit Groups for the provided Limit Group names.
URL: http://hostname:portnumber/api/limitgroups?Names=listOfOneOrMoreLimitGroupNames
http://hostname:portnumber/api/limitgroups?Name=aSingleLimitGroupName
Request Type: GET
Message Body: N/A
Response: JSON object containing the requested Limit Group/s
Possible Errors:
• 404 Not Found: There is no Limit Group with provided Name (this can only occur if a single name is
passed)
• 500 Internal Server Error: An exception occurred within the Deadline code.
Get All Limit Groups Gets the names of all Limit Groups in the repository.
URL: http://hostname:portnumber/api/limitgroups
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Limit Groups.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Set Limit Group Sets the Limit, Slave List, White List Flag, Release Percentage and/or Excluded Slaves for an
existing Limit Group, or creates a new Limit Group with the provided properties.
URL: http://hostname:portnumber/api/limitgroups
Request Type: PUT/POST
Message Body:
JSON object where the following keys are mandatory:
• Command = set
• Name = name of Limit Group
The following keys are optional:
• Limit= integer limit
• Slaves = list of slave names to include
• SlavesEx = list of slave names to exclude
• RelPer = floating point number for release percentage
• White = boolean white list flag
Response: “Success”
8.9. Limits
469
Deadline User Manual, Release 7.1.0.35
Possible Errors:
• 400 Bad Request: No name provided for the Limit Group
• 500 Internal Server Error: An exception occurred within the Deadline code.
Save Limit Group Updates a Limit Group using a JSON object containing all the Limit Group information.
URL: http://hostname:portnumber/api/limitgroups
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = save
• LimitGroup = JSON object containing all relevant Limit Group information
Response: “Success”
Possible Errors:
• 400 Bad Request: No valid Limit Group object provided.
• 500 Internal Server Error: An exception occurred within the Deadline code.
Reset Limit Group Resets the counts for a Limit Group.
URL: http://hostname:portnumber/api/limitgroups
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = save
• Name = name of Limit Group
Response: “Success”
Possible Errors:
• 400 Bad Request: No name provided for the Limit Group
• 404 Not Found: Provided Limit Group name does not correspond to a Limit Group in the repository.
• 500 Internal Server Error: An exception occurred within the Deadline code.
Delete Limit Groups Deletes the Limit Groups for the provided Limit Group names.
URL: http://hostname:portnumber/api/limitgroups
Request Type: DELETE
Message Body: N/A
Response: JSON object containing the requested Limit Group/s
Possible Errors:
• 400 Bad Request: Must provide at least one Limit Group name to delete.
• 500 Internal Server Error: An exception occurred within the Deadline code.
470
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
8.9.3 Limit Group Property Values
Values for some Limit Group properties are represented by numbers. Those properties and their possible values are
listed below.
Type (LimitGroupType)
• 0 = General
• 1 = JobSpecific
• 2 = MachineSpecific
StubLevel (currently not used)
• 0 = Slave
• 1 = Task
• 2 = Machine
8.10 Users
8.10.1 Overview
User requests can be used to set and retrieve information for one or many Users. User requests support GET, PUT,
POST and DELETE request types. For more about these request types and their uses see the Request Formats and
Responses documentation.
8.10.2 Request and Responses
List of possible requests for Users.
Get User Names
Gets all the User names.
URL: http://hostname:portnumber/api/users?NamesOnly=true
Request Type: GET
Message Body: N/A
Response: JSON object containing all the User names.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Users
Gets all the User information for the provided User names.
URL: http://hostname:portnumber/api/users?Name=oneOrMoreValidUserNames
Request Type: GET
Message Body: N/A
Response: JSON object containing all the User information for the Users provided.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get All Users
8.10. Users
471
Deadline User Manual, Release 7.1.0.35
Gets all the Users.
URL: http://hostname:portnumber/api/users
Request Type: GET
Message Body: N/A
Response: JSON object containing all the User information for the Users provided.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Save User
Saves the User Information provided.
URL: http://hostname:portnumber/api/users
Request Type: PUT/POST
Message Body: JSON object containing all the User Information to save.
Response: “Success” for PUT, the User name and ID for POST.
Possible Errors:
• 400 Bad Request:
– No user information provided, or
– No User name provided, or
– User info already exists (POST error only).
• 500 Internal Server Error: An exception occurred within the Deadline code.
Delete User
Deletes the Users corresponding to the User names provided.
URL: http://hostname:portnumber/api/users?Name=oneOrMoreValidUserNames
Request Type: DELETE
Message Body: N/A
Response: “Success”
Possible Errors:
• 400 Bad Request:
– No user information provided, or
– No User names provided.
• 500 Internal Server Error: An exception occurred within the Deadline code.
Get User Group Names
Gets all the User Group names.
URL: http://hostname:portnumber/api/usergroups
Request Type: GET
Message Body: N/A
Response: JSON object containing all the User Group names.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
472
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
Get Users Names For User Group
Gets all the User names for the User Group that corresponds to the provided User Group name.
URL: http://hostname:portnumber/api/usergroups?Name=oneValidUserGroupName
Request Type: GET
Message Body: N/A
Response: JSON object containing all the User names in the User Group.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get User Group Names For User
Gets all the User Group names for the User corresponding to the provided User name.
URL: http://hostname:portnumber/api/usergroups?User=onValidUserName
Request Type: GET
Message Body: N/A
Response: JSON object containing all the User Group names for the User.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Add Users To User Groups
Adds the Users corresponding to the User names provided to the User Groups corresponding with the
User Group names provided.
URL: http://hostname:portnumber/api/usergroups
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = add
• User = the user name/s to add (May be an Array)
• Group = the user group name/s to add to (May be an Array)
Response: “Success”
Possible Errors:
• 400 Bad Request: Missing one or more of the required keys in the JSON object in the message
body.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Command key does not contain a valid command string, or
– None of the provided User names correspond to real Users, or
– None of the provided User Group names correspond to real User Groups.
Remove Users From User Groups
Removes the Users corresponding to the User names provided from the User Groups corresponding with
the User Group names provided.
URL: http://hostname:portnumber/api/usergroups
8.10. Users
473
Deadline User Manual, Release 7.1.0.35
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = remove
• User = the user name/s to add (May be an Array)
• Group = the user group name/s to add to (May be an Array)
Response: “Success”
Possible Errors:
• 400 Bad Request: Missing one or more of the required keys in the JSON object in the message
body.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Command key does not contain a valid command string, or
– None of the provided User names correspond to real Users, or
– None of the provided User Group names correspond to real User Groups.
Set Users For User Groups
Sets the Users corresponding to the User names provided for the User Groups corresponding with the
User Group names provided.
URL: http://hostname:portnumber/api/usergroups
Request Type: PUT
Message Body:
JSON object where the following keys are mandatory:
• Command = set
• User = the user name/s to add (May be an Array)
• Group = the user group name/s to add to (May be an Array)
Response: “Success”
Possible Errors:
• 400 Bad Request: Missing one or more of the required keys in the JSON object in the message
body.
• 500 Internal Server Error:
– An exception occurred within the Deadline code, or
– Command key does not contain a valid command string, or
– None of the provided User names correspond to real Users, or
– None of the provided User Group names correspond to real User Groups.
Create New User Groups
474
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
Creates and saves new user groups with the given names.
URL: http://hostname:portnumber/api/usergroups
Request Type: POST
Message Body:
JSON object where the following keys are mandatory:
• Group = the user group name/s to create (array)
Response: “Success”
Possible Errors:
• 400 Bad Request: Missing one or more of the required keys in the JSON object in the message
body.
• 500 Internal Server Error: An exception occurred within the Deadline code
Delete User Groups
Deletes a user groups with the given name.
URL: http://hostname:portnumber/api/usergroups?Name=user+group+name+to+delete
Request Type: DELETE
Message Body: N/A
Response: “Success”
Possible Errors:
• 400 Bad Request: Must provide a user group name to delete.
• 500 Internal Server Error: An exception occurred within the Deadline code
8.11 Repository
8.11.1 Overview
Repository requests can be used to retrieve Repository information, such as directories or paths, using the GET request
type. Repository requests can also be used for adding history entries for jobs, slaves or the repository using the POST
request type. PUT and DELETE are not supported and sending a message of either of these types will result in a
501 Not Implemented error message. For more about these request types and their uses see the Request Formats and
Responses documentation.
8.11.2 Requests and Responses
List of possible requests for the Repository.
Get Root Directory
URL: http://hostname:portnumber/api/repository?Directory=root
Request Type: GET
Message Body: N/A
Response: JSON Object containing the root directory, or a message stating that the directory is not set.
8.11. Repository
475
Deadline User Manual, Release 7.1.0.35
Possible Errors:
• 400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
• 404 Not Found: Requested Directory could not be found.
Get Bin Directory
URL: http://hostname:portnumber/api/repository?Directory=bin
Request Type: GET
Message Body: N/A
Response: JSON Object containing the bin directory, or a message stating that the directory is not set.
Possible Errors:
• 400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
• 404 Not Found: Requested Directory could not be found.
Get Settings Directory
URL: http://hostname:portnumber/api/repository?Directory=settings
Request Type: GET
Message Body: N/A
Response: JSON Object containing the settings directory, or a message stating that the directory is not
set.
Possible Errors:
• 400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
• 404 Not Found: Requested Directory could not be found.
Get Events Directory
URL: http://hostname:portnumber/api/repository?Directory=events
Request Type: GET
Message Body: N/A
Response: JSON Object containing the events directory, or a message stating that the directory is not set.
Possible Errors:
• 400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
• 404 Not Found: Requested Directory could not be found.
Get Custom Events Directory
URL: http://hostname:portnumber/api/repository?Directory=customevents
Request Type: GET
Message Body: N/A
Response: JSON Object containing the custom events directory, or a message stating that the directory is
not set.
Possible Errors:
• 400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
• 404 Not Found: Requested Directory could not be found.
476
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
Get Plugins Directory
URL: http://hostname:portnumber/api/repository?Directory=plugins
Request Type: GET
Message Body: N/A
Response: JSON Object containing the plugins directory, or a message stating that the directory is not set.
Possible Errors:
• 400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
• 404 Not Found: Requested Directory could not be found.
Get Custom Plugins Directory
URL: http://hostname:portnumber/api/repository?Directory=customplugins
Request Type: GET
Message Body: N/A
Response: JSON Object containing the custom plugins directory, or a message stating that the directory
is not set.
Possible Errors:
• 400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
• 404 Not Found: Requested Directory could not be found.
Get Scripts Directory
URL: http://hostname:portnumber/api/repository?Directory=scripts
Request Type: GET
Message Body: N/A
Response: JSON Object containing the scripts directory, or a message stating that the directory is not set.
Possible Errors:
• 400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
• 404 Not Found: Requested Directory could not be found.
Get Custom Scripts Directory
URL: http://hostname:portnumber/api/repository?Directory=customscripts
Request Type: GET
Message Body: N/A
Response: JSON Object containing the custom scripts directory, or a message stating that the directory is
not set.
Possible Errors:
• 400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
• 404 Not Found: Requested Directory could not be found.
Get Auxiliary Path
8.11. Repository
477
Deadline User Manual, Release 7.1.0.35
URL: http://hostname:portnumber/api/repository?AuxiliaryPath=job&JobID=aValidJobID
Request Type: GET
Message Body: N/A
Response: JSON Object containing the auxiliary path for the provided job id, or a message stating that
the path is not set.
Possible Errors:
• 400 Bad Request:
– Must provide a Directory or an Auxiliary Path to find, or
– Must provide a Job ID.
• 404 Not Found:
– Requested Directory could not be found, or
– Job ID provided does not correspond to a Job in the repository.
Get Alternate Auxiliary Path
URL: http://hostname:portnumber/api/repository?AuxiliaryPath=alternate
Request Type: GET
Message Body: N/A
Response: JSON Object containing the alternate auxiliary path, or a message stating that the path is not
set.
Possible Errors:
• 400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
• 404 Not Found: Requested Directory could not be found.
Get Windows Alternate Auxiliary Path
URL: http://hostname:portnumber/api/repository?AuxiliaryPath=windowsalternate
Request Type: GET
Message Body: N/A
Response: JSON Object containing the windows alternate auxiliary path, or a message stating that the
path is not set.
Possible Errors:
• 400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
• 404 Not Found: Requested Directory could not be found.
Get Linux Alternate Auxiliary Path
URL: http://hostname:portnumber/api/repository?AuxiliaryPath=linuxalternate
Request Type: GET
Message Body: N/A
Response: JSON Object containing the linux alternate auxiliary path, or a message stating that the path is
not set.
Possible Errors:
478
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
• 400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
• 404 Not Found: Requested Directory could not be found.
Get Mac Alternate Auxiliary Path
URL: http://hostname:portnumber/api/repository?AuxiliaryPath=macalternate
Request Type: GET
Message Body: N/A
Response: JSON Object containing the mac alternate auxiliary path, or a message stating that the path is
not set.
Possible Errors:
• 400 Bad Request: Must provide a Directory or an Auxiliary Path to find.
• 404 Not Found: Requested Directory could not be found.
Get Maximum Priority
URL: http://hostname:portnumber/api/maximumpriority
Request Type: GET
Message Body: N/A
Response: JSON Object containing the Maximum Priority.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline.
Path Mapping
URL: http://hostname:portnumber/api/mappedpaths
Request Type: POST
Message Body:
JSON object that must contain the following keys:
• OS = Operating system (“Windows”, “Linux”, or “Mac”).
• Paths = Array of paths to map.
• Region = The region to be used for mapping paths (optional, defaults to “none”).
Response: JSON Object containing the updated paths.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline.
Get Plugin Names
URL: http://hostname:portnumber/api/plugins
Request Type: GET
Message Body: N/A
Response: JSON Object containing the plugin names
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline.
Get Plugin Event Names
URL: http://hostname:portnumber/api/plugins?EventNames=true
Request Type: GET
8.11. Repository
479
Deadline User Manual, Release 7.1.0.35
Message Body: N/A
Response: JSON Object containing the plugin event names
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline.
Get Database Connection String
URL: http://hostname:portnumber/api/repository?DatabaseConnection
Request Type: GET
Message Body: N/A
Response: The Database Connection string in the form of: (server:port,server:port...).
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline.
Add Job History Entry
URL: http://hostname:portnumber/api/repository
Request Type: POST
Message Body:
JSON object where the following keys are mandatory:
• Command = jobhistoryentry
• JobID = The job id string.
• Entry = The entry string to be added.
Response: “Success”
Possible Errors:
• 400 Bad Request:
– JSON object was not provided in message body or,
– The provided JSON object is missing some values.
• 500 Internal Server Error: An exception occurred within the Deadline.
Add Slave History Entry
URL: http://hostname:portnumber/api/repository
Request Type: POST
Message Body:
JSON object where the following keys are mandatory:
• Command = slavehistoryentry
• SlaveName = The slave name.
• Entry = The entry string to be added.
Response: “Success”
Possible Errors:
• 400 Bad Request:
– JSON object was not provided in message body or,
– The provided JSON object is missing some values.
480
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
• 500 Internal Server Error: An exception occurred within the Deadline.
Add Repository History Entry
URL: http://hostname:portnumber/api/repository
Request Type: POST
Message Body:
JSON object where the following keys are mandatory:
• Command = repositoryhistoryentry
• Entry = The entry string to be added.
Response: “Success”
Possible Errors:
• 400 Bad Request:
– JSON object was not provided in message body or,
– The provided JSON object is missing some values.
• 500 Internal Server Error: An exception occurred within the Deadline.
8.12 Pools
8.12.1 Overview
Pool requests can be used to set and retrieve information for one or many Pools. Pool requests support GET, PUT,
POST and DELETE request types. For more about these request types and their uses see the Request Formats and
Responses documentation.
8.12.2 Requests and Responses
List of possible requests for Pools
Get Pool Names Gets Pool Names.
URL: http://hostname:portnumber/api/pools
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Pool names.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Slaves For Pools Gets all the Slave names for the provided Pool names.
URL: http://hostname:portnumber/api/pools?Pool=oneOrMorePoolNames
Request Type: GET
Message Body: N/A
Response: JSON object containing all Slave names that are in the provided Pools.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
8.12. Pools
481
Deadline User Manual, Release 7.1.0.35
Add Pools Creates new Pools using the provided Pool names.
URL: http://hostname:portnumber/api/pools
Request Type: POST
Message Body:
JSON object that must contain the following keys:
• Pool = pool name/s (May be an Array)
Response: “Success”
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Set Pools Removes all pools not provided and creates any provided pools that did not exist.
URL: http://hostname:portnumber/api/pools
Request Type: POST
Message Body:
JSON object that must contain the following keys:
• Pool = pool name/s (May be an Array)
• OverWrite = true
Response: “Success”
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Add Pools to Slaves Adds the provided Pools to the assigned pools for each provided Slave. For both Pools and
Slaves, only the names are required.
URL: http://hostname:portnumber/api/pools
Request Type: PUT
Message Body:
JSON object that must contain the following keys:
• Slave = slave name/s (May be an Array)
• Pool = pool name/s (May be an Array)
Response: “Success”
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Set Pools for Slaves Sets provided Pools as the assigned pools for each provided Slave. For both Pools and Slaves,
only the names are required.
URL: http://hostname:portnumber/api/pools
Request Type: PUT
Message Body:
JSON object that must contain the following keys:
• Slave = slave name/s (May be an Array)
• ReplacementPool = pool name to replace the pools being purged
• OverWrite = true
482
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
Response: “Success”
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Purge Pools Purges all obsolete pools using the provided replacement pool.
URL: http://hostname:portnumber/api/pools
Request Type: PUT
Message Body:
JSON object that must contain the following keys:
• OverWrite = true
• ReplacementPool = pool name to replace the pools being purged
Response: “Success”
Possible Errors:
• 500 Internal Server Error: An exception occurred within the Deadline code, or
• Replacement Pool name provided does not exist.
Set and Purge Pools Sets the list of pools to the provided list of pool names, creating them if necessary. Purges all
the obsolete pools using the provided replacement pool.
URL: http://hostname:portnumber/api/pools
Request Type: PUT
Message Body:
JSON object that must contain the following keys:
• OverWrite = true
• ReplacementPool = pool name to replace the pools being purged
• Pool = the pool/s provided for setting, the replacement pool must be in this pool list or must be
“none” (May be an Array)
Response: “Success”
Possible Errors:
• 500 Internal Server Error: An exception occurred within the Deadline code, or
• Replacement Pool name provided does not exist.
Add and Purge Pools Adds the list of provided pools, creating them if necessary. Purges all the obsolete pools using
the provided replacement pool.
URL: http://hostname:portnumber/api/pools
Request Type: PUT
Message Body:
JSON object that must contain the following keys:
• OverWrite = true
• ReplacementPool = pool name to replace the pools being purged
• Pool = the pool/s provided for adding (May be an Array)
Response: “Success”
8.12. Pools
483
Deadline User Manual, Release 7.1.0.35
Possible Errors:
• 500 Internal Server Error: An exception occurred within the Deadline code, or
• Replacement Pool name provided does not exist.
Delete Pools Deletes all Pools with the provided Pool names.
URL: http://hostname:portnumber/api/pools?Pool=oneOrMorePoolNames
Request Type: DELETE
Message Body: N/A
Response: “Success”
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Delete Pools From Slaves Deletes all Pools from the Slaves’ list of pools.
URL: http://hostname:portnumber/api/pools?Pool=oneOrMorePoolNames&Slaves=oneOrMoreSlaveNames
Request Type: DELETE
Message Body: N/A
Response: “Success”
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
8.13 Groups
8.13.1 Overview
Group requests can be used to set and retrieve information for one or many Groups. Group requests support GET,
PUT, POST and DELETE request types. For more about these request types and their uses see the Request Formats
and Responses documentation.
8.13.2 Requests and Responses
List of possible requests for Groups
Get Group Names Gets Group Names.
URL: http://hostname:portnumber/api/groups
Request Type: GET
Message Body: N/A
Response: JSON object containing all the Group names.
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Get Slaves For Groups Gets all the Slave names for the provided Group names.
URL: http://hostname:portnumber/api/groups?Group=oneOrMoreGroupNames
Request Type: GET
Message Body: N/A
Response: JSON object containing all Slave names that are in the provided Groups.
484
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Add Groups Creates new Groups using the provided Group names.
URL: http://hostname:portnumber/api/groups
Request Type: POST
Message Body:
JSON object that must contain the following keys:
• Group = group name/s (May be an Array)
Response: “Success”
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Set Groups Removes all groups not provided and creates any provided groups that did not exist.
URL: http://hostname:portnumber/api/groups
Request Type: POST
Message Body:
JSON object that must contain the following keys:
• Group = group name/s (May be an Array)
• OverWrite = true
Response: “Success”
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Add Groups to Slaves Adds the provided Groups to the assigned groups for each provided Slave. For both Groups
and Slaves, only the names are required.
URL: http://hostname:portnumber/api/groups
Request Type: PUT
Message Body:
JSON object that must contain the following keys:
• Slave = slave name/s (May be an Array)
• Group = group name/s (May be an Array)
Response: “Success”
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Set Groups for Slaves Sets provided Groups as the assigned groups for each provided Slave. For both Groups and
Slaves, only the names are required.
URL: http://hostname:portnumber/api/groups
Request Type: PUT
Message Body:
JSON object that must contain the following keys:
• Slave = slave name/s (May be an Array)
• ReplacementGroup = group name to replace the groups being purged
• OverWrite = true
8.13. Groups
485
Deadline User Manual, Release 7.1.0.35
Response: “Success”
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Purge Groups Purges all obsolete groups using the provided replacement group.
URL: http://hostname:portnumber/api/groups
Request Type: PUT
Message Body:
JSON object that must contain the following keys:
• OverWrite = true
• ReplacementGroup = group name to replace the groups being purged
Response: “Success”
Possible Errors:
• 500 Internal Server Error: An exception occurred within the Deadline code, or
• Replacement Group name provided does not exist.
Set and Purge Groups Sets the list of groups to the provided list of group names, creating them if necessary. Purges
all the obsolete groups using the provided replacement group.
URL: http://hostname:portnumber/api/groups
Request Type: PUT
Message Body:
JSON object that must contain the following keys:
• OverWrite = true
• ReplacementGroup = group name to replace the groups being purged
• Group = the group/s provided for setting, the replacement group must be in this group list or must
be “none” (May be an Array)
Response: “Success”
Possible Errors:
• 500 Internal Server Error: An exception occurred within the Deadline code, or
• Replacement Group name provided does not exist.
Add and Purge Groups Adds the list of provided groups, creating them if necessary. Purges all the obsolete groups
using the provided replacement group.
URL: http://hostname:portnumber/api/groups
Request Type: PUT
Message Body:
JSON object that must contain the following keys:
• OverWrite = true
• ReplacementGroup = group name to replace the groups being purged
• Group = the group/s provided for adding (May be an Array)
Response: “Success”
486
Chapter 8. REST API
Deadline User Manual, Release 7.1.0.35
Possible Errors:
• 500 Internal Server Error: An exception occurred within the Deadline code, or
• Replacement Group name provided does not exist.
Delete Groups Deletes all Groups with the provided Group names.
URL: http://hostname:portnumber/api/groups?Group=oneOrMoreGroupNames
Request Type: DELETE
Message Body: N/A
Response: “Success”
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
Delete Groups From Slaves Deletes all Groups from the Slaves’ list of groups.
URL: http://hostname:portnumber/api/groups?Group=oneOrMoreGroupNames&Slaves=oneOrMoreSlaveNames
Request Type: DELETE
Message Body: N/A
Response: “Success”
Possible Errors: 500 Internal Server Error: An exception occurred within the Deadline code.
8.13. Groups
487
Deadline User Manual, Release 7.1.0.35
488
Chapter 8. REST API
CHAPTER
NINE
APPLICATION PLUGINS
9.1 3ds Command
9.1.1 Job Submission
You can submit jobs from within 3ds Max by installing the integrated submission script, or you can submit them from
the Monitor. The instructions for installing the integrated submission script can be found further down this page.
To submit from within 3ds Max, select the Deadline (3dsCmd) menu item that you created during the integrated
submission script setup.
489
Deadline User Manual, Release 7.1.0.35
Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The 3ds Command specific options are:
• Force Build: You can force 32 bit or 64 bit rendering.
• Path Config: Allows you to specify an alternate path file in the MXP format that the slaves can use to find
bitmaps that are not found on the primary map paths.
• Show Virtual Frame Buffer: Enable the virtual frame buffer during rendering.
• Apply VideoPost To Scene: Whether or not to use VideoPost during rendering.
• Continue On Errors: Enable to have the 3ds command line renderer ignore errors during rendering.
• Enable Local Rendering: If enabled, the frames will be rendered locally, and then copied to their final network
location.
• Gamma Correction: Enable to apply gamma correction during rendering.
490
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• Split Rendering: Enable split rendering. Specify the number of strips to split the frame into, as well as the
overlap you want to use.
• VRay/Mental Ray DBR: Enable this option to offload a VRay or Mental Ray DBR render to Deadline. See the
VRay/Mental Ray DBR section for more information.
• Run Sanity Check On Submission: Check for scene problems during submission.
VRay/Mental Ray off-load DBR
You can offload a VRay or Mental Ray DBR job to Deadline by enabling the Distributed Rendering option in your
VRay or Mental Ray settings, and by enabling the VRay/Mental Ray DBR checkbox in the submission dialog. With
this option enabled, a job will be submitted with its task count equal to the number of Slaves you specify, and it will
render the current frame in the scene file.
The slave that picks up task 0 will be the “master”, and will wait until all other tasks are picked up by other slaves.
Once the other tasks have been picked up, the “master” will update its local VRay or Mental Ray config file with the
names of the machines that are rendering the other tasks. It will then start the distributed render by connecting to the
other machines. Note that the render will not start until ALL tasks have been picked up by a slave.
It is recommended to setup VRay DBR or Mental Ray DBR for 3ds Max and verify it is working correctly prior
to submitting a DBR off-load job to Deadline. RTT (Render To Texture) is not supported with distributed bucket
rendering. If running multiple Deadline slaves on one machine, having these 2 or more slaves both pick up a different
DBR job concurrently as either master or slave is not supported.
Notes for VRay DBR:
• Ensure VRay is the currently assigned renderer in the 3ds Max scene file prior to submission.
• You must have the Distributed Rendering option enabled in your VRay settings under the Settings tab.
• Ensure “Save servers in the scene” (“Save hosts in the scene” in VRay v2) option in VRay distributed rendering
settings is DISABLED as otherwise it will ignore the vray_dr.cfg file list!
• Ensure “Max servers” value is set to 0. When set to 0 all listed servers will be used.
• It is recommended to disable “Use local host” checkbox to reduce network traffic on the “master” machine,
when using a large number of slaves (5+). If disabled, the “master” machine only organises the DBR process,
sending rendering tasks to the Deadline slaves. This is particularly important if you intend to use the VRay v3+
“Transfer missing assets” feature. Note that Windows 7 OS has a limitation of a maximum of 20 other machines
concurrently ‘connecting’ to the “master” machine.
• VRay v3.00.0x has a bug in DBR when the “Use local host” is unchecked, it still demands a render node license.
This is resolved in a newer version of VRay. Please contact Chaos Group for more information.
• The slaves will launch the VRay Spawner executable found in the 3ds Max root directory. Do NOT install the
VRay Spawner as a service on the master or slave machines. Additionally, Drive Mappings are unsupported
when running as a service.
• The vray_dr.cfg file in the 3ds Max’s plugcfg directory must be writeable so that the “master” machine can
update it. This is typically located in the user profile directory, in which case it will be writeable already.
• Chaos Group recommend that each machine to be used for DBR has previously rendered at least one other 3ds
Max job prior to trying DBR on the same machine.
• Ensure all slaves can correctly access any mapped drives or resolve all UNC paths to obtain any assets required
by the 3ds Max scene file to render successfully. Use the Deadline Mapped Drives feature to ensure the necessary
drive mappings are in place.
• Default lights are not supported by Chaos Group in DBR mode and will not render.
9.1. 3ds Command
491
Deadline User Manual, Release 7.1.0.35
• Ensure you have sufficient VRay DR licenses if processing multiple VRay DBR jobs through Deadline concurrently. Use the Deadline Limits feature to limit the number of licenses being used at any time.
• Ensure the necessary VRay executables & TCP/UDP ports have been allowed to pass-through the Windows
Firewall. Please consult the VRay user manual for specific information.
• VRay does NOT currently support in 3ds Max the ability to dynamically add or remove DBR slaves to the
currently processing DBR render once started on the “master” slave.
Notes for Mental Ray DBR:
• Ensure Mental Ray is the currently assigned renderer in the 3ds Max scene file prior to submission.
• You must have the Distributed Render option enabled in your Mental Ray settings under the Processing tab.
• The Mental Ray Satellite service must be running on your slave machines. It is installed by default during the
3ds Max installation.
• The max.rayhosts file must be writeable so that the “master” machine can update it. It’s location is different for
different versions of 3ds Max:
• 2010 and earlier: It will be in the “mentalray” folder in the 3ds Max root directory.
• 2011 and 2012: It will be in the “mentalimages” folder in the 3ds Max root directory.
• 2013 and later: It will be in the “NVIDIA” folder in the 3ds Max root directory.
• Ensure the “Use Placeholder Objects” checkbox is enabled in the “Translator Options” rollout of the “Processing” tab. When placeholder objects are enabled, geometry is sent to the renderer only on demand.
• Ensure “Bucket Order” is set to “Hilbert” in the “Options” section of the “Sampling Quality” rollout of the
“Renderer” tab. With Hilbert order, the sequence of buckets to render uses the fewest number of data transfers.
• Contour shading is not supported with distributed bucket rendering.
• Autodesk Mental Ray licensing in 3ds Max is restricted. Autodesk says “Satellite processors allow any owner
of a 3ds Max license to freely use up to four slave machines (with up to four processors each and an unlimited
number of cores) to render an image using distributed bucket rendering, not counting the one, two, or four
processors on the master system that runs 3ds Max.” Mental Ray Standalone licensing can be used to go beyond
this license limit. Use the Deadline Limits feature to limit the number of licenses being used at any time if
required.
• Ensure the necessary Mental Ray executables & TCP/UDP ports have been allowed to pass-through the Windows Firewall. Please consult the Autodesk 3ds Max user manual for specific information.
Sanity Check
The 3ds Command Sanity Check script defines a set of functions to be called to ensure that the scene submission does
not contain typical errors like wrong render view and frame range settings, incorrect output path, etc.
The Sanity Check is enabled by the Run Sanity Check Automatically Before Submission checkbox in the User Options
group of controls in the Submit To Deadline (3dsmaxCmd) dialog. You can also run the Sanity Check automatically
by clicking the Run Now! button.
492
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
The dialog contains the following elements:
• The upper area (Error Report) lists the problems found in the current scene.
• The lower area (Feedback Messages) lists the actions the Sanity Check performs and gives feedback to the user.
The latest message is always on top.
• Between the two areas, there is a summary text line listing the total number of errors and a color indicator of the
current Sanity Check state. When red, the Sanity Check will not allow a job submission to be performed.
The Error Report
The left column of the Error Report displays a checkbox and the type of the error. The checkbox determines whether
the error will be taken into account by the final result of the check. Currently, there are 3 types of errors:
• FATAL: The error cannot be fixed automatically and requires manual changes to the scene itself. A job submission with such error would be pointless. The state of the checkbox is ignored and assumed always checked.
• Can Be Fixed: The error can be fixed automatically or manually. If the checkbox is active, the error contributes
to the final result. If unchecked, the error is ignored and handled as a warning.
• Warning: The problem might not require fixing, but could be of importance to the user. It is not taken into
account by the final result (the state of the checkbox is ignored and assumed always unchecked).
Repairing Errors
Right-clicking an Error Message in the Error Report window will cause an associated repair function to be executed
and/or a Report Message to be output in the Feedback Messages window. This difference was caused by the switch
to DotNet controls which handle double-clicks as checked events, changing the checkbox state in front of the error
instead.
Updating the Error Report
You can rerun/update the Sanity Check in one of the following ways:
9.1. 3ds Command
493
Deadline User Manual, Release 7.1.0.35
• Clicking the dialog anywhere outside of the two message areas will rerun the Sanity Check and update all
messages.
• Double-clicking any Message in the Feedback Messages window will rerun the Sanity Check and update all
messages.
• Reparing an error by double-clicking will also automatically rerun the Sanity Check
• Pressing the Run Now! button in the Submit To Deadline dialog will update the Sanity Check.
The following Sanity Checks are FATAL. These are errors that must be fixed manually before the job can be submitted.
Message
The scene does not
contain ANY objects!
Maxwell is the renderer
and the current view is
NOT a Camera.
Description
The scene is empty and should not be sent to
Deadline.
Maxwell renderer must render through an
actual camera and will fail through a viewport.
The scene contains
objects or groups with
the same name as a
camera!
The scene contains objects or groups with a
duplicate name to a camera which could result
in an incorrect object being used as the camera.
Fix
Load a valid scene or create/ merge
objects, then try again.
Double-click the error message to
open a Select By Name dialog to
pick a camera for the current
viewport.
Ensure you remove any duplicate
named objects from your scene.
The following Sanity Checks can be automatically fixed before the job is submitted.
494
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Message
The current Scene
Name is Untitled.
The current view is
NOT a camera.
Description
The scene has never been saved to a MAX
file.
While it is possible to submit an untitled
scene to Deadline, it is not a good practice.
The active viewport is not a camera
viewport.
The Render Time
Output is set to
SINGLE FRAME!
While it is ok to send a single frame to
Deadline, users are sending animations
99% of the cases.
The Render Output
Path appears to
point at a LOCAL
DRIVE!
While it is technically possible to save
locally on each Slave, this is a bad idea - all
Slaves should write their output to a central
location on the network. Currently, disks
C:, D: and E: are considered local and will
be tested agains the output path.
The Name to be saved to ends with one,
two or three digits. Rendering to this file
name will append 4 more digits and make
loading sequential files in other
applications hard or impossible. This check
is performed only when the type is not AVI
or MOV and will ignore 4 trailing digits
which will be replaced by 3dsmax correctly
when rendering to sequential files.
No renders will be saved as Render Scene
Dialog checkbox is currently disabled.
The Render Output
File Name ends
with a DIGIT trailing numbers
might fail.
The Render Output
will not be saved to
a file.
The Distributed
Rendering option is
enabled for this
renderer.
Check if Distributed Rendering is enabled
for MR or V-Ray renderer.
Fix
Double-click the error message to open a
Save As dialog and save to disk.
Double-click the error message to open a
Select By Name dialog to pick a camera for
the current viewport.
Double-click the error message to set the
Render Time Output to “Active Time
Segment:. The Render Dialog will open so
you can check the options and set to Range
or Frames instead.
Double-click the error message to open the
Render Dialog and select a valid path, then
double-click again to retest.
Double-click the error message to add an
underscore _ to the end of the file name, for
example z:\temp\test123.tga will be
changed to z:\temp\test123_.tga
Double-click the error message to open the
Render Dialog and to enable the Save File
checkbox.
Double-click the error message to disable
Distributed rendering.
The following Sanity Checks are simply warnings.
Message
The Render
Output Path is
NOT
DEFINED!
The Render
Output is set
to a MOVIE
format.
Description
No frames will be saved to disk. This is allowed if
you want to output render elements only.
Fix
Double-click the error message to open
the Render Dialog and select a valid
path, then double-click again to retest.
The file extension is set to an AVI or MOV format.
In the current version of Deadline, this would result
in a sequence of single frame MOV files rendered by
separate slaves. In the future, the behaviour might
be changed to render a single MOV or AVI file on a
single slave as one Task.
Double-click the error message to open
the Render Dialog and select a single
frame output format, then double-click
again to retest.
This list will be extended to include future checks and can be edited by 3rd parties by adding new definitions and
functions to the original script. Documentation on extending the script will be published later. Please email suggestions
for enhancements and additional test cases to Deadline Support.
9.1. 3ds Command
495
Deadline User Manual, Release 7.1.0.35
9.1.2 Plug-in Configuration
You can configure the 3ds Command plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the 3ds Command plug-in from the list on the left.
Render Executables
• 3ds Max Cmd Executable: The path to the 3dsmaxcmd.exe executable file used for rendering. Enter alternative
paths on separate lines. Different executable paths can be configured for each version installed on your render
nodes.
Render Options
• 3ds Cmd Verbosity Level: The verbose level (0-5).
VRay DBR and Mental Ray Satellite Rendering
• Use IP Addresses: If offloading a VRay DBR or Mental Ray Satellite render to Deadline, Deadline will update
the appropriate config file with the host names of the machines that are running the VRay Spawner or Satellite
service. If this is enabled, the IP addresses of the machines will be used instead.
9.1.3 Integrated Submission Script Setup
The following procedure describes how to install the integrated Autodesk 3ds Command submission script. The
integrated submission script allows for submitting 3ds Command Line render jobs to Deadline directly from within
the Max editing GUI. The integrated render job submission script and the following installation procedure has been
tested with Max versions 2010 and later (including Design editions).
Note: Due to a maxscript bug in the initial release of 3ds Max 2012, the integrated submission scripts will not work.
However, this bug has been addressed in 3ds Max 2012 Hotfix 1. If you cannot apply this patch, it means that you
must submit your 3ds Max 2012 jobs from the Monitor.
496
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
You can either run the Submitter installer or manually install the submission script
Submitter Installer
• Run the Submitter Installer located at <Repository>/submission/3dsCmd/Installers
Manual Installation of the Submission Script
• Copy [Repository]/submission/3dsCmd/Client/Deadline3dsCmdClient.mcr to [3ds Install Directory]/MacroScripts. If you don’t have a MacroScripts folder in your 3ds Max install directory, check to
see if you have a UI/Macroscripts folder instead, and copy the Deadline3dsCmdClient.mcr file there if you do.
• Copy
[Repository]/submission/3dsmax/Client/SMTDSetup.ms
tory]/scripts/Startup/SMTDSetup.ms
to
[3ds
Max
Install
Direc-
• Launch 3ds Max, and find the new Deadline menu.
9.1.4 FAQ
Which versions of Max are supported?
The 3dsCommand plugin has been tested with 3ds Max 2010 and later (including Design editions).
Note: Due to a maxscript bug in the initial release of 3ds Max 2012, the integrated submission scripts
will not work. However, this bug has been addressed in 3ds Max 2012 Hotfix 1. If you cannot apply this
patch, it means that you must submit your 3ds Max 2012 jobs from the Monitor.
When should I use the 3dsCommand plugin to render Max jobs instead of the original?
This plugin should only be used when a particular feature doesn’t work with our normal 3dsmax plugin.
For example, there was a time when using the 3dsCommand plugin was the only way to render scenes
that made use of Vray’s Frame Buffer features.
9.1. 3ds Command
497
Deadline User Manual, Release 7.1.0.35
Note that the 3dsCommand plugin has less features in the submission dialog, and the error handling isn’t as
robust. In addition, using 3dsCommand causes Max to take extra time to start up because 3dsmaxcmd.exe
needs to be launched for each task, so renders might take a little extra time to complete.
Is PSoft’s Pencil+ render effects plugin supported?
Yes. Ensure the render output and render element output directory paths all exist on the file server before
rendering commences. Please note at least Pencil+ v3.1 is required if you are using the alternative 3dsmax(Lightning) plugin in Deadline. Note, you will require the correct network render license from PSoft
for each Deadline Slave, which is not the same as the full, workstation license of Pencil+.
9.1.5 Error Messages And Meanings
This is a collection of known 3ds Command error messages and their meanings, as well as possible solutions. We
want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email
Deadline Support and let us know.
Currently, no error messages have been reported for this plug-in.
9.2 3ds Max
9.2.1 Job Submission
You can submit jobs from within 3ds Max after installing the integrated Submit Max To Deadline (SMTD) script,
or you can submit them from the Monitor. The instructions for installing the integrated SMTD script can be found
further down this page. You can also submit jobs from within RPManager, the Render Pass Manager for 3ds Max.
The instructions for installing the integrated submitter for RPManager can also be found further down the page.
To submit from within 3ds Max, select the Deadline menu item that you created during the integrated submission
script setup.
498
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
If you are submitting from RPManager, just select the Network tab in RPManager after setting up the integrated
submitter.
9.2. 3ds Max
499
Deadline User Manual, Release 7.1.0.35
Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The 3ds Max specific options are as follows.
Scene File Submission Options
• SAVE and Submit Current Scene File with the Job to the REPOSITORY: The current scene will be saved
to a temporary file which will be sent with the job and will be stored in the Job’s folder in the Repository.
• SAVE and Submit Current Scene File to GLOBAL NETWORK PATH: The current scene will be saved
to a temporary file which will be copied to a Globally-Defined Alternative Network Location (e.g. dedicated
file server). It is specified in [Repository]\submission\3dsmax\Main\SubmitMaxToDeadline_Defaults.ini under
[GlobalSettings] as the SubmitSceneGlobalBasePath key. It will be referenced by the Job via its path only. This
will reduce the load on the Repository server.
• SAVE and Submit Current Scene File to USER-DEFINED NETWORK PATH: The current scene will be
saved to a temporary file which will be copied to a User-Defined Alternative Network Location (e.g. dedicated
500
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
file server) stored as a local setting. It will be referenced by the Job via its path only. This will reduce the load
on the Repository server.
• DO NOT SAVE And Use Current Scene’s ORIGINAL NETWORK PATH: The current scene will NOT be
saved, but the original file it was opened from will be referenced by the job. Assuming the file resides on a
dedicated file server, this will speed up submission and rendering significantly, but current changes to the scene
objects will be ignored.
Sanity Check
• Run Sanity Check Automatically Before Submission: This options forces Submit To Deadline to perform a
Sanity Check before submitting the job. The Sanity Check is implemented as a separate set of scripted functions
which can be enhanced by 3rd parties to meet specific studio needs. For more information, please refer to the
Sanity Check section.
• Run Sanity Check Now!: This button performs a Sanity Check without submitting a job. Any potential problems will be reported and can be fixed before actually submitting the job.
Job Tab
Job Options
• Render Task Chunk Size (Frames Per Task): Defines the number of Tasks (Frames) to be processed at once
by a Slave.
• Limit Number of Machines Rendering Concurrently: When checked, only the number of Slaves specified
by the [Machines] value will be allowed to dequeue the job. When unchecked, any number of Slaves can work
on the job.
• Machines: Defines the number of Slaves that will be allowed to dequeue the job at the same time.
• Out-Of-Order Rendering Every Nth Frame: Deadline will render every Nth frame based on the order selected
in the drop down box. This option can be very useful when rendering long test animations - you can render a
9.2. 3ds Max
501
Deadline User Manual, Release 7.1.0.35
rough animation containing evey Nth frame early enough to detect any major issues before all frames have been
rendered, or in cases where the major action happens in the end of the sequence, reverse the rendering order.
• Log: Print Frame Sequence to the Log File, then double-click the feedback window to open the Log, Copy &
Paste into Monitor > Job’s Frame Range.
• Render Preview Job First: When the checkbox is checked, two jobs will be submitted. The first job will have
[PREVIEW FRAMES] added to its name, have a priority of 100, and will render only N frames based on the
spinner’s value. The step will be calculated internally. If the spinner is set to 2, the first and the last frame will
be rendered. With a value of 3, the first, middle and last frames will be rendered and so on. The second job will
have [REST OF FRAMES] added to its name, and will be DEPENDENT on the first job and will start rendering
once the preview frames job has finished. It will have the priority specified in the dialog, and render all frames
not included in the preview job.
• Priority+: Defines the Priority Increase for the PREVIEW job. For example if the Job Priority is set to 50 and
this value is +5, the PREVIEW job will be submitted with Priority of 55 and the REST job with 50.
• Dependent: When checked, the [REST OF FRAMES] Job will be made dependent on the [PREVIEW
FRAMES] Job. When unchecked, the [REST OF FRAMES] Job will use the same dependencies (none or
custom) as the [PREVIEW FRAMES] Job.
• Frames: Defines the number of frames to be submitted as a PREVIEW job. The frames will be taken at equal
intervals, for example a value of 2 will send the first and last frames, a value of 3 will send first, middle and last
and so on.
• Task Timeout: When checked, a task will be requeued if it runs longer than the specified time. This is useful
when the typical rendering time of the job is known from previous submissions and will prevent stalling.
• Enable Auto Task Timeout: Enables the Auto Task Timeout option.
• Restart 3ds Max Between Tasks: When unchecked (default), 3ds Max will be kept in memory for the duration
of the give job’s processing. This can reduce render time significantly as multiple Tasks can be rendered in
sequence without reloading 3ds Max. When checked, 3ds Max will be restarted between tasks, thus releasing
all memory and resetting the scene settings at cost of startup time.
• Enforce Sequential Rendering: When checked, the Tasks will be processed in ascending order in order to
reduce the performance hit from History-Dependent calculations, for example from particle systems. When
unchecked, Tasks can be picked up by Slaves in any order. Recommended for Particle Rendering.
• Submit Visible Objects Only: This option should be used at your own risk, as it is heavily dependent on the
content of your scene. In most cases, it can be used to submit only a subset of the current scene to Deadline,
skipping all hidden objects that would not render anyway. This feature will be automatically disabled if the
current scene contains any Scene XRefs. The feature will create an incorrect file if any of the scene objects
depend INDIRECTLY on hidden objects.
• Concurrent Tasks: Defines the number of Tasks a single Slave can pick up at once (by launching multiple
instances of 3ds Max on the same machine). Note that only one Deadline license will be used, but if rendering
in Workstation Mode, multiple licenses of 3ds Max might be required. This is useful to maximize performance
when the tasks don’t saturate all CPUs at 100% and don’t use up all memory. Typically, as a rule of thumb, this
feature is NOT required as 3ds Max uses 100% of CPU’s during rendering.
• Limit Tasks To Slave’s Task Limit: When checked, the number of Concurrent Tasks will be limited by the
Slave’s Task Limit which is typically set to the number of available CPUs. For example, if ‘Concurrent Tasks’
is set to 16 but a Slave has 8 cores, only 8 concurrent tasks will be processed.
• On Job Completion: Defines the action to perform when the job has completed rendering successfully. The
job can be either left untouched, ARCHIVED to improve Repository performance, or automatically DELETED
from the Repository.
• Submit Job As Suspended: When checked, the Job will be submitted to the Repository as Suspended. It will
require manual user intervention before becoming active.
502
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• Force 3ds Max Build: This drop-down list allows you to specify which build of 3ds Max (32 bit vs. 64 bit) to
use when rendering the job. The list will be greyed out when running in 3ds Max 8 or earlier.
• Make Force 3ds Max Build Sticky: When the checkbox is unchecked, the “Force 3ds Max Build” dropdown list selection will NOT persist between sessions and will behave as documented above in the “Default”
section. When the checkbox is checked, the “Force 3ds Max Build” drop-down list selection will persist between
sessions. For example, if you are submitting from a 64 bit build of 3ds Max to an older network consisting of
only 32 bit builds, you can set the drop-down list to “32bit” once and lock that setting by checking “Make Force
3ds Max Build Sticky”.
Job Dependencies
When the checkbox is checked and one or more jobs have been selected from the multi-list box, the job will be set
to Pending state and will start rendering when all jobs it depends on have finished rendering. Use the Get Jobs List
button to populate the Job List and the Filter options with job data from the Repository.
RPM Pass Dependencies - Global Setup
This option is ONLY available when submitting jobs from RPManager. If enabled, all passes that are submitted will
be dependent on the passes selected in this rollout.
9.2. 3ds Max
503
Deadline User Manual, Release 7.1.0.35
Job Scheduling
Enable job scheduling. See the Scheduling section of the Modifying Job Properties documentation for more information on the available options.
Job Failure Detection
Override the job failure detection settings. See the Scheduling section of the Modifying Job Properties documentation
for more information on the available options.
Render Tab
3ds Max Rendering
504
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• Use Alternate Plugin.ini file: By default, 3ds Max will launch using the default plugin.ini file in the local
installation. You can use this option to select an alternative plugin.ini file to use instead. Alternative plugin.ini
files can be added to [Repository]\plugins\3dsmax, and then they will appear in the drop down box in the
submitter (see the Custom Plugin.ini File Creation section for more information). If you have the [Default]
option selected, it’s the equivalent to having this feature disabled.
• Fail On Black Frames: This option can be used to fail the render if a certain portion of the output image or
its render elements is black. The Black Pixel % defines the minimum percentage of the image’s pixels that
must be black in order for the image to be considered black. If each of RGB are all less than or equal to the
Threshold, and the alpha is not between the Threshold and (1.0 - threshold), then the pixel is considered black.
If the Threshold is greater than or equal to 0.5, then the alpha value has no effect.
• Override Bitmap Pager Setting While Rendering: You can specify if you want the 3dsmax Bitmap Pager
setting to be enabled or disabled.
• Submit External Files With Scene: Whether the external files (bitmaps, xrefs etc.) will be submitted with the
scene or not.
• Merge Object XRefs: If object XRefs will be merged during submission.
• Merge Scene XRefs: If scene XRefs will be merged during submission.
• Force 3dsmax Workstation Mode (Uses up a 3dsmax License): Used mainly for testing and debugging
purposes and should be left unchecked. When this option is unchecked, 3ds max will be started in Slave mode
without the User Interface, which does not require a 3ds Max license. When checked, 3ds max will be launched
in full Interactive mode and will require a license. Note that Workstation mode is set automatically when
submitting MAXScripts to Deadline.
• Enabled Silent Mode: This option is only available when Force Workstation Mode is checked. It can help
suppress some popups that 3ds Max displays (although some popups like to ignore this setting).
• Ignore Missing External File Errors: Missing external files could mean that the 3ds Max scene will render
incorrectly (with textures missing etc). In some cases though, missing external files could be ignored- for
example if the job is meant for test rendering only. If you want the job to fail if a missing external resource is
detected, uncheck this checkbox.
• Ignore Missing UVW Errors: Missing UVWs could mean that some 3ds Max object would render incorrectly
(with wrong texture mapping etc). In some cases though, missing UVWs could be ignored (for example if the
job is meant for test rendering).
• Ignore Missing XREF Errors: Missing XFEFs could mean that the 3ds Max scene cannot be loaded correctly.
In some cases though, missing XFEFs could be ignored. If you want the job to fail if a missing XFEF message
is detected at startup, keep this checkbox unchecked.
9.2. 3ds Max
505
Deadline User Manual, Release 7.1.0.35
• Ignore Missing DLL Errors: Missing DLLs could mean that the 3ds Max scene cannot be loaded or rendered
correctly. In some cases though, missing DLLs could be ignored. If you want the job to fail if a missing DLL
message is detected at startup, keep this checkbox unchecked.
• Do Not Save Render Element Files: Enable this option to have Deadline skip the saving of Render Element
image files during rendering (the elements themselves are still rendered).
• Show Virtual Frame Buffer: If checked, the 3ds Max frame buffer will be displayed on the slave during
rendering.
• Override Renderer Frame Buffer Visibility: If checked, the current renderers frame buffer visibility will be
overridden by the next setting (Show Renderer Frame Buffer).
• Show Renderer Frame Buffer: If checked, the current renderers frame buffer will be made visible during
rendering (V-Ray and Corona Frame Buffers currently supported).
• Disable Progress Update Timeout: Enable this option to disable progress update checking. This is useful for
renders like Fume FX sims that don’t constantly supply progress to 3dsmax.
• Disable Frame Rendering: Enable this option to skip the rendering process. This is useful for renders like
Fume FX sims that don’t actually require any rendering.
• Restart Renderer Between Frames: This option can be used to force Deadline to restart the renderer after each
frame to avoid some potential problems with specific renderers. Enabling this option has little to no impact on
the actual render times. This feature should be ENABLED to resolve V-Ray renders where typically the beauty
pass renders correctly but the Render Element’s are all black or perhaps seem to be swapped around. When
enabled, the c++ Lightning plugin (unique to Deadline), will unload the renderer plugins and then reload them
instantly. This has the effect of forcing a memory purge and helps to improve renderer stability, as well as ensure
the lowest possible memory footprint. This can be helpful, when rendering close to the physical memory limit
of a machine. Ensure this feature is DISABLED if you are sending FG/LC/IM caching map type jobs to the
farm, as the renderer will get reset for each frame and the FG/LC/IM file(s) won’t get incrementally increased
with the additional data per frame.
• Disable Multipass Effects: Enable this option to skip over multipass effects if they are enabled for the camera
to be rendered.
• V-Ray/Mental Ray DBR: Enable this option to offload a V-Ray or Mental Ray DBR render to Deadline. See
the V-Ray/Mental Ray DBR section for more information.
• Job Is Interruptible: If enabled, this job will be cancelled if a job with higher priority is submitted to the queue.
• Apply Custom Material To Scene: If checked, all geometry objects in the scene will be assigned one of the
user-defined materials available in the drop down box.
3ds Max Gamma Options
• Gamma Correction: Enable to apply gamma correction during rendering.
3ds Max Pathing Options
506
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• Remove Filename Padding: If checked, the output filename will be (for example) “output.tga” instead of
“output0000.tga”. This feature should only be used when rendering single frames. If you render a range of
frames with this option checked, each frame will overwrite the previous existing frame.
• Force Strict Output Naming: If checked, the output image filename is automatically modified
to include the scene’s name.
For example, if the scene name was myScene.max and the output image path was \\myServer\images\output.tga, the output image path would be changed to \\myServer\images\myScene\myScene.tga. If the new output image path doesn’t exist, it is created by the 3dsmax
plugin before rendering begins.
• Purify Filenames: If checked, all render output including Render Elements will be purged of any illegal characters as defined by “PurifyCharacterCodes” in “SubmitMaxToDeadline_Defaults.ini” file.
• Force Lower-Case Filenames: If checked, all render output including Render Elements will be forced to have
a lowercase filename.
• Update Render Elements’ Paths: Each Render Element has its own output path which is independent from
the render output path. When this option is unchecked, changing the output path will NOT update the Render
Elements’ paths and the Elements could be written to the wrong path, possibly overwriting existing passes
from a previous render. When checked, the paths will be updated to point at sub-folders of the current Render
Output path with names based on the name and class of the Render Element. The actual file name will be left
unchanged.
• Also Update RE’s Filenames: If enabled, the Render Element file names will also be updated along with their
paths.
• Include RE Name in Paths: 88If enabled, the new Render Element files will be placed in a folder that contains
the RE name.
• Include RE Name in Filenames: If enabled, the new Render Element files will contains the RE name in the
file name.
• Include RE Type in Paths: If enabled, the new Render Element files will be placed in a folder that contains the
RE type.
9.2. 3ds Max
507
Deadline User Manual, Release 7.1.0.35
• Include RE Type in Filenames: If enabled, the new Render Element files will contains the RE type in the file
name.
• Permanent RE Path Changes: When this checkbox is checked and the above option is also enabled, changes
to the Render Elements paths will be permanent (in other words after the submission, all paths will point at
the new locations created for the job). When unchecked, the changes will be performed temporarily during the
submission, but the old path names will be restored right after the submission.
• Rebuild Render Elements: If checked, Render Elements will be automatically removed and rebuilt during
submission to try and work around known 3dsMax issues.
• Include Local Paths With Job: (Thinkbox internal use only) Currently not hooked up to any functionality.
• Use Alternate Path: Allows you to specify an alternate path file in the MXP format that the slaves can use to
find bitmaps that are not found on the primary map paths.
Render Output Autodesk ME Image Sequence (IMSQ) Creation
• Save File: Specify the render output. Note that this updates the 3ds Max Render Output dialog, and is meant as
a convenience to update the output file.
• Create Image Sequence (IMSQ) File: If checked, an Autodesk IMSQ file will be created from the output files
at the output location.
• Copy IMSQ File On Completion: If checked, the IMSQ file will be copied to the location specified in the text
field.
Options Tab
User Options
508
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• Enable Local Rendering: If checked, Deadline will render the frames locally before copying them over to the
final network location.
• One Cpu Per Task: Forces each task of the job to only use a single CPU. This can be useful when doing single
threaded renders and the Concurrent Tasks setting is greater than 1.
• Automatically Update Job Name When Scene File Name Changes: If checked, the Job Name setting in the
submission dialog will automatically match the file name of the scene loaded. So if you load a new scene, the
Job Name will change accordingly.
• Override Renderer’s Low Priority Thread Option (Brazil r/s, V-Ray): When checked, the Low Priority
Thread option of the renderers supporting this feature will be forced to false during the submission. Both
Brazil r/s and V-Ray provide the feature to launch the renderer in a low priority thread mode. This is useful
when working with multiple applications on a workstation and the rendering should continue in the background
without eating all CPU resources. When submitting a job though, this should be generally disabled since we
want all slaves to work at 100% CPU load.
• Clear Material Editor In The Submitted File: Clears the material editor in the submitted file during submission.
• Unlock Material Editor Renderer: If checked, the Material Editor’s Renderer will be unlocked to use the
Default Scanline Renderer to avoid problems with some old versions of V-Ray.
• Delete Empty State Sets In The Submitted File: Deletes any empty State Sets in the submitted file during
submission and the State Sets dialog/UI will be reset. This fixes an ADSK bug when running 3dsMax as a
service.
• Warn about Missing External Files on Submission: When checked, a warning will be issued if the scene
being submitted contains any missing external files (bitmaps etc.). Depending on the state of the ‘Ignore Missing
External File Errors checkbox under the Render tab, such files might not cause the job to fail but could cause
the result to look wrong. When unchecked, scenes with missing external files will be submitted without any
warnings.
• Warn about Copying External Files with Job only if: the count is greater than 100 or the size is greater than
1024MB. Both values can be configured to a studio’s need.
• Override 3ds Max Language: If enabled, you can choose a language to force during rendering.
Export Renderer-Specific Advanced Settings
9.2. 3ds Max
509
Deadline User Manual, Release 7.1.0.35
If this option is enabled for a specific renderer, you will be able to modify a variety of settings for that renderer after
submission from the Monitor. To modify these settings from the Monitor, right-click on the job and select Modify
Properties, then select the 3dsmax tab.
Submission Timeouts
• Job Submission Timeout in seconds: This value spinner defines how many seconds to wait for the external
Submitter application to return from the Job submission before stopping the attempt with a timeout message.
• Quicktime Submission Timeout in seconds: This value spinner defines how many seconds to wait for the
external Submitter application to return from the Quicktime submission before stopping the attempt with a
timeout message.
• Data Collection Timeout in seconds: This value spinner defines how many seconds to wait for the external
Submitter application to return from data collecting before stopping the attempt with a timeout message. Data
collecting includes collecting Pools, Categories, Limit Groups, Slave Lists, Slave Info, Jobs etc.
Limits Tab
Blacklist/Whitelist Slaves
Set the whitelist or blacklist for the job. See the Scheduling section of the Modifying Job Properties documentation
for more information on the available options.
510
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Limits
9.2. 3ds Max
511
Deadline User Manual, Release 7.1.0.35
Set the Limits that the job requires. See the Scheduling section of the Modifying Job Properties documentation for
more information on the available options.
512
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
StateSets Tab
Select the State Sets you want to submit to Deadline. This option is only available in 3ds Max 2012 (Subscription
Advantage Pack 1) and later.
Integration Tab
Project Management Data
The available Integration options are explained in the Draft and Integration documentation.
9.2. 3ds Max
513
Deadline User Manual, Release 7.1.0.35
Deadline Draft Post-Render Processing
The available Draft/Integration options are explained in the Draft and Integration documentation.
Extra Info
These are some extra arbitrary properties that can be set for the job. Note that some of these are reserved when enabling
514
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
the Shotgun, FTrack or Draft settings.
Scripts Tab
Run Python Scripts
• Run Pre-Job Script: Specify the path to a Python script to execute when the job initially starts rendering.
• Run Post-Job Script: Specify the path to a Python script to execute when the job finishes rendering.
• Run Pre-Task Script: Specify the path to a Python script to execute before each task starts rendering.
• Run Post-Task Script: Specify the path to a Python script to execute after each task finishes rendering.
9.2. 3ds Max
515
Deadline User Manual, Release 7.1.0.35
Run Maxscript Script
• Submit Script Job: This checkbox lets you turn the submission into a MAXScript job. When checked, the
scene will NOT be rendered, instead the specified MAXScript code will be executed for the specified frames.
Options that collide with the submission of a MAXScript Job like “Tile Rendering” and “Render Preview Job
First” will be disabled or ignored.
• Single Task: This checkbox lets you run the MAXScript Job on one slave only. When checked, the job will
be submitted with a single task specified for frame 1. This is useful when the script itself will perform some
operations on ALL frames in the scene, or when per-frame operations are not needed at all. When unchecked,
the frame range specified in the Render Scene Dialog of 3ds Max will be used to create the corresponding
number of Tasks. In this case, all related controls in the Job tab will also be taken into account.
• Workstation Mode: This checkbox is a duplicate of the one under the Render tab (checking one will affect the
other). MAXScript Jobs that require file I/O (loading and saving of 3ds Max files) or commands that require the
3ds Max UI to be present, such as manipulating the modifier stack, HAVE TO be run in Workstation mode (using
up a 3ds Max license on the Slave). MAXScript Jobs that do not require file I/O or 3ds Max UI functionality
can be run in Slave mode on any number of machines without using up 3ds Max licenses.
• New Script From Template: This button creates a new MAXScript without any execution code, but with all
the necessary template code to run a MAXScript Job on Deadline.
• Pick Script: This button lets you select an existing script from disk to use for the MAXScript Job. It is advisable
to use scripts created from the Template file using the “New Script From Template” button.
516
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• Edit MAXScript File: This button lets you open the current script file (if any) for editing.
• Run Pre-Load Script: This checkbox lets you run a MAXScript specified in the text field below it BEFORE
the 3ds Max scene is loaded for rendering by the Slave.
• Run Post-Load Script: This checkbox lets you run a MAXScript specified in the text field below it AFTER the
3ds Max scene is loaded for rendering by the Slave.
• Run Pre-Frame Script: This checkbox lets you run a MAXScript specified in the text field below it BEFORE
the Slave renders a frame.
• Run Post-Frame Script: This checkbox lets you run a MAXScript specified in the text field below it AFTER
the Slave renders a frame.
• Post-Submission Function Call: This field can be used by TDs to enter an arbitrary user-defined MAXScript
Expression (NOT a path to a script!) which will be executed after the submission has finished. This can be used
to trigger the execution of user-defined functions or to press a button in a 3rd party script. In the screenshot, the
expression presses a button in a globally defined rollout which is part of an in-house scene management script.
If you want to execute a multi-line script after each submission, you could enter fileIn “c:\temp\somescript.ms”
in this field and the content of the specified file will be evaluated. The content of this field is sticky and saved in
the local INI file - it will persist between sessions until replaced or removed manually.
The
MAXScript
Job
Template
file
is
located
in
the
Repository
under
\submission\3dsmax\Main\MAXScriptJobTemplate.ms. When the button is pressed, a copy of the template file with a name
pattern “MAXScriptJob_TheSceneName_XXXX.ms” will be created in the \3dsmax#\scripts\SubmitMaxToDeadline
folder where XXXX is a random ID and 3dsmax# is the name of the 3ds Max root folder. The script file will open in
3ds Max for editing. You can add the code to be executed in the marked area and save to disk. The file name of the
new template will be set as the current MAXScript Job file automatically. If a file name is already selected in the UI,
you will be prompted about replacing it first.
Deadline exposes an interface to MAXScript, which allows you to gather information about the job being rendered.
See the Maxscript Interface documentation for the available functions and properties.
Tiles Tab
Tile & Region Rendering Options
• Region Rendering Mode: This drop-down list controls the various rendering modes:
9.2. 3ds Max
517
Deadline User Manual, Release 7.1.0.35
– FULL FRAME Rendering, All Region Options DISABLED - this is the default mode of the Submitter.
No region rendering will be performed and the whole image will be rendered.
– SINGLE FRAME, MULTI-REGION ‘Jigsaw’ Rendering - Single Job, Regions As Tasks - this mode
allows one or more regions to be defined and rendered on one or more network machines. Each region
can be optionally sub-divided to a grid of sub-regions to split between machines. The resulting fragments
will then be combined to a new single image, or optionally composited over a previous version of the full
image using DRAFT. This mode is recommended for large format single frame rendering. Note that the
current frame specified by the 3ds Max TIME SLIDER will be rendered, regardless of the Render Dialog
Time settings.
– ANIMATION, MULTI-REGION ‘Jigsaw’ Rendering - One Job Per Region, Frames As Tasks - this
mode allows one or more regions to be defined and rendered on one or more network machines. Each
region can be optionally sub-divided to a grid of sub-regions to split between machines. Each region
can be optionally animated over time by hand or by using the automatic tracking features. The resulting
fragments from each frame will then be combined to a new single image, or optionally composited over
a previous version of the full image using DRAFT. This mode is recommended for animated sequences
where multiple small portions of the scene are changing relative to the previous render iteration.
– SINGLE FRAME TILE Rendering - Single Job, Tiles As Tasks - this mode splits the final single image
into multiple equally-sized regions (Tiles). Each Tile will be rendered by a different machine and the final
image can be assembled either using DRAFT, or by the legacy command line Tile Assembler. This mode
is recommended when the whole image needs to be re-rendered, but you want to split it between multiple
machines.
– ANIMATION, TILE Rendering - One Job Per Tile, Frames As Tasks - this mode submits a job for each
tile and a post task maxscript will assemble the tiles once they are all rendered per frame for each job.
– 3DS MAX REGION Rendering - Single Job, Frames As Tasks - this mode allows for traditional 3ds
Max REGION, BLOWUP and CROP render modes to be used via Deadline.
• Cleanup Tiles After Assembly: When checked, the Tile image files will be removed after the final image has
been assembled. Keep this unchecked if you intend to resubmit some of the tiles and expect them to re-assemble
with the previous ones.
• Pixel Padding: Default is 4 pixels. This is the number of pixels to be added on each side of the region or tile to
ensure better stitching through some overlapping. Especially when rendering Global Illumination, it might be
necessary to render tiles with significant overlapping to avoid artefacts.
• Copy Draft Config Files To Output Folder: When checked, the configuration files for Draft Assembly jobs
will be duplicated in the output folder(s) for archiving purposes. The actual assembling will be performed using
the copies stored in the Job Auxiliary Files folder. Use this option if you want to preserve a copy next to the
assembled frames even after the Jobs have been deleted from the Deadline Repository.
• Draft Assembly Job Error On Missing Tiles: When unchecked, missing region or tile fragments will not
cause errors and will simply be ignored, leaving either black background or the previous image’s pixels in the
assembled image. When checked, the Assembly will only succeed if all requested input images have been found
and actually put together.
• Override Pool, Group, Priority for Assembly Job: When enabled, the Assembly Pool, Secondary Pool, Group
and Priority settings will be used for the Assembly Job instead of the main job’s settings.
The output formats that are supported by the Tile Assembler jobs are BMP, DDS, EXR, JPG, JPE, JPEG, PNG, RGB,
RGBA, SGI, TGA, TIF, and TIFF.
Jigsaw [Single-Frame | Animation] Multi-Region Rendering
518
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
This rollout contains all controls related to defining, managing and animating multiple regions for the’Jigsaw’ modes.
The rollout title will change to include an ACTIVE: prefix and the “Single-Frame” or “Animation” token when the
respective mode is selected in the Region Rendering Mode drop-down list (see above).
• UPDATE List: Press this button to refresh the ListView.
• LOAD/SAVE File...: Click to open a menu with the following options:
– LOAD Regions From Disk Preset File...: Selecting this option will open a file open dialog and let you
select a previously saved Regions Preset. Any existing regions will be replaced by the ones from the file.
– MERGE Regions From Disk Preset File...: Selecting this option will open a file open dialog and let you
select a previously saved Regions Preset. Any existing regions will be preserved, and the file regions will
be appended to the end of the list.
– SAVE Regions To Disk Preset File...: Only enabled if there are valid regions on the list. When selected,
a file save dialog will open and let you save the current regions list to a disk preset for later loading or
merging in the same or different projects.
• GET From Camera...: If the current view is a Camera, a list of region definitions stored in the current view’s
Camera will be displayed, allowing you to replace the current region list with the stored one. If the current view
is not a Camera view, a warning message will be shown asking you to select a Camera view. If the current view’s
Camera does not have any regions stored in it, nothing will happen.
• STORE In Camera...: If the current view is a Camera, a list of region definitions stored in the current view’s
9.2. 3ds Max
519
Deadline User Manual, Release 7.1.0.35
Camera will be displayed, with the added option to Save New Preset... in a new “slot”. Alternatively, you can
select any of the previously stored “slots” to override or update. The Notes text specified in the Notes: field
below will be used to describe the preset. Also, additional information including the number of regions, the
user, machine name, date and time and the MAX scene name will be stored with the preset.
• Notes: Enter a description of the current Region set to be used when saving a Preset to disk or camera. When a
preset is loaded, the field will display the notes stored with the preset.
• ADD New Region: Creates a new region and appends it to the list. If objects are selected in the scene, the
region will be automatically resized to frame the selection. If nothing is selected, the Region will be set to the
full image size.
• CREATE From...: Click to open a context menu with several multi-region creation options:
• Create from SCENE SELECTION...: Select one or more objects in the scene and pick this option to create one
region for each object in the selection. Note that regions might overlap or be completely redundant depending
on the size and location of the selected objects - use the OPTIMIZE options below to reduce.
• Create from TILES GRID...: Pick this option to create one region for each tile specified in the Tiles rollout.
For example, if the Tiles in X is set to 4 and Tiles in Y is 3, 12 regions resembling the Tile Grid will be created.
Note that once the regions are created, some of them can be merged together, others can be subdivided or split
as needed to distribute regions with different content and size to different machines, providing more flexibility
than the original Tiles mode.
• Create from 3DS MAX REGION...: Create a region with the size specified by the 3ds Max Region gizmo.
• OPTIMAL FILL Of Empty Areas: After the grid is created, two passes are performed: first a Horizontal
Fill where regions are merged horizontally to produce wider regions, then a Vertical Fill merging regions with
shared horizontal edges. The result is the least amount of tiles and equivalent to manually merging any neighbor
tiles with shared edges in Maya Jigsaw. Thus, it is the top (recommended) option.
• HORIZONTAL FILL Of Empty Areas: After creating the grid, a pass is performed over all regions to find
neighbors sharing vertical edges. When two regions share an edge and the same top and bottom corner, they
get merged. This is the equivalent to the Maya Jigsaw behavior, producing wider regions where possible, but
leaving a lot of horizontal edges between tiles with the same width.
• VERTICAL FILL Of Empty Areas: After creating the grid, a pass is performed to merge neighboring regions
sharing a horizontal edge with the same left/right corners. The result is the opposite of the Horizontal Fill - a lot
of tall regions.
• GRID FILL Of Empty Areas: Takes the horizontal and vertical coordinates of all tiles and creates a grid that
contains them all. No merging of regions will be performed.
• OPTIMIZE Regions, Overlap Threshold > 25%: Compare the overlapping of all highlighted regions and if
the overlapping area is more than 25% of the size of the smaller one of the two, combine the two regions to a
single region. Repeat for all regions until no overlapping can be detected.
• OPTIMIZE Regions, Overlap Threshold > 50%: Same as the previous option, but with a larger overlap
threshold.
• OPTIMIZE Regions, Overlap Threshold > 75%: Same as the previous options, but with an even larger
overlap threshold.
• Clone LEFT|RIGHT: Select a single region in the list and click with the Left Mouse Button to clone the region
to the left, or Right Mouse Button to clone to the right. The height will be retained. The width will be clamped
automatically if the new copy is partially outside the screen.
• Clone UP|DOWN: Select a single region in the list and click with the Left Mouse Button to clone the region up,
or Right Mouse Button to clone down. The width will be retained. The height will be clamped automatically if
the new copy is partially outside the screen.
520
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• FIT to N Objects / Fit Padding Value: Highlight exactly one region in the list and select one or more objects
in the scene, then click with the Left Mouse Button to perform a precise vertex-based Fit to the selection, or
click with the Right Mouse Button to perform a quick bounding-box based Fit to the selection. Click the small
button with the number to the right to select the Padding Percentage to use when fitting in either modes.
• TRACK Region...: Left-click to open the Track dialog in Vertex-based mode for the currently selected region
and scene objects. Right-click for Bounding Box-based mode. While you can switch the mode in the dialog
itself, both the radio buttons and the Padding % values will be adjusted for faster access according to the mouse
button pressed.
• SELECT | INVERT: Left-click to highlight all regions on the list. Right-click to invert the current selection.
• DELETE Regions: Click to delete the highlighted regions on the list.
• SET Keyframe: Highlight one or more regions and click this button to set a keyframe with the current region
settings at the current time.
• << PREVIOUS Key: Click to change the time slider to the previous key of the highlighted region(s), if case
there are such keys.
• NEXT Key >>: Click to change the time slider to the next key of the highlighted region(s), if case there are
such keys.
• DELETE Keyframe: Click to delete the keys (if any) of the highlighted regions. If there is no key on the
current frame, nothing will happen. Use in conjunction with Previous/Next Key navigation to delete actually
existing keys.
• Regions ListView: The list view is the main display of the current region settings. It provides several columns
and a set of controls under each column for editing the values on the list:
– On # column: Shows a checkbox to toggle a region on and off for rendering, and the index of the region.
– X and Y columns: These two columns display the coordinates of the upper left corner of the Region. Note
that internally the values are stored in relative screen coordinates, but in the list they are shown in current
output resolution pixel coordinates for convenience. Changing the output resolution in the Render Setup
dialog and pressing the UPDATE List button will recalculate the pixel coordinates accordingly.
– Width and Height columns: These two columns display the width and height of the region in pixels. Like
the upper left corner’s X and Y coordinates, they are stored internally as relative screen coordinates and
are shown as pixels for convenience.
– Tiles column: Each region can be subdivided additionally horizontally and vertically into a grid of subtiles, each to be rendered by a different network machine. This column shows the number of tiles of the
region, default is 1x1.
– Keys column: This column shows the number of animation keys recorded for the region. By default regions have no animation keys and will show – in the column unless animated manually or via the Tracking
option.
– Locked column: After Tracking, the region will be locked automatically to avoid accidental changes to
its position and size. You can also lock the region manually if you want to prevent it from being moved
accidentally.
– Notes column: This column displays auto-generated or user-defined notes for each region. When a region
is created, it might be given a name based on the object it was fitted to, the original region it was cloned or
split from etc. You can enter descriptive notes to explain what every region was meant for.
• UNDO... / REDO...: Most operations performed in the Multi-Region rollout will create undo records automatically. The Undo buffer is saved to disk in a similar form as the presets, and you can undo or redo individual
steps by left-clicking the button, or multiple steps at once by right-clicking and selecting from a list.
9.2. 3ds Max
521
Deadline User Manual, Release 7.1.0.35
• HOLD: Not all operations produce a valid undo record. If you feel that the next operation might be dangerous,
you can press the HOLD button to force the creation of an Undo record at the current point to ensure you can
return back to it in case the following operations don’t produce desirable results.
• SPLIT To Tiles: Pressing this button will split the highlighted region to new regions according to the Tiles
settings, assuming they are higher than 1x1 subdivisions. You can use this feature together with the Tiles
controls to quickly produce a grid of independent regions from a single large region. For example, if you create
a single region with no scene selection, it will have the size of the full screen. Enter Tile values like 4 and 3 and
hit the SPLIT To Tiles to produce a grid of 12 regions.
• MERGE Selected: Highlight two or more regions to merge them into a single region. The regions don’t have
to necessarily touch or overlap - the minimum and maximum extents of all regions will be found and they will
be replaced by a single region with that position and size.
• Summary Field: This field displays information about the number of regions and sub-regions (tiles), the number
of pixels to be rendered by these regions, and the percentage of pixels that would be rendered compared to the
full image.
• Assemble Over... drop-down list: This list provides the assembly compositing options:
– Assemble Over EMPTY Background: The regions will be assembled into a new image using a black
empty background with zero alpha.
– Compose Over PREVIOUS OUTPUT Image: The regions will be assembled over the previously rendered (or assembled) image matching the current output filename (if it exists). If such an image does not
exist, the regions will be assembled over an empty background.
– Compose Over CUSTOM SINGLE Image: The regions will be assembled over a user-defined bitmap
specified with the controls below. The same image will be used on all frames if an animation is rendered.
– Compose Over CUSTOM Image SEQUENCE: The regions will be assembled over a user-defined image
sequence specified with the controls below. Each frame will use the corresponding frame from the image
sequence.
• Pick Custom Background Image: Press this button to select the custom image or image sequence to be used in
the last compositing modes above. Make sure you specify a network location that can be accessed by the Draft
jobs on Deadline performing the Assembly!
[Single-Frame | Animation] Tile Rendering
522
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• Tiles In X / Tiles In Y: These values specify the number of tiles horizontally and vertically. The total number
of tiles (and jobs) to be rendered is calculated as X*Y and is displayed in the UI.
• Show Tiles In Viewport: Enables the tile display gizmo.
• Tile Pixel Padding: This value defines the number of pixels to overlap between tiles. By default it is set to
0, but when rendering Global Illumination, it might be necessary to render tiles with significant overlapping to
avoid artifacts.
• Re-Render User-Defined Tiles: When checked, only user-defined tiles will be submitted for re-rendering. Use
the [Specify Tiles To Re-render...] check-button to open a dialog and select the tiles to be rendered.
• Specify Tiles To Re-render: When checked, a dialog to select the tiles to be re-rendered will open. To close
the dialog, either uncheck the button or press the [X] button on the dialog’s title bar.
• Enable Blowup Mode: If enabled, tile rendering will work by zooming in on the region and rendering it at a
smaller resolution. Then that region is blown up to bring it to the correct resolution. This has been known to
help save memory when rendering large high resolution images.
• Submit All Tiles As A Single Job: By default, a separate job is submitted for each tile (this allows for tile
rendering of a sequence of frames). For easier management of single frame tile rendering, you can choose to
submit all the tiles as a single job.
• Submit Dependent Assembly Job: When rendering a single tile job, you can also submit a dependent assembly
job to assemble the image when the main tile job completes.
• Use Draft For Assembly: If enabled, Draft will be used to assemble the images. Note that you’ll need a Draft
license from Thinkbox.
Region Rendering
9.2. 3ds Max
523
Deadline User Manual, Release 7.1.0.35
When enabled, only the specified region will be rendered and depending on the region type selected, it can be cropped
or blown up as well. If the Enable Distributed Tiles Rendering checkbox is checked, it will be unchecked. This
option REPLACES the “Crop” option in the Render mode drop-down list in the 3ds Max UI. In other words, the
3ds Max option does not have to be selected for Region Rendering to be performed on Deadline. The region can be
specified either using the CornerX, CornerY, Width and Height spinners, or by getting the current region from the
active viewport. To do so, set the Render mode drop-down list to either Region or Crop, press the Render icon and
drag the region marker to specify the desired size. Then press ESC to cancel and press the Get Region From Active
View to capture the new values.
Misc Tab
Quicktime Generation From Rendered Frame Sequence
Create a Quicktime movie from the frames rendered by a 3ds Max job. See the Quicktime documentation for more
information on the available options.
Render To Texture
524
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
This option enables texture baking through Deadline. Use the Add, Remove, and Clear All buttons to add and remove
objects from the list of objects to bake. * One Object Per Task: If enabled, each RTT object will be allocated to an
individual task thereby allowing multiple machines to carry out RTT processing simultaneously.
Batch Submission
• Use Data from 3ds Max Batch Render: This checkbox enables Batch Submission using the 3ds Max Batch
Render dialog settings. If checked, a single MASTER job will be sent to Deadline which in turn will “spawn”
all necessary BATCH jobs.
• Open Dialog: This button opens the 3ds Max Batch Render dialog in Version 8 and higher.
• Update Info: This button reads the 3ds Max Batch Render dialog settings and displays the number of enabled
vs. defined Views.
Sanity Check
The 3ds Max Sanity Check script defines a set of functions to be called to ensure that the scene submission does not
contain typical errors like wrong render view and frame range settings, incorrect output path, etc.
The Sanity Check is enabled by the Run Sanity Check Automatically Before Submission checkbox in the User Options
group of controls in the Submit To Deadline (3dsmax) dialog. You can also run the Sanity Check automatically by
clicking the Run Now! button.
9.2. 3ds Max
525
Deadline User Manual, Release 7.1.0.35
The dialog contains the following elements:
• The upper area (Error Report) lists the problems found in the current scene.
• The lower area (Feedback Messages) lists the actions the Sanity Check performs and gives feedback to the user.
The latest message is always on top.
• Between the two areas, there is a summary text line listing the total number of errors and a color indicator of the
current Sanity Check state. When red, the Sanity Check will not allow a job submission to be performed.
The Error Report
The left column of the Error Report displays a checkbox and the type of the error. The checkbox determines whether
the error will be taken into account by the final result of the check. Currently, there are 3 types of errors:
• FATAL: The error cannot be fixed automatically and requires manual changes to the scene itself. A job submission with such an error would be pointless. The state of the checkbox is ignored and considered always
checked.
• Can Be Fixed: The error can be fixed automatically or manually. If the checkbox is active, the error contributes
to the final result. If unchecked, the error is ignored and handled as a warning.
• Warning: The problem might not require fixing, but could be of importance to the user. It is not taken into
account by the final result (the state of the checkbox is ignored and considered always unchecked).
Repairing Errors
Right-clicking an Error Message in the Error Report window will cause an associated repair function to be executed
and/or a Report Message to be output in the Feedback Messages window. This difference was caused by the switch
to DotNet controls which handle double-clicks as checked events, changing the checkbox state in front of the error
instead.
Updating the Error Report
You can rerun/update the Sanity Check in one of the following ways:
526
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• Clicking the dialog anywhere outside of the two message areas will rerun the Sanity Check and update all
messages.
• Double-clicking any Message in the Feedback Messages window will rerun the Sanity Check and update all
messages.
• Reparing an error by double-clicking will also automatically rerun the Sanity Check
• Pressing the Run Now! button in the Submit To Deadline dialog will update the Sanity Check.
FATAL Sanity Checks
These are errors that must be fixed manually before the job can be submitted.
Message
The scene does not contain
ANY objects!
Maxwell is the renderer and the
current view is NOT a Camera.
The scene contains objects or
groups with the same name as a
camera!
Maxwell is the renderer and the
Render Time Output is set to a
SINGLE FRAME! (Check is
currently disabled in SMTD)
Render Output Path length
exceeds 255 characters!
Render Elements Output Path
length exceeds 255 characters!
Duplicate Render Elements
saving to same File Found!
Scene Object(s) contain names
> 255 characters!
Corrupt Group(s) detected in
your Scene!
Multi-Region Rendering
Requested, But No Active
Regions Found!
9.2. 3ds Max
Description
The scene is empty and should not
be sent to Deadline.
Maxwell renderer must render
through an actual camera and will
fail through a viewport.
The scene contains objects or groups
with a duplicate name to a camera
which could result in an incorrect
object being used as the camera.
Maxwell has an issue with single
frame rendering.
Fix
Load a valid scene or create/ merge
objects, then try again.
Double-click the error message to
open a Select By Name dialog to pick
a camera for the current viewport.
Ensure you remove any duplicate
named objects from your scene.
Ensure the render output file save
path is less than 255 characters in
length.
Ensure any Render Element file save
path lengths are less than 255
characters in length.
Double-click the error message will
open the Render Scene Dialog for you
to manually shorten the path length.
Double-click the error message will
open the Render Scene Dialog for you
to manually shorten the RE path
length.
Double-click the error message will
open the Render Scene Dialog for you
to manually resolve the duplication.
Shorten the character length of all the
objects in your scene to ensure
stability.
1 or more Render Elements are
saving to an identical file path and
file name.
1 or more objects in the scene has an
object name which is greater than
255 characters in length, which will
crash Max.
One or more objects in your scene
are a group head but have no child
members!
Jigsaw Multi-Region Rendering has
been enabled, but there are NO
active regions enabled in the SMTD
Tiles Tab UI.
Double-click the error message will
change the Rendering Output Time to
animation with just the current frame.
Double-click the error message to
automatically have these corrupt nodes
deleted from the scene. Results are
printed to the Sanity Check Window.
Ensure at least 1 region is active in the
Jigsaw Multi- Region Rendering UI in
the Tiles tab of SMTD.
527
Deadline User Manual, Release 7.1.0.35
V-Ray Save Raw Image File is
Enabled, but Raw Image File
Path is Empty!
V-Ray VFB Save Raw Image File is enabled
but NO save file path has been declared!
V-Ray Save Separate Render
Channels is Enabled, but
Separate Render Channels File
Path is Empty!
V-Ray VFB Save Separate Render Channels
is Enabled, but NO save file path has been
declared!
V-Ray VFB Save Raw Image
File - [Generate preview]
should be Disabled!
V-Ray VFB [Generate Preview] must be
disabled for network rendering.
V-Ray VFB - [Region render]
button should be Disabled!
V-Ray VFB [Region render] must be disabled
for network rendering.
V-Ray VFB - [Track mouse
while rendering] button should
be Disabled! (Check is
currently disabled in SMTD).
V-Ray RE:[Alpha,Reflection,
Refraction] or [Save alpha]
requires Draft Tile Assembler.
NOT supported with TA.
V-Ray VFB [Track mouse while rendering]
button must be disabled for network
rendering.
When using Jigsaw SingleFrame Tile
Rendering with V-Ray RE’s such as Alpha,
Reflection, Refraction OR [Save alpha] via
the VFB, ensure you use Draft Tile
Assembler which is able to support the higher
bit depths created by these RE’s.
Double-click the error
message will open the Render
Scene Dialog for you to
manually enter a valid file save
path.
Double-click the error
message will open the Render
Scene Dialog for you to
manually enter a valid file save
path.
Double-click the error
message will disable the
[Generate Preview] button in
the VFB.
Double-click the error
message will disable the
[Region render] button in the
VFB.
Double-click the error
message will disable the
[Track mouse while rendering]
button in the VFB.
Double-click the error
message will enable Draft as
the Tile Assembler.
Fixable Sanity Checks
The following Sanity Checks can be automatically fixed before the job is submitted.
528
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Message
The current Scene
Name is Untitled.
The current view is
NOT a camera.
Description
The scene has never been saved to a
MAX file.
While it is possible to submit an untitled
scene to Deadline, it is not a good
practice.
The active viewport is not a camera
viewport.
The Render Time
Output is set to
SINGLE FRAME!
While it is ok to send a single frame to
Deadline, users are sending animations
99% of the cases.
The Render Output
Path appears to point at
a LOCAL DRIVE!
While it is technically possible to save
locally on each Slave, this is a bad idea all Slaves should write their output to a
central location on the network.
Currently, disks C:, D: and E: are
considered local and will be tested agains
the output path.
The Name to be saved to ends with one,
two or three digits. Rendering to this file
name will append 4 more digits and make
loading sequential files in other
applications hard or impossible. This
check is performed only when the type is
not AVI or MOV and will ignore 4
trailing digits which will be replaced by
3dsmax correctly when rendering to
sequential files.
No renders will be saved as Render Scene
Dialog checkbox is currently disabled.
The Render Output File
Name ends with a
DIGIT - trailing
numbers might fail.
The Render Output will
not be saved to a file.
The Distributed
Rendering option is
enabled for this
renderer.
Workstation Mode
must be enabled to use
V-Ray Distributed
Rendering.
The Render Time
Output is NOT set to
single frame, and
Remove Filename
Padding is enabled!
9.2. 3ds Max
Fix
Double-click the error message to open a
Save As dialog and save to disk.
Double-click the error message to open a
Select By Name dialog to pick a camera
for the current viewport.
Double-click the error message to set the
Render Time Output to “Active Time
Segment:. The Render Dialog will open
so you can check the options and set to
Range or Frames instead.
Double-click the error message to open
the Render Dialog and select a valid path,
then double-click again to retest.
Double-click the error message to add an
underscore _ to the end of the file name,
for example z:\temp\test123.tga will be
changed to z:\temp\test123_.tga
Check if Distributed Rendering is enabled
for MR or V-Ray renderer.
Double-click the error message to open
the Render Dialog and to enable the Save
File checkbox.
Double-click the error message to disable
Distributed rendering.
3dsMax must use a workstation license to
allow Distributed Rendering to work,
when it is being offloaded onto the farm.
Double-click the error message and
Workstation Mode will be enabled in
SMTD.
When rendering animations, you should
allow filename padding to ensure an
image sequence is created during
rendering.
Double-click the error message will
change the Rendering Output Time to
SINGLE FRAME.
529
Deadline User Manual, Release 7.1.0.35
The current Renderer is
Krakatoa and Particle
Cache is ON!
One or more Render
Element Save File Paths
are EMPTY! (V-Ray? Disable the Individual
RE)
Camera Match
Background Image(s) in
your Scene. Right-click
to REMOVE these ref.
bitmaps!
Alpha Channel will NOT
be stored if saving *.tga
file @ 16/24bit depth!
Select 32bit for Alpha!
Particle and Lighting Cache should be
disabled during SMTD submission to
Deadline queue.
Ensure that a Render Element Output File
has been selected for each Render Element!
If using V-Ray Frame Buffer and ALL RE’s
have been Disabled, then IGNORE this
Sanity Check!
Surplus camera match bkgrd images in your
scene cause unnecessary bitmap refs. in
your scene file.
Double-click the error message to set
the PCache & LCache to be disabled.
Ensure you select 32bit in the TGA image
plugin file format options to ensure an alpha
channel is stored in the TGA file.
Double-click the error message will
open the Render Scene Dialog for you
to manually configure the bit depth of
the TGA image file to be saved.
Double-click the error message will
open the Render Scene Dialog for
you to manually resolve the issue or it
can be safely ignored.
Double-click the error message will
delete any background image in ALL
cameras in your scene file.
Warnings
The following Sanity Checks are simply warnings.
Message
The Render Output Path is
NOT DEFINED!
Description
No frames will be saved to disk. This is
allowed if you want to output render
elements only.
The Render Output is set
to a MOVIE format.
The file extension is set to an AVI or MOV
format.
In the current version of Deadline, this would
result in a sequence of single frame MOV
files rendered by separate slaves. In the
future, the behaviour might be changed to
render a single MOV or AVI file on a single
slave as one Task.
“Don’t render final image” is enabled, so
Restart Renderer and Machine Limit should
be set to 1 in SMTD.
Not rendering final image
(GI) so Restart Renderer
should be disabled, and
Machine Limit set to 1.
Restart Renderer Between
Frames is disabled and
V-Ray or Brazil is the
selected renderer.
Viewport is currently
locked, which can result in
incorrect renders with
Deadline.
Tile Rendering is enabled
and the V-Ray VFB is
currently on.
V-Ray & Brazil renderers need Restart
Renderer to be enabled to ensure memory
levels are purged during rendering.
The locked viewport setting in 3dsMax
2009-2014 is ignored in the SDK but is fixed
in 3dsMax 2015 onwards.
Unexpected results can occur when the
V-Ray VFB is enabled and you are Tile
Rendering. Consider where any Render
Elements may be saving to, including the use
of the VFB Split Channels and RAW output.
Fix
Double-click the error message to
open the Render Dialog and select
a valid path, then double-click
again to retest.
Double-click the error message to
open the Render Dialog and select
a single frame output format, then
double-click again to retest.
Double-click the error message and
Restart Renderer will be disabled
and Machine Limit enabled and set
to 1 in SMTD.
Double-click the error message to
enable Restart Renderer in the
SMTD settings.
Double-click the error message to
disable the Locked Viewport
(padlock) in the Render Scene
Dialog.
Double-click the error message to
disable the V-Ray VFB output
checkbox.
This list will be extended to include future checks and can be edited by 3rd parties by adding new definitions and
functions to the original script. Documentation on extending the script will be published later. Please email suggestions
for enhancements and additional test cases to Deadline Support.
530
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
9.2.2 V-Ray/Mental Ray off-load DBR
You can offload a V-Ray or Mental Ray DBR job to Deadline by enabling the Distributed Rendering option in your
V-Ray or Mental Ray settings, and by enabling the V-Ray/Mental Ray DBR checkbox in the submission dialog (under
the Render tab). With this option enabled, a job will be submitted with its task count equal to the number of Slaves
you specify, and it will render the current frame in the scene file.
The slave that picks up task 0 will be the “master”, and will wait until all other tasks are picked up by other slaves.
Once the other tasks have been picked up, the “master” will update its local V-Ray or Mental Ray config file with the
names of the machines that are rendering the other tasks. It will then start the distributed render by connecting to the
other machines. Note that the render will not start until ALL tasks have been picked up by a slave.
It is recommended to setup V-Ray DBR or Mental Ray DBR for 3ds Max and verify it is working correctly prior
to submitting a DBR off-load job to Deadline. RTT (Render To Texture) is not supported with distributed bucket
rendering. If running multiple Deadline slaves on one machine, having these 2 or more slaves both pick up a different
DBR job concurrently as either master or slave is not supported.
Notes for V-Ray DBR:
• You MUST have the Force Workstation Mode option enabled in the submission dialog (under the Render tab).
This means that the “master” will use up a 3ds Max license. If you don’t want to use a 3ds Max license, you can
submit to the 3ds Command plugin instead.
• Ensure V-Ray is the currently assigned renderer in the 3ds Max scene file prior to submission.
• You must have the Distributed Rendering option enabled in your V-Ray settings under the Settings tab.
• Ensure “Save servers in the scene” (“Save hosts in the scene” in V-Ray v2) option in V-Ray distributed rendering
settings is DISABLED as otherwise it will ignore the vray_dr.cfg file list!
• Ensure “Max servers” value is set to 0. When set to 0 all listed servers will be used.
• It is recommended to disable “Use local host” checkbox to reduce network traffic on the “master” machine,
when using a large number of slaves (5+). If disabled, the “master” machine only organises the DBR process,
sending rendering tasks to the Deadline slaves. This is particularly important if you intend to use the V-Ray v3+
“Transfer missing assets” feature. Note that Windows 7 OS has a limitation of a maximum of 20 other machines
concurrently ‘connecting’ to the “master” machine.
• V-Ray v3.00.0x has a bug in DBR when the “Use local host” is unchecked, it still demands a render node license.
This is resolved in a newer version of V-Ray. Please contact Chaos Group for more information.
• The slaves will launch the V-Ray Spawner executable found in the 3ds Max root directory. Do NOT install the
V-Ray Spawner as a service on the master or slave machines. Additionally, Drive Mappings are unsupported
when running as a service.
• The vray_dr.cfg file in the 3ds Max’s plugcfg directory must be writeable so that the “master” machine can
update it. This is typically located in the user profile directory, in which case it will be writeable already.
• Chaos Group recommend that each machine to be used for DBR has previously rendered at least one other 3ds
Max job prior to trying DBR on the same machine.
• Ensure all slaves can correctly access any mapped drives or resolve all UNC paths to obtain any assets required
by the 3ds Max scene file to render successfully. Use the Deadline Mapped Drives feature to ensure the necessary
drive mappings are in place.
• Default lights are not supported by Chaos Group in DBR mode and will not render.
• Ensure you have sufficient V-Ray DR licenses if processing multiple V-Ray DBR jobs through Deadline concurrently. Use the Deadline Limits feature to limit the number of licenses being used at any time.
• Ensure the necessary V-Ray executables & TCP/UDP ports have been allowed to pass-through the Windows
Firewall. Please consult the V-Ray user manual for specific information.
9.2. 3ds Max
531
Deadline User Manual, Release 7.1.0.35
• V-Ray does NOT currently support in 3ds Max the ability to dynamically add or remove DBR slaves to the
currently processing DBR render once started on the “master” slave.
Notes for Mental Ray DBR:
• Ensure Mental Ray is the currently assigned renderer in the 3ds Max scene file prior to submission.
• You must have the Distributed Render option enabled in your Mental Ray settings under the Processing tab.
• The Mental Ray Satellite service must be running on your slave machines. It is installed by default during the
3ds Max 2014 or earlier installation. Note that ADSK changed this default from 3dsMax 2015 onwards and the
Mental Ray Satellite Service is installed as part of the install process but is NOT automatically started, so you
will need to start it manually the very first time. See this AREA blog post about Distributed Bucket Rendering
in 3ds Max 2015.
• The max.rayhosts file must be writeable so that the “master” machine can update it. It’s location is different for
different versions of 3ds Max:
– 2010 and earlier: It will be in the “mentalray” folder in the 3ds Max root directory.
– 2011 and 2012: It will be in the “mentalimages” folder in the 3ds Max root directory.
– 2013 and later: It will be in the “NVIDIA” folder in the 3ds Max root directory.
• Ensure the “Use Placeholder Objects” checkbox is enabled in the “Translator Options” rollout of the “Processing” tab. When placeholder objects are enabled, geometry is sent to the renderer only on demand.
• Ensure “Bucket Order” is set to “Hilbert” in the “Options” section of the “Sampling Quality” rollout of the
“Renderer” tab. With Hilbert order, the sequence of buckets to render uses the fewest number of data transfers.
• Contour shading is not supported with distributed bucket rendering.
• Autodesk Mental Ray licensing in 3ds Max is restricted. Autodesk says “Satellite processors allow any owner
of a 3ds Max license to freely use up to four slave machines (with up to four processors each and an unlimited
number of cores) to render an image using distributed bucket rendering, not counting the one, two, or four
processors on the master system that runs 3ds Max.” Mental Ray Standalone licensing can be used to go beyond
this license limit. Use the Deadline Limits feature to limit the number of licenses being used at any time if
required.
• Ensure the necessary Mental Ray executables & TCP/UDP ports have been allowed to pass-through the Windows Firewall. Please consult the Autodesk 3ds Max user manual for specific information.
9.2.3 Plug-in Configuration
You can configure the 3dsmax plugin settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the 3dsmax plugin from the list on the left.
532
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
3ds Max Render Executables
• 3ds Max Executable: The path to the 3ds Max executable file used for rendering. Enter alternative paths on
separate lines. Different executable paths can be configured for each version installed on your render nodes.
3ds Max Design Render Executables
• 3ds Max Design Executable: The path to the 3ds Max Design executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your
render nodes.
Render Options
• Alternate Plugin ini File: Location of alternate plugin ini file.
• Fail On Existing 3dsmax Process: Prevent deadline from rendering when 3dsmax is already open.
• Run Render Sanity Check: If enabled, Deadline will do a quick sanity check with 3dsmaxcmd.exe prior to
rendering to ensure 3dsmax is properly set up for network rendering.
• Kill ADSK Comms Center Process: If enabled, Deadline will kill Autodesk Communications Center process
if it’s running during network rendering.
• Disable Saving Output To Alternate File Name: If enabled, Deadline won’t try to rename the output file(s) if
it is unable to save the output to its default file name.
Timeouts
• Timeout For Loading 3dsmax: Maximum time for 3dsmax to load, in seconds.
• Timeout For Starting A Job: Maximum time for 3dsmax to start a job, in seconds.
• Timeout For Progress Updates: Maximum time before progress update times out, in seconds.
V-Ray DBR and Mental Ray Satellite Rendering
9.2. 3ds Max
533
Deadline User Manual, Release 7.1.0.35
• Use IP Addresses: If offloading a V-Ray DBR or Mental Ray Satellite render to Deadline, Deadline will update
the appropriate config file with the host names of the machines that are running the V-Ray Spawner or Satellite
service. If this is enabled, the IP addresses of the machines will be used instead.
9.2.4 Firewall Considerations
Here is an non-exhaustive list of specific 3dsMax related application executables which should be granted access to
pass through the Windows Firewall for all applicable policy scopes (Windows - domain, private, public) and both
inbound & outbound rules (where <maxroot> is the 3dsMax install directory):
• <maxroot>/3dsmax.exe
• <maxroot>/3dsmaxcmd.exe
• <maxroot>/maxadapter.adp.exe
• <maxroot>/vrayspawnerYYYY.exe where YYYY is the yearDate such as “2015” (Only applicable if V-Ray
installed)
• <maxroot>/python/python.exe
• <maxroot>/python/pythonw.exe
Autodesk Communication Center (InfoCenter) Path (dependent on 3dsMax version being used):
• 3dsMax 2009-2010: C:\Program Files\Common Files\Autodesk Shared\WSCommCntr1.exe
• 3dsMax 2011: C:\Program Files\Common Files\Autodesk Shared\WSCommCntr\lib\WSCommCntr2.exe
• 3dsMax 2012: C:\Program Files\Common Files\Autodesk Shared\WSCommCntr3\lib\WSCommCntr3.exe
• 3dsMax 2013-2015: C:\Program Files\Common Files\Autodesk Shared\WSCommCntr4\lib\WSCommCntr4.exe
It is recommended to always start 3dsMax for the very first time with Administrative rights to ensure the application is
fully initialized correctly. This can also be achieved by right-clicking the 3dsmax.exe application and selecting “Run
as administrator”.
9.2.5 Integrated Submission Script Setup
The following procedures describe how to install the integrated Autodesk 3ds Max submission script. The integrated
submission script allows for submitting 3ds Max render jobs to Deadline directly from within the 3ds Max editing
GUI. The integrated render job submission script and the following installation procedure has been tested with 3ds
Max versions 2010 and later (including Design editions).
You can either run the Submitter installer or manually install the submission script
Submitter Installer
• Run the Submitter Installer located at <Repository>/submission/3dsmax/Installers
Manual Installation of the Submission Script
• Copy [Repository]/submission/3dsmax/Client/Deadline3dsMaxClient.mcr to [3ds Max Install Directory]/MacroScripts. If you don’t have a MacroScripts folder in your 3ds Max install directory, check to
see if you have a UI/Macroscripts folder instead, and copy the Deadline3dsMaxClient.mcr file there if you do.
• Copy
[Repository]/submission/3dsmax/Client/SMTDSetup.ms
tory]/scripts/Startup/SMTDSetup.ms.
to
[3ds
Max
Install
Direc-
• Launch 3ds Max, and find the new Deadline menu.
534
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
RPManager Script Setup
To install the 3ds Max integrated submission script in RPManager, just follow these steps:
• Create a Deadline7 folder in [maxdir]\scripts\RPManager
• Copy
[repo]\submission\3dsmaxRPM\Client\Deadline3dsMaxRPMClient.ms
[maxdir]\scripts\RPManager\Deadline7\Deadline3dsMaxRPMClient.ms
to
• In RPManager, select Customize -> Preferences to open the preferences window
• In the Network Manager section, choose Custom Submit in the drop down, and then choose the Deadline3dsMaxRPMClient.ms script you copied over
9.2. 3ds Max
535
Deadline User Manual, Release 7.1.0.35
• Click OK to close the preferences, and then click on the Network tab to see the submitter
9.2.6 Advanced Features For Technical Directors
MAXScript Interface
When running a MAXScript job through Deadline, there is an interface called DeadlineUtil which you can use to get
information about the job being rendered. The API for the interface between MAXScript and Deadline is as follows:
Functions
536
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Function
string GetAuxFilename( int index)
string GetJobInfoEntry( string key )
string GetOutputFilename( int
index )
string GetSubmitInfoEntry( string
key )
int
GetSubmitInfoEntryElementCount(
string key )
string GetSubmitInfoEntryElement(
int index, string key )
void FailRender( string message )
void LogMessage( string message )
void SetProgress( float percent )
void SetTitle( string title )
void WarnMessage( string message
)
Description
Gets the file with the given index that was submitted with the job.
Gets a value from the plugin info file that was submitted with the job, and
returns an empty string if the key doesn’t exist.
Gets the output file name for the job at the given index.
Gets a value from the job info file that was submitted with the job, and
returns an empty string if the key doesn’t exist.
If the job info entry is an array, this gets the number of elements in that
array.
If the job info entry is an array, this gets the element at the given index.
Fails the render with the given error message.
Logs the message to the slave log.
Sets the progress of the render in the slave UI.
Sets the render status message in the slave UI.
Logs a warning message to the slave log.
Properties
Property
int CurrentFrame
int CurrentTask
string JobsDataFolder
string PluginsFolder
string SceneFileName
string SceneFilePath
Description
Gets the current frame.
Gets the current task ID.
Gets the local folder on the slave where the Deadline job files are copied to.
Gets the local folder on the slave where the Deadline plugin files are copied to.
Gets the file name of the loaded 3ds Max scene.
Gets the file path of the loaded 3ds Max scene.
Submitter’s Sticky Settings and Factory Defaults
The latest version of the Submit Max To Deadline script allows the user to control the stickiness of most User Interface
controls and, in the case of non-sticky settings, the defaults to be used. In previous versions of SMTD, both the
stickiness and the defaults were hard-coded.
Overview
Two INI files located in the Repository in the folder \submission\3dsmax control the stickiness and the defaults:
• SubmitMaxToDeadline_StickySettings.ini - this file can be used to define which controls in the SMTD UI will
be stored locally in an INI file (“sticky”) and which will be reset to defaults after a restart of the Submitter.
• SubmitMaxToDeadline_Defaults.ini - this file can be used to define the default settings of those controls set to
non-sticky in the other file.
In addition, a local copy of the SubmitMaxToDeadline_StickySettings.ini file can be saved in a user’s application
data folder. This file will OVERRIDE the stickiness settings in the Repository and can contain a sub-set of the
definitions in the global file.
Details
When SMTD is initializing, it will perform the following operations:
1. The SMTDSettings Struct will be initialized to the factory defaults of all settings.
2. Each UI setting will be initially assumed to be sticky.
3. The global Stickiness definition file is searched for a key matching the current UI setting’s name.
9.2. 3ds Max
537
Deadline User Manual, Release 7.1.0.35
• If the key is set to “false”, the setting is not sticky.
• If the key is set to anything but “false”, the setting is sticky.
• If the key does not exist, the stickiness still defaults to the initial value of true.
4. A local Stickiness definition file is searched for a key matching the current UI setting’s name.
• If the key is set to “false”, the setting is not sticky and overrides whatever was found in the global file.
• If the key is set to anything but “false”, the setting is sticky, overriding whatever was found in the
global file.
• If the key does not exist in the local file, the last known value (initial or from the global file) remains
in power.
5. At this point, SMTD knows whether the setting is sticky or not. Now it gets the global default value:
• If a matching key exists in the file SubmitMaxToDeadline_Defaults.ini, the setting is initialized to its
value.
• If no matching key exists in the global defaults file, the original factory default defined in the SMTDStruct definition will remain in power.
• If the setting is sticky, SMTD loads the last known value from the local INI file. If the value turns out
to be invalid or not set, it uses the default instead.
• If the setting is not sticky, the default loaded from the global defaults file or, if no such default was
loaded, the factory default, will be assigned to the setting.
When the User Interface is created, the stickiness info from the local and global files will determine whether a *star
character will be added to the control’s name, reflecting the current stickiness settings.
Using this new feature, a facility can customize the submitter globally to default to the preferred settings and keep
certain settings sticky so their values can be determined by the artists. In addition, single users can override the
company-wide stickiness settings using a local file if they feel their workflow require a different setup.
Custom Job Name Controls
There are two ways to customize the job name. You can use keys in the job name that are replaced with actual values
(like $scene), or you can have the job name be generated from a list of shows, shots, etc. You will then be able to use
the [>>] button to the right of the Job Name field to select these custom job names.
Generate Job Name From Keys
There is a file in the ..\submission\3dsmax\Main\ folder in your Repository called SubmitMaxToDeadline_NameFormats.ini. In addition, a local copy of the SubmitMaxToDeadline_NameFormats.ini file can be saved
in a user’s application data folder. This file will OVERRIDE the name formats in the Repository and can contain a
sub-set of the definitions in the global file. This file will contain some key-value pairs such as:
$scene=(getfilenamefile(maxfilename))
$date=((filterstring (localtime) " ")[1])
$deadlineusername=(SMTDFunctions.GetDeadlineUser())
$username=(sysInfo.username)
$maxversion=(((maxVersion())[1]/1000) as string)
The key to the left of = is the string that will be replaced in the job name. The value to the right of the = is the maxscript
code that is executed to return the replacement string (note that the value returned must be returned as a string). So if
you use $scene in your job name, it will be swapped out for the scene file name. You can append additional key-value
pairs or modify the existing ones as you see fit.
538
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
By default, the [>>] button will already have $scene or $outputfilename as selectable options. You can then create an
optional JobNames.ini file in the 3dsmax submission folder, with each line representing an option. For example:
$scene
$outputfilename
$scene_$camera_$username
$maxversion_$date
These options will then be available for selection in the submission dialog. This allows for all sorts of customization
with regards to the job name.
Generate Job Name For Shows
This advanced feature allows the addition of custom project, sequence, shot and pass names to the [>>] list to the
right of the Job Name field. Producers in larger facilities could provide full shot lists via a central set of files in the
Repository to allow users to pick existing shot names and ensuring consistent naming conventions independent from
the 3ds Max scene naming.
To create a new set of files, go to the ..\submission\3dsmax\Main\ folder in your Repository and create the following
files:
Projects.ini - This file describes the projects currently available for Custom Job Naming. Each Project is defined as a
Category inside this file, with two keys: Name and ShortName.
For example:
[SomeProject]
Name=Some Project in 3D
ShortName=SP
[AnotherProject]
Name=Another Project
ShortName=AP
SomeProject.ini - This is a file whose name should match exactly the Category name inside the file Projects.ini and
contains the actual sequence, shot and pass description of the particular project. One file is expected for each project
definition inside the Projects.ini file.
For example:
[SP_SS_010]
Beauty=true
Diffuse=true
Normals=true
ZDepth=true
Utility=true
[SP_SS_150]
Beauty=true
Diffuse=true
Utility=true
[SP_SO_020]
Beauty=true
[SP_SO_030]
Beauty=true
The Submitter will parse this file and try to collect the Sequences by matching the prefix of the shot names, for example
in the above file, it will collect two sequences - SP_SS and SP_SO - and build a list of shots within each sequence,
then also build a list of passes within each shot.
9.2. 3ds Max
539
Deadline User Manual, Release 7.1.0.35
Then, when the [>>] button is pressed, the context menu will contain the name of each project and will provide a
cascade of sub-menus for its sequences, shots and passes.
If you selected the entry SomeProject>SP_SS>SP_SS_150>Diffuse,
“SP_SS_150_Diffuse”:
the resulting Job Name will be
You can enter as many projects into your Projects.ini file as you want and provide one INI file for each project
describing all its shots and passes. If an INI file is missing, no data will be displayed for that project.
Custom Comment Controls
Just like job names, you can use keys in the comment field that are replaced with actual values (like $scene). There is a
file in the ..\submission\3dsmax\Main\ folder in your Repository called SubmitMaxToDeadline_CommentFormats.ini.
In addition, a local copy of the SubmitMaxToDeadline_CommentFormats.ini file can be saved in a user’s application
data folder. This file will OVERRIDE the comment formats in the Repository and can contain a sub-set of the definitions in the global file. This file will contain some key-value pairs such as:
$default=("3ds Max " + SMTDFunctions.getMaxVersion() + " Scene Submission")
$scene=(getfilenamefile(maxfilename))
$date=((filterstring (localtime) " ")[1])
$deadlineusername=(SMTDFunctions.GetDeadlineUser())
$username=(sysInfo.username)
$maxversion=(((maxVersion())[1]/1000) as string)
The key to the left of = is the string that will be replaced in the comment. The value to the right of the = is the maxscript
code that is executed to return the replacement string (note that the value returned must be returned as a string). So if
you use $scene in your comment, it will be swapped out for the scene file name. You can append additional key-value
pairs or modify the existing ones as you see fit.
By default, the [>>] button will already have $default. You can then create an optional Comments.ini file in the 3dsmax
submission folder, with each line representing an option. For example:
540
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
$default
$scene
$outputfilename
$scene_$camera_$username
$maxversion_$date
These options will then be available for selection in the submission dialog. This allows for all sorts of customization
with regards to the comment field.
Auto-Suggest Category and Priority Mechanism
This feature has been implemented to help Producers suggest categories and priorities based on Shots and Sequence
signatures which are part of the 3ds Max Scene Name.
This feature DOES NOT ENFORCE the Category and Priority for the job, it only suggests a value based on project
guidelines - the Category and Priority can be changed manually after the suggestion.
To use this feature, you have to edit the file called “SubmitMaxToDeadline_CategoryPatterns.ms” located in the Repository in the \submission\3dsmax folder. As a shortcut, you can press the button Edit Patterns... in the Options tab of
the Submitter - the file will open in the built-in MAXScript Editor.
The file defines a global array variable called SMTD_CategoryPatterns which will be used by the Submitter to perform
pattern matching on the Job Name and try to find a corresponding Category and optionally a priority value in the array.
The array can contain one or more sub-arrays, each one representing a separate pattern definition.
Every pattern sub-array consists of four array elements:
• The first element is an array containing zero, one or more string patterns using * wildcards. These strings will
be used to pattern match the Job Name. If it matches, it will be considered for adding to the Category and for
changing the Priority. If the subarray is empty, all jobs will be considered matching the pattern.
• The second element is also an array containing similar pattern strings. These strings will be used to EXCLUDE
jobs matching these patterns from being considered for this Category and Priority. If the subarray is empty, no
exclusion matching will be performed.
• The third element contains the EXACT name (Case Sensitive!) of the category to be set if the Job Name matches
the patterns. If the category specified here does not match any of the categories defined via the Monitor, no action
will be performed.
• The fourth element specifies the Priority to give the job if it matches the patterns. If the value is -1, the existing
priority will NOT be changed.
The pattern array can contain any number of pattern definitions. The higher a definition is on the list, the higher its
priority - if a Job Name matches multiple pattern definitions, only the first one will be used.
The pattern matching will be performed only if the checkbox Auto-Suggest Job Category and Priority in the Options
Tab is checked. It will be performed when the dialog first opens or when the the Job Name is changed.
An example:
• Let’s assume that a VFX facility is working on a project called “SomeProject” with multiple sequences labelled
“AB”, “CD” and “EF”.
• The network manager has created categories called “SomeProject”, “AB_Sequence”, “CD_Sequence”,
“EF_Sequence” and “High_Priority” via the Monitor.
• The Producers have instructed the Artists to name their 3ds Max files “SP_AB_XXX_YYY_”where SP stands
for “SomeProject”, “AB” is the label of the sequence followed by the scene and shot numbers.
9.2. 3ds Max
541
Deadline User Manual, Release 7.1.0.35
• Now we want to set up the Submitter to suggest the right Categories for all Max files sent to Deadline based on
these naming conventions.
• We want jobs from the CD sequence to be set to Priority of 60 unless they are from the scene with number
“007”.
• We want jobs from the AB sequence to be set to Priority of 50
• We don’t want to enforce any priority to jobs for sequence EF.
• Also we want shots from the “AB” sequence with scene number “123” and “EF” sequence with scene shot
number “038” to be sent at highest priority and added to the special “High Priority” category for easier filtering
in the Monitor.
• Finally we want to make sure that any SP project files that do not contain a sequence label are added to the
general “SomeProject” category with lower priority.
To implement these rules, we could create the following definitions in the “SubmitMaxToDeadline_CategoryPatterns.ms” - press the Edit Patterns... button in the Options tab to open the file:
SMTD_CategoryPatterns = #(
#(#("*AB_123*","*EF_*_038*"),#(),"High_Priority",100),
#(#("*AB_*"),#(),"AB_Sequence",50),
#(#("*CD_*"),#("*CD_007_*"),"CD_Sequence",60),
#(#("*EF_*"),#(),"EF_Sequence",-1),
#(#("SP_*"),#(),"SomeProject",30),
)
• The first pattern specifies that files from the “AB” sequence, scene “123” and “EF” sequence, shot “038” (regardless of scene number) will be suggested as Category “High_Priority” and set Priority to 100.
• The second pattern specifies all AB jobs to have priority of 50 and be added to Category “AB_Sequence”. Since
the special case of AB_123 has been handled in the previous pattern, this will not apply to it.
• The third pattern sets jobs that contain “CD_” in their name but NOT the signature “CD_007_” to the
“CD_Sequence” Category and sets the Priority to 60.
• The fourth pattern sets jobs that contain “EF_” in their name to the “EF_Sequence” Category but does not
change the priority (-1).
• The fifth pattern specifies that any jobs that have not matched the above rules but still start with the “SP_”
signature should be added to the “SomeProject” Category and set to low priority of 30.
Note that since we used “*” instead of “SP_”in the beginning of the first 4 patterns, even if the job is not named
correctly with the project prefix “SP_”, the pattern will correctly match the job name.
Custom Plugin.ini File Creation
This section covers the Alternate Plugin.ini feature in the 3ds Max Rendering rollout (under the Render tab).
Alternate Plugin.ini File
The plugin.ini list will show a list of alternative plug-in configuration files located in the Repository. By default, there
will be no alternative plugin.ini files defined in the repository. The list will show only one entry called [Default],
which will cause all slaves to render using their own local plugin.ini configuration and is equivalent to having the Use
Custom Plugin.ini file unchecked.
To define an alternative plugin.ini, copy a local configuration file from one of the slaves to [Repository]\plugins\3dsmax in the repository. Edit the name of the file by adding a description of it. For example, plugin_brazil.ini, plugin_vray.ini, plugin_fr.ini, plugin_mentalray.ini, etc. Open the file and edit its content to include the
542
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
plug-ins you want and exclude the ones you don’t want to use in the specific case. The next time you launch Submit
To Deadline, the list will show all alternative files whose names start with “plugin” and end with ”.ini”. The list will
be alphabetically sorted, with [Default] always on top. You can then select an alternative plugin.ini file manually from
the list.
Pressing the Edit Plugin.ini File button will open the currently selected alternative configuration file in a MAXScript
Editor window for quick browsing and editing, except when [Default] is selected. Pressing the Browse Directory
button will open Windows Explorer, taking you directly to the plug-ins directory containing the alternative plugin.ini
files. Note that if you create a new plugin.ini file, you will have to restart the Submit To Deadline script to update the
list.
Since the alternative plug-in configuration file is located in the Repository and will be used by all slave machines, the
plug-in paths specified inside the alternative plugin.ini will be used as LOCAL paths by each slave. There are two
possible installation configurations that would work with alternative plug-ins (you could mix the two methods, but it’s
not recommended):
• Centralized Plug-ins Repository: In this case, all 3dsmax plug-ins used in the network are located at a centralized location, with all Slaves mapping a drive letter to the central plug-in location and loading the SAME copy
of the plug-in. In this case, the alternative plugin.ini should also specify the common drive letter of the plug-in
repository.
• Local Plug-in: To avoid slow 3dsmax booting in networks with heavy traffic, some studios (including ones we
used to work for) deploy local versions of the plug-ins. Every slave’s 3dsmax installation contains a full set
of all necessary plug-ins (which could potentially be automatically synchronized to a central repository to keep
all machines up-to-date). In this case, the alternative plugin.ini files should use the LOCAL drive letter of the
3dsmax installation, and all Slaves’ 3dsmax copies MUST be installed on the same partition, or at least have the
plug-ins directory on the same drive, for example, “C:”.
Auto-Detect Plugin.ini For Current Renderer
When enabled, the following operations will be performed:
1. When you check the checkbox, the current renderer assigned to the scene will be queried.
2. The first 3 characters of the renderer’s name will be compared to a list of known renderers.
3. If the renderer is not on the list, the alternative list will be reset to [Default].
4. If the renderer is the Default Scanline Renderer of 3dsmax, the alternative list will be reset to [Default].
5. If the renderer is a known renderer, the plugin*.ini file that matches its name will be selected.
Supported renderers for auto-suggesting an alternative configuration are:
• Brazil plugin*.ini should contain “brazil” in its name (i.e.: plugin_brazil.ini, plugin-brazil.ini, pluginbrazil_1_2.ini etc).
• Entropy plugin*.ini should contain “entropy” in its name (i.e.: plugin_entropy.ini, plugin-entropy.ini, pluginentropy.ini, etc).
• finalRender plugin*.ini should contain “fr” or “final” in its name (i.e.: plugin_fr.ini, plugin-finalrender.ini, plugin_finalRender_Stage1.ini etc).
• MaxMan plugin*.ini should contain “maxman” in its name (i.e.: plugin_maxman.ini, plugin-maxman.ini, pluginmaxman001.ini etc).
• mentalRay plugin*.ini should contain “mr” or “mental” in its name (i.e.: plugin_mr.ini, plugin-mentalray.ini,
plugin_mental33.ini etc).
• V-Ray plugin*.ini should contain “vray” in its name (i.e.: plugin_vray.ini, plugin-vray.ini, pluginvray109.ini
etc).
Notes:
9.2. 3ds Max
543
Deadline User Manual, Release 7.1.0.35
• In 3dsmax 5 and higher, opening a MAX file while the Auto-Detect option is checked will trigger a callback
which will perform the above check automatically and switch the plugin.ini to match the renderer used by the
scene.
• In 3dsmax 6 and higher, changing the renderer via the “Current Renderers” rollout of the Render dialog will
also trigger the auto-suggesting mechanism.
• You can override the automatic settings anytime by disabling the Auto-Detect option and selecting from the list
manually.
Custom Extra Info Controls
Just like job names and comments, you can use keys in the Extra Info 0-9 fields (under the ‘Integration’ tab in SMTD)
that are replaced with actual values (like $scene). There is a file in the ..\submission\3dsmax\Main\ folder in your
Repository called SubmitMaxToDeadline_ExtraInfoFormats.ini. In addition, a local copy of the SubmitMaxToDeadline_ExtraInfoFormats.ini file can be saved in a user’s application data folder. This file will OVERRIDE the comment
formats in the Repository and can contain a sub-set of the definitions in the global file. This file will contain some
key-value pairs such as:
$scene=(getfilenamefile(maxfilename))
$date=((filterstring (localtime) " ")[1])
$deadlineusername=(SMTDFunctions.GetDeadlineUser())
$username=(sysInfo.username)
$maxversion=(((maxVersion())[1]/1000) as string)
The key to the left of = is the string that will be replaced in the comment. The value to the right of the = is the maxscript
code that is executed to return the replacement string (note that the value returned must be returned as a string). So if
you use $scene in your comment, it will be swapped out for the scene file name. You can append additional key-value
pairs or modify the existing ones as you see fit.
NOTE, if you are using Shotgun or FTrack Integration, ExtraInfo0 to ExtraInfo5 will be used automatically and take
precendence over any $keys in these particular fields.
As an example, you may wish to use the automatic SMTD ‘BatchName’ functionality to group logical job submissions
together in your Deadline queue, but also use custom Extra Info fields to help track pipeline information such as
Project, Sequence, Shot or Job Number of a particular 3dsMax/Jigsaw/Draft/Quicktime job submission such as:
$project=[execute maxscript code here, returning a string value]
$sequence=123456
$shot=[use maxscript to get shot # from the current render output naming convention]
$jobnumber=[maxscript to query database and get project's job number as a string]
Once this additional ‘pipeline’ information is injected into your Deadline jobs, the Extra Info columns can be given
user friendly names so that they can easily be identified and used to filter and sort jobs in the Monitor. See the Job
Extra Properties section for more information. NOTE, the Extra Info X columns are also injected into the Completed
Job Stats, thereby allowing you to store and later analyse/create reports against previous jobs by the data stored in your
Extra Info X columns.
9.2.7 FAQ
Which versions of 3ds Max are supported?
3ds Max versions 2010 and later are all supported (including Design editions).
544
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Note: Due to a maxscript bug in the initial release of 3ds Max 2012, the integrated submission scripts
will not work. However, this bug has been addressed in 3ds Max 2012 Hotfix 1. If you cannot apply this
patch, it means that you must submit your 3ds Max 2012 jobs from the Deadline Monitor.
Which 3ds Max renderers are supported?
Deadline should already be compatible with all 3ds Max renderers, but it has been explicitly tested with
Scanline, MentalRay, Brazil, V-Ray, Corona, finalRender, and Maxwell. If you have successfully used a
3ds Max renderer that is not on this list, please email Deadline Support.
Does Backburner need to be installed to render with Deadline?
Yes. Backburner installs the necessary files that are needed for command line and network rendering, so
it must be installed to render with Deadline.
Does the 3ds Max plugin support Tile Rendering?
Yes. See the Tile Rendering section of the submission dialog documentation for more details.
Does the 3ds Max plugin support multiple arbitrary sized, multi-resolution Tile Rendering for both stills or
animations and automatic re-assembly, including the use of multi-channel image formats and Render Elements
(incl. V-Ray VFB specific image files)?
Yes. We call it ‘Jigsaw’ and it’s unique to the Deadline system! See the Tile Rendering section of the
submission dialog documentation for more details.
Does the 3ds Max plugin support Batch Rendering?
Yes. See the Batch Rendering section of the submission dialog documentation for more details.
Is PSoft’s Pencil+ render effects plugin supported?
Yes. Please note at least Pencil+ v3.1 is required to resolve an issue with the line element render element
failing to be rendered. Note, you will require the correct network render license from PSoft for each
Deadline Slave or render with a Deadline Slave that already has a full, workstation license of Pencil+
already installed.
When I submit a render with a locked viewport, Deadline sometimes renders a different viewport.
Prior to the release of 3ds Max 2009, the locked viewport feature wasn’t exposed to the 3ds Max SDK,
so it was impossible for Deadline to know whether a viewport is locked or not. Now that the feature has
been exposed, we are working to improve Deadline’s locked viewport support. However, in the 3ds Max
2010 SDK, there is a bug that prevents us from supporting it completely (Autodesk is aware of this bug).
As of 3ds Max 2015, this bug is now resolved. For earlier versions, we can only continue to recommend
that users avoid relying on the locked viewport feature, and instead ensure that the viewport they want to
render is selected before submitting the job. The SMTD sanity check continues to provide a warning for
those versions of 3ds Max, where the locked viewport SDK bug still exists.
When Deadline is running as a service, 3ds Max 2015 render jobs crash during startup.
This can happen if the new Scene (Content) Explorer is docked.
This is a known issue with 3ds Max network rendering when it is launched by a program running as a
service. See this AREA blog post about running 3ds Max 2015 as a service for a workaround and more
information.
Can I mix 3ds Max and 3ds Max Design jobs in Deadline?
Yes. ADSK have introduced (April 2014) a new system environment variable you can set which will
make all jobs from 3ds Max and 3ds Max Design appear as 3ds Max jobs: “MIX_MAX_DESIGN_BB”
set to “1” to enable this feature. Note, Windows typically requires a machine restart or log-off/log-on for
the new environment setting value to become available once set. ADSK have confirmed this works for
3ds Max 2015, 3ds Max Design 2015 with Backburner 2015.0.1. It may also work with 2014 SP5 version
9.2. 3ds Max
545
Deadline User Manual, Release 7.1.0.35
of 3ds Max and 3ds Max Design, with Backburner 2015.0.1. See this AREA blog post about mixing 3ds
Max and 3ds Max design on a render farm for more information. Note, Backburner Manager or Server
are NOT required to be running to make this system work in Deadline, although Backburner software still
needs to be installed on your rendernodes.
When I submit a render job that uses more than one default light, only one default light gets rendered.
The workaround for this problem is to add the default lights to the scene before submitting the job. This
can be done from within 3ds Max by selecting Create Menu -> Lights -> Standard Lights -> Add Default
Lights to Scene.
Is it possible to submit MAXscripts to Deadline instead of just a *.max scene?
Yes. Deadline supports MAXscript jobs from the Scripts tab in the submission dialog.
Does Deadline’s custom interface for rendering with 3ds Max use workstation licenses?
No. Deadline’s custom interface for rendering with 3ds max does not use any workstation licenses when
running on slaves unless you have the Force Workstation Mode option checked in the submission dialog,
a workstation license will be used.
Slaves are rendering their first frame/tile correctly, but subsequent frames and render elements have problems
or are rendered black.
Try enabling the option to “Restart Renderer Between Frames” in the submission dialog before submission, or in the job properties dialog after submission. We have found that this works 99% of the time in
these cases. When enabled, the c++ Lightning plugin (unique to Deadline), will unload the renderer plugins and then reload them instantly. This has the effect of forcing a memory purge and helps to improve
renderer stability, as well as ensure the lowest possible memory footprint. This can be helpful, when
rendering close to the physical memory limit of a machine. See note below for when this feature should
be disabled.
V-Ray Light-Cache / Irradiance Maps are not the correct file size or seem to be getting reset between incremental frames on Deadline but calculate correctly when executed locally.
Ensure the option “Restart Renderer Between Frames” is DISABLED if you are sending FG/LC/IM
caching map type jobs to the farm, as the renderer will get reset for each frame and the FG/LC/IM file(s)
won’t get incrementally increased with the additional data per frame and will only contain the data from
the last frame it calculated. (The resulting file size will be too small as well).
3dsMax Point Cache Files dropping geometry in renders randomly
Sometimes 3dsMax can drop point cache geometry in renders, in an almost random only certain rigs
fashion. Typically but not exclusively, this happens on the 2nd assigned frame processed by a particular
slave. Ensure the option “Restart Renderer Between Frames” is DISABLED in the submission dialog
before submission, or in the job properties dialog after submission. We have found that this works 99%
of the time in these cases.
When rendering with V-Ray/Brazil, it appears as if some maps are not being displayed properly.
Try enabling the option to “Restart Renderer Between Frames” in the submission dialog before submission, or in the job properties dialog after submission. We have found that this works 99% of the time in
these cases.
Tile rendering with a Mental Ray camera shader known as “wraparound” results in an incorrect final image.
How can I fix this?
This is another situation where enabling the option to “Restart Renderer Between Frames” in the submission dialog seems to fix the problem.
When tile rendering with a renderer that supports global/secondary illumination, I get bucket stamps (different
lighting conditions in each tile) on the final image.
546
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Try calculating the irradiance/final gather light caching map first in one pass at full resolution. Then
perform your tile render on a scene that reads the irradiance/final gather map created at full resolution. If
creating the map at full resolution is impossible then you can make it in the tile, but you need to make
sure the tiles are overlapping each other (use Deadline’s tile/jigsaw padding to help here) and make sure to
use the irradiance/final gather map method that appends to the map file. Alternatively, you could consider
using the VRay/Mental Ray DBR off-load system to accelerate the caculation of the light caching map.
In summary: you create (pre-calculate) the secondary/global illumination map first then run the final
render in tiles as a second job. Deadline job dependencies can be used here to release the second job as
the first job successfully completes the lighting pre-calculation job.
Can I perform Distributed Bucket Rendering (DBR) with V-Ray or V-Ray RT?
Yes. A special ‘reserve’ job is submitted that will run the V-Ray Spawner/V-Ray standalone process on
the render nodes. Once the V-Ray Spawner/V-Ray standalone process is running, these nodes will be able
to participate in distributed rendering. Please see the VRay Distributed Rendering (DBR) Plug-in Guide
for more information.
Can I fully off-load 3dsMax V-Ray or Mental Ray DBR rendering from my machine?
Yes, see the VRay/Mental Ray DBR section for more information. The advantages to off-loading a VRay DBR job fully from your workstation include; releasing your local workstation to carry out other
processing tasks and helping to accelerate the irradiance map/photon cache calculation process as the
V-Ray DBR system supports distributing this across multiple machines. A risk/disadvantage to this way
of working is if a single machine currently being used to calculate a DBR bucket crashes/fails for an
unknown reason, then the whole process will fail at it’s current stage and start from the beginning again.
Can I Perform Fume FX Simulations With Deadline?
Yes. To do so, follow these steps:
1. Your render nodes need to have Fume FX licensed properly, either with a “full” or “simulation”
licenses. This requirement is the same if you were rendering with Backburner.
2. Before you launch the 3dsmax submission script, make sure that the Fume FX NetRender toggle
button is “ON” in the Fume FX options in 3dsmax.
3. Before you submit the job, make sure the “Disable Progress Update Timeout” option is enabled
under the Render in the 3dsmax submission window.
4. Note that Fume FX uses its own frame range (in the Fume FX settings/prefs), so submit the
Max scene file to Deadline as a single frame/task.
Can I force a render to use a specific language?
Yes. Using the option located in “User Options” tab of SMTD or in the monitor submission, “Advanced
Options” tab (2013+ only). This will change the default on the machine it is rendered on to the chosen
language. Note that the change is permanent on the machine until such time 3dsMax is restarted and the
language is forced to a different language. You can manually force the language to be changed back via
the language specific shortcuts in the start menu, which effectively start 3dsMax with the language flag.
In this example, EN-US (default) is forced: “C:/Program Files/Autodesk/3ds Max 2015/3dsmax.exe”
/Language=ENU
When submitting to Deadline, non-ASCII characters in output paths, camera names, etc, are not being sent to
Deadline properly.
You need to enable the “Save strings in legacy non-scene files using UTF8” property in the Preference Settings in 3ds Max. After enabling this, the Deadline submission files will be saved as UTF8 and therefore
non-ASCII characters will be saved properly. See the Character Encoding Defaults in 3ds Max section in
the 3ds Max Character Encoding documentation for more information.
Why do 3ds Max jobs add a period delimiter to the output filename?
9.2. 3ds Max
547
Deadline User Manual, Release 7.1.0.35
Deadline 7 introduced a new Delimiter option in the integrated 3ds Max submitter (SMTD) to avoid some
problems with the way render elements and other auto-generated names were formatted in previous version. The Delimiter option is set to a factory default of ”.” as this is the typical convention in VFX
pipelines, but it can be overridden via the Defaults INI file in the Repository. Since this setting is considered a company-wide pipeline value and should not be overridden by individual users, it is currently not
exposed in the SMTD UI.
To change the Delimiter to an empty string, you can do the following:
1. Navigate to your Repository folder
2. Go to ...\submission\3dsmax\Main\
3. Locate the SubmitMaxToDeadline_Defaults.ini file and open it in a text editor
4. Add the following to the [RenderingOptions] category:
[RenderingOptions]
Delimiter=
5. Make sure there is nothing after the = sign!
6. Save the file
7. Restart SMTD on your workstation
8. RESULT: At this point, SMTD should behave like it did in Deadline 6.x and earlier.
Note that in some cases some render element passes might be misformatted due to the lack of delimiter this was a known issue in Deadline 6.x and earlier. For example, if a V-Ray pass was named automatically
based on a TextureMap name ending with digits, the resulting file name could end up having too many
trailing digits, e.g. SomeMap_420000.exr instead of SomeMap_42.0000.exr. So in the Deadline Monitor, the filename could become SomeMap_######.exr instead of SomeMap_42.####.exr. If you want
to replace the ”.” period character with a different character to fit your pipeline requirements, (e.g. _
underscore), you can add the character to the INI file:
[RenderingOptions]
Delimiter=_
In summary, you can use the new Delimiter option to provide a consistent file naming convention across
your studio pipeline. Few caveats; the file naming convention for Thinkbox’s tile, region and Jigsaw
remains unchanged and V-Ray v3 has introduced a maxscript property #fileName_addDot which can
be accessed via
renderers.current.fileName_addDot
which by default is True, so it will also try to add a DOT character to its filenames if one is not present.
9.2.8 Error Messages and Meanings
This is a collection of known 3ds Max error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline
Support and let us know.
Note that when an error occurs in a Max render, we parse the Max render log (Max.log) for any messages that might
explain the problem and include them in the error message. Some examples are:
• ERR: An unexpected exception has occurred in the network renderer and it is terminating.
548
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• ERR: Missing dll: BrMaxPluginMgr.dlu
• ERR: [V-Ray] UNHANDLED EXCEPTION: Preparing ray server Last marker is at .srcvrayrenderer.cpp
3dsmax startup: Error getting connection from 3dsmax: 3dsmax startup: Deadline/3dsmax startup error:
lightningMax*.dlx does not appear to have loaded on 3dsmax startup, check that it is the right version and
installed to the right place.
You likely need to install the appropriate Visual C++ Redistributable package, which is carried out automatically by the Deadline Client installer. Try re-installing the Client software if you see this error.
3dsmax startup: Error getting connection from 3dsmax: Monitored managed process “3dsmaxProcess” has
exited or been terminated.
Full error message:
3dsmax startup: Error getting connection from 3dsmax: Monitored managed...
2012/08/24 14:48:40 DBG: Starting network
2012/08/24 14:48:40 DBG: Calling NetRenderPreLoad
2012/08/24 14:48:40 DBG: in NetWorkerPreLoad. jobFile: ; jobname: C:\Users\...
2012/08/24 14:48:40 DBG: in NetWorkerPreLoad. LoadLib() failed
2012/08/24 14:48:40 DBG: NetRenderPreLoad failed
2012/08/24 14:48:40 ERR: Error loading *.max file
2012/08/24 14:49:10 INF: SYSTEM: Production renderer is changed to Default...
2012/08/24 14:49:10 DBG: Stop network
This is a known issue with 3ds Max, and can occur when IPv6 is enabled on the render node. The issue
can be fixed by disabing IPv6 on the machines, or by disabing the IPv6 to IPv4 tunnel. See this Area blog
post about IPv6 errors for more information.
Could not delete old lightning.dlx... This file may be locked by a copy of 3ds max
Usually this is because a 3dsmax.exe process didn’t quit or get killed properly. Looking in task manager
on the slaves reporting the message for a 3dsmax.exe process and killing it is the solution.
3dsmax crashed in GetCoreInterFace()->LoadFromFile()
There are a number of things that can be tried to diagnose the issue:
• Try opening the file on a machine where it crashed. You may already have done this.
• Try rendering a frame of it on a machine where it crashed, using the 3dsmaxcmd.exe renderer.
This will make it open the file in slave mode and possibly give an idea of what’s failing.
• Submit the job to run in workstation mode. In workstation mode there’s often more diagnostic
output. There’s a checkbox in the submission script for this.
• If you’re comfortable sending us the .max file which is crashing, we’d be happy to diagnose the
issue here.
• Try stripping down the max file by deleting objects and seeing if it still crashes then.
Trapped SEH Exception in CurRendererRenderFrame(): Access Violation
An Access Violation means that when rendering the frame, Max either ran out of memory, or memory
became corrupted. The stack trace in the error message usually shows which plugin the error occurred in.
If that doesn’t help track down the issue, try stripping down the max file by deleting objects and seeing if
the error still occurs.
3dsmax: Trapped SEH Exception in LoadFromFile(): Access Violation
An Access Violation means that when loading the scene, Max either ran out of memory, or memory
became corrupted. The stack trace in the error message usually shows which plugin the error occurred in.
9.2. 3ds Max
549
Deadline User Manual, Release 7.1.0.35
If that doesn’t help track down the issue, try stripping down the max file by deleting objects and seeing if
the error still occurs.
3dsmax: PNG Plugin: PNG Library Internal Error
3dsMax Render Elements can become corrupt or be placed in a bad state with regard the image file format
plugin trying to being used to save each Render Element to your file server. This issue is not limited to
the PNG file format (TGA, TIF) but is common. A known option, which has been known to fix the issue
in most circumstances, is to rebuild the render elements by deleting and re-creating them in the 3dsmax
scene file. This feature is automated in SMTD if you enable the checkbox “Rebuild Render Elements”
under the “Render” tab -> “3ds Max Pathing Options”.
RenderTask: 3dsmax exited unexpectedly (it may have crashed, or someone may have terminated)
This generic error message means that max crashed and exited before the actual error could be propagated
up to Deadline. Often when you see this error, it helps to look through the rest of the error reports for that
job to see if they contain any information that’s more specific.
RenderTask: 3dsmax may have crashed (recv: socket error trying to receive data: WSAError code 10054)
This generic error message means that max crashed and exited before the actual error could be propagated
up to Deadline. Often when you see this error, it helps to look through the rest of the error reports for that
job to see if they contain any information that’s more specific.
3dsmax startup: Error getting connection from 3dsmax: 3dsmax startup: Deadline/3dsmax startup error:
lightningMax*.dlx does not appear to have loaded on 3dsmax startup, check that it is the right version and
installed to the right place.
This error is likely the side effect of another error, but the original error wasn’t propagated to Deadline
properly. Often when you see this error, it helps to look through the rest of the error reports for that job to
see if they contain any information that’s more specific.
3dsmax startup: Max exited unexpectedly. Check that 1) max starts up with no dialog messages and in the
case of 3dsmax 6, 2) 3dsmaxcmd.exe produces the message ‘Error opening scene file: “”’ when run with no
command line arguments
This message is often the result of an issue with the way Max starts up. Try starting 3ds Max on the slave
machine that produced the error to see if it starts up properly. Also try running 3dsmaxcmd.exe from the
command line prompt to see if it produces the message ‘Error opening scene file: “”’ when run with no
command line arguments. If it doesn’t produce this message, there may be a problem with the Max install
or how its configured. Sometimes reinstalling Max is the best solution.
The 3dsmax command line renderer, ...\3dsmaxcmd.exe, hung during the verification of the 3ds max install
Try running 3dsmaxcmd.exe from the command line prompt to see if it pops up an error dialog or crashes,
which is often the cause of this error message. If this is the case, there may be a problem with the Max
install or with how it is configured. Sometimes reinstalling Max is the best solution.
3dsmax: Failed to load max file: ”...”
There could be many reasons my Max would fail to load the scene file. Check for ERR or WRN messages
included in the error message for information that might explain the problem. Often, this error is the result
of a missing plugin or dll.
Error: “3ds Max The Assembly Autodesk.Max.Wrappers.dll encountered an error while loading”
This is a specific 3ds Max 2015 crash when you try to launch the program. Ensure you perform a Windows
update and get latest updates for Windows 7 or 8. Additionally, install the update for Autodesk 3ds Max
2015 Service Pack 1 and Security Fix. See this ADSK Knowledge post for more information.
Error message: 3dsmax adapter error : Autodesk 3dsMax 17.2 reported error: Could not find the specified file
in DefaultSettingsParser::parse() ; Could not find the specified file in DefaultSettingsParser::parse() ;
550
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
The error “Could not find the specified file in DefaultSettingsParser::parse() ;” occurs if you don’t
have the Populate Data installed on each of your Deadline Slave machines. To resolve the issue you
need to ensure that the Populate Data is installed on all the render machines. You can run the 3dsMax_2015_PopulateData.msi installer from the “\x64\PDATA\” folder of the 3ds Max 2015 installer. In
case there was a previous install of the Populate Data on the machine please delete the following folder
before installing “C:\Program Files\Common Files\Autodesk Shared\PeoplePower\2.0\”. See this Area
blog post for more information.
Error message: ERR: “To use this feature, you need the Evolver data. Please check the Autodesk web site for
more information.”
You may get the above error message when you try to run a Populate simulation in your 3dsMax scene file.
This is a known Autodesk bug and the fix is to install the Autodesk 3ds Max 2014 64-bit Populate Data
component. The actual file is 3dsMax_2014_PopulateData.msi which you can find in the “\x64\PDATA\”
folder of the install media. Note that if you’re running 3ds Max Design the filename will be 3dsMaxDesign_2014_PopulateData.msi. Simarily, the same bug in 3ds Max 2015 doesn’t mention Evolver anymore.
Instead, it tells you to install the Populate data. See this Fixing missing Evolver data errors Area blog post
for more information.
Error message: “ERROR: Please, make sure the Populate data is installed.”
This is the same error message as the previous Populate FAQ entry and is fixed by installing the Autodesk Populate Data component. See this Fixing missing Evolver data errors Area blog post for more
information.
Unexpected exception (Error in bm->OpenOutput(): error code 12)
Ensure all instances of 3dsMax are running a consistent LANGUAGE. By default 3dsMax ships with the
LANGUAGE code set to “ENU” - “US English” and this is recommended for the majority of customers.
If you are using a 3rd party plugin in 3dsMax, please contact the plugin developer to verify that their
plugin is capable of running as a different language inside of 3dsMax. Note, that the majority of 3rd party
plugins are still only developed to work in “ENU”. Please see this FAQ for more information regarding
options to control the LANGUAGE: 3dsMax Language Code FAQ.
Exception: Failed to render the frame.
There could be many reasons my Max would fail to render the frame. Check for ERR or WRN messages
included in the error message for information that might explain the problem.
DBG: in Init. nrGetIface() failed
This error message is often an indication that 3dsmax or backburner is out of date on the machine. Updating both to the latest service packs should fix the problem.
ERROR: ImageMagick: Invalid bit depth for RGB image ‘[path to tile/region render output image]’
This error is due to the old TileAssembler executable not supporting certain bit depth images such as
V-Ray’s RE’s “Reflection”, “Refraction” and “Alpha” when saved from the V-Ray Frame Buffer (VFB).
Please note that the Tile Assembler plugin is EOL (End-Of-Life/deprecated). Please use the newer “Draft
Tile Assembler” plugin (Use Draft for Assembly) checkbox option in SMTD when rendering using the
older tile system to ensure all image types/bit depths are correctly assembled. Draft Tile Asssembler jobs
can also be submitted independently if you already have the *.config file(s) and is explained further in the
Draft Tile Assembler documentation.
Error when using Mental Ray DBR in 3ds Max 2016: Could not locate MDL shared core library.
When you try to use DBR (Distributed Bucket Rendering) you will get the following error message:
Could not locate MDL shared core library.
9.2. 3ds Max
551
Deadline User Manual, Release 7.1.0.35
To help Mental Ray satellite find this .dll copy libmdl.dll from the main 3ds Max 2016 folder to the
NVIDIA/Satellite folder. Note that you have to do this on all the machines that will be used for DBR. See
this Error when using Mental Ray DBR in 3ds Max 2016 Area blog post for more information.
9.3 After Effects
9.3.1 Job Submission
You can submit jobs from within After Effects by installing the integrated submission script, or you can submit them
from the Monitor. The instructions for installing the integrated submission script can be found further down this page.
To submit from within After Effects, select File -> Run Script -> DeadlineAfterEffectsClient.jsx.
552
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
9.3. After Effects
553
Deadline User Manual, Release 7.1.0.35
Project Configuration
In After Effects, place the comps you want to render in the Render Queue (CTRL+ALT+0). Due to an issue with the
Render Queue, if you have more than one comp with the same name, only the settings from the first one will be used
554
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
(whether they are checked or not). It is important that all comps in the Render Queue have unique names, and our
submission script will notify you if they do not. Each comp that is in the Render Queue and that has a check mark
next to it will be submitted as separate job to Deadline.
Note that under the comp’s Output Module settings, the Use Comp Frame Number check box must be checked. If this
is not done, every frame in the submitted comp will try to write to the same file.
9.3. After Effects
555
Deadline User Manual, Release 7.1.0.35
Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. Note that the Draft/Integration options are only available in
After Effects CS4 and later.
The After Effects specific options are:
• Use Comp Name As Job Name: If enabled, the job’s name will be the Comp name.
556
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• Use Frame List From The Comp: Check this option to use the frame range defined for the comp.
• Comps Are Dependent On Previous Comps: If enabled, the job for each comp in the render queue will be
dependent on the job for the comp ahead of it. This is useful if a comp in the render queue uses footage rendered
by a comp ahead of it.
• Render The First And Last Frames Of The First: Enable this option to render the first and last frames first,
followed by the the remaining frames in the comp’s frame list. Note that this ignores the Frame List setting in
the submission dialog.
• Submit The Entire Render Queue As One Job With A Single Task: Use this option when the entire render
queue needs to be rendered all at once because some queue items are dependent on others or use proxies. Note
though that only one machine will be able to work on this job.
• Multi-Process Rendering: Enable multi-process rendering.
• Submit Project File With Job: If enabled, the After Effects Project File will be submitted with the job.
• Ignore Missing Layer Dependencies: If enabled, Deadline will ignore errors due to missing layer dependencies.
• Fail On Warning Messages: If enabled, Deadline will fail the job whenever After Effects prints out a warning
message.
• Export XML Project File: Enable to export the project file as an XML file for Deadline to render (After Effects
CS4 and later). The original project file will be restored after submission. If the current project file is already an
XML file, this will do nothing.
• Ignore Missing Effects References: If enabled, Deadline will ignore errors due to missing effect references.
• Continue On Missing Footage: If enabled, rendering will not stop when missing footage is detected.
• Enable Local Rendering: If enabled, Deadline will render the frames locally before copying them over to the
final network location.
• Override Fail On Existing AE Process: If enabled, the global repository setting “Fail on Existing AE Process”
will be overridden.
• Fail on Existing AE Process: If enabled, the job will be failed if any After Effects instances are currently
running on the slave. Existing After Effects instances can sometimes cause 3rd party AE plugins to malfunction
during network rendering.
The following After Effects specific options are only available in After Effects CS4 and later:
• Multi-Machine Rendering: This mode submits a special job where each task represents the full frame range.
The slaves will all work on the same frame range, but if “Skip existing frames” is enabled for the comps, they
will skip frames that other slaves are already rendering.
– This mode requires “Skip existing frames” to be enabled for each comp in the Render Queue.
– Set the number of tasks to be the number of slaves you want working simultaneously on the render.
– This mode ignores the Frame List, Machine Limit, and Frames Per Task settings.
– This mode does not support Local Rendering or Output File Checking.
• Minimum Output File Size: If an output image’s file size is less than what’s specified, the task is requeued
(specify 0 for no limit).
• Enable Memory Management: Whether or not to use the memory management options.
• Image Cache %: The maximum amount of memory after effects will use to cache frames.
• Max Memory %: The maximum amount of memory After Effects can use overall.
9.3. After Effects
557
Deadline User Manual, Release 7.1.0.35
Layer Submission
In addition to normal job submission, you also have the option to submit layers in your After Effects project as separate
jobs. To do so, first select the layers you want to submit. Then run the submission script, set the submission options
mentioned above as usual, and press the Submit Selected Layers button. This will bring up the layers window.
The layer submission options are:
• Render With Unselected Layers: Specify the unselected layers that will render with each of the selected layers.
• Layer Name Parsing: Allows you to specify how the layer names should be formatted. You can
then grab parts of the formatting and stick them in either the output name or the subfolder format box
with square brackets. So, for example, if you’re naming your layers something like “ops024_a_diff”,
you could put “<graphic>_<layer>_<pass>” in this box. Then in the subfoler box, you could put
“[graphic]\[layer]\v001\[pass]”, which would give you “ops024\a\v001\diff” as the subfolder structure.
• Render Settings: Which render settings to use.
• Output Module: Which output module to use.
• Output Format: How the output file name should be formatted.
• Output Folder: Where the output files should be rendered to.
558
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• Use Subfolders: Enable this to render each layer to its own subfolder. If this is enabled, you must also specify
the subfolder format.
9.3.2 Cross-Platform Rendering Considerations
In order to perform cross-platform rendering with After Effects, you must setup Mapped Paths so that Deadline can
swap out the Project and Output file paths where appropriate. You can access the Mapped Paths Setup in the Monitor
while in super user mode by selecting Tools -> Configure Repository. You’ll find the Mapped Paths Setup in the list
on the left.
You then have two options on how to set up your After Effects project file. The traditional way is to ensure that your
After Effects project file is on a network shared location, and that any footage or assets that the project uses is in the
same folder or in sub-folders. Then when you submit the job, you must make sure that the option to submit the project
file with the job is disabled. If you leave it enabled, the project file will be copied to and loaded from the Slave’s local
machine, and thus won’t be able to find the footage.
You also have the option to save your After Effects project as an AEPX file, which is just an XML file. Deadline will
automatically detect that an AEPX file has been submitted, and will swap out paths within the file itself (because it is
just plain text). This way, you don’t have to worry about setting up the project structure described in the first option.
Note though that all the asset paths still need to be network accessible.
9.3.3 Plug-in Configuration
You can configure the After Effects plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the After Effects plug-in from the list on the left.
Render Executables
9.3. After Effects
559
Deadline User Manual, Release 7.1.0.35
• After Effects Executable: The path to the After Effects aerender executable file used for rendering. Enter
alternative paths on separate lines. Different executable paths can be configured for each version installed on
your render nodes.
Render Options
• Fail On Existing After Effects Process: Prevent Deadline from rendering when After Effects is already open.
• Force Rendering In English: You can configure the After Effects plug-in to force After Effects to render in
English. This is useful if you are rendering with a non-English version of After Effects, because it ensures that
Deadline’s progress gathering and error checking function properly (since they are currently based on English
output from the After Effects renderer).
Font Folder Synchronization
The new FontSync event plugin that ships with Deadline v7.1 can be used to synchronize fonts on Mac OS X and
Windows before the Slave application starts rendering any job, or when the Slave first starts up. This general FontSync
Python based event plugin replaces the font synchronization options here in the After Effects plugin and now works
for ALL plugin types in Deadline. This FontSync event plugin is located at <Repository>/events/FontSync
Path Mapping For aepx Project Files (For Mixed Farms)
• Enable Path Mapping For aepx Files: If enabled, a temporary aepx file will be created locally on the slave for
rendering and Deadline will do path mapping directly in the aepx file.
9.3.4 Integrated Submission Script Setup
The following procedures describe how to install the integrated After Effects submission script. This script allows for
submitting After Effects render jobs to Deadline directly from within the After Effects editing GUI. The script and the
following installation procedure has been tested with with After Effects CS3 and later.
You can either run the Submitter installer or manually install the submission script
Submitter Installer
• Run the Submitter Installer located at <Repository>/submission/AfterEffects/Installers
Manual Installation of the Submission Script
• Copy [Repository]\submission\AfterEffects\Client\DeadlineAfterEffectsClient.jsx to [After Effects Install Directory]\Support Files\Scripts
• After starting up After Effects, make sure that under Edit -> Preferences -> General, the Allow Scripts to Write
Files and Access Network option is enabled. This is necessary so that the submission script can create the
necessary files to submit to Deadline.
560
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Custom Sanity Check
A CustomSanityChecks.jsx file can be created alongside the main SubmitAEToDeadline.jsx submission script (in
[Repository]\submission\AfterEffects\Main), and will be evaluated if it exists. This script will let you set any of the
initial properties in the submission script prior to displaying the submission window. You can also use it to run your
own checks and display errors or warnings to the user. Here is a very simple example of what this script could look
like:
{
initDepartment = "The Best Department";
initPriority = 33;
initConcurrentTasks = 2;
alert( "You are in a custom sanity check!" );
}
9.3.5 FAQ
Which versions of After Effects are supported?
After Effects CS3 and later are supported.
Why is there no Advanced tab in the integrated submission script for After Effects CS3?
9.3. After Effects
561
Deadline User Manual, Release 7.1.0.35
Tabs are only supported in CS4 and later, so the Advanced tab and its options are not available in CS3 and
earlier.
Does network rendering with After Effects require a full After Effects license?
In After Effects CS5.0 and earlier, a license is not required. In After Effects CS5.5, a full license is
required. In After Effects CS6.0 and later, a license isn’t required if you enable “non-royalty-bearing”
mode.
Rendering through Deadline seems to take longer than rendering through After Effects locally.
After Effects needs to be restarted at the beginning of each frame, and this loading time results in the
render taking longer than expected. If you know ahead of time that your frames will render quickly, it is
recommended to submit your frames in groups of 5 or 10. This way, After Effects will only load at the
beginning of each group of frames, instead of at the beginning of every frame.
When rendering a job, only the images from the first task are saved, and subsequent tasks just seem to overwrite
those initial image files.
In the comp’s Output Module Settings, make sure that the “Use Comp Frame Number” checkbox is
checked. Check out step 1 here for complete details.
I get the error that the specified comp cannot be found when rendering, but it is in the render queue.
This can occur for a number of reasons, most of which are related to the name of the comp. Examples are names with two spaces next to each other, or names with apostrophes in them. Try using only
alphanumeric characters and underscores in comp names and output paths to see if that resolves the issue.
Why do the comps in the After Effects Render Queue require unique names?
Due to an issue with the Render Queue, if you have more than one comp with the same name, only the
settings from the first one will be used (whether they are checked or not). It is important that all comps in
the Render Queue have unique names, and our submission script will notify you if they do not.
Understanding the different After Effects command line flags.
Adobe have a web page, Automated Rendering which explains the different network render command
line options and how they work. Deadline currently supports as many of these options as possible.
How can I optimize After Effects for high performance?
Adobe provide an excellent web page, Memory and Storage documenting different areas of After Effects
and what can be done by users to improve performance, particularly in the areas of disk storage/caching
& RAM.
9.3.6 Error Messages and Meanings
This is a collection of known After Effects error messages and their meanings, as well as possible solutions. We want
to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline
Support and let us know.
What does this After Effects error code mean?
A great resource describing a very large number of After Effects error codes, their meanings and possible
solutions can be found on the Mylenium Errors website. If this site helps you, please do consider donating
to keep the site going!
Exception during render: Renderer returned non-zero error code, -1073741819
The error code -1073741819 is equivalent to 0xC0000005, which represents a Memory Access Violation
error. So After Effects is either running out of memory, or memory has become corrupt. If you find
that your frames are still being rendered, you can modify the After Effects plugin to ignore this error.
562
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Just add the following function to the AfterEffectsPlugin class in AfterEffects.py, which can be found in
[Repository]/plugins/AfterEffects.
def CheckExitCode( self, exitCode ):
if exitCode != 0:
if exitCode == -1073741819:
LogInfo( "Ignoring exit code -1073741819" )
else:
FailRender( "Renderer returned non-zero error code %d." % exitCode )
You can find another example of the CheckExitCode function in MayaCmd.py, which can be found in
[Repository]/plugins/MayaCmd.
aerender ERROR: No comp was found with the given name.
This can occur for a number of reasons, most of which are related to the name of the comp. Examples are
names with to spaces next to each other, or names with apostrophes in them. Try using only alphanumeric
characters and underscores in comp names and output paths to see if that resolves the issue.
Exception during render: Renderer returned non-zero error code, 1
aerender ERROR: An existing connection was forcibly closed by the remote host. Unable to receive at line 287
aerender ERROR: After Effects can not render for aerender. Another instance of aerender, or another script,
may be running; or, AE may be waiting for response from a modal dialog, or for a render to complete. Try
running aerender without the -reuse flag to invoke a separate instance of After Effects.
It is unknown what the exact cause of this error is, but it is likely that After Effects is simply crashing
or running out of memory. If you are rendering with Concurrent Tasks set to a value greater than 1, try
reducing the number and see if that helps.
The Knoll Light Factory plugin has also been known to cause this error message when it can’t get a
license.
9.4 Anime Studio
9.4.1 Job Submission
You can submit Anime Studio Standalone jobs from the Monitor.
9.4. Anime Studio
563
Deadline User Manual, Release 7.1.0.35
Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Anime Studio specific options are:
564
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• Anime Studio File: The scene file (*.anme) to be rendered.
• Output File: The path to where the rendered images will be saved.
• Add Format Suffix: If this option is enabled, the format name will be appended to the file name of the output
path. Version 9.5 and later.
• Version: The version of Anime Studio to render with.
• Layer Comp: Render a specific layer comp, or select All to render all layer comps to separate files.
Additional Rendering Options:
• Antialiased Edges: Normally, Anime Studio renders your shapes with smoothed edges. Uncheck this box to
turn this feature off.
• Apply Shape Effects: If this box is unchecked, Anime Studio will skip shape effects like shading, texture fills,
and gradients.
• Apply Layer Effects: If this box is unchecked, Anime Studio will skip layer effects like layer shadows and
layer transparency.
• Render At Half Dimensions: Check this box to render a smaller version of your movie. This makes rendering
faster if you just want a quick preview, and is useful for making smaller movies for the web.
• Render At Half Frame Rate: Check this box to skip every other frame in the animation. This makes rendering
faster, and results in smaller movie files.
• Reduced Particles: Some particle effects require hundreds of particles to achieve their effect. Check this box to
render fewer particles. The effect may not look as good, but will render much faster if all you need is a preview.
• Extra-smooth Images: Renders image layers with a higher quality level. Exporting takes longer with this
option on.
• Use NTSC-safe Colors: Automatically limits colors to be NTSC safe. This is only an approximation - you
should still do some testing to make sure your animation looks good on a TV monitor.
• Do Not Premultiply Alpha Channel: Useful if you plan to composite your Anime Studio animation with other
elements in a video editing program.
QT Options:
• Video Codec: The video codec (leave blank to not specify one). Version 10 and later.
• Quality: The quality of the export. Version 10 and later. 0 = Minimum, 1 = Low, 2 = Normal, 3 = High, 4 =
Max, 5 = Lossless
• Depth: The pixel depth of the export. Version 10 and later.
iPhone/iPad Movie Options:
• Format: The available formats for m4v movies.
AVI Options:
• Format: The available formats for avi movies.
SWF Options:
• Variable Line Widths: Exports variable line widths to SWF.
9.4.2 Plug-in Configuration
You can configure the Anime Studio plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the Anime Studio plug-in from the list on the left.
9.4. Anime Studio
565
Deadline User Manual, Release 7.1.0.35
Render Executables
• Anime Studio Executable: The path to the Anime Studio executable file used for rendering. Enter alternative
paths on separate lines. Different executable paths can be configured for each version installed on your render
nodes.
9.4.3 FAQ
Which versions of Anime Studio are supported by Deadline?
Anime Studio 8 and later are supported.
9.4.4 Error Messages And Meanings
This is a collection of known Anime Studio error messages and their meanings, as well as possible solutions. We
want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email
Deadline Support and let us know.
Currently, no error messages have been reported for this plug-in.
9.5 Arion Standalone
9.5.1 Job Submission
You can submit Arion jobs from the Monitor. Note that Arion’s RenderWarrior application does not support animations
and therefore only single Arion files may be submitted. Arion animations can be rendered through the Arion live
plugins.
566
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
9.5. Arion Standalone
567
Deadline User Manual, Release 7.1.0.35
Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Integration options are
explained in the Integration documentation. The Arion specific options are:
• Arion File: The Arion scene that will be rendered. Can be a .rcs or .obj file.
• LDR Output File: The name of the rendered LDR output file. If no output file is specified a default image file
will be saved beside the Arion file.
• HDR Output File: The name of the rendered HDR output file. If no output file is specified a default image file
will be saved beside the Arion file.
• Passes: If enabled, Arion will render until the specified number of passes have completed.
• Minutes: If enabled, Arion will render until the specified number of minutes have passed.
• Threads: The number of threads that Arion will use to render the input file. If no threads are specified, a default
of one will be used.
• Command Line Args: Here you can specify additional command line arguments. Arion accepts command line
arguments in the format “-arg:value”.
• Channels: Each channel enabled will generate a different image appended with the channel name.
If both Passes and Minutes are specified, Arion will finish rendering when the first limit is reached. If neither are
enabled, Arion will render indefinitely and the job will have to be stopped manually.
9.5.2 Plug-in Configuration
You can configure the Arion plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Arion plug-in from the list on the left.
568
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Render Executables
• Arion Engine Executable: The path to the Arion engine executable file used for rendering. Enter alternative
paths on separate lines.
9.5.3 FAQ
Which versions of Arion are supported?
Only the Arion 2 Standalone is supported.
Are there any issues with referencing a file in the global input folder when one or more other files exist with the
same name?
Yes. When there is a file in the scene that has the same name as a file in another subdirectory, the network
renderer will reference the first file with that name that it finds. It ignores the direct path to the correct
subdirectory.
Can I render multiple channels?
Yes! The Arion submitter supports the selection of individual channels.
How can I pass additional information to Arion?
The Command Line Args field allows you to specify additional arguments to Arion. For example, typing
“-h:100 -w:100” in the Command Line Args field will tell Arion to change the image size to 100px by
100px. To find out more information about additional command line arguments, please visit Arion’s
website.
Can I submit a Arion animations?
The Arion 2 Standalone does not support animations and can only render single images. Arion does still
support animations through there Live plugins.
9.5.4 Error Messages and Meanings
This is a collection of known Combustion error messages and their meanings, as well as possible solutions. We want
to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.
9.6 Arnold Standalone
9.6.1 Job Submission
You can submit Arnold Standalone jobs from the Monitor.
9.6. Arnold Standalone
569
Deadline User Manual, Release 7.1.0.35
570
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Setup your Arnold Files
Before you can submit an Arnold Standalone job, you must export your scene into .ass files.
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Arnold specific options are:
• Arnold File: The Arnold file(s) to be rendered.
– If you are submitting a sequence of .ass files, select one of the numbered frames in the sequence, and
the frame range will automatically be detected if Calculate Frames From Arnold File is enabled. The
frames you choose to render should correspond to the numbers in the .ass files.
• Output File: The output file. If left blank, Arnold will save the output to the location defined in the .ass file.
• Version: Choose the Beta or Release version of Arnold to render with (these can be configured in the Arnold
plugin configuration).
• Threads: The number of threads to use for rendering.
• Verbosity: The verbosity level for the render output.
• Enable Local Rendering: If enabled, Deadline will render the frames locally before copying them over to the
final network location.
• Command Line Args: Specify additional command line arguments you would like to pass to the Arnold renderer.
• Additional Plugin Folders: Specify up to three additional plugin folders that Arnold should use when rendering.
9.6.2 Plug-in Configuration
You can configure the Arnold plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Arnold plug-in from the list on the left.
9.6. Arnold Standalone
571
Deadline User Manual, Release 7.1.0.35
Render Executables
• Arnold Kick Executable: The path to the Arnold kick executable file used for rendering. Enter alternative
paths on separate lines. Different executable paths can be configured for each version installed on your render
nodes.
9.6.3 FAQ
Is Arnold Standalone supported by Deadline?
Yes.
Can I submit a sequence of Arnold .ass files that each contain one frame?
Yes, this is supported.
9.6.4 Error Messages and Meanings
This is a collection of known Arnold error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.
572
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
9.7 AutoCAD
9.7.1 Job Submission
You can submit jobs from within AutoCAD by installing the integrated submission script, or you can submit them
from the Monitor. The instructions for installing the integrated submission script can be found further down this page.
To submit from within AutoCAD, press the Submit To Deadline button on the Deadline tab or run the command
SubmitToDeadline
9.7. AutoCAD
573
Deadline User Manual, Release 7.1.0.35
Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation.
AutoCAD has 3 types of submission jobs each of which have their own specific options.
The render job options are:
• Render Views: Which views to render, each one will be a separate frame in a single job.
574
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• Render Procedure: View or Selected - whether or not to render everything in the view or only the selected
objects.
The plotter job options are:
• Plotter to use: Which plotter should be used.
• Plot Area: Extents or Display - what area should be plotted, everything in the scene or what is currently
displayed.
• Paper Size: The size of paper to plot to.
• Paper Units: Which units to use for the paper.
• Fit Plot Scale: Whether or not the plot should be scaled as much as possible to fit on the paper.
• Plot Scale: The scale to use if not fitting
• Plot Style Table: Which plot style table should be used.
• Use Line Weight: Whether or not the lines should have extra weight on them.
• Scale Line Weights: Whether or not the lines should be scaled.
The export job options are:
• Selection: Which objects should be exported. Only available in the integrated submitter.
• Types to Export: Which types of objects should be exported.
• Textures: How textures should be handled.
• DGN Settings: DGN specific settings such as version and seed file.
9.7.2 Plug-in Configuration
You can configure the AutoCAD plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the AutoCAD plug-in from the list on the left.
9.7. AutoCAD
575
Deadline User Manual, Release 7.1.0.35
Render Executables
• AutoCAD 2015 Executable: The path to the AutoCAD 2015 executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your
render nodes.
• AutoCAD 2016 Executable: The path to the AutoCAD 2016 executable file used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your
render nodes.
9.7.3 Integrated Submission Script Setup
The following procedures describe how to install the integrated AutoCAD submission script. This script allows for
submitting AutoCAD render jobs to Deadline directly from within the AutoCAD editing GUI.
You can either run the Submitter installer or manually install the submission script
Submitter Installer
• Run the Submitter Installer located at <Repository>/submission/AutoCAD/Installers
Manual Installation of the Submission Script
• Copy
[Repository]/AutoCAD/Client/AutoCADSubmitter.bundle
DATA%/Autodesk/ApplicationPlugins
to
%APP-
• Restart AutoCAD and the Deadline tool bar should be available.
576
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
9.7.4 FAQ
Is AutoCAD supported by Deadline?
Yes.
AutoCAD 2016 requires signed dlls. Are Deadline’s plugins signed?
Yes, all of Deadline’s plugins are signed, due to the new system though you will have to add Thinkbox
as a trusted company to each of your machines. This can be done by opening AutoCAD 2016 on the
machines that have the plugins (including the render plugin) and then allow the plugins to always load.
9.7.5 Error Messages and Meanings
This is a collection of known AutoCAD error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.
9.8 Blender
9.8.1 Job Submission
You can submit jobs from within Blender by installing the integrated submission script, or you can submit them from
the Monitor. The instructions for installing the integrated submission script can be found further down this page.
To submit from within Blender 2.5 and later, select Render -> Submit To Deadline. For previous versions of Blender,
you must submit from the Monitor.
9.8. Blender
577
Deadline User Manual, Release 7.1.0.35
Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Blender specific options are:
578
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• Threads: The number of threads to use for rendering.
• Build To Force: You can force 32 bit or 64 bit rendering.
9.8.2 Plug-in Configuration
You can configure the Blender plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Blender plug-in from the list on the left.
Render Executables
• Blender Executable: The path to the Blender executable file used for rendering. Enter alternative paths on
separate lines.
Output
• Suppress Verbose Progress Output To Log: When enabled, this will prevent excessive progress logging to the
Slave and task logs.
9.8.3 Integrated Submission Script Setup
The following procedures describe how to install the integrated Blender submission script. This script allows for
submitting Blender render jobs to Deadline directly from within the Blender editing GUI. Note that this script only
works with Blender 2.5 and later. You can submit to older versions of Blender from the Monitor.
You can either run the Submitter installer or manually install the submission script
Submitter Installer
• Run the Submitter Installer located at <Repository>/submission/Blender/Installers
• In Blender, select File -> User Preferences, and then select the Add-Ons tab.
9.8. Blender
579
Deadline User Manual, Release 7.1.0.35
• Click on the Render filter on the left, and check the box next to the Render: Submit To Deadline add-on.
Manual Installation of the Submission Script
• In Blender, select File -> User Preferences, and then select the Add-Ons tab.
580
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• Click the Install Add-On button at the bottom, browse to [Repository]\submission\Blender\Client, and select
the DeadlineBlenderClient.py script. Then press the Install Add-On button to install it. Note that on Windows, you may not be able to browse the UNC repository path, in which case you can just copy [Repository]\submission\Blender\Client\DeadlineBlenderClient.py locally to your machine before pointing the Add-On
installer to it.
• Then click on the Render filter on the left, and check the box next to the Render: Submit To Deadline add-on.
9.8. Blender
581
Deadline User Manual, Release 7.1.0.35
• After closing the User Preferences window, the Submit To Deadline option should now be in your Render menu.
9.8.4 FAQ
Which versions of Blender are supported?
Blender 2.x is currently supported.
9.8.5 Error Messages And Meanings
This is a collection of known Blender error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.
9.9 Cinema 4D
9.9.1 Job Submission
You can submit jobs from within Cinema 4D by installing the integrated submission script, or you can submit them
from the Monitor. The instructions for installing the integrated submission script can be found further down this page.
To submit from within Cinema 4D, select Python -> Plugins -> Submit To Deadline.
582
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Cinema 4D specific options are:
• Threads To Use: The number of threads to use for rendering.
• Build To Force: Force rendering in 32 bit or 64 bit.
• Export Project Before Submission: If your project is local, or you are rendering in a cross-platform environment, you may find it useful to export your project to a network directory before the job is submitted.
• Enable Local Rendering: If enabled, the frames will be rendered locally, and then copied to their final network
location.
9.9.2 Cross-Platform Rendering Considerations
In order to perform cross-platform rendering with Cinema 4D, you must setup Mapped Paths so that Deadline can
swap out the Scene and Output file paths where appropriate. You can access the Mapped Paths Setup in the Monitor
while in super user mode by selecting Tools -> Configure Repository. You’ll find the Mapped Paths Setup in the list
on the left.
When submitting the Cinema 4D job for rendering, you should enable the Export Project Before Submission option,
and choose a network location when prompted for the export path. This will strip any absolute asset paths and make
them relative to the scene file, and will also ensure the option to submit the Cinema 4D scene file with the job is
disabled.
If you don’t enable the Export Project Before Submission option, you need to manually export the project to a network
location. Then, you must submit the exported scene file from the Submit menu in the Monitor and you need to specify
the output and/or multipass output paths in the submitter. Make sure the option to submit the Cinema 4D scene file
with the job is disabled. If you leave it enabled, the scene file will be copied to and loaded from the Slave’s local
machine, which will break the relative asset paths.
9.9. Cinema 4D
583
Deadline User Manual, Release 7.1.0.35
9.9.3 Plug-in Configuration
You can configure the Cinema 4D plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the Cinema 4D plug-in from the list on the left.
Render Executables
• C4D Executable: The path to the C4D executable file used for rendering. Enter alternative paths on separate
lines. Different executable paths can be configured for each version installed on your render nodes.
9.9.4 Integrated Submission Script Setup
The following procedures describe how to install the integrated Cinema 4D submission script. This script allows for
submitting Cinema 4D render jobs to Deadline directly from within the Cinema 4D editing GUI.
You can either run the Submitter installer or manually install the submission script
Submitter Installer
• Run the Submitter Installer located at <Repository>/submission/Cinema4D/Installers
Manual Installation of the Submission Script
• Copy [Repository]/submission/Cinema4D/Client/DeadlineC4DClient.pyp to [Cinema 4D Install Directory]/plugins.
• Restart Cinema 4D, and the Submit To Deadline menu should be available from the Python -> Plugins menu.
Custom Sanity Check
A CustomSanityChecks.py file can be created alongside the main SubmitC4DToDeadline.py submission script (in
[Repository]\submission\Cinema4D\Main), and will be evaluated if it exists. This script will let you set any of the
584
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
initial properties in the submission script prior to displaying the submission window. You can also use it to run your
own checks and display errors or warnings to the user. Here is a very simple example of what this script could look
like:
import c4d
from c4d import gui
def RunSanityCheck( dialog ):
dialog.SetString( dialog.DepartmentBoxID, "The Best Department!" )
dialog.SetLong( dialog.PriorityBoxID, 33 )
dialog.SetLong( dialog.ConcurrentTasksBoxID, 2 )
gui.MessageDialog( "This is a custom sanity check!" )
return True
The available dialog IDs can be found in the SubmitC4DToDeadline.py script mentioned above. They are defined near
the top of the SubmitC4DToDeadlineDialog class. These can be used to set the initial values in the submission dialog.
Finally, if the RunSanityCheck method returns False, the submission will be cancelled.
9.9.5 FAQ
Which versions of Cinema 4D are supported?
Cinema 4D 12 and later are supported.
When I use Adobe Illustrator files as textures, the render fails with “Asset missing”
While Cinema 4D is able to use AI files in workstation mode, there is often problems when rendering
in command line mode. Convert the AI files to another known type such as TIFF or JPEG before using
them.
Sometimes when I open the submission dialog in Cinema 4D, the pool list or group list are empty.
Simply close the submission dialog and reopen it to repopulate the lists.
Does rendering with Cinema 4D with Deadline use up a full Cinema 4D license?
There are separate Cinema 4D command line licenses that are required to render with Deadline. Please
contact Maxon for more information regarding licensing requirements.
Can Deadline render with Cinema 4D’s Net Render Client software?
No. It isn’t possible for 3rd party software such as Deadline to control Cinema 4D’s Net Render Client,
which is why it uses the command line renderer.
I have copied over SubmitToDeadline.pyp file but the integrated submission script does not show up under the
python menu.
This is likely caused by some failure in the script. Check your repository path to ensure the client is able
to read and write to that folder. Using the python console within C4D may provide more specific hints.
My frames never seem to finish rendering. When I check the slave machine, it doesn’t appear to be doing
anything.
This can occur if Cinema 4D hasn’t been licensed yet. Try starting Cinema 4D normally on the machine
and see if you are prompted for a license. If you are, configure everything and then try rendering on that
machine again.
9.9. Cinema 4D
585
Deadline User Manual, Release 7.1.0.35
9.9.6 Error Messages And Meanings
This is a collection of known Cinema 4D error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.
9.10 Cinema 4D Team Render
9.10.1 Job Submission
You can submit jobs from within Cinema 4D by installing the integrated submission script, or you can submit them
from the Monitor. The instructions for installing the integrated submission script can be found further down this page.
To submit from within Cinema 4D, select Python -> Plugins -> Submit Team Render To Deadline.
586
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
9.10. Cinema 4D Team Render
587
Deadline User Manual, Release 7.1.0.35
Submission Options
The general Deadline options are explained in the Job Submission documentation. The Cinema 4D Team Render
specific options are:
• Render Client Count: The number of render clients to use.
• Security Token: The security token that the Team Render application will use on the slaves (it will be generated
automatically if left blank).
588
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Rendering
After you’ve configured your submission options, press the Reserve Clients button to submit the Team Render job.
After the job has been submitted, you can press the Update Clients button to update the job’s ID and Status in the
submitter. As nodes pick up the job, pressing the Update Clients button will also show them in the Active Servers list.
Cinema 4D’s Team Render Machines window will will also appear after pressing the Reserve Clients button, and will
show you the Team Render machines that are currently available. Before you can render with them though, you must
verify them by following these steps:
1. Copying the Security Token from the submitter to the clipboard (use the Copy to Clipboard button).
2. Right-click on each machine in the Team Render Machines window and select the Verify option, then paste the
Security Token and press OK.
When you are ready to render, select the Team Render To Picture Viewer option in C4D’s Render menu to start
rendering.
9.10.2 Plug-in Configuration
You can configure the Cinema 4D Team Render plug-in settings from the Monitor. While in super user mode, select
Tools -> Configure Plugins and select the Cinema 4D plug-in from the list on the left.
9.10. Cinema 4D Team Render
589
Deadline User Manual, Release 7.1.0.35
Cinema 4D Options
• C4D Team Render Executable: The path to the Cinema 4D Team Render Client executable file used for
rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each
version installed on your render nodes.
9.10.3 Integrated Submission Script Setup
The following procedures describe how to install the integrated Cinema 4D Team Render submission script. This
script allows for submitting Cinema 4D Team Render render jobs to Deadline directly from within the Cinema 4D
editing GUI.
You can either run the Submitter installer or manually install the submission script
Submitter Installer
• Run the Submitter Installer located at <Repository>/submission/Cinema4DTeamRender/Installers
Manual Installation of the Submission Script
• Copy [Repository]/submission/Cinema4DTeamRender/Client/DeadlineC4DTeamRenderClient.pyp to [Cinema
4D Install Directory]/plugins.
• Restart Cinema 4D, and the Submit To Deadline menu should be available from the Python -> Plugins menu.
9.10.4 FAQ
Which versions of Cinema 4D are supported?
590
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Cinema 4D 15 and later are supported.
9.10.5 Error Messages And Meanings
This is a collection of known Cinema 4D error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.
9.11 Clarisse iFX
9.11.1 Job Submission
You can submit jobs from within Clarisse iFX by installing the integrated submission script, or you can submit them
from the Monitor. The instructions for installing the integrated submission script can be found further down this page.
To submit from within Clarisse iFX, click on the custom toolbar item you created during the integrated submission
script setup. You will first be prompted to specify a file to export the render archive to.
After you specify the render archive file, the submitter will come up with the Render Archive and Frame List fields
already populated.
9.11. Clarisse iFX
591
Deadline User Manual, Release 7.1.0.35
Note that if you are submitting from the Monitor, you will have to manually export your render archive from inside
Clarisse iFX, and then browse to the Render Archive file in the Monitor submitter.
Submission Options
The general Deadline options are explained in the Job Submission documentation. The Clarisse iFX specific options
are:
• Threads: The number of threads to use for rendering. If set to 0, the value in the Clarisse configuration file will
be used.
• Verbose Logging: Enables verbose logging during rendering.
592
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
9.11.2 Plug-in Configuration
You can configure the Clarisse iFX plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the Clarisse plug-in from the list on the left.
Render Executables
• CRender Executable: The path to the Clarisse’s crender executable file used for rendering. Enter alternative
paths on separate lines.
Configuration Options
• Global Config File: A global configuration file to be used for rendering. If left blank, the Clarisse.cfg file in the
user home directory will be used instead.
• Module Paths: Additional paths to search for modules (one path per line).
• Search Paths: Additional paths to search for includes (one path per line).
9.11.3 Integrated Submission Script Setup
The following procedures describe how to install the integrated Clarisse iFX submission script. This script allows for
submitting Clarisse iFX render jobs to Deadline directly from within the Clarisse iFX editing GUI.
You can either run the Submitter installer or manually install the submission script
Submitter Installer
• Run the Submitter Installer located at <Repository>/submission/Clarisse/Installers
Manual Installation of the Submission Script
• In Clarisse iFX, right-click on the toolbar at the top and select Add Item.
9.11. Clarisse iFX
593
Deadline User Manual, Release 7.1.0.35
• In the Add New Item dialog, set the following properties:
– Title: Submit To Deadline
– Category: Custom
– Category Custom: Deadline
– Script
Path:
Choose
tory]\submission\Clarisse\Client
594
the
DeadlineClarisseClient.py
script
from
[Reposi-
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• Click Add, and you should now see a Deadline tab in the toolbar with a button that you can click on to submit
the job.
9.11.4 FAQ
Which versions of Clarisse iFX are supported?
9.11. Clarisse iFX
595
Deadline User Manual, Release 7.1.0.35
The “crender” application is used for rendering, so any version of Clarisse iFX that includes this application is supported.
9.11.5 Error Messages and Meanings
This is a collection of known Clarisse iFX error messages and their meanings, as well as possible solutions. We want
to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.
9.12 Combustion
9.12.1 Job Submission
You can submit Combustion jobs from the Monitor.
596
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Workspace Configuration
• In Combustion, when you are ready to submit your workspace, open the Render Queue by selecting File ->
Render... (CTRL+R).
• Select which items you want to render in the box on the left.
9.12. Combustion
597
Deadline User Manual, Release 7.1.0.35
• Configure your output settings under the tab Output Settings.
• Under the tab Global Settings, specify an Input Folder (a shared folder where all the footage for you workspace
can be found) and an Output Folder (a shared folder where the output will be dumped). Note that Combustion
will search any subfolders in you Input Folder for footage as well.
598
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• Close the Render Queue and save your workspace.
Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Combustion specific options are:
• Workspace File: The Combustion workspace file to be rendered.
• Output Operator: Select the output operator in the workspace file to render. The render will fail if the operator
cannot be found.
• Version: The version of Combustion to render with.
• Skip Existing Frames: Skip over existing frames during rendering (version 4 and later only).
• Use Only One CPU To Render: Limit rendering to one CPU (version 4 and later only).
9.12.2 Plug-in Configuration
You can configure the Combustion plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the Combustion plug-in from the list on the left.
Render Executables
• Combustion Executable: The path to the ShellRender executable file used for rendering. Enter alternative
paths on separate lines. Different executable paths can be configured for each version installed on your render
nodes.
9.12. Combustion
599
Deadline User Manual, Release 7.1.0.35
9.12.3 FAQ
Which versions of Combustion are supported?
Combustion 4 and later are supported.
All my input footage is spread out over the network, so how do I specify a single Input Folder during submission?
When Combustion is given an Input Folder, it will search all subfolders for the required footage until
the footage is found. So if you have a root folder that all of your footage branches off from, you should
specify that root as the Input Folder.
Are there any issues with referencing a file in the global input folder when one or more other files exist with the
same name?
Yes. When there is a file in the scene that has the same name as a file in another subdirectory, the network
renderer will reference the first file with that name that it finds. It ignores the direct path to the correct
subdirectory.
Can Deadline render multiple outputs?
No. Only one output can be enabled in your Combustion workspace. If no outputs are enabled, or multiple
outputs are enabled, the workspace cannot be submitted to Deadline.
When rendering, I receive a pop up error message. Since rendering is supposed to be silent, should I not be
getting error messages like this in the first place?
Make sure that you’re using ShellRenderer.exe as the render executable, (combustion.exe starts up Combustion normally, while ShellRenderer.exe is the command line rendering appliation). You can make the
switch in the Plugin Configuration (Tools -> Configure Plugins in the Monitor while in super user mode).
Why isn’t path mapping working properly between Windows and Mac?
On the Mac, the Combustion workspace file saves network paths in the form share:\\folder\..., so you have
to set up your Path Mapping settings in the Repository options accordingly.
9.12.4 Error Messages And Meanings
This is a collection of known Combustion error messages and their meanings, as well as possible solutions. We want
to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.
9.13 Command Line
9.13.1 Job Submission
Arbitrary command line jobs can be submitted to Deadline that will execute the same command line for each frame of
the job.
To submit arbitrary command line jobs, refer to the Manual Job Submission documentation. To submit from the
Monitor, refer to the documentation below.
600
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
9.13. Command Line
601
Deadline User Manual, Release 7.1.0.35
Submission Options
The general Deadline options are explained in the Job Submission documentation. The Command Line specific options
are:
• Job Type: Choose a normal job or maintenance job. A normal job will let you specify an arbitrary frame list,
but a maintenance job requires a start frame and an end frame.
• Executable: The executable to use for rendering.
• Arguments: The arguments to pass to the executable. Use the Start Frame and End Frame buttons to add their
corresponding tags to the end of the current arguments. See the Manual Job Submission documentation for more
information on these tags.
• Frame Tag Padding: Determines the amount of frame padding to be added to the Start and End Frame tags.
• Start Up Folder: The folder that the executable will be started in. If left blank, the executable’s folder will be
used instead.
9.13.2 Plug-in Configuration
The Command Line plug-in does not require any configuration.
9.13.3 FAQ
How do I handle paths in the arguments with spaces in them?
Use double-quotes around the path. For example, “T:\projects\path with spaces\project.ext”.
Do I need to use the <QUOTE> tags?
These are only needed when submitting manually from the command line. When using the Monitor
submitter, you can just type in the double-quote character in the Arguments field.
9.14 Command Script
9.14.1 Job Submission
You can submit Command Script jobs from the Monitor. Command Script can execute a series of command lines,
which can be configured to do anything from rendering to folder synchronization.
602
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
9.14. Command Script
603
Deadline User Manual, Release 7.1.0.35
Submission Options
The general Deadline options are explained in the Job Submission documentation. The Command Script specific
options are:
• Commands To Execute: Specify a list of commands to execute by either typing them in, or by loading them
from a file. You also have the option to save the current list of commands to a file. To insert file or folder paths
into the Commands field, use the Insert File Path or Insert Folder Path buttons.
• Startup Directory: The directory where each command will startup. This is optional, and if left blank, the
executable’s directory will be used as the startup directory.
• Commands Per Task: Number of commands that will be executed for each task.
9.14.2 Manual Submission
Command Script jobs can also be manually submitted from the command line.
Submission File Setup
Three files are required for submission:
• the Job Info File
• the Plugin Info File
• the Command file
The Job Info file contains the general job options, which are explained in the Job Submission documentation.
The Plugin info file contains one line (this is the directory where each command will startup):
StartupDirectory=...
The Command file contains the list of commands to run. There should be one command per line, and no lines should
be left blank. If you’re executable path has a space in it, make sure to put quotes around the path. The idea is that
one frame in the job represents one command in the Command file. For example, let’s say that your Command file
contains the following:
"C:\Program
"C:\Program
"C:\Program
"C:\Program
"C:\Program
Files\Executable1.exe"
Files\Executable1.exe" -param1
Files\Executable1.exe"
Files\Executable1.exe" -param1 -param2
Files\Executable1.exe"
Because there are five commands, the Frames specified in the Job Info File should be set to 0-4. If the Chunksize is set
to 1, then a separate task will be created for each command. When a slave dequeues a task, it will run the command
that is on the corresponding line number in the Command file. Note that the frame range specified must start at 0.
If you wish to run the commands in the order that they appear in the Command file, you can do so by setting the
MachineLimit in the Job Info File to 1. Only one machine will render the job at a given time, thus dequeuing each
task in order. However, if a task throws an error, the slave will move on to dequeue next task.
To submit the job, run the following command (where DEADLINE_BIN is the path to the Deadline bin directory):
DEADLINE_BIN\deadlinecommand JobInfoFile PluginInfoFile CommandFile
604
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Manual Submission Example
This example demonstrates how you can render a single frame from a Maya scene with different options, and direct
the output to a specific location. To get the submission script, download Example Script For Command Script Plugin
from the Miscellaneous Deadline Downloads Page. To run the script, run the following command (you must have Perl
installed):
Perl SubmitMayaCommandScript.pl "SceneFile.mb" FrameNumber "OutputPath"
9.14.3 Plug-in Configuration
The Command Script plug-in does not require any configuration.
9.14.4 FAQ
Can I use executables with spaces in the path?
Yes, just add quotes around the executable path.
9.15 Composite
9.15.1 Job Submission
You can submit jobs from within Composite by installing the integrated submission script, or you can submit them
from the Monitor. The instructions for installing the integrated submission script can be found further down this page.
9.15. Composite
605
Deadline User Manual, Release 7.1.0.35
To submit from within Composite, select the version you would like to submit, hit render, and choose the Background
option when prompted.
606
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Integration options are
explained in the Integration documentation. The Composite specific options are:
• Project File: The Composite .txproject file.
• Composition: Path to the composition that you want to submit.
• Composition Version: The version of the current composition selected.
• Users ini file: The path to the user.ini file for this composition.
• Version: The version of Composite to use.
• Build to Force: Force 32 bit or 64 bit rendering.
9.15.2 Plug-in Configuration
You can configure the Composite plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the Composite plug-in from the list on the left.
9.15. Composite
607
Deadline User Manual, Release 7.1.0.35
Render Executables
• Composite Executable: The path to the txrender executable file used for rendering. Enter alternative paths on
separate lines. Different executable paths can be configured for each version installed on your render nodes.
9.15.3 Integrated Submission Script Setup
The following procedures describe how to setup the integrated Composite submission script. This script allows for
submitting Composite render jobs to Deadline directly from within the Composite editing GUI.
You can either run the Submitter installer or manually install the submission script
Submitter Installer
• Run the Submitter Installer located at <Repository>/submission/Composite/Installers
Manual Installation of the Submission Script
• Copy [Repository]\submission\Composite\Client\DeadlineCompositeClient.py to [CompositeInstall Directory]\resources\scripts\
• Setup the Custom Render Action.
– In Composite under the Edit menu select Edit -> Project Preferences
– In the opened dialog select the Render Actions tab
– Under Render Actions, right click and select New
– Name the new action ‘Deadline’
– Enter the following for the Render Command (all on one line):
608
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
“<PythonExec>” “<ScriptsFolder>/DeadlineCompositeClient.py” -d “<RenderProjectPath>” -u “<RenderUserPath>” -c “<Composition>” -v “<Version>” -o “<Outputs>”
-s “<StartFrame>” -e “<EndFrame>”
– There are two additional options you can add to this line:
* -r “COMPOSITE_VERSION” (where COMPOSITE_VERSION is the version of Composite,
like 2012)
* -b “COMPOSITE_BUILD” (where COMPOSITE_BUILD is the bitness of Composite, which
can be set to None, 32bit, or 64bit)
– In the Render window, select ‘Deadline’ as the Action and press Start.
9.15.4 FAQ
Which versions of Composite are supported?
Composite 2010 and later are supported.
9.15.5 Error Messages and Meanings
This is a collection of known Composite error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline
Support and let us know.
9.15. Composite
609
Deadline User Manual, Release 7.1.0.35
Currently, no error messages have been reported for this plug-in.
9.16 Corona Standalone
9.16.1 Job Submission
You can submit Corona jobs from the Monitor.
Submission Options
The general Deadline options are explained in the Job Submission documentation. The Corona specific options are:
• Corona Scene: The Corona scene that will be rendered. Must be a .scn file.
• Output File Directory: The directory for the output to be saved to.
• Output File Name: The prefix for the output file names. If not specified it defaults to “output”.
• Frame List: The list of frames to be rendered. Each frame will be rendered to a separate output file.
610
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• Single Frame Job: If selected, the job is a single frame job.
• Configuration File(s): Add any configuration files for Corona here. Configuration files are processed in the
order they are listed.
The Corona specific advanced options are:
• Override maximum # of Passes: You can override the configuration file setting for the maximum number of
passes here if this is enabled.
• Override maximum Render Time: You can override the configuration file setting for the maximum render
time here if this is enabled.
• Override Threads: You can override the configuration file setting for the number of threads here if this is
enabled.
9.16.2 Plug-in Configuration
You can configure the Corona plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the Corona plug-in from the list on the left.
Render Executables
• Corona Executable: The path to the corona standalone executable file used for rendering. Enter alternative
paths on separate lines.
9.16.3 Error Messages and Meanings
This is a collection of known Combustion error messages and their meanings, as well as possible solutions. We want
to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline
Support and let us know.
9.16. Corona Standalone
611
Deadline User Manual, Release 7.1.0.35
Currently, no error messages have been reported for this plug-in.
9.17 Corona Distributed Rendering
9.17.1 Interactive Distributed Rendering
You can submit interactive Corona DR jobs from 3ds Max. The instructions for installing the integrated submission
script can be found further down this page.
The interactive submitter will submit a Corona DR job to reserve render nodes, and the submitter will automatically
update the Corona DR server list in the 3ds Max UI.
Do NOT execute Render Legion’s Corona DR Server executable manually on each intended machine. Deadline is
more flexible here and will spawn the Corona DR Server standalone executable as a child process of the Deadline
Slave. This makes our system flexible and resilient to crashes as when we terminate/complete the Corona DR job
in the Deadline queue, the Deadline Slave application will ‘cleanly’ tidy up the DR Server and more importantly,
any instances of 3dsMax which it in turn has spawned as a child process. This can be helpful if Corona DR or that
instance of 3dsMax becomes unstable and a user wishes to reset the system remotely. You can simply re-queue or
delete/complete the current Corona DR job or re-submit.
612
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Port Configuration
Here is a consolidated list of port requirements for Corona DR. Ensure any applicable firewalls are opened to allow
pass-through communication. Typically if in doubt, opening TCP/UDP ports in the range: 19660-19670 will cover all
Corona implementations for DR. During initial testing, it is recommended to open all ports in this range, verify and
9.17. Corona Distributed Rendering
613
Deadline User Manual, Release 7.1.0.35
then consider tightening up security.
Protocol
UDP
TCP
TCP
Port Number
19666
19667
19668
Application
3dsMax
3dsMax
3dsMax
Notes
loopback
Submission Options
The general Deadline options are explained in the Job Submission documentation. The Corona DR specific options
are:
• Maximum Servers: The maximum number of Corona DR Servers to reserve for distributed rendering.
• Enable Verbose Logging (Optional): When checked, Corona DR server will create verbose logs.
• Use Server IP Address Instead of Host Name: If checked, the Active Servers list will show the server IP
addresses instead of host names.
• Automatically Update Server List: This option when un-checked stops the automatic refresh of the active
servers list based on the current Deadline queue.
• Complete Job after Render: When checked, as soon as the DR session has completed (max quick render
finished), then the Deadline job will be marked as complete in the queue.
Rendering
After you’ve configured your submission options, press the Reserve Servers button to submit the Corona DR job. The
job’s ID and Status will be tracked in the submitter, and as nodes pick up the job, they will show up in the Active
Servers list. Once you are happy with the server list, press Start Render to start distributed rendering.
Note that the Corona DR Server process can sometimes take a little while to initialize. This means that a server in
the Active Server list could have started the Corona DR server, but it’s not fully initialized yet. If this is the case, it’s
probably best to wait a minute or so after the last server has shown up before pressing Start Render.
Update Servers (3dsMax only) button will manually update the Active Servers List. Note, if you modify the Maximum
Servers value, the job’s frame range will be updated when this button is pressed or if “Automatically Update Server
List” is enabled.
Whilst using the “interactive” Corona DR Server submission system in 3dsMax, it is recommended to NOT use the
“Search LAN” button or enable the “Search LAN during render” checkbox, as you risk accidently selecting the wrong
Corona DR servers running on your network, if another user in your studio is also running 1 or more Corona DR
servers for their rendering needs.
After the render is finished, you can press Release Servers or close the submitter UI (Setup Corona DR With Deadline)
to mark the Corona DR job as complete so that the render nodes can move on to another job in your queue.
9.17.2 Corona DR Submission
You can also submit Corona DR jobs from the Monitor, which can be used to reserve render nodes for distributed rendering. Note, if you submit the job via the Monitor submission script, that you will need to manually configure/update
your local workstation settings to point to the correct, corresponding Deadline slaves either via IP address or hostname, depending on your local network setup. See your local systems administrator if your not sure if you should use
a hostname or IP address on your network.
614
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Submission Options
The general Deadline options are explained in the Job Submission documentation. The Corona DR specific options
are:
• Maximum Servers: The maximum number of Corona DR Servers to reserve for distributed rendering.
• Verbose Logging: Enable for verbose logging from the DrServer application.
Rendering
After you’ve configured your submission options, press the Submit button to submit the Corona DR job. Note that
this doesn’t start any rendering, it just allows the Corona DR Server application to start up on nodes in the farm. Once
you’re happy with the nodes that have picked up the job, you can initiate the distributed render manually from within
the application. This will likely require manually configuring your Corona Server list or conveniently, you could use
the “Search LAN” button to automatically find ANY Corona DR servers running on your network. Additionally,
Corona provides a “Search LAN during render” checkbox, which can be used to locate additional Corona DR Servers
9.17. Corona Distributed Rendering
615
Deadline User Manual, Release 7.1.0.35
whilst the render is progressing on your workstation and it also allows any errored or user interrupted servers to re-join
this rendering session again.
After the distributed render has finished, remember to mark the job as complete or delete it so that the nodes can move
on to other jobs. Alternatively, use the DR Session timeout functionality described below or the auto task timeout to
control whether these type of jobs are automatically completed after a certain period of time.
9.17.3 Plug-in Configuration
You can configure the Corona DR Server plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the CoronaDR plug-in from the list on the left.
Corona DrServer Executables
Here you can specify the Corona DR server executable used for rendering.
DR Process Handling
• Handle Existing DR Process: Only one instance of the same DR process running over the same port is possible.
This option allows for Deadline to fail the task if this is the case or attempt to kill the currently running process,
to allow the Deadline managed DR process to run successfully.
DR Session Timeout
• DR Session Auto Timeout Enable: If enabled, when a DR session has successfully completed on a slave, the
task on the slave will be marked as complete after the DR session auto timeout period in seconds has been
reached (Default: False).
• DR Session Auto Timeout (Seconds): This is the timeout period (Default: 30 seconds) when a DR session will
timeout and be marked as complete by a slave.
616
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
9.17.4 Integrated Submission Script Setup
There is an integrated Corona DR submission script for 3ds Max. The installation process for this script can be found
below.
3ds Max
The following procedures describe how to install the integrated Corona DR submission script for 3ds Max. The
integrated submission script and the following installation procedure has been tested with Max versions 2012 and later
(including Design editions).
Note: Due to a maxscript bug in the initial release of 3ds Max 2012, the integrated submission scripts will not work.
However, this bug has been addressed in 3ds Max 2012 Hotfix 1.
You can either run the Submitter installer or manually install the submission script
Submitter Installer
• Run the Submitter Installer located at <Repository>/submission/3dsmaxCoronaDR/Installers
Manual Installation of the Submission Script
• Copy [Repository]/submission/3dsmaxCoronaDR/Client/Deadline3dsmaxCoronaDRClient.mcr to [3ds Install
Directory]/MacroScripts. If you don’t have a MacroScripts folder in your 3ds Max install directory, check to
see if you have a UI/Macroscripts folder instead, and copy the Deadline3dsmaxCoronaDRClient.mcr file there
if you do.
• Copy
[Repository]/submission/3dsmax/Client/SMTDSetup.ms
tory]/scripts/Startup/SMTDSetup.ms
to
[3ds
Max
Install
Direc-
• Launch 3ds Max, and find the new Deadline menu.
9.17. Corona Distributed Rendering
617
Deadline User Manual, Release 7.1.0.35
9.17.5 FAQ
Is Corona Distributed Rendering (DR) supported?
Yes. A special ‘reserve’ job is submitted that will run the Corona DR Server application on the render
nodes. Once the Corona DR Server process is running, these nodes will be able to participate in distributed
rendering.
Which versions of Corona DR are supported?
Corona interactive rendering is supported for 3ds Max 2012-2015.
Corona DR Server application fails to start manually?
During initial configuration of Corona DR Server & any future debugging, it is recommended to disable
any firewall & anti-virus software at both the DR master host machine as well as all render slave machines
which are intended to participate in the DR process. We suggest you manually get Corona DR up and
running in your studio pipeline to verify all is well before then introducing Deadline as a framework to
handle the DR Server application.
Is Backburner required for 3dsMax based Corona DR via Deadline?
Yes. Normal 3dsMax rendering via Deadline requires the Backburner dll’s to be present on a system
and this is the same prerequisite for Corona DR rendering to work correcty. Ensure you have the latest/corresponding version of Backburner to ensure it supports the version of 3dsMax you are using. You
can submit a normal 3dsmax render job to verify that Backburner & 3dsMax rendering via Deadline are
all operating correctly before attempting to configure Corona DR rendering. Use the Deadline job report
to verify the correctly matched version of Backburner and 3dsMax are in order.
Do I need to run the Corona DR Server application executable on each machine?
Do NOT execute Render Legion’s Corona DR Server executable manually on each intended machine.
Deadline is more flexible here and will spawn the Corona DR Server standalone executable as a child
process of the Deadline Slave. This makes our system flexible and resilient to crashes as when we terminate/complete the Corona DR job in the Deadline queue, the Deadline Slave application will ‘cleanly’
tidy up the DR Server and more importantly, any instances of 3dsMax which it in turn has spawned as a
child process. This can be helpful if Corona DR or that instance of 3dsMax becomes unstable and a user
wishes to reset the system remotely. You can simply re-queue or delete/complete the current Corona DR
job or re-submit.
Can I force Corona DR to run over a certain port?
No. Currently this is not possible and the ports used are fixed. Please see the Port Configuration table at
the top of this page for more information.
Corona DR rendering seems a little unstable sometimes or my machine slows down dramatically!
Depending on the number of slave machines being used (Win7 OS < 20), scene file sizes being moved
around together with asset files, and your network/file storage configuration, it may help to increase the
“Synchronization interval [s]: 60” and decrease the “Max pixels transfer at once: 500000” settings, which
can help to reduce the load on your local machine and network.
9.17.6 Error Messages and Meanings
This is a collection of known Corona error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.
618
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
9.18 CSiBridge
9.18.1 Job Submission
You can submit CSiBridge jobs from the Monitor.
Submission Options
The general Deadline options are explained in the Job Submission documentation. The CSiBridge specific options are:
• CSi Bridge Data File(s): The CSi Bridge Data File to be processed. CSi Bridge Files (*.BDB), Microsoft
Access Files (*.MDB), Microsoft Excel Files (*.XLS), CSi Bridge Text Files (*.$BR *.B2K) are supported.
• Override Output Directory: If this option is enabled, an output directory can be used to re-direct all processed
files to.
• Build To Force: You can force 32 or 64 bit processing with this option.
• Submit Data File With Job: If this option is enabled, the Bridge file will be submitted with the job, and then
copied locally to the slave machine during processing.
• Version: The version of CSiBridge to render with.
CSiBridge Process/Solver Options are:
• Process Selection: Choose to execute inside of the existing Bridge application process or as a separate process.
• Solver Selection: Select the Solver to perform the analysis on the data file.
CSiBridge Design Options are:
9.18. CSiBridge
619
Deadline User Manual, Release 7.1.0.35
4 options are available to automatically perform design after the data file has been opened & analysis results are
available.
• Steel Frame Design: Perform steel frame design after the analysis has completed.
• Concrete Frame Design: Perform concrete frame design after the analysis has completed.
• Aluminium Frame Design: Perform aluminium frame design after the analysis has completed.
• Cold Formed Frame Design: Peform cold formed frame design after analysis has completed.
CSiBridge Deletion Options are:
• Temp File Deletion: Choose a deletion option to cleanup the analysis/log/out files if required.
CSiBridge Additional Options are:
• Include Data File: If enabled, the output zip file will contain the data file OR if outputting to a directory path,
the data file will be included.
• Compress (ZIP) Output: Automatically compress the output to a single zip file.
• Command Line Args: Additional command line flags/options can be added here if required.
9.18.2 Plug-in Configuration
You can configure the CSiBridge plug-in settings from the Monitor. While in super user mode, select Tools -> Configure Plugins and select the CSiBridge plug-in from the list on the left.
Executables
• Bridge 15 Executable: The path to the Bridge 15 executable file used for simulating. Enter alternative paths on
separate lines.
620
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• Bridge 2014 Executable: The path to the Bridge 2014 executable file used for simulating. Enter alternative
paths on separate lines.
• Bridge 2015 Executable: The path to the Bridge 2015 executable file used for simulating. Enter alternative
paths on separate lines.
9.18.3 FAQ
Is CSiBridge supported by Deadline?
Yes.
9.18.4 Error Messages and Meanings
This is a collection of known CSiBridge error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.
9.19 CSiETABS
9.19.1 Job Submission
You can submit CSiETABS jobs from the Monitor.
9.19. CSiETABS
621
Deadline User Manual, Release 7.1.0.35
Submission Options
The general Deadline options are explained in the Job Submission documentation. The CSiETABS specific options
are:
• CSi ETABS Data File(s): The CSi ETABS Data File to be processed. CSi ETABS Files (*.EDB), Microsoft
Access Files (*.MDB), Microsoft Excel Files (*.XLS), CSi ETABS Text Files (*.$ET *.E2K) are supported.
• Override Output Directory: If this option is enabled, an output directory can be used to re-direct all processed
files to.
• Build To Force: You can force 32 or 64 bit processing with this option.
• Submit Data File With Job: If this option is enabled, the ETABS file will be submitted with the job, and then
copied locally to the slave machine during processing.
• Version: The version of CSi ETABS to render with.
CSiETABS Design Options are:
4 options are available to automatically perform design after the data file has been opened & analysis results are
available.
• Steel Frame Design: Perform steel frame design after the analysis has completed.
• Concrete Frame Design: Perform concrete frame design after the analysis has completed.
• Composite Beam Design: Perform composite beam design after the analysis has completed.
• Shear Wall Design: Peform shear wall design after analysis has completed.
CSiETABS Deletion Options are:
• Delete Analysis Results: Choose to delete the analysis results if required.
CSiETABS Additional Options are:
• Include Data File: If enabled, the output zip file will contain the data file OR if outputting to a directory path,
the data file will be included.
• Compress (ZIP) Output: Automatically compress the output to a single zip file.
• Command Line Args: Additional command line flags/options can be added here if required.
9.19.2 Plug-in Configuration
You can configure the CSiETABS plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the CSiETABS plug-in from the list on the left.
622
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Executables
• ETABS 2013 Executable: The path to the ETABS 2013 executable file used for simulating. Enter alternative
paths on separate lines.
• ETABS 2014 Executable: The path to the ETABS 2014 executable file used for simulating. Enter alternative
paths on separate lines.
9.19.3 FAQ
Is CSiETABS supported by Deadline?
Yes.
9.19.4 Error Messages and Meanings
This is a collection of known CSiETABS error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.
9.19. CSiETABS
623
Deadline User Manual, Release 7.1.0.35
9.20 CSiSAFE
9.20.1 Job Submission
You can submit CSiSAFE jobs from the Monitor.
Submission Options
The general Deadline options are explained in the Job Submission documentation. The CSiSAFE specific options are:
• CSi SAFE Data File(s): The CSi SAFE Data File to be processed. CSi SAFE Files (*.FDB), Microsoft Access
Files (*.MDB), Microsoft Excel Files (*.XLS), CSi SAFE Text Files (*.$2K *.F2K) are supported.
• Override Output Directory: If this option is enabled, an output directory can be used to re-direct all processed
files to.
• Build To Force: You can force 32 or 64 bit processing with this option.
• Submit Data File With Job: If this option is enabled, the SAFE file will be submitted with the job, and then
copied locally to the slave machine during processing.
• Version: The version of CSi SAFE to process with.
CSiSAFE Analysis/Design/Detailing Option:
624
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• Run Method: Choose a run combination option such as “Disabled”, “Run Analysis”, “Run Analysis & Design”
or “Run Analysis, Design & Detailing”.
CSiSAFE Process/Solver Options:
• Process Selection: Choose to execute inside of the existing SAFE application process or as a separate process.
• Solver Selection: Select the Solver to perform the analysis on the data file.
• Force 32bit Process: Force analysis to be calculated in 32 bit even when the computer is 64 bit.
CSiSAFE Report Option:
• Create Report: Create a report based on the current report settings in the model file.
CSiSAFE Export Options:
• File Export: File export a Microsoft Access, Microsoft Excel, or text file.
• DB Named Set (required): The name of the database tables named set that defines the tables to be exported.
This parameter is required.
• DB Group Set (optional): The specified group sets the selection for the exported tables. This parameter is
optional. If it is not specified, the group ALL is assumed.
CSiSAFE Deletion Options:
• Temp File Deletion: Choose a deletion option to cleanup the analysis/output files if required such as “keep
everything”, “delete analysis & output files”, “delete analysis files only” or “delete output files only”.
CSiSAFE Additional Options:
• Include Data File: If enabled, the output zip file will contain the data file OR if outputting to a directory path,
the data file will be included.
• Compress (ZIP) Output: Automatically compress the output to a single zip file.
9.20.2 Plug-in Configuration
You can configure the CSiSAFE plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the CSiSAFE plug-in from the list on the left.
9.20. CSiSAFE
625
Deadline User Manual, Release 7.1.0.35
Executables
• SAFE 12 Executable: The path to the SAFE 12 executable file used for simulating. Enter alternative paths on
separate lines.
• SAFE 2014 Executable: The path to the SAFE 2014 executable file used for simulating. Enter alternative paths
on separate lines.
9.20.3 FAQ
Is CSiSAFE supported by Deadline?
Yes.
9.20.4 Error Messages and Meanings
This is a collection of known CSiSAFE error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.
626
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
9.21 CSiSAP2000
9.21.1 Job Submission
You can submit CSiSAP2000 jobs from the Monitor.
Submission Options
The general Deadline options are explained in the Job Submission documentation. The CSiSAP2000 specific options
are:
• CSi SAP2000 Data File(s): The CSi SAP2000 Data File to be processed. CSi SAP2000 Files (*.SDB), Microsoft Access Files (*.MDB), Microsoft Excel Files (*.XLS), CSi SAP2000 Text Files (*.$2K *.S2K) are
supported.
• Override Output Directory: If this option is enabled, an output directory can be used to re-direct all processed
files to.
• Build To Force: You can force 32 or 64 bit processing with this option.
• Submit Data File With Job: If this option is enabled, the SAP2000 file will be submitted with the job, and then
copied locally to the slave machine during processing.
• Version: The version of CSi SAP2000 to render with.
CSiSAP2000 Process/Solver Options are:
• Process Selection: Choose to execute inside of the existing SAP2000 application process or as a separate
process.
9.21. CSiSAP2000
627
Deadline User Manual, Release 7.1.0.35
• Solver Selection: Select the Solver to perform the analysis on the data file.
CSiSAP2000 Design Options are:
4 options are available to automatically perform design after the data file has been opened & analysis results are
available.
• Steel Frame Design: Perform steel frame design after the analysis has completed.
• Concrete Frame Design: Perform concrete frame design after the analysis has completed.
• Aluminium Frame Design: Perform aluminium frame design after the analysis has completed.
• Cold Formed Frame Design: Peform cold formed frame design after analysis has completed.
CSiSAP2000 Deletion Options are:
• Temp File Deletion: Choose a deletion option to cleanup the analysis/log/out files if required.
CSiSAP2000 Additional Options are:
• Include Data File: If enabled, the output zip file will contain the data file OR if outputting to a directory path,
the data file will be included.
• Compress (ZIP) Output: Automatically compress the output to a single zip file.
• Command Line Args: Additional command line flags/options can be added here if required.
9.21.2 Plug-in Configuration
You can configure the CSiSAP2000 plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the CSiSAP2000 plug-in from the list on the left.
Executables
628
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• SAP2000 14 Executable: The path to the SAP2000 14 executable file used for simulating. Enter alternative
paths on separate lines.
• SAP2000 15 Executable: The path to the SAP2000 15 executable file used for simulating. Enter alternative
paths on separate lines.
• SAP2000 16 Executable: The path to the SAP2000 16 executable file used for simulating. Enter alternative
paths on separate lines.
• SAP2000 17 Executable: The path to the SAP2000 17 executable file used for simulating. Enter alternative
paths on separate lines.
9.21.3 FAQ
Is CSiSAP2000 supported by Deadline?
Yes.
9.21.4 Error Messages and Meanings
This is a collection of known CSiSAP2000 error messages and their meanings, as well as possible solutions. We want
to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.
9.22 DJV
9.22.1 Job Submission
You can submit DJV jobs from the Monitor. You can use the Submit menu, or you can right-click on a job and select
Scripts -> Submit DJV Quicktime Job To Deadline to automatically populate some fields in the DJV submitter based
on the job’s output.
9.22. DJV
629
Deadline User Manual, Release 7.1.0.35
Submission Options
The general submission options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. You can get more information about the DJV specific
options by hovering your mouse over the label for each setting. The Settings buttons can be used to quickly save and
load presets, or reset the settings back to their defaults.
9.22.2 Plug-in Configuration
You can configure the DJV plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the DJV plug-in from the list on the left.
630
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
DJV Executables
• DJV Executable: The path to the djv_convert executable file used for rendering. Enter alternative paths on
separate lines. Different executable paths can be configured for each version installed on your render nodes.
9.22.3 FAQ
Is DJV supported by Deadline?
Yes.
Can I create Apple Quicktime mov files with DJV?
Yes. On Windows, you must use the x32 bit version of DJV only. The LibQuicktime based codecs are
only available in DJV v1.0.1 or later AND only on Linux. As an alternative, you can also use Thinkbox’s
Draft product (image/movie creation automation toolkit) which is included in Deadline and is licensed
against your active Deadline support subscription. See Draft for more information.
Can I create EXR files compressed with DreamWorks Animations DWAA or DWAB compression?
Yes, but this is only supported in DJV v1.0.01 or later.
9.22.4 Error Messages and Meanings
This is a collection of known DJV error messages and their meanings, as well as possible solutions. We want to keep
this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline
Support and let us know.
[ -auto_tag] and [ -tag Name Value] options not working in DJV plugin
9.22. DJV
631
Deadline User Manual, Release 7.1.0.35
DJV has a bug causing DJV to crash which is currently stopping these 2 command line flag options from
working. The code has been commented out in the DJV plugin and can be re-enabled as such time the
bug is fixed by the DJV developer.
Various Command Line options failing in DJV
Many of the [djv_convert] commmand line flags are broken due to “spaces” being present between the
flag options in DJV versions earlier than v1.0.1. This is all resolved in DJV v1.0.1 and later, so it is
recommended to use at least this version (wrapping the flag options with additional quotation marks does
not resolve the issue as it’s a bug in the actual [djv_convert] command line args parser function).
9.23 Draft
9.23.1 Job Submission
There are many ways to submit Draft jobs to Deadline. As always, you can simply submit a Draft job from within the
Monitor from the Submit menu. In addition, we’ve also added a right-click job script to the Monitor, which will allow
you to submit a Draft job based on an existing job. This will pull over output information from the original job, and
fill in Draft parameters automatically where possible.
632
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
On top of the Monitor scripts, you can also get set up to submit Draft jobs directly from Shotgun. This will again
pull over as much information as possible, this time from the Shotgun database, in order to pre-fill several of the Draft
parameter fields. See the Integrated Submission Script Setup section below for more details on this.
We’ve also added a Draft section to all of our other submitters. Submitting a Draft job from any of these uses our
9.23. Draft
633
Deadline User Manual, Release 7.1.0.35
Draft Event Plug-in to submit a Draft job based on the job currently being submitted (this is similar in concept to the
right-click job script described above). The Draft job will get automatically created upon completion of the original
job.
9.23.2 Submission Options
The general Deadline options are available in the Draft submitters, and are explained in the Job Submission documentation. Draft-specific options are explained below. It should be noted, however, that given the nature of Draft scripts,
not all of these parameters will be used by all scripts. They can even feasibly be used for different purposes than listed
here.
• Draft Script: This is the Draft script (or Template) that you want to run.
• Input File: Indicates where the input file(s) for the Draft Script can be found. What kind of file this is will
depend entirely on the Draft Script itself. Passed to the Draft script as ‘inFile’.
• Output Folder: Indicates where the output file(s) of the Draft Script will be placed. Can be a relative path, in
which case it will be relative to the input. This is passed to the Draft script as ‘outFolder’.
• Output File Name: As above, the type of file this is will depend entirely on the selected Draft Script. Passed to
the Draft script as ‘outFile’.
• Frame List: The list of Frames that the Draft Script should work with. Passed to the Draft Script as ‘frameList’,
‘firstFrame’, and ‘lastFrame’.
• User: The name of the user that is submitting the job. Typically used by the Draft script for frame annotations.
Passed to the Draft script as ‘username’.
• Entity: The name of the entity being submitted. Typically used by the Draft script for frame annotations. Passed
to the Draft script as ‘entity’.
• Version: The version of the entity being submitted. Typically used by the Draft script for frame annotations.
Passed to the Draft script as ‘version’.
• Additional Args: Any additional command line arguments that you wish to pass to the Draft script should be
listed here. Appended to arguments listed above.
9.23.3 Plug-in Configuration
The Draft plug-in does not require any configuration.
9.23.4 Integrated Submission Script Setup
All of our integrated submission scripts have been updated to have a Draft section, in order to submit dependent Draft
jobs. In addition to this, we also have created scripts to allow you to submit a Draft job directly from Shotgun.
Shotgun Action Menu Item
The best way to install the Draft Submission menu item in Shotgun is to use the automated setup script included in the
Monitor. To access this, select Scripts -> Install Integration Submission Scripts from the Monitor’s menu. From there,
click the ‘Install’ button next to the Draft entry.
It should be noted that this functionality is currently only available on the Windows version, and requires administrator
privileges to run successfully. It should also be noted that while this script will create the ‘Submit Draft Job’ entry in
Shotgun for everyone to see, this must still be done on each machine that will be submitting Draft jobs.
634
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
9.24 Draft Tile Assembler
9.24.1 Job Submission
You can submit Draft tile assembler jobs from the Monitor. Normally, these jobs are submitted as dependent jobs for
your original tile jobs, but you can submit them manually if you wish.
Submission Options
The general Deadline options are explained in the Job Submission documentation. The Draft Tile Assembler specific
options are:
• Input Config File: The file that will control a majority of the assembly.
9.24. Draft Tile Assembler
635
Deadline User Manual, Release 7.1.0.35
• Error on Missing File: If enabled, the job will error if any of the tiles in the config file are missing.
• Cleanup Tiles: If enabled, the job Delete all of the tile files after the assembly is complete.
• Build To Force: You can force 32 bit or 64 bit rendering.
Config File Setup
The config file is a plain text file that uses Key/Value pairs (key=value) to control the draft tile assembly.
• TileCount=<#>: The number of tiles that are going to be assembled
• DistanceAsPixels=<true/false>: Distances provided in pixels or in a 0.0-1.0 percentage range (Defaults to
True)
• BackgroundSource=<BackgroundFile>: If provided, the assembler will attempt to assemble the new tiles
over the specified image.
• TilesCropped=<true/false>: If disabled, the assembler will crop the tiles before assembling them.
• ImageHeight=<#>: The height of the final image. This will be ignored if a background is provided. If this is
not provided and the tiles are not cropped then the first tile will be used to determine the final image size.
• ImageWidth=<#>: The height of the final image. This will be ignored if a background is provided. If this is
not provided and the tiles are not cropped then the first tile will be used to determine the final image size.
• Tile<#>Filename=<FileName>: The file name of the tile to be assembled. (Only used if ImageFolder is not
included, 0 indexed)
• Tile<#>X=<#>: The X coordinates for the tile that is to be assembled. 0 at the left side.
• Tile<#>Y=<#>: The Y coordinates for the tile that is to be assembled. 0 at the bottom.
• Tile<#>Width=<#>: The width of the tile that is to be cropped. (Only used if TilesCropped is false)
• Tile<#>Height=<#>: The height of the tile that is to be cropped. (Only used if TilesCropped is false)
• ImageFolder=<Folder>: The folder that you would like to assemble images from. (If included the assembler
will render all tiles within the specified Folder )
• ImagePadding=<#>: The amount of padding on the file names within the folder.(Only used if ImageFolder is
included)
• ImageExtension=<ext>: The extension that the files to be assembled. (Only used if ImageFolder is included)
• Tile<#>Prefix=<Prefix>: The Prefix that the file must contain (Only used if ImageFolder is included)
Example Config Files
The first example config file will control a simple tile assembly.
#We are assembling 4 tiles into an image
TileCount=4
#The final image will have the following filename
ImageFileName=C:/ExampleConfig/outputFileName.png
#The final Image will have a resolution of 960x540
ImageWidth=960
ImageHeight=540
#The Images are already Cropped
TilesCropped=True
#What is the file that will be the first tile assembled
Tile0FileName=C:/ExampleConfig/_tile_1x1_2x2_sceneName.png
636
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
#Where should the first tile go
Tile0X=0
Tile0Y=0
#What is the file that will be the second tile assembled
Tile1FileName=C:/ExampleConfig/_tile_2x1_2x2_sceneName.png
#Where should the second tile go
Tile1X=480
Tile1Y=0
#What is the file that will be the third tile assembled
Tile2FileName=C:/ExampleConfig/_tile_1x2_2x2_sceneName.png
#Where should the third tile go
Tile2X=0
Tile2Y=270
#What is the file that will be the fourth tile assembled
Tile3FileName=C:/ExampleConfig/_tile_2x2_2x2_sceneName.png
#Where should the fourth tile go
Tile3X=480
Tile3Y=270
The second example config file controls a folder render. It will assemble all files within the folder C:/ExampleConfig/
that have the extension exr and have the given prefixes. So if the files region_0_test.exr, region_1_test.exr, region_2_test.exr, region_3_test.exr then this file will create the images test.exr:
#We are assembling 4 tiles into an image
TileCount=4
#In the config files we are using relative coordinates instead of pixel coordinates
DistanceAsPixels=0
#The tiles have not yet been cropped so the tile assembler has to crop each tile.
TilesCropped=false
#We are going to assemble all files within the specified folder.
ImageFolder=C:/ExampleConfig
#We are going to only assemble files with the following extension
ImageExtension=exr
#The first tile in each of the images will start with the following prefix
Tile0Prefix=region_0_
#Where should the tile go
Tile0X=0
Tile0Y=0
#Because we are cropping the tiles we need to give it a width and height to crop to
Tile0Width=0.5
Tile0Height=0.5
#The second tile in each of the images will start with the following prefix
Tile1Prefix=region_1_
#Where should the tile go
Tile1X=0.5
Tile1Y=0
#Because we are cropping the tiles we need to give it a width and height to crop to
Tile1Width=0.5
Tile1Height=0.5
Tile2Prefix=region_2_
Tile2X=0
Tile2Y=0.5
Tile2Width=0.5
Tile2Height=0.5
Tile3Prefix=region_3_
Tile3X=0.5
Tile3Y=0.5
9.24. Draft Tile Assembler
637
Deadline User Manual, Release 7.1.0.35
Tile3Width=0.5
Tile3Height=0.5
9.24.2 Plug-in Configuration
The Draft Tile Assembler plug-in does not require any configuration.
9.24.3 FAQ
There are no FAQ entries at this time.
9.24.4 Error Messages And Meanings
This is a collection of known Draft Tile Assembler error messages and their meanings, as well as possible solutions.
We want to keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please
email Deadline Support and let us know.
Currently, no error messages have been reported for this plug-in.
9.25 EnergyPlus
9.25.1 Job Submission
You can submit EnergyPlus jobs from the Monitor.
638
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Submission Options
The general Deadline options are explained in the Job Submission documentation. The EnergyPlus specific options
are:
• EnergyPlus IDF File(s): The EnergyPlus IDF file(s) to be processed.
• Weather EPW File(s): The Weather EPW File(s) to be referenced (Optional).
• Override Output Directory: If this option is enabled, an output directory can be used to re-direct all processed
files to.
• Build To Force: You can force 32 or 64 bit processing with this option.
• Submit File(s) With The Job: If this option is enabled, the data file(s) will be submitted with the job, and then
copied locally to the slave machine during processing.
EnergyPlus Post-Process Options are:
• ../ReadVarsESO.exe Max.Columns: Limit the maximium number of columns used when calling readVarsESO.exe.
• Execute ../convertESOMTR.exe: Execute the convertESOMTR.exe application as a post-process.
• Execute ../CSVproc.exe: Execute the csvProc.exe application as a post-process.
EnergyPlus Processing Options are:
9.25. EnergyPlus
639
Deadline User Manual, Release 7.1.0.35
• Multithreading: If enabled, EnergyPlus simulations will use multithreading. Ignored if Concurrent Tasks > 1.
• Pause Mode (DEBUG only): Only for Debug purposes. Will PAUSE the program execution at key steps.
EnergyPlus Other Options are:
• Include Data File: If enabled, the output zip file will contain the data file OR if outputting to a directory path,
the data file will be included.
• Compress (ZIP) Output: Automatically compress the EP output to a single zip file.
9.25.2 Plug-in Configuration
You can configure the EnergyPlus plug-in settings from the Monitor. While in super user mode, select Tools ->
Configure Plugins and select the EnergyPlus plug-in from the list on the left.
Executables
• EnergyPlus Executable: The path to the EnergyPlus executable file used for simulating. Enter alternative paths
on separate lines.
9.25.3 FAQ
Is EnergyPlus supported by Deadline?
Yes.
640
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
9.25.4 Error Messages and Meanings
This is a collection of known EnergyPlus error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.
9.26 FFmpeg
9.26.1 Job Submission
You can submit FFmpeg jobs from the Monitor.
9.26. FFmpeg
641
Deadline User Manual, Release 7.1.0.35
Submission Options
The general Deadline options are explained in the Job Submission documentation. The FFmpeg specific options are:
642
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• Input File: The input file.
• Input Arguments: Additional command line arguments for the input file.
• Replace Frame in Input File(s) With Padding: If enabled, the frame number in the file name will be replaced
by frame padding before being passed to FFMpeg. This should be enabled if you are passing a sequence of
images as input.
• Output File: The output file.
• Output Arguments: Additional command line arguments for the output file.
• Additional Arguments: Additional general command line arguments.
• Additional Input Files: Specify up to 9 additional input files. You can give each file their own arguments, or
use the same arguments as the main input file.
• FFmpeg Preset Files: Specify preset files for video, audio, or subtitle.
9.26.2 Plug-in Configuration
You can configure the FFmpeg plug-in settings from the Monitor. While in super user mode, select Tools -> Configure
Plugins and select the FFmpeg plug-in from the list on the left.
Render Executables
• FFmpeg Executable: The path to the FFmpeg executable file used for rendering. Enter alternative paths on
separate lines.
9.26.3 FAQ
Currently, there are no FAQs for this plug-in.
9.26. FFmpeg
643
Deadline User Manual, Release 7.1.0.35
9.26.4 Error Messages and Meanings
This is a collection of known FFmpeg error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline
Support and let us know.
Currently, no error messages have been reported for this plug-in.
9.27 Fusion
9.27.1 Job Submission
You can submit jobs from within Fusion by installing the integrated submission script, or you can submit them from
the Monitor. The instructions for installing the integrated submission script can be found further down this page.
To submit from within Fusion, select Script -> DeadlineFusionClient.
Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation.
• Fusion Comp: The flow/comp file to be rendered.
• Frame List: The list of frames to render.
• Frames Per Task: This is the number of frames that will be rendered at a time for each job task.
• Proxy: The proxy level to use (not supported in command line mode).
• Version: The version of Fusion to render with.
• Build: Force 32 or 64 bit rendering. Default is None.
• Use Frame List In Comp: Enable this option to pull the frame range from the comp file.
644
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• Check Output: If checked, Deadline will check all savers to ensure they have saved their image file (not
supported in command line mode).
• High Quality: Whether or not to render with high quality (not supported in command line mode).
• Command Line Mode: Render using separate command line calls instead of keeping the scene loaded in
memory between tasks. Using this feature disables the High Quality, Proxy, and Check Saver Output options.
This uses the FusionCmd plug-in, instead of the Fusion one.
• Submit Comp File: If this option is enabled, the flow/comp file will be submitted with the job, and then copied
locally to the slave machine during rendering.
In-app submitter submission options.
• Render First And Last Frames First: The first and last frame of the flow/comp will be rendered first, followed
by the remaining frames in the sequence. Note that the Frame List above is ignored if this box is checked (the
frame list is pulled from the flow/comp itself).
• Submit Comp File With Job: If this option is enabled, the flow/comp file will be submitted with the job, and
then copied locally to the slave machine during rendering.
• Check Saver Output: If checked, Deadline will check all savers to ensure they have saved their image file (not
supported in command line mode).
9.27.2 Plug-in Configuration
You can configure the Fusion and FusionCmd plug-in settings from the Monitor. While in super user mode, select
Tools -> Configure Plugins and select the Fusion plug-in from the list on the left.
Fusion
9.27. Fusion
645
Deadline User Manual, Release 7.1.0.35
Fusion Options
• Fusion Render Executable: The path to the Fusion Render Slave executable used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your
render nodes.
• Fusion Wait For Executable: If you use a proxy RenderSlave.exe, set this to the name of the renamed original.
For example, it might be set to RenderSlave_original.exe. Leave blank to disable this feature.
• Fusion Version To Enforce: Deadline will only render Fusion jobs on slaves running this version of Fusion.
Use a ; to separate alternative versions. Leave blank to disable this feature.
• Fusion Slave Preference File: The path to a global RenderSlave.prefs preference file that is copied over before
starting the Render Slave. Leave blank to disable this feature.
General Fusion Options
• Load Comp Timeout: Maximum time for Fusion to load a comp, in seconds.
• Script Connect Timeout: Amount of time allowed for Fusion to start up and accept a script connection, in
seconds.
FusionCmd
• Fusion Render Executable: The path to the Fusion Console Slave executable used for rendering. Enter alternative paths on separate lines. Different executable paths can be configured for each version installed on your
render nodes.
• Fusion Slave Preference File: The path to a global RenderSlave.prefs preference file that is copied over before
starting the Render Slave. Leave blank to disable this feature.
646
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
9.27.3 Integrated Submission Script Setup
The following procedures describe how to install the integrated Fusion submission script. This script allows for
submitting Fusion render jobs to Deadline directly from within the Fusion editing GUI.
Submitter Installer
• Run the Submitter Installer located at <Repository>/submission/Fusion/Installers
Manual Installation of the Submission Script
• Copy [Repository]/submission/Fusion/Client/DeadlineFusionClient.eyeonscript to [Fusion Install Directory]/Scripts/Comp
• Restart Fusion to find the DeadlineFusionClient option in the Script menu.
Custom Sanity Check Setup
In the [Repository]/submission/Fusion/Main folder, you can create a file called CustomSanityChecks.eyeonscript.
This script will be called by the main Fusion submission script before submission, and can be used to perform sanity
checks. Within this script file, you must define this function, which is called by the main script:
function CustomDeadlineSanityChecks(comp)
local message = ""
...
return message
end
All your checks should be placed within this function. This function should return a message that contains the sanity
check warnings. If an empty message is returned, then it is assumed the sanity check was a success and no warning is
displayed to the user. Here is a simple example that checks if any CineFusion tools are being used in the comp file:
function CustomDeadlineSanityChecks(comp)
local message = ""
------------------------------------------------------ RULE: Check to make sure Cinefusion is disabled
----------------------------------------------------cinefusionAttrs = fusion:GetRegAttrs("CineFusion")
if not (cinefusionAttrs == nil) then
cinefusion_regID = cinefusionAttrs.REGS_ID
local i = nil
for i, v in comp:GetToolList() do
if (v:GetID() == cinefusion_regID) then
if (v:GetAttrs().TOOLB_PassThrough == false) then
message = message ..
"CineFusion '" ..
v:GetAttrs().TOOLS_Name ..
"' should be disabled\n"
end
end
end
end
9.27. Fusion
647
Deadline User Manual, Release 7.1.0.35
return message
end
9.27.4 FAQ
Which versions of Fusion are supported?
Fusion 5 and later are supported.
What’s the difference between the Fusion and FusionCmd plugins?
The Fusion plugin starts the Fusion Render Node in server mode and uses eyeonscript to communicate
with the Fusion renderer. Fusion and the comp remain loaded in memory between tasks to reduce overhead. This is usually the preferred way of rendering with Fusion.
The FusionCmd plugin renders with Fusion by executing command lines, and can be used by selecting
the Command Line mode option in the Fusion submitter. Because Fusion needs to be launched for each
task, there is some additional overhead when using this plugin. In addition, the Proxy, High Quality, and
Saver Output Checking features are not supported in this mode. However, this mode tends to print out
better debugging information when there are problems (especially when the Fusion complains that it can’t
load the comp), so we recommend using it to help figure out problems that may be occurring when using
the Fusion plugin.
Can I use both workstation and render node licenses to render jobs in Deadline?
You can use workstation licenses to render, you just need to do a little tweaking to get this to work nicely.
In the Plugin Configuration settings, you need to specify two paths for the render executable option. The
first path will be the render node path, and the second will be the actual Fusion executable path. You then
have to make sure that the render node is not installed on your workstations. Because you have specified
two paths, Deadline will only use the second path if the first one doesn’t exist, which is why the render
nodes can’t be installed on your workstations.
Why is it not possible to have to 2 instances of Fusion running?
With Fusion there is only one tcp/ip port to which eyeonscript (the scripting language used to run Fusion
renders on a slave computer) can connect. If Fusion is open on a slave computer then the port will be in
use and the Fusion Render Node will have to wait for the port to become available before rendering of
Fusion jobs on that slave can begin.
Fusion alone renders fine, but with Deadline, the slaves are failing on the last frame.
This is usally accompanied by this error message:
INFO: Checking file \\path\to\filename####.ext
INFO: Saver "SaverName" did not produce output file.
INFO: Expected file "\\path\to\filename####.ext" to exist.
The issue likely has to do with the processing of fields as opposed to full frames. When processing your
output as fields, the frames are rendered in two halves (for example, frame 1 would be rendered as 1.0 and
1.5). This error often occurs when the Global Timeline is not set to include the second half of the final
frame. Simply adding a .5 to the Global End Time should resolve this issue.
For example, let us assume that you are processing fields and your output range is 0 - 100. If the Global
Timeline is set to be 0.0 - 100.0, Fusion will render everything, but Deadline will fail on the last frame. If
the Global Timeline is set to be 0.0 - 100.5, Deadline will render everything just fine.
Is there a way to increase Deadline’s efficiency when rendering Fusion frames that only take a few seconds to
complete?
648
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Rendering these frames in groups (groups of 5 for example) tends to reduce the job’s overall rendering
time. The group size can be set in the Fusion submission dialog using the Task Group Size option.
Does Fusion cache data between frames on the network, in the same way it does when rendering sequences
locally?
Deadline renders each block of frames using the eyeonscript comp.render function. The Fusion Render
Node is kept running between each block rendered, so when Fusion caches static results, it can be used
by the next block of frames to be rendered on the same machine.
Fusion seems to be taking a long time to start up when rendering. What can I do to fix this?
If you are running Fusion off a remote share, this can occur when there is a large number of files in the
Autosave folder. If this is the case, deleting the files in the Autosave folder should fix the problem.
Can I use relative paths in my Fusion comp when rendering with Deadline?
If your comp is on a network location, and everything is relative to that network path, you can use relative
paths if you choose the option to not submit the comp file with the job. In this case, the slaves will load the
comp directly over the network, and there shouldn’t be any problems with the relative paths. Just make
sure that your render nodes resolve the paths the same way your workstation does.
9.27.5 Error Message and Meanings
This is a collection of known Fusion error messages and their meanings, as well as possible solutions. We want to
keep this list as up to date as possible, so if you run into an error message that isn’t listed here, please email Deadline
Support and let us know.
Exception during render: Failed to load the comp “[flowname].comp” in startjob.eyeonscript”?
This error usually occurs because the render node is missing a plug-in that is referenced by the flow
in question. Often this is because there is a plug-in installed on the machine from which the job was
submitted that is not in the Fusion Render Node plug-in directory on the slave machine. It is important
to remember that the Fusion Render Node has a different plug-in store than Fusion – even on the same
machine – thus one should ensure that the needed plug-ins are copied/installed in both locations.
Exception during render: The fusion renderer reported that the render failed. Scroll down to the bottom of the
log below for more details.
This can occur for a number of reasons, but often Fusion will print out the cause for the error. In the
error log window, scroll to the end of the Slave Log capture which is near the bottom of the error message
window, and there will be a part which looks something like the following message. This particular
message indicates that a font was missing on the machine.
INFO: Render started at Wed 8:17PM (Range: 198 to 198) INFO: INFO: Comments: Could not find font
“SwitzerlandCondensed” INFO: INFO: Saver 1 failed at time 198 INFO: INFO: Render failed at Wed
8:18PM! Last frame rendered: (none)! INFO: INFO: Render failed
We’ve usually found that the problem behind this error was a plug-in that was installed for Fusion, but
not for the Fusion Render Node. Try updating your Fusion Render Node plug-ins to match your Fusion
plug-ins exactly, and check whether the error still occurs.
Exception during render: Eyeonscript failed to make a connection in startjob.eyeonscript - check that Eyeonscript is set to no login required?
In order to connect to the Fusion Render Node and communicate with it, Deadline uses the eyeonscript,
the scripting language provided for Fusion. The script connects to the Fusion Render Node via a socket
connection, which by default requires a login username and password to connect.
In order for Deadline to be able to render using a given Fusion Render Node, you must change its settings
so that it no longer requires the username and password. This is done by running the Fusion Render Node,
9.27. Fusion
649
Deadline User Manual, Release 7.1.0.35
right clicking on the icon it creates, and choosing preferences. From there, pick the Script option, and you
will see radio buttons, one of which says ‘No login required’. Make sure that that is the option selected,
then click Save to save the preferences, and exit Fusion Render Node.
9.28 Fusion Quicktime
9.28.1 Job Submission
You can submit Fusion Quicktime jobs from the Monitor.
650
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation.
Fusion Options
9.28. Fusion Quicktime
651
Deadline User Manual, Release 7.1.0.35
• Fusion Version: Select the version of Fusion to generate the Quicktime with.
• Build: Force 32 or 64 bit rendering.
• Load/Save Preset: Allows you to save your Fusion Quicktime options to a preset file, so that you can load them
again later.
Input/Output Options
• Input Images: The frames you would like to generate the Quicktime from. If a sequence of frames exist in
the same folder, Deadline will automatically collect the range of the frames and will set the Frame Range field
accordingly.
• Frames: The frame range used to generate the Quicktime.
• Frame Rate: The frame rate of the Quicktime.
• Overide Start: Allows the starting frame in the quicktime to be overridden. For example, if you are making
a quicktime from images with a range 101-150, you can override the start frame to be 1, and the range in the
quicktime will appear as 1-50.
• Output Movie File: The name of the Quicktime to be generated.
• Codec: The codec format to use for the Quicktime.
• On Missing Frames: What the generator will do when a frame is missing or is unable to load. There are 4
options:
– Fail: Nothing will be generated until the missing frame becomes available.
– Hold Previous: The last valid frame will be included instead of the missing frame.
– Output Black: A black frame will be included instead of the missing frame.
– Wait: The generator will wait until the missing frame becomes available.
Quicktime Options
• BG Plate: Specify an optinal background plate. The Quicktime will render using the selected file as the background.
• Template: Specify an optional comp template. See the Template documentation below for more information.
• Artist Name: if you have a text tool with “artist” in its name in the selected template comp, its text will be set
to the name that is specified.
• Curve Correction: Select to turn on the color curves tool (available when using templates only).
• Quality %: The quality of the Quicktime.
• Proxy: The ratio of pixels to render (for example, if set to 4, one out of every four pixels will be rendered).
• Gamma: The gamma level of the Quicktime.
• Exposure Compensation: The “stops” value used to calculate the gain parameter of the Brightness/Contrast
tool. The gain parameter is calculated by using the value pow(2,stops).
9.28.2 Quicktime Templates
A comp template can be specified to put all the messages and watermarks that you want into the Quicktime. It has
some standardized comp naming conventions so that the renderer can set some standard text tool values, as well as the
input and output images. Here is an example of a very simple template file.
652
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
As you can see, this simple template consists of a loader, a saver, a text tool, and a merge tool. This template simply
merges the text tool with the loader so that “This is a test” appears in your Quicktime. You can create your own
template files, but they must meet the following requirements. As long as these requirements are met, you can add
whatever you like between the loader and the saver.
• There must be exactly one loader and one saver.
• The loader must have a dummy file name specified (the file doesn’t have to exist).
9.28.3 Plug-in Configuration
The Fusion Qucktime submitter submits jobs to the Fusion plug-in. See the Fusion Plug-in Guide for information on
configuring the Fusion plug-in.
9.28. Fusion Quicktime
653
Deadline User Manual, Release 7.1.0.35
9.28.4 FAQ
Which versions of Fusion are supported?
Fusion 5 and later are supported.
How is this different than submitting regular Quicktime jobs?
Regular Quicktime jobs are more generic, and provide more general Quicktime options. Fusion Quicktime
jobs are more customizable (ie: using templates), but requires Fusion to render.
9.28.5 Error Message and Meanings
The Fusion Quicktime submitter submits jobs to the Fusion plug-in. See the Fusion Plug-in Guide for Fusion error
messages and meanings.
9.29 Generation
9.29.1 Job Submission
You can submit comp jobs to Fusion from within Generation by installing the integrated submission script. The
instructions for installing the integrated submission script can be found further down this page.
In Generation, select the comp(s) you want to submit, and then right-click and select Submit.
654
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
This will bring up the submission window. Note that the submission window is only shown once, and all jobs that are
submitted will use the same job settings.
9.29. Generation
655
Deadline User Manual, Release 7.1.0.35
Submission Options
The general Deadline options are explained in the Job Submission documentation. The Fusion options are:
• Use Frame List In Comp: Uses the frame list defined in the comp files instead of the Frame List setting. If you
are submitting more than one comp from Generation, you should leave this option enabled unless you want the
Frame List setting to be used for each comp.
• Proxy: The proxy level to use.
• High Quality Mode: Whether or not to render with high quality.
• Check Output: If checked, Deadline will check all savers to ensure they have saved their image file.
• Version: The version of Fusion to render with.
• Build: Force 32 or 64 bit rendering.
• Command Line Mode: Render using separate command line calls instead of keeping the scene loaded in
memory between tasks. Using this feature disables the High Quality, Proxy, and Check Saver Output options.
This uses the FusionCmd plug-in, instead of the Fusion one.
9.29.2 Plug-in Configuration
The Generation submitter submits jobs to the Fusion plug-in. See the Fusion Plug-in Guide for information on configuring the Fusion plug-in.
9.29.3 Integrated Submission Script Setup
The following procedures describe how to install the integrated Generation submission script. This script allows for
submitting Generation comp jobs to Deadline directly from within the Generation editing GUI.
Submitter Installer
656
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
• Run the Submitter Installer located at <Repository>/submission/Generation/Installers
Manual Installation of the Submission Script
• Copy [Repository]\submission\Generation\Client\DeadlineGenerationClient.lua to the Generation scripts folder
([Generation Install Folder]\scripts\generation).
• In the Generation program data folder (%PROGRAMDATA%\eyeon\Generation), you’ll need to edit your Generation.cfg file. If you currently do not have a Generation.cfg file, create an empty one. Open your Generation.cfg file and add this:
SCRIPT_FARMSUBMIT="scripts\generation\DeadlineGenerationClient.lua"
• Save the file. The next time you start up Generation, this script will be used when you select the Submit option
for the selected comps.
9.29.4 FAQ
Which versions of Generation are supported?
Generation 2 and later are supported.
9.29.5 Error Messages and Meanings
The Generation submitter submits jobs to the Fusion plug-in. See the Fusion Plug-in Guide for Fusion error messages
and meanings.
9.30 Hiero
9.30.1 Job Submission
You can submit transcoding jobs to Nuke from within Hiero by installing the integrated submission script. The
instructions for installing the integrated submission script can be found further down this page.
To submit from within Hiero, open the Export window from the File menu, or by right-clicking on a sequence. Then
choose the Submit To Deadline option in the Render Background Tasks drop down and press Export.
9.30. Hiero
657
Deadline User Manual, Release 7.1.0.35
This will bring up the submission window. Note that the submission window is only shown once, and all jobs that are
submitted will use the same job settings.
658
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
Submission Options
The general Deadline options are explained in the Job Submission documentation. The Nuke specific options are:
• Render With NukeX: Enable this option if you want to render with NukeX instead of Nuke.
• Render Threads: The number of threads to use for rendering.
• Continue On Error: If enabled, Nuke will attempt to keep rendering if an error occurs.
• Maximum RAM Usage: The maximum RAM usage (in MB) to be used for rendering.
• Use Batch Mode: If enabled, Deadline will keep the Nuke file loaded in memory between tasks.
• Build To Force: Force 32 or 64 bit rendering.
9.30. Hiero
659
Deadline User Manual, Release 7.1.0.35
9.30.2 Cross-Platform Rendering Considerations
The Hiero submitter submits jobs to the Nuke plug-in. See the Nuke Plug-in Guide for cross-platform rendering
considerations.
9.30.3 Plug-in Configuration
The Hiero submitter submits jobs to the Nuke plug-in. See the Nuke Plug-in Guide for information on configuring the
Nuke plug-in.
9.30.4 Integrated Submission Script Setup
The following procedures describe how to install the integrated Hiero submission script. This script allows for submitting Hiero transcoding jobs to Deadline directly from within the Hiero editing GUI. These jobs are then rendered
using the Nuke plugin.
Submitter Installer
• Run the Submitter Installer located at <Repository>/submission/Hiero/Installers
Manual Installation of the Submission Script
• Go to your .hiero user folder (~/.hiero or %USERPROFILE%\.hiero) and create a folder called “Python” if it
doesn’t exist.
• Open the “Python” folder and create another folder called “Startup” if it doesn’t exist.
• Copy [Repository]\submission\Hiero\Client\DeadlineHieroClient.py
ero/Python/Startup or %USERPROFILE%\.hiero\Python\Startup).
to
the
“Startup”
folder
(~/.hi-
The next time you launch Hiero, there should be a Submit To Deadline option in the Hiero Export window, in the
Render Background Tasks drop down.
9.30.5 FAQ
The Hiero submitter submits jobs to the Nuke plug-in. See the Nuke Plug-in Guide for additional FAQs related to
Nuke.
Which versions of Hiero are supported?
Hiero 1.0 and later are supported.
How does the Deadline submission script for Hiero work?
The submission script submits transcoding jobs from Hiero to Deadline, which are rendered with the Nuke
plugin.
9.30.6 Error Messages and Meanings
The Hiero submitter submits jobs to the Nuke plug-in. See the Nuke Plug-in Guide for Nuke error messages and
meanings.
660
Chapter 9. Application Plugins
Deadline User Manual, Release 7.1.0.35
9.31 Houdini
9.31.1 Job Submission
You can submit jobs from within Houdini by installing the integrated submission script, or you can submit them from
the Monitor. The instructions for installing the integrated submission script can be found further down this page.
To submit from within Houdini, select ‘Render’ -> ‘Submit To Deadline’.
Submission Options
The general Deadline options are explained in the Job Submission documentation, and the Draft/Integration options
are explained in the Draft and Integration documentation. The Houdini specific options are:
• ROP To Render:
– Choose: Allows you to choose your ROP from the dropbox to the right.
– Selected: Allows you to render each ROP that you currently have selected in Houdini (in the order that
you selected)
– All: Allows you to render every ROP in the Houdini file.
• Ignore Inputs: If enabled, only the selected ROP will be rendered. No dependencies will rendered.
• Build to Force: Force 32 or 64 bit rendering.
• Submit Wedges as Separate Jobs: If enabled, each Wedge in a Wedge ROP will be submitted as a separate job
with the current Wedge settings. This option is only enabled if the selected ROP is a Wedge ROP, or if all ROPs
are being rendered and at least one of them is a Wedge ROP.
9.31. Houdini
661
Deadline User Manual, Release 7.1.0.35
Tile Rendering Options
Enable Tile Rendering to split up a single