Community Wiki Home
Spaces
Apps
Templates
Create
Pentaho Data Integration
All content
Space settings
Content
Results will update as you type.
Show more above
Abort Job
Add filenames to result
Amazon EMR Job Executor
Amazon Hive Job Executor
Build Model (Job Entry)
BulkLoad from Mysql to file
BulkLoad into MSSQL
Bulkload into MySQL
Check Db connections
Check Files Locked
Check if a folder is empty
Check if connected to Repository
Check if XML file is well formed
Check webservice availability
Checks if files exists
Columns exist in a table
Compare folders
Convert file between Windows and Unix
Copy Files
Create a file
Create a folder
Delete a file
Delete filenames from result
Delete Files
Delete folders
Display Msgbox info
DTD Validator (Job Entry)
Dummy Job Entry
Evaluate files metrics
Example Plugin
Export repository to XML file
File compare
File Exists (Job Entry)
FTP Delete
Get a file with FTP
Get a file with FTPS
Get a file with SFTP
Get Mails from POP
Hadoop Copy Files
Hadoop Job Executor
HL7 MLLP Acknowledge
HL7 MLLP Input
HTTP
JavaScript (job entry)
Job (Job Entry)
Mail
Move files
MS Access Bulk Load (Deprecated)
Oozie Job Executor
Palo Cube Create (Deprecated)
Palo Cube Delete (Deprecated)
Pentaho MapReduce
Ping a host
Process result filenames
Publish Model (Job Entry)
Put a file with FTP
Put a file with SFTP
Send information using Syslog
Send Nagios passive check
Send SMNP trap
Set variables (job entry)
Shell
SQL
Sqoop Export
Sqoop Import
SSH2 Get
SSH2 Put
Start
Start a PDI Cluster on YARN
Stop a PDI Cluster on YARN
Success
Table Exists (Job Entry)
Talend Job Execution (Deprecated)
Telnet
Transformation (job entry)
Truncate tables
Unzip file
Upload files to FTPS
Verify file signature with PGP
Wait for
Wait for a file
Wait for SQL
Write to File
Write to log