
For example, to use the Get File Names step, create an instance of . and use its get and set methods to configure it. Transformation steps reside in sub-packages of .trans.steps. Create the step by instantiating its class directly and configure it by using its get and set methods.

Perform the following tasks to populate the object with a transformation step: The data flow of a transformation is defined by steps that are connected by hops. Populate the TransMeta Object with Transformation Steps The transformation definition includes the name, the declared parameters, and the required database connections. Create this object using the default constructor.
Pentaho data integration spoon how to#
The. class is an example of how to run a PDI job from Java code in a stand-alone application. When the transformations have completed, call KettleEnvironment.shutdown() to ensure the proper shutdown of all kettle listeners. To get more information, retrieve the transformation log lines. This method returns zero ( 0) on success and a non-zero value when there are errors. The Result object can be queried for success by evaluating getNrErrors(). To ensure that all steps of the Trans object have completed, call waitUntilFinished().Īfter the Trans object completes, you can access the result using getResult(). The Trans object starts, then executes asynchronously. To query the assigned values, use setParameterValue().Īn executable Trans object is derived from the TransMeta object that is passed to the constructor. To query the declared parameters of the transformation definition use listParameters(). ktr file, a PDI repository, or generate it dynamically. The definition of a PDI transformation is represented by a TransMeta object. ktr file using runTransformationFromFileSystem() or from a PDI repository using runTransfomrationFromRepository().Ĭonsider the following general steps while trying to run an embedded transformation:Īlways make the first call to KettleEnvironment.init() whenever you are working with the PDI APIs. This class sets parameters and executes the sample transformations in pentaho/design-tools/data-integration/etl directory. The. class is an example of how to run a PDI transformation from Java code in a stand-alone application. StepPluginType.getInstance().getPluginFolders().add( new PluginFolder( "", false, true ) ) You can also add custom plugins in other locations as long as they are registered with the appropriate implementation of PluginTypeInterface prior to initializing the kettle environment, as shown in the following code example: Once the plugin location(s) are properly configured, you can add custom plugins to your specific locations. Set the KETTLE_PLUGIN_BASE_FOLDERS system property to point to the PDI pentaho/design-tools/data-integration directory, either through the following command line option ( -DKETTLE_PLUGIN_BASE_FOLDERS=/data-integration) or directly in your code ( tProperty( "KETTLE_PLUGIN_BASE_FOLDERS", new File("/data-integration") ) for example).Copy the pentaho/design-tools/data-integration/plugins directory into the /plugins directory of your application.You can use either of following methods to make the default kettle plugins available: With a standard install, the kettle engine looks for plugins in either plugins or /.kettle/plugins. Make the kettle plugins (non-OSGi) available to your application.
