ICode9

精准搜索请尝试: 精确搜索
首页 > 其他分享> 文章详细

kettle 4.1 与 5.4 二次开发的日志记录方式

2019-07-09 10:08:32  阅读:738  来源: 互联网

标签:5.4 kettle appender pentaho import 二次开发 org log4j etl


kettle 4.1 与 5.4 二次开发的日志记录方式

log4j版本:log4j-1.2.17.jar

kettle 4.1 日志记录

问题描述:
kettle-engine.jar 包中集成 log4j.xml文件,log4j默认加载xml配置文件;
log4j.xml文件仅提供控制台输出配置,默认未提供文件输出配置;
如果在classpath中添加log4j.properties文件,也不加载该配置;

解决办法:
1)将log4j.xmlkettle-engine包中提取出来,然后添加appender,将日志输出到文件;
2)并在代码中添加appender关联;

KettleEnvironment.init();
DBCache.getInstance();
LogWriter.getInstance().addAppender(Logger.getRootLogger().getAppender("etl"));			

kettle 5.4 日志记录

问题描述:
kettle 5.4版本中未集成log4j配置文件,仅配置 log4j.properties 仍然无法输出日志;

解决办法:
1)在log4j.properties中配置文件输出;
2)添加如下代码:

private static Log4jLogging etlLog = null;
...
KettleEnvironment.init(false);
LoggingBuffer loggingBuffer = KettleLogStore.getAppender();
if (etlLog == null) {
	etlLog = new Log4jLogging();
	loggingBuffer.addLoggingEventListener(etlLog);
}
DBCache.getInstance();

完整代码示例

kettle 4.1日志配置文件log4j.xml

<log4j:configuration xmlns:log4j="http://jakarta.apache.org/log4j/"
	debug="false">

	<appender name="CONSOLE" class="org.apache.log4j.ConsoleAppender">
		<param name="Target" value="System.out" />
		<param name="Threshold" value="INFO" />
		<param name="encoding" value="utf-8" />
		<layout class="org.apache.log4j.PatternLayout">
			<param name="ConversionPattern" value="%-5p %d{MM-dd HH:mm:ss} - %m%n" />
		</layout>
	</appender>

	<!-- 文件日志 -->
	<appender name="etl" class="org.apache.log4j.DailyRollingFileAppender">
		<param name="File" value="logs/etl.log" />
		<param name="Threshold" value="INFO" />
		<param name="encoding" value="utf-8" />
		<layout class="org.apache.log4j.PatternLayout">
			<param name="ConversionPattern" value="%-5p %d{MM-dd HH:mm:ss} - %m%n" />			
		</layout>
	</appender>
	
	<category name="org.pentaho">
		<priority value="DEBUG"/>
		
	</category>

	<category name="com.healthmarketscience.jackcess">
		<priority value="WARN" />
	</category>

	<category name="org.apache.commons.httpclient">
		<priority value="WARN" />
	</category>

	<category name="org.mortbay">
		<priority value="ERR" />
	</category>

	<category name="java.net">
		<priority value="NONE" />
	</category>

	<category
		name="org.apache.commons.logging.simplelog.log.org.apache.commons.httpclient">
		<priority value="WARN" />
	</category>

	<category
		name="org.apache.commons.logging.simplelog.log.org.apache.commons.httpclient.auth">
		<priority value="WARN" />
	</category>
	
	<root>
		<priority value="info" />
		<appender-ref ref="CONSOLE" />
		<appender-ref ref="etl" />
	</root>

</log4j:configuration>

kettle 5.4日志配置文件log4j.properties

# Threshold OFF FATAL ERROR WARN INFO DEBUG ALL
log4j.rootLogger=DEBUG,CONSOLE,etl
log4j.logger.org.pentaho.di=info,etl

log4j.appender.CONSOLE=org.apache.log4j.ConsoleAppender
log4j.appender.CONSOLE.Threshold=info
log4j.appender.CONSOLE.Target=System.out
log4j.appender.CONSOLE.Encoding=UTF-8
log4j.appender.CONSOLE.layout=org.apache.log4j.PatternLayout
log4j.appender.CONSOLE.layout.ConversionPattern=%-5p %d{MM-dd HH:mm:ss} - %m%n

log4j.appender.etl=org.apache.log4j.DailyRollingFileAppender
log4j.appender.etl.File=logs/etl-core.log
log4j.appender.etl.Encoding=utf-8
log4j.appender.etl.Threshold=info
log4j.appender.etl.DatePattern='.'yyyy-MM-dd
log4j.appender.etl.layout=org.apache.log4j.PatternLayout
log4j.appender.etl.layout.ConversionPattern=%-5p %d{MM-dd HH:mm:ss} - %m%n

java代码

import java.util.HashMap;
import java.util.Map;
import org.apache.log4j.Logger;
import org.junit.Before;
import org.junit.Test;
import org.pentaho.di.core.DBCache;
import org.pentaho.di.core.KettleEnvironment;
import org.pentaho.di.core.exception.KettleException;
import org.pentaho.di.core.exception.KettleXMLException;
import org.pentaho.di.core.logging.KettleLogStore;
import org.pentaho.di.core.logging.LogLevel;
import org.pentaho.di.core.logging.LoggingBuffer;
import org.pentaho.di.core.logging.log4j.Log4jLogging;
import org.pentaho.di.job.Job;
import org.pentaho.di.job.JobMeta;
import org.pentaho.di.trans.Trans;
import org.pentaho.di.trans.TransMeta;

/**
 * @description Kettle Api 调用示例
 * @author hury
 */

public class EtlExample {
	private static Logger logger = Logger.getLogger(EtlExample.class);
	private static Log4jLogging etlLog = null;

	@Before
	public void init() {

		try {
			// kettle 4.1 日志输出写法
			// KettleEnvironment.init();
			// DBCache.getInstance();
			// LogWriter.getInstance().addAppender(Logger.getRootLogger().getAppender("etl"));

			// kettle 5.4日志输出写法
			KettleEnvironment.init(false);
			LoggingBuffer loggingBuffer = KettleLogStore.getAppender();
			if (etlLog == null) {
				etlLog = new Log4jLogging();
				loggingBuffer.addLoggingEventListener(etlLog);
			}
			DBCache.getInstance();
		} catch (KettleException e) {
			logger.error("初始化kettle环境异常。");
			e.printStackTrace();
		}

	}

	@Test
	public void TestEtlJob() {
		logger.info("测试 Job");
		String jobname = "config/test/test_job.kjb";
		runJob(jobname);
	}

	@Test
	public void TestTransformation() {
		logger.info("测试 transformation");
		String filename = "config/test/test_trans.ktr";
		runTrans(filename);
	}

	/**
	 * 执行etl作业脚本
	 * 
	 * @param filename
	 *            脚本名称
	 */
	public static void runJob(String fileName) {
		try {
			Map<String, String> ps = new HashMap<String, String>();
			ps.put("jgid", "机构编码1");
			ps.put("jgmc", "机构名称1");
			String[] args1 = { "参数1" };
			// fileName 是Job脚本的路径及名称
			JobMeta meta = new JobMeta(fileName, null);
			for (String key : ps.keySet()) {
				String value = ps.get(key);
				meta.setParameterValue(key, value);
			}
			Job job = new Job(null, meta);
			job.setArguments(args1);
			job.setLogLevel(LogLevel.DETAILED);
			// 向Job 脚本传递参数,脚本中获取参数值:${参数名}
			job.setVariable("", "");
			job.start();
			job.waitUntilFinished();
			if (job.getErrors() > 0) {
				logger.error("runJob fail!");
			}
		} catch (KettleException e) {
			logger.error(e);
		}
	}

	/**
	 * 执行etl转换脚本
	 * 
	 * @param fileName
	 *            脚本名称
	 */
	public static void runTrans(String fileName) {
		try {
			// 外部参数
			Map<String, String> ps = new HashMap<String, String>();
			ps.put("jgid", "机构编码1");
			ps.put("jgmc", "机构名称1");
			String[] args1 = { "参数1" };
			// 调用ktr脚本
			TransMeta transMeta = new TransMeta(fileName);
			for (String key : ps.keySet()) {
				String value = ps.get(key);
				transMeta.setParameterValue(key, value);
			}

			Trans trans = new Trans(transMeta);
			trans.setArguments(args1);
			trans.prepareExecution(null);
			trans.startThreads();
			trans.waitUntilFinished();
			if (trans.getErrors() != 0) {
				logger.error("Error");
			}
		} catch (KettleXMLException e) {
			e.printStackTrace();
		} catch (KettleException e) {
			e.printStackTrace();
		}
	}
}

参考

https://forums.pentaho.com/threads/156592-Kettle-5-0-1-Log4j-plugin-usage/

标签:5.4,kettle,appender,pentaho,import,二次开发,org,log4j,etl
来源: https://blog.csdn.net/huryer/article/details/95167868

本站声明: 1. iCode9 技术分享网(下文简称本站)提供的所有内容,仅供技术学习、探讨和分享;
2. 关于本站的所有留言、评论、转载及引用,纯属内容发起人的个人观点,与本站观点和立场无关;
3. 关于本站的所有言论和文字,纯属内容发起人的个人观点,与本站观点和立场无关;
4. 本站文章均是网友提供,不完全保证技术分享内容的完整性、准确性、时效性、风险性和版权归属;如您发现该文章侵犯了您的权益,可联系我们第一时间进行删除;
5. 本站为非盈利性的个人网站,所有内容不会用来进行牟利,也不会利用任何形式的广告来间接获益,纯粹是为了广大技术爱好者提供技术内容和技术思想的分享性交流网站。

专注分享技术,共同学习,共同进步。侵权联系[81616952@qq.com]

Copyright (C)ICode9.com, All Rights Reserved.

ICode9版权所有