Compare commits

...

21 Commits

Author SHA1 Message Date
pyzhou-talend
1561df4ee2 patch(TPS-3236):release note 2019-07-16 14:40:34 +08:00
clesaec
f61b793263 fix(TDI-42230) : file copy java11 bug (#3670)
* fix(TDI-42230) : using standart java.nio.Files classes (since Java 7)
2019-07-08 11:31:35 +02:00
nrousseau
1e694d07f0 fix(TMDM-13727) remove not existing plugin 2019-07-08 16:43:34 +08:00
Hanna Liashchuk
bfb26a7bfb fix(TBD-8925): table won't be created unless createTable option is checked OAuth (#3686) 2019-07-05 18:02:43 +03:00
qiongli
47ee1e1ed0 feat(TDQ-16419) Add a filter by the column data type (#3675) 2019-07-05 17:57:21 +08:00
hcyi
164641111e fix(TUP-23264):fix problems for guess schema and trun job. (#3683) 2019-07-05 17:17:05 +08:00
hzhao-talendbj
3251e99e35 fix(TUP-23363)'save the property' to metadata get error (#3682) 2019-07-05 10:50:02 +08:00
kjwang-talend
c69d2348b8 Fix TUP-22583 [CVE:high] Backend : pkg:maven/ch.qos.logback:logback-core:1.0.6 requires version upgrade (#3661)
* Fix TUP-22583 [CVE:high] Backend :
pkg:maven/ch.qos.logback:logback-core:1.0.6 requires version upgrade
https://jira.talendforge.org/browse/TUP-22583

* Fix TUP-22583 [CVE:high] Backend :
pkg:maven/ch.qos.logback:logback-core:1.0.6 requires version upgrade
https://jira.talendforge.org/browse/TUP-22583

* Fix TUP-22583 [CVE:high] Backend :
pkg:maven/ch.qos.logback:logback-core:1.0.6 requires version upgrade
https://jira.talendforge.org/browse/TUP-22583

* Fix TUP-22583 [CVE:high] Backend :
pkg:maven/ch.qos.logback:logback-core:1.0.6 requires version upgrade
https://jira.talendforge.org/browse/TUP-22583
2019-07-04 18:45:52 +08:00
Jane Ding
88f2a8734b fix(TUP-22200)Import XML tree from xsd file throw (#3669)
* fix(TUP-22200)Import XML tree from xsd file throw
indexOutOfBoundsException
https://jira.talendforge.org/browse/TUP-22200

* fix(TUP-22200)Import XML tree from xsd file throw
indexOutOfBoundsException
https://jira.talendforge.org/browse/TUP-22200
2019-07-04 10:24:40 +08:00
chmyga
592be6abb1 fix(TDI-42536): tFileFetch NTLM PROXY issues (#3653)
* fix(TDI-42536): tFileFetch NTLM PROXY issues

* Fix basic auth when server allows different auth mechanisms

* fix(TDI-42536): tFileFetch NTLM PROXY issues

* Fix PR Comment
2019-07-03 15:07:54 +03:00
jiezhang-tlnd
cb7468510a fix(TUP-23371)Studio_will_dead_when_TAC_accidentally_disconnected (#3678) 2019-07-03 18:01:36 +08:00
wang wei
626748e4cb fix(TDI-42380): Migration issues in the Studio after migrating the jobs from 6.5 to 7.1(#3648) 2019-07-03 11:23:03 +08:00
hcyi
0a5277ae6e fix(TUP-23550):Select query for DBInput component page will list (#3674)
Couchbase.
2019-07-02 11:30:11 +08:00
Chao MENG
761fbcf5b0 Revert "fix(TUP-23371)[GIT] Studio will dead when TAC accidentally disconnected (#3665)" (#3673)
This reverts commit 059332dd05.
2019-07-01 14:46:02 +08:00
hcyi
8560b26217 fix(TUP-23264):Studio can't find dependencies for tacokit components. (#3672) 2019-07-01 11:28:59 +08:00
jiezhang-tlnd
059332dd05 fix(TUP-23371)[GIT] Studio will dead when TAC accidentally disconnected (#3665)
* fix(TUP-23371)[GIT] Studio will dead when TAC accidentally disconnected

* fix(TUP-23371)[GIT] Studio will dead when TAC accidentally disconnected
2019-07-01 10:21:24 +08:00
Zhiwei Xue
06d54bf796 fix(TUP-23096):update context takes about 10 mins (#3663) 2019-06-28 18:27:07 +08:00
Zhiwei Xue
b13cf8c27c fix(TUP-23052):Import project failed by Import an existing project (#3667) 2019-06-28 18:22:15 +08:00
Irene Wang
cdb0912a5a Revert "DEVOPS-3416 Deploy all plugins for Black Duck scans"
This reverts commit 1ce0e429ca.
2019-06-28 12:16:22 +02:00
Irene Wang
2777cc69a9 DEVOPS-6106 Update copyright year to 2019 2019-06-28 12:16:22 +02:00
apoltavtsev
3b70073b38 TESB-26293 Duplicated libraries in private & import packages of the build manifest file 2019-06-27 22:57:53 +03:00
35 changed files with 788 additions and 648 deletions

57
PATCH_RELEASE_NOTE.md Normal file
View File

@@ -0,0 +1,57 @@
---
version: 7.1.1
module: https://talend.poolparty.biz/coretaxonomy/42
product:
- https://talend.poolparty.biz/coretaxonomy/23
---
# TPS-3236
| Info | Value |
| ---------------- | ---------------- |
| Patch Name | Patch\_20190716\_TPS\-3236\_v1\-7.1.1 |
| Release Date | 2019-07-16 |
| Target Version | Talend-Studio-20181026_1147-V7.1.1 |
| Product affected | Talend Studio |
## Introduction
This is a self-contained patch.
**NOTE**: For information on how to obtain this patch, reach out to your Support contact at Talend.
## Fixed issues
This patch contains the following fixes:
- TPS-3236 [7.1.1] tExaBulkExec component has a wrong number formatting in bulk imports to Exasol (TDI-42572)
## Prerequisites
Consider the following requirements for your system:
- Talend Studio 7.1.1 must be installed.
## Installation
### Installing the patch using Software update
1) Logon TAC and switch to Configuration->Software Update, then enter the correct values and save referring to the documentation: https://help.talend.com/reader/f7Em9WV_cPm2RRywucSN0Q/j9x5iXV~vyxMlUafnDejaQ
2) Switch to Software update page, where the new patch will be listed. The patch can be downloaded from here into the nexus repository.
3) On Studio Side: Logon Studio with remote mode, on the logon page the Update button is displayed: click this button to install the patch.
### Installing the patch using Talend Studio
1) Create a folder named "patches" under your studio installer directory and copy the patch .zip file to this folder.
2) Restart your studio: a window pops up, then click OK to install the patch, or restart the commandline and the patch will be installed automatically.
### Installing the patch using Commandline
Execute the following commands:
1. Talend-Studio-win-x86_64.exe -nosplash -application org.talend.commandline.CommandLine -consoleLog -data commandline-workspace startServer -p 8002 --talendDebug
2. initRemote {tac_url} -ul {TAC login username} -up {TAC login password}
3. checkAndUpdate -tu {TAC login username} -tup {TAC login password}

View File

@@ -11,7 +11,6 @@
<import feature="org.talend.tos.feature" version="0.0.0" match="greaterOrEqual"/>
</requires>
<plugin id="org.talend.designer.maven.tos" download-size="0" install-size="0" version="0.0.0" fragment="true"/>
<plugin id="org.talend.libraries.mdm.webservice.ce" download-size="0" install-size="0" version="0.0.0" fragment="true"/>
<plugin id="org.talend.presentation.onboarding" download-size="0" install-size="0" version="0.0.0" unpack="false"/>
<plugin id="org.talend.presentation.onboarding.nl" download-size="0" install-size="0" version="0.0.0" fragment="true" unpack="false"/>
<plugin id="org.talend.presentation.onboarding.resource" download-size="0" install-size="0" version="0.0.0" unpack="false"/>

View File

@@ -69,145 +69,140 @@ if(hasInput){
}
}
}
boolean hasValidInput = inputConn!=null;
IMetadataTable metadata = null;
List<IMetadataTable> metadatas = node.getMetadataList();
boolean haveValidNodeMetadata = ((metadatas != null) && (metadatas.size() > 0) && (metadata = metadatas.get(0)) != null);
if (hasValidInput && haveValidNodeMetadata) {
if (hasValidInput) {
List<IMetadataColumn> input_columnList = inputConn.getMetadataTable().getListColumns();
if(input_columnList == null) {
input_columnList = new ArrayList<IMetadataColumn>();
}
// add incoming (not present) columns to enforcer for this comps
if (cid.contains("tDataStewardship") || cid.contains("tMarkLogic")){
%>
boolean shouldCreateRuntimeSchemaForIncomingNode = false;
<%
for (int i = 0; i < input_columnList.size(); i++) {
if(!input_columnList.get(i).getTalendType().equals("id_Dynamic")) {
%>
if (incomingEnforcer_<%=cid%>.getDesignSchema().getField("<%=input_columnList.get(i)%>") == null){
incomingEnforcer_<%=cid%>.addIncomingNodeField("<%=input_columnList.get(i)%>", ((Object) <%=inputConn.getName()%>.<%=input_columnList.get(i)%>).getClass().getCanonicalName());
shouldCreateRuntimeSchemaForIncomingNode = true;
}
<%
}
}
%>
if (shouldCreateRuntimeSchemaForIncomingNode){
incomingEnforcer_<%=cid%>.createRuntimeSchema();
}
<%
}
// If there are dynamic columns in the schema, they need to be
// initialized into the runtime schema of the actual IndexedRecord
// provided to the component.
int dynamicPos = -1;
for (int i = 0; i < input_columnList.size(); i++) {
if (input_columnList.get(i).getTalendType().equals("id_Dynamic")) {
dynamicPos = i;
break;
}
}
if (dynamicPos != -1) {
%>
if (!incomingEnforcer_<%=cid%>.areDynamicFieldsInitialized()) {
// Initialize the dynamic columns when they are first encountered.
for (routines.system.DynamicMetadata dm_<%=cid%> : <%=inputConn.getName()%>.<%=input_columnList.get(dynamicPos).getLabel()%>.metadatas) {
incomingEnforcer_<%=cid%>.addDynamicField(
dm_<%=cid%>.getName(),
dm_<%=cid%>.getType(),
dm_<%=cid%>.getLogicalType(),
dm_<%=cid%>.getFormat(),
dm_<%=cid%>.getDescription(),
dm_<%=cid%>.isNullable());
}
incomingEnforcer_<%=cid%>.createRuntimeSchema();
}
<%
}
%>
incomingEnforcer_<%=cid%>.createNewRecord();
<%
for (int i = 0; i < input_columnList.size(); i++) { // column
IMetadataColumn column = input_columnList.get(i);
if (dynamicPos != i) {
%>
//skip the put action if the input column doesn't appear in component runtime schema
if (incomingEnforcer_<%=cid%>.getRuntimeSchema().getField("<%=input_columnList.get(i)%>") != null){
incomingEnforcer_<%=cid%>.put("<%=column.getLabel()%>", <%=inputConn.getName()%>.<%=column.getLabel()%>);
}
<%
} else {
%>
for (int i = 0; i < <%=inputConn.getName()%>.<%=column.getLabel()%>.getColumnCount(); i++) {
incomingEnforcer_<%=cid%>.put(<%=inputConn.getName()%>.<%=column.getLabel()%>.getColumnMetadata(i).getName(),
<%=inputConn.getName()%>.<%=column.getLabel()%>.getColumnValue(i));
}
<%
}
} // column
// If necesary, generate the code to handle outgoing connections.
// TODO: For now, this can only handle one outgoing record for
// each incoming record. To handle multiple outgoing records, code
// generation needs to occur in component_begin in order to open
// a for() loop.
// There will be a ClassCastException if the output component does
// not implement WriterWithFeedback, but permits outgoing
// connections.
if (input_columnList!=null && !input_columnList.isEmpty()) {
// add incoming (not present) columns to enforcer for this comps
if (cid.contains("tDataStewardship") || cid.contains("tMarkLogic")){
%>
boolean shouldCreateRuntimeSchemaForIncomingNode = false;
<%
for (int i = 0; i < input_columnList.size(); i++) {
if(!input_columnList.get(i).getTalendType().equals("id_Dynamic")) {
%>
if (incomingEnforcer_<%=cid%>.getDesignSchema().getField("<%=input_columnList.get(i)%>") == null){
incomingEnforcer_<%=cid%>.addIncomingNodeField("<%=input_columnList.get(i)%>", ((Object) <%=inputConn.getName()%>.<%=input_columnList.get(i)%>).getClass().getCanonicalName());
shouldCreateRuntimeSchemaForIncomingNode = true;
}
<%
}
}
%>
if (shouldCreateRuntimeSchemaForIncomingNode){
incomingEnforcer_<%=cid%>.createRuntimeSchema();
}
<%
}
// If there are dynamic columns in the schema, they need to be
// initialized into the runtime schema of the actual IndexedRecord
// provided to the component.
int dynamicPos = -1;
for (int i = 0; i < input_columnList.size(); i++) {
if (input_columnList.get(i).getTalendType().equals("id_Dynamic")) {
dynamicPos = i;
break;
}
}
if (dynamicPos != -1) {
%>
if (!incomingEnforcer_<%=cid%>.areDynamicFieldsInitialized()) {
// Initialize the dynamic columns when they are first encountered.
for (routines.system.DynamicMetadata dm_<%=cid%> : <%=inputConn.getName()%>.<%=input_columnList.get(dynamicPos).getLabel()%>.metadatas) {
incomingEnforcer_<%=cid%>.addDynamicField(
dm_<%=cid%>.getName(),
dm_<%=cid%>.getType(),
dm_<%=cid%>.getLogicalType(),
dm_<%=cid%>.getFormat(),
dm_<%=cid%>.getDescription(),
dm_<%=cid%>.isNullable());
}
incomingEnforcer_<%=cid%>.createRuntimeSchema();
}
<%
}
%>
incomingEnforcer_<%=cid%>.createNewRecord();
<%
for (int i = 0; i < input_columnList.size(); i++) { // column
IMetadataColumn column = input_columnList.get(i);
if (dynamicPos != i) {
%>
//skip the put action if the input column doesn't appear in component runtime schema
if (incomingEnforcer_<%=cid%>.getRuntimeSchema().getField("<%=input_columnList.get(i)%>") != null){
incomingEnforcer_<%=cid%>.put("<%=column.getLabel()%>", <%=inputConn.getName()%>.<%=column.getLabel()%>);
}
<%
} else {
%>
for (int i = 0; i < <%=inputConn.getName()%>.<%=column.getLabel()%>.getColumnCount(); i++) {
incomingEnforcer_<%=cid%>.put(<%=inputConn.getName()%>.<%=column.getLabel()%>.getColumnMetadata(i).getName(),
<%=inputConn.getName()%>.<%=column.getLabel()%>.getColumnValue(i));
}
<%
}
} // column
// If necesary, generate the code to handle outgoing connections.
// TODO: For now, this can only handle one outgoing record for
// each incoming record. To handle multiple outgoing records, code
// generation needs to occur in component_begin in order to open
// a for() loop.
// There will be a ClassCastException if the output component does
// not implement WriterWithFeedback, but permits outgoing
// connections.
ComponentProperties componentProps = node.getComponentProperties();
ProcessPropertiesGenerator generator = new ProcessPropertiesGenerator(cid, component);
List<Component.CodegenPropInfo> propsToProcess = component.getCodegenPropInfos(componentProps);
for (Component.CodegenPropInfo propInfo : propsToProcess) { // propInfo
List<NamedThing> properties = propInfo.props.getProperties();
for (NamedThing prop : properties) { // property
if (prop instanceof Property) { // if, only deal with valued Properties
Property property = (Property)prop;
if (property.getFlags() != null && (property.getFlags().contains(Property.Flags.DESIGN_TIME_ONLY) || property.getFlags().contains(Property.Flags.HIDDEN)))
continue;
if(property.getTaggedValue(IGenericConstants.DYNAMIC_PROPERTY_VALUE)!=null && Boolean.valueOf(String.valueOf(property.getTaggedValue(IGenericConstants.DYNAMIC_PROPERTY_VALUE)))) {
generator.setPropertyValues(property, propInfo, null, false, false);
}
ComponentProperties componentProps = node.getComponentProperties();
ProcessPropertiesGenerator generator = new ProcessPropertiesGenerator(cid, component);
List<Component.CodegenPropInfo> propsToProcess = component.getCodegenPropInfos(componentProps);
for (Component.CodegenPropInfo propInfo : propsToProcess) { // propInfo
List<NamedThing> properties = propInfo.props.getProperties();
for (NamedThing prop : properties) { // property
if (prop instanceof Property) { // if, only deal with valued Properties
Property property = (Property)prop;
if (property.getFlags() != null && (property.getFlags().contains(Property.Flags.DESIGN_TIME_ONLY) || property.getFlags().contains(Property.Flags.HIDDEN)))
continue;
if(property.getTaggedValue(IGenericConstants.DYNAMIC_PROPERTY_VALUE)!=null && Boolean.valueOf(String.valueOf(property.getTaggedValue(IGenericConstants.DYNAMIC_PROPERTY_VALUE)))) {
generator.setPropertyValues(property, propInfo, null, false, false);
}
} // property
} // propInfo
%>
org.apache.avro.generic.IndexedRecord data_<%=cid%> = incomingEnforcer_<%=cid%>.getCurrentRecord();
<%
boolean isParallelize ="true".equalsIgnoreCase(ElementParameterParser.getValue(node, "__PARALLELIZE__"));
if (isParallelize) {
String sourceComponentId = inputConn.getSource().getUniqueName();
if(sourceComponentId!=null && sourceComponentId.contains("tAsyncIn")) {
%>
globalMap.put(buffersSizeKey_<%=cid%>, buffersSize_<%=sourceComponentId%>);
<%
}
}
%>
}
} // property
} // propInfo
%>
org.apache.avro.generic.IndexedRecord data_<%=cid%> = incomingEnforcer_<%=cid%>.getCurrentRecord();
<%
boolean isParallelize ="true".equalsIgnoreCase(ElementParameterParser.getValue(node, "__PARALLELIZE__"));
if (isParallelize) {
String sourceComponentId = inputConn.getSource().getUniqueName();
if(sourceComponentId!=null && sourceComponentId.contains("tAsyncIn")) {
%>
globalMap.put(buffersSizeKey_<%=cid%>, buffersSize_<%=sourceComponentId%>);
<%
}
}
%>
writer_<%=cid%>.write(data_<%=cid%>);
nb_line_<%=cid %>++;
<%if(hasMainOutput){
%>
if(!(writer_<%=cid%> instanceof org.talend.components.api.component.runtime.WriterWithFeedback)) {
// For no feedback writer,just pass the input record to the output
if (data_<%=cid%>!=null) {
outgoingMainRecordsList_<%=cid%> = java.util.Arrays.asList(data_<%=cid%>);
}
}
<%
}
writer_<%=cid%>.write(data_<%=cid%>);
nb_line_<%=cid %>++;
<%if(hasMainOutput){
%>
if(!(writer_<%=cid%> instanceof org.talend.components.api.component.runtime.WriterWithFeedback)) {
// For no feedback writer,just pass the input record to the output
if (data_<%=cid%>!=null) {
outgoingMainRecordsList_<%=cid%> = java.util.Arrays.asList(data_<%=cid%>);
}
}
<%
}
}
} // canStart

View File

@@ -2,10 +2,9 @@
<project name="org.talend.designer.components.libs" default="buildall" basedir=".">
<target name="buildall">
<ant antfile="filecopy/build.xml" target="process" inheritall="no" />
<ant antfile="talend_file_enhanced_20070724/build.xml" target="process" inheritall="no" />
<ant antfile="sugarCRMManagement/build.xml" target="process" inheritall="no" />
<ant antfile="TalendSAX/build.xml" target="process" inheritall="no" />
</target>
</project>
</project>

View File

@@ -0,0 +1,3 @@
.classpath
.project
target/

View File

@@ -1,88 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project name="org.talend.designer.components.libs" default="process" basedir=".">
<property name="component.plugin.home" value="../../../org.talend.designer.components.localprovider/components" />
<!-- #################################################### -->
<!-- modification 1: config -->
<property name="jar.name" value="filecopy.jar" />
<property name="component.name" value="tFileCopy" />
<property name="author.name" value="wyang" />
<!-- modification 2: compile classpath -->
<path id="compile.classpath">
</path>
<!-- #################################################### -->
<!-- sourcecode and final jar path -->
<property name="source.home" value="." />
<property name="jar.home" value="${component.plugin.home}/${component.name}/${jar.name}" />
<!-- temp dir for clasee files -->
<property name="build.dir" value="../../build" />
<!-- compile option -->
<property name="compile.debug" value="true" />
<property name="compile.deprecation" value="false" />
<property name="compile.optimize" value="true" />
<target name="process" description="prepare a temp dir">
<antcall target="prepare" />
<antcall target="compile" />
<antcall target="clean" />
</target>
<target name="prepare" description="prepare a temp dir">
<delete dir="${build.dir}" />
<mkdir dir="${build.dir}" />
<mkdir dir="${build.dir}/classes" />
</target>
<target name="compile" description="Compile Java sources">
<!-- compile -->
<javac srcdir="${source.home}" destdir="${build.dir}/classes" debug="${compile.debug}" deprecation="${compile.deprecation}" optimize="${compile.optimize}">
<classpath refid="compile.classpath" />
</javac>
<!-- include source code -->
<copy todir="${build.dir}/classes">
<fileset dir="${source.home}">
<exclude name="build.xml" />
</fileset>
</copy>
<!-- make jar -->
<tstamp>
<format property="date" pattern="yyyy-MM-dd HH:mm:ss" />
</tstamp>
<jar destfile="${build.dir}/${jar.name}" basedir="${build.dir}/classes">
<manifest>
<!-- who -->
<attribute name="Built-By" value="${author.name}" />
<!-- when -->
<attribute name="Built-Date" value="${date}"/>
<!-- JDK version -->
<attribute name="Created-By" value="${java.version} (${java.vendor})" />
<!-- Information about the program itself -->
<attribute name="Implementation-Vendor" value="Talend SA" />
<attribute name="Implementation-Title" value="${jar.name}" />
<attribute name="Implementation-Version" value="1.0" />
</manifest>
</jar>
<!-- move jar -->
<move file="${build.dir}/${jar.name}" tofile="${jar.home}" />
</target>
<target name="clean" description="clean the temp dir">
<delete dir="${build.dir}" />
<mkdir dir="${build.dir}" />
</target>
</project>

View File

@@ -1,190 +0,0 @@
// ============================================================================
//
// Copyright (C) 2006-2019 Talend Inc. - www.talend.com
//
// This source code is available under agreement available at
// %InstallDIR%\features\org.talend.rcp.branding.%PRODUCTNAME%\%PRODUCTNAME%license.txt
//
// You should have received a copy of the agreement
// along with this program; if not, write to Talend SA
// 9 rue Pages 92150 Suresnes, France
//
// ============================================================================
package org.talend;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.lang.reflect.Method;
import java.nio.MappedByteBuffer;
import java.nio.channels.FileChannel;
import java.security.AccessController;
import java.security.PrivilegedAction;
/**
* DOC Administrator class global comment. Detailled comment
*/
public class FileCopy {
private final static long L_SIZE = 100 * 1024 * 1024; // 100M
private final static long M_SIZE = 10 * 1024 * 1024; // 10M
private final static long S_SIZE = 0; // 0M
public static void copyFile(String srcFileName, String desFileName, boolean delSrc) throws Exception {
FileInputStream srcInputStream = null;
try{
srcInputStream = new FileInputStream(srcFileName);
long lastModified = new File(srcFileName).lastModified();
int available = srcInputStream.available();
if (available > L_SIZE) {// X > 100M
copyFileL(srcFileName, srcInputStream, desFileName, delSrc);
} else if (available > M_SIZE) {// 10M < X <100M
copyFileM(srcFileName, srcInputStream, desFileName, delSrc);
} else { // X < 10M
copyFileS(srcFileName, srcInputStream, desFileName, delSrc);
}
// keep modification_time
new File(desFileName).setLastModified(lastModified);
}finally{
if(srcInputStream!=null){
srcInputStream.close();
}
}
}
private static void copyFileS(String srcFileName, FileInputStream srcInputStream, String desFileName, boolean delSrc)
throws IOException {
File source = new File(srcFileName);
File dest = new File(desFileName);
FileInputStream in = null;
FileOutputStream out = null;
try {
in = srcInputStream;
out = new FileOutputStream(dest);
byte[] buf = new byte[1024];
int len = 0;
while ((len = in.read(buf)) != -1) {
out.write(buf, 0, len);
}
in.close();
out.close();
if (delSrc) {
source.delete();
}
} finally {
if (in != null) {
in.close();
}
if (out != null) {
out.close();
}
}
}
private static void copyFileM(String srcFileName, FileInputStream srcInputStream, String desFileName, boolean delSrc)
throws IOException {
File source = new File(srcFileName);
File dest = new File(desFileName);
FileChannel in = null;
FileChannel out = null;
try {
in = srcInputStream.getChannel();
out = new FileOutputStream(dest).getChannel();
int maxCount = (32 * 1024 * 1024) - (28 * 1024);
long size = in.size();
long position = 0;
while (position < size) {
position += in.transferTo(position, maxCount, out);
}
in.close();
out.close();
if (delSrc) {
source.delete();
}
} finally {
if (in != null) {
in.close();
}
if (out != null) {
out.close();
}
}
}
private static void copyFileL(String srcFileName, FileInputStream srcInputStream, String desFileName, boolean delSrc)
throws Exception {
File source = new File(srcFileName);
File dest = new File(desFileName);
FileChannel in = null, out = null;
try {
in = srcInputStream.getChannel();
out = new FileOutputStream(dest).getChannel();
long size = in.size();
long position = 0;
final long MAP_SIZE = 33525760;
MappedByteBuffer buf = null;
while (true) {
if (position + MAP_SIZE >= size) {
buf = in.map(FileChannel.MapMode.READ_ONLY, position, size - position);
out.write(buf);
//For But TDI-26493, here must clean first, or it can't delete
clean(buf);
break;
} else {
buf = in.map(FileChannel.MapMode.READ_ONLY, position, MAP_SIZE);
out.write(buf);
// here must clean first, or it can't delete
clean(buf);
position += MAP_SIZE;
}
}
in.close();
out.close();
if (delSrc) {
source.delete();
}
} finally {
if (in != null) {
in.close();
}
if (out != null) {
out.close();
}
}
}
@SuppressWarnings("unchecked")
private static void clean(final Object buffer) throws Exception {
AccessController.doPrivileged(new PrivilegedAction() {
public Object run() {
try {
Method getCleanerMethod = buffer.getClass().getMethod("cleaner", new Class[0]);
getCleanerMethod.setAccessible(true);
sun.misc.Cleaner cleaner = (sun.misc.Cleaner) getCleanerMethod.invoke(buffer, new Object[0]);
cleaner.clean();
} catch (Exception e) {
e.printStackTrace();
}
return null;
}
});
}
}

View File

@@ -0,0 +1,73 @@
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.talend.libraries</groupId>
<artifactId>filecopy</artifactId>
<version>2.0.0</version>
<packaging>jar</packaging>
<name>talend-copy</name>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<talend.nexus.url>https://artifacts-oss.talend.com</talend.nexus.url>
<java.source.version>1.8</java.source.version>
<junit5.version>5.4.2</junit5.version>
</properties>
<distributionManagement>
<snapshotRepository>
<id>talend_nexus_deployment</id>
<url>${talend.nexus.url}/nexus/content/repositories/TalendOpenSourceSnapshot/</url>
<snapshots>
<enabled>true</enabled>
</snapshots>
<releases>
<enabled>false</enabled>
</releases>
</snapshotRepository>
<repository>
<id>talend_nexus_deployment</id>
<url>${talend.nexus.url}/nexus/content/repositories/TalendOpenSourceRelease/</url>
<snapshots>
<enabled>false</enabled>
</snapshots>
<releases>
<enabled>true</enabled>
</releases>
</repository>
</distributionManagement>
<dependencies>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-api</artifactId>
<version>${junit5.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-engine</artifactId>
<version>${junit5.version}</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.1</version>
<configuration>
<source>${java.source.version}</source>
<target>${java.source.version}</target>
<showDeprecation>true</showDeprecation>
<showWarnings>true</showWarnings>
<fork>true</fork>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -0,0 +1,49 @@
// ============================================================================
//
// Copyright (C) 2006-2019 Talend Inc. - www.talend.com
//
// This source code is available under agreement available at
// %InstallDIR%\features\org.talend.rcp.branding.%PRODUCTNAME%\%PRODUCTNAME%license.txt
//
// You should have received a copy of the agreement
// along with this program; if not, write to Talend SA
// 9 rue Pages 92150 Suresnes, France
//
// ============================================================================
package org.talend;
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.StandardCopyOption;
/**
* DOC Administrator class global comment. Detailled comment
*/
public class FileCopy {
/** Private constructor, only static methods */
private FileCopy() {
}
/**
* Copy files.
*
* @param srcFileName : file name for source file.
* @param desFileName : file name for destination file.
* @param delSrc : true if delete source.
* @throws IOException : if IO pb.
*/
public static void copyFile(String srcFileName, String desFileName, boolean delSrc) throws IOException {
final File source = new File(srcFileName);
final File destination = new File(desFileName);
if (delSrc) {
// move : more efficient if in same FS and mustr delete existing file.
Files.move(source.toPath(), destination.toPath(), StandardCopyOption.REPLACE_EXISTING);
} else {
Files.copy(source.toPath(), destination.toPath(), StandardCopyOption.REPLACE_EXISTING);
}
}
}

View File

@@ -0,0 +1,128 @@
package org.talend;
import java.io.BufferedWriter;
import java.io.File;
import java.io.IOException;
import java.net.URL;
import java.nio.file.Files;
import java.nio.file.StandardOpenOption;
import java.util.concurrent.TimeUnit;
import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.Test;
/**
* Test for FileCopy.class with diffents file size.
*
* @author clesaec
*
*/
class FileCopyTest {
@Test
void testCopyFile() throws Exception {
final URL repCopy = Thread.currentThread().getContextClassLoader().getResource("copy");
File small = this.buildFile("small.txt", 10L * 1024L);
small.deleteOnExit();
File smallCopy = new File(repCopy.getPath(), "small.txt");
smallCopy.deleteOnExit();
FileCopy.copyFile(small.getPath(), smallCopy.getPath(), false);
Assertions.assertTrue(smallCopy.exists(), "small file fail to copy (not created)");
Assertions.assertTrue(small.exists(), "small file : original file deleted");
Assertions.assertEquals(smallCopy.length(), small.length(), "Size error");
File medium = this.buildFile("medium.txt", 30L * 1024L * 1024L);
medium.deleteOnExit();
File mediumCopy = new File(repCopy.getPath(), "medium.txt");
mediumCopy.deleteOnExit();
FileCopy.copyFile(medium.getPath(), mediumCopy.getPath(), false);
Assertions.assertTrue(mediumCopy.exists(), "medium file fail to copy (not created)");
Assertions.assertTrue(medium.exists(), "medium file : original file deleted");
Assertions.assertEquals(mediumCopy.length(), medium.length(), "Size error");
File large = this.buildFile("large.txt", 110L * 1024L * 1024L);
large.deleteOnExit();
long startTime = System.nanoTime();
File largeCopy = new File(repCopy.getPath(), "large.txt");
long duration = System.nanoTime() - startTime;
System.out.println("Duration for 110 Mo file : " + TimeUnit.NANOSECONDS.toMicros(duration) + " µs");
largeCopy.deleteOnExit();
FileCopy.copyFile(large.getPath(), largeCopy.getPath(), false);
Assertions.assertTrue(largeCopy.exists(), "small file fail to copy (not created)");
Assertions.assertTrue(large.exists(), "small file : original file deleted");
Assertions.assertEquals(largeCopy.length(), large.length(), "Size error");
}
@Test
void testCopyMv() throws Exception {
final URL repCopy = Thread.currentThread().getContextClassLoader().getResource("copy");
File file = this.buildFile("fileToMove.txt", 10L * 1024L);
file.deleteOnExit();
File copy = new File(repCopy.getPath(), "fileToMove.txt");
long referenceSize = file.length();
if (copy.exists()) {
copy.delete();
}
copy.deleteOnExit();
FileCopy.copyFile(file.getPath(), copy.getPath(), true);
Assertions.assertFalse(file.exists(), "file not delete");
Assertions.assertTrue(copy.exists(), "small file : original file deleted");
Assertions.assertEquals(referenceSize, copy.length(), "Size error");
}
@Test
void testCopyWithDelete() throws Exception {
final URL repCopy = Thread.currentThread().getContextClassLoader().getResource("copy");
File file = this.buildFile("fileToDelete.txt", 10L * 1024L);
file.deleteOnExit();
File copy = new File(repCopy.getPath(), "fileToDelete.txt");
long referenceSize = file.length();
if (!copy.exists()) {
copy.createNewFile();
}
copy.deleteOnExit();
FileCopy.copyFile(file.getPath(), copy.getPath(), true);
Assertions.assertFalse(file.exists(), "file not delete");
Assertions.assertTrue(copy.exists(), "small file : original file deleted");
Assertions.assertEquals(referenceSize, copy.length(), "Size error");
}
/**
* Generate a new file for testing.
*
* @param name : name of file.
* @param minSize : minimum size.
* @return the new file.
* @throws IOException : on IO pb.
*/
private File buildFile(String name, long minSize) throws IOException {
final URL repGenerated = Thread.currentThread().getContextClassLoader().getResource("generated");
final File generatedFile = new File(repGenerated.getPath(), name);
if (generatedFile.exists()) {
generatedFile.delete();
}
final String data = "{ data to put in generated file for it have the desired sized }" + System.lineSeparator();
long nbeIteration = (minSize / data.length()) + 1;
try (BufferedWriter writer = Files.newBufferedWriter(generatedFile.toPath(), StandardOpenOption.CREATE)) {
for (long i = 0; i < nbeIteration; i++) {
writer.append(data);
}
}
return generatedFile;
}
}

View File

@@ -0,0 +1 @@
Just here to not have an empty directory.

View File

@@ -0,0 +1 @@
Just here to not have an empty directory.

View File

@@ -258,76 +258,80 @@
com.google.api.services.bigquery.model.JobConfiguration config_<%=cid%> = new com.google.api.services.bigquery.model.JobConfiguration();
com.google.api.services.bigquery.model.JobConfigurationLoad queryLoad_<%=cid%> = new com.google.api.services.bigquery.model.JobConfigurationLoad();
com.google.api.services.bigquery.model.TableSchema schema_<%=cid%> = new com.google.api.services.bigquery.model.TableSchema();
<%
if(isLog4jEnabled){
%>
log.info("<%=cid%> - Table field schema:");
<%
}
%>
java.util.List<com.google.api.services.bigquery.model.TableFieldSchema> fields_<%=cid%> = new java.util.ArrayList<com.google.api.services.bigquery.model.TableFieldSchema>();
<%
List<IMetadataTable> metadatas = node.getMetadataList();
if ((metadatas!=null) && (metadatas.size() > 0)) {
IMetadataTable metadata = metadatas.get(0);
if (metadata != null) {
List<IMetadataColumn> columns = metadata.getListColumns();
int nbColumns = columns.size();
for (int i = 0; i < nbColumns; i++ ) {
IMetadataColumn column = columns.get(i);
String columnName = column.getLabel();
String typeToGenerate = "string";
if("id_Float".equals(column.getTalendType()) || "id_Double".equals(column.getTalendType())) {
typeToGenerate = "float";
}else if("id_Integer".equals(column.getTalendType()) || "id_Long".equals(column.getTalendType()) || "id_Short".equals(column.getTalendType())) {
typeToGenerate = "integer";
} else if("id_Character".equals(column.getTalendType())) {
typeToGenerate = "string";
} else if("id_BigDecimal".equals(column.getTalendType())) {
typeToGenerate = "numeric";
} else if("id_Boolean".equals(column.getTalendType())) {
typeToGenerate = "boolean";
} else if("id_Date".equals(column.getTalendType())) {
String pattern = column.getPattern();
if(pattern.length() == 12 || pattern.isEmpty() || "\"\"".equals(pattern)) {
typeToGenerate = "date";
}else if(pattern.length() > 12){
typeToGenerate = "timestamp";
}else{
typeToGenerate = "string";
}
}
%>
<%
String modeType = null;
if (!column.isNullable()) {
modeType = "REQUIRED";
} else {
modeType = "NULLABLE";
if (<%=ElementParameterParser.getBooleanValue(node, "__CREATE_TABLE_IF_NOT_EXIST__")%>) {
com.google.api.services.bigquery.model.TableSchema schema_<%=cid%> = new com.google.api.services.bigquery.model.TableSchema();
<%
if(isLog4jEnabled){
%>
log.info("<%=cid%> - Table field schema:");
<%
}
%>
java.util.List<com.google.api.services.bigquery.model.TableFieldSchema> fields_<%=cid%> = new java.util.ArrayList<com.google.api.services.bigquery.model.TableFieldSchema>();
<%
List<IMetadataTable> metadatas = node.getMetadataList();
if ((metadatas!=null) && (metadatas.size() > 0)) {
IMetadataTable metadata = metadatas.get(0);
if (metadata != null) {
List<IMetadataColumn> columns = metadata.getListColumns();
int nbColumns = columns.size();
for (int i = 0; i < nbColumns; i++ ) {
IMetadataColumn column = columns.get(i);
String columnName = column.getLabel();
String typeToGenerate = "string";
if("id_Float".equals(column.getTalendType()) || "id_Double".equals(column.getTalendType())) {
typeToGenerate = "float";
}else if("id_Integer".equals(column.getTalendType()) || "id_Long".equals(column.getTalendType()) || "id_Short".equals(column.getTalendType())) {
typeToGenerate = "integer";
} else if("id_Character".equals(column.getTalendType())) {
typeToGenerate = "string";
} else if("id_BigDecimal".equals(column.getTalendType())) {
typeToGenerate = "numeric";
} else if("id_Boolean".equals(column.getTalendType())) {
typeToGenerate = "boolean";
} else if("id_Date".equals(column.getTalendType())) {
String pattern = column.getPattern();
if(pattern.length() == 12 || pattern.isEmpty() || "\"\"".equals(pattern)) {
typeToGenerate = "date";
}else if(pattern.length() > 12){
typeToGenerate = "timestamp";
}else{
typeToGenerate = "string";
}
}
%>
<%
String modeType = null;
if (!column.isNullable()) {
modeType = "REQUIRED";
} else {
modeType = "NULLABLE";
}
%>
com.google.api.services.bigquery.model.TableFieldSchema <%=columnName%>_<%=cid%> = new com.google.api.services.bigquery.model.TableFieldSchema();
<%=columnName%>_<%=cid%>.setName("<%=columnName%>");
<%=columnName%>_<%=cid%>.setType("<%=typeToGenerate%>");
<%=columnName%>_<%=cid%>.setMode("<%=modeType%>");
fields_<%=cid%>.add(<%=columnName%>_<%=cid%>);
<%
if(isLog4jEnabled){
%>
log.debug("<%=cid%> - Field index[<%=i%>] {\"name\":\"<%=columnName%>\",\"type\":\"<%=typeToGenerate%>\",\"mode\":\"<%=modeType%>\"}");
<%
}
%>
com.google.api.services.bigquery.model.TableFieldSchema <%=columnName%>_<%=cid%> = new com.google.api.services.bigquery.model.TableFieldSchema();
<%=columnName%>_<%=cid%>.setName("<%=columnName%>");
<%=columnName%>_<%=cid%>.setType("<%=typeToGenerate%>");
<%=columnName%>_<%=cid%>.setMode("<%=modeType%>");
fields_<%=cid%>.add(<%=columnName%>_<%=cid%>);
<%
if(isLog4jEnabled){
%>
log.debug("<%=cid%> - Field index[<%=i%>] {\"name\":\"<%=columnName%>\",\"type\":\"<%=typeToGenerate%>\",\"mode\":\"<%=modeType%>\"}");
<%
}
}
}
}
%>
%>
schema_<%=cid%>.setFields(fields_<%=cid%>);
queryLoad_<%=cid%>.setSchema(schema_<%=cid%>);
schema_<%=cid%>.setFields(fields_<%=cid%>);
queryLoad_<%=cid%>.setSchema(schema_<%=cid%>);
}
<%
if("true".equals(ElementParameterParser.getValue(node, "__CREATE_TABLE_IF_NOT_EXIST__"))) {
%>

View File

@@ -143,7 +143,10 @@
<CODEGENERATION>
<IMPORTS>
<IMPORT NAME="filecopy" MODULE="filecopy.jar" MVN="mvn:org.talend.libraries/filecopy/6.0.0" REQUIRED="true" />
<IMPORT NAME="filecopy" MODULE="filecopy-2.0.0.jar"
MVN="mvn:org.talend.libraries/filecopy/2.0.0"
UrlPath="platform:/plugin/org.talend.libraries.custom/lib/filecopy-2.0.0.jar"
REQUIRED="true" />
</IMPORTS>
</CODEGENERATION>

View File

@@ -253,6 +253,9 @@ if ("http".equals(protocol) || "https".equals(protocol)) {
client_<%=cid %>.getState().setProxyCredentials(
new org.apache.commons.httpclient.auth.AuthScope(<%=proxyHost %>, Integer.parseInt(<%=proxyPort%>), null),
new org.apache.commons.httpclient.UsernamePasswordCredentials(<%=proxyUser %>, decryptedPassword_<%=cid%>));
java.util.List<String> authPrefs_<%=cid %> = java.util.Collections.singletonList(org.apache.commons.httpclient.auth.AuthPolicy.BASIC);
client_<%=cid %>.getParams().setParameter(org.apache.commons.httpclient.auth.AuthPolicy.AUTH_SCHEME_PRIORITY, authPrefs_<%=cid %>);
<%}
}

View File

@@ -72,7 +72,7 @@
SHOW="false"
REPOSITORY_VALUE="TYPE"
>
<DEFAULT>PostgreSQL</DEFAULT>
<DEFAULT>PostgresPlus</DEFAULT>
</PARAMETER>
<PARAMETER

View File

@@ -89,7 +89,7 @@
SHOW="false"
REPOSITORY_VALUE="TYPE"
>
<DEFAULT>PostgreSQL</DEFAULT>
<DEFAULT>PostgresPlus</DEFAULT>
</PARAMETER>
<PARAMETER

View File

@@ -115,7 +115,7 @@
SHOW="false"
REPOSITORY_VALUE="TYPE"
>
<DEFAULT>PostgreSQL</DEFAULT>
<DEFAULT>PostgresPlus</DEFAULT>
</PARAMETER>
<PARAMETER

View File

@@ -8322,15 +8322,13 @@
context="plugin:org.talend.libraries.apache"
id="logback-classic-1.0.9.jar"
mvn_uri="mvn:org.talend.libraries/logback-classic-1.0.9/6.0.0"
name="logback-classic-1.0.9.jar"
uripath="platform:/plugin/org.talend.libraries.apache/lib/logback-classic-1.0.9.jar">
name="logback-classic-1.0.9.jar">
</libraryNeeded>
<libraryNeeded
context="plugin:org.talend.libraries.apache"
id="logback-core-1.0.9.jar"
mvn_uri="mvn:org.talend.libraries/logback-core-1.0.9/6.0.0"
name="logback-core-1.0.9.jar"
uripath="platform:/plugin/org.talend.libraries.apache/lib/logback-core-1.0.9.jar">
name="logback-core-1.0.9.jar">
</libraryNeeded>
<libraryNeeded
context="plugin:"

View File

@@ -14,6 +14,7 @@ package org.talend.designer.core.ui.editor.properties.controllers;
import java.beans.PropertyChangeEvent;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
@@ -76,6 +77,7 @@ import org.talend.designer.rowgenerator.data.Function;
import org.talend.utils.json.JSONArray;
import org.talend.utils.json.JSONException;
import org.talend.utils.json.JSONObject;
import org.talend.utils.sql.TalendTypeConvert;
import orgomg.cwm.objectmodel.core.TaggedValue;
@@ -112,6 +114,12 @@ public class ColumnListController extends AbstractElementPropertySectionControll
*/
private static final String FILTER_PREFIX_CUSTOM = "CUSTOM_COLUMNS:"; //$NON-NLS-1$
/**
* Indicate you want to filter columns by data type
* DATA_TYPE:Date,String will only keep Date and String columns
*/
private static final String FILTER_DATA_TYPE = "DATA_TYPE:"; //$NON-NLS-1$
private static Logger log = Logger.getLogger(ColumnListController.class);
private boolean updateColumnListFlag;
@@ -778,9 +786,15 @@ public class ColumnListController extends AbstractElementPropertySectionControll
}
boolean unlimited = !onlyFilterCustom && !onlyFilterNoneCustom;
boolean hasReg = false;
boolean hasDataTypeFilter = false;
List<String> datatypeNameList = null;
if (filter.startsWith(FILTER_PREFIX_REGEXP)) {
filter = filter.substring(FILTER_PREFIX_REGEXP.length());
hasReg = true;
} else if (filter.startsWith(FILTER_DATA_TYPE)) {
filter = filter.substring(FILTER_DATA_TYPE.length());
hasDataTypeFilter = true;
datatypeNameList = Arrays.asList(filter.split(FILTER_SEPARATOR));
}
boolean filterAll = false;
if (filter.equals(FILTER_ALL)) {
@@ -798,6 +812,17 @@ public class ColumnListController extends AbstractElementPropertySectionControll
columnValueList = (String[]) ArrayUtils.removeElement(columnValueList, colName);
}
}
} else if (hasDataTypeFilter && datatypeNameList != null) {
IMetadataTable metadataTable = getMetadataTable(param.getElement(), param.getContext());
for (String colName : tmpColumnNameList) {
IMetadataColumn metadataColumn = metadataTable.getColumn(colName);
String dataType = metadataColumn.getTalendType();
if (!(datatypeNameList.contains(dataType)
|| datatypeNameList.contains(TalendTypeConvert.convertToJavaType(dataType)))) {
columnNameList = (String[]) ArrayUtils.removeElement(columnNameList, colName);
columnValueList = (String[]) ArrayUtils.removeElement(columnValueList, colName);
}
}
} else {
if (filterAll) {
for (String colName : tmpColumnNameList) {
@@ -923,28 +948,7 @@ public class ColumnListController extends AbstractElementPropertySectionControll
private static List<String> getColumnList(IElement element, String context, Map<String, Boolean> customColMap) {
List<String> columnList = new ArrayList<String>();
IMetadataTable table = null;
if (element instanceof INode) {
table = ((INode) element).getMetadataFromConnector(context);
if (table == null) {
List<IMetadataTable> tableList = ((INode) element).getMetadataList();
if (tableList.size() == 1) {
table = tableList.get(0);
} else {
for (IMetadataTable itable : tableList) {
if (itable.getAttachedConnector() != null && !itable.getAttachedConnector().equals("REJECT")) {
table = itable;
break;
}
}
}
// if (tableList.size() > 0) {
// table = tableList.get(0);
// }
}
} else if (element instanceof IConnection) {
table = ((IConnection) element).getMetadataTable();
}
IMetadataTable table = getMetadataTable(element, context);
if (table != null) {
for (IMetadataColumn column : table.getListColumns()) {
@@ -968,6 +972,31 @@ public class ColumnListController extends AbstractElementPropertySectionControll
return columnList;
}
private static IMetadataTable getMetadataTable(IElement element, String context) {
IMetadataTable table = null;
if (element instanceof INode) {
table = ((INode) element).getMetadataFromConnector(context);
if (table == null) {
List<IMetadataTable> tableList = ((INode) element).getMetadataList();
if (tableList.size() == 1) {
table = tableList.get(0);
} else {
for (IMetadataTable itable : tableList) {
if (itable.getAttachedConnector() != null && !itable.getAttachedConnector().equals("REJECT")) {
table = itable;
break;
}
}
}
}
} else if (element instanceof IConnection) {
table = ((IConnection) element).getMetadataTable();
}
return table;
}
private static List<String> getPrevColumnList(INode node, Map<String, Boolean> customColMap) {
List<String> columnList = new ArrayList<String>();

View File

@@ -384,7 +384,6 @@ public class UpdateDetectionDialog extends SelectionDialog {
tree.setLayoutData(new GridData(GridData.FILL_BOTH));
addViewerListener();
createColumns(tree);
helper.selectAll(true);
return composite;
}

View File

@@ -450,7 +450,7 @@ public final class UpdateManagerUtils {
} else {
((ProcessItem) item).setProcess(processType);
}
factory.save(item);
factory.save(item, true);
} catch (IOException e) {
ExceptionHandler.process(e);
} catch (PersistenceException e) {

View File

@@ -61,4 +61,7 @@ NameSpaceDialog.prefixInvalid=Prefix value is invalid\!
Schema2XMLDragAndDropHandler.HasChildrenWarning=has element children, can not have linker.
Schema2XMLDragAndDropHandler.IsRootWarning=is root, can not have linker.
Schema2XMLDragAndDropHandler.IsNotElementWarning=isn't a Element, can not create sub-elements or attributes.
Schema2XMLDragAndDropHandler.IsNotElementWarning=isn't a Element, can not create sub-elements or attributes.
ImportTreeFromXMLAction.ImportSchemaNotExistError=Required reference schema files are missing.
ImportTreeFromXMLAction.schemaFileNotExistDetailTitle=The following files do not exist:

View File

@@ -14,6 +14,7 @@ package org.talend.designer.fileoutputxml.action;
import java.util.ArrayList;
import java.util.List;
import java.util.Set;
import org.eclipse.jface.dialogs.IDialogConstants;
import org.eclipse.jface.viewers.IStructuredSelection;
@@ -24,9 +25,11 @@ import org.eclipse.ui.actions.SelectionProviderAction;
import org.eclipse.xsd.XSDSchema;
import org.talend.commons.runtime.xml.XmlUtil;
import org.talend.commons.ui.runtime.exception.ExceptionHandler;
import org.talend.commons.ui.swt.dialogs.ErrorDialogWidthDetailArea;
import org.talend.datatools.xml.utils.ATreeNode;
import org.talend.datatools.xml.utils.SchemaPopulationUtil;
import org.talend.datatools.xml.utils.XSDPopulationUtil2;
import org.talend.designer.fileoutputxml.i18n.Messages;
import org.talend.designer.fileoutputxml.ui.FOXUI;
import org.talend.metadata.managment.ui.dialog.RootNodeSelectDialog;
import org.talend.metadata.managment.ui.wizard.metadata.xml.node.Attribute;
@@ -43,6 +46,8 @@ import org.talend.metadata.managment.ui.wizard.metadata.xml.utils.TreeUtil;
*/
public class ImportTreeFromXMLAction extends SelectionProviderAction {
private static final String LINEFEED = "\n";//$NON-NLS-1$
// the xml viewer, see FOXUI.
private TreeViewer xmlViewer;
@@ -172,19 +177,32 @@ public class ImportTreeFromXMLAction extends SelectionProviderAction {
try {
if (XmlUtil.isXSDFile(filePath)) {
XSDSchema xsdSchema = TreeUtil.getXSDSchema(filePath);
List<ATreeNode> list = new XSDPopulationUtil2().getAllRootNodes(xsdSchema);
if (list.size() > 1) {
RootNodeSelectDialog dialog = new RootNodeSelectDialog(xmlViewer.getControl().getShell(), list);
if (dialog.open() == IDialogConstants.OK_ID) {
ATreeNode selectedNode = dialog.getSelectedNode();
newInput = TreeUtil.getFoxTreeNodesByRootNode(xsdSchema, selectedNode);
changed = true;
} else {
changed = false;
// check if there have some (<xs:import>) import reference schema xsd file don't exist
Set<String> notExistImportSchema = TreeUtil.getNotExistImportSchema(filePath, xsdSchema);
if (!notExistImportSchema.isEmpty()) {
StringBuffer detail = new StringBuffer();
detail.append(Messages.getString("ImportTreeFromXMLAction.schemaFileNotExistDetailTitle")).append(LINEFEED);//$NON-NLS-1$
for (String xsdfilePath : notExistImportSchema) {
detail.append(xsdfilePath).append(LINEFEED);
}
new ErrorDialogWidthDetailArea(xmlViewer.getControl().getShell(), Messages.PLUGIN_ID,
Messages.getString("ImportTreeFromXMLAction.ImportSchemaNotExistError"), detail.toString());//$NON-NLS-1$
}else {
List<ATreeNode> list = new XSDPopulationUtil2().getAllRootNodes(xsdSchema);
if (list.size() > 1) {
RootNodeSelectDialog dialog = new RootNodeSelectDialog(xmlViewer.getControl().getShell(), list);
if (dialog.open() == IDialogConstants.OK_ID) {
ATreeNode selectedNode = dialog.getSelectedNode();
newInput = TreeUtil.getFoxTreeNodesByRootNode(xsdSchema, selectedNode);
changed = true;
} else {
changed = false;
}
} else {
newInput = TreeUtil.getFoxTreeNodesByRootNode(xsdSchema, list.get(0));
changed = true;
}
} else {
newInput = TreeUtil.getFoxTreeNodesByRootNode(xsdSchema, list.get(0));
changed = true;
}
} else {
newInput = treeNodeAdapt(filePath);

View File

@@ -26,7 +26,7 @@ public class Messages extends MessagesCore {
private static final String BUNDLE_NAME = "messages"; //$NON-NLS-1$
private static final String PLUGIN_ID = "org.talend.designer.fileoutputxml"; //$NON-NLS-1$
public static final String PLUGIN_ID = "org.talend.designer.fileoutputxml"; //$NON-NLS-1$
private static final ResourceBundle RESOURCE_BUNDLE = ResourceBundle.getBundle(BUNDLE_NAME);

View File

@@ -1180,7 +1180,12 @@ public class JavaProcessor extends AbstractJavaProcessor implements IJavaBreakpo
list.addAll(Arrays.asList(cmd2));
return list.toArray(new String[0]);
} else {
return cmd2;
List<String> asList = convertArgsToList(cmd2);
if ((!isExternalUse() && isStandardJob()) || isGuessSchemaJob(property)) {
String localM2Path = "-Dtalend.component.manager.m2.repository=\"" + PomUtil.getLocalRepositoryPath() + "\""; //$NON-NLS-1$ //$NON-NLS-2$
asList.add(3, localM2Path);
}
return asList.toArray(new String[0]);
}
}

View File

@@ -58,6 +58,11 @@
<artifactId>talendzip</artifactId>
<version>1.0-20190527</version>
</artifactItem>
<artifactItem>
<groupId>org.talend.libraries</groupId>
<artifactId>filecopy</artifactId>
<version>2.0.0</version>
</artifactItem>
</artifactItems>
</configuration>
</execution>

View File

@@ -1,5 +1,5 @@
Talend Open Studio for Data Integration
Copyright (c) 2006 - 2017 Talend Inc. - www.talend.com
Copyright (c) 2006-2019 Talend Inc. - www.talend.com
All rights reserved.
@@ -21,565 +21,565 @@ The Eclipse Foundation (http://www.eclipse.org).
Licensed under the Eclipse Public License - v1.0
This product includes software developed at
This product includes software developed at
ASM Helper Minidev.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
AWS SDK Java (https://github.com/aws/aws-sdk-java).
Licensed under Apache-2.0
© 2016, Amazon Web Services
(C) 2016, Amazon Web Services
This product includes software developed at
This product includes software developed at
Amazon Aws Libraries.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Amazon SDK for Java (https://aws.amazon.com/sdkforjava).
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Amazon-S3 (https://github.com/aws/aws-sdk-java/blob/master/LICENSE.txt).
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
AtInject (https://code.google.com/p/atinject/).
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Avro MapReduce.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Box Java SDK (V2) (https://github.com/box/box-java-sdk-v2/blob/master/LICENSE).
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
CSV Tools.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Castor.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Ehcache.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Ezmorph.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Google APIs Client Library for Java.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Google Gson (https://code.google.com/p/google-gson/).
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Groovy.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Guava (https://github.com/google/guava).
Licensed under Apache-2.0
Copyright (C) 2010 The Guava Authors
This product includes software developed at
This product includes software developed at
Guava (https://github.com/google/guava/blob/master/COPYING).
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Guava: Google Core Libraries for Java (https://code.google.com/p/guava-libraries/).
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Guava: Google Core Libraries for Java 1.6+ (https://code.google.com/p/guava-libraries/).
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Guava: Google Core Libraries for Java 1.6+.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Hadoop Libraries.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Hadoop Libraries (Retrieved from CDH3u0. LICENSE.txt explains that Zookeeper is distributed under Apache License 2.0).
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Ini4j.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
JClouds.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Jackcess (http://jackcess.sourceforge.net/).
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Jackson.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Jackson JSON Processor (http://wiki.fasterxml.com/JacksonLicensing).
Licensed under Apache-2.0
Copyright © 2012-2013 FasterXML.
Copyright (C) 2012-2013 FasterXML.
This product includes software developed at
This product includes software developed at
Jackson JSON processor.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Jackson Java JSON-processor.
Licensed under Apache-2.0
Copyright ©2009 FasterXML, LLC
This product includes software developed at
This product includes software developed at
Jackson Java JSON-processor.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Jackson Libraries.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Jasypt : Java Simplified Encryption.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
JetS3t.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Jettison.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Joda Time.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Joda-Time (http://www.joda.org/joda-time/).
Licensed under Apache-2.0
Copyright ©2002-2016 Joda.org
This product includes software developed at
This product includes software developed at
Joda-Time (http://www.joda.org/joda-time/).
Licensed under Apache-2.0
Copyright ©2002-2015 Joda.org
This product includes software developed at
This product includes software developed at
Joda-Time.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Json Simple (https://code.google.com/p/json-simple/).
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Json-Simple (https://github.com/fangyidong/json-simple).
Licensed under Apache-2.0
Yidong Fang
Chris Nokleberg
Dave Hughes
This product includes software developed at
This product includes software developed at
Lucene Core.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
MarkLogic Java Client API.
Licensed under Apache-2.0
Copyright 2012-2015 MarkLogic Corporation
This product includes software developed at
This product includes software developed at
Microsoft Azure SDK for Java (https://github.com/Azure/azure-sdk-for-java).
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
OpenSAML (https://wiki.shibboleth.net/confluence/display/OpenSAML/Home/).
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Plexus (https://codehaus-plexus.github.io/index.html).
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Resty : A simple HTTP REST client for Java.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Rocoto (http://99soft.github.io/rocoto/index.html
).
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Sisu Guice (https://github.com/sonatype/sisu-guice).
Licensed under Apache-2.0
Copyright (c) 2006 Google, Inc. All rights reserved.
This product includes software developed at
This product includes software developed at
Sonatype Plexus (https://github.com/sonatype?utf8=%E2%9C%93&query=plexus).
Licensed under Apache-2.0
Copyright The Codehaus Foundation.
This product includes software developed at
This product includes software developed at
Spring.
Licensed under Apache-2.0
Pivotal Software Inc. / The original author or authors
This product includes software developed at
This product includes software developed at
SshJ (https://github.com/shikhar/sshj/blob/master/LICENSE).
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
StAX API.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
StAXON - JSON via StAX (https://github.com/beckchr/staxon/).
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Talend ESB Libraries.
Licensed under Apache-2.0
Talend
This product includes software developed at
This product includes software developed at
WSS4J.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Woden.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Woodstox.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
XMLSchema.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Xerces2 Java Parser.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
XmlBeans.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
XmlSchema Core.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Zip4J.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
google-gson.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
google-guice (https://code.google.com/p/google-guice/).
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
json-path (https://code.google.com/p/json-path/).
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
json-smart.
Licensed under Apache-2.0
This product includes software developed at
This product includes software developed at
Jetty (http://www.eclipse.org/jetty/).
Licensed under Apache-2.0;EPL-1.0
Copyright © 2016 The Eclipse Foundation.
Copyright (C) 2016 The Eclipse Foundation.
This product includes software developed at
This product includes software developed at
Jetty.
Licensed under Apache-2.0;EPL-1.0
This product includes software developed at
This product includes software developed at
Jetty (http://www.eclipse.org/jetty/licenses.php).
Licensed under Apache-2.0;EPL-1.0
This product includes software developed at
This product includes software developed at
Jackson Java JSON-processor.
Licensed under Apache-2.0;LGPL-2.1
This product includes software developed at
This product includes software developed at
Woodstox : High-performance XML processor (http://woodstox.codehaus.org/ http://woodstox.codehaus.org/Download).
Licensed under Apache-2.0;LGPL-2.1
This product includes software developed at
This product includes software developed at
Woodstox : High-performance XML processor (http://woodstox.codehaus.org).
Licensed under Apache-2.0;LGPL-2.1
This product includes software developed at
This product includes software developed at
Cryptacular.
Licensed under Apache-2.0;LGPL-3.0
This product includes software developed at
This product includes software developed at
SqliteJDBC.
Licensed under BSD-2-Clause
This product includes software developed at
This product includes software developed at
ASM.
Licensed under BSD-3-Clause
INRIA
This product includes software developed at
This product includes software developed at
AntlR.
Licensed under BSD-3-Clause
This product includes software developed at
This product includes software developed at
Antlr 3 Runtime.
Licensed under BSD-3-Clause
This product includes software developed at
This product includes software developed at
Force.com Web Service Connector (WSC).
Licensed under BSD-3-Clause
This product includes software developed at
This product includes software developed at
Ganymed SSH-2 for Java.
Licensed under BSD-3-Clause
This product includes software developed at
This product includes software developed at
HsqlDB.
Licensed under BSD-3-Clause
This product includes software developed at
This product includes software developed at
Jaxen.
Licensed under BSD-3-Clause
This product includes software developed at
This product includes software developed at
Paraccel JDBC Driver.
Licensed under BSD-3-Clause
This product includes software developed at
This product includes software developed at
PostgreSQL JDBC Driver.
Licensed under BSD-3-Clause
This product includes software developed at
This product includes software developed at
Salesforce.com (http://www.force.com).
Licensed under BSD-3-Clause
Copyright (c) 2005-2013, salesforce.com
This product includes software developed at
This product includes software developed at
Scala (http://www.scala-lang.org/).
Licensed under BSD-3-Clause
Copyright (c) 2002-2016 EPFL; Copyright (c) 2011-2016 Lightbend, Inc. (formerly Typesafe, Inc.)
This product includes software developed at
This product includes software developed at
XStream (http://xstream.codehaus.org/license.html).
Licensed under BSD-3-Clause
This product includes software developed at
This product includes software developed at
XStream.
Licensed under BSD-3-Clause
This product includes software developed at
This product includes software developed at
XStream Core.
Licensed under BSD-3-Clause
This product includes software developed at
This product includes software developed at
jsr-305 (JSR 305: Annotations for Software Defect Detection in Java;https://code.google.com/p/jsr-305/).
Licensed under BSD-3-Clause
This product includes software developed at
This product includes software developed at
Java API for RESTful Services.
Licensed under CDDL-1.1
This product includes software developed at
This product includes software developed at
Java API for RESTful Services (http://grepcode.com/snapshot/repo1.maven.org/maven2/javax.ws.rs/javax.ws.rs-api/2.0-m10).
Licensed under CDDL-1.1
This product includes software developed at
This product includes software developed at
Jaxb.
Licensed under CDDL-1.1;GPL-2.0-with-classpath-exception
This product includes software developed at
This product includes software developed at
WSDL4J.
Licensed under CPL-1.0
This product includes software developed at
This product includes software developed at
Aether (http://www.eclipse.org/aether/).
Licensed under EPL-1.0
Copyright (c) 2010, 2014 Sonatype, Inc.
This product includes software developed at
This product includes software developed at
Eclipse Sisu.
Licensed under EPL-1.0
Copyright (c) 2010, 2015 Sonatype, Inc.
This product includes software developed at
This product includes software developed at
H2 Embedded Database and JDBC Driver (H2 is EPL 1.0 !).
Licensed under EPL-1.0
This product includes software developed at
This product includes software developed at
Jetty.
Licensed under EPL-1.0
Copyright (c) ${copyright-range} Mort Bay Consulting Pty. Ltd.
This product includes software developed at
This product includes software developed at
Mondrian.
Licensed under EPL-1.0
This product includes software developed at
This product includes software developed at
SWTChart.
Licensed under EPL-1.0
This product includes software developed at
This product includes software developed at
Logback.
Licensed under EPL-1.0;LGPL-2.1
This product includes software developed at
This product includes software developed at
Java Json (http://www.json.org/license.html).
Licensed under Json
This product includes software developed at
This product includes software developed at
org-json-java.
Licensed under Json
This product includes software developed at
This product includes software developed at
AOP Alliance (Java/J2EE AOP standards) (http://aopalliance.sourceforge.net/).
Licensed under Public Domain
This product includes software developed at
This product includes software developed at
Simple API for CSS.
Licensed under W3C
This product includes software developed at
This product includes software developed at
BCProv.
Licensed under X11
Copyright (c) 2000 - 2016 The Legion of the Bouncy Castle Inc. (https://www.bouncycastle.org)
This product includes software developed at
This product includes software developed at
BouncyCastle.
Licensed under X11
This product includes software developed at
This product includes software developed at
SL4J.
Licensed under X11
This product includes software developed at
This product includes software developed at
SL4J (http://www.slf4j.org/license.html).
Licensed under X11
This product includes software developed at
This product includes software developed at
SL4J.
Licensed under X11
Copyright (c) 2004-2013 QOS.ch
This product includes software developed at
This product includes software developed at
Simple Logging Facade for Java (http://www.slf4j.org/license.html).
Licensed under X11
This product includes software developed at
This product includes software developed at
Simple Logging Facade for Java.
Licensed under X11
This product includes software developed at
This product includes software developed at
dropbox-sdk-java : Java library for the Dropbox Core API.
Licensed under X11

View File

@@ -1,4 +1,4 @@
Copyright (c) 2006 - 2019 Talend Inc. - www.talend.com
Copyright (c) 2006-2019 Talend Inc. - www.talend.com
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.

View File

@@ -974,9 +974,7 @@ public class RepositoryService implements IRepositoryService, IRepositoryContext
@Override
public void run() {
Shell shell = DisplayUtils.getDefaultShell(false);
retry.set(askRetryForNetworkIssueInDialog(shell, ex));
shell.dispose();
retry.set(askRetryForNetworkIssueInDialog(null, ex));
}
});
}

View File

@@ -374,7 +374,7 @@ public class ImportProjectHelper {
while (childrenEnum.hasNext()) {
Object child = childrenEnum.next();
if (level < 1) {
if (provider.isFolder(child)) {
if (provider.isFolder(child) && !".svnlog".equals(provider.getLabel(child))) { // $NON-NLS-1$
collectProjectFilesFromProvider(files, provider, child, level + 1, monitor, searchFileName);
}
}

View File

@@ -16,6 +16,7 @@ import org.eclipse.jface.viewers.Viewer;
import org.talend.core.model.metadata.MetadataTable;
import org.talend.core.model.metadata.Query;
import org.talend.core.model.repository.ERepositoryObjectType;
import org.talend.repository.model.IRepositoryNode.ENodeType;
import org.talend.repository.model.RepositoryNode;
/**
@@ -56,6 +57,9 @@ public class QueryTypeProcessor extends SingleTypeProcessor {
if (isCDCConnection(node)) {
return false;
}
if (node.getType() == ENodeType.STABLE_SYSTEM_FOLDER) {
return false;
}
return true;
}

View File

@@ -769,6 +769,7 @@ public class JobJavaScriptOSGIForESBManager extends JobJavaScriptsManager {
Manifest manifest = null;
try {
manifest = analyzer.calcManifest();
filterImportPackages(manifest);
} catch (IOException e) {
throw e;
} catch (Exception e) {
@@ -792,6 +793,38 @@ public class JobJavaScriptOSGIForESBManager extends JobJavaScriptsManager {
return manifest;
}
private void filterImportPackages(Manifest manifest) {
// remove import packages which are present in private packages
List<String> privatePackages = new ArrayList<String>();
String privatePackagesString = manifest.getMainAttributes().getValue(Analyzer.PRIVATE_PACKAGE);
if (privatePackagesString != null) {
String [] packages = privatePackagesString.split(",");
for (String p : packages) {
privatePackages.add(p);
}
}
StringBuilder fileterdImportPackage = new StringBuilder();
String importPackagesString = manifest.getMainAttributes().getValue(Analyzer.IMPORT_PACKAGE);
if (importPackagesString != null) {
String [] packages = importPackagesString.split(",");
for (String p : packages) {
String importPackage = p.split(";")[0];
if (!privatePackages.contains(importPackage) || importPackage.startsWith("routines")) {
fileterdImportPackage.append(p).append(",");
}
}
}
String str = fileterdImportPackage.toString();
if (str != null && str.length() > 0 && str.endsWith(",")) {
str = str.substring(0, str.length() - 1);
}
manifest.getMainAttributes().putValue(Analyzer.IMPORT_PACKAGE, str);
}
protected Analyzer createAnalyzer(ExportFileResource libResource, ProcessItem processItem) throws IOException {
Analyzer analyzer = new Analyzer();

11
pom.xml
View File

@@ -165,6 +165,17 @@
<version>${tycho.version}</version>
<extensions>true</extensions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-deploy-plugin</artifactId>
<version>2.8.2</version>
<executions>
<execution>
<id>default-deploy</id>
<phase>none</phase>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>