Compare commits

...

152 Commits

Author SHA1 Message Date
hcyi-talend
0011cc14a7 fix(TUP-29072):improve for When renaming link (table) between teltinput
and teltmap , the generated SQL query is not updated.
2020-12-16 16:24:52 +08:00
hcyi-talend
093d78f386 fix(TUP-29072):improve for When renaming link (table) between teltinput
and teltmap , the generated SQL query is not updated.
2020-12-16 15:57:39 +08:00
bhe-talendbj
e80c5726e1 fix(TUP-29693): Fix migrating jarname to maven uri if jar name contai… (#5627)
* fix(TUP-29693): Fix migrating jarname to maven uri if jar name contains context

* fix(TUP-29693): Kepp main logic unchanged
2020-12-14 20:54:54 +08:00
Jane Ding
fffd887f39 fix(TUP-28657):Wrong behavior while Job Setting project config to choose (#5597) (#5622)
* fix(TUP-28657):Wrong behavior while Job Setting project config to choose
From Database
https://jira.talendforge.org/browse/TUP-28657

Signed-off-by: jding-tlnd <jding@talend.com>

* fix(TUP-28657):Wrong behavior while Job Setting project config to choose
From Database
https://jira.talendforge.org/browse/TUP-28657

Signed-off-by: jding-tlnd <jding@talend.com>

* fix(TUP-28657):Wrong behavior while Job Setting project config to choose
From Database
https://jira.talendforge.org/browse/TUP-28657

Signed-off-by: jding-tlnd <jding@talend.com>

* fix(TUP-28657):Wrong behavior while Job Setting project config to choose
From Database
https://jira.talendforge.org/browse/TUP-28657
Signed-off-by: jding-tlnd <jding@talend.com>

* fix(TUP-28657):Wrong behavior while Job Setting project config to choose
From Database
https://jira.talendforge.org/browse/TUP-28657

Signed-off-by: jding-tlnd <jding@talend.com>
2020-12-11 11:57:42 +08:00
AlixMetivier
6088e2c2ca fix(TBD-11616): set dataset parameter to true for job already importe… (#5620)
* fix(TBD-11616): set dataset parameter to true for job already imported to 7.3 studios

* refacto

* fix
2020-12-10 17:31:47 +01:00
zshen-talend
1637e6220c Zshen/bugfix/tdq 18791 fix conflict2 (#5619)
* fix(TDQ-18817): support context in Confidence Weight on 7.3 and 7.4

* fix(TDQ-18791): change migration version
2020-12-10 18:09:50 +08:00
AlixMetivier
e96d2af073 fix(TBD-11616): parameterize dataset API utilization (#5562)
* fix(TBD-11616): parameterize dataset API utilization

* apply migration to only 7.3 and 7.4 studios

* fix for future studio versions

* fix bad merge
2020-12-10 09:57:35 +01:00
hcyi
a1e453e79a feat(TUP-25346):tELTTeradataOutput - aliases are mandatory for calculated columns (#5572)
* feat(TUP-25346):tELTTeradataOutput - aliases are mandatory for
calculated columns

* feat(TUP-25346):add junits for calculated columns.

* feat(TUP-25346):NEW RULE for if output column name is different than
output db column.

* fix(TUP-29598):[BUG] The "property setting" setting is missing after
migrating data(TUP-25346)

* feat(TUP-25346):NEW RULE if checked the alias option。

* fix(TUP-29636):[BUG] Studio does not support SQL scripts with alias when
execute update database(TUP-25346)

* feat(TUP-25346):change the junits since NEW RULE if checked the alias
option.

* feat(TUP-25346):remove some junits since no need.

* feat(TUP-25346):format the code

* feat(TUP-25346):format the code
2020-12-10 15:17:08 +08:00
wchen-talend
c8c4a586d0 fix(TESB-31294):move velocity.log to configuration folder (#5586) 2020-12-10 15:00:15 +08:00
Dmytro Sylaiev
89435124fe fix(TDI-45135): Reuse same outputStream when 2 outputDelimited writing to the same file (#5584) 2020-12-09 09:23:50 +02:00
wang wei
ae8e41f36b fix(TDI-45310): trim the label string when log it to avoid the compiler issue(#5589) 2020-12-09 14:09:22 +08:00
Mike Yan
e6fd8b0614 fix(TESB-30734): Backport from patch 7.2.1 (#5468) 2020-12-08 18:07:03 +08:00
Andrii Medvedenko
f4223f7a2b fix(TBD-11724): manually CPing changes from 7.2, TBD-9864 (#5599)
* fix(TBD-11724): manually CPing changes from 7.2, TBD-9864

* fix(TBD-11724): manually CPing changes from 7.2, TBD-9864
2020-12-08 10:36:26 +01:00
wang wei
64dcafb080 fix(TDI-45331): Exit code for SSH component is not working as expected (#5598) 2020-12-08 16:14:40 +08:00
Dmytro Grygorenko
402fe1ffbc fix(TDI-45006): Hide "LIKE" operator from CRM-2016 On-Premise setup. (#5566)
* fix(TDI-45006): Removed unavailable 'CRM_2018' option so that the whole condition could be evaluated.

* fix(TDI-45006): correct way to fix.
2020-12-08 10:00:00 +02:00
hcyi
02c66a7e93 fix(TUP-29366):[bug] studio can rename alias to an existing name. (#5587)
* fix(TUP-29366):[bug] studio can rename alias to an existing name.

* fix(TUP-29366):[bug] studio can rename alias to an existing name.
2020-12-04 16:57:47 +08:00
sbliu
207d1c9635 fix(TUP-29391) fix problem that "Save As" loses changes in the studio for such as job, joblet. (#5560) 2020-12-02 15:38:06 +08:00
Jane Ding
a4d0adb671 fix(TUP-29383):DBInput component miss delta lake when use TDI.license (#5525)
* fix(TUP-29383):DBInput component miss delta lake when use TDI.license
https://jira.talendforge.org/browse/TUP-29383

Signed-off-by: jding-tlnd <jding@talend.com>

* fix(TUP-29383):DBInput component miss delta lake when use TDI.license
https://jira.talendforge.org/browse/TUP-29383
Delta lake doesn't support SP, should not list tDBSP

Signed-off-by: jding-tlnd <jding@talend.com>
2020-12-02 14:42:54 +08:00
sbliu
0ce6a06f8e fix(TUP-29224) auto increment the index according to the drag/drop column numbers. (#5555) 2020-12-02 12:00:02 +08:00
hcyi
15cbbf362c fix(TUP-29090):improve since broken some junits . (#5578) 2020-11-30 20:15:08 +08:00
Oleksandr Zhelezniak
c139b0893d fix(TDI-45089): fix date for excel event mode (#5466)
* use date pattern from studio scheme during parse the date column for event mode
* update version of simpleexcel to [2.5-20201119] (the source code of the lib in master branch)
* apply inter-exchange pattern to avoid date rounding
2020-11-30 13:37:53 +02:00
Richard Lecomte
4c2e419bc0 Rlecomte/tdi 45147 t gs copy target issue (#5502)
* TDI-45147 : tGSCopy issue with path

* TDI-45147 : tGSCopy issue with path

Add checkbox to keep legacy behavior by default

* TDI-45147 : tGSCopy issue with path

Add checkbox to keep legacy behavior by default

* TDI-45147 : tGsCopy issue

Resolved conflicts
2020-11-30 11:18:36 +01:00
Dmytro Grygorenko
adc91e4169 fix(TDI-45162): Rearrange imports in tELTPostgresql* and tELTTeradata components. (#5501)
* fix(TDI-45162): Rearrange imports in tELTPostgresql* and tELTTeradata* components.

* fix(TDI-45162): some corrections after branch conflict resolve.

* fix(TDI-45162): more cleanup.
2020-11-30 11:18:25 +02:00
bhe-talendbj
09607ed581 fix(TUP-29166): Remove decorationfigure (#5549)
* fix(TUP-29166): Remove decorationfigure

* fix(TUP-29166): add back arrows

* fix(TUP-29166): remove drawString

* fix(TUP-29166): remove unused method
2020-11-30 10:58:47 +08:00
pyzhou
0a5b925dc4 Pyzhou/tdi 45160 add encording t file unarchive (#5540)
* fix(TDI-45160):add encoding for tFileUnarchive

* fix error

* typo

* add encoding for pass

* upgrade version

* remove import

* remove useless code

* Keep the old behavior

* correct mvn path
2020-11-30 10:38:20 +08:00
ypiel
414ab39434 feat(TDI-44873) : cve - bump olingo odata to 4.7.1 - maintenance
* feat(TDI-44873) : update odata/olingo to v4.7.1

* feat(TDI-44873) : bump to olingo 4.7.1

* feat(TDI-44873) : fix odata lib names

* feat(TDI-44873) : add version to odata MODULE

* feat(TDI-44873) : add version to odata IMPORT name
2020-11-27 17:27:11 +01:00
Dmytro Grygorenko
a269f74a30 fix(TDI-45224): Review and update tJasper* dependencies. (#5545)
* fix(TDI-45224): Review and update tJasper* dependencies.

* fix(TDI-45224): additional dependencies reviewed and updated
2020-11-27 12:15:18 +02:00
clesaec
530814c490 TDI-45193 : dynamic col on tFileInputPositional (#5511)
* TDI-45193 : dynamic col on tFileInputPositional
2020-11-27 10:35:19 +01:00
hcyi
1da69fb285 fix(TUP-29072):When renaming link (table) between teltinput and teltmap , the generated SQL query is not updated (#5467)
* fix(TUP-29072):When renaming link (table) between teltinput and teltmap
, the generated SQL query is not updated

* fix(TUP-29072):When renaming link (table) between teltinput and teltmap
, the generated SQL query is not updated 。

* fix(TUP-29072):add junits

* fix(TUP-29072):add more junits
2020-11-27 17:13:04 +08:00
mbasiuk-talend
3c00488dc8 feat(TDI-44915) access token feature (#5473)
* feat(TDI-44915): integrate Balazs POC

* feat(TDI-44915): implement other BigQuery component with access token

* feat(TDI-44915): update BigQuery bulkexec with new common jet

* feat(TDI-44915): update tGSBucketCreate

* feat(TDI-44915): update tGSBucketDelete

* feat(TDI-44915): update tGSBucketExist

* feat(TDI-44915): update tGSBucketList

* feat(TDI-44915): update connection and close part

* feat(TDI-44915): update tGSCopy

* feat(TDI-44915): update tGSDelete

* feat(TDI-44915): update tGSGet

* feat(TDI-44915): update tGSList

* feat(TDI-44915): update tGSPut

* feat(TDI-44915): use proxy to communicate with GS

* feat(TDI-44915): update code due to PR comments

* feat(TDI-44915): update Input code generation

* feat(TDI-44915): fix tGSPut xml definition

* feat(TDI-44915): update BigQuery Output and generic connection

* feat(TDI-44915): fix _end javajet parts

* feat(TDI-44915): fix bigqueryoutput config mappings

* feat(TDI-44915): fix tGSBuckerCreate dependencies

* feat(TDI-44915): fix PR comments
2020-11-27 10:04:14 +02:00
kjwang
17f54191cf Kjwang/feat tup 28891 temp folder (#5465)
TUP-28891:Shared Studio: Check which functions will write data into
folder "temp" of Studio installation folder from code
https://jira.talendforge.org/browse/TUP-28891
2020-11-27 14:47:48 +08:00
kjwang
ed99155812 Fix : TUP-29358 Performance: It takes more than 1 hour to show "Update Detection" after clicking "detect and update all jobs" icon in a special project (#5548)
* Fix : TUP-29358 Performance: It takes more than 1 hour to show "Update
Detection" after clicking "detect and update all jobs" icon in a special
project
https://jira.talendforge.org/browse/TUP-29358
2020-11-27 14:38:28 +08:00
wang wei
2276f4b51a fix(TDI-45227): talendStats_STATSProcess Error about ELTComponents and Stat&Log tables creation (#5537) 2020-11-27 12:02:29 +08:00
hcyi
da5744d1e5 fix(TUP-29090):[7.2.1] Extra plus '+' signs in generated SQL (#5453) 2020-11-25 10:30:46 +08:00
vyu-talend
3c04002b5e fix(TDI-45159):fix bug in xml of azuresynabe. (#5508) 2020-11-24 15:49:24 +08:00
sbliu
8b67961ade fix(TUP-26486): Can't quickly refresh the GUI when switch format.
when to create azure, quickly refresh the wizard GUI only once when switch format.  fix problem that 'Netsuite/tck: in Metadata credentials fields are not changed based on Auth type selected'.
2020-11-24 15:44:01 +08:00
clesaec
ec914f50fe TDI-45161 : tFile input delimited correction (#5510) 2020-11-24 08:19:06 +01:00
bhe-talendbj
9ab7f01201 fix(TUP-29424): fix junit failures related to Parameter change (#5551) 2020-11-23 11:57:26 +08:00
bhe-talendbj
af79e71c25 fix(TUP-29424): OutputSchemaParameterTest (#5546) 2020-11-23 10:09:59 +08:00
Emmanuel GALLOIS
c50e437c59 feat(TCOMP-1761): Support of complete schema definition in Studio (#5270)
* feat(TCOMP-1761): add nestedProperties for metadata
* feat(TCOMP-1761): update configuration.javajet
* feat(TCOMP-1761): change temporary variable name
2020-11-20 14:17:32 +01:00
Zhiwei Xue
ca07dd16cf fix(TUP-29360): Missing log4j2 jar on user routines (#5529)
* fix(TUP-29360): Missing log4j2 jar on user routines

* fix(TUP-29360): fix switch log4j level problem
2020-11-20 17:12:54 +08:00
ovladyka
64794a596c Fix/TDI45204_IncorrectOutlineFortFileFetch (#5532)
Co-authored-by: Oleksandr Vladyka <oleksandr.vladyka@synapse.com>
2020-11-19 14:08:00 +02:00
pyzhou
ff595fd205 fix(TDI-45167):Close pre Workbook for tFileOutputExcel (#5505) 2020-11-18 11:26:19 +08:00
apoltavtsev
c035091f88 fix(TESB-29553) Publishing a route with cTalendJob from Studio and commandline gives different results 2020-11-12 11:06:56 +01:00
SunChaoqun
3d8c28928a TESB-28330:cConfig/Beans - Support customer groupid and artifact name (#5458)
* TESB-28330:cConfig/Beans - Support customer groupid and artifact name
(additional to custom version)

* TESB-30961:[7.3.1 cConfig] The external jar name and mvn uri is changed
when import a route with cConfig of 721
2020-11-12 16:29:39 +08:00
sbliu
54af03d3ef Revert "feat(TUP-26486) quickly refresh the GUI only one time when switch format."
This reverts commit 1adad9fe93.
2020-11-11 16:05:43 +08:00
sbliu
1adad9fe93 feat(TUP-26486) quickly refresh the GUI only one time when switch format. 2020-11-11 15:38:45 +08:00
Jane Ding
a20ea15f4d fix(TUP-28934):JDBC connection created under folder can not generate the (#5495)
all the components for jdbc when drag to job.
https://jira.talendforge.org/browse/TUP-28934

Signed-off-by: jding-tlnd <jding@talend.com>
2020-11-11 10:45:23 +08:00
apoltavtsev
1df79674e1 feat(TESB-29949) Pass the data source to a job using a context variable 2020-11-10 15:57:31 +01:00
hcyi
c0fedf4ef4 feat(TUP-25235):tELTTeradataMap : allow to rename the alias. (#5491) 2020-11-10 11:59:47 +08:00
hcyi
f3b45bf229 feat(TUP-25235):tELTTeradataMap : allow to rename the alias. (#5284)
* feat(TUP-25235):tELTTeradataMap : allow to rename the alias.

* feat(TUP-25235):tELTTeradataMap : allow to rename the alias.

* feat(TUP-25235):improve and add some junits for rename alias.

* fix(TUP-25235):update some text messages for tELTTeradataMap : allow to
rename the alias

* fix(TUP-25235):update some text messages for tELTTeradataMap : allow to
rename the alias

* fix(TUP-25235):TUP-29225 [bug]ELTMAP should check alias name validation
.

* fix(TUP-25235):TUP-29225 [bug]ELTMAP should check alias name validation

* fix(TUP-25235):TUP-29225 [bug]ELTMAP should check alias name validation
2020-11-10 11:20:41 +08:00
wang wei
243a1f3326 fix(TDI-44993): tS3Get performance issue using multipart target to EFS/NAS(#5429) 2020-11-09 21:41:12 +08:00
wang wei
4abf245ca1 fix(TDI-44910): Add memsql support in studio (#5479) 2020-11-09 21:38:42 +08:00
Jane Ding
d71b4b148a Fix junit failure (#5489)
Signed-off-by: jding-tlnd <jding@talend.com>
2020-11-09 19:24:56 +08:00
Chao MENG
45f94c22be feat(TUP-28790): Enhance detection of localhost (#5379)
* feat(TUP-28790): Enhance detection of localhost

https://jira.talendforge.org/browse/TUP-28790

* feat(TUP-28790): Enhance detection of localhost

https://jira.talendforge.org/browse/TUP-28790
2020-11-09 17:31:55 +08:00
chmyga
1c41f0c05d chore(TDI-44004): update CXF and talend-ws version (#5485)
Co-authored-by: Dmytro Chmyga <dmytro.chmyga@synapse.com>
2020-11-09 10:34:51 +02:00
Chao MENG
d70ad09a49 feat(TUP-29126): Github Renaming the default branch from master (#5457)
* feat(TUP-29126): Github Renaming the default branch from master
https://jira.talendforge.org/browse/TUP-29126

* feat(TUP-29126): Github Renaming the default branch from master
https://jira.talendforge.org/browse/TUP-29126
2020-11-09 10:10:45 +08:00
bhe-talendbj
6e440ed726 fix(TUP-29164): Remove duplicated invokes of checkStartNodes (#5454) 2020-11-06 16:44:28 +08:00
kjwang
d8ace9d577 temp commit (#5411)
TUP-28833 Multi-User: support custom javajet component in shared studio
https://jira.talendforge.org/browse/TUP-28833
2020-11-06 15:02:22 +08:00
Jane Ding
8b3bbf7dcb feat(TUP-29076):support memsql in studio metadata and components (#5441)
* feat(TUP-29076):support memsql in studio metadata and components
https://jira.talendforge.org/browse/TUP-29076

Signed-off-by: jding-tlnd <jding@talend.com>

* feat(TUP-29076):support memsql in studio metadata and components
https://jira.talendforge.org/browse/TUP-29076
fix(TUP-29101):[bug] data viewer failed when export to context
https://jira.talendforge.org/browse/TUP-29101
Signed-off-by: jding-tlnd <jding@talend.com>

* feat(TUP-29076):support memsql in studio metadata and components
https://jira.talendforge.org/browse/TUP-29076

Signed-off-by: jding-tlnd <jding@talend.com>
2020-11-06 10:52:34 +08:00
pyzhou
0d0a67be0c Pyzhou/tdi 45062 t file copy last modified time 7.3 (#5472)
* fix(TDI-45062):add back last modification time

* add copy
2020-11-05 17:04:50 +08:00
clesaec
3e410b5373 TDI-44941 : ftp put connectors, renames (#5366)
* TDI-44941 : ftp put connectors, renames
2020-11-04 12:17:06 +01:00
pyzhou-talend
aad3651c38 fix(TDI-44924):change UI string 2020-11-04 16:04:59 +08:00
Colm O hEigeartaigh
6515a196e2 TDI-44685 - Close streams after calling getResourceAsStream in generated code (#5065) 2020-11-04 15:58:28 +08:00
mbasiuk-talend
cdc51a076c chore(TDI-44994): bump library version (#5427)
* chore(TDI-44994): bump library version

* chore(TDI-44994): change version to lower version, avoid bumping other libs

* chore(TDI-44994): update correct library
2020-11-03 09:49:45 +02:00
pyzhou
5e327d255d fix(TDI-44622):Correct Bonita mvn URL (#4992)
* fix(TDI-44622):Correct Bonita mvn URL

* revert default version
2020-11-03 14:47:28 +08:00
kjwang
63fffa6d6c Kjwang/fix tup 28050 avoid possible actions (#5283)
Fix TUP-28050 Multi-User: Avoid possible actions that are not supported on shared studio
https://jira.talendforge.org/browse/TUP-28050
2020-11-03 11:23:27 +08:00
vyu-talend
9a974a67c0 fix(TDI-45054):fix the protected file error. (#5428)
* fix(TDI-45054):fix the protected file error.

* fix(TDI-45054):update the excel jar in component.

* fix(TDI-45054):update lib version.
2020-11-03 11:17:29 +08:00
bhe-talendbj
2a8a6c074a bugfix(TUP-29131) Replace jar name by maven uri for spark jobs (#5444)
* fix(TUP-29131): Replace jar name by uri for spark job

* fix(TUP-29131): Replace jar name by maven uri for spark jobs
2020-11-02 15:11:42 +08:00
pyzhou
53435511fb fix(TDI-45098):tHttpRequest accept 2XX as success (#5442) 2020-11-02 13:53:21 +08:00
Dmytro Grygorenko
7840e1783e fix(TDI-45023): Update tJasper* components to use Apache POI 4.1.2 (#5426)
* fix(TDI-45023): Update libraries for tJasper* components.

* fix(TDI-45023): fix for HTML report generation.
2020-10-29 15:01:22 +02:00
Jane Ding
48c26b2d21 fix(TUP-28945):For Snowflake connection, the tDBConnection which is (#5407)
* fix(TUP-28945):For Snowflake connection, the tDBConnection which is
displayed in the joblet (in tDBInput) is not retained in the job from
where the joblet is invoked
https://jira.talendforge.org/browse/TUP-28945

Signed-off-by: jding-tlnd <jding@talend.com>

* fix(TUP-28945):For Snowflake connection, the tDBConnection which is
displayed in the joblet (in tDBInput) is not retained in the job from
where the joblet is invoked
https://jira.talendforge.org/browse/TUP-28945

Signed-off-by: jding-tlnd <jding@talend.com>
2020-10-29 20:59:28 +08:00
bhe-talendbj
8027965b4c fix(TUP-28978): Add migration (#5422)
* fix(TUP-28978): Add migration

* fix(TUP-28978): show jar for tELMap, show uri for routine/bean dependencies
2020-10-29 15:55:55 +08:00
qiongli
e9bad7dd32 fix(TDQ-18826)Set required to true for Dynamic jars (#5425) 2020-10-28 18:02:47 +08:00
Dmytro Grygorenko
593a717084 fix(TDI-44087): Update Xstream to 1.4.11.1 version (#5430) 2020-10-28 10:39:29 +02:00
cbadillo1603
fb884595dd fix(TBD-11405): there's no button to browse credential files for tHiveXXX components in DI job (#5333) 2020-10-28 09:10:25 +01:00
clesaec
756ba629ed TDI-44686 : potential NPE (#5164)
* TDI-44686 : potential NPE
2020-10-26 09:59:02 +01:00
Roman
aac496bf8f chore(TDI-44004): update talend-ws lib (#5415)
* chore(TDI-44004): update talend-ws lib

* chore(TDI-44004): update cxf for MicrosoftCRM components
2020-10-26 09:33:28 +02:00
Andrii Medvedenko
7ac98d48a3 fix(TBD-11378): tHiveInput in DI has authentication type for all distributions (#5313)
* fix(TBD-11378): wrong property fix

* an other part of ofix
2020-10-23 15:37:34 +03:00
Dmytro Grygorenko
8752a75abd fix(TDI-45015): tFileInputMail - slow attachments retrieval. (#5399)
* fix(TDI-45015): fix for low Input/OutputStreams performance.

* fix(TDI-45015): increase buffer size to align sizes of internal and external buffers.

* fix(TDI-45015): moved flush() out of cycle.
2020-10-23 11:24:33 +03:00
pyzhou
9e85699b85 fix(TDI-44924):tFileFetch should not trust all server by default (#5328)
* fix(TDI-44924):tFileFetch should not trust all server by default

* add migration task

* correct class name

* fix plugin.xml
2020-10-23 10:41:45 +08:00
bhe-talendbj
c09016f435 fix(TUP-28066): remove the lower jar based on gavct (#5409) 2020-10-22 14:17:38 +08:00
bhe-talendbj
028b6681c3 bugfix(TUP-29022) Fix detection warning popup (#5400) (#5401)
* fix(TUP-29022): fix change warning popup

* fix(TUP-2902): show changes warning popup for implicit context and jobsettings
2020-10-21 14:07:00 +08:00
bhe-talendbj
22fbf0ff0b fix(TUP-29022): fix jobsettings jdbc change detector (#5394) 2020-10-20 18:14:28 +08:00
Oleksandr Zhelezniak
ec74ac0f35 fix(TDI-44935): reuse custom query expression (#5349)
* make query expression idempotent in the generated code
* use instead of duplication of custom query expression a variable
2020-10-20 12:33:53 +03:00
Mike Yan
c5d44ab48b fix(TESB-30713): Beans folder is missing in routines.jar for route (#5391) 2020-10-20 15:43:04 +08:00
Chao MENG
1db9854428 fix(TUP-29012): If I open spark job before di job, on di job run tab I (#5387)
can see spark configuration tab

https://jira.talendforge.org/browse/TUP-29012
2020-10-20 11:31:51 +08:00
Jane Ding
5d1ab30b34 fix(TUP-28952):TOS: NPE when delete items to recycle bin (#5383)
https://jira.talendforge.org/browse/TUP-28952

Signed-off-by: jding-tlnd <jding@talend.com>
2020-10-20 10:15:34 +08:00
Dmytro Sylaiev
3502a8e79a fix(TDI-44896): Get nbline from stdout of gpload (#5304)
* fix(TDI-44896): Get nbline from stdout of gpload

* fix(TDI-44896): Fix the error message for failing get NBLine

* fix(TDI-44896): Fix codegen error

* fix(TDI-44896): Fix log printing 0 despite the real result

* fix(TDI-44897): Add exit code of GPLoad (#5307)
2020-10-19 18:22:46 +03:00
ypiel
f3b91a3cac feat(TDI-44950) : Support oauth 2.0 in mscrm 2016 on-premise (#5300)
* feat(TDI-44950) : Support oauth 2.0 in mscrm 2016 on-premise

* feat(TDI-44950) : some adjustement after meeting + qa

* feat(TDI-44950) : update talend-mscrm version
2020-10-19 14:50:45 +02:00
AlixMetivier
e1bceeea2d fix(TBD-11446): add tFileOutputDelimited to UpdateSeparatorAndEscapeForDatasetAPI migration task (#5374) 2020-10-19 11:24:23 +02:00
cbadillo1603
967a3d94ce fix(TBD-5167): tBigQueryInput project label should same as tBigQueryConfiguration (#5375) 2020-10-16 15:38:21 +02:00
bhe-talendbj
69a5234730 feat(TUP-28342):Save maven url for components with parameter MODULE_LIST (#5278)
* feat(TUP-25246): support custom maven uri for components and dbwizard

* feat(TUP-28342):Save maven url for components with parameter MODULE_LIST

* feat(TUP-28342): Save driver jar as mvnurl for implicit context.

* feat(TUP-25246): Fix necessary UI updates

* feat(TUP-25246): add migration to replace jar name by uri

* feat(TUP-25246): add migration

* feat(TUP-25246): generate poms after migration

* feat(TUP-25246): fix migrate driver jar path

* feat(TUP-25246): Fix context jar path and disable migration and show jar name instead of uri for modulelist

* feat(TUP-25246): fix stats and logs

* feat(TUP-25246): show jar name instead of uri for module table

* feat(TUP-25246): remove quotes

* feat(TUP-25246): fix parse maven uri from context

* feat(TUP-25246): use maven uri of component instead of driver jar path

* feat(TUP-25246): add workaround for components

Co-authored-by: Zhiwei Xue <zwxue@talend.com>
2020-10-16 16:22:59 +08:00
pyzhou
c8f4dd2e6d fix(TDI-44997): fix compile error (#5355) 2020-10-16 14:20:28 +08:00
mbasiuk-talend
a0b0366bcb fix(TDI-44893): update component, and localprovider (#5372)
* fix(TDI-44893): update component, and localprovider

* fix(TDI-44893): use latest soap library
2020-10-16 07:28:01 +03:00
Jane Ding
f24633190e feat(TUP-28640):Improve JDBC database support framework to load (#5368)
supported DBs and components automatically
https://jira.talendforge.org/browse/TUP-28640
2020-10-16 01:37:17 +08:00
jiezhang-tlnd
8e4e04c9cd feat(TUP-28758)Add a warning when we login to a project and if there is (#5295)
migration to do
2020-10-15 15:18:09 +08:00
Liu Xinquan
05a0815778 fix(TDQ-18784) add a new migration task (#5363) 2020-10-15 10:08:01 +08:00
Jane Ding
965c02c58d feat(TUP-28640):Improve JDBC database support framework to load (#5367)
supported DBs and components automatically
https://jira.talendforge.org/browse/TUP-28640
2020-10-15 08:48:14 +08:00
Emmanuel GALLOIS
5eb4100d98 feat(TCOMP-1757): support context in design time actions (#5220)
* feat(TCOMP-1757): support context in design time actions
* feat(TCOMP-1757): add context selection when multiple contexts
2020-10-14 16:53:38 +02:00
vdrokov
b0059fdb12 TESB-30427: Unable to utilize the Runtime datasource in DI Jobs with DB credentials blank (#5271)
Discover jndi source by different keys from the job note.
2020-10-14 16:30:12 +02:00
ypiel
061f3dc431 fix(TDI-44866) : update mscrm lib to 3.4-20200923 2020-10-14 15:40:21 +02:00
Chao MENG
e85afa939c fix(TUP-28612): [TCK] Guess Schema button run the subjob instead of (#5345)
calling discover schema method
https://jira.talendforge.org/browse/TUP-28612
2020-10-14 15:39:40 +08:00
jzhao
cb475f4e5e fix(TDI-45008):tFileOutputDelimited, 'Use OS line separator as row separator when CSV Row separator is set to CR, LF or CRLF' does not work properly (#5364) 2020-10-14 15:37:49 +08:00
Xilai Dai
a76b52f3b9 fix(TESB-30136) javax.activation.UnsupportedDataTypeException (#5329) 2020-10-14 10:21:20 +08:00
bhe-talendbj
4689d45132 fix(TUP-28659): migrate jobs because remove snapshot jars (#5294) 2020-10-14 09:45:15 +08:00
Jane Ding
9b5eccc67c feat(TUP-28640):Improve JDBC database support framework to load (#5287)
* feat(TUP-28640):Improve JDBC database support framework to load
supported DBs and components automatically
https://jira.talendforge.org/browse/TUP-28640

* feat(TUP-28640):Improve JDBC database support framework to load
supported DBs and components automatically
https://jira.talendforge.org/browse/TUP-28640

* feat(TUP-28640):Improve JDBC database support framework to load
supported DBs and components automatically
https://jira.talendforge.org/browse/TUP-28640
fix(TUP-28746):[Bug] after do "save the property to metadata" studio
will throw error logs.
cf: connection infos with quotes

* feat(TUP-28640):Improve JDBC database support framework to load
supported DBs and components automatically
https://jira.talendforge.org/browse/TUP-28640
2020-10-13 16:22:42 +08:00
Jane Ding
53d4b392bc fix(TUP-28618):[Bug] db type dont show Delta in impact page. (#5275)
https://jira.talendforge.org/browse/TUP-28618
2020-10-13 16:20:48 +08:00
Dmytro Grygorenko
a2510f5e2a fix(TDI-44957): tSSH timeouts use (#5303)
* fix(TDI-44957): add timeouts use and configuration.

* fix(TDI-44957): replicate old component's behavior (unlimited session if no timeout specified)

* fix(TDI-44957): set NIO2_READ_TIMEOUT to be infinite too.
2020-10-13 11:01:48 +03:00
Dmytro Grygorenko
539710015b fix(TDI-44839): update POSTGRESQL driver (#5273)
Co-authored-by: wwang-talend <wwang@talend.com>
2020-10-13 10:40:46 +03:00
Mike Yan
a398a56e95 Yyan/feat/tesb 29271 route debug 731 (#5351)
* fix(TESB-29271): Fix NPE for Spack job debug type

* fix(TESB-29271): Hide add breakpoint menu in routelet editor
2020-10-13 15:15:36 +08:00
hzhao-talendbj
ba647bde38 tup-28783 (#5331) 2020-10-13 14:57:05 +08:00
sbliu
3b37b58fd0 TUP-28778 Add 'if' judgment before type conversion. 2020-10-13 10:53:12 +08:00
kjwang
cc07722ebb kjwang/Feat_TUP-27762_new_version_of_ci (#5248)
kjwang/Feat_TUP-27762_new_version_of_ci
2020-10-13 10:50:05 +08:00
hzhao-talendbj
625792e472 tup 27356 (#5279)
* tup 27356

* modify code
2020-10-12 16:09:23 +08:00
mbasiuk-talend
b55e6c1c02 fix(TDI-44893): upgrade libraries, update code, remove old libs (#5229)
* fix(TDI-44893): upgrade libraries, update code, remove old libs

* fix(TDI-44893): update local provider pom with newest version
2020-10-12 10:33:23 +03:00
Andreas Mattes
b43b149ba4 TESB-30623 Ensure routine classes are compiled before the library is created. (#5322) 2020-10-10 10:43:53 +02:00
Mike Yan
918cf4eed5 fix(TESB-29271): Fix NPE for Spack job debug type (#5321) 2020-10-09 23:01:05 +08:00
Mike Yan
5fcaae4e48 fix(TESB-29271): Fix for Spack job debug type (#5320) 2020-10-09 17:48:22 +08:00
Mike Yan
41231a43b6 fix(TESB-29271): Fix for headless issue (#5319) 2020-10-09 10:47:47 +08:00
Mike Yan
03eeaac507 feat(TESB-29271): Route debuging feature (#5280)
* feat(TESB-29271):Initical update for route debugger

* feat(TESB-29271): conditional breakpoint component tab

* feat(TESB-29271): Route debuging feature

* fix(TESB-29271): Cumulative fixes and code improvement

* fix(TESB-29271): Added default view(DI)

* fix(TESB-29271): Code improvements by code review

* feat(TESB-29271): code updated by review
2020-10-07 21:50:35 +08:00
chmyga
ad7316b2ce Dchmyga/tdi 44868 tgsput huge files (#5230)
* fix(TDI-44868): tGSPut with huge files

* Add property to set part size to upload files in parts

* fix(TDI-44868): tGSPut with huge files

* add migration task

* fix(TDI-44868): tGSPut with huge files

* Fix PR comment

* fix(TDI-44868): tGSPut with huge files

* Fix overflow problem

Co-authored-by: Dmytro Chmyga <dmytro.chmyga@synapse.com>
2020-10-05 17:41:49 +03:00
Dmytro Grygorenko
31e27f97e0 fix(TDI-44880): set ERROR_MESSAGE in components. (#5207) 2020-10-05 17:20:34 +03:00
Dmytro Grygorenko
09fd069243 fix(TDI-44883): set value of ERROR_MESSAGE (#5216) 2020-09-29 16:26:04 +03:00
Dmytro Sylaiev
f5ea29812e fix(TDI-44771): Fix error code for prejob in multithread run (#5208) 2020-09-28 09:33:05 +03:00
Jane Ding
5acbcd229e fix(TUP-26156)tCreateTable change DBType and Property Type not work (#4465) (#5276)
* fix(TUP-26156)tCreateTable: change "DBType" and "Property Type" not work
https://jira.talendforge.org/browse/TUP-26156

* fix(TUP-26156)tCreateTable: change "DBType" and "Property Type" not work
https://jira.talendforge.org/browse/TUP-26156

* fix(TUP-26156)tCreateTable change DBType and Property Type not work
https://jira.talendforge.org/browse/TUP-26156
2020-09-26 17:11:27 +08:00
Zhiwei Xue
203187c05f Revert "feat(TUP-28342):Save maven url for components with parameter MODULE_LIST"
This reverts commit 50e7057eb9.
2020-09-25 21:35:06 +08:00
Dmytro Grygorenko
39996404a7 fix(TDI-44568): Cumulative changes for POI update task (all-in-one) (#5269)
* fix(TDI-44568): cumulative commit for all changes

* fix(TDI-44568): Distribution plugin added

* fix(TDI-44568): all latest changes

* fix(TDI-44568): recent changes

* fix(TDI-44568): latest + build fix

* fix(TDI-44568): group name changed, versions updated

* fix(TDI-44568): classpath and manifest files updated
2020-09-25 14:32:39 +03:00
Zhiwei Xue
50e7057eb9 feat(TUP-28342):Save maven url for components with parameter MODULE_LIST 2020-09-25 17:45:08 +08:00
pyzhou
94c39f79ff fix(TDI-44911):tFileInputExcel has malposition header when enabling (#5244)
column
2020-09-25 14:10:51 +08:00
SunChaoqun
a1b85f19c1 Bugfix/maintenance/7.3/tesb 30423 (#5267)
* TESB-30423:Install Patch_20200925_R2020-09_v1-7.3.1 with TDI.license,
studio export job fail

* TESB-30423:Install Patch_20200925_R2020-09_v1-7.3.1 with TDI.license,
studio export job fail
2020-09-24 13:33:18 +08:00
SunChaoqun
66755538e8 TESB-30423:Install Patch_20200925_R2020-09_v1-7.3.1 with TDI.license, (#5259)
studio export job fail
2020-09-23 18:32:09 +08:00
Emmanuel GALLOIS
a2209548ae fix(TCOMP-1772): typo 2020-09-23 11:43:37 +02:00
Emmanuel GALLOIS
d8e604bdb0 fix(TCOMP-1772): fix MEMO_X type in configuration.javajet (#5257) 2020-09-23 09:41:20 +02:00
Jane Ding
e4194eea26 feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE (#5255)
* feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE
https://jira.talendforge.org/browse/TUP-27654

* feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE
https://jira.talendforge.org/browse/TUP-27654

* feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE
https://jira.talendforge.org/browse/TUP-27654
2020-09-23 14:20:12 +08:00
Dmytro Grygorenko
3a9ea6fafd Revert "fix(TDI-44568): Update TalendExcel library. (#5001)" (#5254)
This reverts commit e8c2b70985.
2020-09-22 17:06:25 +03:00
Dmytro Grygorenko
d71fce6268 Revert "fix(TDI-44568): Update talendMsgMailUtil library (#5040)" (#5253)
This reverts commit 75fb043c1c.
2020-09-22 17:06:18 +03:00
Dmytro Grygorenko
11368d3910 Revert "fix(TDI-44568): Update simpleexcel library (#5041)" (#5252)
This reverts commit 9b70682a48.
2020-09-22 17:06:12 +03:00
Dmytro Grygorenko
d01477ab11 Revert "fix(TDI-44568): Update org.talend.libraries.excel plugin (#5042)" (#5251)
This reverts commit e0b54b7df4.
2020-09-22 17:06:03 +03:00
Dmytro Grygorenko
7900e44e13 Revert "fix(TDI-44568): Update Apache POI version in components (#5043)" (#5250)
This reverts commit 04988ed52b.
2020-09-22 17:05:50 +03:00
Dmytro Grygorenko
04988ed52b fix(TDI-44568): Update Apache POI version in components (#5043)
* fix(TDI-44568): Update Apache POI version in components

* fix(TDI-44568): use custom POI libraries in components

* fix(TDI-44568): "simpleexcel" version updated.

* fix(TDI-44568): more complete version update )

* fix(TDI-44568): Jasper components to use previous POI version (could not be updated that easy).
2020-09-22 13:28:11 +03:00
Dmytro Grygorenko
e0b54b7df4 fix(TDI-44568): Update org.talend.libraries.excel plugin (#5042)
* fix(TDI-44568): Talend libraries versions updated.

* fix(TDI-44568): remove unnecessary files.

* fix(TDI-44568): classpath file restored

* fix(TDI-44568): use custom POI libraries

* fix(TDI-44568): Manifest file restored, versions updated.
2020-09-22 13:27:48 +03:00
Dmytro Grygorenko
9b70682a48 fix(TDI-44568): Update simpleexcel library (#5041)
* fix(TDI-44568): Update simpleexcel library

* fix(TDI-44568): Aligning code with "master" branch

* fix(TDI-44568): version updated

* fix(TDI-44568): code alignment reverted, library version changed.
2020-09-22 13:27:32 +03:00
Dmytro Grygorenko
75fb043c1c fix(TDI-44568): Update talendMsgMailUtil library (#5040) 2020-09-22 13:26:16 +03:00
Dmytro Grygorenko
e8c2b70985 fix(TDI-44568): Update TalendExcel library. (#5001)
* fix(TDI-44568): Bump up versions, remove unused dependencies, some refactoring.

* fix(TDI-44568): addressed issue with preWb.

* fix(TDI-44568): GAV update, removed "version = 6.0.0"

* fix(TDI-44568): Version updated
2020-09-22 13:25:56 +03:00
wang wei
e727191389 fix(TDI-44908): pass TMC function jar for trunjob independent process case and dynamic job case (#5235) 2020-09-22 18:14:48 +08:00
Jane Ding
340b59a17e feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE (#5240)
https://jira.talendforge.org/browse/TUP-27654
2020-09-21 19:59:33 +08:00
OleksiiNimych
6a2257f107 feat(TDI-39689): tTeradataTPT support for multiple schemas (#5193)
* feat(TDI-39689): tTeradataTPT support for multiple schemas

* feat(TDI-39689): tTeradataTPTUtility

* feat(TDI-39689): tTeradataTPT add migration task

* feat(TDI-39689): rename new property

* feat(TDI-39689): tTeradataTPT fix migration task
2020-09-21 11:36:14 +03:00
Jane Ding
d4751f4cfb feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE (#5233)
https://jira.talendforge.org/browse/TUP-27654
2020-09-21 14:39:27 +08:00
hcyi
9e9406e2fa fix(TUP-25171):Issue when using components inside Joblet. (#5119)
* fix(TUP-25171):Issue when using components inside Joblet.

* fix(TUP-25171):Issue when using components inside Joblet.

* fix(TUP-25171):code format for Issue when using components inside
Joblet.
2020-09-18 15:55:09 +08:00
Chao MENG
647dbe0676 fix(TUP-26413): Be able to compare metadata connection conflicts (#5222)
https://jira.talendforge.org/browse/TUP-26413
2020-09-18 12:25:18 +08:00
Jane Ding
775e5a59f6 feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE (#5078)
* feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE
https://jira.talendforge.org/browse/TUP-27654

Signed-off-by: jding-tlnd <jding@talend.com>

* feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE
https://jira.talendforge.org/browse/TUP-27654

Signed-off-by: jding-tlnd <jding@talend.com>

* feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE
https://jira.talendforge.org/browse/TUP-27654

* feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE
https://jira.talendforge.org/browse/TUP-27654

Signed-off-by: jding-tlnd <jding@talend.com>

* feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE
https://jira.talendforge.org/browse/TUP-27654

Signed-off-by: jding-tlnd <jding@talend.com>

* feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE
https://jira.talendforge.org/browse/TUP-27654

Signed-off-by: jding-tlnd <jding@talend.com>

* feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE
https://jira.talendforge.org/browse/TUP-27654

Signed-off-by: jding-tlnd <jding@talend.com>

* feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE
https://jira.talendforge.org/browse/TUP-27654

Signed-off-by: jding-tlnd <jding@talend.com>

* fix(TUP-27654):Databricks Delta Lake - Support ELT & MERGE
https://jira.talendforge.org/browse/TUP-27654

Signed-off-by: jding-tlnd <jding@talend.com>

* feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE
https://jira.talendforge.org/browse/TUP-27654
TUP-28598:[Bug] driver class show as mvn:SparkJDBC42-2.6.14.1018.jar is
wrong.

* feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE
TUP-28615:[Bug] Save the property to metadata will change Delta Lake to
JDBC.
https://jira.talendforge.org/browse/TUP-27654

* feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE
TUP-28610:[Bug] drag Delta Lake component from Palatte to job meet NPE.
https://jira.talendforge.org/browse/TUP-27654

* feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE
https://jira.talendforge.org/browse/TUP-27654
Hard code to filter unsupported components
Set useAutoCommit and autocommit default value as true

* feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE
TUP-28616:[Bug] in stat&logs/extra page, when select Delta Lake DB, it
shows as JDBC.
To fit on support database, delta lake should be filter
https://jira.talendforge.org/browse/TUP-27654

* feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE
set delta lake/jdbc default mapping
https://jira.talendforge.org/browse/TUP-27654

* feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE
https://jira.talendforge.org/browse/TUP-27654

* feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE
https://jira.talendforge.org/browse/TUP-27654

* feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE
https://jira.talendforge.org/browse/TUP-27654

* feat(TUP-27654):Databricks Delta Lake - Support ELT & MERGE
https://jira.talendforge.org/browse/TUP-27654
2020-09-18 12:01:14 +08:00
385 changed files with 11208 additions and 4429 deletions

View File

@@ -91,10 +91,10 @@ if((codePart.equals(ECodePart.END))&&(stat || logstashCurrent)){
String sourceNodeId = source.getUniqueName();
String sourceLabel = ElementParameterParser.getValue(source, "__LABEL__");
String sourceNodeLabel = ((sourceLabel==null || "__UNIQUE_NAME__".equals(sourceLabel) || sourceLabel.contains("\"")) ? sourceNodeId : sourceLabel);
String sourceNodeLabel = ((sourceLabel==null || "__UNIQUE_NAME__".equals(sourceLabel) || sourceLabel.contains("\"")) ? sourceNodeId : sourceLabel.trim());
String targetLabel = ElementParameterParser.getValue(node, "__LABEL__");
String targetNodeLabel = ((targetLabel==null || "__UNIQUE_NAME__".equals(targetLabel) || targetLabel.contains("\"")) ? node.getUniqueName() : targetLabel);
String targetNodeLabel = ((targetLabel==null || "__UNIQUE_NAME__".equals(targetLabel) || targetLabel.contains("\"")) ? node.getUniqueName() : targetLabel.trim());
String sourceNodeComponent = source.getComponent().getName();
for (INode jobStructureCatcher : jobCatcherNodes) {
@@ -125,10 +125,10 @@ if((codePart.equals(ECodePart.END))&&(stat || logstashCurrent)){
String sourceNodeId = source.getUniqueName();
String sourceLabel = ElementParameterParser.getValue(source, "__LABEL__");
String sourceNodeLabel = ((sourceLabel==null || "__UNIQUE_NAME__".equals(sourceLabel) || sourceLabel.contains("\"")) ? sourceNodeId : sourceLabel);
String sourceNodeLabel = ((sourceLabel==null || "__UNIQUE_NAME__".equals(sourceLabel) || sourceLabel.contains("\"")) ? sourceNodeId : sourceLabel.trim());
String targetLabel = ElementParameterParser.getValue(node, "__LABEL__");
String targetNodeLabel = ((targetLabel==null || "__UNIQUE_NAME__".equals(targetLabel) || targetLabel.contains("\"")) ? node.getUniqueName() : targetLabel);
String targetNodeLabel = ((targetLabel==null || "__UNIQUE_NAME__".equals(targetLabel) || targetLabel.contains("\"")) ? node.getUniqueName() : targetLabel.trim());
String sourceNodeComponent = source.getComponent().getName();
for (INode jobStructureCatcher : jobCatcherNodes) {

View File

@@ -197,10 +197,10 @@
String sourceNodeId = source.getUniqueName();
String sourceLabel = ElementParameterParser.getValue(source, "__LABEL__");
String sourceNodeLabel = ((sourceLabel==null || "__UNIQUE_NAME__".equals(sourceLabel) || sourceLabel.contains("\"")) ? sourceNodeId : sourceLabel);
String sourceNodeLabel = ((sourceLabel==null || "__UNIQUE_NAME__".equals(sourceLabel) || sourceLabel.contains("\"")) ? sourceNodeId : sourceLabel.trim());
String targetLabel = ElementParameterParser.getValue(node, "__LABEL__");
String targetNodeLabel = ((targetLabel==null || "__UNIQUE_NAME__".equals(targetLabel) || targetLabel.contains("\"")) ? node.getUniqueName() : targetLabel);
String targetNodeLabel = ((targetLabel==null || "__UNIQUE_NAME__".equals(targetLabel) || targetLabel.contains("\"")) ? node.getUniqueName() : targetLabel.trim());
String sourceNodeComponent = source.getComponent().getName();
%>
@@ -233,10 +233,10 @@
String sourceNodeId = source.getUniqueName();
String sourceLabel = ElementParameterParser.getValue(source, "__LABEL__");
String sourceNodeLabel = ((sourceLabel==null || "__UNIQUE_NAME__".equals(sourceLabel) || sourceLabel.contains("\"")) ? sourceNodeId : sourceLabel);
String sourceNodeLabel = ((sourceLabel==null || "__UNIQUE_NAME__".equals(sourceLabel) || sourceLabel.contains("\"")) ? sourceNodeId : sourceLabel.trim());
String targetLabel = ElementParameterParser.getValue(node, "__LABEL__");
String targetNodeLabel = ((targetLabel==null || "__UNIQUE_NAME__".equals(targetLabel) || targetLabel.contains("\"")) ? node.getUniqueName() : targetLabel);
String targetNodeLabel = ((targetLabel==null || "__UNIQUE_NAME__".equals(targetLabel) || targetLabel.contains("\"")) ? node.getUniqueName() : targetLabel.trim());
String sourceNodeComponent = source.getComponent().getName();
%>
@@ -260,10 +260,10 @@
String sourceNodeId = source.getUniqueName();
String sourceLabel = ElementParameterParser.getValue(source, "__LABEL__");
String sourceNodeLabel = ((sourceLabel==null || "__UNIQUE_NAME__".equals(sourceLabel) || sourceLabel.contains("\"")) ? sourceNodeId : sourceLabel);
String sourceNodeLabel = ((sourceLabel==null || "__UNIQUE_NAME__".equals(sourceLabel) || sourceLabel.contains("\"")) ? sourceNodeId : sourceLabel.trim());
String targetLabel = ElementParameterParser.getValue(node, "__LABEL__");
String targetNodeLabel = ((targetLabel==null || "__UNIQUE_NAME__".equals(targetLabel) || targetLabel.contains("\"")) ? node.getUniqueName() : targetLabel);
String targetNodeLabel = ((targetLabel==null || "__UNIQUE_NAME__".equals(targetLabel) || targetLabel.contains("\"")) ? node.getUniqueName() : targetLabel.trim());
String sourceNodeComponent = source.getComponent().getName();
@@ -296,10 +296,10 @@
String sourceNodeId = source.getUniqueName();
String sourceLabel = ElementParameterParser.getValue(source, "__LABEL__");
String sourceNodeLabel = ((sourceLabel==null || "__UNIQUE_NAME__".equals(sourceLabel) || sourceLabel.contains("\"")) ? sourceNodeId : sourceLabel);
String sourceNodeLabel = ((sourceLabel==null || "__UNIQUE_NAME__".equals(sourceLabel) || sourceLabel.contains("\"")) ? sourceNodeId : sourceLabel.trim());
String targetLabel = ElementParameterParser.getValue(node, "__LABEL__");
String targetNodeLabel = ((targetLabel==null || "__UNIQUE_NAME__".equals(targetLabel) || targetLabel.contains("\"")) ? node.getUniqueName() : targetLabel);
String targetNodeLabel = ((targetLabel==null || "__UNIQUE_NAME__".equals(targetLabel) || targetLabel.contains("\"")) ? node.getUniqueName() : targetLabel.trim());
String sourceNodeComponent = source.getComponent().getName();
@@ -435,7 +435,7 @@
if(logstashCurrent) {
for (INode jobStructureCatcher : jobCatcherNodes) {
String label = ElementParameterParser.getValue(node, "__LABEL__");
String nodeLabel = ((label==null || "__UNIQUE_NAME__".equals(label) || label.contains("\"")) ? node.getUniqueName() : label);
String nodeLabel = ((label==null || "__UNIQUE_NAME__".equals(label) || label.contains("\"")) ? node.getUniqueName() : label.trim());
%>
if(enableLogStash) {
<%=jobStructureCatcher.getUniqueName() %>.addCM("<%=node.getUniqueName()%>", "<%=nodeLabel%>", "<%=node.getComponent().getName()%>");

View File

@@ -649,13 +649,15 @@
inContext = <%=className%>.class.getClassLoader().getResourceAsStream("config/contexts/" + contextStr + ".properties");
}
if (inContext != null) {
//defaultProps is in order to keep the original context value
if(context != null && context.isEmpty()) {
try {
//defaultProps is in order to keep the original context value
if(context != null && context.isEmpty()) {
defaultProps.load(inContext);
context = new ContextProperties(defaultProps);
}
} finally {
inContext.close();
}
inContext.close();
} else if (!isDefaultContext) {
//print info and job continue to run, for case: context_param is not empty.
System.err.println("Could not find the context " + contextStr);
@@ -1194,6 +1196,24 @@ if (execStat) {
}
%>
int returnCode = 0;
<%
if (isRunInMultiThread) {
%>
Integer localErrorCode = (Integer)(((java.util.Map)threadLocal.get()).get("errorCode"));
String localStatus = (String)(((java.util.Map)threadLocal.get()).get("status"));
if (localErrorCode != null) {
if (errorCode == null || localErrorCode.compareTo(errorCode) > 0) {
errorCode = localErrorCode;
}
}
if (localStatus != null && !status.equals("failure")){
status = localStatus;
}
<%
}
%>
if(errorCode == null) {
returnCode = status != null && status.equals("failure") ? 1 : 0;
} else {

View File

@@ -60,6 +60,10 @@ if ((metadatas != null) && (metadatas.size() > 0)) { // metadata
org.talend.components.api.component.ComponentDefinition def_<%=cid %> =
new <%= def.getClass().getName()%>();
org.talend.components.api.component.runtime.Writer writer_<%=cid%> = null;
org.talend.components.api.component.runtime.Reader reader_<%=cid%> = null;
<%
List<Component.CodegenPropInfo> propsToProcess = component.getCodegenPropInfos(componentProps);
%>
@@ -240,11 +244,11 @@ if(isTopologyNone) {
if (hasOutputOnly || asInputComponent) {
%>
org.talend.components.api.component.runtime.Source source_<%=cid%> =
(org.talend.components.api.component.runtime.Source)sourceOrSink_<%=cid%>;
org.talend.components.api.component.runtime.Reader reader_<%=cid%> =
source_<%=cid%>.createReader(container_<%=cid%>);
reader_<%=cid%> = new org.talend.codegen.flowvariables.runtime.FlowVariablesReader(reader_<%=cid%>, container_<%=cid%>);
if (sourceOrSink_<%=cid%> instanceof org.talend.components.api.component.runtime.Source) {
org.talend.components.api.component.runtime.Source source_<%=cid%> =
(org.talend.components.api.component.runtime.Source)sourceOrSink_<%=cid%>;
reader_<%=cid%> = source_<%=cid%>.createReader(container_<%=cid%>);
reader_<%=cid%> = new org.talend.codegen.flowvariables.runtime.FlowVariablesReader(reader_<%=cid%>, container_<%=cid%>);
<%
IConnection main = null;
@@ -266,19 +270,19 @@ if (hasOutputOnly || asInputComponent) {
IConnection schemaSourceConnector = (main!=null ? main : reject);
String schemaSourceConnectorName = schemaSourceConnector.getMetadataTable().getAttachedConnector();
%>
boolean multi_output_is_allowed_<%=cid%> = false;
boolean multi_output_is_allowed_<%=cid%> = false;
<% //take care SourceOrSink.validate will change the schema if it contains include-all-fields, so need to get design Avro schema before validate %>
org.talend.components.api.component.Connector c_<%=cid%> = null;
for (org.talend.components.api.component.Connector currentConnector : props_<%=cid %>.getAvailableConnectors(null, true)) {
if (currentConnector.getName().equals("<%=schemaSourceConnectorName%>")) {
c_<%=cid%> = currentConnector;
}
org.talend.components.api.component.Connector c_<%=cid%> = null;
for (org.talend.components.api.component.Connector currentConnector : props_<%=cid %>.getAvailableConnectors(null, true)) {
if (currentConnector.getName().equals("<%=schemaSourceConnectorName%>")) {
c_<%=cid%> = currentConnector;
}
if (currentConnector.getName().equals("REJECT")) {//it's better to move the code to javajet
multi_output_is_allowed_<%=cid%> = true;
if (currentConnector.getName().equals("REJECT")) {//it's better to move the code to javajet
multi_output_is_allowed_<%=cid%> = true;
}
}
}
org.apache.avro.Schema schema_<%=cid%> = props_<%=cid %>.getSchema(c_<%=cid%>, true);
org.apache.avro.Schema schema_<%=cid%> = props_<%=cid %>.getSchema(c_<%=cid%>, true);
<%
irToRow = new IndexedRecordToRowStructGenerator(cid, null, columnList);
@@ -286,117 +290,119 @@ if (hasOutputOnly || asInputComponent) {
}
%>
// Iterate through the incoming data.
boolean available_<%=cid%> = reader_<%=cid%>.start();
// Iterate through the incoming data.
boolean available_<%=cid%> = reader_<%=cid%>.start();
resourceMap.put("reader_<%=cid%>", reader_<%=cid%>);
resourceMap.put("reader_<%=cid%>", reader_<%=cid%>);
for (; available_<%=cid%>; available_<%=cid%> = reader_<%=cid%>.advance()) {
nb_line_<%=cid %>++;
for (; available_<%=cid%>; available_<%=cid%> = reader_<%=cid%>.advance()) {
nb_line_<%=cid %>++;
<%if(hasDataOutput) {%>
if (multi_output_is_allowed_<%=cid%>) {
<%if(main!=null){%>
<%=main.getName()%> = null;
<%}%>
<%if(reject!=null){%>
<%=reject.getName()%> = null;
<%}%>
}
<%}%>
<%if(hasDataOutput) {%>
if (multi_output_is_allowed_<%=cid%>) {
<%if(main!=null){%>
<%=main.getName()%> = null;
<%}%>
<%if(reject!=null){%>
<%=reject.getName()%> = null;
<%}%>
}
<%}%>
try {
Object data_<%=cid%> = reader_<%=cid%>.getCurrent();
<%
if (main != null) {
%>
if(multi_output_is_allowed_<%=cid%>) {
<%=main.getName()%> = new <%=main.getName() %>Struct();
}
<%
irToRow.generateConvertRecord("data_" + cid, main.getName(), main.getMetadataTable().getListColumns());
}
%>
} catch (org.talend.components.api.exception.DataRejectException e_<%=cid%>) {
java.util.Map<String,Object> info_<%=cid%> = e_<%=cid%>.getRejectInfo();
<%
if (reject!=null) {
%>
Object data_<%=cid%> = info_<%=cid%>.get("talend_record");
if (multi_output_is_allowed_<%=cid%>) {
<%=reject.getName()%> = new <%=reject.getName() %>Struct();
}
try{
<%
irToRow.generateConvertRecord("data_" + cid, reject.getName());
%>
}catch(java.lang.Exception e){
// do nothing
}
<%
Set<String> commonColumns = new HashSet<String>();
for (IMetadataColumn column : columnList) {
commonColumns.add(column.getLabel());
}
//pass error columns
List<IMetadataColumn> rejectColumns = reject.getMetadataTable().getListColumns();
for(IMetadataColumn column : rejectColumns) {
String columnName = column.getLabel();
// JavaType javaType = JavaTypesManager.getJavaTypeFromId(column.getTalendType());
String typeToGenerate = JavaTypesManager.getTypeToGenerate(column.getTalendType(), column.isNullable());
//error columns
if(!commonColumns.contains(columnName)) {
%>
<%=reject.getName()%>.<%=columnName%> = (<%=typeToGenerate%>)info_<%=cid%>.get("<%=columnName%>");
<%
}
}
} else {
%>
//TODO use a method instead of getting method by the special key "error/errorMessage"
Object errorMessage_<%=cid%> = null;
if(info_<%=cid%>.containsKey("error")){
errorMessage_<%=cid%> = info_<%=cid%>.get("error");
}else if(info_<%=cid%>.containsKey("errorMessage")){
errorMessage_<%=cid%> = info_<%=cid%>.get("errorMessage");
}else{
errorMessage_<%=cid%> = "Rejected but error message missing";
}
errorMessage_<%=cid%> = "Row "+ nb_line_<%=cid %> + ": "+errorMessage_<%=cid%>;
System.err.println(errorMessage_<%=cid%>);
<%
}
if (main != null) {
%>
// If the record is reject, the main line record should put NULL
<%=main.getName()%> = null;
<%
}
%>
}
try {
Object data_<%=cid%> = reader_<%=cid%>.getCurrent();
<%
if (main != null) {
%>
if(multi_output_is_allowed_<%=cid%>) {
<%=main.getName()%> = new <%=main.getName() %>Struct();
}
<%
irToRow.generateConvertRecord("data_" + cid, main.getName(), main.getMetadataTable().getListColumns());
}
%>
} catch (org.talend.components.api.exception.DataRejectException e_<%=cid%>) {
java.util.Map<String,Object> info_<%=cid%> = e_<%=cid%>.getRejectInfo();
<%
if (reject!=null) {
%>
Object data_<%=cid%> = info_<%=cid%>.get("talend_record");
if (multi_output_is_allowed_<%=cid%>) {
<%=reject.getName()%> = new <%=reject.getName() %>Struct();
}
try{
<%
irToRow.generateConvertRecord("data_" + cid, reject.getName());
%>
}catch(java.lang.Exception e){
// do nothing
}
<%
Set<String> commonColumns = new HashSet<String>();
for (IMetadataColumn column : columnList) {
commonColumns.add(column.getLabel());
}
//pass error columns
List<IMetadataColumn> rejectColumns = reject.getMetadataTable().getListColumns();
for(IMetadataColumn column : rejectColumns) {
String columnName = column.getLabel();
// JavaType javaType = JavaTypesManager.getJavaTypeFromId(column.getTalendType());
String typeToGenerate = JavaTypesManager.getTypeToGenerate(column.getTalendType(), column.isNullable());
//error columns
if(!commonColumns.contains(columnName)) {
%>
<%=reject.getName()%>.<%=columnName%> = (<%=typeToGenerate%>)info_<%=cid%>.get("<%=columnName%>");
<%
}
}
} else {
%>
//TODO use a method instead of getting method by the special key "error/errorMessage"
Object errorMessage_<%=cid%> = null;
if(info_<%=cid%>.containsKey("error")){
errorMessage_<%=cid%> = info_<%=cid%>.get("error");
}else if(info_<%=cid%>.containsKey("errorMessage")){
errorMessage_<%=cid%> = info_<%=cid%>.get("errorMessage");
}else{
errorMessage_<%=cid%> = "Rejected but error message missing";
}
errorMessage_<%=cid%> = "Row "+ nb_line_<%=cid %> + ": "+errorMessage_<%=cid%>;
System.err.println(errorMessage_<%=cid%>);
<%
}
if (main != null) {
%>
// If the record is reject, the main line record should put NULL
<%=main.getName()%> = null;
<%
}
%>
} // end of catch
<%
// The for loop around the incoming records from the reader is left open.
} else if (hasInput) {
%>
org.talend.components.api.component.runtime.Sink sink_<%=cid%> =
(org.talend.components.api.component.runtime.Sink)sourceOrSink_<%=cid%>;
org.talend.components.api.component.runtime.WriteOperation writeOperation_<%=cid%> = sink_<%=cid%>.createWriteOperation();
writeOperation_<%=cid%>.initialize(container_<%=cid%>);
org.talend.components.api.component.runtime.Writer writer_<%=cid%> = writeOperation_<%=cid%>.createWriter(container_<%=cid%>);
writer_<%=cid%>.open("<%=cid%>");
resourceMap.put("writer_<%=cid%>", writer_<%=cid%>);
org.talend.codegen.enforcer.IncomingSchemaEnforcer incomingEnforcer_<%=cid%> = null;
if (sourceOrSink_<%=cid%> instanceof org.talend.components.api.component.runtime.Sink) {
org.talend.components.api.component.runtime.Sink sink_<%=cid%> =
(org.talend.components.api.component.runtime.Sink)sourceOrSink_<%=cid%>;
org.talend.components.api.component.runtime.WriteOperation writeOperation_<%=cid%> = sink_<%=cid%>.createWriteOperation();
writeOperation_<%=cid%>.initialize(container_<%=cid%>);
writer_<%=cid%> = writeOperation_<%=cid%>.createWriter(container_<%=cid%>);
writer_<%=cid%>.open("<%=cid%>");
resourceMap.put("writer_<%=cid%>", writer_<%=cid%>);
} // end of "sourceOrSink_<%=cid%> instanceof ...Sink"
org.talend.components.api.component.Connector c_<%=cid%> = null;
for (org.talend.components.api.component.Connector currentConnector : props_<%=cid %>.getAvailableConnectors(null, false)) {
if (currentConnector.getName().equals("MAIN")) {
@@ -405,8 +411,7 @@ if (hasOutputOnly || asInputComponent) {
}
}
org.apache.avro.Schema designSchema_<%=cid%> = props_<%=cid %>.getSchema(c_<%=cid%>, false);
org.talend.codegen.enforcer.IncomingSchemaEnforcer incomingEnforcer_<%=cid%>
= new org.talend.codegen.enforcer.IncomingSchemaEnforcer(designSchema_<%=cid%>);
incomingEnforcer_<%=cid%> = new org.talend.codegen.enforcer.IncomingSchemaEnforcer(designSchema_<%=cid%>);
<%
List<? extends IConnection> outgoingConns = node.getOutgoingSortedConnections();
if (outgoingConns!=null){
@@ -442,7 +447,8 @@ if (hasOutputOnly || asInputComponent) {
}
}
}
%>
%>
java.lang.Iterable<?> outgoingMainRecordsList_<%=cid%> = new java.util.ArrayList<Object>();
java.util.Iterator outgoingMainRecordsIt_<%=cid%> = null;

View File

@@ -58,13 +58,24 @@ if(isTopologyNone) {
else if(hasOutputOnly || asInputComponent){
%>
} // while
reader_<%=cid%>.close();
final java.util.Map<String, Object> resultMap_<%=cid%> = reader_<%=cid%>.getReturnValues();
<%
if (hasOutputOnly || asInputComponent) {
%>
} // end of "if (sourceOrSink_<%=cid%> instanceof ...Source)"
<% } %>
java.util.Map<String, Object> resultMap_<%=cid%> = null;
if (reader_<%=cid%> != null) {
reader_<%=cid%>.close();
resultMap_<%=cid%> = reader_<%=cid%>.getReturnValues();
}
<%
}else if(hasInput){
%>
org.talend.components.api.component.runtime.Result resultObject_<%=cid%> = (org.talend.components.api.component.runtime.Result)writer_<%=cid%>.close();
final java.util.Map<String, Object> resultMap_<%=cid%> = writer_<%=cid%>.getWriteOperation().finalize(java.util.Arrays.<org.talend.components.api.component.runtime.Result>asList(resultObject_<%=cid%>), container_<%=cid%>);
java.util.Map<String, Object> resultMap_<%=cid%> = null;
if (writer_<%=cid%> != null) {
org.talend.components.api.component.runtime.Result resultObject_<%=cid%> = (org.talend.components.api.component.runtime.Result)writer_<%=cid%>.close();
resultMap_<%=cid%> = writer_<%=cid%>.getWriteOperation().finalize(java.util.Arrays.<org.talend.components.api.component.runtime.Result>asList(resultObject_<%=cid%>), container_<%=cid%>);
}
<%
} else {
return stringBuffer.toString();

View File

@@ -84,7 +84,7 @@ if(hasInput){
for (int i = 0; i < input_columnList.size(); i++) {
if(!input_columnList.get(i).getTalendType().equals("id_Dynamic")) {
%>
if (incomingEnforcer_<%=cid%>.getDesignSchema().getField("<%=input_columnList.get(i)%>") == null){
if (incomingEnforcer_<%=cid%> != null && incomingEnforcer_<%=cid%>.getDesignSchema().getField("<%=input_columnList.get(i)%>") == null){
incomingEnforcer_<%=cid%>.addIncomingNodeField("<%=input_columnList.get(i)%>", ((Object) <%=inputConn.getName()%>.<%=input_columnList.get(i)%>).getClass().getCanonicalName());
shouldCreateRuntimeSchemaForIncomingNode = true;
}
@@ -92,7 +92,7 @@ if(hasInput){
}
}
%>
if (shouldCreateRuntimeSchemaForIncomingNode){
if (shouldCreateRuntimeSchemaForIncomingNode && incomingEnforcer_<%=cid%> != null){
incomingEnforcer_<%=cid%>.createRuntimeSchema();
}
<%
@@ -111,7 +111,7 @@ if(hasInput){
if (dynamicPos != -1) {
%>
if (!incomingEnforcer_<%=cid%>.areDynamicFieldsInitialized()) {
if (incomingEnforcer_<%=cid%> != null && !incomingEnforcer_<%=cid%>.areDynamicFieldsInitialized()) {
// Initialize the dynamic columns when they are first encountered.
for (routines.system.DynamicMetadata dm_<%=cid%> : <%=inputConn.getName()%>.<%=input_columnList.get(dynamicPos).getLabel()%>.metadatas) {
incomingEnforcer_<%=cid%>.addDynamicField(
@@ -128,22 +128,26 @@ if(hasInput){
}
%>
incomingEnforcer_<%=cid%>.createNewRecord();
if (incomingEnforcer_<%=cid%> != null) {
incomingEnforcer_<%=cid%>.createNewRecord();
}
<%
for (int i = 0; i < input_columnList.size(); i++) { // column
IMetadataColumn column = input_columnList.get(i);
if (dynamicPos != i) {
%>
//skip the put action if the input column doesn't appear in component runtime schema
if (incomingEnforcer_<%=cid%>.getRuntimeSchema().getField("<%=input_columnList.get(i)%>") != null){
if (incomingEnforcer_<%=cid%> != null && incomingEnforcer_<%=cid%>.getRuntimeSchema().getField("<%=input_columnList.get(i)%>") != null){
incomingEnforcer_<%=cid%>.put("<%=column.getLabel()%>", <%=inputConn.getName()%>.<%=column.getLabel()%>);
}
<%
} else {
%>
for (int i = 0; i < <%=inputConn.getName()%>.<%=column.getLabel()%>.getColumnCount(); i++) {
incomingEnforcer_<%=cid%>.put(<%=inputConn.getName()%>.<%=column.getLabel()%>.getColumnMetadata(i).getName(),
<%=inputConn.getName()%>.<%=column.getLabel()%>.getColumnValue(i));
if (incomingEnforcer_<%=cid%> != null) {
for (int i = 0; i < <%=inputConn.getName()%>.<%=column.getLabel()%>.getColumnCount(); i++) {
incomingEnforcer_<%=cid%>.put(<%=inputConn.getName()%>.<%=column.getLabel()%>.getColumnMetadata(i).getName(),
<%=inputConn.getName()%>.<%=column.getLabel()%>.getColumnValue(i));
}
}
<%
}
@@ -177,7 +181,11 @@ if(hasInput){
} // propInfo
%>
org.apache.avro.generic.IndexedRecord data_<%=cid%> = incomingEnforcer_<%=cid%>.getCurrentRecord();
org.apache.avro.generic.IndexedRecord data_<%=cid%> = null;
if (incomingEnforcer_<%=cid%> != null) {
data_<%=cid%> = incomingEnforcer_<%=cid%>.getCurrentRecord();
}
<%
boolean isParallelize ="true".equalsIgnoreCase(ElementParameterParser.getValue(node, "__PARALLELIZE__"));
@@ -190,8 +198,9 @@ if(hasInput){
}
}
%>
writer_<%=cid%>.write(data_<%=cid%>);
if (writer_<%=cid%> != null && data_<%=cid%> != null) {
writer_<%=cid%>.write(data_<%=cid%>);
}
nb_line_<%=cid %>++;
<%if(hasMainOutput){

View File

@@ -437,6 +437,20 @@ private RunTrace runTrace = new RunTrace();
globalMap.put(KEY_DB_DATASOURCES, talendDataSources);
globalMap.put(KEY_DB_DATASOURCES_RAW, new java.util.HashMap<String, javax.sql.DataSource>(dataSources));
}
public void setDataSourceReferences(List serviceReferences) throws Exception{
java.util.Map<String, routines.system.TalendDataSource> talendDataSources = new java.util.HashMap<String, routines.system.TalendDataSource>();
java.util.Map<String, javax.sql.DataSource> dataSources = new java.util.HashMap<String, javax.sql.DataSource>();
for (java.util.Map.Entry<String, javax.sql.DataSource> entry : BundleUtils.getServices(serviceReferences, javax.sql.DataSource.class).entrySet()) {
dataSources.put(entry.getKey(), entry.getValue());
talendDataSources.put(entry.getKey(), new routines.system.TalendDataSource(entry.getValue()));
}
globalMap.put(KEY_DB_DATASOURCES, talendDataSources);
globalMap.put(KEY_DB_DATASOURCES_RAW, new java.util.HashMap<String, javax.sql.DataSource>(dataSources));
}
<%
for (INode logCatcher : process.getNodesOfType("tLogCatcher")) {

View File

@@ -68,6 +68,14 @@
id="org.talend.designer.components.model.UserComponentsProvider">
</ComponentsProvider>
</extension>
<extension
point="org.talend.core.components_provider">
<ComponentsProvider
class="org.talend.designer.codegen.components.model.SharedStudioUserComponentProvider"
folderName="user"
id="org.talend.designer.codegen.components.model.SharedStudioUserComponentProvider">
</ComponentsProvider>
</extension>
<extension
point="org.eclipse.core.runtime.preferences">
<initializer

View File

@@ -26,10 +26,8 @@ import java.util.HashMap;
import java.util.HashSet;
import java.util.Iterator;
import java.util.List;
import java.util.Locale;
import java.util.Map;
import java.util.Optional;
import java.util.ResourceBundle;
import java.util.Set;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicBoolean;
@@ -75,6 +73,7 @@ import org.talend.core.model.components.IComponentsFactory;
import org.talend.core.model.components.IComponentsHandler;
import org.talend.core.model.components.filters.ComponentsFactoryProviderManager;
import org.talend.core.model.components.filters.IComponentFactoryFilter;
import org.talend.core.runtime.util.ComponentsLocationProvider;
import org.talend.core.ui.IJobletProviderService;
import org.talend.core.ui.ISparkJobletProviderService;
import org.talend.core.ui.ISparkStreamingJobletProviderService;
@@ -83,8 +82,6 @@ import org.talend.core.ui.images.CoreImageProvider;
import org.talend.core.utils.TalendCacheUtils;
import org.talend.designer.codegen.CodeGeneratorActivator;
import org.talend.designer.codegen.i18n.Messages;
import org.talend.designer.core.ITisLocalProviderService;
import org.talend.designer.core.ITisLocalProviderService.ResClassLoader;
import org.talend.designer.core.model.components.ComponentBundleToPath;
import org.talend.designer.core.model.components.ComponentFilesNaming;
import org.talend.designer.core.model.components.EmfComponent;
@@ -164,7 +161,11 @@ public class ComponentsFactory implements IComponentsFactory {
throw new RuntimeException(e);
}
isInitialising.set(true);
removeOldComponentsUserFolder(); // not used anymore
try {
removeOldComponentsUserFolder();
} catch (IOException ex) {
ExceptionHandler.process(ex);
} // not used anymore
long startTime = System.currentTimeMillis();
// TimeMeasure.display = true;
@@ -387,10 +388,12 @@ public class ComponentsFactory implements IComponentsFactory {
ComponentManager.saveResource();
}
private void removeOldComponentsUserFolder() {
private void removeOldComponentsUserFolder() throws IOException {
String userPath = IComponentsFactory.COMPONENTS_INNER_FOLDER + File.separatorChar
+ ComponentUtilities.getExtFolder(OLD_COMPONENTS_USER_INNER_FOLDER);
File componentsLocation = getComponentsLocation(userPath);
ComponentsProviderManager componentsProviderManager = ComponentsProviderManager.getInstance();
AbstractComponentsProvider componentsProvider = componentsProviderManager.loadUserComponentsProvidersFromExtension();
File componentsLocation = getComponentsLocation(componentsProvider, userPath);
if (componentsLocation != null && componentsLocation.exists()) {
FilesUtils.removeFolder(componentsLocation, true);
}
@@ -671,114 +674,38 @@ public class ComponentsFactory implements IComponentsFactory {
*
* @param currentFolder
* @return
* @throws IOException
* @throws BusinessException
*/
private File getComponentsLocation(String folder) {
String componentsPath = IComponentsFactory.COMPONENTS_LOCATION;
IBrandingService breaningService = (IBrandingService) GlobalServiceRegister.getDefault()
.getService(IBrandingService.class);
if (breaningService.isPoweredOnlyCamel()) {
componentsPath = IComponentsFactory.CAMEL_COMPONENTS_LOCATION;
}
Bundle b = Platform.getBundle(componentsPath);
File file = null;
try {
URL url = FileLocator.find(b, new Path(folder), null);
if (url == null) {
return null;
private File getComponentsLocation(AbstractComponentsProvider componentsProvider, String folder) throws IOException {
if (componentsProvider instanceof ComponentsLocationProvider) {
return componentsProvider.getInstallationFolder();
} else {
String componentsPath = IComponentsFactory.COMPONENTS_LOCATION;
IBrandingService breaningService = (IBrandingService) GlobalServiceRegister.getDefault()
.getService(IBrandingService.class);
if (breaningService.isPoweredOnlyCamel()) {
componentsPath = IComponentsFactory.CAMEL_COMPONENTS_LOCATION;
}
URL fileUrl = FileLocator.toFileURL(url);
file = new File(fileUrl.getPath());
} catch (Exception e) {
// e.printStackTrace();
ExceptionHandler.process(e);
}
Bundle b = Platform.getBundle(componentsPath);
return file;
}
private File getComponentsLocation(String folder, AbstractComponentsProvider provider) {
File file = null;
try {
if (provider != null) {
file = provider.getInstallationFolder();
} else {
String componentsPath = IComponentsFactory.COMPONENTS_LOCATION;
Bundle b = Platform.getBundle(componentsPath);
IBrandingService breaningService = (IBrandingService) GlobalServiceRegister.getDefault()
.getService(IBrandingService.class);
if (breaningService.isPoweredOnlyCamel()) {
componentsPath = IComponentsFactory.CAMEL_COMPONENTS_LOCATION;
}
File file = null;
try {
URL url = FileLocator.find(b, new Path(folder), null);
if (url == null) {
return null;
}
URL fileUrl = FileLocator.toFileURL(url);
file = new File(fileUrl.getPath());
} catch (Exception e) {
// e.printStackTrace();
ExceptionHandler.process(e);
}
} catch (Exception e) {
ExceptionHandler.process(e);
}
return file;
}
private ResourceBundle getComponentResourceBundle(IComponent currentComp, String source, String cachedPathSource,
AbstractComponentsProvider provider) {
try {
AbstractComponentsProvider currentProvider = provider;
if (currentProvider == null) {
ComponentsProviderManager componentsProviderManager = ComponentsProviderManager.getInstance();
Collection<AbstractComponentsProvider> providers = componentsProviderManager.getProviders();
for (AbstractComponentsProvider curProvider : providers) {
String path = new Path(curProvider.getInstallationFolder().toString()).toPortableString();
if (source.startsWith(path)) {
// fix for TDI-19889 and TDI-20507 to get the correct component provider
if (cachedPathSource != null) {
if (path.contains(cachedPathSource)) {
currentProvider = curProvider;
break;
}
} else {
currentProvider = curProvider;
break;
}
}
}
}
String installPath = currentProvider.getInstallationFolder().toString();
String label = ComponentFilesNaming.getInstance().getBundleName(currentComp.getName(),
installPath.substring(installPath.lastIndexOf(IComponentsFactory.COMPONENTS_INNER_FOLDER)));
if (currentProvider.isUseLocalProvider()) {
// if the component use local provider as storage (for user / ecosystem components)
// then get the bundle resource from the current main component provider.
// note: code here to review later, service like this shouldn't be used...
ResourceBundle bundle = null;
IBrandingService brandingService = (IBrandingService) GlobalServiceRegister.getDefault()
.getService(IBrandingService.class);
if (brandingService.isPoweredOnlyCamel()) {
bundle = currentProvider.getResourceBundle(label);
} else {
ITisLocalProviderService service = (ITisLocalProviderService) GlobalServiceRegister.getDefault()
.getService(ITisLocalProviderService.class);
bundle = service.getResourceBundle(label);
}
return bundle;
} else {
ResourceBundle bundle = ResourceBundle.getBundle(label, Locale.getDefault(),
new ResClassLoader(currentProvider.getClass().getClassLoader()));
return bundle;
}
} catch (IOException e) {
ExceptionHandler.process(e);
}
return null;
return file;
}
}
private String getCodeLanguageSuffix() {
@@ -1082,5 +1009,13 @@ public class ComponentsFactory implements IComponentsFactory {
public void setComponentsHandler(IComponentsHandler componentsHandler) {
this.componentsHandler = componentsHandler;
}
public String getCustomComponentBundlePath() {
ComponentsProviderManager componentsProviderManager = ComponentsProviderManager.getInstance();
AbstractComponentsProvider componentsProvider = componentsProviderManager.loadUserComponentsProvidersFromExtension();
String bundle = componentsProvider.getComponentsBundle();
return ComponentBundleToPath.getPathFromBundle(bundle);
}
}

View File

@@ -23,6 +23,7 @@ import org.eclipse.core.runtime.IExtensionRegistry;
import org.eclipse.core.runtime.Platform;
import org.talend.core.GlobalServiceRegister;
import org.talend.core.model.components.AbstractComponentsProvider;
import org.talend.core.runtime.util.SharedStudioInfoProvider;
import org.talend.core.ui.branding.IBrandingService;
import org.talend.designer.codegen.i18n.Messages;
@@ -69,6 +70,9 @@ public final class ComponentsProviderManager {
try {
AbstractComponentsProvider componentsProvider = (AbstractComponentsProvider) configurationElement
.createExecutableExtension("class"); //$NON-NLS-1$
if (componentsProvider instanceof SharedStudioInfoProvider && !((SharedStudioInfoProvider)componentsProvider).isSupportCurrentMode()) {
continue;
}
componentsProvider.setId(id);
componentsProvider.setFolderName(folderName);
componentsProvider.setContributer(contributerName);
@@ -81,15 +85,15 @@ public final class ComponentsProviderManager {
}
}
public AbstractComponentsProvider loadUserComponentsProvidersFromExtension() {
if (providers == null) {
loadComponentsProvidersFromExtension();
}
for (AbstractComponentsProvider provider : providers) {
if ("org.talend.designer.components.model.UserComponentsProvider".equals(provider.getId())) {
return provider;
}
}
return null;
}
public AbstractComponentsProvider loadUserComponentsProvidersFromExtension() {
if (providers == null) {
loadComponentsProvidersFromExtension();
}
for (AbstractComponentsProvider provider : providers) {
if (provider instanceof UserComponentsProvider) {
return provider;
}
}
return null;
}
}

View File

@@ -0,0 +1,61 @@
package org.talend.designer.codegen.components.model;
//============================================================================
//
//Copyright (C) 2006-2019 Talend Inc. - www.talend.com
//
//This source code is available under agreement available at
//%InstallDIR%\features\org.talend.rcp.branding.%PRODUCTNAME%\%PRODUCTNAME%license.txt
//
//You should have received a copy of the agreement
//along with this program; if not, write to Talend SA
//9 rue Pages 92150 Suresnes, France
//
//============================================================================
import java.io.File;
import java.io.IOException;
import java.net.URL;
import java.net.URLClassLoader;
import java.util.ResourceBundle;
import org.eclipse.core.runtime.IPath;
import org.eclipse.core.runtime.Path;
import org.eclipse.core.runtime.Platform;
import org.talend.core.model.components.ComponentUtilities;
import org.talend.core.model.components.IComponentsFactory;
import org.talend.core.runtime.util.ComponentsLocationProvider;
import org.talend.core.runtime.util.SharedStudioUtils;
import org.talend.designer.core.model.components.ComponentBundleToPath;
public class SharedStudioUserComponentProvider extends UserComponentsProvider implements ComponentsLocationProvider{
@Override
public File getInstallationFolder() throws IOException {
File componentFolder = SharedStudioUtils.getSharedStudioComponentsParentFolder();
IPath path = new Path(IComponentsFactory.COMPONENTS_INNER_FOLDER);
path = path.append(IComponentsFactory.EXTERNAL_COMPONENTS_INNER_FOLDER).append(ComponentUtilities.getExtFolder(getFolderName()));
File installationFolder = new File (componentFolder, path.toOSString());
return installationFolder;
}
public String getComponentsBundle() {
return ComponentBundleToPath.SHARED_STUDIO_CUSTOM_COMPONENT_BUNDLE;
}
public boolean isSupportCurrentMode() {
if (SharedStudioUtils.isSharedStudioMode()) {
return true;
}
return false;
}
@Override
public ResourceBundle getResourceBundle(String label) {
URL configFolderUrl = Platform.getConfigurationLocation().getURL();
URLClassLoader urlLoader = new URLClassLoader(new java.net.URL[]{configFolderUrl});
java.util.ResourceBundle bundle = java.util.ResourceBundle.getBundle( label ,
java.util.Locale.getDefault(), urlLoader );
return bundle;
}
}

View File

@@ -34,13 +34,15 @@ import org.talend.core.model.components.ComponentUtilities;
import org.talend.core.model.components.IComponentsFactory;
import org.talend.core.model.general.Project;
import org.talend.core.model.repository.ERepositoryObjectType;
import org.talend.core.runtime.util.SharedStudioInfoProvider;
import org.talend.core.runtime.util.SharedStudioUtils;
import org.talend.core.ui.branding.IBrandingService;
import org.talend.designer.codegen.CodeGeneratorActivator;
import org.talend.designer.codegen.components.ui.IComponentPreferenceConstant;
import org.talend.repository.ProjectManager;
/***/
public class UserComponentsProvider extends AbstractCustomComponentsProvider {
public class UserComponentsProvider extends AbstractCustomComponentsProvider implements SharedStudioInfoProvider{
@Override
protected File getExternalComponentsLocation() {
@@ -147,5 +149,11 @@ public class UserComponentsProvider extends AbstractCustomComponentsProvider {
public String getComponentsBundle() {
return IComponentsFactory.COMPONENTS_LOCATION;
}
public boolean isSupportCurrentMode() {
if (SharedStudioUtils.isSharedStudioMode()) {
return false;
}
return true;
}
}

View File

@@ -18,6 +18,7 @@ import java.util.Map;
import org.eclipse.core.runtime.Platform;
import org.talend.commons.exception.ExceptionHandler;
import org.talend.commons.utils.StringUtils;
import org.talend.designer.core.model.components.ComponentBundleToPath;
/**
* Jet container for a particular component.
@@ -213,8 +214,17 @@ public class JetBean {
if (pluginIdToBundle.containsKey(pluginId)) {
base = pluginIdToBundle.get(pluginId);
} else {
base = Platform.getBundle(pluginId).getEntry("/").toString(); //$NON-NLS-1$
pluginIdToBundle.put(pluginId, base);
if (ComponentBundleToPath.SHARED_STUDIO_CUSTOM_COMPONENT_BUNDLE.equals(pluginId)) {
base = ComponentBundleToPath.getPathFromBundle(pluginId);
if (!base.endsWith("/")) {
base = base + "/";
}
pluginIdToBundle.put(pluginId, base);
} else {
base = Platform.getBundle(pluginId).getEntry("/").toString(); //$NON-NLS-1$
pluginIdToBundle.put(pluginId, base);
}
}
String result = base + relativeUri;
return result;

View File

@@ -136,13 +136,11 @@ public class TalendJETCompiler extends JETCompiler {
// get the plugin name from fileURI
String refPluginName = matcher.group(1);
// retrieve the plugin URI by pluginName.
Bundle refBundle = Platform.getBundle(refPluginName);
if (refBundle != null) {
String realURI = TemplateUtil.getPlatformUrlOfBundle(refPluginName);
String realURI = TemplateUtil.getPlatformUrlOfBundle(refPluginName);
if (realURI != null) {
// replace the old fileURI to new one by pluginURI
String newFileURI = fileURI.replaceFirst(PLUGIN_VAR_PATTERN.pattern(), realURI);
return newFileURI;
}
}
}

View File

@@ -14,6 +14,7 @@ package org.talend.designer.codegen.config;
import org.eclipse.core.runtime.Platform;
import org.osgi.framework.Bundle;
import org.talend.designer.core.model.components.ComponentBundleToPath;
/**
* CodeGenerator Templates Ressources Utils.
@@ -161,10 +162,25 @@ public class TemplateUtil {
* @return
*/
public static String getPlatformUrlOfBundle(String bundleName) {
Bundle bundle = Platform.getBundle(bundleName);
if (bundle == null) {
return null;
}
return "platform:/plugin/" + bundle.getSymbolicName() + "_" + bundle.getVersion().toString() + "/";
if (ComponentBundleToPath.SHARED_STUDIO_CUSTOM_COMPONENT_BUNDLE.equals(bundleName)) {
String basePath = ComponentBundleToPath.getPathFromBundle(bundleName);
if (!basePath.endsWith("/")) {
basePath = basePath + "/";
}
return basePath;
} else {
Bundle bundle = Platform.getBundle(bundleName);
if (bundle == null) {
return null;
}
StringBuilder sb = new StringBuilder();
sb.append("platform:/plugin/");
sb.append(bundle.getSymbolicName());
sb.append("_");
sb.append(bundle.getVersion().toString());
sb.append("/");
return sb.toString();
}
}
}

View File

@@ -47,6 +47,7 @@ import org.talend.core.ui.component.ComponentsFactoryProvider;
import org.talend.designer.codegen.CodeGeneratorActivator;
import org.talend.designer.codegen.config.TemplateUtil;
import org.talend.designer.codegen.i18n.Messages;
import org.talend.designer.core.model.components.ComponentBundleToPath;
/**
* DOC xtan
@@ -256,10 +257,9 @@ public final class JetSkeletonManager {
};
for (TemplateUtil template : CodeGeneratorInternalTemplatesFactoryProvider.getInstance().getTemplates()) {
Bundle b = Platform.getBundle(template.getJetPluginRepository());
URL resourcesUrl = null;
try {
resourcesUrl = FileLocator.toFileURL(FileLocator.find(b, new Path(template.getTemplateRelativeUri()), null));
resourcesUrl = FileLocator.toFileURL(ComponentBundleToPath.findComponentsBundleURL(template.getJetPluginRepository(), new Path(template.getTemplateRelativeUri()), null));
} catch (IOException e) {
ExceptionHandler.process(e);
}

View File

@@ -9,6 +9,14 @@
id="org.talend.designer.components.exchange.ExchangeComponentsProvider">
</ComponentsProvider>
</extension>
<extension
point="org.talend.core.components_provider">
<ComponentsProvider
class="org.talend.designer.components.exchange.SharedStudioExchangeComponentsProvider"
folderName="exchange"
id="org.talend.designer.components.exchange.SharedStudioExchangeComponentsProvider">
</ComponentsProvider>
</extension>
<extension
point="org.talend.core.runtime.service">
<Service

View File

@@ -28,13 +28,15 @@ import org.talend.core.GlobalServiceRegister;
import org.talend.core.model.components.AbstractComponentsProvider;
import org.talend.core.model.components.ComponentUtilities;
import org.talend.core.model.components.IComponentsFactory;
import org.talend.core.runtime.util.SharedStudioInfoProvider;
import org.talend.core.runtime.util.SharedStudioUtils;
import org.talend.core.ui.branding.IBrandingService;
import org.talend.designer.components.exchange.util.ExchangeUtils;
/**
* DOC hcyi class global comment. Detailled comment
*/
public class ExchangeComponentsProvider extends AbstractComponentsProvider {
public class ExchangeComponentsProvider extends AbstractComponentsProvider implements SharedStudioInfoProvider{
/**
* ExchangeComponentsProvider constructor.
@@ -184,4 +186,10 @@ public class ExchangeComponentsProvider extends AbstractComponentsProvider {
return IComponentsFactory.COMPONENTS_LOCATION;
}
public boolean isSupportCurrentMode() {
if (SharedStudioUtils.isSharedStudioMode()) {
return false;
}
return true;
}
}

View File

@@ -0,0 +1,59 @@
package org.talend.designer.components.exchange;
//============================================================================
//
//Copyright (C) 2006-2019 Talend Inc. - www.talend.com
//
//This source code is available under agreement available at
//%InstallDIR%\features\org.talend.rcp.branding.%PRODUCTNAME%\%PRODUCTNAME%license.txt
//
//You should have received a copy of the agreement
//along with this program; if not, write to Talend SA
//9 rue Pages 92150 Suresnes, France
//
//============================================================================
import java.io.File;
import java.io.IOException;
import java.net.URL;
import java.net.URLClassLoader;
import java.util.ResourceBundle;
import org.eclipse.core.runtime.IPath;
import org.eclipse.core.runtime.Path;
import org.eclipse.core.runtime.Platform;
import org.talend.core.model.components.ComponentUtilities;
import org.talend.core.model.components.IComponentsFactory;
import org.talend.core.runtime.util.ComponentsLocationProvider;
import org.talend.core.runtime.util.SharedStudioUtils;
import org.talend.designer.core.model.components.ComponentBundleToPath;
public class SharedStudioExchangeComponentsProvider extends ExchangeComponentsProvider implements ComponentsLocationProvider{
@Override
public File getInstallationFolder() throws IOException {
File componentFolder = SharedStudioUtils.getSharedStudioComponentsParentFolder();
IPath path = new Path(IComponentsFactory.COMPONENTS_INNER_FOLDER);
path = path.append(IComponentsFactory.EXTERNAL_COMPONENTS_INNER_FOLDER).append(ComponentUtilities.getExtFolder(getFolderName()));
File installationFolder = new File (componentFolder, path.toOSString());
return installationFolder;
}
public String getComponentsBundle() {
return ComponentBundleToPath.SHARED_STUDIO_CUSTOM_COMPONENT_BUNDLE;
}
public boolean isSupportCurrentMode() {
if (SharedStudioUtils.isSharedStudioMode()) {
return true;
}
return false;
}
@Override
public ResourceBundle getResourceBundle(String label) {
URL configFolderUrl = Platform.getConfigurationLocation().getURL();
URLClassLoader urlLoader = new URLClassLoader(new java.net.URL[]{configFolderUrl});
java.util.ResourceBundle bundle = java.util.ResourceBundle.getBundle( label ,
java.util.Locale.getDefault(), urlLoader );
return bundle;
}
}

View File

@@ -52,6 +52,7 @@ import org.talend.core.download.IDownloadHelper;
import org.talend.core.model.components.ComponentManager;
import org.talend.core.model.components.IComponent;
import org.talend.core.model.components.IComponentsFactory;
import org.talend.core.runtime.util.SharedStudioUtils;
import org.talend.core.ui.component.ComponentPaletteUtilities;
import org.talend.core.ui.component.ComponentsFactoryProvider;
import org.talend.designer.codegen.ICodeGeneratorService;
@@ -312,51 +313,54 @@ public class DownloadComponenentsAction extends Action implements IIntroAction {
protected void afterDownload(IProgressMonitor monitor, ComponentExtension extension, File localZipFile) throws Exception {
if (UpdatesHelper.isComponentUpdateSite(localZipFile)) {
final File workFolder = org.talend.utils.files.FileUtils.createTmpFolder("downloadedComponents", ""); //$NON-NLS-1$ //$NON-NLS-2$
if (!SharedStudioUtils.isSharedStudioMode()) {
final File workFolder = org.talend.utils.files.FileUtils.createTmpFolder("downloadedComponents", ""); //$NON-NLS-1$ //$NON-NLS-2$
try {
FilesUtils.copyFile(localZipFile, new File(workFolder, localZipFile.getName()));
try {
FilesUtils.copyFile(localZipFile, new File(workFolder, localZipFile.getName()));
ComponentsInstallComponent component = LocalComponentInstallHelper.getComponent();
if (component != null) {
try {
component.setComponentFolder(workFolder);
if (component.install()) {
ComponentsInstallComponent component = LocalComponentInstallHelper.getComponent();
if (component != null) {
try {
component.setComponentFolder(workFolder);
if (component.install()) {
if (component.needRelaunch()) {
askReboot();
} else {
MessageDialog.openInformation(DisplayUtils.getDefaultShell(),
Messages.getString("DownloadComponenentsAction.installComponentsTitle"),
component.getInstalledMessages());
if (component.needRelaunch()) {
askReboot();
} else {
MessageDialog.openInformation(DisplayUtils.getDefaultShell(),
Messages.getString("DownloadComponenentsAction.installComponentsTitle"),
component.getInstalledMessages());
}
} else {// install failure
MessageDialog.openWarning(DisplayUtils.getDefaultShell(),
Messages.getString("DownloadComponenentsAction_failureTitle"), //$NON-NLS-1$
Messages.getString("DownloadComponenentsAction_failureMessage", extension.getLabel())); //$NON-NLS-1$
}
} else {// install failure
MessageDialog.openWarning(DisplayUtils.getDefaultShell(),
} finally {
// after install, clear the setting for service.
component.setComponentFolder(null);
}
}
} catch (Exception e) {
// Popup dialog to user to waring install failed.
Display.getDefault().syncExec(new Runnable() {
@Override
public void run() {
MessageDialog.openError(DisplayUtils.getDefaultShell(false),
Messages.getString("DownloadComponenentsAction_failureTitle"), //$NON-NLS-1$
Messages.getString("DownloadComponenentsAction_failureMessage", extension.getLabel())); //$NON-NLS-1$
}
} finally {
// after install, clear the setting for service.
component.setComponentFolder(null);
}
});
throw e;
} finally {
FilesUtils.deleteFolder(workFolder, true);
}
} catch (Exception e) {
// Popup dialog to user to waring install failed.
Display.getDefault().syncExec(new Runnable() {
@Override
public void run() {
MessageDialog.openError(DisplayUtils.getDefaultShell(false),
Messages.getString("DownloadComponenentsAction_failureTitle"), //$NON-NLS-1$
Messages.getString("DownloadComponenentsAction_failureMessage", extension.getLabel())); //$NON-NLS-1$
}
});
throw e;
} finally {
FilesUtils.deleteFolder(workFolder, true);
}
monitor.done();
ExchangeManager.getInstance().saveDownloadedExtensionsToFile(extension);
monitor.done();
ExchangeManager.getInstance().saveDownloadedExtensionsToFile(extension);
}
} else {
File installedLocation = ComponentInstaller.unzip(localZipFile.getAbsolutePath(), getComponentsFolder()
.getAbsolutePath());

View File

@@ -37,6 +37,7 @@ import org.eclipse.swt.widgets.Shell;
import org.eclipse.ui.PlatformUI;
import org.talend.commons.ui.runtime.exception.ExceptionHandler;
import org.talend.core.download.DownloadHelper;
import org.talend.core.runtime.util.SharedStudioUtils;
import org.talend.designer.components.exchange.i18n.Messages;
import org.talend.designer.components.exchange.model.Category;
import org.talend.designer.components.exchange.model.VersionRevision;
@@ -105,7 +106,7 @@ public class ImportExchangeDialog extends Dialog {
@Override
protected void okPressed() {
IPath tempPath = new Path(System.getProperty("user.dir")).append("temp"); //$NON-NLS-1$ //$NON-NLS-2$
IPath tempPath = SharedStudioUtils.getTempFolderPath();
File pathFile = tempPath.toFile();
if (downloadproperty.getFileName() == null || downloadproperty.getFileName() == null) {
MessageBox box = new MessageBox(Display.getCurrent().getActiveShell(), SWT.ICON_WARNING | SWT.OK);

View File

@@ -55,6 +55,7 @@ import org.talend.core.language.ECodeLanguage;
import org.talend.core.language.LanguageManager;
import org.talend.core.model.components.IComponentsFactory;
import org.talend.core.model.general.Project;
import org.talend.core.runtime.util.SharedStudioUtils;
import org.talend.core.ui.component.ComponentPaletteUtilities;
import org.talend.core.ui.component.ComponentsFactoryProvider;
import org.talend.designer.components.exchange.ExchangePlugin;
@@ -205,14 +206,19 @@ public class ExchangeUtils {
* @return
*/
public static File getComponentFolder(String componentfolder) {
URL url = FileLocator.find(ExchangePlugin.getDefault().getBundle(), new Path(componentfolder), null);
try {
URL fileUrl = FileLocator.toFileURL(url);
return new File(fileUrl.getPath());
} catch (Exception e) {
ExceptionHandler.process(e);
}
return null;
if (SharedStudioUtils.isSharedStudioMode()) {
File componentFolder = SharedStudioUtils.getSharedStudioComponentsExtFolder();
return new File (componentFolder, componentfolder);
} else {
URL url = FileLocator.find(ExchangePlugin.getDefault().getBundle(), new Path(componentfolder), null);
try {
URL fileUrl = FileLocator.toFileURL(url);
return new File(fileUrl.getPath());
} catch (Exception e) {
ExceptionHandler.process(e);
}
return null;
}
}
/**

View File

@@ -2,9 +2,9 @@
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.talend.libraries</groupId>
<groupId>org.talend.components</groupId>
<artifactId>filecopy</artifactId>
<version>2.0.0</version>
<version>2.0.1</version>
<packaging>jar</packaging>
<name>talend-copy</name>

View File

@@ -15,7 +15,10 @@ package org.talend;
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.nio.file.StandardCopyOption;
import java.nio.file.attribute.FileTime;
/**
* DOC Administrator class global comment. Detailled comment
@@ -35,14 +38,17 @@ public class FileCopy {
* @throws IOException : if IO pb.
*/
public static void copyFile(String srcFileName, String desFileName, boolean delSrc) throws IOException {
final File source = new File(srcFileName);
final File destination = new File(desFileName);
final Path source = Paths.get(srcFileName);
final Path destination = Paths.get(desFileName);
if (delSrc) {
// move : more efficient if in same FS and mustr delete existing file.
Files.move(source.toPath(), destination.toPath(), StandardCopyOption.REPLACE_EXISTING);
FileTime lastModifiedTime = Files.getLastModifiedTime(source);
Files.move(source, destination, StandardCopyOption.REPLACE_EXISTING);
Files.setLastModifiedTime(destination,lastModifiedTime);
} else {
Files.copy(source.toPath(), destination.toPath(), StandardCopyOption.REPLACE_EXISTING);
Files.copy(source, destination, StandardCopyOption.REPLACE_EXISTING);
Files.setLastModifiedTime(destination,Files.getLastModifiedTime(source));
}
}

View File

@@ -100,6 +100,24 @@ class FileCopyTest {
Assertions.assertEquals(referenceSize, copy.length(), "Size error");
}
@Test
void testLastModifiedTime() throws Exception {
final URL repCopy = Thread.currentThread().getContextClassLoader().getResource("copy");
File file = this.buildFile("fileLMT.txt", 10L * 1024L);
file.deleteOnExit();
long referencceTime = 324723894L;
file.setLastModified(referencceTime);
File copy = new File(repCopy.getPath(), "fileLMTDestination.txt");
if (copy.exists()) {
copy.delete();
}
copy.deleteOnExit();
FileCopy.copyFile(file.getPath(), copy.getPath(), true);
Assertions.assertEquals(referencceTime, copy.lastModified(), "modified time is not idential");
}
/**
* Generate a new file for testing.
*

View File

@@ -2,9 +2,9 @@
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.talend.libraries</groupId>
<artifactId>simpleexcel-2.2-20190722</artifactId>
<version>6.0.0</version>
<groupId>org.talend.components</groupId>
<artifactId>simpleexcel</artifactId>
<version>2.4-20200923</version>
<packaging>jar</packaging>
<name>simpleexcel</name>
@@ -13,7 +13,7 @@
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<talend.nexus.url>https://artifacts-oss.talend.com</talend.nexus.url>
<java.source.version>1.6</java.source.version>
<java.source.version>1.8</java.source.version>
</properties>
<distributionManagement>
@@ -43,47 +43,30 @@
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi</artifactId>
<version>4.1.0</version>
<version>4.1.2</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-scratchpad</artifactId>
<version>4.1.0</version>
<version>4.1.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.poi/poi-ooxml -->
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
<version>4.1.0</version>
<version>4.1.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.poi/poi-ooxml-schemas -->
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml-schemas</artifactId>
<version>4.1.0</version>
<version>4.1.2</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.geronimo.specs/geronimo-stax-api_1.0_spec -->
<dependency>
<groupId>org.apache.geronimo.specs</groupId>
<artifactId>geronimo-stax-api_1.0_spec</artifactId>
<version>1.0</version>
</dependency>
<dependency>
<groupId>org.dom4j</groupId>
<artifactId>dom4j</artifactId>
<version>2.1.3</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.xmlbeans/xmlbeans -->
<dependency>
<groupId>org.apache.xmlbeans</groupId>
<artifactId>xmlbeans</artifactId>
<version>3.1.0</version>
</dependency>
</dependencies>
<build>
<resources>

View File

@@ -1,6 +1,6 @@
// ============================================================================
//
// Copyright (C) 2006-2019 Talend Inc. - www.talend.com
// Copyright (C) 2006-2020 Talend Inc. - www.talend.com
//
// This source code is available under agreement available at
// %InstallDIR%\features\org.talend.rcp.branding.%PRODUCTNAME%\%PRODUCTNAME%license.txt

View File

@@ -1,6 +1,6 @@
// ============================================================================
//
// Copyright (C) 2006-2019 Talend Inc. - www.talend.com
// Copyright (C) 2006-2020 Talend Inc. - www.talend.com
//
// This source code is available under agreement available at
// %InstallDIR%\features\org.talend.rcp.branding.%PRODUCTNAME%\%PRODUCTNAME%license.txt

View File

@@ -1,6 +1,6 @@
// ============================================================================
//
// Copyright (C) 2006-2019 Talend Inc. - www.talend.com
// Copyright (C) 2006-2020 Talend Inc. - www.talend.com
//
// This source code is available under agreement available at
// %InstallDIR%\features\org.talend.rcp.branding.%PRODUCTNAME%\%PRODUCTNAME%license.txt

View File

@@ -1,6 +1,6 @@
// ============================================================================
//
// Copyright (C) 2006-2019 Talend Inc. - www.talend.com
// Copyright (C) 2006-2020 Talend Inc. - www.talend.com
//
// This source code is available under agreement available at
// %InstallDIR%\features\org.talend.rcp.branding.%PRODUCTNAME%\%PRODUCTNAME%license.txt

View File

@@ -1,6 +1,6 @@
// ============================================================================
//
// Copyright (C) 2006-2019 Talend Inc. - www.talend.com
// Copyright (C) 2006-2020 Talend Inc. - www.talend.com
//
// This source code is available under agreement available at
// %InstallDIR%\features\org.talend.rcp.branding.%PRODUCTNAME%\%PRODUCTNAME%license.txt

View File

@@ -3,14 +3,13 @@
<modelVersion>4.0.0</modelVersion>
<groupId>org.talend.components</groupId>
<artifactId>components-soap</artifactId>
<version>2.2-20200730</version>
<version>2.3-20200918</version>
<packaging>jar</packaging>
<name>talend-soap</name>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<cxf.version>3.1.1</cxf.version>
<talend.nexus.url>https://artifacts-oss.talend.com</talend.nexus.url>
</properties>
@@ -46,19 +45,14 @@
<systemPath>${java.home}/lib/rt.jar</systemPath>
</dependency>
<dependency>
<groupId>jdom</groupId>
<artifactId>jdom</artifactId>
<version>1.1</version>
<groupId>org.dom4j</groupId>
<artifactId>dom4j</artifactId>
<version>2.1.3</version>
</dependency>
<dependency>
<groupId>com.sun.xml.messaging.saaj</groupId>
<artifactId>saaj-impl</artifactId>
<version>1.3.2</version>
</dependency>
<dependency>
<groupId>javax.activation</groupId>
<artifactId>activation</artifactId>
<version>1.1</version>
<groupId>com.sun.xml.messaging.saaj</groupId>
<artifactId>saaj-impl</artifactId>
<version>1.5.2</version>
</dependency>
<dependency>
<groupId>xerces</groupId>
@@ -68,7 +62,7 @@
<dependency>
<groupId>commons-codec</groupId>
<artifactId>commons-codec</artifactId>
<version>1.9</version>
<version>1.14</version>
</dependency>
</dependencies>
<build>
@@ -108,4 +102,4 @@
</plugin>
</plugins>
</build>
</project>
</project>

View File

@@ -32,8 +32,7 @@ import javax.xml.transform.stream.StreamResult;
import javax.xml.transform.stream.StreamSource;
import org.apache.commons.codec.binary.Base64;
import org.jdom.input.DOMBuilder;
import org.jdom.output.XMLOutputter;
import org.dom4j.io.DOMReader;
import org.talend.soap.sun.SunNtlmAuthenticationUpdater;
import org.w3c.dom.Document;
import org.w3c.dom.Element;
@@ -45,8 +44,6 @@ public class SOAPUtil {
private static final String vmVendor = System.getProperty("java.vendor.url");
private static final String ibmVmVendor = "http://www.ibm.com/";
private static final String sunVmVendor = "http://java.sun.com/";
private static final String oracleVmVendor = "http://java.oracle.com/";
@@ -140,12 +137,7 @@ public class SOAPUtil {
StreamSource preppedMsgSrc = new StreamSource(stream);
soapPart.setContent(preppedMsgSrc);
// InputStream stream = new FileInputStream(new File("d://soap.txt"));
// StreamSource preppedMsgSrc = new StreamSource(stream);
// soapPart.setContent(preppedMsgSrc);
message.saveChanges();
// Send the message
SOAPMessage reply = connection.call(message, destination);
@@ -226,7 +218,7 @@ public class SOAPUtil {
Node content;
Element headerRootElem = document.createElement("Header");
Iterator childElements = header.getChildElements();
Iterator<javax.xml.soap.Node> childElements = header.getChildElements();
org.w3c.dom.Node domNode = null;
while (childElements.hasNext()) {
domNode = (org.w3c.dom.Node) childElements.next();
@@ -245,12 +237,11 @@ public class SOAPUtil {
return reHeaderMessage;
}
private String Doc2StringWithoutDeclare(Document doc) {
DOMBuilder builder = new DOMBuilder();
org.jdom.Document jdomDoc = builder.build(doc);
XMLOutputter outputter = new XMLOutputter();
return outputter.outputString(jdomDoc.getRootElement());
}
private String Doc2StringWithoutDeclare(Document doc) {
DOMReader reader = new DOMReader();
org.dom4j.Document document = reader.read(doc);
return document.getRootElement().asXML();
}
/**
* invoke soap and return the response document
@@ -363,4 +354,4 @@ public class SOAPUtil {
headers.setHeader("Authorization", "Basic " + encodeUserInfo);
}
}
}

View File

@@ -0,0 +1,66 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.talend.components.lib</groupId>
<artifactId>talend-aws</artifactId>
<version>1.0</version>
<packaging>jar</packaging>
<name>talend-aws</name>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<talend.nexus.url>https://artifacts-oss.talend.com</talend.nexus.url>
<java.source.version>1.8</java.source.version>
</properties>
<distributionManagement>
<snapshotRepository>
<id>talend_nexus_deployment</id>
<url>${talend.nexus.url}/nexus/content/repositories/TalendOpenSourceSnapshot/</url>
<snapshots>
<enabled>true</enabled>
</snapshots>
<releases>
<enabled>false</enabled>
</releases>
</snapshotRepository>
<repository>
<id>talend_nexus_deployment</id>
<url>${talend.nexus.url}/nexus/content/repositories/TalendOpenSourceRelease/</url>
<snapshots>
<enabled>false</enabled>
</snapshots>
<releases>
<enabled>true</enabled>
</releases>
</repository>
</distributionManagement>
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>1.11.848</version>
</dependency>
</dependencies>
<build>
<resources>
<resource>
<directory>src/main/java</directory>
</resource>
</resources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>${java.source.version}</source>
<target>${java.source.version}</target>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -0,0 +1,277 @@
package org.talend.aws;
import static com.amazonaws.event.SDKProgressPublisher.publishProgress;
import java.util.Collection;
import java.util.LinkedList;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.Future;
import com.amazonaws.AmazonClientException;
import com.amazonaws.AmazonServiceException;
import com.amazonaws.event.ProgressEventType;
import com.amazonaws.event.ProgressListener;
import com.amazonaws.event.ProgressListenerChain;
import com.amazonaws.services.s3.model.LegacyS3ProgressListener;
import com.amazonaws.services.s3.transfer.Transfer;
import com.amazonaws.services.s3.transfer.TransferProgress;
import com.amazonaws.services.s3.transfer.internal.TransferMonitor;
import com.amazonaws.services.s3.transfer.internal.TransferStateChangeListener;
/**
* Abstract transfer implementation.
*/
public abstract class AbstractTransfer implements Transfer {
/** The current state of this transfer. */
protected volatile TransferState state = TransferState.Waiting;
protected TransferMonitor monitor;
/** The progress of this transfer. */
private final TransferProgress transferProgress;
private final String description;
/** Hook for adding/removing more progress listeners. */
protected final ProgressListenerChain listenerChain;
/** Collection of listeners to be notified for changes to the state of this transfer via setState() */
protected final Collection<TransferStateChangeListener> stateChangeListeners = new LinkedList<TransferStateChangeListener>();
AbstractTransfer(String description, TransferProgress transferProgress, ProgressListenerChain progressListenerChain) {
this(description, transferProgress, progressListenerChain, null);
}
AbstractTransfer(String description, TransferProgress transferProgress,
ProgressListenerChain progressListenerChain, TransferStateChangeListener stateChangeListener) {
this.description = description;
this.listenerChain = progressListenerChain;
this.transferProgress = transferProgress;
addStateChangeListener(stateChangeListener);
}
/**
* Returns whether or not the transfer is finished (i.e. completed successfully,
* failed, or was canceled). This method should never block.
*
* @return Returns <code>true</code> if this transfer is finished (i.e. completed successfully,
* failed, or was canceled). Returns <code>false</code> if otherwise.
*/
public final synchronized boolean isDone() {
return (state == TransferState.Failed ||
state == TransferState.Completed ||
state == TransferState.Canceled);
}
/**
* Waits for this transfer to complete. This is a blocking call; the current
* thread is suspended until this transfer completes.
*
* @throws AmazonClientException
* If any errors were encountered in the client while making the
* request or handling the response.
* @throws AmazonServiceException
* If any errors occurred in Amazon S3 while processing the
* request.
* @throws InterruptedException
* If this thread is interrupted while waiting for the transfer
* to complete.
*/
public void waitForCompletion()
throws AmazonClientException, AmazonServiceException, InterruptedException {
try {
Object result = null;
while (!monitor.isDone() || result == null) {
Future<?> f = monitor.getFuture();
result = f.get();
}
} catch (ExecutionException e) {
rethrowExecutionException(e);
}
}
/**
* Waits for this transfer to finish and returns any error that occurred, or
* returns <code>null</code> if no errors occurred.
* This is a blocking call; the current thread
* will be suspended until this transfer either fails or completes
* successfully.
*
* @return Any error that occurred while processing this transfer.
* Otherwise returns <code>null</code> if no errors occurred.
*
* @throws InterruptedException
* If this thread is interrupted while waiting for the transfer
* to complete.
*/
public AmazonClientException waitForException() throws InterruptedException {
try {
/**
* Do not remove the while loop. We need this as the future returned by
* monitor.getFuture() is set two times during the upload and copy operations.
*/
while (!monitor.isDone()) {
monitor.getFuture().get();
}
monitor.getFuture().get();
return null;
} catch (ExecutionException e) {
return unwrapExecutionException(e);
}
}
/**
* Returns a human-readable description of this transfer.
*
* @return A human-readable description of this transfer.
*/
public String getDescription() {
return description;
}
/**
* Returns the current state of this transfer.
*
* @return The current state of this transfer.
*/
public synchronized TransferState getState() {
return state;
}
/**
* Sets the current state of this transfer.
*/
public void setState(TransferState state) {
synchronized (this) {
this.state = state;
}
for ( TransferStateChangeListener listener : stateChangeListeners ) {
listener.transferStateChanged(this, state);
}
}
/**
* Notifies all the registered state change listeners of the state update.
*/
public void notifyStateChangeListeners(TransferState state) {
for ( TransferStateChangeListener listener : stateChangeListeners ) {
listener.transferStateChanged(this, state);
}
}
/**
* Adds the specified progress listener to the list of listeners
* receiving updates about this transfer's progress.
*
* @param listener
* The progress listener to add.
*/
public synchronized void addProgressListener(ProgressListener listener) {
listenerChain.addProgressListener(listener);
}
/**
* Removes the specified progress listener from the list of progress
* listeners receiving updates about this transfer's progress.
*
* @param listener
* The progress listener to remove.
*/
public synchronized void removeProgressListener(ProgressListener listener) {
listenerChain.removeProgressListener(listener);
}
/**
* @deprecated Replaced by {@link #addProgressListener(ProgressListener)}
*/
@Deprecated
public synchronized void addProgressListener(com.amazonaws.services.s3.model.ProgressListener listener) {
listenerChain.addProgressListener(new LegacyS3ProgressListener(listener));
}
/**
* @deprecated Replaced by {@link #removeProgressListener(ProgressListener)}
*/
@Deprecated
public synchronized void removeProgressListener(com.amazonaws.services.s3.model.ProgressListener listener) {
listenerChain.removeProgressListener(new LegacyS3ProgressListener(listener));
}
/**
* Adds the given state change listener to the collection of listeners.
*/
public synchronized void addStateChangeListener(TransferStateChangeListener listener) {
if ( listener != null )
stateChangeListeners.add(listener);
}
/**
* Removes the given state change listener from the collection of listeners.
*/
public synchronized void removeStateChangeListener(TransferStateChangeListener listener) {
if ( listener != null )
stateChangeListeners.remove(listener);
}
/**
* Returns progress information about this transfer.
*
* @return The progress information about this transfer.
*/
public TransferProgress getProgress() {
return transferProgress;
}
/**
* Sets the monitor used to poll for transfer completion.
*/
public void setMonitor(TransferMonitor monitor) {
this.monitor = monitor;
}
public TransferMonitor getMonitor() {
return monitor;
}
protected void fireProgressEvent(final ProgressEventType eventType) {
publishProgress(listenerChain, eventType);
}
/**
* Examines the cause of the specified ExecutionException and either
* rethrows it directly (if it's a type of AmazonClientException) or wraps
* it in an AmazonClientException and rethrows it.
*
* @param e
* The execution exception to examine.
*/
protected void rethrowExecutionException(ExecutionException e) {
throw unwrapExecutionException(e);
}
/**
* Unwraps the root exception that caused the specified ExecutionException
* and returns it. If it was not an instance of AmazonClientException, it is
* wrapped as an AmazonClientException.
*
* @param e
* The ExecutionException to unwrap.
*
* @return The root exception that caused the specified ExecutionException.
*/
protected AmazonClientException unwrapExecutionException(ExecutionException e) {
Throwable t = e;
while (t.getCause() != null && t instanceof ExecutionException) {
t = t.getCause();
}
if (t instanceof AmazonClientException) {
return (AmazonClientException) t;
}
return new AmazonClientException("Unable to complete transfer: " + t.getMessage(), t);
}
}

View File

@@ -0,0 +1,39 @@
package org.talend.aws;
import com.amazonaws.annotation.SdkInternalApi;
import com.amazonaws.services.s3.internal.ServiceUtils;
import com.amazonaws.services.s3.transfer.Transfer;
import java.io.File;
import java.util.List;
import java.util.concurrent.Callable;
import java.util.concurrent.Future;
/**
* Helper class to merge all the individual part files into a destinationFile.
*/
@SdkInternalApi
public class CompleteMultipartDownload implements Callable<File> {
private final List<Future<File>> partFiles;
private final File destinationFile;
private final DownloadImpl download;
private Integer currentPartNumber;
public CompleteMultipartDownload(List<Future<File>> files, File destinationFile, DownloadImpl download, Integer currentPartNumber) {
this.partFiles = files;
this.destinationFile = destinationFile;
this.download = download;
this.currentPartNumber = currentPartNumber;
}
@Override
public File call() throws Exception {
for (Future<File> file : partFiles) {
ServiceUtils.appendFile(file.get(), destinationFile);
download.updatePersistableTransfer(currentPartNumber++);
}
download.setState(Transfer.TransferState.Completed);
return destinationFile;
}
}

View File

@@ -0,0 +1,60 @@
package org.talend.aws;
import java.io.IOException;
import com.amazonaws.services.s3.model.CryptoMode;
import com.amazonaws.services.s3.model.ObjectMetadata;
import com.amazonaws.services.s3.transfer.Transfer;
import com.amazonaws.services.s3.transfer.exception.PauseException;
/**
* Represents an asynchronous download from Amazon S3.
*/
public interface Download extends Transfer {
/**
* Returns the ObjectMetadata for the object being downloaded.
*
* @return The ObjectMetadata for the object being downloaded.
*/
public ObjectMetadata getObjectMetadata();
/**
* The name of the bucket where the object is being downloaded from.
*
* @return The name of the bucket where the object is being downloaded from.
*/
public String getBucketName();
/**
* The key under which this object was stored in Amazon S3.
*
* @return The key under which this object was stored in Amazon S3.
*/
public String getKey();
/**
* Cancels this download.
*
* @throws IOException
*/
public void abort() throws IOException;
/**
* Pause the current download operation and returns the information that can
* be used to resume the download at a later time.
*
* Resuming a download would not perform ETag check as range get is
* performed for downloading the object's remaining contents.
*
* Resuming a download for an object encrypted using
* {@link CryptoMode#StrictAuthenticatedEncryption} would result in
* AmazonClientException as authenticity cannot be guaranteed for a range
* get operation.
*
* @throws PauseException
* If any errors were encountered while trying to pause the
* download.
*/
public PersistableDownload pause() throws PauseException;
}

View File

@@ -0,0 +1,312 @@
package org.talend.aws;
import java.io.File;
import java.io.RandomAccessFile;
import java.net.SocketException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.concurrent.Callable;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.ScheduledExecutorService;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Future;
import javax.net.ssl.SSLProtocolException;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import com.amazonaws.AmazonClientException;
import com.amazonaws.SdkClientException;
import com.amazonaws.annotation.SdkInternalApi;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.internal.FileLocks;
import com.amazonaws.services.s3.internal.ServiceUtils;
import com.amazonaws.services.s3.internal.ServiceUtils.RetryableS3DownloadTask;
import com.amazonaws.services.s3.model.GetObjectRequest;
import com.amazonaws.services.s3.model.S3Object;
import com.amazonaws.services.s3.transfer.Transfer.TransferState;
import com.amazonaws.services.s3.transfer.exception.FileLockException;
import com.amazonaws.util.IOUtils;
@SdkInternalApi
final class DownloadCallable implements Callable<File> {
private static final Log LOG = LogFactory.getLog(DownloadCallable.class);
private final AmazonS3 s3;
private final CountDownLatch latch;
private final GetObjectRequest req;
private final boolean resumeExistingDownload;
private final DownloadImpl download;
private final File dstfile;
private final long origStartingByte;
private final long timeout;
private final ScheduledExecutorService timedExecutor;
/** The thread pool in which parts are downloaded downloaded. */
private final ExecutorService executor;
private final List<Future<File>> futureFiles;
private final boolean isDownloadParallel;
private Integer lastFullyMergedPartNumber;
private final boolean resumeOnRetry;
private long expectedFileLength;
DownloadCallable(AmazonS3 s3, CountDownLatch latch,
GetObjectRequest req, boolean resumeExistingDownload,
DownloadImpl download, File dstfile, long origStartingByte,
long expectedFileLength, long timeout,
ScheduledExecutorService timedExecutor,
ExecutorService executor,
Integer lastFullyDownloadedPartNumber, boolean isDownloadParallel, boolean resumeOnRetry)
{
if (s3 == null || latch == null || req == null || dstfile == null || download == null)
throw new IllegalArgumentException();
this.s3 = s3;
this.latch = latch;
this.req = req;
this.resumeExistingDownload = resumeExistingDownload;
this.download = download;
this.dstfile = dstfile;
this.origStartingByte = origStartingByte;
this.expectedFileLength = expectedFileLength;
this.timeout = timeout;
this.timedExecutor = timedExecutor;
this.executor = executor;
this.futureFiles = new ArrayList<Future<File>>();
this.lastFullyMergedPartNumber = lastFullyDownloadedPartNumber;
this.isDownloadParallel = isDownloadParallel;
this.resumeOnRetry = resumeOnRetry;
}
/**
* This method must return a non-null object, or else the existing
* implementation in {@link AbstractTransfer#waitForCompletion()}
* would block forever.
*
* @return the downloaded file
*/
@Override
public File call() throws Exception {
try {
latch.await();
if (isTimeoutEnabled()) {
timedExecutor.schedule(new Runnable() {
public void run() {
try {
if (download.getState() != TransferState.Completed) {
download.abort();
}
} catch(Exception e) {
throw new SdkClientException(
"Unable to abort download after timeout", e);
}
}
}, timeout, TimeUnit.MILLISECONDS);
}
download.setState(TransferState.InProgress);
ServiceUtils.createParentDirectoryIfNecessary(dstfile);
if (isDownloadParallel) {
downloadInParallel(ServiceUtils.getPartCount(req, s3));
} else {
S3Object s3Object = retryableDownloadS3ObjectToFile(dstfile,
new DownloadTaskImpl(s3, download, req));
updateDownloadStatus(s3Object);
}
return dstfile;
} catch (Throwable t) {
// Cancel all the futures
for (Future<File> f : futureFiles) {
f.cancel(true);
}
// Downloads aren't allowed to move from canceled to failed
if (download.getState() != TransferState.Canceled) {
download.setState(TransferState.Failed);
}
if (t instanceof Exception)
throw (Exception) t;
else
throw (Error) t;
}
}
/**
* Takes the result from serial download,
* updates the transfer state and monitor in downloadImpl object
* based on the result.
*/
private void updateDownloadStatus(S3Object result) {
if (result == null) {
download.setState(TransferState.Canceled);
download.setMonitor(new DownloadMonitor(download, null));
} else {
download.setState(TransferState.Completed);
}
}
/**
* Downloads each part of the object into a separate file synchronously and
* combines all the files into a single file.
*/
private void downloadInParallel(int partCount) throws Exception {
if (lastFullyMergedPartNumber == null) {
lastFullyMergedPartNumber = 0;
}
for (int i = lastFullyMergedPartNumber + 1; i <= partCount; i++) {
GetObjectRequest getPartRequest = new GetObjectRequest(req.getBucketName(), req.getKey(),
req.getVersionId()).withUnmodifiedSinceConstraint(req.getUnmodifiedSinceConstraint())
.withModifiedSinceConstraint(req.getModifiedSinceConstraint())
.withResponseHeaders(req.getResponseHeaders()).withSSECustomerKey(req.getSSECustomerKey())
.withGeneralProgressListener(req.getGeneralProgressListener());
getPartRequest.setMatchingETagConstraints(req.getMatchingETagConstraints());
getPartRequest.setNonmatchingETagConstraints(req.getNonmatchingETagConstraints());
getPartRequest.setRequesterPays(req.isRequesterPays());
futureFiles.add(
executor.submit(new DownloadPartCallable(s3, getPartRequest.withPartNumber(i), dstfile)));
}
truncateDestinationFileIfNecessary();
Future<File> future = executor.submit(new CompleteMultipartDownload(futureFiles, dstfile, download, ++lastFullyMergedPartNumber));
((DownloadMonitor) download.getMonitor()).setFuture(future);
}
/**
* If only partial part object is merged into the dstFile(due to pause
* operation), adjust the file length so that the part starts writing from
* the correct position.
*/
private void truncateDestinationFileIfNecessary() {
RandomAccessFile raf = null;
if (!FileLocks.lock(dstfile)) {
throw new FileLockException("Fail to lock " + dstfile);
}
try {
raf = new RandomAccessFile(dstfile, "rw");
if (lastFullyMergedPartNumber == 0) {
raf.setLength(0);
} else {
long lastByte = ServiceUtils.getLastByteInPart(s3, req, lastFullyMergedPartNumber);
if (dstfile.length() < lastByte) {
throw new SdkClientException(
"File " + dstfile.getAbsolutePath() + " has been modified since last pause.");
}
raf.setLength(lastByte + 1);
download.getProgress().updateProgress(lastByte + 1);
}
} catch (Exception e) {
throw new SdkClientException("Unable to append part file to dstfile " + e.getMessage(), e);
} finally {
IOUtils.closeQuietly(raf, LOG);
FileLocks.unlock(dstfile);
}
}
/**
* This method is called only if it is a resumed download.
*
* Adjust the range of the get request, and the expected (ie current) file
* length of the destination file to append to.
*/
private void adjustRequest(GetObjectRequest req) {
long[] range = req.getRange();
long lastByte = range[1];
long totalBytesToDownload = lastByte - this.origStartingByte + 1;
if (dstfile.exists()) {
if (!FileLocks.lock(dstfile)) {
throw new FileLockException("Fail to lock " + dstfile
+ " for range adjustment");
}
try {
expectedFileLength = dstfile.length();
long startingByte = this.origStartingByte + expectedFileLength;
LOG.info("Adjusting request range from " + Arrays.toString(range)
+ " to "
+ Arrays.toString(new long[] { startingByte, lastByte })
+ " for file " + dstfile);
req.setRange(startingByte, lastByte);
totalBytesToDownload = lastByte - startingByte + 1;
} finally {
FileLocks.unlock(dstfile);
}
}
if (totalBytesToDownload < 0) {
throw new IllegalArgumentException(
"Unable to determine the range for download operation. lastByte="
+ lastByte + ", origStartingByte=" + origStartingByte
+ ", expectedFileLength=" + expectedFileLength
+ ", totalBytesToDownload=" + totalBytesToDownload);
}
}
private S3Object retryableDownloadS3ObjectToFile(File file,
RetryableS3DownloadTask retryableS3DownloadTask) {
boolean hasRetried = false;
S3Object s3Object;
for (;;) {
final boolean appendData = resumeExistingDownload || (resumeOnRetry && hasRetried);
if (appendData && hasRetried) {
// Need to adjust the get range or else we risk corrupting the downloaded file
adjustRequest(req);
}
s3Object = retryableS3DownloadTask.getS3ObjectStream();
if (s3Object == null)
return null;
try {
if (testing && resumeExistingDownload && !hasRetried) {
throw new SdkClientException("testing");
}
ServiceUtils.downloadToFile(s3Object, file,
retryableS3DownloadTask.needIntegrityCheck(),
appendData, expectedFileLength);
return s3Object;
} catch (AmazonClientException ace) {
if (!ace.isRetryable())
throw ace;
// Determine whether an immediate retry is needed according to the captured SdkClientException.
// (There are three cases when downloadObjectToFile() throws SdkClientException:
// 1) SocketException or SSLProtocolException when writing to disk (e.g. when user aborts the download)
// 2) Other IOException when writing to disk
// 3) MD5 hashes don't match
// For 1) If SocketException is the result of the client side resetting the connection, this is retried
// Cases 2) and 3) will always be retried
final Throwable cause = ace.getCause();
if ((cause instanceof SocketException && !cause.getMessage().equals("Connection reset"))
|| (cause instanceof SSLProtocolException)) {
throw ace;
} else {
if (hasRetried)
throw ace;
else {
LOG.info("Retry the download of object " + s3Object.getKey() + " (bucket " + s3Object.getBucketName() + ")", ace);
hasRetried = true;
}
}
} finally {
s3Object.getObjectContent().abort();
}
}
}
private boolean isTimeoutEnabled() {
return timeout > 0;
}
private static boolean testing;
/**
* Used for testing purpose only.
*/
static void setTesting(boolean b) {
testing = b;
}
}

View File

@@ -0,0 +1,202 @@
package org.talend.aws;
import java.io.File;
import java.io.IOException;
import com.amazonaws.annotation.SdkInternalApi;
import com.amazonaws.event.ProgressEventType;
import com.amazonaws.event.ProgressListenerChain;
import com.amazonaws.services.s3.model.GetObjectRequest;
import com.amazonaws.services.s3.model.ObjectMetadata;
import com.amazonaws.services.s3.model.S3Object;
import com.amazonaws.services.s3.transfer.TransferProgress;
import com.amazonaws.services.s3.transfer.exception.PauseException;
import com.amazonaws.services.s3.transfer.internal.S3ProgressPublisher;
import com.amazonaws.services.s3.transfer.internal.TransferManagerUtils;
import com.amazonaws.services.s3.transfer.internal.TransferStateChangeListener;
public class DownloadImpl extends AbstractTransfer implements Download {
private S3Object s3Object;
/**
* Information to resume if the download is paused.
*/
private PersistableDownload persistableDownload;
/**
* The last part that has been successfully written into the downloaded file.
*/
private Integer lastFullyDownloadedPartNumber;
private final GetObjectRequest getObjectRequest;
private final File file;
private final ObjectMetadata objectMetadata;
private final ProgressListenerChain progressListenerChain;
@Deprecated
public DownloadImpl(String description, TransferProgress transferProgress,
ProgressListenerChain progressListenerChain, S3Object s3Object, TransferStateChangeListener listener,
GetObjectRequest getObjectRequest, File file) {
this(description, transferProgress, progressListenerChain, s3Object, listener,
getObjectRequest, file, null, false);
}
public DownloadImpl(String description, TransferProgress transferProgress,
ProgressListenerChain progressListenerChain, S3Object s3Object, TransferStateChangeListener listener,
GetObjectRequest getObjectRequest, File file,
ObjectMetadata objectMetadata, boolean isDownloadParallel) {
super(description, transferProgress, progressListenerChain, listener);
this.s3Object = s3Object;
this.objectMetadata = objectMetadata;
this.getObjectRequest = getObjectRequest;
this.file = file;
this.progressListenerChain = progressListenerChain;
this.persistableDownload = captureDownloadState(getObjectRequest, file);
S3ProgressPublisher.publishTransferPersistable(progressListenerChain, persistableDownload);
}
/**
* Returns the ObjectMetadata for the object being downloaded.
*
* @return The ObjectMetadata for the object being downloaded.
*/
public synchronized ObjectMetadata getObjectMetadata() {
if (s3Object != null) {
return s3Object.getObjectMetadata();
}
return objectMetadata;
}
/**
* The name of the bucket where the object is being downloaded from.
*
* @return The name of the bucket where the object is being downloaded from.
*/
public String getBucketName() {
return getObjectRequest.getBucketName();
}
/**
* The key under which this object was stored in Amazon S3.
*
* @return The key under which this object was stored in Amazon S3.
*/
public String getKey() {
return getObjectRequest.getKey();
}
/**
* Only for internal use.
* For parallel downloads, Updates the persistableTransfer each time a
* part is successfully merged into download file.
* Then notify the listeners that new persistableTransfer is available.
*/
@SdkInternalApi
public void updatePersistableTransfer(Integer lastFullyDownloadedPartNumber) {
synchronized (this) {
this.lastFullyDownloadedPartNumber = lastFullyDownloadedPartNumber;
}
persistableDownload = captureDownloadState(getObjectRequest, file);
S3ProgressPublisher.publishTransferPersistable(progressListenerChain, persistableDownload);
}
/**
* For parallel downloads, returns the last part number that was
* successfully written into the download file.
* Returns null for serial downloads.
*/
public synchronized Integer getLastFullyDownloadedPartNumber() {
return lastFullyDownloadedPartNumber;
}
/**
* Cancels this download.
*
* @throws IOException
*/
public synchronized void abort() throws IOException {
this.monitor.getFuture().cancel(true);
if ( s3Object != null ) {
s3Object.getObjectContent().abort();
}
setState(TransferState.Canceled);
}
/**
* Cancels this download, but skip notifying the state change listeners.
*
* @throws IOException
*/
public synchronized void abortWithoutNotifyingStateChangeListener() throws IOException {
this.monitor.getFuture().cancel(true);
this.state = TransferState.Canceled;
}
/**
* Set the S3 object to download.
*/
public synchronized void setS3Object(S3Object s3Object) {
this.s3Object = s3Object;
}
/**
* This method is also responsible for firing COMPLETED signal to the
* listeners.
*/
@Override
public void setState(TransferState state) {
super.setState(state);
switch (state) {
case Completed :
fireProgressEvent(ProgressEventType.TRANSFER_COMPLETED_EVENT);
break;
case Canceled:
fireProgressEvent(ProgressEventType.TRANSFER_CANCELED_EVENT);
break;
case Failed:
fireProgressEvent(ProgressEventType.TRANSFER_FAILED_EVENT);
break;
default:
break;
}
}
/**
* Returns the captured state of the download; or null if it should not be
* captured (for security reason).
*/
private PersistableDownload captureDownloadState(
final GetObjectRequest getObjectRequest, final File file) {
if (getObjectRequest.getSSECustomerKey() == null) {
return new PersistableDownload(
getObjectRequest.getBucketName(), getObjectRequest.getKey(),
getObjectRequest.getVersionId(), getObjectRequest.getRange(),
getObjectRequest.getResponseHeaders(), getObjectRequest.isRequesterPays(),
file.getAbsolutePath(), getLastFullyDownloadedPartNumber(),
getObjectMetadata().getLastModified().getTime());
}
return null;
}
/*
* (non-Javadoc)
*
* @see com.amazonaws.services.s3.transfer.Download#pause()
*/
@Override
public PersistableDownload pause() throws PauseException {
boolean forceCancel = true;
TransferState currentState = getState();
this.monitor.getFuture().cancel(true);
if (persistableDownload == null) {
throw new PauseException(TransferManagerUtils.determinePauseStatus(
currentState, forceCancel));
}
return persistableDownload;
}
}

View File

@@ -0,0 +1,30 @@
package org.talend.aws;
import com.amazonaws.services.s3.transfer.internal.TransferMonitor;
import java.util.concurrent.Future;
public class DownloadMonitor implements TransferMonitor {
private Future<?> future;
private final DownloadImpl download;
public DownloadMonitor(DownloadImpl download, Future<?> future) {
this.download = download;
this.future = future;
}
@Override
public synchronized Future<?> getFuture() {
return future;
}
public synchronized void setFuture(Future<?> future) {
this.future = future;
}
@Override
public boolean isDone() {
return download.isDone();
}
}

View File

@@ -0,0 +1,52 @@
package org.talend.aws;
import com.amazonaws.util.StringUtils;
import java.io.File;
import java.util.UUID;
import java.util.concurrent.Callable;
import com.amazonaws.SdkClientException;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.model.GetObjectRequest;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
/**
* Helper class to get a part from s3,
* write the part data to a temporary file and
* return the temporary file.
*/
public class DownloadPartCallable implements Callable<File> {
private static final Log LOG = LogFactory.getLog(DownloadPartCallable.class);
private static final String TEMP_FILE_MIDDLE_NAME = ".part.";
private final AmazonS3 s3;
private final GetObjectRequest getPartRequest;
private final File destinationFile;
private final String destinationFilePath;
public DownloadPartCallable(AmazonS3 s3, GetObjectRequest getPartRequest, File destinationFile) {
this.s3 = s3;
this.getPartRequest = getPartRequest;
this.destinationFile = destinationFile;
this.destinationFilePath = destinationFile.getAbsolutePath();
}
public File call() throws Exception {
final File partFile = File.createTempFile(
UUID.nameUUIDFromBytes(destinationFile.getName().getBytes(StringUtils.UTF8)).toString(),
TEMP_FILE_MIDDLE_NAME + getPartRequest.getPartNumber().toString(),
new File(destinationFilePath.substring(0, destinationFilePath.lastIndexOf(File.separator))));
try {
partFile.deleteOnExit();
} catch (SecurityException exception) {
LOG.warn("SecurityException denied delete access to file " + partFile.getAbsolutePath());
}
if (s3.getObject(getPartRequest, partFile) == null) {
throw new SdkClientException(
"There is no object in S3 satisfying this request. The getObject method returned null");
}
return partFile;
}
}

View File

@@ -0,0 +1,37 @@
package org.talend.aws;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3Encryption;
import com.amazonaws.services.s3.internal.ServiceUtils;
import com.amazonaws.services.s3.internal.SkipMd5CheckStrategy;
import com.amazonaws.services.s3.model.GetObjectRequest;
import com.amazonaws.services.s3.model.S3Object;
final class DownloadTaskImpl implements
ServiceUtils.RetryableS3DownloadTask
{
private final AmazonS3 s3;
private final DownloadImpl download;
private final GetObjectRequest getObjectRequest;
private final SkipMd5CheckStrategy skipMd5CheckStrategy = SkipMd5CheckStrategy.INSTANCE;
DownloadTaskImpl(AmazonS3 s3, DownloadImpl download,
GetObjectRequest getObjectRequest) {
this.s3 = s3;
this.download = download;
this.getObjectRequest = getObjectRequest;
}
@Override
public S3Object getS3ObjectStream() {
S3Object s3Object = s3.getObject(getObjectRequest);
download.setS3Object(s3Object);
return s3Object;
}
@Override
public boolean needIntegrityCheck() {
// Don't perform the integrity check if the checksum won't matchup.
return !(s3 instanceof AmazonS3Encryption) && !skipMd5CheckStrategy.skipClientSideValidationPerRequest(getObjectRequest);
}
}

View File

@@ -0,0 +1,159 @@
package org.talend.aws;
import com.amazonaws.services.s3.model.ResponseHeaderOverrides;
import com.amazonaws.services.s3.transfer.PersistableTransfer;
import com.fasterxml.jackson.annotation.JsonProperty;
/**
* An opaque token that holds some private state and can be used to resume a
* paused download operation.
*/
public final class PersistableDownload extends PersistableTransfer {
static final String TYPE = "download";
@JsonProperty
private final String pauseType = TYPE;
/** The bucket name in Amazon S3 from where the object has to be downloaded. */
@JsonProperty
private final String bucketName;
/** The name of the object in Amazon S3 that has to be downloaded. */
@JsonProperty
private final String key;
/** The version id of the object in Amazon S3 to download. */
@JsonProperty
private final String versionId;
/** Optional member indicating the byte range of data to retrieve */
@JsonProperty
private final long[] range;
/**
* Optional field that overrides headers on the response.
*/
@JsonProperty
private final ResponseHeaderOverrides responseHeaders;
/**
* If enabled, the requester is charged for downloading the data from
* Requester Pays Buckets.
*/
@JsonProperty
private final boolean isRequesterPays;
/**
* File where the downloaded data is written.
*/
@JsonProperty
private final String file;
/**
* The last part that has been successfully written into the downloaded file.
*/
@JsonProperty
private final Integer lastFullyDownloadedPartNumber;
/**
* Last Modified/created time on Amazon S3 for this object.
*/
@JsonProperty
private final long lastModifiedTime;
public PersistableDownload() {
this(null, null, null, null, null, false, null, null, 0L);
}
public PersistableDownload(
@JsonProperty(value = "bucketName") String bucketName,
@JsonProperty(value = "key") String key,
@JsonProperty(value = "versionId") String versionId,
@JsonProperty(value = "range") long[] range,
@JsonProperty(value = "responseHeaders") ResponseHeaderOverrides responseHeaders,
@JsonProperty(value = "isRequesterPays") boolean isRequesterPays,
@JsonProperty(value = "file") String file,
@JsonProperty(value = "lastFullyDownloadedPartNumber") Integer lastFullyDownloadedPartNumber,
@JsonProperty(value = "lastModifiedTime") long lastModifiedTime) {
this.bucketName = bucketName;
this.key = key;
this.versionId = versionId;
this.range = range == null ? null : range.clone();
this.responseHeaders = responseHeaders;
this.isRequesterPays = isRequesterPays;
this.file = file;
this.lastFullyDownloadedPartNumber = lastFullyDownloadedPartNumber;
this.lastModifiedTime = lastModifiedTime;
}
/**
* Returns the name of the bucket.
*/
String getBucketName() {
return bucketName;
}
/**
* Returns the name of the object.
*/
String getKey() {
return key;
}
/**
* Returns the version id of the object.
*/
String getVersionId() {
return versionId;
}
/**
* Returns the byte range of the object to download.
*/
long[] getRange() {
return range == null ? null : range.clone();
}
/**
* Returns the optional response headers.
*/
ResponseHeaderOverrides getResponseHeaders() {
return responseHeaders;
}
/**
* Returns true if RequesterPays is enabled on the Amazon S3 bucket else
* false.
*/
boolean isRequesterPays() {
return isRequesterPays;
}
/**
* Returns the file where the object is to be downloaded.
*/
String getFile() {
return file;
}
String getPauseType() {
return pauseType;
}
/**
* Returns the last part number that was successfully written into the downloaded file.
*/
Integer getLastFullyDownloadedPartNumber() {
return lastFullyDownloadedPartNumber;
}
/**
* Returns the last modified/created time of the object represented by
* the bucketName and key.
*/
Long getlastModifiedTime() {
return lastModifiedTime;
}
}

View File

@@ -0,0 +1,17 @@
package org.talend.aws;
import com.amazonaws.event.ProgressEvent;
import com.amazonaws.event.ProgressEventFilter;
import com.amazonaws.event.ProgressEventType;
final class TransferCompletionFilter implements ProgressEventFilter {
@Override
public ProgressEvent filter(ProgressEvent progressEvent) {
// Block COMPLETE events from the low-level GetObject operation,
// but we still want to keep the BytesTransferred
return progressEvent.getEventType() == ProgressEventType.TRANSFER_COMPLETED_EVENT
? null // discard this event
: progressEvent
;
}
}

View File

@@ -0,0 +1,233 @@
package org.talend.aws;
import com.amazonaws.AmazonClientException;
import com.amazonaws.AmazonWebServiceRequest;
import com.amazonaws.event.ProgressListenerChain;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.internal.FileLocks;
import com.amazonaws.services.s3.internal.RequestCopyUtils;
import com.amazonaws.services.s3.internal.ServiceUtils;
import com.amazonaws.services.s3.model.GetObjectMetadataRequest;
import com.amazonaws.services.s3.model.GetObjectRequest;
import com.amazonaws.services.s3.model.ObjectMetadata;
import com.amazonaws.services.s3.transfer.TransferManagerConfiguration;
import com.amazonaws.services.s3.transfer.TransferProgress;
import com.amazonaws.services.s3.transfer.exception.FileLockException;
import com.amazonaws.services.s3.transfer.internal.S3ProgressListener;
import com.amazonaws.services.s3.transfer.internal.S3ProgressListenerChain;
import com.amazonaws.services.s3.transfer.internal.TransferManagerUtils;
import com.amazonaws.services.s3.transfer.internal.TransferStateChangeListener;
import com.amazonaws.services.s3.transfer.internal.TransferProgressUpdatingListener;
import com.amazonaws.util.VersionInfoUtils;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import java.io.File;
import java.util.Date;
import java.util.concurrent.*;
import java.util.concurrent.atomic.AtomicInteger;
public class TransferManager {
private static final Log log = LogFactory.getLog(TransferManager.class);
private final AmazonS3 s3;
private final ExecutorService executorService;
private final TransferManagerConfiguration configuration;
private final boolean shutDownThreadPools;
public TransferManager(AmazonS3 s3) {
this.s3 = s3;
this.executorService = TransferManagerUtils.createDefaultExecutorService();
this.configuration = resolveConfiguration();
this.shutDownThreadPools = true;
}
private TransferManagerConfiguration resolveConfiguration() {
TransferManagerConfiguration configuration = new TransferManagerConfiguration();
configuration.setDisableParallelDownloads(false);
return configuration;
}
public Download download(GetObjectRequest getObjectRequest, File file, S3ProgressListener progressListener,
long timeoutMillis, boolean resumeOnRetry) {
return doDownload(getObjectRequest, file, null, progressListener, ServiceUtils.OVERWRITE_MODE, timeoutMillis, null, 0L,
resumeOnRetry);
}
private Download doDownload(final GetObjectRequest getObjectRequest,
final File file, final TransferStateChangeListener stateListener,
final S3ProgressListener s3progressListener,
final boolean resumeExistingDownload,
final long timeoutMillis,
final Integer lastFullyDownloadedPart,
final long lastModifiedTimeRecordedDuringPause,
final boolean resumeOnRetry)
{
assertParameterNotNull(getObjectRequest,
"A valid GetObjectRequest must be provided to initiate download");
assertParameterNotNull(file,
"A valid file must be provided to download into");
appendSingleObjectUserAgent(getObjectRequest);
String description = "Downloading from " + getObjectRequest.getBucketName() + "/" + getObjectRequest.getKey();
TransferProgress transferProgress = new TransferProgress();
// S3 progress listener to capture the persistable transfer when available
S3ProgressListenerChain listenerChain = new S3ProgressListenerChain(
// The listener for updating transfer progress
new TransferProgressUpdatingListener(transferProgress),
getObjectRequest.getGeneralProgressListener(),
s3progressListener); // Listeners included in the original request
// The listener chain used by the low-level GetObject request.
// This listener chain ignores any COMPLETE event, so that we could
// delay firing the signal until the high-level download fully finishes.
getObjectRequest
.setGeneralProgressListener(new ProgressListenerChain(new TransferCompletionFilter(), listenerChain));
GetObjectMetadataRequest getObjectMetadataRequest = RequestCopyUtils.createGetObjectMetadataRequestFrom(getObjectRequest);
final ObjectMetadata objectMetadata = s3.getObjectMetadata(getObjectMetadataRequest);
// Used to check if the object is modified between pause and resume
long lastModifiedTime = objectMetadata.getLastModified().getTime();
long startingByte = 0;
long lastByte;
long[] range = getObjectRequest.getRange();
if (range != null && range.length == 2) {
startingByte = range[0];
lastByte = range[1];
} else {
lastByte = objectMetadata.getContentLength() - 1;
}
final long origStartingByte = startingByte;
final boolean isDownloadParallel = !configuration.isDisableParallelDownloads()
&& TransferManagerUtils.isDownloadParallelizable(s3, getObjectRequest, ServiceUtils.getPartCount(getObjectRequest, s3));
// We still pass the unfiltered listener chain into DownloadImpl
final DownloadImpl download = new DownloadImpl(description, transferProgress, listenerChain, null,
stateListener, getObjectRequest, file, objectMetadata, isDownloadParallel);
long totalBytesToDownload = lastByte - startingByte + 1;
transferProgress.setTotalBytesToTransfer(totalBytesToDownload);
// Range information is needed for auto retry of downloads so a retry
// request can start at the last downloaded location in the range.
//
// For obvious reasons, setting a Range header only makes sense if the
// object actually has content because it's inclusive, otherwise S3
// responds with 4xx
//
// In addition, we only set the range if the download was *NOT*
// determined to be parallelizable above. One of the conditions for
// parallel downloads is that getRange() returns null so preserve that.
if (totalBytesToDownload > 0 && !isDownloadParallel) {
getObjectRequest.withRange(startingByte, lastByte);
}
long fileLength = -1;
if (resumeExistingDownload) {
if (isS3ObjectModifiedSincePause(lastModifiedTime, lastModifiedTimeRecordedDuringPause)) {
throw new AmazonClientException("The requested object in bucket " + getObjectRequest.getBucketName()
+ " with key " + getObjectRequest.getKey() + " is modified on Amazon S3 since the last pause.");
}
// There's still a chance the object is modified while the request
// is in flight. Set this header so S3 fails the request if this happens.
getObjectRequest.setUnmodifiedSinceConstraint(new Date(lastModifiedTime));
if (!isDownloadParallel) {
if (!FileLocks.lock(file)) {
throw new FileLockException("Fail to lock " + file + " for resume download");
}
try {
if (file.exists()) {
fileLength = file.length();
startingByte = startingByte + fileLength;
getObjectRequest.setRange(startingByte, lastByte);
transferProgress.updateProgress(Math.min(fileLength, totalBytesToDownload));
totalBytesToDownload = lastByte - startingByte + 1;
if (log.isDebugEnabled()) {
log.debug("Resume download: totalBytesToDownload=" + totalBytesToDownload
+ ", origStartingByte=" + origStartingByte + ", startingByte=" + startingByte
+ ", lastByte=" + lastByte + ", numberOfBytesRead=" + fileLength + ", file: "
+ file);
}
}
} finally {
FileLocks.unlock(file);
}
}
}
if (totalBytesToDownload < 0) {
throw new IllegalArgumentException(
"Unable to determine the range for download operation.");
}
final CountDownLatch latch = new CountDownLatch(1);
Future<?> future = executorService.submit(
new DownloadCallable(s3, latch,
getObjectRequest, resumeExistingDownload,
download, file, origStartingByte, fileLength, timeoutMillis, timedThreadPool,
executorService, lastFullyDownloadedPart, isDownloadParallel, resumeOnRetry));
download.setMonitor(new DownloadMonitor(download, future));
latch.countDown();
return download;
}
public void shutdownNow(boolean shutDownS3Client) {
if (shutDownThreadPools) {
executorService.shutdownNow();
timedThreadPool.shutdownNow();
}
if (shutDownS3Client) {
s3.shutdown();
}
}
private void assertParameterNotNull(Object parameterValue, String errorMessage) {
if (parameterValue == null) throw new IllegalArgumentException(errorMessage);
}
public static <X extends AmazonWebServiceRequest> X appendSingleObjectUserAgent(X request) {
request.getRequestClientOptions().appendUserAgent(USER_AGENT);
return request;
}
private static final String USER_AGENT = TransferManager.class.getName() + "/" + VersionInfoUtils.getVersion();
private boolean isS3ObjectModifiedSincePause(final long lastModifiedTimeRecordedDuringResume,
long lastModifiedTimeRecordedDuringPause) {
return lastModifiedTimeRecordedDuringResume != lastModifiedTimeRecordedDuringPause;
}
private final ScheduledExecutorService timedThreadPool = new ScheduledThreadPoolExecutor(1, daemonThreadFactory);
private static final ThreadFactory daemonThreadFactory = new ThreadFactory() {
final AtomicInteger threadCount = new AtomicInteger( 0 );
public Thread newThread(Runnable r) {
int threadNumber = threadCount.incrementAndGet();
Thread thread = new Thread(r);
thread.setDaemon(true);
thread.setName("S3TransferManagerTimedThread-" + threadNumber);
return thread;
}
};
@Override
protected void finalize() throws Throwable {
shutdownThreadPools();
}
private void shutdownThreadPools() {
if (shutDownThreadPools) {
executorService.shutdown();
timedThreadPool.shutdown();
}
}
}

View File

@@ -2,9 +2,9 @@
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.talend.libraries</groupId>
<artifactId>talendExcel-1.5-20200825</artifactId>
<version>6.0.0</version>
<groupId>org.talend.components</groupId>
<artifactId>talendExcel</artifactId>
<version>1.8-20201113</version>
<packaging>jar</packaging>
<name>talendExcel</name>
@@ -43,47 +43,30 @@
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi</artifactId>
<version>4.1.0</version>
<version>4.1.2</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-scratchpad</artifactId>
<version>4.1.0</version>
<version>4.1.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.poi/poi-ooxml -->
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
<version>4.1.0</version>
<version>4.1.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.poi/poi-ooxml-schemas -->
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml-schemas</artifactId>
<version>4.1.0</version>
<version>4.1.2</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.geronimo.specs/geronimo-stax-api_1.0_spec -->
<dependency>
<groupId>org.apache.geronimo.specs</groupId>
<artifactId>geronimo-stax-api_1.0_spec</artifactId>
<version>1.0</version>
</dependency>
<dependency>
<groupId>org.dom4j</groupId>
<artifactId>dom4j</artifactId>
<version>2.1.3</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.xmlbeans/xmlbeans -->
<dependency>
<groupId>org.apache.xmlbeans</groupId>
<artifactId>xmlbeans</artifactId>
<version>3.1.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.commons/commons-lang3 -->
<dependency>
<groupId>org.apache.commons</groupId>

View File

@@ -35,12 +35,12 @@ public class ExcelTool {
private Workbook wb = null;
private Workbook preWb = null;
private String sheetName = null;
private Sheet sheet = null;
private Workbook preWb = null;
private Sheet preSheet = null;
private Row curRow = null;
@@ -161,12 +161,8 @@ public class ExcelTool {
}
private void appendActionForFile(String fileName) throws Exception {
if (password == null) {
InputStream inp = new FileInputStream(fileName);
wb = WorkbookFactory.create(inp);
} else {
wb = readEncryptedFile(fileName);
}
InputStream inp = new FileInputStream(fileName);
wb = WorkbookFactory.create(inp, password);
sheet = wb.getSheet(sheetName);
if (sheet != null) {
if (appendSheet) {
@@ -191,27 +187,10 @@ public class ExcelTool {
}
private void initPreXlsx(String fileName) throws Exception {
if(password == null) {
InputStream preIns = new FileInputStream(fileName);
preWb = WorkbookFactory.create(preIns);
} else {
preWb = readEncryptedFile(fileName);
}
InputStream preIns = new FileInputStream(fileName);
preWb = WorkbookFactory.create(preIns, password);
preSheet = preWb.getSheet(sheetName);
}
private Workbook readEncryptedFile(String fileName)
throws IOException, GeneralSecurityException {
InputStream inp = new FileInputStream(fileName);
POIFSFileSystem fs = new POIFSFileSystem(inp);
EncryptionInfo info = new EncryptionInfo(fs);
Decryptor decryptor = Decryptor.getInstance(info);
if (!decryptor.verifyPassword(password)) {
throw new GeneralSecurityException("Error: Incorrect password!");
}
InputStream dataStream = decryptor.getDataStream(fs);
return WorkbookFactory.create(dataStream);
}
public void setFont(String fontName) {
if (StringUtils.isNotEmpty(fontName)) {
@@ -222,11 +201,7 @@ public class ExcelTool {
public void addRow() {
if (isAbsY && keepCellFormat) {
if (preSheet != null) {
preRow = preSheet.getRow(curY);
} else {
preRow = null;
}
preRow = (preSheet != null) ? preSheet.getRow(curY) : null;
}
curRow = sheet.getRow(curY);
if (curRow == null) {
@@ -244,11 +219,7 @@ public class ExcelTool {
private void addCell() {
if (isAbsY && keepCellFormat) {
if (preRow != null) {
preCell = preRow.getCell(startX + xOffset);
} else {
preCell = null;
}
preCell = (preRow != null) ? preRow.getCell(startX + xOffset) : null;
}
curCell = curRow.createCell(startX + xOffset);
xOffset++;
@@ -267,10 +238,8 @@ public class ExcelTool {
cellStylesMapping.put("normal", style);
return style;
}
} else {
return preCellStyle;
}
return preCellStyle;
}
private CellStyle getDateCellStyle(String pattern) {
@@ -289,9 +258,8 @@ public class ExcelTool {
cellStylesMapping.put(pattern, style);
return style;
}
} else {
return preCellStyle;
}
return preCellStyle;
}
private CellStyle getPreCellStyle() {
@@ -308,9 +276,8 @@ public class ExcelTool {
return targetCellStyle;
} else {
return null;
}
return null;
}
public void addCellValue(boolean booleanValue) {
@@ -364,6 +331,9 @@ public class ExcelTool {
try {
wb.write(outputStream);
wb.close();
if(preWb != null){
preWb.close();
}
} finally {
if (outputStream != null) {
outputStream.close();
@@ -382,13 +352,11 @@ public class ExcelTool {
if (appendWorkbook && appendSheet && recalculateFormula) {
evaluateFormulaCell();
}
FileOutputStream fileOutput = new FileOutputStream(fileName);
POIFSFileSystem fs = null;
try {
try (FileOutputStream fileOutput = new FileOutputStream(fileName);
POIFSFileSystem fs = new POIFSFileSystem()) {
if (password == null) {
wb.write(fileOutput);
} else {
fs = new POIFSFileSystem();
Encryptor encryptor = new EncryptionInfo(EncryptionMode.agile).getEncryptor();
encryptor.confirmPassword(password);
OutputStream encryptedDataStream = encryptor.getDataStream(fs);
@@ -398,9 +366,8 @@ public class ExcelTool {
}
} finally {
wb.close();
fileOutput.close();
if (fs != null) {
fs.close();
if(preWb != null){
preWb.close();
}
}
}
@@ -411,8 +378,8 @@ public class ExcelTool {
sheet = wb.getSheetAt(sheetNum);
for (Row r : sheet) {
for (Cell c : r) {
if (c.getCellTypeEnum() == CellType.FORMULA) {
evaluator.evaluateFormulaCellEnum(c);
if (c.getCellType() == CellType.FORMULA) {
evaluator.evaluateFormulaCell(c);
}
}
}

View File

@@ -1,61 +1,61 @@
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.talend.libraries</groupId>
<artifactId>talendMsgMailUtil-1.1-20191012</artifactId>
<name>talendMsgMailUtil</name>
<version>6.0.0</version>
<groupId>org.talend.components</groupId>
<artifactId>talendMsgMailUtil</artifactId>
<version>1.2-20200923</version>
<packaging>jar</packaging>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<talend.nexus.url>https://artifacts-oss.talend.com</talend.nexus.url>
<java.source.version>1.5</java.source.version>
</properties>
<distributionManagement>
<snapshotRepository>
<id>talend_nexus_deployment</id>
<url>${talend.nexus.url}/nexus/content/repositories/TalendOpenSourceSnapshot/</url>
<snapshots>
<enabled>true</enabled>
</snapshots>
<releases>
<enabled>false</enabled>
</releases>
</snapshotRepository>
<repository>
<id>talend_nexus_deployment</id>
<url>${talend.nexus.url}/nexus/content/repositories/TalendOpenSourceRelease/</url>
<snapshots>
<enabled>false</enabled>
</snapshots>
<releases>
<enabled>true</enabled>
</releases>
</repository>
</distributionManagement>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<talend.nexus.url>https://artifacts-oss.talend.com</talend.nexus.url>
<java.source.version>1.8</java.source.version>
</properties>
<distributionManagement>
<snapshotRepository>
<id>talend_nexus_deployment</id>
<url>${talend.nexus.url}/nexus/content/repositories/TalendOpenSourceSnapshot/</url>
<snapshots>
<enabled>true</enabled>
</snapshots>
<releases>
<enabled>false</enabled>
</releases>
</snapshotRepository>
<repository>
<id>talend_nexus_deployment</id>
<url>${talend.nexus.url}/nexus/content/repositories/TalendOpenSourceRelease/</url>
<snapshots>
<enabled>false</enabled>
</snapshots>
<releases>
<enabled>true</enabled>
</releases>
</repository>
</distributionManagement>
<dependencies>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.25</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.25</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi</artifactId>
<version>4.1.0</version>
<version>4.1.2</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-scratchpad</artifactId>
<version>4.1.0</version>
<version>4.1.2</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.13</version>
<version>1.2.17</version>
</dependency>
</dependencies>
<build>
@@ -65,19 +65,19 @@
</resource>
</resources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>${java.source.version}</source>
<target>${java.source.version}</target>
<showDeprecation>true</showDeprecation>
<showWarnings>true</showWarnings>
<compilerArgument>-XDignore.symbol.file</compilerArgument>
<fork>true</fork>
</configuration>
</plugin>
</plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>${java.source.version}</source>
<target>${java.source.version}</target>
<showDeprecation>true</showDeprecation>
<showWarnings>true</showWarnings>
<compilerArgument>-XDignore.symbol.file</compilerArgument>
<fork>true</fork>
</configuration>
</plugin>
</plugins>
</build>
</project>

View File

@@ -7,10 +7,6 @@ import java.io.OutputStream;
import org.apache.poi.hsmf.MAPIMessage;
import org.apache.poi.hsmf.datatypes.AttachmentChunks;
import org.apache.poi.hsmf.datatypes.Chunk;
import org.apache.poi.hsmf.datatypes.MAPIProperty;
import org.apache.poi.hsmf.datatypes.StringChunk;
import org.apache.poi.hsmf.datatypes.Types;
import org.apache.poi.hsmf.exceptions.ChunkNotFoundException;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
@@ -36,12 +32,12 @@ public class MsgMailUtil {
public MsgMailUtil(String fileName, String outAttachmentPath)
throws IOException {
msg = new MAPIMessage(fileName);
this.msg = new MAPIMessage(fileName);
this.outAttachmentPath = outAttachmentPath;
}
public void activeLog(String logger_name, String position) {
this.log = LoggerFactory.getLogger(logger_name);
public void activeLog(String loggerName, String position) {
this.log = LoggerFactory.getLogger(loggerName);
this.position = position;
}
@@ -70,20 +66,14 @@ public class MsgMailUtil {
}
File attachedFile = new File(dir, fileName);
OutputStream fileOut = null;
try {
try(OutputStream fileOut = new FileOutputStream(attachedFile)) {
processLog(Level.INFO, "Exporting attachment file :" + fileName);
processLog(Level.INFO,
"File location:" + attachedFile.getAbsolutePath());
fileOut = new FileOutputStream(attachedFile);
fileOut.write(attachment.getEmbeddedAttachmentObject());
processLog(Level.INFO, "Export successfully");
} finally {
if (fileOut != null) {
fileOut.close();
}
}
}

View File

@@ -2,9 +2,21 @@
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.talend.libraries</groupId>
<groupId>org.talend.components</groupId>
<artifactId>talendzip</artifactId>
<version>1.0-20190917</version>
<version>1.1-20201120</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>8</source>
<target>8</target>
</configuration>
</plugin>
</plugins>
</build>
<packaging>jar</packaging>
<properties>

View File

@@ -2,6 +2,7 @@ package com.talend.compress.zip;
import java.io.File;
import java.util.List;
import java.util.Optional;
import net.lingala.zip4j.core.ZipFile;
import net.lingala.zip4j.model.FileHeader;
@@ -44,6 +45,8 @@ public class Unzip {
this.useZip4jDecryption = useZip4jDecryption;
}
public void setEncording(String encording){this.encording = encording;}
private boolean needPassword = false;
private boolean useZip4jDecryption = false;
@@ -55,6 +58,8 @@ public class Unzip {
private String sourceZip;
private String targetDir;
private String encording;
public Unzip(String sourceZip, String targetDir) {
this.sourceZip = sourceZip;
@@ -92,6 +97,9 @@ public class Unzip {
}
ZipFile zipFile = new ZipFile(sourceZip);
if(encording != null){
zipFile.setFileNameCharset(encording);
}
if (checkArchive) {
if (!zipFile.isValidZipFile()) {
@@ -152,9 +160,8 @@ public class Unzip {
is = new javax.crypto.CipherInputStream(is,
org.talend.archive.IntegrityUtil.createCipher(
javax.crypto.Cipher.DECRYPT_MODE, password));
org.apache.commons.compress.archivers.zip.ZipArchiveInputStream input = new org.apache.commons.compress.archivers.zip.ZipArchiveInputStream(
new java.io.BufferedInputStream(is));
new java.io.BufferedInputStream(is),Optional.ofNullable(encording).orElse("UTF8"));
org.apache.commons.compress.archivers.zip.ZipArchiveEntry entry;
while ((entry = input.getNextZipEntry()) != null) {
@@ -190,7 +197,7 @@ public class Unzip {
org.apache.commons.compress.archivers.zip.ZipFile zip = null;
try {
zip = new org.apache.commons.compress.archivers.zip.ZipFile(
sourceZip);
sourceZip,Optional.ofNullable(encording).orElse("UTF8"));
java.util.Enumeration enuFiles = zip.getEntries();
java.io.InputStream is = null;

View File

@@ -54,11 +54,11 @@ for(IConnection conn : outgoingConns) {
<%
}
}
log4jCodeGenerateUtil.query(node);
%>
query_<%=cid %> = <%=dbquery%>;
whetherReject_<%=cid%> = false;
<%
log4jCodeGenerateUtil.query(node, "query_" + cid);
List<IMetadataTable> metadatas = node.getMetadataList();
if ((metadatas!=null)&&(metadatas.size()>0)) {
IMetadataTable metadata = metadatas.get(0);
@@ -131,7 +131,7 @@ try {
<%
}
}
log4jCodeGenerateUtil.logInfo(node,"info",cid+" - Execute the query: '\" + "+dbquery +" + \"' has finished.");
log4jCodeGenerateUtil.logInfo(node,"info",cid+" - Execute the query: '\" + query_" + cid + " + \"' has finished.");
%>
<% //feature 0010425
if(usePrepareStatement){

View File

@@ -200,7 +200,7 @@
<DEFAULT>false</DEFAULT>
</PARAMETER>
<PARAMETER NAME="DATE_PATTERN" GROUP="LOAD_DETAILS" FIELD="TEXT" NUM_ROW="15" SHOW_IF="isShow[(DATE_FORMAT] AND (DATE_FORMAT == 'true')">
<PARAMETER NAME="DATE_PATTERN" GROUP="LOAD_DETAILS" FIELD="TEXT" NUM_ROW="15" SHOW_IF="isShow[DATE_FORMAT] AND (DATE_FORMAT == 'true')">
<DEFAULT>"yyyy-MM-dd"</DEFAULT>
</PARAMETER>

View File

@@ -1,10 +1,10 @@
<%@ jet
imports="
org.talend.core.model.process.INode
org.talend.core.model.process.ElementParameterParser
org.talend.core.model.process.INode
org.talend.core.model.process.ElementParameterParser
org.talend.designer.codegen.config.CodeGeneratorArgument
org.talend.core.model.metadata.IMetadataTable
org.talend.core.model.metadata.IMetadataColumn
org.talend.core.model.metadata.IMetadataTable
org.talend.core.model.metadata.IMetadataColumn
org.talend.core.model.process.IConnection
java.util.List
"
@@ -37,7 +37,7 @@
boolean useCustomNullMarker = ElementParameterParser.getBooleanValue(node, "__USE_CUSTOM_NULL_MARKER__");
String nullMarker = useCustomNullMarker ? ElementParameterParser.getValue(node, "__CUSTOM_NULL_MARKER__") : "\"\\\\N\"";
String passwordFieldName = "";
IConnection incomingConnection = null;
if(node.getUniqueName().startsWith("tBigQueryOutput_")) {
List< ? extends IConnection> conns = node.getIncomingConnections();
@@ -49,7 +49,7 @@
incomingConnection = virtConnection.getSource().getIncomingConnections().get(0);
}
}
if (authMode.equals("OAUTH")) {
%>
/* ----START-CREATING-CLIENT (OAuth 2.0)---- */
@@ -58,22 +58,22 @@
<%
passwordFieldName = "__CLIENT_SECRET__";
%>
<%@ include file="@{org.talend.designer.components.localprovider}/components/templates/password.javajet"%>
<%@ include file="@{org.talend.designer.components.localprovider}/components/templates/password.javajet"%>
final String CLIENT_SECRET_<%=cid%> = "{\"web\": {\"client_id\": \""+<%=clientId%>+"\",\"client_secret\": \"" +decryptedPassword_<%=cid%>+ "\",\"auth_uri\": \"https://accounts.google.com/o/oauth2/auth\",\"token_uri\": \"https://accounts.google.com/o/oauth2/token\"}}";
final String PROJECT_ID_<%=cid %> = <%=projectId %>;
// Static variables for API scope, callback URI, and HTTP/JSON functions
final List<String> SCOPES_<%=cid%> = java.util.Arrays.asList("https://www.googleapis.com/auth/bigquery");
final String REDIRECT_URI_<%=cid%> = "urn:ietf:wg:oauth:2.0:oob";
final com.google.api.client.http.HttpTransport TRANSPORT_<%=cid %> = new com.google.api.client.http.javanet.NetHttpTransport();
final com.google.api.client.json.JsonFactory JSON_FACTORY_<%=cid %> = new com.google.api.client.json.jackson2.JacksonFactory();
com.google.api.client.googleapis.auth.oauth2.GoogleClientSecrets clientSecrets_<%=cid%> = com.google.api.client.googleapis.auth.oauth2.GoogleClientSecrets.load(
new com.google.api.client.json.jackson2.JacksonFactory(), new java.io.InputStreamReader(new java.io.ByteArrayInputStream(
CLIENT_SECRET_<%=cid%>.getBytes())));
com.google.api.client.googleapis.auth.oauth2.GoogleAuthorizationCodeFlow flow_<%=cid%> = null;
com.google.api.services.bigquery.Bigquery bigqueryclient_<%=cid%> = null;
long nb_line_<%=cid%> = 0;
@@ -111,7 +111,7 @@
%>
}
String storedRefreshToken_<%=cid%> = (String) properties_<%=cid%>.get("refreshtoken");
// Check to see if the an existing refresh token was loaded.
// If so, create a credential and call refreshToken() to get a new
// access token.
@@ -120,7 +120,7 @@
com.google.api.client.googleapis.auth.oauth2.GoogleCredential credential_<%=cid%> = new com.google.api.client.googleapis.auth.oauth2. GoogleCredential.Builder().setTransport(TRANSPORT_<%=cid%>)
.setJsonFactory(JSON_FACTORY_<%=cid%>).setClientSecrets(clientSecrets_<%=cid%>)
.build().setFromTokenResponse(new com.google.api.client.auth.oauth2.TokenResponse().setRefreshToken(storedRefreshToken_<%=cid%>));
credential_<%=cid%>.refreshToken();
<%
if(isLog4jEnabled){
@@ -186,13 +186,13 @@
if (outputStream_<%=cid%> != null) {
outputStream_<%=cid%>.close();
}
bigqueryclient_<%=cid%> = new com.google.api.services.bigquery.Bigquery.Builder(new com.google.api.client.http.javanet.NetHttpTransport(),new com.google.api.client.json.jackson2.JacksonFactory(),credential_<%=cid%>).build();
}
}
/* ----END-CREATING-CLIENT (OAuth 2.0)---- */
<%
} else if (authMode.equals("SERVICEACCOUNT")) {
%>
@@ -213,9 +213,35 @@
/* ----END-CREATING-CLIENT (Cloud API)---- */
long nb_line_<%=cid%> = 0;
<%
} else if (authMode.equals("TOKEN")) {
if (ElementParameterParser.canEncrypt(node, "__ACCESS_TOKEN__")) {%>
final String decryptedAccessToken_<%=cid%> = routines.system.PasswordEncryptUtil.decryptPassword(<%=ElementParameterParser.getEncryptedValue(node, "__ACCESS_TOKEN__")%>);
<%} else {%>
final String decryptedAccessToken_<%=cid%> = <%= ElementParameterParser.getValue(node, "__ACCESS_TOKEN__")%>;
<%}%>
/* ----START-CREATING-CLIENT (OAuth based Token)---- */
final String PROJECT_ID_<%=cid %> = <%=projectId %>;
com.google.api.client.auth.oauth2.Credential cred_<%=cid%> =
new com.google.api.client.auth.oauth2.Credential(com.google.api.client.auth.oauth2.BearerToken.authorizationHeaderAccessMethod()).setFromTokenResponse(
(new com.google.api.client.auth.oauth2.TokenResponse()).setAccessToken(decryptedAccessToken_<%=cid%>));
com.google.api.services.bigquery.Bigquery bigqueryclient_<%=cid%> = null;
long nb_line_<%=cid%> = 0;
<%
if(isLog4jEnabled){
%>
log.info("<%=cid%> - Creating client.");
<%
}
%>
bigqueryclient_<%=cid%> =
new com.google.api.services.bigquery.Bigquery.Builder(new com.google.api.client.http.javanet.NetHttpTransport(), new com.google.api.client.json.jackson2.JacksonFactory(), cred_<%=cid%>).setApplicationName("Talend").build();
/* ----END-CREATING-CLIENT (OAuth based Token)---- */
<%
} else {
throw new IllegalArgumentException("authentication mode should be either \"SERVICEACCOUNT\" or \"OAUTH\", but it is " + authMode);
throw new IllegalArgumentException("authentication mode should be either \"SERVICEACCOUNT\", \"OAUTH\" or \"TOKEN\", but it is " + authMode);
}
boolean bulkFileAlreadyExists = ElementParameterParser.getBooleanValue(node, "__BULK_FILE_ALREADY_EXIST__");
String accessKey = ElementParameterParser.getValue(node, "__GS_ACCESS_KEY__");
@@ -227,17 +253,17 @@
if(!bulkFileAlreadyExists) {
if (!useServiceAccountForConnection){
if ("USER_ACCOUNT_HMAC".equals(authType)){
%>
/* ----START-UPLOADING-FILE WITH HMAC ACCOUNT---- */
<%
passwordFieldName = "__GS_SECRET_KEY__";
%>
<%if (ElementParameterParser.canEncrypt(node, passwordFieldName)) {%>
<%if (ElementParameterParser.canEncrypt(node, passwordFieldName)) {%>
final String decryptedPwd_<%=cid%> = routines.system.PasswordEncryptUtil.decryptPassword(<%=ElementParameterParser.getEncryptedValue(node, passwordFieldName)%>);
<%} else {%>
final String decryptedPwd_<%=cid%> = <%= ElementParameterParser.getValue(node, passwordFieldName)%>;
final String decryptedPwd_<%=cid%> = <%= ElementParameterParser.getValue(node, passwordFieldName)%>;
<%}%>
@@ -262,33 +288,12 @@
log.info("<%=cid%> - Upload Done.");
<%
}
} else {
String pathToServerAccoutKeyFile = ElementParameterParser.getValue(node, "__GS_SERVICE_ACCOUNT_KEY__");
%>
/* ----START-UPLOADING-FILE WITH SERVICE ACCOUNT---- */
com.google.cloud.storage.Storage storage_<%=cid%> = null;
com.google.auth.Credentials credential_<%=cid%> = null;
try {
credential_<%=cid%> = com.google.auth.oauth2.GoogleCredentials
.fromStream(new java.io.FileInputStream(<%=pathToServerAccoutKeyFile%>));
} catch (IOException e_<%=cid%>) {
<%
if (("true").equals(dieOnError)) {
%>
throw(e_<%=cid%>);
<%
}
%>
<%
if(isLog4jEnabled){
%>
log.error("<%=cid%> - Exception in component <%=cid%>.", e_<%=cid%>);
<%
}
%>
}
com.google.cloud.storage.StorageOptions.Builder builder = com.google.cloud.storage.StorageOptions.newBuilder();
storage_<%=cid%> = builder.setCredentials(credential_<%=cid%>).build().getService();
} else {
boolean useExistingConn = false;
String gsProjectID = projectId;
String connection = cid;
%>
<%@ include file="@{org.talend.designer.components.localprovider}/components/templates/googleStorageConnection.javajet"%>
java.io.File file_<%=cid%> = new java.io.File(<%=localFilename%>);
@@ -302,6 +307,7 @@
<%
}
%>
/* ----END-UPLOADING-FILE---- */
<%
}
}
@@ -313,7 +319,7 @@
int currIndex_<%=cid%> = 0;
<%
}
if (authMode.equals("OAUTH")) {
if (authMode.equals("OAUTH") || authMode.equals("TOKEN")) {
if(isLog4jEnabled){
%>
log.info("<%=cid%> - Starting build a job.");
@@ -323,12 +329,15 @@
/* ----START-CREATING-JOB (OAuth 2.0)---- */
com.google.api.services.bigquery.model.Job job_<%=cid%> = new com.google.api.services.bigquery.model.Job();
job_<%=cid%>.setJobReference(new com.google.api.services.bigquery.model.JobReference().setProjectId(PROJECT_ID_<%=cid%>));
com.google.api.services.bigquery.model.JobConfiguration config_<%=cid%> = new com.google.api.services.bigquery.model.JobConfiguration();
com.google.api.services.bigquery.model.JobConfigurationLoad queryLoad_<%=cid%> = new com.google.api.services.bigquery.model.JobConfigurationLoad();
<%if (dropTable) {%>
try {
<% if(authMode.equals("TOKEN") && !ElementParameterParser.canEncrypt(node, "__ACCESS_TOKEN__")) { %>
cred_<%=cid%>.setAccessToken(<%= ElementParameterParser.getValue(node, "__ACCESS_TOKEN__")%>);
<% } %>
bigqueryclient_<%=cid%>.tables().delete(PROJECT_ID_<%=cid%>, <%=dataset%>, <%=table%>).execute();
} catch (com.google.api.client.googleapis.json.GoogleJsonResponseException e_<%=cid%>) {
if (e_<%=cid%>.getDetails().getCode() != 404) {
@@ -351,10 +360,13 @@
}
}
<%}%>
<%if (createTableIfNotExist) { %>
try {
<% if(authMode.equals("TOKEN") && !ElementParameterParser.canEncrypt(node, "__ACCESS_TOKEN__")) { %>
cred_<%=cid%>.setAccessToken(<%= ElementParameterParser.getValue(node, "__ACCESS_TOKEN__")%>);
<% } %>
com.google.api.services.bigquery.model.Table getTable = bigqueryclient_<%=cid%>.tables().get(PROJECT_ID_<%=cid%>, <%=dataset%>, <%=table%>).execute();
queryLoad_<%=cid%>.setCreateDisposition("CREATE_NEVER");
} catch (com.google.api.client.googleapis.json.GoogleJsonResponseException e_<%=cid%>){
@@ -497,15 +509,17 @@
destinationTable_<%=cid%>.setProjectId(PROJECT_ID_<%=cid%>);
destinationTable_<%=cid%>.setDatasetId(<%=dataset%>);
destinationTable_<%=cid%>.setTableId(<%=table%>);
queryLoad_<%=cid%>.setDestinationTable(destinationTable_<%=cid%>);
queryLoad_<%=cid%>.setSourceUris(java.util.Arrays.asList(<%=gsFile%>));
queryLoad_<%=cid%>.setSkipLeadingRows(<%=gsFileHeader%>);
queryLoad_<%=cid%>.setNullMarker(<%= nullMarker %>);
config_<%=cid%>.setLoad(queryLoad_<%=cid%>);
job_<%=cid%>.setConfiguration(config_<%=cid%>);
<% if(authMode.equals("TOKEN") && !ElementParameterParser.canEncrypt(node, "__ACCESS_TOKEN__")) { %>
cred_<%=cid%>.setAccessToken(<%= ElementParameterParser.getValue(node, "__ACCESS_TOKEN__")%>);
<% } %>
com.google.api.services.bigquery.Bigquery.Jobs.Insert insertReq_<%=cid%> = bigqueryclient_<%=cid%>.jobs().insert("", job_<%=cid%>);
insertReq_<%=cid%>.setProjectId(PROJECT_ID_<%=cid%>);
<%
@@ -514,11 +528,17 @@
log.info("<%=cid%> - Build a job successfully.");
log.info("<%=cid%> - Starting load the job.");
<%
} else {
%>
System.out.println("Starting load job.");
<%
}
%>
System.out.println("Starting load job.");
com.google.api.services.bigquery.model.Job jobExec_<%=cid%> = null;
try {
<% if(authMode.equals("TOKEN") && !ElementParameterParser.canEncrypt(node, "__ACCESS_TOKEN__")) { %>
cred_<%=cid%>.setAccessToken(<%= ElementParameterParser.getValue(node, "__ACCESS_TOKEN__")%>);
<% } %>
jobExec_<%=cid%> = insertReq_<%=cid%>.execute();
} catch (Exception ee_<%=cid%>) {
<%
@@ -540,9 +560,15 @@
}
if (jobExec_<%=cid%>.getStatus().getState().equals("RUNNING")
|| jobExec_<%=cid%>.getStatus().getState().equals("PENDING")) {
com.google.api.services.bigquery.model.Job pollJob_<%=cid%> = bigqueryclient_<%=cid%>.jobs().get(PROJECT_ID_<%=cid%>,jobExec_<%=cid%>.getJobReference().getJobId()).setLocation(jobExec_<%=cid%>.getJobReference().getLocation()).execute();
<% if(authMode.equals("TOKEN") && !ElementParameterParser.canEncrypt(node, "__ACCESS_TOKEN__")) { %>
cred_<%=cid%>.setAccessToken(<%= ElementParameterParser.getValue(node, "__ACCESS_TOKEN__")%>);
<% } %>
com.google.api.services.bigquery.model.Job pollJob_<%=cid%> = bigqueryclient_<%=cid%>.jobs().get(PROJECT_ID_<%=cid%>,jobExec_<%=cid%>.getJobReference().getJobId()).setLocation(jobExec_<%=cid%>.getJobReference().getLocation()).execute();
while (pollJob_<%=cid%>.getStatus().getState().equals("RUNNING") || pollJob_<%=cid%>.getStatus().getState().equals("PENDING")) {
Thread.sleep(1000);
<% if(authMode.equals("TOKEN") && !ElementParameterParser.canEncrypt(node, "__ACCESS_TOKEN__")) { %>
cred_<%=cid%>.setAccessToken(<%= ElementParameterParser.getValue(node, "__ACCESS_TOKEN__")%>);
<% } %>
pollJob_<%=cid%> = bigqueryclient_<%=cid%>.jobs().get(PROJECT_ID_<%=cid%>,jobExec_<%=cid%>.getJobReference().getJobId()).setLocation(jobExec_<%=cid%>.getJobReference().getLocation()).execute();
System.out.println(String.format(
"Waiting on job %s ... Current status: %s", jobExec_<%=cid%>
@@ -556,14 +582,17 @@
}
%>
}
<% if(authMode.equals("TOKEN") && !ElementParameterParser.canEncrypt(node, "__ACCESS_TOKEN__")) { %>
cred_<%=cid%>.setAccessToken(<%= ElementParameterParser.getValue(node, "__ACCESS_TOKEN__")%>);
<% } %>
com.google.api.services.bigquery.model.Job doneJob_<%=cid%> = pollJob_<%=cid%>;
if ((doneJob_<%=cid%>.getStatus() != null) && (doneJob_<%=cid%>.getStatus().getErrors() != null)) {
status = "failure";
throw new Exception(doneJob_<%=cid%>.getStatus().getErrors().toString());
}
System.out.println("Done: " + doneJob_<%=cid%>.toString());
com.google.api.services.bigquery.model.JobStatistics jobStatistics_<%=cid%>= doneJob_<%=cid%>.getStatistics();
if(jobStatistics_<%=cid%>!=null && jobStatistics_<%=cid%>.getLoad() != null){
@@ -605,11 +634,11 @@
/* ----START-CREATING-JOB (Cloud API)---- */
com.google.cloud.bigquery.TableId tableId_<%=cid%> = com.google.cloud.bigquery.TableId.of(<%=projectId%>, <%=dataset%>, <%=table%>);
com.google.cloud.bigquery.LoadJobConfiguration.Builder loadJobBuilder_<%=cid%> = com.google.cloud.bigquery.LoadJobConfiguration.newBuilder(tableId_<%=cid%>, <%=gsFile%>);
boolean dropTable_<%=cid%> = <%=dropTable%>;
if (dropTable_<%=cid%> && bigquery_<%=cid%>.getTable(tableId_<%=cid%>) != null) {
boolean deleted = bigquery_<%=cid%>.delete(tableId_<%=cid%>);
boolean deleted = bigquery_<%=cid%>.delete(tableId_<%=cid%>);
if (deleted) {
<%
if(isLog4jEnabled){
@@ -645,10 +674,10 @@
} else if ("id_Short".equals(column.getTalendType()) || "id_Integer".equals(column.getTalendType()) || "id_Long".equals(column.getTalendType())) {
typeToGenerate = "com.google.cloud.bigquery.LegacySQLTypeName.INTEGER";
} else if ("id_BigDecimal".equals(column.getTalendType())) {
typeToGenerate = "com.google.cloud.bigquery.LegacySQLTypeName.NUMERIC";
typeToGenerate = "com.google.cloud.bigquery.LegacySQLTypeName.NUMERIC";
} else if ("id_Boolean".equals(column.getTalendType())) {
typeToGenerate = "com.google.cloud.bigquery.LegacySQLTypeName.BOOLEAN";
} else if ("id_Date".equals(column.getTalendType())) {
} else if ("id_Date".equals(column.getTalendType())) {
String pattern = column.getPattern();
if(pattern.length() == 12 || pattern.isEmpty() || "\"\"".equals(pattern)) {
typeToGenerate = "com.google.cloud.bigquery.LegacySQLTypeName.DATE";
@@ -660,11 +689,11 @@
}
String modeType = (!column.isNullable()) ? "REQUIRED" : "NULLABLE";
%>
com.google.cloud.bigquery.Field field_<%=i%> = com.google.cloud.bigquery.Field.newBuilder("<%=columnName%>", <%=typeToGenerate%>)
.setMode(com.google.cloud.bigquery.Field.Mode.valueOf("<%=modeType%>"))
.build();
fields_<%=cid%>.add(field_<%=i%>);
fields_<%=cid%>.add(field_<%=i%>);
<%
if(isLog4jEnabled){
%>
@@ -753,32 +782,32 @@
}
}
%>
com.google.cloud.bigquery.Schema schema_<%=cid%> = com.google.cloud.bigquery.Schema.of(fields_<%=cid%>);
com.google.cloud.bigquery.TableInfo tableInfo_<%=cid%> = com.google.cloud.bigquery.TableInfo.newBuilder(tableId_<%=cid%>, com.google.cloud.bigquery.StandardTableDefinition.of(schema_<%=cid%>)).build();
com.google.cloud.bigquery.Table table_<%=cid%> = bigquery_<%=cid%>.create(tableInfo_<%=cid%>);
loadJobBuilder_<%=cid%>.setSchema(schema_<%=cid%>);
loadJobBuilder_<%=cid%>.setCreateDisposition(com.google.cloud.bigquery.JobInfo.CreateDisposition.CREATE_IF_NEEDED);
} else {
loadJobBuilder_<%=cid%>.setCreateDisposition(com.google.cloud.bigquery.JobInfo.CreateDisposition.CREATE_NEVER);
}
<%} else {%>
loadJobBuilder_<%=cid%>.setCreateDisposition(com.google.cloud.bigquery.JobInfo.CreateDisposition.CREATE_NEVER);
<%}%>
loadJobBuilder_<%=cid%>.setWriteDisposition(com.google.cloud.bigquery.JobInfo.WriteDisposition.WRITE_<%=actionOnData%>);
loadJobBuilder_<%=cid%>.setWriteDisposition(com.google.cloud.bigquery.JobInfo.WriteDisposition.WRITE_<%=actionOnData%>);
loadJobBuilder_<%=cid%>.setDestinationTable(tableId_<%=cid%>);
com.google.cloud.bigquery.CsvOptions.Builder csvOptions_<%=cid%> = com.google.cloud.bigquery.CsvOptions.newBuilder();
csvOptions_<%=cid%>.setAllowQuotedNewLines(true);
csvOptions_<%=cid%>.setSkipLeadingRows(<%=gsFileHeader%>);
<%if(setFieldDelimiter) {
%>
csvOptions_<%=cid%>.setFieldDelimiter(<%=fieldDelimiter%>);
<%
}
%>
%>
loadJobBuilder_<%=cid%>.setFormatOptions(csvOptions_<%=cid%>.build());
loadJobBuilder_<%=cid%>.setNullMarker(<%= nullMarker %>);
com.google.cloud.bigquery.Job job_<%=cid%> = bigquery_<%=cid%>.create(com.google.cloud.bigquery.JobInfo.of(loadJobBuilder_<%=cid%>.build()));
@@ -795,10 +824,10 @@
List<com.google.cloud.bigquery.BigQueryError> errorList = job_<%=cid%>.getStatus().getExecutionErrors();
throw new RuntimeException("Job failed: " + errorList.get(errorList.size() - 1));
}
/* ----END-CREATING-JOB (Cloud API)---- */
<%
} else {
throw new IllegalArgumentException("authentication mode should be either \"SERVICEACCOUNT\" or \"OAUTH\", but it is " + authMode);
throw new IllegalArgumentException("authentication mode should be either \"SERVICEACCOUNT\", \"OAUTH\" or \"TOKEN\", but it is " + authMode);
}
%>

View File

@@ -39,8 +39,31 @@
<ITEMS DEFAULT="SERVICEACCOUNT">
<ITEM NAME="SERVICEACCOUNT" VALUE="SERVICEACCOUNT" />
<ITEM NAME="OAUTH" VALUE="OAUTH" />
<ITEM NAME="TOKEN" VALUE="TOKEN" />
</ITEMS>
</PARAMETER>
<PARAMETER
NAME="ACCESS_TOKEN"
FIELD="PASSWORD"
NUM_ROW="10"
REQUIRED="true"
SHOW_IF="AUTH_MODE == 'TOKEN'"
GROUP="AUTHENTICATION"
>
<DEFAULT>""</DEFAULT>
</PARAMETER>
<PARAMETER
NAME="ACCESS_TOKEN_WARNING"
FIELD="LABEL"
NUM_ROW="11"
REQUIRED="false"
GROUP="AUTHENTICATION"
SHOW_IF="AUTH_MODE == 'TOKEN'"
>
<DEFAULT>*Note: If the Access Token Expire we won't be able to refresh it!"</DEFAULT>
</PARAMETER>
<PARAMETER
NAME="SERVICE_ACCOUNT_CREDENTIALS_FILE"
@@ -144,6 +167,7 @@
<ITEMS DEFAULT="GS_SERVICE_ACCOUNT">
<ITEM NAME="USER_ACCOUNT_HMAC" VALUE="USER_ACCOUNT_HMAC"/>
<ITEM NAME="GS_SERVICE_ACCOUNT" VALUE="GS_SERVICE_ACCOUNT"/>
<ITEM NAME="TOKEN" VALUE="TOKEN" />
</ITEMS>
</PARAMETER>
@@ -177,6 +201,28 @@
<DEFAULT>"__COMP_DEFAULT_FILE_DIR__/key.json"</DEFAULT>
</PARAMETER>
<PARAMETER
NAME="GS_ACCESS_TOKEN"
FIELD="PASSWORD"
NUM_ROW="60"
REQUIRED_IF="(BULK_FILE_ALREADY_EXIST=='false') AND (AUTH_TYPE == 'TOKEN')"
SHOW_IF="(BULK_FILE_ALREADY_EXIST=='false') AND (AUTH_TYPE == 'TOKEN')"
GROUP="GS_CONF"
>
<DEFAULT>""</DEFAULT>
</PARAMETER>
<PARAMETER
NAME="GS_ACCESS_TOKEN_WARNING"
FIELD="LABEL"
NUM_ROW="61"
REQUIRED="false"
GROUP="GS_CONF"
SHOW_IF="(BULK_FILE_ALREADY_EXIST=='false') AND (AUTH_TYPE == 'TOKEN')"
>
<DEFAULT>*Note: If the Access Token Expire we won't be able to refresh it!"</DEFAULT>
</PARAMETER>
<PARAMETER NAME="GS_LOCAL_FILE" FIELD="FILE" NUM_ROW="63" REQUIRED="true" GROUP="GS_CONF" SHOW_IF="BULK_FILE_ALREADY_EXIST=='false'">
<DEFAULT>""</DEFAULT>
</PARAMETER>
@@ -237,7 +283,7 @@
<IMPORT NAME="google-api-services-oauth2-v2-rev151-1.25.0.jar" MODULE="google-api-services-oauth2-v2-rev151-1.25.0.jar" MVN="mvn:com.google.apis/google-api-services-oauth2/v2-rev151-1.25.0" REQUIRED="true" />
<IMPORT NAME="google-api-services-bigquery-v2-rev454-1.25.0.jar" MODULE="google-api-services-bigquery-v2-rev454-1.25.0.jar" MVN="mvn:com.google.apis/google-api-services-bigquery/v2-rev454-1.25.0" REQUIRED="true" />
<IMPORT NAME="google-http-client-1.25.0.jar" MODULE="google-http-client-1.25.0.jar" MVN="mvn:com.google.http-client/google-http-client/1.25.0" REQUIRED="true" />
<IMPORT NAME="google-oauth-client-1.25.0.jar" MODULE="google-oauth-client-1.25.0.jar" MVN="mvn:com.google.oauth-client/google-oauth-client/1.25.0" REQUIRED="true" />
<IMPORT NAME="google-oauth-client-1.31.0.jar" MODULE="google-oauth-client-1.31.0.jar" MVN="mvn:com.google.oauth-client/google-oauth-client/1.31.0" REQUIRED="true" />
<IMPORT NAME="google-http-client-jackson2-1.25.0.jar" MODULE="google-http-client-jackson2-1.25.0.jar" MVN="mvn:com.google.http-client/google-http-client-jackson2/1.25.0" REQUIRED="true" />
<IMPORT NAME="guava-20.0.jar" MODULE="guava-20.0.jar" MVN="mvn:com.google.guava/guava/20.0" REQUIRED="true" />
<IMPORT NAME="jackson-core-2.10.1.jar" MODULE="jackson-core-2.10.1.jar" MVN="mvn:com.fasterxml.jackson.core/jackson-core/2.10.1" REQUIRED="true" />
@@ -247,33 +293,33 @@
<IMPORT NAME="httpclient-4.5.12" MODULE="httpclient-4.5.12.jar" MVN="mvn:org.apache.httpcomponents/httpclient/4.5.12" REQUIRED="true" />
<IMPORT NAME="httpcore-4.4.13" MODULE="httpcore-4.4.13.jar" MVN="mvn:org.apache.httpcomponents/httpcore/4.4.13" REQUIRED="true" />
<IMPORT NAME="commons-codec-1.14" MODULE="commons-codec-1.14.jar" MVN="mvn:commons-codec/commons-codec/1.14" REQUIRED="true" />
<IMPORT NAME="google-cloud-bigquery-1.60.0.jar" MODULE="google-cloud-bigquery-1.60.0.jar" MVN="mvn:com.google.cloud/google-cloud-bigquery/1.60.0" REQUIRED_IF="AUTH_MODE == 'SERVICEACCOUNT'" />
<IMPORT NAME="google-http-client-jackson-1.25.0.jar" MODULE="google-http-client-jackson-1.25.0.jar" MVN="mvn:com.google.http-client/google-http-client-jackson/1.25.0" REQUIRED_IF="AUTH_MODE == 'SERVICEACCOUNT'" />
<IMPORT NAME="threetenbp-1.3.3.jar" MODULE="threetenbp-1.3.3.jar" MVN="mvn:org.threeten/threetenbp/1.3.3" REQUIRED_IF="AUTH_MODE == 'SERVICEACCOUNT'" />
<IMPORT NAME="google-auth-library-credentials-0.20.0.jar" MODULE="google-auth-library-credentials-0.20.0.jar" MVN="mvn:com.google.auth/google-auth-library-credentials/0.20.0" REQUIRED_IF="AUTH_MODE == 'SERVICEACCOUNT'" />
<IMPORT NAME="gax-httpjson-0.44.0.jar" MODULE="gax-httpjson-0.44.0.jar" MVN="mvn:com.google.api/gax-httpjson/0.44.0" REQUIRED_IF="AUTH_MODE == 'SERVICEACCOUNT'" />
<IMPORT NAME="jackson-core-asl-1.9.13.jar" MODULE="jackson-core-asl-1.9.13.jar" MVN="mvn:org.codehaus.jackson/jackson-core-asl/1.9.13" REQUIRED_IF="AUTH_MODE == 'SERVICEACCOUNT'" />
<IMPORT NAME="google-auth-library-oauth2-http-0.20.0.jar" MODULE="google-auth-library-oauth2-http-0.20.0.jar" MVN="mvn:com.google.auth/google-auth-library-oauth2-http/0.20.0" REQUIRED_IF="AUTH_MODE == 'SERVICEACCOUNT'" />
<IMPORT NAME="google-cloud-core-1.93.4.jar" MODULE="google-cloud-core-1.93.4.jar" MVN="mvn:com.google.cloud/google-cloud-core/1.93.4" REQUIRED_IF="AUTH_MODE == 'SERVICEACCOUNT'" />
<IMPORT NAME="google-cloud-core-http-1.32.0.jar" MODULE="google-cloud-core-http-1.32.0.jar" MVN="mvn:com.google.cloud/google-cloud-core-http/1.32.0" REQUIRED_IF="AUTH_MODE == 'SERVICEACCOUNT'" />
<IMPORT NAME="gax-1.27.0.jar" MODULE="gax-1.27.0.jar" MVN="mvn:com.google.api/gax/1.27.0" REQUIRED_IF="(AUTH_MODE == 'SERVICEACCOUNT') OR (AUTH_TYPE == 'GS_SERVICE_ACCOUNT')" />
<IMPORT NAME="google-http-client-appengine-1.25.0.jar" MODULE="google-http-client-appengine-1.25.0.jar" MVN="mvn:com.google.http-client/google-http-client-appengine/1.25.0" REQUIRED_IF="(AUTH_MODE == 'SERVICEACCOUNT') OR (AUTH_TYPE == 'GS_SERVICE_ACCOUNT')" />
<IMPORT NAME="api-common-1.6.0.jar" MODULE="api-common-1.6.0.jar" MVN="mvn:com.google.api/api-common/1.6.0" REQUIRED_IF="(AUTH_MODE == 'SERVICEACCOUNT') OR (AUTH_TYPE == 'GS_SERVICE_ACCOUNT')" />
<IMPORT NAME="google-cloud-bigquery-1.60.0.jar" MODULE="google-cloud-bigquery-1.60.0.jar" MVN="mvn:com.google.cloud/google-cloud-bigquery/1.60.0" REQUIRED_IF="(AUTH_MODE == 'SERVICEACCOUNT') OR (AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="google-http-client-jackson-1.25.0.jar" MODULE="google-http-client-jackson-1.25.0.jar" MVN="mvn:com.google.http-client/google-http-client-jackson/1.25.0" REQUIRED_IF="(AUTH_MODE == 'SERVICEACCOUNT') OR (AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="threetenbp-1.3.3.jar" MODULE="threetenbp-1.3.3.jar" MVN="mvn:org.threeten/threetenbp/1.3.3" REQUIRED_IF="(AUTH_MODE == 'SERVICEACCOUNT') OR (AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="google-auth-library-credentials-0.20.0.jar" MODULE="google-auth-library-credentials-0.20.0.jar" MVN="mvn:com.google.auth/google-auth-library-credentials/0.20.0" REQUIRED_IF="(AUTH_MODE == 'SERVICEACCOUNT') OR (AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="gax-httpjson-0.44.0.jar" MODULE="gax-httpjson-0.44.0.jar" MVN="mvn:com.google.api/gax-httpjson/0.44.0" REQUIRED_IF="(AUTH_MODE == 'SERVICEACCOUNT') OR (AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="jackson-core-asl-1.9.13.jar" MODULE="jackson-core-asl-1.9.13.jar" MVN="mvn:org.codehaus.jackson/jackson-core-asl/1.9.13" REQUIRED_IF="(AUTH_MODE == 'SERVICEACCOUNT') OR (AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="google-auth-library-oauth2-http-0.20.0.jar" MODULE="google-auth-library-oauth2-http-0.20.0.jar" MVN="mvn:com.google.auth/google-auth-library-oauth2-http/0.20.0" REQUIRED_IF="(AUTH_MODE == 'SERVICEACCOUNT') OR (AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="google-cloud-core-1.93.4.jar" MODULE="google-cloud-core-1.93.4.jar" MVN="mvn:com.google.cloud/google-cloud-core/1.93.4" REQUIRED_IF="(AUTH_MODE == 'SERVICEACCOUNT') OR (AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="google-cloud-core-http-1.32.0.jar" MODULE="google-cloud-core-http-1.32.0.jar" MVN="mvn:com.google.cloud/google-cloud-core-http/1.32.0" REQUIRED_IF="(AUTH_MODE == 'SERVICEACCOUNT') OR (AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="gax-1.27.0.jar" MODULE="gax-1.27.0.jar" MVN="mvn:com.google.api/gax/1.27.0" REQUIRED_IF="(AUTH_MODE == 'SERVICEACCOUNT') OR (AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="google-http-client-appengine-1.25.0.jar" MODULE="google-http-client-appengine-1.25.0.jar" MVN="mvn:com.google.http-client/google-http-client-appengine/1.25.0" REQUIRED_IF="(AUTH_MODE == 'SERVICEACCOUNT') OR (AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="api-common-1.6.0.jar" MODULE="api-common-1.6.0.jar" MVN="mvn:com.google.api/api-common/1.6.0" REQUIRED_IF="(AUTH_MODE == 'SERVICEACCOUNT') OR (AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="google-cloud-storage-1.104.0" MODULE="google-cloud-storage-1.104.0.jar" MVN="mvn:com.google.cloud/google-cloud-storage/1.104.0" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT')" />
<IMPORT NAME="google-cloud-storage-1.104.0" MODULE="google-cloud-storage-1.104.0.jar" MVN="mvn:com.google.cloud/google-cloud-storage/1.104.0" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<!-- Transitive dependencies of google-cloud-storage -->
<IMPORT NAME="google-api-services-storage-v1-rev20191011-1.30.3" MODULE="google-api-services-storage-v1-rev20191011-1.30.3.jar" MVN="mvn:com.google.apis/google-api-services-storage/v1-rev20191011-1.30.3" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT')" />
<IMPORT NAME="gson-2.8.6" MODULE="gson-2.8.6.jar" MVN="mvn:com.google.code.gson/gson/2.8.6" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT')" />
<IMPORT NAME="javax.annotation-api" MODULE="javax.annotation-api-1.3.jar" MVN="mvn:javax.annotation/javax.annotation-api/1.3" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT')" />
<IMPORT NAME="protobuf-java-2.5.0" MODULE="protobuf-java-2.5.0.jar" MVN="mvn:com.google.protobuf/protobuf-java/2.5.0" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT')" />
<IMPORT NAME="error_prone_annotation-2.1.3" MODULE="error_prone_annotation-2.1.3.jar" MVN="mvn:com.google.errorprone/error_prone_annotations/2.1.3" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT')" />
<IMPORT NAME="opencensus-api-0.21.0" MODULE="opencensus-api-0.21.0.jar" MVN="mvn:io.opencensus/opencensus-api/0.21.0" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT')" />
<IMPORT NAME="opencensus-contrib-http-util-0.21.0" MODULE="opencensus-contrib-http-util-0.21.0.jar" MVN="mvn:io.opencensus/opencensus-contrib-http-util/0.21.0" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT')" />
<IMPORT NAME="grpc-context-1.19.0" MODULE="grpc-context-1.19.0.jar" MVN="mvn:io.grpc/grpc-context/1.19.0" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT')" />
<IMPORT NAME="proto-google-common-protos-1.17.0" MODULE="proto-google-common-protos-1.17.0.jar" MVN="mvn:com.google.api.grpc/proto-google-common-protos/1.17.0" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT')" />
<IMPORT NAME="proto-google-iam-v1-0.13.0" MODULE="proto-google-iam-v1-0.13.0.jar" MVN="mvn:com.google.api.grpc/proto-google-iam-v1/0.13.0" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT')" />
<IMPORT NAME="protobuf-java-util-3.11.4" MODULE="protobuf-java-util-3.11.4.jar" MVN="mvn:com.google.protobuf/protobuf-java-util/3.11.4" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT')" />
<IMPORT NAME="annotation-1.1.0" MODULE="annotation-1.1.0.jar" MVN="mvn:androidx.annotation/annotation/1.1.0" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT')" />
<IMPORT NAME="google-api-services-storage-v1-rev20191011-1.30.3" MODULE="google-api-services-storage-v1-rev20191011-1.30.3.jar" MVN="mvn:com.google.apis/google-api-services-storage/v1-rev20191011-1.30.3" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="gson-2.8.6" MODULE="gson-2.8.6.jar" MVN="mvn:com.google.code.gson/gson/2.8.6" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="javax.annotation-api" MODULE="javax.annotation-api-1.3.jar" MVN="mvn:javax.annotation/javax.annotation-api/1.3" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="protobuf-java-2.5.0" MODULE="protobuf-java-2.5.0.jar" MVN="mvn:com.google.protobuf/protobuf-java/2.5.0" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="error_prone_annotation-2.1.3" MODULE="error_prone_annotation-2.1.3.jar" MVN="mvn:com.google.errorprone/error_prone_annotations/2.1.3" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="opencensus-api-0.21.0" MODULE="opencensus-api-0.21.0.jar" MVN="mvn:io.opencensus/opencensus-api/0.21.0" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="opencensus-contrib-http-util-0.21.0" MODULE="opencensus-contrib-http-util-0.21.0.jar" MVN="mvn:io.opencensus/opencensus-contrib-http-util/0.21.0" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="grpc-context-1.19.0" MODULE="grpc-context-1.19.0.jar" MVN="mvn:io.grpc/grpc-context/1.19.0" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="proto-google-common-protos-1.17.0" MODULE="proto-google-common-protos-1.17.0.jar" MVN="mvn:com.google.api.grpc/proto-google-common-protos/1.17.0" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="proto-google-iam-v1-0.13.0" MODULE="proto-google-iam-v1-0.13.0.jar" MVN="mvn:com.google.api.grpc/proto-google-iam-v1/0.13.0" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="protobuf-java-util-3.11.4" MODULE="protobuf-java-util-3.11.4.jar" MVN="mvn:com.google.protobuf/protobuf-java-util/3.11.4" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
<IMPORT NAME="annotation-1.1.0" MODULE="annotation-1.1.0.jar" MVN="mvn:androidx.annotation/annotation/1.1.0" REQUIRED_IF="(AUTH_TYPE == 'GS_SERVICE_ACCOUNT') OR (AUTH_TYPE == 'TOKEN')" />
</IMPORTS>
</CODEGENERATION>
<RETURNS>

View File

@@ -8,11 +8,13 @@ SERVICE_ACCOUNT_CREDENTIALS_FILE.NAME=Service account credentials file
AUTH_MODE.NAME=Authentication mode
AUTH_MODE.ITEM.SERVICEACCOUNT=Service account
AUTH_MODE.ITEM.OAUTH=OAuth 2.0
AUTH_MODE.ITEM.TOKEN=OAuth Access Token
CLIENT_ID.NAME=Client Id
CLIENT_SECRET.NAME=Client Secret
PROJECT_ID.NAME=Project Id
PROJECT_ID.NAME=Project ID
AUTHORIZATION_CODE.NAME=Authorization Code
ACCESS_TOKEN.NAME=OAuth Access Token
SCHEMA.NAME=Schema
@@ -50,3 +52,6 @@ AUTH_TYPE.NAME=Credential type
AUTH_TYPE.ITEM.USER_ACCOUNT_HMAC=HMAC key (deprecated)
AUTH_TYPE.ITEM.GS_SERVICE_ACCOUNT=Service account
GS_SERVICE_ACCOUNT_KEY.NAME=Service account key
GS_ACCESS_TOKEN.NAME=OAuth Access Token
AUTH_TYPE.ITEM.TOKEN=OAuth Access Token

View File

@@ -2,7 +2,7 @@ LONG_NAME=Connects and loads data efficiently from a file into Google BigQuery
HELP=org.talend.help.tBigQueryBulkExec
CLIENT_ID.NAME=Client Id
CLIENT_SECRET.NAME=Client Secret
PROJECT_ID.NAME=Project Id
PROJECT_ID.NAME=Project ID
AUTHORIZATION_CODE.NAME=Authorization Code
SCHEMA.NAME=Schema
DATASET.NAME=Dataset

View File

@@ -6,24 +6,31 @@
class BigQueryUtil_<%=cid%> {
String projectId;
<%=basePackage%>com.google.api.services.bigquery.Bigquery bigqueryclient = null;
<%=basePackage%>com.google.api.client.auth.oauth2.Credential credentials = null;
String tokenFile;
boolean useLargeResult = false;
String tempDataset;
String tempTable;
final boolean isAutoGeneratedTemporaryDataset;
public BigQueryUtil_<%=cid%>(String projectId, <%=basePackage%>com.google.api.services.bigquery.Bigquery bigqueryclient, String tokenFile) {
this (projectId, bigqueryclient, tokenFile, null);
this (projectId, bigqueryclient, tokenFile, null, null);
}
public BigQueryUtil_<%=cid%>(String projectId, <%=basePackage%>com.google.api.services.bigquery.Bigquery bigqueryclient, String tokenFile, String tempDataset) {
public BigQueryUtil_<%=cid%>(String projectId, <%=basePackage%>com.google.api.services.bigquery.Bigquery bigqueryclient, String tokenFile, String tempDataset) {
this (projectId, bigqueryclient, tokenFile, tempDataset, null);
}
public BigQueryUtil_<%=cid%>(String projectId, <%=basePackage%>com.google.api.services.bigquery.Bigquery bigqueryclient, String tokenFile, String tempDataset,
<%=basePackage%>com.google.api.client.auth.oauth2.Credential credentials) {
this.projectId = projectId;
this.bigqueryclient = bigqueryclient;
this.tokenFile = tokenFile;
this.tempDataset = tempDataset;
this.isAutoGeneratedTemporaryDataset = tempDataset == null;
this.credentials = credentials;
}
private String genTempName(String prefix){
@@ -32,13 +39,21 @@ class BigQueryUtil_<%=cid%> {
public void cleanup() throws Exception{
if(useLargeResult){
<% if(authMode.equals("TOKEN") && !ElementParameterParser.canEncrypt(node, "__ACCESS_TOKEN__")) { %>
if (credentials !=null)
credentials.setAccessToken(<%= ElementParameterParser.getValue(node, "__ACCESS_TOKEN__")%>);
<% } %>
bigqueryclient.tables().delete(projectId, tempDataset, tempTable).execute();
if (isAutoGeneratedTemporaryDataset) {
<% if(authMode.equals("TOKEN") && !ElementParameterParser.canEncrypt(node, "__ACCESS_TOKEN__")) { %>
if (credentials !=null)
credentials.setAccessToken(<%= ElementParameterParser.getValue(node, "__ACCESS_TOKEN__")%>);
<% } %>
bigqueryclient.datasets().delete(projectId, tempDataset).execute();
}
}
}
private String getLocation(<%=basePackage%>com.google.api.services.bigquery.model.JobConfigurationQuery queryConfig) throws Exception {
String location = null;
<%=basePackage%>com.google.api.services.bigquery.model.JobConfiguration config = new <%=basePackage%>com.google.api.services.bigquery.model.JobConfiguration();
@@ -46,6 +61,10 @@ class BigQueryUtil_<%=cid%> {
config.setDryRun(true);
<%=basePackage%>com.google.api.services.bigquery.model.Job job = new <%=basePackage%>com.google.api.services.bigquery.model.Job();
job.setConfiguration(config);
<% if(authMode.equals("TOKEN") && !ElementParameterParser.canEncrypt(node, "__ACCESS_TOKEN__")) { %>
if (credentials !=null)
credentials.setAccessToken(<%= ElementParameterParser.getValue(node, "__ACCESS_TOKEN__")%>);
<% } %>
location = bigqueryclient.jobs().insert(projectId, job).execute().getJobReference().getLocation();
location = location == null ? "US" : location;
return location;
@@ -59,13 +78,17 @@ class BigQueryUtil_<%=cid%> {
String description = "Dataset for BigQuery query job temporary table";
dataset.setFriendlyName(description);
dataset.setDescription(description);
<% if(authMode.equals("TOKEN") && !ElementParameterParser.canEncrypt(node, "__ACCESS_TOKEN__")) { %>
if (credentials !=null)
credentials.setAccessToken(<%= ElementParameterParser.getValue(node, "__ACCESS_TOKEN__")%>);
<% } %>
bigqueryclient.datasets().insert(projectId, dataset).execute();
}
public <%=basePackage%>com.google.api.services.bigquery.model.Job executeQuery(String query, boolean useLargeResult) throws Exception{
return executeQuery(query, useLargeResult, true);
}
public <%=basePackage%>com.google.api.services.bigquery.model.Job executeQuery(String query, boolean useLargeResult, boolean useLegacySql) throws Exception{
<%=basePackage%>com.google.api.services.bigquery.model.JobConfigurationQuery queryConfig = new <%=basePackage%>com.google.api.services.bigquery.model.JobConfigurationQuery();
queryConfig.setQuery(query);
@@ -84,16 +107,20 @@ class BigQueryUtil_<%=cid%> {
.setDatasetId(tempDataset)
.setTableId(tempTable));
}
<%=basePackage%>com.google.api.services.bigquery.model.JobConfiguration config = new <%=basePackage%>com.google.api.services.bigquery.model.JobConfiguration();
config.setQuery(queryConfig);
<%=basePackage%>com.google.api.services.bigquery.model.Job job = new <%=basePackage%>com.google.api.services.bigquery.model.Job();
job.setConfiguration(config);
<%=basePackage%>com.google.api.services.bigquery.model.Job insert = null;
<%=basePackage%>com.google.api.services.bigquery.model.JobReference jobId = null;
try {
<% if(authMode.equals("TOKEN") && !ElementParameterParser.canEncrypt(node, "__ACCESS_TOKEN__")) { %>
if (credentials !=null)
credentials.setAccessToken(<%= ElementParameterParser.getValue(node, "__ACCESS_TOKEN__")%>);
<% } %>
insert = bigqueryclient.jobs().insert(projectId, job).execute();
jobId = insert.getJobReference();
} catch (<%=basePackage%>com.google.api.client.googleapis.json.GoogleJsonResponseException e) {
@@ -142,7 +169,10 @@ class BigQueryUtil_<%=cid%> {
%>
// wait for query execution
while (true) {
<% if(authMode.equals("TOKEN") && !ElementParameterParser.canEncrypt(node, "__ACCESS_TOKEN__")) { %>
if (credentials !=null)
credentials.setAccessToken(<%= ElementParameterParser.getValue(node, "__ACCESS_TOKEN__")%>);
<% } %>
<%
if("".equals(basePackage)) {
%>
@@ -154,7 +184,7 @@ class BigQueryUtil_<%=cid%> {
<%
}
%>
<%=basePackage%>com.google.api.services.bigquery.model.JobStatus status = pollJob.getStatus();
if (status.getState().equals("DONE")) {
<%=basePackage%>com.google.api.services.bigquery.model.ErrorProto errorProto = status.getErrorResult();

View File

@@ -32,7 +32,7 @@
String basePackage = "";
boolean isCustomTemporaryName = ElementParameterParser.getBooleanValue(node,"__USE_CUSTOM_TEMPORARY_DATASET__");
String tempDataset = ElementParameterParser.getValue(node,"__TEMPORARY_DATASET_NAME__");
String encoding = ElementParameterParser.getValue(node,"__ENCODING__");
String advancedSeparatorStr = ElementParameterParser.getValue(node, "__ADVANCED_SEPARATOR__");
boolean advancedSeparator = (advancedSeparatorStr!=null&&!("").equals(advancedSeparatorStr))?("true").equals(advancedSeparatorStr):false;
@@ -44,7 +44,7 @@
String tokenFile = ElementParameterParser.getValue(node,"__TOKEN_NAME__");
boolean isLog4jEnabled = ("true").equals(ElementParameterParser.getValue(node.getProcess(), "__LOG4J_ACTIVATE__"));
//Dynamic start
List<IMetadataTable> metadatas = node.getMetadataList();
int sizeListColumns = 0;
@@ -59,11 +59,20 @@
}
}
int dynamic_index=-1;
//Dynamic end
if (authMode.equals("OAUTH")) {
if (authMode.equals("OAUTH") || authMode.equals("TOKEN") ) {
%>
final String PROJECT_ID_<%=cid %> = <%=projectId %>;
com.google.api.services.bigquery.Bigquery bigqueryclient_<%=cid%> = null;
com.google.api.client.auth.oauth2.Credential credential_<%=cid%> = null;
long nb_line_<%=cid%> = 0;
final com.google.api.client.http.HttpTransport TRANSPORT_<%=cid %> = new com.google.api.client.http.javanet.NetHttpTransport();
final com.google.api.client.json.JsonFactory JSON_FACTORY_<%=cid %> = new com.google.api.client.json.jackson2.JacksonFactory();
<%
if (authMode.equals("OAUTH")) {
%>
final String CLIENT_ID_<%=cid %> = <%=clientId %>;
<%
@@ -73,21 +82,17 @@
<%@ include file="@{org.talend.designer.components.localprovider}/components/templates/password.javajet"%>
final String CLIENT_SECRET_<%=cid%> = "{\"web\": {\"client_id\": \""+<%=clientId%>+"\",\"client_secret\": \"" +decryptedPassword_<%=cid%>+ "\",\"auth_uri\": \"https://accounts.google.com/o/oauth2/auth\",\"token_uri\": \"https://accounts.google.com/o/oauth2/token\"}}";
final String PROJECT_ID_<%=cid %> = <%=projectId %>;
// Static variables for API scope, callback URI, and HTTP/JSON functions
final List<String> SCOPES_<%=cid%> = java.util.Arrays.asList("https://www.googleapis.com/auth/bigquery");
final String REDIRECT_URI_<%=cid%> = "urn:ietf:wg:oauth:2.0:oob";
final com.google.api.client.http.HttpTransport TRANSPORT_<%=cid %> = new com.google.api.client.http.javanet.NetHttpTransport();
final com.google.api.client.json.JsonFactory JSON_FACTORY_<%=cid %> = new com.google.api.client.json.jackson2.JacksonFactory();
com.google.api.client.googleapis.auth.oauth2.GoogleClientSecrets clientSecrets_<%=cid%> = com.google.api.client.googleapis.auth.oauth2.GoogleClientSecrets.load(
new com.google.api.client.json.jackson2.JacksonFactory(), new java.io.InputStreamReader(new java.io.ByteArrayInputStream(
CLIENT_SECRET_<%=cid%>.getBytes())));
com.google.api.client.googleapis.auth.oauth2.GoogleAuthorizationCodeFlow flow_<%=cid%> = null;
com.google.api.services.bigquery.Bigquery bigqueryclient_<%=cid%> = null;
long nb_line_<%=cid%> = 0;
<%
if(isLog4jEnabled){
%>
@@ -129,7 +134,7 @@
// access token.
if (storedRefreshToken_<%=cid%> != null) {
// Request a new Access token using the refresh token.
com.google.api.client.googleapis.auth.oauth2.GoogleCredential credential_<%=cid%> = new com.google.api.client.googleapis.auth.oauth2.GoogleCredential.Builder().setTransport(TRANSPORT_<%=cid%>)
credential_<%=cid%> = new com.google.api.client.googleapis.auth.oauth2.GoogleCredential.Builder().setTransport(TRANSPORT_<%=cid%>)
.setJsonFactory(JSON_FACTORY_<%=cid%>).setClientSecrets(clientSecrets_<%=cid%>)
.build().setFromTokenResponse(new com.google.api.client.auth.oauth2.TokenResponse().setRefreshToken(storedRefreshToken_<%=cid%>));
@@ -141,7 +146,6 @@
<%
}
%>
bigqueryclient_<%=cid%> = new com.google.api.services.bigquery.Bigquery.Builder(new com.google.api.client.http.javanet.NetHttpTransport(),new com.google.api.client.json.jackson2.JacksonFactory(),credential_<%=cid%>).setApplicationName("Talend").build();
} else {
<%
if(isLog4jEnabled){
@@ -183,7 +187,7 @@
.build();
}
com.google.api.client.googleapis.auth.oauth2.GoogleTokenResponse response_<%=cid%> = flow_<%=cid%>.newTokenRequest(authorizationCode_<%=cid%>).setRedirectUri(REDIRECT_URI_<%=cid%>).execute();
com.google.api.client.auth.oauth2.Credential credential_<%=cid%> = flow_<%=cid%>.createAndStoreCredential(response_<%=cid%>, null);
credential_<%=cid%> = flow_<%=cid%>.createAndStoreCredential(response_<%=cid%>, null);
<%
if(isLog4jEnabled){
@@ -195,17 +199,25 @@
// Store the refresh token for future use.
java.util.Properties storeProperties_<%=cid%> = new java.util.Properties();
storeProperties_<%=cid%>.setProperty("refreshtoken", credential_<%=cid%>.getRefreshToken());
java.io.FileOutputStream outputStream_<%=cid%> = new java.io.FileOutputStream(tokenFile_<%=cid %>);
storeProperties_<%=cid%>.store(outputStream_<%=cid%>,null);
if (outputStream_<%=cid%> != null) {
outputStream_<%=cid%>.close();
}
bigqueryclient_<%=cid%> = new com.google.api.services.bigquery.Bigquery.Builder(new com.google.api.client.http.javanet.NetHttpTransport(),new com.google.api.client.json.jackson2.JacksonFactory(),credential_<%=cid%>).build();
try (java.io.FileOutputStream outputStream_<%=cid%> = new java.io.FileOutputStream(tokenFile_<%=cid %>)) {
storeProperties_<%=cid%>.store(outputStream_<%=cid%>,null);
}
}
}
<%} else {
if (ElementParameterParser.canEncrypt(node, "__ACCESS_TOKEN__")) {%>
final String decryptedAccessToken_<%=cid%> = routines.system.PasswordEncryptUtil.decryptPassword(<%=ElementParameterParser.getEncryptedValue(node, "__ACCESS_TOKEN__")%>);
credential_<%=cid%> =
new com.google.api.client.auth.oauth2.Credential(com.google.api.client.auth.oauth2.BearerToken.authorizationHeaderAccessMethod()).setFromTokenResponse(
(new com.google.api.client.auth.oauth2.TokenResponse()).setAccessToken(decryptedAccessToken_<%=cid%>));
<%} else {%>
credential_<%=cid%> =
new com.google.api.client.auth.oauth2.Credential(com.google.api.client.auth.oauth2.BearerToken.authorizationHeaderAccessMethod()).setFromTokenResponse(
(new com.google.api.client.auth.oauth2.TokenResponse()).setAccessToken(<%= ElementParameterParser.getValue(node, "__ACCESS_TOKEN__")%>));
<%}
} %>
bigqueryclient_<%=cid%> = new com.google.api.services.bigquery.Bigquery.Builder(TRANSPORT_<%=cid %>, JSON_FACTORY_<%=cid %>,credential_<%=cid%>).setApplicationName("Talend").build();
<%@ include file="@{org.talend.designer.components.localprovider}/components/tBigQueryInput/BigQueryInputQueryHelper.javajet"%>
@@ -218,11 +230,12 @@
log.debug("<%=cid%> - Running Query: "+querySql_<%=cid %>);
<%
}
%>
if (authMode.equals("OAUTH")) { %>
BigQueryUtil_<%=cid%> bigQueryUtil_<%=cid%> = new BigQueryUtil_<%=cid%>(PROJECT_ID_<%=cid%>, bigqueryclient_<%=cid%>, tokenFile_<%=cid%><% if (isCustomTemporaryName) {%>, <%=tempDataset%> <%} %>);
<%} else {%>
BigQueryUtil_<%=cid%> bigQueryUtil_<%=cid%> = new BigQueryUtil_<%=cid%>(PROJECT_ID_<%=cid%>, bigqueryclient_<%=cid%>, null, <% if (isCustomTemporaryName) {%><%=tempDataset%> <%} else {%>null<% } %>, credential_<%=cid%>);
<%}
BigQueryUtil_<%=cid%> bigQueryUtil_<%=cid%> = new BigQueryUtil_<%=cid%>(PROJECT_ID_<%=cid%>, bigqueryclient_<%=cid%>, tokenFile_<%=cid%><% if (isCustomTemporaryName) {%>, <%=tempDataset%> <%} %>);
<%
if(isLog4jEnabled){
%>
log.info("<%=cid%> - Try <%="LARGE".equals(resultSizeType) ? "with" : "without"%> the allowLargeResults flag");
@@ -242,15 +255,17 @@
//Dynamic start
<%
if(isDynamic){
%>
<%@ include file="@{org.talend.designer.components.localprovider}/components/templates/DB/Input/BigQuery/BigQueryHelper_Oauth.javajet"%>
BigQueryHelper_<%=cid%> helper_<%=cid%> = new BigQueryHelper_<%=cid%>();
<% if(authMode.equals("TOKEN") && !ElementParameterParser.canEncrypt(node, "__ACCESS_TOKEN__")) { %>
credential_<%=cid%>.setAccessToken(<%= ElementParameterParser.getValue(node, "__ACCESS_TOKEN__")%>);
<% } %>
com.google.api.services.bigquery.model.TableSchema schema_<%=cid%> = bigqueryclient_<%=cid%>
.jobs().getQueryResults(PROJECT_ID_<%=cid%>, insert_<%=cid%>.getJobReference().getJobId()).execute().getSchema();
routines.system.Dynamic dcg_<%=cid%> = new routines.system.Dynamic();
<%
for(int i=0; i< columnList.size(); i++) {
@@ -272,7 +287,7 @@
dcg_<%=cid%>.metadatas.add(dynamicMetadata_<%=cid%>);
}
<%
}
%>
//Dynamic end
@@ -375,14 +390,14 @@
//Dynamic start
<%
if(isDynamic){
%>
<%@ include file="@{org.talend.designer.components.localprovider}/components/templates/DB/Input/BigQuery/BigQueryHelper_ServiceAccount.javajet"%>
BigQueryHelper_<%=cid%> helper_<%=cid%> = new BigQueryHelper_<%=cid%>();
com.google.cloud.bigquery.Schema schema_<%=cid%> = result_<%=cid%>.getSchema();
routines.system.Dynamic dcg_<%=cid%> = new routines.system.Dynamic();
<%
for(int i=0; i< columnList.size(); i++) {
@@ -411,21 +426,24 @@
dcg_<%=cid%>.metadatas.add(dynamicMetadata_<%=cid%>);
}
<%
}
%>
//Dynamic end
long nb_line_<%=cid%> = 0;
<%
} else {
throw new IllegalArgumentException("authentication mode should be either \"SERVICEACCOUNT\" or \"OAUTH\", but it is " + authMode);
throw new IllegalArgumentException("authentication mode should be either \"SERVICEACCOUNT\", \"OAUTH\" or \"TOKEN\", but it is " + authMode);
}
if (authMode.equals("OAUTH")) {
if (authMode.equals("OAUTH") || authMode.equals("TOKEN")) {
%>
while (true) {
// Fetch Results
<% if(authMode.equals("TOKEN") && !ElementParameterParser.canEncrypt(node, "__ACCESS_TOKEN__")) { %>
credential_<%=cid%>.setAccessToken(<%= ElementParameterParser.getValue(node, "__ACCESS_TOKEN__")%>);
<% } %>
com.google.api.services.bigquery.model.TableDataList dataList_<%=cid %> = bigqueryclient_<%=cid%>.tabledata()
.list(PROJECT_ID_<%=cid %>,
insert_<%=cid %>.getConfiguration().getQuery()
@@ -452,7 +470,7 @@
nb_line_<%=cid%> ++;
<%
} else {
throw new IllegalArgumentException("authentication mode should be either \"SERVICEACCOUNT\" or \"OAUTH\", but it is " + authMode);
throw new IllegalArgumentException("authentication mode should be either \"SERVICEACCOUNT\", \"OAUTH\" or \"TOKEN\", but it is " + authMode);
}
%>
int fieldsCount_<%=cid %> = field_<%=cid %>.size();
@@ -466,8 +484,8 @@
IConnection conn =conns.get(0);
String connName = conn.getName();
if (conn.getLineStyle().hasConnectionCategory(IConnectionCategory.DATA)) {
if ((metadatas!=null) && (metadatas.size() > 0)) {
IMetadataTable metadata = metadatas.get(0);
if (metadata != null) {
@@ -500,7 +518,7 @@
<%=connName%>.<%=column.getLabel()%> = <%=defVal%>;
} else {
<%
if (authMode.equals("OAUTH")) {
if (authMode.equals("OAUTH") || authMode.equals("TOKEN")) {
%>
value_<%=cid%> = field_<%=cid %>.get(column_index_<%=cid%>).getV();
<%
@@ -509,7 +527,7 @@
value_<%=cid%> = field_<%=cid %>.get(column_index_<%=cid%>).getValue();
<%
} else {
throw new IllegalArgumentException("authentication mode should be either \"SERVICEACCOUNT\" or \"OAUTH\", but it is " + authMode);
throw new IllegalArgumentException("authentication mode should be either \"SERVICEACCOUNT\", \"OAUTH\" or \"TOKEN\", but it is " + authMode);
}
%>
if(com.google.api.client.util.Data.isNull(value_<%=cid%>)) value_<%=cid%> = null;

View File

@@ -15,7 +15,7 @@ imports="
String resultSizeType = ElementParameterParser.getValue(node,"__RESULT_SIZE__");
boolean isCustomTemporaryName = ElementParameterParser.getBooleanValue(node,"__USE_CUSTOM_TEMPORARY_DATASET__");
if (authMode.equals("OAUTH")) {
if (authMode.equals("OAUTH") || authMode.equals("TOKEN")) {
%>
}
pageToken_<%=cid%> = dataList_<%=cid %>.getPageToken();
@@ -41,7 +41,7 @@ imports="
}
}
} else {
throw new IllegalArgumentException("authentication mode should be either \"SERVICEACCOUNT\" or \"OAUTH\", but it is " + authMode);
throw new IllegalArgumentException("authentication mode should be either \"SERVICEACCOUNT\" or \"OAUTH\" or \"TOKEN\", but it is " + authMode);
}
log4jFileUtil.retrievedDataNumberInfo(node);
%>

View File

@@ -43,21 +43,43 @@
REQUIRED="true"
NUM_ROW="1"
/>
<PARAMETER NAME="AUTH_MODE" FIELD="CLOSED_LIST" NUM_ROW="3" REQUIRED="true" GROUP="AUTHENTICATION" REPOSITORY_VALUE="AUTH_MODE">
<ITEMS DEFAULT="SERVICEACCOUNT">
<ITEM NAME="SERVICEACCOUNT" VALUE="SERVICEACCOUNT" />
<ITEM NAME="OAUTH" VALUE="OAUTH" />
<ITEM NAME="TOKEN" VALUE="TOKEN" />
</ITEMS>
</PARAMETER>
<PARAMETER
NAME="SERVICE_ACCOUNT_CREDENTIALS_FILE"
<PARAMETER
NAME="ACCESS_TOKEN"
FIELD="PASSWORD"
NUM_ROW="10"
REQUIRED="true"
SHOW_IF="AUTH_MODE == 'TOKEN'"
GROUP="AUTHENTICATION"
>
<DEFAULT>""</DEFAULT>
</PARAMETER>
<PARAMETER
NAME="ACCESS_TOKEN_WARNING"
FIELD="LABEL"
NUM_ROW="11"
REQUIRED="false"
GROUP="AUTHENTICATION"
SHOW_IF="AUTH_MODE == 'TOKEN'"
>
<DEFAULT>*Note: If the Access Token Expire we won't be able to refresh it!"</DEFAULT>
</PARAMETER>
<PARAMETER
NAME="SERVICE_ACCOUNT_CREDENTIALS_FILE"
REPOSITORY_VALUE="SERVICE_ACCOUNT_CREDENTIALS_FILE"
GROUP="AUTHENTICATION"
FIELD="FILE"
NUM_ROW="10"
REQUIRED="true"
FIELD="FILE"
NUM_ROW="10"
REQUIRED="true"
SHOW_IF="AUTH_MODE == 'SERVICEACCOUNT'"
>
<DEFAULT>""</DEFAULT>
@@ -104,7 +126,7 @@
>
<DEFAULT>""</DEFAULT>
</PARAMETER>
<PARAMETER
NAME="USE_LEGACY_SQL"
FIELD="CHECK"
@@ -113,7 +135,7 @@
>
<DEFAULT>false</DEFAULT>
</PARAMETER>
<PARAMETER
NAME="QUERY"
FIELD="MEMO_SQL"
@@ -122,13 +144,13 @@
>
<DEFAULT>"select id, name from employee"</DEFAULT>
</PARAMETER>
<!--
https://cloud.google.com/bigquery/querying-data#large-results
<!--
https://cloud.google.com/bigquery/querying-data#large-results
SMALL: allowLargeResult is false
LARGE: allowLargeResult is true and auto create/clean temp dataset/table
AUTO: try SMALL first, if get responseTooLarge error then try LARGE
-->
<PARAMETER
<PARAMETER
NAME="RESULT_SIZE"
FIELD="CLOSED_LIST"
NUM_ROW="60"
@@ -183,7 +205,7 @@
<IMPORT NAME="google-api-services-oauth2-v2-rev151-1.25.0.jar" MODULE="google-api-services-oauth2-v2-rev151-1.25.0.jar" MVN="mvn:com.google.apis/google-api-services-oauth2/v2-rev151-1.25.0" REQUIRED="true" />
<IMPORT NAME="google-api-services-bigquery-v2-rev454-1.25.0.jar" MODULE="google-api-services-bigquery-v2-rev454-1.25.0.jar" MVN="mvn:com.google.apis/google-api-services-bigquery/v2-rev454-1.25.0" REQUIRED="true" />
<IMPORT NAME="google-http-client-1.25.0.jar" MODULE="google-http-client-1.25.0.jar" MVN="mvn:com.google.http-client/google-http-client/1.25.0" REQUIRED="true" />
<IMPORT NAME="google-oauth-client-1.25.0.jar" MODULE="google-oauth-client-1.25.0.jar" MVN="mvn:com.google.oauth-client/google-oauth-client/1.25.0" REQUIRED="true" />
<IMPORT NAME="google-oauth-client-1.31.0.jar" MODULE="google-oauth-client-1.31.0.jar" MVN="mvn:com.google.oauth-client/google-oauth-client/1.31.0" REQUIRED="true" />
<IMPORT NAME="google-http-client-jackson2-1.25.0.jar" MODULE="google-http-client-jackson2-1.25.0.jar" MVN="mvn:com.google.http-client/google-http-client-jackson2/1.25.0" REQUIRED="true" />
<IMPORT NAME="guava-20.0.jar" MODULE="guava-20.0.jar" MVN="mvn:com.google.guava/guava/20.0" REQUIRED="true" />
<IMPORT NAME="jackson-core-2.10.1.jar" MODULE="jackson-core-2.10.1.jar" MVN="mvn:com.fasterxml.jackson.core/jackson-core/2.10.1" REQUIRED="true" />

View File

@@ -1,6 +1,6 @@
#Created by JInto - www.guh-software.de
#Wed Mar 19 09:39:53 CST 2008
LONG_NAME=Connect and run a query on Google BigQuery
LONG_NAME=Connect and run a query on Google BigQuery
HELP=org.talend.help.tBigQueryInput
AUTHENTICATION.NAME=Authentication
@@ -8,11 +8,14 @@ SERVICE_ACCOUNT_CREDENTIALS_FILE.NAME=Service account credentials file
AUTH_MODE.NAME=Authentication mode
AUTH_MODE.ITEM.SERVICEACCOUNT=Service account
AUTH_MODE.ITEM.OAUTH=OAuth 2.0
AUTH_MODE.ITEM.TOKEN=OAuth Access Token
CLIENT_ID.NAME=Client Id
CLIENT_SECRET.NAME=Client Secret
PROJECT_ID.NAME=Project Id
PROJECT_ID.NAME=Project ID
AUTHORIZATION_CODE.NAME=Authorization Code
ACCESS_TOKEN.NAME=OAuth Access Token
QUERY.NAME=Query
SCHEMA.NAME=Schema
@@ -29,4 +32,4 @@ RESULT_SIZE.ITEM.LARGE=Large(with allowLargeResults)
RESULT_SIZE.ITEM.AUTO=Auto
USE_CUSTOM_TEMPORARY_DATASET.NAME=Use custom temporary Dataset name
TEMPORARY_DATASET_NAME.NAME=Temporary Dataset name
USE_LEGACY_SQL.NAME=Use Legacy SQL
USE_LEGACY_SQL.NAME=Use Legacy SQL

View File

@@ -2,7 +2,7 @@ LONG_NAME=Connect and run a query on Google BigQuery
HELP=org.talend.help.tBigQueryInput
CLIENT_ID.NAME=Client Id
CLIENT_SECRET.NAME=Client Secret
PROJECT_ID.NAME=Project Id
PROJECT_ID.NAME=Project ID
AUTHORIZATION_CODE.NAME=Authorization Code
QUERY.NAME=Query
SCHEMA.NAME=Schema

View File

@@ -54,20 +54,43 @@
<PARAMETER NAME="SCHEMA" FIELD="SCHEMA_TYPE" REQUIRED="true" NUM_ROW="13">
<DEFAULT/>
</PARAMETER>
<PARAMETER NAME="AUTH_MODE" FIELD="CLOSED_LIST" NUM_ROW="15" REQUIRED="true" GROUP="AUTHENTICATION" REPOSITORY_VALUE="AUTH_MODE">
<ITEMS DEFAULT="SERVICEACCOUNT">
<ITEM NAME="SERVICEACCOUNT" VALUE="SERVICEACCOUNT" />
<ITEM NAME="OAUTH" VALUE="OAUTH" />
<ITEM NAME="TOKEN" VALUE="TOKEN" />
</ITEMS>
</PARAMETER>
<PARAMETER
NAME="SERVICE_ACCOUNT_CREDENTIALS_FILE"
<PARAMETER
NAME="ACCESS_TOKEN"
FIELD="PASSWORD"
NUM_ROW="18"
REQUIRED="true"
SHOW_IF="AUTH_MODE == 'TOKEN'"
GROUP="AUTHENTICATION"
>
<DEFAULT>""</DEFAULT>
</PARAMETER>
<PARAMETER
NAME="ACCESS_TOKEN_WARNING"
FIELD="LABEL"
NUM_ROW="20"
REQUIRED="false"
GROUP="AUTHENTICATION"
SHOW_IF="AUTH_MODE == 'TOKEN'"
>
<DEFAULT>*Note: If the Access Token Expire we won't be able to refresh it!"</DEFAULT>
</PARAMETER>
<PARAMETER
NAME="SERVICE_ACCOUNT_CREDENTIALS_FILE"
REPOSITORY_VALUE="SERVICE_ACCOUNT_CREDENTIALS_FILE"
FIELD="FILE"
NUM_ROW="18"
REQUIRED="true"
FIELD="FILE"
NUM_ROW="18"
REQUIRED="true"
SHOW_IF="AUTH_MODE == 'SERVICEACCOUNT'"
GROUP="AUTHENTICATION"
>
@@ -162,25 +185,26 @@
<ITEMS DEFAULT="GS_SERVICE_ACCOUNT">
<ITEM NAME="USER_ACCOUNT_HMAC" VALUE="USER_ACCOUNT_HMAC"/>
<ITEM NAME="GS_SERVICE_ACCOUNT" VALUE="GS_SERVICE_ACCOUNT"/>
<ITEM NAME="TOKEN" VALUE="TOKEN" />
</ITEMS>
</PARAMETER>
<PARAMETER
NAME="GS_ACCESS_KEY"
FIELD="TEXT"
NUM_ROW="60"
REQUIRED="true"
GROUP="GS_CONF"
<PARAMETER
NAME="GS_ACCESS_KEY"
FIELD="TEXT"
NUM_ROW="60"
REQUIRED="true"
GROUP="GS_CONF"
SHOW_IF="(BULK_FILE_ALREADY_EXIST=='false') AND (AUTH_TYPE == 'USER_ACCOUNT_HMAC')">
<DEFAULT>""</DEFAULT>
</PARAMETER>
<PARAMETER
NAME="GS_SECRET_KEY"
FIELD="PASSWORD"
NUM_ROW="61"
REQUIRED="true"
GROUP="GS_CONF"
<PARAMETER
NAME="GS_SECRET_KEY"
FIELD="PASSWORD"
NUM_ROW="61"
REQUIRED="true"
GROUP="GS_CONF"
SHOW_IF="(BULK_FILE_ALREADY_EXIST=='false') AND (AUTH_TYPE == 'USER_ACCOUNT_HMAC')">
<DEFAULT>""</DEFAULT>
</PARAMETER>
@@ -195,6 +219,28 @@
<DEFAULT>"__COMP_DEFAULT_FILE_DIR__/key.json"</DEFAULT>
</PARAMETER>
<PARAMETER
NAME="GS_ACCESS_TOKEN"
FIELD="PASSWORD"
NUM_ROW="60"
REQUIRED_IF="(BULK_FILE_ALREADY_EXIST=='false') AND (AUTH_TYPE == 'TOKEN')"
SHOW_IF="(BULK_FILE_ALREADY_EXIST=='false') AND (AUTH_TYPE == 'TOKEN')"
GROUP="GS_CONF"
>
<DEFAULT>""</DEFAULT>
</PARAMETER>
<PARAMETER
NAME="GS_ACCESS_TOKEN_WARNING"
FIELD="LABEL"
NUM_ROW="61"
REQUIRED="false"
GROUP="GS_CONF"
SHOW_IF="(BULK_FILE_ALREADY_EXIST=='false') AND (AUTH_TYPE == 'TOKEN')"
>
<DEFAULT>*Note: If the Access Token Expire we won't be able to refresh it!"</DEFAULT>
</PARAMETER>
<PARAMETER NAME="BUCKET_NAME" FIELD="TEXT" NUM_ROW="63" REQUIRED="true" GROUP="GS_CONF" SHOW_IF="BULK_FILE_ALREADY_EXIST=='false'">
<DEFAULT>""</DEFAULT>
</PARAMETER>
@@ -303,6 +349,8 @@
<TEMPLATE_PARAM SOURCE="self.AUTH_MODE" TARGET="tBQBE.AUTH_MODE" />
<TEMPLATE_PARAM SOURCE="self.SERVICE_ACCOUNT_CREDENTIALS_FILE" TARGET="tBQBE.SERVICE_ACCOUNT_CREDENTIALS_FILE" />
<TEMPLATE_PARAM SOURCE="self.ACCESS_TOKEN" TARGET="tBQBE.ACCESS_TOKEN" />
<TEMPLATE_PARAM SOURCE="self.ACCESS_TOKEN_WARNING" TARGET="tBQBE.ACCESS_TOKEN_WARNING" />
<TEMPLATE_PARAM SOURCE="self.CLIENT_ID" TARGET="tBQBE.CLIENT_ID" />
<TEMPLATE_PARAM SOURCE="self.CLIENT_SECRET" TARGET="tBQBE.CLIENT_SECRET" />
<TEMPLATE_PARAM SOURCE="self.PROJECT_ID" TARGET="tBQBE.PROJECT_ID" />
@@ -313,6 +361,8 @@
<TEMPLATE_PARAM SOURCE="self.ACTION_ON_DATA" TARGET="tBQBE.ACTION_ON_DATA" />
<TEMPLATE_PARAM SOURCE="self.AUTH_TYPE" TARGET="tBQBE.AUTH_TYPE" />
<TEMPLATE_PARAM SOURCE="self.GS_SERVICE_ACCOUNT_KEY" TARGET="tBQBE.GS_SERVICE_ACCOUNT_KEY" />
<TEMPLATE_PARAM SOURCE="self.GS_ACCESS_TOKEN" TARGET="tBQBE.GS_ACCESS_TOKEN" />
<TEMPLATE_PARAM SOURCE="self.GS_ACCESS_TOKEN_WARNING" TARGET="tBQBE.GS_ACCESS_TOKEN_WARNING" />
<TEMPLATE_PARAM SOURCE="self.GS_ACCESS_KEY" TARGET="tBQBE.GS_ACCESS_KEY" />
<TEMPLATE_PARAM SOURCE="self.GS_SECRET_KEY" TARGET="tBQBE.GS_SECRET_KEY" />
<TEMPLATE_PARAM SOURCE="self.BUCKET_NAME" TARGET="tBQBE.BUCKET_NAME" />

View File

@@ -22,11 +22,13 @@ SERVICE_ACCOUNT_CREDENTIALS_FILE.NAME=Service account credentials file
AUTH_MODE.NAME=Authentication mode
AUTH_MODE.ITEM.SERVICEACCOUNT=Service account
AUTH_MODE.ITEM.OAUTH=OAuth 2.0
AUTH_MODE.ITEM.TOKEN=OAuth Access Token
CLIENT_ID.NAME=Client Id
CLIENT_SECRET.NAME=Client Secret
PROJECT_ID.NAME=Project Id
PROJECT_ID.NAME=Project ID
AUTHORIZATION_CODE.NAME=Authorization Code
ACCESS_TOKEN.NAME=OAuth Access Token
SCHEMA.NAME=Schema
@@ -62,4 +64,6 @@ DIE_ON_ERROR.NAME=Die on error
AUTH_TYPE.NAME=Credential type
AUTH_TYPE.ITEM.USER_ACCOUNT_HMAC=HMAC key (deprecated)
AUTH_TYPE.ITEM.GS_SERVICE_ACCOUNT=Service account
AUTH_TYPE.ITEM.TOKEN=OAuth Access Token
GS_SERVICE_ACCOUNT_KEY.NAME=Service account key
GS_ACCESS_TOKEN.NAME=OAuth Access Token

View File

@@ -16,7 +16,7 @@ LONG.NAME=Connects and loads data from a flow into Google BigQuery
HELP=org.talend.help.tBigQueryOutput
CLIENT_ID.NAME=Client Id
CLIENT_SECRET.NAME=Client Secret
PROJECT_ID.NAME=Project Id
PROJECT_ID.NAME=Project ID
AUTHORIZATION_CODE.NAME=Authorization Code
SCHEMA.NAME=Schema
DATASET.NAME=Dataset

View File

@@ -1,7 +1,7 @@
<%@ jet
imports="
org.talend.core.model.process.INode
org.talend.core.model.process.ElementParameterParser
org.talend.core.model.process.INode
org.talend.core.model.process.ElementParameterParser
org.talend.designer.codegen.config.CodeGeneratorArgument
java.util.List
"
@@ -23,33 +23,39 @@
String query = ElementParameterParser.getValue(node,"__QUERY__");
boolean useLegacySql = ElementParameterParser.getBooleanValue(node,"__USE_LEGACY_SQL__");
String basePackage = "";
query = query.replaceAll("\n"," ").replaceAll("\r", " ");
String tokenFile = ElementParameterParser.getValue(node,"__TOKEN_NAME__");
boolean isLog4jEnabled = ("true").equals(ElementParameterParser.getValue(node.getProcess(), "__LOG4J_ACTIVATE__"));
if (authMode.equals("OAUTH")) {
if (authMode.equals("OAUTH") || authMode.equals("TOKEN")) {
%>
final String PROJECT_ID_<%=cid %> = <%=projectId %>;
final com.google.api.client.http.HttpTransport TRANSPORT_<%=cid %> = new com.google.api.client.http.javanet.NetHttpTransport();
final com.google.api.client.json.JsonFactory JSON_FACTORY_<%=cid %> = new com.google.api.client.json.jackson2.JacksonFactory();
com.google.api.services.bigquery.Bigquery bigqueryclient_<%=cid%> = null;
com.google.api.client.auth.oauth2.Credential credential_<%=cid%> = null;
long nb_line_<%=cid%> = 0;
<%
if (authMode.equals("OAUTH")) {
String passwordFieldName = "__CLIENT_SECRET__";
%>
<%@ include file="@{org.talend.designer.components.localprovider}/components/templates/password.javajet"%>
final String CLIENT_SECRET_<%=cid%> = "{\"web\": {\"client_id\": \""+<%=clientId%>+"\",\"client_secret\": \"" +decryptedPassword_<%=cid%>+ "\",\"auth_uri\": \"https://accounts.google.com/o/oauth2/auth\",\"token_uri\": \"https://accounts.google.com/o/oauth2/token\"}}";
final String PROJECT_ID_<%=cid %> = <%=projectId %>;
// Static variables for API scope, callback URI, and HTTP/JSON functions
final List<String> SCOPES_<%=cid%> = java.util.Arrays.asList("https://www.googleapis.com/auth/bigquery");
final String REDIRECT_URI_<%=cid%> = "urn:ietf:wg:oauth:2.0:oob";
final com.google.api.client.http.HttpTransport TRANSPORT_<%=cid %> = new com.google.api.client.http.javanet.NetHttpTransport();
final com.google.api.client.json.JsonFactory JSON_FACTORY_<%=cid %> = new com.google.api.client.json.jackson2.JacksonFactory();
com.google.api.client.googleapis.auth.oauth2.GoogleClientSecrets clientSecrets_<%=cid%> = com.google.api.client.googleapis.auth.oauth2.GoogleClientSecrets.load(
JSON_FACTORY_<%=cid %>, new java.io.InputStreamReader(new java.io.ByteArrayInputStream(
CLIENT_SECRET_<%=cid%>.getBytes())));
com.google.api.client.googleapis.auth.oauth2.GoogleAuthorizationCodeFlow flow_<%=cid%> = null;
com.google.api.services.bigquery.Bigquery bigqueryclient_<%=cid%> = null;
long nb_line_<%=cid%> = 0;
<%
if(isLog4jEnabled){
%>
@@ -59,7 +65,7 @@
<%
}
%>
// Attempt to load existing refresh token
String tokenFile_<%=cid %> = <%=tokenFile%>;
java.util.Properties properties_<%=cid%> = new java.util.Properties();
@@ -75,16 +81,16 @@
%>
}
String storedRefreshToken_<%=cid%> = (String) properties_<%=cid%>.get("refreshtoken");
// Check to see if the an existing refresh token was loaded.
// If so, create a credential and call refreshToken() to get a new
// access token.
if (storedRefreshToken_<%=cid%> != null) {
// Request a new Access token using the refresh token.
com.google.api.client.googleapis.auth.oauth2.GoogleCredential credential_<%=cid%> = new com.google.api.client.googleapis.auth.oauth2. GoogleCredential.Builder().setTransport(TRANSPORT_<%=cid%>)
credential_<%=cid%> = new com.google.api.client.googleapis.auth.oauth2.GoogleCredential.Builder().setTransport(TRANSPORT_<%=cid%>)
.setJsonFactory(JSON_FACTORY_<%=cid%>).setClientSecrets(clientSecrets_<%=cid%>)
.build().setFromTokenResponse(new com.google.api.client.auth.oauth2.TokenResponse().setRefreshToken(storedRefreshToken_<%=cid%>));
credential_<%=cid%>.refreshToken();
<%
if(isLog4jEnabled){
@@ -93,7 +99,6 @@
<%
}
%>
bigqueryclient_<%=cid%> = new com.google.api.services.bigquery.Bigquery.Builder(new com.google.api.client.http.javanet.NetHttpTransport(),new com.google.api.client.json.jackson2.JacksonFactory(),credential_<%=cid%>).setApplicationName("Talend").build();
} else {
<%
if(isLog4jEnabled){
@@ -106,7 +111,7 @@
if(authorizationCode_<%=cid%> == null || "".equals(authorizationCode_<%=cid%>) || "\"\"".equals(authorizationCode_<%=cid%>)) {
String authorizeUrl_<%=cid%> = new com.google.api.client.googleapis.auth.oauth2.GoogleAuthorizationCodeRequestUrl(
clientSecrets_<%=cid%>, REDIRECT_URI_<%=cid%>, SCOPES_<%=cid%>).setState("").build();
<%
if(isLog4jEnabled){
%>
@@ -131,13 +136,12 @@
}
%>
// Exchange the auth code for an access token and refesh token
flow_<%=cid%> = new com.google.api.client.googleapis.auth.oauth2.GoogleAuthorizationCodeFlow.Builder(new com.google.api.client.http.javanet.NetHttpTransport(),
new com.google.api.client.json.jackson2.JacksonFactory(), clientSecrets_<%=cid%>, SCOPES_<%=cid%>)
flow_<%=cid%> = new com.google.api.client.googleapis.auth.oauth2.GoogleAuthorizationCodeFlow.Builder(TRANSPORT_<%=cid %>, JSON_FACTORY_<%=cid%>, clientSecrets_<%=cid%>, SCOPES_<%=cid%>)
.setAccessType("offline").setApprovalPrompt("force")
.build();
com.google.api.client.googleapis.auth.oauth2.GoogleTokenResponse response_<%=cid%> = flow_<%=cid%>.newTokenRequest(authorizationCode_<%=cid%>).setRedirectUri(REDIRECT_URI_<%=cid%>).execute();
com.google.api.client.auth.oauth2.Credential credential_<%=cid%> = flow_<%=cid%>.createAndStoreCredential(response_<%=cid%>, null);
credential_<%=cid%> = flow_<%=cid%>.createAndStoreCredential(response_<%=cid%>, null);
<%
if(isLog4jEnabled){
%>
@@ -151,17 +155,28 @@
try(java.io.FileOutputStream outputStream_<%=cid%> = new java.io.FileOutputStream(tokenFile_<%=cid %>)) {
storeProperties_<%=cid%>.store(outputStream_<%=cid%>,null);
}
bigqueryclient_<%=cid%> = new com.google.api.services.bigquery.Bigquery.Builder(new com.google.api.client.http.javanet.NetHttpTransport(),new com.google.api.client.json.jackson2.JacksonFactory(),credential_<%=cid%>).build();
}
}
<% } else {
if (ElementParameterParser.canEncrypt(node, "__ACCESS_TOKEN__")) {%>
final String decryptedAccessToken_<%=cid%> = routines.system.PasswordEncryptUtil.decryptPassword(<%=ElementParameterParser.getEncryptedValue(node, "__ACCESS_TOKEN__")%>);
credential_<%=cid%> =
new com.google.api.client.auth.oauth2.Credential(com.google.api.client.auth.oauth2.BearerToken.authorizationHeaderAccessMethod()).setFromTokenResponse(
(new com.google.api.client.auth.oauth2.TokenResponse()).setAccessToken(decryptedAccessToken_<%=cid%>));
<%} else {%>
credential_<%=cid%> =
new com.google.api.client.auth.oauth2.Credential(com.google.api.client.auth.oauth2.BearerToken.authorizationHeaderAccessMethod()).setFromTokenResponse(
(new com.google.api.client.auth.oauth2.TokenResponse()).setAccessToken(<%= ElementParameterParser.getValue(node, "__ACCESS_TOKEN__")%>));
<%}
} %>
bigqueryclient_<%=cid%> = new com.google.api.services.bigquery.Bigquery.Builder(TRANSPORT_<%=cid %>, JSON_FACTORY_<%=cid%>, credential_<%=cid%>).setApplicationName("Talend").build();
<%
String resultSizeType = ElementParameterParser.getValue(node,"__RESULT_SIZE__");
%>
<%@ include file="@{org.talend.designer.components.localprovider}/components/tBigQueryInput/BigQueryInputQueryHelper.javajet"%>
BigQueryUtil_<%=cid%> bigQueryUtil_<%=cid%> = new BigQueryUtil_<%=cid%>(PROJECT_ID_<%=cid%>, bigqueryclient_<%=cid%>, tokenFile_<%=cid%>);
<%@ include file="@{org.talend.designer.components.localprovider}/components/tBigQueryInput/BigQueryInputQueryHelper.javajet"%>
BigQueryUtil_<%=cid%> bigQueryUtil_<%=cid%> = new BigQueryUtil_<%=cid%>(PROJECT_ID_<%=cid%>, bigqueryclient_<%=cid%>, <% if (authMode.equals("OAUTH")) { %> tokenFile_<%=cid%> <% } else { %> null <%}%>);
<%
} else if (authMode.equals("SERVICEACCOUNT")) {
%>
@@ -176,7 +191,7 @@
.setProjectId(<%=projectId%>)
.build()
.getService();
<%
} else {
throw new IllegalArgumentException("authentication mode should be either \"SERVICEACCOUNT\" or \"OAUTH\", but it is " + authMode);

View File

@@ -44,9 +44,31 @@
<ITEMS DEFAULT="SERVICEACCOUNT">
<ITEM NAME="SERVICEACCOUNT" VALUE="SERVICEACCOUNT"/>
<ITEM NAME="OAUTH" VALUE="OAUTH"/>
</ITEMS>
</PARAMETER>
<ITEM NAME="TOKEN" VALUE="TOKEN" />
</ITEMS>
</PARAMETER>
<PARAMETER
NAME="ACCESS_TOKEN"
FIELD="PASSWORD"
NUM_ROW="10"
REQUIRED="true"
SHOW_IF="AUTH_MODE == 'TOKEN'"
GROUP="AUTHENTICATION"
>
<DEFAULT>""</DEFAULT>
</PARAMETER>
<PARAMETER
NAME="ACCESS_TOKEN_WARNING"
FIELD="LABEL"
NUM_ROW="11"
REQUIRED="false"
GROUP="AUTHENTICATION"
SHOW_IF="AUTH_MODE == 'TOKEN'"
>
<DEFAULT>*Note: If the Access Token Expire we won't be able to refresh it!"</DEFAULT>
</PARAMETER>
<PARAMETER
NAME="SERVICE_ACCOUNT_CREDENTIALS_FILE"
REPOSITORY_VALUE="SERVICE_ACCOUNT_CREDENTIALS_FILE"
@@ -170,7 +192,7 @@
<IMPORT NAME="google-api-services-oauth2-v2-rev151-1.25.0.jar" MODULE="google-api-services-oauth2-v2-rev151-1.25.0.jar" MVN="mvn:com.google.apis/google-api-services-oauth2/v2-rev151-1.25.0" REQUIRED="true" />
<IMPORT NAME="google-api-services-bigquery-v2-rev454-1.25.0.jar" MODULE="google-api-services-bigquery-v2-rev454-1.25.0.jar" MVN="mvn:com.google.apis/google-api-services-bigquery/v2-rev454-1.25.0" REQUIRED="true" />
<IMPORT NAME="google-http-client-1.25.0.jar" MODULE="google-http-client-1.25.0.jar" MVN="mvn:com.google.http-client/google-http-client/1.25.0" REQUIRED="true" />
<IMPORT NAME="google-oauth-client-1.25.0.jar" MODULE="google-oauth-client-1.25.0.jar" MVN="mvn:com.google.oauth-client/google-oauth-client/1.25.0" REQUIRED="true" />
<IMPORT NAME="google-oauth-client-1.31.0.jar" MODULE="google-oauth-client-1.31.0.jar" MVN="mvn:com.google.oauth-client/google-oauth-client/1.31.0" REQUIRED="true" />
<IMPORT NAME="google-http-client-jackson2-1.25.0.jar" MODULE="google-http-client-jackson2-1.25.0.jar" MVN="mvn:com.google.http-client/google-http-client-jackson2/1.25.0" REQUIRED="true" />
<IMPORT NAME="guava-20.0.jar" MODULE="guava-20.0.jar" MVN="mvn:com.google.guava/guava/20.0" REQUIRED="true"/>
<IMPORT NAME="jackson-core-2.10.1.jar" MODULE="jackson-core-2.10.1.jar"

View File

@@ -1,10 +1,10 @@
<%@ jet
imports="
org.talend.core.model.process.INode
org.talend.core.model.process.ElementParameterParser
org.talend.core.model.process.INode
org.talend.core.model.process.ElementParameterParser
org.talend.designer.codegen.config.CodeGeneratorArgument
org.talend.core.model.metadata.IMetadataTable
org.talend.core.model.metadata.IMetadataColumn
org.talend.core.model.metadata.IMetadataTable
org.talend.core.model.metadata.IMetadataColumn
org.talend.core.model.process.IConnection
org.talend.core.model.process.IConnectionCategory
org.talend.core.model.metadata.types.JavaTypesManager
@@ -23,14 +23,14 @@
String authMode = ElementParameterParser.getValue(node,"__AUTH_MODE__");
String query = ElementParameterParser.getValue(node,"__QUERY__");
boolean useLegacySql = ElementParameterParser.getBooleanValue(node,"__USE_LEGACY_SQL__");
query = query.replaceAll("\n"," ").replaceAll("\r"," ");
boolean isLog4jEnabled = ("true").equals(ElementParameterParser.getValue(node.getProcess(), "__LOG4J_ACTIVATE__"));
if (authMode.equals("OAUTH")) {
if (authMode.equals("OAUTH") || authMode.equals("TOKEN")) {
String resultSizeType = ElementParameterParser.getValue(node,"__RESULT_SIZE__");
%>
// Start a Query Job
String querySql_<%=cid %> = <%=query %>;
System.out.format("Running Query : %s\n", querySql_<%=cid %>);
@@ -41,17 +41,20 @@
<%
}
%>
<%
if(isLog4jEnabled){
%>
log.info("<%=cid%> - Try <%="LARGE".equals(resultSizeType) ? "with" : "without"%> allow large results flag");
<%
}
%>
if(authMode.equals("TOKEN") && !ElementParameterParser.canEncrypt(node, "__ACCESS_TOKEN__")) { %>
credential_<%=cid%>.setAccessToken(<%= ElementParameterParser.getValue(node, "__ACCESS_TOKEN__")%>);
<% } %>
com.google.api.services.bigquery.model.Job insert_<%=cid %> = bigQueryUtil_<%=cid%>.executeQuery(querySql_<%=cid%>, <%="LARGE".equals(resultSizeType) ? true : false%>, <%=useLegacySql%>);
<%
<%
if(isLog4jEnabled){
%>
log.info("<%=cid%> - Executing query.");
@@ -88,12 +91,12 @@
<%
}
%>
<%
} else {
throw new IllegalArgumentException("authentication mode should be either \"SERVICEACCOUNT\" or \"OAUTH\", but it is " + authMode);
throw new IllegalArgumentException("authentication mode should be either \"SERVICEACCOUNT\", \"OAUTH\" or \"TOKEN\", but it is " + authMode);
}
List< ? extends IConnection> conns = node.getOutgoingSortedConnections();
if (conns != null){
if (conns.size()>0){
@@ -101,7 +104,7 @@
String connName = conn.getName();
if (conn.getLineStyle().hasConnectionCategory(IConnectionCategory.DATA)) {
if (authMode.equals("OAUTH")) {
if (authMode.equals("OAUTH") || authMode.equals("TOKEN")) {
%>
while (true) {
// Fetch Results
@@ -112,14 +115,14 @@
insert_<%=cid %>.getConfiguration().getQuery()
.getDestinationTable().getTableId())
.setPageToken(pageToken_<%=cid%>).execute();
List<com.google.api.services.bigquery.model.TableRow> rows_<%=cid %> = dataList_<%=cid %>.getRows();
if(rows_<%=cid %> == null) {
// Means there is no record.
rows_<%=cid %> = new java.util.ArrayList<com.google.api.services.bigquery.model.TableRow>();
}
for (com.google.api.services.bigquery.model.TableRow row_<%=cid %> : rows_<%=cid %>) {
java.util.List<com.google.api.services.bigquery.model.TableCell> field_<%=cid %> = row_<%=cid %>.getF();
Object value_<%=cid%> = null;
@@ -130,15 +133,15 @@
com.google.cloud.bigquery.TableResult result_<%=cid%> = job_<%=cid%>.getQueryResults();
long nb_line_<%=cid%> = 0;
for (com.google.cloud.bigquery.FieldValueList field_<%=cid %> : result_<%=cid%>.iterateAll()) {
Object value_<%=cid%>;
nb_line_<%=cid%> ++;
<%
} else {
throw new IllegalArgumentException("authentication mode should be either \"SERVICEACCOUNT\" or \"OAUTH\", but it is " + authMode);
throw new IllegalArgumentException("authentication mode should be either \"SERVICEACCOUNT\", \"OAUTH\" or \"TOKEN\", but it is " + authMode);
}
List<IMetadataTable> metadatas = node.getMetadataList();
if ((metadatas!=null) && (metadatas.size() > 0)) {
IMetadataTable metadata = metadatas.get(0);
@@ -148,17 +151,17 @@
boolean advancedSeparator = (advancedSeparatorStr!=null&&!("").equals(advancedSeparatorStr))?("true").equals(advancedSeparatorStr):false;
String thousandsSeparator = ElementParameterParser.getValueWithJavaType(node, "__THOUSANDS_SEPARATOR__", JavaTypesManager.CHARACTER);
String decimalSeparator = ElementParameterParser.getValueWithJavaType(node, "__DECIMAL_SEPARATOR__", JavaTypesManager.CHARACTER);
List<IMetadataColumn> columns = metadata.getListColumns();
int nbColumns = columns.size();
for (int i = 0; i < nbColumns; i++ ) {
IMetadataColumn column = columns.get(i);
String columnName = column.getLabel();
String typeToGenerate = JavaTypesManager.getTypeToGenerate(column.getTalendType(), column.isNullable());
JavaType javaType = JavaTypesManager.getJavaTypeFromId(column.getTalendType());
String patternValue = column.getPattern() == null || column.getPattern().trim().length() == 0 ? null : column.getPattern();
if (authMode.equals("OAUTH")) {
String patternValue = column.getPattern() == null || column.getPattern().trim().length() == 0 ? null : column.getPattern();
if (authMode.equals("OAUTH") || authMode.equals("TOKEN")) {
%>
value_<%=cid%> = field_<%=cid %>.get(<%=i%>).getV();
<%
@@ -167,22 +170,22 @@
value_<%=cid%> = field_<%=cid %>.get(<%=i%>).getValue();
<%
} else {
throw new IllegalArgumentException("authentication mode should be either \"SERVICEACCOUNT\" or \"OAUTH\", but it is " + authMode);
throw new IllegalArgumentException("authentication mode should be either \"SERVICEACCOUNT\", \"OAUTH\" or \"TOKEN\", but it is " + authMode);
}
%>
if(com.google.api.client.util.Data.isNull(value_<%=cid%>)) value_<%=cid%> = null;
if(value_<%=cid%> != null){
<%
if (javaType == JavaTypesManager.STRING) {
%>
<%=connName%>.<%=columnName%> = value_<%=cid%>.toString();
<%
} else if (javaType == JavaTypesManager.OBJECT) {
%>
%>
<%=connName%>.<%=columnName%> = value_<%=cid%>;
<%
} else if(javaType == JavaTypesManager.DATE) {
<%
} else if(javaType == JavaTypesManager.DATE) {
%>
if (value_<%=cid%>.toString().contains("-")) {
String sValue_<%=cid%> = value_<%=cid%>.toString();
@@ -195,13 +198,13 @@
} else{
<%=connName%>.<%=columnName%>=ParserUtils.parseTo_Date(value_<%=cid%>.toString());
}
<%
} else if(advancedSeparator && JavaTypesManager.isNumberType(javaType)) {
<%
} else if(advancedSeparator && JavaTypesManager.isNumberType(javaType)) {
%>
<%=connName%>.<%=columnName%> = ParserUtils.parseTo_<%= typeToGenerate %>(ParserUtils.parseTo_Number(value_<%=cid%>.toString(), <%= thousandsSeparator %>, <%= decimalSeparator %>));
<%
} else if(javaType == JavaTypesManager.BYTE_ARRAY) {
%>
} else if(javaType == JavaTypesManager.BYTE_ARRAY) {
%>
<%=connName%>.<%=columnName%> = value_<%=cid%>.toString().getBytes(<%=encoding %>);
<%
} else {

View File

@@ -1,4 +1,4 @@
LONG_NAME=Connect and run a query on Google BigQuery
LONG_NAME=Connect and run a query on Google BigQuery
HELP=org.talend.help.tBigQuerySQLRow
AUTHENTICATION.NAME=Authentication
@@ -6,11 +6,14 @@ SERVICE_ACCOUNT_CREDENTIALS_FILE.NAME=Service account credentials file
AUTH_MODE.NAME=Authentication mode
AUTH_MODE.ITEM.SERVICEACCOUNT=Service account
AUTH_MODE.ITEM.OAUTH=OAuth 2.0
AUTH_MODE.ITEM.TOKEN=OAuth Access Token
CLIENT_ID.NAME=Client Id
CLIENT_SECRET.NAME=Client Secret
PROJECT_ID.NAME=Project Id
PROJECT_ID.NAME=Project ID
AUTHORIZATION_CODE.NAME=Authorization Code
ACCESS_TOKEN.NAME=OAuth Access Token
QUERY.NAME=Query
SCHEMA.NAME=Schema

View File

@@ -7,7 +7,7 @@ AUTH_MODE.ITEM.SERVICEACCOUNT=Service account
AUTH_MODE.ITEM.OAUTH=OAuth 2.0
CLIENT_ID.NAME=Client Id
CLIENT_SECRET.NAME=Client Secret
PROJECT_ID.NAME=Project Id
PROJECT_ID.NAME=Project ID
AUTHORIZATION_CODE.NAME=Authorization Code
QUERY.NAME=Query
SCHEMA.NAME=Schema

View File

@@ -1,4 +1,4 @@
<%@ jet
<%@ jet
imports="
org.talend.core.model.process.INode
org.talend.designer.codegen.config.CodeGeneratorArgument
@@ -6,7 +6,7 @@ imports="
org.talend.core.model.process.IConnection
org.talend.core.model.process.IConnectionCategory
java.util.List
"
"
%>
<%@ include file="@{org.talend.designer.components.localprovider}/components/templates/Log4j/Log4jFileUtil.javajet"%>
<%
@@ -20,7 +20,7 @@ imports="
IConnection conn =conns.get(0);
String connName = conn.getName();
if (conn.getLineStyle().hasConnectionCategory(IConnectionCategory.DATA)) {
if (authMode.equals("OAUTH")) {
if (authMode.equals("OAUTH") || authMode.equals("TOKEN")) {
%>
}
pageToken_<%=cid%> = dataList_<%=cid %>.getPageToken();
@@ -35,7 +35,7 @@ imports="
}
<%
} else {
throw new IllegalArgumentException("authentication mode should be either \"SERVICEACCOUNT\" or \"OAUTH\", but it is " + authMode);
throw new IllegalArgumentException("authentication mode should be either \"SERVICEACCOUNT\" or \"OAUTH\"or \"TOKEN\", but it is " + authMode);
}
log4jFileUtil.retrievedDataNumberInfo(node);
}

View File

@@ -165,7 +165,7 @@
<CODEGENERATION>
<IMPORTS>
<IMPORT NAME="xstream-1.4.10.jar" MODULE="xstream-1.4.10.jar" MVN="mvn:com.thoughtworks.xstream/xstream/1.4.10" REQUIRED_IF="(DB_VERSION=='BONITA_523') OR (((DB_VERSION=='BONITA_652')OR(DB_VERSION=='BONITA_724')) AND (CLIENT_MODE=='JAVA_CLIENT'))" />
<IMPORT NAME="xstream-1.4.11.1.jar" MODULE="xstream-1.4.11.1.jar" MVN="mvn:com.thoughtworks.xstream/xstream/1.4.11.1" REQUIRED_IF="(DB_VERSION=='BONITA_523') OR (((DB_VERSION=='BONITA_652')OR(DB_VERSION=='BONITA_724')) AND (CLIENT_MODE=='JAVA_CLIENT'))" />
<!-- Bonita 5.2.3 -->
<IMPORT NAME="bonita_client_523" MODULE="bonita-client-5.2.3.jar" MVN="mvn:org.talend.libraries/bonita-client-5.2.3/6.0.0" UrlPath="platform:/plugin/org.talend.libraries.bonita/lib/bonita-client-5.2.3.jar" REQUIRED_IF="DB_VERSION=='BONITA_523'" />
<IMPORT NAME="bonita_server_523" MODULE="bonita-server-5.2.3.jar" MVN="mvn:org.talend.libraries/bonita-server-5.2.3/6.0.0" UrlPath="platform:/plugin/org.talend.libraries.bonita/lib/bonita-server-5.2.3.jar" REQUIRED_IF="DB_VERSION=='BONITA_523'" />
@@ -221,7 +221,7 @@
<IMPORT NAME="javassist_3120_GA" MODULE="javassist-3.12.0.GA.jar" MVN="mvn:org.talend.libraries/javassist-3.12.0.GA/6.0.0" REQUIRED_IF="DB_VERSION=='BONITA_5101'" />
<!-- Bonita 6.5.2 HTTP API -->
<IMPORT NAME="talend-bonita-client-1.0.0.jar" MODULE="talend-bonita-client-1.0.0.jar" MVN="mvn:org.talend.libraries/talend-bonita-client-1.0.0/6.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_652') AND (CLIENT_MODE=='HTTP_CLIENT')" />
<IMPORT NAME="talend-bonita-client-1.0.0.jar" MODULE="talend-bonita-client-1.0.0.jar" MVN="mvn:org.talend.libraries/talend-bonita-client/1.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_652') AND (CLIENT_MODE=='HTTP_CLIENT')" />
<IMPORT NAME="bonita-client-6.5.2.jar" MODULE="bonita-client-6.5.2.jar" MVN="mvn:org.talend.libraries/bonita-client-6.5.2/6.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_652') AND (CLIENT_MODE=='HTTP_CLIENT')" />
<IMPORT NAME="bonita-common-6.5.2.jar" MODULE="bonita-common-6.5.2.jar" MVN="mvn:org.talend.libraries/bonita-common-6.5.2/6.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_652') AND (CLIENT_MODE=='HTTP_CLIENT')" />
@@ -232,7 +232,7 @@
<IMPORT NAME="commons-logging-1.2.jar" MODULE="commons-logging-1.2.jar" MVN="mvn:commons-logging/commons-logging/1.2" REQUIRED_IF="(DB_VERSION=='BONITA_652') AND (CLIENT_MODE=='HTTP_CLIENT')" />
<!-- Bonita 6.5.2 JAVA API -->
<IMPORT NAME="talend-bonita-client-1.0.0.jar" MODULE="talend-bonita-client-1.0.0.jar" MVN="mvn:org.talend.libraries/talend-bonita-client-1.0.0/6.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_652') AND (CLIENT_MODE=='JAVA_CLIENT')" />
<IMPORT NAME="talend-bonita-client-1.0.0.jar" MODULE="talend-bonita-client-1.0.0.jar" MVN="mvn:org.talend.libraries/talend-bonita-client/1.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_652') AND (CLIENT_MODE=='JAVA_CLIENT')" />
<IMPORT NAME="bonita-client-6.5.2.jar" MODULE="bonita-client-6.5.2.jar" MVN="mvn:org.talend.libraries/bonita-client-6.5.2/6.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_652') AND (CLIENT_MODE=='JAVA_CLIENT')" />
<IMPORT NAME="bonita-common-6.5.2.jar" MODULE="bonita-common-6.5.2.jar" MVN="mvn:org.talend.libraries/bonita-common-6.5.2/6.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_652') AND (CLIENT_MODE=='JAVA_CLIENT')" />
<IMPORT NAME="bonita-server-6.5.2.jar" MODULE="bonita-server-6.5.2.jar" MVN="mvn:org.talend.libraries/bonita-server-6.5.2/6.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_652') AND (CLIENT_MODE=='JAVA_CLIENT')" />
@@ -245,7 +245,7 @@
<IMPORT NAME="commons-codec-1.14.jar" MODULE="commons-codec-1.14.jar" MVN="mvn:commons-codec/commons-codec/1.14" REQUIRED_IF="(DB_VERSION=='BONITA_652') AND (CLIENT_MODE=='JAVA_CLIENT')" />
<!-- Bonita 7.2.4 HTTP API -->
<IMPORT NAME="talend-bonita-client-1.0.0.jar" MODULE="talend-bonita-client-1.0.0.jar" MVN="mvn:org.talend.libraries/talend-bonita-client-1.0.0/6.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_724') AND (CLIENT_MODE=='HTTP_CLIENT')" />
<IMPORT NAME="talend-bonita-client-1.0.0.jar" MODULE="talend-bonita-client-1.0.0.jar" MVN="mvn:org.talend.libraries/talend-bonita-client/1.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_724') AND (CLIENT_MODE=='HTTP_CLIENT')" />
<IMPORT NAME="bonita-client-7.2.4.jar" MODULE="bonita-client-7.2.4.jar" MVN="mvn:org.talend.libraries/bonita-client-7.2.4/6.3.0" REQUIRED_IF="(DB_VERSION=='BONITA_724') AND (CLIENT_MODE=='HTTP_CLIENT')" />
<IMPORT NAME="bonita-common-7.2.4.jar" MODULE="bonita-common-7.2.4.jar" MVN="mvn:org.talend.libraries/bonita-common-7.2.4/6.3.0" REQUIRED_IF="(DB_VERSION=='BONITA_724') AND (CLIENT_MODE=='HTTP_CLIENT')" />
@@ -256,7 +256,7 @@
<IMPORT NAME="commons-codec-1.14.jar" MODULE="commons-codec-1.14.jar" MVN="mvn:commons-codec/commons-codec/1.14" REQUIRED_IF="(DB_VERSION=='BONITA_724') AND (CLIENT_MODE=='HTTP_CLIENT')" />
<!-- Bonita 7.2.4 JAVA API -->
<IMPORT NAME="talend-bonita-client-1.0.0.jar" MODULE="talend-bonita-client-1.0.0.jar" MVN="mvn:org.talend.libraries/talend-bonita-client-1.0.0/6.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_724') AND (CLIENT_MODE=='JAVA_CLIENT')" />
<IMPORT NAME="talend-bonita-client-1.0.0.jar" MODULE="talend-bonita-client-1.0.0.jar" MVN="mvn:org.talend.libraries/talend-bonita-client/1.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_724') AND (CLIENT_MODE=='JAVA_CLIENT')" />
<IMPORT NAME="bonita-client-7.2.4.jar" MODULE="bonita-client-7.2.4.jar" MVN="mvn:org.talend.libraries/bonita-client-7.2.4/6.3.0" REQUIRED_IF="(DB_VERSION=='BONITA_724') AND (CLIENT_MODE=='JAVA_CLIENT')" />
<IMPORT NAME="bonita-common-7.2.4.jar" MODULE="bonita-common-7.2.4.jar" MVN="mvn:org.talend.libraries/bonita-common-7.2.4/6.3.0" REQUIRED_IF="(DB_VERSION=='BONITA_724') AND (CLIENT_MODE=='JAVA_CLIENT')" />
<IMPORT NAME="bonita-server-7.2.4.jar" MODULE="bonita-server-7.2.4.jar" MVN="mvn:org.talend.libraries/bonita-server-7.2.4/6.3.0" REQUIRED_IF="(DB_VERSION=='BONITA_724') AND (CLIENT_MODE=='JAVA_CLIENT')" />

View File

@@ -214,7 +214,7 @@
<CODEGENERATION>
<IMPORTS>
<IMPORT NAME="xstream-1.4.10.jar" MODULE="xstream-1.4.10.jar" MVN="mvn:com.thoughtworks.xstream/xstream/1.4.10" REQUIRED_IF="((DB_VERSION!='BONITA_652') AND (DB_VERSION!='BONITA_724')) OR (((DB_VERSION=='BONITA_652')OR(DB_VERSION=='BONITA_724')) AND (CLIENT_MODE=='JAVA_CLIENT'))" />
<IMPORT NAME="xstream-1.4.11.1.jar" MODULE="xstream-1.4.11.1.jar" MVN="mvn:com.thoughtworks.xstream/xstream/1.4.11.1" REQUIRED_IF="((DB_VERSION!='BONITA_652') AND (DB_VERSION!='BONITA_724')) OR (((DB_VERSION=='BONITA_652')OR(DB_VERSION=='BONITA_724')) AND (CLIENT_MODE=='JAVA_CLIENT'))" />
<!-- Bonita 5.2.3 -->
<IMPORT NAME="bonita_client_523" MODULE="bonita-client-5.2.3.jar" MVN="mvn:org.talend.libraries/bonita-client-5.2.3/6.0.0" UrlPath="platform:/plugin/org.talend.libraries.bonita/lib/bonita-client-5.2.3.jar" REQUIRED_IF="DB_VERSION=='BONITA_523'" />
<IMPORT NAME="bonita_server_523" MODULE="bonita-server-5.2.3.jar" MVN="mvn:org.talend.libraries/bonita-server-5.2.3/6.0.0" UrlPath="platform:/plugin/org.talend.libraries.bonita/lib/bonita-server-5.2.3.jar" REQUIRED_IF="DB_VERSION=='BONITA_523'" />
@@ -269,7 +269,7 @@
<IMPORT NAME="javassist_3120_GA" MODULE="javassist-3.12.0.GA.jar" MVN="mvn:org.talend.libraries/javassist-3.12.0.GA/6.0.0" REQUIRED_IF="(CLIENT_MODE=='JAVA_CLIENT') AND (DB_VERSION=='BONITA_5101')" />
<!-- Bonita 6.5.2 HTTP API -->
<IMPORT NAME="talend-bonita-client-1.0.0.jar" MODULE="talend-bonita-client-1.0.0.jar" MVN="mvn:org.talend.libraries/talend-bonita-client-1.0.0/6.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_652') AND (CLIENT_MODE=='HTTP_CLIENT')" />
<IMPORT NAME="talend-bonita-client-1.0.0.jar" MODULE="talend-bonita-client-1.0.0.jar" MVN="mvn:org.talend.libraries/talend-bonita-client/1.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_652') AND (CLIENT_MODE=='HTTP_CLIENT')" />
<IMPORT NAME="bonita-client-6.5.2.jar" MODULE="bonita-client-6.5.2.jar" MVN="mvn:org.talend.libraries/bonita-client-6.5.2/6.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_652') AND (CLIENT_MODE=='HTTP_CLIENT')" />
<IMPORT NAME="bonita-common-6.5.2.jar" MODULE="bonita-common-6.5.2.jar" MVN="mvn:org.talend.libraries/bonita-common-6.5.2/6.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_652') AND (CLIENT_MODE=='HTTP_CLIENT')" />
@@ -281,7 +281,7 @@
<IMPORT NAME="commons-lang3-3.8.1.jar" MODULE="commons-lang3-3.8.1.jar" MVN="mvn:org.apache.commons/commons-lang3/3.8.1" UrlPath="platform:/plugin/org.talend.libraries.apache.common/lib/commons-lang3-3.8.1.jar" REQUIRED_IF="(DB_VERSION=='BONITA_652') AND (CLIENT_MODE=='HTTP_CLIENT')" />
<!-- Bonita 6.5.2 JAVA API -->
<IMPORT NAME="talend-bonita-client-1.0.0.jar" MODULE="talend-bonita-client-1.0.0.jar" MVN="mvn:org.talend.libraries/talend-bonita-client-1.0.0/6.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_652') AND (CLIENT_MODE=='JAVA_CLIENT')" />
<IMPORT NAME="talend-bonita-client-1.0.0.jar" MODULE="talend-bonita-client-1.0.0.jar" MVN="mvn:org.talend.libraries/talend-bonita-client/1.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_652') AND (CLIENT_MODE=='JAVA_CLIENT')" />
<IMPORT NAME="bonita-client-6.5.2.jar" MODULE="bonita-client-6.5.2.jar" MVN="mvn:org.talend.libraries/bonita-client-6.5.2/6.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_652') AND (CLIENT_MODE=='JAVA_CLIENT')" />
<IMPORT NAME="bonita-common-6.5.2.jar" MODULE="bonita-common-6.5.2.jar" MVN="mvn:org.talend.libraries/bonita-common-6.5.2/6.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_652') AND (CLIENT_MODE=='JAVA_CLIENT')" />
<IMPORT NAME="bonita-server-6.5.2.jar" MODULE="bonita-server-6.5.2.jar" MVN="mvn:org.talend.libraries/bonita-server-6.5.2/6.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_652') AND (CLIENT_MODE=='JAVA_CLIENT')" />
@@ -294,7 +294,7 @@
<IMPORT NAME="commons-codec-1.14.jar" MODULE="commons-codec-1.14.jar" MVN="mvn:commons-codec/commons-codec/1.14" REQUIRED_IF="(DB_VERSION=='BONITA_652') AND (CLIENT_MODE=='JAVA_CLIENT')" />
<!-- Bonita 7.2.4 HTTP API -->
<IMPORT NAME="talend-bonita-client-1.0.0.jar" MODULE="talend-bonita-client-1.0.0.jar" MVN="mvn:org.talend.libraries/talend-bonita-client-1.0.0/6.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_724') AND (CLIENT_MODE=='HTTP_CLIENT')" />
<IMPORT NAME="talend-bonita-client-1.0.0.jar" MODULE="talend-bonita-client-1.0.0.jar" MVN="mvn:org.talend.libraries/talend-bonita-client/1.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_724') AND (CLIENT_MODE=='HTTP_CLIENT')" />
<IMPORT NAME="bonita-client-7.2.4.jar" MODULE="bonita-client-7.2.4.jar" MVN="mvn:org.talend.libraries/bonita-client-7.2.4/6.3.0" REQUIRED_IF="(DB_VERSION=='BONITA_724') AND (CLIENT_MODE=='HTTP_CLIENT')" />
<IMPORT NAME="bonita-common-7.2.4.jar" MODULE="bonita-common-7.2.4.jar" MVN="mvn:org.talend.libraries/bonita-common-7.2.4/6.3.0" REQUIRED_IF="(DB_VERSION=='BONITA_724') AND (CLIENT_MODE=='HTTP_CLIENT')" />
@@ -306,7 +306,7 @@
<IMPORT NAME="commons-lang3-3.8.1.jar" MODULE="commons-lang3-3.8.1.jar" MVN="mvn:org.apache.commons/commons-lang3/3.8.1" UrlPath="platform:/plugin/org.talend.libraries.apache.common/lib/commons-lang3-3.8.1.jar" REQUIRED_IF="(DB_VERSION=='BONITA_724') AND (CLIENT_MODE=='HTTP_CLIENT')" />
<!-- Bonita 7.2.4 JAVA API -->
<IMPORT NAME="talend-bonita-client-1.0.0.jar" MODULE="talend-bonita-client-1.0.0.jar" MVN="mvn:org.talend.libraries/talend-bonita-client-1.0.0/6.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_724') AND (CLIENT_MODE=='JAVA_CLIENT')" />
<IMPORT NAME="talend-bonita-client-1.0.0.jar" MODULE="talend-bonita-client-1.0.0.jar" MVN="mvn:org.talend.libraries/talend-bonita-client/1.0.0" REQUIRED_IF="(DB_VERSION=='BONITA_724') AND (CLIENT_MODE=='JAVA_CLIENT')" />
<IMPORT NAME="bonita-client-7.2.4.jar" MODULE="bonita-client-7.2.4.jar" MVN="mvn:org.talend.libraries/bonita-client-7.2.4/6.3.0" REQUIRED_IF="(DB_VERSION=='BONITA_724') AND (CLIENT_MODE=='JAVA_CLIENT')" />
<IMPORT NAME="bonita-common-7.2.4.jar" MODULE="bonita-common-7.2.4.jar" MVN="mvn:org.talend.libraries/bonita-common-7.2.4/6.3.0" REQUIRED_IF="(DB_VERSION=='BONITA_724') AND (CLIENT_MODE=='JAVA_CLIENT')" />
<IMPORT NAME="bonita-server-7.2.4.jar" MODULE="bonita-server-7.2.4.jar" MVN="mvn:org.talend.libraries/bonita-server-7.2.4/6.3.0" REQUIRED_IF="(DB_VERSION=='BONITA_724') AND (CLIENT_MODE=='JAVA_CLIENT')" />

View File

@@ -606,11 +606,12 @@
SHOW_IF="(DBTYPE=='SYBASE') and (USE_EXISTING_CONNECTION == 'true')"
/>
<PARAMETER NAME="DB_SYBASE_VERSION" FIELD="CLOSED_LIST" NUM_ROW="50"
<PARAMETER NAME="DB_SYBASE_VERSION" FIELD="CLOSED_LIST" NUM_ROW="50" REPOSITORY_VALUE="DB_VERSION"
SHOW_IF="(USE_EXISTING_CONNECTION == 'false')AND(DBTYPE=='SYBASE')">
<ITEMS DEFAULT="SYBSEIQ_16">
<ITEM NAME="SYBSEIQ_12_15" VALUE="SYBSEIQ_12_15" />
<ITEM NAME="SYBSEIQ_16" VALUE="SYBSEIQ_16" />
<ITEM NAME="SYBSEIQ_16_SA" VALUE="SYBSEIQ_16_SA" />
</ITEMS>
</PARAMETER>
<!--Sybase configuration end-->
@@ -877,7 +878,7 @@
<IMPORT NAME="Driver-Oracle12c" MODULE="ojdbc7.jar" MVN="mvn:org.talend.libraries/ojdbc7/6.0.0" REQUIRED_IF="(DBTYPE=='DBORACLE') AND (DB_VERSION == 'ORACLE_12') AND (USE_EXISTING_CONNECTION == 'false')" />
<IMPORT NAME="ORACLE_18" MODULE="ojdbc8-19.3.0.0.jar" MVN="mvn:com.oracle.ojdbc/ojdbc8/19.3.0.0" REQUIRED_IF="(DBTYPE=='DBORACLE') AND (DB_VERSION == 'ORACLE_18') AND (USE_EXISTING_CONNECTION == 'false')" />
<IMPORT NAME="Driver-POSTGRESQL" MODULE="postgresql-8.4-703.jdbc4.jar" MVN="mvn:postgresql/postgresql/8.4-703.jdbc4" REQUIRED_IF="(USE_EXISTING_CONNECTION == 'false') AND ((((DBTYPE=='POSTGRE') or (DBTYPE=='POSTGREPLUS')) AND (DB_POSTGRE_VERSION =='PRIOR_TO_V9')) or (DBTYPE=='GREENPLUM'))" />
<IMPORT NAME="Driver-Postgres9" MODULE="postgresql-42.2.9.jar" MVN="mvn:org.postgresql/postgresql/42.2.9" REQUIRED_IF="(USE_EXISTING_CONNECTION == 'false') AND (((DBTYPE=='POSTGRE') or (DBTYPE=='POSTGREPLUS')) AND (DB_POSTGRE_VERSION =='V9_X'))" />
<IMPORT NAME="Driver-Postgres9" MODULE="postgresql-42.2.14.jar" MVN="mvn:org.postgresql/postgresql/42.2.14" REQUIRED_IF="(USE_EXISTING_CONNECTION == 'false') AND (((DBTYPE=='POSTGRE') or (DBTYPE=='POSTGREPLUS')) AND (DB_POSTGRE_VERSION =='V9_X'))" />
<IMPORT NAME="Driver-FIREBIRD" MODULE="jaybird-full-2.1.1.jar" MVN="mvn:org.talend.libraries/jaybird-full-2.1.1/6.0.0" UrlPath="platform:/plugin/org.talend.libraries.jdbc.firebird/lib/jaybird-full-2.1.1.jar" REQUIRED_IF="(USE_EXISTING_CONNECTION == 'false') AND (DBTYPE=='FIREBIRD')" />
<IMPORT NAME="Driver-HSQLDb" MODULE="hsqldb.jar" MVN="mvn:org.talend.libraries/hsqldb/6.0.0" UrlPath="platform:/plugin/org.talend.libraries.jdbc.hsql/lib/hsqldb.jar" REQUIRED_IF="DBTYPE=='HSQLDB'"/>
<IMPORT NAME="Driver-INFORMIX-JDBC" MODULE="ifxjdbc.jar" MVN="mvn:org.talend.libraries/ifxjdbc/6.0.0" REQUIRED_IF="(USE_EXISTING_CONNECTION == 'false') AND (DBTYPE=='INFORMIX')" />
@@ -894,6 +895,7 @@
<IMPORT NAME="Driver-SQLITE-JDBC-NESTED" MODULE="sqlitejdbc-v056.jar" MVN="mvn:org.talend.libraries/sqlitejdbc-v056/6.0.0" UrlPath="platform:/plugin/org.talend.libraries.jdbc.sqlite3/lib/sqlitejdbc-v056.jar" REQUIRED_IF="(USE_EXISTING_CONNECTION == 'false') AND (DBTYPE=='SQLITE')" />
<IMPORT NAME="Driver-SYBASE-JCONN3" MODULE="jconn3.jar" MVN="mvn:org.talend.libraries/jconn3/6.0.0" BundleID="" REQUIRED_IF="(USE_EXISTING_CONNECTION == 'false') AND (DBTYPE=='SYBASE') AND (DB_SYBASE_VERSION == 'SYBSEIQ_12_15')" />
<IMPORT NAME="Driver-SYBASE-JCONN4" MODULE="jconn4.jar" MVN="mvn:org.talend.libraries/jconn4/6.0.0" BundleID="" REQUIRED_IF="(USE_EXISTING_CONNECTION == 'false') AND (DBTYPE=='SYBASE') AND (DB_SYBASE_VERSION == 'SYBSEIQ_16')" />
<IMPORT NAME="Driver-SqlAnywhere" MODULE="sajdbc4.jar" MVN="mvn:sap.jdbc4.sqlanywhere/sajdbc4/17.0.0" BundleID="" REQUIRED_IF = "(DB_VERSION == 'SYBSEIQ_16_SA') AND (USE_EXISTING_CONNECTION == 'false')" />
<IMPORT NAME="Driver-Teradata_jdbc" MODULE="terajdbc4-16.20.00.02.jar" MVN="mvn:com.teradata/terajdbc4/16.20.00.02" REQUIRED_IF="(USE_EXISTING_CONNECTION == 'false') AND (DBTYPE=='TERADATA')" />
<IMPORT NAME="Driver-Teradata_config" MODULE="tdgssconfig-16.20.00.02.jar" MVN="mvn:com.teradata/tdgssconfig/16.20.00.02" REQUIRED_IF="(USE_EXISTING_CONNECTION == 'false') AND (DBTYPE=='TERADATA')" />
<IMPORT NAME="Driver-Netezza" MODULE="nzjdbc.jar" MVN="mvn:org.talend.libraries/nzjdbc/6.0.0" REQUIRED_IF="(USE_EXISTING_CONNECTION == 'false') AND (DBTYPE=='NETEZZA')" />

View File

@@ -159,6 +159,7 @@ MSSQL_DRIVER.ITEM.MSSQL_PROP=Microsoft
DB_SYBASE_VERSION.NAME=DB Version
DB_SYBASE_VERSION.ITEM.SYBSEIQ_12_15=Sybase 12/15
DB_SYBASE_VERSION.ITEM.SYBSEIQ_16=Sybase 16
DB_SYBASE_VERSION.ITEM.SYBSEIQ_16_SA=Sybase 16 (SQL Anywhere)
ACCOUNT.NAME=Account
ROLE.NAME=Role

View File

@@ -87,6 +87,19 @@
REPOSITORY_VALUE="DATABASE:POSTGRESQL"
/>
<PARAMETER
NAME="DB_VERSION"
FIELD="CLOSED_LIST"
NUM_ROW="30"
SHOW_IF="(#LINK@CONNECTOR.OUT.TABLE_REF == '0') AND (USE_EXISTING_CONNECTION == 'false')"
REPOSITORY_VALUE="DB_VERSION"
>
<ITEMS DEFAULT="V9_X">
<ITEM NAME="PRIOR_TO_V9" VALUE="PRIOR_TO_V9" />
<ITEM NAME="V9_X" VALUE="V9_X" />
</ITEMS>
</PARAMETER>
<PARAMETER
NAME="HOST"
@@ -175,7 +188,11 @@
</ADVANCED_PARAMETERS>
<CODEGENERATION>
</CODEGENERATION>
<IMPORTS>
<IMPORT NAME="Driver-Postgres9" MODULE="postgresql-42.2.14.jar" MVN="mvn:org.postgresql/postgresql/42.2.14" REQUIRED_IF="DB_VERSION =='V9_X'" />
<IMPORT NAME="Driver-Postgres" MODULE="postgresql-8.4-703.jdbc4.jar" MVN="mvn:postgresql/postgresql/8.4-703.jdbc4" REQUIRED_IF="DB_VERSION =='PRIOR_TO_V9'" />
</IMPORTS>
</CODEGENERATION>
<RETURNS>
</RETURNS>

View File

@@ -38,3 +38,7 @@ LINK_STYLE.ITEM.AUTO=Auto
LINK_STYLE.ITEM.BEZIER_CURVE=Bezier curve
LINK_STYLE.ITEM.LINE=Line (fastest)
CONNECTION.NAME=Component List
DB_VERSION.NAME=DB Version
DB_VERSION.ITEM.PRIOR_TO_V9=Prior to v9
DB_VERSION.ITEM.V9_X=v9 and later

View File

@@ -118,12 +118,8 @@
<DEFAULT>false</DEFAULT>
</PARAMETER>
</ADVANCED_PARAMETERS>
<CODEGENERATION>
<IMPORTS>
<IMPORT NAME="Driver-Postgres" MODULE="postgresql-8.4-703.jdbc4.jar" MVN="mvn:postgresql/postgresql/8.4-703.jdbc4" REQUIRED="true" />
</IMPORTS>
</CODEGENERATION>
<CODEGENERATION/>
<RETURNS>
<RETURN NAME="NB_LINE" TYPE="id_Integer" AVAILABILITY="AFTER"/>

View File

@@ -78,7 +78,7 @@
REPOSITORY_VALUE="DATABASE:SYBASE"
/>
<PARAMETER NAME="DB_VERSION" FIELD="CLOSED_LIST" NUM_ROW="25">
<PARAMETER NAME="DB_VERSION" FIELD="CLOSED_LIST" NUM_ROW="25" REPOSITORY_VALUE="DB_VERSION">
<ITEMS DEFAULT="SYBSEIQ_12_15">
<ITEM NAME="SYBSEIQ_12_15" VALUE="SYBSEIQ_12_15" />
<ITEM NAME="SYBSEIQ_16" VALUE="SYBSEIQ_16" />

View File

@@ -126,7 +126,12 @@
</PARAMETER>
</ADVANCED_PARAMETERS>
<CODEGENERATION></CODEGENERATION>
<CODEGENERATION>
<IMPORTS>
<IMPORT NAME="Driver-Teradata" MODULE="terajdbc4-16.20.00.02.jar" MVN="mvn:com.teradata/terajdbc4/16.20.00.02" REQUIRED="true" />
<IMPORT NAME="Driver-Teradata" MODULE="tdgssconfig-16.20.00.02.jar" MVN="mvn:com.teradata/tdgssconfig/16.20.00.02" REQUIRED="true" />
</IMPORTS>
</CODEGENERATION>
<RETURNS></RETURNS>

View File

@@ -119,10 +119,6 @@
</ADVANCED_PARAMETERS>
<CODEGENERATION>
<IMPORTS>
<IMPORT NAME="Driver-Teradata" MODULE="terajdbc4-16.20.00.02.jar" MVN="mvn:com.teradata/terajdbc4/16.20.00.02" REQUIRED="true" />
<IMPORT NAME="Driver-Teradata" MODULE="tdgssconfig-16.20.00.02.jar" MVN="mvn:com.teradata/tdgssconfig/16.20.00.02" REQUIRED="true" />
</IMPORTS>
</CODEGENERATION>
<RETURNS>

View File

@@ -19,6 +19,16 @@ imports="
<%
CodeGeneratorArgument codeGenArgument = (CodeGeneratorArgument) argument;
INode node = (INode)codeGenArgument.getArgument();
boolean useAlias = false;
List<IConnection> inConnections = (List<IConnection>) node.getIncomingConnections();
if(inConnections != null && inConnections.size() > 0 && inConnections.get(0) != null) {
IConnection inConnection = inConnections.get(0);
INode sourceNode = inConnection.getSource();
if(sourceNode!=null){
useAlias = "true".equals(ElementParameterParser.getValue(sourceNode, "__USE_ALIAS_IN_OUTPUT_TABLE__"));
}
}
boolean isLog4jEnabled = ("true").equals(ElementParameterParser.getValue(node.getProcess(), "__LOG4J_ACTIVATE__"));
String cid = node.getUniqueName();
@@ -27,19 +37,23 @@ imports="
public String transform(String content){
String result = null;
content = content.trim();
String[] splitArr = content.split("\\s");
int lgth = splitArr.length;
if(lgth > 1){// has expr alias
result = "src." + splitArr[lgth - 1];
}else{
int index = content.indexOf(".");
if(index != -1){
result = "src" + content.substring(index);
}else{
result = content;
}
}
return result;
<%if(!useAlias){%>
String[] splitArr = content.split("\\s");
int lgth = splitArr.length;
if(lgth > 1){// has expr alias
result = "src." + splitArr[lgth - 1];
}else{
int index = content.indexOf(".");
if(index != -1){
result = "src" + content.substring(index);
}else{
result = content;
}
}
return result;
<%}else{%>
return "src." + content;
<%}%>
}
}
StrUtils_<%= cid %> strUtil_<%= cid %> = new StrUtils_<%= cid %>();

View File

@@ -232,10 +232,6 @@ COMPATIBILITY="ALL"
</ITEMS>
</PARAMETER>
<PARAMETER NAME="NOTE_CUSTOMER" FIELD="LABEL" REQUIRED="true" NUM_ROW="9" COLOR="255;0;0" SHOW_IF="(SFTPOVERWRITE == 'APPEND')">
<DEFAULT>Note: File can not be renamed while using "append" mode in SFTP</DEFAULT>
</PARAMETER>
<PARAMETER NAME="PERL5_REGEX" FIELD="CHECK" REQUIRED="false" NUM_ROW="11">
<DEFAULT>false</DEFAULT>
</PARAMETER>

View File

@@ -103,7 +103,14 @@ try{
if(remoteDir_<%=cid%>.endsWith("/")) {
remoteDir_<%=cid%> = remoteDir_<%=cid%>.substring(0, remoteDir_<%=cid%>.length()-1);
}
String dest_<%=cid%> = remoteDir_<%=cid%>+"/"+listings<%=cid %>[m<%=cid %>].getName();
String destRename_<%= cid %> = map<%=cid %>.get(key<%=cid %>);
final String dest_<%=cid%>;
if (destRename_<%= cid %> == null || destRename_<%= cid %>.isEmpty()) {
dest_<%=cid%> = remoteDir_<%=cid%> + "/" + listings<%=cid %>[m<%=cid %>].getName();
}
else {
dest_<%=cid%> = remoteDir_<%=cid%> + "/" + destRename_<%= cid %>;
}
try{
c_<%=cid%>.put(listings<%=cid %>[m<%=cid %>].getAbsolutePath(), dest_<%=cid%>, monitor<%=cid%>, mode<%=cid%>);
@@ -121,6 +128,8 @@ try{
globalMap.put("<%=cid %>_CURRENT_FILE_EXISTS",true);
}
}catch(com.jcraft.jsch.SftpException e_<%=cid%>) {
if (e_<%=cid%>.id != 4 && // if exception is not resume (normal) pb.
(e_<%=cid%>.getMessage() == null || !e_<%=cid%>.getMessage().startsWith("failed to resume"))) {
<%if(isLog4jEnabled){%>
log.error("<%=cid%> - File transfer fail."+e_<%= cid %>.getMessage());
<%}%>
@@ -145,6 +154,7 @@ try{
<%
}
%>
} // end if exception not resume pb
}catch(java.lang.Exception e_<%=cid%>){
if(!(e_<%=cid%> instanceof com.jcraft.jsch.SftpException)) {
msg_<%=cid%>.add("file " + listings<%=cid %>[m<%=cid %>].getAbsolutePath() + " not found?");
@@ -162,27 +172,6 @@ try{
<%}%>
System.err.println("No matches found for mask '" + key<%=cid %> + "'!");
}
//do rename
if (!((map<%=cid %>.get(key<%=cid %>) == null) || map<%=cid %>.get(key<%=cid %>).isEmpty() || key<%=cid %>.equals(map<%=cid %>.get(key<%=cid %>)))){
<%
if(("overwrite").equals(sftpoverwrite) || ("resume").equals(sftpoverwrite)){
%>
try{
c_<%=cid%>.rm(<%=remotedir %>+"/"+map<%=cid %>.get(key<%=cid %>));
}catch(com.jcraft.jsch.SftpException e_<%=cid%>){
}
<%
}
%>
try{
c_<%=cid%>.rename(<%=remotedir %>+"/"+key<%=cid %>, <%=remotedir %>+"/"+map<%=cid %>.get(key<%=cid %>));
globalMap.put("<%=cid %>_CURRENT_STATUS", "File rename OK.");
}catch(com.jcraft.jsch.SftpException e_<%=cid%>){
globalMap.put("<%=cid %>_CURRENT_STATUS", "File rename fail.");
throw e_<%=cid%>;
}
}
}
<%
@@ -194,7 +183,7 @@ try{
String currentStatus_<%=cid %> = "No file transfered.";
globalMap.put("<%=cid %>_CURRENT_STATUS", currentStatus_<%=cid %>);
java.util.Set<String> keySet<%=cid %> = map<%=cid %>.keySet();
boolean needRename_<%=cid%> = false;
for (String key<%=cid %> : keySet<%=cid %>){
if(key<%=cid %> == null || "".equals(key<%=cid%>)){
<%if(isLog4jEnabled){%>
@@ -235,7 +224,7 @@ try{
});
}
java.util.List<String> remoteExistsFiles_<%=cid%> = new java.util.ArrayList<String>();
java.util.Set<String> remoteExistsFiles_<%=cid%> = new java.util.HashSet<>();
String[] ftpFileNames_<%=cid%> = ftp_<%=cid %>.listNames();
for (String ftpFileName : ftpFileNames_<%=cid%>) {
@@ -255,55 +244,67 @@ try{
if (listings<%=cid %>[m<%=cid %>].getName().matches(mask<%=cid %>)){
java.io.File file_in_localDir_<%=cid%> = listings<%=cid %>[m<%=cid %>];
java.io.FileInputStream file_stream_<%=cid %> = new java.io.FileInputStream(file_in_localDir_<%=cid%>);
globalMap.put("<%=cid %>_CURRENT_FILE_EXISTS", remoteExistsFiles_<%=cid%>.contains(file_in_localDir_<%=cid%>.getName()));
String newName_<%=cid%> = ("".equals(map<%=cid %>.get(key<%=cid %>)))?file_in_localDir_<%=cid%>.getName():map<%=cid %>.get(key<%=cid %>);
needRename_<%=cid%> = true;
final String destRename_<%= cid %> = map<%=cid %>.get(key<%=cid %>);
final String dest_<%=cid%>;
if (destRename_<%= cid %> == null || destRename_<%= cid %>.isEmpty()) {
dest_<%=cid%> = listings<%=cid %>[m<%=cid %>].getName();
}
else {
dest_<%=cid%> = destRename_<%= cid %>;
}
globalMap.put("<%=cid %>_CURRENT_FILE_EXISTS", remoteExistsFiles_<%=cid%>.contains(dest_<%=cid%>));
<%
if (!ftps && append) {
%>
if ((remoteExistsFiles_<%=cid%>.contains(newName_<%=cid%>))){
ftp_<%=cid %>.appendFile(file_in_localDir_<%=cid%>.getName(), file_stream_<%=cid %>);
if ((remoteExistsFiles_<%=cid%>.contains(dest_<%=cid%>))){
ftp_<%=cid %>.appendFile(dest_<%=cid%>, file_stream_<%=cid %>);
} else {
ftp_<%=cid %>.storeFile(file_in_localDir_<%=cid%>.getName(), file_stream_<%=cid %>);
ftp_<%=cid %>.storeFile(dest_<%=cid%>, file_stream_<%=cid %>);
}
<%if(isLog4jEnabled){%>
log.debug("<%=cid%> - Uploaded file '" + newName_<%=cid%> + "' successfully.");
log.debug("<%=cid%> - Uploaded file '" + dest_<%=cid%> + "' successfully.");
<%}%>
globalMap.put("<%=cid %>_CURRENT_FILE_EXISTS", remoteExistsFiles_<%=cid%>.contains(newName_<%=cid%>));
globalMap.put("<%=cid %>_CURRENT_FILE_EXISTS", remoteExistsFiles_<%=cid%>.contains(dest_<%=cid%>));
remoteExistsFiles_<%=cid%>.add(dest_<%=cid%>);
<%
} else if ("size_differ".equals(ftpsoverwrite)) {
%>
if ((remoteExistsFiles_<%=cid%>.contains(newName_<%=cid%>))){
long ftpSize = java.util.Arrays.stream(ftp_<%=cid %>.listFiles(newName_<%=cid%>)).filter(org.apache.commons.net.ftp.FTPFile::isFile).findFirst().get().getSize();
if ((remoteExistsFiles_<%=cid%>.contains(dest_<%=cid%>))){
long ftpSize = java.util.Arrays.stream(ftp_<%=cid %>.listFiles(dest_<%=cid%>))
.filter(org.apache.commons.net.ftp.FTPFile::isFile)
.findFirst()
.get().getSize();
long localSize = file_in_localDir_<%=cid%>.length();
if (ftpSize != localSize) {
ftp_<%=cid %>.deleteFile(newName_<%=cid%>);
ftp_<%=cid %>.deleteFile(dest_<%=cid%>);
}
}
ftp_<%=cid %>.storeFile(file_in_localDir_<%=cid%>.getName(), file_stream_<%=cid %>);
ftp_<%=cid %>.storeFile(dest_<%=cid%>, file_stream_<%=cid %>);
remoteExistsFiles_<%=cid%>.add(dest_<%=cid%>);
<%
} else if("never".equals(ftpsoverwrite)){
%>
if (!(remoteExistsFiles_<%=cid%>.contains(newName_<%=cid%>))){
ftp_<%=cid %>.storeFile(file_in_localDir_<%=cid%>.getName(), file_stream_<%=cid %>);
if (!(remoteExistsFiles_<%=cid%>.contains(dest_<%=cid%>))){
ftp_<%=cid %>.storeFile(dest_<%=cid%>, file_stream_<%=cid %>);
<%if(isLog4jEnabled){%>
log.debug("<%=cid%> - Uploaded file '" + newName_<%=cid%> + "' successfully.");
log.debug("<%=cid%> - Uploaded file '" + dest_<%=cid%> + "' successfully.");
<%}%>
globalMap.put("<%=cid %>_CURRENT_FILE_EXISTS", remoteExistsFiles_<%=cid%>.contains(newName_<%=cid%>));
} else {
needRename_<%=cid %> = false;
}
globalMap.put("<%=cid %>_CURRENT_FILE_EXISTS", remoteExistsFiles_<%=cid%>.contains(dest_<%=cid%>));
remoteExistsFiles_<%=cid%>.add(dest_<%=cid%>);
}
<%}else if("always".equals(ftpsoverwrite)){%>
if ((remoteExistsFiles_<%=cid%>.contains(newName_<%=cid%>))){
ftp_<%=cid %>.deleteFile(newName_<%=cid%>);
if ((remoteExistsFiles_<%=cid%>.contains(dest_<%=cid%>))){
ftp_<%=cid %>.deleteFile(dest_<%=cid%>);
}
ftp_<%=cid %>.storeFile(file_in_localDir_<%=cid%>.getName(), file_stream_<%=cid %>);
ftp_<%=cid %>.storeFile(dest_<%=cid%>, file_stream_<%=cid %>);
<%if(isLog4jEnabled){%>
log.debug("<%=cid%> - Overwrote file '" + newName_<%=cid%> + "' successfully.");
log.debug("<%=cid%> - Overwrote file '" + dest_<%=cid%> + "' successfully.");
<%}%>
globalMap.put("<%=cid %>_CURRENT_FILE_EXISTS", remoteExistsFiles_<%=cid%>.contains(newName_<%=cid%>));
globalMap.put("<%=cid %>_CURRENT_FILE_EXISTS", remoteExistsFiles_<%=cid%>.contains(dest_<%=cid%>));
remoteExistsFiles_<%=cid%>.add(dest_<%=cid%>);
<%}%>
file_stream_<%=cid %>.close();
<%
@@ -323,20 +324,6 @@ try{
<%}%>
System.err.println("No matches found for mask '" + key<%=cid %> + "'!");
}
//do rename
if (!((map<%=cid %>.get(key<%=cid %>) == null) || map<%=cid %>.get(key<%=cid %>).isEmpty() || key<%=cid %>.equals(map<%=cid %>.get(key<%=cid %>)))){
try{
if(needRename_<%=cid%>){
ftp_<%=cid%>.rename(key<%=cid %>, map<%=cid %>.get(key<%=cid %>));
}
globalMap.put("<%=cid %>_CURRENT_STATUS", "File rename OK.");
}catch(IOException e_<%=cid%>){
globalMap.put("<%=cid %>_CURRENT_STATUS", "File rename fail.");
throw e_<%=cid%>;
}
}
needRename_<%=cid%> = false;
}
<%}%>
}catch(java.lang.Exception e_<%=cid%>){

View File

@@ -134,7 +134,7 @@
<IMPORT NAME="commons-compress-1.19" MODULE="commons-compress-1.19.jar" MVN="mvn:org.apache.commons/commons-compress/1.19" REQUIRED="true" />
<IMPORT NAME="Encrypt-Zip" MODULE="checkArchive-1.1-20190917.jar" MVN="mvn:org.talend.libraries/checkArchive-1.1-20190917/6.0.0" UrlPath="platform:/plugin/org.talend.libraries.custom/lib/checkArchive-1.1-20190917.jar" REQUIRED="true" />
<IMPORT NAME="zip4j-1.3.3.jar" MODULE="zip4j-1.3.3.jar" MVN="mvn:net.lingala.zip4j/zip4j/1.3.3" REQUIRED="true" />
<IMPORT NAME="talendzip-1.0-20190917.jar" MODULE="talendzip-1.0-20190917.jar" MVN="mvn:org.talend.libraries/talendzip/1.0-20190917" UrlPath="platform:/plugin/org.talend.libraries.custom/lib/talendzip-1.0-20190917.jar" REQUIRED="true" />
<IMPORT NAME="talendzip-1.1-20201120.jar" MODULE="talendzip-1.1-20201120.jar" MVN="mvn:org.talend.components/talendzip/1.1-20201120" UrlPath="platform:/plugin/org.talend.libraries.custom/lib/talendzip-1.1-20201120.jar" REQUIRED="true" />
</IMPORTS>
</CODEGENERATION>

View File

@@ -144,8 +144,8 @@
<CODEGENERATION>
<IMPORTS>
<IMPORT NAME="filecopy" MODULE="filecopy.jar"
MVN="mvn:org.talend.libraries/filecopy/2.0.0"
UrlPath="platform:/plugin/org.talend.libraries.custom/lib/filecopy-2.0.0.jar"
MVN="mvn:org.talend.components/filecopy/2.0.1"
UrlPath="platform:/plugin/org.talend.libraries.custom/lib/filecopy-2.0.1.jar"
REQUIRED="true" />
</IMPORTS>
</CODEGENERATION>

View File

@@ -401,6 +401,15 @@ COMPATIBILITY="ALL"
NUM_ROW="60"
>
<DEFAULT>false</DEFAULT>
</PARAMETER>
<PARAMETER
NAME="TRUST_ALL_SERVER"
FIELD="CHECK"
REQUIRED="true"
SHOW_IF="PROTO == 'HTTPS_PROTO'"
NUM_ROW="70"
>
<DEFAULT>false</DEFAULT>
</PARAMETER>
</ADVANCED_PARAMETERS>

View File

@@ -47,6 +47,7 @@ if ("http".equals(protocol) || "https".equals(protocol)) {
boolean useProxyNTLM = "true".equals(ElementParameterParser.getValue(node, "__PROXY_NTLM__"));
boolean addHeader = "true".equals(ElementParameterParser.getValue(node, "__ADD_HEADER__"));
boolean encodeURI = "true".equals(ElementParameterParser.getValue(node, "__ENCODE_URI__"));
boolean trustAll = "true".equals(ElementParameterParser.getValue(node, "__TRUST_ALL_SERVER__"));
String scaccepted_uploadfile = ((uploadFile) ? " || status_"+cid+" == org.apache.commons.httpclient.HttpStatus.SC_ACCEPTED": "");
%>
@@ -116,7 +117,7 @@ if ("http".equals(protocol) || "https".equals(protocol)) {
e.printStackTrace();
}
}
javax.net.ssl.TrustManager[] trustManagers = new javax.net.ssl.TrustManager[]{new TrustAnyTrustManager()};
javax.net.ssl.TrustManager[] trustManagers = <%=trustAll?"new javax.net.ssl.TrustManager[]{new TrustAnyTrustManager()}":"null"%>;
String trustStoreType = java.util.Optional.ofNullable(System.getProperty("javax.net.ssl.trustStoreType")).orElse("");
String trustStoreFile = java.util.Optional.ofNullable(System.getProperty("javax.net.ssl.trustStore")).orElse("");
String trustStorePassword = java.util.Optional.ofNullable(System.getProperty("javax.net.ssl.trustStorePassword")).orElse("");

View File

@@ -44,6 +44,7 @@ PROXY_DOMAIN.NAME=Domain
USE_CACHE.NAME=Use cache to save the resource
REDIRECT.NAME=Support redirection
REDIRECT_302_AS_303.NAME=Force GET method for 302 redirection
TRUST_ALL_SERVER.NAME=Trust all servers
SAVE_COOKIE.NAME=Save cookie
READ_COOKIE.NAME=Read cookie
COOKIE_DIR.NAME= Cookie file
@@ -59,3 +60,5 @@ ADD_HEADER.NAME=Add header
HEADERS.NAME=Headers
HEADERS.ITEM.HEADER_NAME=Name
HEADERS.ITEM.HEADER_VALUE=Value
INPUT_STREAM.NAME=Input stream

View File

@@ -227,18 +227,19 @@
this.isBehindDynamic=true;
dynamic_index = valueN;
%>
<%=dynamicName%>.clearColumnValues();
<%=targetConnName %>.<%=column.getLabel() %> = <%=dynamicName%>.copyMetadata();
int fieldCount = <%=sourceValueName%>.getColumnsCountOfCurrentRow();
dynamic_column_count_<%=cid%> = <%=dynamicName%>.getColumnCount();
for (int i = 0; i < dynamic_column_count_<%=cid%> ; i++) {
if ((<%=dynamic_index%>+i) < fieldCount){
<%=dynamicName%>.addColumnValue(<%=sourceValueName%>.get(<%=dynamic_index%>+i)<%=(isTrimAll || (!trimSelects.isEmpty() && ("true").equals(trimSelects.get(valueN).get("TRIM"))))?".trim()":"" %>);
<%=targetConnName %>.<%=column.getLabel() %>.addColumnValue(<%=sourceValueName%>.get(<%=dynamic_index%>+i)<%=(isTrimAll || (!trimSelects.isEmpty() && ("true").equals(trimSelects.get(valueN).get("TRIM"))))?".trim()":"" %>);
}
else{
<%=dynamicName%>.addColumnValue("");
<%=targetConnName %>.<%=column.getLabel() %>.addColumnValue("");
}
}
<%=targetConnName %>.<%=column.getLabel() %>=<%=dynamicName%>;
<%
}else{
@@ -464,6 +465,7 @@
log.error("<%=cid%> - " +e.getMessage());
<%}%>
System.err.println(e.getMessage());
globalMap.put("<%=cid %>_ERROR_MESSAGE", e.getMessage());
<% } %>
}
java.util.zip.ZipEntry entry_<%=cid %> = null;
@@ -479,6 +481,7 @@
log.error("<%=cid%> - " +e.getMessage());
<%}%>
System.err.println(e.getMessage());
globalMap.put("<%=cid %>_ERROR_MESSAGE", e.getMessage());
break;
<% } %>
}
@@ -503,6 +506,7 @@
log.error("<%=cid%> - " +e.getMessage());
<%}%>
System.err.println(e.getMessage());
globalMap.put("<%=cid %>_ERROR_MESSAGE", e.getMessage());
<% } %>
}
<%
@@ -528,6 +532,7 @@
log.error("<%=cid%> - " +e.getMessage());
<%}%>
System.err.println(e.getMessage());
globalMap.put("<%=cid %>_ERROR_MESSAGE", e.getMessage());
<% } %>
}
<%
@@ -606,7 +611,9 @@
} else if(rejectConnName.equals(firstConnName)){%>
<%=rejectConnName%>.errorMessage = e.getMessage() + " - Line: " + tos_count_<%=node.getUniqueName() %>;
<%
}
} %>
globalMap.put("<%=cid %>_ERROR_MESSAGE", e.getMessage());
<%
}
%>
}
@@ -698,8 +705,7 @@
IMetadataColumn column1 =metadata.getListColumns().get(colNo);
if("id_Dynamic".equals(column1.getTalendType())) {
%>
<%=dynamicName%>.clearColumnValues();
<%=targetConnName %>.<%=column1.getLabel() %> = <%=dynamicName%>;
<%=targetConnName %>.<%=column1.getLabel() %> = <%=dynamicName%>.copyMetadata();
<%
} else {
%>
@@ -867,17 +873,16 @@
%>
dynamic_column_count_<%=cid%> = <%=dynamicName%>.getColumnCount();
if(dynamic_column_count_<%=cid%> > 0) {
<%=dynamicName%>.clearColumnValues();
<%=targetConnName %>.<%=column.getLabel() %> = <%=dynamicName%>.copyMetadata();
}
int fieldCount = <%=sourceValueName%>.length;
for (int i = 0; i < dynamic_column_count_<%=cid%>; i++) {
if ((<%=dynamic_index%>+i) < fieldCount) {
<%=dynamicName%>.addColumnValue(<%=sourceValueName%>[<%=dynamic_index%>+i]);
<%=targetConnName %>.<%=column.getLabel() %>.addColumnValue(<%=sourceValueName%>[<%=dynamic_index%>+i]);
} else {
<%=dynamicName%>.addColumnValue("");
<%=targetConnName %>.<%=column.getLabel() %>.addColumnValue("");
}
}
<%=targetConnName %>.<%=column.getLabel() %> = <%=dynamicName%>;
<%
}
}
@@ -1076,6 +1081,7 @@
log.error("<%=cid%> - " + e.getMessage());
<%}%>
System.err.println(e.getMessage());
globalMap.put("<%=cid %>_ERROR_MESSAGE", e.getMessage());
<% } %>
}
java.util.zip.ZipEntry entry_<%=cid %> = null;
@@ -1091,6 +1097,7 @@
log.error("<%=cid%> - " + e.getMessage());
<%}%>
System.err.println(e.getMessage());
globalMap.put("<%=cid %>_ERROR_MESSAGE", e.getMessage());
break;
<% } %>
}
@@ -1283,6 +1290,7 @@
log.error("<%=cid%> - " +e.getMessage());
<%}%>
System.err.println(e.getMessage());
globalMap.put("<%=cid %>_ERROR_MESSAGE", e.getMessage());
<% } %>
}//TD110 end
@@ -1386,7 +1394,9 @@
%>
<%=rejectConnName%>.errorMessage = e.getMessage() + " - Line: " + tos_count_<%=node.getUniqueName() %>;
<%
}
} %>
globalMap.put("<%=cid %>_ERROR_MESSAGE", e.getMessage());
<%
}
%>
}

View File

@@ -113,7 +113,7 @@ if ((metadatas!=null)&&(metadatas.size()>0)) {
}
%>
dynamic_column_count_<%=cid%> = colsLen_<%=cid %>-<%=colLen-1%>;
for (int i = <%=dynamic_index-1%>; i < colsLen_<%=cid %>-<%=colLen-dynamic_index%>; i++) {
for (int i = <%=dynamic_index-1%> + start_column_<%= cid %>; i < colsLen_<%=cid %>-<%=colLen-dynamic_index%> + start_column_<%= cid %>; i++) {
routines.system.DynamicMetadata dynamicMetadata_<%=cid%> = new routines.system.DynamicMetadata();
<%
if(!version07){
@@ -1276,6 +1276,17 @@ if ((metadatas!=null)&&(metadatas.size()>0)) {
java.text.DecimalFormat df_<%=cid %> = new java.text.DecimalFormat("#.####################################");
char decimalChar_<%=cid %> = df_<%=cid %>.getDecimalFormatSymbols().getDecimalSeparator();
<%
for(int i=0; columnList!=null && i< columnList.size(); i++) {
if ("id_Date".equals(columnList.get(i).getTalendType())) {
// it's input component. we should avoid data rounding here. for dynamic schema we have another algo
%>
excelReader_<%=cid%>.addDateFormat(<%=i%>,new java.text.SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"));
<%
}
}
%>
if(source_<%=cid %> instanceof String){
excelReader_<%=cid%>.parse((String)source_<%=cid %>,<%=encoding %>, password_<%=cid%>);
} else if(source_<%=cid %> instanceof java.io.InputStream) {
@@ -1382,7 +1393,8 @@ if ((metadatas!=null)&&(metadatas.size()>0)) {
IMetadataColumn column = listColumns.get(i);
String typeToGenerate = JavaTypesManager.getTypeToGenerate(column.getTalendType(), column.isNullable());
JavaType javaType = JavaTypesManager.getJavaTypeFromId(column.getTalendType());
String patternValue = column.getPattern() == null || column.getPattern().trim().length() == 0 ? null : column.getPattern();
// it's input component. we should avoid data rounding here. it would be nice to avoid transformation from date to string and back
String patternValue = "\"yyyy-MM-dd'T'HH:mm:ss.SSS'Z'\"";
if(metadata.isDynamicSchema()){
%>
columnIndex_<%=cid%> = <%=i-1%>+dynamic_column_count_<%=cid%>;

View File

@@ -177,20 +177,24 @@
<CODEGENERATION>
<IMPORTS>
<IMPORT NAME="Java_Excel" MODULE="jxl.jar" MVN="mvn:org.talend.libraries/jxl/6.0.0" UrlPath="platform:/plugin/org.talend.libraries.jexcel/lib/jxl.jar" REQUIRED_IF="(VERSION_2007 == 'false')" />
<IMPORT NAME="simpleexcel" MODULE="simpleexcel-2.5-20201119.jar" MVN="mvn:org.talend.components/simpleexcel/2.5-20201119" UrlPath="platform:/plugin/org.talend.libraries.excel/lib/simpleexcel-2.5-20201119.jar" REQUIRED_IF="(VERSION_2007 == 'true') AND GENERATION_MODE == 'EVENT_MODE')" />
<IMPORT NAME="Java_DOM4J2" MODULE="dom4j-2.1.3.jar" MVN="mvn:org.dom4j/dom4j/2.1.3" REQUIRED_IF="(VERSION_2007 == 'true')" BundleID="" />
<IMPORT NAME="geronimo-stax-api" MODULE="geronimo-stax-api_1.0_spec-1.0.1.jar" MVN="mvn:org.talend.libraries/geronimo-stax-api_1.0_spec-1.0.1/6.0.0" UrlPath="platform:/plugin/org.talend.libraries.apache.axis2/lib/geronimo-stax-api_1.0_spec-1.0.1.jar" REQUIRED_IF="(VERSION_2007 == 'true')" BundleID="" />
<IMPORT NAME="log4j" MODULE="log4j-1.2.17.jar" MVN="mvn:log4j/log4j/1.2.17" UrlPath="platform:/plugin/org.talend.libraries.apache/lib/log4j-1.2.17.jar" REQUIRED_IF="(VERSION_2007 == 'true')" BundleID="" />
<IMPORT NAME="poi" MODULE="poi-4.1.0-20190523141255_modified_talend.jar" MVN="mvn:org.apache.poi/poi/4.1.0-20190523141255_modified_talend" UrlPath="platform:/plugin/org.talend.libraries.excel/lib/poi-4.1.0-20190523141255_modified_talend.jar" REQUIRED_IF="(VERSION_2007 == 'true')" />
<IMPORT NAME="poi-ooxml" MODULE="poi-ooxml-4.1.0-20190523141255_modified_talend.jar" MVN="mvn:org.apache.poi/poi-ooxml/4.1.0-20190523141255_modified_talend" UrlPath="platform:/plugin/org.talend.libraries.excel/lib/poi-ooxml-4.1.0-20190523141255_modified_talend.jar" REQUIRED_IF="(VERSION_2007 == 'true')" />
<IMPORT NAME="poi-ooxml-schemas" MODULE="poi-ooxml-schemas-4.1.0-20190523141255_modified_talend.jar" MVN="mvn:org.apache.poi/poi-ooxml-schemas/4.1.0-20190523141255_modified_talend" UrlPath="platform:/plugin/org.talend.libraries.excel/lib/poi-ooxml-schemas-4.1.0-20190523141255_modified_talend.jar" REQUIRED_IF="(VERSION_2007 == 'true')" />
<IMPORT NAME="poi-scratchpad" MODULE="poi-scratchpad-4.1.0-20190523141255_modified_talend.jar" MVN="mvn:org.apache.poi/poi-scratchpad/4.1.0-20190523141255_modified_talend" UrlPath="platform:/plugin/org.talend.libraries.excel/lib/poi-scratchpad-4.1.0-20190523141255_modified_talend.jar" REQUIRED_IF="(VERSION_2007 == 'true')" />
<IMPORT NAME="xmlbeans" MODULE="xmlbeans-3.1.0.jar" MVN="mvn:org.apache.xmlbeans/xmlbeans/3.1.0" REQUIRED_IF="(VERSION_2007 == 'true')" />
<IMPORT NAME="simpleexcel" MODULE="simpleexcel-2.2-20190722.jar" MVN="mvn:org.talend.libraries/simpleexcel-2.2-20190722/6.0.0" UrlPath="platform:/plugin/org.talend.libraries.excel/lib/simpleexcel-2.2-20190722.jar" REQUIRED_IF="(VERSION_2007 == 'true') AND GENERATION_MODE == 'EVENT_MODE')" />
<IMPORT NAME="commons-collections4" MODULE="commons-collections4-4.1.jar" MVN="mvn:org.talend.libraries/commons-collections4-4.1/6.0.0" UrlPath="platform:/plugin/org.talend.libraries.excel/lib/commons-collections4-4.1.jar" REQUIRED_IF="(VERSION_2007 == 'true')" />
<IMPORT NAME="commons-compress" MODULE="commons-compress-1.19.jar" MVN="mvn:org.apache.commons/commons-compress/1.19" REQUIRED_IF="(VERSION_2007 == 'true')" />
<IMPORT NAME="poi" MODULE="poi-4.1.2-20200903124306_modified_talend.jar" MVN="mvn:org.apache.poi/poi/4.1.2-20200903124306_modified_talend" UrlPath="platform:/plugin/org.talend.libraries.excel/lib/poi-4.1.2-20200903124306_modified_talend.jar" REQUIRED_IF="(VERSION_2007 == 'true')" />
<IMPORT NAME="SparseBitSet" MODULE="SparseBitSet-1.2.jar" MVN="mvn:com.zaxxer/SparseBitSet/1.2" REQUIRED_IF="(VERSION_2007 == 'true')" />
<IMPORT NAME="commons-codec-1.14.jar" MODULE="commons-codec-1.14.jar" MVN="mvn:commons-codec/commons-codec/1.14" REQUIRED_IF="(VERSION_2007 == 'true')" />
<IMPORT NAME="commons-collections4" MODULE="commons-collections4-4.4.jar" MVN="mvn:org.apache.commons/commons-collections4/4.4" REQUIRED_IF="(VERSION_2007 == 'true')" />
<IMPORT NAME="commons-math3" MODULE="commons-math3-3.6.1.jar" MVN="mvn:org.apache.commons/commons-math3/3.6.1" REQUIRED_IF="(VERSION_2007 == 'true')" />
<IMPORT NAME="poi-ooxml" MODULE="poi-ooxml-4.1.2-20200903124306_modified_talend.jar" MVN="mvn:org.apache.poi/poi-ooxml/4.1.2-20200903124306_modified_talend" UrlPath="platform:/plugin/org.talend.libraries.excel/lib/poi-ooxml-4.1.2-20200903124306_modified_talend.jar" REQUIRED_IF="(VERSION_2007 == 'true')" />
<IMPORT NAME="curvesapi" MODULE="curvesapi-1.06.jar" MVN="mvn:com.github.virtuald/curvesapi/1.06" REQUIRED_IF="(VERSION_2007 == 'true')" />
<IMPORT NAME="commons-compress" MODULE="commons-compress-1.19.jar" MVN="mvn:org.apache.commons/commons-compress/1.19" REQUIRED_IF="(VERSION_2007 == 'true')" />
<IMPORT NAME="poi-ooxml-schemas" MODULE="poi-ooxml-schemas-4.1.2-20200903124306_modified_talend.jar" MVN="mvn:org.apache.poi/poi-ooxml-schemas/4.1.2-20200903124306_modified_talend" UrlPath="platform:/plugin/org.talend.libraries.excel/lib/poi-ooxml-schemas-4.1.2-20200903124306_modified_talend.jar" REQUIRED_IF="(VERSION_2007 == 'true')" />
<IMPORT NAME="xmlbeans" MODULE="xmlbeans-3.1.0.jar" MVN="mvn:org.apache.xmlbeans/xmlbeans/3.1.0" REQUIRED_IF="(VERSION_2007 == 'true')" />
<IMPORT NAME="poi-scratchpad" MODULE="poi-scratchpad-4.1.2-20200903124306_modified_talend.jar" MVN="mvn:org.apache.poi/poi-scratchpad/4.1.2-20200903124306_modified_talend" UrlPath="platform:/plugin/org.talend.libraries.excel/lib/poi-scratchpad-4.1.2-20200903124306_modified_talend.jar" REQUIRED_IF="(VERSION_2007 == 'true')" />
</IMPORTS>
</CODEGENERATION>

View File

@@ -46,6 +46,7 @@ Object filenameOrStream_<%=cid %> = null;
<%if(isLog4jEnabled){%>
log.error("<%=cid%> - " + e_<%=cid%>.getMessage());
<%}%>
globalMap.put("<%=cid %>_ERROR_MESSAGE", e_<%=cid%>.getMessage());
System.err.println(e_<%=cid%>.getMessage());
<%
}
@@ -85,6 +86,7 @@ if(dieOnError){
if(isLog4jEnabled){%>
log.error("<%=cid%> - " + e_<%=cid%>.getMessage());
<%}%>
globalMap.put("<%=cid %>_ERROR_MESSAGE", e_<%=cid%>.getMessage());
System.err.println(e_<%=cid%>.getMessage());
<%
}
@@ -275,7 +277,9 @@ for(Object row_<%=cid%> : resultset_<%=cid%>) {
%>
<%=rejectConnName%>.errorMessage = e_<%=cid%>.getMessage() + " - Line: " + tos_count_<%=node.getUniqueName() %>;
<%
}
} %>
globalMap.put("<%=cid %>_ERROR_MESSAGE", e_<%=cid%>.getMessage());
<%
}
%>
}

View File

@@ -166,6 +166,7 @@ if (jsEngine_<%=cid%> == null) {
log.error("<%=cid%> - " + e_<%=cid%>.getMessage());
<%}%>
System.err.println(e_<%=cid%>.getMessage());
globalMap.put("<%=cid %>_ERROR_MESSAGE", e_<%=cid%>.getMessage());
<%
}
%>
@@ -203,6 +204,7 @@ if (jsEngine_<%=cid%> == null) {
log.error("<%=cid%> - " + e_<%=cid%>.getMessage());
<%}%>
System.err.println(e_<%=cid%>.getMessage());
globalMap.put("<%=cid %>_ERROR_MESSAGE", e_<%=cid%>.getMessage());
<%}%>
} finally {
if(fr_<%=cid%> != null ) {
@@ -316,7 +318,9 @@ if ((metadatas!=null)&&(metadatas.size()>0)) {
%>
<%=rejectConnName%>.errorMessage = e.getMessage() + " - Line: " + tos_count_<%=node.getUniqueName() %>;
<%
}
} %>
globalMap.put("<%=cid %>_ERROR_MESSAGE", e.getMessage());
<%
}
%>
}

View File

@@ -210,6 +210,7 @@ boolean checkDate = (checkDateStr!=null&&!("").equals(checkDateStr))?("true").eq
log.error("<%=cid%> - " + e_<%=cid%>.getMessage());
<%}%>
System.err.println(e_<%=cid%>.getMessage());
globalMap.put("<%=cid %>_ERROR_MESSAGE", e_<%=cid%>.getMessage());
<%
}
%>
@@ -275,6 +276,7 @@ boolean checkDate = (checkDateStr!=null&&!("").equals(checkDateStr))?("true").eq
log.error("<%=cid%> - " + e_<%=cid%>.getMessage());
<%}%>
System.err.println( e_<%=cid%>.getMessage());
globalMap.put("<%=cid %>_ERROR_MESSAGE", e_<%=cid%>.getMessage());
isValidFile_<%=cid %> = false;
<%
}
@@ -550,7 +552,9 @@ if ((metadatas!=null)&&(metadatas.size()>0)) {
%>
<%=rejectConnName%>.errorMessage = e_<%=cid%>.getMessage() + " - Line: " + tos_count_<%=node.getUniqueName() %>;
<%
}
} %>
globalMap.put("<%=cid %>_ERROR_MESSAGE", e_<%=cid%>.getMessage());
<%
}
%>
}

View File

@@ -222,19 +222,18 @@
}
path = path + attachFileName;
<% if(isLog4jEnabled){ %>
log.info("<%= cid %> - Extracted attachment: '" + attachFileName + "'.");
log.info("<%= cid %> - Extracting attachment: '" + attachFileName + "'.");
<% } %>
java.io.File attachFile = new java.io.File(path);
java.io.BufferedOutputStream out = new java.io.BufferedOutputStream(new java.io.FileOutputStream(attachFile));
java.io.BufferedInputStream in = new java.io.BufferedInputStream(mpart.getInputStream());
int buffer = 0;
while ((buffer = in.read()) != -1) {
out.write(buffer);
try (java.io.BufferedOutputStream out = new java.io.BufferedOutputStream(new java.io.FileOutputStream(attachFile));
java.io.BufferedInputStream in = new java.io.BufferedInputStream(mpart.getInputStream())){
byte[] buffer = new byte[8192];
int bytesRead = 0;
while ((bytesRead = in.read(buffer)) > 0) {
out.write(buffer, 0, bytesRead);
}
out.flush();
}
out.close();
in.close();
}
}

View File

@@ -87,9 +87,16 @@
<IMPORTS>
<IMPORT NAME="JavaMail" MODULE="mail.jar" MVN="mvn:org.talend.libraries/mail/6.0.0" REQUIRED_IF="(MAIL_TYPE == 'MIME')" />
<IMPORT NAME="JAF" MODULE="activation.jar" MVN="mvn:org.talend.libraries/activation/6.0.0" REQUIRED_IF="(MAIL_TYPE == 'MIME')" />
<IMPORT NAME="poi" MODULE="poi-4.1.0-20190523141255_modified_talend.jar" MVN="mvn:org.apache.poi/poi/4.1.0-20190523141255_modified_talend" UrlPath="platform:/plugin/org.talend.libraries.excel/lib/poi-4.1.0-20190523141255_modified_talend.jar" REQUIRED_IF="(MAIL_TYPE == 'MSG')" />
<IMPORT NAME="poi-scratchpad" MODULE="poi-scratchpad-4.1.0-20190523141255_modified_talend.jar" MVN="mvn:org.apache.poi/poi-scratchpad/4.1.0-20190523141255_modified_talend" UrlPath="platform:/plugin/org.talend.libraries.excel/lib/poi-scratchpad-4.1.0-20190523141255_modified_talend.jar" REQUIRED_IF="(MAIL_TYPE == 'MSG')" />
<IMPORT NAME="talendMsgMailUtil" MODULE="talendMsgMailUtil-1.1-20191012.jar" MVN="mvn:org.talend.libraries/talendMsgMailUtil-1.1-20191012/6.0.0" UrlPath="platform:/plugin/org.talend.libraries.excel/lib/talendMsgMailUtil-1.1-20191012.jar" REQUIRED_IF="(MAIL_TYPE == 'MSG')" />
<IMPORT NAME="poi" MODULE="poi-4.1.2-20200903124306_modified_talend.jar" MVN="mvn:org.apache.poi/poi/4.1.2-20200903124306_modified_talend" UrlPath="platform:/plugin/org.talend.libraries.excel/lib/poi-4.1.2-20200903124306_modified_talend.jar" REQUIRED_IF="(MAIL_TYPE == 'MSG')" />
<IMPORT NAME="SparseBitSet" MODULE="SparseBitSet-1.2.jar" MVN="mvn:com.zaxxer/SparseBitSet/1.2" REQUIRED_IF="(MAIL_TYPE == 'MSG')" />
<IMPORT NAME="commons-codec-1.14.jar" MODULE="commons-codec-1.14.jar" MVN="mvn:commons-codec/commons-codec/1.14" REQUIRED_IF="(MAIL_TYPE == 'MSG')" />
<IMPORT NAME="commons-collections4" MODULE="commons-collections4-4.4.jar" MVN="mvn:org.apache.commons/commons-collections4/4.4" REQUIRED_IF="(MAIL_TYPE == 'MSG')" />
<IMPORT NAME="commons-math3" MODULE="commons-math3-3.6.1.jar" MVN="mvn:org.apache.commons/commons-math3/3.6.1" REQUIRED_IF="(MAIL_TYPE == 'MSG')" />
<IMPORT NAME="poi-scratchpad" MODULE="poi-scratchpad-4.1.2-20200903124306_modified_talend.jar" MVN="mvn:org.apache.poi/poi-scratchpad/4.1.2-20200903124306_modified_talend" UrlPath="platform:/plugin/org.talend.libraries.excel/lib/poi-scratchpad-4.1.2-20200903124306_modified_talend.jar" REQUIRED_IF="(MAIL_TYPE == 'MSG')" />
<IMPORT NAME="talendMsgMailUtil" MODULE="talendMsgMailUtil-1.2-20200923.jar" MVN="mvn:org.talend.components/talendMsgMailUtil/1.2-20200923" UrlPath="platform:/plugin/org.talend.libraries.excel/lib/talendMsgMailUtil-1.2-20200923.jar" REQUIRED_IF="(MAIL_TYPE == 'MSG')" />
<IMPORT NAME="log4j-1.2.17.jar" MODULE="log4j-1.2.17.jar" MVN="mvn:log4j/log4j/1.2.17" UrlPath="platform:/plugin/org.talend.libraries.apache/lib/log4j-1.2.17.jar" REQUIRED_IF="(MAIL_TYPE == 'MSG')" />
</IMPORTS>
</CODEGENERATION>

View File

@@ -278,7 +278,7 @@
if(useExistingDynamic){
%>
routines.system.Dynamic dynamic_<%=cid %> = (routines.system.Dynamic)globalMap.get("<%=dyn%>");
dynamic_<%=cid %>.clearColumnValues();
<%=firstConnName %>.<%=column.getLabel() %> = dynamic_<%=cid %>.copyMetadata();
int maxColumnCount_<%=cid %> = dynamic_<%=cid %>.getColumnCount();
int substringBegin<%=cid %> = begins_<%=cid %>[<%=valueN%>]; int substringEnd<%=cid %> =0;
for (int i<%=cid%>=0;i<%=cid%><maxColumnCount_<%=cid %>;i<%=cid%>++) {
@@ -331,10 +331,9 @@
<%
}
%>
dynamic_<%=cid %>.addColumnValue(currentColumnValue_<%=cid%>);
<%=firstConnName %>.<%=column.getLabel() %>.addColumnValue(currentColumnValue_<%=cid%>);
substringBegin<%=cid %> = substringEnd<%=cid %>;
}
<%=firstConnName %>.<%=column.getLabel() %> = dynamic_<%=cid %>;
<%
}
} else {
@@ -387,7 +386,7 @@
for(int i=0;i < columnList.size();i++){
if(i % 100 == 0){
%>
int parseValue_<%=i%>(String row_<%=cid%>, int substringBegin<%=cid%>, int substringEnd<%=cid%>, int rowLen_<%=cid%>, String[] columnValue<%=cid%> <%if("BYTES".equals(patternUnits)){%>, Arrays_<%=cid%> arrays_<%=cid%>, byte[][] byteArray_<%=cid%><%}%><%if(useExistingDynamic){%>, routines.system.Dynamic dynamic_<%=cid%><%}%>)throws java.lang.Exception{
int parseValue_<%=i%>(String row_<%=cid%>, int substringBegin<%=cid%>, int substringEnd<%=cid%>, int rowLen_<%=cid%>, String[] columnValue<%=cid%> <%if("BYTES".equals(patternUnits)){%>, Arrays_<%=cid%> arrays_<%=cid%>, byte[][] byteArray_<%=cid%><%}%><%if(useExistingDynamic){%>, routines.system.Dynamic dynamicLocal_<%=cid%><%}%>)throws java.lang.Exception{
<%
}
IMetadataColumn column = columnList.get(i);
@@ -396,10 +395,10 @@
String paddingChar = formats.get(i).get("PADDING_CHAR");
String align = formats.get(i).get("ALIGN");
%>
dynamic_<%=cid %>.clearColumnValues();
int maxColumnCount_<%=cid %> = dynamic_<%=cid %>.getColumnCount();
int maxColumnCount_<%=cid %> = dynamicLocal_<%=cid %>.getColumnCount();
for (int i<%=cid%>=0;i<%=cid%><maxColumnCount_<%=cid %>;i<%=cid%>++) {
routines.system.DynamicMetadata dynamicMetadataColumn_<%=cid%> = dynamic_<%=cid %>.getColumnMetadata(i<%=cid%>);
routines.system.DynamicMetadata dynamicMetadataColumn_<%=cid%> = dynamicLocal_<%=cid %>.getColumnMetadata(i<%=cid%>);
int currentFieldLength_<%=cid%> = dynamicMetadataColumn_<%=cid%>.getLength();
<%
if(!"RARE_SYMBOLS".equals(patternUnits)){
@@ -448,7 +447,7 @@
<%
}
%>
dynamic_<%=cid %>.addColumnValue(currentColumnValue_<%=cid%>);
dynamicLocal_<%=cid %>.addColumnValue(currentColumnValue_<%=cid%>);
substringBegin<%=cid %> = substringEnd<%=cid %>;
}
<%
@@ -997,8 +996,13 @@
IMetadataColumn column = columnList.get(i);
if(i % 100 == 0){
%>
substringBegin<%=cid%> = positionalUtil_<%=cid%>.parseValue_<%=i%>(row_<%=cid%>, substringBegin<%=cid%>, substringEnd<%=cid%>, rowLen_<%=cid%>, columnValue<%=cid%> <%if("BYTES".equals(patternUnits)){%>, arrays_<%=cid%>, byteArray_<%=cid%><%}%><%if(useExistingDynamic){%>, dynamic_<%=cid%><%}%>);
substringEnd<%=cid%> = substringBegin<%=cid%>;
<% if (useExistingDynamic) { %>
<%=firstConnName %>.<%=metadata.getDynamicColumn().getLabel()%> = dynamic_<%=cid %>.copyMetadata();
substringBegin<%=cid%> = positionalUtil_<%=cid%>.parseValue_<%=i%>(row_<%=cid%>, substringBegin<%=cid%>, substringEnd<%=cid%>, rowLen_<%=cid%>, columnValue<%=cid%> <%if("BYTES".equals(patternUnits)){%>, arrays_<%=cid%>, byteArray_<%=cid%><%}%>, <%=firstConnName %>.<%=metadata.getDynamicColumn().getLabel()%>);
<% } else { %>
substringBegin<%=cid%> = positionalUtil_<%=cid%>.parseValue_<%=i%>(row_<%=cid%>, substringBegin<%=cid%>, substringEnd<%=cid%>, rowLen_<%=cid%>, columnValue<%=cid%> <%if("BYTES".equals(patternUnits)){%>, arrays_<%=cid%>, byteArray_<%=cid%><%}%>);
<% } %>
substringEnd<%=cid%> = substringBegin<%=cid%>;
<%
}
}
@@ -1010,12 +1014,6 @@
<%
}
}
if(useExistingDynamic && !advanced){
%>
<%=firstConnName %>.<%=metadata.getDynamicColumn().getLabel()%>=dynamic_<%=cid %>;
<%
}
}
log4jFileUtil.debugRetriveData(node,false);

View File

@@ -821,8 +821,13 @@ if(("false").equals(ElementParameterParser.getValue(node,"__CSV_OPTION__"))) {
fileToDelete_<%=cid%>.delete();
}
<%}%>
out<%=cid%> = new routines.system.BufferedOutput(new java.io.OutputStreamWriter(
new java.io.FileOutputStream(fileName_<%=cid%>, <%=isAppend%>), <%=encoding%>));
if (resourceMap.get("outWriter_for_" + <%=fileName %>) == null) {
out<%=cid%> = new routines.system.BufferedOutput(new java.io.OutputStreamWriter(
new java.io.FileOutputStream(fileName_<%=cid%>, <%=isAppend%>), <%=encoding%>));
resourceMap.put("outWriter_for_" + <%=fileName %>, out<%=cid%>);
} else {
out<%=cid%> = (java.io.Writer) resourceMap.get("outWriter_for_" + <%=fileName %>);
}
java.io.StringWriter strWriter<%=cid%> = new java.io.StringWriter();
CsvWriter<%=cid%> = new com.talend.csv.CSVWriter(strWriter<%=cid%>);
CsvWriter<%=cid%>.setSeparator(csvSettings_<%=cid%>.getFieldDelim());

Some files were not shown because too many files have changed in this diff Show More