Welcome Guest! Log in
Due to some maintenance operations, stambia.org will be switched to read-only mode during the 13th November. It will be possible to read and download, but impossible to post on the forums or create new accounts. For any question please contact the support team.


This article describes the principal changes of Hadoop Templates.

The Template download section can be found at this page and Hadoop section at this page.

Note:

Stambia DI is a flexible and agile solution. It can be quickly adapted to your needs.

If you have any question, any feature request or any issue, do not hesitate to contact us.

 

templates.hadoop.2019-02-08

INTEGRATION Hive and INTEGRATION Impala
Recycling of previous rejects fixed

When using the option to recycle the rejects of previous execution an extra step is executed to add those previous rejects in the integration flow.

Possible duplicates while retrieving those rejects are now filtered using DISTINCT keyword.

 

templates.hadoop.2019-01-17

TOOL HBase Operation.proc

HBase Operation tool now supports performing snapshot operations on HBase.

Three new operations have been added to take, restore, or delete a snapshot on HBase.

 

templates.hadoop.2018-09-18

hive.tech and impala.tech

Previous versions of the Stambia Hive and Impala technologies had a mechanism that automatically added some of the required kerberos properties in the JDBC URLs.

Such as the "principal" property for instance, which was retrieved automatically from the kerberos Metadata.

This was causing issues as the client kerberos keytab and principal may be different than the Hive / Impala service principal that needs to be defined in the JDBC URL.

  • To avoid any misunderstanding and issue with the automatic mechanism, we decided to remove it and let the user define all the JDBC URL properties.
  • This does not change how to use kerberos with Hive and Impala in Stambia, but simply the definition of the JDBC URL that must be done all by the user now.

We also updated the Hive and Impala getting started articles accordingly to give indications about what should be defined in the JDBC URL when using kerberos.

 

Notes

If you were using kerberos with a previous version of the Hadoop Templates, make sure to update the JDBC Urls of your Hive and Impala Metadata.

Examples of the necessary parameters are listed in the getting started articles.

For history, the parameters which were added automatically were the following AuthMech=1;principal=<principal name>

Make sure the JDBC URL correspond to the examples listed in the articles.

 

You have no rights to post comments

Articles

Suggest a new Article!