skip to main content
Administering Hybrid Data Pipeline : Using third party JDBC drivers with Hybrid Data Pipeline : Verifying a third party JDBC driver for Hybrid Data Pipeline compatibility
  

Try Now

Verifying a third party JDBC driver for Hybrid Data Pipeline compatibility

The Hybrid Data Pipeline verification tool should be used to verify whether a third party driver is compatible with Hybrid Data Pipeline. The following files and scripts will be used in the verification process. These files are located in the tools folder of either a Hybrid Data Pipeline server installation, or an On-Premises Connector installation.
*jdbcVerificationTool.jar - The JDBC driver verification tool.
*config.properties - This file must be updated with driver-specific information before running the verification tool. The following information must be updated in the config.properties file.
# Configure the database URL
DBURL= database_url
# Configure the driver class name
CLASSNAME=classname
# Configure the user name
USER=username
# Configure the password
PASSWORD=passwordl
# Configure the schema name
SCHEMA=schemaname
# Configure the comma separated table names
TABLES=tablename1, tablename2
# Configure the top term supported by the database.Supported top term keywords are {LIMIT ,ROWNUM ,FIRST,FETCHFIRST,TOP}
TOPTERM=topterm
# Configure the location of the third party driver files.
LOCATION=\default\location
*jdbcVerificationTool.sh - This shell script reads the config.properties file and runs the verification tool. This file is used to run the tool in Linux.
*jdbcVerificationTool.bat - This bat file reads the config.properties file and runs the verification tool. This file is used to run the tool in Windows.
*datastore_profile_template.xml - This is the template profile file for the third party JDBC Driver.
1. Navigate to the tools folder.
a. For the Hybrid Data Pipeline service: <install_dir>/ddcloud/tools
b. For an On-Premises Connector service: <opc_install_dir>\tools
2. Update the config.properties file with driver information.
3. Run the JDBC verification tool. Use the jdbcVerificationTool.sh file for Linux, or the jdbcVerificationTool.bat file for Windows.
4. Review the reports generated by the verification tool. The reports can be seen in the Reports folder.
The tool generates the following files:
*A summary report summarizes the findings of the verification tool. It conveys the percentage of test cases that succeeded, and provides an overview of warnings and exceptions. It will also indicate if any of the warnings and exceptions are critical. Here is an example of what a summary report looks like:
---------------------------------------------------------------
Summary of Results:
---------------------------------------------------------------
Total:20
Succeeded:19
Failed:1
Pass Percentage:95%


---------------------------------------------------------------
Conclusion:
---------------------------------------------------------------
This driver is compatible and can be used in HDP. However some of the functionality will be affected due to the following failures.
Found un-supported data types, respective columns will not be exposed via OData.
Found columns with longer size than the supported column size in table 'GTABLE', the list of columns that will not be exposed via OData: LVCOL,LVARBINCOL.
*A verbose report provides information on a full range of test cases, including metadata and SQL queries. This report also details all the errors, warnings and exceptions. Here is an example of what a verbose report looks like:
---------------------------------------------------------------
JDBC Metadata API Verification
---------------------------------------------------------------

API: getMaxCatalogNameLength
[SUCCESS]Succeeded with Value:32

API: getTypeInfo
[SUCCESS]Succeeded with Table:null
[SUCCESS][BIT][UNSIGNED_ATTRIBUTE]Received:false
[SUCCESS][BIT][TYPE_NAME]Received:BIT


API: getColumns TABLE:GTABLE
[SUCCESS]Succeeded with Table:GTABLE
[SUCCESS][CHARCOL][COLUMN_DEF]Received:null
[SUCCESS][CHARCOL][COLUMN_NAME]Received:CHARCOL
[SUCCESS][LVCOL][DATA_TYPE]Received:-1
[FAILURE][LVCOL][COLUMN_SIZE]Failed with exception:Actual size is 2147483647 and supported size is 32768
[SUCCESS][BITCOL][COLUMN_DEF]Received:null
[SUCCESS][BITCOL][DATA_TYPE]Received:-7
[FAILURE][LVARBINCOL][COLUMN_SIZE]Failed with exception:Actual size is 2147483647 and supported size is 32768
[SUCCESS][DATECOL][COLUMN_DEF]Received:null

...

---------------------------------------------------------------
SQL Query Processing
---------------------------------------------------------------

ODATA QUERY: SELECT
[SUCCESS]Succeeded with Query:SELECT T0.`CHARCOL`, T0.`VCHARCOL`, T0.`DECIMALCOL`, T0.`NUMERICCOL`, T0.`SMALLCOL` FROM `GTABLE` T0

ODATA QUERY: COUNT
[SUCCESS]Succeeded with Query:SELECT count(*) FROM `GTABLE` T0

...


*A <driverclass>.datastore-profile.xml is generated. This file can be used to specify any changes that need to be made to the third party JDBC driver. If the user intends to create this file from scratch, it should be named in the given format: <driverclass>.datastore-profile.xml. In case of any changes, the updated file must be placed in the same location as the third party JDBC driver.