Integrating the third party JDBC driver into Hybrid Data Pipeline
After confirming that the third party JDBC driver is compatible, the driver can be integrated with the Hybrid Data Pipeline environment. The driver must be copied to the drivers folder. The location of the drivers folder varies depending on the Hybrid Data Pipeline environment.
In a standalone installation, the driver must be copied to the following location:
<install_dir>/ddcloud/keystore/drivers if the default key location was selected during installation of the server
<user_specified_location>/shared/drivers if a non-default key location was specified during installation of the server
In a load balancer installation, the driver must be copied to the drivers directory in the key location specified during the installation of the initial Hybrid Data Pipeline node.
In an On-Premises Connector installation, the drivers must be updated in the On-Premises Connector Installation directory: <opc_install_dir>\OPDAS\server\drivers. The profile xml for the third party driver will still be read from the Hybrid Data Pipeline server.
After the third party driver has been integrated with the Hybrid Data Pipeline environment, you can create a data source to access backend data. If you attempt to create the JDBC data source without plugging in the driver, you will get an error. Data sources can be created either through the Web UI or through the Hybrid Data Pipeline API. For information on creating data sources through the Web UI, see Creating data sources with the Web UI and JDBC parameters for third party drivers . For information on creating data sources through the Hybrid Data Pipeline API, see Data Sources API.
Note: The current limitation is that there should not be any conflicts between the classes among various drivers. Multiple drivers cannot use different versions of the same library.