README
     Progress(R) DataDirect(R)
     Progress(R) DataDirect(R) for JDBC(TM) for Apache Spark(TM) SQL Driver
     Release 6.0.1
     May 2017


***********************************************************************
Copyright (C) 1994-2017 Progress and/or its 
subsidiaries or affiliates. All Rights Reserved.

***********************************************************************


CONTENTS

Requirements
Installation Directory
Changes since Release 6.0.1
Changes for Release 6.0.1
Product Features
Notes, Known Problems, and Restrictions
Documentation
Installed Files
Third Party Acknowledgements


     Requirements

Java SE 6 or higher must be installed and the JVM must be defined on your system
path.


     Installation Directory

The default installation directory for the driver is:

* Windows:
  C:\Program Files\Progress\DataDirect\JDBC_60

* UNIX/Linux:
  /opt/Progress/DataDirect/JDBC_60
  Note: For UNIX/Linux, if you do not have access to "/opt", your user's home
  directory will take the place of this directory.


     Changes since Release 6.0.1

Certifications
--------------
The driver has been certified with Spark SQL 2.0.
Driver version 6.0.1.000030 (F000034.U000009)


     Changes for Release 6.0.1

Certifications
--------------
The driver has been certified with Apache Spark SQL 1.4 and 1.5.

Enhancements
------------
* The driver has been enhanced to support the Decimal and Varchar data types.
  See "Notes, Known Problems, and Restrictions" below for details about support
  for the Varchar data type.

* The ArrayFetchSize connection property has been added to the driver to improve
  performance and reduce out of memory errors. ArrayFetchSize can be used to
  increase throughput or, alternately, improve response time in Web-based
  applications.

Changed Behavior
----------------
The driver no longer registers the Statement Pool Monitor as a JMX MBean by
default. To register the Statement Pool Monitor and manage statement pooling
with standard JMX API calls, the new RegisterStatementPoolMonitorMBean
connection property must be set to true. See "Notes, Known Problems, and
Restrictions" for details.


     Product Features

The driver for Apache Spark SQL supports standard SQL query language for
read-write access to Apache Spark SQL version 1.2.0 and higher. By using
DataDirect's Wire Protocol technology, the driver eliminates the need for client
libraries, improving response time and throughput. The following features are
also included with the product.

* Supports all JDBC Core functions

* Supports the core SQL-92 grammar

* Supports SSL data encryption

* Supports Kerberos authentication

* Supports connection pooling

* Returns result set metadata for parameterized statements that have been
  prepared but not yet executed
  
* Includes a set of timeout connection properties which allow you to limit the
  duration of active sessions and how long the driver waits to establish a
  connection before timing out

* Includes the TransactionMode connection property which allows you to configure
  the driver to report that it supports transactions, although Spark SQL does
  not support transactions. This provides a workaround for applications that do
  not operate with a driver that reports transactions are not supported.

* The driver supports the following data types:
  - BIGINT        maps to BIGINT
  - BOOLEAN       maps to BOOLEAN
  - DATE          maps to DATE
  - DECIMAL       maps to DECIMAL
  - DOUBLE        maps to DOUBLE
  - FLOAT         maps to REAL
  - INT           maps to INTEGER
  - SMALLINT      maps to SMALLINT
  - STRING        maps to VARCHAR or LONGVARCHAR
  - TIMESTAMP     maps to TIMESTAMP
  - TINYINT       maps to TINYINT
  - VARCHAR       maps to VARCHAR

* The driver package includes the following components:
  - DataDirect Connection Pool Manager
  - DataDirect Statement Pool Monitor
  - DataDirect Test for JDBC
  - DataDirect Spy for JDBC


     Notes, Known Problems, and Restrictions

The following are notes, known problems, and restrictions with the preview
driver.

Varchar Data Type
-----------------
When returning result set metadata for Varchar columns, the Spark Thrift server
reports the column type as (12) STRING and the precision as 2147483647. For the
latest information about this issue, refer to the Apache JIRA SPARK-5918 issue
Web page:
https://issues.apache.org/jira/browse/SPARK-5918

Multiple Simultaneous Connections
---------------------------------
For Spark SQL versions 1.5 and earlier, the Spark Thrift supports only a single
connection from a single application (client) per instance. Multiple
connections can result in a number of unexpected behaviors, including
corrupting data or crashing the Thrift server. While Spark SQL 1.4 resolved
some issues related to this limitation, all of the multi-connect issues are not
expected to be fixed until Spark SQL 1.6 or later. Refer to
https://github.com/apache/spark/pull/8909 for more information.

Rand() Function
---------------
For Spark SQL version 1.4, the rand() function is not supported. If the rand()
function is used, the driver throws the following exception:
org.apache.spark.sql.AnalysisException: For input string: "TOK_TABLE_OR_COL";

Date/Time Hour() Function
-------------------------
For Spark SQL version 1.5, the Spark Thrift server returns (-17) for the hour()
function.

SQL Support
-----------
The driver supports Spark SQL and the core SQL grammar (primarily SQL-92). For
information about how the driver handles SQL queries, see the "Supported SQL
Functionality" topic in the PROGRESS DATADIRECT FOR JDBC FOR APACHE SPARK SQL
DRIVER USER'S GUIDE.
 
Note that Spark SQL uses a subset of SQL and HiveQL. While it provides much of
the functionality of SQL, the Spark SQL subset of SQL and HiveQL has some
differences and limitations. For the latest information, refer to the "Spark SQL
and DataFrame Guide":
http://spark.apache.org/docs/latest/sql-programming-guide.html

Spark SQL Compatability with Apache Hive
----------------------------------------
Spark SQL currently supports most Hive features and additional features are
continuously being added. For the latest information, refer to the
"Compatibility with Apache Hive" section of the "Spark SQL and DataFrame Guide":
https://spark.apache.org/docs/latest/sql-programming-guide.html#compatibility-
with-apache-hive

Transactions
------------
Apache Spark SQL does not support transactions, and by default, the driver
reports that transactions are not supported. However, some applications will not
operate with a driver that reports transactions are not supported. The
TransactionMode connection property allows you to configure the driver to report
that it supports transactions. In this mode, the driver ignores requests to
enter manual commit mode, start a transaction, or commit a transaction and
return success. Requests to rollback a transaction return an error regardless of
the transaction mode specified.

Executing DataDirect Shell Script
---------------------------------
For UNIX/Linux users: If you receive an error message when executing any
DataDirect for JDBC shell script, make sure that the file has EXECUTE
permission. To do this, use the chmod command. For example, to grant EXECUTE
permission to the testforjdbc.sh file, change to the directory containing
testforjdbc.sh and enter: chmod +x testforjdbc.sh

PreparedStatement and ResultSet methods on Clob Data Types
----------------------------------------------------------
The driver allows PreparedStatement.setXXX methods and ResultSet.getXXX
methods on Clob data types, in addition to the functionality described in
the JDBC specification. The supported conversions typically are the same as
those for LONGVARCHAR, except where limited by database support.

Performance Tuning Wizard
-------------------------
The Performance Tuning Wizard is not available with the Apache Spark SQL driver.

Help System Compatibility
-------------------------
Internet Explorer with the Google Toolbar installed sometimes displays the
following error when the browser is closed: "An error has occurred in the
script on this page." This is a known issue with the Google Toolbar and has
been reported to Google. When closing the driver's help system, this error may
display. 


     Documentation	 	 

PROGRESS DATADIRECT FOR JDBC FOR APACHE SPARK SQL DRIVER USER'S GUIDE
---------------------------------------------------------------------
The USER'S GUIDE is available as an HTML help system and as a PDF.

* The HTML version of the USER'S GUIDE is installed in the SparkSQLHelp
  subdirectory of your product installation directory. This help system is also
  available on the Progress DataDirect Web site:
  https://www.progress.com/resources/documentation/by-data-source

* The PDF version of the USER'S GUIDE is available on the Progress DataDirect
  Web site:
  https://www.progress.com/resources/documentation/by-data-source


     Installed Files

When you extract the contents of the installation download package to your
installer directory, you will notice the following files that are required to
install the driver:

*  Windows:
   - PROGRESS_DATADIRECT_JDBC_INSTALL.exe
   - PROGRESS_DATADIRECT_JDBC_SPARKSQL_6.0.1_INSTALL.iam.zip
   - PROGRESS_DATADIRECT_JDBC_COMMON_6.0.0_INSTALL.iam.zip
   - PROGRESS_DATADIRECT_JDBC_DOCUMENTATION_6.0.0_INSTALL.iam.zip

*  Non-Windows:
   - PROGRESS_DATADIRECT_JDBC_INSTALL.jar
   - PROGRESS_DATADIRECT_JDBC_SPARKSQL_6.0.1_INSTALL.iam.zip
   - PROGRESS_DATADIRECT_JDBC_COMMON_6.0.0_INSTALL.iam.zip
   - PROGRESS_DATADIRECT_JDBC_DOCUMENTATION_6.0.0_INSTALL.iam.zip

When you install the driver, the installer creates the following directories and
files in the product installation directory in the default installation
directory or in an installation directory you specify, represented by
INSTALL_DIR.


INSTALL_DIR/:
-------------
LicenseTool.jar            Product license file manager

DDProcInfo.exe             Windows executable to start the Processor Information
                           Utility

DDProcInfo                 UNIX/Linux script to start the Processor Information
                           Utility


INSTALL_DIR/Examples/Bulk/:
---------------------------
Load From File/bulkLoadFileDemo.java
                           Java source example for bulk loading from a CSV file

Load From File/load.txt    Sample data for the example

Streaming/bulkLoadStreamingDemo.java
                           Java source example for bulk loading from a result
                           set

Threaded Streaming/bulkLoadThreadedStreamingDemo.java
                           Java source example for multi-threaded bulk loading
                           from a result set

Threaded Streaming/README.txt
                           Instructions on how to use the thread.properties file

Threaded Streaming/thread.properties
                           Properties file for the example


INSTALL_DIR/Examples/Connector/:
--------------------------------
ConnectorSample.ear        J2EE Application Enterprise Archive file containing
                           the ConnectorSample application 

connectorsample.htm        USING DATADIRECT CONNECT FOR JDBC RESOURCE ADAPTERS
                           document

graphics/*.*               Images referenced by the USING DATADIRECT CONNECT FOR
                           JDBC RESOURCE ADAPTERS document

src/ConnectorSample.jsp    Source for the JavaServer Page used to access the
                           ConnectorSample application

src/connectorsample/ConnectorSample.java     
                           Java source file defining the remote interface for
                           the ConnectorSample EJB

src/connectorsample/ConnectorSampleBean.java  
                           Java source file defining the home interface for the
                           ConnectorSample EJB

src/connectorsample/ConnectorSampleHome.java  
                           Java source file containing the implementation for
                           the ConnectorSample EJB


INSTALL_DIR/Examples/JNDI/:
---------------------------
JNDI_FILESYSTEM_Example.java
                           Example Java(TM) source file

JNDI_LDAP_Example.java     Example Java source file


INSTALL_DIR/Examples/SforceSamples/:
------------------------------------
buildsamples.bat           Batch file to build the Salesforce example

buildsamples.sh            Shell script to build the Salesforce example
 
ddlogging.properties       Logging properties file

runsalesforceconnectsample.bat
                           Batch file to run the Salesforce example

runsalesforceconnectsample.sh
                           Shell script to run the Salesforce example

bin/com/ddtek/jdbc/samples/SalesforceConnectSample.class
                           Java example class

bin/com/ddtek/jdbc/samples/SampleException.class
                           Java example class

src/com/ddtek/jdbc/samples/SalesforceConnectSample.java
                           Java source example


INSTALL_DIR/Help/: 
------------------
SparkSQLHelp/index.html    Driver HTML help system entry file

SparkSQLHelp/*             Support files and folders for the driver help system


INSTALL_DIR/install/: 
---------------------
.psc_dd_inst_reg.xml       Support file for the installation logs

logs/*.*                   Log file generated upon installation


INSTALL_DIR/jre/:
-----------------
*.*                        Files associated with the driver for Apache Spark SQL


INSTALL_DIR/lib/:
-----------------
sparksql.jar               Driver and DataSource classes


INSTALL_DIR/NOTICES/:
---------------------
JDBC for Apache Spark SQL v6.0 notices.txt
                           Third party agreement information


INSTALL_DIR/pool manager/:
--------------------------
pool.jar                   All DataDirect Connection Pool Manager classes


INSTALL_DIR/READMES/:
---------------------
JDBC for Apache Spark SQL v6.0 readme.txt
                           This file


INSTALL_DIR/testforjdbc/:
-------------------------
Config.txt                 Configuration file for DataDirect Test

ddlogging.properties       Logging properties file

testforjdbc.bat            Batch file to start DataDirect Test

testforjdbc.sh             Shell script to start DataDirect Test

lib/testforjdbc.jar        DataDirect Test classes


INSTALL_DIR/uninstall/:
-----------------------
.com.zerog.registry.xml    Support file for the uninstaller

.psc_dd_uninst_reg.xml     Support file for the uninstaller

InstallScript.iap_xml      Support file for the uninstaller

installvariables.properties
                           Support file for the Windows uninstaller

Uninstall_JDBC_60.exe      Windows uninstaller for all 6.0 drivers

Uninstall_JDBC_60.lax      Support file for the Windows uninstaller

uninstaller.jar            Java uninstaller for all 6.0 drivers

resource/*.*               Resource files for the Windows uninstaller


2 May 2017
=============
End of README