FAQ—JDBC jOOQ Upgrade from v3.9 to v3.17
Overview
This update for the JDBC Snap Pack includes upgrading the jOOQ library to version 3.17, which contains several improvements, performance optimization, and key changes that may impact your current integrations with SnapLogic. This FAQ document provides the reasons behind this update, the expected impact, the timelines for the upgrade, the affected databases, and the expected behavior changes.
What is happening?
The jOOQ library for the JDBC Snap Pack is upgraded from v3.9 to v3.17
Why are we upgrading the JDBC jOOQ library?
To improve security, compatibility, performance, stability, and reliability.
What is the timeline for this upgrade?
The upgrade to jOOQ 3.17 for the JDBC Snap Pack will be rolled out as per the following timelines:
-
Early availability phase/ Latest Release: This phase will allow early adopters to test the new version and provide feedback. You can choose to use the latest version of the Snap Pack for testing and reporting any issues you encounter.
-
General Availability (GA)/ Stable Release: The full upgrade to jOOQ 3.17 will be available to all with the GA release on February 12, 2025. The stable version is then pushed to all production environments.
Will I be impacted?
Yes, you may be impacted if you use the latest version. If you prefer not to proceed with the jOOQ upgrade, you can revert to the previous working version of the Snap Pack.
What are the key areas affected by this upgrade?
-
SQL generation and parsing
-
Database connection pooling
-
Query performance and execution time
-
Error and exception handling
-
Compatibility with some legacy database features
Which databases were tested for the jOOQ upgrade?
Given the various databases in use, we have tested against only specific databases. However, we recommend testing your pipelines in your lower-level environments (development or QA) to ensure compatibility and functionality in your specific database before planning on the production environment.
|
MySQL PostgreSQL Redshift Snowflake Oracle |
SQL Server Maria DB DB2 Informix Cassandra |
Teradata Sybase Denodo Vertica SAPHANA |
Do I need to reconfigure my accounts?
Yes, you need to reconfigure your accounts if the current driver (JARs) is incompatible with the upgraded jOOQ version (v3.17.x). For more details, refer to the breaking changes.
What are the recommended JDBC drivers (JARs) for various databases?
| Database | Latest certified and recommended drivers | Dependencies |
|---|---|---|
| SQL Server | mssql-jdbc-11.2.0.jre11.jar | For this driver version, you must set the URL property in the Generic
Database Account as trustServerCertificate=true |
| MySQL | mysql-connector-java-5.1.47.jar (MySQL 5.x.x server version)
mysql-connector-j-8.0.17.jar (MySQL 8.x.x server version) |
|
| Oracle | ojdbc10.jar | |
| Snowflake | snowflake-jdbc-3.13.25.jar | |
| Redshift | RedshiftJDBC42-1.1.17.1017.jar | |
| Postgre SQL | postgresql-42.7.2.jar | |
| Vertica | vertica-jdbc-11.0.0-0.jar | |
| Teradata | 17.20.12 | |
| Cassandra | v4.11.1 | |
| Maria DB | mariadb-java-client-2.6.0.jar | |
| Informix | ifxjdbc.jar | |
| Db2 | db2jcc4.jar | |
| SAP HANA | ngdbc-2.8.14.jar | |
| IBM Db2 i | jt400-20.0.7.jar | |
| Sybase | jconn4.jar | |
| Athena AWS | AthenaJDBC42_2.0.24.1000.jar |
Potential Breaking Changes
- After the jOOQ library upgrade, we recommend that you use the SnapLogic-certified drivers for your target databases. If you do not use the certified drivers, the previously functioning pipelines might be impacted.
- Additionally, we highly recommend that you migrate to dedicated Database Snap Packs (for example, Oracle Snap Pack, SQL Server Snap Pack, MySQL Snap Pack, Redshift Snap Pack) instead of using the JDBC Snap Pack for your integration needs.
Behavior changes
The following table lists the change in the output messages:
| JDBC Snaps | Endpoints | Scenario | Error/Output from the Old (v3.9.1) JOOQ Library | Error/Output from the New (v3.17.x) JOOQ Library |
|---|---|---|---|---|
| Generic JDBC - Execute | All databases - Execute | When calling stored procedures | Output message:
|
Output message:
|
| Generic JDBC - Select and Generic JDBC - Execute | IBM DB2 | When the Snaps integrate with the DB2 for the XML data type. | Output message: |
Output
message: |
| Generic JDBC - Select and Generic JDBC - Execute | PostgreSQL | When the Snaps integrate with the PostgreSQL for the XML data type. | Output message:
|
Output message:
|
The following table lists the change in the error messages:
| JDBC Snaps | Endpoints | Scenario | Error/Output from the Old (v3.9.1) JOOQ Library | Error/Output from the New (v3.17.x) JOOQ Library |
|---|---|---|---|---|
| Generic JDBC - Execute | Sybase - Execute | When
Generic JDBC - Execute
Snap inserts out-of-range
values for int types like BigInt,
SmallInt, and TinyInt |
error=Batch operation failed, reason=Arithmetic overflow during
implicit conversion of NUMERIC value '922337203685478' to an int field. , error
code: 0, SQL state: 22001, next exception: Arithmetic overflow during implicit
conversion of NUMERIC value '922337203685478.5807' to a MONEY field. ,
resolution=Please check for valid Snap properties and input data. |
error=Batch operation failed, reason=Data truncation, error code: 0,
SQL state: 22001, next exception: Arithmetic overflow during implicit conversion
of NUMERIC value '922337203685478' to an int field. , resolution=Please check for
valid Snap properties and input data. Reason: Expected and input data are not
equal, Resolution: Please check expected and input data. |
| Oracle - Execute | When you pass a special character (') within the single quotes (''). | "reason": "Invalid query: INSERT INTO \"enron\".\"EMP7400\"(NAME)
VALUES ('windyi'eee') |
reason=You have an error in your SQL syntax; check the manual that
corresponds to your MySQL server version for the right syntax to use near
'"enron"."EMP7400"(NAME) VALUES ('windyi'eee')' at line 1, error code: 1064, SQL
state: 42000 |
|
| MySQL - Execute | When you pass a special character (') within the single quotes (''). | "reason": "Invalid query: INSERT INTO \"enron\".\"EMP7400\"(NAME)
VALUES ('windyi'eee');" |
reason=You have an error in your SQL syntax; check the manual that
corresponds to your MySQL server version for the right syntax to use near
'"enron"."EMP7400"(NAME) VALUES ('windyi'eee')' at line 1, error code: 1064, SQL
state: 42000 |
|
| Redshift - Execute | When you pass a special character (') within the single quotes (''). | "reason": "Invalid query: insert into singlequote values
(101,'na'me') |
"reason": "Invalid query: This may be raised due to limitations in the
JDBC driver or syntax errors in the SQL statement", |
|
| Generic JDBC - Insert | SQL Server - Insert | When the Generic JDBC - Insert Snap has a datatype mismatch. | reason=Error occurred while converting the value of bigintcol field,
error=Invalid value for number field definition. |
reason=Error converting value bigintcol to type class java.lang.Long,
error=Error converting value |
| Oracle - Insert | reason=ORA-01438: value larger than specified precision allowed for
this column , error code: 1438, SQL state: 22003, error=SQL operation
failed |
[{"reason":"The string value is not a number","error":"Invalid integer:
value10"}] |
||
| SAP HANA - Insert | reason=The string value is not a long integer or too large for a long
integer, error=Invalid long integer: bigintcol |
reason=SQL [null]; SAP DBTech JDBC: Cannot convert data bigintcol to
type long., error=SQL operation failed |
||
| Vertica - Insert | When you attempt to insert null, and, there is a datatype
mismatch. |
reason=Error converting value bigintcol to type class
java.lang.Integer, error=Error converting value |
reason=[Vertica][VJDBC](3681) ERROR: Invalid input syntax for integer:
"bigintcol", error code: 3681, SQL state: 22V02, error=Batch operation
failed |
|
| PostgreSQL | When you integrate with the PostgreSQL database and attempt to insert a
String value in an Int type. |
"error": "Invalid long integer: bigintcol",
|
"error": "Error converting value", |
|
| DB2 | When you integrate DB2 and JDBC Snaps with the char data type
that is more than the prescribed range. |
Error: SQLCODE=-433 |
Error: SQLCODE=-302 |
|
| Sybase - Insert | When you insert more than the range of Int signed
value. |
error=Batch operation failed, reason=Data truncation, error code: 0,
SQL state: 22001, next exception: Arithmetic overflow during implicit conversion
of DECIMAL value '2147483648' to a INT field. |
error=Number overflow: 2147483648, reason=Error occurred while
converting the value of int_signed_col field, resolution=Please correct the value
to match the expected data type, Reason: Expected and input data are not
equal |
|
When you insert varchar to type Int. |
error=Batch operation failed, reason=Implicit conversion from datatype
'VARCHAR' to 'INT' is not allowed. Use the CONVERT function to run this query. ,
error code: 257, SQL state: 37000, resolution=Please check for valid Snap
properties and input data. |
error=Error converting value, reason=Error converting value P to type
class java.lang.Integer, resolution=Please provide input data in the appropriate
data type., Reason: Expected and input data are not equal, Resolution: Please
check expected and input data. |
||
When you insert varchar to type date. |
error=Batch operation failed, reason=Syntax error during implicit
conversion of VARCHAR value 'Hello' to a DATETIME field. , error code: 249, SQL
state: S1000, resolution=Please check for valid Snap properties and input
data. |
error=Error converting value, reason=Error converting value Hello to
type class java.sql.Timestamp, resolution=Please provide input data in the
appropriate data type. Reason: Expected and input data are not equal, Resolution:
Please check expected and input data. |
||
When you insert varchar to type TinyInt
Signed. |
error=Error converting value, reason=Error converting value P to type
class java.lang.Byte, resolution=Please provide input data in the appropriate data
type. |
error=Error converting value, reason=Error converting value P to type
class org.jooq.types.UByte, resolution=Please provide input data in the
appropriate data type. Reason: Expected and input data are not equal, Resolution:
Please check expected and input data. |
||
| Generic JDBC - Execute and Generic JDBC - Insert | Sybase - Execute | When Generic Generic JDBC - Execute Snap inserts out of range in Positive for money type. | reason=Arithmetic overflow during implicit conversion of NUMERIC value
'922337203685478.5807' to a MONEY field . , error code: 0, SQL state: 22001, next
exception: Arithmetic overflow during implicit conversion of NUMERIC value
'922337203685478.5807' to a MONEY field |
reason=Data truncation, error code: 0, SQL state: 22001, next
exception: Arithmetic overflow during implicit conversion of NUMERIC value
'922337203685478.5807' to a MONEY field. |
-
Previously, when you ran multiple statements in the Generic JDBC - Execute Snap, only the reason appeared in the error. If your target database is Snowflake, the error reason now includes the Snowflake class appended in the beginning.
Before After reason=Actual statement count 2 did not match the desired statement count 1reason=net.snowflake.client.jdbc.SnowflakeSQLException: Actual statement count 2 did not match the desired statement count 1 - When
Generic JDBC - Insert
Snap inserts
nullinto the Vertica - Insert and there is a data type mismatch, a change in the error message has been observed.Before After reason=Error converting value bigintcol to type class java.lang.Integer, error=Error converting valuereason=[Vertica][VJDBC](3681) ERROR: Invalid input syntax for integer: "bigintcol", error code: 3681, SQL state: 22V02, error=Batch operation failed - Previously, when you integrated DB2 and JDBC Snaps with the
chardata type, theSQLCODE=-433. Now, the error message isSQLCODE=-302.Before After SQLCODE=-433SQLCODE=-302 -
Previously, when you called stored procedures, the Execute Snap displayed only the success message in the output. Now, the Snap displays both the status and the message in the output for all databases.
Before After Message: Success$UPDATE_COUNT=-1, 0, or 1(based on the Snap Pack behavior) in the output.
- Previously, when
Generic JDBC - Select
and
Generic JDBC - Execute
Snaps were integrated with PostgreSQL or
IBM DB2, the metadata was displayed along with the XML data. Now, you only receive the
XML data.
Before After metadata + XML dataXML Data - Previously, when the
Generic JDBC - Insert
and
Generic JDBC - Execute
were integrated with the Sybase database
and attempted to insert an out-of-range negative value for the
moneytype, the error reason slightly changed.Before After reason=Arithmetic overflow during implicit conversion of NUMERIC valuereason=Data truncation - Previously, when the Generic JDBC—Insert and Generic JDBC—Execute were integrated with
the Sybase database and attempted to insert out-of-range values for
inttypes likebig int,small int, andtiny int, the error reason slightly changed.Before After reason=Arithmetic overflow during implicit conversion of NUMERIC valuereason=Data truncation - Previously, when the
Generic JDBC - Insert
was integrated with
the Sybase database and
varcharwas attempted to insert intointdatatype, the error message and reason slightly changed.Before After error=Batch operation failed, reason=Implicit conversion from datatype 'VARCHAR' to 'INT' is not allowed. Use the CONVERT function to run this query. , error code: 257, SQL state: 37000, resolution=Please check for valid Snap properties and input data.error=Error converting value, reason=Error converting value P to type class java.lang.Integer, resolution=Please provide input data in the appropriate data type., Reason: Expected and input data are not equal, Resolution: Please check expected and input data -
Previously, when the Generic JDBC - Insert was integrated with the Sybase database and
varcharwas attempted to insert intotiny intdatatype, the reason slightly changed.Before After reason=Error converting value P to type class java.lang.Bytereason=Error converting value P to type class org.jooq.types.UByte - When the
Generic JDBC - Execute
integrated with the Oracle,
MySQL, PostgreSQL, or Redshift databases, and you attempted to pass a special character
('), the error reason slightly changed.
Ex:
Insert into table(ids) values ('windyi'eee')Before After "reason": "Invalid query: Insert into TECTONIC.SNPQA2055 (ids) values ('windyi'eee')""reason": "error occurred during batching: ORA-00917: missing comma\n, error code: 17081", - When the
Generic JDBC - Insert
Snap is integrated with the
PostgreSQL database, and you attempt to insert a String value into Int types, the error
message and reason change.
Before After "error": "Invalid long integer: bigintcol", "reason": "The string value is not a long integer or too large for a long integer""error": "Error converting value", "reason": "Error converting value bigintcol to type class java.lang.Long"