-
Notifications
You must be signed in to change notification settings - Fork 167
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SNOW-1621535: DatabaseMetaData - handling of escape characters #1869
Comments
Hello @bauer-at-work , Thanks for raising the issue, we are looking into it, will update. Regards, |
Hello @bauer-at-work , Could you please share the code snippet.
|
Hi @sfc-gh-sghosh, In case (i), in the output of In case (ii), however, the underscore is escaped and should therefore not be treated as a wildcard. Instead, the schema string should be stripped of the backslashes and be treated as a literal schema identifier. Actually, you need to provide four backslashes instead of only two in your Java code examples. Please re-check your triage. Thank you! P.S.: GitHub's markdown renderer treats a backslash as an escape character as well, so I hope nothing got stripped from your message. |
What version of JDBC driver are you using?
v3.18
What operating system and processor architecture are you using?
Fedora 40, amd64
What version of Java are you using?
OpenJDK 64-Bit Server VM (Red_Hat-21.0.4.0.7-2) (build 21.0.4+7, mixed mode, sharing)
What did you do?
Run
DataBaseMetadata.getSchemas()
with two kinds of JDBC URL parameters:Then, I compared the results. Both queries yield one line of result (the only matching schema is "UNDER_SCORE")
However, I realized a difference in field
IS_CURRENT
:In case (i), IS_CURRENT equals 'Y'.
In case (ii), IS_CURRENT equals 'N'.
What did you expect to see?
In case (ii), IS_CURRENT equalling 'Y'.
It seems however that either the driver or the Snowflake query engine don't properly handle the escape characters
\\
.Given that this is the second issue I encountered in regards to DatabaseMetaData operations on schemas with special characters,
please closely examine the driver implementation.
We're struggling to get decent performance on metadata operations on our Snowflake-to-SAP-HANA interface.
Thank you!
The text was updated successfully, but these errors were encountered: