Thursday, September 17, 2015

Using Debug Mode in Sql Developer with KERBEROS Debug

Sql developer debug mode can be enabled with 2 steps:

1-) Set Log Directory path in Tools->Preferences->Environment

2-) At sqldeveloper\sqldeveloper\bin directory edit sqldeveloper.conf file, comment out "IncludeConfFile  sqldeveloper-nondebug.conf" and "IncludeConfFile  sqldeveloper-debug.conf"

#IncludeConfFile  sqldeveloper-nondebug.conf
IncludeConfFile  sqldeveloper-debug.conf

Restart sql developer. You will see that debug logs will be created under log directory with the name format SQLDeveloper_YYYYMMDDHH24MISS.log

In order to add kerberos debugging you shoud add following java option to sqldeveloper.conf file at sqldeveloper\sqldeveloper\bin directory


However kerberos debug information will not be written to sql developer debug log file. It is send to stdout. In order to get kerberos debug output you should start sql developer from command line. Set screen buffer size of cmd window to some value higher then the system default so some debug output will not be lost because of small buffer size. 

Sql Developer: How to set up Hive connection with Kerberos authentication on Windows client

Windows by default has restriction to retrieve TGT session key. Following registry values should be set in order to allow applications:

Windows 2003, Windows Vista, and later:
• Key:HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\Lsa\Kerberos\Parameters
• Value Name: allowtgtsessionkey
• Value Type: REG_DWORD
• Value: 0x01
Windows XP and Windows 2000:
• KeyHKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\Lsa\Kerberos
• Value Name: allowtgtsessionkey
• Value Type: REG_DWORD
• Value: 0x01

Sql developer 4.0.3 and later supports hive connection. My version is

Hive JDBC drivers should be downloaded and registered with sql developer as Third Party JDBC drivers. My hadoop distribution is cloudera version is 2.5.4. I downloaded the driver from following URL:
Here is how it looks like:

"Java Cryptography Extension (JCE) Unlimited Strength Jurisdiction Policy Files " should be downloaded. You should extract (overwrite) files local_policy.jar and US_export_policy.jar into sqldeveloper/jdk/jre/lib/security directory. 

krb5.conf is also needed by java for kerberos authentication. if KRB5_CONFIG is not set then it first looks into jre/lib/security then %WINDIR% directory. You should create this file under sqldeveloper/jdk/jre/lib/security (the same directory you put local_policy.jar and US_export_policy.jar files). Parameters in this file depends on your KDC configuration. In general /etc/krb5.conf file on any hadoop cluster node works fine on windows too. My krb5.conf file looks like this:

default_realm = MYDOMAIN.COM
default_tgs_enctypes = aes256-cts aes128-cts arcfour-hmac-md5 des-cbc-md5 des-cbc-crc
default_tkt_enctypes = aes256-cts aes128-cts arcfour-hmac-md5 des-cbc-md5 des-cbc-crc
permitted_enctypes = aes256-cts aes128-cts arcfour-hmac-md5 des-cbc-md5 des-cbc-crc
dns_lookup_realm = true
dns_lookup_kdc = true
passwd_check_s_address = false
noaddresses = true
udp_preference_limit = 1
ccache_type = 3
kdc_timesync = 1

Now, you can create a Hive connection with Kerberos authentication on Windows client: