Skip to content

Use a separate logger for unsafe thrift responses #153

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Jun 23, 2023
Merged

Conversation

susodapop
Copy link
Contributor

Description

This pull request adds a dedicated logger for unredacted thrift responses. For the standard databricks.sql.thrift_backend logger, records will only include the type of request and type of response.

For example, here is the live logging output from running the test_timezone_with_timestamp e2e test before this change:

Log output before this PR
DEBUG    databricks.sql.thrift_backend:thrift_backend.py:321 Sending request: TOpenSessionReq(client_protocol=None, username=None, password=None, configuration={'spark.thriftserver.arrowBasedRowSet.timestampAsString': 'false'}, getInfos=None, client_protocol_i64=42247, connectionProperties=None, initialNamespace=None, canUseMultipleCatalogs=True, sessionId=None)
DEBUG    databricks.sql.thrift_backend:thrift_backend.py:327 Received response: TOpenSessionResp(status=TStatus(statusCode=0, infoMessages=None, sqlState=None, errorCode=None, errorMessage=None, displayMessage=None, responseValidation=None), serverProtocolVersion=42247, sessionHandle=TSessionHandle(sessionId=THandleIdentifier(guid=b'\x01\xee\x10v\xe88\x14\xdd\x9dF\x89(\x80?<\x98', secret=b'3\x8dR\x9d\x82rF\xeb\x84\x82\xcbA\x94f\x83\x9d', executionVersion=None), serverProtocolVersion=None), configuration=None, initialNamespace=TNamespace(catalogName='hive_metastore', schemaName=None), canUseMultipleCatalogs=True, getInfos=[])
DEBUG    databricks.sql.thrift_backend:thrift_backend.py:321 Sending request: TExecuteStatementReq(sessionHandle=TSessionHandle(sessionId=THandleIdentifier(guid=b'\x01\xee\x10v\xe88\x14\xdd\x9dF\x89(\x80?<\x98', secret=b'3\x8dR\x9d\x82rF\xeb\x84\x82\xcbA\x94f\x83\x9d', executionVersion=None), serverProtocolVersion=None), statement="SET TIME ZONE 'Europe/Amsterdam'", confOverlay={'spark.thriftserver.arrowBasedRowSet.timestampAsString': 'false'}, runAsync=True, getDirectResults=TSparkGetDirectResults(maxRows=1000, maxBytes=10485760), queryTimeout=0, canReadArrowResult=True, canDownloadResult=False, canDecompressLZ4Result=True, maxBytesPerFile=None, useArrowNativeTypes=TSparkArrowTypes(timestampAsArrow=True, decimalAsArrow=True, complexTypesAsArrow=True, intervalTypesAsArrow=False), resultRowLimit=None, operationId=None, sessionConf=None, rejectHighCostQueries=None, estimatedCost=None, executionVersion=None, requestValidation=None, resultPersistenceMode=None, trimArrowBatchesToLimit=None)
DEBUG    databricks.sql.thrift_backend:thrift_backend.py:327 Received response: TExecuteStatementResp(status=TStatus(statusCode=0, infoMessages=None, sqlState=None, errorCode=None, errorMessage=None, displayMessage=None, responseValidation=None), operationHandle=TOperationHandle(operationId=THandleIdentifier(guid=b'\x01\xee\x10v\xe8Q\x17Z\x9b\x00_-(\xa5[\xd5', secret=b'3\x8dR\x9d\x82rF\xeb\x84\x82\xcbA\x94f\x83\x9d', executionVersion=0), operationType=0, hasResultSet=True, modifiedRowCount=None), directResults=TSparkDirectResults(operationStatus=TGetOperationStatusResp(status=TStatus(statusCode=0, infoMessages=None, sqlState=None, errorCode=None, errorMessage=None, displayMessage=None, responseValidation=None), operationState=2, sqlState=None, errorCode=None, errorMessage=None, taskStatus=None, operationStarted=1687381338618, operationCompleted=1687381338749, hasResultSet=None, progressUpdateResponse=None, numModifiedRows=None, displayMessage=None, diagnosticInfo=None, responseValidation=None, idempotencyType=1, statementTimeout=172800, statementTimeoutLevel=None), resultSetMetadata=TGetResultSetMetadataResp(status=TStatus(statusCode=0, infoMessages=None, sqlState=None, errorCode=None, errorMessage=None, displayMessage=None, responseValidation=None), schema=TTableSchema(columns=[TColumnDesc(columnName='key', typeDesc=TTypeDesc(types=[TTypeEntry(primitiveEntry=TPrimitiveTypeEntry(type=7, typeQualifiers=None), arrayEntry=None, mapEntry=None, structEntry=None, unionEntry=None, userDefinedTypeEntry=None)]), position=1, comment=''), TColumnDesc(columnName='value', typeDesc=TTypeDesc(types=[TTypeEntry(primitiveEntry=TPrimitiveTypeEntry(type=7, typeQualifiers=None), arrayEntry=None, mapEntry=None, structEntry=None, unionEntry=None, userDefinedTypeEntry=None)]), position=2, comment='')]), resultFormat=0, lz4Compressed=True, arrowSchema=b'\xff\xff\xff\xff\xa8\x01\x00\x00\x10\x00\x00\x00\x00\x00\n\x00\x0e\x00\x06\x00\r\x00\x08\x00\n\x00\x00\x00\x00\x00\x04\x00\x10\x00\x00\x00\x00\x01\n\x00\x0c\x00\x00\x00\x08\x00\x04\x00\n\x00\x00\x00\x08\x00\x00\x00\x08\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00\xbc\x00\x00\x00\x04\x00\x00\x00^\xff\xff\xff\x14\x00\x00\x00\x88\x00\x00\x00\x88\x00\x00\x00\x00\x00\x00\x05\x84\x00\x00\x00\x02\x00\x00\x00@\x00\x00\x00\x04\x00\x00\x00\x14\xff\xff\xff\x08\x00\x00\x00\x14\x00\x00\x00\x08\x00\x00\x00"string"\x00\x00\x00\x00\x17\x00\x00\x00Spark:DataType:JsonType\x00L\xff\xff\xff\x08\x00\x00\x00\x10\x00\x00\x00\x06\x00\x00\x00STRING\x00\x00\x16\x00\x00\x00Spark:DataType:SqlName\x00\x00\x00\x00\x00\x00D\xff\xff\xff\x05\x00\x00\x00value\x00\x12\x00\x18\x00\x14\x00\x00\x00\x13\x00\x0c\x00\x00\x00\x08\x00\x04\x00\x12\x00\x00\x00\x14\x00\x00\x00\x90\x00\x00\x00\x94\x00\x00\x00\x00\x00\x00\x05\x90\x00\x00\x00\x02\x00\x00\x00H\x00\x00\x00\x04\x00\x00\x00\xc8\xff\xff\xff\x08\x00\x00\x00\x14\x00\x00\x00\x08\x00\x00\x00"string"\x00\x00\x00\x00\x17\x00\x00\x00Spark:DataType:JsonType\x00\x08\x00\x0c\x00\x08\x00\x04\x00\x08\x00\x00\x00\x08\x00\x00\x00\x10\x00\x00\x00\x06\x00\x00\x00STRING\x00\x00\x16\x00\x00\x00Spark:DataType:SqlName\x00\x00\x00\x00\x00\x00\x04\x00\x04\x00\x04\x00\x00\x00\x03\x00\x00\x00key\x00\x00\x00\x00\x00', cacheLookupResult=0, uncompressedBytes=280, compressedBytes=182, isStagingOperation=False, reasonForNoCloudFetch=1, resultFiles=None, manifestFile=None, manifestFileFormat=None), resultSet=TFetchResultsResp(status=TStatus(statusCode=0, infoMessages=None, sqlState=None, errorCode=None, errorMessage=None, displayMessage=None, responseValidation=None), hasMoreRows=False, results=TRowSet(startRowOffset=0, rows=[], columns=None, binaryColumns=None, columnCount=None, arrowBatches=[TSparkArrowBatch(batch=b'\x04"M\x18`ps\xa7\x00\x00\x00\xa2\xff\xff\xff\xff\xd8\x00\x00\x00\x14\x00\x01\x00\xf2\x02\x0c\x00\x16\x00\x0e\x00\x15\x00\x10\x00\x04\x00\x0c\x00\x00\x008\x17\x00`\x00\x00\x00\x04\x00\x10\x08\x00\xc3\x03\n\x00\x18\x00\x0c\x00\x08\x00\x04\x00\n8\x00Px\x00\x00\x00\x01\x1c\x00\x03\x02\x00\x13\x06\x08\x00\x00\x02\x00\x04\x18\x00\x10\x08\r\x00\x07\x08\x00\x01L\x00\x07\x10\x00\x11\x18\x0e\x00\x060\x00\x10 \r\x00\x07 \x00\x11(\x0e\x00\x068\x00\x00\x02\x00\x1b\x02x\x00\x00\x02\x00\x04p\x00\x04\x02\x00\x08\x10\x00\x00\x84\x00\x88timezone\x18\x00\x00\x8c\x00\xf0\x01Europe/Amsterdam\x00\x00\x00\x00', rowCount=1)], resultLinks=None), resultSetMetadata=None, responseValidation=None), closeOperation=TCloseOperationResp(status=TStatus(statusCode=0, infoMessages=None, sqlState=None, errorCode=None, errorMessage=None, displayMessage=None, responseValidation=None))), executionRejected=None, maxClusterCapacity=None, queryCost=None, sessionConf=TDBSqlSessionConf(confs={'spark.databricks.sqlgateway.useCreateViewCommandWithResult': 'false', 'enable_photon': 'false', 'timezone': 'Europe/Amsterdam', 'statement_timeout': '0', 'spark.thriftserver.arrowBasedRowSet.timestampAsString': 'false', 'use_cached_result': 'true'}, tempViews=[], currentDatabase='default', currentCatalog='hive_metastore', sessionCapabilities=TDBSqlSessionCapabilities(supportsMultipleCatalogs=True), expressionsInfos=[], internalConfs=None), currentClusterLoad=None, idempotencyType=None)
DEBUG    databricks.sql.thrift_backend:thrift_backend.py:321 Sending request: TExecuteStatementReq(sessionHandle=TSessionHandle(sessionId=THandleIdentifier(guid=b'\x01\xee\x10v\xe88\x14\xdd\x9dF\x89(\x80?<\x98', secret=b'3\x8dR\x9d\x82rF\xeb\x84\x82\xcbA\x94f\x83\x9d', executionVersion=None), serverProtocolVersion=None), statement="select CAST('2022-03-02 12:54:56' as TIMESTAMP)", confOverlay={'spark.thriftserver.arrowBasedRowSet.timestampAsString': 'false'}, runAsync=True, getDirectResults=TSparkGetDirectResults(maxRows=1000, maxBytes=10485760), queryTimeout=0, canReadArrowResult=True, canDownloadResult=False, canDecompressLZ4Result=True, maxBytesPerFile=None, useArrowNativeTypes=TSparkArrowTypes(timestampAsArrow=True, decimalAsArrow=True, complexTypesAsArrow=True, intervalTypesAsArrow=False), resultRowLimit=None, operationId=None, sessionConf=None, rejectHighCostQueries=None, estimatedCost=None, executionVersion=None, requestValidation=None, resultPersistenceMode=None, trimArrowBatchesToLimit=None)
DEBUG    databricks.sql.thrift_backend:thrift_backend.py:327 Received response: TExecuteStatementResp(status=TStatus(statusCode=0, infoMessages=None, sqlState=None, errorCode=None, errorMessage=None, displayMessage=None, responseValidation=None), operationHandle=TOperationHandle(operationId=THandleIdentifier(guid=b'\x01\xee\x10v\xe8\xa3\x1d@\xb1\x11\xc3\xcc\xfeB\xfd\xef', secret=b'3\x8dR\x9d\x82rF\xeb\x84\x82\xcbA\x94f\x83\x9d', executionVersion=0), operationType=0, hasResultSet=True, modifiedRowCount=None), directResults=TSparkDirectResults(operationStatus=TGetOperationStatusResp(status=TStatus(statusCode=0, infoMessages=None, sqlState=None, errorCode=None, errorMessage=None, displayMessage=None, responseValidation=None), operationState=2, sqlState=None, errorCode=None, errorMessage=None, taskStatus=None, operationStarted=1687381339080, operationCompleted=1687381339385, hasResultSet=None, progressUpdateResponse=None, numModifiedRows=None, displayMessage=None, diagnosticInfo=None, responseValidation=None, idempotencyType=2, statementTimeout=172800, statementTimeoutLevel=None), resultSetMetadata=TGetResultSetMetadataResp(status=TStatus(statusCode=0, infoMessages=None, sqlState=None, errorCode=None, errorMessage=None, displayMessage=None, responseValidation=None), schema=TTableSchema(columns=[TColumnDesc(columnName='CAST(2022-03-02 12:54:56 AS TIMESTAMP)', typeDesc=TTypeDesc(types=[TTypeEntry(primitiveEntry=TPrimitiveTypeEntry(type=8, typeQualifiers=None), arrayEntry=None, mapEntry=None, structEntry=None, unionEntry=None, userDefinedTypeEntry=None)]), position=1, comment='')]), resultFormat=0, lz4Compressed=True, arrowSchema=b'\xff\xff\xff\xffP\x01\x00\x00\x10\x00\x00\x00\x00\x00\n\x00\x0e\x00\x06\x00\r\x00\x08\x00\n\x00\x00\x00\x00\x00\x04\x00\x10\x00\x00\x00\x00\x01\n\x00\x0c\x00\x00\x00\x08\x00\x04\x00\n\x00\x00\x00\x08\x00\x00\x00\x08\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x18\x00\x00\x00\x00\x00\x12\x00\x18\x00\x14\x00\x13\x00\x12\x00\x0c\x00\x00\x00\x08\x00\x04\x00\x12\x00\x00\x00\x14\x00\x00\x00\x94\x00\x00\x00\x9c\x00\x00\x00\x00\x00\n\x01\xb8\x00\x00\x00\x02\x00\x00\x00H\x00\x00\x00\x04\x00\x00\x00\xc8\xff\xff\xff\x08\x00\x00\x00\x14\x00\x00\x00\x0b\x00\x00\x00"timestamp"\x00\x17\x00\x00\x00Spark:DataType:JsonType\x00\x08\x00\x0c\x00\x08\x00\x04\x00\x08\x00\x00\x00\x08\x00\x00\x00\x14\x00\x00\x00\t\x00\x00\x00TIMESTAMP\x00\x00\x00\x16\x00\x00\x00Spark:DataType:SqlName\x00\x00\x00\x00\x00\x00\x08\x00\x0c\x00\n\x00\x04\x00\x08\x00\x00\x00\x08\x00\x00\x00\x00\x00\x02\x00\x10\x00\x00\x00Europe/Amsterdam\x00\x00\x00\x00&\x00\x00\x00CAST(2022-03-02 12:54:56 AS TIMESTAMP)\x00\x00\x00\x00\x00\x00', cacheLookupResult=0, uncompressedBytes=160, compressedBytes=122, isStagingOperation=False, reasonForNoCloudFetch=1, resultFiles=None, manifestFile=None, manifestFileFormat=None), resultSet=TFetchResultsResp(status=TStatus(statusCode=0, infoMessages=None, sqlState=None, errorCode=None, errorMessage=None, displayMessage=None, responseValidation=None), hasMoreRows=False, results=TRowSet(startRowOffset=0, rows=[], columns=None, binaryColumns=None, columnCount=None, arrowBatches=[TSparkArrowBatch(batch=b'\x04"M\x18`psk\x00\x00\x00\xa2\xff\xff\xff\xff\x88\x00\x00\x00\x14\x00\x01\x00\xf2\x02\x0c\x00\x16\x00\x0e\x00\x15\x00\x10\x00\x04\x00\x0c\x00\x00\x00\x10\x17\x00B\x00\x00\x00\x04\x0c\x00\xc3\x03\n\x00\x18\x00\x0c\x00\x08\x00\x04\x00\n8\x00Q8\x00\x00\x00\x01$\x00\x02\x02\x00\x12\x02\x07\x00\x01\x02\x00\x04\x18\x00\x11\x08\x0e\x00\x06\x08\x00\x00\x02\x00\x00\x1c\x00\x00\x04\x00\x08\x02\x00\x05\x10\x00p\xe4@\xf3:\xd9\x05\x00\x00\x00\x00\x00', rowCount=1)], resultLinks=None), resultSetMetadata=None, responseValidation=None), closeOperation=TCloseOperationResp(status=TStatus(statusCode=0, infoMessages=None, sqlState=None, errorCode=None, errorMessage=None, displayMessage=None, responseValidation=None))), executionRejected=False, maxClusterCapacity=10.0, queryCost=0.5, sessionConf=None, currentClusterLoad=1.0, idempotencyType=2)
DEBUG    databricks.sql.thrift_backend:thrift_backend.py:321 Sending request: TExecuteStatementReq(sessionHandle=TSessionHandle(sessionId=THandleIdentifier(guid=b'\x01\xee\x10v\xe88\x14\xdd\x9dF\x89(\x80?<\x98', secret=b'3\x8dR\x9d\x82rF\xeb\x84\x82\xcbA\x94f\x83\x9d', executionVersion=None), serverProtocolVersion=None), statement="select CAST('2022-03-02 12:54:56' as TIMESTAMP)", confOverlay={'spark.thriftserver.arrowBasedRowSet.timestampAsString': 'false'}, runAsync=True, getDirectResults=TSparkGetDirectResults(maxRows=1000, maxBytes=10485760), queryTimeout=0, canReadArrowResult=True, canDownloadResult=False, canDecompressLZ4Result=True, maxBytesPerFile=None, useArrowNativeTypes=TSparkArrowTypes(timestampAsArrow=True, decimalAsArrow=True, complexTypesAsArrow=True, intervalTypesAsArrow=False), resultRowLimit=None, operationId=None, sessionConf=None, rejectHighCostQueries=None, estimatedCost=None, executionVersion=None, requestValidation=None, resultPersistenceMode=None, trimArrowBatchesToLimit=None)
DEBUG    databricks.sql.thrift_backend:thrift_backend.py:327 Received response: TExecuteStatementResp(status=TStatus(statusCode=0, infoMessages=None, sqlState=None, errorCode=None, errorMessage=None, displayMessage=None, responseValidation=None), operationHandle=TOperationHandle(operationId=THandleIdentifier(guid=b'\x01\xee\x10v\xe9\x0b\x12\xa7\x97\x19JLK\x8d\xa2\xbf', secret=b'3\x8dR\x9d\x82rF\xeb\x84\x82\xcbA\x94f\x83\x9d', executionVersion=0), operationType=0, hasResultSet=True, modifiedRowCount=None), directResults=TSparkDirectResults(operationStatus=TGetOperationStatusResp(status=TStatus(statusCode=0, infoMessages=None, sqlState=None, errorCode=None, errorMessage=None, displayMessage=None, responseValidation=None), operationState=2, sqlState=None, errorCode=None, errorMessage=None, taskStatus=None, operationStarted=1687381339734, operationCompleted=1687381339959, hasResultSet=None, progressUpdateResponse=None, numModifiedRows=None, displayMessage=None, diagnosticInfo=None, responseValidation=None, idempotencyType=2, statementTimeout=172800, statementTimeoutLevel=None), resultSetMetadata=TGetResultSetMetadataResp(status=TStatus(statusCode=0, infoMessages=None, sqlState=None, errorCode=None, errorMessage=None, displayMessage=None, responseValidation=None), schema=TTableSchema(columns=[TColumnDesc(columnName='CAST(2022-03-02 12:54:56 AS TIMESTAMP)', typeDesc=TTypeDesc(types=[TTypeEntry(primitiveEntry=TPrimitiveTypeEntry(type=8, typeQualifiers=None), arrayEntry=None, mapEntry=None, structEntry=None, unionEntry=None, userDefinedTypeEntry=None)]), position=1, comment='')]), resultFormat=0, lz4Compressed=True, arrowSchema=b'\xff\xff\xff\xffP\x01\x00\x00\x10\x00\x00\x00\x00\x00\n\x00\x0e\x00\x06\x00\r\x00\x08\x00\n\x00\x00\x00\x00\x00\x04\x00\x10\x00\x00\x00\x00\x01\n\x00\x0c\x00\x00\x00\x08\x00\x04\x00\n\x00\x00\x00\x08\x00\x00\x00\x08\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x18\x00\x00\x00\x00\x00\x12\x00\x18\x00\x14\x00\x13\x00\x12\x00\x0c\x00\x00\x00\x08\x00\x04\x00\x12\x00\x00\x00\x14\x00\x00\x00\x94\x00\x00\x00\x9c\x00\x00\x00\x00\x00\n\x01\xb8\x00\x00\x00\x02\x00\x00\x00H\x00\x00\x00\x04\x00\x00\x00\xc8\xff\xff\xff\x08\x00\x00\x00\x14\x00\x00\x00\x0b\x00\x00\x00"timestamp"\x00\x17\x00\x00\x00Spark:DataType:JsonType\x00\x08\x00\x0c\x00\x08\x00\x04\x00\x08\x00\x00\x00\x08\x00\x00\x00\x14\x00\x00\x00\t\x00\x00\x00TIMESTAMP\x00\x00\x00\x16\x00\x00\x00Spark:DataType:SqlName\x00\x00\x00\x00\x00\x00\x08\x00\x0c\x00\n\x00\x04\x00\x08\x00\x00\x00\x08\x00\x00\x00\x00\x00\x02\x00\x10\x00\x00\x00Europe/Amsterdam\x00\x00\x00\x00&\x00\x00\x00CAST(2022-03-02 12:54:56 AS TIMESTAMP)\x00\x00\x00\x00\x00\x00', cacheLookupResult=0, uncompressedBytes=160, compressedBytes=122, isStagingOperation=False, reasonForNoCloudFetch=1, resultFiles=None, manifestFile=None, manifestFileFormat=None), resultSet=TFetchResultsResp(status=TStatus(statusCode=0, infoMessages=None, sqlState=None, errorCode=None, errorMessage=None, displayMessage=None, responseValidation=None), hasMoreRows=False, results=TRowSet(startRowOffset=0, rows=[], columns=None, binaryColumns=None, columnCount=None, arrowBatches=[TSparkArrowBatch(batch=b'\x04"M\x18`psk\x00\x00\x00\xa2\xff\xff\xff\xff\x88\x00\x00\x00\x14\x00\x01\x00\xf2\x02\x0c\x00\x16\x00\x0e\x00\x15\x00\x10\x00\x04\x00\x0c\x00\x00\x00\x10\x17\x00B\x00\x00\x00\x04\x0c\x00\xc3\x03\n\x00\x18\x00\x0c\x00\x08\x00\x04\x00\n8\x00Q8\x00\x00\x00\x01$\x00\x02\x02\x00\x12\x02\x07\x00\x01\x02\x00\x04\x18\x00\x11\x08\x0e\x00\x06\x08\x00\x00\x02\x00\x00\x1c\x00\x00\x04\x00\x08\x02\x00\x05\x10\x00p\xe4@\xf3:\xd9\x05\x00\x00\x00\x00\x00', rowCount=1)], resultLinks=None), resultSetMetadata=None, responseValidation=None), closeOperation=TCloseOperationResp(status=TStatus(statusCode=0, infoMessages=None, sqlState=None, errorCode=None, errorMessage=None, displayMessage=None, responseValidation=None))), executionRejected=False, maxClusterCapacity=10.0, queryCost=0.5, sessionConf=None, currentClusterLoad=1.0, idempotencyType=2)
DEBUG    databricks.sql.thrift_backend:thrift_backend.py:321 Sending request: TCloseSessionReq(sessionHandle=TSessionHandle(sessionId=THandleIdentifier(guid=b'\x01\xee\x10v\xe88\x14\xdd\x9dF\x89(\x80?<\x98', secret=b'3\x8dR\x9d\x82rF\xeb\x84\x82\xcbA\x94f\x83\x9d', executionVersion=None), serverProtocolVersion=None))
DEBUG    databricks.sql.thrift_backend:thrift_backend.py:327 Received response: TCloseSessionResp(status=TStatus(statusCode=0, infoMessages=None, sqlState=None, errorCode=None, errorMessage=None, displayMessage=None, responseValidation=None))

Log output after this PR

DEBUG    databricks.sql.thrift_backend:thrift_backend.py:325 Sending request: OpenSession(<REDACTED>)
DEBUG    databricks.sql.thrift_backend:thrift_backend.py:333 Received response: TOpenSessionResp(<REDACTED>)
DEBUG    databricks.sql.thrift_backend:thrift_backend.py:325 Sending request: ExecuteStatement(<REDACTED>)
DEBUG    databricks.sql.thrift_backend:thrift_backend.py:333 Received response: TExecuteStatementResp(<REDACTED>)
DEBUG    databricks.sql.thrift_backend:thrift_backend.py:325 Sending request: ExecuteStatement(<REDACTED>)
DEBUG    databricks.sql.thrift_backend:thrift_backend.py:333 Received response: TExecuteStatementResp(<REDACTED>)
DEBUG    databricks.sql.thrift_backend:thrift_backend.py:325 Sending request: ExecuteStatement(<REDACTED>)
DEBUG    databricks.sql.thrift_backend:thrift_backend.py:333 Received response: TExecuteStatementResp(<REDACTED>)
DEBUG    databricks.sql.thrift_backend:thrift_backend.py:325 Sending request: CloseSession(<REDACTED>)
DEBUG    databricks.sql.thrift_backend:thrift_backend.py:333 Received response: TCloseSessionResp(<REDACTED>)

Related Tickets & Documents

Closes #152

Jesse Whitehouse added 5 commits June 21, 2023 16:24
This gives an example of how to actual view the unsafe logs when needed.

Signed-off-by: Jesse Whitehouse <[email protected]>
do have a `name` property.

Signed-off-by: Jesse Whitehouse <[email protected]>
Signed-off-by: Jesse Whitehouse <[email protected]>
Signed-off-by: Jesse Whitehouse <[email protected]>
Signed-off-by: Jesse Whitehouse <[email protected]>
@susodapop
Copy link
Contributor Author

I pushed this branch to pypi as version 2.6.3dev1 so I can experiment with it in the dbt-databricks connector here: databricks/dbt-databricks#364

@susodapop susodapop merged commit 7fcfa7b into main Jun 23, 2023
@susodapop susodapop deleted the issue-152 branch June 23, 2023 22:50
susodapop pushed a commit to unj1m/databricks-sql-python that referenced this pull request Sep 19, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Stop emitting unredacted thrift responses by default
2 participants