Closed
Description
It appears that default values aren't supported?
Trying to create a schema with the below server_default
gives a spark error:
insert_timestamp: Mapped[datetime] = mapped_column(
sa.DateTime(timezone=True),
init=False,
nullable=False,
server_default=sa.func.current_timestamp(),
)
DatabaseError: (databricks.sql.exc.ServerOperationError) [INTERNAL_ERROR] The Spark SQL phase planning failed with an internal error. You hit a bug in Spark or the Spark plugins you use. Please, report this bug to the corresponding communities or vendors, and provide the full stack trace.
[SQL:
CREATE TABLE test.`ModelMetadata` (
pkid BIGINT GENERATED ALWAYS AS IDENTITY,
name STRING NOT NULL,
version STRING NOT NULL,
insert_timestamp TIMESTAMP_NTZ DEFAULT CURRENT_TIMESTAMP NOT NULL,
PRIMARY KEY (pkid)
) USING DELTA
]
(Background on this error at: https://sqlalche.me/e/20/4xp6)
Edit:
It looks like the internal error is created by having a TIMESTAMP_NTZ
type with the CURRENT_TIMESTAMP
default. Using the correct TIMESTAMP
type for the column then gives the table feature was not enabled error.
To work around the latter error requires setting the table properties, which AFAICS can't be done from sqlalchemy
, at least not directly.