diff --git a/docs/changelog/next_release/+.doc.1.rst b/docs/changelog/next_release/+.doc.1.rst new file mode 100644 index 00000000..5e1a4f3e --- /dev/null +++ b/docs/changelog/next_release/+.doc.1.rst @@ -0,0 +1 @@ +Fix links to MSSQL date & time type documentation. diff --git a/docs/connection/db_connection/mssql/types.rst b/docs/connection/db_connection/mssql/types.rst index 13c7874a..1770c91b 100644 --- a/docs/connection/db_connection/mssql/types.rst +++ b/docs/connection/db_connection/mssql/types.rst @@ -197,8 +197,7 @@ Temporal types So not all of values in Spark DataFrame can be written to MSSQL. References: - * `Clickhouse DateTime documentation `_ - * `Clickhouse DateTime documentation `_ + * `MSSQL date & time types documentation `_ * `Spark DateType documentation `_ * `Spark TimestampType documentation `_ @@ -213,7 +212,7 @@ Temporal types Last digit will be lost during read or write operations. .. [5] - ``time`` type is the same as ``timestamp`` with date ``1970-01-01``. So instead of reading data from MSSQL like ``23:59:59.999999`` + ``time`` type is the same as ``datetime2`` with date ``1970-01-01``. So instead of reading data from MSSQL like ``23:59:59.999999`` it is actually read ``1970-01-01 23:59:59.999999``, and vice versa. String types