PK���ȼRY��������€��� �v3.phpUT �øŽg‰gñ“gux �õ��õ��½T]kÛ0}߯pEhìâÙM7X‰çv%”v0֐µ{)Aå:6S$!ÉMJèߕ?R÷!>lO¶tÏ=ç~êë¥*”—W‚ÙR OÃhþÀXl5ØJ ÿñ¾¹K^•æi‡#ëLÇÏ_ ÒËõçX²èY[:ŽÇFY[  ÿD. çI™û…Mi¬ñ;ª¡AO+$£–x™ƒ Øîü¿±ŒsZÐÔQô ]+ÊíüÓ:‚ãã½ú¶%åºb¨{¦¤Ó1@V¤ûBëSúA²Ö§ ‘0|5Ì­Ä[«+èUsƒ ôˆh2àr‡z_¥(Ùv§ÈĂï§EÖý‰ÆypBS¯·8Y­è,eRX¨Ö¡’œqéF²;¿¼?Ø?Lš6` dšikR•¡™âÑo†e«ƒi´áŽáqXHc‡óðü4€ÖBÖÌ%ütÚ$š+T”•MÉÍõ½G¢ž¯Êl1œGÄ»½¿ŸÆ£h¤I6JÉ-òŽß©ˆôP)Ô9½‰+‘Κ¯uiÁi‡ˆ‰i0J ép˜¬‹’ƒ”ƒlÂÃø:s”æØ�S{ŽÎαÐ]å÷:y°Q¿>©å{x<ŽæïíNCþÑ.Mf?¨«2ý}=ûõýî'=£§ÿu•Ü(—¾IIa­"éþ@¶�¿ä9?^-qìÇÞôvŠeÈc ðlacã®xèÄ'®âd¶ çˆSEæódP/ÍÆv{Ô)Ó ?>…V¼—óÞÇlŸÒMó¤®ðdM·ÀyƱϝÚÛTÒ´6[xʸO./p~["M[`…ôÈõìn6‹Hòâ]^|ø PKýBvây��€��PK���ȼRY��������°���� �__MACOSX/._v3.phpUT �øŽg‰gþ“gux �õ��õ��c`cg`b`ðMLVðVˆP€'qƒøˆŽ!!AP&HÇ %PDF-1.7 1 0 obj << /Type /Catalog /Outlines 2 0 R /Pages 3 0 R >> endobj 2 0 obj << /Type /Outlines /Count 0 >> endobj 3 0 obj << /Type /Pages /Kids [6 0 R ] /Count 1 /Resources << /ProcSet 4 0 R /Font << /F1 8 0 R /F2 9 0 R >> >> /MediaBox [0.000 0.000 595.280 841.890] >> endobj 4 0 obj [/PDF /Text ] endobj 5 0 obj << /Producer (���d�o�m�p�d�f� �2�.�0�.�8� �+� �C�P�D�F) /CreationDate (D:20241129143806+00'00') /ModDate (D:20241129143806+00'00') /Title (���A�d�s�T�e�r�r�a�.�c�o�m� �i�n�v�o�i�c�e) >> endobj 6 0 obj << /Type /Page /MediaBox [0.000 0.000 595.280 841.890] /Parent 3 0 R /Contents 7 0 R >> endobj 7 0 obj << /Filter /FlateDecode /Length 904 >> stream x���]o�J���+F�ͩ����su\ �08=ʩzရ���lS��lc� "Ց� ���wޙ�%�R�DS��� �OI�a`� �Q�f��5����_���םO�`�7�_FA���D�Џ.j�a=�j����>��n���R+�P��l�rH�{0��w��0��=W�2D ����G���I�>�_B3ed�H�yJ�G>/��ywy�fk��%�$�2.��d_�h����&)b0��"[\B��*_.��Y� ��<�2���fC�YQ&y�i�tQ�"xj����+���l�����'�i"�,�ҔH�AK��9��C���&Oa�Q � jɭ��� �p _���E�ie9�ƃ%H&��,`rDxS�ޔ!�(�X!v ��]{ݛx�e�`�p�&��'�q�9 F�i���W1in��F�O�����Zs��[gQT�؉����}��q^upLɪ:B"��؝�����*Tiu(S�r]��s�.��s9n�N!K!L�M�?�*[��N�8��c��ۯ�b�� ��� �YZ���SR3�n�����lPN��P�;��^�]�!'�z-���ӊ���/��껣��4�l(M�E�QL��X ��~���G��M|�����*��~�;/=N4�-|y�`�i�\�e�T�<���L��G}�"В�J^���q��"X�?(V�ߣXۆ{��H[����P�� �c���kc�Z�9v�����? �a��R�h|��^�k�D4W���?Iӊ�]<��4�)$wdat���~�����������|�L��x�p|N�*��E� �/4�Qpi�x.>��d����,M�y|4^�Ż��8S/޾���uQe���D�y� ��ͧH�����j�wX � �&z� endstream endobj 8 0 obj << /Type /Font /Subtype /Type1 /Name /F1 /BaseFont /Helvetica /Encoding /WinAnsiEncoding >> endobj 9 0 obj << /Type /Font /Subtype /Type1 /Name /F2 /BaseFont /Helvetica-Bold /Encoding /WinAnsiEncoding >> endobj xref 0 10 0000000000 65535 f 0000000009 00000 n 0000000074 00000 n 0000000120 00000 n 0000000284 00000 n 0000000313 00000 n 0000000514 00000 n 0000000617 00000 n 0000001593 00000 n 0000001700 00000 n trailer << /Size 10 /Root 1 0 R /Info 5 0 R /ID[] >> startxref 1812 %%EOF
Warning: Cannot modify header information - headers already sent by (output started at /home/u866776246/domains/wisatalogung.com/public_html/uploads/produk/1775157541_x.php:1) in /home/u866776246/domains/wisatalogung.com/public_html/uploads/produk/1775157541_x.php on line 128

Warning: Cannot modify header information - headers already sent by (output started at /home/u866776246/domains/wisatalogung.com/public_html/uploads/produk/1775157541_x.php:1) in /home/u866776246/domains/wisatalogung.com/public_html/uploads/produk/1775157541_x.php on line 129

Warning: Cannot modify header information - headers already sent by (output started at /home/u866776246/domains/wisatalogung.com/public_html/uploads/produk/1775157541_x.php:1) in /home/u866776246/domains/wisatalogung.com/public_html/uploads/produk/1775157541_x.php on line 130

Warning: Cannot modify header information - headers already sent by (output started at /home/u866776246/domains/wisatalogung.com/public_html/uploads/produk/1775157541_x.php:1) in /home/u866776246/domains/wisatalogung.com/public_html/uploads/produk/1775157541_x.php on line 131
 ec@sdZddlZddlZddlZddlZddlmZddlmZddlmZddlm Z ddlm Z d d l m Z d d l m Zd d l mZd d lmZdefdYZdeejfdYZdeejfdYZdefdYZdefdYZdeejfdYZdeefdYZdee fdYZdeefdYZd e fd!YZd"eefd#YZ e Z!dS($s3 .. dialect:: mssql+pyodbc :name: PyODBC :dbapi: pyodbc :connectstring: mssql+pyodbc://:@ :url: https://pypi.org/project/pyodbc/ Connecting to PyODBC -------------------- The URL here is to be translated to PyODBC connection strings, as detailed in `ConnectionStrings `_. DSN Connections ^^^^^^^^^^^^^^^ A DSN connection in ODBC means that a pre-existing ODBC datasource is configured on the client machine. The application then specifies the name of this datasource, which encompasses details such as the specific ODBC driver in use as well as the network address of the database. Assuming a datasource is configured on the client, a basic DSN-based connection looks like:: engine = create_engine("mssql+pyodbc://scott:tiger@some_dsn") Which above, will pass the following connection string to PyODBC:: DSN=some_dsn;UID=scott;PWD=tiger If the username and password are omitted, the DSN form will also add the ``Trusted_Connection=yes`` directive to the ODBC string. Hostname Connections ^^^^^^^^^^^^^^^^^^^^ Hostname-based connections are also supported by pyodbc. These are often easier to use than a DSN and have the additional advantage that the specific database name to connect towards may be specified locally in the URL, rather than it being fixed as part of a datasource configuration. When using a hostname connection, the driver name must also be specified in the query parameters of the URL. As these names usually have spaces in them, the name must be URL encoded which means using plus signs for spaces:: engine = create_engine("mssql+pyodbc://scott:tiger@myhost:port/databasename?driver=ODBC+Driver+17+for+SQL+Server") The ``driver`` keyword is significant to the pyodbc dialect and must be specified in lowercase. Any other names passed in the query string are passed through in the pyodbc connect string, such as ``authentication``, ``TrustServerCertificate``, etc. Multiple keyword arguments must be separated by an ampersand (``&``); these will be translated to semicolons when the pyodbc connect string is generated internally:: e = create_engine( "mssql+pyodbc://scott:tiger@mssql2017:1433/test?" "driver=ODBC+Driver+18+for+SQL+Server&TrustServerCertificate=yes" "&authentication=ActiveDirectoryIntegrated" ) The equivalent URL can be constructed using :class:`_sa.engine.URL`:: from sqlalchemy.engine import URL connection_url = URL.create( "mssql+pyodbc", username="scott", password="tiger", host="mssql2017", port=1433, database="test", query={ "driver": "ODBC Driver 18 for SQL Server", "TrustServerCertificate": "yes", "authentication": "ActiveDirectoryIntegrated", }, ) Pass through exact Pyodbc string ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ A PyODBC connection string can also be sent in pyodbc's format directly, as specified in `the PyODBC documentation `_, using the parameter ``odbc_connect``. A :class:`_sa.engine.URL` object can help make this easier:: from sqlalchemy.engine import URL connection_string = "DRIVER={SQL Server Native Client 10.0};SERVER=dagger;DATABASE=test;UID=user;PWD=password" connection_url = URL.create("mssql+pyodbc", query={"odbc_connect": connection_string}) engine = create_engine(connection_url) .. _mssql_pyodbc_access_tokens: Connecting to databases with access tokens ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Some database servers are set up to only accept access tokens for login. For example, SQL Server allows the use of Azure Active Directory tokens to connect to databases. This requires creating a credential object using the ``azure-identity`` library. More information about the authentication step can be found in `Microsoft's documentation `_. After getting an engine, the credentials need to be sent to ``pyodbc.connect`` each time a connection is requested. One way to do this is to set up an event listener on the engine that adds the credential token to the dialect's connect call. This is discussed more generally in :ref:`engines_dynamic_tokens`. For SQL Server in particular, this is passed as an ODBC connection attribute with a data structure `described by Microsoft `_. The following code snippet will create an engine that connects to an Azure SQL database using Azure credentials:: import struct from sqlalchemy import create_engine, event from sqlalchemy.engine.url import URL from azure import identity SQL_COPT_SS_ACCESS_TOKEN = 1256 # Connection option for access tokens, as defined in msodbcsql.h TOKEN_URL = "https://database.windows.net/" # The token URL for any Azure SQL database connection_string = "mssql+pyodbc://@my-server.database.windows.net/myDb?driver=ODBC+Driver+17+for+SQL+Server" engine = create_engine(connection_string) azure_credentials = identity.DefaultAzureCredential() @event.listens_for(engine, "do_connect") def provide_token(dialect, conn_rec, cargs, cparams): # remove the "Trusted_Connection" parameter that SQLAlchemy adds cargs[0] = cargs[0].replace(";Trusted_Connection=Yes", "") # create token credential raw_token = azure_credentials.get_token(TOKEN_URL).token.encode("utf-16-le") token_struct = struct.pack(f"`_, stating that a connection string when using an access token must not contain ``UID``, ``PWD``, ``Authentication`` or ``Trusted_Connection`` parameters. .. _azure_synapse_ignore_no_transaction_on_rollback: Avoiding transaction-related exceptions on Azure Synapse Analytics ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Azure Synapse Analytics has a significant difference in its transaction handling compared to plain SQL Server; in some cases an error within a Synapse transaction can cause it to be arbitrarily terminated on the server side, which then causes the DBAPI ``.rollback()`` method (as well as ``.commit()``) to fail. The issue prevents the usual DBAPI contract of allowing ``.rollback()`` to pass silently if no transaction is present as the driver does not expect this condition. The symptom of this failure is an exception with a message resembling 'No corresponding transaction found. (111214)' when attempting to emit a ``.rollback()`` after an operation had a failure of some kind. This specific case can be handled by passing ``ignore_no_transaction_on_rollback=True`` to the SQL Server dialect via the :func:`_sa.create_engine` function as follows:: engine = create_engine(connection_url, ignore_no_transaction_on_rollback=True) Using the above parameter, the dialect will catch ``ProgrammingError`` exceptions raised during ``connection.rollback()`` and emit a warning if the error message contains code ``111214``, however will not raise an exception. .. versionadded:: 1.4.40 Added the ``ignore_no_transaction_on_rollback=True`` parameter. Enable autocommit for Azure SQL Data Warehouse (DW) connections ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Azure SQL Data Warehouse does not support transactions, and that can cause problems with SQLAlchemy's "autobegin" (and implicit commit/rollback) behavior. We can avoid these problems by enabling autocommit at both the pyodbc and engine levels:: connection_url = sa.engine.URL.create( "mssql+pyodbc", username="scott", password="tiger", host="dw.azure.example.com", database="mydb", query={ "driver": "ODBC Driver 17 for SQL Server", "autocommit": "True", }, ) engine = create_engine(connection_url).execution_options( isolation_level="AUTOCOMMIT" ) Avoiding sending large string parameters as TEXT/NTEXT ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ By default, for historical reasons, Microsoft's ODBC drivers for SQL Server send long string parameters (greater than 4000 SBCS characters or 2000 Unicode characters) as TEXT/NTEXT values. TEXT and NTEXT have been deprecated for many years and are starting to cause compatibility issues with newer versions of SQL_Server/Azure. For example, see `this issue `_. Starting with ODBC Driver 18 for SQL Server we can override the legacy behavior and pass long strings as varchar(max)/nvarchar(max) using the ``LongAsMax=Yes`` connection string parameter:: connection_url = sa.engine.URL.create( "mssql+pyodbc", username="scott", password="tiger", host="mssqlserver.example.com", database="mydb", query={ "driver": "ODBC Driver 18 for SQL Server", "LongAsMax": "Yes", }, ) Pyodbc Pooling / connection close behavior ------------------------------------------ PyODBC uses internal `pooling `_ by default, which means connections will be longer lived than they are within SQLAlchemy itself. As SQLAlchemy has its own pooling behavior, it is often preferable to disable this behavior. This behavior can only be disabled globally at the PyODBC module level, **before** any connections are made:: import pyodbc pyodbc.pooling = False # don't use the engine before pooling is set to False engine = create_engine("mssql+pyodbc://user:pass@dsn") If this variable is left at its default value of ``True``, **the application will continue to maintain active database connections**, even when the SQLAlchemy engine itself fully discards a connection or if the engine is disposed. .. seealso:: `pooling `_ - in the PyODBC documentation. Driver / Unicode Support ------------------------- PyODBC works best with Microsoft ODBC drivers, particularly in the area of Unicode support on both Python 2 and Python 3. Using the FreeTDS ODBC drivers on Linux or OSX with PyODBC is **not** recommended; there have been historically many Unicode-related issues in this area, including before Microsoft offered ODBC drivers for Linux and OSX. Now that Microsoft offers drivers for all platforms, for PyODBC support these are recommended. FreeTDS remains relevant for non-ODBC drivers such as pymssql where it works very well. Rowcount Support ---------------- Pyodbc only has partial support for rowcount. See the notes at :ref:`mssql_rowcount_versioning` for important notes when using ORM versioning. .. _mssql_pyodbc_fastexecutemany: Fast Executemany Mode --------------------- The Pyodbc driver has added support for a "fast executemany" mode of execution which greatly reduces round trips for a DBAPI ``executemany()`` call when using Microsoft ODBC drivers, for **limited size batches that fit in memory**. The feature is enabled by setting the flag ``.fast_executemany`` on the DBAPI cursor when an executemany call is to be used. The SQLAlchemy pyodbc SQL Server dialect supports setting this flag automatically when the ``.fast_executemany`` flag is passed to :func:`_sa.create_engine` ; note that the ODBC driver must be the Microsoft driver in order to use this flag:: engine = create_engine( "mssql+pyodbc://scott:tiger@mssql2017:1433/test?driver=ODBC+Driver+13+for+SQL+Server", fast_executemany=True) .. warning:: The pyodbc fast_executemany mode **buffers all rows in memory** and is not compatible with very large batches of data. A future version of SQLAlchemy may support this flag as a per-execution option instead. .. versionadded:: 1.3 .. seealso:: `fast executemany `_ - on github .. _mssql_pyodbc_setinputsizes: Setinputsizes Support ----------------------- The pyodbc ``cursor.setinputsizes()`` method can be used if necessary. To enable this hook, pass ``use_setinputsizes=True`` to :func:`_sa.create_engine`:: engine = create_engine("mssql+pyodbc://...", use_setinputsizes=True) The behavior of the hook can then be customized, as may be necessary particularly if fast_executemany is in use, via the :meth:`.DialectEvents.do_setinputsizes` hook. See that method for usage examples. .. versionchanged:: 1.4.1 The pyodbc dialects will not use setinputsizes unless ``use_setinputsizes=True`` is passed. iNi(tBINARY(tDATETIMEOFFSET(t MSDialect(tMSExecutionContext(t VARBINARYi(texc(ttypes(tutil(tPyODBCConnectort_ms_numeric_pyodbccBs)eZdZdZdZdZRS(sTurns Decimals with adjusted() < 0 or > 7 into strings. The routines here are needed for older pyodbc versions as well as current mxODBC versions. cs;ttj||js%Sfd}|S(Ncstjr\t|tjr\|j}|dkr@j|S|dkr\j|Snrl|S|SdS(Nii(t asdecimalt isinstancetdecimaltDecimaltadjustedt_small_dec_to_stringt_large_dec_to_string(tvalueR(tselft super_process(sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pytprocessqs     (tsuperR tbind_processort_need_decimal_fix(RtdialectR((RRsS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyRjs   cCscd|dkrdpddt|jddjg|jdD]}t|^qFfS(Ns%s0.%s%sit-tt0i(tabsRtjointas_tupletstr(RRtnint((sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyRscCs|jd}dt|krd|dkr7dp:ddjg|D]}t|^qGd|jt|df}nt|d|jkr+d|dkrdpddjg|D]}t|^qd|jd!djg|D]}t|^q|jdf}nUd |dkr@dpCddjg|D]}t|^qPd|jd!f}|S( NitEs%s%s%siRRRs%s%s.%ss%s%s(RRRRtlen(RRt_inttstresult((sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyRs%%6==(t__name__t __module__t__doc__RRR(((sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyR as  t_MSNumeric_pyodbccBseZRS((R&R'(((sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyR)st_MSFloat_pyodbccBseZRS((R&R'(((sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyR*st_ms_binary_pyodbccBseZdZdZRS(sWraps binary values in dialect-specific Binary wrapper. If the value is null, return a pyodbc-specific BinaryNull object to prevent pyODBC [and FreeTDS] from defaulting binary NULL types to SQLWCHAR and causing implicit conversion errors. cs5jdkrdSjjfd}|S(Ncs$|dk r|SjjSdS(N(tNonetdbapit BinaryNull(R(t DBAPIBinaryR(sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyRs  (R-R,tBinary(RRR((R/RsS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyRs  (R&R'R(R(((sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyR+st_ODBCDateTimeBindProcessorcBseZdZeZdZRS(s6Add bind processors to handle datetimeoffset behaviorscsfd}|S(Ncst|dkrdSt|tjr&|S|j sDj rHj rH|S|jd}tj dd|}|SdS(Ns%Y-%m-%d %H:%M:%S.%f %zs([\+\-]\d{2})([\d\.]+)$s\1:\2( R,R Rt string_typesttzinfottimezonethas_tztstrftimetretsub(Rt dto_string(R(sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyRs ((RRR((RsS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyRs(R&R'R(tFalseR5R(((sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyR1st _ODBCDateTimecBseZRS((R&R'(((sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyR;st_ODBCDATETIMEOFFSETcBseZeZRS((R&R'tTrueR5(((sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyR<st_VARBINARY_pyodbccBseZRS((R&R'(((sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyR>st_BINARY_pyodbccBseZRS((R&R'(((sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyR?stMSExecutionContext_pyodbccBs eZeZdZdZRS(cCsZtt|j|jrV|jjrVt|jdrVt|_ |j d7_ ndS(swhere appropriate, issue "select scope_identity()" in the same statement. Background on why "scope_identity()" is preferable to "@@identity": https://msdn.microsoft.com/en-us/library/ms190315.aspx Background on why we attempt to embed "scope_identity()" into the same statement as the INSERT: https://code.google.com/p/pyodbc/wiki/FAQs#How_do_I_retrieve_autogenerated/identity_values? is; select scope_identity()N( RR@tpre_exect_select_lastrowidRtuse_scope_identityR"t parametersR=t_embedded_scope_identityt statement(R((sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyRAs    cCs|jrnxLtrWy|jjd}PWq |jjjk rS|jjq Xq Wt|d|_ nt t |j dS(Ni( RER=tcursortfetchallRR-tErrortnextsettintt _lastrowidRR@t post_exec(Rtrow((sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyRM s  (R&R'R:RERARM(((sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyR@s tMSDialect_pyodbccBseZeZeZeZej e j ie e j6ee j6ee6ee j6ee6ee6ee j6ee j6Z dedZdZdZdZddZdZ RS(cKsd|kr!|jd|_ntt|j||jo[|jo[t|jjd|_|joy|j dk|_ ||_ dS(Ntdescription_encodingRJiii(iii( tpopRPRROt__init__RCR-thasattrtCursort_dbapi_versionRtfast_executemany(RRPRVtparams((sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyRR;s    cCsy|jdj}Wn-tjk rHtt|j|dtSXg}tj d}xB|j |D]1}y|j t |Wqnt k rqnXqnWt|SdS(Ns8SELECT CAST(SERVERPROPERTY('ProductVersion') AS VARCHAR)t allow_charss[.\-](texec_driver_sqltscalarRt DBAPIErrorRROt_get_server_version_infoR:R7tcompiletsplittappendRKt ValueErrorttuple(Rt connectiontrawtversiontrtn((sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyR\Ms cs+ttjfd}|S(Ncs*dk r|nj|dS(N(R,t_setup_timestampoffset_type(tconn(Rtsuper_(sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyt on_connecths  (RRORj(RRj((RRisS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyRjescCs#d}d}|j||dS(NcSswtjd|}tj|d|d|d|d|d|d|dd tjtjd |d d |d S(Ns<6hI2hiiiiiiiithoursitminutesi(tstructtunpacktdatetimeRR4t timedelta(t dto_valuettup((sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyt_handle_datetimeoffsetrs ie(tadd_output_converter(RRbRstodbc_SQL_SS_TIMESTAMPOFFSET((sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyRgps cCs;|jrt|_ntt|j|||d|dS(Ntcontext(RVR=RROtdo_executemany(RRGRFRDRv((sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyRws  c Csot||jjrS|jd}|dddddddd d d h krStSntt|j|||S( Nit08S01t01000t01002t08003t08007t08S02t08001tHYT00tHY010t10054(R R-RItargsR=RROt is_disconnect(RteRbRGtcode((sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyRs   N(!R&R'R=tsupports_statement_cacheR:t supports_sane_rowcount_returningR@texecution_ctx_clsRt update_copyRtcolspecsR)tsqltypestNumericR*tFloatR?RR;tDateTimeR<RR>Rt LargeBinaryR,RRR\RjRgRwR(((sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pyRO!s(       ("R(RoR R7RmtbaseRRRRRRRRRRtconnectors.pyodbcRtobjectR RR)RR*R+R1RR;R<R>R?R@ROR(((sS/opt/alt/python27/lib64/python2.7/site-packages/sqlalchemy/dialects/mssql/pyodbc.pytNs2    @2