Are you the publisher? Claim or contact us about this channel


Embed this content in your HTML

Search

Report adult content:

click to rate:

Account: (login)

More Channels


Channel Catalog


Channel Description:

This is the official team Web Log for Microsoft Customer Service and Support (CSS) SQL Support. Posts are provided by the CSS SQL Escalation Services team.

older | 1 | .... | 15 | 16 | (Page 17)

    0 0

    I have had more Linked Server cases that are setup for an Oracle database than any other non-SQL Server database in SQL. Being in Business Intelligence Support we deal with plenty of connectivity issues and this is one topic of connectivity that does not get touched on a lot.

    In less than a month I got 4 Oracle Linked Server cases that all had different issues. The one thing that really got me was I did not really understand how the Oracle side of things worked for me to better troubleshoot the issue. For example, in one case I did not really have a good understanding of the ODAC Providers (Oracle’s Providers for connecting to different tools and applications) and the tnsnames.ora file and how they related to the whole setup. By having the whole picture I feel we can really help understand Oracle Linked Server setups better.

    Walkthrough:

    So for me to understand how the Oracle side worked I needed to get an Oracle server up and running.

    As such, I decided to create an Oracle 11G server. You can download the bits using the following link.

    Oracle Database Software
    http://www.oracle.com/technetwork/database/enterprise-edition/downloads/index-092322.html

    After all that, I created a table in the system (default) schema.

    Then I needed to create a listener which I learned is very important from an Oracle’s standpoint to make the database run properly.

    What’s a table without any data? I added some test data so I can compare the results between the Oracle database and the Linked Server results. Then after creating the table I added data to it so I can compare the results between the Oracle database and the Linked Server results.

    image

    Once I had the Oracle side all up and ready I started to create my Linked Server in SSMS.

    Now after I got my Oracle server up and operational I needed to find a very distinct file as this is the file that deals with the connectivity between Oracle and SQL. This file is called the tnsnames.ora file. I need to make sure I can locate it on the Oracle database server itself. More normal default Location is

    C:\<database folder>\product\11.2.0\dbhome_1\ NETWORK\ADMIN\tnsnames.ora

    ex: C:\OracleDatabase\product\11.2.0\dbhome_1\NETWORK\ADMIN\tnsnames.ora

    The Service ID you have setup will be the connection information you will need when creating the Linked Server in SSMS. In this case I am going to use MSORATEST.

    image

    Now that we know the Oracle server is setup and we have our tnsnames.ora information ready, we need to start setting up the SQL Server to have the ability to create a Linked Server that connects to an Oracle database.

    So at this point we would need to download and install the proper ODAC provider from ORACLE to get that process started. REMEMBER – BITNESS MATTERS!

    Listed below are the sites on where to download the proper provider needed:

    For 64-bit providers
    http://www.oracle.com/technetwork/database/windows/downloads/index-090165.html

    For 32-bit providers
    http://www.oracle.com/technetwork/topics/dotnet/utilsoft-086879.html

    For this example we are using 64 bit Oracle version 11g

    image

    For a quick test to verify you have it downloaded and installed properly you can do a quick UDL test. On the desktop create a new text file (make sure to show extensions so you can see the .txt part of the name). Then rename the entire file including the extension to Test.udl and press OK. Once you go to the Provider tab at the top left you should see something like “Oracle Provider for OLE DB” listed.

    image

    Now once you have confirmed you have installed the provider, search for the tnsnames.ora file on the SQL Server. Normally the default location is – C:\<folder chosen to save it in>\app\oracle\product\11.2.0\client\network\ADMIN.

    Example location we are going to use will be: D:\app\sql2012\product\11.2.0\client_1\network\ADMIN.

    image

    What we would add to the SQL Server TNSNames file:

    SPORTS =

    (DESCRIPTION =

    (ADDRESS = (PROTOCOL = TCP)(HOST = ORACLE11GMG.markg.local)(PORT = 4977))

    (ADDRESS = (PROTOCOL = TCP)(HOST = ORACLE11GMG.markg.local)(PORT = 1521))

    (CONNECT_DATA =

    (SERVER = DEDICATED)

    (SERVICE_NAME = Sports)

    )

    )

    Once you have the tnsnames.ora file correctly filled in on the SQL Server machine, you can now setup the Linked Server in SQL Server Management Studio.

    Using SQL Server Management Studio to create a linked server to another instance of SQL Server Using SQL Server Management Studio
    https://technet.microsoft.com/en-us/library/ff772782(v=sql.110).aspx#SSMSProcedure

    Before we start going through the actual steps you need to make sure the “OraOLEDB.Oracle” Provider is listed under Linked Server > Providers.

    image

    Also make sure that under properties for the provider you select Allow inprocess.

    image

    When you use the “Allow in-process” option for Linked Server providers, SQL loads the COM DLL in its own memory process. We do not normally recommend it because it can lead to stability issues in SQL Server, but some providers require it such as the Oracle one. If it crashes, it will also crash SQL Server.

    When running “Out of Process”, SQL launches the MSDAINITIALIZE process and that process loads the COM server (in this case, OLE DB Provider). If it is idle for x minutes or if the driver crashes the process, it unloads and the next linked server request loads in a new MSDAINITIALIZE process. You can see MSDAINITIALIZE by running dcomcnfg and working your way down Component Services.

    Generally only Administrators or the local system account can launch this, so if SQL is running under a domain account, you should add it to the local Administrators group or have it run as Local System.

    Now we can start creating the Oracle Linked Server in Management Studio.

    Go to Server Objects > Linked Servers > right click and select New Linked Server…

    image

    Then start filling in the necessary information to continue to create an Oracle Linked server

    General Tab:

    Linked server - Name of your Linked Server

    Server Type - Choose “Other data source” when using Oracle or any other Non-SQL Server database

    Provider– Oracle Provider for OLE DB (downloaded from the Oracle site)

    Product name– Oracle

    Data source– MSORATEST (this comes from the information in the TNSNames.ora file you added onto the SQL machine)

    Provider String– “leave blank”

    Sample image of what it would look like once completed.

    image

    Then you will need to go to the Security Tab.

    Select the option – Be made using this security context. The credentials you need to add are the ones that get you logged into your Oracle database.

    Do Note: this is probably not the safest option. Mapping logins would be more secure. By doing this, it means that every user hitting this linked server will connect to Oracle using that context. I did it this way because it was easier for me and I am my own admin.

    image

    Then open up the Linked Server in Management Studio and search for the system tables:

    image

    Test that it works by running a 4 part query.

    <Linked server name> <Database name> (if no specific database name then just use “..”) <Schema> <Table Name>

    Ex: [SPORTS]..[MARK].[SPORTSDALLAS]

    image

    image

    If you get this error when trying to create a Linked Server - “Cannot create an instance of OLE DB provider” - after filling in all the information follow this BLOG.

    Troubleshooting “Cannot create an instance of OLE DB provider”
    http://blogs.msdn.com/b/dataaccesstechnologies/archive/2011/09/28/troubleshooting-cannot-create-an-instance-of-ole-db-provider.aspx

    Mark Ghanayem
    Microsoft Business Intelligence Support


    0 0

    I was attempting to add the SSMS connection dialog to my utility and ran into problems with referenced assemblies.

    The ConnectionDialog is documented here: https://msdn.microsoft.com/en-us/library/microsoft.sqlserver.management.ui.connectiondlg.aspx

    The following is a snippet using the SSMS ConnectionDialog in a C# application.

    Microsoft.SqlServer.Management.UI.ConnectionDlg.

    ConnectionDialog dlg = new Microsoft.SqlServer.Management.UI.ConnectionDlg.ConnectionDialog();

    Microsoft.SqlServer.Management.Smo.RegSvrEnum.

    UIConnectionInfo connInfo = new Microsoft.SqlServer.Management.Smo.RegSvrEnum.UIConnectionInfo { ApplicationName = "My App" };

    IDbConnection conn;
    dlg.AddServer(
    new Microsoft.SqlServer.Management.UI.ConnectionDlg.SqlServerType());
    dlg.Text =
    "Connect to My Database";

    DialogResult dr = dlg.ShowDialogValidateConnection(this, ref connInfo, out conn);
    if (DialogResult.OK == dr)
    {
       m_strConnString = conn.ConnectionString +
    ";Pooling=TRUE";
      
    if(false == m_strConnString.Contains("Database="))
          m_strConnString +=
    ";Database=MyDb";

       bRC =
    true;
    }
    else
    {
      
    bRC = false;
    }

    To compile the properly, references to the RegSvrEnum and ManagementControls are required.   I compiled it on my system and provided it to a peer, who quickly caused the application to fail with missing assembly references.

    I built the application using the SQL Server SSMS 2015 July Preview and they had SQL Server 2014 on their system.   No problem I made sure both assemblies were in the same directory as my application but this still failed.

    Following the trail of missing assemblies I had to provide the following in order for the application to execute.

    • Microsoft.SqlServer.Management.Controls
    • Microsoft.SqlServer.Management.UserSettings
    • Microsoft.SqlServer.RegSvrEnum
    • SqlWorkbbench.Interfaces

    There is not a redistributable package containing these assemblies.  The supported way is to install the matching SSMS package on the target system.   SSMS can be installed separately using the following link: https://msdn.microsoft.com/en-us/library/mt238290.aspx?f=255&MSPPError=-2147217396  

    Bob Dorr - Principal SQL Server Escalation Engineer


    0 0

    In SQL 2014 and above, you can create memory optimized tables with In-Memory OLTP feature.   When you use this feature, SQL Server actually generates native code to optimize performance.  As a result, there will be dll, pdb files plus other intermediate files.  In fact, native optimization is one of three pillars of high performance.  The other two are no lock/no latch implantation and optimizing for in memory (no buffer pool handling).

    Each stored procedure or table will have separate set of files generated.   These are managed by SQL Server and you don’t need to worry about them normally.  But we actually got a report from customer lately and they got the following error when starting their database

    "Msg 41322, Level 16, State 13, Line 0
    MAT/PIT export/import encountered a failure for memory optimized table or natively compiled stored procedure with object ID 214291823 in database ID 6. The error code was 0x80030070".

    The error 0x80030070 is operating system error for ERROR_DISK_FULL “There is not enough space on the disk”.

    It turned out that customer has lots of memory optimized objects (tables and stored procedures) and that resulted in lots of files generated.

    Where do these files get stored?

    They are stored in the default location of database file for the server instance. 

    SQL will always create a subfolder like <default data file location>\xtp\<dbid> and then store files.  The file names follow the convention of xtp_<p or t>_<dbid>_<objected>.*.     For example, when I created a sample In-Memory OLTP with just one memory optimized table named t, my instance of SQL Server generated the following files.  

    image

     

    if you query sys.dm_os_loaded_modules, you will see the native dlls loaded. see a screenshot below.

    image

    Additionally, these files will be always deleted and recreated for the following conditions

    1. SQL Server restarts
    2. Offline/online database
    3. drop and recreate a table or procedure

     

    How can I relocate these files?

    If you want these files stored in a different location, all you need to do is to change default data file location.  SQL Server Management Studio allows you to do that.   But you will need to restart SQL Server after the change.  Once you do that, the In-Memory OLTP related files will be in the new location.

    image

     

    Jack Li |Senior Escalation Engineer | Microsoft SQL Server

    twitter| pssdiag |Sql Nexus


    0 0
  • 11/09/15--07:10: Are My Statistics Correct?
  • The question is often “Are my statistics up-to-date?” which can be a bit misleading.   I can make sure I have up-to-date statistics but the statistics may not be accurate. 

    I recently engaged in an issue where the statistics were rebuilt nightly.   A maintenance job change had been made moving from FULLSCAN to WITH SAMPLE statistics creation/update that dramatically altered the statistical layout.  The underlying data was skewed and as such the execution plan generation(s) varied significantly.  Queries running in 1 minute now took over an hour to complete using an alternate plan with significant memory grants and TEMPDB usage.

    As you can imagine this issue has resulted in a series of DCR asks from the product team. 

    The dilemma we all run into is what level of SAMPLED statistics is appropriate?   The answer is you have to test but that is not always feasible and in the case of Microsoft CSS we generally don’t have histogram, historical states to revisit.  

    Microsoft CSS is engaged to help track down the source of a poorly performing query.   It is common step to locate possible cardinality mismatches and study them closer.   Studying the statistics dates, row modification counter(s), atypical parameters usage and the like are among the fundamental troubleshooting steps.

    The script, below, is one way Microsoft CSS may use to help determine the accuracy of the current statistics.   You can use similar techniques to check the accuracy of your statistical, SAMPLING choices or to store historical information.  The example loads a specific histogram for the ‘SupportCases’ table then executes queries to, using the key values and range information to determine actual counts (as if FULLSCAN) had been executed.   The final select of the captured data can be used to detect variations in current actual vs the in use histogram.

    create table #tblHistogram
    (
    vData sql_variant,
    range_rows bigint,
    eq_rows bigint,
    distinct_range_rows bigint,
    avg_range_rows bigint,
    actual_eq_rows bigintDEFAULT(NULL),
    actual_range_rows bigintDEFAULT(NULL)
    )
    go

    create

    procedure #spHistogram @strTable sysname, @strIndex sysname
    as

    dbccshow_statistics(@strTable, @strIndex)withHISTOGRAM
    go

    truncate

    table #tblHistogram
    go

    insert

    into #tblHistogram(vData, range_rows, eq_rows, distinct_range_rows, avg_range_rows)
      exec #spHistogram'SupportCases','cix_SupportCases'
    go

    -- EQ_ROWS

    update #tblHistogram
    set actual_eq_rows =(selectcount(*)from SupportCases with(NOLOCK) where ServiceRequestNumber = h.vData)
      from #tblHistogram h;

    -- RANGE_ROWS

    with BOUNDS(LowerBound, UpperBound)
    as
    (
      select LAG(vData)over(orderby vData)as [LowerBound], vData [UpperBound] from #tblHistogram
    )

    update

    #tblHistogram
      set actual_range_rows = ActualRangeRows
      from (select LowerBound, UpperBound,
       (selectcount(*)from SupportCases with(NOLOCK) where ServiceRequestNumber > LowerBound and    ServiceRequestNumber < UpperBound)as ActualRangeRows from BOUNDS
    )as t
    where vData = t.UpperBound
    go

    select

    /*TOP 10 NEWID(),*/ vData, eq_rows, actual_eq_rows, range_rows, actual_range_rows from   #tblHistogram
      where eq_rows <> actual_eq_rows or range_rows <> actual_range_rows
    --order by 1
    go

    Testing the script I leveraged UPDATE STATISTICS with SAMPLE 1 PERCENT and skewed data in my table.   This resulted in several steps of the histogram having a statistical variation of +200% from the actual (FULLSCAN) values.

    I continued to test variants of SAMPLE PERCENTAGE until the statistical relevance level from actuals fell within a noise range.   For my data this was 65 PECENT.   SAMPLING at 65 PERCENT allows reduction of statistics creation/modification time while retaining the necessary statistical relevance.

    Bob Dorr - Principal SQL Server Escalation Engineer


    0 0

    Have you seen this message before? We see our customers encounter this message while performing SQL Server installation. If there is a problem, you will normally get this message in the “Instance Configuration” page of the “new SQL Server Failover Cluster setup” sequence.

    Here is how the screen appears with the message at the bottom:

    image

    After you provide the SQL Server Network Name and instance name, you will click Next. At this point the setup program performs a few validations. If those validations fail, then you will notice the error message at the bottom of the screen. If you click on the error message, you will see the some additional information embedded in this message at the end which is not visible by default in this view. Here is an example:

    image

    In general you might encounter one of the following messages:

    The given network name is unusable because there was a failure trying to determine if the network name is valid for use by the clustered SQL instance due to the following error: 'The network address is invalid.'

    The given network name is unusable because there was a failure trying to determine if the network name is valid for use by the clustered SQL instance due to the following error: 'Access is denied.

    The troubleshooting steps and resolution for these situations depends on the what the last part of the error message indicates. Let’s take a quick look at how the setup program performs the validation of the network name. The setup program calls the Windows API NetServerGetInfo and passes two parameters: The network name that you typed in the setup screen and level 101. There are multiple outcomes from this Windows API call:

    1. The API call returns OS error code 53 [The network path was not found]. This tells the setup program that network name provided in the setup program is good to use since nobody is currently using that same name in the network. This is what you ideally want to happen. Setup can proceed to the next steps.

    2. The API call returns success. This tells the setup program that there is another computer active with this same name and hence we cannot use the network name provided in the setup screen. This is essentially a duplicate name scenario. This is straight forward and you can provide a different name to be used by setup.

    3. The API call returns other unexpected failure states like the following:

    RPC error code 1707 which translates to "The network address is invalid"
    Security error code 5 which translates to "Access is denied"

        These are the same error messages you actually get on the setup screen in the last part of that long error message. Now, let us review the steps you can take to troubleshoot these errors and resolve them.

    As a first step, you can isolate this issue to this specific API call and remove SQL server setup from the picture. You can take the sample code for Windows API NetServerGetInfo to build a console application and pass the same network name as parameter to this call. Observe which one of the error codes discussed above is returned back. You need to get back OS error 53 but you might be getting 1707 or 5 as error codes.

    If you now use Process Monitor to track the activity, you will notice a CreateFile call to \\SQL-VNN-TEST\PIPE\srvsvc encounter a BAD NETWORK NAME or ACCESS DENIED.

    If you do not have the required permissions to create computer objects, make sure that the computer objects are pre-staged with the appropriate permissions as described in the document: Failover Cluster Step-by-Step Guide: Configuring Accounts in Active Directory. Also validate that there is no stale entry in the DNS server that is pointing this network name to a different IP address. If possible, clean up all entries related to this network name from the active directory and other name resolution servers like DNS. It will be a good idea to create entries for this network name fresh as described in the section “Steps for prestaging the cluster name account” and “Steps for prestaging an account for a clustered service or application”.

    In the past when our networking team debugged this, they noticed that the error code changes (from 53 to 1707) while the network request is flowing through the various drivers in the network stack. RDBSS will show the correct error code but when the request reaches MUP it gets changed to one of the incorrect error codes we finally encounter. Typically this happens when there is some filter driver sitting in the network stack and intercepting these calls and eventually changing the return codes. So next step for you will be to review all processes and services that are running on this system and evaluate if you can disable or remove the non-critical ones just during the installation or troubleshooting timeframe.

    Check if this problem happens only for a specific name or any network name that you pass for the validation. This can help establish the fact that there is a generic network issue at play than looking up a specific network name.

    It will be great to hear from you if you encountered this issue and which one of the above steps you used to resolve this issue. Also if there is something we have overlooked, please let us know so we can add them to this list of steps to resolve this issue.

    Thanks,

    Suresh Kandoth – SQL Server Escalation Services


    0 0

    We had a customer who opened an issue with us and wanted to know the behavior of statistics during online index rebuild.  Specifically, he suspected that SQL Server might have used ‘incomplete’ statistics because his application uses read uncommitted isolation level.

    This type of questions comes up frequently.  I thought I’d share my research and answers to this customer so that readers will benefit from this blog.

    In order to answer the question more accurately, let’s be specific.     Let’s call Stats1 for index1’s statistics before online index rebuild and stats2 is after online index rebuild.   Furthermore, let’s call Stats3 for any incomplete stats during the index rebuild.   Now the question becomes:  during online index index rebuild  for index1 (started but not completed), which stats will my query (compiled during online index rebuild) use (stats1, stats2 or stats3)?

    Here are few key points that answer the above question:

    1. First of all, there is no stats3.  SQL Server never stuffs in flight stats to stats blob for use during online index rebuild.  Even you are under dirty read, you won’t get non-existing stats3.
    2. During online index rebuild, stats1 (old stats) continues to be available for use until the very end
    3. Stats2 (new stats) will be updated at very end of index rebuild .
    4. During the brief period when SQL switches to new stats (stats2), no one can access stats at all.  Even with read uncommitted isolation level, you can’t access it.    This is because SQL Server acquires schema modification lock at the very last of online index rebuild to make changes in meta data including stats change.   Even you have read uncommitted isolation level, you still need schema stability lock for the table.  You can’t have that when schema modification lock is granted by someone else.  In short, you will never see anything in between.  You either see before (stats 1) or after (stats2).
    5. After online index rebuild, all queries involved in the tables will need to recompile .

    What about Index reorg?

    REORG does nothing related to statistics update. In other word, REORG doesn’t update stats for the index at all.    I have posted a blog.   In the interest of finding impact of reorg on locks and recompile, I did more research.  Re-org won’t cause recompile of your query or hold schema modification lock.  It requests a schema stability lock which is much ligher weight.  Reorg does acquires and releases x locks on pages or rows.  But these have no effect on stats or queries in read uncommitted isolation levels.  In otherwords, your query in read uncommitted isolation will continue to run without any impact.  Re-org only help on data is accessed physically.   No stats update, no recompile. 

    What is the duration of schema stability locks

    for online index rebuild, duration of schema-modification lock (for rebuild, sql acquire schema modification lock) is very brief towards the end.  All it does is to do metadata update?

    Jack Li |Senior Escalation Engineer | Microsoft SQL Server

    twitter| pssdiag |Sql Nexus


    0 0

    Starting SQL Server 2012, we support creating database on a remote SMB file share without requiring a trace flag.  You can even configure SQL cluster to use SMB file share for databases. This is documented here. When creating or opening data or log files, SQL Server calls various file manipulation API including an WIN32 API...

    0 0

    I have had more Linked Server cases that are setup for an Oracle database than any other non-SQL Server database in SQL. Being in Business Intelligence Support we deal with plenty of connectivity issues and this is one topic of connectivity that does not get touched on a lot. In less than a month I...

    0 0

    I was attempting to add the SSMS connection dialog to my utility and ran into problems with referenced assemblies. The ConnectionDialog is documented here: https://msdn.microsoft.com/en-us/library/microsoft.sqlserver.management.ui.connectiondlg.aspx The following is a snippet using the SSMS ConnectionDialog in a C# application. Microsoft.SqlServer.Management.UI.ConnectionDlg. ConnectionDialog dlg = new Microsoft.SqlServer.Management.UI.ConnectionDlg.ConnectionDialog(); Microsoft.SqlServer.Management.Smo.RegSvrEnum. UIConnectionInfo connInfo = new Microsoft.SqlServer.Management.Smo.RegSvrEnum.UIConnectionInfo { ApplicationName = “My App”...

    0 0

    In SQL 2014 and above, you can create memory optimized tables with In-Memory OLTP feature.   When you use this feature, SQL Server actually generates native code to optimize performance.  As a result, there will be dll, pdb files plus other intermediate files.  In fact, native optimization is one of three pillars of high performance.  The...

    0 0
  • 11/08/15--23:10: Are My Statistics Correct?
  • The question is often “Are my statistics up-to-date?” which can be a bit misleading.   I can make sure I have up-to-date statistics but the statistics may not be accurate.  I recently engaged in an issue where the statistics were rebuilt nightly.   A maintenance job change had been made moving from FULLSCAN to WITH SAMPLE statistics...

    0 0

    Have you seen this message before? We see our customers encounter this message while performing SQL Server installation. If there is a problem, you will normally get this message in the “Instance Configuration” page of the “new SQL Server Failover Cluster setup” sequence. Here is how the screen appears with the message at the bottom:...

    0 0

    We had a customer who opened an issue with us and wanted to know the behavior of statistics during online index rebuild.  Specifically, he suspected that SQL Server might have used ‘incomplete’ statistics because his application uses read uncommitted isolation level. This type of questions comes up frequently.  I thought I’d share my research and...

    0 0

    I get to be the a good new messenger today.    We have made changes to the SQL Server Client Provider.  The provider detects when multiple IP addresses are present for a listener.   The links below detail the behavior making it easier for your multi-subnet AlwaysOn deployments. Improved MultiSubnet Listener Behavior With Newly Released SQL Client...

    0 0

    Microsoft is always seeking out ways to improve the customer experience and satisfaction.  A project that is currently active looks at the SQL Server incidents reported to Microsoft SQL Server Support and applies Machine Learning.   A specific aspect of the project is to predict when a case needs advanced assistance (escalation, onsite, development or upper...

    0 0

    If you have seen enough query plans, you for sure ran into spool operators (index spool or table spool). It is documented in https://technet.microsoft.com/en-us/library/ms181032(v=sql.105).aspx The spool operator helps improve a query performance because it stores intermediate results so that SQL doesn’t have to rescan or re-compute for repeated uses.  Spool operator has many usage. For...

    0 0

    While data for memory optimized tables resides in memory all the time with SQL Server 2014 and 2016’s In-Memory OLTP feature, we still need a means to cut down recovery time in case of crash or restart.  For disk based table, checkpoint flushes the dirty pages into data file(s).  With In-memory OLTP, there are separate set of...

    0 0

    Even with SQL Server support for so many years, we still face something new almost every day.   Sometimes you will just have to combine things together to achieve what you need.  Here is an example due to troubleshooting a customer’s issue. A couple of months ago, we ran into a need to enable a trace...

    0 0

    Microsoft is pleased to announce the release of (Transport Layer Security) TLS 1.2 support in all major client drivers and SQL Server releases. The updates made available on January 29th, 2016 provide TLS 1.2 support for SQL Server 2008, SQL Server 2008 R2, SQL Server 2012 and SQL Server 2014. The client drivers that have...

older | 1 | .... | 15 | 16 | (Page 17)