Feed aggregator

A faster CHECKDB – Part III

SQL Server SQL CSS - Mon, 11/10/2014 - 08:33

Bob Ward introduced Part 1 and Part 2 of ‘A faster CHECKDB’ as highlighted in the following links.

Part 1: http://blogs.msdn.com/b/psssql/archive/2011/12/20/a-faster-checkdb-part-i.aspx 
Part 2: http://blogs.msdn.com/b/psssql/archive/2012/02/23/a-faster-checkdb-part-ii.aspx 

Recently,  Jonathan pointed out a memory grant issue in the following post.

https://www.sqlskills.com/blogs/jonathan/dbcc-checkdb-execution-memory-grants-not-quite-what-you-expect/

I always enjoy my interactions with Jonathan and this is yet another positive experience for us all.  After digging into this I found there is a bug and it was corrected in the SQL Server 2014 release.

The heart of the matter is a cardinality problem for the estimated number of fact rows.  The cardinality estimation drives a large portion of the memory grant size calculation for the DBCC check commands.  As Jonathan outlines in his post the overestimate is often unnecessary and reduces the overall performance of the DBCC check operation.

The checkdb/checktable component responsible for returning the number of fact rows (cardinality) for each object mistakenly returned the size of the object as the number of rows.

The following example shows 10,000 rows, requiring 182,000 bytes on disk.

Prior to SQL Server 2014 the SQL Server code would return a cardinality estimate based on 182,000 instead of 10,000.  As you can easily see this is an significant, row estimate variance.

If you capture the query_post_execution_showplan (or pre) you can see the checkindex plan used by the DBCC check operation.

Shown in the table are plan excerpts from SQL Server 2012 and SQL Server 2014, using an EMPTY, table.  Notice the estimate is near 2 pages in size (8192 * 2) and for an empty table SQL Server only produces 3 total facts related to allocation state.

SQL 2012

<StmtSimple StatementEstRows="129.507" StatementOptmLevel="FULL"

          <QueryPlan DegreeOfParallelism="0" MemoryGrant="33512" NonParallelPlanReason="MaxDOPSetToOne" CachedPlanSize="24" CompileTime="0" CompileCPU="0" CompileMemory="128">

  <RelOp NodeId="1" PhysicalOp="Sort" LogicalOp="Sort" EstimateRows="16772             

       <RunTimeInformation>

                       < RunTimeCountersPerThread Thread="0" ActualRows="3" ActualEndOfScans="1" ActualExecutions="1" />

SQL 2014

<StmtSimple StatementEstRows="10" StatementOptmLevel="FULL"

          <QueryPlan DegreeOfParallelism="0" MemoryGrant="1024" NonParallelPlanReason="MaxDOPSetToOne" CachedPlanSize="24" CompileTime="0" CompileCPU="0" CompileMemory="128">

  <RelOp NodeId="1" PhysicalOp="Sort" LogicalOp="Sort" EstimateRows="9"             

       <RunTimeInformation>

                       < RunTimeCountersPerThread Thread="0" ActualRows="3" ActualEndOfScans="1" ActualExecutions="1" />


A more dramatic difference is shown from a test I ran against a 1.3 trillion row table, without the fix.  The estimated rows are 900 trillion with a memory grant size of 90GB.


Prior to SQL Server 2014 you can leverage Jonathan’s advice and limit the DBCC check using Resource Governor or move to SQL Server 2014 to execute your DBCC check operations faster.

Bob Dorr - Principal SQL Server Escalation Engineer

Categories: SQL Server MS Blogs

Database Configuration Management for SQL Server

Simple-Talk on SQL - Tue, 11/04/2014 - 17:00

It is not just the rapid and painless testing, deployment and update of databases that requires care in the retention and management of configuration information. Configuration information is also essential for audit, resilience, and support. The range of documentation varies widely with the database and its setting, but the underlying principles remain the same. Without appropriate configuration management, automation is likely to be futile.

Questions About Using TSQL to Import Excel Data You Were Too Shy to Ask

Simple-Talk on SQL - Tue, 11/04/2014 - 17:00

It is easy to import Excel data into database tables via TSQL, using OLEDB, either by the OPENROWSET function or linking to the spreadsheet as a server. The problem is that there are certain things that aren't obvious that you need to know about, and you feel awkward about asking such simple questions.

The Oracle and Teradata connector V3.0 for SQL Server 2014 Integration Service is now available for download

SQL Server Release Blog - Tue, 11/04/2014 - 13:37
Dear Customers, The Oracle and Teradata connector V3.0 for SQL Server 2014 Integration Service is now available for download at the Microsoft Download Center. Microsoft SSIS Connectors by Attunity Version 3.0 is a minor release. It supports SQL...(read more)
Categories: SQL Server MS Blogs

SSIS 2012 Projects: Setup, Project Creation and Deployment

Simple-Talk on SQL - Mon, 11/03/2014 - 17:00

It used to be that SQL Server Integration Services (SSIS) packages had to be deployed individually. Now, they can be all deployed together from a single file by means of the Project Deployment Model introduced in SSIS 2012. Where there are tens or hundreds of SSIS packages to deploy, this system is essential. Feodor Georgiev talks us through the basics in the first of a three-part series.

Where in the Application Should Data Validation be Done?

Simple-Talk on SQL - Mon, 11/03/2014 - 17:00

Whereabouts in the application should the business logic of data-validation checks be made? The advantages of a layered approach to validation that includes database constraints, would seem to outweigh the disadvantages. William Sisson explains some interesting issues.

Can you restore from your backups? Are you sure?

Tibor Karaszi - Mon, 11/03/2014 - 10:52
A few days ago, we were doing restore practice drills with a client. I had tested the stuff before this, so the practice was more for the client's DBAs to test various restore scenarios, with me being able to point to the right direction (when needed),...(read more)

Ola Hallengrens Maintenance Solution now supports mirrored backup

Tibor Karaszi - Mon, 11/03/2014 - 06:27
You probably know that you can mirror a backup to several destinations, assuming you are on a supported edition (Enterprise or Developer). This is not the same as striping; you can compare striping to RAID 0, and mirroring to RAID 1. Ola now supports...(read more)

SQL Server MAX DOP Beyond 64 – Is That Possible?

SQL Server SQL CSS - Thu, 10/30/2014 - 11:14

I recently posted a blog outlining how the partitions of a table can be used in the calculation for the achievable max degree of parallelism (MAX DOP). http://blogs.msdn.com/b/psssql/archive/2014/09/04/a-partitioned-table-may-limit-the-runtime-max-dop-of-create-alter-index.aspx 

Discussing this with various peers I uncovered a perception that SQL Server was always limited to a max of 64 CPUs, even if the machine had more (128, 160, …)   This is not the case, instead the perception is semantic driven and once you realize how to leverage it maintenance operations can take advantage of more than 64 CPUs.

It is not hard to understand how the perception started or continues to propagate itself.

SQL Server Books Online states: “Setting maximum degree of parallelism to 0 allows SQL Server to use all the available processors up to 64 processors. “ and that is where most of us quit reading and assume the MAX DOP for SQL Server is limited to 64.

Instead if you read a bit further: “If a value greater than the number of available processors is specified, the actual number of available processors is used.”

Simply stated if you tell SQL Server to use more than 64 CPUs SQL Server will attempt to do just that.

Bob Dorr - Principal SQL Server Escalation Engineer

Categories: SQL Server MS Blogs

SQL CLR assembly fails verification with “Unable to resolve token”

SQL Server SQL CSS - Wed, 10/22/2014 - 13:05

Recently we worked with a customer has an SQL CLR assembly. This customer decided to upgrade from SQL Server 2008 R2 to SQL Server 2012. But this assembly failed to register with SQL Server and he received the following error:
Msg 6218, Level 16, State 2, Line 11
CREATE ASSEMBLY for assembly 'test3.5' failed because assembly 'test3.5' failed verification. Check if the referenced assemblies are up-to-date and trusted (for external_access or unsafe) to execute in the database. CLR Verifier error messages if any will follow this message [ : Test.Test::my_func][mdToken=0x6000001][offset 0x00000000] Unable to resolve token.

First of all, SQL Server 2008 R2 and SQL 2012 use different versions of CLR. SQL 2008 R2 and below uses CLR 2.0/3.5 but SQL 2012 was upgraded to use CLR 4.0 and above.
What's interesting for this customer is that if they compile the assembly using 4.0 compiler, then they could register the assembly by using CREATE ASSEMBLY.
When we compared the IL generated, there is just one difference dealing with a local variable.
For the assembly compiled for 2.0/3.5, you see ".locals init ([0] void& pinned pData)". But for the assembly compiled for 4.0, you see ".locals init ([0] native int& pinned pData)". See a screenshot below with ildasm:
IL generated by 2.0 compiler


IL generated by 4.0 compiler



The IL in question is generated for the code like fixed (void* pData = &buf[1024]). Basically, the intention is to pin the memory for native call.

Cause

There are two changes in CLR that cause CREATE ASSEMBLY to fail. First, CLR 4.0 compiler no longer generate IL "void& pinned" for code like fixed (void* pData = &buf[1024]). Instead, it generates IL like .locals init ([0] native int& pinned pData). Additionally, CLR 4.0 peverify code is updated and no longer recognize that particular IL generated by CLR 2.0 compiler. When you CREATE ASSEMBLY in SQL Server, it has to do peverify to ensure the assembly passes verification. In this case, SQL 2012 uses 4.0 peverify code to verify the assembly compiled with 2.0 compiler. Therefore, it fails.

Solution

There are two solutions for this.
First option is to compile your assembly using CLR 4.0 compiler targeting 4.0 framework. This should be the best option because SQL 2012 uses CLR 4.0.
If you need your assembly to continue to target 2.0/3.5 framework, you can use 4.0 compiler but link 2.0 version of mscorlib.dll. Here is an example command.
C:\Windows\Microsoft.NET\Framework\v4.0.30319\csc.exe /nostdlib+ /noconfig /r:c:\Windows\Microsoft.NET\Framework\v2.0.50727\mscorlib.dll -unsafe -optimize -debug:pdbonly -target:library -out:test.dll test.cs

Repro Step 1 Save the following code into test.cs

using System;
using System.Collections;
using System.Runtime.InteropServices;
using System.Text;
using System.Reflection;
using System.Reflection.Emit;
using Microsoft.Win32;
namespace Test
{
unsafe public class Test
{
unsafe public delegate void my_Delegate(ushort comp,
ushort func,
void* ptr,
uint length);
public static my_Delegate delegate1;

uint dataLength = 0;
public void my_func (String objectId,
uint component,
uint method,
ushort level,
String caption,
uint offset,
int length,
byte[] buf)
{
fixed (void* pData = &buf[1024])
{

delegate1((ushort)component,
(ushort)method,
pData,
dataLength);
}
}
}
}

Step 2 Compile the assembly

Compile using the following command
C:\Windows\Microsoft.NET\Framework\v3.5\csc.exe -unsafe -optimize -debug:pdbonly -target:library -out:test3.5.dll test.cs

Step 3: CREATE ASSEMBLY

If you "create assembly asem from 'C:\repro2\test3.5.dll' with permission_set=unsafe", you will receive the above error.

Step 4: solution and workaround

But the following two commands won't result in errors
C:\Windows\Microsoft.NET\Framework\v4.0.30319\csc.exe -unsafe -optimize -debug:pdbonly -target:library -out:test4.0.dll test.cs
C:\Windows\Microsoft.NET\Framework\v4.0.30319\csc.exe /nostdlib+ /noconfig /r:c:\Windows\Microsoft.NET\Framework\v2.0.50727\mscorlib.dll -unsafe -optimize -debug:pdbonly -target:library -out:test.dll test.cs

Jack Li | Senior Escalation Engineer | Microsoft SQL Server Support

Categories: SQL Server MS Blogs

Hekaton in 1000 Words

Simple-Talk on SQL - Tue, 10/21/2014 - 17:00

The SQL Server 2014 In-Memory OLTP engine (a.k.a. Hekaton) is a radical change for relational databases. This article, an exerpt from Kalen Delaney's book "SQL Server Internals: In-Memory OLTP", provides a brief overview of what Hekaton is and why it's important.

Cumulative Update #4 for SQL Server 2014 RTM

SQL Server Release Blog - Tue, 10/21/2014 - 13:31
Dear Customers, The 4 th cumulative update release for SQL Server 2014 RTM is now available for download at the Microsoft Support site. To learn more about the release or servicing model, please visit: CU#4 KB Article: http://support.microsoft...(read more)
Categories: SQL Server MS Blogs

Report Builder of SQL Server 2008 R2 Service Pack 3 does not launch.

SQL Server Release Blog - Thu, 10/16/2014 - 17:55
Dear Customers we have discovered a problem with Report Builder that ships with SQL Server 2008 R2 Service Pack 3. If you installed SQL Server 2008 R2, have upgraded it to Service Pack 2 and then applied Service Pack 3, then Report Builder will...(read more)
Categories: SQL Server MS Blogs

The Mindset of the Enterprise DBA: Delegating Work

Simple-Talk on SQL - Thu, 10/16/2014 - 17:00

A lot of the routine jobs demanded of a DBA can be automated, but a tougher prospect is to automate these jobs in a way that the requestor rather than the DBA can actually set of the job running themselves without compromising security and without risk. Is it true to say that some tasks can be made self-service? In the final part of his series, Joshua considers delegation.

Exploring Your SQL Server Databases with T-SQL

Simple-Talk on SQL - Wed, 10/08/2014 - 17:00

Most DBAs hoard their own favourite T-SQL scripts to help them with their work, often on a USB 'thumbdrive', but it is rare that one of them offers us a glimpse of the sort of scripts that they find useful. It is our good fortune that Scott Swanberg shows us those scripts he uses for discovering more about database objects.

Set-based Constraint Violation Reporting in SQL Server

Simple-Talk on SQL - Mon, 10/06/2014 - 17:00

When you're importing data into an RDBMS in bulk and an exception condition is raised because of a constraint violation, you generally need to fix the problem with the data and try again. The error won't tell you which rows are causing the violation. What if you've thousands of rows to search when it happens? There are solutions, writes William Sisson.

Questions about Primary and Foreign Keys You Were Too Shy to Ask

Simple-Talk on SQL - Sun, 10/05/2014 - 17:00

It is strange that one can ask simple questions about extended events or Hekaton at professional events and conferences without feeling embarrassed, yet nobody likes to ask vital questions about SQL Server primary keys and foreign keys. Once more, Rob Sheldon is 'drawn to one side' to answer those questions about keys that one is too shy to ask.

SQL Server 2008 Service Pack 4 has released.

SQL Server Release Blog - Tue, 09/30/2014 - 13:29
Dear Customers, Microsoft SQL Server Product team is pleased to announce the release of SQL Server 2008 Service Pack 4 (SP4). As part of our continued commitment to software excellence for our customers, this upgrade is free and doesn’t...(read more)
Categories: SQL Server MS Blogs

SQL Server 2008 R2 Service Pack 3 has released.

SQL Server Release Blog - Fri, 09/26/2014 - 13:41
Dear Customers, Microsoft SQL Server Product team is pleased to announce the release of SQL Server 2008 R2 Service Pack 3 (SP3). As part of our continued commitment to software excellence for our customers, this upgrade is free and doesn’t...(read more)
Categories: SQL Server MS Blogs

Improving the Quality of SQL Server Database Connections in the Cloud

Simple-Talk on SQL - Sun, 09/21/2014 - 17:00

To access SQL Server from the client, you use TDS protocol over TCP. This is fine over reliable LANs but over the internet these connections are relatively slow and fragile, TDS is still used to connect to databases in the cloud, but you need to use a combination of the new features such as connection pools and idle connection resiliency to make applications faster and more reliable.