Pages

Monday, October 11, 2010

SQL Compact Performance: PostSyncCleanup to turn off UpdateStatistics during initial download of SQL Compact Replication


SQL Compact Replication initial subscription may take significant time on devices.   The UpdateStatistics operation adds additional overhead to the initial download.    Indicators: The Client Agent log may show the start and end times of UpdateStatistics operation.  For example: SQLCECA: 04/09/2009-12:21:26 UpdateStatistics Start app=<appName>.exe SQLCECA: 04/09/2009-12:27:59 UpdateStatistics End hr=0 app=<appName>.exe   Workaround:   Recent hotfixes for SQL Compact 3.5 and 3.1 (KB article URLs given below) expose a new API for SQL Compact applications to turn off UpdateStatistics during initial download of Replication.   Disabling UpdateStatistics can be done only on the merge replication, and not for RDA operations. One should use,   SqlCeReplication repl = new SqlCeReplication(); … repl.PostSyncCleanup = 2;   to make sure that update stats is not called after merge replication sync.    To use the new method in your SQL Compact app, in addition to applying the hotfix to device, you will also need to deploy the updated System.Data.SqlServerCe.dll  to the desktop folder where VS can find it, else an error will occur:   'PostSyncCleanup' is not a member of 'System.Data.SqlServerCe.SqlCeReplication'.   You can follow these steps to resolve the above error:   Extract SYSTEM~1.004 file from the device cab, rename it to “ System.Data.SqlServerCe.dll ” and replace at the appropriate path.   When working with SQL Server Compact 3.5 (Visual Studio 2008): “C:\Program Files\Microsoft SQL Server Compact Edition\v3.5\Devices\System.Data.SqlServerCe.dll”   If your project refers to “System.Data.SqlServerCe.dll” in another location then replace the DLL at that location as well.    Once the modified app runs on the device, if the hotfix bits were not deployed you may see the following error (as expected):   “Method not found: set_ PostSyncCleanup System.Data.SqlServerCe.SqlCeReplication”   Uninstalling SQL Compact 3.5 SP1 RTM and installing the hotfix .cab files should resolve the above error.   References:   http://support.microsoft.com/kb/963060   http://support.microsoft.com/kb/960142   Posted by:  Syed

Plan guides (plan freezing) in SQL Server 2005/2008
  SQL Server 2005   The plan guides were first introduced in SQL Server 2005.You may find detailed information about the plan guides in SQL Server 2005 here:   General information http://msdn.microsoft.com/en-us/library/ms187032(SQL.90).aspx    Description of sp_create_plan_guide http://msdn.microsoft.com/en-us/library/ms179880(SQL.90).aspx   This feature can be useful when you are trying to enforce the optimizer using a specific execution plan without changing the original query. In SQL Server 2005, however,  you have to specify the statement exactly as it was submitted to the SQL Server. Sometimes it can be difficult since you have to capture a SQL Profiler trace and copy the SQL statement from there. Also there are certain rules that have to be applied. For example, the SQL parser does ignore spaces and escape characters inside the query text, but does not allow it at the end, so it may add complexity to capturing the actual statement.   In SQL Server 2005 query statistics will appear in DMVs including the T-SQL statement. In order to make it easier you may create a plan guide based on query statistics DMVs directly.   Here is the script that demonstrates this.   NOTE: Before you execute the examples below please make sure you have the  AdventureWorks database installed and the compatibility level for it  set to 90. If you don’t have the AdventureWorks database you can download it from: http://www.codeplex.com/SqlServerSamples       use AdventureWorks go   --- Cleaning cache for this sample dbcc freeproccache go   --- Running query first time to get plan generated for freezing set statistics xml on exec sp_executesql   N'SELECT COUNT(*) FROM [HumanResources].[Employee] e INNER JOIN [Person].[Contact] c ON c.[ContactID] = e.[ContactID]' set statistics xml off go   --- Based on query pattern creating a plan guide - freezing plan declare @sql_text nvarchar(max),         @sql_xml_plan nvarchar(max)   select @sql_text=     substring(sqt.text, (qs.statement_start_offset/2)+1,((CASE     qs.statement_end_offset WHEN -1 THEN DATALENGTH(sqt.text) ELSE       qs.statement_end_offset END - qs.statement_start_offset)/2) + 1), @sql_xml_plan =          convert(nvarchar(max),sqp.query_plan) from sys.dm_exec_query_stats qs        cross apply sys.dm_exec_sql_text(qs.sql_handle) sqt        cross apply sys.dm_exec_query_plan(qs.plan_handle) sqp where text like '%Employee%'   if @sql_text<>'' begin        select @sql_text, @sql_xml_plan        set @sql_xml_plan = 'OPTION(USE PLAN '''+@sql_xml_plan+''')'        exec sp_create_plan_guide @name =N'MyPlan_Guide_1'            , @stmt = @sql_text            , @type = N'SQL'            , @module_or_batch = NULL            , @params = NULL            , @hints = @sql_xml_plan end     You may check the plan guide is created by querying sys.plan_guides catalog view   select * from sys.plan_guides     Now execute the query again.   --- This time we will see USEPLAN=1 and plan guide name in XML plan output set statistics xml on exec sp_executesql   N'SELECT COUNT(*) FROM [HumanResources].[Employee] e INNER JOIN [Person].[Contact] c ON c.[ContactID] = e.[ContactID]' set statistics xml off   Click on ShowPlanXML hyperlink   SQL Server Management Studio 2005 will show you XML. Look at <StmtSimple> and <QueryPlan> tags.   As you can see SQL Server picked the created plan guide and USEPLAN option.   WARNING! You should be careful using the plan guides in SQL Server 2005. In case if metadata and/or data distribution has been changed the optimizer will not be able to use the plan guide anymore and the query will fail with the following error:     Msg 8698, Level 16, State 0, Line 1 Query processor could not produce query plan because USE PLAN hint contains plan that could not be verified to be legal for query. Remove or replace USE PLAN hint. For best likelihood of successful plan forcing, verify that the plan provided in the USE PLAN hint is one generated automatically by SQL Server for the same query.     To demonstrate this, disable the existent index that is used in the plan guide above   ALTER INDEX IX_Employee_ManagerID ON HumanResources.Employee DISABLE GO   and try running the query again.   set statistics xml on exec sp_executesql   N'SELECT COUNT(*) FROM [HumanResources].[Employee] e INNER JOIN [Person].[Contact] c ON c.[ContactID] = e.[ContactID]' set statistics xml off     To clean up your server after this demonstration:   --- Rebuild disabled index to enable it ALTER INDEX IX_Employee_ManagerID ON HumanResources.Employee REBUILD GO   --- Drop plan guide EXEC sp_control_plan_guide N'DROP',N'MyPlan_Guide_1';     SQL Server 2008   In SQL Server 2008 plan guides feature has been improved. In addition to sp_create_plan_guide you can also use the sp_create_plan_guide_from_handle stored procedure   Understanding plan guides http://msdn.microsoft.com/en-us/library/ms190417(SQL.90).aspx   sp_create_plan_guide_from_handle http://technet.microsoft.com/en-us/library/bb964726.aspx   So now you can create a plan guide based on the actual plan handle without pulling the T-SQL statement text   Here is the example:   use AdventureWorks go   --- Cleaning cache for this sample dbcc freeproccache go   --- Running query first time to get plan generated for freezing set statistics xml on exec sp_executesql   N'SELECT COUNT(*) FROM [HumanResources].[Employee] e INNER JOIN [Person].[Contact] c ON c.[ContactID] = e.[ContactID]' set statistics xml off go     --- Based on query pattern creating a plan guide - freezing plan declare @plan_handle varbinary(1000)   select @plan_handle = plan_handle from sys.dm_exec_query_stats qs        cross apply sys.dm_exec_sql_text(qs.sql_handle) sqt where text like '%Employee%'   select @plan_handle   exec sp_create_plan_guide_from_handle 'MyPlan_Guide_1', @plan_handle=@plan_handle     As you can see creating the plan guide is easier now,  and you may also easily copy and paste the plan handle from the DMV output and manually pass it to sp_create_plan_guide_from_handle   Run the query again   set statistics xml on exec sp_executesql   N'SELECT COUNT(*) FROM [HumanResources].[Employee] e INNER JOIN [Person].[Contact] c ON c.[ContactID] = e.[ContactID]' set statistics xml off go     Click on ShowPlanXML hyperlink   In SQL Server 2008 SQL Server Management studio will take you directly to the graphic execution plan automatically   Then right click on the execution plan page and choose Properties   To clean up your server after this demonstration:   --- Drop plan guide EXEC sp_control_plan_guide N'DROP',N'MyPlan_Guide_1';     Another difference in SQL Server 2008 from SQL Server 2005 is with the optimizer behavior  if the metadata has been changed. If this is the case, then the plan guide can not be used anymore.  The SQL Server 2008 optimizer will silently skip the plan guide and continue with query execution with whatever execution plan is the best. This makes the use of plan guides more robust and less dangerous than in SQL Server 2005.   To monitor the plan guides behavior you may use two new SQL Profiler events in SQL Server 2008   Posted by: Sergey Pustovit – SQL Server Escalation Services  

Tools of the Trade: Part IV - Developing WinDbg Extension DLLs
A WinDbg extension DLL is set of exported callback functions for implementing user defined commands to extract specific customized  information from the memory dump(s).  Extension DLLs are loaded by debugger engine and can provide extra functionality of automation tasks while performing user-mode or kenrel-mode debugging.  An extension DLL may export any number of functions that are used to execute extension commands.  Each function is explicitly declared as an export in the  DLL's definition file or .def file and function names must be in lowercase letters WinDbg (DbgEng ) extension DLL must export DebugExtensionInitialize. This will be called when the DLL is loaded, to initialize the DLL. It may be used by the DLL to initialize global variables. An extension DLL may export an optional function DebugExtensionUninitialize. If this is exported, it will be called just before the extension DLL is unloaded. An extension DLL may export a DebugExtensionNotify. If this is exported, it will be called when a session begins or ends, and when a target starts or stops executing. These notifications are also provided to IDebugEventCallbacks objects registered with a client. An extension DLL may export KnownStructOutput. If this is exported, it will be called when the DLL is loaded. This function returns a list of structures that the DLL knows how to print on a single line. It may be called later to format instances of these structures for printing. So, how to develop   your own Windbg Extension DLL? Lets follow these steps: 1. Download & Install debugging tools for windows from http://www.microsoft.com/whdc/devtools/debugging/installx86.Mspx 2. Create "Win32 Console Application" using VS 2008 3. Select Application type as "DLL" and click "Finish" . 4. Add a "Module-Definition File (.def)" called "wdbrowser" to the project. One way to export your extension function is by specifying the function names in the EXPORTS section of the .def file. You may you use other ways of exporting functions, such as __dllexport 5. Configure the project "Additional include Directories" to point to header files that comes with Windbg. Default folder for x86 is "C:\Program Files\Debugging Tools for Windows (x86)\sdk\inc" 6. Configure the project "Additional Library Directories" to point to library files that comes with Windbg. Default folder  for x86 libraries  is  ""C:\Program Files\Debugging Tools for Windows (x86)\sdk\lib\i386"" 7. The library, "dbgeng.lib " & "dbgeng.dll" has the implementation of the Debug engine exported functions. So, add "dbgeng.lib" in "Additional Dependencies". 8. Add name of the module definition file created at the above Step #3 9. Now Include the following required headers files in "stdafx.h" #include <windows.h> #include <imagehlp.h> #include <wdbgexts.h> #include <dbgeng.h> #include <extsfns.h> 10. Declare following two global variables in your extension project main implementation file. //Version. EXT_API_VERSION g_ExtApiVersion = {1,1,EXT_API_VERSION_NUMBER,0} ; WINDBG_EXTENSION_APIS ExtensionApis = {0}; 11. Declare following debug engine COM interface pointers. IDebugAdvanced2*  gAdvancedDebug2=NULL; IDebugControl4*   gDebugControl4=NULL; IDebugControl*    gExecuteCmd=NULL; IDebugClient*                 gDebugClient=NULL; 12. Next step is to declare and implement WinDbgExtensionDllInit function in your DLL main implementation source file. In this example that is "wdbrowser.cpp" . The WinDbgExntensionDllInit is the first function that will be called by windbg . So, this function is the idle for implementing any extension specific initialization or related functionality. Please refer http://msdn.microsoft.com/en-us/library/cc267872.aspx for more details about this function. VOID WDBGAPI WinDbgExtensionDllInit (PWINDBG_EXTENSION_APIS lpExtensionApis, USHORT usMajorVersion, USHORT usMinorVersion) {                   ExtensionApis = *lpExtensionApis;                   HRESULT hResult = S_FALSE;                         if (hResult = DebugCreate(__uuidof(IDebugClient), (void**) &gDebugClient) != S_OK)                          {                                                 dprintf("Acuqiring IDebugClient* Failled\n\n");                                                 return;                          }                          if (hResult = gDebugClient->QueryInterface(__uuidof(IDebugControl), (void**) &gExecuteCmd) != S_OK)                          {                                         dprintf("Acuqiring IDebugControl* Failled\n\n");                                                 return;                          }                          if (hResult = gDebugClient->QueryInterface(__uuidof(IDebugAdvanced2), (void**) &gAdvancedDebug2) != S_OK)                          {                                               dprintf("Acuqiring IDebugAdvanced2* Failled\n\n");                                                 return;                          }                          if (hResult = gDebugClient->QueryInterface(__uuidof(IDebugControl4), (void**) &gDebugControl4) != S_OK)                          {                             dprintf("Acuqiring IDebugControl4* Failled\n\n");                                                 return;                          } } 13. Declare another exported function ExtensionApiVersion to report the version of your extension to windbg. Please refer to http://msdn.microsoft.com/en-us/library/cc267873.aspx for detailed information about this function. LPEXT_API_VERSION WDBGAPI ExtensionApiVersion (void) {     return &g_ExtApiVersion; } 14. Define Debug engine's interface pointers, so that your extension module can interact with debug engine. For more information please refer http://msdn.microsoft.com/en-us/library/cc265976.aspx  - IDebugClient, http://msdn.microsoft.com/en-us/library/cc266102.aspx - IDebugControl http://msdn.microsoft.com/en-us/library/cc265957.aspx - IDebugAdvanced IDebugAdvanced2* gAdvancedDebug2=NULL; IDebugControl4* gDebugControl4=NULL; IDebugControl* gExecuteCmd=NULL; IDebugClient*               gDebugClient=NULL; 15. Next step is to - implement debug engine's callback interface IDebugOutputCallbacks . Debug engine callbacks your implementation of IDebugOutCallbacks::Output() with output as a result of the commands that are executed by your extension function. Refer to http://msdn.microsoft.com/en-us/library/cc265716.aspx   for detailed information about IDebugOutputCallbacks::Output() 16. Add the following new class in a header file that inherits the IDebugOutputCallbacks interface . #ifndef __OUT_HPP__ #define __OUT_HPP__ #include <string> #include <sstream> class StdioOutputCallbacks : public IDebugOutputCallbacks { private:                         std::string m_OutputBuffer;                         //                         //This buffer holds the output from the command execution.                         //                         CHAR m_OutPutBuffer[4096]; public:                         void InitOutPutBuffer();                         std::string GetOutputBuffer()                         {                                                 return m_OutputBuffer;                         };                         void ClearOutPutBuffer()                                       {                                                 m_OutputBuffer = "";                         };     STDMETHOD(QueryInterface)(         THIS_         IN REFIID InterfaceId,         OUT PVOID* Interface         );     STDMETHOD_(ULONG, AddRef)(         THIS         );     STDMETHOD_(ULONG, Release)(         THIS         );     // IDebugOutputCallbacks.     STDMETHOD(Output)(         THIS_         IN ULONG Mask,         IN PCSTR Text         ); }; extern StdioOutputCallbacks g_OutputCb; #endif // #ifndef __OUT_HPP__ 17. Add the following code that implements the IDebugOutputCallbacks interface methods, especially Output() #include "stdafx.h" #include <stdio.h> #include <windows.h> #include <dbgeng.h> #include "OutputCallBack.h" StdioOutputCallbacks g_OutputCb; STDMETHODIMP StdioOutputCallbacks::QueryInterface(     THIS_     IN REFIID InterfaceId,     OUT PVOID* Interface     ) {     *Interface = NULL;     if (IsEqualIID(InterfaceId, __uuidof(IUnknown)) ||         IsEqualIID(InterfaceId, __uuidof(IDebugOutputCallbacks)))     {         *Interface = (IDebugOutputCallbacks *)this;         AddRef();         return S_OK;     }     else     {         return E_NOINTERFACE;     } } STDMETHODIMP_(ULONG) StdioOutputCallbacks::AddRef(     THIS     ) {     // This class is designed to be static so     // there's no true refcount.     return 1; } STDMETHODIMP_(ULONG) StdioOutputCallbacks::Release(     THIS     ) {     // This class is designed to be static so     // there's no true refcount.     return 0; } STDMETHODIMP StdioOutputCallbacks::Output(     THIS_     IN ULONG Mask,     IN PCSTR Text     ) {     UNREFERENCED_PARAMETER(Mask);                         m_OutputBuffer += Text;     return S_OK; } void StdioOutputCallbacks::InitOutPutBuffer() {                         m_OutputBuffer.erase(); } 18. Add implementation of your extension function. In this example, we choose to implement an extension that displays the variable names, types in the frame 2 of the current thread. The implementation is: DECLARE_API (dvf3) { // // Install output callbacks. // if ((gDebugClient->SetOutputCallbacks((PDEBUG_OUTPUT_CALLBACKS) &g_OutputCb))!= S_OK) { dprintf("*****Error while installing Outputcallback.*****\n\n"); return; } // // Since frame count starts from 0 index, we have to pass 2 as parameter for .frame command for the frame# 2 // //Execute command to extrac 2nd frame. if (gExecuteCmd->Execute(DEBUG_OUTCTL_THIS_CLIENT | //Send output to only outputcallbacks DEBUG_OUTCTL_OVERRIDE_MASK | DEBUG_OUTCTL_NOT_LOGGED, ".frame 2", DEBUG_EXECUTE_DEFAULT ) != S_OK) { dprintf("Executing .frame 2 failled\n"); return; } //Execute command to extrac 2nd frame. if (gExecuteCmd->Execute(DEBUG_OUTCTL_THIS_CLIENT | //Send output to only outputcallbacks DEBUG_OUTCTL_OVERRIDE_MASK | DEBUG_OUTCTL_NOT_LOGGED, "dv /i /t /v", DEBUG_EXECUTE_DEFAULT ) != S_OK) { dprintf("Executing dv /i /t /v failled\n"); return; } dprintf("***** Extracting locals & formal params from frame 2 *****"); dprintf("\n%s\n", g_OutputCb.GetOutputBuffer().c_str()); } 19. Re-build the project. Copy the .DLL from release folder to a folder where Windbg looks for extension DLLs. On x86 machine default location is  "<Drive letter>\Program Files\Debugging Tools for Windows (x86)\winext" 20. The extension is ready for the use or test. 21. Start windbg and open a full user mode dump. Type .load myextension and hit enter to load the extension DLL into Windbg process space 22. Run .chain command to verify if your extension is loaded by WinDbg . you will see output similar to below output, if your extension is loaded. 23. Type !dvf3 to run the extension function for extracting and displaying variable names, types from the frame 2 . Additional references: http://msdn.microsoft.com/en-us/library/cc265826.aspx - describes about how to Interact with debug engine, I/O operations with debug engine, Memory access, Using symbols, source files. http://www.codeplex.com/ODbgExt  - Microsoft Open Debugger Extension for Windbg Happy developing debug engine extensions! Posted By: Srini Gajjela & Enamul Khaleque (DSD-SQLDeveloper group @ Microsoft)

SQL Null: a set of posts on the null
Null is a special marker used in Structured Query Language (SQL) to indicate that a data value does not exist in the database. Introduced by the creator of the relational database model, E. F. Codd, SQL Null serves to fulfill the requirement that all true relational database management systems (RDBMS) support a representation of "missing information and inapplicable information". Codd also introduced the use of the lowercase Greek omega (ω) symbol to represent Null in database theory. NULL is also an SQL reserved keyword used to identify the Null special marker. Null has been the focus of controversy and a source of debate because of its associated three-valued logic (3VL), special requirements for its use in SQL joins, and the special handling required by aggregate functions and SQL grouping operators. Although special functions and predicates are provided to properly handle Nulls, opponents feel that resolving these issues introduces unnecessary complexity and inconsistency into the relational model of databases. Since Null is not a member of any data domain, it is not considered a "value", but rather a marker (or placeholder) indicating the absence of value. Because of this, comparisons with Null can never result in either True or False, but always in a third logical result, Unknown. In my next few posts, I will explore the null.

Tools of the Trade: Part II – The Repro
In this series of posts, we’re going to take a look at some of the tools we at Microsoft Developer Support -- Database commonly use, how to use them, and the types of issues they can be used to resolve.   This post might seem like a bit of a stretch.  For one thing, it’s not about a SQL related tool.  That’s not such a big deal – there are plenty of tools that we use that have applications beyond SQL.  You can expect more posts in this series about tools that have applications far beyond troubleshooting databases and database connectivity.   However, this isn’t about a specific, pre-existing tool either.  Call it a meta-tool, I guess.  Pseudo-tool.  Something along those lines.  One way or the other, even if it isn’t already built for you, a reproduction can be one of the most invaluable and time-saving tools you’ll use to troubleshoot your most difficult issues. The Repro In the Product Support Services world at Microsoft, we refer to a simplified, stripped down series of steps that lead to a specific behavior as a Repro.  This may simply be a step-by-step to recreate an error, or a simple application that recreates an unusual behavior.  When we are working on an issue that is buried in a hulking, production application, we often like to tackle it is by attempting to recreate the issue outside of the in vivo environment.  There are a lot of benefits to investing some time in developing a Repro.   It’s probably pretty obvious why you would not want to troubleshoot in a production environment.  I personally like to avoid debugging anything that generates revenue – it just seems to ease the tension.  There’s more to it, though – not only do you avoid tinkering with your company’s livelihood, a Repro also gives you the advantage of portability.  Whatever technology you’re working with, it is much, much easier to get the assistance of Subject Matter Experts if you can actually send them something to play with.   Another big benefit of the Repro is that it helps you to focus on the issue.  This is particularly helpful when you need to use other tools to analyze the behavior with which you’re dealing. Whatever kind of tracing you’re doing, you are bound to pick-up extraneous information if you’re running the trace on a production application.  Unless you’re very fond of wading through gigantic logs looking for clues, a Repro really helps to avoid getting lost in superfluous data.   Finally, a repro can save you a huge amount of time.  It can get so irritating having to click through the same series of screens and options, over and over again, as you troubleshoot an issue in a full-scale application.  Being able to make a change to code or the environment, and quickly test dramatically improves the speed of the troubleshooting process.  It also has the added benefit of keeping debugging from getting boring.   That’s my case for investing the time in creating a Repro.  The next question is: what is the best way to build a Repro?  Well, that’s going to depend greatly on the issue and the environment.   One thing that I like to do is work from the inside out.  In other words, if I know that a specific line of code is throwing an exception, I’ll create an application with the smallest amount of code I can use to execute that line.  If that doesn’t reproduce the behavior, I’ll add more code from the application around it, building up the Repro until it gets the behavior I’m looking for.   Another thing that I’ve found is important is to make sure that I attempt my Repro in an environment that’s as similar to production as possible.  I do not recommend attempting to reproduce an issue occurring in a data-access application running on Windows Server 2003 on your grandma’s Mac.  It’ll be much easier to create a repro on something that’s as close to the original scenario as possible and then move it to other environments to see if the behavior is any different.   One of the Technical Leads on our team has a good tip for reproducing issues that occur intermittently.  For example, say you find that every-so-often your application gets an error when attempting to connect to a data-store.  In this situation, it’s probably worth a try to take the connection you’re trying to make and wrap it in a tight loop.  It’ll be much easier to launch this and catch the failure than it will be to keep running and re-running a small repro that makes the connection a single time and exits.  If you are troubleshooting an issue occurring in a web-application, you may try a tool like Application Center Test to execute your repro many times simultaneously (see: this link for a discussion of using ACT).   Finally, I really recommend carefully documenting your repro.  Again, it makes it much easier to get the help of others, and it will also prevent you from forgetting what you had done on Friday after a long weekend.  Even if you really cannot find a way to reproduce the issue in a stripped down environment, documenting the steps an end-user takes when the issue occurs can be valuable for helping to clarify the problem statement and focus your efforts.     Posted By:       Justin Trobec  

How to enable TDS parser to display TDS frames when SQLServer is listening on port other than default 1433
If you try to view a netmon trace in Netmon 3.4, you will see TDS traffic is nicely parsed for you. You will see a display similar to this: The parsing works nicely because SQLServer is listening on default tcp port 1433. But, if your SQLServer is listening on a different port (other than 1433), then these TDS parsing won't work by default. You will see a display similar to this (no frames resolving to TDS): To enable the parser to parse TDS traffic for ports other than 1433 (in this case, 63959), we need to take following steps: 1. Click on "Parsers" tab in netmon UI. This will display list of parsers installed. You will see tcp.npl, tds.npl along with several different parsers 2. Double click on tcp.npl and search for "1433", you will get into a switch/case code block saying "case 1433". We basically need to include our port 63959 here. Just add a case statement above “case 1433” without any parsing code. case 63959: case 1433: //TDS-parsing code goes here   This is what it’s look like after the change:   3. Save your changes 4. Reload your netmon trace, now it should look like following. With the change, TDS parser will resolve TDS traffic on this particular non-default port (63959). Author : Enamul(MSFT), SQL Developer Technical Lead

SQL 2005 SP2 is available now!
Finally, after rounds of CTPs (Community Technology Previews) and great feedback from customers, we have officially released Service Pack 2 (SP2) for SQL 2005. Download SP2You can download it by visiting the Microsoft website here. Enhancements in SQL 2005 SP2SP2 provides several new features and improvements. You can learn about them here. Download SQL 2005 SP2 Sample databasesYou can download the sample databases and learn about their enhancements by visiting here. Download Books Online refresh for SP2Books online gets updated with every release of service pack. Please download the latest version by visiting here.   As with every service pack, please make sure that you first test it out on your test and QA environment, before putting it to the production server.     Posted By: Sanchan Saxena

List_all_SQL_Agent_jobs_and_the_owner_of_those_jobs
A new article has been posted in the wiki.

After Installation of .NET Framework 3.5 SP1 (or .Net framework 2.0 SP2) , SSIS packages using ODBC 3rd party drivers may fail
The Issue: In the last few months, we have seen quite a few cases where after installing .net framework 3.5 SP1 ( KB http://support.microsoft.com/kb/951847), SSIS packages using certain ODBC 3rd party drivers and importing character data may fail with the error below - Error Message: [DataReader Source [<number>]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "component "DataReader Source" (<number>)" failed because error code 0x80131937 occurred, and the error row disposition on "output column "ColumnName" (<number)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure. The error messages may vary depending on if you are using DataReader Source in Visual Studio 2005 or ADO .Net Source in Visual Studio 2008 – [ADO NET Source [<number>]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "component "ADO NET Source" (<number>)" failed because error code 0x80131937 occurred, and the error row disposition on "output column "ColumnName" (<number>)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure. Error 0xc0209029: Data Flow Task: The "component "<ComponentName>" (<number>)" failed because error code 0x80131937 occurred... [ADO NET Source [<number>] Error: The component "ADO NET Source" (<number>) was unable to process the data. Pipeline component has returned HRESULT error code 0xC0209029 from a method call. [SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on component "ADO NET Source" (<number>) returned error code 0xC02090F5. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure. NOTE: Not all ODBC drivers are affected by this problem. This problem affects primarily SSIS packages which import character data using certain 3rd party ODBC drivers, through the use of ADO.Net connection manager and System.Data.ODBC managed provider. In Visual Studio 2005: SSIS DataReader Source --> ADO.Net connection manager --> .Net Providers \ ODBC Data Provider (System.Data.Odbc) --> Vendor ODBC driver --> Character data In Visual Studio 2008: ADO .Net Source --> ADO.Net connection manager --> .Net Providers \ ODBC Data Provider (System.Data.Odbc) --> Vendor ODBC driver --> Character data Cause: The problem started with .Net Framework 2.0 SP2 after a change was made in the way OdbcDataReader checks for Nulls in Sequential Access mode, calling SQLGetData with a BufferLength argument of 0 for variable width data types.  Certain ODBC drivers are not compatible with this pattern. For more information on this function and its arguments, see: SQLGetData Function http://msdn.microsoft.com/en-us/library/ms715441.aspx .Net framework 2.0 SP2 is also included .Net framework 3.5 SP1 components. Microsoft is still investigating the issue and understands the difficult impact that this change has on customers. Determining the full impact of the issue (for example, what are all the data types that can cause the issue etc) is still a work in progress. You can track the progress from the connect site below. http://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=416455 In the meantime, we suggest that you use one of the workarounds mentioned below. Workaround: Workaround#1: (recommended) Modify the SSIS package to redirect the DataReader (or ADO .Net Source in Visual Studio 2008) Error Output to a text file, thereby allowing the package to succeed even though no rows are written out to the error output file. A. Go to Data Flow Task. B. On DataReader Source Properties-> ‘Input and Output Properties’ Tab -> DataReader OutPut -> Output columns C. Select the columns (only with character data type) and change the ‘ErrorRowDisposition’ and TruncationRowDisposition’ properties to RD_RidectRow D. Redirect the DataReader Error Output to a text file E. Repeat A-D for all Data Flow Tasks that fail with the above error. Workaround#2: If you need a quick fix (because this is a production environment and you cannot afford to modify SSIS packages at this time) – you may need to uninstall the components of .Net Framework 3.5 SP1 or .Net framework 2.0 SP2, depending on what framework version you installed that caused the issue. a. If your issue is caused by installing only .Net framework 2.0 SP2 (for example, from a Microsoft download site http://www.microsoft.com/downloads/details.aspx?familyid=5B2C0358-915B-4EB5-9B1D-10E506DA9D0F&displaylang=en), then you can uninstall .Net framework 2.0 SP2. b. If your issue is caused by installing .Net framework 3.5 SP1 (KB http://support.microsoft.com/kb/951847), Remove the .Net Framework 3.5 SP1 components and bring the system back to the state it was in before the update.  This process involves more than simply uninstalling .Net Framework 3.5 SP1, as discussed in the following blogs: http://blogs.msdn.com/dougste/archive/2008/08/19/uninstalling-net-framework-3-5-sp1.aspx http://blogs.msdn.com/astebner/archive/2008/08/01/8803442.aspx Posted By: Azim Uddin & Enamul Khaleque

Dealing with very large SQL Compact database files
Working with very large SQL Compact files, you may run into two issues:   1.     Manipulating a large SQL Compact database in VS 2008 IDE requires a hotfix   http://support.microsoft.com/kb/968436   Error message when you use the Visual Studio 2008 IDE to manipulate a SQL Server Compact 3.5 Database file which is larger than 128 Megabytes: "The database file that is larger than the configured maximum database size"   2.     Upgrading a very large database from SQL Compact 3.1 to 3.5 requires another hotfix http://support.microsoft.com/kb/971027 FIX: Error message when you upgrade a very large database to SQL Server Compact 3.5: "The database file is larger than the configured maximum database size. This setting takes effect on the first concurrent database connection only" Hotfix 3.5.5692.12 fixes a problem where large SQL CE database upgrade fails with an error: The database file is larger than the configured maximum database size. This setting takes effect on the first concurrent database connection only. [ Required Max Database Size (in MB; 0 if unknown) = <size>] Depending upon the file-size and available resources on the machine, the upgrade process for a very large database may consume significant memory.  This is expected behavior.   You may want to stop other applications to make room for the Upgrade process to complete.

SQL Server 2008 Service Pack 2 Released
A new blogpost has been posted by SQLDenis on Sep 29, 2010 in category Microsoft SQL Server Admin

No comments:

Post a Comment