Skip to main content

Fun and games with the Management Data Warehouse (MDW and Data Collectors)

The SQL Server Management Data Warehouse (when you first come across it) seems to promise so much if the verbiage from Microsoft and some other websites is to to believed. But when you install it you may find that it is not as useful as it could be. This is a shame but we are currently only on v2 of the product with SQL 2012 so one hopes it will improve in subsequent versions.

However, it probably is worth playing with if you have never used it before - at least you can show your boss some reports on general server health when he asks for it and you have nothing else in place.

There is one big problem with it though if you decide that you don't want to use it any more, uninstalling it is not supported! Mad, I know. But as usual some very helpful people in the community have worked out, what seems to me, a pretty safe way of doing it.

I had a problem with my MDW. The data collector jobs were causing a lot of deadlocking on some production servers and impacting performance. It looks like there may be a workaround for it now but due to time constraints I didn't have the opportunity to investigate further so I disabled the associated SQL Agent jobs on the monitored servers. I thought I would revisit MDW in the future as it looked unlikely that my department were going to buy a 3rd party application that would offer similar functionality.

Some time later I noticed that the MDW database had grown very large, about 136G, and the server I had it running on was struggling for space. This was odd because I thought data wasn't being uploaded to it.

There is purge job that is installed by default called mdw_purge_data_[MDW] but though it was running each day didn't seem to clear much data out so I could shrink the file. Looking at the SP that is run by this job, core.sp_purge_data, it took some parameters so I thought I would try and execute it for each monitored instance. The parameters are:

@retention_days
@instance_name
@collection_set_uid
@duration

Making an educated guess that I could set @retention_days and @duration to 1, all I had to do was find the values for @instance_name and @collection_set_uid. To do so I ran this query against the MDW database:

--Show the databases that have been configured for MDW
SELECT DISTINCT [instance_name]
       ,[collection_set_uid]
FROM [MDW].[core].[snapshots]

I used the results as parameters for the the core.sp_purge_data stored procedure but even though it appeared to delete several thousand rows, not much space was released . Meanwhile, the database kept growing. There was only one thing for it - I had to "uninstall" MDW and set it up again. The MSSQLTips.com  article did the job for me on this 2008R2 server.

I have reinstalled MDW now and am only monitoring one server for the moment while keeping an eye on the amount of data that is being collected. But to help out matters I have enabled page compression on some tables that I identified previously which had grown very large. They are:

snapshots.active_sessions_and_requests
snapshots.io_virtual_file_stats
snapshots.notable_query_plan
snapshots.notable_query_text
snapshots.os_memory_clerks
snapshots.os_wait_stats
snapshots.performance_counter_values
snapshots.query_stats

To generate that list of tables I used the following query:

SELECT SCHEMA_NAME(sys.objects.schema_id) AS [SchemaName]
       ,OBJECT_NAME(sys.objects.object_id) AS [ObjectName]
       ,[rows]
       ,[data_compression_desc]
       ,[index_id] AS [IndexID_on_Table]
FROM sys.partitions
INNER JOIN sys.objects
       ON sys.partitions.object_id = sys.objects.object_id
WHERE data_compression > 0
       AND SCHEMA_NAME(sys.objects.schema_id) <> 'SYS'
ORDER BY SchemaName
       ,ObjectName

We are now rolling out SCOM so it will be interesting to see what sort of statistics I will be able to generate from it. The reports won't be as useful as those from Confio Ignite, for instance, but it will have to do for now.

Comments

  1. Hey Paulie,

    Excellent article and hopefully, better things to come for MDW in 2014+

    ReplyDelete
  2. Very good article and a topic for discussion. I completely agree with the author that the expectations of the program are at variance with its opportunities and it is sad.
    Because as a supplement to the article and how the exchange of useful information to share with you a very useful resource that meets all the expectations from him virtual data rooms for mergers and acquisitions.

    ReplyDelete

Post a Comment

Popular posts from this blog

How to move the Microsoft Assessment and Planning Toolkit (MAP) database to a different drive

The Microsoft Assessment and Planning Toolkit (MAP) is a very useful tool for scanning your network to find instances of SQL Server plus all manner of detailed information about the installed product, OS and hardware it sits on.


<Click image to enbiggen>
There is an issue with it the database it uses to store the data it collects, however. Assuming you don't have an instance called MAPS on your server, the product will install using LocalDB (a cut down version of SQL Server Express) and puts the databases on your C: drive. If you then scan a large network you could easily expand the database to 10GB which may cause issues on a server when that drive is often one of the smallest. However, there is a simple solution: connect to LocalDB using Management Studio, detach the databases, move to a different drive, set permissions on the new location if required and reattach the database. How do you connect to LocalDB? Here you go:

Connect to (localdb)\MAPTOOLKIT


The databases I move…

SAN performance testing using SQLIO

Introduction
This document describes how to use Microsoft’s SQLIO to test disk/SAN performance. It is biased towards SQL Server – which uses primarily 64KB and 8KB data pages so I am running the tests using those cluster sizes, however, other sizes can be specified.  Download SQLIO from https://www.microsoft.com/en-gb/download/details.aspx?id=20163 SQLIO is a command line tool with no GUI so you need to open a command prompt at C:\Program Files (x86)\SQLIO after you have installed it. Configuration First of all edit param.txt so that you create the test file we will be using. The file needs to be bigger than the combined RAID and on-board disk caches. In this case we are using a 50GB file.
The “2” refers to the number of threads to use when testing, you don’t need to change this now. The “0x0” value indicates that all CPUs should be used, which you probably don’t want to change either, “#” is a comment. The only part you may want to change is 51200 (50GB) and the drive letter. After …

SSIS Job fails when it calls Excel via the SQL Agent but succeeds from SSDT

If you have an SSIS package which fails when run on a schedule but succeeds when executed interactively in Visual Studio/BIDS/SSDT, it may produce an error like this:
Executed as user: DOMAIN\user. Microsoft (R) SQL Server Execute Package Utility  Version 12.0.4100.1 for 64-bit  Copyright (C) Microsoft Corporation. All rights reserved.    Started:  10:52:25  Error: 2017-02-06 10:52:26.26     Code: 0x00000001     Source: Open Spreadsheet and Run Macro      Description: Exception has been thrown by the target of an invocation.  End Error  DTExec: The package execution returned DTSER_FAILURE (1).  Started:  10:52:25  Finished: 10:52:26  Elapsed:  0.485 seconds.  The package execution failed.  The step failed.
The issue is a type of permissions error. Excel needs to have its permissions changed via the DCOM applet in Control Panel. By default it is on “The launching user.” This needs to be changed to a user with more permissions, in this case we have used the service account used by SQL A…