Quantcast
Viewing all articles
Browse latest Browse all 158

You probably have a ton of old event data in your Data Warehouse

 

Image may be NSFW.
Clik here to view.
image

 

Prior to SCOM 2012 R2 UR7, we had an issue where we did not groom out old data from the Event Parameter and Event Rule tables in the DW.  This will show up as these tables growing quite large, especially the event parameter tables.  They will never groom out the old, orphaned data.

It isn’t a big deal, but if you’d like to free up some space in your Data Warehouse database – read on.

 

I’ll just go out and say that ANYONE who ever ran a SCOM management group prior to SCOM 2012 R2 UR7, is affected.  How much just depends on how many events you were collecting and shoving into your DW.

Once you apply UR7 or later, this issue stops, and the normal grooming will groom out the data as events get groomed.  HOWEVER – we will never go back and clean out the old, already orphaned event parameters and event rules.

 

Nicole was the first person I saw write about this issue:

https://blogs.msdn.microsoft.com/nicole_welch/2016/01/07/scom-2012-large-event-parameter-tables/

 

Essentially – to know if you are affected, there are some SQL statements you can run…. but I wrote my own.  These take a long time to run – but it gives you an idea of how many events are in scope to be groomed.

 

SELECT count(*) from event.vEventParameter ep WHERE ep.EventOriginId NOT IN (SELECT distinct EventOriginId from event.vEvent) select count(*) from event.vEventRule er WHERE er.EventOriginId NOT IN (SELECT distinct EventOriginId from event.vEvent)

 

 

Nicole has a stored procedure listed on her site – where you can run that to create the stored proc – then use the statement calling the sproc with a “max rows to groom” parameter.   It works well and I recommend it.

 

Alternatively – you can just run this as a straight SQL query.  I will post that below:

I set MaxRowsToGroom hard coded to 1,000,000 rows.  I found this runs pretty quick and doesn’t use a lot of transaction log space.  You can adjust this depending on how much cleanup you need to do if you prefer the query approach, or just use the stored proc and the loop command in the blog post linked above.

 

DECLARE @MaxRowsToGroom int ,@RowsDeleted int SET NOCOUNT ON; SET @MaxRowsToGroom = 1000000 DECLARE @RuleTableName sysname ,@DetailTableName sysname ,@ParamTableName sysname ,@DatasetId uniqueidentifier = (select DatasetId from StandardDataset where SchemaName = 'Event') ,@TableGuid uniqueidentifier ,@Statement nvarchar(max) ,@schemaName sysname = 'Event' SET @TableGuid = (select TableGuid from StandardDatasetTableMap where DatasetId = @datasetID) --BEGIN TRY BEGIN TRAN SELECT TOP 1 @RuleTableName = BaseTableName + '_' + REPLACE(CAST(@TableGuid AS varchar(50)), '-', '') FROM StandardDatasetAggregationStorage WHERE (DatasetId = @DatasetId) AND (AggregationTypeId = 0) AND (DependentTableInd = 1) AND (TableTag = 'Rule') SET @Statement = 'DELETE TOP (' + CAST(@MaxRowsToGroom AS varchar(15)) + ')' + ' FROM ' + QUOTENAME(@SchemaName) + '.' + QUOTENAME(@RuleTableName) + ' WHERE (EventOriginId NOT IN (SELECT EventOriginId FROM Event.vEvent)) ' execute (@Statement) SELECT TOP 1 @ParamTableName = BaseTableName + '_' + REPLACE(CAST(@TableGuid AS varchar(50)), '-', '') FROM StandardDatasetAggregationStorage WHERE (DatasetId = @DatasetId) AND (AggregationTypeId = 0) AND (DependentTableInd = 1) AND (TableTag = 'Parameter') SET @Statement = 'DELETE TOP (' + CAST(@MaxRowsToGroom AS varchar(15)) + ')' + ' FROM ' + QUOTENAME(@SchemaName) + '.' + QUOTENAME(@ParamTableName) + ' WHERE (EventOriginId NOT IN (SELECT EventOriginId FROM Event.vEvent)) ' execute (@Statement) SET @RowsDeleted = @@ROWCOUNT COMMIT

 

 

I do recommend you clean this up.  It doesn’t hurt anything sitting there, other than potentially making any event based reports run slower, but the big impact to me is just dealing with such a large DW, backups, restores, and cost of ownership of a database that big, for little reason.

Make sure you update statistics when you are done, if not also a full DBReindex.   To update statistics – run:    exec sp_updatestats

 

 

Here is an example of my before and after:

Before:

Image may be NSFW.
Clik here to view.
image

 

After:

Image may be NSFW.
Clik here to view.
image

 

Trimmed from 3.3 GB to 117 MB!!!!!   If this were a large production environment, this could be a substantial amount of data.

 

 

And remember – most collected events are worthless to begin with.  As a tuning exercise – I recommend disabling MOST of the out of the box event collections, and also reduce your event retention in the DW:

https://blogs.technet.microsoft.com/kevinholman/2009/11/25/tuning-tip-turning-off-some-over-collection-of-events/

https://blogs.technet.microsoft.com/kevinholman/2010/01/05/understanding-and-modifying-data-warehouse-retention-and-grooming/


Viewing all articles
Browse latest Browse all 158

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>