How do you calculate the age of an individual? This can become an extremely hot topic and the number of solutions provided can be numerous. Before attacking this “controversial” topic, it actually would help set some ground rules to be used in designing an aging algorithm.
The most difficult part of this, of course, is figuring out what to do with leap years. When does a person (or object or document for that matter) become another year older? For the purposes of this short article, I propose that this occurs on February 28th of non leap years.
Some Basic Date Arithmetic, SQL StyleBefore diving into the aging calculation, let’s look at some very simple date arithmetic using SQL Server 2005.
First, let’s see what adding 1 year to February 28, 2008 and February 29, 2008 returns.
declare @date1 datetime,
@date2 datetime
set @date1 = '20080228'
set @date2 = '20080229'
select dateadd(yy, 1, @date1), dateadd(yy, 1, @date2)
Result:
2009-02-28 00:00:00.000 2009-02-28 00:00:00.000
Interesting, adding 1 year to both dates results in the same date. Let’s see what happens when you now add 4 years.
declare @date1 datetime,
@date2 datetime
set @date1 = '20080228'
set @date2 = '20080229'
select dateadd(yy, 4, @date1), dateadd(yy, 4, @date2)
Result:
2012-02-28 00:00:00.000 2012-02-29 00:00:00.000
Now that is interesting, both dates aren’t the same. This is what would be expected.
Now, let’s look at DATEDIFF and see how it works a little.
declare @date1 datetime,
@date2 datetime
set @date1 = '20080229'
set @date2 = '20090228'
select datediff(yy, @date1, @date2)
Result: 1
But wait, what is the result of the following?
declare @date1 datetime,
@date2 datetime
set @date1 = '20081231'
set @date2 = '20090101'
select datediff(yy, @date1, @date2)
Result:1
Wait! That isn’t right, there isn’t a year between those two dates, something must be wrong. As you can see, the DATEDIFF function returned the number of year’s difference between the year values of the two dates, not the actual number of years between the two dates.
We can use this information to create a simple algorithm:
Age = datediff(yy, date_of_birth, Today)
– (if Today < dateadd(yy, datediff(yy, date_of_birth, Today),date_of_birth)
then 1 else 0
Or, to put it in to T-SQL:
declare @dob datetime,
@age int,
@day datetime
set @day = '2008-02-28'
set @dob = '2007-03-01'
set @age = datediff(yy,@dob,@day)–
case
when @day <>
then 1
else 0
end
select @age
Conclusion
This is just one method to use to calculate the age of person in code. There are numerous other methods that can also be used. All I can suggest, is use the method that works best for you in your situation.
Monday, June 30, 2008
Thursday, June 19, 2008
Scripting Trace's for SQL 2005
I wrote this with the assumption that you will be running traces around job steps, but you can adapt this for other situations.I used the default traces for both SQL and AS traces, but removed most of the Events from both the trace definitions in this write-up, just for brevity, you can generate the full definitions from Profiler by following the steps below. I have sections for SQL Server and Analysis Services.
SQL Trace
1. Create the Trace you want in SQL Profiler, on a SQL connection, setting a Trace Name (I used 'OLAP Error SQL Data'), Save To File with a file name and a file size large enough to hold the data in a single file (I used 32MB), and the Events / Data fields you want to trace. Run the Trace.
2. Stop the Trace - we don't actually want to collect anything at this point.3. Export the trace definition - select "FileExportScript Trace DefintionFor SQL 2005...". Export the definition to a .SQL file of your choice.
4. Close Profiler, we're done with it at this point.In the SQL job with the step you want to trace, you're going to add a couple of steps before the step you want to trace...
5. Add the following code to a T-SQL job step, named 'Delete SQL Trace File', before the step you want to trace (this step is just to make sure there isn't a file from a previous run, just in case):
DECLARE @OS_Cmd VARCHAR(1024)SELECT @OS_Cmd = 'DEL /Q /F "C:\Trace Files\OLAP Error SQL Data.trc"'EXECUTE master.dbo.xp_cmdshell @OS_CmdNOTE - you will get an error code 12 from the sp_trace_create call in the next step if the output file already exists.
6. Add the code from the SQL file you just created to a T-SQL job step, named 'Start SQL Trace', immediately after the 'Delete SQL Trace File' step. Replace the literal 'InsertFileNameHere' with the destination you want for the trace file (don't add .trc to the end of the file name - it will automatically get added for you). It should look like this:
-- Create a Queue
declare @rc int
declare @TraceID int
declare @maxfilesize bigint
set @maxfilesize = 32-- Please replace the text InsertFileNameHere, with an appropriate
--filename prefixed by a path, e.g., c:\MyFolder\MyTrace. The .trc extension
--will be appended to the filename automatically. If you are writing from
-- remote server to local drive, please use UNC path and make sure server has
-- write access to your network share
exec @rc = sp_trace_create @TraceID output, 0, N'C:\Trace Files\OLAP Error SQL Data', @maxfilesize, NULL
if (@rc != 0) goto error
-- Client side File and Table cannot be scripted
-- Set the events
declare @on bit
set @on = 1
exec sp_trace_setevent @TraceID, 14, 1, @on
exec sp_trace_setevent @TraceID, 14, 9, @on
exec sp_trace_setevent @TraceID, 14, 6, @on
exec sp_trace_setevent @TraceID, 14, 10, @on
exec sp_trace_setevent @TraceID, 14, 14, @on
exec sp_trace_setevent @TraceID, 14, 11, @on
exec sp_trace_setevent @TraceID, 14, 12, @on
-- Set the Filters
declare @intfilter int
declare @bigintfilter bigint
exec sp_trace_setfilter @TraceID, 10, 0, 7, N'SQL Server Profiler - 1745445d-46a5-4050-9922-16caf3851690'
-- Set the trace status to start
exec sp_trace_setstatus @TraceID, 1
-- display trace id for future references
select TraceID=@TraceID
goto finish
error: select ErrorCode=@rc
finish:
Now you're going to add a couple of steps after the step you want to trace...
7. You need to stop the trace. Add this code to a T-SQL step named 'Stop SQL Trace', note that the file name in the WHERE clause must match the file name you specified above in the 'Start SQL Trace' step, with the '.trc. added to the end:
DECLARE @TraceID INT
SELECT @TraceID = TraceID FROM :: fn_trace_getinfo(0)WHERE CAST([value] AS VARCHAR(256)) = 'C:\Trace Files\OLAP Error SQL Data.trc'
IF @TraceID IS NOT NULL
BEGIN
PRINT 'Closing Trace ID ' + CAST (@TraceID AS VARCHAR(5))
EXEC sp_trace_setstatus @TraceID, 0
EXEC sp_trace_setstatus @TraceID, 2
END
10 Steps to Secure SQL server
1. KEEP SYSTEMS UP TO DATE ON SERVICE PACKS, PATCHES, AND NEW VERSIONS OF THE OS AND SQL SERVER.
Tools to help with this: Idera SQLsecure is a tool for assessing security across your SQL Servers. SQLsecure collects SQL Server version number and service pack as part of the overall security assessment it performs. The version number returned by SQLsecure will also reflect the latest patches applied, so you can determine which servers are out of date.
2. ENFORCE STANDARDS FOR SECURE PASSWORDS.
Tools to help with this: If you are using SQL Server 2005, use the built-in security options to enforce password complexity standards. For previous versions of SQL Server, there are many publicly available scripts you can use to identify accounts with weak or non-existent passwords.
3. SECURE THE FULL ENVIRONMENT, INCLUDING THE OS AND NETWORK
Tools to help with this: Microsoft has many documents and knowledge base articles that provide more detail and best practices on how to ensure server and application security.
4. ENSURE APPROPRIATE SQL SERVER SETUP
Tools to help with this: Idera SQLsecure checks most of these settings across your SQL Servers and reports back where you have potential security concerns.
5. REGULARLY ASSESS WHO CAN ACCESS YOUR SQL SERVERS AND TAKE ACTION TO LIMIT ACCESS.
Tools to help with this: Idera SQLsecure can make this task a whole lot easier for you. SQLsecure automatically generates a complete list of all users with access to your SQL Server, across Windows, Active Directory, and SQL Server, including users that have access as a result of group membership.
6. ASSESS WHAT ACTIONS USERS CAN PERFORM ON WHAT OBJECTS
Tools to help with this: Accurately assessing effective permissions on SQL Server objects is really, really, hard to do manually. Idera SQLsecure can perform this time consuming task for you. SQLsecure calculates both effective (derived) and assigned permissions for all users, on any database object, making it much easier to ensure that users have appropriate access rights.
7. KEEP AN AUDIT TRAIL OF DATABASE ACTIVITY.
Tools to help with this: SQL Server does provide native C2 auditing to help with gathering this information. However, C2 auditing can be very resource intensive and does not provide fine-grained controls to let you specify what you want to collect. You may instead want to use a 3rd party auditing tool, such as Idera SQL compliance manager. SQL compliance manager provides customizable, low-impact auditing, alerting and reporting on virtually all activity across multiple SQL servers.
8. AUDIT THE ACTIONS OF ADMIN USERS ON YOUR SQL SERVERS
Tools to help with this: The native SQL Server C2 auditing can also be used for this purpose, however, there is no way to ensure an immutable trail of audit data with the native tools. So, what’s to stop an admin user from deleting the audit trail after making off with your customer list? A better solution is to invest in a 3rd party tool such as SQL compliance manager which can audit ALL admin user activity, and protect the audit trail from tampering by anyone, even admin users.
9. SECURE AND AUDIT THE APPLICATIONS THAT ACCESS YOUR SQL SERVER DATABASES
Tools to help with this: Idera SQL compliance manager will track the activity of any application within a targeted database. And, alerts can be configured to watch for and provide immediate notification of questionable behavior, such as a business application accessing a database it shouldn’t.
10. ENSURE PROPER DATABASE BACKUPS, AND SECURE YOUR BACKUPS
Tools to help with this: While SQL Server does give you the ability to backup your databases, it does not include the ability to encrypt the backups, or compress them to help save space and reduce backup time. Idera SQLsafe provides a high performance backup solution which includes encryption and compression. SQLsafe also lets you create backup policies with exception reporting to notify you of any backups that failed to run as scheduled.
Tools to help with this: Idera SQLsecure is a tool for assessing security across your SQL Servers. SQLsecure collects SQL Server version number and service pack as part of the overall security assessment it performs. The version number returned by SQLsecure will also reflect the latest patches applied, so you can determine which servers are out of date.
2. ENFORCE STANDARDS FOR SECURE PASSWORDS.
Tools to help with this: If you are using SQL Server 2005, use the built-in security options to enforce password complexity standards. For previous versions of SQL Server, there are many publicly available scripts you can use to identify accounts with weak or non-existent passwords.
3. SECURE THE FULL ENVIRONMENT, INCLUDING THE OS AND NETWORK
Tools to help with this: Microsoft has many documents and knowledge base articles that provide more detail and best practices on how to ensure server and application security.
4. ENSURE APPROPRIATE SQL SERVER SETUP
Tools to help with this: Idera SQLsecure checks most of these settings across your SQL Servers and reports back where you have potential security concerns.
5. REGULARLY ASSESS WHO CAN ACCESS YOUR SQL SERVERS AND TAKE ACTION TO LIMIT ACCESS.
Tools to help with this: Idera SQLsecure can make this task a whole lot easier for you. SQLsecure automatically generates a complete list of all users with access to your SQL Server, across Windows, Active Directory, and SQL Server, including users that have access as a result of group membership.
6. ASSESS WHAT ACTIONS USERS CAN PERFORM ON WHAT OBJECTS
Tools to help with this: Accurately assessing effective permissions on SQL Server objects is really, really, hard to do manually. Idera SQLsecure can perform this time consuming task for you. SQLsecure calculates both effective (derived) and assigned permissions for all users, on any database object, making it much easier to ensure that users have appropriate access rights.
7. KEEP AN AUDIT TRAIL OF DATABASE ACTIVITY.
Tools to help with this: SQL Server does provide native C2 auditing to help with gathering this information. However, C2 auditing can be very resource intensive and does not provide fine-grained controls to let you specify what you want to collect. You may instead want to use a 3rd party auditing tool, such as Idera SQL compliance manager. SQL compliance manager provides customizable, low-impact auditing, alerting and reporting on virtually all activity across multiple SQL servers.
8. AUDIT THE ACTIONS OF ADMIN USERS ON YOUR SQL SERVERS
Tools to help with this: The native SQL Server C2 auditing can also be used for this purpose, however, there is no way to ensure an immutable trail of audit data with the native tools. So, what’s to stop an admin user from deleting the audit trail after making off with your customer list? A better solution is to invest in a 3rd party tool such as SQL compliance manager which can audit ALL admin user activity, and protect the audit trail from tampering by anyone, even admin users.
9. SECURE AND AUDIT THE APPLICATIONS THAT ACCESS YOUR SQL SERVER DATABASES
Tools to help with this: Idera SQL compliance manager will track the activity of any application within a targeted database. And, alerts can be configured to watch for and provide immediate notification of questionable behavior, such as a business application accessing a database it shouldn’t.
10. ENSURE PROPER DATABASE BACKUPS, AND SECURE YOUR BACKUPS
Tools to help with this: While SQL Server does give you the ability to backup your databases, it does not include the ability to encrypt the backups, or compress them to help save space and reduce backup time. Idera SQLsafe provides a high performance backup solution which includes encryption and compression. SQLsafe also lets you create backup policies with exception reporting to notify you of any backups that failed to run as scheduled.
Wednesday, June 18, 2008
Automate Audit Requests
Periodically an internal auditor would stop by my desk and ask me to show them all the users and permissions on a particular database. This is easily gathered, and most of us know how to do this. I produce this list, and send it on to the requestor. The next question is ‘What has changed?’. I do not have this information, and have to tell them I do not know. So, after a couple iterations of this (I will not admit how many) I finally devised a simple way to store this information to adequately respond to this question. A snapshot of the user information is gathered from time to time, and stored into a history table. The next time a request comes in I can compare the current values to the historical ones. This would become tedious as it usually involved some spreadsheets or query results, and manually reviewing the data, looking for new records, changed records and so on. When I would produce these two sets of data (current and 1 historical snapshot) and give the data back to the auditor, they were initially happy, until they realized all the time that would be involved to perform an adequate review of the two sets of data. The next question would invariably be, is there a way to automate this? There is always a way to automate anything, I would respond, and skulk back to my DBA hole and pound out some more code. After more review of the results I produced and even more back and forth between me and the auditor, we finally decided which fields we needed to see, the differences we should show, etc. I will now try to explain the system that we devised to assist us in this simple, yet complex endeavor. System We basically want to know what the users looked like at a point in time. Compare it to the current state of the users, and show differences. Differences will be defined as: new records, changed records, removed records and old records. We want to be able to dynamically include any number of new and existing servers into this system, though we started with a single server. We want to be able to display these results for a period of time, and allow reporting to occur based on the results. Since we already have a monitoring server setup, this was the perfect place to locate this system. We already use linked servers to connect too and monitor all our remote servers, so we will continue in this vein for this system. Justifications can be read in other articles I’ve written about monitoring with linked servers. A configuration table needs to indicate which servers we will be monitoring. Some stored procedures will need to use dynamic sql, cycle thru the config list of servers, and call them to gather data, stage the data, and then do comparisons against this data. Then resulting data can be reported on. That’s the high level description. I will now go through all the objects, with more detail. Tables DatabaseRolesConfig This table will contain a ServerName and an Enabled flag. This allows us to setup multiple servers to be monitored, and enable and disable them at whim. The fields for this table are as follows: ServerName sysname not null, Enabled bit not null default 0 DatabaseRolesStaging This table will allow us to store data we have pulled down from all the remote servers. Once staged, we can query it and process it to our desires. The fields in this table are as follows:
Type sysname null default '',
ServerName sysname null default '',
DBName sysname null default '',
LoginName sysname null default '',
UserName sysname null default '',
db_owner varchar(3) null default '',
db_accessadmin varchar(3) null default '',
db_securityadmin varchar(3) null default '',
db_ddladmin varchar(3) null default '',
db_datareader varchar(3) null default '',
db_datawriter varchar(3) null default '',
db_denydatareader varchar(3) null default '',
db_denydatawriter varchar(3) null default '',
denylogin int null default '',
hasaccess int null default '',
isntname int null default '',
isntgroup int null default '',
isntuser int null default '',
sysadmin int null default '',
securityadmin int null default '',
serveradmin int null default '',
setupadmin int null default '',
processadmin int null default '',
diskadmin int null default '',
dbcreator int null default '',
bulkadmin int null default '' All the fields accept a default of blank, for reporting purposes. Not all fields for every row will have a value. This eliminates the display of null values in the report. This is just a simple step I chose to make life easy
DatabaseRolesHistory
This table is identical to the staging table, and will hold the processed data from the last execution for comparison to the next execution.
DatabaseRolesArchive
This table is identical to the staging table, except that it has an added Identity field for uniqueness, and a [Date] field that will contain the date of the archived data. This will be a holding area for all the data processes and displayed from past executinos. The extra fields are as follows:
ID integer not NULL IDENTITY(1,1),
[Date] datetime not null default getdate(),
DatabaseRolesDisplay
This table is similar to the above two tables. We’ve added an Identity field for uniqueness. A Version field to keep track of the previous version (Type). This will be the table that contains the processed data, grouped by type, and cleaned up. We will report from this table, as it will have the result set of data after processing. The fields of this table are as follows:
ID integer not NULL IDENTITY(1,1),
Version VARCHAR(8) not null,
Stored Procedures
sp_GetDBRoles
This procedure be passed a ServerName, DatabaseName and a UserName. The last two params were never implemented. But the ServerName determines which server we will be pulling data from. We use dynamic sql to pull information from the syslogins from the remote server, and then store it in the DatabaseRolesStaging table, with a ServerName and Type included. Then we create a cursor that will cycle thru each database in sysdatabases, except a few. Inside the cursor, we will dynamically call sql that pulls more data from sysmembers, sysusers and syslogins, retrieving those users that have roles set. This data is also stored in the staging table also. We will next use dynamic sql to pull more data from sysusers and syslogins, retrieving those users that do not have roles set. This data is also stored in the staging table.
sp_GetAllDBRoles
This will cycle thru the config table DatabaseRolesConfig and call the sp_GetDBRoles proc for each server that is enabled to be processed. sp_ProduceDatabaseRolesResults
Since we have previously gathered records into the DatabaseRolesStaging table, we can now compare these results to some other tables of historical data. We have a history table called DatabaseRolesHistory that contains the last set of data we gathered about Users.
1. We select the new values, and insert them into a memory table @DatabaseRolesStaging with a flag indicating they are ‘new’.
2. We select the old values that have changed, and insert them into the table @DatabaseRolesStaging with a flag indicating they are ‘old’.
3. We update the ‘new’ values that have changed. We determine this if there is a new record, and an old with some of the same values (Servername, DBName, UserName and LoginName). These are updated in the table @DatabaseRolesStaging with a flag indicating they are ‘changed’.
4. We find the records that were removed from the DatabaseRolesHistory table, compared to the DatabaseRolesStaging table. These records are inserted into the table @DatabaseRolesStaging with a flag indicating they are ‘removed’.
5. Records that were simply altered are removed from the memory table @DatabaseRolesStaging.
This resulting data is now labled and ready to display to the requestor. We process the data from the @DatabaseRolesStaging table, and order it with ‘Old’ and ‘Changed’ being first, then the ‘Removed’ records, followed by the ‘New’ records. This just helps in the viewing of the data, with the important ones being first, and so on. This resulting data is dumped into a real table called DatabaseRolesDisplay, and will live there until the next execution of this process. This allows me to reselect from this data when needed between executions. I used to have this simply returned once as part of the proc call, but would tend to need to look at the data subsequently; this solves that need. sp_ProcessDatabaseRoles
This stored procedure takes a ServerName as a parameter. If you use this option, it will call the procedure sp_GetDBRoles for just that ServerName. If you leave the ServerName blank, then it will call sp_GetAllDBRoles and process all enabled ServerNames from the config table. This proc will then get the user data into the staging table as described above. It will then call the sp_ProduceDatabaseRolesResults procedure, which will process the data in the staging table, comparing it to the historical data. Then the data in the history table will be pumped into the Archive table. The history table will be truncated. The current staging data will be pumped into the history table, and there await the next execution. This is the Gem of the sytem, taking all other parts into account, and doing it all for you. This can be called singly when the Auditor requests it. Or you can schedule it to run as a job, and simply query the resulting data in the DatabaseRolesDisplay table. There are many options you now have to follow, depending on your own needs. This system will allow you to gather user information, stage it, and store it historically. Allowing you the chance to see back into the past at a snapshot of what once was. No more will you be stymied by auditors or others with the questions of what users do we have in the system, and how do they compare to a year ago. You have the data, you are empowered, you are the DBA in control and informed. I hope that this system will help you gather the needed data and have it onhand to help out with your user reviews.
Type sysname null default '',
ServerName sysname null default '',
DBName sysname null default '',
LoginName sysname null default '',
UserName sysname null default '',
db_owner varchar(3) null default '',
db_accessadmin varchar(3) null default '',
db_securityadmin varchar(3) null default '',
db_ddladmin varchar(3) null default '',
db_datareader varchar(3) null default '',
db_datawriter varchar(3) null default '',
db_denydatareader varchar(3) null default '',
db_denydatawriter varchar(3) null default '',
denylogin int null default '',
hasaccess int null default '',
isntname int null default '',
isntgroup int null default '',
isntuser int null default '',
sysadmin int null default '',
securityadmin int null default '',
serveradmin int null default '',
setupadmin int null default '',
processadmin int null default '',
diskadmin int null default '',
dbcreator int null default '',
bulkadmin int null default '' All the fields accept a default of blank, for reporting purposes. Not all fields for every row will have a value. This eliminates the display of null values in the report. This is just a simple step I chose to make life easy
DatabaseRolesHistory
This table is identical to the staging table, and will hold the processed data from the last execution for comparison to the next execution.
DatabaseRolesArchive
This table is identical to the staging table, except that it has an added Identity field for uniqueness, and a [Date] field that will contain the date of the archived data. This will be a holding area for all the data processes and displayed from past executinos. The extra fields are as follows:
ID integer not NULL IDENTITY(1,1),
[Date] datetime not null default getdate(),
DatabaseRolesDisplay
This table is similar to the above two tables. We’ve added an Identity field for uniqueness. A Version field to keep track of the previous version (Type). This will be the table that contains the processed data, grouped by type, and cleaned up. We will report from this table, as it will have the result set of data after processing. The fields of this table are as follows:
ID integer not NULL IDENTITY(1,1),
Version VARCHAR(8) not null,
Stored Procedures
sp_GetDBRoles
This procedure be passed a ServerName, DatabaseName and a UserName. The last two params were never implemented. But the ServerName determines which server we will be pulling data from. We use dynamic sql to pull information from the syslogins from the remote server, and then store it in the DatabaseRolesStaging table, with a ServerName and Type included. Then we create a cursor that will cycle thru each database in sysdatabases, except a few. Inside the cursor, we will dynamically call sql that pulls more data from sysmembers, sysusers and syslogins, retrieving those users that have roles set. This data is also stored in the staging table also. We will next use dynamic sql to pull more data from sysusers and syslogins, retrieving those users that do not have roles set. This data is also stored in the staging table.
sp_GetAllDBRoles
This will cycle thru the config table DatabaseRolesConfig and call the sp_GetDBRoles proc for each server that is enabled to be processed. sp_ProduceDatabaseRolesResults
Since we have previously gathered records into the DatabaseRolesStaging table, we can now compare these results to some other tables of historical data. We have a history table called DatabaseRolesHistory that contains the last set of data we gathered about Users.
1. We select the new values, and insert them into a memory table @DatabaseRolesStaging with a flag indicating they are ‘new’.
2. We select the old values that have changed, and insert them into the table @DatabaseRolesStaging with a flag indicating they are ‘old’.
3. We update the ‘new’ values that have changed. We determine this if there is a new record, and an old with some of the same values (Servername, DBName, UserName and LoginName). These are updated in the table @DatabaseRolesStaging with a flag indicating they are ‘changed’.
4. We find the records that were removed from the DatabaseRolesHistory table, compared to the DatabaseRolesStaging table. These records are inserted into the table @DatabaseRolesStaging with a flag indicating they are ‘removed’.
5. Records that were simply altered are removed from the memory table @DatabaseRolesStaging.
This resulting data is now labled and ready to display to the requestor. We process the data from the @DatabaseRolesStaging table, and order it with ‘Old’ and ‘Changed’ being first, then the ‘Removed’ records, followed by the ‘New’ records. This just helps in the viewing of the data, with the important ones being first, and so on. This resulting data is dumped into a real table called DatabaseRolesDisplay, and will live there until the next execution of this process. This allows me to reselect from this data when needed between executions. I used to have this simply returned once as part of the proc call, but would tend to need to look at the data subsequently; this solves that need. sp_ProcessDatabaseRoles
This stored procedure takes a ServerName as a parameter. If you use this option, it will call the procedure sp_GetDBRoles for just that ServerName. If you leave the ServerName blank, then it will call sp_GetAllDBRoles and process all enabled ServerNames from the config table. This proc will then get the user data into the staging table as described above. It will then call the sp_ProduceDatabaseRolesResults procedure, which will process the data in the staging table, comparing it to the historical data. Then the data in the history table will be pumped into the Archive table. The history table will be truncated. The current staging data will be pumped into the history table, and there await the next execution. This is the Gem of the sytem, taking all other parts into account, and doing it all for you. This can be called singly when the Auditor requests it. Or you can schedule it to run as a job, and simply query the resulting data in the DatabaseRolesDisplay table. There are many options you now have to follow, depending on your own needs. This system will allow you to gather user information, stage it, and store it historically. Allowing you the chance to see back into the past at a snapshot of what once was. No more will you be stymied by auditors or others with the questions of what users do we have in the system, and how do they compare to a year ago. You have the data, you are empowered, you are the DBA in control and informed. I hope that this system will help you gather the needed data and have it onhand to help out with your user reviews.
Subscribe to:
Posts (Atom)