Friday, August 22, 2014

Security Updates released in MS14-044, and An Approach to Microsoft’s Incremental Servicing Model

On August 12th, 2014, another infamous ‘patch-Tuesday,’ Microsoft released a series of Security Updates for multiple versions of SQL Server, to deal with potential Denial of Service attacks and an exploit in Master Data Services. After attempting to make my way through hundreds of instances already, and all the respective environments, with a recent applicable Cumulative Update, the release of all these Security Updates has most definitely thrown a wrench in the patching plans. Here are the details for this specific bulletin. https://technet.microsoft.com/en-us/library/security/ms14-044.aspx

The question is, if you’re a DBA, how to you make sense of all the Cumulative Updates (CUs which contain critical on demand updates from clients), Service Packs (SP), Security Updates (SU), General Distribution Releases (GDR), and the acronym I have only noticed recently - QFE (most have heard of hotfixes, but this particular one means a Quck Fix [from] Engineering) updates. This is where this explanation of Microsoft`s Incremental Servicing Model from the SQL Server Team steps in to help, in fact, after 15 years of administering SQL Server, I have not found a page with such an updated description of how SQL is patched, and this is thanks to a recent visit from Felix Avemore, a Microsoft SQL Server Premier Field Engineer based in Quebec City.

For Microsoft Premier Field Engineers for SQL Server, it’s clear, your priority is to apply important Security Updates before anything else, but often those updates require the application of a CU or an SP as a pre-requisite, which makes patching a painful affair when you have the daunting task of updating 3-400 servers!  That is where updated/clear documentation, system backups, and checklists come in rather handy, and perhaps deeper recommendations from the vendor to validate registry keys if your system is in production and ultra-sensitive. If ever you arrive with a corrupt master, attempt restore but always remember you can rebuild the instance cleanly with the exact Configuration.ini file found within the setup bootstrap folder (please see a previous post on command line installs for more).
To decide which updates to apply, depends on what build you are at,
therefore for 2008-2014, here’s a quick guide:

SQL Server Version
General Distribution Release (GDR)
Quick Fix [from] Engineering (QFE)
2014
RTM
SP1 (without any CUs)
SP2 (..)
CU1 - CU2
SP1 CU1-11
SP2 CU1-13
2012
2008 R2
2008
SP3 (..)
SP3 CU1-CU17
              
Note that If you are on SQL 2014 RTM CU3, or SQL 2012 SP2 you are covered already at those build levels.

There are clear arguments, as laid out well by Glenn Berry here, that you should apply recent Cumulative Updates to maintain proper performance, stability, and regular maintenance of your SQL Server Instances.
Are QFEs cumulative? By their build level, it would appear so, and after reading several definitions, I can confirm that they are indeed cumulative hotfixes also.

Hope this clears up some of the muddy pathway you`ll find attempting to keep up with patches on SQL Server. 
Happy Patching


 

Tuesday, May 13, 2014

How to Fix that Never-Ending Join Dragging Down the Whole DB Server Instance – Pre-Populate the Largest Bugger in Temp with Indexes


Now that it has been a good five years I have been blogging away here and on SSC, the editors recently thanked us for our work. They also provided valuable feedback that we should give real-world situations that DBAs encounter.  The following has a target of optimising performance, from an actual task that has re-occurred several times since I first wrote on the subject, in various production environments, on an instance that is bogged down by that one massive query within a stored procedure that has to run all the time, yet is so huge, important and/or complex everyone is afraid or unsure how to resolve.


In this post I hope to clearly explain how the combination of the use of data definition language for your temporary tables and non-clustered indexes can improve the performance of stored procedures that join data from one or many large tables by up to seventeen times (at least that was the case previous time I saw this type of query to optimise) - as I have seen on stored proc.s that work with tables in the tens of millions.


Temporary tables, if used frequently or in stored procedures, in turn, end up with significant input/output disk consumption. To start, one thing we should be aware of is that they are also created as a heap by default.  As experience has shown, if you are cutting up a very large table and using the temporary database, it is best to first do your DDL (data definition language) before running through the rest of your operation with the temporary data - as opposed Select * INTO #temp. Thus, we should avoid Select * into #temp as much as possible, unless the number of rows is insignificant, because being in a single statement, it will create great disk contention within the temp database:

(N.B. the assumed pre-requisite is that you've identified the worst query from your plan cache or have seen the code from Recent Expensive queries listed in Activity Monitor, sorted by worst performing resource)

CREATE TABLE #MyLot  -- you’ll see that we only need a few columns join in the end
       (
       [LotId] [int] IDENTITY(1,1) NOT NULL,
       [LotCode] [nvarchar](10) NOT NULL
       )

INSERT into #MyLot ( LotId, LotCode )
 -- e.g. you can also avoid NULLs by using ISNULL(col,0)
Select LotId, LotCode
from MyBigLotTable
Where clause / matching variables
 -- this is where you found out what joins this massive table with the others and slice it up
 -- horizontally and vertically
 b
efore (!) making that big join,
 -- and that is where we obtain the significant performance gains

Create NonClustered Index #idx_Lot on #MyLot ([LotCode] ASC )
-- create index on matching column used in the 'big' join (this case it was a 5 table join)
-- the glaring ID field
-- integrate all this preparation of #MyLot into the main super slow query
INSERT INTO @result
([Number],[LocId],[BinId],[LotCode],[LotId],[PCode],[PId],[Stock],[StatusCode],[UnitCode])

SELECT
[BIResult].[Number], [Loc].[LocId], [BLoc].[BILocId],[BIResult].[LotCode], #MyLot.[LotId],[BIResult].[PCode],[P].[PId],[BIResult].[Stock],ISNULL([BIResult].[StatusCode],[BIResult].[UnitCode]
FROM OPENXML (@handle"WITH"
                        (
                              [Number] SMALLINT N'@Number'
                              [LocID] NVARCHAR(10) N'@LocID'
                              [PCode] NVARCHAR(18) N'@PCode'
                              [LocCode] NVARCHAR(4) N'@LotCode'
                              [PCode] NVARCHAR(10) N'@LotId'
                              [Stock] NVARCHAR(MAX) N'@Stock'
                              [StatusCode] NVARCHAR(3) N'@StatusCode'
                              [UnitCode] NVARCHAR(1) N'@UnitCode'
                        ) AS [BIResult]
                JOIN [Pt] ON [Pt].[Number] = [BIResult].[Number]
                LEFT JOIN #MyLot --[Lot] was here before, the huge table
                            ON #MyLot.[LotCode] = [BIResult].[LotCode]
                JOIN [P] ON [P].[PtId] = [Pt].[PtId]
                 AND [P].[PCode] = [BIResult].[PCode]
                JOIN [SLoc] ON [SLoc].[PtId] = [Pt].[PId]
                 AND [SLoc].[SLocCode] = [BIResult].[SLocCode]
                JOIN [BLoc] ON [BLoc].[LocId] = [Loc].[LocId]
                 AND [BLoc].[BLocCode] = [BIResult].[BLocCode]
               WHERE CAST([BIResult].[Stock] AS DECIMAL(13)
              )

-- always explicitly/clearly drop the temp at the end of the stored proc.
drop table #MyLot -- and the respective index is dropped too with it


Happy optimising!

Shed the heavy weight of that extra slow query bogging your instance down.



Tuesday, January 14, 2014

Microsoft Project Migration Template for the move to SQL 2012


For those planning a move to SQL Server 2012, although this process can apply for many Database Migrations, perhaps this Microsoft Project Migration Template could help? *
In this plan, there are so many possible steps, it is better to trim down from too many to just those applicable instead of missing steps. As an experienced migrator, you may already know of an even better order of tasks to accomplish a successful migration. By all means share with us below in the comments.   My approach here is to ensure that I delve well into the domain of Project Management as a DBA must/should do from time to time, so that if an official project manager is assigned to you, this document could be handed over to them as a good way of organising the upcoming work.
A quick nota bene at this planning stage of a project is that you should not skip the time estimations, which in turn lead to the analysis of the critical path.  There might be a point where you have to pool resources with another team-member or pull in a consultant to ensure project delivery timelines are met.  Somewhere along the critical path of the project you might want to take this proactive step to mitigate deadline risk. In this way, whole project planning with respect to time estimations is a predecessor to accomplishing this task.
And sorry, for the notes sometimes being in FR, I just tend to mix up the sources/references often enough. This template has a little bit of history. While migrating databases in early 2005 for the Artificial Insemination [of cows] Centre of Quebec (CIAQ)  Mathieu Noel (his profile on linkedin.com) helped me out greatly while writing this document. This version has had major revisions four times so far, with the most recent being this one for SQL 2012.   
 * To view an MPP file without installing Project itself, you can use this free viewer. Exports of the Migration project plan to PDF and Excel are also available on my SkyDrive.
PS as with all migrations, one should constantly try and adhere to the keep it simple rule (K.I.S.S.). Even this old post about a simple Oracle 10 migration to SQL Server 2008 was no exception, so what we did from the very beginning was create insert scripts of all the data into the tables (not a huge database, in the tens of megabytes only), since the schema export was already done for us by a vendor (to which I only had to do minor tweaks to appreciatively).  Before actually going through each table insert script one by one to adjust the fully qualified table names, add Set Identity_Insert On/off statements, with a quick truncate before the begin tran/inserts/commit batch, I had scripted out, in a previous step, all the drop/create foreign key and constraints statements to bring all the data in quickly without per-table FK/Constraint drop/recreation.

Monday, December 16, 2013

SQL Server 2012 AlwaysOn Presentation Given at Vermont PASS SQL Server User Group Last Week

This past Wednesday evening, 11th Dec, I braved the snowy roads down from Montreal to Winooski (Burlington area) to join a very friendly crowd at MyWebGrocer and presented all I know about AlwaysOn for the Vermont Professional Association for SQL Server, run by Roman Rehak.
I shall be placing an AlwaysOn script shortly, once I have cleaned up the code - however for those who were there, Roman Rehak was provided with all the files to redistribute also.
If the presentation link is blocked for you, please try this one on LinkedIn's server http://www.linkedin.com/profile/view?id=2308075&trk=wvmp-profile (see right after summary).
Thanks again to Roman and especially My Web Grocer for sharing its amazing work space with us.

Wednesday, December 11, 2013

SQL Server Installation Folder Setup Log and Command Line Install Information


The other morning I was commenting during our regular meetings amongst fellow DBAs on where to read up on installation issues, or for simple text validation of what components were added during an installation process. This is the folder I want to point out: \\rootDrive:\Program Files\Microsoft SQL Server\110\Setup Bootstrap\Log

Every install action, whether an Add of a node, settings validation, repair, remove, uninstall appears in this folder, and as soon as you have installation issues, go right to the specific Detail.txt under the Log folder corresponding to the date and time an installation was executed. You will see that Summary.txt has very little information, which is self-evident by its name.

For those of you who have a solid state drive at home (desktop / laptop, etc), to become familiar with how SQL Server is installed, I recommend a download of the SQL 2012 Developer edition ISO and to familiarize oneself with the command line installation - and if we could compare total install times (it's ll in the logs, no need for the stop watch), versus a GUI install that would be cool :) to share.  
For those who are curious regarding command line installs, here are some installation examples. Note that now you can mount your ISO natively in Win 2012/Windows 8 and run this comment directly from the mounted drive letter. I have avoided the use of /QS below because I like to see the GUI install to validate the parameters for a second time, just to ensure the instance starts off on the right foot.
CMD line install and one for an non-clustered with Analysis Services:
setup.exe /ACTION="Install" /AGTSVCPASSWORD="19CharacterPassword" /ASSVCPASSWORD="19CharacterPassword" /SAPWD="19CharacterPassword" /SQLSVCPASSWORD="19CharacterPassword" /INDICATEPROGRESS="true" /ENU="True" /UpdateEnabled="TRUE" /UpdateSource="Drive:\FOLDERCONTAININGLatestUpdate/FEATURES=SQLENGINE,REPLICATION,FULLTEXT,DQ,AS,RS /HELP="False" /INDICATEPROGRESS="TRUE" /X86="False" /INSTALLSHAREDDIR="C:\Program Files\Microsoft SQL Server" /INSTALLSHAREDWOWDIR="C:\Program Files (x86)\Microsoft SQL Server" /INSTANCENAME="InstanceName" /INSTANCEID="InstanceName" /ERRORREPORTING="True" /INSTANCEDIR="C:\Program Files\Microsoft SQL Server" /AGTSVCACCOUNT="InstanceSpecificServiceAccountName" /ASSVCACCOUNT="InstanceSpecificServiceAccountName" /ASSVCSTARTUPTYPE="Automatic" /ASCOLLATION="Latin1_General_CI_AS" /ASDATADIR="DriveName:1\olapdb_InstanceName" /ASLOGDIR="DriveName:\olaplog_InstanceName" /ASBACKUPDIR="DriveName:\olapbakup_InstanceName" /ASTEMPDIR="DriveName:\\olaptmp_VInstanceName" /ASCONFIGDIR="DriveName:\OLAP\Config" /ASPROVIDERMSOLAP="1" /ASSYSADMINACCOUNTS="ListOfUsers" "ADDINSTANCESPECIFICCCOUNT" /ASSERVERMODE="MULTIDIMENSIONAL" /FILESTREAMLEVEL="0" /SQLCOLLATION="Latin1_General_CI_AS" /SQLSVCACCOUNT="InstanceSpecificServiceAccountName" /SQLSYSADMINACCOUNTS="InstanceSpecificServiceAccountName" "ADDINSTANCESPECIFICCCOUNT" /SECURITYMODE="SQL" /INSTALLSQLDATADIR="DriveName:\sqlsysdb_InstanceName" /SQLBACKUPDIR="DriveName:\sqlbakup_InstanceName" /SQLUSERDBDIR="DriveName:\mpdbs001\sqlappdb_InstanceName" /SQLUSERDBLOGDIR="DriveName:\sqlapplog_InstanceName" /SQLTEMPDBDIR="DriveName:\sqltmpdb_InstanceName" /RSSVCACCOUNT="NT Service\ReportServer$InstanceName" /RSSVCSTARTUPTYPE="Automatic" /FTSVCACCOUNT="NT Service\MSSQLFDLauncher$InstanceName"

And a command line instance Repair:
setup.exe /QS /ACTION="repair" /ENU="True" /INSTANCENAME="NAME" /ASSVCACCOUNT="19CharacterPassword" /ASSVCPASSWORD="19CharacterPassword" /SAPWD="19CharacterPassword" /SQLSVCPASSWORD="19CharacterPassword"

For the most part, the cluster installation is exactly the same as the standalone SQL Server installation with the exception of a few screens in the GUI, and I would not recommend a Failover Cluster installation from the CMD prompt since you miss all the steps of whether parameters are valid for installation - unless you run CMD without the /QS parameter, which means an attended installation launched from the command link. I find this is a faster way of feeding the GUI installation procedure, and validating as you go along that the parameters actually work within the installation procedure before clicking Next (or equivalent) on each step.

Adding a node, however, is straightforward unattended and a real time-saver, NB
 when you add a node, you must provide again the passwords for service accounts.


setup.exe /ACTION="AddNode" /AGTSVCPASSWORD="StrongPassword" /SQLSVCPASSWORD="StrongPassword" /INDICATEPROGRESS="true" /ENU="True" /UpdateEnabled="False" /UpdateSource="Drive:\FOLDERCONTAININGLatestUpdate" /HELP="False" /INDICATEPROGRESS="TRUE" /X86="False" /INSTANCENAME="InstanceName" /FAILOVERCLUSTERGROUP="ClusterRoleName" /FAILOVERCLUSTERIPADDRESSES="IPv4;159.208.196.63;Public;255.255.252.0" /FAILOVERCLUSTERNETWORKNAME="SQLVirtualClusterName" /CONFIRMIPDEPENDENCYCHANGE=1 /AGTSVCACCOUNT="domain\InstanceSpecificServiceAccount" /SQLSVCACCOUNT="domain\InstanceSpecificServiceAccount"

---this one is when you have to add AS also on the second node
setup.exe /ACTION="AddNode" /AGTSVCPASSWORD="StrongPassword" /SQLSVCPASSWORD="StrongPassword" /INDICATEPROGRESS="true" /ENU="True" /UpdateEnabled="False" /UpdateSource="MU" /HELP="False" /INDICATEPROGRESS="TRUE" /X86="False" /INSTANCENAME="InstanceName" /FAILOVERCLUSTERGROUP="ClusterRoleName" /FAILOVERCLUSTERIPADDRESSES="IPv4;IPADDRESSFORSQLVIRTUALSEVER;Public;255.255.252.0" /FAILOVERCLUSTERNETWORKNAME="DNSVirtualServerEntry" /CONFIRMIPDEPENDENCYCHANGE=1 /AGTSVCACCOUNT="domain\InstanceAccountName" /SQLSVCACCOUNT="domain\InstanceAccountName" /ASSVCACCOUNT="domain\user" /ASSVCPASSWORD="StrongPassword"

Using a Configuration file to add a second node to a cluster:
setup.exe /qs /ACTION="AddNode" /CONFIGURATIONFILE=”DRIVEONOTHERNODE:\Program Files\Microsoft SQL Server\110\Setup Bootstrap\ConfigurationFileINSTANCENAME.ini” /AGTSVCPASSWORD=”15CharacterPassword" /ASSVCPASSWORD=”15CharacterPassword" /SQLSVCPASSWORD=”15CharacterPassword" /INDICATEPROGRESS="TRUE"

Changing Database Server Collation:, err, if you set it wrong by accident (works exclusively for standalone from my experience):
Setup /QS /ACTION=REBUILDDATABASE /INSTANCENAME="InstanceName" /INDICATEPROGRESS="TRUE" /SQLSYSADMINACCOUNTS="ML\oth_mlsqldbms" "listOfAccounts" "domain\userGroup" /SAPWD="StrongPassword" /SQLCOLLATION=SQL_Latin1_General_CP1_CI_AS


References (for all the other options):
http://msdn.microsoft.com/en-us/library/ms144259.aspx

Wednesday, October 09, 2013

Leveraging LinkedIn to Contribute to the DBA Community & Attract Opportunities with Internationally-Oriented Organisations


LinkedIn has matured over the past decade and I can sincerely say it is well worth the time setting up your profile and actually completing it. It is pretty much the ultimate business networking tool-until replaced by a competitor of course! Much of facebooks’ features have been mimicked into the site also, which helps evaluate shared topic interests.

The prerequisite to fully take advantage of LinkedIn, would be completing one’s profile to the 100% level and obtaining as many recommendations as possible, although now with Endorsements (from LinkedIn contacts) the written recommendations have been fully taken over by the latter. Pre-MVP award, I had made efforts to request recommendations, or exchange them, to grow over the ten person threshold, since, at the time, I believed establishing credibility by means of online references is a significant prerequisite to mastering LinkedIn’s networking potential. If you recommend someone online, they are taking a leap of faith in you; since it is something they are willing to state in front of the entire world basically how that individual feels about your workplace conduct (i.e. playing nice in the sandbox). However, once the option of endorsements for specific skills became available, the ease of it simply opened the flood to hundreds of endorsements (if you are publicly contributing by blogs or writing, this has been my experience, at least, as you can see below).
LinkedIn endorsements, accumulated after five years of blogging, writing, speaking, etc.

You’ll be pleasantly surprised also, that if you describe the way you prefer to work exactly (e.g. personally, I described following Brad McGehee’s Exceptional DBA guide), or the methodology you follow, it will allow you to bring in qualified clients/opportunities and provide the chance to filter out unwanted mandates.  My current job in Montreal was found through recruitment agencies working ironically, in another province, although my current and former colleagues play hockey together!

LinkedIn also provides opportunities for diversity of work, which contributes to experience on a whole, proves invaluable and maintains the profession of being a DBA across platforms (or DBA polyglot as I have pushed), even if only minimal tasks executed over a few days here and there accumulate into a personal body of knowledge which bloggers can benefit from themselves, as well as the contribution to community.   Inside organisations, I encourage DBAs to post blogs to demystify our profession and approach, as well as help educate Developers and elude some pretty foul code.


Further, it should be treated as a longer than usual Curriculum Vitæ or Résumé in North America (unless you are in MX, or the province of QC) but in accordance to the format obviously, because perhaps if you place details in the wrong portion of your profile, an opportunity could easily be missed.  I love the way a mate here in Montreal (Martin Arvisais) describes it as a great place ‘pour vendre ta salade’ (cute local way of saying to sell your stuff in French). Now you can upload word documents to your LinkedIn profile directly for those who would like to see the traditional format. 

The other improvement, although not that recent, is that LinkedIn is much like a blog platform too, since you can share almost anything. It is method to make a pillar of the all-important (in this net-oriented generation), Online Persona.

Another good reason to do it is, to be quite forthright, showing how you can contribute to your professional community, and thus leveraging your contacts within this tool.  There are several SQL Server related groups in LinkedIn, my contributions through the LinkedIn groups are part of the reason why Canada’s MVP Lead approached me back in 2009 for a nomination (also, thanks to a referral from SQLServerToolBox.com‘s Scott Stauffer, and frequent speaker, a SQL DBA based in Vancouver) – therefore, what more motivation could one implore to Link themselves In.



Friday, September 13, 2013

How to Avoid the 'Abuse' of SysAdmin by Applying User Defined Roles in SQL 2012 (and keep Exec.s and Auditors Happy)

This will not be a typical post, just a dive right away into Data Access Language Code, which provides a method to avoid the SysAdmin fixed Server role in SQL Server for DBAs and Monitoring/Auditing Accounts, thanks to extensive explicit permissions and taking full advantage of SQL 2012 User Define Server Roles or Flexible Server Roles.

-- This script would be a required step to do post instance install and to apply flexible server roles
-- We are to apply this as a security policy in production environments, and then perform validation 
-- Could be applied on some Dev/UAT servers

-- start with a rollback / back-out - or clean out roles to start again (alternatively skip to line 50)
USE [master]
GO
Drop Server Role DBAs;
Drop Server Role Monitoring;
go
-- Add DBAs back to fixed server role sysadmin, unless on servers that will not be managed by DBAs
ALTER SERVER ROLE [sysadmin] ADD MEMBER [Group1]
ALTER SERVER ROLE [sysadmin] ADD MEMBER [MonitoringAccount1]
ALTER SERVER ROLE [sysadmin] ADD MEMBER [MonitoringAccount2]
ALTER SERVER ROLE [sysadmin] ADD MEMBER [AuditingAccount]
GO
-- drop explicit rights in master, model and msdb also
USE [msdb]
GO
ALTER ROLE [ServerGroupAdministratorRole] DROP MEMBER [DBAGroup2]
ALTER ROLE [ServerGroupAdministratorRole] DROP MEMBER [DBAGroup1]
GO
ALTER ROLE [SQLAgentOperatorRole] DROP MEMBER [DBAGroup2]
ALTER ROLE [SQLAgentOperatorRole] DROP MEMBER [DBAGroup1]
GO-- not supposed to exist, but just in case
ALTER ROLE [db_owner] DROP MEMBER [DBAGroup2]
ALTER ROLE [db_owner] DROP MEMBER [DBAGroup1]
GO
use model
go
GRANT select, insert, TAKE OWNERSHIP, view definition, update, execute, CONTROL, REFERENCES
         on schema::dbo to [DBAGroup2]
GRANT select, insert, TAKE OWNERSHIP, view definition, update, execute, CONTROL, REFERENCES
         ON SCHEMA::[dbo] TO [DBAGroup1]
GO -- don't forget you can easily enough do DENY permissions too to prevent data modificación 
use master
go
-- All sections of this Security Hardening should correspond to a master Document/ed procedure
ALTER LOGIN [sa] enable
GO
use [Master]
go
drop USER [AuditingAccount] FOR LOGIN [AuditingAccount]
drop USER [MonitoringAccount1] FOR LOGIN [MonitoringAccount1]
drop USER [MonitoringAccount2] FOR LOGIN [MonitoringAccount2]
--- END CLEAN UP / Rollback of Role Security hardening


-- BEGIN SQL Security Hardening
--- disable SA, but do not drop it, maybe needed for service packs or for backout in Startup -m option (single-user mode)
ALTER LOGIN [sa] disable  -- ALTER LOGIN [sa] enable
GO
-- before applying any security policy, ensure BUILTIN\Administrators not there
USE MASTER
GO
IF EXISTS (SELECT * FROM sys.server_principals
WHERE name = NBUILTIN\Administrators)
DROP LOGIN [BUILTIN\Administrators]
GO


-- New in SQL 2012 - User Defined Server Roles, begin to take advantage of them
-- using roles rather than granting access to individuals, is a best practice in itself,
-- and the flexibility of user defined roles has become essential for many reasons
-- not limited to but including auditing, compliance, best management practices
Create Server Role DBAs Authorization [securityadmin]; 
-- where SecurityAdmin contains just a few who have FireCall IDs (elevated accounts, for rare use)

-- example with a few DBAs
IF NOT EXISTS (SELECT * FROM sys.server_principals
WHERE name = N’SeniorSQLDBA1)
CREATE LOGIN [SeniorSQLDBA1] FROM WINDOWS WITH DEFAULT_DATABASE=[master]

IF NOT EXISTS (SELECT * FROM sys.server_principals
WHERE name = N’SeniorSQLDBA2)
CREATE LOGIN [SeniorSQLDBA2] FROM WINDOWS WITH DEFAULT_DATABASE=[master]
go
-- add a few senior dbas to [securityadmin] only for now (unless Super User/System Admin account exists)
ALTER SERVER ROLE [securityadmin] ADD MEMBER [SeniorSQLDBA2]
ALTER SERVER ROLE [securityadmin] ADD MEMBER [SeniorSQLDBA1]
-- these senior dbas should backup each other in case of role issues and revised grants have to be applied
GO

IF NOT EXISTS (SELECT * FROM sys.server_principals
WHERE name = N’DBAGroup1)
CREATE LOGIN [DBAGroup1] FROM WINDOWS WITH DEFAULT_DATABASE=[master]

IF NOT EXISTS (SELECT * FROM sys.server_principals
WHERE name = N’DBAGroup2)
CREATE LOGIN [DBAGroup2] FROM WINDOWS WITH DEFAULT_DATABASE=[master]

CREATE
USER [DBAGroup1] FOR LOGIN [DBAGroup1]
CREATE USER [DBAGroup2] FOR LOGIN [DBAGroup2]
GO
-- add DBA groups to the role
Alter server role DBAs add member [DBAGroup2]
-- we could use a subset of the following grants for different 'levels' of DBAs
-- (i.e. a JuniorDBA flexible server role)
Alter server role DBAs add member [DBAGroup1]

-- now lock down for operations we as DBAs should not be doing anyway
-- do not forget to give WITH GRANT rights when necessary
-- (please validate in the GUI afterwards that DBAs role has the correct DAL)

-- (all necessary permissions to do DBA job, minus unnecessary privileges)
GRANT ADMINISTER BULK OPERATIONS TO [DBAs] WITH GRANT OPTION
GRANT ALTER ANY CONNECTION TO [DBAs] WITH GRANT OPTION
GRANT ALTER ANY CREDENTIAL TO [DBAs] WITH GRANT OPTION
GRANT ALTER ANY DATABASE TO [DBAs] WITH GRANT OPTION
GRANT Shutdown to DBAs
GRANT control server TO [DBAs]
-- Luckily Control Server permission respects the following DENYs
-- which is not the case for sysadmin fixed role
GRANT ALTER ANY EVENT NOTIFICATION TO [DBAs] WITH GRANT OPTION
GRANT ALTER ANY EVENT SESSION TO [DBAs] WITH GRANT OPTION
GRANT ALTER RESOURCES TO [DBAs]
GRANT ALTER SERVER STATE TO [DBAs]
GRANT ALTER SETTINGS TO [DBAs]
GRANT AUTHENTICATE SERVER TO [DBAs]
-- Grants or denies the ability to use a particular signature across all databases on the server
-- when impersonation is used.
GRANT CONNECT SQL TO [DBAs] WITH GRANT OPTION
-- Grants or denies the ability to connect to the SQL Server.
-- All logins, when newly created, are granted this permission automatically
GRANT CREATE ANY DATABASE TO [DBAs] WITH GRANT OPTION
-- GRANT CREATE AVAILABILITY GROUP TO [DBAs] (if you have Av. Groups at all)
GRANT CREATE DDL EVENT NOTIFICATION TO [DBAs]
GRANT CREATE TRACE EVENT NOTIFICATION TO [DBAs]
GRANT VIEW ANY DATABASE TO [DBAs] WITH GRANT OPTION
GRANT VIEW ANY DEFINITION TO [DBAs]
GRANT VIEW SERVER STATE TO [DBAs]
GRANT ALTER ANY EVENT NOTIFICATION TO [DBAs]
GRANT ALTER ANY EVENT SESSION TO [DBAs]
GRANT ALTER ANY LOGIN TO [DBAs]  -- some may want this as a DENY
--- Now the explicit denys
DENY ALTER ANY AVAILABILITY GROUP TO [DBAs]
DENY ALTER ANY ENDPOINT TO [DBAs]
DENY ALTER ANY LINKED SERVER TO [DBAs]
-- debatable regarding linked servers
DENY ALTER ANY SERVER ROLE TO [DBAs] 
--obviously, we want control on number or roles
DENY ALTER TRACE TO [DBAs]
DENY CREATE ENDPOINT TO [DBAs]
Deny impersonate on login::sa to DBAs
-- add any other accounts that are individual users on the server with elevated rights, and the Service Account(s)
Deny Alter any Server Audit to DBAs
Deny Unsafe Assembly to DBAs;
GO
-- resolve master grants
USE [master]
GO
GRANT select, view definition, execute, CONTROL, REFERENCES on schema::dbo to [DBAGroup2]
GRANT select, view definition, execute on schema::sys to [DBAGroup2]
GRANT EXECUTE ON xp_readerrorlog TO [DBAGroup2]  -- helpful to get DBAs to query the error log
GRANT EXECUTE ON sp_readerrorlog TO [DBAGroup2]
ALTER ROLE [db_datareader] ADD MEMBER [DBAGroup2]
GO
GRANT select, view definition, execute, CONTROL, REFERENCES on schema::dbo to [DBAGroup1]
GRANT select, view definition, execute on schema::sys to [DBAGroup1]
GRANT EXECUTE ON xp_readerrorlog TO [DBAGroup1]
GRANT EXECUTE ON sp_readerrorlog TO [DBAGroup1]
ALTER ROLE [db_datareader] ADD MEMBER [DBAGroup1]
GO
-- resolve MSDB grants
USE [msdb]
GO
CREATE USER [DBAGroup1] FOR LOGIN [DBAGroup1]
CREATE USER [DBAGroup2] FOR LOGIN [DBAGroup2]
go
ALTER ROLE [ServerGroupAdministratorRole] ADD MEMBER [DBAGroup1]
ALTER ROLE SQLAgentOperatorRole ADD MEMBER [DBAGroup1]
GO 
ALTER ROLE [ServerGroupAdministratorRole] ADD MEMBER [DBAGroup2]
ALTER ROLE SQLAgentOperatorRole ADD MEMBER [DBAGroup2]
GO -- after roles, grant explicit rights to be sure nothing is missing
GRANT select, insert, TAKE OWNERSHIP, view definition, update, execute, CONTROL, REFERENCES
         on schema::dbo to [DBAGroup2]
GO
GRANT select, insert, TAKE OWNERSHIP, view definition, update, execute, CONTROL, REFERENCES
         ON SCHEMA::[dbo] TO [DBAGroup1]
GO -- allow DBAs to be part of msdb ownership?
--ALTER ROLE [db_owner] ADD MEMBER [DBAGroup1]
GO -- not necessary since in above Admnistrator and Operator roles
ALTER ROLE [db_ssisadmin] ADD MEMBER [DBAGroup2]
go

-- resolve issues for all new databases created - fix MODEL database to include DBAs.
Use Model
GO
-- setup DenyData reader role by default for groups to cover Prod data constraint
CREATE USER [DBAGroup1] FOR LOGIN [DBAGroup1]
CREATE USER [DBAGroup2] FOR LOGIN [DBAGroup2]
-- all user databases for production, will have deny read on the data
GO
-- use roles even at the database level
CREATE ROLE [DBAs] AUTHORIZATION [dbo]
GRANT VIEW DATABASE State, execute, view definition TO [DBAs]
-- remove read access when necessary, but allow administration
-- add appropriate groups
ALTER ROLE [DBAs] ADD MEMBER [DBAGroup2]
ALTER ROLE [DBAs] ADD MEMBER [DBAGroup1]
-- and finally, requirements for some prod. environments
Deny select, insert, update TO [DBAs] 
GO
-- or deny select in prod user dbs this way
ALTER ROLE [db_denydatareader] ADD MEMBER [DBAGroup2]
ALTER ROLE [db_denydatareader] ADD MEMBER [DBAGroup1]
GO


-- WE DO NOT STOP THERE, CONTINUE with Monitoring Accounts
-- which traditionally have way too many Privileges
-- and the goal is to adhere to the principle of least privileges
use master
GO
-- create monitoring role and add accounts the monitor the servers
Create Server Role [Monitoring] Authorization [securityadmin]; 
-- the right logins in the container
CREATE LOGIN [MonitoringAccount1] FROM WINDOWS WITH DEFAULT_DATABASE=[master], DEFAULT_LANGUAGE=[us_english]
CREATE LOGIN [MonitoringAccount2] FROM WINDOWS WITH DEFAULT_DATABASE=[master], DEFAULT_LANGUAGE=[us_english]
CREATE LOGIN [AuditingAccount] FROM WINDOWS WITH DEFAULT_DATABASE=[master], DEFAULT_LANGUAGE=[us_english]
ALTER SERVER ROLE [Monitoring] ADD MEMBER [MonitoringAccount1]
ALTER SERVER ROLE [Monitoring] ADD MEMBER [MonitoringAccount2]
ALTER SERVER ROLE [Monitoring] ADD MEMBER [AuditingAccount]
-- minimum WMI permissions required, so add to local admins on the server too
GRANT CONNECT SQL TO [Monitoring]
GRANT CONTROL Server TO [Monitoring]
GRANT ALTER TRACE TO [Monitoring]
GRANT VIEW ANY DATABASE TO [Monitoring]
GRANT VIEW ANY DEFINITION TO [Monitoring]
GRANT VIEW SERVER STATE TO [Monitoring]
Grant CREATE DDL EVENT NOTIFICATION TO [Monitoring]
GRANT CREATE TRACE EVENT NOTIFICATION TO [Monitoring]
-- because we have granted control server, we must apply these DENY statements
DENY ALTER ANY AVAILABILITY GROUP TO [Monitoring]
DENY ALTER ANY ENDPOINT TO [Monitoring]
DENY ALTER ANY LINKED SERVER TO [Monitoring]
DENY ALTER ANY LOGIN TO [Monitoring]
DENY ALTER ANY SERVER ROLE TO [Monitoring] 
--obviously, we want control on number or roles
DENY CREATE ENDPOINT TO [Monitoring]
Deny impersonate on login::sa to [Monitoring]
-- add any other accounts that are individual users on the server
Deny Alter any Server Audit to [Monitoring]
Deny Unsafe Assembly to [Monitoring];

-- Monitoring users need to be in Master and Model, MSDB as DataReader
Go
CREATE
USER [MonitoringAccount1] FOR LOGIN [MonitoringAccount1]
CREATE USER [MonitoringAccount2] FOR LOGIN [MonitoringAccount2]
CREATE USER [AuditingAccount] FOR LOGIN [AuditingAccount]
--
ALTER ROLE [db_datareader] ADD MEMBER [MonitoringAccount1]
ALTER ROLE [db_datareader] ADD MEMBER [MonitoringAccount2]
ALTER ROLE [db_datareader] ADD MEMBER [AuditingAccount]
-- or more explicitly
grant select, view definition, execute on schema::dbo to [AuditingAccount]
grant select, view definition, execute on schema::sys to [AuditingAccount]
GRANT EXECUTE ON xp_readerrorlog TO [AuditingAccount]
GRANT EXECUTE ON sp_readerrorlog TO [AuditingAccount]
GO
grant
select, view definition, execute on schema::dbo to [MonitoringAccount1]
grant select, view definition, execute on schema::sys to [MonitoringAccount1]
GRANT EXECUTE ON xp_readerrorlog TO [MonitoringAccount1]
GRANT EXECUTE ON sp_readerrorlog TO [MonitoringAccount1]
GO
grant select, view definition, execute on schema::dbo to [MonitoringAccount2]
grant select, view definition, execute on schema::sys to [MonitoringAccount2]
GRANT EXECUTE ON xp_readerrorlog TO [MonitoringAccount2]
GRANT EXECUTE ON sp_readerrorlog TO [MonitoringAccount2]
GO
USE
[msdb]
GO -- the options here for system databases are to either continue to create user defined roles, or used fixed if possible
CREATE USER [MonitoringAccount1] FOR LOGIN [MonitoringAccount1]
CREATE USER [MonitoringAccount2] FOR LOGIN [MonitoringAccount2]
CREATE USER [AuditingAccount] FOR LOGIN [AuditingAccount]
GO
grant
execute on schema::dbo to [MonitoringAccount1]
grant
execute on schema::dbo to [MonitoringAccount2]
grant
execute on schema::dbo to [AuditingAccount]
GO
-- improve this by adding appropriate role?
ALTER ROLE [db_datareader] ADD MEMBER [AuditingAccount] -- may need to use more elevate fixed roles here
ALTER ROLE [db_datareader] ADD MEMBER [MonitoringAccount1]
ALTER ROLE [db_datareader] ADD MEMBER [MonitoringAccount2]
GO
-- add model for defaults similar DBA groups
Use Model
GO
CREATE USER [AuditingAccount] FOR LOGIN [AuditingAccount]
CREATE USER [MonitoringAccount1] FOR LOGIN [MonitoringAccount1]
CREATE USER [MonitoringAccount2] FOR LOGIN [MonitoringAccount12]
-- all user databases for production, will have deny read on the data
GO  -- create  monitoring database role
CREATE ROLE [Monitoring] AUTHORIZATION [dbo]
GRANT VIEW DATABASE State, execute, view definition TO [Monitoring]
-- remove read access when necessary, but allow administration
Deny select, insert, update TO [Monitoring]  -- requirements for some prod. environments
-- add appropriate groups or accounts for auditing or monitoring
ALTER ROLE [Monitoring] ADD MEMBER [AuditingAccount]
ALTER ROLE [Monitoring] ADD MEMBER [MonitoringAccount1]
ALTER ROLE [Monitoring] ADD MEMBER [MonitoringAccount1]

GO

USE [master]
GO
-- cleanup those who were in sysadmin
ALTER SERVER ROLE [sysadmin] DROP MEMBER [DBAGroup2]
ALTER SERVER ROLE [sysadmin] DROP MEMBER [DBAGroup1]
ALTER SERVER ROLE [sysadmin] DROP MEMBER [AuditingAccount]
ALTER SERVER ROLE [sysadmin] DROP MEMBER [MonitoringAccount1]
-- you can truly state that you have reduced the SysAdmin elevate account access to your auditors
GO