Configuring DPM Agents, Non-domain using Certificates

Hi All,

Lots of articles on this, Microsoft’s own documentation is far more detailed for 2012 R2 than it is for 2016. The concept remains the same.

My problems started as the cert authority (being comprised of an offline root CA and domain enterprise subordinate CA) forming the certificate chain, could not be contacted by agents on the other side of a firewall (without making swiss cheese of the firewall and allowing port 80 to the CRL distribution point, or standing up an alternative CRL location)

Follow the cert template setup guide here.

In the settings for the template, change the subject name handling to supply in the request. These certs are for non domain machines.

certs

Make sure permissions to the template allow read & enroll. Then we can add the template to the CA for use and head for the MMC console + local computer cert snap-in on your DPM server.

Add the cert to your DPM server following the guide above, generate the BIN file, copy to the agent.

If your agent is behind a firewall, make sure to open the additional TCP 6076 port required for the certificate communication.

If your non-domain agent is unable to access the CRL distribution point (required for initial verification of the cert) then you will need to manually import the CRL files. Open explorer to your cert server \\yourcertserver\certenroll – here you will find your CRL files.

Import your CRL files (copied from the CA) using:

certutil -addstore CA “C:\CRLfolderpath\CRLfilename.CRL”

Remember to import the full chain of valid CRL files, once added in, run your DPM agent setup command – then you will have a BIN file with which to complete setup.

Support

Bizarrely – Microsoft doesn’t support backup of machines in a “Perimiter Network” – https://technet.microsoft.com/en-us/library/hh757801(v=sc.12).aspx

I’m not sure about anyone else, but I make extensive use of VLANs, which are subject to firewall rule sets and agents work fine in these scenarios. So why would a “perimiter network” be any different?

I suspect this is just noted as MS have not fully tested support of agents in this scenario, but I do not see any reason why it should not work.

However, follow best practice and remember that if backup of a system is essential, you are best sticking to the official supported guidelines 🙂

Qlik Sense – PostGRE SQL, PGPASS and backups…

A little off my usual subject matter – but was tasked with this recently and couldn’t find anything really detailing it all in one place.

Anyway, those of you familiar (or not) with this DB will be looking for some easy way to back it up. Well, there isn’t. It’s a good old fashioned dump out a flat file product.

Command lines to do this are detailed here: Qlik Sense Backup Guide

pg_dump.exe -h localhost -p 4432 -U postgres -b -F t -f “c:\QSR_backup.tar” QSR

As you will read there, when you backup you can be prompted for a password (???) or you can specify it in plain text (sigh). To store the password, you need to use a PGPASS file.

PGPASS files are detailed here: PostgreSQL docs

File format needs to be:

hostname:port:database:username:password

What it doesn’t clearly explain is that this is a lookup list (think HOSTS file). If you were to use the pg_dump command in the example above – your PGPASS file should read:

localhost:4432:QSR:postgres:YourPassword

For multiple databases across multiple hosts, then you need to list each entry in your file, when PG_dump.exe references this information, it will lookup based on the details in the command line of hostname, port, database and username to then select the password.

Security

I like to work with service accounts for applications, so it makes sense to put the PGpass.conf file in a folder with restricted permissions – so only that service account user or domain admin can get at it. Run your script using this account or one with permissions to read the file.

Scripting Backup

My preference here is to call the script on a schedule. I’m using poweshell as I have multiple nodes in the site and services need to be stopped on every node for the backup to take place.

 

$scriptblock1={
stop-service QlikSenseEngineService
stop-service QlikSensePrintingService
stop-service QlikSenseProxyService
stop-service QlikSenseSchedulerService
stop-service QlikSenseServiceDispatcher
get-service QlikSenseRepositoryService | stop-service
}
$scriptblock2={
get-service QlikSenseRepositoryService | start-service
get-service qlik* | ? status -eq 'stopped' | start-service
}

# Shutdown services on primary & secondary nodes
Invoke-Command -ScriptBlock $scriptblock1
Invoke-Command -ComputerName secondnode -ScriptBlock $scriptblock1

# Call Backup file
(start-process -FilePath "cmd.exe" -ArgumentList '/c c:\backup.bat' -Wait -PassThru).ExitCode

# Start Services up on primary & secondary nodes
Invoke-Command -ScriptBlock $scriptblock2
Invoke-Command -ComputerName secondnode -ScriptBlock $scriptblock2

# Move backup to another location 
$backupfile="d:\backups\QSR.tar"
$destination="\\locationofyourchoice\CONFIG\Backup"
Move-Item $backupfile -Destination $destination -Force

The script calls a backup.bat file which contains the following:

SET PGPASSFILE=C:\securefolder\pgpass.conf
"C:\Path to Qliksense\Repository\PostgreSQL\9.3\bin\pg_dump.exe" -U yourDBadminuser -h localhost -p 4432 QSR >"d:\backups\QSR.tar"

The file sets a path ref to your PGpass.conf file, then calls PG_Dump with the required parameters, so it can lookup the password and then output your backup to D:\backups.

Powershell picks the file up and moves it elsewhere (I suggest moving into the same file structure as your Root,Apps,Logs,CustomData & Static content is (specified on install) – so that you can use your prefered backup software (DPM etc…) to protect it.

 

 

 

 

Deploying custom registry keys – to use with System Center

It can be useful to tattoo servers automatically on deployment with a custom reg – environment, service and component key set.

Example:

AssetName Environment Service Component
SERVER01 PROD WEB APP WFE
SERVER02 UAT WEB APP WFE
SERVER03 DR WEB APP WFE
SERVER04 PROD WEB APP APP
SERVER05 UAT WEB APP APP
SERVER06 DR WEB APP APP
SERVER07 PROD WEB APP SQL
SERVER08 PROD WEB APP SQL
SERVER09 UAT WEB APP SQL
SERVER10 DR WEB APP SQL

Unfortunately we had a bunch of legacy servers out there, with a flakey app containing this information centrally.

Not only did we want this information into SCSM, but also available for SCOM and SCCM to use for different purposes. So, armed with a CSV of data (in the format above) I needed to get this applied quickly to a few hundred VMs.

#Set Variables
 
$ENV="HKLM:\SOFTWARE\MyCompanyName"
$SCRIPT={
$ENV = $args[0]
$ENVVAL = $args[1]
$SERVAL = $args[2]
$COMVAL = $args[3]
New-ItemProperty -Path $ENV -Name Environment -PropertyType String -Value $ENVVAL -Force
New-ItemProperty -Path $ENV -Name Service -PropertyType String -Value $SERVal -Force
New-ItemProperty -Path $ENV -Name Component -PropertyType String -Value $COMVal -Force
}
 
# Import CSV file
$list = Import-Csv C:\temp\ServiceData\servicedata.csv
 
# Pipe variable contents and invoke script
$list | foreach-object{
$obj = $_
Invoke-Command -ComputerName $obj.AssetName -ScriptBlock $SCRIPT -ArgumentList $ENV,$obj.Environment,$obj.Service,$obj.Component
}
 
# End of Script

The script above sets variables for the reg path, then a script – which will be passed to the server remotely using invoke-command.

This script sets variables based on the command arguments received in the loop at lines 20+21. The CSV data is formatted as the above example table, so the command connects to the computer (defined as AssetName), sends the script (variable $script) and appends the reg path, Environment, Service & Component data as Argument positions 0,1,2&3.

At the other end, it runs the script passed, which in the example CSV above, line 1 would be:

$ENV = HKLM:\SOFTWARE\MYCompanyName
 $ENVVAL = PROD
 $SERVAL = WEB APP
 $COMVAL = WFE
 New-ItemProperty -Path $ENV -Name Environment -PropertyType String -Value $ENVVAL -Force
 New-ItemProperty -Path $ENV -Name Service -PropertyType String -Value $SERVal -Force
 New-ItemProperty -Path $ENV -Name Component -PropertyType String -Value $COMVal -Force

It will proceed to loop round and apply each server in turn. Yes, it’s raw and there’s no error handling there, but you could easily put a TRY/CATCH in there to verify the server can be contacted, plus you can output the results to a file etc…

Now, you can build out dynamically adjusting patch groups in SCCM – based on Environment & Service, gather data into SCSM for services and customise SCOM monitoring & alerting based on Environment, plus set scheduled maintenance mode in SCOM for these groups when they patch.

After all, you dont want to be dragged out of bed for a non-prod server going offline or a routine patch cycle.

Deploy DPM Remote Management Console 2016 + UR2

Unlike all the other system center products, which can normally accept a straight forward setup.exe /install /client and install silently – DPM is different (no shock there then!)

After a long search for documentation on the available install switches, it lead me to a blog post by Steve Buchanan which is for the 2012 console install.

So, 2016 follows the same principle, but for some very bizarre reason – source media contains:

2012 Console,

2012 SP1 Console,

2012 R2 Console and….

2016 Console.

The only command line for install – Setup.exe /i /cc /client – installs all 4 versions – FAIL.

So, the only way round as far as I can see is to live with it and then remove the unnecessary components after install, then apply UR2.

Follow Steve’s post to getting it into config manager (i’m not rewriting his post) – in the source directory, add your source media, a copy of the UR2 console patch (you can extract the file and grab the MSP – it’s called: DPMMANAGEMENTSHELL2016-KB3209593.MSP ) and finally a batch file for install and reference that instead.

so – your file layout should look something like this:

install-folder

In your batch file:

start /wait cmd /c “Setup.exe /i /cc /client”

start /wait cmd /c “msiexec.exe /x {DFF93860-2113-4207-A7AC-3901ABCE8002} /passive”

start /wait cmd /c “msiexec.exe /x {FF6E79E3-66E5-4079-BE10-2B9CFBE3B458} /passive”

start /wait cmd /c “msiexec.exe /x {88E17747-6E2C-48A0-88CC-396AC8D9C5BB} /passive”

start /wait cmd /c “msiexec.exe /f {BF23ED54-5484-4AC1-8EA7-6ACAFBBA6A45} /qn”

start /wait cmd /c “msiexec.exe /update DPMMANAGEMENTSHELL2016-KB3209593.MSP /qb”

So, we are installing all consoles, then removing 3 of 4 versions. This for me caused the 2016 console icons to go awry – so a quick repair of the 2016 one before finally installing the UR2 MSP.

Dont forget to reference Visual C++ 2008 Redist x64 in your dependencies list in SCCM – otherwise it won’t install 🙂

Enjoy!

 

System Center 2016 UR3

It’s out now –

https://support.microsoft.com/en-hk/help/4020906/update-rollup-3-for-system-center-2016

A lot of good VMM fixes in there – which I will be testing soon. Bulk host agent update script is in Charbel’s blog here: https://charbelnemnom.com/2017/05/update-rollup-3-for-system-center-2016-is-now-available-sysctr-systemcenter-scvmm/

Details of SCOM fixes in Kevin’s blog here: http://kevingreeneitblog.blogspot.co.uk/2017/05/scom-2016-update-rollup-3-ur3-now.html

I’m a little disappointed to see DPM missed an update in UR3. VMware support is still missing from 2016 – but all will be forgiven if this turns up in UR4 along with fixes for woes experienced with UR2 currently:

Tape Library Sharing – 2012 OS cannot remove TL sharing & re-establishing 2016 OS TL required a manual DB cleanout (with Premier Support).

Console Crashing on PG alteration – requires DLL from MS (see my previous posts)

Mount points, whilst supported for the storage (see my other posts) uncover a known issue with DPM mouting the VHDX files for Modern backup Storage – the workaround for this is to add a drive letter to the storage.

If you don’t urgently need supported SQL 2016 backups / SharePoint 2016 protection from DPM, I would seriously consider sticking to UR1 for now.

Roll on UR4! 🙂

 

 

DPM 2016 agent installations – Making your life easier with SCCM

Take the pain away from manual deployment – grab the agent and put it into SCCM. The command lines for agent install (2016 UR2) are:

DPMAgentInstaller_KB3209593_AMD64.exe /q /IAcceptEULA

DPMAgentInstaller_KB3209593.exe /q /IAcceptEULA (for x86)

Just make sure all the agent pre-reqs are in place (WMF 4.0 for 2008 R2 etc…) and make the detection of those a pre-req for the SCCM deployment.

If you know what DPM server you are going to protect with – simply add the server name to the install above – that will open the ports and make the agent ready to be attached.

If you dont just yet – then run a second SCCM task to call a batch file running the setdpmserver.exe (in the DPM agent Bin directory) to configure the agent.

Run an “Application deployment type compliance details” report in SCCM, using your target collections, application, deployment type and status of “Success” to generate a CSV file of the installed agents.

Take the computer name column in excel, append your domain name (using concatenate) and put the resulting list into a .txt file (no headings or any other info required)

Open the DPM console – select Install, Attach Agents, click add from file and point to your txt file.

Output from SCCM report, manipulate and import in ~10 mins saving many hours of manual config.

Job Done!

VMM 2016 UR 2 on Windows 2016 – Storage Management Bug

Recent installs of VMM 2016 have shown nice improvements over 2012, especially a much needed performance boost with storage operations through SMI-S.

Our latest SMI-S provider from NetApp (for ontap 9.1) in combination with VMM 2016 – seems to be light years ahead of 2012 R2. It’s responsive and carries out tasks in a tenth of the time that VMM 2012 took with ontap 8.

All issues with the SMI-S provider going unresponsive have been found to be due to a little service running in windows:

SMI-S

This little chap seems to randomly keel over – with system and application logs revealing nothing as to why.

  • Log Name: Application
    Source: Application Error
    Date: 28/04/2017 17:44:20
    Event ID: 1000
    Task Category: (100)
    Level: Error
    Keywords: Classic
    User: N/A
    Computer:
    Description:
    Faulting application name: svchost.exe_MSStrgSvc, version: 10.0.14393.0, time stamp: 0x57899b1c
    Faulting module name: concrete.dll, version: 10.0.14393.0, time stamp: 0x57899a8c
    Exception code: 0xc0000005
    Fault offset: 0x0000000000002eb0
    Faulting process ID: 0xe38
    Faulting application start time: 0x01d2c03e31b6c56d
    Faulting application path: C:\Windows\system32\svchost.exe
    Faulting module path: c:\windows\system32\concrete.dll
    Report ID: 1bc955fe-6a3b-469b-8c0c-de7992f3858d
    Faulting package full name:
    Faulting package-relative application ID:

I have logged it with premier support – they know about it, but a fix isn’t available as yet. Frustrating, but not the end of the world – just start the service again, wait 2 mins, then refresh the provider in SCVMM – all is good!

As soon as a fix is available – I will post an update here.

Updated:

  • WinCXE is planning of hotfix for Windows 10 version 1607. The KB article ID is going to be 4019472.

This should be available towards the end of May 2017.