May 272016
 
 May 27, 2016  Posted by at 13:42 Exchange, Powershell No Responses »

The other day I had to write a Powershell script that utilised some Exchange Powershell cmdlets. This script had to run as a scheduled task on a non-exchange server that did not have the Exchange Management tools installed, nor was this an option.

To do this I knew I had to import the Exchange cmdlets to the remote computer and then run the script.

Here is the command line that I used in task scheduler to achieve my goal:

C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -command "$s = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri http://YourexchangeServer.yourDomain.com/PowerShell/ -Authentication Kerberos ; Import-PSSession $s; &'c:\Path\to\script\ExchangeScript.ps1'"

Apr 162016
 
 April 16, 2016  Posted by at 21:16 Configuration Manager 2012 No Responses »

I’ve implemented SCCM 2012 R2 at our corporate HQ and now I’ve started to deploy remote distribution points at our other offices. So there I am configuring the first one at our office in Germany and I get to the boundary groups page where the ‘Allow fallback source location for content’ is selected by default. My design does not include this feature for this particular DP and so I un-check it as per the screenshot below:
RemoteDP-Fallback1
When I get to the summary page, I have a read through (you do too, right?) to make sure I haven’t made a mistake and noticed that it said that ‘Allow fallback source location for content = Yes” !! I never said that!
RemoteDP-Fallback2

So I go back and sure enough it’s unchecked. Looks like the logic for the code is slightly wrong when it presents the informaion on the summary screen. If you check the box then the opposite is true on the summary!

So I left it unchecked and even though the summary says it’s a big yes, when I checked post-installation, it was in fact all OK…Phew! (See screenshot below)
RemoteDP-Fallback3

Right..one down, twenty to go….

Mar 112016
 
 March 11, 2016  Posted by at 20:50 Configuration Manager 2012 No Responses »

I needed to backup bookmarks from the Chrome web browser using USMT as part of an image refresh task I’ve recently implemented using SCCM 2012R2 + MDT Integration.

Searching the Internet (why re-invent the wheel eh?) only gave me a couple of results, and when trying the ‘solution’ I found that it did not work.

Here is the main post that I used as a reference: http://www.itninja.com/question/user-state-migration-tool-1

The reason it failed was that the detection rule path in migapp.xml (referred to in the above link) was failing. When I installed Chrome on my system, the registry key HKLM\SOFTWARE\Wow6432Node\Google\Chrome that is being detected did not exist:
usmt google reg

I shortened the path of the detection rule to: HKLM\SOFTWARE\Wow6432Node\Google which was the only path that existed in my test systems and that did the trick.

So all you need to do is modify migapp.xml…

This is the original, remove the two existing detection rules (highlighted in yellow):
USMTOriginal

…and replace them with the single new detection rule:
USMTModified

Feb 042016
 
 February 4, 2016  Posted by at 20:54 Powershell No Responses »

We have decided to try and use network locations (See screenshot below) instead of drive maps where practical at my work place for a number of reasons; ensuring minimal damage by Cryptolocker and running out of letters being two of the primary ones.

Nloc

I was trying to find a way of automating this in a way that will give me the greatest flexibility and naturally PowerShell once again came to the rescue.

I got most of the code from here and so cannot take credit for it – all I did was strip it of some of the validation (as I did not need this) and turn it into a function: This enables me to use it in a more versatile manner which meets my needs perfectly and I’ll be writing a new script that calls this function over the next couple of days.

You can find it on my Github here.

Nov 082015
 
 November 8, 2015  Posted by at 20:53 Tools No Responses »

I’m migrating data to a NetApps Filer SAN and as part of this I hit upon an issue whereby the Domain Administrator had been denied access by some rather unscrupulous staff members to various folders and files resulting in failed copy operations.

Unfortunately, due to the way the permissions were originally configured, I could not take ownership on the root directory to allow inheritance to do it’s magic.

I started to manually take ownership until I realised the extent of  work involved.  This job was going to take hours!

Err…no.  Enter into the ring something that I suddenly remembered reading about a few years ago but had never actually used.  It’s a great tool and even better, it’s built right into the Windows operating system: takeown
Typing in:

takeown /? 

gave me the help screen and from that I constructed and ran the following command:

 takeown /F \\Path\to\RootDir /R

The above command line gives ownership to the current user and uses recursion. Depending on who you are logged in as you may want to also use /A which gives ownership to the Administrators group instead of the current logged in user.  I ran this and 4 hours of work was completed in about a minute.  Perfect. Back to migrating that data….

Sep 262015
 
 September 26, 2015  Posted by at 19:51 Group Policy, Project: 2008R2 to 2012R2, Server 2012R2 No Responses »

I’m currently upgrading domain controllers from 2008R2 to 2012R2 in various countries in my workplace.  As I was project planning our UK and Germany upgrade I noticed that the PDC on our UK DC has it’s NTP time source set manually.  As part of my project I will be moving the PDC FSMO role from it’s existing DC to another and then move it once again at a later stage in the project!

Naturally I didn’t want to set the NTP time source manually each time so here’s how I did it via GPO so I don’t have to worry about it:

The first thing I did was to create a GPO filter that would target only my PDC:

1.
In the Group policy editor, select the WMI Filters node, right-click it and select New:

Where to set wmi filter

2.
Give the filter a meaningful name then click the Add button:

Click Add on filter

3.
Type the query to target the PDC emulator as shown in the screenshot below.  DomainRole = 5 targets only the PDC.  I found this information here where you can also find information on how to target other roles if need be.

The wmi filter

4.
When I clicked OK on my 2012R2 DC I received the following error:
Error message - ignore

On investigation I discovered that it can be safely ignored as it seems to be a bug.  There are a few posts out there saying to enclose the where clause in parenthesis or quotes but this never worked.  At any rate, ignoring the message worked for me.  I tried transferring the PDC role a couple of times and the GPO switched accordingly despite the message so all’s good.

5.
Click Save on your newly created filter:
Click save

6.
Now for the GPO.  Create a new GPO and navigate to the following:
Computer Configuration\Administrative Templates\System\Windows Time Service\Time Providers

7.
Select ‘Configure Windows NTP Client’ and enter the name or IP address of your NTP server followed by ,0x1 (Incidentally, if you want to know more about the flags, check out this excellent post.)

If you wish to add more than one ntp server then note that they are space separated eg: (Note the space between the 0x1 and the 1)
0.pool.ntp.org,0x01 1.pool.ntp.org,0x01
Configure NTP Client

8.
Enable this too while you are there…
enable client

9.
And this one…
Enable NTP Server

9.
Now all you need to do is select the WMI filter you created earlier in your GPO, and link the GPO to your Domain Controllers OU:
Select your filter on the GPO

10.
When you flip the PDC FSMO role you will see the GPO applied to the new PDC when the DC’s refresh their GPO policy (every 5 minutes by default)
GPO Applied to PDC

That’s it – now when I move the PDC FSMO role throughout my UK\Germany project I have one less thing to worry about!

Jun 262015
 
 June 26, 2015  Posted by at 12:28 Powershell 3 Responses »

This post is for my reference although if you stumble upon it and find it useful then cool!

I have configured AppLocker to run in ‘Audit’ mode and wanted a quick method of seeing what would be blocked without having to log on to individual computers and checking the event logs.

That way I can pro-actively monitor computers and build the white-list rules before I turn AppLocker on to ‘Enforce’ mode.

I’ve seen there are a number of AppLocker cmdlets that I’ll be exploring later, but for now, here’s my solution on GitHub.

Jun 042015
 
 June 4, 2015  Posted by at 12:41 Group Policy, Powershell No Responses »

Currently at my workplace, our GPO’s are backed up every now and then using a manual process via the GPMC.  I decided it was time to automate this process and I found out a few interesting discoveries which I thought I would share here.

Backing up all of the GPO’s is very easy in Powershell and it’s been documented a billion times all over the Internet:

Backup-GPO -Path C:\GPOBackups -All

Setting this up as a scheduled task completed the job nicely.

The tricky part was backing up the GPO Links.  In my environment there are a lot of GPO’s that are linked to a myriad of OU’s throughout our Organisation.  In the event of a disaster I want an easy method of seeing what these links were and an automated method of not only restoring the GPO’s but also re-creating the links.  What follows is the first part of my journey…

I went with the cmdlet Get-GPOReport for this.  First of all I ran it and generated an HTML report to see what it contained:

Get-GPOReport -GUID 42917962-6bfd-4d17-ade0-bfe411245bef -Path C:\GPOReport.html -ReportType html

The information presented included the GPO links – perfect:

htmlLinks

The next  step was to script it and set it up as a scheduled task that would run alongside my backup GPO task.  My criteria was to create a separate report for each GPO and name the report after the name of the GPO along with the date that the report was run.  I also decided to export the report as XML as this would give me greater flexibility later on and help me achieve my ultimate goal:

get-gpo -all | % $_ {Get-GPOReport -Guid $_.id -Path "c:\GPOBackups\Reports\$($_.displayname) $(get-date -Format "dd-MM-yy").xml" -ReportType xml}

I can now easily interrogate the xml for any info I need – for example, to see the GPO Links for a specified GPO:

$Path = 'c:\GPOBackups\reports'
$GPOXML = [xml](Get-Content "$path\Set Desktop Wallpaper 04-06-15.xml")
$GPOXML.GPO.LinksTo

Which displays the following info:

the Links

If I wanted the GUID I could further query the xml by adding the following:

[string]$GPOGUID = $GPOXML.gpo.Identifier.Identifier.InnerText
#Clean up the GUID...
$GPOGUID = $GPOGUID.Trim("{,}")

And If I wanted the GPO name:

$GPOName = $GPOXML.gpo.Name

You get the idea…

I now have all of the information I need and automated measures in place to write a script that would restore all or some of my GPO’s,  including the links,  in the event of a disaster. (But that’s a post for another day!)

If there’s a better \ simpler method then let me know as I’m still on my Powershell learning journey!

May 112015
 
 May 11, 2015  Posted by at 10:30 Powershell No Responses »

I needed a script to be able to delete local user profiles on computers and all of the existing scripts I found on the Internet were either overly complicated or would only delete the directory whilst leaving the registry entries in place.

Other scripts I found would run synchronously when you had multiple computers ie one at a time…you would have to sit and watch it plod along whilst 100 or so profiles were slowly deleted one at a time before moving on to the next computer and doing the whole thing again. Not fun when you have 50 or 60 computers to get through.

Although I considered using a workflow for this, (as I needed to run this on lots of computers at once), I found that it overcomplicated things when really I only needed to utilise Powershell jobs which did the – and no pun intended – job.  This is a great Powershell feature and in essence, by using -AsJob I’ve achieved what I would be looking to do with a workflow but without the complexity that workflows bring: Rather than wait for  computer1 to delete the profiles then computer2 then computer3…etc, I have sent the ‘job’ to all computers in the pipeline at once so all computers are running the script at the same time.

The script itself is nothing flash, it simply does what it needs to do – no user menu, simply fire and forget – which met my requirements. (Although maybe not yours – you can always embellish the script for your own purposes if you need more from it.)

Special accounts and any logged on user will be ignored and won’t be deleted – this is great as it means you can run it on machines that are already logged on.  If you have any other accounts that you want to keep, simply add them to the filter in the code – that part is pretty straight forward.

It also accepts pipeline input so if you want to pipe in a text file of computer names or whatever then go ahead.

You can find the script in my Powershell repo on github and the script is called: Delete-Profiles