Encrypt data synced to OneDrive with Bitlocker

Hey there !

Long time no article, but now there is pretty nifting idea to enhance security of your public-storage files – if you worry that much 🙂

The high level steps are simple:
1) Create VHDX at OneDrive syncing location (or any other drive you want to use, shouldn’t matter)
2) Mount the drive
3) Enable Bitlocker Protection (Windows 10 Pro +)
4) Dismount the drive to enable syncing

Now, to the details.

1. VHDX creation and mounting

You can do it via Disk Management GUI. With Disk Management you go right-click on start, Disk Management, from there Action -> Create VHD. Such created disk will not be initialized nor formatted, so you need to right click disk, press Initialize disk, partition type, file system and finish. Here is Powershell oneliner to create 15GB disk. Note you can replace the bytes with formula: (GB * 1024 * 1024 * 1024 ). Note it requires HyperV installed (New-VHD cmdlet comes from there)

New-VHD -Path S:\OneDrive\TestEncryption.vhdx -Dynamic -SizeBytes 16106127360
| Mount-VHD -Passthru | Initialize-Disk -PartitionStyle MBR -PassThru
| New-Partition -UseMaximumSize -AssignDriveLetter |Format-Volume -FileSystem NTFS

With the above, you will get a new drive mounted.

2. Encrypt with Bitlocker

Right click your new disk and press “Turn on Bitlocker”. Following prompt will appear:

Capture

Get the password in, save your recovery key somewhere, as it will be your last resort when your key stops working and complete to encrypt the drive.

With the drive loaded onto OneDrive sync location, upload files directly to the new drive. After you are done, just dismount the drive – OneDrive (and possibly other sync clients) are unable to sync mounted disk.

Enjoy Bitlocker-Protected OneDrive 😉
Bitlocker2
Btw. Public cloud providers will hate you for that, as it breaks their deduplication on storage – encrypted data doesn’t deduplicate well 🙂

AlexP

NB. I was exploring the second option to use Smart Card (I own Yubikey one) – I was able to encrypt the drive once, but unable to get certificate detected when trying to unlock it. Let me know if you had more luck with it!

Publish SSO SCOM Web Console to Internet via Azure Web Application proxy

Hey there!

With the new 1801 Web Console, more and more companies are likely to depend on it in foreseeable future, especially when SCOM has been given more focus lately. The post will cover the aspect of utilizing Azure Web Application Proxy to access SCOM webconsole through internet, and provide single-sign on feature if your environment allows it. As AAD is widely adopted as authentication mechanism, it makes more sense for enterprises to consolidate their sign-on efforts to Azure AD, given all the capabilities it provides.

Prerequisites:

  1. Azure-AD-enabled enviornment. You don’t need to have it synced via ADConnect, you just need to make sure you have a tenat that can be used. At least AAD Basic plan is required.
  2. SCOM Installation (well, it is a prerequisite in the end 🙂 )
  3. Some knowledge on Kerberos Constrained Delegation
  4. Some understanding of Federation/SSO features
  5. Global Administrator rights on Azure tenant

Disclaimer here: I will follow what worked for me, to get SSO to work. Your environment will most likely be in a different configuration, so please do not treat this post as a guide, but much rather as an insipration

1. Enable Azure AD Web Application proxy

Login to Azure AD portal, and navigate to Azure Active Directory blade. Click on “Application proxy” and following blade should start up:

AADProxy

Discard the existing Connector here, it’s a leftover from my lab setup. You need to Enable application proxy for your tenant first. Once you enable it, you can register AAD Proxy on your server. For production environments, I would suggest considering creating a dedicated servers, which will work in a “cluster” so to say, that is at least one gateway in a group must be active.

Click on the link “Download Connector”, and install it onto your desired Web Application gateway. During installation it will ask you to sign in, you need to use your privileged credentials to register AAD Web Application proxy. Also, please bear in mind the server needs to be able to connect remotely to Azure Datacenter.

Once you have installed gateway, it’s time to configure it. Assign the proxy to a group that works for you. Remember one application proxy can be used for many applications – in my example here “SCOM” is valid, but you might want to name your proxy after network boundary you are accessing on-premises (LAN, DMZ etc.)

Now it’s time to register an on-premises application for authentication. I have gone to “Enterprise applications -> Categories -> Add an application -> Add your own on-premises application” and filled in the details as follows:

AADProx3y

You might notice that I chose the internal URL to be only a FQDN of SCOM web console server. I will explain why that, instead of http://w16d-scom/OperationsManager later in this post.

Now that we have this part completed, we might need to create additional security group for SCOM Web Console Access. I do encourage to use groups instead of direct user assignments for obvious reasons – it does help a lot with access management.

You need to assign the user or group an access to the enterprise application that has been created. You can do so by following to “Enterprise Applications -> SCOM Web Console -> Add Assignment -> User or Group:

AADProxyGroup

Now we are reaching the tricky part. How will SCOM server know that the person who authenticated is really that person? Since SCOM Web Console uses Integrated Windows Authentication, Kerberos Constrained Delegation can help here. See more details here

To get IWA to work with our SCOM Web Console, we need to modify KCD settings in Active Directory. In AD Users And Computers snap-in, naviage to computer object of the Web Application proxy and choose Delegation tab. In my example I’m running Web App Proxy on the same server as  SCOM web console, so it’s actually the same server. In this specific case it is not even be required (authenticating server and authentication target is the same), but I did it for clarity purpose. Also, this part needs revision for your specific environment as the identity needs to be matched to local AD, and one that has enough permissions to access SCOM Web Console. In my setup, my local AD is oblivious to Azure AD, meaning there is completely no trust between these domains. What I did to overcome this issue is to create a user “alexpawlak@aleksanderpawlakfalck.onmicrosoft.com” on my local AD, so that User Principal Name could be matched properly from AAD. I also registered HTTP SPN to my SCOM Web Console, so it was being used:

SetSPN -S http/w16d-scom w16d-scom
SetSPN -S http/w16d-scom.ad.alexpawlak.pl w16d-scom

Configure the SSO settings of Web Application Proxy as follows, once you have the username ready. Play around these settings and see what works for you in the end 🙂

AADProxy6

Furthermore, the app is made available on Office365 application dashboard:

SCOMAADProxy

Now that we have Azure AD part covered, there is one extra thing you need to do to trick SCOM to load up properly. As we have set the web application proxy to FQDN of the web server, we are now getting generic IIS greeting screen. To overcome this issue, log on to web server of SCOM, go to HTTP Redirect:

SCOMRedirect.PNG

And configure this feature like that:

SCOMRedirect2

This way any request that goes to your FQDN will be redirected to OperationsManager Application.

At least for me at this step it was enough to get this to work.

Let me know if you are using Azure Web Application Proxy and with what extent – as far as I know in my current company it’s not used at all, and it could well help us provide secured remote access to resources we might need to reach – all with credentials protected by AAD features:

SCOMRedirectFinal

Thanks for reading !

Br.

Alex Pawlak

SCOM – Upgrade to 1801 on production

Hey

This post would cover my experiences with upgrading multi-server setup of SCOM 2016 to SCOM 1801, as well as underline some of the aspects of the upgrade that are skipped / not given enough attention in the upgrade guide, according to my own subject

If you haven’t done so already, here is the link.

Read it. Then think if you understand everything, and then read it again 🙂

In case of Ooops….

I can’t stress how important is it to have VM checkpoints (aka snapshots) in place. And make sure that your storage has enough space to hold the differential disks for long enough to complete the upgrade. I assume your OM environment is fully virtualized, if it’s not – make sure you have reliable backup of your physical servers. My test environment actually didn’t survive the upgrade, the agent didn’t start up due to cryptic error “Class not registered”. MS techie on support gave up on my case saying I missed (and I did!) minimum requirements, which is 4 cores and 8GB of RAM, on my test server. Shame 🙂

Patch it twice.

Once you get the VM snapshots done, my recommendation is to check Microsoft Update for patches to SCOM installation software. If you don’t have that kind of access, try to bribe the network guys. Throw some routers at them or whatever works in your case ;). Case is, MS frequently updates their installation software but not necessarily the ISO files the software comes with. And there might be fairly important bugfixes which decide whether your deployment was a success or not.

A little bit of scripting here and there

You need to stop all SCOM services on all other management servers. A short PS:

$Servers = (Get-SCOMResourcePool "All Management Servers Resource Pool" | % Members | % DisplayName).Where{$_ -notlike "$env:computername*"}

This gets Management servers which are not the server you are running on.

Get-Service omsdk,healthservice,cshost -ComputerName $servers

This should return all services as stopped. If it does, you have met one of the important prereqs.

Also – you will fail to upgrade SCOM if you don’t know your service account passwords, check your documentation or password managers.

First management server takes really long time to upgrade. Subsequent servers upgrade much faster – this is due to SCOM database / DWH being upgraded along with the first management server.

Telemetry data / Application Insights in web console

Another nifty thing is, that I encourage you make use of most, is SCOM 1801 Web Console. There is seemingly little-known fact MS has incorporated their Application Insights into new web console, so telemetry data is being sent out to MS systems to check performance usage. See more details here. Application Insights, is, if you are unfamiliar with it, a telemetry service which provides devs pretty roboust information on how the application is used, it’s response times and so on and so forth. Unless you have a very specific requirement to not turn this on, I’d suggest forwarding this data to MS (actually every telemetry data) – it’s in their interest to make SCOM better, and in our interest to make our everyday monitoring tool a bit better.

Upgrade agents – *poof* audit data gone

Last time I upgraded SCOM agents I realized the hard way agents are loosing their audit configuration data upon upgrade. So you need to set them again, if you use ACS collectors. Now, with little help of ConfigMgr I made myself a quick compliance check to make sure all domain controllers are running properly configured ACS agent.

$AdtService = Get-Service AdtAgent

if ($AdtService.Status -eq "Running") {
return "OK"
}
else {
return "Not OK"
}

And also registry query:

HKLM:\SOFTWARE\Policies\Microsoft\AdtAgent\Parameters\AdtServers - with following check in SCCM.

ACS CI

And these rules are collected as compliance baseline, deployed to Domain Controllers collection. Now I’m certain I’ll get alerted if I forget to set it up again. In case anyone needs a SCCM AD Domain Controller collection query definition, here it is:

select SMS_R_SYSTEM.ResourceID,SMS_R_SYSTEM.ResourceType,SMS_R_SYSTEM.Name,
SMS_R_SYSTEM.SMSUniqueIdentifier,SMS_R_SYSTEM.ResourceDomainORWorkgroup,
SMS_R_SYSTEM.Client from SMS_R_System inner join SMS_G_System_COMPUTER_SYSTEM on SMS_G_System_COMPUTER_SYSTEM.ResourceId = SMS_R_System.ResourceId where SMS_R_System.PrimaryGroupID = "516"

Timeouts, all timeouts

The upgrade was successful, well, to some extent. The default configuration in ConfigService.config file might contain timeout thresholds that are pretty low. I had all my management servers error out on me mostly with 29181 error:

OpsMgr Management Configuration Service failed to execute 'LocalHealthServiceDirtyNotification' engine work item due to the following exception

Microsoft.EnterpriseManagement.ManagementConfiguration
DataAccessLayer.DataAccessOperationTimeoutException

I have dared to increase timeouts for these operations to complete (I have no proper insight on consequences on this one, follow at your own merit), modifying one of the lines in the config file, increasing timeout until the problem was gone.

Then I copied the configFile and bounced cshost services on Management Servers. I really hope this didn’t set up a delayed bomb of some sort [ please do assure me or proove me wrong ]

That’s about it so far, I might drop in a short post later if anything new comes up. In the meanwhile I’ll enjoy new shiny SCOM if you may

Br.
~Alex

Manually approve Microsoft Bookings appointments

Set Bookings to manual approval

Microsoft Bookings is a pretty nice feature that allows you to create services for your customers to request. It’s built on Exchange platform and available for select Office subscriptions (Business Premium + at time of writing this post). In short, you define a set of services that you offer, publish it to the Internet (or as a Facebook-page plugin(!)), and let customers do self-service. Pretty nifty, especially for small local businesses which usually lack on IT resources.

One of the obstacles I have stumbled onto is lack of manual approvals – that is, a customer can book a time which you have forgotten to set as unavailable. Bookings is able to read your calendar and does not display dates where you are busy in your calendar. Currently, once customer finishes Bookings, they immediately receive an e-mail with confirmation and a new meeting is created. This has even been raised on Bookings UserVoice (204 votes so far). ETA for implementing this feature natively is rather unknown, but thankfully we have a mail flow feature in Exchange Admin Center.

Here is a quick roundup of steps that you need to take to set this up:

  1. Login to Exchange Admin Center
  2. Go to Mail Flow, and create new rule as follows:

1

Booking page mail is created when you first set up Bookings app. If you don’t know it, you can find it in the URL: https://outlook.office365.com/owa/calendar/BookingSite@yourorganization.com/bookings/

A quick explanation of settings: Any mail that goes out from your organization from your bookings mail, that contains mentioned keywords in subject, and goes outside of your organization (to the customer) will be sent for approval to person of your choice. Save the rule and test it. An approval mail will be sent shortly to you, that looks as follows:

2

Once you press “Approve”, Bookings mailbox will send the message to the customer, confirming their appointment. One thing I noticed is these decision buttons seem not to display properly on Outlook for Android (and possibly for other mobile apps).

Let me know if this helped you, or if you need further assistance, feel free to contact me

~AlexP

Automatic portfolio created from Instagram tags

Hey there!

Recently my SO had that idea to try her strengths in make-up industry. Details aside, she needed some IT to promote her business and that’s where I come in. I created a website, loaded some Bootstrap theme that included nice portfolio (I’m no frontend guy, any interaction with JS/CSS is a no-no for me), and this portfolio had to be populated with actual images. For us to save the work I wrote an ASP.NET code which pulls image data from Azure Table storage, and injects it into view. I decided to use Flow to actually put the data from Instagram to Table storage.

For her clients she makes a photo of before and after – to see the actual difference make-up can do. Every time she posts a photo to Instagram with specific tag, Flow pulls the photo to Azure Table storage, where in turn website queries it and displays photos. you can see the end result here (mind you, some photos are not yet replaced, work in progress 😉 ):

Lyada Website – Makeup

The tags we agreed to use is #LyadaDaily, for all the occasional photos, #LyadaWedding, where she would do the make-up for brides, and #LyadaArt, where the make-up is advanced and serves rather theatrical purpose. This is how it looks on the website:

Screenshot-2018-2-28 Lyada Make-up - mobilna wizażystka Warszawa z dojazdem do klienta

The flow itself is triggered with new Instagram Media:

Screenshot-2018-2-28 Edit your flow Microsoft Flow(1)

Afterwards, the ID of uploaded media is used in activity “Get my recent media”. This retrieves all recent objects, as I want to use this opportunity to update the previous posts as well (I need to update the “Likes” count, to show only top works on website).

Screenshot-2018-2-28 Edit your flow Microsoft Flow(2)

The condition tree is built that way because so far Flow does not support a Switch statement that would simply this task. Right now each condition looks for text matching a specific tag, and processes this entry into Azure Table. If the content doesn’t match, a second and then third condition is evaluated.

Finally, “Insert or Merge entity” is used. I have had to spend some time figuring that one out, but it finally worked. In the Azure Table storage, I chose the PartitionKey \ ID primary key constraints, as these are going to be rather unique. Screenshot-2018-2-28 Edit your flow Microsoft Flow(3)

That aside, if you are not familiar with Flow from Microsoft, check it out on flow.microsoft.com – for non-Office users there is a free tier, which allows for 750 Flow runs per month, with polls (triggers) every 15 minutes.

If you’ve had any troubles with it, let me know and maybe I can help out 🙂

~Alex Pawlak